CN108762488A - A kind of single base station portable V R system based on wireless human body motion capture and optical alignment - Google Patents

A kind of single base station portable V R system based on wireless human body motion capture and optical alignment Download PDF

Info

Publication number
CN108762488A
CN108762488A CN201810419675.8A CN201810419675A CN108762488A CN 108762488 A CN108762488 A CN 108762488A CN 201810419675 A CN201810419675 A CN 201810419675A CN 108762488 A CN108762488 A CN 108762488A
Authority
CN
China
Prior art keywords
human body
joint
optical alignment
hinge
base station
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810419675.8A
Other languages
Chinese (zh)
Inventor
李蕊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meng Zhuo Technology (shenzhen) Co Ltd
Original Assignee
Meng Zhuo Technology (shenzhen) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meng Zhuo Technology (shenzhen) Co Ltd filed Critical Meng Zhuo Technology (shenzhen) Co Ltd
Priority to CN201810419675.8A priority Critical patent/CN108762488A/en
Publication of CN108762488A publication Critical patent/CN108762488A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A kind of single base station portable V R system based on wireless human body motion capture and optical alignment, including:Motion sensor module monitors the three dimensional orientation of human body limb;Optical alignment module monitors human body and VR aobvious absolute spatial positions;Computing module calculates according to the data of motion sensor module and optical alignment module and obtains human body and limbs in the position of three dimensions and the motion change track in orientation;VR terminal devices, it is aobvious including smart mobile phone and VR, wherein, VR programs built in smart mobile phone, refreshed according to the variation track in position and orientation and shows content, and by including both hands position and azimuth information be applied in VR programs, output VR images aobvious receive viewing by VR by user.The present invention has good portability, is limited without space and place, need not fix using area.An optical tracking base station is only used, can realize the location tracking to 360 ° of gamuts of human body, and human action is introduced into VR systems.

Description

A kind of single base station portable V R based on wireless human body motion capture and optical alignment System
Technical field
It is the invention belongs to VR applied technical fields, more particularly to a kind of based on wireless human body motion capture and optical alignment Single base station portable V R system.
Background technology
Current VR technologies and equipment are broadly divided into three classes:
1, the professional VR of the optical alignment based on timestamp
Typical products are HTC VIVE VR.Aobvious, 1~2 handle and 2 locating base stations including 1 VR.Locating base station Sent with certain frequency and include the laser of timestamp information, head show and handle on have multiple laser pickoffs, according to laser pick-off The timestamp information in laser that device receives can calculate VR and show at a distance from locating base station.It may thereby determine that VR show And the position of handle.Because of the VR aobvious laser that must directly receive base station with handle.So need to be equipped with 2 base stations, and And base station needs to be mounted on ceiling position, could position VR aobvious and handle positions within the scope of 360 ° in this way.
2, the professional VR of the optical alignment based on binocular camera
Typical products are Oculus RIFT VR and Sony PlayStation VR.Aobvious, 1~2 hand including 1 VR Handle and at least two locating base station.VR aobvious and handle on have multiple infrared lasers or a visible light emitter, in locating base station Contain binocular camera.Binocular camera by track that VR aobvious and handle on infrared ray or visible light, calculate VR and show And the spatial position of handle.2 locating base stations are needed, VR aobvious and handle positions could be positioned within the scope of 360 °.
3, the VR helmets without optical alignment
Such typically no display unit of the VR helmets, main composition is lens.Mobile phone is inserted into the VR helmets as display unit Part, user are VR videos or the game content that may be viewed by mobile phone.Because there is no the position of optical tracking unit, the VR helmets and both hands Setting can not learn, therefore the not displacement at visual angle in VR contents.Such VR helmets are generally come using the IMU units in mobile phone Before and after to the VR helmets and left and right inclination angle and rotation, and VR contents are updated therewith.
Invention content
In order to overcome the disadvantages of the above prior art, the purpose of the present invention is to provide one kind is caught based on wireless human body action Catch single base station portable V R system with optical alignment.
To achieve the goals above, the technical solution adopted by the present invention is:
A kind of single base station portable V R system based on wireless human body motion capture and optical alignment, which is characterized in that packet It includes:
Motion sensor module monitors the three dimensional orientation of human body limb;
Optical alignment module monitors human body and VR aobvious absolute spatial positions;
Computing module calculates according to the data of motion sensor module and optical alignment module and obtains human body and limb Body is in the position of three dimensions and the motion change track in orientation;
VR terminal devices, including smart mobile phone and VR are aobvious, wherein VR programs built in smart mobile phone, according to the position And orientation variation track refresh show content, and by including both hands position and azimuth information be applied to VR programs In, output VR images are watched by VR aobvious received by user.
The motion sensor module includes node and finger ring, contains accelerometer, gyro in interior joint and finger ring Instrument and magnetometer, node are attached at each major joint of human body, and to capture major joint in the orientation of three dimensions, finger ring is worn It is worn over the first joint of human body index finger, to obtain the first joint of index finger in the orientation of three dimensions.
The finger ring has VR to manipulate load button towards thumb side, facilitates click or long-press for thumb;The section Point is attached to by dressing bandage at each major joint of human body.
The optical alignment module includes being equipped with the hinge storage charging of binocular infrared camera and infrared LED laser lamp Box and the infrared external reflection scraps of paper being arranged on human body back and VR terminal modules, wherein hinge storage charging box are built-in with microprocessor Device;The infrared external reflection scraps of paper described in the infrared LED laser light irradiation, form and highlight pattern, and the binocular infrared camera passes through Camera shooting captures and positions the highlighted pattern, realizes optical alignment.
Base station of the hinge storage charging box as VR systems.
The hinge storage charging box storage node and finger ring, and built-in charging module charges to node and finger ring.
The computing module is run in microprocessor, and absolute spatial position data and all bearing datas converge to Wei Chu It manages in device, calculates the spatial position in each joint, and then introduce time factor, the movement for tracking human body and limbs becomes Change track.
The spatial position in the joint calculates as follows:
1), human body towards or when lateral binocular infrared camera
With VR aobvious position P0(x0,y0,z0) it is foundation, if 1,2 ..., J is the from the beginning joint list to joint J, then It is calculated by the following formula out joint J coordinates PJ
I=1 to J computes repeatedly Pi-1(xi-1,yi-1,zi-1)–(0,hi-1,0)*Qi-1
Wherein, hi-1It is the length of the close head side limbs of joint i, Qi-1It is the close head side limbs of joint i Dimensional orientation quaternary number, as i=J, you can calculate PJCoordinate;
2) when, human body is backwards to binocular infrared camera
If 1,2 ..., J is the from the beginning joint list to joint J, then is calculated by the following formula out joint J coordinates PJ
I=1 to J computes repeatedly Pi-1(xi-1,yi-1,zi-1)–(0,hi-1,0)*Qi-1
Wherein, hi-1It is the length of the close back side limbs of joint i, Qi-1It is the close back side limbs of joint i Dimensional orientation quaternary number, when joint J is more than back, h1It is distance of the back pattern to neck;When joint J back with Under, h1It is distance of the back pattern to waist;
As i=J, you can calculate PJCoordinate.
Addition accelerometer, angular speed meter and magnetometer in the hinge storage charging box, are melted using SensorFusion Hop algorithm calculates the three dimensional orientation Q of hinge storage charging box itselfHinge, QHingeIt is quaternary number, to the position of binocular positioning P1 uses formula P first0'=P0*QHinge -1It corrects, to obtain coordinate when hinge storage charging box is horizontal, then carries out again Subsequent arithmetic, wherein QHinge -1It is QHingeInverse quaternion.
Compared with prior art, the present invention monitors the three dimensional orientation of human body limb by motion sensor, and utilizes Optical alignment monitors human body change in displacement, integrate can detect human body and limbs in the position of three dimensions and The variation track in orientation, and human body movement data is input in real time in VR or 3D virtual worlds.It is relative to traditional VR systems System is limited with good portability, without space and place, need not fix using area.And relative to traditional VR systems An optical tracking base station is only used, can realize positioning and tracking to 360 ° of gamuts of human body, and by human action Introduce VR systems.
Description of the drawings
Fig. 1 is present system general structure schematic diagram.
Fig. 2 be human body towards or position calculates schematic diagram when lateral binocular infrared camera.
Fig. 3 be human body backwards to binocular infrared camera when position calculate schematic diagram.
Specific implementation mode
The embodiment that the present invention will be described in detail with reference to the accompanying drawings and examples.
As shown in Figure 1, portable V R system of the present invention mainly comprises the following steps:
Hinge stores charging box (Hub) 100, several nodes (Node), and finger ring (Ring) dresses bandage 400, and VR aobvious 500 and smart mobile phone 600.
Wherein, hinge stores charging box 100, binocular infrared camera 101 and infrared LED laser lamp 102 is equipped with, in people The infrared reflection scraps of paper 702 there are one special shape are pasted at body back, and it is special to be pasted on VR aobvious 500 box bodys and bandage The infrared reflection scraps of paper 701 of shape, meeting strong reflection infrared ray are formed highlighted under the irradiation of infrared LED laser lamp 102 Pattern.These reflection scraps of paper can be captured and positioned by binocular infrared camera 101, can also position VR aobvious 500 and people Body back.This hinge is provided with node and finger ring storage charging slot 103 simultaneously, can store node and finger ring, and can give Node and finger ring charging.Hinge, which stores charging box 100, has the base station functions of tradition VR systems.
Node can be attached at the major joint of human body, with capture joint space orientation.Contain acceleration in node Degree meter, gyroscope and magnetometer can obtain node in the orientation of three dimensions after the data of 3 sensors are by fusion.
Finger ring is worn on the first joint of human body index finger.Contain accelerometer, gyroscope and magnetometer, 3 biographies in finger ring After the data of sensor are by fusion, the first joint of index finger can be obtained in the orientation of three dimensions.Meanwhile finger ring is towards thumb There is button in side, facilitates click or long-press for thumb, carries out game input, such as this system is applied to gunbattle VR game In when, transmitting ammunition can be controlled by pressing the button on ring with thumb.
Operation has a Sensor Fusion data fusion programs in node/finger ring, i.e., by accelerometer, angular speed meter and The data of magnetometer carry out fusion calculation, obtain node/finger ring in the orientation of 3 dimension spaces.Node/finger ring is attached to human body On, the three dimensional orientation of node/finger ring just represents the three dimensional orientation of corresponding human body.
Magic tape structure can be used in wearing bandage 400, can node be easily attached to each position of human body, while just In dismounting.
Node is attached on most 19 positions of human body by dressing bandage 100, is opened up in figure with reference to figure 1 by usage scenario Show hindbrain node 201, posterior neck node 202, right shoulder node 203, back node 204, right large arm node 205, vertebra node 206, waist/buttocks node 207, right forearm node 208, the right hand back of the body node 209, right thigh node 210, right leg node 211, Right crus of diaphragm node 212, left foot node 213, left leg node 214, left thigh node 215, left hand back of the body node 216, left forearm node 217, left large arm node 218, left shoulder node 219, while left index finger carries left index finger finger ring 302, right hand index finger is with the right side Hand index finger finger ring 301.
The infrared reflection scraps of paper 702 there are one special shape, VR aobvious 500 box bodys and bandage are pasted in human body back On be pasted with the infrared reflection scraps of paper 701 of special shape, in the present embodiment, the infrared reflection scraps of paper 702 are connected to left shoulder Node 219 and back node 204, right shoulder node 203 and back node 204 and back node 204 and vertebra node 206, but This is not necessarily.The shape of the infrared reflection scraps of paper 701 and the infrared reflection scraps of paper 702 has no particular/special requirement, as long as can expire Foot easily identifies.
Smart mobile phone 600 is put into VR aobvious 500, and head is worn over by VR aobvious 500.Hinge is placed on by the side of 2 ~5 meters of place, binocular infrared camera 101 is against human body.Same picture is taken with two cameras, according to two width head portraits In same pattern pixel difference away from and two cameras physical distance, come calculate pattern apart from binocular camera away from From, that is, pattern is in the position in space.Pattern is in the position in space, that is, head or back are in the position in space.
All nodes, finger ring three dimensional orientation data fortune is real-time transmitted to by the wireless mode of bluetooth or WiFI Calculate module.Hinge storage charging box 100 passes through the positioning in real time of binocular infrared camera 101 VR aobvious 500 and human body back simultaneously. Using this two groups of data, the computing module that hinge carries can calculate the position of both hands and all limbs.Then by wireless The position of all human bodies and azimuth information are transferred to smart mobile phone 600 by mode.VR programs in smart mobile phone 600 according to The positioning on head and orientation, which refresh, shows content, while position including both hands and azimuth information can also be applied to VR In program, and finally it is by a user by VR aobvious 500 and is received.
VR aobvious 500 and human body back are positioned above by binocular infrared camera 101, and merge human body limb three-dimensional side Position, it is as follows to calculate the method for position of all limbs:
One, human body towards or when lateral binocular infrared camera 101
Binocular infrared camera 101 can take VR aobvious 500 always, that is, hinge storage charging box 100 can be determined always Position is to VR aobvious 500.
As shown in Fig. 2, indicating human body with 19 rigid body joint linked systems.Its Oxford gray is main portions, light gray It is auxiliary position.
The position that binocular infrared camera 101 navigates to VR aobvious 500 is:P0, P0Detailed coordinate be (x0,y0,z0)。
With VR aobvious position P0(x0,y0,z0) it is foundation, if 1,2 ..., J is the from the beginning joint list to joint J, then It is calculated by the following formula out joint J coordinates PJ
I=1 to J computes repeatedly Pi-1(xi-1,yi-1,zi-1)–(0,hi-1,0)*Qi-1
Wherein, hi-1It is the length of the close head side limbs of joint i, Qi-1It is the close head side limbs of joint i Dimensional orientation quaternary number, as i=J, you can calculate PJCoordinate;
Specifically, if head length is h0, the three dimensional orientation on head is Q0(Q0It is quaternary number).
Then head and neck connecting portion J1Position it is available as under type calculates.
Pass through the vectorial V of origin0(0,h0, 0) and there is length same as head.
V0'=V0*Q0It indicates vectorial V0Rotate to the vectorial V completely the same with head orientation0', and V0' still pass through Origin.
If V0' arrive P0Translation be S0That is S0=P0–V0'
Then J1Coordinate PJ1=(0,0,0)+S0=P0–V0'=P0–V0*Q0
Assuming that the length of other known position limbs can calculate other position limbs two according to same computational methods The position at end.
Two, when human body is backwards to binocular infrared camera 101
When human body is backwards to binocular infrared camera 101, under normal circumstances, binocular infrared camera 101 is still it can be seen that VR The bandage of head aobvious 500 can also position VR aobvious 500 positions.According to above-mentioned calculating, the position of other position limbs can be calculated It sets.
In some cases, binocular infrared camera 101 cannot take VR aobvious 500.But at this point, under normal circumstances, Binocular infrared camera 101, which is appointed, can so take human body back.
The position that binocular infrared camera 101 navigates to back echo area is P0, P0Detailed coordinate be (x0,y0,z0).The back of the body The three dimensional orientation of portion, that is, trunk is Q0(Q0It is quaternary number).
If 1,2 ..., J is the from the beginning joint list to joint J, then is calculated by the following formula out joint J coordinates PJ
I=1 to J computes repeatedly Pi-1(xi-1,yi-1,zi-1)–(0,hi-1,0)*Qi-1
Wherein, hi-1It is the length of the close back side limbs of joint i, Qi-1It is the close back side limbs of joint i Dimensional orientation quaternary number, when joint J is more than back, h0It is distance of the back pattern to neck;When joint J back with Under, h0It is distance of the back pattern to waist;
As i=J, you can calculate PJCoordinate.
Specifically, if P0To J1Length be h0u, P0To J2Length be h0d
If V0u(0,h0u, 0), V0d(0,h0d, 0) and it is vector by origin respectively.
Then J1Coordinate be:PJ1=P0+V0u*Q0
J2Coordinate be:PJ2=P0–V0d*Q0
Wherein * representing matrixes multiplication.
Node can only obtain dimensional orientation, but not have location information.But because human body is the articular system of a connection.Example If large arm is motionless, forearm is moving.Forearm end then can be calculated i.e. according to the Orientation differences of forearm and the length of forearm Relative position variation at wrist.Assuming that the length of other known position limbs can calculate every according to same computational methods The position at a position limbs both ends.For example, if it is known that large arm (such as elbow joint) in the absolute position in space, then can be counted Calculate the absolute position variation at wrist.
In above-mentioned one and two calculating, it is assumed that it is horizontal positioned always that hinge, which stores charging box 100,.
If improving this system, accelerometer, angular speed meter and magnetometer can be added in hinge stores charging box 100, Using SensorFusion blending algorithms, the three dimensional orientation Q of hinge storage charging box 100 itself is calculatedHinge, (QHingeIt is Quaternary number).
Then for the position P of binocular positioning0, need to correct with the following method first, charging box stored to obtain hinge 100 coordinate when being horizontal.
P0'=P0*QHinge - 1,Wherein QHinge -1It is QHingeInverse quaternion.
Then P is used0' continue above-mentioned calculating.
The estimation of each position limbs length of human body.
In above-mentioned calculating, it is both needed to the length it is to be understood that each position limbs of human body.Such as head length, neck are long Degree, shoulder length, upper body torso length and back node are in position of upper body trunk etc..These parameters can enable user input.
One simple estimation mode is:According to the height of user, age and gender, according to human body ratio, estimation Go out the length of required limbs, and estimates the height of back echo area.
Site error amendment
Portable V R system of the present invention only uses an optical alignment base station to position VR and shows 500 or back position, On the basis of this position, coordinate human body attitude, calculates the position of partes corporis humani position.Since each limbs used in system are long Degree (such as head, neck, shoulder breadth, back of the body length, size brachium, thigh and calf length etc.) is obtained by height, age, gender estimation, Inevitably there is error.Meanwhile human body is reduced to 19 bone rigid body joint systems by this system, also brings along error.Therefore it needs Want error correcting technology.
In VR systems, the positioning on head is particularly critical, because the position on head and angle determine VR image contents.Such as There is saltus step in fruit head position, and VR image contents have apparent pause, saltus step sense, influence usage experience.Therefore this is portable VR systems eliminate error, enhancing experience using the technology of smooth head position.Specifically logic is:
A, when hinge (base station) can navigate to VR aobvious 500, that is, can navigate to head position, with head position It is set to benchmark, calculates the position of other body parts such as both hands.
B, as hinge (base station) last moment tn-1VR aobvious 500 can be navigated to, this moment tnVR cannot be navigated to show 500, but when can navigate to human body back.According to human body back position, t is calculatednMoment head, both hands and other body parts Position.If calculated head position and tn-1Head position it is inconsistent, then use tn-1Head positioning.If tn's Back position and tn-1The back position of calculating is different, it tries 1:Use tnThe back position at moment, while adjusting light grey limb The orientation of body, so that tnMoment and tn-1The head at moment and the position all same at back.If trial 1 is unsuccessful, use tn-1The back position at moment records tnMoment back position and tn-1The offset of moment back position is ShiftBack, subsequently make When with the optical alignment at back, this offset is applied.
C, as hinge (base station) last moment tn-1VR aobvious 500 cannot be positioned, this moment tnVR aobvious 500 can be navigated to When, while hinge can also navigate to human body back.If tn-1Moment calculated head position, with tnWhat the moment navigated to Head position is different, then uses tn-1The head position at moment, the offset for recording the two are ShiftHead, fixed subsequently using optics When the head position that position is arrived, this offset is applied.On the basis of revised head position, both hands and other people body regions are calculated Position.
The infrared external reflection scraps of paper are used in this portable V R system, it is infrared double to facilitate to form the highlight bar of infrared imaging Ocular head following reflex area.The use of the reason of the method is that general can be shown on 500 to different VR heads.
Infrared or visible LED lamp can also be used, to replace the reflection scraps of paper, but the VR heads of customization can only be used aobvious in this way 500。
The absolute spatial position for being determined the helmet or back in this system using optical alignment is connected further according to the joint of human body Relationship, the length and orientation of each joint limbs are connect, the absolute position at each position of whole body is calculated, to accomplish that tracking whole body is each The orientation and change in location at a position.

Claims (10)

1. a kind of single base station portable V R system based on wireless human body motion capture and optical alignment, which is characterized in that including:
Motion sensor module monitors the three dimensional orientation of human body limb;
Optical alignment module monitors human body and VR aobvious absolute spatial positions;
Computing module calculates acquisition human body and limbs exists according to the data of motion sensor module and optical alignment module The position of three dimensions and the motion change track in orientation;
VR terminal devices, including smart mobile phone and VR are aobvious, wherein VR programs built in smart mobile phone, according to the position and side Position variation track refresh show content, and by including both hands position and azimuth information be applied in VR programs, VR images are exported to watch by VR aobvious received by user.
2. single base station portable V R system based on wireless human body motion capture and optical alignment according to claim 1, It is characterized in that, the motion sensor module includes node and finger ring, contains accelerometer, gyro in interior joint and finger ring Instrument and magnetometer, node are attached at each major joint of human body, and to capture major joint in the orientation of three dimensions, finger ring is worn It is worn over the first joint of human body index finger, to obtain the first joint of index finger in the orientation of three dimensions.
3. single base station portable V R system based on wireless human body motion capture and optical alignment according to claim 2, It is characterized in that, the finger ring has VR to manipulate load button towards thumb side, facilitates click or long-press for thumb;The section Point is attached to by dressing bandage at each major joint of human body.
4. single base station portable V R system based on wireless human body motion capture and optical alignment according to claim 2, It is characterized in that, the optical alignment module includes being equipped with the hinge storage of binocular infrared camera and infrared LED laser lamp to fill Electric box and the infrared external reflection scraps of paper being arranged on human body back and VR terminal modules, wherein hinge storage charging box are built-in with micro- place Manage device;The infrared external reflection scraps of paper described in the infrared LED laser light irradiation, form and highlight pattern, and the binocular infrared camera is logical It crosses camera shooting and captures and position the highlighted pattern, realize optical alignment.
5. single base station portable V R system based on wireless human body motion capture and optical alignment according to claim 4, It is characterized in that, base station of the hinge storage charging box as VR systems.
6. single base station portable V R system based on wireless human body motion capture and optical alignment according to claim 4, It is characterized in that, the hinge storage charging box storage node and finger ring, and built-in charging module charges to node and finger ring.
7. single base station portable V R system based on wireless human body motion capture and optical alignment according to claim 4, It is characterized in that, the computing module is run in microprocessor, and absolute spatial position data and all bearing datas converge to micro- In processor, the spatial position in each joint is calculated, and then introduce time factor, track the movement of human body and limbs Variation track.
8. single base station portable V R system based on wireless human body motion capture and optical alignment according to claim 7, It is characterized in that, the spatial position calculating in the joint is as follows:
1), human body towards or when lateral binocular infrared camera
With VR aobvious position P0(x0,y0,z0) be foundation, if 1,2 ..., J is the from the beginning joint list to joint J, then by with Lower formula calculates joint J coordinates PJ
I=1 to J computes repeatedly Pi-1(xi-1,yi-1,zi-1)–(0,hi-1,0)*Qi-1
Wherein, hi-1It is the length of the close head side limbs of joint i, Qi-1It is the sky of the close head side limbs of joint i Between orientation quaternary number, as i=J, you can calculate PJCoordinate;
2) when, human body is backwards to binocular infrared camera
If 1,2 ..., J is the from the beginning joint list to joint J, then is calculated by the following formula out joint J coordinates PJ
I=1 to J computes repeatedly Pi-1(xi-1,yi-1,zi-1)–(0,hi-1,0)*Qi-1
Wherein, hi-1It is the length of the close back side limbs of joint i, Qi-1It is the sky of the close back side limbs of joint i Between orientation quaternary number, when joint J is more than back, h0It is distance of the back pattern to neck;When joint J at back hereinafter, h0It is Distance of the back pattern to waist;
As i=J, you can calculate PJCoordinate.
9. single base station portable V R system based on wireless human body motion capture and optical alignment according to claim 8, It is characterized in that, addition accelerometer, angular speed meter and magnetometer in the hinge storage charging box are melted using SensorFusion Hop algorithm calculates the three dimensional orientation Q of hinge storage charging box itselfHinge, QHingeIt is quaternary number, to the position of binocular positioning P0Formula P is used first0'=P0*QHinge -1It corrects, to obtain coordinate when hinge storage charging box is horizontal, after then carrying out again Reforwarding is calculated, wherein QHinge -1It is QHingeInverse quaternion.
10. single base station portable V R system based on wireless human body motion capture and optical alignment according to claim 8, It is characterized in that, carry out site error amendment using the method for smooth head position, specific method is:
A shows when hinge can navigate to VR, then on the basis of head position, calculates the position of other body parts;
B, as hinge last moment tn-1VR can be navigated to show, this moment tnVR cannot be navigated to show, but human body can be navigated to When back, according to human body back position, t is calculatednMoment head, both hands and other body parts position;If calculated Head position and tn-1Head position it is inconsistent, then use tn-1Head positioning;If tnBack position and tn-1It calculates Back position it is different;Then attempt 1:Use tnThe back position at moment, while the orientation at body auxiliary position is adjusted, so that tnMoment and tn-1The head at moment and the position all same at back;If trial 1 is unsuccessful, t is usedn-1The back at moment It sets, records tnMoment back position and tn-1The offset of moment back position is ShiftBack, fixed subsequently using the optics at back When position, this offset is applied;
C, as hinge last moment tn-1VR cannot be positioned to show, this moment tnCan navigate to VR it is aobvious when, while hinge can also Navigate to human body back;If tn-1Moment calculated head position, with tnThe head position that moment navigates to is different, then makes Use tn-1The head position at moment, the offset for recording the two are ShiftHead, in the head position subsequently arrived using optical alignment, This offset is applied, on the basis of revised head position, calculates the position at other positions.
CN201810419675.8A 2018-05-04 2018-05-04 A kind of single base station portable V R system based on wireless human body motion capture and optical alignment Pending CN108762488A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810419675.8A CN108762488A (en) 2018-05-04 2018-05-04 A kind of single base station portable V R system based on wireless human body motion capture and optical alignment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810419675.8A CN108762488A (en) 2018-05-04 2018-05-04 A kind of single base station portable V R system based on wireless human body motion capture and optical alignment

Publications (1)

Publication Number Publication Date
CN108762488A true CN108762488A (en) 2018-11-06

Family

ID=64010088

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810419675.8A Pending CN108762488A (en) 2018-05-04 2018-05-04 A kind of single base station portable V R system based on wireless human body motion capture and optical alignment

Country Status (1)

Country Link
CN (1) CN108762488A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110567451A (en) * 2019-09-20 2019-12-13 深圳市丰之健电子科技有限公司 Human body posture recognition instrument device and use method thereof
CN110631411A (en) * 2019-09-02 2019-12-31 北京易智时代数字科技有限公司 Virtual shooting training control method and system
CN110955335A (en) * 2019-12-18 2020-04-03 视境技术(深圳)有限公司 Motion capture system and method
CN111372070A (en) * 2018-12-26 2020-07-03 宏碁股份有限公司 Tracking positioning system and positioning and correcting method thereof
CN111947650A (en) * 2020-07-14 2020-11-17 杭州瑞声海洋仪器有限公司 Fusion positioning system and method based on optical tracking and inertial tracking
CN112604272A (en) * 2020-12-16 2021-04-06 深圳康佳电子科技有限公司 Mobile control method and device for VR (virtual reality) game, intelligent terminal and storage medium
US11315530B2 (en) 2018-11-28 2022-04-26 Acer Incorporated Tracking system and related positioning and calibration methods

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020024675A1 (en) * 2000-01-28 2002-02-28 Eric Foxlin Self-referenced tracking
CN103279186A (en) * 2013-05-07 2013-09-04 兰州交通大学 Multiple-target motion capturing system integrating optical localization and inertia sensing
CN105183166A (en) * 2015-09-15 2015-12-23 北京国承万通信息科技有限公司 Virtual reality system
CN105592535A (en) * 2015-12-28 2016-05-18 山东大学 Bluetooth 4.0-based inertia motion capturing system used for realizing low-power wireless data transmission, and data transmission method thereof
CN106256394A (en) * 2016-07-14 2016-12-28 广东技术师范学院 The training devices of mixing motion capture and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020024675A1 (en) * 2000-01-28 2002-02-28 Eric Foxlin Self-referenced tracking
CN103279186A (en) * 2013-05-07 2013-09-04 兰州交通大学 Multiple-target motion capturing system integrating optical localization and inertia sensing
CN105183166A (en) * 2015-09-15 2015-12-23 北京国承万通信息科技有限公司 Virtual reality system
CN105592535A (en) * 2015-12-28 2016-05-18 山东大学 Bluetooth 4.0-based inertia motion capturing system used for realizing low-power wireless data transmission, and data transmission method thereof
CN106256394A (en) * 2016-07-14 2016-12-28 广东技术师范学院 The training devices of mixing motion capture and system

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11315530B2 (en) 2018-11-28 2022-04-26 Acer Incorporated Tracking system and related positioning and calibration methods
CN111372070A (en) * 2018-12-26 2020-07-03 宏碁股份有限公司 Tracking positioning system and positioning and correcting method thereof
CN110631411A (en) * 2019-09-02 2019-12-31 北京易智时代数字科技有限公司 Virtual shooting training control method and system
CN110567451A (en) * 2019-09-20 2019-12-13 深圳市丰之健电子科技有限公司 Human body posture recognition instrument device and use method thereof
CN110955335A (en) * 2019-12-18 2020-04-03 视境技术(深圳)有限公司 Motion capture system and method
CN111947650A (en) * 2020-07-14 2020-11-17 杭州瑞声海洋仪器有限公司 Fusion positioning system and method based on optical tracking and inertial tracking
CN112604272A (en) * 2020-12-16 2021-04-06 深圳康佳电子科技有限公司 Mobile control method and device for VR (virtual reality) game, intelligent terminal and storage medium

Similar Documents

Publication Publication Date Title
CN108762488A (en) A kind of single base station portable V R system based on wireless human body motion capture and optical alignment
US11210808B2 (en) Systems and methods for augmented reality
US11194386B1 (en) Artificial reality wearable magnetic sensor system for body pose tracking
CN104699247B (en) A kind of virtual reality interactive system and method based on machine vision
Roetenberg et al. Xsens MVN: Full 6DOF human motion tracking using miniature inertial sensors
CN105850113B (en) The calibration of virtual reality system
US8786680B2 (en) Motion capture from body mounted cameras
CN110327048B (en) Human upper limb posture reconstruction system based on wearable inertial sensor
CN106843507B (en) Virtual reality multi-person interaction method and system
US20210349529A1 (en) Avatar tracking and rendering in virtual reality
CN110140099A (en) System and method for tracking control unit
CN108846867A (en) A kind of SLAM system based on more mesh panorama inertial navigations
CN105027030A (en) Wireless wrist computing and control device and method for 3d imaging, mapping, networking and interfacing
CN103279186A (en) Multiple-target motion capturing system integrating optical localization and inertia sensing
CN108733206A (en) A kind of coordinate alignment schemes, system and virtual reality system
CN109344922A (en) A kind of dance movement evaluating system having motion-captured function
CN110609621B (en) Gesture calibration method and human motion capture system based on microsensor
CN110221691A (en) A kind of immersion virtual experience method, system and device
JP2021512388A (en) Systems and methods for augmented reality
US20180216959A1 (en) A Combined Motion Capture System
Wei et al. Real-time 3D arm motion tracking using the 6-axis IMU sensor of a smartwatch
CN111401340B (en) Method and device for detecting motion of target object
CN206819290U (en) A kind of system of virtual reality multi-person interactive
CN108981690A (en) A kind of light is used to fusion and positioning method, equipment and system
Myn et al. Xsens mvn user manual

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20181106

WD01 Invention patent application deemed withdrawn after publication