CN114533039B - Human joint position and angle resolving method based on redundant sensor - Google Patents

Human joint position and angle resolving method based on redundant sensor Download PDF

Info

Publication number
CN114533039B
CN114533039B CN202111614526.5A CN202111614526A CN114533039B CN 114533039 B CN114533039 B CN 114533039B CN 202111614526 A CN202111614526 A CN 202111614526A CN 114533039 B CN114533039 B CN 114533039B
Authority
CN
China
Prior art keywords
joint
quaternion
sensor
sensors
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111614526.5A
Other languages
Chinese (zh)
Other versions
CN114533039A (en
Inventor
王伟
江超
杨德伟
姜小明
田�健
冉鹏
李章勇
郭毅军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN202111614526.5A priority Critical patent/CN114533039B/en
Publication of CN114533039A publication Critical patent/CN114533039A/en
Application granted granted Critical
Publication of CN114533039B publication Critical patent/CN114533039B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Dentistry (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a human joint position and angle resolving method based on redundant sensors, and relates to the technical field of measurement. The invention comprises the following steps: two inertial sensors are respectively arranged on limbs at two ends of a joint to be detected, reflective marker points required by infrared motion capture are arranged at the same time, initial quaternions of the sensors are obtained through static calibration, a deviation value of the sensors is obtained through calculation, quaternions of nodes output by the sensors are obtained through calculation through collected quaternions output by the sensors in real time in a motion process, then the quaternions of the nodes are subjected to multi-element linear fitting with standard quaternions of the joint points obtained by the infrared capture equipment, quaternions based on redundant information output are obtained, and finally positions and angles of the joint are obtained through calculation. The invention uses the information of the redundant sensor to eliminate the error caused by the mismatching of the sensor and the bone position in the movement process, and obtains more accurate joint position and angle information.

Description

Human joint position and angle resolving method based on redundant sensor
Technical Field
The invention relates to the technical field of measurement, in particular to a human joint position and angle resolving method based on redundant sensors.
Background
With the rapid development of science and technology, gait information of human body is very important, and relates to the fields of rehabilitation of injured limbs, physical training, exoskeleton, active artificial limb research and the like. The technology of a human motion capture system is mainly divided into methods such as an optical type and an inertial sensor, wherein an infrared optical capture system based on a multi-marker point has extremely high accuracy, but the technical scheme has limitations on scene size and optical conditions, and equipment is expensive and cannot be applied to mass scenes. With the development of inertial sensors in recent years, the accuracy of motion capture systems based on inertial sensors has been increased year by year, and motion capture systems based on inertial sensors have the advantages of low requirements on sites and relatively low cost.
Disclosure of Invention
In view of the above problems, the present invention aims to disclose a method for resolving the position and angle of a human joint based on redundant sensors, which uses the information of the redundant sensors to eliminate errors caused by mismatching of the sensors and the bone positions during the movement process, and obtain more accurate joint position and angle information.
Specifically, the human joint position and angle resolving method based on the redundant sensor comprises the following steps:
s1: in joint O to be measured 2 Two inertial sensors are respectively arranged on limbs at two ends of the robot, reflective marker points required by infrared motion capture are arranged at the same time, and the attitude angle information of the inertial sensors is read;
s2: the human body keeps standing upright for 5 seconds, and the joint O to be measured 2 Joint O to be measured 2 Superior proximal joint O 1 No rotation is generated, and the initial quaternion q of each sensor is acquired (init) According to
Respectively calculating the deviation value q of each sensor bias Completing static calibration, and taking the average value of the deviation values in the static calibration processThe next calculation is performed, wherein q O1(init) Is O 1 Initial quaternion of joint->For quaternion multiplication, q *_bias The deviation value of the sensor;
s3: in the human body movement process, quaternion q of real-time output of four sensors is respectively acquired s* And pass throughCalculating to obtain quaternions of the joint points output by each sensor;
s4: the quaternion of each node obtained in the step S3 and the standard quaternion q of the joint point obtained by the infrared capturing equipment sta Performing multiple linesFitting, namely solving fitting parameters, and substituting the parameters into a fitting equation to obtain quaternion q based on redundant information output o1 、q o2
S5: q obtained according to step S4 o1 、q o2 Calculating to obtain the joint O to be measured 2 The positions of (2) are:
O 1 O 2 =(0,0,0,-Len O 1 O 2 )
then according to q o1 The position of O2 can be found as follows:
P O2 =(X o2 ,Y o2 ,Z o2 )
in the Len O 1 O 2 Is limb O 1 O 2 Length, q O1 -1 Represents O 1 Inverse of quaternion, X o2 ,Y o2 ,Z o2 Respectively represent O 2 The values of the joints on the X axis, the Y axis and the Z axis,a quaternion representation of three-dimensional coordinates of the position of joint O2;
s6: q obtained according to step S4 o2 Calculating to obtain the joint O to be measured 2 Inferior proximal joint O 3 The position of (2) is
O 2 O 3 =(0,0,0,-(Len O 1 O 2 +Len O 2 O 3 ))
Then according to q o2 O can be obtained 3 The positions of (2) are:
P O3 =(X o3 ,Y o3 ,Z o3 ),
in the Len O 2 O 3 Is limb O 2 O 3 Length, q O2 -1 Represents O 2 Inverse of quaternion, X o3 ,Y o3 ,Z o3 Respectively represent O 3 The values of the joints on the X axis, the Y axis and the Z axis,a quaternion representation of three-dimensional coordinates of the position of joint O3;
s7: from the calculation of step S4, q o1 And q o2 Calculating to obtain the joint O to be measured 2 The angular quaternion form of (c) is:
then O 1 O 2 Extension line and O 2 O 3 Included angle between:
α=arctan2(2(q 0 q 3 +q 1 q 2 ),1-2((q 2 ) 2 +(q 3 ) 2 ))
wherein q is 0 As the real part, q 1 ,q 2 ,q 3 For imaginary parts with different imaginary parts, i.e. limbs O 1 O 2 With limb O 2 O 3 The joint angle between the two is as follows: 180 deg. -alpha.
Further, in the step S1, when a reflective marker point required for infrared motion capture is set, the method is performed on the limb O 1 O 2 With limb O 2 O 3 Three non-collinear points are respectively arranged on the two sides of the frame.
Further, in the step S2,q O1(init) is O 1 Initial quaternion of joint, q O2(init) Is O 2 Initial quaternion of the joint.
Further, the step S3 specifically includes: let joint O to be measured 2 Upper limb, limb O 1 O 2 The upper sensors are respectively denoted as S1 sensorsSensor, S2 sensor, joint O to be measured 2 Lower extremity limb O 2 O 3 The upper sensors are respectively represented as an S3 sensor and an S4 sensor, and in the human body movement process, real-time output quaternions q of the S1 sensor, the S2 sensor, the S3 sensor and the S4 sensor are respectively acquired s1 、q s2 、q s3 、q s4 O is calculated 1 The quaternion of the joint is:
O 2 the quaternion of the joint is:
wherein:the average value of the deviation values of the S1 sensor, the S2 sensor, the S3 sensor and the S4 sensor.
Further, the step S4 calculates q by using the step S3 s1o1 、q s2o1 O acquired with infrared equipment 1 Standard quaternion q sta_o1 And (3) performing multi-element linear fitting, wherein the fitting equation is as follows:
namely:
a*q s11 +b*q s21 +c 1 =q sta_o11 ,……,a*q s14 +b*q s24 +c 4 =q sta_o14
i.e. solvingPartial data is taken to calculate fitting parameters a, b and c, and the rest parts verify whether the parameters are correct or not, so that the parameters are substituted into a fitting equation to obtain quaternion q output based on redundant information o1 Wherein a1, a2 represent the independent variables of the multiple function, y sta Representing standard parameters obtained by the infrared motion capture equipment, namely accurate joint quaternion, y i The method is based on parameters obtained by the inertial motion capture device, namely joint quaternions with errors, and i represents i groups of data in total.
Further, the step S4 calculates q by using the step S3 s3o2 、q s4o2 O acquired with infrared equipment 2 Standard quaternion q sta_o2 And (3) performing multi-element linear fitting, wherein the fitting equation is as follows:
namely:
d*q s11 +e*q s21 +f 1 =q sta_o21 ,……,d*q s14 +e*q s24 +f 4 =q sta_o24
i.e. solvingPartial data is taken to calculate fitting parameters d, e and f, and the rest parts verify whether the parameters are correct or not, so that the parameters are substituted into a fitting equation to obtain a quaternion q based on redundant information output o2 Wherein a1, a2 represent the independent variables of the multiple function, y sta Representing standard parameters obtained by the infrared motion capture equipment, namely accurate joint quaternion, y i The method is based on parameters obtained by the inertial motion capture device, namely joint quaternions with errors, and i represents i groups of data in total.
Further, the said
q * =(q v ,q w ) * =(-q v ,q w )
Wherein q represents a general quaternion of generalization, q v Imaginary part of quaternion, q w Is the real part of the quaternion.
The invention has the beneficial effects that:
the invention discloses a human joint position and angle resolving method based on a redundant sensor, which is characterized in that the resolving result of the redundant sensor and the result of an infrared motion capturing system are fitted by a multi-element linear fitting method to obtain parameters of a linear equation so as to solve the error of mismatching of an inertial sensor and a skeleton position in motion and obtain more accurate joint position and angle information.
Drawings
FIG. 1 is a schematic view of a joint model and sensor arrangement of the present invention;
FIG. 2 is a schematic illustration of a calculated joint angle according to the present invention;
figure 3 is a flow chart of the method of the present invention.
Detailed Description
The present invention will be described in detail with reference to the following specific examples:
the invention discloses a human joint position and angle resolving method based on redundant sensors, which specifically comprises the following steps:
s1: as shown in fig. 1, in the joint to be measured O 2 Two inertial sensors are respectively arranged on the limbs at two ends of the joint O to be measured 2 Upper limb, limb O 1 O 2 The upper sensors are respectively represented as an S1 sensor and an S2 sensor, and the joint O to be detected 2 Lower extremity and extremity O 2 O 3 The upper sensors are respectively represented as an S3 sensor and an S4 sensor, and are simultaneously arranged on the limb O 1 O 2 With limb O 2 O 3 Three non-collinear infrared motion capturing stations are respectively arranged on the three non-collinear infrared motion capturing stationsAnd the required reflective marker point starts to read the attitude angle information of the inertial sensor.
S2: the human body keeps standing upright for 5 seconds, and the joint O to be measured 2 Joint O to be measured 2 Superior proximal joint O 1 All without rotation, joint O 1 Joint O 2 Initial quaternion of (a)Collecting initial quaternions q of an S1 sensor, an S2 sensor, an S3 sensor and an S4 sensor s1(init) 、q s2(init) 、q s3(init) 、q s4(init) According toCalculating to obtain the deviation value q of the S1 sensor S1_bias Wherein->Is a quaternion multiplication.
Specifically, since the quaternion can be written as a four-dimensional vector p= (p) 0 ,p 1 ,p 2 ,p 3 ),q=(q 0 ,q 1 ,q 2 ,q 3 ) The quaternion multiplication can be expressed as:
is required to obtainQ in (b) S1_bias The first term on the right is a 4*4 matrix of quaternions of the S1 sensor output during calibration, i.e., the solution equation set:
thus, q can be obtained S1_bias According to the same principle Can calculate q S2_bias ,q S3_bias ,q S4_bias Completing static calibration, taking the mean value of the deviation values in the static calibration process +.>For the deviation value calculated in the next step.
S3: in the human body movement process, real-time output quaternion q of the S1 sensor, the S2 sensor, the S3 sensor and the S4 sensor are respectively acquired s1 、q s2 、q s3 、q s4 O is calculated 1 The quaternion of the joint is:
O 2 the quaternion of the joint is:
wherein:is the mean value of the deviation values of the S1 sensor, the S2 sensor, the S3 sensor and the S4 sensor>And (3) performing matrix multiplication calculation directly in the same calculation mode as the step S2 for quaternion multiplication.
S4: adjusting the frequency of the infrared action capturing device and the frequency of the inertial action device to be the same, namely keeping the data of the two sets of devices in time synchronization, the data amount of the settlement angle obtained in the same time are the same, so that error data of the inertial device can be conveniently fitted to the accurate infrared action capturing device, and the quaternion of each node obtained and the standard quaternion q of the joint point obtained by the infrared capturing device sta Performing multiple linear fitting, solving fitting parameters, substituting the parameters into a fitting equation, and obtaining quaternion q based on redundant information output o1 、q o2 Specifically, the method comprises the steps of,
q is calculated by using the step S3 s1o1 、q s2o1 O acquired with infrared equipment 1 Standard quaternion q sta_o1 A multiple linear fit is performed and the result is a linear fit,
namely:
a*q s11 +b*q s21 +c 1 =q sta_o11 ,……,a*q s14 +b*q s24 +c 4 =q sta_o14
i.e. solvingPartial data is taken to calculate fitting parameters a, b and c, and the rest parts verify whether the parameters are correct or not, so that the parameters are substituted into a fitting equation to obtain quaternion q output based on redundant information o1 Wherein a1, a2 represent the independent variables of the multiple function, y sta Representing standard parameters obtained by the infrared motion capture equipment, namely accurate joint quaternion, y i The method is based on parameters obtained by the inertial motion capture device, namely joint quaternions with errors, and i represents i groups of data in total.
Q is calculated by using the step S3 s3o2 、q s4o2 O acquired with infrared equipment 2 Standard quaternion q sta_o2 And (3) performing multi-element linear fitting, wherein the fitting equation is as follows:
namely:
d*q s11 +e*q s21 +f 1 =q sta_o21 ,……,d*q s14 +e*q s24 +f 4 =q sta_o24
i.e. solvingPartial data is taken to calculate fitting parameters d, e and f, and the rest parts verify whether the parameters are correct or not, so that the parameters are substituted into a fitting equation to obtain a quaternion q based on redundant information output o2 Wherein a1, a2 represent the independent variables of the multiple function, y sta Representing standard parameters obtained by the infrared motion capture equipment, namely accurate joint quaternion, y i The method is based on parameters obtained by the inertial motion capture device, namely joint quaternions with errors, and i represents i groups of data in total.
S5: q obtained according to step S4 o1 、q o2 Calculating to obtain the joint O to be measured 2 The positions of (2) are:
O 1 O 2 =(0,0,0,-Len O 1 O 2 )
then according to q o1 The position of O2 can be found as follows:
P O2 =(X o2 ,Y o2 ,Z o2 )
in the Len O 1 O 2 Is limb O 1 O 2 Length, q O1 -1 Represents O 1 The inverse of quaternion, i.e.Wherein q is * =(q v ,q w ) * =(-q v ,q w ) Is the conjugation of quaternion, q v Imaginary part of quaternion, q w As the real part of the quaternion, X o2 ,Y o2 ,Z o2 Respectively represent O 2 Values of the joints on the X-axis, Y-axis, Z-axis,>a quaternion representation of three-dimensional coordinates of the position of joint O2;
s6: q obtained according to step S4 o2 Calculating to obtain the joint O to be measured 2 Inferior proximal joint O 3 The position of (2) is
O 2 O 3 =(0,0,0,-(Len O 1 O 2 +Len O 2 O 3 ))
Then according to q o2 O can be obtained 3 The positions of (2) are:
P O3 =(X o3 ,Y o3 ,Z o3 )
in the Len O 2 O 3 Is limb O 2 O 3 Length, q O2 -1 Represents O 2 The inverse of quaternion, i.e.Wherein q is * =(q v ,q w ) * =(-q v ,q w ) Is the conjugation of quaternion, q v Imaginary part of quaternion, q w As the real part of the quaternion, X o3 ,Y o3 ,Z o3 Respectively represent O 3 Values of the joints on the X-axis, Y-axis, Z-axis,>a quaternion representation of three-dimensional coordinates of the position of joint O3;
s7: from the calculation of step S4, q o1 And q o2 Calculating to obtain the joint O to be measured 2 The angular quaternion form of (c) is:
limb O 1 O 2 Extension cord and limb O 2 O 3 Included angle between:
α=arctan2(2(q 0 q 3 +q 1 q 2 ),1-2((q 2 ) 2 +(q 3 ) 2 ))
limb O 1 O 2 With limb O 2 O 3 The joint angle between the two is as follows: 180 deg. -alpha,
limb O of joint 1 O 2 Extension cord and other limb O 2 O 3 The euler angles in the remaining two directions are:
β=arcsin(2(q 0 q 2 -q 1 q 3 )),γ=atan2(2(q 0 q 1 +q 2 q 3 ),1-2((q 1 ) 2 +(q 2 ) 2 ))
wherein q is 0 As the real part, q 1 ,q 2 ,q 3 Is an imaginary part with a different imaginary part.
The above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made thereto without departing from the spirit and scope of the technical solution of the present invention, which is intended to be covered by the scope of the claims of the present invention. The technology, shape, and construction parts of the present invention, which are not described in detail, are known in the art.

Claims (7)

1. The human joint position and angle resolving method based on the redundant sensor is characterized by comprising the following steps:
s1: in joint O to be measured 2 Two inertial sensors are respectively arranged on limbs at two ends of the robot, reflective marker points required by infrared motion capture are arranged at the same time, and the attitude angle information of the inertial sensors is read;
s2: the human body keeps standing upright for 5 seconds, and the joint O to be measured 2 Joint O to be measured 2 Superior proximal joint O 1 No rotation is generated, and the initial quaternion q of each sensor is acquired (init) According to
Respectively calculating the deviation value q of each sensor bias Completing static calibration, and taking the average value of the deviation values in the static calibration processThe next calculation is performed, wherein q O1(init) Is O 1 Initial quaternion of joint->For quaternion multiplication, q *_bias Sensor bias value;
s3: in the human body movement process, quaternion q of real-time output of four sensors is respectively acquired s* And pass throughCalculating to obtain quaternions of the joint points output by each sensor;
s4: the quaternion of each node obtained in the step S3 and the standard quaternion q of the joint point obtained by the infrared capturing equipment sta Performing multiple linear fitting, solving fitting parameters, substituting the parameters into a fitting equation, and obtaining quaternion q based on redundant information output o1 、q o2
S5: according to SQ obtained in step 4 o1 、q o2 Calculating to obtain the joint O to be measured 2 The positions of (2) are:
O 1 O 2 =(0,0,0,-Len O 1 O 2 )
then according to q o1 The position of O2 can be found as follows:
P O2 =(X o2 ,Y o2 ,Z o2 )
in the Len O 1 O 2 Is limb O 1 O 2 Length, q O1 -1 Represents O 1 Inverse of quaternion, X o2 ,Y o2 ,Z o2 Respectively represent O 2 The values of the joints on the X axis, the Y axis and the Z axis,a quaternion representation of three-dimensional coordinates of the position of joint O2;
s6: q obtained according to step S4 o2 Calculating to obtain the joint O to be measured 2 Inferior proximal joint O 3 The position of (2) is
O 2 O 3 =(0,0,0,-(Len O 1 O 2 +Len O 2 O 3 ))
Then according to q o2 O can be obtained 3 The positions of (2) are:
P O3 =(X o3 ,Y o3 ,Z o3 ),
in the Len O 2 O 3 Is limb O 2 O 3 Length, q O2 -1 Represents O 2 Inverse of quaternion, X o3 ,Y o3 ,Z o3 Respectively represent O 3 Joints being in the X-axisNumerical values on the Y axis and the Z axis,a quaternion representation of three-dimensional coordinates of the position of joint O3;
s7: from the calculation of step S4, q o1 And q o2 Calculating to obtain the joint O to be measured 2 The angular quaternion form of (c) is:
then O 1 O 2 Extension line and O 2 O 3 Included angle between:
α=arctan2(2(q 0 q 3 +q 1 q 2 ),1-2((q 2 ) 2 +(q 3 ) 2 ))
wherein q is 0 As the real part, q 1 ,q 2 ,q 3 For imaginary parts with different imaginary parts, i.e. limbs O 1 O 2 With limb O 2 O 3 The joint angle between the two is as follows: 180 deg. -alpha.
2. The method for resolving positions and angles of joints of a human body based on redundant sensors according to claim 1, wherein in the step S1, when a reflective marker point required for infrared motion capture is set, the method is characterized in that the limb O 1 O 2 With limb O 2 O 3 Three non-collinear points are respectively arranged on the two sides of the frame.
3. The method for resolving positions and angles of joints of a human body based on redundant sensors according to claim 2, wherein, in the step S2,q O1(init) is O 1 Initial quaternion of joint, q O2(init) Is O 2 Initial quaternion of the joint.
4. The redundant sensor-based human joint position and angle calculation method according to claim 1, wherein the step S3 specifically comprises: let joint O to be measured 2 Upper limb, limb O 1 O 2 The upper sensors are respectively represented as an S1 sensor and an S2 sensor, and the joint O to be detected 2 Lower extremity limb O 2 O 3 The upper sensors are respectively represented as an S3 sensor and an S4 sensor, and in the human body movement process, real-time output quaternions q of the S1 sensor, the S2 sensor, the S3 sensor and the S4 sensor are respectively acquired s1 、q s2 、q s3 、q s4 O is calculated 1 The quaternion of the joint is:
O 2 the quaternion of the joint is:
wherein:the average value of the deviation values of the S1 sensor, the S2 sensor, the S3 sensor and the S4 sensor.
5. The redundant sensor-based human joint position and angle solution of claim 4The calculation method is characterized in that the step S4 is performed by the step S3 to obtain q s1o1 、q s2o1 O acquired with infrared equipment 1 Standard quaternion q sta_o1 And (3) performing multi-element linear fitting, wherein the fitting equation is as follows:
namely:
a*q s11 +b*q s21 +c 1 =q sta_o11 ,……,a*q s14 +b*q s24 +c 4 =q sta_o14
i.e. solvingPartial data is taken to calculate fitting parameters a, b and c, and the rest parts verify whether the parameters are correct or not, so that the parameters are substituted into a fitting equation to obtain quaternion q output based on redundant information o1 Wherein a1, a2 represent the independent variables of the multiple function, y sta Representing standard parameters obtained by the infrared motion capture equipment, namely accurate joint quaternion, y i The method is based on parameters obtained by the inertial motion capture device, namely joint quaternions with errors, and i represents i groups of data in total.
6. The method for resolving a position and an angle of a human joint based on redundant sensors as set forth in claim 4, wherein said step S4 is performed by calculating q using step S3 s3o2 、q s4o2 O acquired with infrared equipment 2 Standard quaternion q sta_o2 And (3) performing multi-element linear fitting, wherein the fitting equation is as follows:
namely:
d*q s11 +e*q s21 +f 1 =q sta_o21 ,……,d*q s14 +e*q s24 +f 4 =q sta_o24
i.e. solvingPartial data is taken to calculate fitting parameters d, e and f, and the rest parts verify whether the parameters are correct or not, so that the parameters are substituted into a fitting equation to obtain a quaternion q based on redundant information output o2 Wherein a1, a2 represent the independent variables of the multiple function, y sta Representing standard parameters obtained by the infrared motion capture equipment, namely accurate joint quaternion, y i The method is based on parameters obtained by the inertial motion capture device, namely joint quaternions with errors, and i represents i groups of data in total.
7. The redundant sensor based human joint position and angle resolution method of any one of claims 1-6, wherein the
q * =(q v ,q w ) * =(-q v ,q w )
Wherein q represents a general quaternion of generalization, q v Imaginary part of quaternion, q w Is the real part of the quaternion.
CN202111614526.5A 2021-12-27 2021-12-27 Human joint position and angle resolving method based on redundant sensor Active CN114533039B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111614526.5A CN114533039B (en) 2021-12-27 2021-12-27 Human joint position and angle resolving method based on redundant sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111614526.5A CN114533039B (en) 2021-12-27 2021-12-27 Human joint position and angle resolving method based on redundant sensor

Publications (2)

Publication Number Publication Date
CN114533039A CN114533039A (en) 2022-05-27
CN114533039B true CN114533039B (en) 2023-07-25

Family

ID=81670321

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111614526.5A Active CN114533039B (en) 2021-12-27 2021-12-27 Human joint position and angle resolving method based on redundant sensor

Country Status (1)

Country Link
CN (1) CN114533039B (en)

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2595167A1 (en) * 2006-07-31 2008-01-31 University Of New Brunswick Method for calibrating sensor positions in a human movement measurement and analysis system
JP2009112540A (en) * 2007-11-07 2009-05-28 Toshiba Corp Joint flexing action forecasting apparatus and joint flexing action forecasting method
WO2010027015A1 (en) * 2008-09-05 2010-03-11 国立大学法人東京大学 Motion capture device
JP2013075041A (en) * 2011-09-30 2013-04-25 Equos Research Co Ltd Joint angle measuring apparatus and joint angle measuring method
CN105473097A (en) * 2013-07-29 2016-04-06 直观外科手术操作公司 Shape sensor systems with redundant sensing
CN105809144A (en) * 2016-03-24 2016-07-27 重庆邮电大学 Gesture recognition system and method adopting action segmentation
CN106648088A (en) * 2016-12-14 2017-05-10 影动(北京)科技有限公司 Inertial motion capture pose transient calibration method and inertial motion capture system
JP2017119102A (en) * 2015-12-28 2017-07-06 住友ゴム工業株式会社 Motion analysis device, method and program
CN107330967A (en) * 2017-05-12 2017-11-07 武汉商学院 Knight's athletic posture based on inertia sensing technology is caught and three-dimensional reconstruction system
CN108720841A (en) * 2018-05-22 2018-11-02 上海交通大学 Wearable lower extremity movement correction system based on cloud detection
JP2018196575A (en) * 2017-05-24 2018-12-13 日本電信電話株式会社 Ankle joint angle estimation device, walking support device, ankle joint angle estimation method, walking support method, and program
CN109000633A (en) * 2017-06-06 2018-12-14 大连理工大学 Human body attitude motion capture algorithm design based on isomeric data fusion
CN109623878A (en) * 2019-01-22 2019-04-16 天津大学 A kind of self-calibrating method of the sensor-based system for humanoid dextrous hand wrist joint
WO2020116836A1 (en) * 2018-12-06 2020-06-11 (주)코어센스 Motion capture device using movement of center of gravity of human body and method therefor
KR20200093185A (en) * 2019-01-28 2020-08-05 전북대학교산학협력단 Mulit-exercise apparatus with smart mirror
CN111895997A (en) * 2020-02-25 2020-11-06 哈尔滨工业大学 Human body action acquisition method based on inertial sensor without standard posture correction
CN113793360A (en) * 2021-08-31 2021-12-14 大连理工大学 Three-dimensional human body reconstruction method based on inertial sensing technology
WO2022133063A1 (en) * 2020-12-16 2022-06-23 New York University Wearable intertial sensor system and methods
CN116027905A (en) * 2023-01-18 2023-04-28 大连理工大学 Double kayak upper limb motion capturing method based on inertial sensor

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8219245B2 (en) * 2006-05-15 2012-07-10 Kuka Roboter Gmbh Articulated arm robot
US20080091373A1 (en) * 2006-07-31 2008-04-17 University Of New Brunswick Method for calibrating sensor positions in a human movement measurement and analysis system
US10575979B2 (en) * 2009-02-06 2020-03-03 Jamshid Ghajar Subject-mounted device to measure relative motion of human joints
US11182946B2 (en) * 2015-09-21 2021-11-23 TuringSense Inc. Motion management via conductive threads embedded in clothing material
EP3986266A4 (en) * 2019-06-21 2023-10-04 Rehabilitation Institute of Chicago D/b/a Shirley Ryan Abilitylab Wearable joint tracking device with muscle activity and methods thereof

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2595167A1 (en) * 2006-07-31 2008-01-31 University Of New Brunswick Method for calibrating sensor positions in a human movement measurement and analysis system
JP2009112540A (en) * 2007-11-07 2009-05-28 Toshiba Corp Joint flexing action forecasting apparatus and joint flexing action forecasting method
WO2010027015A1 (en) * 2008-09-05 2010-03-11 国立大学法人東京大学 Motion capture device
JP2013075041A (en) * 2011-09-30 2013-04-25 Equos Research Co Ltd Joint angle measuring apparatus and joint angle measuring method
CN105473097A (en) * 2013-07-29 2016-04-06 直观外科手术操作公司 Shape sensor systems with redundant sensing
JP2017119102A (en) * 2015-12-28 2017-07-06 住友ゴム工業株式会社 Motion analysis device, method and program
CN105809144A (en) * 2016-03-24 2016-07-27 重庆邮电大学 Gesture recognition system and method adopting action segmentation
CN106648088A (en) * 2016-12-14 2017-05-10 影动(北京)科技有限公司 Inertial motion capture pose transient calibration method and inertial motion capture system
CN107330967A (en) * 2017-05-12 2017-11-07 武汉商学院 Knight's athletic posture based on inertia sensing technology is caught and three-dimensional reconstruction system
JP2018196575A (en) * 2017-05-24 2018-12-13 日本電信電話株式会社 Ankle joint angle estimation device, walking support device, ankle joint angle estimation method, walking support method, and program
CN109000633A (en) * 2017-06-06 2018-12-14 大连理工大学 Human body attitude motion capture algorithm design based on isomeric data fusion
CN108720841A (en) * 2018-05-22 2018-11-02 上海交通大学 Wearable lower extremity movement correction system based on cloud detection
WO2020116836A1 (en) * 2018-12-06 2020-06-11 (주)코어센스 Motion capture device using movement of center of gravity of human body and method therefor
CN109623878A (en) * 2019-01-22 2019-04-16 天津大学 A kind of self-calibrating method of the sensor-based system for humanoid dextrous hand wrist joint
KR20200093185A (en) * 2019-01-28 2020-08-05 전북대학교산학협력단 Mulit-exercise apparatus with smart mirror
CN111895997A (en) * 2020-02-25 2020-11-06 哈尔滨工业大学 Human body action acquisition method based on inertial sensor without standard posture correction
WO2022133063A1 (en) * 2020-12-16 2022-06-23 New York University Wearable intertial sensor system and methods
CN113793360A (en) * 2021-08-31 2021-12-14 大连理工大学 Three-dimensional human body reconstruction method based on inertial sensing technology
CN116027905A (en) * 2023-01-18 2023-04-28 大连理工大学 Double kayak upper limb motion capturing method based on inertial sensor

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Calibrating the human hand for haptic interfaces;Robert N Eohling 等;《Presence:Teleoperators and virtualenvironments》;第2卷(第4期);281-296 *
Effects of dynamic IMU-to -Segment misalignment error on 3-DOF knee angle estimation in walking and running;Chao Jiang 等;《Sensors》;第22卷(第22期);1-16 *
基于人体动作姿态识别的机器人仿人运动;王梅;卢熙昌;屠大维;于远芳;周华;;机械工程学报(21);32-40 *
基于四元数表示法的机器人基坐标系标定方法;王伟 等;《北京航空航天大学学报》;第41卷(第3期);411-417 *
基于惯性传感器的人体姿态角度测量方法研究进展;赵晓皓;盖翔;谢新武;周卫斌;倪爱娟;田丰;;医疗卫生装备(10);106-110+114 *
基于惯性传感器的运动感知机制研究;黄大为;《中国优秀硕士学位论文全文数据库信息科技辑》(第1期);1140-458 *

Also Published As

Publication number Publication date
CN114533039A (en) 2022-05-27

Similar Documents

Publication Publication Date Title
Chiari et al. Human movement analysis using stereophotogrammetry: Part 2: Instrumental errors
CN104757976B (en) A kind of Human Body Gait Analysis method and system based on Multi-sensor Fusion
US7233872B2 (en) Difference correcting method for posture determining instrument and motion measuring instrument
US20100194879A1 (en) Object motion capturing system and method
WO2018081986A1 (en) Wearable device and real-time step length measurement method for device
Li et al. Real-time human motion capture based on wearable inertial sensor networks
CN111895997B (en) Human body action acquisition method based on inertial sensor without standard posture correction
CN112215172A (en) Human body prone position three-dimensional posture estimation method fusing color image and depth information
Kim et al. StrokeTrack: wireless inertial motion tracking of human arms for stroke telerehabilitation
CN106227368B (en) A kind of human synovial angle calculation method and device
CN109945889B (en) Joint angle measuring method based on double-posture sensor
CN114533039B (en) Human joint position and angle resolving method based on redundant sensor
KR20190022198A (en) Method for calibrating posture of lower body using wearable sensors, and computer readable medium for performing the method
CN111158482B (en) Human body motion gesture capturing method and system
CN115607146B (en) Wearable single-node device for leg posture estimation and measurement method
CN208591046U (en) The detection device of unstable motion data
CN114748306A (en) Exoskeleton equipment wearing error correction method
CN112057083B (en) Wearable human upper limb pose acquisition equipment and acquisition method
CN114469078A (en) Human motion detection method based on optical-inertial fusion
CN111772640B (en) Limb exercise training guiding method, device and storage medium
CN105575239B (en) A kind of reduction of the fracture training pattern angle detection device and its method
Bailly et al. Recursive estimation of the human body’s center of mass and angular momentum derivative
Callejas-Cuervo et al. Validation of an inertial sensor-based platform to acquire kinematic information for human joint angle estimation
CN110928420B (en) Human body motion gesture capturing method and system
Sun et al. Development of lower limb motion detection based on LPMS

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant