CN207923150U - A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude - Google Patents

A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude Download PDF

Info

Publication number
CN207923150U
CN207923150U CN201720973935.7U CN201720973935U CN207923150U CN 207923150 U CN207923150 U CN 207923150U CN 201720973935 U CN201720973935 U CN 201720973935U CN 207923150 U CN207923150 U CN 207923150U
Authority
CN
China
Prior art keywords
measurement unit
depth camera
inertial measurement
relative attitude
displacement information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201720973935.7U
Other languages
Chinese (zh)
Inventor
朱海飞
陈集辉
谷世超
管贻生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN201720973935.7U priority Critical patent/CN207923150U/en
Application granted granted Critical
Publication of CN207923150U publication Critical patent/CN207923150U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The utility model embodiment discloses a kind of calibration system of depth camera and Inertial Measurement Unit relative attitude, depth camera is used to obtain the three-dimensional spatial information of target object, it is fixed on application scenarios side to be calibrated, is connected by wired with displacement information acquisition module;Inertial Measurement Unit is used for the angular speed and acceleration of measurement object object in three dimensions, and relative depth camera moves freely, is connected with displacement information acquisition module;Data obtaining module is connected with relative attitude demarcating module, for obtaining the first displacement information when depth camera acquisition target object is moved;And the second displacement information that Inertial Measurement Unit is acquired in the corresponding period;Relative attitude demarcating module is used to that the relative attitude spin matrix of depth camera and Inertial Measurement Unit to be calculated using calibration principle according to upper displacement information.Calibration process is easily operated, without additional calibration ancillary equipment and be contactless calibration, improves the accuracy of calibration relative attitude.

Description

Calibration system for relative attitude of depth camera and inertial measurement unit
Technical Field
The embodiment of the utility model provides an interactive technical field is felt to the body, especially relates to a calibration system of degree of depth camera and inertial measurement unit relative gesture.
Background
With the rapid development of computer technology and internet technology, somatosensory interaction technology is widely applied to various industries, such as somatosensory games, interaction between people and robots, and the like. In the somatosensory interaction technology, the accurate capture of the description of human body gestures, actions and the like in a unified observation coordinate system is the key of other upper-layer applications.
The multi-sensing information fusion theory and technology are effective ways for improving indexes such as human body posture and motion capture range and precision. And the relative pose calibration between the sensors is the basis of multi-sensing information fusion. When the depth camera and the inertial measurement unit are used for capturing human body actions so as to remotely control the robot, the relative posture between the two sensor coordinate systems needs to be calibrated firstly for fusing the sensing data of the depth camera and the inertial measurement unit, namely when the depth camera and the inertial measurement unit are combined to capture human body motion information, the accurate description of the relative posture of the depth camera and the inertial measurement unit is obtained by a simple method.
SUMMERY OF THE UTILITY MODEL
The embodiment of the utility model provides a calibration system of degree of depth camera and inertial measurement unit relative gesture is provided to improve the accuracy of maring degree of depth camera and inertial measurement unit relative gesture.
In order to solve the above technical problem, an embodiment of the present invention provides the following technical solution:
the embodiment of the utility model provides a calibration system of degree of depth camera and inertial measurement unit relative gesture, include:
the system comprises a depth camera, an inertia measurement unit, a displacement information acquisition module and a relative attitude calibration module;
the depth camera is used for acquiring three-dimensional space information of an object, is fixed on one side of an application scene to be calibrated, and is connected with the displacement information acquisition module through a wire;
the inertial measurement unit is used for measuring the angular velocity and the acceleration of the object in a three-dimensional space, freely moves relative to the depth camera and is connected with the displacement information acquisition module;
the information acquisition module is connected with the relative attitude calibration module and is used for acquiring first displacement information when the depth camera acquires a target object to move in any direction according to a preset action sequence; the inertia measurement unit acquires second displacement information of the object in the corresponding same time period;
and the relative attitude calibration module is used for calculating to obtain a relative attitude rotation matrix of the depth camera and the inertial measurement unit by utilizing a calibration principle according to the first displacement information and the second displacement information.
Optionally, the inertial measurement unit is bound to a motion part of the target object to move along with the movement of the target object.
Optionally, the inertial measurement unit is an integrated multi-axis accelerometer, a multi-axis gyroscope, or a magnetometer.
Optionally, the inertia measurement unit is wirelessly connected with the displacement information acquisition module.
Optionally, the method further includes:
and the relative attitude representing module is connected with the relative attitude calibration module and is used for converting the relative attitude of the depth camera and the inertial measurement unit according to a preset attitude representing mode according to the relative attitude rotation matrix.
Optionally, the method further includes:
and the display is connected with the relative attitude calibration module and used for displaying the relative attitude of the depth camera and the inertial measurement unit according to a preset attitude representation mode.
The embodiment of the utility model provides a calibration system of depth camera and inertial measurement unit relative gesture, including depth camera, inertial measurement unit, displacement information acquisition module and relative gesture calibration module, wherein, depth camera is used for acquireing the three-dimensional spatial information of object, fixes and treats calibration application scene one side, and the displacement information acquisition module is through wired the connection; the inertial measurement unit is used for measuring the angular velocity and the acceleration of the object in a three-dimensional space, freely moves relative to the depth camera and is connected with the displacement information acquisition module; the information acquisition module is connected with the relative attitude calibration module and used for acquiring first displacement information when the depth camera acquires the object to be detected to move; second displacement information acquired by the inertia measurement unit in a corresponding time period; and the relative attitude calibration module is used for calculating to obtain a relative attitude rotation matrix of the depth camera and the inertial measurement unit according to the displacement information by using a calibration principle.
The technical scheme provided by the application has the advantages that the fixed depth camera and the freely movable inertia measurement unit are used for recording displacement information of the object when the object moves in the three-dimensional space, and the relative attitude rotation matrix of the depth camera and the inertia measurement unit is obtained through calculation according to the displacement information by using a calibration principle. The whole relative attitude calibration process is easy to operate, high in precision, free of additional calibration auxiliary equipment and non-contact calibration, and beneficial to improving the efficiency and accuracy of the relative attitude calibration of the depth camera and the inertial measurement unit, so that heuristic and accurate calibration information is provided for the somatosensory interaction technical fields of robot remote control, somatosensory game equipment calibration and the like.
Drawings
In order to clearly illustrate the embodiments or technical solutions of the present invention, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic diagram of a frame of an exemplary embodiment of the present invention;
fig. 2 is a schematic structural diagram of a system for calibrating a relative posture between a depth camera and an inertial measurement unit according to an embodiment of the present invention;
fig. 3 is a schematic diagram of different cartesian measurement coordinate systems according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a system for measuring relative attitude between a depth camera and an inertial measurement unit according to another embodiment of the present invention.
Fig. 5 is a schematic curve diagram of another exemplary embodiment in a specific application scenario according to the present invention.
Detailed Description
In order to make the technical field better understand the solution of the present invention, the following detailed description of the present invention is provided with reference to the accompanying drawings and the detailed description. It is to be understood that the embodiments described are only some embodiments of the invention, and not all embodiments. Based on the embodiments in the present invention, all other embodiments obtained by a person skilled in the art without creative work belong to the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may include other steps or elements not expressly listed.
The inventor of the present application finds, through research, that for how to obtain an accurate description of the relative posture of the depth camera and the inertial measurement unit by a simple method when the depth camera and the inertial measurement unit are combined to capture human motion information, that is, for simple and effective calibration of the relative posture of the depth camera and the inertial measurement unit, the current technical scheme cannot meet the requirements of practical application.
The three technical schemes in the prior art are as follows: the method can obtain a suboptimal solution, but needs the assistance of a calibration plate; the depth camera and the inertial measurement unit adopt intelligent algorithms such as an extended Kalman filter and the like to make the relative poses of coordinate systems on the two sensors converge on a true value, but the position relationship between the depth camera and the inertial measurement unit is determined, namely the two sensors are arranged on the same platform, so that the depth camera and the inertial measurement unit are more suitable for robot positioning and navigation; the position relation between the depth camera and the inertial measurement unit is not fixed, namely the depth camera and the inertial measurement unit are respectively positioned on two carriers with relative motion, the relative postures of the depth camera and the inertial measurement unit are calibrated by utilizing a method of manual direct observation and estimation, and the precision is difficult to guarantee due to inevitable defects of manual operation.
In view of the above, the depth camera fixed on one side of the application scene to be calibrated is acquired, and first displacement information of the object in rigid motion is acquired; acquiring an inertial measurement unit which freely moves relative to the depth camera, and acquiring angular velocity and acceleration of the object in rigid motion in the same time period to acquire second displacement information of the object in the three-dimensional space; and calculating to obtain a relative attitude rotation matrix of the depth camera and the inertial measurement unit by utilizing a calibration principle according to the first displacement information and the second displacement information so as to realize calibration of the relative attitude of the depth camera and the inertial measurement unit.
Based on above-mentioned technical scheme of the embodiment of the utility model, it is right below at first to combine fig. 1 some possible application scenarios that technical scheme of the embodiment of the utility model relates to introduce for example, fig. 1 is the frame schematic diagram of an illustrative example that the embodiment of the utility model provides.
As shown in fig. 1, a system for calibrating relative poses of a depth camera and an Inertial measurement unit includes the depth camera, the Inertial Measurement Unit (IMU), and a computer. The depth camera is fixed on one side of the application scene and used for acquiring three-dimensional space information of the motion of the human body part; the inertial measurement unit is bound at a moving part of a human body and used for measuring the angular velocity and the acceleration of the human body when the human body moves in the same time period, and the current attitude information is obtained by integrating the angular velocity and the acceleration; the computer is respectively connected with the depth camera and the inertia measurement unit and is used for acquiring first displacement information when the depth camera acquires a target object to move in any direction according to a preset action sequence; the inertia measurement unit acquires second displacement information of the object in the corresponding same time period; and then calculating to obtain a relative attitude rotation matrix of the depth camera and the inertial measurement unit by using a calibration principle according to the first displacement information and the second displacement information.
It should be noted that the above application scenarios are only shown for facilitating understanding of the ideas and principles of the present application, and the embodiments of the present application are not limited in any way in this respect. Rather, embodiments of the present application may be applied to any scenario where applicable.
Having described the technical solutions of the embodiments of the present invention, various non-limiting embodiments of the present application are described in detail below.
Referring to fig. 2 at first, fig. 2 is a schematic structural diagram of a calibration system of a depth camera and a relative attitude of an inertial measurement unit provided in an embodiment of the present invention under a specific implementation manner, an embodiment of the present invention may include the following:
the system comprises a depth camera 101, an inertial measurement unit 102, a displacement information acquisition module 103 and a relative attitude calibration module 104.
The depth camera 101 is used for acquiring three-dimensional space information of a target object, is fixed on one side of an application scene to be calibrated, and is connected with the displacement information acquisition module 103 through a wire.
The depth camera 101 is fixed at one side of the application scene to be calibrated currently, for example, the depth camera is still in a scene where a person interacts with a robot, and the fixed position of the depth camera 101 can capture all actions of the current person in all directions.
The depth camera 101 is an optical sensor for capturing three-dimensional spatial information of a current target object, and unlike a general camera, can acquire depth information of the target object. Depth cameras refer to, but are not limited to, optical sensors based on binocular vision, structured light, or time-of-flight principles that can acquire three-dimensional spatial information of an object.
The object is an object satisfying rigid motion, which can also be understood as affine transformation that keeps length, angle, area, etc. unchanged, i.e. keeps the inner product and the measure unchanged. From the coordinate transformation point of view, the orthogonal matrix with the determinant of 1 is rotated. For example, the motion of each part of the human body is rigid motion; while the motion of the whole body of a person is non-rigid motion. In addition, under rigid body transformations, quantities of physical significance, such as gradient, divergence and rotation, remain unchanged.
For example, the depth camera 101 may obtain the spatial position of the palm joint in real time, and subtract the average values of the positions of the front and back stationary phases to obtain the displacement vector, i.e., the first displacement information.
The inertial measurement unit 102 is configured to measure an angular velocity and an acceleration of the target object in a three-dimensional space, is freely movable with respect to the depth camera 101, and is connected to the displacement information acquisition module 103. The displacement information acquisition module 103 may be connected by wire or wirelessly.
The inertia measurement unit 102 acquires angular velocity and acceleration of the object in rigid motion acquired in the same time period to acquire second displacement information of the object in the three-dimensional space.
The inertial measurement unit 102 is free to move relative to the depth camera 101, i.e., the depth camera is stationary in the application scenario, and the inertial measurement unit is not fixed and can move freely. Alternatively, the inertial measurement unit may be bound to the target object such that the inertial measurement unit moves as the target object moves. For example, when the target object is a human hand, the inertial measurement unit may be tied to the human hand to move in space with the hand.
The inertial measurement unit refers to, but is not limited to, a sensor which integrates a multi-axis accelerometer, a multi-axis gyroscope and a magnetometer and can directly measure the angular velocity and the acceleration of an object in a three-dimensional space, the angular velocity and the acceleration are integrated to obtain the attitude information of the object in the three-dimensional space, and the displacement information of the target object can be obtained through the attitude information.
The information acquisition module 103 is connected with the relative posture calibration module 104 and is used for acquiring first displacement information of the depth camera when the depth camera acquires a target object to move in any direction according to a preset action sequence; and second displacement information acquired by the inertial measurement unit for the target object in the corresponding same time period.
The second displacement information and the first displacement information are obtained by respectively utilizing different acquisition tools at the same time when the same object moves according to the same motion sequence, namely the first displacement information is information captured by the depth camera, and the second displacement information is information captured by the inertial measurement unit.
For example, the inertial measurement unit may obtain acceleration information of the movement of the palm of the hand and its relative attitude information with respect to the geomagnetic coordinate system, may construct a displacement vector obtained by the inertial measurement unit, first map the acceleration from the coordinate system I of the inertial measurement unit to the geomagnetic coordinate system G:
wherein A isGIs the acceleration vector in G and,is a relative attitude rotation matrix between an inertial measurement unit coordinate system and a geomagnetic field coordinate system, can be directly read by the inertial measurement unit or obtained by integrating angular velocity components, AIIs the acceleration vector read from the inertial measurement unit,gis the gravity information in the G coordinate system, and 9.8m/s is taken2And converted into dimensionless vector g ═ 0, 0, 9.8. t according to a certain proportion t]T. Then, judging the synthetic acceleration in the G by setting a threshold value, acquiring a start-stop point of the motion segment, and then iterating by using the acceleration information of each moment of the motion segment to obtain the displacement:
Vi Gspeed indicating time iDegree vector, TiRepresenting the time interval from time i-1 to time i, the time at which the system reads data will float every cycle. And (5) iterating by using the acceleration value to obtain the speed of each moment, and further obtaining the displacement vector of the moment i. The displacement vector of the inertial measurement unit expressed in the G coordinate system can be obtained by iterating until the motion segment is finished
And if the detection of the motion segment of the inertial measurement unit fails, so that the extraction of the displacement vector fails, the first displacement information and the second displacement information are acquired again.
The relative attitude calibration module 104 is configured to calculate, according to the first displacement information and the second displacement information, a relative attitude rotation matrix between the depth camera and the inertial measurement unit by using a calibration principle.
Based on the invariance of rigid motion described in different cartesian coordinate systems, that is, a three-dimensional coordinate system C is established in the depth camera, a three-dimensional coordinate system I is established in the inertial measurement unit, and in a geomagnetic coordinate system G, please refer to fig. 3, the rigid motion is described as the same.
The calibration solution objective is to determine the attitude transformation relation between C and IThe geomagnetism meter in the 9-axis inertial measurement unit can automatically calibrate the relative attitude information between the coordinate system I and the geomagnetic coordinate system G, i.e. without external magnetic field interferenceIn known amounts. Transforming relationships according to a coordinate systemKnown as a pairCan be converted into pairsI.e. determining the pose description of the depth camera in the geomagnetic coordinate system
Constructing a calibration solution model according to the first displacement information and the second displacement information, and solving the calibration solution model by using a least square method to obtain a relative attitude rotation matrix of the depth camera and the inertial measurement unit, wherein the specific method comprises the following steps:
constructing a first displacement vector matrix according to the displacement vectors corresponding to the multiple groups of first displacement information; constructing a second displacement vector matrix according to the displacement vectors corresponding to the plurality of groups of second displacement information;
constructing a calibration solving model according to the first displacement vector matrix and the second displacement vector matrix based on invariance described by rigid motion under different Cartesian measurement coordinate systems;
solving the calibration solving model according to a preset algorithm to obtain a relative attitude rotation matrix of the depth camera and the inertial measurement unit;
the multiple groups of first displacement information are acquired by the depth camera for multiple times, and the object moves in any direction according to a preset action sequence; the plurality of groups of second displacement information are second displacement information of the object acquired by the inertial measurement unit in the corresponding same time period.
The action sequence is the movement track of the object, has a definite starting point and movement direction, and can obtain displacement information in any direction in space and position information of the starting point and the stopping point, and can be linear movement with displacement or curvilinear movement, which does not affect the realization of the application. For example, stationary 2 s-motion-stationary 2s, where the hand stationary 2s is intended to allow sufficient time for the sensor to determine the start and end positions of the hand motion.
For example, the hand moves in an arbitrary direction in space with a certain motion sequence as an input to the system, and the hand moves three times; then, using upper computer software to respectively record acceleration and Euler angle information generated by an inertial measurement unit, hand position information in a depth camera and a timestamp for recording data; simultaneously acquiring the displacement vector of the hand motion of the same person by using a depth camera and an inertia measurement unit; using the three sets of displacement vectors, a rotation matrix can be obtained by:
wherein M isGRepresenting a 3 x 3 matrix of displacement vectors in the form of any 3 displacement vectors in the earth's magnetic field coordinatesRespectively are arbitrary 3 multiplied by 1 displacement column vectors under the geomagnetic field coordinate; in the same way, MCIs a displacement matrix, a representation form and M in the depth coordinate system of the depth cameraGThe same is true;is a rotation matrix between the depth camera coordinate systems C and G. However, since the measurement data of the sensor always has a deviation, the rotation matrix obtained by the above equation may have a large deviation.
Invariance structure calibration solution model described under different Cartesian measurement coordinate systems based on rigid motionThen, solving a relative attitude rotation matrix of the depth camera and an inertial measurement unit of a geomagnetic field coordinate system by a least square method:
from the coordinate system transformation relationship shown in fig. 3, the attitude of the inertial measurement unit coordinate system in the depth camera coordinate system is described as follows:
wherein,namely, the relative attitude rotation matrix between the coordinate system on the inertial measurement unit plate and the geomagnetic field coordinate system, namely, the relative attitude matrix between the depth camera and the inertial measurement unit can be obtained by using the formula.
It should be noted that the displacement information obtaining module and the relative attitude calibration module may be executed by a computer, and refer to, but are not limited to, a personal computer or an embedded system with computing capability; the depth camera is in wired connection with the computer, so that data can be transmitted to the computer in real time; the inertia measurement unit is connected with the computer in a wired or wireless way, and can also transmit data to the computer in real time; the depth camera and the inertial measurement unit are not directly connected or share data; and the computer is provided with self-developed upper computer software which is responsible for acquiring and processing data of the depth camera and the inertial measurement unit and solving a relative attitude rotation matrix of the depth camera and the inertial measurement unit according to a calibration principle.
The embodiment of the utility model provides an among the technical scheme, utilize fixed degree of depth camera and the inertial measurement unit record object displacement information when moving in three-dimensional space of free mobility, utilize and mark the principle, calculate the relative gesture rotation matrix who obtains degree of depth camera and inertial measurement unit according to these displacement information. The whole relative attitude calibration process is easy to operate, high in precision, free of additional calibration auxiliary equipment and non-contact calibration, and beneficial to improving the efficiency and accuracy of the relative attitude calibration of the depth camera and the inertial measurement unit, so that heuristic and accurate calibration information is provided for the somatosensory interaction technical fields of robot remote control, somatosensory game equipment calibration and the like.
In a specific embodiment, since the relative postures of the depth camera and the inertial measurement unit are expressed in different manners in different application scenarios, in view of the above embodiment, please refer to fig. 4, the present application further provides another embodiment, which may further include:
and a relative attitude representing module 105, connected to the relative attitude calibration module 104, configured to convert the relative attitude of the depth camera and the inertial measurement unit according to a preset attitude representing manner according to the relative attitude rotation matrix.
The gesture representing mode is determined according to the current application scenario or the requirement of the user, and the present application does not limit this.
And converting the relative attitude of the depth camera and the inertial measurement unit according to the relative attitude rotation matrix according to an Euler angle attitude representation method so as to realize that the relative attitude of the depth camera and the inertial measurement unit is represented by a rotation angle, a pitch angle and a yaw angle.
For example, after the calibration process is completed, the output is madeThe resulting euler angular attitude representation has 3 parameters, namely, rotation angle, pitch angle, and yaw angle.
According to different expression modes, the applicability of the technical scheme is improved, and the user experience is improved.
In a specific implementation manner, referring to fig. 4, the present application further provides another embodiment, which may further include:
and the display 106 is connected with the relative posture calibration module 104 and is used for displaying the relative posture of the depth camera and the inertial measurement unit according to a preset posture representation mode.
Of course, when the relative pose representation module 105 is included in the system, the display 106 may also be connected to the relative pose representation module 105.
And by adding the display, a user can more intuitively describe information of the relative postures of the depth camera and the inertial measurement unit, and the user experience is facilitated.
In order to make the technical solutions of the present application more clearly understood by those skilled in the art, specific examples are also provided, which may specifically be:
the operator stands in front of the depth camera and performs the following sequence of actions: the hand naturally droops-lifts the hand forward, the palm is forward, the arms are parallel as much as possible-walks forward one step-and the hand is naturally laid down. The data of the depth camera and the inertial measurement unit are transmitted to the upper computer, and the software is responsible for managing and coordinating the operation of each sensor and recording the sensor data at each moment. The graph shown in fig. 5 represents spatial position information in the Z-axis direction in the geomagnetic coordinate system, which is obtained by converting the spatial position of the palm measured by the depth camera according to the above-obtained relative posture conversion matrix. Stages 1, 2 and 3 in fig. 5 correspond to several stages of raising hands, stepping forward, lowering hands and the like, and other periods are static stages. The position where the hand is lifted takes the mark position of the wall surface as a reference point, and the mark position is preset according to the arm length of an experimenter. The length from the palm to the shoulder of the experimenter is about 63cm, when the arm is lifted and is parallel to the ground as much as possible, and the palm is higher than the shoulder by about 5cm when the palm is forward, so that the vertical height of the mark position of the hand falling and the mark position of the hand after lifting is 68 cm. As can be seen, the solid line shows the rotation matrix after calibrationConverting the position information from a depth coordinate system to a geomagnetic field coordinate system, wherein the position difference between the position after lifting the hand and the position before lifting the hand is 68-70 cm, and the position after releasing the hand and the position before lifting the hand areThe former position difference is 0.81cm, which proves that the calibration result is feasible. In contrast, the dashed line was processed using an uncalibrated, manually read transformation matrix with a vertical height difference of between 78 and 79cm, with a 5.42cm positional deviation before and after hand lowering. The result of the calibration is obviously superior to the result without calibration, and the practical application value of the utility model is obvious.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
It is right above the utility model provides a calibration system of degree of depth camera and inertial measurement unit relative gesture introduces in detail. The principles and embodiments of the present invention have been explained herein using specific examples, and the above descriptions of the embodiments are only used to help understand the method and its core ideas of the present invention. It should be noted that, for those skilled in the art, without departing from the principle of the present invention, the present invention can be further modified and modified, and such modifications and modifications also fall within the protection scope of the appended claims.

Claims (6)

1. A system for calibrating the relative attitude of a depth camera and an inertial measurement unit, comprising:
the system comprises a depth camera, an inertia measurement unit, a displacement information acquisition module and a relative attitude calibration module;
the depth camera is used for acquiring three-dimensional space information of an object, is fixed on one side of an application scene to be calibrated, and is connected with the displacement information acquisition module through a wire;
the inertial measurement unit is used for measuring the angular velocity and the acceleration of the object in a three-dimensional space, freely moves relative to the depth camera and is connected with the displacement information acquisition module;
the information acquisition module is connected with the relative attitude calibration module and is used for acquiring first displacement information when the depth camera acquires a target object to move in any direction according to a preset action sequence; the inertia measurement unit acquires second displacement information of the object in the corresponding same time period;
and the relative attitude calibration module is used for calculating to obtain a relative attitude rotation matrix of the depth camera and the inertial measurement unit by utilizing a calibration principle according to the first displacement information and the second displacement information.
2. The system for calibrating the relative attitude of a depth camera and an inertial measurement unit according to claim 1, wherein the inertial measurement unit is bound to a moving part of the object to move as the object moves.
3. A system for calibration of the relative attitude of a depth camera and an inertial measurement unit according to claim 2, wherein the inertial measurement unit is an integrated multi-axis accelerometer, multi-axis gyroscope or magnetometer.
4. The system for calibrating the relative attitude of a depth camera and an inertial measurement unit of claim 3, wherein the inertial measurement unit is wirelessly connected to the displacement information acquisition module.
5. The system for calibrating the relative attitude of a depth camera and an inertial measurement unit according to any one of claims 1 to 4, further comprising:
and the relative attitude representing module is connected with the relative attitude calibration module and is used for converting the relative attitude of the depth camera and the inertial measurement unit according to a preset attitude representing mode according to the relative attitude rotation matrix.
6. The system for calibrating the relative attitude of a depth camera and an inertial measurement unit of claim 5, further comprising:
and the display is connected with the relative attitude calibration module and used for displaying the relative attitude of the depth camera and the inertial measurement unit according to a preset attitude representation mode.
CN201720973935.7U 2017-08-04 2017-08-04 A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude Active CN207923150U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201720973935.7U CN207923150U (en) 2017-08-04 2017-08-04 A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201720973935.7U CN207923150U (en) 2017-08-04 2017-08-04 A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude

Publications (1)

Publication Number Publication Date
CN207923150U true CN207923150U (en) 2018-09-28

Family

ID=63611543

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201720973935.7U Active CN207923150U (en) 2017-08-04 2017-08-04 A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude

Country Status (1)

Country Link
CN (1) CN207923150U (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109685852A (en) * 2018-11-22 2019-04-26 上海肇观电子科技有限公司 The scaling method of camera and inertial sensor, system, equipment and storage medium
CN109798891A (en) * 2019-01-25 2019-05-24 上海交通大学 Inertial Measurement Unit calibration system based on high-precision motion capture system
CN110928432A (en) * 2019-10-24 2020-03-27 中国人民解放军军事科学院国防科技创新研究院 Ring mouse, mouse control device and mouse control system
CN111060138A (en) * 2019-12-31 2020-04-24 上海商汤智能科技有限公司 Calibration method and device, processor, electronic equipment and storage medium
CN111240469A (en) * 2019-12-31 2020-06-05 北京诺亦腾科技有限公司 Calibration method and device for hand motion capture, electronic device and storage medium
CN111750850A (en) * 2019-03-27 2020-10-09 杭州海康威视数字技术股份有限公司 Angle information acquisition method, device and system
CN112272757A (en) * 2019-11-22 2021-01-26 深圳市大疆创新科技有限公司 External parameter calibration method and device for detection device and movable platform
CN112577518A (en) * 2020-11-19 2021-03-30 北京华捷艾米科技有限公司 Inertial measurement unit calibration method and device
CN113392909A (en) * 2021-06-17 2021-09-14 深圳市睿联技术股份有限公司 Data processing method, data processing device, terminal and readable storage medium
CN113776556A (en) * 2021-05-30 2021-12-10 南京理工大学 Data fusion-based gyroscope and camera relative position matrix calibration method

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109685852A (en) * 2018-11-22 2019-04-26 上海肇观电子科技有限公司 The scaling method of camera and inertial sensor, system, equipment and storage medium
CN109798891A (en) * 2019-01-25 2019-05-24 上海交通大学 Inertial Measurement Unit calibration system based on high-precision motion capture system
CN111750850B (en) * 2019-03-27 2021-12-14 杭州海康威视数字技术股份有限公司 Angle information acquisition method, device and system
CN111750850A (en) * 2019-03-27 2020-10-09 杭州海康威视数字技术股份有限公司 Angle information acquisition method, device and system
CN110928432A (en) * 2019-10-24 2020-03-27 中国人民解放军军事科学院国防科技创新研究院 Ring mouse, mouse control device and mouse control system
CN110928432B (en) * 2019-10-24 2023-06-23 中国人民解放军军事科学院国防科技创新研究院 Finger ring mouse, mouse control device and mouse control system
CN112272757A (en) * 2019-11-22 2021-01-26 深圳市大疆创新科技有限公司 External parameter calibration method and device for detection device and movable platform
CN111060138A (en) * 2019-12-31 2020-04-24 上海商汤智能科技有限公司 Calibration method and device, processor, electronic equipment and storage medium
WO2021134960A1 (en) * 2019-12-31 2021-07-08 上海商汤智能科技有限公司 Calibration method and apparatus, processor, electronic device, and storage medium
CN111060138B (en) * 2019-12-31 2022-01-28 上海商汤智能科技有限公司 Calibration method and device, processor, electronic equipment and storage medium
CN111240469A (en) * 2019-12-31 2020-06-05 北京诺亦腾科技有限公司 Calibration method and device for hand motion capture, electronic device and storage medium
CN112577518A (en) * 2020-11-19 2021-03-30 北京华捷艾米科技有限公司 Inertial measurement unit calibration method and device
CN113776556A (en) * 2021-05-30 2021-12-10 南京理工大学 Data fusion-based gyroscope and camera relative position matrix calibration method
CN113776556B (en) * 2021-05-30 2024-05-07 南京理工大学 Gyroscope and camera relative position matrix calibration method based on data fusion
CN113392909A (en) * 2021-06-17 2021-09-14 深圳市睿联技术股份有限公司 Data processing method, data processing device, terminal and readable storage medium

Similar Documents

Publication Publication Date Title
CN107314778B (en) Calibration method, device and system for relative attitude
CN207923150U (en) A kind of calibration system of depth camera and Inertial Measurement Unit relative attitude
JP4989660B2 (en) Motion capture device and method related thereto
Tian et al. Accurate human navigation using wearable monocular visual and inertial sensors
CN109141433A (en) A kind of robot indoor locating system and localization method
CN106052584B (en) A kind of view-based access control model and the orbit space linear measurement method of Inertia information fusion
CN109313417A (en) Help robot localization
WO2016183812A1 (en) Mixed motion capturing system and method
TWI402506B (en) Method and system for motion tracking
KR101708584B1 (en) Method and device for sensing orientation of an object in space in a fixed frame of reference
CN110617814A (en) Monocular vision and inertial sensor integrated remote distance measuring system and method
JP6776882B2 (en) Motion analyzers, methods and programs
KR20160037972A (en) Method for camera motion estimation and correction
CN104834917A (en) Mixed motion capturing system and mixed motion capturing method
JP6145072B2 (en) Sensor module position acquisition method and apparatus, and motion measurement method and apparatus
CN104848861B (en) A kind of mobile device attitude measurement method based on picture drop-out point identification technology
CN208751577U (en) A kind of robot indoor locating system
CN108801250B (en) Real-time attitude acquisition method and device based on underwater robot
CN103105166B (en) Motion data processing method and system for motion practice beat
CN111435083A (en) Pedestrian track calculation method, navigation method and device, handheld terminal and medium
JP2011033489A (en) Marker for motion capture
Zhang et al. Fusion of vision and IMU to track the racket trajectory in real time
JP2009186244A (en) Tilt angle estimation system, relative angle estimation system, and angular velocity estimation system
Qian et al. Optical flow based step length estimation for indoor pedestrian navigation on a smartphone
KR101950453B1 (en) Apparatus and method for wearing position proposal of measuring sensor

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant