WO2018233514A1 - 一种位姿测量方法、设备及存储介质 - Google Patents

一种位姿测量方法、设备及存储介质 Download PDF

Info

Publication number
WO2018233514A1
WO2018233514A1 PCT/CN2018/090821 CN2018090821W WO2018233514A1 WO 2018233514 A1 WO2018233514 A1 WO 2018233514A1 CN 2018090821 W CN2018090821 W CN 2018090821W WO 2018233514 A1 WO2018233514 A1 WO 2018233514A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional
real
virtual
rotation
matrix
Prior art date
Application number
PCT/CN2018/090821
Other languages
English (en)
French (fr)
Inventor
徐坤
周轶
范国田
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2018233514A1 publication Critical patent/WO2018233514A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken

Definitions

  • the present invention relates to the field of measurement technology, and in particular, to a pose measurement method, device, and storage medium.
  • Object pose measurement has important application value in many fields such as modern communication, national defense and aerospace.
  • the measurement of the antenna pose is closely related to the coverage of the base station; for example, during the robot assembly process, the pose of the robot determines the accuracy of the assembly. How to achieve the pose measurement of objects has always been the focus of research.
  • the position measurement method usually needs to be configured with relevant measurement personnel, and the relevant measurement instrument is measured on the object.
  • this method can be implemented for objects that can be operated near.
  • objects such as antennas and aircraft that require long-distance measurement
  • such measurement methods are very dangerous, and it is impossible to ensure the personal safety of the measurement personnel, and at the same time, excessive manpower is required. Therefore, it is very necessary to provide a simple and feasible pose measurement method.
  • the embodiment of the invention provides a method, a device and a storage medium for measuring a pose, which solves the problem that the pose measurement method in the prior art is labor-intensive and cannot guarantee the personal safety of the measurement personnel.
  • a pose measurement method including:
  • a posture parameter of the object is calculated according to position information of the object in the real three-dimensional environment.
  • the posture parameter of the photographing device includes rotation angle information of the photographing device; and the matching the virtual three-dimensional scene with the real three-dimensional environment according to the posture parameter of the photographing device, including:
  • the posture parameter of the photographing device includes the positioning information of the photographing device; and the matching the virtual three-dimensional scene with the real three-dimensional environment according to the posture parameter of the photographing device, further includes:
  • a panning and scaling transformation matrix of the virtual three-dimensional scene to the real three-dimensional environment is determined according to the positioning information and the position of the photographing device in the virtual three-dimensional scene.
  • the calculating a rotation transformation matrix of the virtual three-dimensional scene to the real three-dimensional environment according to the first rotation matrix and the second rotation matrix comprises:
  • the average of the rotation angles of the respective rotation conversion matrices on the three coordinate axes is calculated, and the final rotation transformation matrix is obtained based on the average value reconstruction.
  • the calculating the attitude parameter of the object according to the location information of the object in the real three-dimensional environment comprises:
  • Two pairs of points with the same name are determined according to the straight line of the same name calibrated in the two target photos;
  • the rotation transformation matrix and the translation and scaling transformation matrix are used to transform the positions of the centers of the two phase points, and the positioning values and altitude values of the objects in the real three-dimensional environment are obtained according to the transformed positions.
  • the two pairs of the same name are determined according to the same-named line calibrated in the two target photos, including:
  • intersection of the polar line and the straight line of the same name in another target photo is the point of the same name of the end point.
  • the determining, according to the two pairs of phase names of the same name, the positions of two phase points in the virtual three-dimensional scene including:
  • a linear equation of the same name pair is formed into a linear equation group, and the linear equation group is solved by a matrix method, and the position of the phase point in the virtual three-dimensional scene is determined based on the solution.
  • the linear equations of the same-named pair of points form a linear equation group, and the linear equations are solved by a matrix method, and the position of the phase points in the virtual three-dimensional scene is determined based on the solution, including :
  • the new weighting factor of each phase point is obtained.
  • the linear equations of each phase point are respectively divided by the corresponding weighting factors to obtain a new linear equation group, and then the new solution is calculated. Repeat this step until After the new solution is equal to the previous solution, this step is repeated once again, and the calculated final solution is the position of the phase point.
  • the calculating the attitude parameter of the object according to the location information of the object in the real three-dimensional environment comprises:
  • the attitude angle of the object is calculated based on the position information of the straight line in the real space.
  • a pose measuring apparatus includes a camera, a processor, and a memory, wherein the memory stores a pose measurement program; the processor is configured to execute the stored in the memory The program is used to implement the steps in the pose measurement method described above.
  • a computer readable storage medium on which the pose measurement program is stored, and the pose measurement program is implemented by a processor to implement the bit described above. The steps in the attitude measurement method.
  • the object in different positions is photographed by the photographing device, and then the photograph of the photographed object is analyzed by combining the posture parameters of the photographing device, so that the actual pose parameter information of the object can be obtained. Therefore, the embodiment of the invention realizes the remote measurement of the pose of the object, and the operation method is very simple and convenient, and the measurement personnel are not required to climb, thereby effectively eliminating the risk of measurement.
  • FIG. 1 is a flowchart of a pose measurement method according to an embodiment of the present invention.
  • FIGS. 2a and 2b are schematic diagrams showing the result of calculating the phase point of the same name in an embodiment of the present invention.
  • FIG. 3 is a flow chart of a three-focus tensor measuring antenna according to an embodiment of the present invention.
  • FIG. 4 is a schematic block diagram of a pose measuring device according to an embodiment of the present invention.
  • the pose measurement method includes the following steps:
  • Step 101 Acquire a plurality of photos of the objects photographed at different positions and posture parameters of the photographing device when the photographs are taken.
  • the object here is not limited to the antenna, but also an object such as a robot, an aircraft, or the like that needs to monitor the attitude parameter.
  • the posture parameters of the robot arm are required to be tested to ensure the accuracy of the assembly; or, in the case of the aircraft wind test, the flight attitude is detected.
  • the shooting device here includes a camera, a camera, a mobile phone, and a tablet computer, and the like, and the type of the shooting device is not specifically limited.
  • the photographing device is a movable device
  • the movable device can acquire the posture parameter information by itself.
  • you can choose your own shooting location and get photos of objects taken at different locations.
  • the attitude parameters of the photographing apparatus include positioning information (for example, global positioning system GPS information) and information such as three rotation angles (rotation angles with respect to three coordinate axes).
  • positioning information for example, global positioning system GPS information
  • information such as three rotation angles (rotation angles with respect to three coordinate axes).
  • the photographing device is a movable device
  • the parameter can be directly obtained by the corresponding sensor.
  • you can pre-configure the rotation angle and GPS information, and directly obtain the configuration information when needed.
  • Step 102 Construct a virtual three-dimensional scene according to the object photo, and match the virtual three-dimensional scene with the real three-dimensional environment according to the posture parameter of the shooting device.
  • the matching between the virtual three-dimensional scene and the real three-dimensional environment is performed according to the posture parameter of the photographing device, mainly by calculating the rotation conversion matrix according to the rotation angle information of the photographing device, and calculating the translation and zoom matrix according to the positioning information of the photographing device.
  • the transformation of a virtual 3D scene into a real 3D scene can be achieved by rotating the transformation matrix, translation and scaling transformation matrices.
  • Step 103 Calculate the attitude parameter of the object according to the position information of the object in the real three-dimensional environment.
  • step 102 the conversion information of the virtual three-dimensional scene to the real three-dimensional environment has been acquired, and according to the conversion information, the position information of the antenna in the real space can be obtained by the user calibrating the position of the desired measurement in the object photo.
  • the attitude parameter of the object can be directly calculated according to the position information of the object in the real space.
  • the pose measurement method captureds an object at different positions by a photographing device, analyzes the photograph of the photographed object by combining the posture parameters of the photographing device, and finally obtains a photograph by combining a mathematical algorithm.
  • the actual pose parameter of the medium object The embodiment of the invention realizes the remote measurement of the posture of the object, and does not require the measuring personnel to perform the climbing operation, so the operation is very simple and convenient, the risk of measurement is effectively eliminated, and the personal safety of the measuring personnel is ensured.
  • the antenna is taken as an example to introduce the technical solution through the pose measurement process of the antenna.
  • Step 101 Acquire a plurality of photos of the objects photographed at different positions and posture parameters of the photographing device when the photographs are taken.
  • a mobile phone is used as a measuring device, and a photo of the antenna is taken by using a camera of the mobile phone, and the GPS position information and posture are acquired by using related sensors (for example, a GPS sensor, a gyroscope, an accelerometer, and an electronic compass) in the mobile phone.
  • related sensors for example, a GPS sensor, a gyroscope, an accelerometer, and an electronic compass
  • parameter. Take several aerial photos at different locations on your phone (the number can be anywhere from 5 to 10).
  • the attitude parameter of the mobile phone is acquired and recorded, and the posture parameter here includes the GPS position and the rotation angle information of the three coordinate axes.
  • Step 102 Construct a virtual three-dimensional scene according to the object photo, and match the virtual three-dimensional scene with the real three-dimensional environment according to the posture parameter of the shooting device.
  • the rotation transformation matrix is calculated by the three rotation angle information acquired when the shooting device is photographed, and the following steps are included:
  • Step 21 Acquire a first rotation matrix R SfM of the photographing device from the two-dimensional image plane to the virtual three-dimensional scene by using the SFM algorithm.
  • the rotation matrix (the rotation matrix of the two-dimensional image plane to the SFM space) in the external reference of each shooting device is obtained according to the SFM algorithm, and the rotation matrix is the first rotation matrix, and the matrix is passed through the matrix.
  • the attitude of the photographing device in the virtual three-dimensional space can be known from the two-dimensional image plane.
  • Step 22 Acquire a second rotation matrix of the photographing device from the two-dimensional image plane to the real three-dimensional environment according to the rotation angle information of the photographing device.
  • the rotation matrix of the photographing device in the real three-dimensional environment can be obtained according to the three rotation angles of the photographing device.
  • the following formula is used to calculate the attitude of the phone's own coordinate system with respect to the real three-dimensional environment (the geodetic coordinate system):
  • R camera R phone *R x (180)*R z (-90) (1)
  • R x (180) represents that the coordinate system first rotates 180 degrees around the x-axis of the phone's own coordinate system in a counterclockwise direction
  • R z (-90) represents the coordinate system and then rotates 90 degrees around the z-axis in a clockwise direction
  • R phone Three rotation angle information for shooting with the shooting device.
  • Step 23 Calculate a rotation transformation matrix of the virtual three-dimensional scene to the real three-dimensional environment according to the first rotation matrix and the second rotation matrix.
  • step 21 and step 22 respectively, to obtain the photographing apparatus in the rotation matrix R SfM SFM space (virtual three-dimensional scene) is, the rotation matrix R camera imaging device to the real three dimensional environment according to the formula (2) can be obtained rotational transformation Matrix R trans :
  • R trans R camera *(R SfM ) -1 (2)
  • the photographing device photographs the object at different positions (multiple positions)
  • several rotation conversion matrices can be obtained according to the photographing devices of different positions, and the present invention is ensured in order to ensure the accuracy of the rotation conversion matrix.
  • the average rotation transformation matrix of the plurality of rotation conversion matrices obtained from different positions is required to obtain a final rotation transformation matrix. Specifically, the average values of the rotation angles of the respective rotation conversion matrices on the three coordinate axes are acquired, and the final rotation transformation matrix is obtained based on the average value reconstruction.
  • each rotation transformation matrix is decomposed into three rotation angles R x , R y , R z , which are averaged in three directions and then averaged according to three rotation angles. The value is re-reconstructed to obtain the final rotation transformation matrix.
  • the conversion process of the matrix and the rotation angle here is a technique well known to those skilled in the art and will not be described in detail herein.
  • the coordinate system of the real space is that the Y axis represents the true north direction, the X axis represents the true east direction, and the Z axis represents the vertical direction from the center of the earth to the sky. It can be seen that the side of the building is nearly perpendicular to the Z axis, so a more accurate conversion is made.
  • the position of the point cloud needs to be translated and scaled.
  • the conversion parameters since the scaling parameters are the same ratio of the respective coordinate axes, the value in the Z-axis direction can be ignored.
  • the numerical calculations of the X-axis and the Y-axis are explained.
  • the position of the shooting device in meters after conversion is (X i , Y i ) (that is, the position of the shooting device in the real three-dimensional environment can be obtained according to the GPS information of the shooting device), and the same can be obtained in the three-dimensional reconstruction space.
  • n represents the number of photographs of the object
  • the above formula is an overdetermined equation.
  • the solution can be solved by using the least square method, QR decomposition or singular value decomposition (SVD) and other common solutions in matrix theory, so that the target can be translated and scaled. Conversion parameters.
  • QR decomposition or singular value decomposition (SVD) and other common solutions in matrix theory, so that the target can be translated and scaled. Conversion parameters.
  • SSD singular value decomposition
  • an interface for the user to input the zoom parameters can be set, and the user provides the average moving distance between each of the two antenna photos at the time of shooting, and this parameter is divided by the neighboring photographing device in the virtual three-dimensional scene. The average distance between them to get the scaling parameter. Normally, the average moving distance of the user is between 0.3 and 1 meter.
  • the translation parameter can also be solved using the above formula. The reason for this method is that the GPS positioning accuracy is low. For example, the user has taken 5 photos, and the moving distance of the adjacent photos is 0.5 meters, which moves within a total range of 2.5 meters. However, the GPS accuracy is often accurate at 3 meters. Left and right, it is very likely that user movement occurs but the GPS reading is too dense.
  • Step 103 Calculate the attitude parameter of the object according to the position information of the object in the real three-dimensional environment.
  • the antenna attitude parameters include the attitude angle of the antenna, as well as GPS and altitude (height) information.
  • GPS and altitude (height) information we first introduce the acquisition of GPS and altitude (height) information.
  • the internal and external parameters of the shooting device can be obtained.
  • the phase name of the same name representing the antenna position to calculate the three-dimensional spatial position of the antenna target, which includes the following:
  • Two pairs of points with the same name are determined according to the straight line of the same name calibrated in the two target photos;
  • the rotation transformation matrix, translation and scaling transformation matrix are used to transform the position of the center of the two phase points, and the GPS and altitude values of the antenna in the real three-dimensional environment are obtained according to the transformed position.
  • the user after taking a picture of the antenna, the user selects two photos from the captured photos as the target picture, and calibrates the straight line of the same name indicating the position of the antenna in the target picture.
  • the final GPS and altitude values are determined based on the information of the line of the same name.
  • the method of inputting the same name straight line pair and the binocular visual limit constraint is used to obtain the same name phase point, including the following:
  • the polar line at the two end points of the same name line in one of the target photos is calculated according to the binocular vision limit constraint principle
  • intersection of the polar line and the line of the same name in the other target photo is the point of the same name of the endpoint.
  • the user selects two photos as the target photo, and selects an antenna target in each target photo, and selects a straight line pair of the same name representing the antenna in the antenna target.
  • the constraint limit binocular vision principle the same name is calculated 2a a line l at the end of the electrode line a in FIG. L; FIG. 2b and the entry of the same name polar straight line l 'intersects the intersection points a', thereby to obtain a point of the same name Can also get a straight line pair with the same name
  • the acquisition method of the above-mentioned same-named phase has increased the difficulty of the user's operation.
  • the current linear matching algorithm cannot accurately match the corresponding straight line, and the method of the embodiment of the present invention ensures the accuracy of the straight line matching.
  • the precision of the sex and the same name are the precision of the sex and the same name.
  • the embodiment of the present invention is implemented as follows:
  • the linear equations of the same-named pair are formed into a linear equation, and the linear equations are solved by the matrix method.
  • the position of the phase points in the virtual three-dimensional scene is determined based on the solution result.
  • Each phase point here corresponds to a shooting angle, that is, the shooting angle of the two target photos (different shooting positions) mentioned above.
  • X is the position coordinate of the three-dimensional space of the phase point to be calculated.
  • phase points of the same name phase point and projection matrix are needed.
  • A is a 4 x 4 matrix. Normally, due to the presence of noise, this equation cannot be completely equivalent. You can find the smallest X that makes
  • 1.
  • the optimal solution of this equation is the eigenvector corresponding to the minimum eigenvalue of the matrix A T A.
  • the general solution such as SVD, QR decomposition can be used to solve X.
  • the solved X (a, b, c, d)
  • the homogeneous coordinate X1 (a/d, b/d, c/d, d/d)
  • the final target coordinate T (a/d) , b/d, c/d).
  • one problem with the linear method is that the minimized
  • the spatial point X calculated at the beginning does not fully satisfy the linear equation and there is an error. What you want to minimize is the projection point of the real image point x and X. Distance, ie This means that the linear equation is divided by the weighting factor. Then the final error is to minimize the meaning of the photo.
  • the embodiment of the present invention is implemented according to an iterative method: a new weighting factor for each view is obtained according to the previous solution result, and the linear equations of each view are respectively divided by the corresponding weighting factors to obtain a new one. The linear equations are recalculated to obtain a new solution. Repeat this step until the new solution is equal to the previous solution. Then, the solution step of the linear equations is repeated according to the new weighting factor, and the final solution is calculated.
  • the optimal weighting factor obtained by the iterative method is divided into the optimal weighting factors of the respective angles of view, and the obtained solution is the position of the final target space point. Based on this method, the minimized equation can be made to conform to the coordinate error in the photographic sense.
  • the spatial iteration method calculated by such a linear iterative method has high accuracy, and generally can achieve convergence with a small number of iterations, and the implementation is simple and the program is simple.
  • the GPS position and altitude in the real three-dimensional scene are calculated according to the position coordinates of the center of the phase connecting the two ends, thereby representing the The GPS position and altitude of the antenna.
  • the (X, Y) of the real space coordinates obtained after the rotation transformation is obtained according to the translation and scaling matrix obtained in step 102, and is obtained as:
  • the (X, Y) is subjected to back projection transformation to obtain the GPS value.
  • the target's altitude is:
  • the spatial position coordinates of the two phase points may be utilized, and then the rotation transformation matrix, the translation and the scaling matrix are used to perform the rotation transformation.
  • the two phase points are directly at the coordinate points of the real three-dimensional environment, and the attitude angle of the antenna can be directly calculated.
  • the coordinate point representing the spatial straight line is re-acquired by the three-focus tensor method, and the attitude angle of the antenna is calculated according to the coordinate point.
  • the attitude angle of the antenna is calculated according to the coordinate point. Specifically, determining the position of the same straight line calibrated in the three target photos; calculating the position information of the straight line in the real three-dimensional environment by using the three-focus tensor algorithm according to the position and the position information of the photographing device; according to the position of the straight line in the real space The information calculates the attitude angle of the antenna.
  • the projection matrix of the three photographing devices centered at ⁇ C 0 , C 1 , C 2 ⁇ is P 1 , P 2 , P 3
  • the embodiment of the present invention further provides a pose measuring device for implementing the above-described pose measurement method.
  • the device includes a processor 42 and a memory 41 storing instructions executable by the processor 42. among them,
  • the processor 42 may be a general-purpose processor, such as a central processing unit (CPU), or may be a digital signal processor (DSP), an application specific integrated circuit (ASIC), or One or more integrated circuits configured to implement embodiments of the present invention.
  • CPU central processing unit
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • the memory 41 is configured to store the program code and transmit the program code to the CPU.
  • the memory 41 may include a volatile memory such as a random access memory (RAM); the memory 41 may also include a non-volatile memory such as a read-only memory (read- Only memory, ROM), flash memory, hard disk drive (HDD), or solid-state drive (SSD); the memory 41 may also include a combination of the above types of memories.
  • RAM random access memory
  • ROM read-only memory
  • HDD hard disk drive
  • SSD solid-state drive
  • a pose measuring device includes a camera, a processor and a memory, wherein a pose measurement program is stored in the memory; and the processor is configured to execute the pose measurement program stored in the memory, and the method is as follows:
  • the attitude parameter of the object is calculated according to the position information of the object in the real three-dimensional environment.
  • the posture parameter of the photographing device includes rotation angle information of the photographing device; and matching the virtual three-dimensional scene with the real three-dimensional environment according to the posture parameter of the photographing device, including:
  • a rotation transformation matrix of the virtual three-dimensional scene to the real three-dimensional environment is calculated according to the first rotation matrix and the second rotation matrix.
  • the posture parameter of the shooting device includes positioning information of the shooting device; matching the virtual three-dimensional scene with the real three-dimensional environment according to the posture parameter of the shooting device, further includes:
  • the translation and scaling transformation matrix of the virtual three-dimensional scene to the real three-dimensional environment is determined according to the positioning information and the position of the photographing device in the virtual three-dimensional scene.
  • calculating a rotation transformation matrix of the virtual three-dimensional scene to the real three-dimensional environment according to the first rotation matrix and the second rotation matrix including:
  • the average of the rotation angles of the respective rotation conversion matrices in the three coordinate axes is calculated, and the final rotation transformation matrix is obtained based on the average value reconstruction.
  • calculating the attitude parameter of the object according to the position information of the object in the real three-dimensional environment including:
  • Two pairs of points with the same name are determined according to the straight line of the same name calibrated in the two target photos;
  • the rotation transformation matrix and the translation and scaling transformation matrix are used to transform the position of the center of the two phase points, and the position value and the altitude value of the object in the real three-dimensional environment are obtained according to the transformed position.
  • two pairs of the same name are determined according to the straight line of the same name calibrated in the two target photos, including:
  • intersection of the polar line and the line of the same name in the other target photo is the point of the same name of the endpoint.
  • determining the positions of the two phase points in the virtual three-dimensional scene based on the two pairs of the same name including:
  • the linear equations of the same-named pair are formed into a linear system of equations, and the linear equations are solved by the matrix method.
  • the position of the phase points in the virtual three-dimensional scene is determined based on the solution.
  • a linear equation of the same name pair is formed into a linear equation group, and the matrix equation is used to solve the linear equation group, and the position of the phase point in the virtual three-dimensional scene is determined based on the solution, including:
  • the new weighting factor of each phase point is obtained.
  • the linear equations of each phase point are respectively divided by the corresponding weighting factors to obtain a new linear equation group, and then the new solution is calculated. Repeat this step until After the new solution is equal to the previous solution, this step is repeated, and the calculated final solution is the position of the phase point.
  • calculating the attitude parameter of the object according to the position information of the object in the real three-dimensional environment including:
  • the attitude angle of the object is calculated from the position information of the straight line in the real space.
  • the embodiment of the invention further provides a computer readable storage medium.
  • the computer readable storage medium herein stores one or more programs.
  • the computer readable storage medium may include a volatile memory such as a random access memory; the memory may also include a non-volatile memory such as a read only memory, a flash memory, a hard disk or a solid state hard disk; the memory may also include the above categories a combination of memory.
  • One or more programs in a computer readable storage medium may be executed by one or more processors to implement the pose measurement methods provided in the method embodiments.
  • embodiments of the present invention can be provided as a method, system, or computer program product. Accordingly, the present invention can take the form of a hardware embodiment, a software embodiment, or a combination of software and hardware. Moreover, the present invention can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage and optical storage, etc.) including computer usable program code.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
  • the actual pose parameter information of the object can be obtained by shooting the object at different positions by the photographing device and analyzing the photograph of the photographed object in combination with the posture parameter of the photographing device.
  • the remote measurement of the pose of the object is realized, and the operation method is very simple and convenient, and no measurement personnel are required to climb, which effectively eliminates the risk of measurement.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

一种位姿测量方法、设备及存储介质。位姿测量方法包括:获取多张不同位置拍摄的物体照片以及拍摄照片时拍摄设备的姿态参数(步骤101);根据物体照片构建虚拟三维场景,并根据拍摄设备的姿态参数将虚拟三维场景与真实三维环境进行匹配(步骤102);根据物体在真实三维环境的位置信息计算物体的姿态参数(步骤103)。

Description

一种位姿测量方法、设备及存储介质
相关申请的交叉引用
本申请基于申请号为201710475557.4、申请日为2017年06月21日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的内容在此以引入方式并入本申请。
技术领域
本发明涉及测量技术领域,特别是涉及一种位姿测量方法、设备及存储介质。
背景技术
物***姿测量在现代通信、国防以及航空航天等诸多领域都有着重要的应用价值。例如,尤其是在通信***中,天线位姿的测量与基站的覆盖范围息息相关;在例如,机器人装配过程中,机械臂的位姿决定的装配的精度。如何,实现物体的位姿测量一直是人们研究的重点。
目前,位姿测量方法中通常需要配置相关测量人员,将相关测量仪器被测量物体上。但是该种方法,对于可近操作的物体可以实现。但是,对于天线以及飞行器等需要远距离测量的物体,该种测量方法具有很大的危险性,无法保证测量人员的人身安全,同时耗费过多的人力。因此提供一种简便可行的的位姿测量方法,是非常有必要的。
发明内容
本发明实施例提供一种位姿测量方法、设备及存储介质,用以解决现有技术中位姿测量方法耗费人力且无法保证测量人员的人身安全的问题。
本发明实施例采用下述的技术方案:
依据本发明实施例的一个方面,提供一种位姿测量方法,包括:
获取多张不同位置拍摄的物体照片以及拍摄所述照片时拍摄设备的姿态参数;
根据所述物体照片构建虚拟三维场景,并根据所述拍摄设备的姿态参数将所述虚拟三维场景与真实三维环境进行匹配;
根据所述物体在所述真实三维环境的位置信息计算所述物体的姿态参数。
在一个可选的方案中,所述拍摄设备的姿态参数包括拍摄设备的旋转角信息;所述根据所述拍摄设备的姿态参数将所述虚拟三维场景与真实三维环境进行匹配,包括:
获得二维像平面到虚拟三维场景中的第一旋转矩阵;
根据所述旋转角信息获得二维像平面到真实三维环境的第二旋转矩阵;
根据所述第一旋转矩阵和所述第二旋转矩阵计算虚拟三维场景到真实三维环境的旋转转化矩阵。
在一个可选的方案中,所述拍摄设备的姿态参数包括拍摄设备的定位信息;所述根据所述拍摄设备的姿态参数将所述虚拟三维场景与真实三维环境进行匹配,还包括:
根据所述定位信息以及拍摄设备在虚拟三维场景中的位置确定虚拟三维场景到真实三维环境的平移和缩放转化矩阵。
在一个可选的方案中,所述根据所述第一旋转矩阵和所述第二旋转矩阵计算虚拟三维场景到真实三维环境的旋转转化矩阵,包括:
获取拍摄设备在不同位置得到的旋转转换矩阵;
计算各个旋转转换矩阵在三个坐标轴的旋转角的平均值,并基于所述平均值重构获得最终的旋转转化矩阵。
在一个可选的方案中,所述根据所述物体在所述真实三维环境的位置信息计算所述物体的姿态参数,包括:
根据在两张目标照片中标定的同名直线确定两个同名相点对;
基于所述两个同名相点对确定虚拟三维场景中两个相点的位置;
利用所述旋转转化矩阵和所述平移和缩放转化矩阵对所述两个相点连线中心的位置进行转化,根据转化后的位置获得所述物体在真实三维环境的定位值和海拔值。
在一个可选的方案中,所述根据在两张目标照片中标定的同名直线确定两个同名相点对,包括:
检测到所述目标照片中标定的同名直线后,根据双目视觉极限约束原理,计算其中一张目标照片中同名直线两个端点处的极线;
所述极线与另一目标照片中同名直线的交点即为所述端点的同名相点。
在一个可选的方案中,所述基于所述两个同名相点对确定虚拟三维场景中两个相点的位置,包括:
根据三维空间到二维像平面的投影方程,构建同名相点对每个相点对应的线性方程;
将同名相点对的线性方程构成线性方程组,利用矩阵方法对所述线性方程组进行求解,基于所述解确定虚拟三维场景中相点的位置。
在一个可选的方案中,所述将同名相点对的线性方程构成线性方程组,利用矩阵方法对所述线性方程组进行求解,基于所述解确定虚拟三维场景中相点的位置,包括:
根据上一次的解得到每个相点新的加权因子,对每个相点的线性方程都分别除以对应的加权因子,得到新的线性方程组,再计算得到新解,重复本步骤,直至得到新解与上一次的解相等后,再重复一次本步骤,计算 得到的最终解为所述相点的位置。
在一个可选的方案中,所述根据所述物体在所述真实三维环境的位置信息计算所述物体的姿态参数,包括:
确定在三张目标照片中标定的同一直线的位置;
根据所述位置以及拍摄设备的位置信息利用三焦点张量算法计算所述直线在真实三维环境中的位置信息;
根据所述直线在真实空间中的位置信息计算物体的姿态角。
依据本发明实施例的一个方面,提供一种位姿测量设备,包括摄像头、处理器和存储器,其中所述存储器中存储有位姿测量程序;所述处理器用于执行所述存储器中存储的所述程序,用以实现上述所述的位姿测量方法中的步骤。
依据本发明实施例的一个方面,提供一种计算机可读存储介质,所述计算机可读存储介质上存储有位姿测量程序,所述位姿测量程序被处理器执行时实现上述所述的位姿测量方法中的步骤。
本发明实施例有益效果如下:
本发明实施例通过拍摄设备对不同位置上的物体进行拍摄,而后结合拍摄设备的姿态参数对拍摄的物体照片进行分析,即可获得物体的实际位姿参数信息。因此本发明实施例实现了物***姿的远程测量,操作方法非常简单便捷,无需测量人员进行攀爬,有效消除了测量的危险性。
上述说明仅是本发明技术方案的概述,为了能够更清楚了解本发明的技术手段,而可依照说明书的内容予以实施,并且为了让本发明的上述和其它目的、特征和优点能够更明显易懂,以下特举本发明的具体实施方式。
附图说明
为了更清楚地说明本发明实施例或现有中的方案,下面将对实施例或 现有描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例所提供的位姿测量方法的流程图;
图2a和图2b为本发明一具体实施例中计算同名相点的结果图像示意图。
图3为本发明一具体实施例中三焦点张量测量天线的流程图;
图4为本发明实施例所提供的位姿测量设备的原理框图。
具体实施方式
以下结合附图以及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不限定本发明。
方法实施例
如图1所示,本发明实施例所提供的位姿测量方法,包括如下步骤:
步骤101,获取多张不同位置拍摄的物体照片以及拍摄照片时拍摄设备的姿态参数。
在该步骤中,这里的物体不局限于天线,还可以机器人、飞行器等需要监测姿态参数的物体。例如,在装配过程中,需要机器人手臂的姿态参数进行检测确保装配的准确性;或者,飞行器风力测试时,对飞行姿态进行检测。
其中,这里的拍摄设备包括相机、摄像机、手机以及平板电脑等具有拍摄功能的设备,这里对于拍摄设备的类型不做具体的限定。其中,当拍摄设备为可移动设备时,可移动设备能够自行获取姿态参数信息。在拍摄时,可以自行选择拍摄位置,并获取不同位置拍摄的物体照片。当然还可以通过在不同位置设置固定的摄像机时,当需要时,则采集相应位置的照片即可。
其中,这里的拍摄设备的姿态参数包括定位信息(例如,全球定位***GPS信息)以及三个旋转角(相对于三个坐标轴的旋转角)等信息。其中,当拍摄设备为可移动设备时,该参数可以直接通过相应的传感器获取即可。而对于通过固定位置的摄像头进行拍摄时,可以预先配置旋转角和GPS信息,在需要时,直接获取配置信息即可。
步骤102,根据物体照片构建虚拟三维场景,并根据拍摄设备的姿态参数将虚拟三维场景与真实三维环境进行匹配。
在该步骤中,根据物体照片构建虚拟三维场景时,需要分别对不同位置获取的物体照片进行特征点提取、匹配和矩阵转换等处理构建照片的稀疏点云;而后再利用SFM(Structure from Motion,运动恢复结构)算法计算特征点的三维坐标,构建简单的稀疏的虚拟三维场景。这里,对于SFM算法构建虚拟三维场景的过程已属于本领域技术人员所熟知的技术,因此这里不再进行赘述。
其中,在根据拍摄设备的姿态参数实现虚拟三维场景与真实三维环境的匹配,主要是通过根据拍摄设备的旋转角信息计算旋转转换矩阵,根据拍摄设备的定位信息计算平移和缩放矩阵。通过旋转转化矩阵、平移和缩放转化矩阵可以将实现虚拟三维场景到真实三维场景的转化。
步骤103,根据物体在真实三维环境的位置信息计算物体的姿态参数。
在步骤102中,已经获取虚拟三维场景到真实三维环境的转换信息,根据该转换信息,通过用户在物体照片中标定所需测量的位置,即可获取天线在真实空间的位置信息。根据物体在真实空间的位置信息即可直接计算物体的姿态参数。
基于上述可知,本发明实施例所提供的位姿测量方法,通过拍摄设备对不同位置上的物体进行拍摄,通过结合拍摄设备的姿态参数对拍摄的物体照片进行分析,结合数学算法最终获得拍摄照片中物体的实际位姿参数。 本发明实施例实现了物***姿的远程测量,无需测量人员进行攀爬操作,因此操作非常简单便捷,有效消除了测量的危险性,保证了测量人员人身安全。
下面结合具体实施例对本发明的技术内容做进一步的详细说明。在该实施例中,以天线为例,通过天线的位姿测量过程对本技术方案进行介绍。
步骤101,获取多张不同位置拍摄的物体照片以及拍摄照片时拍摄设备的姿态参数。
在该实施例中,采用手机作为测量设备,通过利用手机的摄像头进行天线的照片拍摄,同时使用手机中相关传感器(例如,GPS传感器、陀螺仪、加速度计以及电子罗盘)获取GPS位置信息和姿态参数。通过手机在不同位置上拍摄数张天线照片(数量可为5至10张中的任意张数)。在拍摄天线照片时,获取并记录手机的姿态参数,这里的姿态参数包括GPS位置和三个坐标轴的旋转角信息。
步骤102,根据物体照片构建虚拟三维场景,并根据拍摄设备的姿态参数将虚拟三维场景与真实三维环境进行匹配。
上述提到利用SFM算法进行了获得到场景的三维重建。在构建完成三维的稀疏点云之后,需要将稀疏点云进行旋转,尽量使得点云的姿态、朝向、分布于真实三维环境的一致。
其中,通过拍摄设备拍摄时获取的三个旋转角信息,计算旋转转化矩阵,包括如下步骤:
步骤21,利用SFM算法获取拍摄设备从二维像平面到虚拟三维场景中的第一旋转矩阵R SfM。在进行虚拟三维场景的重建时,根据SFM算法得到每个拍摄设备外参中的旋转矩阵(二维像平面到SFM空间的旋转矩阵),而该旋转矩阵即为第一旋转矩阵,通过该矩阵可以根据二维像平面获知拍摄设备在虚拟三维空间中的姿态。
步骤22,根据拍摄设备的旋转角度信息获取拍摄设备从二维像平面到真实三维环境的第二旋转矩阵。
在该步骤中,根据内部旋转转化定理即可根据拍摄设备的三个旋转角度获取到拍摄设备在真实三维环境的旋转矩阵。这里,使用如下公式计算手机自身坐标系关于真实三维环境(大地坐标系)的姿态:
R camera=R phone*R x(180)*R z(-90) (1)
其中,R x(180)代表坐标系先以逆时针方向绕手机自身坐标系的x轴旋转180度,R z(-90)代表坐标系再以顺时针方向绕z轴旋转90度,R phone为拍摄设备拍摄时的三个旋转角信息。
步骤23,根据第一旋转矩阵和第二旋转矩阵计算虚拟三维场景到真实三维环境的旋转转化矩阵。
在步骤21和步骤22中,分别获得了拍摄设备在SFM空间(虚拟三维场景)中的旋转矩阵R SfM,拍摄设备到真实三维环境的旋转矩阵R camera,根据公式(2)即可获得旋转转化矩阵R trans:
R trans=R camera*(R SfM) -1 (2)
在一个可选的实施例中,由于拍摄设备在不同位置(多个位置)拍摄了物体照片,根据不同位置的拍摄设备可以得到数个旋转转换矩阵,为了保证旋转转换矩阵的准确性,本发明实施例中,需要根据不同位置获得的多个旋转转换矩阵的均值来获得最终的旋转转化矩阵。具体地,获取各个旋转转换矩阵在三个坐标轴的旋转角的平均值,并基于平均值重构获得最终的旋转转化矩阵。
在该实施例中,需将每个旋转转换矩阵分解成三个旋转角R x,R y,R z,将这些旋转角在三个方向上分别求平均值,而后根据三个旋转角的平均值再重构则得到最终的旋转转化矩阵。这里对于矩阵与旋转角度的转化过程,已属于本领域技术人员所熟知的技术,这里不再进行详细介绍。在进行旋 转转化后,真实空间的坐标系为Y轴代表正北方向,X轴代表正东方向,Z轴代表由地心往天空的垂直方向。可以看到,建筑物的侧边与Z轴接近垂直,因此进行了较为准确的转化。
可知,通过对旋转转换矩阵进行旋转角度的分解,而后再根据角度平均值进行逆变换,获取最终的旋转转换矩阵,可以有效减少最终的计算误差,提高后续姿态参数的准确性。
在一个可选的实施例中,为了让将旋转后的稀疏点云的大小、位置,与真实世界一致,保证GPS的精度的精度,这里需要对点云的位置进行平移和缩放。在计算转换参数时,因为缩放参数是各个坐标轴同比例的,故可以忽略Z轴方向数值,这里仅对X轴与Y轴的数值计算进行说明。
设转化后以米作为度量单位的拍摄设备位置为(X i,Y i)(即拍摄设备在真实三维环境的位置,可以根据拍摄设备的GPS信息获得),在三维重建空间中同样能够得到拍摄设备的位置(x i,y i),这里,令缩放系数为S,平移向量为(T x,T y),则可以建立如下线性方程:
Figure PCTCN2018090821-appb-000001
其中n代表物体照片的数量,上式为超定方程,求解该方程可以采用最小二乘法,QR分解或者奇异值分解(SVD)等矩阵论中的常用解法,由此可以得到目标的平移和缩放转化参数。对于上述方程的求解过程,是本领域技术人员所熟知的技术,因此这里也不再进行说明。
由于在拍摄天线照片时,最佳的GPS精度不过3m,得到的GPS的精 度是非常不精确的,通过上述结合所有获得GPS信息重新确定进行计算目标的GPS,即可获得非常精确的GPS值。
除了上述计算平移与缩放参数的方法,还可以设置用户输入缩放参数的接口,由用户提供在拍摄时每两张天线照片之间的平均移动距离,用此参数除以虚拟三维场景中邻近拍摄设备之间的平均距离以得到缩放参数。通常情况下,用户的平均移动距离在0.3米至1米之间,得到缩放参数后,平移参数同样可以使用上式求解。采用此种方法计算的原因是GPS的定位精度较低,例如用户拍摄了5张照片,相邻照片的移动距离为0.5米,总共在2.5米范围内移动,然而GPS精度往往精度就在3米左右,很可能发生用户移动却发生了GPS读数过于密集的现象。
步骤103,根据物体在真实三维环境的位置信息计算物体的姿态参数。
这里天线姿态参数包括天线的姿态角,以及GPS和海拔(挂高)信息。这里首先介绍GPS和海拔(挂高)信息的获取。
在经过SFM算法后,可获取拍摄设备内外参数。在该前提下,需要获取代表天线位置的同名相点来计算天线目标的三维空间位置,具体包括如下:
根据在两张目标照片中标定的同名直线确定两个同名相点对;
基于两个同名相点对确定虚拟三维场景中两个相点的位置;
利用旋转转化矩阵、平移和缩放转化矩阵对两个相点连线中心的位置进行转化,根据转化后的位置获得天线在真实三维环境的定位值GPS和海拔值。
这里,在拍摄天线的照片后,由用户从所拍摄的照片中选择两张照片作为目标图片,并在目标照片中标定用以表示天线位置的同名直线。根据同名直线的信息来确定最终的GPS和海拔值。
其中,基于用户输入同名直线对和双目视觉极限约束的方法来获取同 名相点,包括如下:
待检测到目标照片中标定的同名直线后,根据双目视觉极限约束原理,计算其中一张目标照片中同名直线两个端点处的极线;
极线与另一目标照片中同名直线的交点即为端点的同名相点。
举例说明,如图2a和图2b所示,用户选取两张照片作为目标照片,并在每张目标照片中框选出天线目标,在天线目标中选取代表天线的同名直线对
Figure PCTCN2018090821-appb-000002
根据双目视觉极限约束原理,计算图2a中同名直线l的端点a处的极线l a;此条极线与图2b同名直线l′相交得到交点a′,由此得到一个同名点对
Figure PCTCN2018090821-appb-000003
同样能够获取到同名直线对
Figure PCTCN2018090821-appb-000004
基于上述可知,采用这上述同名相点的获取方法固然增加了用户的操作难度,但是目前的直线匹配算法不能够较好地精确匹配对应直线,采用本发明实施例的方法保证了直线匹配的精确性和同名相点获取的精度。
其中,在获取选取的两张目标照片中两个同名相点对后,需要根据两个同名相点对在照片中的位置确定对应的两个相点的三维空间位置坐标。可选的,本发明实施例通过如下方式实现:
根据三维空间到二维像平面的投影方程,构建同名相点对每个相点对应的线性方程;
将同名相点对的线性方程构成线性方程组,利用矩阵方法对线性方程组进行求解,基于求解结果确定虚拟三维场景中相点的位置。
这里每个相点对应的是一个拍摄视角,即上述提到两个目标照片(不同拍摄位置)的拍摄视角。具体地,根据三维空间与二维像平面的投影方程有x=PX;其中,x为齐次坐标且x=w(u,v,1),(u,v)为照片上的坐标点(二维像平面坐标),w代表投影深度,即缩放因子。X为所需计算的相点的三维空间的位置坐标,为了能够使用矩阵进行计算,通常使用齐次坐标表示,即X=(x,y,z,1)这样的四维向量;P为投影矩阵(可根据SFM算法获得),这 里第i行用
Figure PCTCN2018090821-appb-000005
表示,则投影方程可以被重写为:
Figure PCTCN2018090821-appb-000006
对上述的投影方程,将参数w消去,得到如下2个线性方程:
Figure PCTCN2018090821-appb-000007
其中,若需要计算X的值,至少需要两个视角的同名相点和投影矩阵。上述公式(5)表示的是在某一个视角下获取到的两个线性方程;若有两个视角,则可以获得四个线性方程,并经过将等式右边的项移动到左部,可以将四个方程组转化为AX=0的形式。
其中,A是一个4×4的矩阵。通常情况下,由于噪声的存在,这个方程式不能够完全对等的,可以找到使得||AX||最小的X,并满足约束条件||X||=1。这个方程的最优解是对应于矩阵A TA的最小特征值的特征向量,可以采用通用的解法如SVD,QR分解等方法来求解X。
其中,在计算得到X后,它是一个齐次向量,将向量的每一元素除以最后一个元素,最后得到的前三个元素代表了最终的目标位置。即,求解出的X=(a,b,c,d),齐次坐标X1=(a/d,b/d,c/d,d/d),最终的目标坐标T=(a/d,b/d,c/d)。
需要说明的是,上述从两个视角(两张目标照片)对求解原理进行说明,当存在有多个视角,可以构建6个,8个…等线性方程组,此时的A是超定矩阵,同样可以使用最小二乘法来求解。可以使用奇异值分解SVD,QR方法等求解。
在一个可选的实施例中,线性方法存在的一个问题是最小化的||AX||没有几何意义,也与最小化的目标函数不相符,并且给矩阵A的每一行乘以加权因子,得出的结果也不尽相同。因此,为了解决该问题,本发明一实施例中,通过线性迭代方法不断地改变线性方程的加权因子,使得加权的 方程能够与照片坐标点的误差统一。
举例来说,一开始计算到的空间点X并不能完全满足线性方程,且存在误差
Figure PCTCN2018090821-appb-000008
想要最小化的是真实图相点x和X的投影点
Figure PCTCN2018090821-appb-000009
的距离,即
Figure PCTCN2018090821-appb-000010
这就意味着线性方程如果除以加权因子
Figure PCTCN2018090821-appb-000011
那么最终的误差就是符合照片意义的最小化。
为了让该误差最小化,本发明实施例根据迭代法实现:根据上一次的求解结果得到每个视角新的加权因子,对每个视角的线性方程都分别除以对应的加权因子,得到新的线性方程组,再计算得到新解,重复本步骤,直至得到新解与上一次的求解结果相等后,再根据新的加权因子重复一次线性方程组的求解步骤,计算得到最终解。
具体地,设定初始加权因子w 0=w 0′=1,计算上述线性方程组的解,并令该解为初始解X 0
第一视角方程组除以加权因子
Figure PCTCN2018090821-appb-000012
同样对第二视角的方程除以加权因子
Figure PCTCN2018090821-appb-000013
从而得到新的线性方程组,再计算得到解X i,,式中X i-1为上次计算出的结果。重复该步骤,直至收敛得到X i=X i-1且有
Figure PCTCN2018090821-appb-000014
此时的误差
Figure PCTCN2018090821-appb-000015
就是最小化的照片误差,为期望误差。
通过迭代方法获取的最优加权因子,并将原始的各个视角的线性方程分别除以各个视角的最优加权因子,获得的解即为最终的目标空间点的位置。基于该方法可以使得最小化的方程能够符合照片意义上的坐标误差。此种线性迭代方法计算出的空间点精确度较高,且一般能够在较少次数迭代即可达到收敛,且实现简单,程序简洁。
通过天线两端的同名相点对在虚拟三维场景中对应空间直线两端相点的位置后,根据两端相点连线中心的位置坐标计算在真实三维场景的GPS 位置和海拔,以此代表该天线的GPS位置和海拔。方法如下:
针对连线中心位置的三维点(x,y),得到其旋转转化后真实空间坐标的(X,Y)根据步骤102中获得的平移和缩放矩阵,即可获得为:
X=x*S+t x,Y=y*S+t y
将(X,Y)经过反投影变换即可得到GPS的值。在拍摄时都处于同一海拔Z 0,根据X,Y,Z轴等比例缩放原理,目标对应的海拔为:
Z=Z 0+(z-z camera)*S
接下来对根据不同天线照片中标定的同一直线的位置计算天线的姿态角过程进行说明。
其中,可选的,在通过上述方法获得代表空间直线的两个相点之后,可以利用两个相点的空间位置坐标后,根据上述提到的旋转转化矩阵、平移和缩放矩阵进行旋转转化后的两个相点在真实三维环境的坐标点,直接可以计算出天线的姿态角。
其中,可选的,通过三焦点张量法重新获取代表空间直线的坐标点,根据该坐标点计算天线的姿态角。具体地,确定在三张目标照片中标定的同一直线的位置;根据位置以及拍摄设备的位置信息利用三焦点张量算法计算直线在真实三维环境中的位置信息;根据直线在真实空间中的位置信息计算天线的姿态角。
由于三焦点张量是表示三个视图间的相互对应关系,这是一种不依赖于被测物体本身结构的空间几何关系。如图3所示,其中直线l 1,l 2,l 3是空间直线L在每张照片上的投影,中心在{C 0,C 1,C 2}的三个拍摄设备的投影矩阵为P 1,P 2,P 3,投影矩阵可以经过SFM算法估计出的内外参数计算得到,具体为P=K[R t]得到,构建三焦点张量矩阵为:
Figure PCTCN2018090821-appb-000016
用奇异值分解法分解W得到[u,s,v]=SVD(W),两个空间点的齐次坐标分别对应于v的最后两列4维向量X a=v(:,3),X b=v(:,4),将这两个4维向量非齐次化即可得到空间直线两点,也用此表示标定直线在空间中的位置,从而进一步通过简单的数学方法求解直线姿态角,也就是天线的姿态角(方向角与俯仰角)。对于根据空间点求解姿态角的过程已属于本领域技术人员所熟知的技术,这里就不再赘述。
设备实施例
本发明实施例还提供了一种位姿测量设备,用以实现上述的位姿测量方法,如图4所示,该设备包括处理器42以及存储有处理器42可执行指令的存储器41。其中,
处理器42可以是通用处理器,例如中央处理器(central processing unit,CPU),还可以是数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC),或者是被配置成实施本发明实施例的一个或多个集成电路。
存储器41,用于存储程序代码,并将该程序代码传输给CPU。存储器41可以包括易失性存储器(volatile memory),例如随机存取存储器(random access memory,RAM);存储器41也可以包括非易失性存储器(non-volatile memory),例如只读存储器(read-only memory,ROM)、快闪存储器(flash memory)、硬盘(hard disk drive,HDD)或固态硬盘(solid-state drive,SSD);存储器41还可以包括上述种类的存储器的组合。
本发明实施例所提供的一种位姿测量设备,包括摄像头、处理器和存储器,其中存储器中存储有位姿测量程序;处理器用于执行存储器中存储 的位姿测量程序,用以如下步骤:
获取多张不同位置拍摄的物体照片以及拍摄照片时拍摄设备的姿态参数;
根据物体照片构建虚拟三维场景,并根据拍摄设备的姿态参数将虚拟三维场景与真实三维环境进行匹配;
根据物体在真实三维环境的位置信息计算物体的姿态参数。
可选的,拍摄设备的姿态参数包括拍摄设备的旋转角信息;根据拍摄设备的姿态参数将虚拟三维场景与真实三维环境进行匹配,包括:
利用SFM算法获得二维像平面到虚拟三维场景中的第一旋转矩阵;
根据旋转角信息获得二维像平面到真实三维环境的第二旋转矩阵;
根据第一旋转矩阵和第二旋转矩阵计算虚拟三维场景到真实三维环境的旋转转化矩阵。
可选的,拍摄设备的姿态参数包括拍摄设备的定位信息;根据拍摄设备的姿态参数将虚拟三维场景与真实三维环境进行匹配,还包括:
根据定位信息以及拍摄设备在虚拟三维场景中的位置确定虚拟三维场景到真实三维环境的平移和缩放转化矩阵。
可选的,根据第一旋转矩阵和第二旋转矩阵计算虚拟三维场景到真实三维环境的旋转转化矩阵,包括:
获取拍摄设备在不同位置得到的旋转转换矩阵;
计算各个旋转转换矩阵在三个坐标轴的旋转角的平均值,并基于平均值重构获得最终的旋转转化矩阵。
可选的,根据物体在真实三维环境的位置信息计算物体的姿态参数,包括:
根据在两张目标照片中标定的同名直线确定两个同名相点对;
基于两个同名相点对确定虚拟三维场景中两个相点的位置;
利用旋转转化矩阵和平移和缩放转化矩阵对两个相点连线中心的位置进行转化,根据转化后的位置获得物体在真实三维环境的定位值和海拔值。
可选的,根据在两张目标照片中标定的同名直线确定两个同名相点对,包括:
检测到在目标照片中标定同名直线后,根据双目视觉极限约束原理,计算其中一张目标照片中同名直线两个端点处的极线;
极线与另一目标照片中同名直线的交点即为端点的同名相点。
可选的,基于两个同名相点对确定虚拟三维场景中两个相点的位置,包括:
根据三维空间到二维像平面的投影方程,构建同名相点对每个相点对应的线性方程;
将同名相点对的线性方程构成线性方程组,利用矩阵方法对线性方程组进行求解,基于解确定虚拟三维场景中相点的位置。
可选的,将同名相点对的线性方程构成线性方程组,利用矩阵方法对线性方程组进行求解,基于解确定虚拟三维场景中相点的位置,包括:
根据上一次的解得到每个相点新的加权因子,对每个相点的线性方程都分别除以对应的加权因子,得到新的线性方程组,再计算得到新解,重复本步骤,直至得到新解与上一次的解相等后,再重复一次本步骤,计算得到的最终解为相点的位置。
可选的,根据物体在真实三维环境的位置信息计算物体的姿态参数,包括:
确定在三张目标照片中标定的同一直线的位置;
根据位置以及拍摄设备的位置信息利用三焦点张量算法计算直线在真实三维环境中的位置信息;
根据直线在真实空间中的位置信息计算物体的姿态角。
存储介质实施例
本发明实施例还提供了一种计算机可读存储介质。这里的计算机可读存储介质存储有一个或者多个程序。其中,计算机可读存储介质可以包括易失性存储器,例如随机存取存储器;存储器也可以包括非易失性存储器,例如只读存储器、快闪存储器、硬盘或固态硬盘;存储器还可以包括上述种类的存储器的组合。当计算机可读存储介质中一个或者多个程序可被一个或者多个处理器执行,以实现方法实施例中所提供的位姿测量方法。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。
虽然通过实施例描述了本申请,本领域的技术人员知道,本申请有许多变形和变化而不脱离本发明的精神和范围。这样,倘若本发明的这些修改和变型属于本发明权利要求及其等同技术的范围之内,则本发明也意图包含这些改动和变型在内。
本领域内的技术人员应明白,本发明的实施例可提供为方法、***、或计算机程序产品。因此,本发明可采用硬件实施例、软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器和光学存储器等)上实施的计算机程序产品的形式。
本发明是参照根据本发明实施例的方法、设备(***)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得 通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
以上所述,仅为本发明的较佳实施例而已,并非用于限定本发明的保护范围。
工业实用性
本发明实施例中,通过拍摄设备对不同位置上的物体进行拍摄,并结合拍摄设备的姿态参数对拍摄的物体照片进行分析,即可获得物体的实际位姿参数信息。实现了物***姿的远程测量,操作方法非常简单便捷,无需测量人员进行攀爬,有效消除了测量的危险性。

Claims (11)

  1. 一种位姿测量方法,包括:
    获取多张不同位置拍摄的物体照片以及拍摄所述照片时拍摄设备的姿态参数;
    根据所述物体照片构建虚拟三维场景,并根据所述拍摄设备的姿态参数将所述虚拟三维场景与真实三维环境进行匹配;
    根据所述物体在所述真实三维环境的位置信息计算所述物体的姿态参数。
  2. 如权利要求1所述的方法,其中,所述拍摄设备的姿态参数包括拍摄设备的旋转角信息;
    所述根据所述拍摄设备的姿态参数将所述虚拟三维场景与真实三维环境进行匹配,包括:
    获得二维像平面到虚拟三维场景中的第一旋转矩阵;
    根据所述旋转角信息获得二维像平面到真实三维环境的第二旋转矩阵;
    根据所述第一旋转矩阵和所述第二旋转矩阵计算虚拟三维场景到真实三维环境的旋转转化矩阵。
  3. 如权利要求2所述的方法,其中,所述拍摄设备的姿态参数包括拍摄设备的定位信息;
    所述根据所述拍摄设备的姿态参数将所述虚拟三维场景与真实三维环境进行匹配,还包括:
    根据所述定位信息以及拍摄设备在虚拟三维场景中的位置确定虚拟三维场景到真实三维环境的平移和缩放转化矩阵。
  4. 如权利要求2所述的方法,其中,所述根据所述第一旋转矩阵和所述第二旋转矩阵计算虚拟三维场景到真实三维环境的旋转转化矩阵,包括:
    获取拍摄设备在不同位置得到的旋转转换矩阵;
    计算各个旋转转换矩阵在三个坐标轴的旋转角的平均值,并基于所述平均值获得的虚拟三维场景到真实三维环境的旋转转化矩阵。
  5. 如权利要求3所述的方法,其中,所述根据所述物体在所述真实三维环境的位置信息计算所述物体的姿态参数,包括:
    根据在两张目标照片中标定的同名直线确定两个同名相点对,所述目标照片为所述多张物体照片中的任意两张;
    基于所述两个同名相点对确定虚拟三维场景中两个相点的位置;
    利用所述旋转转化矩阵、所述平移和缩放转化矩阵对所述两个相点连线中心的位置进行转化,根据转化后的位置获得所述物体在真实三维环境的定位值和海拔值。
  6. 如权利要求5所述的方法,其中,所述根据在两张目标照片中标定的同名直线确定两个同名相点对,包括:
    检测到所述目标照片中标定的同名直线后,根据双目视觉极限约束原理,计算其中一张目标照片中同名直线两个端点处的极线;
    所述极线与另一目标照片中同名直线的交点即为所述端点的同名相点。
  7. 如权利要求5或6所述的方法,其中,所述基于所述两个同名相点对确定虚拟三维场景中两个相点的位置,包括:
    根据三维空间到二维像平面的投影方程,构建同名相点对每个相点对应的线性方程;
    将同名相点对的线性方程构成线性方程组,利用矩阵方法对所述线性方程组进行求解,基于求解结果确定虚拟三维场景中相点的位置。
  8. 如权利要求7所述的方法,其中,所述将同名相点对的线性方程构成线性方程组,利用矩阵方法对所述线性方程组进行求解,基于求解结果 确定虚拟三维场景中相点的位置,包括:
    根据上一次的求解结果得到每个相点新的加权因子,对每个相点的线性方程都分别除以对应的加权因子,得到新的线性方程组,再计算得到新解,重复本步骤,直至得到新解与上一次的解相等后,再重复一次本步骤,计算得到的最终解为所述相点的位置。
  9. 如权利要求1所述的方法,其中,所述根据所述物体在所述真实三维环境的位置信息计算所述物体的姿态参数,包括:
    确定在三张目标照片中标定的同一直线的位置,所述三张目标照片为所述多张物体照片中的任意三张;
    根据所述位置以及拍摄设备的位置信息利用三焦点张量算法计算所述直线在真实三维环境中的位置信息;
    根据所述直线在真实空间中的位置信息计算物体的姿态角。
  10. 一种位姿测量设备,包括摄像头、处理器和存储器,其中所述存储器中存储有位姿测量程序;所述处理器用于执行所述存储器中存储的所述程序,用以实现权利要求1~9任一项所述的位姿测量方法中的步骤。
  11. 一种计算机可读存储介质,所述计算机可读存储介质上存储有位姿测量程序,所述位姿测量程序被处理器执行时实现权利要求1~9任一项所述的位姿测量方法中的步骤。
PCT/CN2018/090821 2017-06-21 2018-06-12 一种位姿测量方法、设备及存储介质 WO2018233514A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710475557.4A CN109099888A (zh) 2017-06-21 2017-06-21 一种位姿测量方法、设备及存储介质
CN201710475557.4 2017-06-21

Publications (1)

Publication Number Publication Date
WO2018233514A1 true WO2018233514A1 (zh) 2018-12-27

Family

ID=64735483

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/090821 WO2018233514A1 (zh) 2017-06-21 2018-06-12 一种位姿测量方法、设备及存储介质

Country Status (2)

Country Link
CN (1) CN109099888A (zh)
WO (1) WO2018233514A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110296686B (zh) * 2019-05-21 2021-11-09 北京百度网讯科技有限公司 基于视觉的定位方法、装置及设备
CN112815923B (zh) * 2019-11-15 2022-12-30 华为技术有限公司 视觉定位方法和装置
CN113781548B (zh) * 2020-06-10 2024-06-14 华为技术有限公司 多设备的位姿测量方法、电子设备及***
CN114274147B (zh) * 2022-02-10 2023-09-22 北京航空航天大学杭州创新研究院 目标跟踪控制方法及装置、机械臂控制设备和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157011A (zh) * 2010-12-10 2011-08-17 北京大学 利用移动拍摄设备进行动态纹理采集及虚实融合的方法
CN103217147A (zh) * 2012-01-19 2013-07-24 株式会社东芝 测量设备和测量方法
CN103245337A (zh) * 2012-02-14 2013-08-14 联想(北京)有限公司 一种获取移动终端位置的方法、移动终端和位置检测***
CN103759716A (zh) * 2014-01-14 2014-04-30 清华大学 基于机械臂末端单目视觉的动态目标位置和姿态测量方法
US20150029345A1 (en) * 2012-01-23 2015-01-29 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
CN106296801A (zh) * 2015-06-12 2017-01-04 联想(北京)有限公司 一种建立物体三维图像模型的方法及电子设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6369534B2 (ja) * 2014-03-05 2018-08-08 コニカミノルタ株式会社 画像処理装置、画像処理方法、および、画像処理プログラム
CN106569591A (zh) * 2015-10-26 2017-04-19 苏州梦想人软件科技有限公司 基于计算机视觉跟踪和传感器跟踪的跟踪方法和跟踪***
CN105528082B (zh) * 2016-01-08 2018-11-06 北京暴风魔镜科技有限公司 三维空间及手势识别追踪交互方法、装置和***
CN106651942B (zh) * 2016-09-29 2019-09-17 苏州中科广视文化科技有限公司 基于特征点的三维旋转运动检测与旋转轴定位方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157011A (zh) * 2010-12-10 2011-08-17 北京大学 利用移动拍摄设备进行动态纹理采集及虚实融合的方法
CN103217147A (zh) * 2012-01-19 2013-07-24 株式会社东芝 测量设备和测量方法
US20150029345A1 (en) * 2012-01-23 2015-01-29 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
CN103245337A (zh) * 2012-02-14 2013-08-14 联想(北京)有限公司 一种获取移动终端位置的方法、移动终端和位置检测***
CN103759716A (zh) * 2014-01-14 2014-04-30 清华大学 基于机械臂末端单目视觉的动态目标位置和姿态测量方法
CN106296801A (zh) * 2015-06-12 2017-01-04 联想(北京)有限公司 一种建立物体三维图像模型的方法及电子设备

Also Published As

Publication number Publication date
CN109099888A (zh) 2018-12-28

Similar Documents

Publication Publication Date Title
WO2021212844A1 (zh) 点云拼接方法、装置、设备和存储设备
EP3309751B1 (en) Image processing device, method, and program
US9466143B1 (en) Geoaccurate three-dimensional reconstruction via image-based geometry
CN108592950B (zh) 一种单目相机和惯性测量单元相对安装角标定方法
WO2018233514A1 (zh) 一种位姿测量方法、设备及存储介质
CN112183171B (zh) 一种基于视觉信标建立信标地图方法、装置
CN106530358A (zh) 仅用两幅场景图像标定ptz摄像机的方法
CN110969665B (zh) 一种外参标定方法、装置、***及机器人
CN112288853B (zh) 三维重建方法、三维重建装置、存储介质
JP2013539147A5 (zh)
IL214151A (en) Method and device for restoring 3D character
CN113048980B (zh) 位姿优化方法、装置、电子设备及存储介质
CN109949232B (zh) 图像与rtk结合的测量方法、***、电子设备及介质
CN113820735A (zh) 位置信息的确定方法、位置测量设备、终端及存储介质
JP7114686B2 (ja) 拡張現実装置及び位置決め方法
CN115423863B (zh) 相机位姿估计方法、装置及计算机可读存储介质
Duran et al. Accuracy comparison of interior orientation parameters from different photogrammetric software and direct linear transformation method
JP6928217B1 (ja) 測定処理装置、方法及びプログラム
Tjahjadi et al. Single image orientation of UAV's imagery using orthogonal projection model
El-Ashmawy A comparison study between collinearity condition, coplanarity condition, and direct linear transformation (DLT) method for camera exterior orientation parameters determination
KR20210009019A (ko) 벡터내적과 3차원 좌표변환을 이용한 카메라의 위치 및 자세를 결정하는 시스템
Bakuła et al. Capabilities of a smartphone for georeferenced 3dmodel creation: An evaluation
CN108764161B (zh) 基于极坐标系的破解稀疏阵引发的病态奇异性的遥感影像处理方法和装置
Hussein et al. Evaluation of the accuracy of direct georeferencing of smartphones for use in some urban planning applications within smart cities
CN113223163A (zh) 点云地图构建方法及装置、设备、存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18821112

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18821112

Country of ref document: EP

Kind code of ref document: A1