CN110414101A - A kind of simulating scenes measurement method, accuracy measuring method and system - Google Patents

A kind of simulating scenes measurement method, accuracy measuring method and system Download PDF

Info

Publication number
CN110414101A
CN110414101A CN201910637565.3A CN201910637565A CN110414101A CN 110414101 A CN110414101 A CN 110414101A CN 201910637565 A CN201910637565 A CN 201910637565A CN 110414101 A CN110414101 A CN 110414101A
Authority
CN
China
Prior art keywords
simulating scenes
eye
point
optical center
binocular camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910637565.3A
Other languages
Chinese (zh)
Other versions
CN110414101B (en
Inventor
吴程程
吕毅
许澍虹
薛阳
成天壮
武玉芬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Commercial Aircraft Corp of China Ltd
Beijing Aeronautic Science and Technology Research Institute of COMAC
Original Assignee
Commercial Aircraft Corp of China Ltd
Beijing Aeronautic Science and Technology Research Institute of COMAC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Commercial Aircraft Corp of China Ltd, Beijing Aeronautic Science and Technology Research Institute of COMAC filed Critical Commercial Aircraft Corp of China Ltd
Priority to CN201910637565.3A priority Critical patent/CN110414101B/en
Publication of CN110414101A publication Critical patent/CN110414101A/en
Application granted granted Critical
Publication of CN110414101B publication Critical patent/CN110414101B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a kind of simulating scenes measurement methods, comprising: before binocular camera imitation eye is placed in 3D glasses;Demarcate the optical center position of the binocular camera;Using the optical center position as position of eye point;VR system scenarios real-time rendering, which is carried out, according to the position of eye point obtains simulating scenes;The simulating scenes are measured according to the calibrating parameters of the binocular camera and the image of acquisition.For tradition relatively by human eye observation, the method is more objective more acurrate by quantitative analysis means;Whole measurement process is not related to the subjective judgement of people, and automatic operation may be implemented;Practical object can be completely disengaged, virtual objects are individually measured, application range is expanded;Any point can be measured by the method in three-dimensional space, as a result be had comprehensive.

Description

A kind of simulating scenes measurement method, accuracy measuring method and system
Technical field
The present invention relates to analog simulation fields, and in particular to a kind of simulating scenes measurement method, accuracy measuring method and System.
Background technique
Currently, detection method general in analog simulation industry is to pass through system emulation with it using material object existing for system Identical dummy model afterwards carries out virtual-real comparison, fetches the geometric simulation precision of measurement system.
Existing technical solution has the disadvantage in that
1) complicated for operation cumbersome, the prior art is all made of actual object and compares with its virtual objects, measurement range by Limit, can only realize the measurement to virtual scene geometric simulation accuracy by indirect mode;
2) result accuracy is low, and actual situation comparison result is obtained by the observation of people, can not carry out quantitative analysis.
Summary of the invention
(1) goal of the invention
It is existing to solve that the object of the present invention is to provide a kind of simulating scenes measurement method, accuracy measuring method and systems Virtual emulation detection technique is complicated for operation cumbersome, as a result the low problem of accuracy.
(2) technical solution
To solve the above problems, the first aspect of the present invention provides a kind of simulating scenes measurement method, comprising: by binocular Before camera imitation eye is placed in 3D glasses;The optical center position of the binocular camera is demarcated, and using the optical center position as eyespot position It sets;VR system scenarios real-time rendering, which is carried out, according to the position of eye point obtains simulating scenes;According to the calibration of the binocular camera Parameter and the image of acquisition measure the simulating scenes.
Further, the optical center position includes the optical center position of initial optical center position and real-time tracing.
Further, described VR system scenarios real-time rendering is carried out according to the position of eye point to obtain simulating scenes and specifically wrap It includes: presetting Virtual Space point to be measured;Three-dimensional rendering is carried out to the Virtual Space point according to the position of eye point, is shown on the screen Left-eye image and eye image are shown, simulating scenes are formed.
Further, described that the simulating scenes are carried out according to the calibrating parameters of the binocular camera and the image of acquisition Measurement specifically includes: acquiring the left-eye image and the right eye figure by the way that the 3D glasses are corresponding using the binocular camera Picture;According to calibrating parameters calculating of the stereoscopic vision algorithm using the left-eye image, the eye image and the camera The physical world coordinates value of spatial point.
According to another aspect of the present invention, a kind of simulating scenes rendering accuracy measuring method is provided, comprising:
The position of the mobile binocular camera;
The described in any item simulating scenes measurement method steps of above-mentioned technical proposal are executed respectively in multiple positions, are obtained more A measurement result;
The deviation between multiple measurement result positions is compared, is rendered by simulating scenes described in the deviation measuring quasi- True property.
According to another aspect of the invention, a kind of simulating scenes measuring system is provided, comprising:
Binocular camera, for imitating human eye acquisition left-eye image and eye image;
Optical center locating module, for demarcating the optical center position of the binocular camera, and using the optical center position as eyespot Position;
Scene rendering module renders to obtain simulating scenes for carrying out VR system scenarios according to the position of eye point;
Simulating scenes measurement module, for according to the calibrating parameters of the binocular camera and the image of acquisition to the emulation Scene measures.
Further, the optical center position includes initial optical center position and real-time optical center position.
Further, the scene rendering module includes:
Spatial point analog module, for presetting Virtual Space point to be measured;
Three-dimensional rendering module, for carrying out three-dimensional rendering to the Virtual Space point according to the position of eye point, in screen On show left-eye image and eye image, form simulating scenes.
Further, the three-dimensional rendering module carries out three-dimensional wash with watercolours to the Virtual Space point according to the position of eye point Dye, shows left-eye image and eye image on the screen, and formation simulating scenes specifically execute step and include:
The left-eye image and the eye image are acquired by the way that the 3D glasses are corresponding using the binocular camera;
It is calculated according to stereoscopic vision algorithm using the calibrating parameters of the left-eye image, the eye image and the camera The physical world coordinates value of the spatial point.
According to another aspect of the invention, a kind of simulating scenes rendering accuracy measurement system is provided, comprising:
Drive module, for moving the position of the binocular camera;
Simulating scenes measurement module is obtained for executing the described in any item simulating scenes measurement method steps of above scheme To multiple measurement results;
Comparison module, for comparing the deviation between multiple measurement result positions, by described in the deviation measuring Simulating scenes render accuracy.
(3) beneficial effect
Above-mentioned technical proposal of the invention has following beneficial technical effect:
(1) for tradition relatively by human eye observation, the method is more objective more acurrate by quantitative analysis means;
(2) whole measurement process is not related to the subjective judgement of people, and automatic operation may be implemented;
(3) practical object can be completely disengaged, virtual objects are individually measured, application range is expanded;
(4) any point can be measured by the method in three-dimensional space, as a result be had comprehensive.
Detailed description of the invention
Fig. 1 is the simulating scenes measuring method flow chart of first embodiment according to the present invention;
Fig. 2 is that the real-time rendering of first embodiment according to the present invention obtains simulating scenes flow chart;
Fig. 3 is the simulating scenes measurement flow chart of first embodiment according to the present invention;
Fig. 4 is the process of the another aspect simulating scenes rendering accuracy measuring method of first embodiment according to the present invention Figure;
Fig. 5 is the schematic diagram of the simulating scenes measurement method of an optional embodiment according to the present invention;
Fig. 6 is the method flow diagram of the simulating scenes measurement of an optional embodiment according to the present invention;
Fig. 7 is that general measure method and true eye position form deviation principle figure;
Fig. 8 is the schematic diagram of transition matrix between the coordinate system of an optional embodiment according to the present invention.
Specific embodiment
In order to make the objectives, technical solutions and advantages of the present invention clearer, With reference to embodiment and join According to attached drawing, the present invention is described in more detail.It should be understood that these descriptions are merely illustrative, and it is not intended to limit this hair Bright range.In addition, in the following description, descriptions of well-known structures and technologies are omitted, to avoid this is unnecessarily obscured The concept of invention.
Obviously, described embodiments are some of the embodiments of the present invention, instead of all the embodiments.Based on the present invention In embodiment, every other implementation obtained by those of ordinary skill in the art without making creative efforts Example, shall fall within the protection scope of the present invention.
As long as in addition, the non-structure each other of technical characteristic involved in invention described below different embodiments It can be combined with each other at conflict.
As shown in Figure 1, in the embodiment of the present invention in a first aspect, providing a kind of simulating scenes measurement method, comprising:
S1: before binocular camera imitation eye is placed in 3D glasses;
S2: the optical center position of binocular camera is demarcated, and using optical center position as position of eye point;Optionally, optical center position is wrapped Include initial optical center position and real-time optical center position.Specifically, demarcating initial optical center position, the content of calibration includes two camera lights Three dimensional space coordinate of the heart under unified world coordinate system, the relative position for having obtained initial optical center position and 3D glasses are closed System;The position real time information that 3D glasses are obtained using tracing system is closed according to initial optical center position and the relative position of 3D glasses System calculates the real-time optical center position of camera, and tracing system optionally has the optical tracking system of ART company, is also possible to OptiTrack, Vicon and the green pupil of the country etc..In order to obtain real-time optical center position coordinates, what the present embodiment was taken is to utilize The camera system and tracing system of initial position to be calibrated measure the mode of same group of physical space point jointly, to realize that measurement needs Stereoscopic vision calibration is carried out to camera system itself, the intrinsic parameter comprising camera and outer parameter.The above method solves tracking The location information that system can only obtain 3D glasses can not obtain the problem of optical center position of real camera, and then solve and be arranged The problem of will appear deviation when eyespot, passes through the available accurately real-time optical center position of this method.
S3: VR system scenarios real-time rendering is carried out according to position of eye point and obtains simulating scenes;Optionally, as shown in Fig. 2, root Obtain simulating scenes according to position of eye point progress VR system scenarios real-time rendering to specifically include: S31 presets Virtual Space point to be measured; S32 carries out three-dimensional rendering to Virtual Space point according to position of eye point, shows left-eye image and eye image on the screen, is formed Simulating scenes.
S4: simulating scenes are measured according to the image of the calibrating parameters of binocular camera and acquisition.Optionally, such as Fig. 3 Shown, measured and specifically included to simulating scenes according to the image of the calibrating parameters of binocular camera and acquisition: S41 utilizes binocular Camera passes through the corresponding acquisition left-eye image of 3D glasses and eye image;S42 utilizes left-eye image, right eye according to stereoscopic vision algorithm The physical world coordinates value of the calibrating parameters of image and camera calculating spatial point.Wherein corresponding be meant that in binocular camera replaces The image on screen that the mesh camera acquisition left eye of people's left eye observes, instead of the mesh camera acquisition right eye observation of people's right eye To screen on image.
The method does not need artificially to judge, but passes through quantitative analysis means, more objective more acurrate;Whole measurement process It is not related to the subjective judgement of people, automatic operation may be implemented;Practical object can be completely disengaged, virtual objects are carried out independent Measurement, expands application range;Any point can be measured by the method in three-dimensional space, as a result be had comprehensive.
As shown in figure 4, providing a kind of simulating scenes rendering accuracy measurement side in the other side of the embodiment of the present invention Method, comprising:
S ' 1: the position of mobile binocular camera;
S ' 2: executing the simulating scenes measurement method step of above-described embodiment in multiple positions respectively, obtains multiple measurement knots Fruit;
S ' 3: comparing the deviation between multiple measurement result positions, and measurement simulating scenes render accuracy.
At the another aspect of the embodiment of the present invention, a kind of simulating scenes measuring system is provided, comprising:
Binocular camera, for imitating human eye acquisition left-eye image and eye image;Optionally, binocular camera is fixed-focus number Camera.
Optical center locating module, for demarcating the optical center position of binocular camera, and using optical center position as position of eye point;It is optional , optical center position includes the optical center position of initial optical center position and real-time tracing.
Scene rendering module renders to obtain simulating scenes for carrying out VR system scenarios according to position of eye point;Optionally, field Scape rendering module includes: spatial point analog module, for presetting Virtual Space point to be measured;Three-dimensional rendering module, for according to eye Point position carries out three-dimensional rendering to Virtual Space point, shows left-eye image and eye image on the screen, forms simulating scenes. Optionally, three-dimensional rendering module carries out three-dimensional rendering to Virtual Space point according to position of eye point, shows left eye figure on the screen Picture and eye image, forming simulating scenes and specifically executing step includes: that binocular camera passes through the corresponding acquisition left-eye image of 3D glasses And eye image;Spatial point is calculated using the calibrating parameters of left-eye image, eye image and camera according to stereoscopic vision algorithm Physical world coordinates value.
Simulating scenes measurement module, for being carried out according to the calibrating parameters of binocular camera and the image of acquisition to simulating scenes Measurement.
At the another aspect of the embodiment of the present invention, a kind of simulating scenes rendering accuracy measurement system is provided, comprising:
Drive module, for moving the position of binocular camera;
Simulating scenes measurement module obtains multiple surveys for executing the simulating scenes measurement method step of above-described embodiment Measure result;
Comparison module, for comparing the deviation between multiple measurement result positions, measurement simulating scenes render accuracy.
As shown in figure 5, in an optional implementation of the invention, in order to be measured in real world to simulating scenes, The fixed-focus digital camera that a pair has been demarcated, which is placed in simulating scenes, replaces human eye to be observed scene, and 3D glasses are put Before being placed in two cameras, two cameras can also change binocular camera into guarantee instead of people's right and left eyes camera each be only capable of observing The picture rendered under to respective position of eye point is to obtain the disparity map of simulating scenes;Then camera photocentre position is carried out just Begin to demarcate and carry out real-time tracing, result will be tracked and be defined as the position of eye point that VR system is used to render, to guarantee to render Simulating scenes and observation position between matching, avoid measurement error caused by the deviation due to position of eye point;Subsequent start-up VR system carries out scene real-time rendering according to the eye position tracked;And then it utilizes to collect to have using stereoscopic vision algorithm and regard Two width figures of difference and the calibrating parameters of camera system measure simulating scenes;Mobile camera system changes observation position, weight Above-mentioned measurement process is newly repeated with new position of eye point, obtains the measurement result of different observation positions;Finally by field will be emulated The average result that scape repeatedly measures is compared with original design data, realizes the survey to the rendering accuracy of scene geometric dimension Amount.
As shown in fig. 6, the measurement to simulating scenes geometric dimension, chases after and searches to the bottom i.e. for any one in virtual environment The position of Virtual Space point measures, specific steps are as follows:
The first step, the stereoscopic camera measuring system (fixed-focus that stereoscopic camera measuring system has demarcated two that will have been demarcated Camera is connected on a rigid body bracket, and aid has the marker (Track Marker) for tracking and positioning on bracket, Whole system as be not between each component of complete rigid body relative position variation) be placed in any position in VR system, And before 3D glasses are placed in camera;
Second step carries out the calibration of camera photocentre initial position, obtains camera photocentre position as rendering eyespot initial position;
Third step starts to carry out real-time tracing to camera photocentre position according to the calibration result of camera photocentre initial position;
4th step sets point P in Virtual Space to be measured;
Camera space position is defined as position of eye point, and is carried out according to position of eye point to Virtual Space point P by the 5th step Solid rendering, shows IL and IR two images in screen S;
6th step acquires right and left eyes image using camera respectively under the auxiliary of 3D glasses, and uses stereoscopic vision algorithm The physical world coordinates value of point P is calculated using the calibrating parameters of collected two width figures and camera system with parallax;
7th step, mobile camera, change observation eye position repeat above-mentioned third step to the 6th step and obtain multiple groups difference observation bit Set the measurement result at place, the deviation between the Virtual Space setting position and multiple groups actual measurement location center of gravity of last contrast points P, The geometry of measuring point P renders accuracy.
In second step, the calibration of camera photocentre needs to calculate table of the two camera photocentre positions of left and right in the coordinate of Virtual Space Show, to provide the position of eye point for calculating of relying for rendering.In VR display system normal use, this position of eye point is typically directly Using the center of lens position of 3D glasses, and by it is dynamic catch systematic survey after pass to rendering system to scene rendering.Due to There are deviations for center of lens position and true eye interdigit, therefore scene shown for observer is inaccurate, partially It is poor as shown in Figure 7.
When observer is people, due to lacking the accurate perception for size, this error is often ignored, but works as and use phase This error will generate measurement result and greatly influence when machine carries out precise measurement, therefore cannot directly be tracked using dynamic equipment of catching To eye position measure.This motion devises a kind of true eye position measuring method for camera for this problem, thus The accurate position for obtaining camera photocentre in rendering system.This partial content is related to the meter of transition matrix between several coordinate systems It calculates, the specific method is as follows:
The coordinate system related generally to during the calibration process is as shown in Figure 8, comprising:
(1) VR system physical world coordinate system COW, for describing the position of real world physical spatial point
(2) motion capture system coordinate system COT positions point for describing the real world physical that the dynamic system of catching is tracked It sets, simultaneously because the location tracking result of tracing system, which can be used for rendering for virtual scene, provides eye position information.Therefore, this coordinate Coordinate under system can also be used to the point in description Virtual Space
(3) VR system virtual space coordinate system COV, for describing the position of Virtual Space point
(4) left camera coordinate system COCL, for describing camera measurement as a result, its origin position is left eye point when rendering It sets, Z axis is the direction of observation of left eye point
(5) right camera coordinate system COCR, for describing camera measurement as a result, its origin position is right eye point when rendering It sets, Z axis is the direction of observation of right eye point
It is accurate to obtain position of eye point, i.e., existed by calculating the coordinate origin for obtaining COCL and COCR and change in coordinate axis direction vector Expression way in COV.It is analyzed from mathematical angle, i.e. coordinate system transforming relationship between calculating COCL and COCR and COV.Due to Space described by COCL and COCR is real physical space, and it is Virtual Space that COV is described, can not be directly acquired between the two Relationship, it is therefore desirable to be calculated by other coordinate systems.To simplify problem, COW can be defined as to COT first, that is, defined COT is physical world coordinates system, thus known to the transforming relationship between COT and COW.Secondly, as previously mentioned, dynamic system of catching is for surveying It measures the eye position in actual physics space and this result is supplied to rendering system and render, therefore the conversion between COT and COV Known to relationship.Mutually converting between camera stereo calibration two cameras of acquisition can be passed through finally, for two camera COCL and COCR Relationship, therefore known to the transforming relationship between COCL and COCR.To sum up, problem is converted into the transforming relationship found between COT and COCL.
Since the dynamic system and camera system of catching can measure real world objects, this patent is by using two sets System between same object measurement come it is counter push away two coordinates COT and COCL between transforming relationship.
For n space tested point any in physical space, ifFor this group of point Measurement coordinate in COT;For the measurement coordinate of the group point in COCL;s∈R Proportionality coefficient, the R ∈ R mutually converted between COT and COCL3×3Spin matrix between coordinate system, T ∈ R3×1Between Two coordinate system Translation matrix then has:
MT=s*R*MC+[T...T]
Now rememberFor MCThe column vector of jth column, the then center of gravity of point group are as follows:
Mean radius are as follows:
For MTSimilarly.
It enablesThen objective function can be with abbreviation are as follows:
Former problem is converted into orthogonal procrustes problem, that is, solves:
1. solution can obtain R=UVT, wherein U, VTIt is to M=QPTObtained two orthogonal matrixes of singular value decomposition.Due to (s, R, T) shares 7 known variables, as long as therefore n be more than or equal to and 3 solution can be completed, so as to obtain between COT and COCL Transforming relationship and complete for camera true eye position calculate.Wherein :=indicate to be defined as, i.e., with a simple symbol Represent an expression formula;RaxbIndicate that a matrix of a row b column, element are real number;| | | | indicate Euclidean distance;SO (3) table Show three-dimensional rotation matrix group
Indicate that R is so that expression formula | | Ω P-Q | | the value of the smallest Ω of value, Ω belong to Three-dimensional rotation matrix group.
With in the third step, completion eye position needs to carry out real-time tracking to eye position after calculating, to realize in mobile camera Correct eye position information can be still obtained behind position without being demarcated from newly.
Since entire binocular camera system is integral system, the relative positional relationship of two cameras be will not change, therefore Whole system can be regarded as to a rigid body, complete the rigid body that the calibrated movement in initial eye position is opposite initial alignment position Movement.Based on this analysis, real-time eye position tracking is realized in this motion by the way of increasing anchor point on camera system.In eye The 6 of the current subsequent real-time tracing mark point of 6DOF posture information Po0 are recorded freely using the dynamic system of catching when position calibration is completed Posture information Pot is spent, the evolution between Pot and Po0 is calculated, this transformation, which is applied to initial alignment eye position, to be obtained in real time Accurate eye position information.
In step 7, it completes to need one group by progress n times measurement obtains under different observation positions after all measuring The real world coordinates Pri (0 < i < n) of virtual point P, with the Virtual Space coordinate P set when modelingvIt is compared to obtain Error of the system when rendering virtual point P.For obtain more with statistical significance as a result, we definePoint is directed to be all The position of centre of gravity of the mean value of the measurement result of P, i.e. Pri (0 < i < n), and useWith PvBetween Euclidean distance as standard carry out Judge to point P geometric dimension simulation nicety.
Further, the measurement of more spatial points is such as carried out, can be borrowed in statistical analysis after carrying out multi-group data acquisition The testing index of root-mean-square error RMSE and determining coefficients R-square as total system scale reducing power.Remember collected Test sample sum is m, then the calculating of testing index is shown below:
RMSE value represents geometric simulation closer to 0, R-square value closer to true closer to 1.
This method using camera and human eye have the function of similar structure and, and there is quantitative calculating physical size ability Characteristic replaces human eye to be observed simulating scenes, realizes measurement and survey to VR system geometric simulation accuracy using camera It is fixed.
It catches system using stereoscopic camera measuring system simultaneously one group of identical real space point is measured with moving, it is counter to push away Stereoscopic camera measuring system coordinate system and the dynamic transforming relationship caught between system coordinate system out, and then remaining comprehensive known coordinate system turns Change relationship realizes the accurate calibration of stereoscopic camera measuring system eye position, and by adding mark point for stereoscopic camera measuring system Realize the real-time update of calibration eye position.
The present invention is directed to protect a kind of simulating scenes measurement method, comprising: binocular camera imitation eye is placed in 3D glasses Before;Demarcate the optical center position of the binocular camera;Using the optical center position as position of eye point;It is carried out according to the position of eye point VR system scenarios real-time rendering obtains simulating scenes;According to the calibrating parameters of the binocular camera and the image of acquisition to described imitative True scene measures.For tradition relatively by human eye observation, the method is more objective more acurrate by quantitative analysis means;It is whole A continuous mode is not related to the subjective judgement of people, and automatic operation may be implemented;Practical object can be completely disengaged, to virtual right As individually being measured, application range is expanded;Any point can be measured by the method in three-dimensional space, as a result be had Have comprehensive.
It should be understood that above-mentioned specific embodiment of the invention is used only for exemplary illustration or explains of the invention Principle, but not to limit the present invention.Therefore, that is done without departing from the spirit and scope of the present invention is any Modification, equivalent replacement, improvement etc., should all be included in the protection scope of the present invention.In addition, appended claims purport of the present invention Covering the whole variations fallen into attached claim scope and boundary or this range and the equivalent form on boundary and is repairing Change example.

Claims (10)

1. a kind of simulating scenes measurement method characterized by comprising
Before binocular camera imitation eye is placed in 3D glasses;
The optical center position of the binocular camera is demarcated, and using the optical center position as position of eye point;
VR system scenarios real-time rendering, which is carried out, according to the position of eye point obtains simulating scenes;
The simulating scenes are measured according to the calibrating parameters of the binocular camera and the image of acquisition.
2. simulating scenes measurement method according to claim 1, which is characterized in that the optical center position includes initial optical center Position and real-time optical center position.
3. simulating scenes measurement method according to claim 1, which is characterized in that described to be carried out according to the position of eye point VR system scenarios real-time rendering obtains simulating scenes and specifically includes:
Preset Virtual Space point to be measured;
Three-dimensional rendering is carried out to the Virtual Space point according to the position of eye point, shows left-eye image and right eye on the screen Image forms simulating scenes.
4. simulating scenes measurement method according to claim 1, which is characterized in that the mark according to the binocular camera The image for determining parameter and acquisition, which measures the simulating scenes, to be specifically included:
The left-eye image and the eye image are acquired by the way that the 3D glasses are corresponding using the binocular camera;
According to calibrating parameters calculating of the stereoscopic vision algorithm using the left-eye image, the eye image and the camera The physical world coordinates value of spatial point.
5. a kind of simulating scenes render accuracy measuring method characterized by comprising
The position of the mobile binocular camera;
In multiple positions, difference perform claim requires the described in any item simulating scenes measurement method steps of 1-4, obtains multiple surveys Measure result;
The deviation between multiple measurement result positions is compared, it is accurate to render by simulating scenes described in the deviation measuring Property.
6. a kind of simulating scenes measuring system characterized by comprising
Binocular camera, for imitating human eye acquisition left-eye image and eye image;
Optical center locating module, for demarcating the optical center position of the binocular camera, and using the optical center position as position of eye point;
Scene rendering module renders to obtain simulating scenes for carrying out VR system scenarios according to the position of eye point;
Simulating scenes measurement module, for according to the calibrating parameters of the binocular camera and the image of acquisition to the simulating scenes It measures.
7. simulating scenes measuring system according to claim 6, which is characterized in that the optical center position includes initial optical center Position and real-time optical center position.
8. simulating scenes measuring system according to claim 6, which is characterized in that the scene rendering module includes:
Spatial point analog module, for presetting Virtual Space point to be measured;
Three-dimensional rendering module is shown on the screen for carrying out three-dimensional rendering to the Virtual Space point according to the position of eye point Left-eye image and eye image are shown, simulating scenes are formed.
9. simulating scenes measuring system according to claim 6, which is characterized in that the solid rendering module is according to Position of eye point carries out three-dimensional rendering to the Virtual Space point, shows left-eye image and eye image on the screen, is formed imitative True scene specifically executes step and includes:
The left-eye image and the eye image are acquired by the way that the 3D glasses are corresponding using the binocular camera;
According to calibrating parameters calculating of the stereoscopic vision algorithm using the left-eye image, the eye image and the camera The physical world coordinates value of spatial point.
10. a kind of simulating scenes rendering accuracy measures system characterized by comprising
Drive module, for moving the position of the binocular camera;
Simulating scenes measurement module requires the described in any item simulating scenes measurement method steps of 1-4 for perform claim, obtains Multiple measurement results;
Comparison module, for comparing the deviation between multiple measurement result positions, by being emulated described in the deviation measuring Scene rendering accuracy.
CN201910637565.3A 2019-07-15 2019-07-15 Simulation scene measurement method, accuracy measurement method and system Active CN110414101B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910637565.3A CN110414101B (en) 2019-07-15 2019-07-15 Simulation scene measurement method, accuracy measurement method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910637565.3A CN110414101B (en) 2019-07-15 2019-07-15 Simulation scene measurement method, accuracy measurement method and system

Publications (2)

Publication Number Publication Date
CN110414101A true CN110414101A (en) 2019-11-05
CN110414101B CN110414101B (en) 2023-08-04

Family

ID=68361483

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910637565.3A Active CN110414101B (en) 2019-07-15 2019-07-15 Simulation scene measurement method, accuracy measurement method and system

Country Status (1)

Country Link
CN (1) CN110414101B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113012270A (en) * 2021-03-24 2021-06-22 纵深视觉科技(南京)有限责任公司 Stereoscopic display method and device, electronic equipment and storage medium
CN113658474A (en) * 2021-08-18 2021-11-16 中国商用飞机有限责任公司 Emergency evacuation training system for airplane
CN115118880A (en) * 2022-06-24 2022-09-27 中广建融合(北京)科技有限公司 XR virtual shooting system based on immersive video terminal is built

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102801994A (en) * 2012-06-19 2012-11-28 西北工业大学 Physical image information fusion device and method
US20130300637A1 (en) * 2010-10-04 2013-11-14 G Dirk Smits System and method for 3-d projection and enhancements for interactivity
CN107093195A (en) * 2017-03-10 2017-08-25 西北工业大学 A kind of locating mark points method that laser ranging is combined with binocular camera
CN107277495A (en) * 2016-04-07 2017-10-20 深圳市易瞳科技有限公司 A kind of intelligent glasses system and its perspective method based on video perspective
CN107408315A (en) * 2015-02-23 2017-11-28 Fittingbox公司 The flow and method of glasses try-in accurate and true to nature for real-time, physics
CN107820075A (en) * 2017-11-27 2018-03-20 中国计量大学 A kind of VR equipment delayed test devices based on light stream camera
CN108413941A (en) * 2018-02-06 2018-08-17 四川大学 A kind of simple and efficient distance measuring method based on cheap binocular camera
CN109598796A (en) * 2017-09-30 2019-04-09 深圳超多维科技有限公司 Real scene is subjected to the method and apparatus that 3D merges display with dummy object
CN111951332A (en) * 2020-07-20 2020-11-17 燕山大学 Glasses design method based on sight estimation and binocular depth estimation and glasses thereof

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130300637A1 (en) * 2010-10-04 2013-11-14 G Dirk Smits System and method for 3-d projection and enhancements for interactivity
CN102801994A (en) * 2012-06-19 2012-11-28 西北工业大学 Physical image information fusion device and method
CN107408315A (en) * 2015-02-23 2017-11-28 Fittingbox公司 The flow and method of glasses try-in accurate and true to nature for real-time, physics
CN107277495A (en) * 2016-04-07 2017-10-20 深圳市易瞳科技有限公司 A kind of intelligent glasses system and its perspective method based on video perspective
CN107093195A (en) * 2017-03-10 2017-08-25 西北工业大学 A kind of locating mark points method that laser ranging is combined with binocular camera
CN109598796A (en) * 2017-09-30 2019-04-09 深圳超多维科技有限公司 Real scene is subjected to the method and apparatus that 3D merges display with dummy object
CN107820075A (en) * 2017-11-27 2018-03-20 中国计量大学 A kind of VR equipment delayed test devices based on light stream camera
CN108413941A (en) * 2018-02-06 2018-08-17 四川大学 A kind of simple and efficient distance measuring method based on cheap binocular camera
CN111951332A (en) * 2020-07-20 2020-11-17 燕山大学 Glasses design method based on sight estimation and binocular depth estimation and glasses thereof

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113012270A (en) * 2021-03-24 2021-06-22 纵深视觉科技(南京)有限责任公司 Stereoscopic display method and device, electronic equipment and storage medium
CN113658474A (en) * 2021-08-18 2021-11-16 中国商用飞机有限责任公司 Emergency evacuation training system for airplane
CN115118880A (en) * 2022-06-24 2022-09-27 中广建融合(北京)科技有限公司 XR virtual shooting system based on immersive video terminal is built

Also Published As

Publication number Publication date
CN110414101B (en) 2023-08-04

Similar Documents

Publication Publication Date Title
CN110296691B (en) IMU calibration-fused binocular stereo vision measurement method and system
CN102072706B (en) Multi-camera positioning and tracking method and system
CN108038902A (en) A kind of high-precision three-dimensional method for reconstructing and system towards depth camera
CN108510551B (en) Method and system for calibrating camera parameters under long-distance large-field-of-view condition
CN108510535A (en) A kind of high quality depth estimation method based on depth prediction and enhancing sub-network
CN108550143A (en) A kind of measurement method of the vehicle length, width and height size based on RGB-D cameras
CN107063129A (en) A kind of array parallel laser projection three-dimensional scan method
CN110414101A (en) A kind of simulating scenes measurement method, accuracy measuring method and system
CN108734776A (en) A kind of three-dimensional facial reconstruction method and equipment based on speckle
CN103426168B (en) Based on the general calibration method of common, wide-angle, the flake stereo camera of one-dimension calibration bar
CN105205858A (en) Indoor scene three-dimensional reconstruction method based on single depth vision sensor
CN104376552A (en) Virtual-real registering algorithm of 3D model and two-dimensional image
CN108053437A (en) Three-dimensional model acquiring method and device based on figure
CN104155765A (en) Method and equipment for correcting three-dimensional image in tiled integral imaging display
CN108648274A (en) A kind of cognition point cloud map creation system of vision SLAM
CN108225216A (en) Structured-light system scaling method and device, structured-light system and mobile equipment
CN105378794A (en) 3d recording device, method for producing 3d image, and method for setting up 3d recording device
CN110345921A (en) Stereoscopic fields of view vision measurement and vertical axial aberration and axial aberration bearing calibration and system
CN110458932A (en) Image processing method, device, system, storage medium and image scanning apparatus
CN106705849A (en) Calibration method of linear-structure optical sensor
CN100561118C (en) A kind of color rendering method in the three-dimensional digitized measurement
TW201310004A (en) Correlation arrangement device of digital images
CN108020175A (en) A kind of more optical grating projection binocular vision tongue body surface three dimension entirety imaging methods
CN109035345A (en) The TOF camera range correction method returned based on Gaussian process
CN104182968A (en) Method for segmenting fuzzy moving targets by wide-baseline multi-array optical detection system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant