CN108121068A - Virtual implementing helmet depth of field laser sets the method and device of optimization display - Google Patents

Virtual implementing helmet depth of field laser sets the method and device of optimization display Download PDF

Info

Publication number
CN108121068A
CN108121068A CN201710543920.1A CN201710543920A CN108121068A CN 108121068 A CN108121068 A CN 108121068A CN 201710543920 A CN201710543920 A CN 201710543920A CN 108121068 A CN108121068 A CN 108121068A
Authority
CN
China
Prior art keywords
depth
field
implementing helmet
virtual implementing
observation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710543920.1A
Other languages
Chinese (zh)
Inventor
党少军
姜燕冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Virtual Reality Technology Co Ltd
Original Assignee
Shenzhen Virtual Reality Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Virtual Reality Technology Co Ltd filed Critical Shenzhen Virtual Reality Technology Co Ltd
Publication of CN108121068A publication Critical patent/CN108121068A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0242Testing optical properties by measuring geometrical properties or aberrations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0242Testing optical properties by measuring geometrical properties or aberrations
    • G01M11/0257Testing optical properties by measuring geometrical properties or aberrations by analyzing the image formed by the object to be tested
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0012Optical design, e.g. procedures, algorithms, optimisation routines
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0127Head-up displays characterised by optical features comprising devices increasing the depth of field
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0161Head-up displays characterised by mechanical features characterised by the relative positioning of the constitutive elements
    • G02B2027/0163Electric or electronic control thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Human Computer Interaction (AREA)
  • Geometry (AREA)
  • Eye Examination Apparatus (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Eyeglasses (AREA)

Abstract

The present invention provides the method and device that a kind of virtual implementing helmet depth of field laser sets optimization display, including test cell, observation unit, elementary area and processing unit, the test cell includes virtual implementing helmet to be placed, fixed structure, virtual implementing helmet to be set includes display screen, the fixed structure includes clamping device and position-limit mechanism, and the clamping device, which can be opened, is put into the virtual implementing helmet.Compared with prior art, the present invention effectively simply solves the problems, such as depth of field setting using the combination of test cell, observation unit, elementary area and processing unit.Drive observation unit that can facilitate from multiple angles from carrying out, to facilitate the setting of multiple points of observation along eyepiece track motion by motor.

Description

Virtual implementing helmet depth of field laser sets the method and device of optimization display
Technical field
The present invention relates to field of virtual reality, are set more specifically to a kind of virtual implementing helmet depth of field laser excellent Change the method and device of display.
Background technology
Distortion eyeglass has application in many fields, for example, in virtual reality system, in order to which user is allowed visually to gather around There is real feeling of immersion, virtual reality device will cover the visual range of human eye as far as possible, therefore just need virtually existing Real equipment fills a specific spherical curvature eyeglass, but when traditional image is projected using Arc lens in the eye of people, Image is distortion, and human eye just has no idea to obtain the positioning in Virtual Space, i.e., your periphery is all to turn round in virtual reality Bent image.Solve the problems, such as that this, it is necessary to first torsigram picture, the corresponding distortion figure of eyeglass is generated a distortion by specific algorithm Picture, then these fault images by distortion eyeglass project human eye after, normal image will be become, so as to which people be allowed to feel Feel real position projection and the covering of big angular field of view.Current lens manufacturer can come according to certain distortion parameter Eyeglass is made, these eyeglasses are assembled by the manufacturer of virtual implementing helmet on virtual implementing helmet.For common For the user and software developer of virtual implementing helmet, due to can not detect the instrument of eyeglass distortion parameter, except Distortion parameter can not be intuitively obtained beyond asking for distortion parameter to eyeglass manufacturer, largely affects virtual reality The exploitation and use of software.Simultaneously because distortion parameter can not be obtained, the depth of field of virtual implementing helmet can not just be optimized.
The content of the invention
In order to solve the defects of current virtual real world devices can not optimize the depth of field, the present invention provides a kind of virtual implementing helmet Depth of field laser sets the method and device of optimization display.
The technical solution adopted by the present invention to solve the technical problems is:A kind of virtual implementing helmet depth of field source, laser apparatus is provided The method for putting optimization display, comprises the following steps:
S10:Image to be placed is divided into accurate setting area and general setting area;
S20:The depth of field setting method set to the imagery exploitation laser of the accurate setting area is configured, for institute The depth of field setting method for stating the imagery exploitation calculating and setting of general setting area is configured;
S30:Fuzzy processing is carried out to image.
Preferably, the calculating and setting comprises the following steps:
S101:The distortion parameter of virtual implementing helmet to be placed is stored in processing unit;
S102:The angle position of corresponding sight is calculated according to depth of field relation;
S103:Luminous point is gone out according to the angle position backwards calculation of the distortion parameter of virtual implementing helmet to be measured and sight Position on the screen.
Preferably, step S100 is further comprised:Distortion before calculating and setting first to the virtual implementing helmet Parameter measures.
Preferably, the laser setting mainly includes the following steps that:
S201 determines the image depth position that needs adjust, and corresponding scale on the scale is calculated according to depth of field relation Position D1, D2;
S202:Observation eyepiece is adjusted, makes the laser that laser emits is corresponding respectively to beat in graduated scale D1, D2 positions;
S203:Display screen shows test image information according to preset display rule;
S204:The test image information that processing unit processes observation unit is observed draws the image depth that needs adjust The corresponding display abscissa in position and display ordinate.
Preferably, the display screen is shown by column in units of column of pixels from the first end of the display screen to second end Longitudinal light, when elementary area detects that the display information of the display screen reaches the calibration position of observation unit after distortion When, the processing unit records the abscissa positions of light in the display screen at this time, which is the display The abscissa positions of the correct display depth of field of screen.
Preferably, the calibration position is the image center location of observation unit shooting.
Preferably, after determining to need the image depth position adjusted, sight angle, and root are calculated by depth of field position Corresponding graduation position D1, D2 on the graduated scale is calculated according to sight angle.
Preferably, after determining to need the image depth position adjusted, virtual direction of visual lines is simulated, is prolonged according to sight The intersection point of long line and the graduated scale determines D1, D2 position.
The device that a kind of virtual implementing helmet depth of field laser sets optimization display is provided, including test cell, observation unit, Elementary area and processing unit, the test cell includes virtual implementing helmet to be placed, fixed structure, to be set virtual The real helmet includes display screen, and the fixed structure includes clamping device, position-limit mechanism and graduated scale, and the clamping device can be with Opening is put into the virtual implementing helmet, and the observation unit includes observation eyepiece, is provided with above the observation eyepiece Laser, the laser of the laser transmitting can be beaten forms hot spot on the graduated scale.
Preferably, the observation unit includes observation eyepiece, eyepiece track and motor, and the observation eyepiece can be described Eyepiece track motion described in the drive lower edge of motor.
Preferably, the observation unit includes movable plate, observation eyepiece, visor, eyepiece track and motor, the observation Eyepiece can eyepiece track motion described in the drive lower edge in the motor, the eyepiece track is arranged on the movable plate, The movable plate can drive the observation eyepiece, the motor and the eyepiece track to move together.
Compared with prior art, the present invention causes the display of image to be more nearly people using by the method in image division region The scene that eye is actually seen enhances the feeling of immersion of virtual reality.By the division of accurate setting area and general setting area, The Efficiency and accuracy of depth of field setting is effectively improved using the different depth of field plans of establishment for different zones.It provides sharp Light is set and two kinds of depth of field setting methods of calculating and setting, the setting for making the depth of field are more convenient.The present invention is using according to depth of field relation The correspondence position on graduated scale calculated beats the method simulation depth of field in graduated scale correspondence position using laser transmitting laser Formation, can intuitively confirm depth of field position.Observation by the observation eyepiece being connected with laser determines to need what is adjusted The corresponding display abscissa in image depth position and display ordinate provide a kind of upper method for setting the depth of field, convenient and easy. By setting calibration position that processing unit is allow to differentiate the image that observation eyepiece is observed, judge whether it meets scape Deep requirement.View directions are calculated by depth of field relation, calculate the correspondence position of graduated scale.When theoretic luminous point with Actual luminous point can directly be corrected the position of actual luminous point using theoretic luminous point position, ensured there are during error The depth of field meets preset requirement.It is effectively simply solved using the combination of test cell, observation unit, elementary area and processing unit The problem of depth of field of having determined is verified.Observation unit is driven along eyepiece track motion by motor, can facilitate from multiple angles come into Row observation, facilitates the setting of multiple points of observation.
Description of the drawings
Below in conjunction with accompanying drawings and embodiments, the invention will be further described, in attached drawing:
Fig. 1 is the module signal that virtual implementing helmet depth of field laser of the present invention sets optimization display device first embodiment Figure;
Fig. 2 is first embodiment test cell module diagram;
Fig. 3 is that virtual implementing helmet depth of field laser of the present invention sets optimization display device first embodiment schematic diagram;
Fig. 4 is that virtual implementing helmet depth of field laser of the present invention sets optimization display device first embodiment side schematic view;
Fig. 5 is virtual implementing helmet depth of field displaying principle schematic diagram;
Fig. 6 is that virtual implementing helmet depth of field laser of the present invention sets optimization display device second embodiment structure diagram;
Fig. 7 is the real laser irradiates schematic diagram of the present invention second;
Fig. 8 is second embodiment of the invention shade schematic diagram;
Fig. 9 is that virtual implementing helmet depth of field laser of the present invention sets optimization displaying principle schematic diagram.
Specific embodiment
In order to solve the defects of current virtual real world devices can not optimize the depth of field, the present invention provides a kind of virtual implementing helmet Depth of field laser sets the method and device of optimization display.
In order to which the technical features, objects and effects to the present invention are more clearly understood, now compare attached drawing and be described in detail The specific embodiment of the present invention.
- Fig. 2 is please referred to Fig.1, virtual implementing helmet depth of field laser of the present invention sets optimization display device to include test cell 1st, observation unit 2, elementary area 3 and processing unit 4.Wherein, test cell 1 includes trial lens 12 to be measured, fixed structure 14, treats Testing lens 12 is removably attached on fixed structure 14.Elementary area 3 is electrically connected with observation unit 2, processing unit 4 with Elementary area 3 is electrically connected.Observation unit 2 observes test cell 1 by way of captured image, and observation unit 2 can To shoot the image of test cell 1, and the image transmitting of shooting to elementary area 3 is handled, elementary area 3 can be handled The image that observation unit 2 is shot, and handling result is transferred to processing unit 4 and is handled, processing unit 4 can be according to image The data that unit 3 transmits are handled.
Fig. 3-Fig. 4, which is shown, sets optimize display device first as exemplary virtual implementing helmet depth of field laser in fact Example is applied, display screen 16 is fixed in fixed structure 14, and eyeglass mounting portion 18, eyeglass mounting portion are provided on fixed structure 14 18 can be used for installing trial lens 12 to be measured.Observation unit 2 includes observation eyepiece 23, eyepiece track 25, eyepiece motor 271, liter Drop motor 272 and elevating lever 273, observation eyepiece 23 can be translatable under the drive of eyepiece motor 271 along eyepiece track 25, and Can under the drive of eyepiece motor 271 rotational transform viewing angle.Observation eyepiece 23 is connected with elevating lever 273, and can be with Follow 273 1 lifting of elevating lever.Elevating lever 273 can be lifted by the control of lifting motor 272 in vertical direction.It is using When, eyepiece motor 271, lifting motor 272 can be translatable to coordinate and rotate and lift, and observation eyepiece 23 is made to reach different observation positions It puts, the light that simulation direction of visual lines observation display screen 16 emits.
In initial fitting distortion data, fixed structure 14 is removed first, and to be measured show on trial is installed at eyeglass mounting portion 18 Then fixed structure 14 is mounted on pedestal 21 by piece 12.Eyepiece motor 271 is resetted, eyepiece motor 271 is made to reach eyepiece track The initial position of 25 one end.At this point, preparation is completed before detection.After processing unit 4 receives the order for starting detection, Eyepiece motor 271 and lifting motor 272 drive observation eyepiece 23 to reach first point of observation, meanwhile, the order display of processing unit 4 16 display detection informations of screen, first, display screen 16 in units of column of pixels from the first end of display screen 16 to second end by column The longitudinal light of display, first end and second end is opposite, can artificially specify as needed, we are specified under normal circumstances The direction of test cell 1 of the unit 2 after fixation sees that the left end of display screen 16 is first end, and right end is second end, when image list When member 3 detects that the display information of display screen 16 reaches the calibration position of observation unit 2 after distortion, elementary area 3 transfers For information to processing unit 4, processing unit 4 records the abscissa positions of light in the position of observation unit 2 at this time and display screen 16. Then observation unit 2 moves to next point of observation, and 4 order test cell 1 of processing unit shows detection information, repeats above-mentioned inspection Survey process.Point of observation quantity sets more, and eyeglass lens measurement result is finer, is just more conducive to data fitting.All After the completion of the detection of point of observation, processing unit 4 summarizes all correspondences, and in the correspondence fitting data storehouse according to storage The distortion function of storage.After processing unit 4 is successfully fitted one of them to several distortion functions, processing unit 4 is recorded and deposited Store up the fitting result;When processing unit 4 can not be according to distortion function in the correspondence fitting data storehouse measured, processing is single Member 4 stores correspondence in a manner of point function.
Fig. 5 shows the field depth principle schematic diagram of virtual implementing helmet.As shown in the figure, when observer forms figure in vision , it is necessary to right and left eyes collaboration imaging during picture.In Figure 5, the transmitting light of display screen 16 arrives separately at a left side by the refraction of optical mirror slip Right eye makes right and left eyes visually feel there are image at A, and on display screen 16, corresponding luminous point is respectively A1With A2, material is thus formed the effects of the depth of field.We can optimize the depth of field, and before optimization, we can be first to void The distortion parameter for intending the real helmet measures, and the method for measurement is identical to the measuring method of optical mirror slip with embodiment one, The distortion function being fitted using this method measurement determines the viewing angle of observation unit 2 and pair of luminous point on display screen 16 It should be related to, i.e. the sight of people and the correspondence of luminous point on display screen 16.
Fig. 6-Fig. 9 is referred to, Fig. 6-Fig. 9 shows second embodiment of the invention.The second embodiment of the present invention is main It is optimized for the display depth of field to virtual implementing helmet.Virtual implementing helmet depth of field laser of the present invention sets optimization display dress Putting second embodiment includes virtual implementing helmet 13 to be placed, fixed structure 14, and virtual implementing helmet 13 to be placed is detachably pacified In fixed structure 14, fixed structure 14 includes clamping device 142, position-limit mechanism 141, graduated scale 144 and bottom plate 143, In, clamping device 142 includes torsional spring (not shown), and clamping device 142 can open, when being put into virtual implementing helmet 13 to be placed Afterwards, torsional spring can act on clamping device 142 and be allowed to be closed, and play the role of fixed virtual implementing helmet 13 to be placed./V Mechanism 141 can precisely limit the position of virtual implementing helmet 13 to be placed, prevent 13 position mistake of virtual implementing helmet to be placed In forward or influence optimum results rearward, position-limit mechanism 141, graduated scale 144 and clamping device 142 are fixed on bottom plate 143.It sees Examining unit 2 includes two groups of observation devices, and two groups of observation devices respectively observe left eye and the corresponding fault image of right eye.It sees Examining unit 2 includes observation eyepiece 23, laser 24, eyepiece track 25, motor 27 and shade 29, and observation eyepiece 23 can be Along 25 rotational transform viewing angle of eyepiece track under the drive of motor 27.When in use, motor 27 can surround virtual left sight Examine a little 26 and right point of observation 28 rotate, observation eyepiece 23 is made to reach different observation positions, simulation direction of visual lines observation is to be placed The light that virtual implementing helmet 13 emits.Laser 24, which can emit laser, to be beaten and forms luminous point on graduated scale 144.Laser 24 Lower section be provided with supporting rod 241, supporting rod 241 can rise laser 24 in the vertical direction, prevent laser 24 Light is blocked and can not be beaten on graduated scale 144 by virtual implementing helmet 13 to be placed.Fig. 8 shows and is filled as exemplary shading 29 are put, the slit 291 of shade 29,291 a diameter of 1mm of slit or so are provided through on shade 29, has one Fixed depth for ensureing thin image formation by rays condition, allows observation eyepiece 23 accurately to observe the light that respective direction transmits, and prevents Only the light in other directions has an impact observation result.Shade 29 is removably mounted on observation eyepiece 23.
Showing that calculating and setting may be employed in we and laser is set when being configured to the depth of field.Carrying out calculating and setting When, we first can measure the distortion parameter of virtual implementing helmet before carrying out depth of field display and setting, and utilize this The distortion function that method measurement is fitted determines the viewing angle of observation unit 2 and the correspondence of luminous point on display screen 16, That is the sight of people and the correspondence of luminous point on display screen 16.Then the angle of left and right an eye line is calculated according to depth of field data, And the light spot position on the corresponding display screen 16 of the angle is drawn according to distortion function.Be iteratively repeated this process can treat it is aobvious The display of all depth of field positions is effectively set on diagram picture.
The setting that the method for calculating and setting shows the depth of field is converted into mathematical computations, provides a kind of setting side of simplicity Method its advantage lies in being able to quickly draw the display data of the depth of field.But since error unavoidably occur in mathematical computations, not necessarily High definition can be met to show and the display requirement of the accurate depth of field, in addition can not be intuitive to see the effect that the depth of field is set.In order into one Accurate depth of field display effect is walked, the scheme of laser setting may be employed in we.
When carrying out laser setting, we calculate the angle of left and right an eye line according to depth of field data, and calculate sight Angle corresponding graduation position D1, D2.Virtual direction of visual lines can also be simulated, according to sight extended line and graduated scale 144 Intersection point determine D1, D2 position.Two observation eyepieces 23 in adjustment left and right make the laser that its top laser 24 emits correspond to respectively D1, D2 position of graduated scale 144 are got to, the intersection point E of two laser and theoretic depth of field position are in same plumb line at this time On.4 order display screen 16 of processing unit shows detection information, and first, display screen 16 is in units of column of pixels from display screen 16 First end show longitudinal light by column to second end, first end and second end is opposite, can artificially specify as needed, generally In the case of we specify in terms of from observation unit 2 to the direction of the test cell 1 after fixation, the left end of display screen 16 is first end, Right end is second end, when elementary area 3 detects that the display information of display screen 16 reaches the calibration of observation unit 2 after distortion During position, elementary area 3 carries the information to processing unit 4, and processing unit 4 records the abscissa position of light in display screen 16 at this time It puts, which is the abscissa positions that display screen 16 correctly shows the depth of field, since display screen 16 is during display It is shown respectively at left and right sides of point, two images being visually observed in corresponding left and right, therefore abscissa positions under the same depth of field There are two, two observation eyepieces 23 are corresponded to respectively.After the abscissa positions of the correct measurement depth of field, display screen 16 is with entire row of pixels Transverse light rays are shown line by line to lower end from the upper end of display screen 16 for unit, when elementary area 3 detects the display of display screen 16 When information reaches the calibration position of observation unit 2 after distortion, elementary area 3 carries the information to processing unit 4, processing unit 4 record the ordinate position of light in display screen 16 at this time, which is that display screen 16 correctly shows the vertical of the depth of field Coordinate position, similarly, ordinate position is also two under the same depth of field.Corresponding display abscissa of depth of field and ordinate are at this It determines.Then observation unit 2 moves to next point of observation, and 4 order test cell 1 of processing unit shows detection information, repeats Above-mentioned detection process.Until measurement in need the depth of field all obtain measurement until.The calibration position of observation unit 2 can basis It needs to specify, the center that measurement will generally demarcate position and be arranged on the captured image of observation unit 2 for convenience can be with The target center of one fixed width around the position is set, the display information of display screen 16 is can consider when luminous point image falls in target center The calibration position of observation unit 2 is reached after distortion.
Human eye has opposite choice when observation shows image, when eyes adjustment focal length watches some object, Other images different from the object depth of field will become Relative Fuzzy, this is the behavior that the mankind are formed during evolution.Cause This is when showing image, if all carrying out depth of field setting, entire picture all right and wrong using the method that laser is set Often clearly, false sensation can be so caused, influences the feeling of immersion of virtual reality.Therefore we need when setting the depth of field Fuzzy Processing is deliberately carried out to the image of some vision edge positions, to form relatively real image scene.We use Laser sets the mode being combined with calculating and setting to carry out this processing.First, when image needs to set the depth of field, we are right It is divided in image, marks off accurate setting area and general setting area.Accurate setting area and general setting area Criterion is mainly divided by mainly showing object and the display object closely located with main display object depth, due to Main display object is often subject to the more concern of observer, therefore the image of main display object covering during display Region is accurate setting area.Due to the relation of the depth of field, show that object also can be relatively clear similar in object depth with main show It is clear, therefore this part shows that object also falls within accurate display area.It can be set not to meet more accurate display requirement The accurate display area of same level, this can voluntarily be set according to demand by developer.Mark off accurate setting area and Behind general setting area, accurate setting area carries out depth of field setting using the mode that laser is set, and general setting area utilizes The mode of calculating and setting carries out depth of field setting.So not only save the time, but cause accurate setting area the depth of field show compared with High display quality.Fuzzy processing is carried out to image after the depth of field is provided with, fog-level can be according to the accurate of setting The different stage of setting area and accordingly set.The processing of image is accused and completed after the completion of Fuzzy Processing.
Compared with prior art, the present invention causes the display of image to be more nearly people using by the method in image division region The scene that eye is actually seen enhances the feeling of immersion of virtual reality.By the division of accurate setting area and general setting area, The Efficiency and accuracy of depth of field setting is effectively improved using the different depth of field plans of establishment for different zones.It provides sharp Light is set and two kinds of depth of field setting methods of calculating and setting, the setting for making the depth of field are more convenient.The present invention is using according to depth of field relation The correspondence position on graduated scale 144 calculated beats the method in 144 correspondence position of graduated scale using the transmitting laser of laser 24 The formation of the depth of field is simulated, can intuitively confirm depth of field position.Observation by the observation eyepiece 23 being connected with laser 24, really The corresponding display abscissa in the image depth position adjusted and display ordinate are needed calmly, provide a kind of upper side that the depth of field is set Method, it is convenient and easy.The image for by setting calibration position processing unit being allow to observe observation eyepiece 23 differentiates, sentences Breaking, whether it meets the requirement of the depth of field.View directions are calculated by depth of field relation, calculate the correspondence position of graduated scale 144. When theoretic luminous point and actual luminous point, there are during error, can directly utilize the amendment of theoretic luminous point position actual The position of luminous point ensures that the depth of field meets preset requirement.Utilize test cell 1, observation unit 2, elementary area 3 and processing unit 4 combination effectively simply solves the problems, such as depth of field verification.By motor 27 observation unit 2 is driven to be transported along eyepiece track 25 It is dynamic, it can facilitate from multiple angles from carrying out, to facilitate the setting of multiple points of observation.
The embodiment of the present invention is described above in conjunction with attached drawing, but the invention is not limited in above-mentioned specific Embodiment, above-mentioned specific embodiment is only schematical rather than restricted, those of ordinary skill in the art Under the enlightenment of the present invention, present inventive concept and scope of the claimed protection are not being departed from, can also made very much Form, these are belonged within the protection of the present invention.

Claims (10)

1. a kind of method that virtual implementing helmet depth of field laser sets optimization display, which is characterized in that comprise the following steps:
S10:Image to be placed is divided into accurate setting area and general setting area;
S20:The depth of field setting method set to the imagery exploitation laser of the accurate setting area is configured, for described one As the depth of field setting method of imagery exploitation calculating and setting of setting area be configured;
S30:Fuzzy processing is carried out to image.
2. the method that virtual implementing helmet depth of field laser according to claim 1 sets optimization display, which is characterized in that institute Calculating and setting is stated to comprise the following steps:
S100:The distortion parameter of the virtual implementing helmet is measured first before calculating and setting;
S101:The distortion parameter of virtual implementing helmet to be placed is stored in processing unit;
S102:The angle position of corresponding sight is calculated according to depth of field relation;
S103:Go out luminous point according to the angle position backwards calculation of the distortion parameter of virtual implementing helmet to be measured and sight shielding Position on curtain.
3. the method that virtual implementing helmet depth of field laser according to claim 1 sets optimization display, which is characterized in that institute Laser setting is stated to mainly include the following steps that:
S201 determines the image depth position that needs adjust, and corresponding graduation position on the scale is calculated according to depth of field relation D1、D2;
S202:Observation eyepiece is adjusted, makes the laser that laser emits is corresponding respectively to beat in graduated scale D1, D2 positions;
S203:Display screen shows test image information according to preset display rule;
S204:The test image information that processing unit processes observation unit is observed draws the image depth position that needs adjust Corresponding display abscissa and display ordinate.
4. the method that virtual implementing helmet depth of field laser according to claim 3 sets optimization display, which is characterized in that institute It states display screen and shows longitudinal light by column to second end from the first end of the display screen in units of column of pixels, when image list When member detects that the display information of the display screen reaches the calibration position of observation unit after distortion, the processing unit note The abscissa positions of light in the display screen at this time are recorded, which is the horizontal stroke that the display screen correctly shows the depth of field Coordinate position.
5. the method that virtual implementing helmet depth of field laser according to claim 4 sets optimization display, which is characterized in that institute State the image center location that calibration position is observation unit shooting.
6. the method that virtual implementing helmet depth of field laser according to claim 5 sets optimization display, which is characterized in that Behind the definite image depth position for needing to adjust, sight angle is calculated by depth of field position, and is calculated according to sight angle Corresponding graduation position D1, D2 on the graduated scale.
7. the method that virtual implementing helmet depth of field laser according to claim 6 sets optimization display, which is characterized in that Behind the definite image depth position for needing to adjust, virtual direction of visual lines is simulated, according to sight extended line and the graduated scale Intersection point determine D1, D2 position.
8. a kind of virtual implementing helmet depth of field laser sets the device of optimization display, which is characterized in that including test cell, observation Unit, elementary area and processing unit, the test cell includes virtual implementing helmet to be placed, fixed structure, described to wait to set Putting virtual implementing helmet includes display screen, and the fixed structure includes clamping device, position-limit mechanism and graduated scale, the clamping work Tool, which can be opened, is put into the virtual implementing helmet, and the observation unit includes observation eyepiece, in the top of the observation eyepiece Laser is provided with, the laser of the laser transmitting can be beaten forms hot spot on the graduated scale.
9. the device that virtual implementing helmet depth of field laser according to claim 8 is set, which is characterized in that the observation is single Member further comprises eyepiece track and motor, and the observation eyepiece can eyepiece track fortune described in the drive lower edge in the motor It is dynamic.
10. the device that virtual implementing helmet depth of field laser according to claim 8 is set, which is characterized in that the observation Unit further comprises movable plate, visor, and the observation eyepiece can eyepiece track described in the drive lower edge in the motor Movement, the eyepiece track is arranged on the movable plate, the movable plate can drive the observation eyepiece, the motor and The eyepiece track moves together.
CN201710543920.1A 2016-11-30 2017-07-05 Virtual implementing helmet depth of field laser sets the method and device of optimization display Pending CN108121068A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2016213083149 2016-11-30
CN201621308314 2016-11-30

Publications (1)

Publication Number Publication Date
CN108121068A true CN108121068A (en) 2018-06-05

Family

ID=60100336

Family Applications (35)

Application Number Title Priority Date Filing Date
CN201710544195.XA Pending CN107329266A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field region is set
CN201710544211.5A Pending CN107300775A (en) 2016-11-30 2017-07-05 The depth of field based on image scale sets the method and device of optimization
CN201710544198.3A Pending CN107544149A (en) 2016-11-30 2017-07-05 Region depth of field method to set up and device based on image scale
CN201710543920.1A Pending CN108121068A (en) 2016-11-30 2017-07-05 Virtual implementing helmet depth of field laser sets the method and device of optimization display
CN201710543944.7A Pending CN107544147A (en) 2016-11-30 2017-07-05 The method and device that depth of field laser based on image scale is set
CN201710544204.5A Withdrawn CN107464221A (en) 2016-11-30 2017-07-05 Based on the method and device of virtual reality eyeglass distortion checking and adjustment corresponding to scale
CN201710543925.4A Pending CN107329263A (en) 2016-11-30 2017-07-05 The method and device that the virtual implementing helmet depth of field is shown
CN201710543937.7A Pending CN107490861A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field optimization display
CN201710544194.5A Pending CN107329265A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance optimizes with depth of field laser
CN201710543919.9A Pending CN107422479A (en) 2016-11-30 2017-07-05 Based on virtual implementing helmet depth of field method to set up and device corresponding to scale
CN201710544205.XA Pending CN107315252A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field region laser is set
CN201710543941.3A Pending CN107390364A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field laser is set
CN201710544210.0A Pending CN107544151A (en) 2016-11-30 2017-07-05 Based on virtual implementing helmet depth of field zone approach and device corresponding to scale
CN201710544203.0A Pending CN107340595A (en) 2016-11-30 2017-07-05 The method and device set based on virtual implementing helmet depth of field region laser corresponding to scale
CN201710543922.0A Pending CN107462400A (en) 2016-11-30 2017-07-05 The method and device detected based on virtual reality eyeglass dispersion corresponding to scale
CN201710544208.3A Pending CN107290854A (en) 2016-11-30 2017-07-05 Virtual implementing helmet interpupillary distance optimizes the method and device of display with the depth of field
CN201710543865.6A Pending CN107702894A (en) 2016-11-30 2017-07-05 The method and device of virtual reality eyeglass dispersion detection
CN201710544200.7A Pending CN107479188A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field optimization
CN201710543942.8A Pending CN107329264A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance is set with the depth of field
CN201710544213.4A Withdrawn CN107478412A (en) 2016-11-30 2017-07-05 Virtual implementing helmet distortion checking and the method and device of adjustment
CN201710543938.1A Pending CN107357038A (en) 2016-11-30 2017-07-05 Virtual implementing helmet interpupillary distance and the method and device of depth of field adjustment
CN201710544197.9A Pending CN107505708A (en) 2016-11-30 2017-07-05 Virtual implementing helmet depth of field method to set up and device based on image scale
CN201710543918.4A Pending CN107687936A (en) 2016-11-30 2017-07-05 The method and device detected based on virtual implementing helmet dispersion corresponding to scale
CN201710543936.2A Pending CN107462991A (en) 2016-11-30 2017-07-05 The method and device that the virtual implementing helmet depth of field is set
CN201710543923.5A Pending CN107688387A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet dispersion detection
CN201710544192.6A Pending CN107544148A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field laser based on image scale is set
CN201710544212.XA Pending CN107300776A (en) 2016-11-30 2017-07-05 Interpupillary distance depth of field method to set up and device based on image scale
CN201710544196.4A Pending CN107315251A (en) 2016-11-30 2017-07-05 Based on the corresponding virtual implementing helmet interpupillary distance of scale and depth of field method to set up and device
CN201710543921.6A Pending CN107300774A (en) 2016-11-30 2017-07-05 Method and device based on the corresponding virtual implementing helmet distortion checking of scale and adjustment
CN201710544189.4A Withdrawn CN107357039A (en) 2016-11-30 2017-07-05 Virtual reality eyeglass distortion checking and the method and device of adjustment
CN201710543939.6A Pending CN107526167A (en) 2016-11-30 2017-07-05 The method and device optimized based on depth of field laser corresponding to scale
CN201710543924.XA Pending CN107357037A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet laser assisted depth of field optimization
CN201710544199.8A Pending CN107544150A (en) 2016-11-30 2017-07-05 The method and device set based on virtual implementing helmet depth of field laser corresponding to scale
CN201710544201.1A Pending CN107291246A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field measurement based on image scale
CN201710544202.6A Pending CN107402448A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance is set with depth of field laser

Family Applications Before (3)

Application Number Title Priority Date Filing Date
CN201710544195.XA Pending CN107329266A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field region is set
CN201710544211.5A Pending CN107300775A (en) 2016-11-30 2017-07-05 The depth of field based on image scale sets the method and device of optimization
CN201710544198.3A Pending CN107544149A (en) 2016-11-30 2017-07-05 Region depth of field method to set up and device based on image scale

Family Applications After (31)

Application Number Title Priority Date Filing Date
CN201710543944.7A Pending CN107544147A (en) 2016-11-30 2017-07-05 The method and device that depth of field laser based on image scale is set
CN201710544204.5A Withdrawn CN107464221A (en) 2016-11-30 2017-07-05 Based on the method and device of virtual reality eyeglass distortion checking and adjustment corresponding to scale
CN201710543925.4A Pending CN107329263A (en) 2016-11-30 2017-07-05 The method and device that the virtual implementing helmet depth of field is shown
CN201710543937.7A Pending CN107490861A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field optimization display
CN201710544194.5A Pending CN107329265A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance optimizes with depth of field laser
CN201710543919.9A Pending CN107422479A (en) 2016-11-30 2017-07-05 Based on virtual implementing helmet depth of field method to set up and device corresponding to scale
CN201710544205.XA Pending CN107315252A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field region laser is set
CN201710543941.3A Pending CN107390364A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field laser is set
CN201710544210.0A Pending CN107544151A (en) 2016-11-30 2017-07-05 Based on virtual implementing helmet depth of field zone approach and device corresponding to scale
CN201710544203.0A Pending CN107340595A (en) 2016-11-30 2017-07-05 The method and device set based on virtual implementing helmet depth of field region laser corresponding to scale
CN201710543922.0A Pending CN107462400A (en) 2016-11-30 2017-07-05 The method and device detected based on virtual reality eyeglass dispersion corresponding to scale
CN201710544208.3A Pending CN107290854A (en) 2016-11-30 2017-07-05 Virtual implementing helmet interpupillary distance optimizes the method and device of display with the depth of field
CN201710543865.6A Pending CN107702894A (en) 2016-11-30 2017-07-05 The method and device of virtual reality eyeglass dispersion detection
CN201710544200.7A Pending CN107479188A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field optimization
CN201710543942.8A Pending CN107329264A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance is set with the depth of field
CN201710544213.4A Withdrawn CN107478412A (en) 2016-11-30 2017-07-05 Virtual implementing helmet distortion checking and the method and device of adjustment
CN201710543938.1A Pending CN107357038A (en) 2016-11-30 2017-07-05 Virtual implementing helmet interpupillary distance and the method and device of depth of field adjustment
CN201710544197.9A Pending CN107505708A (en) 2016-11-30 2017-07-05 Virtual implementing helmet depth of field method to set up and device based on image scale
CN201710543918.4A Pending CN107687936A (en) 2016-11-30 2017-07-05 The method and device detected based on virtual implementing helmet dispersion corresponding to scale
CN201710543936.2A Pending CN107462991A (en) 2016-11-30 2017-07-05 The method and device that the virtual implementing helmet depth of field is set
CN201710543923.5A Pending CN107688387A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet dispersion detection
CN201710544192.6A Pending CN107544148A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet depth of field laser based on image scale is set
CN201710544212.XA Pending CN107300776A (en) 2016-11-30 2017-07-05 Interpupillary distance depth of field method to set up and device based on image scale
CN201710544196.4A Pending CN107315251A (en) 2016-11-30 2017-07-05 Based on the corresponding virtual implementing helmet interpupillary distance of scale and depth of field method to set up and device
CN201710543921.6A Pending CN107300774A (en) 2016-11-30 2017-07-05 Method and device based on the corresponding virtual implementing helmet distortion checking of scale and adjustment
CN201710544189.4A Withdrawn CN107357039A (en) 2016-11-30 2017-07-05 Virtual reality eyeglass distortion checking and the method and device of adjustment
CN201710543939.6A Pending CN107526167A (en) 2016-11-30 2017-07-05 The method and device optimized based on depth of field laser corresponding to scale
CN201710543924.XA Pending CN107357037A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet laser assisted depth of field optimization
CN201710544199.8A Pending CN107544150A (en) 2016-11-30 2017-07-05 The method and device set based on virtual implementing helmet depth of field laser corresponding to scale
CN201710544201.1A Pending CN107291246A (en) 2016-11-30 2017-07-05 The method and device of virtual implementing helmet depth of field measurement based on image scale
CN201710544202.6A Pending CN107402448A (en) 2016-11-30 2017-07-05 The method and device that virtual implementing helmet interpupillary distance is set with depth of field laser

Country Status (1)

Country Link
CN (35) CN107329266A (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108008535A (en) * 2017-11-17 2018-05-08 国网山东省电力公司 A kind of augmented reality equipment
CN107977076B (en) * 2017-11-17 2018-11-27 国网山东省电力公司泰安供电公司 A kind of wearable virtual reality device
CN107942517B (en) * 2018-01-02 2020-03-06 京东方科技集团股份有限公司 VR head-mounted display device and display method thereof
CN108303798B (en) * 2018-01-15 2020-10-09 海信视像科技股份有限公司 Virtual reality helmet, virtual reality helmet interpupillary distance adjusting method and device
CN108426702B (en) * 2018-01-19 2020-06-02 华勤通讯技术有限公司 Dispersion measurement device and method of augmented reality equipment
CN108399606B (en) * 2018-02-02 2020-06-26 北京奇艺世纪科技有限公司 Image adjusting method and device
CN108510549B (en) 2018-03-27 2022-01-04 京东方科技集团股份有限公司 Distortion parameter measuring method, device and system of virtual reality equipment
CN109186957B (en) * 2018-09-17 2024-05-10 浙江晶正光电科技有限公司 High-precision automatic detection equipment for diffusion angle of laser diffusion sheet
CN109557669B (en) * 2018-11-26 2021-10-12 歌尔光学科技有限公司 Method for determining image drift amount of head-mounted display equipment and head-mounted display equipment
CN112313699A (en) * 2019-05-24 2021-02-02 京东方科技集团股份有限公司 Method and device for controlling virtual reality display equipment
CN110320009A (en) * 2019-06-25 2019-10-11 歌尔股份有限公司 Optical property detection method and detection device
CN113822104B (en) * 2020-07-07 2023-11-03 湖北亿立能科技股份有限公司 Artificial intelligence surface of water detecting system based on virtual scale of many candidates
CN113768240A (en) * 2021-08-30 2021-12-10 航宇救生装备有限公司 Method for adjusting imaging position of display protection helmet
CN114089508B (en) * 2022-01-19 2022-05-03 茂莱(南京)仪器有限公司 Wide-angle projection lens for detecting optical waveguide AR lens
DE102022207774A1 (en) 2022-07-28 2024-02-08 Robert Bosch Gesellschaft mit beschränkter Haftung Method for an automated calibration of a virtual retinal display for data glasses, calibration device and virtual retinal display
CN117214025B (en) * 2023-11-08 2024-01-12 广东德鑫体育产业有限公司 Helmet lens detection device

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5619373A (en) * 1995-06-07 1997-04-08 Hasbro, Inc. Optical system for a head mounted display
CN102967473B (en) * 2012-11-30 2015-04-29 奇瑞汽车股份有限公司 Driver front-view measuring device
US10228562B2 (en) * 2014-02-21 2019-03-12 Sony Interactive Entertainment Inc. Realtime lens aberration correction from eye tracking
US10262400B2 (en) * 2014-10-31 2019-04-16 Huawei Technologies Co., Ltd. Image processing method and device using reprojection error values
CN104808342B (en) * 2015-04-30 2017-12-12 杭州映墨科技有限公司 The optical lens structure of the wearable virtual implementing helmet of three-dimensional scenic is presented
US10271042B2 (en) * 2015-05-29 2019-04-23 Seeing Machines Limited Calibration of a head mounted eye tracking system
CN105979243A (en) * 2015-12-01 2016-09-28 乐视致新电子科技(天津)有限公司 Processing method and device for displaying stereo images
CN105979252A (en) * 2015-12-03 2016-09-28 乐视致新电子科技(天津)有限公司 Test method and device
CN105867606A (en) * 2015-12-15 2016-08-17 乐视致新电子科技(天津)有限公司 Image acquisition method and apparatus in virtual reality helmet, and virtual reality helmet
CN105869142A (en) * 2015-12-21 2016-08-17 乐视致新电子科技(天津)有限公司 Method and device for testing imaging distortion of virtual reality helmets
CN105787980B (en) * 2016-03-17 2018-12-25 北京牡丹视源电子有限责任公司 A kind of detection virtual reality shows the method and system of equipment field angle
CN106028013A (en) * 2016-04-28 2016-10-12 努比亚技术有限公司 Wearable device, display device, and display output adjusting method
CN105791789B (en) * 2016-04-28 2019-03-19 努比亚技术有限公司 The method of helmet, display equipment and adjust automatically display output
CN106441212B (en) * 2016-09-18 2020-07-28 京东方科技集团股份有限公司 Device and method for detecting field angle of optical instrument
CN106527733A (en) * 2016-11-30 2017-03-22 深圳市虚拟现实技术有限公司 Virtual-reality helmet distortion fitting-detecting method and device
CN106651954A (en) * 2016-12-27 2017-05-10 天津科技大学 Laser simulation method and device for space sight line benchmark

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
叶玉堂 等: "《光学教程》", 31 August 2005, 清华大学出版社 *
吴启海: "《数码照相机可换镜头使用完全手册》", 31 January 2015, 浙江摄影出版社 *
高鸿锦 等: "《新型显示技术 下》", 31 August 2014, 北京邮电大学出版社 *

Also Published As

Publication number Publication date
CN107544149A (en) 2018-01-05
CN107300774A (en) 2017-10-27
CN107357038A (en) 2017-11-17
CN107478412A (en) 2017-12-15
CN107702894A (en) 2018-02-16
CN107479188A (en) 2017-12-15
CN107329265A (en) 2017-11-07
CN107340595A (en) 2017-11-10
CN107687936A (en) 2018-02-13
CN107544148A (en) 2018-01-05
CN107291246A (en) 2017-10-24
CN107544147A (en) 2018-01-05
CN107462991A (en) 2017-12-12
CN107290854A (en) 2017-10-24
CN107402448A (en) 2017-11-28
CN107357039A (en) 2017-11-17
CN107300775A (en) 2017-10-27
CN107505708A (en) 2017-12-22
CN107357037A (en) 2017-11-17
CN107544151A (en) 2018-01-05
CN107390364A (en) 2017-11-24
CN107329263A (en) 2017-11-07
CN107462400A (en) 2017-12-12
CN107544150A (en) 2018-01-05
CN107464221A (en) 2017-12-12
CN107300776A (en) 2017-10-27
CN107422479A (en) 2017-12-01
CN107315251A (en) 2017-11-03
CN107329266A (en) 2017-11-07
CN107688387A (en) 2018-02-13
CN107329264A (en) 2017-11-07
CN107526167A (en) 2017-12-29
CN107315252A (en) 2017-11-03
CN107490861A (en) 2017-12-19

Similar Documents

Publication Publication Date Title
CN108121068A (en) Virtual implementing helmet depth of field laser sets the method and device of optimization display
Rolland et al. Towards quantifying depth and size perception in virtual environments
KR101812525B1 (en) Method of determining at least one refraction characteristic of an ophthalmic lens
CN103782131B (en) Can the measuring equipment of Touchless manipulation and the control method for this measuring equipment
US20090040460A1 (en) Method and Device for Determining the Eye's Rotation Center
CN106646882A (en) Head-mounted display device and adjusting parameter determining method thereof
US10993612B1 (en) Systems and methods for visual field testing in head-mounted displays
US20150339511A1 (en) Method for helping determine the vision parameters of a subject
CN106441822A (en) Virtual reality headset distortion detection method and device
CN106644404A (en) Virtual reality helmet distortion complete machine detection method and device
US20220125297A1 (en) Device calibration via a projective transform matrix
CN106527733A (en) Virtual-reality helmet distortion fitting-detecting method and device
US20220125298A1 (en) Active calibration of head-mounted displays
CN117268713A (en) Visual angle measuring method and device for display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180605