CN107302694A - Method, equipment and the virtual reality device of scene are presented by virtual reality device - Google Patents

Method, equipment and the virtual reality device of scene are presented by virtual reality device Download PDF

Info

Publication number
CN107302694A
CN107302694A CN201710364277.6A CN201710364277A CN107302694A CN 107302694 A CN107302694 A CN 107302694A CN 201710364277 A CN201710364277 A CN 201710364277A CN 107302694 A CN107302694 A CN 107302694A
Authority
CN
China
Prior art keywords
camera
left eye
right eye
image
offset distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710364277.6A
Other languages
Chinese (zh)
Other versions
CN107302694B (en
Inventor
王鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201710364277.6A priority Critical patent/CN107302694B/en
Publication of CN107302694A publication Critical patent/CN107302694A/en
Priority to PCT/CN2017/112383 priority patent/WO2018214431A1/en
Application granted granted Critical
Publication of CN107302694B publication Critical patent/CN107302694B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a kind of method, equipment and virtual reality device that scene is presented by virtual reality device.This method includes:The imaging parameters of left eye imaging device and the imaging parameters of right eye imagery equipment are obtained respectively;According to the second offset distance of the first offset distance of the imaging parameters of left eye imaging device acquisition left eye imaging device, and the imaging parameters right eye imagery equipment according to right eye imagery equipment;According to the first offset distance, the second offset distance and default human eye interpupillary distance, adjustment left eye lens, left eye camera, right eye lens and right eye camera;By the first image rendering of left eye camera preview on left eye screen, and correspondingly by the second image rendering of right eye camera preview simultaneously on right eye screen, so that virtual reality scenario is presented to user.Scene is presented according to the present invention it is possible to which third party's program library need not be introduced and realized, and strengthens the sense of reality of correspondence scene.It is particularly suitable for use in presenting the real scene of real world.

Description

Method, equipment and the virtual reality device of scene are presented by virtual reality device
Technical field
The present invention relates to technical field of virtual reality, scene is presented by virtual reality device more particularly, to one kind Method, equipment and virtual reality device.
Background technology
Virtual reality technology is developed rapidly in recent years, is not only applicable to presentation virtual scene and is supplied to user almost real Feeling of immersion, can also be applied to that reality scene is presented to be supplied to user to watch the sense of reality of real world.Therefore, based on virtual Virtual reality device such as virtual implementing helmet, virtual reality glasses etc. that scene is presented in reality technology are also used by more and more The concern at family.
Current virtual reality device is typically by the dual camera pattern use for setting left eye camera, right eye camera In presentation scene.The dual camera of current virtual reality device is different from general single camera, can increase when scene is presented Plus the depth of field of corresponding scene image so that the scene image of presentation has third dimension, when user is used to watch real world Strengthen the sense of reality.
, it is necessary to be increased income program library example by introducing third party when scene is presented using dual camera in current virtual reality device Such as FFmpeg storehouses, each two field picture of left eye camera and the preview of right eye camera institute is merged algorithm process to reach mould Intend the effect of real world, then user is presented to by virtual reality device.But, increased income program due to introducing third party, increasing Plus the program size of virtual reality device, the problem of operational efficiency is low and power consumption is big can be brought, correspondingly, when scene is presented Image refreshing speed can be influenceed so that sense of reality when scene is presented is poor, particularly, be seen in user using virtual reality device It is especially apparent when seeing real world.In addition, some virtual reality devices are improper because dual camera is set at present so that Yong Hu When watching real world using virtual reality device, there is ghost image or image is repeated, reduction user is comfortable, vision occurs tired Labor.
Therefore, it has been recognised by the inventors that being necessary to be improved for above-mentioned problems of the prior art.
The content of the invention
It is an object of the present invention to provide a kind of new solution for being used to scene be presented by virtual reality device.
According to the first aspect of the invention there is provided a kind of method that scene is presented by virtual reality device, it is implemented on On virtual reality device, the virtual reality device includes left eye imaging device and right eye imagery equipment, the left eye imaging Equipment is disposed with left eye lens, left eye camera, left eye screen and left eye camera lens, and the right eye imagery equipment is set successively It is equipped with right eye lens, right eye camera, right eye screen and right eye camera lens;
Methods described includes:
The imaging parameters of left eye imaging device and the imaging parameters of right eye imagery equipment, the imaging parameters are obtained respectively At least including camera lens in corresponding imaging device to the center vertical range of camera and the refraction angle of camera lens;
The first offset distance of left eye imaging device is obtained according to the imaging parameters of left eye imaging device, and according to right eye Second offset distance of the imaging parameters right eye imagery equipment of imaging device;
According to first offset distance, the second offset distance and default human eye interpupillary distance, adjust the left eye lens, Left eye camera, right eye lens and right eye camera so that the central horizontal distance of the left eye lens and right eye lens is less The first offset distance and the right eye lens are offset in the human eye interpupillary distance, the left eye lens and left eye camera central horizontal The second offset distance is offset with right eye camera central horizontal;
By the first image rendering of left eye camera preview on left eye screen, and it is correspondingly that right eye camera is simultaneously pre- The second image rendering look at is on right eye screen, so that corresponding scene is presented to user by corresponding camera lens.
Alternatively, the first offset distance of the acquisition and include the step of the second offset distance:
According to the center vertical range h of the camera lens of left eye imaging device to camera1And the refraction angle β of camera lens1, meter Calculate the first offset distance w1=h1×tgβ1;And
According to the center vertical range h of the camera lens of right eye imagery equipment to camera2And the refraction angle β of camera lens2, meter Calculate the second offset distance w2=h2×tgβ2
Alternatively, methods described also includes:
Camera lens based on left eye imaging device is to the center vertical range of camera, the first offset distance and described first The picture altitude of image obtains first level offset distance, based on the first level offset distance by described first image The heart carries out horizontal-shift, obtains new picture centre, then perform the Rendering operations;
And
According to the center vertical range of the camera lens of right eye imagery equipment to camera, the second offset distance and the second image Picture altitude obtain the second horizontal-shift distance, the center of second image is entered based on second horizontal-shift distance Row horizontal-shift, obtains new picture centre, then perform the Rendering operations.
Still optionally further, the first level offset distance is d1=(w1/h1)×(H1/ 2),
Wherein, h1For center vertical range of the camera lens to camera of left eye imaging device, w1For the first offset distance and H1For the picture altitude of the first image;
Second horizontal-shift distance is d2=(w2/h2)×(H2/ 2),
Wherein, h2For center vertical range of the camera lens to camera of right eye imagery equipment, w2For the first offset distance and H2For the picture altitude of the second image.
Alternatively, methods described also includes:
Image center based on described first image is determined after the center object of described first image, to the center pair As carrying out contour detecting with corresponding center object profile, and the described first image outside the center object profile is included Object blurred or Fuzzy Processing after, then perform the Rendering operations;And
Image center based on second image is determined after the center object of second image, to the center pair As carrying out contour detecting with corresponding center object profile, and second image outside the center object profile is included Object blurred or Fuzzy Processing after, then perform the Rendering operations.
According to the second aspect of the invention there is provided a kind of scene display device, virtual reality device side is arranged at,
The virtual reality device includes left eye imaging device and right eye imagery equipment, and the left eye imaging device is successively Left eye lens, left eye camera, left eye screen and left eye camera lens are provided with, the right eye imagery equipment is disposed with right eye Lens, right eye camera, right eye screen and right eye camera lens;
The scene display device includes:
Parameter acquiring unit, for obtaining the imaging parameters of left eye imaging device and the imaging of right eye imagery equipment respectively Parameter, center vertical range and camera lens of the imaging parameters at least including camera lens in corresponding imaging device to camera Reflect angle;
Offset distance acquiring unit, for obtaining the first of left eye imaging device according to the imaging parameters of left eye imaging device Offset distance, and the imaging parameters right eye imagery equipment according to right eye imagery equipment the second offset distance;
Element adjustment unit, for according to first offset distance, the second offset distance and default human eye interpupillary distance, Adjust the left eye lens, left eye camera, right eye lens and right eye camera so that the left eye lens and right eye lens The no more than described human eye interpupillary distance of central horizontal distance, the left eye lens and left eye camera central horizontal offset the first offset distance From and the right eye lens and right eye camera central horizontal offset the second offset distance;
Image rendering unit, for by the first image rendering of left eye camera preview in left eye screen, and correspondingly By the second image rendering of right eye camera preview simultaneously on right eye screen, so that correspondence is presented to user by corresponding camera lens Scene.
Alternatively, the offset distance acquiring unit includes:
Center vertical range h for the camera lens according to left eye imaging device to camera1And the refraction angle of camera lens β1, calculate the first offset distance w1=h1×tgβ1Device;And
Center vertical range h for the camera lens according to right eye imagery equipment to camera2And the refraction angle of camera lens β2, calculate the second offset distance w2=h2×tgβ2Device.
Alternatively, the equipment also includes picture centre offset units, is used for:
Camera lens based on left eye imaging device is to the center vertical range of camera, the first offset distance and described first The picture altitude of image obtains first level offset distance, based on the first level offset distance by described first image The heart carries out horizontal-shift, obtains new picture centre, then perform the Rendering operations;
And
According to the center vertical range of the camera lens of right eye imagery equipment to camera, the second offset distance and the second image Picture altitude obtain the second horizontal-shift distance, the center of second image is entered based on second horizontal-shift distance Row horizontal-shift, obtains new picture centre, then perform the Rendering operations.
Alternatively, the equipment also includes profile processing unit, is used for:
Image center based on described first image is determined after the center object of described first image, to the center pair As carrying out contour detecting with corresponding center object profile, and the described first image outside the center object profile is included Object blurred or Fuzzy Processing after, then perform the Rendering operations;And
Image center based on second image is determined after the center object of second image, to the center pair As carrying out contour detecting with corresponding center object profile, and second image outside the center object profile is included Object blurred or Fuzzy Processing after, then perform the Rendering operations.
According to the third aspect of the invention we there is provided a kind of virtual reality device, including:
Left eye imaging device, the left eye imaging device be disposed with left eye lens, left eye camera, left eye screen with And left eye camera lens;
Right eye imagery equipment, the right eye imagery equipment be disposed with right eye lens, right eye camera, right eye screen with And right eye camera lens;And
Any one scene display device provided such as the second aspect of the present invention.
It was found by the inventors of the present invention that in the prior art, scene not yet is presented by virtual reality device in the presence of one kind Method, equipment and virtual reality device, can need not introduce each frame of third party's program library to images of left and right eyes camera preview Image merges algorithm process, to obtain the sense of reality that scene is presented in correspondence.Therefore, the technical assignment of the invention to be realized Or technical problem to be solved be it is that those skilled in the art never expect or it is not expected that, therefore the present invention is one Plant new technical scheme.
By referring to the drawings to the detailed description of the exemplary embodiment of the present invention, further feature of the invention and its Advantage will be made apparent from.
Brief description of the drawings
The accompanying drawing for being combined in the description and constituting a part for specification shows embodiments of the invention, and even It is used for the principle for explaining the present invention together with its explanation.
Fig. 1 is the block diagram for the example for showing the hardware configuration available for the electronic equipment for realizing embodiments of the invention.
Fig. 2 shows the flow chart of the method that scene is presented by virtual reality device of embodiments of the invention.
Fig. 3 shows the example schematic diagram of the method that scene is presented by virtual reality device of embodiments of the invention.
Fig. 4 shows the schematic block diagram of the scene display device of embodiments of the invention.
Fig. 5 shows the schematic block diagram of the virtual reality scenario equipment of embodiments of the invention.
Embodiment
The various exemplary embodiments of the present invention are described in detail now with reference to accompanying drawing.It should be noted that:Unless had in addition Body illustrates that the part and the positioned opposite of step, numerical expression and numerical value otherwise illustrated in these embodiments does not limit this The scope of invention.
The description only actually at least one exemplary embodiment is illustrative below, never as to the present invention And its any limitation applied or used.
It may be not discussed in detail for technology, method and apparatus known to person of ordinary skill in the relevant, but suitable In the case of, the technology, method and apparatus should be considered as a part for specification.
In shown here and discussion all examples, any occurrence should be construed as merely exemplary, without It is as limitation.Therefore, other examples of exemplary embodiment can have different values.
It should be noted that:Similar label and letter represents similar terms in following accompanying drawing, therefore, once a certain Xiang Yi It is defined, then it need not be further discussed in subsequent accompanying drawing in individual accompanying drawing.
<Hardware configuration>
Fig. 1 is to show that the block diagram of the hardware configuration of the electronic equipment 1000 of embodiments of the invention can be realized.
In one example, electronic equipment 1000 can be virtual implementing helmet or virtual reality glasses etc..Such as Fig. 1 institutes Show, electronic equipment 1000 can include processor 1100, memory 1200, interface arrangement 1300, communicator 1400, display dress Put 1500, input unit 1600, loudspeaker 1700, microphone 1800 etc..Wherein, processor 1100 can be central processing unit CPU, Micro-processor MCV etc..Memory 1200 is for example including ROM (read-only storage), RAM (random access memory), such as Nonvolatile memory of hard disk etc..Interface arrangement 1300 is such as including USB interface, earphone interface.Communicator 1400 If carrying out wired or wireless communication, it can specifically include Wifi communications, Bluetooth communication, 2G/3G/4G/5G communications etc..It is aobvious Showing device 1500 is, for example, LCDs, touch display screen etc..Input unit 1600 for example can include touch-screen, keyboard, Body-sensing input etc..User can pass through loudspeaker 1700 and the inputting/outputting voice information of microphone 1800.
Electronic equipment shown in Fig. 1 is merely illustrative and is in no way intended to the invention, its application, or uses Any limitation.Applied in embodiments of the invention, the memory 1200 of electronic equipment 1000 is used for store instruction, described Instruct and passed through virtually now with performing any one provided in an embodiment of the present invention for controlling the processor 1100 to be operated The method that scene is presented in real equipment.Although it will be appreciated by those skilled in the art that being shown in Fig. 1 to electronic equipment 1000 many Individual device, still, the present invention can only relate to partial devices therein, for example, electronic equipment 1000 pertains only to processor 1100 With memory 1200.Technical staff can instruct according to presently disclosed conceptual design.How control processor is carried out for instruction Operation, this is it is known in the art that therefore being not described in detail herein.
<Embodiment>
<Method>
In the present embodiment there is provided a kind of method that scene is presented by virtual reality device, it is implemented on virtual reality and sets Standby upper, the virtual reality device includes left eye imaging device and right eye imagery equipment, and the left eye imaging device is set gradually There are left eye lens, left eye camera, left eye screen and left eye camera lens, it is saturating that the right eye imagery equipment is disposed with right eye Mirror, right eye camera, right eye screen and right eye camera lens.
Specifically, left eye imaging device and right eye imagery equipment can be physically separated in the virtual reality device Two entity devices, can also physically be either partially or completely integrated in together and be logically separated.
In one example, the virtual reality device can be virtual reality glasses or virtual implementing helmet.
The method of scene is presented by virtual reality device for this, as shown in Fig. 2 including:
Step S2100, obtains the imaging parameters of left eye imaging device and the imaging parameters of right eye imagery equipment, institute respectively State the refraction folder of center vertical range and camera lens of the imaging parameters at least including camera lens in corresponding imaging device to camera Angle;
The imaging parameters can be that the producer or manufacturer of corresponding virtual reality device provide, and prestore In the storage region of virtual reality device and provide interface for obtain, can also as virtual reality device product parameters lead to Cross specification or product official website explanation etc. mode supply and provide download etc. mode obtain, it is numerous to list herein.
Step S2200, the first offset distance of left eye imaging device is obtained according to the imaging parameters of left eye imaging device, with And the second offset distance of the imaging parameters right eye imagery equipment according to right eye imagery equipment.
For example, the imaging parameters of the left eye display device obtained in step S2100 include the camera lens of left eye imaging device To the center vertical range h of camera1And the refraction angle β of camera lens1, the imaging parameters of right eye display device include right eye into As equipment camera lens to camera center vertical range h2And the refraction angle β of camera lens2, specifically, obtain the first offset distance The step of from the second offset distance, includes:
According to the center vertical range h of the camera lens of left eye imaging device to camera1And the refraction angle β of camera lens1, meter Calculate the first offset distance w1=h1×tgβ1;And
According to the center vertical range h of the camera lens of right eye imagery equipment to camera2And the refraction angle β of camera lens2, meter Calculate the second offset distance w2=h2×tgβ2
Step S2300, according to first offset distance, the second offset distance and default human eye interpupillary distance, adjusts institute State left eye lens, left eye camera, right eye lens and right eye camera so that the middle edema with the heart involved of the left eye lens and right eye lens The no more than described human eye interpupillary distance of flat distance, the left eye lens and left eye camera central horizontal offset the first offset distance and institute State right eye lens and offset the second offset distance with right eye camera central horizontal.
Pass through above-mentioned adjustment so that left eye lens offset the first offset distance and the right side with left eye camera central horizontal Eyelens offsets the second offset distance with right eye camera central horizontal, can avoid passing through the virtual reality device in scene Image is duplicated or overlapping during scape, strengthening subsequent by the Deep Canvas of left and right camera preview image, and, it is to avoid use There is comfort level reduction or even visual fatigue in family.And cause the central horizontal distance no more than institute of the left eye lens and right eye lens Human eye interpupillary distance is stated, can be avoided because right and left eyes lens distance more than interpupillary distance brings the visual fatigue after user's use, further Improve usage comfort.
For example, the center vertical range of right eye camera lens to right eye camera is h1And the refraction angle of right eye camera lens is β1, The center vertical range of left eye camera lens to camera is h2And the refraction angle of right eye camera lens is β2, obtained by step S2200 The first offset distance be w1, the second offset distance is w2, after performing step S2300 adjustment corresponding all parts are set As shown in figure 3, wherein, the central horizontal distance of left eye lens and right eye lens shown in Fig. 3 is the human eye interpupillary distance, is only shown Meaning property.In other examples, it is also possible that the central horizontal distance of left eye lens and right eye lens is close to the people Eye pupil away from value.
The human eye interpupillary distance can be the average interpupillary distance of the general user according to engineering experience or experiment simulation selection, or Person, can be operate or input for user by interface that the virtual reality device that is related in the present embodiment is provided or interface and The numerical value of acquisition, people's eye pupil is set for the user of actual use virtual reality device according to self-demand or application scenarios Away to realize the interpupillary distance setting of personalization, further to lift the comfort level that user uses virtual reality device.
Step S2400, by the first image rendering of left eye camera preview on left eye screen, and correspondingly takes the photograph right eye As the second image rendering of head preview simultaneously is on right eye screen, so that corresponding scene is presented to user by corresponding camera lens.
Specifically, the scene can be virtual scene, to provide the almost real feeling of immersion of user or reality The real scene in the world, to be supplied to user to watch the sense of reality of real world.
After step S2300 setting right and left eyes cameras, the figure for respectively obtaining the preview simultaneously of images of left and right eyes camera Picture, correspondence is rendered on corresponding images of left and right eyes screen respectively, corresponding scene directly is presented to user so that without introducing the 3rd Equation storehouse merges algorithm process to each two field picture of images of left and right eyes camera preview, so as to reduce virtual reality device Realize difficulty, reduce program size, it is to avoid bring operational efficiency low and the problem of power consumption is big.Also, carried when scene is presented Hi-vision refresh rate, sense of reality when enhanced scene is presented.Especially, be particularly suitable for use in presentation real world true field There is provided the three-dimensional sense of reality for scape.
Specifically, what the Rendering operations can be provided by Unity3D renders function realization, and Unity3D is by Unity The professional game engine that one of Technologies exploitations integrates comprehensively, can be generally applied for developing virtual reality device The scene of presentation, will not be repeated here.
, will be same by images of left and right eyes camera respectively and in the present embodiment, what can be provided by Unity3D renders function When the obtained each two field picture of preview, correspondence is rendered on corresponding images of left and right eyes screen respectively, so that corresponding scene is presented.
In one example, the picture centre of each two field picture of images of left and right eyes camera preview simultaneously can also be carried out The Rendering operations are performed after skew again, to strengthen the Deep Canvas of corresponding image so that it is (special that corresponding scene is presented in correspondence It is not the real scene of real world), with more the sense of reality.Accordingly, what is provided in this example is presented by virtual reality device The method of scene also includes:
Camera lens based on left eye imaging device is to the center vertical range of camera, the first offset distance and described first The picture altitude of image obtains first level offset distance, based on the first level offset distance by described first image The heart carries out horizontal-shift, obtains new picture centre, then perform the Rendering operations;
And
According to the center vertical range of the camera lens of right eye imagery equipment to camera, the second offset distance and the second image Picture altitude obtain the second horizontal-shift distance, the center of second image is entered based on second horizontal-shift distance Row horizontal-shift, obtains new picture centre, then perform the Rendering operations.
More specifically, h1For center vertical range of the camera lens to camera of left eye imaging device, w1For the first offset distance And H1For the picture altitude of the first image, h2For center vertical range of the camera lens to camera of right eye imagery equipment, w2For One offset distance and H2For the picture altitude of the second image, accordingly, first level offset distance is d1=(w1/h1)×(H1/ 2), the second horizontal-shift distance is d2=(w2/h2)×(H2/2)。
, can also be to images of left and right eyes camera while the picture centre pair of each two field picture of preview in another example Object outside the profile of elephant blurred or Fuzzy Processing after, after perform the Rendering operations again, to strengthen corresponding figure The Deep Canvas of picture so that correspondence is presented corresponding scene (the particularly real scene of real world) and has more the sense of reality.It is right The method that scene is presented by virtual reality device provided in Ying Di, this example also includes:
Image center based on described first image is determined after the center object of described first image, to the center pair As carrying out contour detecting with corresponding center object profile, and the described first image outside the center object profile is included Object blurred or Fuzzy Processing after, then perform the Rendering operations;And
Image center based on second image is determined after the center object of second image, to the center pair As carrying out contour detecting with corresponding center object profile, and second image outside the center object profile is included Object blurred or Fuzzy Processing after, then perform the Rendering operations.
Wherein, the contour detecting can be realized by the function findContours () provided in OpenCV, OpenCV It is a cross-platform computer vision library increased income, will not be repeated here.
, can also be to each two field picture of images of left and right eyes camera preview however, it should be understood that in a particular application Picture centre enter after line displacement, the object outside the profile of picture centre object is blurred or Fuzzy Processing, held The row Rendering operations, preferably to strengthen the Deep Canvas of corresponding image so that correspondence is presented corresponding scene and had more The sense of reality.
<Equipment>
In the present embodiment, a kind of scene display device 3000 is also provided, as shown in figure 3, being arranged at virtual reality device 4000 sides, the scene display device 3000 includes parameter acquiring unit 3100, offset distance acquiring unit 3200, element adjustment Unit 3300 and image rendering unit 3400, alternatively, in addition to picture centre offset units 3500 and profile processing list Member 3600, for implementing the method that scene is presented by virtual reality device provided in the present embodiment, will not be repeated here.
Specifically, the virtual reality device 4000 includes left eye imaging device 4100 and right eye imagery equipment 4200, The left eye imaging device 4100 is disposed with left eye lens 4101, left eye camera 4102, left eye screen 4103 and a left side Glasses first 4104, the right eye imagery equipment 4200 is disposed with right eye lens 4201, right eye camera 4202, right eye screen 4203 and right eye camera lens 4204;
The scene display device 3000 includes:
Parameter acquiring unit 3100, imaging parameters and right eye imagery equipment for obtaining left eye imaging device respectively Imaging parameters, the imaging parameters at least include the center vertical range and mirror of camera lens in corresponding imaging device to camera The refraction angle of head;
Offset distance acquiring unit 3200, for obtaining left eye imaging device according to the imaging parameters of left eye imaging device First offset distance, and the imaging parameters right eye imagery equipment according to right eye imagery equipment the second offset distance;
Element adjustment unit 3300, for according to first offset distance, the second offset distance and default human eye Interpupillary distance, adjusts the left eye lens, left eye camera, right eye lens and right eye camera so that the left eye lens and right eye The no more than described human eye interpupillary distance of central horizontal distance, the left eye lens and the left eye camera central horizontal skew first of lens Offset distance and the right eye lens and right eye camera central horizontal the second offset distance of skew;
Image rendering unit 3400, for by the first image rendering of left eye camera preview in left eye screen, and Correspondence is by right eye camera while the second image rendering of preview is on right eye screen, to be presented by corresponding camera lens to user Corresponding scene.
Alternatively, the offset distance acquiring unit 3200 includes:
Center vertical range h for the camera lens according to left eye imaging device to camera1And the refraction angle of camera lens β1, calculate the first offset distance w1=h1×tgβ1Device;And
Center vertical range h for the camera lens according to right eye imagery equipment to camera2And the refraction angle of camera lens β2, calculate the second offset distance w2=h2×tgβ2Device.
Alternatively, the scene display device 3000 also includes picture centre offset units 3400, is used for:
Camera lens based on left eye imaging device is to the center vertical range of camera, the first offset distance and described first The picture altitude of image obtains first level offset distance, based on the first level offset distance by described first image The heart carries out horizontal-shift, obtains new picture centre, then perform the Rendering operations;
And
According to the center vertical range of the camera lens of right eye imagery equipment to camera, the second offset distance and the second image Picture altitude obtain the second horizontal-shift distance, the center of second image is entered based on second horizontal-shift distance Row horizontal-shift, obtains new picture centre, then perform the Rendering operations.
Alternatively, the scene display device 3000 also includes profile processing unit 3500, is used for:
Image center based on described first image is determined after the center object of described first image, to the center pair As carrying out contour detecting with corresponding center object profile, and the described first image outside the center object profile is included Object blurred or Fuzzy Processing after, then perform the Rendering operations;And
Image center based on second image is determined after the center object of second image, to the center pair As carrying out contour detecting with corresponding center object profile, and second image outside the center object profile is included Object blurred or Fuzzy Processing after, then perform the Rendering operations.
It should be appreciated that the annexation of the scene display device 3000 and virtual reality device 4000 shown in Fig. 3 is only Schematically, specific limitation is not it.Scene display device 3000 can be arranged at or be integrated in virtual reality device In 4000, can be independently of virtual reality device 4000 outside, pass through wireless or wired connection form and virtual reality device 4000 connections coordinate the scene rendering method for performing the present embodiment, numerous to list herein.
In addition, left eye imaging device 4100 and right eye imagery that the virtual reality device 4000 shown in Fig. 3 is included are set Standby 4200 are only illustrative, and it is physics point not necessarily imply that left eye imaging device 4100 and right eye imagery equipment 4200 From two entity devices, the left eye imaging device 4100 and right eye imagery equipment 4200 can be logical separation, example Such as, in the specific implementation, left eye lens 4101 corresponding with left eye imaging device 4100, left eye camera 4102, left eye screen 4103rd, left eye camera lens 4104 and right eye lens 4201 corresponding with right eye display device 4200, right eye camera 4202, right eye Screen 4203 and right eye camera lens 4204, can be elements that is integrated or being built in same entity device, not with physics Separation corresponding left eye imaging device 4100 and right eye imagery equipment 4200 is set.
In a specific example, the electronics that the hardware configuration of the scene display device 3000 can be as shown in Figure 1 Equipment 1000.
Those skilled in the art are it should also be appreciated that scene display device 3000 can be realized by various modes.For example, Scene display device 3000 can be realized by instructing configuration processor.For example, instruction can be stored in ROM, and When starting the device, will instruction from ROM read programming device in realize scene display device 3000.For example, can be by Scene display device 3000 is cured in dedicated devices (such as ASIC).Scene display device 3000 can be divided into separate Unit, or they can be merged to realization.Scene display device 3000 can pass through above-mentioned various implementations In one kind realize, or can be realized by the combination of two or more modes in above-mentioned various implementations.
<Virtual reality device>
In the present embodiment, a kind of virtual reality device 5000 is also provided, as shown in figure 5, including:
Left eye imaging device 5100, the left eye imaging device 5100 is disposed with left eye lens 5101, left eye shooting First 5102, left eye screen 5103 and left eye camera lens 5104;
Right eye imagery equipment 5200, the right eye imagery equipment 5200 is disposed with right eye lens 5201, right eye shooting First 5202, right eye screen 5203 and right eye camera lens 5204;And
The virtual display scene display device 3000 provided in the present embodiment.
Specifically, virtual reality device 5000 can be virtual implementing helmet or virtual reality glasses etc..In a tool In style, the virtual reality device 5000 can electronic equipment 1000 as shown in Figure 1.
In addition, left eye imaging device 5100 and right eye imagery that the virtual reality device 5000 shown in Fig. 5 is included are set Standby 5200 are only illustrative, and it is physics point not necessarily imply that left eye imaging device 5100 and right eye imagery equipment 5200 From two entity devices, the left eye imaging device 5100 and right eye imagery equipment 5200 can be logical separation, example Such as, in the specific implementation, the virtual reality device 5000 can include left eye lens 5101, the left eye camera set gradually 5102nd, left eye screen 5103, left eye camera lens 5104, and correspondence successively the right eye lens 5201 of equipment, right eye camera 5202, Right eye screen 5203 and right eye camera lens 5204, do not set with the separation of physics corresponding left eye imaging device 5100 and Right eye imagery equipment 5200.
Offer one kind in the present embodiment, the present embodiment is provided above scene is presented by virtual reality device Method, equipment and virtual reality device, in virtual reality device comprising images of left and right eyes camera perform adjustment operation after point The image of images of left and right eyes camera preview simultaneously is not corresponded to respectively and rendered on corresponding images of left and right eyes camera lens so that without introducing Third party's program library merges algorithm process to each two field picture of images of left and right eyes camera preview, so as to reduce virtual reality Equipment realizes difficulty, reduces program size, it is to avoid bring operational efficiency low and the problem of power consumption is big.Also, scene is being presented Shi Tigao image refreshing speed, sense of reality when enhanced scene is presented.Meanwhile, can avoid passing through the virtual reality device is in Image is duplicated or overlapping during live scape, strengthens the Deep Canvas of correspondence image, sense of reality when enhanced scene is presented.This Outside, moreover it is possible to lift the comfort level that user uses virtual reality device.It is particularly suitable for use in and the real scene of real world is presented.
It is well known by those skilled in the art that the development of the electronic information technology with such as large scale integrated circuit technology With the trend of hardware and software, clearly to divide computer system soft and hardware boundary and seem relatively difficult.Because appointing What operation can be realized with software, can also be realized by hardware.The execution of any instruction can be completed by hardware, equally also may be used To be completed by software.Hardware implementations or software implement scheme are used for a certain machine function, depending on price, speed The Non-technical factors such as degree, reliability, memory capacity, change cycle.Therefore, for the ordinary skill of electronic information technical field For personnel, more it is direct and be explicitly described the mode of a technical scheme be describe the program in each operation.Knowing In the case of road institute operation to be performed, those skilled in the art can directly be set based on the consideration to the Non-technical factor Count out desired product.
The present invention can be system, method and/or computer program product.Computer program product can include computer Readable storage medium storing program for executing, containing for making processor realize the computer-readable program instructions of various aspects of the invention.
Computer-readable recording medium can keep and store to perform the tangible of the instruction that equipment is used by instruction Equipment.Computer-readable recording medium for example can be-- but be not limited to-- storage device electric, magnetic storage apparatus, optical storage Equipment, electromagnetism storage device, semiconductor memory apparatus or above-mentioned any appropriate combination.Computer-readable recording medium More specifically example (non exhaustive list) includes:Portable computer diskette, hard disk, random access memory (RAM), read-only deposit It is reservoir (ROM), erasable programmable read only memory (EPROM or flash memory), static RAM (SRAM), portable Compact disk read-only storage (CD-ROM), digital versatile disc (DVD), memory stick, floppy disk, mechanical coding equipment, for example thereon Be stored with instruction punch card or groove internal projection structure and above-mentioned any appropriate combination.It is used herein above to calculate Machine readable storage medium storing program for executing is not construed as instantaneous signal in itself, the electromagnetic wave of such as radio wave or other Free propagations, logical Cross the electromagnetic wave (for example, the light pulse for passing through fiber optic cables) of waveguide or the propagation of other transmission mediums or transmitted by electric wire Electric signal.
Computer-readable program instructions as described herein can be downloaded to from computer-readable recording medium each calculate/ Processing equipment, or outer computer is downloaded to or outer by network, such as internet, LAN, wide area network and/or wireless network Portion's storage device.Network can be transmitted, be wirelessly transferred including copper transmission cable, optical fiber, router, fire wall, interchanger, gateway Computer and/or Edge Server.Adapter or network interface in each calculating/processing equipment are received from network to be counted Calculation machine readable program instructions, and the computer-readable program instructions are forwarded, for the meter being stored in each calculating/processing equipment In calculation machine readable storage medium storing program for executing.
For perform the computer program instructions that operate of the present invention can be assembly instruction, instruction set architecture (ISA) instruction, Machine instruction, machine-dependent instructions, microcode, firmware instructions, condition setup data or with one or more programming languages Source code or object code that any combination is write, programming language of the programming language including object-oriented-such as Smalltalk, C++ etc., and conventional procedural programming languages-such as " C " language or similar programming language.Computer Readable program instructions can perform fully on the user computer, partly perform on the user computer, as one solely Vertical software kit is performed, part is performed or completely in remote computer on the remote computer on the user computer for part Or performed on server.In the situation of remote computer is related to, remote computer can be by network-bag of any kind LAN (LAN) or wide area network (WAN)-be connected to subscriber computer are included, or, it may be connected to outer computer is (such as sharp With ISP come by Internet connection).In certain embodiments, by using computer-readable program instructions Status information carry out personalized customization electronic circuit, such as PLD, field programmable gate array (FPGA) or can Programmed logic array (PLA) (PLA), the electronic circuit can perform computer-readable program instructions, so as to realize each side of the present invention Face.
Referring herein to method according to embodiments of the present invention, device (system) and computer program product flow chart and/ Or block diagram describes various aspects of the invention.It should be appreciated that each square frame and flow chart of flow chart and/or block diagram and/ Or in block diagram each square frame combination, can be realized by computer-readable program instructions.
These computer-readable program instructions can be supplied to all-purpose computer, special-purpose computer or other programmable datas The processor of processing unit, so as to produce a kind of machine so that these instructions are passing through computer or other programmable datas During the computing device of processing unit, work(specified in one or more of implementation process figure and/or block diagram square frame is generated The device of energy/action.Can also be the storage of these computer-readable program instructions in a computer-readable storage medium, these refer to Order causes computer, programmable data processing unit and/or other equipment to work in a specific way, so that, be stored with instruction Computer-readable medium then includes a manufacture, and it is included in one or more of implementation process figure and/or block diagram square frame The instruction of the various aspects of defined function/action.
Computer-readable program instructions can also be loaded into computer, other programmable data processing units or other In equipment so that perform series of operation steps on computer, other programmable data processing units or miscellaneous equipment, to produce Raw computer implemented process, so that performed on computer, other programmable data processing units or miscellaneous equipment Instruct function/action specified in one or more of implementation process figure and/or block diagram square frame.
Flow chart and block diagram in accompanying drawing show system, method and the computer journey of multiple embodiments according to the present invention Architectural framework in the cards, function and the operation of sequence product.At this point, each square frame in flow chart or block diagram can generation One module of table, program segment or a part for instruction, the module, program segment or a part for instruction are used comprising one or more In the executable instruction for realizing defined logic function.In some realizations as replacement, the function of being marked in square frame Can be with different from the order marked in accompanying drawing generation.For example, two continuous square frames can essentially be held substantially in parallel OK, they can also be performed in the opposite order sometimes, and this is depending on involved function.It is also noted that block diagram and/or The combination of each square frame in flow chart and the square frame in block diagram and/or flow chart, can use function as defined in execution or dynamic The special hardware based system made is realized, or can be realized with the combination of specialized hardware and computer instruction.It is right For those skilled in the art it is well known that, realized by hardware mode, realized by software mode and by software and It is all of equal value that the mode of combination of hardware, which is realized,.
It is described above various embodiments of the present invention, described above is exemplary, and non-exclusive, and It is not limited to disclosed each embodiment.In the case of without departing from the scope and spirit of illustrated each embodiment, for this skill Many modifications and changes will be apparent from for the those of ordinary skill in art field.The selection of term used herein, purport Best explaining the principle of each embodiment, practical application or to the technological improvement in market, or making its of the art Its those of ordinary skill is understood that each embodiment disclosed herein.The scope of the present invention is defined by the appended claims.

Claims (10)

1. a kind of method that scene is presented by virtual reality device, is implemented on virtual reality device, it is characterised in that
The virtual reality device includes left eye imaging device and right eye imagery equipment, and the left eye imaging device is set gradually There are left eye lens, left eye camera, left eye screen and left eye camera lens, it is saturating that the right eye imagery equipment is disposed with right eye Mirror, right eye camera, right eye screen and right eye camera lens;
Methods described includes:
The imaging parameters of left eye imaging device and the imaging parameters of right eye imagery equipment are obtained respectively, and the imaging parameters are at least The refraction angle of center vertical range and camera lens including camera lens in corresponding imaging device to camera;
The first offset distance of left eye imaging device is obtained according to the imaging parameters of left eye imaging device, and according to right eye imagery Second offset distance of the imaging parameters right eye imagery equipment of equipment;
According to first offset distance, the second offset distance and default human eye interpupillary distance, the left eye lens, left eye are adjusted Camera, right eye lens and right eye camera so that the central horizontal distance no more than institute of the left eye lens and right eye lens State human eye interpupillary distance, the left eye lens and left eye camera central horizontal and offset the first offset distance and the right eye lens and the right side Eye imaging head central horizontal offsets the second offset distance;
By the first image rendering of left eye camera preview on left eye screen, and correspondingly by the preview simultaneously of right eye camera Second image rendering is on right eye screen, so that corresponding scene is presented to user by corresponding camera lens.
2. according to the method described in claim 1, it is characterised in that the first offset distance and second offset distance of obtaining Step includes:
According to the center vertical range h of the camera lens of left eye imaging device to camera1And the refraction angle β of camera lens1, calculate the One offset distance w1=h1×tgβ1;And
According to the center vertical range h of the camera lens of right eye imagery equipment to camera2And the refraction angle β of camera lens2, calculate the Two offset distance w2=h2×tgβ2
3. according to the method described in claim 1, it is characterised in that also include:
Camera lens based on left eye imaging device is to the center vertical range of camera, the first offset distance and described first image Picture altitude obtain first level offset distance, the center of described first image is entered based on the first level offset distance Row horizontal-shift, obtains new picture centre, then perform the Rendering operations;
And
According to the figure of the center vertical range of the camera lens of right eye imagery equipment to camera, the second offset distance and the second image Image height degree obtains the second horizontal-shift distance, and the center of second image is entered into water-filling based on second horizontal-shift distance Flat skew, obtains new picture centre, then perform the Rendering operations.
4. method according to claim 3, it is characterised in that
The first level offset distance is d1=(w1/h1)×(H1/ 2),
Wherein, h1For center vertical range of the camera lens to camera of left eye imaging device, w1For the first offset distance and H1For The picture altitude of first image;
Second horizontal-shift distance is d2=(w2/h2)×(H2/ 2),
Wherein, h2For center vertical range of the camera lens to camera of right eye imagery equipment, w2For the first offset distance and H2For The picture altitude of second image.
5. the method according to claim 1 or 3, it is characterised in that also include:
Image center based on described first image is determined after the center object of described first image, and the center object is entered Row contour detecting is with corresponding center object profile, and to pair that the described first image outside the center object profile includes As blurred or Fuzzy Processing after, then perform the Rendering operations;And
Image center based on second image is determined after the center object of second image, and the center object is entered Row contour detecting is with corresponding center object profile, and to pair that second image outside the center object profile includes As blurred or Fuzzy Processing after, then perform the Rendering operations.
6. a kind of scene display device, it is characterised in that be arranged at virtual reality device side,
The virtual reality device includes left eye imaging device and right eye imagery equipment, and the left eye imaging device is set gradually There are left eye lens, left eye camera, left eye screen and left eye camera lens, it is saturating that the right eye imagery equipment is disposed with right eye Mirror, right eye camera, right eye screen and right eye camera lens;
The scene display device includes:
Parameter acquiring unit, for obtaining the imaging parameters of left eye imaging device and the imaging ginseng of right eye imagery equipment respectively Number, the imaging parameters at least include the center vertical range and the folding of camera lens of camera lens in corresponding imaging device to camera Penetrate angle;
Offset distance acquiring unit, the first skew for obtaining left eye imaging device according to the imaging parameters of left eye imaging device Distance, and the imaging parameters right eye imagery equipment according to right eye imagery equipment the second offset distance;
Element adjustment unit, for according to first offset distance, the second offset distance and default human eye interpupillary distance, adjustment The left eye lens, left eye camera, right eye lens and right eye camera so that the center of the left eye lens and right eye lens Horizontal range be not more than the human eye interpupillary distance, the left eye lens and left eye camera central horizontal offset the first offset distance and The right eye lens offset the second offset distance with right eye camera central horizontal;
Image rendering unit, for by the first image rendering of left eye camera preview in left eye screen, and correspondingly by the right side Second image rendering of Eye imaging head preview simultaneously is on right eye screen, so that corresponding field is presented to user by corresponding camera lens Scape.
7. equipment according to claim 6, it is characterised in that the offset distance acquiring unit includes:
Center vertical range h for the camera lens according to left eye imaging device to camera1And the refraction angle β of camera lens1, meter Calculate the first offset distance w1=h1×tgβ1Device;And
Center vertical range h for the camera lens according to right eye imagery equipment to camera2And the refraction angle β of camera lens2, meter Calculate the second offset distance w2=h2×tgβ2Device.
8. equipment according to claim 6, it is characterised in that also including picture centre offset units, be used for:
Camera lens based on left eye imaging device is to the center vertical range of camera, the first offset distance and described first image Picture altitude obtain first level offset distance, the center of described first image is entered based on the first level offset distance Row horizontal-shift, obtains new picture centre, then perform the Rendering operations;
And
According to the figure of the center vertical range of the camera lens of right eye imagery equipment to camera, the second offset distance and the second image Image height degree obtains the second horizontal-shift distance, and the center of second image is entered into water-filling based on second horizontal-shift distance Flat skew, obtains new picture centre, then perform the Rendering operations.
9. the equipment according to claim 6 or 8, it is characterised in that also including profile processing unit, be used for:
Image center based on described first image is determined after the center object of described first image, and the center object is entered Row contour detecting is with corresponding center object profile, and to pair that the described first image outside the center object profile includes As blurred or Fuzzy Processing after, then perform the Rendering operations;And
Image center based on second image is determined after the center object of second image, and the center object is entered Row contour detecting is with corresponding center object profile, and to pair that second image outside the center object profile includes As blurred or Fuzzy Processing after, then perform the Rendering operations.
10. a kind of virtual reality device, it is characterised in that including:
Left eye imaging device, the left eye imaging device is disposed with left eye lens, left eye camera, left eye screen and a left side Glasses head;
Right eye imagery equipment, the right eye imagery equipment is disposed with right eye lens, right eye camera, right eye screen and the right side Glasses head;And
Any one scene display device as described in claim 6-9.
CN201710364277.6A 2017-05-22 2017-05-22 Method, equipment and the virtual reality device of scene are presented by virtual reality device Active CN107302694B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710364277.6A CN107302694B (en) 2017-05-22 2017-05-22 Method, equipment and the virtual reality device of scene are presented by virtual reality device
PCT/CN2017/112383 WO2018214431A1 (en) 2017-05-22 2017-11-22 Method and apparatus for presenting scene using virtual reality device, and virtual reality apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710364277.6A CN107302694B (en) 2017-05-22 2017-05-22 Method, equipment and the virtual reality device of scene are presented by virtual reality device

Publications (2)

Publication Number Publication Date
CN107302694A true CN107302694A (en) 2017-10-27
CN107302694B CN107302694B (en) 2019-01-18

Family

ID=60137596

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710364277.6A Active CN107302694B (en) 2017-05-22 2017-05-22 Method, equipment and the virtual reality device of scene are presented by virtual reality device

Country Status (2)

Country Link
CN (1) CN107302694B (en)
WO (1) WO2018214431A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108830943A (en) * 2018-06-29 2018-11-16 歌尔科技有限公司 A kind of image processing method and virtual reality device
WO2018214431A1 (en) * 2017-05-22 2018-11-29 歌尔科技有限公司 Method and apparatus for presenting scene using virtual reality device, and virtual reality apparatus
CN108986225A (en) * 2018-05-29 2018-12-11 歌尔科技有限公司 Processing method and processing device, equipment when virtual reality device display scene
CN109002164A (en) * 2018-07-10 2018-12-14 歌尔科技有限公司 It wears the display methods for showing equipment, device and wears display equipment
CN111193919A (en) * 2018-11-15 2020-05-22 中兴通讯股份有限公司 3D display method, device, equipment and computer readable medium
CN112235562A (en) * 2020-10-12 2021-01-15 聚好看科技股份有限公司 3D display terminal, controller and image processing method
CN115079826A (en) * 2022-06-24 2022-09-20 平安银行股份有限公司 Virtual reality implementation method, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105892053A (en) * 2015-12-30 2016-08-24 乐视致新电子科技(天津)有限公司 Virtual helmet lens interval adjusting method and device
CN105954875A (en) * 2016-05-19 2016-09-21 华为技术有限公司 VR (Virtual Reality) glasses and adjustment method thereof
CN106019588A (en) * 2016-06-23 2016-10-12 深圳市虚拟现实科技有限公司 Near-to-eye display device capable of automatically measuring interpupillary distance and method
WO2016176309A1 (en) * 2015-04-30 2016-11-03 Google Inc. Virtual eyeglass set for viewing actual scene that corrects for different location of lenses than eyes
CN106598250A (en) * 2016-12-19 2017-04-26 北京星辰美豆文化传播有限公司 VR display method and apparatus, and electronic device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200813469A (en) * 2006-09-08 2008-03-16 Asia Optical Co Inc Micro-type imaging-capturing lens
US9244277B2 (en) * 2010-04-30 2016-01-26 The Arizona Board Of Regents On Behalf Of The University Of Arizona Wide angle and high resolution tiled head-mounted display device
US9025252B2 (en) * 2011-08-30 2015-05-05 Microsoft Technology Licensing, Llc Adjustment of a mixed reality display for inter-pupillary distance alignment
JP6498660B2 (en) * 2013-03-26 2019-04-10 ルソスペース, プロジェクトス エンゲンハリア エリデーアー Display device
US9798145B2 (en) * 2014-05-23 2017-10-24 Qualcomm Incorporated Method and apparatus for see-through near eye display
CN106445167B (en) * 2016-10-20 2019-09-20 网易(杭州)网络有限公司 Simple eye visual field is adaptive to match method of adjustment and device, wear-type visual device
CN106646892A (en) * 2017-03-21 2017-05-10 上海乐蜗信息科技有限公司 Optical system and head-mounted virtual reality device
CN107302694B (en) * 2017-05-22 2019-01-18 歌尔科技有限公司 Method, equipment and the virtual reality device of scene are presented by virtual reality device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016176309A1 (en) * 2015-04-30 2016-11-03 Google Inc. Virtual eyeglass set for viewing actual scene that corrects for different location of lenses than eyes
CN105892053A (en) * 2015-12-30 2016-08-24 乐视致新电子科技(天津)有限公司 Virtual helmet lens interval adjusting method and device
CN105954875A (en) * 2016-05-19 2016-09-21 华为技术有限公司 VR (Virtual Reality) glasses and adjustment method thereof
CN106019588A (en) * 2016-06-23 2016-10-12 深圳市虚拟现实科技有限公司 Near-to-eye display device capable of automatically measuring interpupillary distance and method
CN106598250A (en) * 2016-12-19 2017-04-26 北京星辰美豆文化传播有限公司 VR display method and apparatus, and electronic device

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018214431A1 (en) * 2017-05-22 2018-11-29 歌尔科技有限公司 Method and apparatus for presenting scene using virtual reality device, and virtual reality apparatus
CN108986225A (en) * 2018-05-29 2018-12-11 歌尔科技有限公司 Processing method and processing device, equipment when virtual reality device display scene
CN108986225B (en) * 2018-05-29 2022-10-18 歌尔光学科技有限公司 Processing method, device and equipment for displaying scene by virtual reality equipment
CN108830943A (en) * 2018-06-29 2018-11-16 歌尔科技有限公司 A kind of image processing method and virtual reality device
CN108830943B (en) * 2018-06-29 2022-05-31 歌尔光学科技有限公司 Image processing method and virtual reality equipment
CN109002164A (en) * 2018-07-10 2018-12-14 歌尔科技有限公司 It wears the display methods for showing equipment, device and wears display equipment
CN109002164B (en) * 2018-07-10 2021-08-24 歌尔光学科技有限公司 Display method and device of head-mounted display equipment and head-mounted display equipment
CN111193919A (en) * 2018-11-15 2020-05-22 中兴通讯股份有限公司 3D display method, device, equipment and computer readable medium
CN111193919B (en) * 2018-11-15 2023-01-13 中兴通讯股份有限公司 3D display method, device, equipment and computer readable medium
CN112235562A (en) * 2020-10-12 2021-01-15 聚好看科技股份有限公司 3D display terminal, controller and image processing method
CN112235562B (en) * 2020-10-12 2023-09-15 聚好看科技股份有限公司 3D display terminal, controller and image processing method
CN115079826A (en) * 2022-06-24 2022-09-20 平安银行股份有限公司 Virtual reality implementation method, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN107302694B (en) 2019-01-18
WO2018214431A1 (en) 2018-11-29

Similar Documents

Publication Publication Date Title
CN107302694A (en) Method, equipment and the virtual reality device of scene are presented by virtual reality device
CN109410298B (en) Virtual model manufacturing method and expression changing method
CN111066026B (en) Techniques for providing virtual light adjustment to image data
JP4766877B2 (en) Method for generating an image using a computer, computer-readable memory, and image generation system
CN108604385A (en) A kind of application interface display methods and device
CN101401129B (en) Techniques for creating facial animation using a face mesh
CN106686365A (en) Lens adjusting method and lens adjusting device for head-mounted display equipment, and head-mounted display equipment
CN106782260A (en) For the display methods and device of virtual reality moving scene
CN104581119B (en) A kind of display methods of 3D rendering and a kind of helmet
US10650507B2 (en) Image display method and apparatus in VR device, and VR device
CN107728986A (en) The display methods and display device of a kind of double-display screen
CN107170047A (en) Update method, equipment and the virtual reality device of virtual reality scenario
CN105657408A (en) Method for implementing virtual reality scene and virtual reality apparatus
CN106846487A (en) Subtract face method, equipment and display device
CN115512014A (en) Method for training expression driving generation model, expression driving method and device
CN103905806A (en) System for realizing 3D shooting by using single camera and method
CN109447931A (en) Image processing method and device
US11543655B1 (en) Rendering for multi-focus display systems
US20170178393A1 (en) Recording medium, information processing apparatus, and control method
CN107959845A (en) The method, apparatus of view data transmission, client terminal device and wear display device
CN107479692A (en) Control method, equipment and the virtual reality device of virtual reality scenario
CN104182979A (en) Visual impairment simulation method and device
CN107688241A (en) Wear the control method, equipment and system of display device
CN106385577A (en) Split screen display method under recovery mode, device and virtual reality device
CN111667906B (en) Eyeball structure virtual teaching system and digital model building method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201016

Address after: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronic office building)

Patentee after: GoerTek Optical Technology Co.,Ltd.

Address before: 266104 Laoshan Qingdao District North House Street investment service center room, Room 308, Shandong

Patentee before: GOERTEK TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221122

Address after: 266104 No. 500, Songling Road, Laoshan District, Qingdao, Shandong

Patentee after: GOERTEK TECHNOLOGY Co.,Ltd.

Address before: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronics office building)

Patentee before: GoerTek Optical Technology Co.,Ltd.

TR01 Transfer of patent right