WO2018058673A1 - 一种3d显示方法及用户终端 - Google Patents

一种3d显示方法及用户终端 Download PDF

Info

Publication number
WO2018058673A1
WO2018058673A1 PCT/CN2016/101374 CN2016101374W WO2018058673A1 WO 2018058673 A1 WO2018058673 A1 WO 2018058673A1 CN 2016101374 W CN2016101374 W CN 2016101374W WO 2018058673 A1 WO2018058673 A1 WO 2018058673A1
Authority
WO
WIPO (PCT)
Prior art keywords
angle
user terminal
projection
display
view
Prior art date
Application number
PCT/CN2016/101374
Other languages
English (en)
French (fr)
Inventor
党茂昌
李毅
王光琳
符玉襄
杜明亮
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US16/338,216 priority Critical patent/US10908684B2/en
Priority to PCT/CN2016/101374 priority patent/WO2018058673A1/zh
Priority to EP16917423.2A priority patent/EP3511764B1/en
Priority to CN201680077478.1A priority patent/CN108476316B/zh
Publication of WO2018058673A1 publication Critical patent/WO2018058673A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/378Image reproducers using viewer tracking for tracking rotational head movements around an axis perpendicular to the screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/395Arrangements specially adapted for transferring the contents of the bit-mapped memory to the screen
    • G09G5/397Arrangements specially adapted for transferring the contents of two or more bit-mapped memories to the screen simultaneously, e.g. for mixing or overlay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes

Definitions

  • the present invention relates to the field of terminal technologies, and in particular, to a 3D display method and a user terminal.
  • Three Dimension (3D) display technology can make the output of the image stereoscopic and realistic, allowing viewers to have an immersive experience.
  • 3D display technology is widely used in user terminals (such as mobile phones, computers, televisions, etc.).
  • the user needs to view the content of the 3D display from a certain fixed angle in order to have the best 3D experience, that is, the 3D display of the user terminal cannot flexibly adapt to different viewing positions or viewing angles of the user, and the 3D display technology Need to be further improved.
  • the embodiment of the invention discloses a 3D display method and a user terminal, which can dynamically adjust the 3D Project Angle according to the viewing angle of the user, thereby solving the 3D display with a fixed 3D projection angle, and cannot flexibly adapt to the user. Problems with different viewing positions or viewing angles.
  • the first aspect provides a 3D display method, which is applied to a user terminal, the method includes: the user terminal detects a viewing angle of the user on the display screen; the user terminal determines a 3D projection angle according to the viewing angle; the user terminal according to the 3D projection The angle displays the content that needs to be displayed in 3D.
  • the viewing angle of view may be a central axis viewing angle, where the central axis viewing angle is an angle between a midpoint of the two eyes and a center perpendicular, and the center perpendicular is a line perpendicular to a center position of the display screen. .
  • the viewing angle of view may include a left eye angle of view and a right eye angle of view, wherein the left eye angle of view is an angle between a midpoint of the left eye pupil and a center perpendicular line, and the right eye angle of view is a right eye angle The angle between the midpoint of the pupil and the center perpendicular, which is a line perpendicular to the center of the display screen.
  • the user terminal can dynamically adjust the 3D projection angle for performing 3D display through the viewing angle of the user, and can flexibly adapt to different viewing positions or viewing angles of the user, so that the 3D display effect is more realistic, and the user Watch 3D pictures more clearly and improve 3D display effect.
  • the user terminal detecting the viewing angle of the user to the display screen may include: detecting, by the user terminal, a tilt angle of the user terminal with respect to a vertical line of gravity, and the user The rotation angle of the terminal about the axis of symmetry and the angle between the midpoint of the eyes and the camera; the user terminal calculates the angle of view of the center axis according to the angle of inclination, the angle of rotation, and the angle between the midpoint of the eyes and the camera.
  • the user terminal can accurately determine the mid-axis viewing angle.
  • the user terminal can detect the tilt angle of the user terminal with respect to the vertical line of gravity, the rotation angle of the user terminal about the axis of symmetry, and the angle between the midpoint of the eyes and the camera in real time. In this way, after the viewing angle of the user is changed, the user terminal can detect the new central axis viewing angle of the user in time, and then adjust the 3D projection angle in time according to the new central axis viewing angle.
  • the user terminal detects the tilt angle of the user terminal with respect to the vertical line of gravity, the rotation angle of the user terminal rotating around the symmetry axis, and the angle between the midpoint of the two eyes and the camera may include: the user terminal detects the user The angle of inclination of the terminal relative to the vertical line of gravity, the angle of rotation of the user terminal relative to the axis of symmetry; when the user terminal detects that the tilt angle or the change of the rotation angle is greater than the preset angle, detecting the angle between the midpoint of the two eyes and the camera .
  • the angle between the midpoint of the user's eyes and the camera can be calculated without constantly turning on the camera to take a picture of the user, which is beneficial to save CPU resources.
  • the specific manner that the user terminal detects whether the change of the tilt angle or the rotation angle exceeds the preset angle may be: the user terminal determines whether the absolute value of the difference between the recently detected tilt angle and the first tilt angle exceeds the pre-predetermined state. The angle is set, wherein the first tilt angle is an angle of tilt detected when the angle between the midpoint of the user's eyes and the camera is detected last time. Or the user terminal determines whether the absolute value of the difference between the recently detected rotation angle and the first rotation angle exceeds a preset angle, wherein the first rotation angle is when the angle between the midpoint of the user's eyes and the camera is detected last time. The detected angle of rotation.
  • the user terminal can calculate the central axis viewing angle by using Equation 1 below.
  • Equation 1 Where ⁇ is the central axis viewing angle. This ⁇ is the angle of inclination of the user terminal with respect to the vertical line of gravity. This ⁇ is the angle of rotation of the user terminal about the axis of symmetry. This x is a preset angle correction parameter. This ⁇ is the angle between the midpoint of both eyes and the camera.
  • the user terminal can accurately determine the central axis angle of view.
  • the user terminal can calculate the central axis viewing angle by using Equation 2 below.
  • Equation 2 Where ⁇ is the central axis viewing angle. This ⁇ is the angle of inclination of the user terminal with respect to the vertical line of gravity. This ⁇ is the angle of rotation of the user terminal about the axis of symmetry.
  • This x is a preset angle correction parameter. This ⁇ is the angle between the midpoint of both eyes and the camera.
  • the ⁇ is a preset angle value, which may be an empirical value, for example, may be 45°, 40°, or the like.
  • the user terminal can accurately determine the central axis angle of view.
  • the 3D projection angle includes a left eye 3D projection angle and a right eye 3D projection angle
  • determining, by the user terminal, the 3D projection angle according to the detected central axis perspective may include: the user terminal according to the detected central axis angle and The left eye adjustment angle is preset to determine the left eye 3D projection angle; the user terminal determines the right eye 3D projection angle according to the detected central axis angle of view and the preset right eye adjustment angle.
  • the user terminal can accurately determine the left eye 3D projection angle and the right eye 3D projection angle.
  • the preset left eye adjustment angle is a preset relationship between the preset preset central axis angle and the preset left eye adjustment angle, and the preset axis corresponding to the central axis angle detected by the user terminal The eye adjustment angle; the preset right eye adjustment angle is a preset right eye adjustment angle corresponding to the central axis angle detected by the user terminal in the correspondence between the preset preset central axis angle and the preset right eye adjustment angle.
  • a plurality of preset left eye adjustment angles and a preset right eye adjustment angle are stored, and corresponding preset left eye adjustment angles and preset right eye adjustment angles are obtained according to the currently detected central axis perspective, thereby obtaining more accurate
  • the left eye adjustment angle and the preset right eye adjustment angle are preset, so that the left eye 3D projection angle and the right eye 3D projection angle can be obtained more accurately.
  • the 3D projection angle includes a left eye 3D projection angle and a right eye 3D projection angle.
  • the viewing angle includes the left eye angle and the right eye angle
  • the user terminal determines, according to the viewing angle, that the 3D projection angle may include The user terminal determines the left eye angle of view as the left eye 3D projection angle and the right eye angle of view as the right eye 3D projection angle.
  • the user terminal performing 3D display on the content to be displayed according to the 3D projection angle may include: the user terminal according to the left eye 3D The projection angle and the right eye 3D projection angle are plotted on the content to be displayed, and the drawing result is displayed through the 3D display.
  • the user terminal can determine different ones depending on the viewing angle of the user. 3D projection angle, and then the user terminal can display images of different 3D projection angles through the 3D display, which can make the 3D display effect more realistic, and make the user view the picture more clear, and improve the 3D display effect.
  • determining, by the user terminal, the 3D projection angle according to the viewing angle may include: determining, by the user terminal, the central axis angle of view as the 3D projection angle.
  • the user terminal can accurately determine the 3D projection angle.
  • the user terminal determines the central axis angle of view as the 3D projection angle
  • the user terminal performs the 3D display on the content to be displayed according to the 3D projection angle
  • the user terminal may include: the user terminal needs to display according to the 3D projection angle.
  • the content is drawn and the drawing results are displayed through a 2D display or a holographic display.
  • the user terminal can determine different 3D projection angles according to different viewing angles of the user, and the user terminal can display images of different angles through the 2D display or the holographic display, thereby making the 3D display effect more realistic and improving. 3D display effect.
  • a user terminal having a function of implementing the behavior of the user terminal in the first aspect or the possible implementation manner of the first aspect.
  • This function can be implemented in hardware or in hardware by executing the corresponding software.
  • the hardware or software includes one or more units corresponding to the functions described above.
  • the unit can be software and/or hardware.
  • a user terminal comprising a display, one or more processors, a memory, a bus system, and one or more programs, the one or more processors and the memory being connected by a bus system;
  • the one or more programs are stored in a memory, the one or more programs including instructions that invoke instructions stored in the memory to implement the solution in the method design of the first aspect above.
  • the implementation of the user terminal can refer to the implementation of the method, and the repetition is not Let me repeat.
  • a computer readable storage medium storing one or more programs, the one or more programs comprising instructions that, when executed by a user terminal, cause the user terminal to perform the first aspect Law or the possible implementation of the first aspect.
  • FIG. 3 are schematic diagrams showing a conventional 3D display effect according to an embodiment of the present invention.
  • FIG. 4 is a schematic flow chart of a 3D display method according to an embodiment of the present invention.
  • FIG. 5 and FIG. 6 are schematic diagrams of an included angle according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a 3D display effect according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a user terminal according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of another user terminal according to an embodiment of the present disclosure.
  • FIG. 10 is a schematic structural diagram of still another user terminal according to an embodiment of the present invention.
  • Two Dimension (2D) display A display that displays a two-dimensional planar image.
  • 2D displays can also perform 3D display by displaying images of 3D models.
  • the 3D display makes the displayed picture stereoscopic and is no longer limited to the plane of the screen, giving the viewer an immersive experience.
  • Three Dimension (3D) display A display that performs 3D display using the characteristics of parallax in both eyes of a person.
  • the 3D display can utilize glasses-type 3D display technology (ie, display technology that requires glasses, helmets, and other auxiliary tools to obtain realistic stereoscopic images with spatial depth) or naked-eye 3D display technology (ie, no need to wear glasses, helmets)
  • glasses-type 3D display technology ie, display technology that requires glasses, helmets, and other auxiliary tools to obtain realistic stereoscopic images with spatial depth
  • naked-eye 3D display technology ie, no need to wear glasses, helmets
  • a display technique of a realistic stereoscopic image having a spatial depth can be obtained by other auxiliary tools to perform 3D display.
  • glasses The 3D display technology may further include technologies such as an anaglyphic 3D display technology, an Active Shutter 3D display technology, and a Polarized 3D display technology.
  • the naked eye 3D display technology may further include technologies such as a light barrier 3D display technology
  • 3D naked eye display A display using naked eye 3D display technology, which is a type of 3D display.
  • Holographic display A display that uses 3D display with holographic technology.
  • the holographic technique is a technique for reproducing a true three-dimensional image of an object using the principle of diffraction.
  • the 3D display is performed at a fixed 3D projection angle.
  • the user terminal performs 3D display through the 2D display
  • the user terminal displays a rectangular parallelepiped with a fixed 3D projection angle of 0°
  • the angle between the center perpendicular (the center perpendicular line in this article is the line perpendicular to the center of the display screen) (the angle is the user's central axis angle of view) is equal to 0 °
  • the display of the cuboid is shown in the figure
  • the first side of the rectangular parallelepiped is output to the screen.
  • the first side is a hatched surface of the rectangular parallelepiped in FIG.
  • the display effect of the rectangular parallelepiped is the same as that of FIG. 1, and the first side of the rectangular parallelepiped is output to the screen. That is to say, no matter how the viewing angle of the user changes, the screen displayed by the user terminal through the 2D display is always the same, and the screen viewed by the user does not change. Such a 3D display effect is not realistic, and the 3D display effect is poor.
  • the user terminal performs 3D display through the holographic display
  • the user terminal displays the screen at a fixed 3D projection angle of 0°
  • the user's eyes are at the midpoint of
  • the angle between the center perpendiculars is equal to 0°
  • the viewing effect of the screen viewed by the user is optimal.
  • the user's central axis angle of view is not 0°, the user may not be able to see the displayed picture, and the stereoscopic effect of the viewed picture is not good.
  • the 3D projection angle of the user terminal has two, including the left eye 3D projection angle and the right eye 3D projection angle. If the user terminal performs 3D display with a fixed left eye 3D projection angle of -5° and a fixed right eye 3D projection angle of 5°, as shown in FIG.
  • the user terminal performs 3D display with a fixed 3D projection angle, and the user experience of the 3D display is poor, and the display needs to be further improved.
  • the present invention provides a 3D display method and a user terminal.
  • the user terminal may be a mobile phone, a tablet computer, a personal computer (PC, Personal Computer), a PDA (Personal Digital Assistant), a television, a car computer, or a wearable device (such as a smart watch).
  • the user terminal can have a 2D display, a 3D display, a holographic display, or other display that can be used for 3D display, which is not limited by the embodiment of the present invention.
  • FIG. 4 is a schematic flowchart diagram of a 3D display method according to an embodiment of the present invention. As shown in FIG. 4, the 3D display method may include portions 401 to 403.
  • the user terminal detects the user's viewing angle of the display screen.
  • the viewing angle in the 401 portion may be the user's central axis viewing angle, that is, the angle between the midpoint of the user's eyes and the center perpendicular.
  • the mid-axis viewing angle can be as shown in FIG. 2.
  • the viewing angle of view in section 401 may include a right eye viewing angle and a left eye viewing angle.
  • the left eye angle of view is an angle between a midpoint of the left eye pupil and a center perpendicular
  • the right eye angle is an angle between a midpoint of the right eye pupil and a center perpendicular.
  • the left eye angle and the right eye angle can be as shown in FIG.
  • the specific implementation of the 401 part may include the 11) and 12) parts.
  • the user terminal may also detect the user's middle by other means.
  • the embodiment of the present invention is not limited. Among them, sections 11) and 12) are:
  • the user terminal detects the tilt angle of the user terminal with respect to the vertical line of gravity, the angle of rotation of the user terminal about the axis of symmetry, and the angle between the midpoint of the eyes and the camera.
  • the user terminal calculates the central axis angle of view of the user according to the tilt angle, the rotation angle, and the angle between the midpoint of the eyes and the camera.
  • the user terminal can accurately determine the user's central axis viewing angle.
  • the user's central axis angle of view ie, the angle between the midpoint of the eyes and the center perpendicular
  • the angle of inclination of the user terminal relative to the vertical of the gravity ie, the angle between the midpoint of the eyes and the camera
  • the angle between the midpoint of the eyes and the camera may be As shown in Figure 5.
  • the inclination angle of the user terminal with respect to the vertical line of gravity is the angle between the axis of symmetry of the user terminal and the vertical line of gravity.
  • the vertical line of gravity is a line that coincides with the direction of gravity.
  • the angle between the midpoint of the eyes and the camera is the angle between the line perpendicular to the camera and parallel to the center perpendicular and the midpoint of both eyes.
  • the angle of rotation of the user terminal about the axis of symmetry may be as shown in FIG. Figure 6 is a schematic view of the top of the user terminal. As shown in FIG. 6, if the angle of the user terminal at the first position is 0° and the user terminal rotates to the second position about the axis of symmetry, the rotation angle of the user terminal about the axis of symmetry is between the first position and the second position. The angle of the.
  • the tilt angle of the user terminal relative to the vertical line of gravity can be detected by a gyroscope or a gravity sensor.
  • the tilt angle can be detected by other instruments, which is not limited in the embodiment of the present invention.
  • the angle of rotation of the user terminal about the axis of symmetry can be detected by the gyroscope.
  • the rotation angle can be detected by other instruments, which is not limited in the embodiment of the present invention.
  • a specific implementation manner in which the user terminal detects an angle between a midpoint of the two eyes and the camera may be: the user terminal captures a picture of the user through the camera, and analyzes the captured picture to obtain a distance between the eyes of the user. The user terminal obtains an angle between the midpoint of the eyes and the camera according to the distance between the midpoint of the eyes and the camera and the distance between the eyes. The distance between the midpoint of the eyes and the camera can be detected by a distance sensor (such as an infrared distance sensor or an ultrasonic distance sensor).
  • a distance sensor such as an infrared distance sensor or an ultrasonic distance sensor.
  • the user terminal may include a main camera and at least one auxiliary camera. If the angle between the midpoint of the eyes in the 11) and 12) portions and the camera is between the midpoint of the eyes and the main camera
  • the specific angle of the user terminal detecting the angle between the midpoint of the two eyes and the main camera may be: the user terminal controls the auxiliary camera and the main camera to simultaneously take a picture; the user terminal shoots according to the picture taken by the auxiliary camera and the main camera.
  • the distance between the eyes and the distance between the eyes gives the angle between the midpoint of the eyes and the main camera.
  • the angle between the midpoint of the user's eyes and the camera can be detected by other instruments, which is not limited in the embodiment of the present invention.
  • the user terminal performs the 11) and 12) portions only when performing 3D display.
  • the user terminal executes the part 11
  • the inclination angle of the user terminal with respect to the vertical line of gravity, the rotation angle of the user terminal about the axis of symmetry, and the angle between the midpoint of the eyes and the camera can be detected in real time.
  • the user terminal can detect the user's new central axis viewing angle in time, and then adjust the 3D projection angle in time according to the new central axis viewing angle.
  • the tilt angle of the user terminal with respect to the vertical line of gravity and the rotation angle of the user terminal about the axis of symmetry may be detected in real time.
  • the user terminal can also detect whether the tilt angle and the change in the rotation angle exceed a preset angle.
  • the user terminal detects the angle between the midpoint of the user's eyes and the camera. In the process of 3D display, the angle between the midpoint of the user's eyes and the camera can be calculated without constantly turning on the camera to take a picture of the user, which is beneficial to save CPU resources.
  • the specific manner that the user terminal detects whether the change of the tilt angle or the rotation angle exceeds the preset angle may be: the user terminal determines whether the absolute value of the difference between the recently detected tilt angle and the first tilt angle exceeds the pre-predetermined state. The angle is set, wherein the first tilt angle is an angle of tilt detected when the angle between the midpoint of the user's eyes and the camera is detected last time. Or the user terminal determines whether the absolute value of the difference between the recently detected rotation angle and the first rotation angle exceeds a preset angle, wherein the first rotation angle is when the angle between the midpoint of the user's eyes and the camera is detected last time. The detected angle of rotation.
  • the user terminal may also detect the tilt angle of the user terminal with respect to the vertical line of gravity, the rotation angle of the user terminal about the axis of symmetry, and the midpoint of the eyes. The angle between the camera and the camera. In this way, it is not necessary to detect the tilt angle, the rotation angle, and the angle between the midpoint of the eyes and the camera at all times, which is advantageous for saving CPU resources.
  • Equation 1 calculates the mid-axis viewing angle.
  • is the central axis viewing angle.
  • This ⁇ is the angle of inclination of the user terminal with respect to the vertical line of gravity.
  • This ⁇ is the angle of rotation of the user terminal about the axis of symmetry.
  • This x is a preset angle correction parameter.
  • This ⁇ is the angle between the midpoint of both eyes and a camera.
  • the central axis angle of view can be specifically calculated by the following formula 2.
  • is the central axis viewing angle.
  • This ⁇ is the angle of inclination of the user terminal with respect to the vertical line of gravity.
  • This ⁇ is the angle of rotation of the user terminal about the axis of symmetry.
  • This x is a preset angle correction parameter.
  • This ⁇ is the angle between the midpoint of both eyes and a camera.
  • the ⁇ is a preset angle value, which may be an empirical value, for example, may be 45°, 40°, or the like.
  • the user terminal can accurately determine the central axis angle of view.
  • the user terminal may also calculate the central axis angle of view without using the above formula 1 and formula 2, which is not limited in the embodiment of the present invention.
  • the user terminal determines the 3D projection angle based on the viewing angle detected by Section 401.
  • the specific implementation of the 402 portion may be: the user terminal determines the detected central axis viewing angle as the 3D projection angle.
  • the user terminal can directly determine the central axis viewing angle as a 3D projection angle after detecting the central axis angle of view.
  • the user terminal when the user terminal performs 3D display through other displays than the 2D display and the holographic display, the user terminal can directly determine the central axis viewing angle as the 3D projection angle after detecting the central axis viewing angle, and the present invention
  • the embodiment is not limited.
  • the 3D projection angle may include a left eye 3D projection angle and a right eye 3D projection angle.
  • the specific implementation of the 402 part may be: the user terminal determines the left eye 3D projection angle according to the detected central axis viewing angle and the preset left eye adjustment angle; the user terminal detects the The mid-axis viewing angle and the preset right-eye adjustment angle determine the right-eye 3D projection angle.
  • the preset left eye adjustment angle and the preset right eye adjustment angle may be experience values.
  • the preset left eye adjustment angle may be 3°
  • the preset right eye adjustment angle may be -3°.
  • the user terminal obtains the left eye angle of the user according to the user's central axis angle of view and the preset left eye adjustment angle, and the user terminal determines the calculated left eye angle of the user as the left eye 3D projection angle. .
  • the user terminal obtains the right eye angle of the user according to the user's central axis angle of view and the preset left eye adjustment angle, and the user terminal determines the calculated right eye angle of the user as the left eye 3D projection angle.
  • the user terminal can obtain the left eye angle of the user and the right eye angle of the user in other manners according to the user's central axis perspective, which is not limited in the embodiment of the present invention.
  • the user terminal may store a plurality of preset left eye adjustment angles and a plurality of preset right eye adjustment angles.
  • the user terminal can pre-store the correspondence between the preset central axis angle of view and the preset left eye adjustment angle.
  • the user terminal can pre-store the correspondence between the preset central axis angle of view and the preset right eye adjustment angle.
  • the user terminal After the user terminal detects the center axis angle of the user, the user terminal acquires a preset preset left eye adjustment angle and a preset right eye adjustment angle corresponding to the central axis angle, and according to the central axis angle of view and the central axis angle of view The corresponding preset left eye adjustment angle and preset right eye adjustment angle are used to determine the left eye 3D projection angle and the right eye 3D projection angle.
  • the user terminal can pre-store the preset center axis angle of view 0° corresponding to the preset left eye adjustment angle of 3°, and the preset center axis angle of view 10° corresponds to the preset left eye adjustment angle of 4°.
  • the user terminal can pre-store the preset center axis angle of view 0° corresponding to the preset right eye adjustment angle -3°, and the preset center axis angle of view 10° corresponds to the preset right eye adjustment angle of -4°.
  • the user terminal After the user terminal detects that the user's central axis angle of view is 0°, the user terminal acquires a preset preset left eye adjustment angle of 3° corresponding to the central axis angle of view 0° and a preset right eye adjustment angle of ⁇ 3°, and according to The left axis viewing angle is 0°, the preset left eye adjustment angle is 3°, and the preset right eye adjustment angle is ⁇ 3° to determine the left eye 3D projection angle and the right eye 3D projection angle.
  • the specific implementation manner of the user terminal determining the left-eye 3D projection angle according to the central axis angle of view and the preset left-eye adjustment angle may be: the user terminal may adjust the angle between the central axis and the preset left eye. The difference between the two (ie, the left eye angle of the user terminal) is determined as the left eye 3D projection angle.
  • the specific implementation manner of determining the 3D projection angle of the right eye according to the central axis angle of view and the preset right eye adjustment angle may be: the difference between the central axis angle of view and the preset right eye adjustment angle of the user terminal (ie, the user terminal The right eye view is determined as the right eye 3D projection angle.
  • the user's central axis angle of view is 0°
  • the preset left eye adjustment angle may be 3°
  • the preset right eye adjustment angle may be ⁇ 3°
  • the user terminal may subtract 3° from 0° to obtain a left eye 3D projection.
  • the angle is -3°
  • the 3D projection angle of the right eye is 3°.
  • the specific implementation of the 402 part may be: determining the detected left-eye view as the left eye 3D.
  • the projection angle determines the detected right eye angle of view as the right eye 3D projection angle.
  • the user terminal performs 3D display on the content to be displayed according to the 3D projection angle.
  • the user terminal After detecting the 3D projection angle, the user terminal performs a drawing operation on the content to be displayed according to the 3D projection angle, and displays the result of the drawing through the corresponding display.
  • the specific implementation of the 403 part may be: the user terminal draws the content to be displayed according to the 3D projection angle, and passes the 2D The display shows the result of the drawing.
  • the user terminal may send the 3D projection angle to a graphics processing unit (GPU) of the user terminal.
  • GPU graphics processing unit
  • the content to be displayed is drawn according to the 3D projection angle, and the drawing result is displayed through the 2D display.
  • the GPU may display the content to be displayed according to the 3D projection angle, and the specific implementation manner of displaying the drawing result by using the 2D display may be: the GPU rasterizes the content to be displayed into the FrameBuffer according to the 3D projection angle. The drawing result in the FrameBuffer is displayed through the 2D display.
  • the GPU may further map the content that needs to be displayed according to the 3D projection angle, which is not limited by the embodiment of the present invention.
  • the specific implementation of the 403 portion may be: the user terminal draws the content to be displayed according to the 3D projection angle, and passes the holography. The display shows the result of the drawing.
  • the user terminal may send the 3D projection angle to the GPU of the user terminal.
  • the GPU obtains the 3D projection angle
  • the content to be displayed is drawn according to the 3D projection angle, and after the drawing is completed, the drawing result is displayed by the holographic display.
  • the GPU can draw the content to be displayed according to the 3D projection angle by using the following function: glRotatef (GLfloat angle, GLfloat x, GLfloat y, GLfloat z). Where angle is the 3D projection angle.
  • the function of the glRotatef(GLfloat angle, GLfloat x, GLfloat y, GLfloat z) function is to rotate the angle of the current coordinate system with the a(x, y, z) vector as the rotation axis.
  • the specific implementation of the 403 part may be: the user terminal draws the content to be displayed according to the left eye 3D projection angle and the right eye 3D projection angle, and displays the drawing result through the 3D display.
  • the 3D display can be a 3D naked eye display or a 3D display that requires viewing with glasses, a helmet, and the like.
  • the user terminal may send the left eye 3D projection angle and the right eye 3D projection angle to the GPU of the user terminal.
  • the GPU draws the content to be displayed according to the left eye 3D projection angle and the right eye 3D projection angle, and displays the drawing result through the 3D display.
  • the GPU may map the content to be displayed according to the left eye 3D projection angle and the right eye 3D projection angle.
  • the GPU may rasterize the content to be displayed into a FrameBuffer according to the left eye 3D projection angle. And rasterize the content to be displayed to another FrameBuffer according to the right eye 3D projection angle. Accordingly, the drawing results in the two FrameBuffers are displayed by the 3D display.
  • the GPU may further map the content to be displayed according to the left-eye 3D projection angle and the right-eye 3D projection angle, which is not limited in the embodiment of the present invention.
  • the GPU can draw the content to be displayed according to the left eye 3D projection angle by using the following function: glRotatef (GLfloat angle, GLfloat x, GLfloat y, GLfloat z). Where angle is the 3D projection angle of the left eye.
  • the function of the glRotatef(GLfloat angle, GLfloat x, GLfloat y, GLfloat z) function is to rotate the angle of the current coordinate system with the a(x, y, z) vector as the rotation axis.
  • the GPU can draw the content to be displayed according to the right eye 3D projection angle by the following functions: glRotatef (GLfloat angle, GLfloat x, GLfloat y, GLfloat z). Where angle is the 3D projection angle of the right eye.
  • Application scenario 1 The user terminal performs 3D display on a rectangular parallelepiped through a 2D display.
  • the user terminal detects that the user's central axis angle of view is 0°
  • the user terminal determines 0° as the 3D projection angle.
  • the user terminal draws the rectangular parallelepiped according to the 3D projection angle of 0°, and displays the drawn rectangular parallelepiped through the 2D display.
  • the display result can be as shown in FIG. 1.
  • the user terminal detects that the user's central axis angle of view is 30°
  • the user terminal determines 30° as the 3D projection angle.
  • the user terminal draws the rectangular parallelepiped according to the 3D projection angle of 30°, and displays the rectangular parallelepiped obtained by the 2D display.
  • the display result can be as shown in FIG.
  • Application scenario 2 The user terminal performs a 3D display on a rectangular parallelepiped through a holographic display.
  • the user terminal detects that the user's central axis angle of view is 0°
  • the user terminal determines 0° as the 3D projection angle.
  • the user terminal draws the rectangular parallelepiped according to the 3D projection angle of 0°, and displays the rectangular parallelepiped by the holographic display.
  • the user terminal detects that the user's central axis angle of view is 30°
  • the user terminal determines 30° as the 3D projection angle.
  • the user terminal draws the rectangular parallelepiped according to the 3D projection angle of 30°, and displays the rectangular parallelepiped by the holographic display.
  • Application scenario 3 The user terminal performs 3D display through the 3D display.
  • the user terminal can detect the left eye angle and the right eye angle of the user, determine the left eye angle of view as the left eye 3D projection angle, and the right eye angle of view as the right eye 3D projection angle. For example, if the user terminal detects that the left eye angle of the user is -5° and the right eye angle of the user is 5°, the user terminal determines -5° as the left eye 3D projection angle and 5° as the right eye 3D projection. Angle, and draw a rectangular parallelepiped according to the left eye 3D projection angle -5° and the right eye 3D projection angle of 5°, and display the drawn rectangular parallelepiped through the 3D display.
  • the user terminal determines 10° as the left eye 3D projection angle, and 20° as the right eye 3D projection angle, and according to The left eye 3D projection angle 10° and the right eye 3D projection angle 20° are used to draw a rectangular parallelepiped, and the rectangular parallelepiped is displayed by a 3D display.
  • Application scenario 4 The user terminal performs 3D display through the 3D display.
  • the user terminal detects the left eye angle and the right eye angle of the user according to the central axis perspective.
  • the preset left eye adjustment angle pre-stored by the user terminal is 5°, and the preset right eye adjustment time is -5°.
  • the user terminal When the user terminal detects that the user's central axis angle of view is 0°, the user terminal subtracts 5° from 0° to obtain a left-eye viewing angle of ⁇ 5°, and determines a left-eye viewing angle of ⁇ 5° as a left-eye 3D projection angle; The terminal subtracts -5° from 0° to obtain a right eye angle of view of 5°, and the right eye angle of view of 5° is determined as the right eye 3D projection angle.
  • the user terminal draws the rectangular parallelepiped according to the left-eye 3D projection angle -5° and the right-eye 3D projection angle of 5°, and displays the drawn rectangular parallelepiped through the 3D display.
  • the user terminal detects that the user's central axis angle of view is 15°, the user terminal subtracts 5° from 15° to obtain a left-eye 3D projection angle of 10°, and subtracts -5° from 15° to obtain a right-eye 3D.
  • the projection angle is 20°.
  • the user terminal draws a rectangular parallelepiped according to a left-eye 3D projection angle of 10° and a right-eye 3D projection angle of 20°, and displays a rectangular parallelepiped by a 3D display.
  • the user terminal uses different 3D projection angles for 3D display according to different viewing angles of the user, which can make the 3D display effect more realistic, and the user can view the 3D picture more clearly and improve. 3D display effect.
  • the embodiment of the present invention may divide the functional unit into the user terminal according to the foregoing method example.
  • each functional unit may be divided according to each function, or two or more functions may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present invention is schematic, and is only a logical function division, and the actual implementation may have another division manner.
  • FIG. 8 is a schematic structural diagram of a user terminal according to an embodiment of the present invention.
  • the user terminal includes a detection module 801, a determination module 802, and a display module 803. among them:
  • the detecting module 801 is configured to detect a viewing angle of the user on the display screen.
  • the determining module 802 is configured to determine a 3D projection angle according to the viewing angle of view.
  • the display module 803 is configured to perform 3D display on the content that needs to be displayed according to the 3D projection angle.
  • the viewing angle detected by the detecting module 801 is a central axis viewing angle, where the central axis viewing angle is an angle between a midpoint of the two eyes and a center perpendicular, and the center perpendicular is perpendicular to the center of the display screen. The line of the location.
  • the determining module 802 is specifically configured to: determine the central axis viewing angle as a 3D projection angle.
  • the display module 803 is specifically configured to: draw the content to be displayed according to the 3D projection angle, and pass the 2D display Or a holographic display shows the result of the drawing.
  • the 3D projection angle includes a left-eye 3D projection angle and a right-eye 3D projection angle.
  • the determining module 802 is specifically configured to: according to the detecting module 801. The detected central axis viewing angle and the preset left eye adjustment angle determine the left eye 3D projection angle; and determine the right eye 3D projection angle according to the central axis viewing angle detected by the detecting module 801 and the preset right eye adjustment angle.
  • the viewing angle detected by the detecting module 801 includes a left-eye viewing angle and a right-eye viewing angle, where the left-eye viewing angle is an angle between a midpoint of the left-eye pupil and a center perpendicular, the right-eye viewing angle. It is the angle between the midpoint of the right eye pupil and the center perpendicular.
  • the 3D projection angle includes a left eye 3D projection angle and a right eye 3D projection angle.
  • the determining module is specifically configured to: detect The left eye angle of view detected by the module 801 is determined as the left eye 3D projection angle, and the right eye angle of view detected by the detection module 801 is determined as the right eye 3D projection angle.
  • the display module is specifically configured to: according to the left eye 3D projection angle and the right eye 3D projection angle, the content to be displayed Draw and display the plot results through a 3D display.
  • the detection module 801 is used to perform the method in step 401 in FIG. 4 of the method embodiment of the present invention.
  • the determining module 802 is used to perform the method in the step 402 of FIG. 4 in the method embodiment of the present invention.
  • the display module 803 is used to perform the method of the step 403 in FIG. 4 of the method embodiment of the present invention.
  • the implementation of the display module 803 can refer to the description of the step 403 in FIG. 4 of the method embodiment of the present invention, and details are not described herein again.
  • FIG. 9 is a schematic structural diagram of another user terminal according to an embodiment of the present invention.
  • the user terminal shown in FIG. 9 is an optimization of the user terminal shown in FIG.
  • the user terminal shown in FIG. 9 includes all the modules shown in FIG.
  • the detecting module 801 of the user terminal of FIG. 9 includes a detecting unit 8011 and a calculating unit 8012, wherein:
  • the detecting unit 8011 is configured to detect an inclination angle of the user terminal with respect to a vertical line of gravity, a rotation angle of the user terminal rotating about the axis of symmetry, and an angle between a midpoint of the eyes and the camera.
  • the calculating unit 8012 is configured to calculate the central axis viewing angle according to the tilt angle, the rotation angle, and the angle between the midpoint of the eyes and the camera.
  • the detecting unit 8011 is specifically configured to: detect a tilt angle of the user terminal with respect to a vertical line of gravity, a rotation angle of the user terminal relative to the axis of symmetry; and when a change in the tilt angle or the rotation angle is greater than When the angle is preset, the angle between the midpoint of both eyes and the camera is detected.
  • the specific implementations of the detecting unit 8011 and the calculating unit 8012 can be referred to the corresponding description of the foregoing method embodiments, and are not described herein for brevity.
  • the principle of the user terminal of the 3D display method provided in the embodiment of the present invention is similar to the 3D display method in the method embodiment of the present invention. Therefore, the implementation of the user terminal can refer to the implementation of the method. , will not repeat them here.
  • FIG. 10 Shown is a block diagram of a portion of the structure of a handset 1000 associated with an embodiment of the present invention.
  • the mobile phone 1000 includes an RF (Radio Frequency) circuit 1001, a memory 1002, other input devices 1003, a display screen 1004, a sensor 1005, an audio circuit 1006, an I/O subsystem 1007, a processor 1008, and a power supply 1009. And other components.
  • RF Radio Frequency
  • the structure of the mobile phone shown in FIG. 10 does not constitute a limitation to the mobile phone, and may include more or less components than those illustrated, or combine some components, or split some components, or Different parts are arranged.
  • the RF circuit 1001 can be used for receiving and transmitting signals during the transmission or reception of information or during a call. Specifically, after receiving the downlink information of the base station, it is processed by the processor 1008. In addition, the uplink data is designed to be sent to the base station.
  • RF circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like.
  • the RF circuit 1001 can also communicate with the network and other devices through wireless communication.
  • the wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access). , Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (Short Messaging Service), and the like.
  • the memory 1002 can be used to store computer executable program code, the program code including instructions; the processor 1008 executes various functional applications and data processing of the mobile phone 1000 by running software programs and modules stored in the memory 1002.
  • the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the storage data area may store data (such as audio data) created according to the use of the mobile phone 1000. , phone book, etc.).
  • the memory 1002 may include a ROM and a RAM, and may also include a high speed random access memory, a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
  • Other input devices 1003 can be used to receive input numeric or character information, as well as generate key signal inputs related to user settings and function controls of handset 1000.
  • other input devices 1003 may include, but are not limited to, physical keyboards, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and light mice (the light mouse is not sensitive to display visual output) Surface, or by One or more of the extension of the touch sensitive surface formed by the touch screen.
  • the other input device 1003 is coupled to other input device controllers 171 of the I/O subsystem 1007 and is in signal communication with the processor 1008 under the control of other device input controllers 171.
  • the display screen 1004 can be used to display information entered by the user or information provided to the user as well as various menus of the handset 1000, and can also accept user input.
  • the display screen 1004 can display information that needs to be displayed in the above method embodiments, such as an unread exclusive message, a selection list including a message selection, a selection list of selections including a plurality of time periods, an upward jump arrow, or a downward Jump arrows, etc.
  • the specific display screen 1004 can include a display panel 141 and a touch panel 142.
  • the display panel 141 can be configured by using an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like.
  • the touch panel 142 also referred to as a touch screen, a touch sensitive screen, etc., can collect contact or non-contact operations on or near the user (eg, the user uses any suitable object or accessory such as a finger, a stylus, etc. on the touch panel 142.
  • the operation in the vicinity of the touch panel 142 may also include a somatosensory operation; the operation includes a single-point control operation, a multi-point control operation, and the like, and drives the corresponding connection device according to a preset program.
  • the touch panel 142 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the touch orientation and posture of the user, and detects a signal brought by the touch operation, and transmits a signal to the touch controller; the touch controller receives the touch information from the touch detection device, and converts the signal into a processor.
  • the processed information is sent to the processor 1008 and can receive commands from the processor 1008 and execute them.
  • the touch panel 142 can be implemented by using various types such as resistive, capacitive, infrared, and surface acoustic waves, and the touch panel 142 can be implemented by any technology developed in the future.
  • the touch panel 142 can cover the display panel 141, and the user can display the content according to the display panel 141 (the display content includes, but is not limited to, a soft keyboard, a virtual mouse, a virtual button, an icon, etc.) on the display panel 141. Covered When operating on or near the touch panel 142, the touch panel 142 detects a touch operation thereon or nearby, and transmits it to the processor 1008 through the I/O subsystem 1007 to determine the type of touch event to determine the user. Input, and then processor 1008 provides a corresponding visual output on display panel 141 via I/O subsystem 1007 in accordance with user input in accordance with the type of touch event.
  • the touch panel 142 and the display panel 141 are two independent components to implement the input and input functions of the mobile phone 1000, in some embodiments, the touch panel 142 may be integrated with the display panel 141. The input and output functions of the mobile phone 1000 are realized.
  • the mobile phone 1000 can also include at least one type of sensor 1005, such as a fingerprint sensor, a light sensor, Motion sensors, gravity sensors, gyroscopes, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 141 according to the brightness of the ambient light, and the proximity sensor may close the display panel 141 when the mobile phone 1000 moves to the ear. / or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity.
  • the mobile phone 1000 can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.
  • the mobile phone 1000 can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, here Let me repeat.
  • An audio circuit 1006, a speaker 161, and a microphone 162 can provide an audio interface between the user and the handset 1000.
  • the audio circuit 1006 can transmit the converted audio data to the speaker 161 for conversion to the sound signal output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into a signal, which is received by the audio circuit 1006.
  • the audio data is converted to audio data, which is then output to the RF circuit 1001 for transmission to, for example, another mobile phone, or the audio data is output to the memory 1002 for further processing.
  • the I/O subsystem 1007 is used to control external devices for input and output, and may include other input device controllers 171, sensor controllers 172, and display controllers 173.
  • one or more other input device controllers 171 receive signals from other input devices 1003 and/or send signals to other input devices 1003, and other input devices 1003 may include physical buttons (press buttons, rocker buttons, etc.), Dial, slide switch, joystick, click wheel, light mouse (light mouse is a touch-sensitive surface that does not display visual output, or an extension of a touch-sensitive surface formed by a touch screen). It is worth noting that other input device controllers 171 can be connected to any one or more of the above devices.
  • Display controller 173 in I/O subsystem 1007 receives signals from display screen 1004 and/or transmits signals to display screen 1004. After the display screen 1004 detects the user input, the display controller 173 converts the detected user input into an interaction with the user interface object displayed on the display screen 1004, ie, implements human-computer interaction.
  • Sensor controller 172 can receive signals from one or more sensors 1005 and/or send signals to one or more sensors 1005.
  • the processor 1008 is a control center of the handset 1000 that connects various portions of the entire handset using various interfaces and lines, by running or executing software programs and/or modules stored in the memory 1002, and recalling data stored in the memory 1002, Perform various functions and processing data of the mobile phone 1000 to perform overall monitoring of the mobile phone.
  • the processor 1008 can include one or more processing units;
  • the processor 1008 can integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, an application, etc., and the modem processor mainly processes wireless communication. It will be appreciated that the above described modem processor may also not be integrated into the processor 1008.
  • the instruction causes the mobile phone 1000 to perform the 3D display method of the embodiment of the present invention.
  • the principle of solving the problem of the user terminal installed by the software program provided in the embodiment of the present invention is similar to the method for installing the software program in the method embodiment of the present invention. Therefore, the implementation of the user terminal can refer to the implementation of the foregoing method. For the sake of brevity, it will not be repeated here.
  • the mobile phone 1000 also includes a power source 1009 (such as a battery) that supplies power to various components.
  • a power source 1009 such as a battery
  • the power source can be logically coupled to the processor 1008 through a power management system to manage functions such as charging, discharging, and power consumption through the power management system.
  • the mobile phone 1000 may further include a camera, a Bluetooth module, and the like, and details are not described herein again.
  • an embodiment of the present invention further provides a non-transitory computer readable storage medium storing one or more programs, the non-volatile computer readable storage medium storing at least one program, each of the programs Included in the instruction, when the user terminal is executed by the user terminal provided by the embodiment of the present invention, the user terminal is configured to perform other execution processes of the user terminal in the embodiment 401 to 403 in the embodiment of the present invention or in the foregoing method embodiment, and may refer to the method. Descriptions of other implementations of the user terminal in the 401 to 403 of FIG. 4 or the foregoing method embodiment are not described herein.
  • the functions described herein can be implemented in hardware, software, firmware, or any combination thereof.
  • the functions may be stored in a computer readable medium or transmitted as one or more instructions or code on a computer readable medium.
  • Computer readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one location to another.
  • a storage medium may be any available media that can be accessed by a general purpose or special purpose computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明实施例公开了一种3D显示方法及用户终端,该方法包括:用户终端检测用户对显示屏幕的观看视角;用户终端根据该观看视角,确定3D投影角度;用户终端根据该3D投影角度对需要显示的内容进行3D显示。可见,通过实施本发明实施例,随着用户的观看角度不同,用户终端可显示不同3D投影角度的图像,能够使3D显示效果更逼真,提高了3D显示效果。

Description

一种3D显示方法及用户终端 技术领域
本发明涉及终端技术领域,尤其涉及一种3D显示方法及用户终端。
背景技术
三维(Three Dimension,3D)显示技术能够使输出的画面变得立体逼真,让观看者有身临其境的感受。目前,3D显示技术被广泛应用于用户终端(如手机、电脑、电视机等终端)中。
然而当前在实践中发现,用户需要从某一固定的角度观看3D显示的内容,才能有最佳的3D体验,即用户终端的3D显示不能灵活适应用户不同的观看位置或观看角度,3D显示技术有待进一步提高。
发明内容
本发明实施例公开了一种3D显示方法及用户终端,能够根据用户的观看视角动态地调整3D投影角度(3D Project Angle),从而解决了以固定的3D投影角度进行3D显示,不能灵活适应用户不同观看位置或观看角度的问题。
第一方面,提供了一种3D显示方法,应用于用户终端,该方法包括:用户终端检测用户对显示屏幕的观看视角;用户终端根据该观看视角,确定3D投影角度;用户终端根据该3D投影角度对需要显示的内容进行3D显示。
作为一种可选的实施方式,上述观看视角可以为中轴视角,该中轴视角为双眼的中点与中心垂线之间的夹角,该中心垂线为垂直于显示屏幕中心位置的线。
作为一种可选的实施方式,上述观看视角可包括左眼视角和右眼视角,该左眼视角为左眼瞳孔的中点与中心垂线之间的夹角,该右眼视角为右眼瞳孔的中点与中心垂线之间的夹角,该中心垂线为垂直于显示屏幕中心位置的线。
可见,通过实施第一方面所提供的方法,用户终端可通过用户的观看视角动态地调整进行3D显示的3D投影角度,能够灵活适应用户不同观看位置或观看角度,使3D显示效果更逼真,用户观看3D画面更加清晰,提高了3D 显示效果。
作为一种可选的实施方式,当用户终端检测的观看视角为中轴视角时,用户终端检测用户对显示屏幕的观看视角可以包括:用户终端检测用户终端相对于重力垂线的倾斜角度、用户终端绕对称轴旋转的旋转角度以及双眼的中点与摄像头之间的角度;用户终端根据倾斜角度、旋转角度以及双眼的中点与摄像头之间的角度,计算得到中轴视角。
通过实施该实施方式,用户终端可准确地确定中轴视角。
作为一种可选的实施方式,用户终端可实时地检测用户终端相对于重力垂线的倾斜角度、用户终端绕对称轴旋转的旋转角度以及双眼的中点与摄像头之间的角度。这样当用户的观看视角改变之后,用户终端可及时地检测到用户的新的中轴视角,进而根据新的中轴视角及时地调整3D投射角度。
作为一种可选的实施方式,用户终端检测用户终端相对于重力垂线的倾斜角度、用户终端绕对称轴旋转的旋转角度以及双眼的中点与摄像头之间的角度可以包括:用户终端检测用户终端相对于重力垂线的倾斜角度、用户终端相对于对称轴旋转的旋转角度;当用户终端检测到倾斜角度或旋转角度的变化大于预设角度时,检测双眼的中点与摄像头之间的角度。这样在3D显示的过程中,可不用一直开启摄像头拍摄用户图片来计算用户双眼的中点与摄像头之间的夹角,有利于节省CPU资源。
可选的,用户终端检测该倾斜角度或该旋转角度的变化是否超过预设角度的具体实施方式可以为:用户终端判断最近检测到的倾斜角度与第一倾斜角度之差的绝对值是否超过预设角度,其中,第一倾斜角度为在上一次检测用户双眼的中点与摄像头之间的夹角时检测到的倾斜角度。或用户终端判断最近检测到的旋转角度与第一旋转角度之差的绝对值是否超过预设角度,其中,第一旋转角度为在上一次检测用户双眼的中点与摄像头之间的夹角时检测到的旋转角度。
作为一种可选的实施方式,用户终端具体可通过以下公式1计算得到中轴视角。公式1:
Figure PCTCN2016101374-appb-000001
其中,该θ为中轴视角。该α为用户终端相对于重力垂线的倾斜角度。该β为用户终端绕对称轴旋转的旋转角度。该x为预设角度矫正参数。该λ为双眼的中点与摄像头之间的角度。
通过该公式1,用户终端可准确地确定中轴视角。
作为一种可选的实施方式,用户终端具体可通过以下公式2计算得到中轴视角。公式2:
Figure PCTCN2016101374-appb-000002
其中,该θ为中轴视角。该α为用户终端相对于重力垂线的倾斜角度。该β为用户终端绕对称轴旋转的旋转角度。该x为预设角度矫正参数。该λ为双眼的中点与摄像头之间的角度。该ε为一个预设的角度值,可以是一个经验值,例如,可以是45°、40°等。
通过该公式2,用户终端可准确地确定中轴视角。
作为一种可选的实施方式,3D投影角度包括左眼3D投影角度和右眼3D投影角度,用户终端根据检测的中轴视角,确定3D投影角度可以包括:用户终端根据检测的中轴视角和预设左眼调整角度,确定左眼3D投影角度;用户终端根据检测的中轴视角和预设右眼调整角度,确定右眼3D投影角度。
通过实施该实施方式,用户终端可准确地确定左眼3D投影角度和右眼3D投影角度。
作为一种可选的实施方式,上述预设左眼调整角度为预存的预设中轴视角与预设左眼调整角度之间的对应关系中,用户终端检测的中轴视角对应的预设左眼调整角度;上述预设右眼调整角度为预存的预设中轴视角与预设右眼调整角度之间的对应关系中,用户终端检测的中轴视角对应的预设右眼调整角度。
这样存储多组预设左眼调整角度和预设右眼调整角度,并根据当前检测到的中轴视角获取对应的预设左眼调整角度和预设右眼调整角度,可获取到更准确的预设左眼调整角度和预设右眼调整角度,从而可更准确地得到左眼3D投影角度和右眼3D投影角度。
作为一种可选的实施方式,3D投影角度包括左眼3D投影角度和右眼3D投影角度,当观看视角包括左眼视角和右眼视角时,用户终端根据观看视角,确定3D投影角度可以包括:用户终端将左眼视角确定为左眼3D投影角度,将右眼视角确定为右眼3D投影角度。
这样可准确地确定左眼3D投影角度和右眼3D投影角度。
作为一种可选的实施方式,当3D投影角度包括左眼3D投影角度和右眼3D投影角度时,用户终端根据3D投影角度对需要显示的内容进行3D显示可以包括:用户终端根据左眼3D投影角度和右眼3D投影角度对需要显示的内容绘图,并通过3D显示器显示绘图结果。
通过实施该实施方式,随着用户的观看角度不同,用户终端可确定不同的 3D投影角度,进而用户终端可通过3D显示器显示不同3D投影角度的图像,能够使3D显示效果更逼真,且使用户观看画面更加清晰,提高了3D显示效果。
作为一种可选的实施方式,当用户终端检测的观看视角为中轴视角时,用户终端根据观看视角,确定3D投影角度可以包括:用户终端将中轴视角确定为3D投影角度。
通过实施该实施方式,用户终端可准确地确定3D投影角度。
作为一种可选的实施方式,若用户终端将中轴视角确定为3D投影角度,则用户终端根据3D投影角度对需要显示的内容进行3D显示可以包括:用户终端根据3D投影角度对需要显示的内容进行绘图,并通过2D显示器或全息显示器显示绘图结果。
通过实施该实施方式,随着用户的观看角度不同,用户终端可确定不同的3D投影角度,进而用户终端可通过2D显示器或全息显示器显示不同角度的图像,能够使3D显示效果更逼真,提高了3D显示效果。
第二方面,提供了一种用户终端,该用户终端具有实现上述第一方面或第一方面可能的实现方式中用户终端行为的功能。该功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。该硬件或软件包括一个或多个与上述功能相对应的单元。该单元可以是软件和/或硬件。基于同一发明构思,由于该用户终端解决问题的原理以及有益效果可以参见上述第一方面和第一方面的各可能的方法实施方式以及所带来的有益效果,因此该用户终端的实施可以参见上述第一方面和第一方面的各可能的方法实施方式,重复之处不再赘述。
第三方面,提供了一种用户终端,用户终端包括显示器、一个或多个处理器、存储器、总线***以及一个或多个程序,该一个或多个处理器和存储器通过总线***相连;其中,该一个或多个程序被存储在存储器中,该一个或多个程序包括指令,该处理器调用存储在该存储器中的指令以实现上述第一方面的方法设计中的方案。由于该用户终端解决问题的实施方式以及有益效果可以参见上述第一方面和第一方面的各可能的方法的实施方式以及有益效果,因此该用户终端的实施可以参见方法的实施,重复之处不再赘述。
第四方面,提供了一种存储一个或多个程序的计算机可读存储介质,一个或多个程序包括指令,指令当被用户终端执行时使用户终端执行第一方面的方 法或第一方面可能的实现方式。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1~图3是本发明实施例提供的一种现有的3D显示效果的示意图;
图4是本发明实施例提供的一种3D显示方法的流程示意图;
图5和图6是本发明实施例提供的一种夹角的示意图;
图7是本发明实施例提供的一种3D显示效果的示意图;
图8是本发明实施例提供的一种用户终端的结构示意图;
图9是本发明实施例提供的另一种用户终端的结构示意图;
图10是本发明实施例提供的又一种用户终端的结构示意图。
具体实施方式
为了使本发明的目的、技术方案和优点更加清楚,下面将结合附图对本发明实施例的技术方案进行描述。
为了便于理解本发明实施例,下面对本发明实施中涉及的专业术语进行介绍:
二维(two Dimension,2D)显示器:可显示二维平面图像的显示器。目前,一些2D显示器也可通过显示3D模型的图像来进行3D显示。3D显示可使显示的画面立体逼真,不再局限于屏幕的平面上,给观看者以身临其境的感受。
三维(Three Dimension,3D)显示器:利用人两眼具有视差的特性,来进行3D显示的显示器。例如,该3D显示器可利用眼镜式3D显示技术(即需要佩戴眼镜、头盔等其他辅助工具才能获得具有空间深度的逼真立体图像的显示技术)或裸眼式3D显示技术(即不需要佩戴眼镜、头盔等其他辅助工具就可获得具有空间深度的逼真立体图像的显示技术)来进行3D显示。其中,眼镜 式3D显示技术又可包括色差式(Anaglyphic)3D显示技术、快门式(Active Shutter)3D显示技术和偏光式(Polarized)3D显示技术等技术。裸眼式3D显示技术又可包括光屏障式(Light barrier)3D显示技术、柱状透镜(Lenticular Lens)3D显示技术和指向光源(Directional Backlight)3D显示技术等技术。
3D裸眼显示器:采用裸眼式3D显示技术的显示器,属于3D显示器的一种。
全息显示器:采用全息技术进行3D显示的显示器,全息技术是利用衍射原理再现物体真实的三维图像的技术。
在现有的实际应用中,用户终端通过任何显示器进行3D显示时,都是以固定的3D投影角度来进行3D显示。
举例来说,在现有的实际应用中,在用户终端通过2D显示器来进行3D显示的情况下,若用户终端以固定的3D投影角度0°来显示一个长方体,当用户的双眼的中点与中心垂线(本文中的中心垂线为垂直于显示屏幕的中心位置的线)之间的夹角(该夹角即为用户的中轴视角)等于0°时,该长方体的显示效果如图1所示,该长方体的第一侧面正对屏幕输出。该第一侧面为图1中长方体的阴影面。当用户的中轴视角变为30°时,如图2所示,该长方体的显示效果和图1的显示效果相同,该长方体的第一侧面正对屏幕输出。也就是说,无论用户的观看角度怎么改变,用户终端通过2D显示器显示的画面始终是一样的,用户观看到的画面并没改变。这样的3D显示效果并不逼真,3D显示效果较差。
再举例来说,在现有的实际应用中,在用户终端通过全息显示器来进行3D显示的情况下,若用户终端以固定的3D投影角度0°来显示画面,当用户的双眼的中点与中心垂线之间的夹角等于0°时,用户观看显示的画面的观看效果最佳。当用户的中轴视角不为0°时,用户可能看不清显示的画面,并且观看的画面的立体效果也不好。
再举例来说,在现有的实际应用中,在用户终端通过3D显示器来进行3D显示的情况下,用户终端的3D投影角度具有两个,包括左眼3D投影角度和右眼3D投影角度。若用户终端以固定的左眼3D投影角度-5°和固定的右眼3D投影角度5°来进行3D显示,则如图3所示,当用户的左眼瞳孔的中点与中心垂线之间的夹角(该夹角即为用户的左眼视角)等于-5°,用户的右眼瞳孔的中 点与中心垂线之间的夹角(该夹角即为用户的右眼视角)为5°时,用户观看显示画面的观看效果最好。当用户的左眼视角不为-5°°,用户的右眼视角不为5°时,用户可能看不清显示的画面,并且观看的画面的立体效果也不好。
综上所述,在现有的实际应用中,用户终端以固定的3D投影角度来进行3D显示,3D显示的用户体验较差,显示有待进一步提高。
为解决上述3D显示效果较差的问题,本发明提供了一种3D显示方法及用户终端。其中,该用户终端可以为手机、平板电脑、个人电脑(PC,Personal Computer)、PDA(Personal Digital Assistant,个人数字助理)、电视机、车载电脑或可穿戴设备(如智能手表等)等终端。该用户终端可具有2D显示器、3D显示器、全息显示器或其他可用于3D显示的显示器,本发明实施例不做限定。
请参见图4,图4为本发明实施例提供的一种3D显示方法的流程示意图。如图4所示,该3D显示方法可以包括401~403部分。
在401部分中,用户终端会检测用户对显示屏幕的观看视角。
作为一种可选的实施方式,401部分中的观看视角可以为用户的中轴视角,即用户双眼的中点与中心垂线之间的夹角。例如,中轴视角可如图2所示。
作为一种可选的实施方式,401部分中的观看视角可以包括右眼视角和左眼视角。其中,该左眼视角为左眼瞳孔的中点与中心垂线之间的夹角,该右眼视角为右眼瞳孔的中点与中心垂线之间的夹角。例如,左眼视角和右眼视角可如图3所示。
作为一种可选的实施方式,401部分中的观看视角为用户的中轴视角时,401部分的具体实施方式可以包括11)和12)部分,当然用户终端还可通过其他方式检测用户的中轴视角,本发明实施例不做限制。其中,11)和12)部分为:
11)用户终端检测用户终端相对于重力垂线的倾斜角度、用户终端绕对称轴旋转的旋转角度、双眼的中点与摄像头之间的角度。
12)用户终端根据倾斜角度、旋转角度以及双眼的中点与摄像头之间的角度,计算得到用户的中轴视角。
通过实施该实施方式,用户终端可准确地确定用户的中轴视角。
在该实施方式中,用户的中轴视角(即双眼的中点与中心垂线之间的夹角)、用户终端相对于重力垂线的倾斜角度以及双眼的中点与摄像头之间的角度可如图5所示。
如图5所示,用户终端相对于重力垂线的倾斜角度,即为用户终端的对称轴与重力垂线之间的角度。其中,重力垂线为与重力方向一致的线。
如图5所示,双眼的中点与摄像头之间的角度,即为垂直穿过摄像头且与中心垂线平行的线与双眼的中点之间的角度。
在该实施方式中,用户终端绕对称轴旋转的旋转角度可以如图6所示。图6为俯视用户终端顶部的示意图。如图6所示,若用户终端在第一位置的角度为0°,用户终端绕对称轴旋转至第二位置,则用户终端绕对称轴旋转的旋转角度为第一位置与第二位置之间的夹角。
作为一种可选的实施方式,用户终端相对于重力垂线的倾斜角度可通过陀螺仪或重力传感器检测。当然还可通过其他仪器来检测该倾斜角度,本发明实施例不做限制。
作为一种可选的实施方式,用户终端绕对称轴旋转的旋转角度可通过陀螺仪来检测。当然还可通过其他仪器来检测该旋转角度,本发明实施例不做限制。
作为一种可选的实施方式,用户终端检测双眼的中点与摄像头之间的角度的具体实施方式可以为:用户终端通过摄像头拍摄用户的图片,并分析拍摄的图片得到用户双眼之间的距离;用户终端根据双眼的中点与摄像头之间的距离以及双眼之间的距离,得到双眼的中点与摄像头之间的夹角。其中,双眼的中点与摄像头之间的距离可通过距离传感器(如红外线距离传感器或超声波距离传感器等)检测。
其中,用户终端如何根据双眼之间的距离以及双眼的中点与摄像头之间的距离得到双眼的中点与摄像头之间的夹角为业界公知的技术,在此不赘述。
作为一种可选的实施方式,用户终端可包括主摄像头和至少一个辅助摄像头,若11)和12)部分中双眼的中点与摄像头之间的夹角为双眼的中点与主摄像头之间的夹角,则用户终端检测双眼的中点与主摄像头之间的角度的具体实施方式可以为:用户终端控制辅助摄像头和主摄像头同时拍摄图片;用户终端根据辅助摄像头拍摄的图片和主摄像头拍摄的图片,得到双眼的中点与摄像头之间的距离以及双眼之间的距离;用户终端根据双眼的中点与主摄像头之间 的距离以及双眼之间的距离,得到双眼的中点与主摄像头之间的夹角。其中,用户终端如何根据辅助摄像头拍摄的图片和主摄像头拍摄的图片,得到双眼的中点与摄像头之间的距离以及双眼之间的距离为业界公知的技术,在此不赘述。
当然还可通过其他仪器来检测该用户双眼的中点与摄像头之间的夹角,本发明实施例不做限制。
作为一种可选的实施方式,用户终端仅在进行3D显示时,才执行11)和12)部分。用户终端执行11)部分时,可实时地检测用户终端相对于重力垂线的倾斜角度、用户终端绕对称轴旋转的旋转角度以及双眼的中点与摄像头之间的角度。这样当用户的中轴视角改变之后,用户终端可及时地检测到用户的新的中轴视角,进而根据新的中轴视角及时地调整3D投射角度。
作为一种可选的实施方式,用户终端执行11)部分时,可实时地检测用户终端相对于重力垂线的倾斜角度以及用户终端绕对称轴旋转的旋转角度。用户终端还可检测该倾斜角度和该旋转角度的变化是否超过预设角度。当用户终端检测到该倾斜角度或该旋转角度的变化超过预设角度时,用户终端才检测用户双眼的中点与摄像头之间的夹角。这样在3D显示的过程中,可不用一直开启摄像头拍摄用户图片来计算用户双眼的中点与摄像头之间的夹角,有利于节省CPU资源。
可选的,用户终端检测该倾斜角度或该旋转角度的变化是否超过预设角度的具体实施方式可以为:用户终端判断最近检测到的倾斜角度与第一倾斜角度之差的绝对值是否超过预设角度,其中,第一倾斜角度为在上一次检测用户双眼的中点与摄像头之间的夹角时检测到的倾斜角度。或用户终端判断最近检测到的旋转角度与第一旋转角度之差的绝对值是否超过预设角度,其中,第一旋转角度为在上一次检测用户双眼的中点与摄像头之间的夹角时检测到的旋转角度。
作为一种可选的实施方式,用户终端执行11)部分时,也可以预设时间周期来检测用户终端相对于重力垂线的倾斜角度、用户终端绕对称轴旋转的旋转角度以及双眼的中点与摄像头之间的角度。这样不用时刻检测倾斜角度、旋转角度以及双眼的中点与摄像头之间的角度,有利于节省CPU资源。
作为一种可选的实施方式,用户终端在执行12)部分时,具体可通过以 下公式1计算得到中轴视角。
公式1:
Figure PCTCN2016101374-appb-000003
其中,该θ为中轴视角。该α为用户终端相对于重力垂线的倾斜角度。该β为用户终端绕对称轴旋转的旋转角度。该x为预设角度矫正参数。该λ为双眼的中点与一个摄像头之间的角度。
作为一种可选的实施方式,用户终端在执行12)部分时,具体可通过以下公式2计算得到中轴视角。
公式2:
Figure PCTCN2016101374-appb-000004
其中,该θ为中轴视角。该α为用户终端相对于重力垂线的倾斜角度。该β为用户终端绕对称轴旋转的旋转角度。该x为预设角度矫正参数。该λ为双眼的中点与一个摄像头之间的角度。该ε为一个预设的角度值,可以是一个经验值,例如,可以是45°、40°等。
通过该公式1和公式2,用户终端可准确地确定中轴视角。
当然,用户终端也可不通过上述公式1和公式2来计算中轴视角,本发明实施例不做限定。
在402部分中,用户终端根据401部分检测的观看视角,确定3D投影角度。
作为一种可选的实施方式,当401部分检测的观看视角为中轴视角时,402部分的具体实施方式可以为:用户终端将检测到的中轴视角确定为3D投影角度。
例如,若用户终端通过2D显示器或全息显示器来进行3D显示,则用户终端在检测到中轴视角之后,可直接将该中轴视角确定为3D投影角度。
当然,在用户终端通过除2D显示器和全息显示器之外的其他显示器来进行3D显示时,则用户终端在检测到中轴视角之后,也可直接将该中轴视角确定为3D投影角度,本发明实施例不做限定。
作为一种可选的实施方式,3D投影角度可包括左眼3D投影角度和右眼3D投影角度。当401部分检测的观看视角为中轴视角时,402部分的具体实施方式可以为:用户终端根据检测到的中轴视角以及预设左眼调整角度确定左眼3D投影角度;用户终端根据检测到的中轴视角以及预设右眼调整角度确定右眼3D投影角度。
其中,预设左眼调整角度和预设右眼调整角度可以是经验值。例如,预设左眼调整角度可以是3°,预设右眼调整角度可以是-3°。
在该实施方式中,可以理解的是用户终端根据用户的中轴视角与预设左眼调整角度得到用户的左眼视角,用户终端将计算得到的用户的左眼视角确定为左眼3D投影角度。同理,用户终端根据用户的中轴视角与预设左眼调整角度得到用户的右眼视角,用户终端将计算得到的用户的右眼视角确定为左眼3D投影角度。当然,用户终端还可根据用户的中轴视角通过其他方式得到用户的左眼视角和用户的右眼视角,本发明实施例不做限定。
作为一种可选的实施方式,用户终端可存储多个预设左眼调整角度和多个预设右眼调整角度。用户终端可预存预设中轴视角与预设左眼调整角度之间的对应关系。用户终端可预存预设中轴视角与预设右眼调整角度之间的对应关系。在用户终端检测到用户的中轴视角之后,用户终端获取预存的与该中轴视角对应的预设左眼调整角度和预设右眼调整角度,并根据该中轴视角以及与该中轴视角对应的预设左眼调整角度和预设右眼调整角度来确定左眼3D投影角度和右眼3D投影角度。例如,用户终端可预存预设中轴视角0°对应预设左眼调整角度3°,预设中轴视角10°对应预设左眼调整角度4°。用户终端可预存预设中轴视角0°对应预设右眼调整角度-3°,预设中轴视角10°对应预设右眼调整角度-4°。在用户终端检测到用户的中轴视角为0°之后,用户终端获取预存的与该中轴视角0°对应的预设左眼调整角度3°和预设右眼调整角度-3°,并根据该中轴视角0°、预设左眼调整角度3°和预设右眼调整角度-3°来确定左眼3D投影角度和右眼3D投影角度。
作为一种可选的实施方式,用户终端根据中轴视角以及预设左眼调整角度确定左眼3D投影角度的具体实施方式可以为:用户终端可将中轴视角与预设左眼调整角度之间的差值(即用户终端的左眼视角)确定为左眼3D投影角度。用户终端根据中轴视角以及预设右眼调整角度确定右眼3D投影角度的具体实施方式可以为:用户终端可将中轴视角与预设右眼调整角度之间的差值(即用户终端的右眼视角)确定为右眼3D投影角度。例如,用户的中轴视角为0°,预设左眼调整角度可以是3°,预设右眼调整角度可以是-3°,则用户终端可用0°减去3°,得到左眼3D投影角度为-3°,并用0°减去-3°,得到右眼3D投影角度为3°。
作为一种可选的实施方式,当401部分检测的观看视角为左眼视角和右眼视角时,402部分的具体实施方式可以为:将检测到的左眼视角确定为左眼3D 投影角度,将检测到的右眼视角确定为右眼3D投影角度。
在403部分中,用户终端根据该3D投影角度对需要显示的内容进行3D显示。
具体地,用户终端检测到3D投影角度之后,根据3D投影角度对需要显示的内容进行绘图操作,并通过相应的显示器来显示绘图的结果。
作为一种可选的实施方式,若用户终端将中轴视角确定为3D投影角度,则403部分的具体实施方式可以为:用户终端根据该3D投影角度对需要显示的内容进行绘图,并通过2D显示器显示绘图结果。
具体地,用户终端将中轴视角确定为3D投影角度之后,用户终端可将该3D投影角度发送给用户终端的图形处理器(Graphics Processing Unit,GPU)。GPU得到该3D投影角度之后,根据该3D投影角度对需要显示的内容进行绘图,并通过2D显示器显示绘图结果。
可选的,GPU根据该3D投影角度对需要显示的内容进行绘图,并通过2D显示器显示绘图结果的具体实施方式可以为:GPU根据该3D投影角度将需要显示的内容栅格化至FrameBuffer中,通过2D显示器将FrameBuffer中的绘图结果进行显示。可选的,GPU还可通过其他方式根据该3D投影角度对需要显示的内容进行绘图,本发明实施例不做限定。
作为一种可选的实施方式,若用户终端将中轴视角确定为3D投影角度,则403部分的具体实施方式可以为:用户终端根据该3D投影角度对需要显示的内容进行绘图,并通过全息显示器显示绘图结果。
具体地,用户终端将中轴视角确定为3D投影角度之后,用户终端可将该3D投影角度发送给用户终端的GPU。GPU得到该3D投影角度之后,根据该3D投影角度对需要显示的内容进行绘图,绘图完成后,由全息显示器显示绘图结果。
作为一种可选的实施方式,GPU根据3D投影角度对需要显示的内容进行绘图可通过以下函数实现:glRotatef(GLfloat angle,GLfloat x,GLfloat y,GLfloat z)。其中,angle为3D投影角度。glRotatef(GLfloat angle,GLfloat x,GLfloat y,GLfloat z)函数的作用是将当前坐标系以a(x,y,z)向量为旋转轴旋转angle角度。
作为一种可选的实施方式,若3D投影角度包括左眼3D投影角度和右眼 3D投影角度,则403部分的具体实施方式可以为:用户终端根据左眼3D投影角度和右眼3D投影角度对需要显示的内容进行绘图,并通过3D显示器显示绘图结果。该3D显示器可以为3D裸眼显示器或需要佩戴眼镜、头盔等其他辅助工具进行观看的3D显示器。
具体地,用户终端确定左眼3D投影角度和右眼3D投影角度之后,用户终端可将该左眼3D投影角度和右眼3D投影角度发送给用户终端的GPU。GPU根据左眼3D投影角度和右眼3D投影角度对需要显示的内容进行绘图,并通过3D显示器显示绘图结果。
可选的,GPU根据左眼3D投影角度和右眼3D投影角度对需要显示的内容进行绘图的具体实施方式可以为:GPU根据左眼3D投影角度将需要显示的内容栅格化至一个FrameBuffer中,并根据右眼3D投影角度将需要显示的内容栅格化至另一个FrameBuffer中。相应地,由3D显示器将这两个FrameBuffer中的绘图结果进行显示。可选的,GPU还可通过其他方式根据左眼3D投影角度和右眼3D投影角度对需要显示的内容进行绘图,本发明实施例不做限定。
作为一种可选的实施方式,GPU根据左眼3D投影角度对需要显示的内容进行绘图可通过以下函数实现:glRotatef(GLfloat angle,GLfloat x,GLfloat y,GLfloat z)。其中,angle为左眼3D投影角度。glRotatef(GLfloat angle,GLfloat x,GLfloat y,GLfloat z)函数的作用是将当前坐标系以a(x,y,z)向量为旋转轴旋转angle角度。
同理,GPU根据右眼3D投影角度对需要显示的内容进行绘图可通过以下函数实现:glRotatef(GLfloat angle,GLfloat x,GLfloat y,GLfloat z)。其中,angle为右眼3D投影角度。
下面通过具体的应用场景1~应用场景4进一步对本发明实施例进行说明。
应用场景1:用户终端通过2D显示器来对一个长方体进行3D显示。当用户终端检测到用户的中轴视角为0°时,用户终端将0°确定为3D投影角度。用户终端根据该3D投影角度0°对长方体进行绘图,并通过2D显示器显示绘图得到的长方体。该显示结果可如图1所示。当用户终端检测到用户的中轴视角为30°时,用户终端将30°确定为3D投影角度。用户终端根据该3D投影角度30°对长方体进行绘图,并通过2D显示器显示绘图得到的长方体。该显示结果可如图7所示。
应用场景2:用户终端通过全息显示器来对一个长方体进行3D显示。当用户终端检测到用户的中轴视角为0°时,用户终端将0°确定为3D投影角度。用户终端根据该3D投影角度0°对长方体进行绘图,并通过全息显示器显示绘图得到的长方体。当用户终端检测到用户的中轴视角为30°时,用户终端将30°确定为3D投影角度。用户终端根据该3D投影角度30°对长方体进行绘图,并通过全息显示器显示绘图得到的长方体。
应用场景3:用户终端通过3D显示器来进行3D显示。用户终端可检测用户的左眼视角和右眼视角,将左眼视角确定为左眼3D投影角度,将右眼视角确为右眼3D投影角度。例如,若用户终端检测到用户的左眼视角为-5°,用户的右眼视角为5°,则用户终端将-5°确定为左眼3D投影角度,将5°确定为右眼3D投影角度,并根据左眼3D投影角度-5°和右眼3D投影角度5°对长方体进行绘图,并通过3D显示器显示绘图得到的长方体。若用户终端检测到用户的左眼视角为10°,用户的右眼视角为20°,则用户终端将10°确定为左眼3D投影角度,将20°确定为右眼3D投影角度,并根据左眼3D投影角度10°和右眼3D投影角度20°对长方体进行绘图,并通过3D显示器显示绘图得到的长方体。
应用场景4:用户终端通过3D显示器来进行3D显示。用户终端根据中轴视角检测用户的左眼视角和右眼视角。用户终端预存的预设左眼调整视角为5°,预设右眼调整时间为-5°。当用户终端检测到用户的中轴视角为0°时,用户终端用0°减去5°,得到左眼视角-5°,并将左眼视角-5°确定为左眼3D投影角度;用户终端用0°减去-5°,得到右眼视角5°,并将右眼视角5°确定为右眼3D投影角度。用户终端根据左眼3D投影角度-5°和右眼3D投影角度5°对长方体进行绘图,并通过3D显示器显示绘图得到的长方体。同理,当用户终端检测到用户的中轴视角为15°时,用户终端用15°减去5°,得到左眼3D投影角度10°,并用15°减去-5°,得到右眼3D投影角度20°。用户终端根据左眼3D投影角度10°和右眼3D投影角度20°对长方体进行绘图,并通过3D显示器显示绘图得到的长方体。
可见,通过实施图4所提供的3D显示方法,随着用户的观看角度不同,用户终端使用不同的3D投影角度来进行3D显示,能够使3D显示效果更逼真,用户观看3D画面更清晰,提高了3D显示效果。
本发明实施例可以根据上述方法示例对用户终端进行功能单元的划分,例如,可以对应各个功能划分各个功能单元,也可以将两个或两个以上的功能集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。需要说明的是,本发明实施例中对单元的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
请参见图8,图8示出了本发明实施例提供的一种用户终端的结构示意图。如图8所示,用户终端包括检测模块801、确定模块802和显示模块803。其中:
检测模块801,用于检测用户对显示屏幕的观看视角。
确定模块802,用于根据该观看视角,确定3D投影角度。
显示模块803,用于根据该3D投影角度对需要显示的内容进行3D显示。
作为一种可选的实施方式,检测模块801检测的观看视角为中轴视角,该中轴视角为双眼的中点与中心垂线之间的夹角,该中心垂线为垂直于显示屏幕中心位置的线。
作为一种可选的实施方式,当检测模块801检测的观看视角为中轴视角时,确定模块802具体用于:将中轴视角确定为3D投影角度。
作为一种可选的实施方式,若确定模块802具体用于将中轴视角确定为3D投影角度,则显示模块803具体用于:根据3D投影角度对需要显示的内容进行绘图,并通过2D显示器或全息显示器显示绘图结果。
作为一种可选的实施方式,3D投影角度包括左眼3D投影角度和右眼3D投影角度,当检测模块801检测的观看视角为中轴视角时,确定模块802具体用于:根据检测模块801检测的中轴视角和预设左眼调整角度,确定左眼3D投影角度;根据检测模块801检测的中轴视角和预设右眼调整角度,确定右眼3D投影角度。
作为一种可选的实施方式,检测模块801检测的观看视角包括左眼视角和右眼视角,该左眼视角为左眼瞳孔的中点与中心垂线之间的夹角,该右眼视角为右眼瞳孔的中点与中心垂线之间的夹角。
作为一种可选的实施方式,3D投影角度包括左眼3D投影角度和右眼3D投影角度,当检测模块801检测的观看视角包括左眼视角和右眼视角,确定模块具体用于:将检测模块801检测的左眼视角确定为左眼3D投影角度,将检测模块801检测的右眼视角确定为右眼3D投影角度。
作为一种可选的实施方式,当3D投影角度包括左眼3D投影角度和右眼3D投影角度时,显示模块具体用于:根据左眼3D投影角度和右眼3D投影角度对需要显示的内容绘图,并通过3D显示器显示绘图结果。
其中,检测模块801用于执行本发明方法实施例图4中步骤401的方法,检测模块801的实施方式可以参考本发明方法实施例图4中步骤401对应的描述,在此不再赘述。确定模块802用于执行本发明方法实施例图4中步骤402的方法,确定模块802的实施方式可以参考本发明方法实施例图4中步骤402对应的描述,在此不再赘述。显示模块803用于执行本发明方法实施例图4中步骤403的方法,显示模块803的实施方式可以参考本发明方法实施例图4中步骤403对应的描述,在此不再赘述。
请一并参见图9,图9示出了本发明实施例提供的另一种用户终端的结构示意图。图9所示的用户终端是图8所示的用户终端的优化。图9所示的用户终端包括图8所示的所有模块。图9的用户终端的检测模块801包括检测单元8011和计算单元8012,其中:
检测单元8011,用于检测用户终端相对于重力垂线的倾斜角度、用户终端绕对称轴旋转的旋转角度以及双眼的中点与摄像头之间的角度。
计算单元8012,用于根据倾斜角度、旋转角度以及双眼的中点与摄像头之间的角度,计算得到中轴视角。
作为一种可选的实施方式,检测单元8011具体用于:检测用户终端相对于重力垂线的倾斜角度、用户终端相对于对称轴旋转的旋转角度;当检测到倾斜角度或旋转角度的变化大于预设角度时,检测双眼的中点与摄像头之间的角度。
其中,检测单元8011和计算单元8012的具体实施方式可参见上述方法实施例对应的描述,为简洁描述,在这里不再赘述。
基于同一发明构思,本发明实施例中提供的3D显示方法的用户终端解决问题的原理与本发明方法实施例中的3D显示方法相似,因此该用户终端的实施可以参见方法的实施,为简洁描述,在这里不再赘述。
本发明实施例还提供了一种用户终端,以该用户终端为手机为例,图10 示出的是与本发明实施例相关的手机1000的部分结构的框图。参考图10,手机1000包括RF(Radio Frequency,射频)电路1001、存储器1002、其他输入设备1003、显示屏1004、传感器1005、音频电路1006、I/O子***1007、处理器1008、以及电源1009等部件。本领域技术人员可以理解,图10中示出的手机结构并不构成对手机的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。
下面结合图10对手机1000的各个构成部件进行具体的介绍:
RF电路1001可用于收发信息或通话过程中,信号的接收和发送,特别地,将基站的下行信息接收后,给处理器1008处理;另外,将设计上行的数据发送给基站。通常,RF电路包括但不限于天线、至少一个放大器、收发信机、耦合器、LNA(Low Noise Amplifier,低噪声放大器)、双工器等。此外,RF电路1001还可以通过无线通信与网络和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于GSM(Global System of Mobile communication,全球移动通讯***)、GPRS(General Packet Radio Service,通用分组无线服务)、CDMA(Code Division Multiple Access,码分多址)、WCDMA(Wideband Code Division Multiple Access,宽带码分多址)、LTE(Long Term Evolution,长期演进)、电子邮件、SMS(Short Messaging Service,短消息服务)等。
存储器1002可用于存储计算机可执行程序代码,程序代码包括指令;处理器1008通过运行存储在存储器1002的软件程序以及模块,从而执行手机1000的各种功能应用以及数据处理。其中,存储程序区可存储操作***、至少一个功能所需的应用程序(比如声音播放功能、图象播放功能等)等;存储数据区可存储根据手机1000的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器1002可以包括ROM和RAM,还可包括高速随机存取存储器、非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
其他输入设备1003可用于接收输入的数字或字符信息,以及产生与手机1000的用户设置以及功能控制有关的键信号输入。具体地,其他输入设备1003可包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆、光鼠(光鼠是不显示可视输出的触摸敏感表面,或者是由 触摸屏形成的触摸敏感表面的延伸)等中的一种或多种。其他输入设备1003与I/O子***1007的其他输入设备控制器171相连接,在其他设备输入控制器171的控制下与处理器1008进行信号交互。
显示屏1004可用于显示由用户输入的信息或提供给用户的信息以及手机1000的各种菜单,还可以接受用户输入。例如,显示屏1004可显示上述方法实施例中需要显示的信息,如未读专属消息、包括消息选择项的选择列表、包括多个时间段的选择项的选择列表、向上跳转箭头或向下跳转箭头等。具体的显示屏1004可包括显示面板141,以及触控面板142。其中显示面板141可以采用LCD(Liquid Crystal Display,液晶显示器)、OLED(Organic Light-Emitting Diode,有机发光二极管)等形式来配置显示面板141。触控面板142,也称为触摸屏、触敏屏等,可收集用户在其上或附近的接触或者非接触操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板142上或在触控面板142附近的操作,也可以包括体感操作;该操作包括单点控制操作、多点控制操作等操作类型。),并根据预先设定的程式驱动相应的连接装置。可选的,触控面板142可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位、姿势,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成处理器能够处理的信息,再送给处理器1008,并能接收处理器1008发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板142,也可以采用未来发展的任何技术实现触控面板142。进一步的,触控面板142可覆盖显示面板141,用户可以根据显示面板141显示的内容(该显示内容包括但不限于,软键盘、虚拟鼠标、虚拟按键、图标等等),在显示面板141上覆盖的当触控面板142上或者附近进行操作,触控面板142检测到在其上或附近的触摸操作后,通过I/O子***1007传送给处理器1008以确定触摸事件的类型以确定用户输入,随后处理器1008根据触摸事件的类型在显示面板根据用户输入通过I/O子***1007在显示面板141上提供相应的视觉输出。虽然在图10中,触控面板142与显示面板141是作为两个独立的部件来实现手机1000的输入和输入功能,但是在某些实施例中,可以将触控面板142与显示面板141集成而实现手机1000的输入和输出功能。
手机1000还可包括至少一种传感器1005,比如指纹传感器、光传感器、 运动传感器、重力传感器、陀螺仪以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板141的亮度,接近传感器可在手机1000移动到耳边时,关闭显示面板141和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于手机1000还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
音频电路1006、扬声器161,麦克风162可提供用户与手机1000之间的音频接口。音频电路1006可将接收到的音频数据转换后的信号,传输到扬声器161,由扬声器161转换为声音信号输出;另一方面,麦克风162将收集的声音信号转换为信号,由音频电路1006接收后转换为音频数据,再将音频数据输出至RF电路1001以发送给比如另一手机,或者将音频数据输出至存储器1002以便进一步处理。
I/O子***1007用来控制输入输出的外部设备,可以包括其他输入设备控制器171、传感器控制器172、显示控制器173。可选的,一个或多个其他输入设备控制器171从其他输入设备1003接收信号和/或者向其他输入设备1003发送信号,其他输入设备1003可以包括物理按钮(按压按钮、摇臂按钮等)、拨号盘、滑动开关、操纵杆、点击滚轮、光鼠(光鼠是不显示可视输出的触摸敏感表面,或者是由触摸屏形成的触摸敏感表面的延伸)。值得说明的是,其他输入设备控制器171可以与任一个或者多个上述设备连接。所述I/O子***1007中的显示控制器173从显示屏1004接收信号和/或者向显示屏1004发送信号。显示屏1004检测到用户输入后,显示控制器173将检测到的用户输入转换为与显示在显示屏1004上的用户界面对象的交互,即实现人机交互。传感器控制器172可以从一个或者多个传感器1005接收信号和/或者向一个或者多个传感器1005发送信号。
处理器1008是手机1000的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器1002内的软件程序和/或模块,以及调用存储在存储器1002内的数据,执行手机1000的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器1008可包括一个或多个处理单元; 优选的,处理器1008可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作***、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器1008中。当处理器1008执行存储器1002所存储的指令时,指令使手机1000执行本发明实施例的3D显示方法,可以参考方法实施例中对图4中的401~403部分或上述方法实施例中用户终端的其他执行过程对应的描述,在此不再赘述。基于同一发明构思,本发明实施例中提供的软件程序安装的用户终端解决问题的原理与本发明方法实施例中的软件程序安装的方法相似,因此该用户终端的实施可以参见上述方法的实施,为简洁描述,在这里不再赘述。
手机1000还包括给各个部件供电的电源1009(比如电池),优选的,电源可以通过电源管理***与处理器1008逻辑相连,从而通过电源管理***实现管理充电、放电、以及功耗等功能。
尽管未示出,手机1000还可以包括摄像头、蓝牙模块等,在此不再赘述。
另外,本发明实施例还提供了一种存储一个或者多个程序的非易失性计算机可读存储介质,所述非易失性计算机可读存储介质存储有至少一个程序,每个所述程序包括指令,该指令当被本发明实施例提供的用户终端执行时,使用户终端执行本发明实施例图4中的401~403部分或上述方法实施例中用户终端的其他执行过程,可以参考方法实施例中对图4中的401~403部分或上述方法实施例中用户终端的其他执行过程对应的描述,在此不再赘述。
本领域技术人员应该可以意识到,在上述一个或多个示例中,本发明所描述的功能可以用硬件、软件、固件或它们的任意组合来实现。当使用软件实现时,可以将这些功能存储在计算机可读介质中或者作为计算机可读介质上的一个或多个指令或代码进行传输。计算机可读介质包括计算机存储介质和通信介质,其中通信介质包括便于从一个地方向另一个地方传送计算机程序的任何介质。存储介质可以是通用或专用计算机能够存取的任何可用介质。
以上所述的具体实施方式,对本发明的目的、技术方案和有益效果进行了进一步详细说明,所应理解的是,以上所述仅为本发明的具体实施方式而已,并不用于限定本发明的保护范围,凡在本发明的技术方案的基础之上,所做的任何修改、等同替换、改进等,均应包括在本发明的保护范围之内。

Claims (24)

  1. 一种3D三维显示方法,应用于用户终端,其特征在于,所述方法包括:
    检测用户对显示屏幕的观看视角;
    根据所述观看视角,确定3D投影角度;
    根据所述3D投影角度对需要显示的内容进行3D显示。
  2. 根据权利要求1所述的方法,其特征在于,所述观看视角为中轴视角,所述中轴视角为双眼的中点与中心垂线之间的夹角,所述中心垂线为垂直于显示屏幕中心位置的线。
  3. 根据权利要求2所述的方法,其特征在于,所述检测用户对显示屏幕的观看视角,包括:
    检测所述用户终端相对于重力垂线的倾斜角度、所述用户终端绕对称轴旋转的旋转角度以及双眼的中点与摄像头之间的角度;
    根据所述倾斜角度、所述旋转角度以及所述双眼的中点与摄像头之间的角度,计算得到所述中轴视角。
  4. 根据权利要求3所述的方法,其特征在于,所述检测所述用户终端相对于重力垂线的倾斜角度、所述用户终端绕对称轴旋转的旋转角度以及双眼的中点与摄像头之间的角度,包括:
    检测所述用户终端相对于重力垂线的倾斜角度、所述用户终端相对于对称轴旋转的旋转角度;
    当检测到所述倾斜角度或所述旋转角度的变化大于预设角度时,检测双眼的中点与摄像头之间的角度。
  5. 根据权利要求2~4任意一项所述的方法,其特征在于,所述3D投影角度包括左眼3D投影角度和右眼3D投影角度,所述根据所述观看视角,确定3D投影角度,包括:
    根据所述中轴视角和预设左眼调整角度,确定左眼3D投影角度;
    根据所述中轴视角和预设右眼调整角度,确定右眼3D投影角度。
  6. 根据权利要求5所述的方法,其特征在于,所述预设左眼调整角度为预存的预设中轴视角与预设左眼调整角度之间的对应关系中,所述中轴视角对应的预设左眼调整角度;所述预设右眼调整角度为预存的预设中轴视角与预设右眼调整角度之间的对应关系中,所述中轴视角对应的预设右眼调整角度。
  7. 根据权利要求1所述的方法,其特征在于,所述观看视角包括左眼视角和右眼视角,所述左眼视角为左眼瞳孔的中点与中心垂线之间的夹角,所述右眼视角为右眼瞳孔的中点与中心垂线之间的夹角,所述中心垂线为垂直于显示屏幕中心位置的线。
  8. 根据权利要求7所述的方法,其特征在于,所述3D投影角度包括左眼3D投影角度和右眼3D投影角度,所述根据所述观看视角,确定3D投影角度,包括:
    将所述左眼视角确定为所述左眼3D投影角度,将所述右眼视角确定为所述右眼3D投影角度。
  9. 根据权利要求5、6或8所述的方法,其特征在于,所述根据所述3D投影角度对需要显示的内容进行3D显示,包括:
    根据所述左眼3D投影角度和所述右眼3D投影角度对需要显示的内容绘图,并通过3D显示器显示绘图结果。
  10. 根据权利要求2~4任意一项所述的方法,其特征在于,所述根据所述观看视角,确定3D投影角度,包括:
    将所述中轴视角确定为3D投影角度。
  11. 根据权利要求10所述的方法,其特征在于,所述根据所述3D投影角度对需要显示的内容进行3D显示,包括:
    根据所述3D投影角度对需要显示的内容进行绘图,并通过2D显示器或 全息显示器显示绘图结果。
  12. 一种用户终端,其特征在于,所述用户终端包括:
    检测模块,用于检测用户对显示屏幕的观看视角;
    确定模块,用于根据所述观看视角,确定3D投影角度;
    显示模块,用于根据所述3D投影角度对需要显示的内容进行3D显示。
  13. 根据权利要求12所述的用户终端,其特征在于,所述观看视角为中轴视角,所述中轴视角为双眼的中点与中心垂线之间的夹角,所述中心垂线为垂直于显示屏幕中心位置的线。
  14. 根据权利要求13所述的用户终端,其特征在于,所述检测模块包括:
    检测单元,用于检测所述用户终端相对于重力垂线的倾斜角度、所述用户终端绕对称轴旋转的旋转角度以及双眼的中点与摄像头之间的角度;
    计算单元,用于根据所述倾斜角度、所述旋转角度以及所述双眼的中点与摄像头之间的角度,计算得到所述中轴视角。
  15. 根据权利要求14所述的用户终端,其特征在于,检测单元具体用于:
    检测所述用户终端相对于重力垂线的倾斜角度、所述用户终端相对于对称轴旋转的旋转角度;
    当检测到所述倾斜角度或所述旋转角度的变化大于预设角度时,检测双眼的中点与摄像头之间的角度。
  16. 根据权利要求13~15任意一项所述的用户终端,其特征在于,所述3D投影角度包括左眼3D投影角度和右眼3D投影角度,所述确定模块具体用于:
    根据所述中轴视角和预设左眼调整角度,确定左眼3D投影角度;
    根据所述中轴视角和预设右眼调整角度,确定右眼3D投影角度。
  17. 根据权利要求16所述的用户终端,其特征在于,所述预设左眼调整 角度为预存的预设中轴视角与预设左眼调整角度之间的对应关系中,所述中轴视角对应的预设左眼调整角度;所述预设右眼调整角度为预存的预设中轴视角与预设右眼调整角度之间的对应关系中,所述中轴视角对应的预设右眼调整角度。
  18. 根据权利要求12所述的用户终端,其特征在于,所述观看视角包括左眼视角和右眼视角,所述左眼视角为左眼瞳孔的中点与中心垂线之间的夹角,所述右眼视角为右眼瞳孔的中点与中心垂线之间的夹角,所述中心垂线为垂直于显示屏幕中心位置的线。
  19. 根据权利要求18所述的用户终端,其特征在于,所述3D投影角度包括左眼3D投影角度和右眼3D投影角度,所述确定模块具体用于:
    将所述左眼视角确定为所述左眼3D投影角度,将所述右眼视角确定为所述右眼3D投影角度。
  20. 根据权利要求16、17或19所述的用户终端,其特征在于,所述显示模块具体用于:
    根据所述左眼3D投影角度和所述右眼3D投影角度对需要显示的内容绘图,并通过3D显示器显示绘图结果。
  21. 根据权利要求13~15任意一项所述的用户终端,其特征在于,所述确定模块具体用于:
    将所述中轴视角确定为3D投影角度。
  22. 根据权利要求21所述的用户终端,其特征在于,所述显示模块具体用于:
    根据所述3D投影角度对需要显示的内容进行绘图,并通过2D显示器或全息显示器显示绘图结果。
  23. 一种用户终端,其特征在于,所述用户终端包括:显示器、一个或多 个处理器、存储器以及一个或多个程序,其中,其中,所述一个或多个程序被存储在存储器中,所述一个或多个程序包括指令,所述处理器调用存储在所述存储器中的指令以实现如权利要求1~11任意一项所述的方法。
  24. 一种存储一个或多个程序的计算机可读存储介质,所述一个或多个程序包括指令,所述指令当被用户终端执行时使所述用户终端执行如权利要求1至11任一项所述方法。
PCT/CN2016/101374 2016-09-30 2016-09-30 一种3d显示方法及用户终端 WO2018058673A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US16/338,216 US10908684B2 (en) 2016-09-30 2016-09-30 3D display method and user terminal
PCT/CN2016/101374 WO2018058673A1 (zh) 2016-09-30 2016-09-30 一种3d显示方法及用户终端
EP16917423.2A EP3511764B1 (en) 2016-09-30 2016-09-30 3d display method and user terminal
CN201680077478.1A CN108476316B (zh) 2016-09-30 2016-09-30 一种3d显示方法及用户终端

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/101374 WO2018058673A1 (zh) 2016-09-30 2016-09-30 一种3d显示方法及用户终端

Publications (1)

Publication Number Publication Date
WO2018058673A1 true WO2018058673A1 (zh) 2018-04-05

Family

ID=61762345

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/101374 WO2018058673A1 (zh) 2016-09-30 2016-09-30 一种3d显示方法及用户终端

Country Status (4)

Country Link
US (1) US10908684B2 (zh)
EP (1) EP3511764B1 (zh)
CN (1) CN108476316B (zh)
WO (1) WO2018058673A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110448906A (zh) * 2018-11-13 2019-11-15 网易(杭州)网络有限公司 游戏中视角的控制方法及装置、触控终端
CN109857246A (zh) * 2018-12-28 2019-06-07 努比亚技术有限公司 终端及其3d显示控制方法、及计算机可读存储介质
CN110949272A (zh) * 2019-12-23 2020-04-03 斑马网络技术有限公司 车载显示设备调节方法、装置、车辆、介质及设备
CN114566132A (zh) * 2022-02-28 2022-05-31 北京京东方显示技术有限公司 参数处理方法、装置、电子设备及计算机可读存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103354616A (zh) * 2013-07-05 2013-10-16 南京大学 在平面显示器上实现立体显示的方法和***
CN103517060A (zh) * 2013-09-03 2014-01-15 展讯通信(上海)有限公司 一种终端设备的显示控制方法及装置
US20150062311A1 (en) * 2012-04-29 2015-03-05 Hewlett-Packard Development Company, L.P. View weighting for multiview displays
CN104503092A (zh) * 2014-11-28 2015-04-08 深圳市亿思达科技集团有限公司 不同角度和距离自适应的三维显示方法及设备
CN104581350A (zh) * 2015-02-04 2015-04-29 京东方科技集团股份有限公司 一种显示方法和显示装置
CN104581113A (zh) * 2014-12-03 2015-04-29 深圳市亿思达科技集团有限公司 基于观看角度的自适应全息显示方法及全息显示装置
CN104618711A (zh) * 2015-01-12 2015-05-13 深圳市亿思达科技集团有限公司 一种多区实现自由视角全息立体显示的设备及方法

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090282429A1 (en) * 2008-05-07 2009-11-12 Sony Ericsson Mobile Communications Ab Viewer tracking for displaying three dimensional views
JP2010122879A (ja) * 2008-11-19 2010-06-03 Sony Ericsson Mobile Communications Ab 端末装置、表示制御方法および表示制御プログラム
KR20110037657A (ko) 2009-10-07 2011-04-13 삼성전자주식회사 모션을 이용한 gui 제공방법 및 이를 적용한 디스플레이 장치
US20120200676A1 (en) 2011-02-08 2012-08-09 Microsoft Corporation Three-Dimensional Display with Motion Parallax
TWI530154B (zh) * 2011-03-17 2016-04-11 群邁通訊股份有限公司 3d可視視角自動調整系統及方法
WO2014067552A1 (en) 2012-10-29 2014-05-08 Telefonaktiebolaget L M Ericsson (Publ) 3d video warning module
CN103000161B (zh) 2012-12-14 2015-08-12 小米科技有限责任公司 一种图像显示方法、装置和一种智能手持终端
KR101916663B1 (ko) * 2012-12-18 2018-11-08 삼성전자주식회사 이용자의 시선 방향 또는 중력 방향 중 적어도 하나를 이용하여 3차원 영상을 표시하는 3차원 디스플레이 장치
KR102019125B1 (ko) 2013-03-18 2019-09-06 엘지전자 주식회사 3d 디스플레이 디바이스 장치 및 제어 방법
CN104601981A (zh) * 2014-12-30 2015-05-06 深圳市亿思达科技集团有限公司 一种基于人眼跟踪的调整观看角度的方法及全息显示装置
CN105120251A (zh) * 2015-08-19 2015-12-02 京东方科技集团股份有限公司 一种3d场景展示方法及装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150062311A1 (en) * 2012-04-29 2015-03-05 Hewlett-Packard Development Company, L.P. View weighting for multiview displays
CN103354616A (zh) * 2013-07-05 2013-10-16 南京大学 在平面显示器上实现立体显示的方法和***
CN103517060A (zh) * 2013-09-03 2014-01-15 展讯通信(上海)有限公司 一种终端设备的显示控制方法及装置
CN104503092A (zh) * 2014-11-28 2015-04-08 深圳市亿思达科技集团有限公司 不同角度和距离自适应的三维显示方法及设备
CN104581113A (zh) * 2014-12-03 2015-04-29 深圳市亿思达科技集团有限公司 基于观看角度的自适应全息显示方法及全息显示装置
CN104618711A (zh) * 2015-01-12 2015-05-13 深圳市亿思达科技集团有限公司 一种多区实现自由视角全息立体显示的设备及方法
CN104581350A (zh) * 2015-02-04 2015-04-29 京东方科技集团股份有限公司 一种显示方法和显示装置

Also Published As

Publication number Publication date
CN108476316B (zh) 2020-10-09
EP3511764A1 (en) 2019-07-17
US20190391639A1 (en) 2019-12-26
EP3511764A4 (en) 2019-09-11
EP3511764B1 (en) 2021-07-21
CN108476316A (zh) 2018-08-31
US10908684B2 (en) 2021-02-02

Similar Documents

Publication Publication Date Title
US10950205B2 (en) Electronic device, augmented reality device for providing augmented reality service, and method of operating same
US11231845B2 (en) Display adaptation method and apparatus for application, and storage medium
CN109712224B (zh) 虚拟场景的渲染方法、装置及智能设备
US10055064B2 (en) Controlling multiple devices with a wearable input device
US9586147B2 (en) Coordinating device interaction to enhance user experience
WO2021098697A1 (zh) 屏幕显示的控制方法及电子设备
CN106445340B (zh) 一种双屏终端显示立体图像的方法和装置
WO2020151594A1 (zh) 视角转动的方法、装置、设备及存储介质
EP3561667B1 (en) Method for displaying 2d application in vr device, and terminal
WO2022134632A1 (zh) 作品处理方法及装置
WO2018058673A1 (zh) 一种3d显示方法及用户终端
WO2021115103A1 (zh) 显示控制方法和终端设备
WO2020108041A1 (zh) 耳部关键点检测方法、装置及存储介质
WO2015014135A1 (zh) 鼠标指针的控制方法、装置及终端设备
WO2022199102A1 (zh) 图像处理方法及装置
CN109618055B (zh) 一种位置共享方法及移动终端
US9665232B2 (en) Information-processing device, storage medium, information-processing method, and information-processing system for enlarging or reducing an image displayed on a display device
CN112262364A (zh) 用于生成对象的电子装置和***
CN112381729B (zh) 图像处理方法、装置、终端及存储介质
CN111443796B (zh) 一种信息处理方法及装置
CN109688064B (zh) 数据传输方法、装置、电子设备和存储介质
CN109634503B (zh) 一种操作响应方法及移动终端
CN110955378A (zh) 一种控制方法及电子设备
WO2020220957A1 (zh) 屏幕显示方法及终端
CN110489190B (zh) 一种显示控制方法及终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16917423

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2016917423

Country of ref document: EP

Effective date: 20190409