WO2018233387A1 - 裸眼3d显示方法及装置 - Google Patents

裸眼3d显示方法及装置 Download PDF

Info

Publication number
WO2018233387A1
WO2018233387A1 PCT/CN2018/085791 CN2018085791W WO2018233387A1 WO 2018233387 A1 WO2018233387 A1 WO 2018233387A1 CN 2018085791 W CN2018085791 W CN 2018085791W WO 2018233387 A1 WO2018233387 A1 WO 2018233387A1
Authority
WO
WIPO (PCT)
Prior art keywords
phase
viewpoint
eye
right eye
screen
Prior art date
Application number
PCT/CN2018/085791
Other languages
English (en)
French (fr)
Inventor
于炀
陈佳搏
Original Assignee
上海玮舟微电子科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201710479203.7A external-priority patent/CN107249125A/zh
Priority claimed from CN201710479231.9A external-priority patent/CN107172409A/zh
Priority claimed from CN201710479202.2A external-priority patent/CN107167926A/zh
Priority claimed from CN201710479195.6A external-priority patent/CN107454381A/zh
Application filed by 上海玮舟微电子科技有限公司 filed Critical 上海玮舟微电子科技有限公司
Publication of WO2018233387A1 publication Critical patent/WO2018233387A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof

Definitions

  • the present disclosure relates to the field of three-dimensional (3D) technology, for example, to a naked-eye 3D display method and apparatus.
  • the naked-eye 3D display is widely used in various fields such as advertising, media, demonstration teaching, exhibition display, and film and television.
  • the naked eye 3D display has the unique characteristics of the naked eye, that is, the viewer does not need to wear glasses or a helmet to watch the 3D effect, and the naked eye 3D displays the realistic depth of field and the stereoscopic effect, which is greatly improved.
  • the visual impact and immersion of the audience during the viewing experience has become the best display product for product promotion, publicity and video playback.
  • the principle of naked-eye 3D display is generally to split the image displayed by the display through a lens, and the lens refracts different display contents to different places in the space by refraction of light, and the content displayed when reaching the human eye is separated, the human eye Two images with parallax are received, which produces a stereo effect. If the user is viewing outside the viewable area, image reversal may occur.
  • the multi-viewpoint method can be used to increase the visible area, the use of multi-viewpoints at a certain angular resolution will result in a decrease in sharpness and an aliasing of the image, which affects the actual viewing effect.
  • Some embodiments of the present application provide a naked-eye 3D display method and apparatus, which solves the technical problem that the naked-eye 3D display is prone to inversion and image aliasing in the related art.
  • the embodiment provides a naked eye 3D display method, including:
  • a phase difference between the right eye phase and the left eye phase is calculated, and a phase of the viewpoint image corresponding to the phase of the left eye and a phase of the viewpoint image corresponding to the phase of the right eye are adjusted according to the phase difference.
  • it also includes:
  • the multi-viewpoint (the number of viewpoints is greater than or equal to 2) is interleaved by the viewpoint view corresponding to the adjusted phase of the left eye and the viewpoint map corresponding to the phase of the right eye.
  • the adjustment is the phase of the viewpoint image.
  • the viewpoint corresponds to the physical channel and is fixed for a given design.
  • the interleaving process refers to a process of writing a view map combination to a plurality of physical channels (viewpoints).
  • the embodiment further provides a naked-eye 3D display device, including:
  • a determining module configured to set a number of viewpoints within a viewing range of the design viewing distance, and determining a phase of each viewpoint and a viewpoint map corresponding to the phase according to the number of the viewpoints;
  • a calculation module configured to acquire a spatial position of a human eye in a viewing area of the screen, calculate a left eye and a right eye position according to the spatial position, and determine a phase corresponding to the plurality of points on the screen;
  • Adjusting a module configured to calculate a phase difference between the right eye and the left eye, and adjusting a phase of the viewpoint according to the phase difference;
  • the interleaving module is configured to interleave the multi-viewpoint (the number of viewpoints is greater than or equal to 2) according to the adjusted view point phase.
  • An embodiment further provides a computer readable storage medium storing computer executable instructions for performing the method of any of the above.
  • An embodiment further provides a naked-eye 3D display device including at least one processor
  • a memory communicatively coupled to the at least one processor
  • the memory stores instructions executable by the at least one processor, the instructions being executed by the at least one processor, performing the method of any of the above.
  • the naked eye 3D display method and device provided by the embodiment combines the human eye tracking and the multi-view display technology to enhance the viewing range of the naked eye 3D display, improve the display effect, reduce the visible aliasing area and the inversion area, and improve the viewing effect. And the user's viewing experience.
  • FIG. 1 is a schematic flow chart of a naked eye 3D display method according to Embodiment 1;
  • FIG. 2 is a schematic diagram of multi-viewpoint optical design in the naked-eye 3D display method provided in the first embodiment
  • FIG. 3 is a schematic flow chart of a naked eye 3D display method provided in the second embodiment
  • FIG. 4 is a schematic flow chart of a naked eye 3D display method according to Embodiment 3;
  • FIG. 5 is a schematic diagram showing the relationship between phase calculation positions in the naked eye 3D display method provided in the third embodiment
  • FIG. 6 is a schematic diagram of adjusting a phase range of a multi-view channel in the naked-eye 3D display method provided in the third embodiment
  • FIG. 7 is a schematic flow chart of a naked eye 3D display method provided in Embodiment 4.
  • FIG. 8 is a schematic diagram showing the relationship of phase calculation positions in the naked eye 3D display method provided in the fourth embodiment.
  • Embodiment 9 is a schematic flow chart of a naked-eye 3D display method provided in Embodiment 5;
  • FIG. 10 is a schematic diagram showing a relationship of phase calculation positions in a naked-eye 3D display method according to Embodiment 5;
  • FIG. 10 is a schematic diagram showing a relationship of phase calculation positions in a naked-eye 3D display method according to Embodiment 5;
  • FIG. 11 is a schematic flow chart of a naked eye 3D display method provided in Embodiment 6;
  • FIG. 12 is a schematic diagram of multi-viewpoint channel and view point allocation in the naked-eye 3D display method provided in Embodiment 6; FIG.
  • Figure 13 is a schematic diagram of phase adjustment of a multi-view channel
  • FIG. 14 is a schematic structural diagram of a naked-eye 3D display device provided in Embodiment 7;
  • FIG. 15 is a structural block diagram of a naked-eye 3D display device according to Embodiment 8 of the present invention.
  • FIG. 1 is a schematic flowchart of a naked eye 3D display method according to the first embodiment.
  • the present embodiment is applicable to displaying a naked eye 3D image.
  • the method may be performed by a naked eye 3D display device, and the device may be implemented by software/hardware. And can be integrated in a display device for playing naked-eye 3D video or images.
  • the naked eye 3D display method includes:
  • step 110 the number of viewpoints within the viewing range at the design viewing distance is set, and the phase of each viewpoint and the viewpoint map corresponding to the phase are determined according to the number of viewpoints.
  • the design viewing distance which means that when the vertical distance from the screen is a preset viewing distance, the sub-pixel projection position separated by the lenticular lens is adapted to the position of the human eye.
  • the viewer's left and right eyes respectively see appropriate corresponding images, forming binocular parallax, resulting in a sense of depth and space.
  • the design viewing distance may be an optimal viewing distance.
  • the phase is defined by viewing the viewing angle range (ie, the viewing range) at the design viewing distance.
  • one viewing angle is covered by 0 to 1, and each viewing angle corresponds to a range of 0 to 1. Determining the phase of each viewpoint according to the number of preset viewpoints in the viewing angle range at the design viewing distance; and determining the phase of each viewpoint image, that is, the initial phase, according to the number of displayed viewpoint images;
  • OVD represents a design viewing distance
  • d OVD represents a length of a viewing line segment of a central visible area at a design viewing distance.
  • the viewing line segment of the central visible area at the design viewing distance can be set to the corresponding phase [0, 1], and the phase range corresponding to each viewpoint is determined according to the set number of viewpoints.
  • the viewpoints are evenly distributed, from which the phase range corresponding to each viewpoint can be determined.
  • the range of viewpoints is consecutively equal, covering the entire phase range [0, 1].
  • the viewpoint is used to serve a view map.
  • the phase range of the viewpoint is equal to the initial phase range of the viewpoint image.
  • the phase ranges of the viewpoints 1, 2, 3, 4, and 5 can be: ⁇ [0, 0.2), [0.2, 0.4), [0.4, 0.6), [0.6, 0.8) , [0.8, 1) ⁇ .
  • the rendering produces a K-viewpoint map, where K is greater than or equal to two.
  • K is greater than or equal to two.
  • the range of views is consecutively equal, covering the entire phase range [0, 1], ie
  • viewpoints of the maps 1, 2, 3, 4, and 5 correspond to the phase ranges: ⁇ [0, 0.2), [0.2, 0.4), [0.4, 0.6), [0.6, 0.8), [ 0.8,1) ⁇ .
  • step 120 the spatial position of the human eye in the screen viewing area is obtained, and the left eye and the right eye position are calculated according to the spatial position, and the phases of the plurality of points on the screen corresponding to the left eye and the points on the screen corresponding to the right eye are respectively determined. Phase.
  • an image with a human face can be obtained by a photographing device that is disposed on the display device and faces the screen viewing area. Identify faces in the image. And determining the spatial position of the left and right eyes of the viewer according to the face, such as the vertical distance from the screen, and the distance from the center line perpendicular to the center of the screen.
  • infrared devices can also be used to assist in ranging to obtain a more accurate spatial position of the human eye.
  • the mapping relationship between the spatial position of the human eye and the viewing line segment of the central visible region on the design viewing distance is established.
  • the distance between the eye-catching position and the vertical center line of the screen is f
  • the eye-distance position and the screen distance are VD.
  • the preset point on the screen is calculated by the following equation for the human eye position phase p.
  • the left eye phase may be a point on the screen that is located to the left of the center of the screen, such as the leftmost point on the screen, that is, the point corresponding to the x' maximum value for the left eye; the right eye phase may be the center of the screen on the screen.
  • the point to the left of the position such as the point on the far right of the screen, that is, the point corresponding to the point of the right eye for the point corresponding to the x' minimum.
  • the adjusted view map is a global calculation rather than a single point calculation.
  • Viewpoint interleaving is a point calculation rather than a global calculation.
  • step 130 a phase difference between the left eye phase and the right eye phase is calculated, and the distribution of the viewpoint image phase is adjusted according to the phase difference.
  • the phase of the viewpoint image is adjusted.
  • the viewpoint corresponds to the physical channel and is fixed for a given design.
  • the interleaving process is actually a process of writing a view map combination to multiple physical channels (viewpoints).
  • the phase difference between the two eyes is calculated. For example, it can be calculated as follows:
  • ⁇ p p R -p L , where p R is the right eye phase and p L is the left eye phase.
  • the right eye phase is subtracted from the left eye phase because the right eye and the left eye are in the same visible region.
  • the right eye phase will be larger than the left eye phase, which is convenient for later processing.
  • the phase difference can be obtained by subtracting the phase of the right eye from the phase of the left eye, and correspondingly, the subsequent processing also needs to be adjusted.
  • multi-viewpoint and dual-view input generate multi-viewpoint graph (the number of viewpoints is greater than or equal to 2), complete multi-view interleaving, and rely on the micro-column lens in front of the naked-eye 3D display liquid crystal display to image pixels R, G, B
  • the pixels are projected through the lens in different directions, reorganizing the 3D multi-view display so that the viewer can view different views from different directions. Since the left view viewed by the left eye and the main view viewed by the right eye vary with the position of the viewer, in the case where the left and right eye positions of the viewer are known, the adjustment of the interleaving method can be performed, in the case of multi-view display. To further avoid image inversion and image aliasing.
  • the viewpoint images respectively corresponding to the left eye and the right eye of the current viewer may be adjusted according to the spatial position changes of the left eye and the right eye of the viewer, so that the viewer views corresponding to the left eye and the right eye respectively
  • the viewpoint view of the right eye is still maintained on the right side of the left eye, that is, the phase of the viewpoint image seen by the right eye is larger than the phase of the viewpoint image seen by the left eye.
  • the viewpoint map can be adjusted according to the phase difference.
  • the view point map is adjusted according to the view point and the phase difference corresponding to the original phase, so that the phase map corresponding to the left eye and the right eye and the set phase map corresponding to the corresponding position on the design viewing distance are Consistent.
  • phase difference calculation there are two cases of phase difference calculation. The first one is ⁇ p>0. If ⁇ p>0, the phases of the left eye and the right eye are in the same phase range of [0,1]. Therefore, it is not necessary.
  • the phase map By adjusting the phase map, the user's left and right eyes can respectively view the corresponding viewpoint map.
  • the phase of the left eye and the right eye are in different phase ranges of [0, 1], and the difference between the two phase ranges, ie 1, the left eye and the right eye can be re-determined.
  • the phase range And adjusting the view according to the view and the phase difference corresponding to the original phase.
  • the adjusted view is a global adjusted view.
  • the method in this embodiment further includes performing multi-view (the number of views is greater than or equal to 2) interleaving according to the adjusted view point phase, and generating an image corresponding to the plurality of views.
  • the position of the viewing distance of the screen is determined according to the position of the human eye, and the phase of the viewing distance corresponding to the screen is adjusted according to the phase, and the content of the corresponding viewing point in each viewpoint is adjusted according to the phase, so that the content of the viewpoint image can be made according to the person.
  • the change in eye position enables both the left and right eyes to view the correct view content, which may be a linear superposition of the multi-view images rather than a single view.
  • the embodiment solves the problem that the angle resolution of the multi-view display technology in the related art is not too low, the field of view is limited, and the human eye tracking 3D display technology in the related art cannot effectively process the display area between the eyes.
  • This embodiment can avoid image aliasing or inversion, and improve the viewing effect and the viewing experience of the user.
  • a phase system defined by designing a viewing distance perspective includes a plurality of viewpoints (corresponding to optical channels) and a viewpoint image (corresponding display content); and determining, based on the position of the human eye, a plurality of points on the screen in the current viewing condition, that is, a person The phase corresponding to the eye position; the adjustment interleaving method is adapted to different position viewing.
  • FIG. 3 is a schematic flow chart of a naked eye 3D display method provided in the second embodiment.
  • This embodiment is modified on the basis of the above embodiment, which takes into account the human eye tracking error and reduces the error by controlling the unobserved viewpoint display.
  • the spatial position of the human eye in the acquisition screen viewing area may be: acquiring a spatial position of the left eye and the right eye visible area edge; correspondingly, calculating the left eye according to the spatial position
  • the phase of the right eye viewpoint may be: calculating a phase of the left eye and the right eye visible region edge according to the spatial position of the left eye and the right eye visible region edge; correspondingly, adjusting the viewpoint image according to the phase difference
  • the central phase difference between the left eye and the right eye may be calculated according to the phases of the left and right eye visible region edges, and the viewpoint image may be adjusted according to the phase difference.
  • the naked eye 3D display method includes:
  • step 210 the number of viewpoints within the viewing range of the design viewing distance is set, and the phase of each viewpoint and the viewpoint map corresponding to the phase are determined according to the number of the viewpoints.
  • step 220 the spatial position of the human eye in the screen viewing area is obtained, and the spatial position of the left eye and right eye visible area edges is calculated according to the spatial position.
  • an image with a human face can be obtained by a photographing device that is disposed on the display device and faces the screen viewing area. Identify faces in the image. And determining the spatial position of the left and right eyes of the viewer according to the face, such as the vertical distance from the screen, and the distance from the center line perpendicular to the center of the screen.
  • infrared devices can also be used to assist in ranging to obtain a more accurate spatial position of the human eye.
  • the image with the face captured by the camera is periodically acquired, and the spatial position of the human eye is determined according to the plurality of face images to avoid spatial position deviation caused by the viewer's accidental swing.
  • the spatial position of the left and right eye visible area edges can be calculated from the human eye spatial position.
  • the distance e between the center and the boundary of the left and right eyes can be set according to the size of the visible area of the normal human eye. According to the distance e, it can be determined that the left and right edge positions of the visible area of the human eye are:
  • step 230 the phases of the left and right eye visible region edges are calculated based on the spatial positions of the left and right eye visible region edges.
  • the mapping relationship between the spatial position of the visible area of the human eye and the viewing line segment of the central visible area on the viewing distance is established. And according to the mapping relationship, the phase of the visible area of the human eye is determined.
  • a central phase difference between the left eye and the right eye is calculated according to the phases of the left and right eye visible region edges, and the viewpoint image phase is adjusted according to the phase difference.
  • the phases of the left and right eye visible region edges can be obtained, and according to the phases of the left and right eye visible region edges, the center phases of the left and right eyes can be obtained, and according to the left and right eyes.
  • the center phase calculates the phase difference between the two eyes.
  • the center phases of the left and right eyes can be calculated as follows:
  • p L (p Lr +p Ll )/2; where p L is the left eye center phase; p Lr is the right edge of the left eye viewable region, and p Ll is the left edge of the left eye viewable region.
  • p R (p Rr +p Rl )/2; where p R is the right eye center phase; p Rr is the right edge of the right eye viewable region, and p Pl is the left edge of the right eye viewable region.
  • the central phase difference between the left and right eyes can be calculated as follows:
  • ⁇ p p R -p L , where p R is the right eye center phase and p L is the left eye center phase.
  • p R is the right eye center phase
  • p L is the left eye center phase.
  • the reason why the right eye phase is subtracted from the left eye phase is because the right eye and the left eye are in the same visible region.
  • the angle of the right eye is usually greater than the phase of the left eye, it is convenient for subsequent processing.
  • the phase difference can be obtained by subtracting the phase of the right eye from the phase of the left eye, and correspondingly, the subsequent processing also needs to be adjusted.
  • the spatial position of the human eye in the viewing screen viewing area to: acquiring the spatial position of the left eye and the right eye visible area edge; correspondingly, calculating the left eye and the right according to the spatial position
  • the phase of the eye point of view is changed to: calculate the phase of the edge of the visible area of the left eye and the right eye according to the spatial position of the edge of the visible area of the left eye and the right eye; correspondingly, the angle view is adjusted according to the phase difference
  • the central phase difference between the left eye and the right eye is calculated according to the phases of the left and right eye visible region edges, and the viewpoint image is adjusted according to the phase difference.
  • the content of the viewpoint image can be changed according to the position of the visible area of the human eye, so that both the left eye and the right eye can view the correct viewpoint image content, avoiding image aliasing or inversion, and improving the viewing effect and the user's viewing experience. .
  • the phase of calculating the left eye and the right eye viewpoint according to the spatial position is changed to: calculating the design viewpoint phase of the left eye and the right eye respectively. Range center; adjust the left-eye and right-eye viewpoint phase ranges according to the actual left-eye and right-eye actual viewpoint phases; correspondingly, adjust the viewpoint map according to the phase difference to: the adjusted left
  • the eye and right eye viewpoint phase ranges adjust the viewpoint map of the left and right eye viewpoint ranges.
  • the viewpoint map of the left and right eye viewpoint ranges is adjusted according to the adjusted left-eye and right-eye viewpoint phase ranges.
  • the naked eye 3D display method includes:
  • step 310 the number of viewpoints within the viewing range in which the viewing distance is designed is set, and the phase of each viewpoint and the viewpoint map corresponding to the phase are determined according to the number of viewpoints.
  • step 320 the spatial position of the human eye in the screen viewing area is obtained, and the phases of the left eye and the right eye viewpoint are calculated according to the spatial position.
  • step 330 the center of the design viewpoint phase range of the left and right eyes is calculated separately.
  • FIG. 5 is a schematic diagram showing a phase range of a viewpoint image in the naked eye 3D display method according to the third embodiment.
  • the center of the design viewpoint phase range of the left eye may be calculated as follows:
  • v lc is the center of the design viewpoint phase range of the left eye
  • v ll is the left corner of the design viewpoint phase of the left eye
  • v lr is the design viewpoint phase right boundary of the left eye.
  • center of the design viewpoint phase range of the left eye is calculated as follows:
  • v rc is the center of the design viewpoint phase range of the right eye
  • v rl is the left corner of the design viewpoint phase of the right eye
  • v rr is the right boundary of the design viewpoint phase of the right eye.
  • step 340 the left eye and right eye viewpoint phase ranges are adjusted according to the actual viewpoint phases of the actual left and right eyes, respectively.
  • the actual viewpoint phase centers of the left and right eyes are p l and p r , respectively, the actual viewpoint phase ranges of the left and right eyes can be calculated according to the new phase center:
  • the actual viewpoint phase range of the left eye is: v l -v lc +p l ;
  • the actual viewpoint phase range of the right eye is: v r -v rc +p r ;
  • step 350 the viewpoint maps of the viewpoint ranges of the left and right eyes are adjusted according to the adjusted left-eye and right-eye viewpoint phase ranges.
  • the viewpoint map can be adjusted according to the actual viewpoint phase range, so that the viewpoint map actually corresponding to the left eye and the right eye and the corresponding position at the designed viewing distance are set.
  • the corresponding design viewpoint maps are consistent.
  • the viewpoint views of the viewpoint ranges of the left and right eyes can be respectively adjusted as follows:
  • v lp mod(v l -v lc +p l , [0,1]), since v l -v lc +p l may be greater than 1, or may be less than 0, since the phase range is between [0,1] You need to take the remainder to get an accurate range of displacement.
  • v rp mod(v r -v rc +p r , [0,1]).
  • the viewpoint view of adjusting the range of the left eye and the right eye according to the adjusted left eye and right eye viewpoint phase ranges may include:
  • Corresponding channels are determined according to the adjusted view image; the sub-pixels of the channel are assigned according to the adjusted view image.
  • the lens refracts different display contents to different places in the space by refraction of light to form a plurality of optical channels.
  • each phase has a corresponding optical channel. According to the correspondence between the optical channel and the phase, the channel corresponding to the phase range can be determined.
  • the lens refracts different display contents to different places in the space by refraction of light to form a plurality of optical channels.
  • the display content is determined by multiple sub-pixels.
  • the display content corresponding to the channel is a view point map.
  • the sub-pixels forming the channel should be adjusted according to the adjusted display content. That is, the sub-pixels of the channel are reassigned.
  • the sub-pixels of the channel may be assigned according to the adjusted view image to: determine a distribution of corresponding view maps in the channel; and determine each according to the distribution situation.
  • the weight of the view graph; the sub-pixels of the channel are assigned according to the weights.
  • the number of view points and the number of channels are not necessarily equal, and usually the number of view points is less than the number of channels. It can be seen that the viewpoint map and the channel are not corresponding.
  • the distribution may be the number of viewpoint maps within the channel, as well as the proportion of each view in the channel, and the like.
  • the weight can be the ratio of each view in the channel.
  • FIG. 6 is a schematic diagram of adjusting the phase range of the multi-view channel in the naked-eye 3D display method provided in the third embodiment.
  • the view corresponding to the channel c5 is a part of v5 and v6, respectively.
  • the number of view points corresponding to channel c5 is 2, and a part of v5 and v6 together constitute a view point corresponding to channel c5.
  • the weight of each view map may be determined by setting a length ratio of the corresponding channel of each view map to d i ,
  • D i is the actual length of each view of the view
  • L is the total length of the channel.
  • the sub-pixels may be assigned according to the weights of all the view images. That is, the weighted average is performed in proportion, and the adjusted viewpoint map is obtained, and the sub-pixels are assigned according to the final sub-pixel value.
  • the channel output is 0, that is, all black settings, or insert the contents of the nearest left and right channels. To avoid aliasing in the left and right eyes.
  • the adjusted viewpoint map can be calculated as follows:
  • c j is the adjusted viewpoint map
  • v i is any one of the viewpoint maps included in the channel
  • d i is the weight of the viewpoint map
  • Calculating the phase of the left eye and the right eye viewpoint according to the spatial position to: respectively calculating the center of the design viewpoint phase range of the left eye and the right eye; adjusting the left eye according to the actual viewpoint phase of the actual left eye and the right eye respectively And a right-eye viewpoint phase range; correspondingly, adjusting the viewpoint map according to the phase difference to: the adjusting the viewpoint of the left-eye and right-eye viewpoint ranges according to the adjusted left-eye and right-eye viewpoint phase ranges Figure.
  • the viewpoint map of the left and right eye viewpoint ranges is adjusted according to the adjusted left-eye and right-eye viewpoint phase ranges.
  • FIG. 7 is a schematic flow chart of a naked eye 3D display method provided in the fourth embodiment.
  • the embodiment is changed based on the above embodiment.
  • the phase of calculating the left eye and the right eye viewpoint according to the spatial position is changed to: calculating the vertical center of the left eye and the right eye and the screen respectively.
  • the distance of the line; the viewpoint phase included in the left eye and the right eye is calculated according to the distance, the design viewing distance, and the pixel point and the screen center position distance, respectively.
  • the vertical center line of the screen refers to a straight line passing through the center of the screen and perpendicular to the plane of the screen.
  • the naked eye 3D display method includes:
  • step 410 the number of viewpoints within the viewing range of the design viewing distance is set, and the phase of each viewpoint and the viewpoint map corresponding to the phase are determined according to the number of viewpoints.
  • step 420 the spatial position of the human eye in the screen viewing area is obtained, and the distances between the left eye and the right eye and the vertical center line of the screen are respectively calculated.
  • FIG. 8 is a schematic diagram showing the positional relationship of the phase in the naked-eye 3D display method provided in the fourth embodiment.
  • the distance from the screen in the spatial position of the human eye is measured according to a single eye.
  • the distance of the human eye from the vertical centerline of the screen is calculated from the right triangle.
  • the corresponding distance between the design viewing distance and the vertical center line can be calculated.
  • step 430 the viewpoint phases included in the left eye and the right eye are respectively calculated according to the distance, the design viewing distance, and the pixel point and the screen center position distance.
  • the image that can be viewed by the human eye is a collection of a plurality of pixels in the screen.
  • the description is made by one pixel.
  • pixels (pixels) are at a certain distance from the center of the screen. Set the distance between a pixel on the screen and the center of the screen to be x. Then modify the position of the human eye at the design viewing distance and the vertical center line to:
  • the position of the human eye can be converted into a corresponding design viewing distance with respect to the distance from the vertical center line, that is, the position corresponding to the human eye at the design viewing distance is determined.
  • the corresponding phase of the human eye is calculated according to the phase length on the design viewing distance.
  • the human eye can be divided by the distance from the vertical center line and the phase length on the design viewing distance in the design viewing distance. , get the corresponding phase.
  • the distance between the human eye and the vertical center line in the design viewing distance and the pixel and the screen center position may be positive or negative, and the focus of the line center vertical line and the design viewing distance line segment is the origin. The right side is positive and the left side is negative. Since the phase is set to a positive number, the formula for calculating the phase is adjusted accordingly:
  • step 440 the phase difference between the two eyes is calculated, and the phase map phase distribution is adjusted according to the phase difference.
  • the distance between the left eye and the right eye and the vertical center line of the screen is separately calculated; according to the distance, the design viewing distance, The distance between the pixel and the center of the screen is calculated to calculate the phase of the viewpoint included in the left and right eyes, respectively.
  • the corresponding phase can be calculated according to the acquired position of the human eye at a position corresponding to the design viewing distance, and the correct phase can be obtained by scaling when the human eye position is located on either side of the vertical center line.
  • FIG. 9 is a schematic flow chart of a naked eye 3D display method provided in the fifth embodiment.
  • the screen is defined as a curved screen
  • the phase of the left eye and the right eye viewpoint is calculated according to the spatial position, and is changed to:
  • the distance between the left eye and the right eye and the vertical center line of the screen, the design viewing distance, the distance between the pixel point and the center position of the screen on the plane formed at both ends of the curved screen are respectively calculated as the viewpoint phases included in the left eye and the right eye.
  • the naked eye 3D display method includes:
  • step 510 the number of viewpoints within the viewing range of the design viewing distance is set, and the phase of each viewpoint and the viewpoint map corresponding to the phase are determined according to the number of the viewpoints.
  • step 520 the spatial position of the human eye in the screen viewing area is obtained, and the distance between the left eye and the right eye and the vertical center line of the screen is calculated respectively.
  • an image with a human face can be obtained by a photographing device that is disposed on the display device and faces the screen viewing area. Identify faces in the image. And determining the spatial position of the left and right eyes of the viewer according to the face, such as the vertical distance from the screen, and the distance from the center line perpendicular to the center of the screen.
  • infrared devices can also be used to assist in ranging to obtain a more accurate spatial position of the human eye.
  • the image with the face captured by the camera is periodically acquired, and the spatial position of the human eye is determined according to the plurality of face images to avoid spatial position deviation caused by the viewer's accidental swing.
  • FIG. 10 is a schematic diagram showing the positional relationship of the phase in the naked-eye 3D display method according to the fifth embodiment.
  • the distance from the screen in the spatial position of the human eye is measured by a single eye.
  • the distance of the human eye from the vertical centerline of the screen is calculated from the right triangle.
  • the corresponding distance between the design viewing distance and the vertical center line can be calculated.
  • the distance between the human eye and the screen is VD.
  • the distance between the design viewing distance and the screen is OVD.
  • the distance between the human eye position and the vertical center line of the screen is f.
  • the corresponding distance between the human eye and the vertical center line is
  • step 530 the viewpoint phases included in the left eye and the right eye are respectively calculated according to the distance, the design viewing distance, the distance between the pixel point and the center position of the screen on the plane formed by both ends of the curved screen.
  • the image that can be viewed by the human eye is a collection of a plurality of pixels in the screen.
  • the description is made by one pixel.
  • pixels are at a certain distance from the center of the screen. Since the pixel is in the same surface as the center of the screen in the curved screen, its distance on the screen is related to the curvature of the curved screen. Since the curved image on the curved segment in the curved screen is also formed by the visual plane projection on the curved segment, and the curvature of the curved screen is small, for the convenience of calculation, the pixel point and the center of the screen can be positioned at both ends of the curved screen.
  • the distance on the composed plane is approximated as the distance between the pixel point and the center of the screen. Set the distance between the pixel point and the center of the screen on the plane formed at both ends of the curved screen as x. Then modify the position of the human eye at the design viewing distance and the vertical center line to:
  • the position of the human eye can be converted into a corresponding design viewing distance with respect to the distance from the vertical center line, that is, the position corresponding to the human eye at the design viewing distance is determined.
  • the corresponding phase of the human eye is calculated according to the phase length on the design viewing distance.
  • the distance between the human eye and the vertical center line in the design viewing distance can be divided by the phase length in the design viewing distance. , get the corresponding phase.
  • the distance between the human eye and the vertical center line in the design viewing distance and the pixel and the screen center position may be positive or negative, and the intersection of the vertical line of the screen and the designed viewing distance line is the origin. The right side is positive and the left side is negative. Since the phase is set to a positive number, the formula for calculating the phase is adjusted accordingly:
  • step 540 the phase difference between the right eye and the left eye is calculated, and the viewpoint map is adjusted according to the phase difference.
  • the phase of calculating the left eye and the right eye viewpoint according to the spatial position is changed to: according to the distance between the left eye and the right eye and the vertical center line of the screen, The distances of the viewing distance, the pixel points, and the center position of the screen on the plane formed at both ends of the curved screen are calculated to calculate the viewpoint phases included in the left and right eyes, respectively.
  • the content of the viewpoint image can be changed according to the position of the human eye, so that both the left eye and the right eye can view the correct viewpoint image content, avoiding image aliasing or inversion, and improving the viewing effect and the user's viewing experience.
  • FIG. 11 is a schematic flow chart of a naked eye 3D display method according to Embodiment 8 of the present invention.
  • the embodiment is modified based on the above embodiment.
  • the viewpoint view of adjusting the left-eye and right-eye visible regions according to the phase change amount is changed to: determining according to the phase change amount. a view channel corresponding to the adjusted view; assigning a sub-pixel of the view channel according to the view corresponding to the original phase.
  • the naked eye 3D display method includes:
  • step 610 the number of viewpoints within the viewing range of the design viewing distance is set, and the phase of each viewpoint and the viewpoint map corresponding to the phase are determined according to the number of the viewpoints.
  • step 620 the spatial position of the human eye in the screen viewing area is obtained, and the phases of the left eye and the right eye viewpoint are calculated according to the spatial position.
  • step 630 the phase difference between the right eye and the left eye is calculated, the phase change amount is calculated based on the phase difference, and the viewpoint map is adjusted according to the phase change amount.
  • the viewpoint image may be adjusted according to the phase difference such that the phase map corresponding to the left eye and the right eye and the phase corresponding to the corresponding position at the designed viewing distance are set.
  • the amount of phase change is introduced. Exemplarily, it is first determined whether the phase difference is less than zero. If the phase difference is less than zero, the phases of the left eye and the right eye are not in the same (0, 1) phase range. Further, when the absolute value of the phase difference is greater than the right eye phase, the phase change amount is calculated according to the phase difference; and when the absolute value of the phase difference is not greater than the right eye phase, according to the phase of the left eye and the right eye viewpoint Calculate the amount of phase change.
  • the phase change amount may be directly calculated from the phases of the left eye and the right eye viewpoint regardless of the relationship between the absolute value of the phase difference and the right eye phase. Specifically, the amount of phase change can be calculated as follows:
  • is the amount of phase change.
  • the viewpoint phase can be adjusted according to the phase change amount. For example, the following manner can be adopted:
  • the view point may be adjusted by: determining a channel corresponding to the adjusted view image according to the phase change amount; and assigning a value to the sub-pixel of the channel according to the adjusted view image.
  • the lens refracts different display contents to different places in space by refraction of light to form a plurality of optical channels. Since the number of viewpoint images is usually less than the number of optical channels, linear interpolation can be used to determine each FIG. 12 is a schematic diagram of a multi-view channel and a view point distribution in the naked-eye 3D display method provided in the sixth embodiment, and FIG. 12 shows a distribution relationship between the channel and the view point.
  • FIG. 13 is a schematic diagram of multi-view channel phase adjustment.
  • the viewpoint map is transformed and adjusted according to the phase change amount ⁇ , so that the phase map of the channel corresponding to the human eye changes.
  • the phase map is subjected to the right shift adjustment of the phase change amount ⁇ .
  • the assigning the sub-pixels of the view channel according to the view point corresponding to the original phase may include: if the original phase has no corresponding view image, setting the sub-pixel of the channel Set subpixels for all black or for view content within the most recent channel.
  • c j is the adjusted viewpoint image
  • v i is any one of the viewpoint images included in the channel
  • d i is the weight of the viewpoint image
  • the sub-pixel is set according to the adjusted viewpoint image content .
  • the viewpoint map is adjusted according to the phase difference, and the phase change amount is calculated according to the phase difference, and the viewpoint map is adjusted according to the phase change amount.
  • the corresponding phase change amount can be calculated for a plurality of different situations, and the viewpoint map is adjusted correspondingly according to the phase change amount.
  • the viewpoint view of the channel corresponding to the human eye can be adjusted, so that the left eye and the right eye can view the correct viewpoint image content, avoiding image aliasing or inversion, and improving the viewing effect and the user's viewing experience.
  • FIG. 14 is a schematic structural diagram of a naked-eye 3D display device provided in the seventh embodiment. As shown in FIG. 14, the naked eye 3D display device includes:
  • the determining module 710 is configured to set a number of viewpoints within a viewing range of the design viewing distance, and determine a phase of each viewpoint and a viewpoint map corresponding to the phase according to the number of the viewpoints;
  • the calculating module 720 is configured to acquire a spatial position of a human eye in a viewing area of the screen, and calculate a phase of the left eye and the right eye according to the spatial position;
  • the adjustment module 730 is configured to calculate a phase difference between the right eye and the left eye, and adjust the viewpoint map according to the phase difference.
  • the naked-eye 3D display device provided by the embodiment can obtain the view of the human eye by instantly acquiring the position of the human eye, and determine the phase corresponding to the optimal viewing distance according to the position of the human eye, and adjust the content of the view view according to the phase, so that the view point can be made.
  • the content changes with the position of the human eye, enabling the left eye and the right eye to view the correct viewpoint image content, avoiding image aliasing or inversion, and improving the viewing effect and the user's viewing experience.
  • the calculating module includes:
  • An acquisition unit configured to obtain spatial locations of edges of the left and right eye viewable regions
  • the calculating module further includes:
  • a first calculating unit configured to calculate a phase of an edge of the left-eye and right-eye visible regions according to spatial positions of edges of the left-eye and right-eye visible regions
  • the adjustment module includes:
  • the first adjusting unit is configured to calculate a central phase difference between the left eye and the right eye according to the phases of the left and right eye visible region edges, and adjust the viewpoint map according to the phase difference.
  • the adjustment module is configured to: respectively calculate a center of a design viewpoint phase range of the left eye and the right eye;
  • the viewpoint map of the left and right eye viewpoint ranges is adjusted according to the adjusted left-eye and right-eye viewpoint phase ranges.
  • the calculation module is set to:
  • the viewpoint phases included in the left eye and the right eye are respectively calculated according to the distances of the left and right eyes from the vertical center line of the screen, the design viewing distance, and the distances between the pixel points and the center position of the screen on the planes formed at both ends of the curved screen.
  • the calculation module is set to
  • the adjusting module further includes:
  • a second calculating unit configured to determine whether the phase difference is less than zero, and if the phase difference is less than zero, when the absolute value of the phase difference is greater than a right eye phase, calculate a phase change amount according to the phase difference; and determine Whether the phase difference is less than zero, and if the phase difference is less than zero, when the absolute value of the phase difference is not greater than the right eye phase, the phase change amount is calculated according to the phases of the left eye and the right eye viewpoint.
  • the adjusting unit includes:
  • the viewpoint map adjusting unit is configured to adjust the viewpoint map according to the viewpoint map and the phase change amount corresponding to the original phase.
  • the view point adjustment unit includes:
  • a channel determining subunit configured to determine a channel corresponding to the adjusted view image according to the phase change amount
  • the assignment subunit is configured to assign a value to the sub-pixel of the channel according to the view map corresponding to the original phase. Based on the above embodiment, the assignment subunit is set to:
  • the sub-pixel of the channel is set to be all black or the sub-pixel is set according to the view content in the nearest channel;
  • c j is the adjusted viewpoint image
  • v i is any one of the viewpoint images included in the channel
  • d i is the weight of the viewpoint image
  • the sub-pixel is set according to the adjusted viewpoint image content .
  • the naked-eye 3D display device may perform the naked-eye 3D display method provided by any embodiment of the present disclosure, and has a function module and a beneficial effect corresponding to the execution method.
  • the various modules or operations of the present disclosure described above can be implemented by a terminal device as described above.
  • the embodiment may be implemented by a program executable by a computer device, so that they may be stored in a storage device by a processor, and the program may be stored in a computer readable storage medium,
  • the storage medium mentioned may be a read only memory, a magnetic disk or an optical disk, etc.; or they may be separately fabricated into a plurality of integrated circuit modules, or a plurality of modules or operations thereof may be implemented as a single integrated circuit module.
  • the embodiment further provides a computer readable storage medium storing computer executable instructions for executing the naked eye 3D display method described in the foregoing embodiments.
  • FIG. 15 is a structural block diagram of a naked-eye 3D display device according to an embodiment of the present invention.
  • the display device provided in this embodiment may include a processor 801 and a memory 803, and may further include a communication interface 802, a bus 804, and a display screen 805.
  • the processor 801, the communication interface 802, and the memory 803 can complete communication with each other through the bus 804.
  • Communication interface 802 can be used for information transmission.
  • the processor 801 can call logic instructions in the memory 803 to perform the naked eye 3D display method of the above embodiment.
  • the logic instructions in the memory 803 described above may be implemented in the form of a software functional unit and sold or used as a stand-alone product, and may be stored in a computer readable storage medium.
  • the technical solution of the present disclosure may be embodied in the form of a software product stored in a storage medium, including a plurality of instructions for causing a computer device (which may be a personal computer, a server, or a network) The device or the like) performs all or part of the steps of the method described in various embodiments of the present disclosure.
  • the foregoing storage medium may be a non-transitory storage medium, including: a USB flash drive, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk.
  • a medium that can store program code, or a transitory storage medium including: a USB flash drive, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk.
  • the naked eye 3D display method and apparatus can obtain the viewpoint by directly acquiring the human eye viewing position, determining the phase corresponding to the corresponding optimal viewing distance according to the human eye position, and adjusting the content of the viewed viewpoint image according to the phase.
  • the content of the picture changes with the position of the human eye, enabling the left eye and the right eye to view the correct viewpoint image content, avoiding image aliasing or inversion, and improving the viewing effect and the user's viewing experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

一种裸眼3D显示方法,包括:设定预设观看距离的观看范围内的多个视点的数量,根据所述视点数量确定每个视点的相位和所述相位对应的视点图;获取屏幕观看区域内人眼的空间位置,根据所述空间位置计算左眼视点的相位和右眼视点的相位;以及计算右眼视点的相位与左眼视点的相位之间的相位差,根据所述相位差调整所述左眼的相位对应的视点图和所述右眼的相位对应的视点图。

Description

裸眼3D显示方法及装置 技术领域
本公开涉及裸眼3D(three Dimensions,三维)技术领域,例如涉及一种裸眼3D显示方法及装置。
背景技术
裸眼3D显示器被广泛应用于广告、传媒、示范教学、展览展示以及影视等各个不同领域。区别于传统的双目3D显示技术,裸眼3D显示由于拥有裸眼的独特特性,即不需要观众佩戴眼镜或头盔便可观赏3D效果,且裸眼3D显示逼真的景深及立体感,又极大提高了观众在观看体验时的视觉冲击力和沉浸感,成为产品推广、公众宣传及影像播放的最佳显示产品。
裸眼3D显示的原理一般是通过透镜将显示器显示的图像进行分光,透镜通过对光的折射作用,将不同的显示内容折射到空间中不同的地方,到达人眼时显示的内容被分开,人眼接收到两幅含有视差的图像,这样便产生了立体效果。如果用户在可视区域外进行观看,可能会出现图像反转的情况。虽然可以采用多视点方式增加可视区域,但采用多视点在一定的角度分辨率下,会使得清晰度下降,使图像出现混叠,影响了实际观看效果。
发明内容
本申请一些实施例提供了一种裸眼3D显示方法及装置,解决了相关技术中裸眼3D显示易出现反转和图像混叠的技术问题。
第一方面,本实施例提供了一种裸眼3D显示方法,包括:
设定预设观看距离观看范围内的视点数量,根据所述视点数量确定每个视点的相位;同时根据显示的视点图数量,确定每个视点图的相位;
获取屏幕观看区域内人眼的空间位置,根据所述空间位置计算左眼和右眼位置,确定屏幕上多个点分别对应左眼的相位和对应右眼的相位;以及
计算右眼相位与左眼相位之间的相位差,根据所述相位差调整所述左眼的相位对应的视点图的相位和所述右眼的相位对应的视点图的相位。可选的,还包括:
对调节后的所述左眼的相位对应的视点图和所述右眼的相位对应的视点图 进行完成多视点(视点数大于等于2)交织。其中,调节的是视点图相位。视点对应物理通道,对于给定设计是固定的。该交织过程是指一种将视点图组合写入多个物理通道(视点)的过程。
第二方面,本实施例还提供了一种裸眼3D显示装置,包括:
确定模块,设置为设定设计观看距离观看范围内的视点数量,根据所述视点数量确定每个视点的相位和所述相位对应的视点图;
计算模块,设置为获取屏幕观看区域内人眼的空间位置,根据所述空间位置计算左眼和右眼位置,确定屏幕上多个点所对应相位;以及
调整模块,设置为计算右眼与左眼之间的相位差,根据所述相位差调整视点图相位;
交织模块,设置为按调节后的视点图相位对完成多视点(视点数大于等于2)交织。
一实施例还提供了一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令用于执行上述任一项所述的方法。
一实施例还提供了一种裸眼3D显示装置,包括至少一个处理器;以及
与所述至少一个处理器通信连接的存储器;
其中,所述存储器存储有可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行,执行如上述任一项所述的方法。
本实施例提供的裸眼3D显示方法及装置,结合人眼跟踪和多视点显示技术,增强了裸眼3D显示的观看范围,提高显示效果,减少可见的混叠区域和反转区域,提高了观看效果和用户的观看体验。
附图概述
图1是本实施例一提供的裸眼3D显示方法的流程示意图;
图2是本实施例一提供的裸眼3D显示方法中多视点光学设计示意图;
图3是本实施例二提供的裸眼3D显示方法的流程示意图;
图4是本实施例三提供的裸眼3D显示方法的流程示意图;
图5是本实施例三提供的裸眼3D显示方法中相位计算位置关系示意图;
图6是本实施例三提供的裸眼3D显示方法中是多视点通道相位范围调节示意图;
图7是本实施例四提供的裸眼3D显示方法的流程示意图;
图8是本实施例四提供的裸眼3D显示方法中相位计算位置关系示意图;
图9是本实施例五提供的裸眼3D显示方法的流程示意图;
图10是本实施例五提供的裸眼3D显示方法中相位计算位置关系示意图;
图11是本实施例六提供的裸眼3D显示方法的流程示意图;
图12是本实施例六提供的裸眼3D显示方法中多视点通道和视点图分配示意图;
图13是多视点通道相位调节示意图;
图14是本实施例七提供的裸眼3D显示装置的结构示意图;
图15为本实施例八提供的裸眼3D显示装置的结构框图。
具体实施方式
实施例一
图1为本实施例一提供的裸眼3D显示方法的流程示意图,本实施例可适用于显示裸眼3D图像的情况,该方法可以由裸眼3D显示装置来执行,该装置可由软件/硬件方式实现,并可集成于用于播放裸眼3D视频或者图像的显示装置中。
参见图1,所述裸眼3D显示方法,包括:
步骤110中,设定设计观看距离处观看范围内的视点数量,根据所述视点数量确定每个视点的相位和所述相位对应的视点图。
对于裸眼3D,通常存在一个设计观看距离,所述设计观看距离是指在距离屏幕的垂直距离为一预设观看距离时,通过柱状透镜分离出来的子像素投影位置与人眼位置相适应,可以使得观看者左眼和右眼分别看到合适的对应图像,形成双目视差,产生纵深感和空间感。可选的,所述设计观看距离可以是最佳观看距离。
本实施例中,以设计观看距离处观看视角范围(即观看范围)定义相位,可选的,由0至1覆盖一个观看视角,并每个观看视角都与0至1的范围对应。在设计观看距离处按观看视角范围内预设视点数量,确定每个视点的相位;同时根据显示的视点图数量,确定每个视点图的相位,即初始相位;
图2是本实施例一提供的裸眼3D显示方法中多视点光学设计示意图,参见图2,图中OVD表示设计观看距离,d OVD表示设计观看距离处的中心可视区域的观看线段的长度。在本实施例中,可以将设计观看距离处中心可视区域的观 看线段设定为对应相位[0,1],并根据设定的视点数量确定每个视点对应的相位范围。可选地,视点都是均匀分布的,据此可以确定每个视点对应的相位范围。视点范围连续相等,覆盖整个相位范围[0,1]。所述视点用于投放视点图。可选的,视点的相位范围与视点图的初始相位范围相等。以视点图数量为5为例,视点图1,2,3,4,5对应相位范围可以是:{[0,0.2)、[0.2,0.4)、[0.4,0.6)、[0.6,0.8)、[0.8,1)}。
相应的,渲染产生K视点图,其中K大于等于2。确定多个视点对应相位范围。可选的,视点范围连续相等,覆盖整个相位范围[0,1],即
Figure PCTCN2018085791-appb-000001
以视点图数量为5为例,视点图1,2,3,4,5对应相位范围:{[0,0.2)、[0.2,0.4)、[0.4,0.6)、[0.6,0.8)、[0.8,1)}。
步骤120中,获取屏幕观看区域内人眼的空间位置,根据所述空间位置计算左眼和右眼位置,分别确定屏幕上多个点对应左眼的相位以及屏幕上多个点对应右眼的相位。
示例性的,可以通过配置于显示装置,并面向屏幕观看区域的拍摄装置得到带有人脸的图像。识别图像中人脸。并根据人脸确定观看者左右眼的空间位置,例如与屏幕垂直距离,与垂直于屏幕中心的中心线的距离。此外,也可采用红外装置辅助测距,以获取更加精确的人眼空间位置。
在获取到人眼空间位置之后,根据几何关系,建立人眼空间位置与设计观看距离上中心可视区域观看线段的映射关系。并根据映射关系,确定人眼相位,可参见图8。该方法为,令人眼位置与屏幕垂直中心线距离为f,令人眼位置与屏幕距离为VD。令屏幕上任一预设点与屏幕中心位置距离为x′,其中x′=x+sy,(x,y)为屏幕上所述预设点坐标,s为棱镜或光栅斜率。由下式计算屏幕上所述预设点对于人眼位置相位p。
Figure PCTCN2018085791-appb-000002
Figure PCTCN2018085791-appb-000003
Figure PCTCN2018085791-appb-000004
其中左眼相位可以为屏幕上位于屏幕中心位置左侧的点,例如屏幕上最左侧的点,即x′最大值对应的点对于左眼的相位;右眼相位可以为屏幕上位于屏 幕中心位置左侧的点,例如屏幕上最右侧的点,即x′最小值对应的点对于右眼的相位。
在一些实施例中,调节视点图为全局计算而非单点计算。而视点图交织则为点计算而非全局计算。步骤130中,计算左眼相位与右眼相位之间的相位差,根据所述相位差调整视点图相位的分布。
调节的是视点图的相位。视点对应物理通道,对于给定设计是固定的。交织过程实际上是一种将视点图组合写入多个物理通道(视点)的过程。
根据上述方法计算得到的右眼和左眼的相位,计算两眼之间的相位差。示例性的,可采用如下方式计算:
Δp=p R-p L,其中p R为右眼相位,p L为左眼相位,可选的,采用右眼相位减去左眼相位,因为在右眼和左眼处于同一可视区域时,通常右眼相位会大于左眼相位,方便后面运算处理。此外,也可采用左眼相位减去右眼相位得到相位差,相应的,后续处理也需要进行调整。
考虑多视点及双视点输入的情况,生成多视点图(视点数大于等于2),完成多视点交织,并依靠裸眼3D显示器液晶显示屏前的微柱透镜,将图像像素R、G、B子像素通过透镜以不同的方向投影,重新组织3D多视点显示,使观众可从不同的方向观看到不同的视图。由于左眼所观看的左视图和右眼所观看的主视图随着观看者的位置变化,在已知观看者左右眼位置的情况下,可以通过交织方法的调整,在多视点显示的情况下,进一步避免出现图像反转的现象和图像混叠。
示例性的,可以随着观看者左眼和右眼的空间位置变化,对当前观看者左眼和右眼分别对应的视点图进行调整,以使得观看者左眼和右眼分别对应的视点图仍然维持右眼所见视点图在左眼右侧,即右眼所见视点图相位大于左眼所见视点图相位的特点。依据计算得到的右眼与左眼之间的相位差后,可以根据相位差对视点图进行调整。示例性的,根据所述原有相位对应的视点图和相位差调整所述视点图,以使得左眼和右眼对应的相位图与设定的在设计观看距离上相应位置对应的相位图相一致。示例性的,相位差计算存在两种情况,第一种是Δp>0,如果Δp>0,则说明左眼和右眼的相位处于同一个[0,1]的相位范围内,因此,无需对相位图进行调整,用户的左眼和右眼可以分别观看到对应的视点图。对于Δp<0,则说明左眼和右眼的相位处于不同的[0,1]的相位范围内,可以利用两个相位范围之间的差值,即1,重新确定左眼和右眼对应的相 位范围。并根据所述原有相位对应的视点图和相位差调整所述视点图,本实施例中所述调节视点图为全局调整视点图。可选的,可取,
Figure PCTCN2018085791-appb-000005
Figure PCTCN2018085791-appb-000006
本实施例中所述方法还包括按调节后的视点图相位完成多视点(视点数大于等于2)交织,生成所述多个视点对应的图像。
可选的,对于多个视点,按与所述多个视点中的每个视点的相位重叠的视点图的相位的重叠区域比例,进行对所述多个视点图线性叠加,生成所述多个视点对应的图像。
本实施例通过即时获取人眼观看位置,并根据人眼位置确定屏幕对应设计观看距离的相位,并根据相位对每个视点中对应播放的视点图内容进行调整,可以使得视点图内容随着人眼位置变化,能够使左眼和右眼都能观看到正确的视点图内容,所述视点图内容可以是多视点图的线性叠加,而非单一视点图。本实施例解决了相关技术中多视点显示技术存在的角度分辨率不宜过低,视场有限的问题,以及相关技术中人眼跟踪3D显示技术,对于双眼间显示区域,无法有效处理,在观看者移动时,容易引入混叠的问题。本实施例能够避免出现图像混叠或者反转,提高了观看效果和用户的观看体验。
本实施例以设计观看距离视角定义的相位体系,包括多个视点(对应光学通道)以及视点图(对应显示内容);以人眼位置为依据,确定屏幕上多个点在当前观看条件即人眼位置下所对应的相位;调节交织方法适应不同位置观看。
实施例二
图3是本实施例二提供的裸眼3D显示方法的流程示意图。本实施例以上述实施例为基础进行改变,该实施例考虑了人眼跟踪误差,并通过控制未观测视点显示降低误差。在本实施例中,将所述获取屏幕观看区域内人眼的空间位置可以为:获取左眼和右眼可视区域边缘的空间位置;相应的,将所述根据所述空间位置计算左眼和右眼视点的相位,可以为:根据左眼和右眼可视区域边缘的空间位置,计算左眼和右眼可视区域边缘的相位;相应的,所述根据所述相 位差调整视点图,可以为:根据所述左眼和右眼可视区域边缘的相位计算左眼和右眼的中心相位差,并根据所述相位差调整视点图。
参见图3,所述裸眼3D显示方法,包括:
步骤210中,设定设计观看距离观看范围内的视点数量,根据所述视点数量确定每个视点的相位和所述相位对应的视点图。
步骤220中,获取屏幕观看区域内人眼的空间位置,根据所述空间位置计算左眼和右眼可视区域边缘的空间位置。
示例性的,可以通过配置于显示装置,并面向屏幕观看区域的拍摄装置得到带有人脸的图像。识别图像中人脸。并根据人脸确定观看者左右眼的空间位置,例如与屏幕垂直距离,与垂直于屏幕中心的中心线的距离。此外,也可采用红外装置辅助测距,以获取更加精确的人眼空间位置。
可选地,周期性获取相机拍摄到的带有人脸的图像,根据多张人脸图像确定人眼的空间位置,以避免观看者偶然摆动造成的空间位置偏差。
实际上,在观察外界事物时,人眼存在一个可视区域,在此区域内都能够观看到外界事物。可以根据人眼空间位置计算左眼和右眼可视区域边缘的空间位置。示例性的,可以根据通常人眼的可视区域的大小设定左右眼的中心与边界的距离e。根据距离e,可以确定人眼可视区域的左右边缘位置为:
Figure PCTCN2018085791-appb-000007
Figure PCTCN2018085791-appb-000008
步骤230中,根据左眼和右眼可视区域边缘的空间位置,计算左眼和右眼可视区域边缘的相位。
在获取到人眼可视区域的左右边缘的空间位置之后,根据几何关系,建立人眼可视区域的空间位置与计观看距离上中心可视区域观看线段的映射关系。并根据映射关系,确定人眼可视区域的相位。
步骤240中,根据所述左眼和右眼可视区域边缘的相位计算左眼和右眼的中心相位差,根据所述相位差调整视点图相位。
根据以上操作步骤可以得到左眼和右眼可视区域边缘的相位,根据左眼和右眼可视区域边缘的相位,可以得出左眼和右眼的中心相位,并根据左眼和右眼的中心相位计算两眼之间的相位差。示例性的,可采用如下方式计算左眼和右眼的中心相位:
p L=(p Lr+p Ll)/2;其中,p L为左眼中心相位;p Lr为左眼可视区域的右侧边缘,p Ll为左眼可视区域的左侧边缘。
p R=(p Rr+p Rl)/2;其中,p R为右眼中心相位;p Rr为右眼可视区域的右侧边缘,p Pl为右眼可视区域的左侧边缘。
示例性的,可采用如下方式计算左眼和右眼的中心相位差:
Δp=p R-p L,其中p R为右眼中心相位,p L为左眼中心相位,之所以采用右眼相位减去左眼相位,是因为在右眼和左眼处于同一可视区域时,通常右眼相位会大于左眼相位,方便后面运算处理。此外,也可采用左眼相位减去右眼相位得到相位差,相应的,后续处理也需要进行调整。
本实施例通过将所述获取屏幕观看区域内人眼的空间位置改为:获取左眼和右眼可视区域边缘的空间位置;相应的,将所述根据所述空间位置计算左眼和右眼视点的相位,改为:根据左眼和右眼可视区域边缘的空间位置,计算左眼和右眼可视区域边缘的相位;相应的,所述根据所述相位差调整视点图,改为:根据所述左眼和右眼可视区域边缘的相位计算左眼和右眼的中心相位差,并根据所述相位差调整视点图。可以使得视点图内容随着人眼可视区域位置变化,能够使左眼和右眼都能观看到正确的视点图内容,避免出现图像混叠或者反转,提高了观看效果和用户的观看体验。
实施例三
图4是本实施例三提供的裸眼3D显示方法的流程示意图。本实施例以上述实施例为基础进行改变,在本实施例中,将所述根据所述空间位置计算左眼和右眼视点的相位,改为:分别计算左眼和右眼的设计视点相位范围中心;根据实际左眼和右眼的实际视点相位分别调整左眼和右眼视点相位范围;相应的,将所述根据所述相位差调整视点图,改为:所述根据调整后的左眼和右眼视点相位范围调整左眼和右眼的视点范围的视点图。根据调整后的左眼和右眼视点相位范围调整左眼和右眼的视点范围的视点图。
参见图4,所述裸眼3D显示方法,包括:
步骤310中,设定设计观看距离的观看范围内的视点数量,根据所述视点数量确定每个视点的相位和所述相位对应的视点图。
步骤320中,获取屏幕观看区域内人眼的空间位置,根据所述空间位置计算左眼和右眼视点的相位。
步骤330中,分别计算左眼和右眼的设计视点相位范围中心。
由于获取屏幕观看区域内人眼的空间位置是人眼的中心,而设计视点相位范围是以人眼中心为中心左右等距的一个范围。为了使人眼的实际空间位置能够与设计相位范围相对应,因此需要分别计算左眼和右眼的设计视点相位范围中心。图5是本实施例三提供的裸眼3D显示方法中视点图相位范围示意图,可参考图5,示例性的,可以通过如下方式计算左眼的设计视点相位范围中心:
Figure PCTCN2018085791-appb-000009
其中,v lc为左眼的设计视点相位范围中心,v ll为左眼的设计视点相位左边界,v lr为左眼的设计视点相位右边界。
相应的,计算左眼的设计视点相位范围中心方式如下:
Figure PCTCN2018085791-appb-000010
其中,v rc为右眼的设计视点相位范围中心,v rl为右眼的设计视点相位左边界,v rr为右眼的设计视点相位右边界。
步骤340中,根据实际左眼和右眼的实际视点相位分别调整左眼和右眼视点相位范围。
由于左眼和右眼的实际视点相位中心分别为p l和p r,根据新的相位中心可以计算得到左眼和右眼的实际视点相位范围:
相应的,左眼实际视点相位范围为:v l-v lc+p l
即[v ll-v lc+p l,v lr-v lc+p l],
右眼实际视点相位范围为:v r-v rc+p r
即[v rl-v rc+p r,v rr-v rc+p r]。
步骤350中,根据调整后的左眼和右眼视点相位范围调整左眼和右眼的视点范围的视点图。
在计算得到右眼与左眼的实际视点相位范围后,可以根据实际视点相位范围对视点图进行调整,以使得左眼和右眼实际对应的视点图与设定的在设计观看距离上相应位置对应的设计视点图相一致。在本实施例中,可以通过如下方式分别将左眼和右眼的视点范围的视点图调整为:
对于左眼,可采用如下方式:
v lp=mod(v l-v lc+p l,[0,1]),由于v l-v lc+p l可能大于1,也可能小于0,由于相位范围处于[0,1]之间,需要将其取余数,以获取准确的位移范围。
同理,对于右眼可采用如下方式:
v rp=mod(v r-v rc+p r,[0,1])。
示例性的,所述根据调整后的左眼和右眼视点相位范围调整左眼和右眼的视点范围的视点图可以包括:
根据调整后的视点图确定对应的通道;根据所述调整后的视点图对所述通道的子像素进行赋值。
透镜通过对光的折射作用,将不同的显示内容折射到空间中不同的地方,形成多个光学通道。对于设计观看距离OVD的中心位置的线段,每个相位都有对应的光学通道。根据光学通道和相位之间的对应关系,可以确定相位范围所对应的通道。
透镜通过对光的折射作用,将不同的显示内容折射到空间中不同的地方,形成多个光学通道。而显示内容则由多个子像素决定。在本实施例中,通道对应的显示内容即为视点图。在显示内容发生变化时,应按照调整后的显示内容调整形成通道的子像素。即对通道的子像素重新赋值。
可选的,可以将所述根据所述调整后的视点图对所述通道的子像素进行赋值,改为:确定所述通道内对应的视点图的分布情况;根据所述分布情况确定每个视点图的权重;根据所述权重对对所述通道的子像素进行赋值。
由上述步骤可知,视点图的数量和通道的数量不一定相等,且通常视点图的数量少于通道数量。由此可以看出,视点图和通道并不是对应关系。所述分布情况可以是通道内的视点图的数量,以及通道内每个视点图的比例等。
所述权重可以是通道内每个视点图的比例。参见图6,图6是本实施例三提供的裸眼3D显示方法中是多视点通道相位范围调节示意图,由图6可已看出,通道c5对应的视点图分别是v5和v6的一部分,即通道c5对应的视点图数量为2,v5和v6的一部分共同构成了通道c5对应的视点图。示例性的,可以通过如下方式确定每个视点图的权重,设定每个视点图对应通道的长度比例为d i
Figure PCTCN2018085791-appb-000011
其中,D i为每个视点图在该通道的实际长度,L为通道总长度。
在计算得到通道包括的所有视点图的权重后,可以根据所述所有视点图的权重,对子像素进行赋值。即按照比例进行加权平均,得到调整后的视点图,并按照最终子像素值对子像素进行赋值。这里包括两种情况,如果该通道没有 对应的视点图,则该通道输出为0,即全黑设置,或者***最近左右通道内内容。以避免左右眼产生混叠。如果该通道有对应的视点图,则可以通过如下方式计算调整后的视点图:
Figure PCTCN2018085791-appb-000012
其中c j为调整后的视点图,v i为通道中包括的任意一个视点图,d i为该视点图的权重。
将所述根据所述空间位置计算左眼和右眼视点的相位,改为:分别计算左眼和右眼的设计视点相位范围中心;根据实际左眼和右眼的实际视点相位分别调整左眼和右眼视点相位范围;相应的,将所述根据所述相位差调整视点图,改为:所述根据调整后的左眼和右眼视点相位范围调整左眼和右眼的视点范围的视点图。根据调整后的左眼和右眼视点相位范围调整左眼和右眼的视点范围的视点图。通过引入相位变化量调整视点图,可以针对多种不同情况,计算出相应的相位变化量,并根据相位变化量对视点图进行相应的调整。
实施例四
图7是本实施例四提供的裸眼3D显示方法的流程示意图。本实施例以上述实施例为基础进行改变,在本实施例中,将所述根据所述空间位置计算左眼和右眼视点的相位,改为:分别计算左眼和右眼与屏幕垂直中心线的距离;根据所述距离、设计观看距离、像素点与屏幕中心位置距离分别计算左眼和右眼包括的视点相位。其中,本实施例以及其他实施例中,所述屏幕垂直中心线是指过屏幕中心且垂直于所述屏幕所在平面的直线。
参见图7,所述裸眼3D显示方法,包括:
步骤410中,设定设计观看距离观看范围内的视点数量,根据所述视点数量确定每个视点的相位和所述相位对应的视点图。
步骤420中,获取屏幕观看区域内人眼的空间位置,分别计算左眼和右眼与屏幕垂直中心线的距离。
图8是本实施例四提供的裸眼3D显示方法中相位计算位置关系示意图,参见图8,以单独一只眼举例,根据测量得到的人眼的空间位置中距离屏幕的距离。设定屏幕中某一像素点在屏幕中的横向坐标。根据直角三角形计算得到人眼相对于屏幕垂直中心线的距离。示例性的,根据直角三角形两边距离成比例的特点可以计算该人眼在设计观看距离与垂直中心线的相应距离。
设定人眼位置与屏幕距离为VD,设计观看距离与屏幕之间的距离为OVD, 人眼位置与屏幕垂直中心线距离为f,则人眼在设计观看距离与垂直中心线的相应距离为
Figure PCTCN2018085791-appb-000013
步骤430中,根据所述距离、设计观看距离、像素点与屏幕中心位置距离分别计算左眼和右眼包括的视点相位。
人眼所能观看的到图像是屏幕中多个像素的集合,在本实施例中,通过一个像素来进行说明。通常像素(像素点)都与屏幕中心位置存在着一定的距离。设定屏幕上一像素点与屏幕中心位置距离为x。则将人眼位置在设计观看距离与垂直中心线的相应距离修改为:
Figure PCTCN2018085791-appb-000014
通过上述方式可以将人眼位置转换为对应的设计观看距离上相对于与垂直中心线的距离,即确定人眼在设计观看距离上对应的位置。
在本实施例中,根据设计观看距离上的相位长度计算人眼的相应相位,示例性的,可以将人眼在设计观看距离上与垂直中心线的距离和设计观看距离上的相位长度相除,获取相应的相位。在本实施例中,人眼在设计观看距离上与垂直中心线的距离以及像素与屏幕中心位置可以为正数,也可以为负数,以屏幕中心垂直线与设计观看距离线段的焦点为原点,右侧为正,左侧为负。因相位设定为正数,因此,将计算相位的公式相应调整为:
Figure PCTCN2018085791-appb-000015
Figure PCTCN2018085791-appb-000016
步骤440中,计算两眼之间的相位差,根据所述相位差调整视点图相位分布。
本实施例通过将所述根据所述空间位置计算左眼和右眼包括的视点相位,改为:分别计算左眼和右眼与屏幕垂直中心线的距离;根据所述距离、设计观看距离、像素点与屏幕中心位置距离分别计算左眼和右眼包括的视点相位。可以根据所获取的人眼位置在设计观看距离上对应的位置计算对应的相位,并可在人眼位置位于在垂直中心线任意一侧时,通过换算得到正确的相位。
实施例五
图9是本实施例五提供的裸眼3D显示方法的流程示意图。本实施例以上述 实施例为基础进行改变,在本实施例中,将所述屏幕限定为曲面屏,将所述根据所述空间位置计算左眼和右眼视点的相位,改为:根据所述左眼和右眼与屏幕垂直中心线的距离、设计观看距离、像素点与屏幕中心位置在曲面屏两端组成的平面上的距离分别计算左眼和右眼包括的视点相位。
参见图9,所述裸眼3D显示方法,包括:
步骤510中,设定设计观看距离观看范围内的视点数量,根据所述视点数量确定每个视点的相位和所述相位对应的视点图。
步骤520中,获取屏幕观看区域内人眼的空间位置,分别计算左眼和右眼与屏幕垂直中心线的距离。
示例性的,可以通过配置于显示装置,并面向屏幕观看区域的拍摄装置得到带有人脸的图像。识别图像中人脸。并根据人脸确定观看者左右眼的空间位置,例如与屏幕垂直距离,与垂直于屏幕中心的中心线的距离。此外,也可采用红外装置辅助测距,以获取更加精确的人眼空间位置。
可选地,周期性获取相机拍摄到的带有人脸的图像,根据多张人脸图像确定人眼的空间位置,以避免观看者偶然摆动造成的空间位置偏差。
在获取到人眼空间位置之后,根据几何关系,分别计算左眼和右眼与屏幕垂直中心线的距离。图10是本实施例五提供的裸眼3D显示方法中相位计算位置关系示意图,参见图10,以单独一只眼举例,据测量得到的人眼的空间位置中距离屏幕的距离。设定屏幕中某一像素点在屏幕中的横向坐标。根据直角三角形计算得到人眼相对于屏幕垂直中心线的距离。示例性的,根据直角三角形两边距离成比例的特点可以计算该人眼在设计观看距离与垂直中心线的相应距离。
设定人眼位置与屏幕距离为VD,设计观看距离与屏幕之间的距离为OVD,人眼位置与屏幕垂直中心线距离为f,则人眼在设计观看距离与垂直中心线的相应距离为
Figure PCTCN2018085791-appb-000017
步骤530中,根据所述距离、设计观看距离、像素点与屏幕中心位置在曲面屏两端组成的平面上的距离分别计算左眼和右眼包括的视点相位。
人眼所能观看的到图像是屏幕中多个像素的集合,在本实施例中,通过一个像素来进行说明。通常像素都与屏幕中心位置存在着一定的距离。由于曲面屏中,该像素与屏幕中心位置在同一个曲面内,其在屏幕上的距离与曲面屏的曲率相关。由于曲面屏中的弧形片段上的曲面图像也是由视觉平面投影在弧形 片段上形成的,且曲面屏的曲率较小,为了便于计算,可以将像素点与屏幕中心位置在曲面屏两端组成的平面上的距离近似视为像素点与屏幕中心位置的距离。设定像素点与屏幕中心位置在曲面屏两端组成的平面上的距离为x。则将人眼位置在设计观看距离与垂直中心线的相应距离修改为:
Figure PCTCN2018085791-appb-000018
通过上述方式可以将人眼位置转换为对应的设计观看距离上相对于与垂直中心线的距离,即确定人眼在设计观看距离上对应的位置。
在本实施例中,根据设计观看距离上的相位长度计算人眼的相应相位,示例性的,可以将人眼在设计观看距离上与垂直中心线的距离与设计观看距离上的相位长度相除,获取相应的相位。在本实施例中,人眼在设计观看距离上与垂直中心线的距离以及像素与屏幕中心位置可以为正数,也可以为负数,以屏幕中心垂直线与设计观看距离线段的交点为原点,右侧为正,左侧为负。因相位设定为正数,因此,将计算相位的公式相应调整为:
Figure PCTCN2018085791-appb-000019
Figure PCTCN2018085791-appb-000020
步骤540中,计算右眼与左眼之间的相位差,根据所述相位差调整视点图。
本实施例通过将所述屏幕限定为曲面屏,将所述根据所述空间位置计算左眼和右眼视点的相位,改为:根据所述左眼和右眼与屏幕垂直中心线的距离、设计观看距离、像素点与屏幕中心位置在曲面屏两端组成的平面上的距离分别计算左眼和右眼包括的视点相位。可以使得视点图内容随着人眼位置变化,能够使左眼和右眼都能观看到正确的视点图内容,避免出现图像混叠或者反转,提高了观看效果和用户的观看体验。
实施例六
图11是本实施例八提供的裸眼3D显示方法的流程示意图。本实施例以上述实施例为基础进行改变,在本实施例中,将所述根据所述相位变化量调整所述左眼和右眼可视区域的视点图,改为:根据相位变化量确定调整后的视点图对应的视点通道;根据所述原有相位对应的视点图对所述视点通道的子像素进行赋值。
参见图11,所述裸眼3D显示方法,包括:
步骤610中,设定设计观看距离观看范围内的视点数量,根据所述视点数量确定每个视点的相位和所述相位对应的视点图。
步骤620中,获取屏幕观看区域内人眼的空间位置,根据所述空间位置计算左眼和右眼视点的相位。
步骤630中,计算右眼与左眼之间的相位差,根据所述相位差计算相位变化量,根据所述相位变化量调整视点图。
在计算得到右眼与左眼之间的相位差后,可以根据相位差对视点图进行调整,以使得左眼和右眼对应的相位图与设定的在设计观看距离上相应位置对应的相位图相一致。在本实施例中,引入相位变化量。示例性的,首先判断所述相位差是否小于零,若所述相位差小于零,则说明左眼和右眼的相位不处于同一个(0,1)的相位范围内。进而,在所述相位差的绝对值大于右眼相位时,根据所述相位差计算相位变化量;在所述相位差的绝对值不大于右眼相位时,根据左眼和右眼视点的相位计算相位变化量。或者,可以不考虑相位差的绝对值与右眼相位之间的关系,直接采用根据左眼和右眼视点的相位计算相位变化量。具体的,可以采用如下方式进行计算相位变化量:
可取,
Figure PCTCN2018085791-appb-000021
或者,可以直接采用
Figure PCTCN2018085791-appb-000022
,其中φ为相位变化量。根据计算得到的相位变化量,可以根据相位变化量调整视点相位,示例性的,可采用如下方式:
Figure PCTCN2018085791-appb-000023
φ∈[p R,p L]
示例性的,可以采用如下方式调整所述视点图:根据相位变化量确定调整后的视点图对应的通道;根据所述调整后的视点图对所述通道的子像素进行赋值。透镜通过对光的折射作用,将不同的显示内容折射到空间中不同的地方,形成多个光学通道,由于视点图的数量通常少于光学通道的数量,因此,可采用线性插值方法确定每个通道对应的视点图,图12是本实施例六提供的裸眼3D显示方法中是多视点通道和视点图分配示意图,图12显示了通道和视点图的分配关系。
图13是多视点通道相位调节示意图,由图13可以看出,根据计算得到的 相位变化量φ,将视点图按照相位变化量φ进行变换调整,以使得人眼对应的通道的相位图发生变化。由图16可以看出,相位图进行了相位变化量φ的右移调整。通过对通道进行对于已知像素点和已知的通道,可以通过重新对像素点的子像素进行赋值的方式,实现视点图的变化。
可选的,所述根据所述原有相位对应的视点图对所述视点通道的子像素进行赋值,可以包括:如果原有相位无对应的视点图,则将所述该通道的子像素设置为全黑或者按最近通道内的视图内容设置子像素。
如果原有相位有对应的视点图,利用
Figure PCTCN2018085791-appb-000024
计算调整后的视点图,其中c j为调整后的视点图,v i为通道中包括的任意一个视点图,d i为该视点图的权重,按所述调整后的视点图内容设置子像素。
本实施例通过将所述根据所述相位差调整视点图,改为:根据所述相位差计算相位变化量,根据所述相位变化量调整视点图。通过引入相位变化量调整视点图,可以针对多种不同情况,计算出相应的相位变化量,并根据相位变化量对视点图进行相应的调整。并可以对人眼对应的通道的视点图进行调整,能够使左眼和右眼都能观看到正确的视点图内容,避免出现图像混叠或者反转,提高了观看效果和用户的观看体验。
实施例七
图14是本实施例七提供的裸眼3D显示装置的结构示意图。如图14所示,所述裸眼3D显示装置包括:
确定模块710,设置为设定设计观看距离观看范围内的视点数量,根据所述视点数量确定每个视点的相位和所述相位对应的视点图;
计算模块720,设置为获取屏幕观看区域内人眼的空间位置,根据所述空间位置计算左眼和右眼视点的相位;
调整模块730,设置为计算右眼与左眼之间的相位差,根据所述相位差调整视点图。
本实施例提供的裸眼3D显示装置,通过即时获取人眼观看位置,并根据人眼位置确定对应的最优观看距离对应的相位,并根据相位对播放的视点图内容进行调整,可以使得视点图内容随着人眼位置变化,能够使左眼和右眼都能观看到正确的视点图内容,避免出现图像混叠或者反转,提高了观看效果和用户的观看体验。
在上述实施例的基础上,所述计算模块包括:
获取单元,设置为获取左眼和右眼可视区域边缘的空间位置;
相应的,所述计算模块,还包括:
第一计算单元,设置为根据左眼和右眼可视区域边缘的空间位置,计算左眼和右眼可视区域边缘的相位;
相应的,所述调整模块,包括:
第一调整单元,设置为根据所述左眼和右眼可视区域边缘的相位计算左眼和右眼的中心相位差,并根据所述相位差调整视点图。
在上述实施例的基础上,所述调整模块设置为:分别计算左眼和右眼的设计视点相位范围中心;
根据实际左眼和右眼的实际视点相位分别调整左眼和右眼视点相位范围;
根据调整后的左眼和右眼视点相位范围调整左眼和右眼的视点范围的视点图。
在上述实施例的基础上,在所述屏幕为曲面显示屏时,所述计算模块设置为:
根据所述左眼和右眼与屏幕垂直中心线的距离、设计观看距离、像素点与屏幕中心位置在曲面屏两端组成的平面上的距离分别计算左眼和右眼包括的视点相位。
在上述实施例的基础上,所述计算模块,设置为
分别计算左眼和右眼与屏幕垂直中心线的距离;
以及根据所述距离、设计观看距离、像素点与屏幕中心位置距离分别计算左眼和右眼包括的视点相位。
在上述实施例的基础上,所述调整模块还包括:
第二计算单元,设置为判断所述相位差是否小于零,若所述相位差小于零,在所述相位差的绝对值大于右眼相位时,根据所述相位差计算相位变化量;以及判断所述相位差是否小于零,若所述相位差小于零,在所述相位差的绝对值不大于右眼相位时,根据左眼和右眼视点的相位计算相位变化量。
在上述实施例的基础上,所述调整单元,包括:
视点图调整单元,设置为根据所述原有相位对应的视点图和相位变化量调整所述视点图。
在上述实施例的基础上,所述视点图调整单元包括:
通道确定子单元,设置为根据相位变化量确定调整后的视点图对应的通道;
赋值子单元,设置为根据所述原有相位对应的视点图对所述通道的子像素进行赋值。在上述实施例的基础上,所述赋值子单元设置为:
如果原有相位无对应的视点图,则将所述该通道的子像素设置为全黑或者按最近通道内的视图内容设置子像素;
如果原有相位有对应的视点图,利用
Figure PCTCN2018085791-appb-000025
计算调整后的视点图,其中c j为调整后的视点图,v i为通道中包括的任意一个视点图,d i为该视点图的权重,按所述调整后的视点图内容设置子像素。
上述裸眼3D显示装置可执行本公开任意实施例所提供的裸眼3D显示方法,具备执行方法相应的功能模块和有益效果。
本领域技术人员应该明白,上述的本公开的各模块或各操作可以通过如上所述的终端设备实施。可选地,本实施例可以用计算机装置可执行的程序来实现,从而可以将它们存储在存储装置中由处理器来执行,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等;或者将它们分别制作成多个集成电路模块,或者将它们中的多个模块或操作制作成单个集成电路模块来实现。
本实施例还提供了一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令用于执行上述实施例所述的裸眼3D显示方法。
图15为本实施例提供的裸眼3D显示装置的结构框图。本实施例提供的显示装置可以是包括:处理器(processor)801和存储器(memory)803,还可以包括通信接口(Communications lnterface)802、总线804以及显示屏805。其中,处理器801、通信接口802、存储器803可以通过总线804完成相互间的通信。通信接口802可以用于信息传输。处理器801可以调用存储器803中的逻辑指令,以执行上述实施例的裸眼3D显示方法。
此外,上述的存储器803中的逻辑指令可以通过软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本公开的技术方案可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本公开各个实施例所述 方法的全部或部分步骤。而前述的存储介质可以是非暂态存储介质,包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质,也可以是暂态存储介质。
工业实用性
本公开提供的裸眼3D显示方法及装置,通过即时获取人眼观看位置,并根据人眼位置确定对应的最优观看距离对应的相位,并根据相位对播放的视点图内容进行调整,可以使得视点图内容随着人眼位置变化,能够使左眼和右眼都能观看到正确的视点图内容,避免出现图像混叠或者反转,提高了观看效果和用户的观看体验。

Claims (19)

  1. 一种裸眼3D显示方法,包括:
    设定预设观看距离的观看范围内的多个视点的数量,根据所述视点数量确定每个视点的相位和所述相位对应的视点图;
    获取屏幕观看区域内人眼的空间位置,根据所述空间位置计算左眼视点的相位和右眼视点的相位;以及
    计算右眼视点的相位与左眼视点的相位之间的相位差,根据所述相位差调整所述左眼视点的相位对应的视点图和所述右眼视点的相位对应的视点图。
  2. 根据权利要求1所述的方法,其中,所述获取屏幕观看区域内人眼的空间位置包括:
    获取左眼可视区域边缘和右眼可视区域边缘的空间位置;
    其中,所述根据所述空间位置计算左眼视点的相位和右眼视点的相位,包括:
    根据左眼可视区域边缘的空间位置和右眼可视区域边缘的空间位置,计算左眼可视区域边缘的相位和右眼可视区域边缘的相位;
    其中,所述根据所述相位差调整视点图,包括:
    根据所述左眼可视区域边缘的相位和右眼可视区域边缘的相位计算左眼的中心相位和右眼的中心相位的中心相位差,并根据所述中心相位差调整视点图。
  3. 根据权利要求1所述的方法,其中,所述根据所述空间位置计算左眼视点的相位和右眼视点的相位,包括:分别计算左眼的设计视点相位范围中心和右眼的设计视点相位范围中心;以及根据实际左眼的实际视点相位和右眼的实际视点相位分别调整左眼视点相位范围和右眼视点相位范围;
    根据所述相位差调整所述左眼视点的相位对应的视点图和所述右眼视点的相位对应的视点图,包括:
    所述根据调整后的左眼视点相位范围和右眼视点相位范围调整左眼的视点范围的视点图和右眼的视点范围的视点图。
  4. 根据权利要求1所述的方法,其中,所述根据所述空间位置计算左眼视点的相位和右眼视点的相位,包括:
    分别计算左眼和右眼与屏幕垂直中心线的距离;以及
    根据所述距离、设计观看距离、预设像素点与屏幕中心位置的距离分别计算左眼视点的相位和右眼视点的相位。
  5. 根据权利要求1所述的方法,其中,在所述屏幕为曲面显示屏时,所述根据所述空间位置计算左眼视点的相位和右眼视点的相位,包括:
    根据所述左眼和右眼与屏幕垂直中心线的距离、设计观看距离、像素点与屏幕中心位置在曲面屏两端形成的平面上的距离分别计算左眼视点的相位和右眼视点的相位。
  6. 根据权利要求1、2和4任一所述的方法,其中,所述根据所述相位差调整所述左眼视点的相位对应的视点图和所述右眼视点的相位对应的视点图,包括:
    根据所述相位差计算相位变化量,根据所述相位变化量调整所述左眼可视区域的视点图和右眼可视区域的视点图。
  7. 根据权利要求6所述的方法,其中,所述根据所述相位差计算相位变化量,包括:
    判断所述相位差是否小于零,若所述相位差小于零时,则比较所述相位差的绝对值与右眼视点的相位;
    在所述相位差的绝对值大于右眼视点的相位时,根据所述相位差计算相位变化量;
    在所述相位差的绝对值不大于右眼视点的相位时,根据左眼视点的相位和右眼视点的相位计算相位变化量。
  8. 根据权利要求6所述的方法,其中,所述根据所述相位变化量调整所述左眼可视区域的视点图和右眼可视区域的视点图,包括:
    根据相位变化量确定调整后的视点图对应的视点通道;以及
    根据所述原有相位对应的视点图对所述视点通道的子像素进行赋值。
  9. 根据权利要求8所述的方法,其中,所述根据所述原有相位对应的视点图对所述视点通道的子像素进行赋值,包括:
    如果原有相位无对应的视点图,则将所述该通道的子像素设置为全黑或者按相邻通道内的视图内容设置子像素;
    如果原有相位有对应的视点图,利用
    Figure PCTCN2018085791-appb-100001
    计算调整后的视点图;
    其中c j为调整后的视点图,v i为通道中包括的任意一个视点图,d i为该视点图的权重,按所述调整后的视点图内容设置子像素。
  10. 一种裸眼3D显示装置,包括:
    确定模块,设置为设定设计观看距离观看范围内的多个视点的数量,根据所述视点数量确定每个视点的相位和所述相位对应的视点图;
    计算模块,设置为获取屏幕观看区域内人眼的空间位置,根据所述空间位置计算左眼视点的相位和右眼视点的相位;以及
    调整模块,设置为计算右眼视点的相位与左眼视点的相位之间的相位差,根据所述相位差调整所述左眼视点的相位对应的视点图和所述右眼视点的相位对应的视点图。
  11. 根据权利要求10所述的装置,其中,所述计算模块包括:
    获取单元,设置为获取左眼和右眼可视区域边缘的空间位置;
    所述计算模块,包括:
    第一计算单元,设置为根据左眼可视区域边缘的空间位置和右眼可视区域边缘的空间位置,计算左眼可视区域边缘的相位和右眼可视区域边缘的相位;
    所述调整模块,包括:
    第一调整单元,设置为根据所述左眼可视区域边缘的相位和右眼可视区域边缘的相位计算左眼的中心相位和右眼的中心相位差,并根据所述相位差调整视点图。
  12. 根据权利要求10所述的装置,其中,所述调整模块是设置为:分别计算左眼的设计视点相位范围中心和右眼的设计视点相位范围中心;
    根据实际左眼的实际视点相位和右眼的实际视点相位分别调整左眼视点相位范围和右眼视点相位范围;以及
    根据调整后的左眼视点相位范围和右眼视点相位范围调整左眼的视点范围的视点图和右眼的视点范围的视点图。
  13. 根据权利要求10所述的装置,其中,计算模块是设置为:
    分别计算左眼和右眼与屏幕垂直中心线的距离;以及
    根据所述距离、设计观看距离、预设像素点与屏幕中心位置的距离分别计算左眼视点的相位和右眼视点的相位。
  14. 根据权利要求10所述的装置,其中,在所述屏幕为曲面显示屏时,所述计算模块是设置为:
    根据所述左眼和右眼与屏幕垂直中心线的距离、设计观看距离、预设像素点与屏幕中心位置在曲面屏两端组成的平面上的距离分别计算左眼视点的相位和右眼视点的相位。
  15. 根据权利要求10、11或13任一所述的装置,所述调整模块还包括:
    第二计算单元,设置为根据所述相位差计算相位变化量;以及
    第二调整单元,设置为根据所述相位变化量调整所述左眼可视区域的视点图和右眼可视区域的视点图。
  16. 根据权利要求15所述的装置,其中,所述第二计算单元,是设置为:
    判断所述相位差是否小于零,若所述相位差小于零时,则比较所述相位差的绝对值与右眼视点的相位;
    在所述相位差的绝对值大于右眼视点的相位时,根据所述相位差计算相位变化量;
    在所述相位差的绝对值不大于右眼视点的相位时,根据左眼视点的相位和右眼视点的相位计算相位变化量。
  17. 根据权利要求15所述的装置,其中,所述第二调整单元包括:
    通道确定子单元,设置为根据相位变化量确定调整后的视点图对应的视点通道;以及
    赋值子单元,设置为根据所述原有相位对应的视点图对所述视点通道的子像素进行赋值。
  18. 根据权利要求17所述的装置,其中,所述赋值子单元是设置为:
    如果原有相位无对应的视点图,则将所述该通道的子像素设置为全黑或者按相邻通道内的视图内容设置子像素;以及
    如果原有相位有对应的视点图,利用
    Figure PCTCN2018085791-appb-100002
    计算调整后的视点图;
    其中c j为调整后的视点图,v i为通道中包括的任意一个视点图,d i为该视点图的权重,按所述调整后的视点图内容设置子像素。
  19. 一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令用于执行权利要求1-9任一项所述的方法。
PCT/CN2018/085791 2017-06-22 2018-05-07 裸眼3d显示方法及装置 WO2018233387A1 (zh)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
CN201710479203.7 2017-06-22
CN201710479195.6 2017-06-22
CN201710479203.7A CN107249125A (zh) 2017-06-22 2017-06-22 一种裸眼3d显示方法及装置
CN201710479231.9A CN107172409A (zh) 2017-06-22 2017-06-22 曲面显示屏裸眼3d显示方法及装置
CN201710479202.2A CN107167926A (zh) 2017-06-22 2017-06-22 一种裸眼3d显示方法及装置
CN201710479231.9 2017-06-22
CN201710479195.6A CN107454381A (zh) 2017-06-22 2017-06-22 一种裸眼3d显示方法及装置
CN201710479202.2 2017-06-22

Publications (1)

Publication Number Publication Date
WO2018233387A1 true WO2018233387A1 (zh) 2018-12-27

Family

ID=64736180

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/085791 WO2018233387A1 (zh) 2017-06-22 2018-05-07 裸眼3d显示方法及装置

Country Status (1)

Country Link
WO (1) WO2018233387A1 (zh)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030026474A1 (en) * 2001-07-31 2003-02-06 Kotaro Yano Stereoscopic image forming apparatus, stereoscopic image forming method, stereoscopic image forming system and stereoscopic image forming program
CN103002302A (zh) * 2011-09-15 2013-03-27 索尼公司 显示设备、显示方法和程序
CN103238341A (zh) * 2010-12-09 2013-08-07 索尼公司 图像处理装置、图像处理方法以及程序
CN104023222A (zh) * 2014-05-29 2014-09-03 京东方科技集团股份有限公司 裸眼3d显示控制方法、装置及***
CN206260049U (zh) * 2016-08-09 2017-06-16 擎中科技(上海)有限公司 一种裸眼3d显示设备
CN107167926A (zh) * 2017-06-22 2017-09-15 上海玮舟微电子科技有限公司 一种裸眼3d显示方法及装置
CN107172409A (zh) * 2017-06-22 2017-09-15 上海玮舟微电子科技有限公司 曲面显示屏裸眼3d显示方法及装置
CN107249125A (zh) * 2017-06-22 2017-10-13 上海玮舟微电子科技有限公司 一种裸眼3d显示方法及装置
CN107454381A (zh) * 2017-06-22 2017-12-08 上海玮舟微电子科技有限公司 一种裸眼3d显示方法及装置

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030026474A1 (en) * 2001-07-31 2003-02-06 Kotaro Yano Stereoscopic image forming apparatus, stereoscopic image forming method, stereoscopic image forming system and stereoscopic image forming program
CN103238341A (zh) * 2010-12-09 2013-08-07 索尼公司 图像处理装置、图像处理方法以及程序
CN103002302A (zh) * 2011-09-15 2013-03-27 索尼公司 显示设备、显示方法和程序
CN104023222A (zh) * 2014-05-29 2014-09-03 京东方科技集团股份有限公司 裸眼3d显示控制方法、装置及***
CN206260049U (zh) * 2016-08-09 2017-06-16 擎中科技(上海)有限公司 一种裸眼3d显示设备
CN107167926A (zh) * 2017-06-22 2017-09-15 上海玮舟微电子科技有限公司 一种裸眼3d显示方法及装置
CN107172409A (zh) * 2017-06-22 2017-09-15 上海玮舟微电子科技有限公司 曲面显示屏裸眼3d显示方法及装置
CN107249125A (zh) * 2017-06-22 2017-10-13 上海玮舟微电子科技有限公司 一种裸眼3d显示方法及装置
CN107454381A (zh) * 2017-06-22 2017-12-08 上海玮舟微电子科技有限公司 一种裸眼3d显示方法及装置

Similar Documents

Publication Publication Date Title
KR102030830B1 (ko) 곡면형 다시점 영상 디스플레이 장치 및 그 제어 방법
KR102415502B1 (ko) 복수의 사용자를 위한 라이트 필드 렌더링 방법 및 장치
CN103595986B (zh) 立体图像显示装置、图像处理装置及图像处理方法
JP6308513B2 (ja) 立体画像表示装置、画像処理装置及び立体画像処理方法
CN101895779B (zh) 立体显示方法和***
US10237539B2 (en) 3D display apparatus and control method thereof
CN108769664B (zh) 基于人眼跟踪的裸眼3d显示方法、装置、设备及介质
JP5978695B2 (ja) 裸眼立体ディスプレイ装置及び視点調整方法
US9154765B2 (en) Image processing device and method, and stereoscopic image display device
WO2015139625A1 (en) Automated 3d photo booth
JP5625979B2 (ja) 表示装置および表示方法ならびに表示制御装置
CN103873844A (zh) 多视点自动立体显示器及控制其最佳观看距离的方法
CN103562963A (zh) 用于角切片真3d显示器的对准、校准和渲染的***和方法
US9754379B2 (en) Method and system for determining parameters of an off-axis virtual camera
US10136121B2 (en) System, method and software for producing virtual three dimensional images that appear to project forward of or above an electronic display
JP2010524309A (ja) 三次元表示する方法および構成
KR20120051287A (ko) 사용자 위치 기반의 영상 제공 장치 및 방법
KR101975246B1 (ko) 다시점 영상 디스플레이 장치 및 그 제어 방법
US20140071237A1 (en) Image processing device and method thereof, and program
CN107172409A (zh) 曲面显示屏裸眼3d显示方法及装置
CN110381305A (zh) 裸眼3d的去串扰方法、***、存储介质及电子设备
CN112399168B (zh) 一种多视点图像生成方法、存储介质、显示装置
KR20130075253A (ko) 픽셀 매핑을 이용한 입체영상 디스플레이 방법 및 그 장치
TWI526717B (zh) 裸視立體顯示裝置及排列裸視立體顯示裝置之像素的方法
WO2018233387A1 (zh) 裸眼3d显示方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18820114

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 24.04.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18820114

Country of ref document: EP

Kind code of ref document: A1