WO2019080295A1 - 基于人眼跟踪的裸眼3d显示方法及控制*** - Google Patents

基于人眼跟踪的裸眼3d显示方法及控制***

Info

Publication number
WO2019080295A1
WO2019080295A1 PCT/CN2017/116194 CN2017116194W WO2019080295A1 WO 2019080295 A1 WO2019080295 A1 WO 2019080295A1 CN 2017116194 W CN2017116194 W CN 2017116194W WO 2019080295 A1 WO2019080295 A1 WO 2019080295A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinates
binocular
viewer
eyes
display screen
Prior art date
Application number
PCT/CN2017/116194
Other languages
English (en)
French (fr)
Inventor
夏正国
于炀
谢春华
Original Assignee
上海玮舟微电子科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海玮舟微电子科技有限公司 filed Critical 上海玮舟微电子科技有限公司
Publication of WO2019080295A1 publication Critical patent/WO2019080295A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • the present invention relates to the field of 3D display technologies, and in particular, to a naked eye 3D display method and a control system based on human eye tracking.
  • the naked-eye 3D display technology refers to a 3D display technology in which a viewer can directly view a three-dimensional image with a naked eye and present a 3D effect without wearing dedicated 3D glasses.
  • the naked-eye 3D display technology includes a lenticular grating, a slit grating, a liquid crystal lens, etc., and the most widely used is the cylindrical 3D display technology.
  • the principle of the cylindrical 3D display technology is achieved by attaching a special lenticular lens to the front of the conventional display.
  • the image pixels below each cylindrical lens are divided into several sub-pixels so that the lens can project each sub-pixel in a different direction.
  • the left eye and the right eye respectively see the light emitted by the different sub-pixels, so that the left eye and the right eye of the person see different pictures and merge into a 3D effect picture in the brain.
  • each cylindrical mirror projects the pixel point content separately to the left and right eyes
  • the fixed light distribution without the human eye tracking requires the user to find a suitable viewing position before and after, in order to see the ideal three-dimensional. effect.
  • the viewing position is not suitable, the light entering the left eye may enter the right eye.
  • the right eye can see both the left image and the right image, which is prone to crosstalk and has a poor user experience.
  • one solution is to use the camera to track the movement of the human eye, establish two coordinate systems with the display center and the camera position as coordinate origins, and calibrate the relationship between the two coordinate systems.
  • the coordinates of the human eye in the camera coordinate system are acquired and converted, and further converted into the central coordinate system of the display screen. Therefore, the above scheme needs to go through Conversion of two coordinate systems.
  • the camera captures a two-dimensional image, it is also necessary to convert to a three-dimensional coordinate system to determine the change of the human eye in the x, y, and z-axis directions in the three-dimensional coordinate system.
  • the related technical solution needs to calibrate more parameters when performing the mapping of the naked eye 3D content, and the debugging efficiency is low, which is not conducive to the calibration requirement of a large number of devices.
  • the embodiment of the invention provides a naked eye 3D display method and a control system based on human eye tracking, which greatly simplifies the traditional human eye tracking method and the adjustment of the layout content based on the human eye tracking, thereby improving the 3D display effect.
  • an embodiment of the present invention provides a naked eye 3D display method based on human eye tracking, where the method includes:
  • Determining a moving direction of the viewer's eyes and a moving distance from the preset reference coordinates by comparing the binocular midpoint coordinates with the pre-calibrated preset reference coordinates;
  • the starting offset is a distance from an axis of the optical axis of the camera to a center point of the chart period starting from the reference point of the map when the coordinates of the midpoint of the binocular are located at the preset reference coordinate
  • the preset reference coordinate is a central position of the camera taking a picture
  • a direction parallel to the grating film is set with a pixel coordinate of the camera relative to a coordinate system of the display screen as a starting point
  • the line segment is a plan reference line, and any point on the chart reference line is the chart reference point.
  • an embodiment of the present invention provides a naked eye 3D control system based on human eye tracking, where the control system includes:
  • a binocular midpoint coordinate acquisition module configured to acquire a midpoint coordinates of a viewer's eyes captured by the camera
  • a moving distance determining module configured to determine a moving direction of the two eyes of the viewer and a moving distance from the preset reference coordinate by comparing the binocular midpoint coordinates with the pre-calibrated preset reference coordinates;
  • Displaying a content translation module configured to perform a translation process on the display content of the display screen according to the moving direction and the moving distance, together with the initial offset of the pre-calibrated binoculars;
  • the starting offset is a distance from an axis of the optical axis of the camera to a center point of the chart period starting from the reference point of the map when the coordinates of the midpoint of the binocular are located at the preset reference coordinate
  • the preset reference coordinate is a center position of the camera taking a picture; a line segment parallel to the direction in which the grating film is disposed is taken as a starting point with respect to a pixel coordinate in a coordinate system of the display screen.
  • a reference line, any point on the map reference line is the map reference point.
  • the coordinates of the eyes of the viewer and the deviation of the eyes of the viewer can be determined by acquiring the coordinates of the midpoint of the eyes of the viewer captured by the camera and comparing the coordinates of the midpoints of the eyes with the preset reference coordinates.
  • the preset reference coordinate is the center position of the picture taken by the camera.
  • the display content of the display screen can be translated according to the moving direction and moving distance of the eyes of the viewer and the initial offset of the two eyes which are pre-calibrated.
  • the initial offset is a constant calibrated by the camera before the display content of the display screen is processed, specifically, when the coordinates of the midpoint of the two eyes are at the preset reference coordinates, the axis of the optical axis of the camera deviates from the reference chart. The distance from the center point of the chart period as the starting point.
  • the display content of the display screen can be correspondingly moved following the movement of the eyes of the viewer.
  • the viewer is within the visual distance and does not need to deliberately find the viewing position or maintain the viewing position.
  • the display content since the display content also moves correspondingly, the light that should enter the viewer's left eye is prevented from entering the right eye, and the light that should enter the viewer's right eye enters the crosstalk caused by the left eye.
  • the phenomenon makes it easy for viewers to experience the ideal naked-eye 3D effect.
  • the mapping reference point to the pixel point in the coordinate system of the camera relative to the display screen, the calibration process and the number of the chart parameters are greatly simplified, simplifying the calculation. By taking the center position of the camera to take the preset reference coordinates and calibrating the starting offset, the human eye tracking can be made faster and more accurate.
  • FIG. 1a is a schematic flowchart of a naked eye 3D display method based on human eye tracking according to Embodiment 1 of the present invention
  • FIG. 1b is a schematic diagram of a layout cycle according to Embodiment 1 of the present invention.
  • FIG. 2 is a schematic flowchart diagram of a naked eye 3D display method based on human eye tracking according to Embodiment 2 of the present invention
  • FIG. 3a is a schematic flowchart diagram of a preferred naked eye 3D display method based on human eye tracking according to Embodiment 3 of the present invention.
  • FIG. 3b is a schematic diagram of a layout manner according to Embodiment 3 of the present invention.
  • 3c is a schematic diagram of a stripe following a center position movement of a binocular according to a preset optical distribution diagram according to Embodiment 3 of the present invention
  • FIG. 4a is a schematic flowchart diagram of still another preferred naked eye 3D display method based on human eye tracking according to Embodiment 4 of the present invention.
  • FIG. 4b is a schematic diagram showing a relationship between a moving direction of a double eye and a width of a layout period according to Embodiment 4 of the present invention
  • FIG. 4c is a schematic diagram of a method for adjusting a layout period according to Embodiment 4 of the present invention.
  • FIG. 5a is a schematic flowchart diagram of still another preferred naked eye 3D display method based on human eye tracking according to Embodiment 5 of the present invention.
  • FIG. 5b is a schematic diagram showing a linear relationship between a binocular imaging distance and a chart period according to Embodiment 5 of the present invention.
  • FIG. 6 is a structural block diagram of a naked eye 3D control system based on human eye tracking according to Embodiment 6 of the present invention.
  • the technical solution of the present application can be applied to a display device mounted with a camera.
  • the terminal device can be provided with a camera module, and of course the camera can also be fixedly mounted on the display device.
  • the captured binocular midpoint coordinates are compared with a preset optical profile pre-calibrated by the camera to convert the movement of the viewer's eyes relative to the screen into the movement of the binocular coordinates relative to the preset optical profile.
  • the display content of the display device can be correspondingly translated by calculating the moving distance of the stripe in the preset optical distribution map, so as to ensure that the display image seen by the viewer remains unchanged, that is, the left image data After entering the viewer's left eye through the grating film, the right image data enters the viewer's right eye, and the left and right eye image data is fused to realize the naked eye 3D display effect.
  • the content displayed by the preset optical distribution map can be used to represent the content displayed on the display screen, that is, the preset optical distribution map is a real embodiment of the display screen in the two-dimensional space.
  • the preset optical distribution map is a pre-calibrated spectroscopic spectrum, including a first stripe and a second stripe arranged at intervals.
  • the inclination angle of the first stripe or the second stripe is an inclination angle when the grating film is actually disposed with respect to a preset direction (which may be a long side or a short side of the display screen), each first stripe and each second stripe
  • the sum of the total number of pixels per line of the stripe is the stripe period.
  • the preset optical profile can be represented by the following three parameters: the starting offset, the tilt angle of the stripe, and the stripe period.
  • the stripe period may correspond to a chart period of the displayed content.
  • the boundary line between each of the first stripe and the second stripe is the position of the center point of the pattern period.
  • the first stripe and the second stripe need to enter the left eye and the right eye of the viewer respectively.
  • the coordinates of the middle point of the viewer's eyes are located at the center point of the stripe period, that is, the first stripe and the second stripe. Junction line.
  • the boundary line between the first stripe and the second stripe is always located at the midpoint coordinates of the two eyes, thereby making the first stripe and the second stripe
  • the light enters the viewer's left and right eyes, respectively, and avoids the crosstalk phenomenon caused by the movement of the viewer's eyes while forming the naked eye 3D effect.
  • the premise that the preset optical distribution map is used in the present application is that the optical axis of the camera is nearly perpendicular to the display screen, and the off angle is within 5 degrees.
  • the advantage of this arrangement is that the fringes of the preset optical distribution are evenly distributed, thereby simplifying the calculation process.
  • FIG. 1a is a schematic flowchart diagram of a naked eye 3D display method based on human eye tracking according to Embodiment 1 of the present invention.
  • the method of the present embodiment can be performed by a naked eye 3D control system, where the control system can be implemented by software and/or hardware, and can generally be integrated into a playback controller that displays content.
  • the method may include steps S110 to S130.
  • S110 Acquire a midpoint coordinates of a viewer's eyes captured by the camera.
  • the camera is preferably a front camera provided by the naked eye 3D playback device, or may be an external camera fixedly mounted on the display device and having a communication connection relationship with the display device and facing the viewing area of the display screen.
  • the display device may be a small display device such as a mobile phone, a tablet computer or a notebook computer, or a large display device such as a large screen color TV or an advertising machine.
  • the display device in this embodiment is preferably a 3D display device provided with a grating film, and is arranged to form a naked eye 3D display effect.
  • the grating film can be a lenticular grating film or a slit grating film.
  • the camera may first be used to capture an image containing the face of the viewer. Based on the face recognition algorithm, the face image can be recognized, and the coordinates of the pupils of the eyes in the face image are recognized, and the coordinates of the midpoints of the eyes in the image can be obtained.
  • S120 Determine a moving direction of the eyes of the viewer and a moving distance from the preset reference coordinates by comparing the coordinates of the midpoint of the two eyes with the preset reference coordinates pre-calibrated.
  • the preset reference coordinate is a reference point for whether the position of the viewer's eyes is moving.
  • the embodiment preferably sets the preset reference coordinates to the center position of the camera to take a picture. By comparing the relationship between the midpoint coordinates of the binocular and the preset reference coordinates, the moving direction of the eyes of the viewer and the moving distance from the preset reference coordinates can be determined.
  • the coordinates of the midpoint of both eyes are acquired in real time, thereby more accurately tracking the movement of the human eye.
  • S130 Perform a translation processing on the display content of the display screen according to the moving direction and the moving distance, together with the starting offset of the pre-calibrated eyes.
  • the initial offset is related to the installation position and installation angle of the camera. When the installation position of the camera is fixed, the initial offset is also determined. Therefore, the camera needs to use the camera to calibrate the starting offset before panning the display content of the display.
  • the pixel point in the coordinate system of the camera relative to the display screen is used as the reference point of the map, and the reference point will be parallel to the reference point.
  • a line segment in which the grating film is laid is used as a reference line.
  • any point on the map reference line is the map reference point.
  • the layout direction of the grating film can be vertically arranged with the bottom edge of the display screen, or can be arranged at a certain inclination angle with the bottom edge of the display screen.
  • the advantage of this setting is that the display content at the layout reference line does not change when the display content is panned.
  • the width of the layout period is changed (this process is specifically described in Embodiments 4 and 5)
  • the light emitted by the display at the reference line after passing through the grating film is spatially stationary. Since the width variation range of the pattern period is small, the influence of the variation of the layout period width on the stripe width in the optical distribution map can be further ignored.
  • the technical solution provided by the embodiment greatly simplifies the row with respect to the conventional human eye tracking and patterning method.
  • the starting offset is preferably: when the coordinates of the midpoint of the two eyes are at the preset reference coordinates, the axis of the optical axis of the camera is deviated from the center point of the chart period starting from the reference point of the map. the distance.
  • the width of each of the chart periods can be quantized from 0 to 1.
  • the width can be expressed by the number of rows of pixels covered by the grating film based on the width of the grating film.
  • FIG. 1b is a schematic diagram of a layout cycle according to Embodiment 1 of the present invention. As shown in FIG. 1b, the contents of the plurality of pictures are sequentially arranged within the width of the 0 to 1 interval as a layout period. The position of the center point of the layout period is the position where the layout period width is 0.5.
  • At least two images are included in each of the chart periods, and the at least two pictures are sequentially arranged within a width of the chart period, wherein any two of the at least two images are respectively Enter the viewer's left and right eyes to create a naked eye 3D effect.
  • the starting offset can be calibrated when the preset optical profile is calibrated with the camera.
  • the preset optical distribution map is a true reflection of the display screen in a two-dimensional space. Therefore, one of the layout periods of the display screen is a stripe period of the preset optical distribution map. For one stripe period, since the stripe period corresponds to the pattern period, if the width of the stripe period is also quantized to 0 to 1, the width of the first stripe is 0 to 0.5, and the width of the second stripe is 0.5 to 1.
  • the ideal state of the axis of the optical axis of the camera is located on the boundary line between the first stripe and the second stripe, that is, the stripe width is 0.5.
  • the axis of the optical axis of the camera is not on the boundary line, so the axis deviation of the optical axis of the camera must be calibrated first.
  • the distance of the boundary line that is, the starting offset of the calibration. For example, if the horizontal distance of the axis of the optical axis of the camera deviates from the boundary line by 0.2, the initial offset is 0.2.
  • the display content of the display screen can be correspondingly translated to ensure that the left image content displayed on the display screen enters the left eye, and the displayed right image content enters the right eye to form a good naked eye 3D effect.
  • the moving direction of the eyes of the viewer may be any one of up, down, left or right. Directions.
  • the optical path distribution of the display content of the display since the distribution of the light after passing through the grating is similar to the imaging of the small hole, the translation direction of the display content of the display screen needs to be opposite to the moving direction of the eyes, for example, when the eyes move to the left, The display content needs to be moved to the right side so that the display content seen by both eyes does not change.
  • the embodiment of the invention provides a naked eye 3D display method based on human eye tracking, which obtains the coordinates of the midpoint of the eyes of the viewer captured by the camera, and compares the coordinates of the middle points of the eyes with the preset reference coordinates which are pre-calibrated.
  • the moving direction of the viewer's eyes and the moving distance from the preset reference coordinates are determined.
  • the preset reference coordinate is the center position of the picture taken by the camera.
  • the display content of the display screen can be translated by combining the moving direction and the moving distance of the eyes of the viewer with the initial offset of the pre-calibrated binoculars.
  • the initial offset is a constant calibrated by the camera before the display content of the display screen is processed, specifically, when the coordinates of the midpoint of the two eyes are at the preset reference coordinates, the axis of the optical axis of the camera deviates from the reference chart.
  • the distance from the center point of the chart period as the starting point.
  • the eye, and the crosstalk caused by the light entering the viewer's right eye entering the left eye makes it easy for the viewer to experience the ideal naked-eye 3D effect.
  • the map reference point to the pixel point in the coordinate system of the camera relative to the display screen, compared with the technical solution provided by the prior art, the calibration process and the number of the chart parameter are greatly simplified, and the simplification is simplified. Calculation. By taking the center position of the camera to take the preset reference coordinates and calibrating the starting offset, the human eye tracking can be made faster and more accurate.
  • FIG. 2 is a schematic flowchart diagram of a naked eye 3D display method based on human eye tracking according to Embodiment 2 of the present invention.
  • the second embodiment optimizes the step "determining the moving direction of the eyes of the viewer and the moving distance from the preset reference coordinates by comparing the coordinates of the midpoint of the two eyes with the preset reference coordinates pre-calibrated".
  • the method of Embodiment 2 of the present invention includes steps S210 to S240.
  • S210 Acquire a midpoint coordinates of a viewer's eyes captured by the camera.
  • S220 Determine an inclination angle of the grating film with respect to a preset direction.
  • the preset direction may be the direction in which the bottom edge of the display screen is located (the x-axis direction), or the vertical direction (the y-axis direction) where the side edges are located.
  • the grating film set to achieve the naked-eye 3D effect may be vertically disposed on the outer surface of the display screen or may be obliquely disposed on the outer surface of the display screen. If it is laid vertically, the inclination angle with the bottom edge of the display screen is 90 degrees. If it is inclined, the inclination angle of the grating film relative to the bottom edge of the display screen can be obtained during the process of arranging the grating film.
  • the tilt angle of the first stripe and the second stripe in the preset optical map relative to the bottom edge of the display screen can represent the grating film.
  • the angle of inclination since the pre-calibrated preset optical map can represent the light distribution of the display content, the tilt angle of the first stripe and the second stripe in the preset optical map relative to the bottom edge of the display screen can represent the grating film. The angle of inclination.
  • S230 Calculate a horizontal distance of the binocular midpoint coordinates from the preset reference coordinates according to the tilt angle, the binocular midpoint coordinates and the preset reference coordinates.
  • the grating film is vertically arranged with the bottom edge of the display screen, if the viewer's eyes move up and down, the light that should have entered the left eye can still enter the left eye, and the light that should enter the right eye can still enter the right eye. That is, the up and down movement of the viewer's eyes does not affect the naked eye 3D display effect. If the viewer moves both eyes to the left and right, the horizontal distance of the midpoint coordinates of the binoculars from the preset reference coordinates can be calculated, and according to the calculated horizontal distance, the display content is translated in the opposite direction to the eyes to avoid the occurrence of crosstalk. It is also possible to avoid vertigo caused by a wrong viewing position.
  • the grating film and the bottom edge of the display screen have a certain inclination angle
  • the movement of the middle point coordinate of the viewer's eyes with respect to the preset reference coordinate in any one of the up, down, left or right directions can be converted.
  • the crosstalk phenomenon due to the movement of both eyes can be solved by translating the display content of the display screen in the horizontal direction.
  • the advantage of this setting is that the direction of the map is unified to the set horizontal direction, which simplifies the calculation amount and realizes the naked-eye 3D display effect.
  • S240 Perform a translation process on the display content of the display screen according to the moving direction and the moving distance, together with the starting offset of the pre-calibrated eyes.
  • the display content can be translated in the horizontal direction to avoid the eyes of the viewer.
  • the crosstalk phenomenon caused by the movement realizes the naked eye 3D display effect and improves the user experience.
  • FIG. 3a is a schematic flowchart diagram of a preferred naked eye 3D display method based on human eye tracking according to Embodiment 3 of the present invention.
  • the third embodiment refines the "translation processing of the display content of the display screen according to the moving direction and the moving distance combined with the initial offset of the pre-calibrated eyes".
  • the method of Embodiment 3 of the present invention includes steps S310 to S360.
  • S310 Acquire a midpoint coordinates of a viewer's eyes captured by the camera.
  • S320 Determine an inclination angle of the grating film with respect to a preset direction.
  • S330 Calculate a horizontal distance of the binocular midpoint coordinates from the preset reference coordinates according to the tilt angle, combined with the binocular midpoint coordinates and the preset reference coordinates.
  • FIG. 3b is a schematic diagram of a layout manner according to Embodiment 3 of the present invention.
  • the angle of inclination of the grating film with respect to the vertical direction (y-axis direction) of the side of the display screen is.
  • Point C is At the center of the screen, if the pixel coordinates of the camera in the display coordinate system are replaced with the coordinates of point C, the display content can be arranged with the camera as the reference point of the map.
  • the display content in the layout period can be determined. Since the preset optical distribution map is a true representation of the display screen in a two-dimensional space, for the convenience of calculation, the preset optical distribution map can be used instead of the display screen, and the moving distance of the binocular position captured by the camera is calculated relative to the preset optical distribution map. The distance traveled.
  • FIG. 3c is a schematic diagram of a stripe following a center position movement of a binocular according to a preset optical distribution diagram according to Embodiment 3 of the present invention.
  • Z is the axis of the optical axis of the camera
  • point C is the center position of the eyes captured by the camera.
  • point C needs to pass through the first stripe and
  • the boundary line of the second stripe ensures that the first stripe and the second stripe enter the left and right eyes, respectively. Therefore, as the position of the midpoint of both eyes moves, the horizontal movement amount offset of the boundary line needs to be calculated.
  • S340 Calculate the quotient between the horizontal distance and the stripe period of the preset optical distribution map, and take the quotient value as a fractional part.
  • the preset optical distribution map is a true reflection of the display content of the display screen in the display two-dimensional space.
  • the preset optical distribution map is a pre-calibrated spectroscopic map, and the preset optical distribution map includes a first stripe and a second stripe arranged at intervals.
  • the inclination angle of the first stripe or the second stripe relative to the bottom edge of the display screen is the inclination angle of the grating film
  • the sum of the total number of pixels of each row of each of the first stripe and each second stripe is a stripe period .
  • the display content of the display screen can be made by moving the content of the plan in the horizontal direction.
  • the display content does not change, when the position of both eyes changes, the light that should have entered the left eye may enter the right eye, and the light that should have entered the right eye may enter the left eye, causing crosstalk.
  • the stun phenomenon may occur due to poor viewing position.
  • the offset value to be offset can be calculated. Taking two images as an example, since the left and right images in the two figures are arranged in a stripe period, that is, the first stripe period is first arranged in the left and then in the right, and the second stripe period is also in the above order.
  • the first picture is arranged first and then the right picture is taken, so the order of the pictures in the left and right pictures is the same in each stripe period. Therefore, it is only necessary to calculate the horizontal movement amount offset of the boundary line as a percentage of the entire stripe period.
  • the specific formula is:
  • Offset (BC/period) takes the fractional part.
  • the period is a stripe period
  • the length of the BC can be calculated by the tilt angle of the pre-calibrated stripe.
  • step S340 it is assumed that the axis Z of the optical axis of the camera is exactly at the offset calculated on the boundary line between the first stripe and the second stripe. As shown in Fig. 3c, since the Z point is not located on the boundary line, the value of the initial offset and the offset of the boundary line are used to obtain the amount of shift of the current map content relative to the map reference point. .
  • S360 The display content of the display screen is translated according to the amount of translation.
  • the embodiment further refines the above embodiment, and compares the position of the midpoint coordinates of the viewer's binoculars captured by the camera with the preset optical distribution map calibrated by the camera.
  • the movement of the center position of the eyes relative to the screen translates into the movement of the center coordinates of the eyes relative to the preset optical profile.
  • the content displayed on the display screen can be shifted correspondingly to the movement of the eyes of the viewer, thereby ensuring that the light that should enter the left eye can still enter the left eye, which should be right.
  • the light of the eye can still enter the right eye, achieving the naked-eye 3D display effect and avoiding the occurrence of crosstalk.
  • the calculation method is simplified, and the accuracy of the layout is improved.
  • FIG. 4a is a schematic flow chart of still another preferred naked eye 3D display method based on human eye tracking according to Embodiment 4 of the present invention.
  • the foregoing embodiment is optimized, and the method for arranging the display content when the viewer moves toward the screen or away from the screen.
  • the method in this embodiment further includes steps S410 to S430.
  • S410 Acquire a first binocular position coordinate and a second binocular position coordinate of the viewer in the images respectively captured by the first time and the second time of the camera.
  • S420 Determine a moving direction of the viewer according to the relationship between the first binocular position coordinate and the second binocular position coordinate.
  • the distance of the image from the camera can be determined by obtaining the size of the image in the captured picture. Since the camera is mounted on the display, the distance of the image from the display can be determined. In this embodiment, by acquiring the first binocular position coordinate and the second binocular position coordinate of the viewer in the image captured by the first time and the second time of the camera, by comparing the first binocular coordinate with the second binocular coordinate, The distance between the eyes and the screen can be determined, thereby determining the direction of movement of the viewer's eyes.
  • S430 Determine the manner of adjusting the width of the display screen period according to the moving direction.
  • FIG. 4b is a schematic diagram showing the relationship between the moving direction of both eyes and the width of the drawing period according to Embodiment 4 of the present invention.
  • the width of the layout period should be the length of the line segment EF; when the eye is at B
  • the width of the layout period at this time should be the length of the line segment DF.
  • the width of the layout period is related to the viewing distance. By determining the distance between the viewer's eyes and the display screen, the width of the layout period can be determined.
  • the width of the display screen period is increased.
  • the width of the chart period is increased from DF by DF.
  • the width of the display chart period is reduced. For example, as shown in FIG. 4b, when the viewer's binocular midpoint coordinates are moved from point A to point B, the width of the chart period is reduced from DF by DF.
  • FIG. 4c is a schematic diagram of a layout period adjustment manner according to Embodiment 4 of the present invention.
  • Figure 4c shows an adjustment to increase the width of the layout period.
  • the width of the layout period is adjusted, the positions of all the points on the map reference line AB are not changed, and the contents displayed by other points are shifted accordingly.
  • the pixel coordinates of the camera in the display screen are used as the reference point of the map, and the line segment parallel to the direction in which the grating film is disposed is used as the reference line of the map through the reference point of the map, The width of the graph period is changed, and the light emitted by the display content at the reference line of the map after passing through the grating film is static in spatial distribution, which improves the accuracy of the map.
  • This embodiment optimizes the above embodiment to track the moving side of the viewer's eyes through the camera.
  • the width of the display screen period can be adjusted accordingly, so that the display content of the display changes according to the change of the position of the viewer's eyes, thereby avoiding the occurrence of crosstalk and improving the viewing experience of the user.
  • FIG. 5a is a schematic flowchart diagram of still another preferred naked eye 3D display method based on human eye tracking according to Embodiment 5 of the present invention.
  • the embodiment is refined in the above embodiment.
  • the method in this embodiment further includes steps S510 to S520.
  • S510 The control camera recognizes the binocular imaging distance of the viewer in the captured image, and acquires the binocular imaging distance.
  • the camera may recognize the face image based on the face recognition algorithm, and identify the pupil coordinates of the face in the face image, thereby determining the face image. Binocular imaging distance.
  • S520 Adjust the width of the chart period of the content of the display screen according to the binocular imaging distance.
  • the imaging distance of the binocular in the face of the camera is inversely proportional to the actual distance between the eyes and the screen, that is, the farther the actual distance between the eyes and the display screen is, the smaller the distance of the binocular imaging is.
  • the actual distance between the eyes and the display screen is also inversely proportional to the width of the display screen period, that is, the farther the actual distance between the eyes and the display screen is, the smaller the chart period is. Therefore, after combining the two inverse ratio relationships, the binocular imaging distance in the camera captured image can be linearly proportional to the width of the chart period.
  • FIG. 5b is a schematic diagram showing a linear relationship between a binocular imaging distance and a chart period according to Embodiment 5 of the present invention. As shown in FIG. 5b, when the imaging distance of the binocular image captured by the camera is larger, the chart period is wider.
  • the width y of the adjustment chart period can be calculated by a preset linear calculation formula:
  • x is the distance between the two eyes of the viewer
  • y is the width of the chart period
  • k and b are all pre-calibrated constant coefficients.
  • the linear proportional relationship described above needs to calibrate the values of k and b.
  • the actual distance between the viewer's eyes is 6.5 cm.
  • the distance between the eyes of 6.5 cm can be used as the basis of the calibration parameters, and the values of k and b are obtained.
  • a conversion factor can be obtained by dividing 6.5 by the actual distance of the viewer's eyes. By multiplying the conversion factor by the binocular imaging distance recognized by the camera, a binocular imaging distance suitable for all viewers can be obtained.
  • the up, down, left and right tracking and front and back tracking of the viewer's eyes can be performed simultaneously or sequentially.
  • the distance between the coordinates of the midpoint of the eyes and the preset reference coordinates may be decomposed into two horizontal and vertical directions.
  • the solution provided by the above embodiment can be used to separately calculate the offset of the display content in the horizontal direction and the width of the layout period to be adjusted, so as to accurately track the position of the display to the viewer, and realize the ideal naked eye. 3D display to enhance the user experience.
  • the embodiment is refined on the basis of the above embodiment.
  • the viewer can view the camera according to the position of the binocular position.
  • the binocular imaging distance directly determines the width of the display chart period, which simplifies the number of calibration parameters during the charting process.
  • the width of the chart period makes the light entering the viewer's eyes change accordingly, avoiding the phenomenon of crosstalk and improving the user's viewing experience.
  • FIG. 6 is a structural block diagram of a naked eye 3D control system based on human eye tracking according to Embodiment 6 of the present invention.
  • the control system can be implemented by software and/or hardware, and is generally configured to perform a naked eye 3D display method based on human eye tracking provided by any of the above embodiments.
  • the control system includes a binocular midpoint coordinate acquisition module 610, a movement distance determination module 620, and a display content translation module 630.
  • the binocular midpoint coordinate acquisition module 610 is configured to acquire the binocular midpoint coordinates of the viewer captured by the camera.
  • the moving distance determining module 620 is configured to determine a moving direction of the eyes of the viewer and a moving distance from the preset reference coordinates by comparing the binocular midpoint coordinates with the pre-calibrated preset reference coordinates.
  • the display content translation module 630 is configured to perform translation processing on the display content of the display screen according to the moving direction and the moving distance, together with the starting offset of the pre-calibrated binoculars; wherein the starting offset The amount is when the binocular midpoint coordinates are located at the preset reference coordinates, the axis of the optical axis of the camera is offset from the center point of the chart period starting from the map reference point; wherein the preset reference The coordinates are the center position of the picture taken by the camera; the line segment parallel to the direction in which the grating film is disposed is the reference line of the drawing, with the pixel coordinates of the camera relative to the coordinate system of the display screen as a starting point, the row Any point on the map reference line is the map reference point.
  • the embodiment of the present invention provides a naked eye 3D control system based on human eye tracking.
  • the coordinates of the middle point of the eyes of the viewer captured by the camera are obtained, and the coordinates of the midpoint of the two eyes are pre-calibrated.
  • the preset reference coordinates are compared to determine the moving direction of the viewer's eyes and the moving distance from the preset reference coordinates. Wherein, the preset reference coordinates are in the picture taken by the camera Heart position.
  • the display content of the display screen can be translated according to the moving direction and moving distance of the eyes of the viewer and the initial offset of the two eyes which are pre-calibrated.
  • the initial offset is a constant calibrated by the camera before the display content of the display screen is processed, specifically, when the coordinates of the midpoint of the two eyes are at the preset reference coordinates, the axis of the optical axis of the camera deviates from the reference chart.
  • the distance from the center point of the chart period as the starting point.
  • the display content also moves correspondingly, the light that should enter the viewer's left eye is prevented from entering the right eye, and the light that should enter the viewer's right eye enters the crosstalk caused by the left eye.
  • the phenomenon makes it easy for viewers to experience the ideal naked-eye 3D effect.
  • the map reference point to the pixel point in the coordinate system of the camera relative to the display screen, the calibration process and the number of parameters are greatly simplified compared with the technical solutions provided by the prior art, and the calculation is simplified.
  • the human eye tracking can be made faster and more accurate.
  • each of the layout periods includes at least two images, and the at least two images are sequentially arranged within a width of the layout period, wherein any of the at least two images The two images enter the left and right eyes of the viewer, respectively.
  • the moving distance determining module 620 is specifically configured to: determine an inclination angle of the grating film with respect to a preset direction; according to the tilting angle, simultaneously combine the coordinates of the middle point of the two eyes and the The preset reference coordinates calculate a horizontal distance of the binocular midpoint coordinates from the preset reference coordinates.
  • the display content translation module 630 is specifically configured to: calculate a quotient between the horizontal distance and a stripe period of the preset optical distribution map, and take the quotient value into a fractional portion;
  • the preset optical distribution map is a pre-calibrated spectroscopic map, and the preset optical distribution map a first stripe and a second stripe arranged in a cross-interval, the tilt angle of the first stripe or the second stripe being an angle of inclination of the grating film, each of the first stripe and each of the first strips The sum of the total number of pixels of each line of the two stripes is the stripe period;
  • the display content of the display screen is translated according to the shift amount.
  • control system further includes: a binocular position acquisition module configured to acquire a first binocular position coordinate and a second binocular position coordinate of the viewer in the images respectively captured by the first time and the second time of the camera.
  • a movement direction determining module configured to determine a moving direction of the viewer according to a relationship between the first binocular position coordinate and the second binocular position coordinate; and a width adjustment mode determining module configured to determine according to the moving direction The way to adjust the width of the display layout period.
  • the width adjustment mode determining module is specifically configured to: use the first binocular coordinate position as a current position of the viewer's eyes;
  • the width of the display screen period is decreased.
  • control system further includes: a binocular imaging distance acquiring module, configured to control the camera to identify a binocular imaging distance of the viewer in the captured image, and acquire the binocular imaging distance;
  • a period width adjustment module is configured to adjust a width of a layout period of the content of the display screen according to the binocular imaging distance.
  • the layout period width adjustment module is specifically configured to: Let the linear calculation formula calculate the width y of the adjustment chart period:
  • x is the distance between the two eyes of the viewer
  • y is the width of the chart period
  • k and b are all pre-calibrated constant coefficients.
  • the grating film is a lenticular grating film or a slit grating film.
  • An embodiment of the present invention further provides an electronic device.
  • the electronic device includes at least one processor and a memory coupled to the at least one processor, the memory for storing instructions executable by the at least one processor, the instructions being executed by the at least one processor And causing the at least one processor to perform the naked eye 3D display method based on the human eye tracking in the above embodiment.
  • the embodiment of the present invention further provides a non-transitory storage medium storing computer executable instructions, and the computer executable instructions are configured to perform the above-described naked eye 3D display method based on human eye tracking.
  • Embodiments of the present invention also provide a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions, when the program instructions are executed by a computer
  • the computer is caused to perform the above-described naked eye 3D display method based on human eye tracking.
  • the naked eye 3D control system based on the human eye tracking provided by the embodiment of the present invention can perform the naked eye 3D display method based on the human eye tracking provided by any embodiment of the present invention, and has the corresponding functional modules and beneficial effects of the execution method.
  • the naked eye 3D display method based on human eye tracking provided by any embodiment of the present invention can perform the naked eye 3D display method based on the human eye tracking provided by any embodiment of the present invention, and has the corresponding functional modules and beneficial effects of the execution method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

公开了一种基于人眼跟踪的裸眼3D显示方法及控制***。所述方法包括:获取摄像头拍摄到的观看者的双眼中点坐标;通过比较双眼中点坐标与预先标定的预设基准坐标,确定观看者双眼的移动方向以及偏离预设基准坐标的移动距离;根据移动方向和移动距离,同时结合预先标定的双眼的起始偏移量,对显示屏的显示内容进行平移处理。其中,起始偏移量为当双眼中点坐标位于预设基准坐标时,摄像头光轴的轴心偏离以排图基准点作为起点的排图周期中心点的距离。通过采用上述技术方案,可以使得屏幕显示内容跟随双眼的移动进行相应的移动,观看者在可视距离范围内,无需刻意寻找观看位置,可避免串扰现象,同时也可轻松体验理想的裸眼3D效果。

Description

基于人眼跟踪的裸眼3D显示方法及控制*** 技术领域
本发明涉及3D显示技术领域,尤其涉及一种基于人眼跟踪的裸眼3D显示方法及控制***。
背景技术
裸眼3D显示技术是指无需佩戴专用的3D眼镜,观看者即可直接以肉眼观赏三维影像,呈现3D效果的一种3D显示技术。裸眼3D显示技术包括柱镜光栅、狭缝光栅、液晶透镜等,目前应用最广的是柱镜3D显示技术。柱镜3D显示技术的原理是通过在常规显示屏的前面贴附一层特制柱状透镜来实现的。在每个柱透镜下面的图像像素被分成几个子像素,这样透镜就能以不同的方向投影每个子像素。当用户在观看3D显示内容时,左眼和右眼分别看到不同子像素所发出的光线,使人的左眼和右眼看到不同的画面,并在大脑中融合成3D效果的画面。
但是,由于每个柱镜将像素点内容分开投射到左右眼,因此在没有进行人眼跟踪的情况下,固定的光线分布,需要用户在前后左右寻找合适的观看位置,才能看到理想的立体效果。当观看位置不合适时,进入左眼的光线可能进入右眼,此时右眼既可看到左图又可看到右图,容易产生串扰(crosstalk),用户体验较差。
目前,对于上述问题,一种解决方案是,采用摄像头跟踪人眼的运动,建立以显示屏中心和摄像头位置为坐标原点的两个坐标系,并标定两个坐标系的关系。当利用摄像头跟踪人眼的移动时,通过获取并转换出人眼在摄像头坐标系中的坐标,再进一步转换到显示屏中心坐标系中。因此,上述方案需要经过 两次坐标系的转换。同时由于摄像头拍摄到的是二维图像,因此,还需转换到三维坐标系中,确定人眼在三维坐标系中x、y和z轴方向的变化。此外,还需要标定相机图像中的人眼坐标与屏幕坐标的转换关系数据、以屏幕中心和以相机位置分别为坐标原点的两坐标系之间的关系、屏幕坐标与柱透镜光路分布和排图参数之间的关系。
因此,基于人眼位置的移动,相关技术方案在进行裸眼3D内容的排图时,需标定的参数较多,调试效率较低,不利于大量设备的标定需求。
发明内容
本发明实施例提供一种基于人眼跟踪的裸眼3D显示方法及控制***,极大地简化了传统人眼跟踪方法以及基于人眼跟踪对排图内容的调整,提升了3D显示效果。
第一方面,本发明实施例提供了一种基于人眼跟踪的裸眼3D显示方法,所述方法包括:
获取摄像头拍摄到的观看者的双眼中点坐标;
通过比较所述双眼中点坐标与预先标定的预设基准坐标,确定所述观看者双眼的移动方向以及偏离所述预设基准坐标的移动距离;
根据所述移动方向和所述移动距离,同时结合预先标定的双眼的起始偏移量,对显示屏的显示内容进行平移处理;
其中,所述起始偏移量为当所述双眼中点坐标位于所述预设基准坐标时,所述摄像头光轴的轴心偏离以排图基准点作为起点的排图周期中心点的距离;其中,所述预设基准坐标为所述摄像头拍摄图片的中心位置;以所述摄像头相对于所述显示屏所在坐标系中的像素坐标为起点,平行于所述光栅膜布设方向 的线段为排图基准线,所述排图基准线上的任意一点均为所述排图基准点。
第二方面,本发明实施例提供了一种基于人眼跟踪的裸眼3D控制***,所述控制***包括:
双眼中点坐标获取模块,设置为获取摄像头拍摄到的观看者的双眼中点坐标;
移动距离确定模块,设置为通过比较所述双眼中点坐标与预先标定的预设基准坐标,确定所述观看者双眼的移动方向以及偏离所述预设基准坐标的移动距离;
显示内容平移模块,设置为根据所述移动方向和所述移动距离,同时结合预先标定的双眼的起始偏移量,对显示屏的显示内容进行平移处理;
其中,所述起始偏移量为当所述双眼中点坐标位于所述预设基准坐标时,所述摄像头光轴的轴心偏离以排图基准点作为起点的排图周期中心点的距离;其中,所述预设基准坐标为所述摄像头拍摄图片的中心位置;以所述摄像头相对于所述显示屏所在坐标系中的像素坐标为起点,平行于光栅膜布设方向的线段为排图基准线,所述排图基准线上的任意一点均为所述排图基准点。
本发明实施例的技术方案中,通过获取摄像头拍摄到的观看者的双眼中点坐标,并将双眼中点坐标与预先标定的预设基准坐标作比较,可以确定观看者双眼的移动方向以及偏离预设基准坐标的移动距离。其中,预设基准坐标为摄像头拍摄图片的中心位置。根据观看者双眼的移动方向和移动距离,同时结合预先标定的双眼的起始偏移量,可对显示屏的显示内容进行平移处理。其中,起始偏移量为在对显示屏的显示内容进行处理之前,利用摄像头标定的常量,具体为当双眼中点坐标位于预设基准坐标时,摄像头光轴的轴心偏离以排图基准点作为起点的排图周期中心点的距离。
通过采用上述技术方案,可以使得显示屏幕的显示内容跟随观看者双眼的移动进行相应的移动。观看者在可视距离范围内,无需刻意寻找观看位置或保持观看位置。当观看者的位置发生改变时,由于显示内容也相应的发生移动,避免了本应进入观看者左眼的光线进入右眼,而本应进入观看者右眼的光线进入左眼所导致的串扰现象,使得观看者可以轻松体验理想的裸眼3D效果。此外,通过将排图基准点设置为以摄像头相对于显示屏所在坐标系中的像素点,极大的简化了排图参数的标定过程和数量,简化了计算。通过将摄像头拍摄照片的中心位置作为预设基准坐标,并通过标定起始偏移量,可以使得人眼跟踪更加快速、准确。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对本发明实施例描述中所需要使用的附图作简单的介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据本发明实施例的内容和这些附图获得其他的附图。
图1a为本发明实施例一提供的一种基于人眼跟踪的裸眼3D显示方法的流程示意图;
图1b为本发明实施例一提供的排图周期示意图;
图2为本发明实施例二提供的一种基于人眼跟踪的裸眼3D显示方法的流程示意图;
图3a为本发明实施例三提供的一种优选的基于人眼跟踪的裸眼3D显示方法的流程示意图;
图3b为本发明实施例三提供的一种排图方式的示意图;
图3c为本发明实施例三提供的一种预设光学分布图的条纹跟随双眼中心位置移动的示意图;
图4a为本发明实施例四提供的又一种优选的基于人眼跟踪的裸眼3D显示方法的流程示意图;
图4b为本发明实施例四提供的一种双眼移动方向与排图周期宽度之间的关系示意图;
图4c为本发明实施例四提供的一种排图周期调整方式的示意图;
图5a为本发明实施例五提供了又一种优选的基于人眼跟踪的裸眼3D显示方法的流程示意图;
图5b为本发明实施例五提供的一种双眼成像距离与排图周期的线性关系示意图;
图6为本发明实施例六提供的一种基于人眼跟踪的裸眼3D控制***的结构框图。
具体实施方式
为使本发明解决的技术问题、采用的技术方案和达到的技术效果更加清楚,下面将结合附图对本发明实施例的技术方案作进一步的详细描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
为了清楚、明白的描述本发明各实施例,首先将本发明的实现原理进行简单介绍。本申请的技术方案可以应用于安装有摄像头的显示设备中。该终端设备可自带摄像头模组,当然摄像头也可固定安装在显示设备上。通过将摄像头 拍摄到的观看者双眼中点坐标与利用该摄像头预先标定的预设光学分布图做比较,可将观看者双眼相对于屏幕的移动转化为双眼坐标相对于预设光学分布图的移动。跟随观看者双眼的移动距离,通过计算预设光学分布图中条纹的移动距离可以对显示设备的显示内容进行相应的平移处理,以保证观看者看到的显示图像保持不变,即左图数据经过光栅膜进入观看者左眼,右图数据进入观看者右眼,左右眼图像数据融合后实现裸眼3D显示效果。
其中,预设光学分布图显示的内容可用于表示显示屏所显示的内容,即预设光学分布图为显示屏在二维空间中的真实体现。具体的,预设光学分布图为预先标定的分光图谱,包括交叉间隔排列的第一条纹和第二条纹。其中,第一条纹或第二条纹的倾斜角度均为光栅膜相对于预设方向(可以为显示屏的长边或短边)实际布设时的倾斜角度,每个第一条纹以及每个第二条纹每行像素点的总个数之和为条纹周期。因此,在具体计算过程中,预设光学分布图可利用如下三个参数表示:起始偏移量、条纹的倾斜角度和条纹周期。其中,条纹周期可对应显示内容的排图周期。每个第一条纹与第二条纹之间的交界线为排图周期的中心点的位置。为了实现裸眼3D显示,第一条纹和第二条纹需分别进入观看者的左眼和右眼,此时,观看者双眼中点坐标位于条纹周期的中心点,即第一条纹和第二条纹的交界线上。因此,当人眼在空间中移动时,通过对分光图谱进行相应的平移,可使得第一条纹与第二条纹的交界线始终位于双眼的中点坐标处,进而使得第一条纹和第二条纹的光线分别进入观看者的左眼和右眼,在形成裸眼3D效果的同时,避免了由于观看者双眼的移动而产生的串扰现象。
需要说明的是,本申请中采用预设光学分布图进行处理的前提是摄像头的光轴与显示屏接近垂直,偏角在5度以内。这样设置的好处在于,使得预设光学分布的条纹均匀分布,从而简化计算过程。
实施例一
图1a为本发明实施例一提供的一种基于人眼跟踪的裸眼3D显示方法的流程示意图。本实施例的方法可由裸眼3D控制***来执行,其中该控制***可由软件和/或硬件实现,一般可集成在显示内容的播放控制器中。如图1a所示,该方法可以包括步骤S110至S130。
S110:获取摄像头拍摄到的观看者的双眼中点坐标。
其中,摄像头优选为裸眼3D播放设备自带的前置摄像头,也可以为固定安装在显示设备上,与显示设备具有通信连接关系且面向显示屏观看区域的外接摄像头。示例性的,显示设备可以为手机、平板电脑或笔记本电脑等小型显示设备,也可以为大屏彩电或广告机等大型显示设备。本实施例中的显示设备优选为布设有光栅膜的3D显示设备,设置为形成裸眼3D显示效果。示例性的,光栅膜可以为柱镜光栅膜或狭缝光栅膜。
示例性的,在利用摄像头获取观看者双眼中点坐标时,可首先利用摄像头拍摄包含有观看者人脸的图像。基于人脸识别算法可识别出人脸图像,并识别出人脸图像中双眼瞳孔的坐标,进而可获取到双眼中点在图像中的坐标。
S120:通过比较双眼中点坐标与预先标定的预设基准坐标,确定观看者双眼的移动方向以及偏离预设基准坐标的移动距离。
其中,预设基准坐标为观看者双眼位置是否发生移动的参考点。为便于对人眼进行准确地跟踪,本实施例优选将预设基准坐标设置为摄像头拍摄图片的中心位置。通过比较双眼中点坐标与预设基准坐标的关系可以确定观看者双眼的移动方向,以及偏离预设基准坐标的移动距离。
可选地,通过周期性地获取摄像头拍摄到的包含有人脸的图像,可以实现 实时地获取双眼中点坐标,进而更准确的跟踪人眼的移动。
S130:根据移动方向和移动距离,同时结合预先标定的双眼的起始偏移量,对显示屏的显示内容进行平移处理。
其中,起始偏移量与摄像头的安装位置及安装角度有关,当摄像头的安装位置固定后,起始偏移量也随之得到确定。因此,在对显示屏的显示内容进行平移处理之前,需利用摄像头对起始偏移量进行标定。
在对起始偏移量进行具体解释之前,需要注意的是,本实施例中,优选将摄像头相对于显示屏所在坐标系中的像素点作为排图基准点,经过排图基准点将平行于光栅膜布设方向的线段作为排图基准线。其中,排图基准线上的任意一点均为排图基准点。本领域技术人员可以理解的是,光栅膜的布设方向可以为垂直与显示屏的底边布设,也可以与显示屏的底边存在一定的倾斜角度布设。
这样设置的好处在于:在对显示内容进行平移处理时,排图基准线处的显示内容不发生改变。当只改变排图周期的宽度时(此过程在实施例四和五中具体说明),排图基准线处的显示内容经过光栅膜后所射出的光线在空间分布上是静止的。由于排图周期的宽度变化范围较小,因此,可进一步忽略排图周期宽度的变化对于光学分布图中条纹宽度的影响。
还需要说明的是,通过将摄像头相对于显示屏所在坐标系中的像素点作为排图基准点,相对于传统的人眼跟踪和排图方法,本实施例提供的技术方案极大地简化了排图参数的标定过程和数量。
在确定排图基准点的位置后,起始偏移量优选为:当双眼中点坐标位于预设基准坐标时,摄像头光轴的轴心偏离以排图基准点作为起点的排图周期中心点的距离。
示例性的,每个排图周期的宽度可量化为0~1。对于每个排图周期的物理 宽度,可根据以光栅膜的宽度为基准,具体可通过光栅膜所覆盖的行像素点的个数来表示。图1b为本发明实施例一提供的排图周期示意图。如图1b所示,将多幅图的内容顺序排列在0~1区间的宽度内,作为一个排图周期。其中,排图周期中心点的位置为排图周期宽度为0.5的位置。
示例性的,每个排图周期中均包括至少两幅图像,所述至少两幅图顺序排列在所述排图周期的宽度内,其中,所述至少两幅图像中的任意两幅图像分别进入观看者的左眼和右眼,以形成裸眼3D效果。
示例性的,起始偏移量可在利用摄像头标定预设光学分布图时进行标定。如上述实现原理所介绍的内容,预设光学分布图是显示屏在二维空间中的真实反映,因此,显示屏的一个排图周期即为预设光学分布图的一个条纹周期。对于一个条纹周期而言,由于条纹周期与排图周期相对应,若将条纹周期的宽度也量化为0~1时,则第一条纹的宽度为0~0.5,第二条纹的宽度为0.5~1。其中,摄像头光轴的轴心理想的状态是位于第一条纹与第二条纹的交界线上,即条纹宽度为0.5的位置。但对于实际大多数情况而言,当摄像头的位置确定后,在标定预设光学分布图的过程中,摄像头光轴的轴心并不在交界线上,所以需先标定摄像头光轴的轴心偏离交界线的距离,即标定起始偏移量。例如,若摄像头光轴的轴心偏离交界线的水平距离为0.2时,则起始偏移量则为0.2。
当观看者双眼位置发生移动时,如果显示内容不发生改变,则随着双眼的移动,本应进入左眼的光线可能进入右眼,本应进入右眼的光线可能进入左眼,当用户在观看显示屏幕的显示内容时,容易发生串扰现象。本实施例通过跟踪双眼的移动,可对显示屏的显示内容进行相应的平移处理以保证显示屏显示的左图内容进入左眼,显示的右图内容进入右眼,形成良好的裸眼3D效果。
需要说明的是,观看者双眼的移动方向可以为上、下、左或右中的任意一 个方向。根据显示屏显示内容的光路分布,由于通过光栅后,光线的分布类似于小孔成像,因此,显示屏幕的显示内容的平移方向需与双眼的移动方向相反,例如,当双眼向左边移动时,显示内容需向右边移动才可使得双眼看到的显示内容不发生改变。
本发明实施例提供了一种基于人眼跟踪的裸眼3D显示方法,通过获取摄像头拍摄到的观看者的双眼中点坐标,并将双眼中点坐标与预先标定的预设基准坐标作比较,可以确定观看者双眼的移动方向以及偏离预设基准坐标的移动距离。其中,预设基准坐标为摄像头拍摄图片的中心位置。通过根据观看者双眼的移动方向和移动距离,同时结合预先标定的双眼的起始偏移量,可对显示屏的显示内容进行平移处理。其中,起始偏移量为在对显示屏的显示内容进行处理之前,利用摄像头标定的常量,具体为当双眼中点坐标位于预设基准坐标时,摄像头光轴的轴心偏离以排图基准点作为起点的排图周期中心点的距离。通过采用上述技术方案,可以使得显示屏幕的显示内容跟随观看者双眼的移动进行相应的移动。观看者在可视距离范围内,无需刻意寻找观看位置或保持观看位置,当观看者的位置发生改变时,由于显示内容也相应的发生移动,避免了本应进入观看者左眼的光线进入右眼,而本应进入观看者右眼的光线进入左眼所导致的串扰现象,使得观看者可以轻松体验理想的裸眼3D效果。此外,通过将排图基准点设置为以摄像头相对于显示屏所在坐标系中的像素点,相对于现有技术提供的技术方案,极大的简化了排图参数的标定过程和数量,简化了计算。通过将摄像头拍摄照片的中心位置作为预设基准坐标,并通过标定起始偏移量,可以使得人眼跟踪更加快速、准确。
实施例二
图2为本发明实施例二提供的一种基于人眼跟踪的裸眼3D显示方法的流程示意图。本实施例二在实施例一的基础上,对步骤“通过比较双眼中点坐标与预先标定的预设基准坐标,判断观看者双眼的移动方向以及偏离预设基准坐标的移动距离”进行了优化,参照图2,本发明实施例二的方法包括步骤S210至S240。
S210:获取摄像头拍摄到的观看者的双眼中点坐标。
S220:确定光栅膜相对于预设方向布设的倾斜角度。
其中,预设方向可以为显示屏底边所在的方向(x轴方向),也可以为侧边所在的垂直方向(y轴方向)。示例性的,设置为达到裸眼3D效果的光栅膜可以垂直布设在显示屏的外表面,也可倾斜布设在显示屏的外表面。若垂直布设,则与显示屏底边之间的倾斜角度为90度,若倾斜布设,可在布设光栅膜的过程中获取光栅膜相对于显示屏底边的倾斜角度。
示例性的,由于预先标定的预设光学分布图可以表示显示内容的光线分布,因此,预设光学分布图中的第一条纹和第二条纹相对于显示屏底边的倾斜角度可以表示光栅膜的倾斜角度。
S230:根据倾斜角度,同时结合双眼中点坐标以及预设基准坐标计算双眼中点坐标在偏离预设基准坐标的水平距离。
示例性的,若光栅膜垂直与显示屏的底边布设时,如果观看者双眼上下移动,则本应进入左眼的光线仍可进入左眼,本应进入右眼的光线仍可进入右眼,即观看者双眼的上下移动并不影响裸眼3D显示效果。如果观看者双眼左右移动时,可计算双眼中点坐标偏离预设基准坐标的水平距离,并根据计算出的水平距离,对显示内容进行与双眼相反方向的平移处理,避免串扰现象的发生,同时也可避免由于观看位置错误而产生的眩晕现象。
可选地,若光栅膜与显示屏的底边存在一定的倾斜角度,则观看者双眼中点坐标相对于预设基准坐标在上、下、左或右任意一个方向上的移动,都可转换为在水平方向上的移动,即可以通过在水平方向上平移显示屏的显示内容解决由于双眼移动而产生的串扰现象。这样设置的好处在于,将排图的方向统一到设定的水平方向,简化了计算量,实现了裸眼3D显示效果。
S240:根据移动方向和移动距离,同时结合预先标定的双眼的起始偏移量,对显示屏的显示内容进行平移处理。
本实施例二在上述实施例的基础上,通过将观看者双眼偏离预设基准坐标的移动距离转换到水平方向上的移动距离,可在水平方向对显示内容进行平移处理,避免由于观看者双眼移动而产生的串扰现象,实现了裸眼3D显示效果,提升了用户体验。
实施例三
图3a为本发明实施例三提供的一种优选的基于人眼跟踪的裸眼3D显示方法的流程示意图。本实施例三在上述实施例的基础上,对“根据移动方向和移动距离,同时结合预先标定的双眼的起始偏移量,对显示屏的显示内容进行平移处理”进行了细化,参照图3a,本发明实施例三的方法包括步骤S310至S360。
S310:获取摄像头拍摄到的观看者的双眼中点坐标。
S320:确定光栅膜相对于预设方向布设的倾斜角度。
S330:根据倾斜角度,同时结合双眼中点坐标以及预设基准坐标计算双眼中点坐标在偏离预设基准坐标的水平距离。
示例性的,图3b为本发明实施例三提供的一种排图方式的示意图。如图3b所示,光栅膜相对于显示屏侧边垂直方向(y轴方向)的倾斜角度为。C点为 屏幕的中心位置,若将摄像头在显示屏坐标系中的像素坐标替换C点的坐标,则可实现以摄像头作为排图基准点对显示内容进行排图。
具体的,对于显示屏中任一像素点A(x,y),通过计算A到排图基准线的距离AD与排图周期宽度之间的关系,可确定排图周期中的显示内容。由于预设光学分布图为显示屏在二维空间中的真实体现,因此,为了便于计算,可用预设光学分布图代替显示屏,计算摄像头拍摄的双眼位置的移动距离相对于预设光学分布图的移动距离。
具体的,图3c为本发明实施例三提供的一种预设光学分布图的条纹跟随双眼中心位置移动的示意图。如图3c所示,Z为摄像头光轴的轴心,C点为摄像头拍摄到的双眼中心位置,若要实现裸眼3D效果,如上述实施例所介绍的内容,C点需经过第一条纹和第二条纹的交界线,才可保证第一条纹和第二条纹分别进入左眼和右眼。因此,随着双眼中点位置的移动,需计算交界线的水平移动量offset。
S340:计算水平距离与预设光学分布图的条纹周期之间的商值,并将商值取小数部分。
如上述实现原理所介绍的内容,预设光学分布图为显示屏的显示内容在显示二维空间中的真实反映。预设光学分布图为预先标定的分光图谱,预设光学分布图包括交叉间隔排列的第一条纹和第二条纹。其中,第一条纹或第二条纹相对于显示屏幕底边的倾斜角度均为光栅膜的倾斜角度,每个第一条纹以及每个第二条纹每行像素点的总个数之和为条纹周期。
示例性的,若将观看者双眼中点坐标偏离预设基准坐标的移动距离,转换到偏离预设基准坐标的水平距离,则可通过在水平方向上移动排图的内容使得显示屏的显示内容跟随观看者的双眼位置的改变而发生相应的移动,避免了串 扰和眩晕现象的发生,提升了裸眼3D显示效果。如果显示内容不发生改变,则当双眼位置改变时,本应进入左眼的光线可能进入右眼,本应进入右眼的光线可能进入左眼,产生串扰现象。当用户在观看显示屏幕的显示内容时,由于观看位置不佳容易导致眩晕现象的发生。
示例性的,为了计算offset,可先假设摄像头光轴的轴心Z点位于第一条纹和第二条纹的交界线上,并以此轴心作为参考点。如图3c所示,只要求出BC长度和一个红蓝条纹水平宽度,就可以计算出要偏移的offset值。以两幅图像为例,由于两幅图中的左图和右图顺序排列在一个条纹周期,即第一个条纹周期先排左图再排右图,第二个条纹周期也按照上述排列顺序先排左图再排右图,因此,左图和右图的排图顺序在每个条纹周期内均相同。所以,只需计算交界线的水平移动量offset占整个条纹周期的百分比即可,具体公式为:
offset=(BC/period)取小数部分。
其中,period为一个条纹周期,BC的长度可通过预先标定的条纹的倾斜角度计算得到。
S350:将计算出的小数部分的值与起始偏移量相加,得到当前排图内容相对于排图基准点的平移量。
在步骤S340中是假设摄像头光轴的轴心Z正好位于第一条纹和第二条纹的交界线上计算的offset。如图3c所示,由于Z点并不位于交界线上,因此需利用起始偏移量的值加上交界线的偏移量即可得到当前排图内容相对于排图基准点的平移量。
S360:根据平移量对显示屏的显示内容进行平移处理。
本实施例对上述实施例进行了进一步的细化,通过将摄像头拍摄到的观看者双眼中点坐标的位置与利用该摄像头标定的预设光学分布图做比较,将观看 者双眼中心位置相对于屏幕的移动转化为双眼中心坐标相对于预设光学分布图的移动。通过计算光学分布图中条纹的水平移动距离,可使得显示屏显示的内容跟随观看者双眼的移动相应的发生平移,进而保证了本应进入左眼的光线仍可进入左眼,本应进入右眼的光线仍可进入右眼,实现了裸眼3D显示效果,避免了串扰现象的发生。通过采用预设光学分布图,简化了计算方式,提高了排图的准确度。
实施例四
图4a为本发明实施例四提供的又一种优选的基于人眼跟踪的裸眼3D显示方法的流程示意图。本实施例对上述实施例进行了优化,主要针对观看者向靠近屏幕或远离屏幕的方向移动时,对显示内容的排图方式,参照图4a,本实施例的方法还包括步骤S410至S430。
S410:获取摄像头第一时刻和第二时刻分别拍摄到的图像中观看者的第一双眼位置坐标和第二双眼位置坐标。
S420:根据第一双眼位置坐标和第二双眼位置坐标的关系确定观看者的移动方向。
示例性的,对于目前大多数光学摄像头而言,都可通过获取拍摄图片中图像的大小确定图像距离摄像头的距离,由于摄像头安装在显示屏上,因此可确定图像距离显示屏的距离。本实施例中,通过获取摄像头第一时刻和第二时刻分别拍摄到的图像中观看者的第一双眼位置坐标和第二双眼位置坐标,通过比较第一双眼坐标与第二双眼坐标的大小,可确定双眼与屏幕之间的距离,进而可确定观看者双眼的移动方向。
S430:根据移动方向,确定调整显示屏排图周期宽度的方式。
下面首先介绍观看者双眼的移动方向与显示屏排图周期宽度之间的关系:
其中,双眼的移动方向可以为从当前位置靠近显示屏的方向向前移动,也可以为从当前位置远离显示屏的方向向后移动。图4b为本发明实施例四提供的一种双眼移动方向与排图周期宽度之间的关系示意图。如图4b所示,在理想情况下,当眼睛在A点处时,要保证透过相邻柱透镜看到O点的显示内容,排图周期宽度应该是线段EF的长度;当眼睛位于B处时,为了透过相邻的两个柱状透镜看到O点所显示的内容,这时的排图周期宽度应该是线段DF的长度。显然,排图周期的宽度与观看距离有关,通过确定观看者双眼与显示屏之间的距离,即可确定排图周期的宽度。
示例性的,当移动方向为基于当前位置向靠近显示屏的方向移动时,则增大显示屏排图周期的宽度。例如,如图4b所示,当观看者双眼中点坐标从B点移动到A点时,排图周期的宽度从DF增大的EF。
当移动方向为基于当前位置向远离显示屏的方向移动时,则减小显示屏排图周期的宽度。例如,如图4b所示,当观看者双眼中点坐标从A点移动到B点时,排图周期的宽度从EF减小的DF。
具体的,图4c为本发明实施例四提供的一种排图周期调整方式的示意图。图4c示出的是一种增加排图周期宽度的调整方式。如图4c所示,在调整排图周期宽度时,排图基准线AB上的所有点的位置不发生改变,其他点显示的内容都发生了相应的偏移。因此,如实施例一所描述的内容,若将摄像头在显示屏幕中的像素坐标作为排图基准点,经过排图基准点将平行于光栅膜布设方向的线段作为排图基准线时,若排图周期的宽度发生改变,则排图基准线处的显示内容经过光栅膜后所射出的光线在空间分布上是静止的,提高了排图的准确性。
本实施例对上述实施例进行了优化,通过摄像头跟踪观看者双眼的移动方 向,可对显示屏排图周期的宽度进行相应的调整,使得显示屏的显示内容跟随观看者双眼位置的改变而发生改变,避免了串扰现象的发生,提升了用户的观看体验。
实施例五
图5a为本发明实施例五提供了又一种优选的基于人眼跟踪的裸眼3D显示方法的流程示意图。本实施例对上述实施例进行了细化,参照图5a,本实施例的方法还包括步骤S510至S520。
S510:控制摄像头对拍摄到的图像中观看者的双眼成像距离进行识别,并获取双眼成像距离。
示例性的,摄像头在拍摄到包含有观看者人脸的图像后,可基于人脸识别算法可识别出人脸图像,并识别出人脸图像中的双眼瞳孔坐标,进而确定人脸图像中的双眼成像距离。
S520:根据双眼成像距离,调整显示屏排图内容的排图周期的宽度。
本领域技术人员可以理解的是,摄像头拍摄人脸中的双眼成像距离与双眼到屏幕的实际距离成反比关系,即双眼到显示屏的实际距离越远,双眼成像的距离越小。而双眼到显示屏的实际距离与显示屏排图周期的宽度也成反比关系,即双眼到显示屏的实际距离越远,排图周期越小。因此,将两个反比关系复合后,可得到摄像头拍摄图像中的双眼成像距离与排图周期的宽度成线性正比关系。
示例性的,图5b为本发明实施例五提供的一种双眼成像距离与排图周期的线性关系示意图。如图5b所示,当摄像头拍摄到的双眼成像距离越大时,排图周期越宽。
示例性的,可通过预设线性计算公式计算调整排图周期的宽度y:
y=kx+b;
其中,x为观看者双眼成像距离,y表示排图周期的宽度,k和b均为预先标定的常数系数。通过上述线性关系,当摄像头拍摄到包含有观看者双眼的图像时,从图像中识别观看者双眼成像距离可以直接确定排图周期的宽度,避免了观看者距离屏幕的位置与排图周期宽度的关系之间的多次换算,简化了标定参数的个数,提升了跟踪人眼位置以对排图周期宽度进行相应处理的效率和准确性。
需要说明的是,上述线性正比关系需要标定k和b的值。一般情况下,观看者双眼实际距离为6.5厘米。本实施例在具体实现过程中,可将6.5厘米的双眼距离作为标定参数的依据,得到k和b的值。但为了适应更多观看者的眼睛,用6.5除以观看者双眼的实际距离,可得到一个转换系数。将该转换系数与摄像头识别到的双眼成像距离相乘后,可以得到适用于所有观看者的双眼成像距离。
值得注意的是,观看者双眼的上下左右跟踪和前后跟踪可同步进行,也可先后进行。优选的,当观看者在平行于显示屏的水平方向和垂直于显示屏的垂直方向同时存在移动距离时,可将观看着双眼中点坐标偏离预设基准坐标的距离分解到水平和垂直两个方向,然后可利用上述实施例提供的方案分别计算显示内容在水平方向上的偏移量,以及需调整的排图周期的宽度,以实现显示内容对观看者位置的准确跟踪,实现理想的裸眼3D显示,提升用户体验。
本实施例在上述实施例的基础上进行了细化,通过确定观看者双眼成像距离与排图周期宽度之间的线性关系,可当观看者的双眼位置发生改变后,根据摄像头拍摄到的观看者的双眼成像距离直接确定显示屏排图周期的宽度,简化了排图过程中标定参数的数量。跟随观看者双眼位置的改变,通过调整显示屏 的排图周期的宽度,使得进入观看者双眼的光线也相应发生改变,避免了串扰的现象,提升了用户的观看体验。
实施例六
图6为本发明实施例六提供的一种基于人眼跟踪的裸眼3D控制***的结构框图。该控制***可由软件和/或硬件实现,一般控制显示屏内容播放的播放控制器中,设置为执行上述任意实施例所提供的基于人眼跟踪的裸眼3D显示方法。如图6所示,该控制***包括:双眼中点坐标获取模块610、移动距离确定模块620和显示内容平移模块630。
双眼中点坐标获取模块610,设置为获取摄像头拍摄到的观看者的双眼中点坐标。移动距离确定模块620,设置为通过比较所述双眼中点坐标与预先标定的预设基准坐标,确定所述观看者双眼的移动方向以及偏离所述预设基准坐标的移动距离。显示内容平移模块630,设置为根据所述移动方向和所述移动距离,同时结合预先标定的双眼的起始偏移量,对显示屏的显示内容进行平移处理;其中,所述起始偏移量为当所述双眼中点坐标位于所述预设基准坐标时,所述摄像头光轴的轴心偏离以排图基准点作为起点的排图周期中心点的距离;其中,所述预设基准坐标为所述摄像头拍摄图片的中心位置;以所述摄像头相对于所述显示屏所在坐标系中的像素坐标为起点,平行于所述光栅膜布设方向的线段为排图基准线,所述排图基准线上的任意一点均为所述排图基准点。
本发明实施例提供了一种基于人眼跟踪的裸眼3D控制***,本发明实施例的技术方案中,通过获取摄像头拍摄到的观看者的双眼中点坐标,并将双眼中点坐标与预先标定的预设基准坐标作比较,可以确定观看者双眼的移动方向以及偏离预设基准坐标的移动距离。其中,预设基准坐标为摄像头拍摄图片的中 心位置。根据观看者双眼的移动方向和移动距离,同时结合预先标定的双眼的起始偏移量,可对显示屏的显示内容进行平移处理。其中,起始偏移量为在对显示屏的显示内容进行处理之前,利用摄像头标定的常量,具体为当双眼中点坐标位于预设基准坐标时,摄像头光轴的轴心偏离以排图基准点作为起点的排图周期中心点的距离。通过采用上述技术方案,可以使得显示屏幕的显示内容跟随观看者双眼的移动进行相应的移动。观看者在可视距离范围内,无需刻意寻找观看位置或保持观看位置。当观看者的位置发生改变时,由于显示内容也相应的发生移动,避免了本应进入观看者左眼的光线进入右眼,而本应进入观看者右眼的光线进入左眼所导致的串扰现象,使得观看者可以轻松体验理想的裸眼3D效果。此外,通过将排图基准点设置为以摄像头相对于显示屏所在坐标系中的像素点,相对于现有技术提供的技术方案,极大的简化了参数的标定过程和数量,简化了计算。通过将摄像头拍摄照片的中心位置作为预设基准坐标,并通过标定起始偏移量,可以使得人眼跟踪更加快速,准确。
在上述实施例的基础上,每个排图周期中均包括至少两幅图像,所述至少两幅图顺序排列在所述排图周期的宽度内,其中,所述至少两幅图像中的任意两幅图像分别进入所述观看者的左眼和右眼。
在上述实施例的基础上,所述移动距离确定模块620具体设置为:确定所述光栅膜相对于预设方向布设的倾斜角度;根据所述倾斜角度,同时结合所述双眼中点坐标以及所述预设基准坐标计算所述双眼中点坐标在偏离所述预设基准坐标的水平距离。
在上述实施例的基础上,所述显示内容平移模块630具体设置为:计算所述水平距离与预设光学分布图的条纹周期之间的商值,并将所述商值取小数部分;其中,所述预设光学分布图为预先标定的分光图谱,所述预设光学分布图 包括交叉间隔排列的第一条纹和第二条纹,所述第一条纹或所述第二条纹的倾斜角度均为所述光栅膜的倾斜角度,每个所述第一条纹以及每个所述第二条纹每行像素点的总个数之和为所述条纹周期;
将计算出的小数部分的值与所述起始偏移量相加,得到当前排图内容相对于所述排图基准点的平移量;
根据所述平移量对显示屏的显示内容进行平移处理。
在上述实施例的基础上,该控制***还包括:双眼位置获取模块,设置为获取摄像头第一时刻和第二时刻分别拍摄到的图像中观看者的第一双眼位置坐标和第二双眼位置坐标;移动方向确定模块,设置为根据所述第一双眼位置坐标和所述第二双眼位置坐标的关系确定所述观看者的移动方向;宽度调整方式确定模块,设置为根据所述移动方向,确定调整所述显示屏排图周期宽度的方式。
在上述实施例的基础上,所述宽度调整方式确定模块具体设置为:将所述第一双眼坐标位置作为观看者双眼的当前位置;
当所述移动方向为基于所述当前位置向靠近所述显示屏的方向移动时,则增大所述显示屏排图周期的宽度;
当所述移动方向为基于所述当前位置向远离所述显示屏的方向移动时,则减小所述显示屏排图周期的宽度。
在上述实施例的基础上,该控制***还包括:双眼成像距离获取模块,设置为控制所述摄像头对拍摄到的图像中观看者的双眼成像距离进行识别,并获取所述双眼成像距离;排图周期宽度调整模块,设置为根据所述双眼成像距离,调整所述显示屏排图内容的排图周期的宽度。
在上述实施例的基础上,所述排图周期宽度调整模块具体设置为:通过预 设线性计算公式计算调整排图周期的宽度y:
y=kx+b;
其中,x为观看者双眼成像距离,y表示排图周期的宽度,k和b均为预先标定的常数系数。
在上述实施例的基础上,所述光栅膜为柱镜光栅膜或狭缝光栅膜。
本发明实施例还提供一种电子设备。所述电子设备包括至少一个处理器和与所述至少一个处理器连接的存储器,所述存储器用于存储可被所述至少一个处理器执行的指令,所述指令被所述至少一个处理器执行时,使所述至少一个处理器执行上述实施例中的基于人眼跟踪的裸眼3D显示方法。
本发明实施例还提供了一种非暂态存储介质,存储有计算机可执行指令,所述计算机可执行指令设置为执行上述的基于人眼跟踪的裸眼3D显示方法。
本发明实施例还提供了一种计算机程序产品,所述计算机程序产品包括存储在非暂态计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,使所述计算机执行上述的基于人眼跟踪的裸眼3D显示方法。
本发明实施例提供的基于人眼跟踪的裸眼3D控制***可执行本发明任意实施例所提供的基于人眼跟踪的裸眼3D显示方法,具备执行方法相应的功能模块和有益效果。未在上述实施例中详尽描述的技术细节,可参见本发明任意实施例所提供的基于人眼跟踪的裸眼3D显示方法。
注意,上述仅为本发明的较佳实施例及所运用技术原理。本领域技术人员会理解,本发明不限于这里所述的特定实施例,对本领域技术人员来说能够进行各种明显的变化、重新调整和替代而不会脱离本发明的保护范围。因此,虽 然通过以上实施例对本发明进行了较为详细的说明,但是本发明不仅仅限于以上实施例,在不脱离本发明构思的情况下,还可以包括更多其他等效实施例,而本发明的范围由所附的权利要求范围决定。

Claims (11)

  1. 一种基于人眼跟踪的裸眼3D显示方法,应用于设置有光栅膜的显示屏,包括:
    获取摄像头拍摄到的观看者的双眼中点坐标;
    通过比较所述双眼中点坐标与预先标定的预设基准坐标,确定所述观看者双眼的移动方向以及偏离所述预设基准坐标的移动距离;以及
    根据所述移动方向和所述移动距离,同时结合预先标定的双眼的起始偏移量,对显示屏的显示内容进行平移处理;
    其中,所述起始偏移量为当所述双眼中点坐标位于所述预设基准坐标时,所述摄像头光轴的轴心偏离以排图基准点作为起点的排图周期中心点的距离;其中,所述预设基准坐标为所述摄像头拍摄图片的中心位置;以所述摄像头相对于所述显示屏所在坐标系中的像素坐标为起点,平行于所述光栅膜布设方向的线段为排图基准线,所述排图基准线上的任意一点均为所述排图基准点。
  2. 根据权利要求1所述的方法,其中,每个排图周期中均包括至少两幅图像,所述至少两幅图顺序排列在所述排图周期的宽度内,其中,所述至少两幅图像中的任意两幅图像分别进入所述观看者的左眼和右眼。
  3. 根据权利要求1所述的方法,其中,所述“通过比较所述双眼中点坐标与预先标定的预设基准坐标,确定所述观看者双眼的移动方向以及偏离所述预设基准坐标的移动距离”包括:
    确定所述光栅膜相对于预设方向布设的倾斜角度;
    根据所述倾斜角度,同时结合所述双眼中点坐标以及所述预设基准坐标计算所述双眼中点坐标在偏离所述预设基准坐标的水平距离。
  4. 根据权利要求3所述的方法,其中,所述“根据所述移动方向和所述移动距离,同时结合预先标定的双眼的起始偏移量,对显示屏的显示内容进行平 移处理”包括:
    计算所述水平距离与预设光学分布图的条纹周期之间的商值,并将所述商值取小数部分;其中,所述预设光学分布图为预先标定的分光图谱,所述预设光学分布图包括交叉间隔排列的第一条纹和第二条纹,所述第一条纹或所述第二条纹的倾斜角度均为所述光栅膜相对于预设方向布设的倾斜角度,每个所述第一条纹以及每个所述第二条纹每行像素点的总个数之和为所述条纹周期;
    将计算出的小数部分的值与所述起始偏移量相加,得到当前排图内容相对于所述排图基准点的平移量;
    根据所述平移量对显示屏的显示内容进行平移处理。
  5. 根据权利要求1所述的方法,还包括:
    获取摄像头第一时刻和第二时刻分别拍摄到的图像中观看者的第一双眼位置坐标和第二双眼位置坐标;
    根据所述第一双眼位置坐标和所述第二双眼位置坐标的关系确定所述观看者的移动方向;
    根据所述移动方向,确定调整所述显示屏排图周期宽度的方式。
  6. 根据权利要求5所述的方法,其中,所述“根据所述移动方向,确定调整所述显示屏排图周期宽度的方式”包括:
    将所述第一双眼坐标位置作为观看者双眼的当前位置;
    当所述移动方向为基于所述当前位置向靠近所述显示屏的方向移动时,则增大所述显示屏排图周期的宽度;
    当所述移动方向为基于所述当前位置向远离所述显示屏的方向移动时,则减小所述显示屏排图周期的宽度。
  7. 根据权利要求5所述的方法,还包括:
    控制所述摄像头对拍摄到的图像中观看者的双眼成像距离进行识别,并获取所述双眼成像距离;
    根据所述双眼成像距离,调整所述显示屏排图内容的排图周期的宽度。
  8. 根据权利要求7所述的方法,其中,所述“根据所述双眼成像距离,调整所述显示屏排图内容的排图周期的宽度”包括:
    通过预设线性计算公式计算调整排图周期的宽度y:
    y=kx+b;
    其中,x为观看者双眼成像距离,y表示排图周期的宽度,k和b均为预先标定的常数系数。
  9. 根据权利要求1所述的方法,其中:
    所述光栅膜为柱镜光栅膜或狭缝光栅膜。
  10. 一种基于人眼跟踪的裸眼3D控制***,包括:
    双眼中点坐标获取模块,设置为获取摄像头拍摄到的观看者的双眼中点坐标;
    移动距离确定模块,设置为通过比较所述双眼中点坐标与预先标定的预设基准坐标,确定所述观看者双眼的移动方向以及偏离所述预设基准坐标的移动距离;
    显示内容平移模块,设置为根据所述移动方向和所述移动距离,同时结合预先标定的双眼的起始偏移量,对显示屏的显示内容进行平移处理;
    其中,所述起始偏移量为当所述双眼中点坐标位于所述预设基准坐标时,所述摄像头光轴的轴心偏离以排图基准点作为起点的排图周期中心点的距离;其中,所述预设基准坐标为所述摄像头拍摄图片的中心位置;以所述摄像头相对于所述显示屏所在坐标系中的像素坐标为起点,平行于光栅膜布设方向的线 段为排图基准线,所述排图基准线上的任意一点均为所述排图基准点。
  11. 根据权利要求10所述的控制***,还包括:
    双眼成像距离获取模块,设置为控制所述摄像头对拍摄到的图像中观看者的双眼成像距离进行识别,并获取所述双眼成像距离;
    排图周期宽度调整模块,设置为根据所述双眼成像距离,调整所述显示屏排图内容的排图周期的宽度。
PCT/CN2017/116194 2017-10-23 2017-12-14 基于人眼跟踪的裸眼3d显示方法及控制*** WO2019080295A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710991468.5 2017-10-23
CN201710991468.5A CN107885325B (zh) 2017-10-23 2017-10-23 一种基于人眼跟踪的裸眼3d显示方法及控制***

Publications (1)

Publication Number Publication Date
WO2019080295A1 true WO2019080295A1 (zh) 2019-05-02

Family

ID=61782031

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/116194 WO2019080295A1 (zh) 2017-10-23 2017-12-14 基于人眼跟踪的裸眼3d显示方法及控制***

Country Status (2)

Country Link
CN (1) CN107885325B (zh)
WO (1) WO2019080295A1 (zh)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110874585A (zh) * 2019-11-28 2020-03-10 西安云海信息技术有限责任公司 一种基于注意力区域的窥视类作弊行为识别方法
CN112532964A (zh) * 2020-11-30 2021-03-19 深圳英伦科技股份有限公司 图像处理方法、设备、装置及可读存储介质
CN112698515A (zh) * 2020-04-18 2021-04-23 彭昊 一种裸眼三维显示成像方法及光场模拟器
CN112767317A (zh) * 2020-12-31 2021-05-07 上海易维视科技有限公司 裸眼3d显示器光栅膜检测方法
CN113079364A (zh) * 2021-03-24 2021-07-06 纵深视觉科技(南京)有限责任公司 一种静态对象的立体显示方法、装置、介质及电子设备
CN113467602A (zh) * 2020-03-31 2021-10-01 ***通信集团浙江有限公司 Vr显示方法及***
CN113660480A (zh) * 2021-08-16 2021-11-16 纵深视觉科技(南京)有限责任公司 一种环视功能实现方法、装置、电子设备及存储介质
CN113938669A (zh) * 2021-10-15 2022-01-14 纵深视觉科技(南京)有限责任公司 一种基于oled显示屏的裸眼3d显示方法、装置、设备及介质
CN114302130A (zh) * 2021-12-06 2022-04-08 嘉兴智瞳科技有限公司 一种智能显微手术成像装置控制方法及***
CN114356273A (zh) * 2021-12-30 2022-04-15 纵深视觉科技(南京)有限责任公司 一种独立驱动位置确定方法、装置、存储介质及电子设备
CN114390267A (zh) * 2022-01-11 2022-04-22 宁波视睿迪光电有限公司 立体图像数据合成方法、装置、电子设备及存储介质
WO2022143426A1 (zh) * 2020-12-31 2022-07-07 华为技术有限公司 会议装置和会议***
CN115113416A (zh) * 2022-07-22 2022-09-27 吉林省钜鸿智能技术有限公司 一种户外裸眼3d显示屏
WO2023036218A1 (zh) * 2021-09-08 2023-03-16 未来科技(襄阳)有限公司 视点宽度的确定方法及其装置
WO2024022086A1 (zh) * 2022-07-29 2024-02-01 京东方科技集团股份有限公司 基于三维显示的视频通信方法及***

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108962182A (zh) * 2018-06-15 2018-12-07 广东康云多维视觉智能科技有限公司 基于眼球追踪的三维图像显示装置及其实现方法
CN108881893A (zh) * 2018-07-23 2018-11-23 上海玮舟微电子科技有限公司 基于人眼跟踪的裸眼3d显示方法、装置、设备和介质
CN108896982A (zh) * 2018-08-01 2018-11-27 上海玮舟微电子科技有限公司 基于Hud***的出入屏距离测量方法及***
CN108989785B (zh) * 2018-08-22 2020-07-24 张家港康得新光电材料有限公司 基于人眼跟踪的裸眼3d显示方法、装置、终端和介质
CN109104603B (zh) * 2018-09-25 2020-11-03 张家港康得新光电材料有限公司 一种视点补偿方法、装置、电子设备和存储介质
CN109360504B (zh) * 2018-12-04 2021-06-11 深圳奇屏科技有限公司 一种成像距离可调整的3d-led大屏幕
CN109933189A (zh) * 2018-12-28 2019-06-25 惠州Tcl移动通信有限公司 动态桌面布局方法、显示设备及计算机存储介质
CN109688403A (zh) * 2019-01-25 2019-04-26 广州杏雨信息科技有限公司 一种应用于手术室内的裸眼3d人眼追踪方法及其设备
CN109674629A (zh) * 2019-01-31 2019-04-26 河南云睛视光科技有限公司 一种云数据计算方式的视光训练方法
CN109602587A (zh) * 2019-01-31 2019-04-12 河南云睛视光科技有限公司 一种智能化多功能视光训练仪
CN111679444A (zh) * 2019-03-05 2020-09-18 邓云天 可增大分光角度的裸眼3d光栅
CN112114440A (zh) * 2019-06-20 2020-12-22 格相科技(北京)有限公司 手机屏保膜实现3d显示的方法
CN110381305B (zh) * 2019-07-31 2021-06-01 南方医科大学南方医院 裸眼3d的去串扰方法、***、存储介质及电子设备
CN111918052B (zh) * 2020-08-14 2022-08-23 广东申义实业投资有限公司 竖直旋转式控制装置及平面图片转3d图像处理方法
WO2022228451A1 (zh) * 2021-04-30 2022-11-03 纵深视觉科技(南京)有限责任公司 基于人眼跟踪的显示处理方法、装置、设备及存储介质
CN113867526A (zh) * 2021-09-17 2021-12-31 纵深视觉科技(南京)有限责任公司 一种基于人眼追踪的优化显示方法、装置、设备及介质
CN113891061B (zh) * 2021-11-19 2022-09-06 深圳市易快来科技股份有限公司 一种裸眼3d显示方法及显示设备
CN114827578B (zh) * 2022-05-20 2024-06-07 庞通 裸眼3d实现方法、装置及存储介质
CN116898704B (zh) * 2023-07-13 2023-12-26 广州视景医疗软件有限公司 一种基于vr的视标物调整方法和装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8704879B1 (en) * 2010-08-31 2014-04-22 Nintendo Co., Ltd. Eye tracking enabling 3D viewing on conventional 2D display
CN105072431A (zh) * 2015-07-28 2015-11-18 上海玮舟微电子科技有限公司 一种基于人眼跟踪的裸眼3d播放方法及***
CN106713894A (zh) * 2015-11-17 2017-05-24 深圳超多维光电子有限公司 一种跟踪式立体显示方法及设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101316795B1 (ko) * 2012-02-02 2013-10-11 한국과학기술연구원 시역 평탄화 및 동적 크로스토크 최소화를 위한 무안경식 3차원 영상표시장치
CN103760980A (zh) * 2014-01-21 2014-04-30 Tcl集团股份有限公司 根据双眼位置进行动态调整的显示方法、***及显示设备
CN107249128B (zh) * 2017-06-23 2020-04-03 深圳超多维科技有限公司 一种摄像头的校正方法及装置
CN107172417B (zh) * 2017-06-30 2019-12-20 深圳超多维科技有限公司 一种裸眼3d屏幕的图像显示方法、装置及***

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8704879B1 (en) * 2010-08-31 2014-04-22 Nintendo Co., Ltd. Eye tracking enabling 3D viewing on conventional 2D display
CN105072431A (zh) * 2015-07-28 2015-11-18 上海玮舟微电子科技有限公司 一种基于人眼跟踪的裸眼3d播放方法及***
CN106713894A (zh) * 2015-11-17 2017-05-24 深圳超多维光电子有限公司 一种跟踪式立体显示方法及设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"A Stereo Camera Tracking Algorithm Used in Glass-free Stereoscopic Display System", JOURNAL OF COMPUTER-AIDED DESIGN & COMPUTER GRAPHICS, vol. 29, no. 3, 17 August 3104 (3104-08-17) *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110874585A (zh) * 2019-11-28 2020-03-10 西安云海信息技术有限责任公司 一种基于注意力区域的窥视类作弊行为识别方法
CN110874585B (zh) * 2019-11-28 2023-04-18 西安云海信息技术有限责任公司 一种基于注意力区域的窥视类作弊行为识别方法
CN113467602A (zh) * 2020-03-31 2021-10-01 ***通信集团浙江有限公司 Vr显示方法及***
CN113467602B (zh) * 2020-03-31 2024-03-19 ***通信集团浙江有限公司 Vr显示方法及***
CN112698515B (zh) * 2020-04-18 2023-08-22 彭昊 一种裸眼三维显示成像方法及光场模拟器
CN112698515A (zh) * 2020-04-18 2021-04-23 彭昊 一种裸眼三维显示成像方法及光场模拟器
CN112532964A (zh) * 2020-11-30 2021-03-19 深圳英伦科技股份有限公司 图像处理方法、设备、装置及可读存储介质
WO2022143426A1 (zh) * 2020-12-31 2022-07-07 华为技术有限公司 会议装置和会议***
CN112767317A (zh) * 2020-12-31 2021-05-07 上海易维视科技有限公司 裸眼3d显示器光栅膜检测方法
CN113079364A (zh) * 2021-03-24 2021-07-06 纵深视觉科技(南京)有限责任公司 一种静态对象的立体显示方法、装置、介质及电子设备
CN113660480B (zh) * 2021-08-16 2023-10-31 纵深视觉科技(南京)有限责任公司 一种环视功能实现方法、装置、电子设备及存储介质
CN113660480A (zh) * 2021-08-16 2021-11-16 纵深视觉科技(南京)有限责任公司 一种环视功能实现方法、装置、电子设备及存储介质
WO2023036218A1 (zh) * 2021-09-08 2023-03-16 未来科技(襄阳)有限公司 视点宽度的确定方法及其装置
CN113938669A (zh) * 2021-10-15 2022-01-14 纵深视觉科技(南京)有限责任公司 一种基于oled显示屏的裸眼3d显示方法、装置、设备及介质
CN113938669B (zh) * 2021-10-15 2024-06-04 深显科技(南京)有限责任公司 一种基于oled显示屏的裸眼3d显示方法、装置、设备及介质
CN114302130A (zh) * 2021-12-06 2022-04-08 嘉兴智瞳科技有限公司 一种智能显微手术成像装置控制方法及***
CN114356273A (zh) * 2021-12-30 2022-04-15 纵深视觉科技(南京)有限责任公司 一种独立驱动位置确定方法、装置、存储介质及电子设备
CN114356273B (zh) * 2021-12-30 2023-10-17 纵深视觉科技(南京)有限责任公司 一种独立驱动位置确定方法、装置、存储介质及电子设备
CN114390267A (zh) * 2022-01-11 2022-04-22 宁波视睿迪光电有限公司 立体图像数据合成方法、装置、电子设备及存储介质
CN115113416B (zh) * 2022-07-22 2023-08-25 吉林省钜鸿智能技术有限公司 一种户外裸眼3d显示屏
CN115113416A (zh) * 2022-07-22 2022-09-27 吉林省钜鸿智能技术有限公司 一种户外裸眼3d显示屏
WO2024022086A1 (zh) * 2022-07-29 2024-02-01 京东方科技集团股份有限公司 基于三维显示的视频通信方法及***

Also Published As

Publication number Publication date
CN107885325A (zh) 2018-04-06
CN107885325B (zh) 2020-12-08

Similar Documents

Publication Publication Date Title
WO2019080295A1 (zh) 基于人眼跟踪的裸眼3d显示方法及控制***
US9325968B2 (en) Stereo imaging using disparate imaging devices
US8717352B2 (en) Tracing-type stereo display apparatus and tracing-type stereo display method
EP2383995B1 (en) Display system, method and computer program for capturing images using multiple integrated image sensors
CN107155104B (zh) 一种裸眼立体显示设备的显示校正方法及装置
US9357206B2 (en) Systems and methods for alignment, calibration and rendering for an angular slice true-3D display
US7583307B2 (en) Autostereoscopic display
US9172949B2 (en) Display apparatus and method
JP4616559B2 (ja) 表示装置及び表示システム
WO2020019548A1 (zh) 基于人眼跟踪的裸眼3d显示方法、装置、设备和介质
US10502967B2 (en) Method for rendering three-dimensional image, imaging method and system
US9541494B2 (en) Apparatus and method to measure display quality
CN107179613B (zh) 一种裸眼立体显示设备的显示校正方法、装置及电子设备
CN108989785B (zh) 基于人眼跟踪的裸眼3d显示方法、装置、终端和介质
US11754975B2 (en) System and method for holographic image display
JP2019213127A (ja) Ip立体映像表示装置及びそのプログラム
CN105391997A (zh) 立体显示装置的三维视点校正方法
US20140362197A1 (en) Image processing device, image processing method, and stereoscopic image display device
Hopf et al. Multi-user eye tracking suitable for 3D display applications
CN110191331B (zh) 一种真三维裸眼3d图像合成方法、存储介质及合成装置
CN118264790A (zh) 裸眼3d显示方法、装置和***及电子设备
CN205212984U (zh) 立体显示装置
CN115755428A (zh) 光栅驱动和校正信息获取方法、相关设备以及校正***
CN116830151A (zh) 显示方法及显示装置
CN114666566A (zh) 三维显示装置的显示、检测方法、存储介质及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17929674

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.06.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 17929674

Country of ref document: EP

Kind code of ref document: A1