CN117412020A - Parallax adjustment method, parallax adjustment device, storage medium and computing device - Google Patents

Parallax adjustment method, parallax adjustment device, storage medium and computing device Download PDF

Info

Publication number
CN117412020A
CN117412020A CN202210800791.0A CN202210800791A CN117412020A CN 117412020 A CN117412020 A CN 117412020A CN 202210800791 A CN202210800791 A CN 202210800791A CN 117412020 A CN117412020 A CN 117412020A
Authority
CN
China
Prior art keywords
pixel
viewer
interest
image
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210800791.0A
Other languages
Chinese (zh)
Inventor
宋碧薇
李慧玲
陈宇宸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210800791.0A priority Critical patent/CN117412020A/en
Publication of CN117412020A publication Critical patent/CN117412020A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/371Image reproducers using viewer tracking for tracking viewers with different interocular distances; for tracking rotational head movements around the vertical axis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/178Metadata, e.g. disparity information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Library & Information Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The application discloses a parallax adjustment method, a parallax adjustment device, a storage medium and computing equipment, and relates to the field of computer vision. The method is performed by a computing device, the method comprising: for a concerned pixel area of a viewer in the binocular image, determining convergence conflict of corresponding pixel points in the concerned pixel area according to parallax between the corresponding pixel points; then, a target pixel region in which the convergence conflict exceeds the conflict threshold is determined in the pixel region of interest, and parallax adjustment is performed on the target pixel region. Therefore, the convergence conflict of the corresponding pixel points in the target pixel area after parallax adjustment is smaller, and in the watching process, even if the time that a viewer looks at the pixel area is longer, the viewer can not see the situation of eye fatigue, and the stereoscopic watching experience of the viewer can be improved.

Description

Parallax adjustment method, parallax adjustment device, storage medium and computing device
Technical Field
The present disclosure relates to the field of computer vision, and in particular, to a parallax adjustment method, a parallax adjustment device, a storage medium, and a computing device.
Background
When a person views an object, parallax exists between a left eye viewing image seen by the left eye and a right eye viewing image seen by the right eye, so that people can distinguish the distance between the object and the near object, and a stereoscopic impression is obtained. A 3D movie, a 3D television, and the like also present stereoscopic images to a user using the parallax principle.
In the process of watching a 3D film, perceived stereoscopic images are obtained by fusing human brains based on parallax between left-eye watching images and right-eye watching images. In the process of watching pictures with more shocking 3D effects, viewers easily generate visual fatigue and even physiological discomfort, and 3D watching experience of users is reduced.
Disclosure of Invention
The application provides a parallax adjustment method, a parallax adjustment device, a storage medium and a computing device, so as to solve the problem that a viewer generates visual fatigue in the process of viewing a 3D image.
In a first aspect, the present application provides a parallax adjustment method, the method being performed by a computing device, the method comprising: first determining a pixel region of interest of a viewer in a binocular image, the binocular image comprising a left eye viewing image and a right eye viewing image, the pixel region of interest comprising a first pixel region of interest of the viewer in the left eye viewing image and a second pixel region of interest in the right eye viewing image; then, according to the parallax between the corresponding pixel points in the first attention pixel area and the second attention pixel area, determining the convergence conflict of the corresponding pixel points in the attention pixel area; if a target pixel region with convergence conflict exceeding a conflict threshold exists in the target pixel region, parallax adjustment is performed on the target pixel region.
In the scheme of the application, aiming at the attention pixel area with longer attention time in the binocular image, the target pixel area with convergence conflict exceeding the preset threshold value in the attention pixel area is subjected to parallax adjustment so as to reduce the convergence conflict of the pixels in the target pixel area, so that even if the observer looks at the target pixel area for a long time, the fatigue or physiological discomfort of eyes of the observer can be effectively reduced, and the stereoscopic viewing experience of the observer is improved.
In one possible implementation, determining a region of interest pixel for a viewer in a binocular image includes: according to the historical gazing information of the viewer, determining a historical gazing area of the viewer on an imaging surface; and determining a pixel region of interest of the viewer on the binocular image according to the historical gazing region.
In the implementation manner, since the gaze point of the viewer on the imaging surface does not change much in a short time during the viewing process, the past historical gaze information is combined to determine the region of interest pixels on the binocular image, so that the accuracy of the determined region of interest pixels can be ensured to be higher, and the parallax adjustment effect can be ensured.
In one possible implementation, if the historical gaze information indicates a probability that the viewer gazes at each pixel point on the imaging surface over a historical period of time; the historical gaze region may be determined from the historical gaze information as follows: for each pixel point on the imaging surface, firstly integrating a plurality of probabilities of watching the pixel point in a historical time period, and determining a reference watching probability corresponding to each pixel point on the imaging surface; and then, determining a pixel point set with reference fixation probability not lower than a probability threshold value on the imaging surface, and taking the determined pixel point set as a historical fixation area of the viewer on the imaging surface.
In the above implementation manner, the historical gazing area is comprehensively determined by combining the probabilities that the viewer gazes at each pixel point in the historical time period, and compared with the method that the historical gazing area is determined according to the probability that the viewer gazes at each pixel point in a certain past time point, the method for determining the historical gazing area has more reference data, and can ensure the accuracy of the determined historical gazing area.
In one possible implementation, determining a region of interest pixels of a viewer on a binocular image from a historical gaze region includes: performing intersection operation on each sub-pixel area and a historical gazing area in the binocular image; then, at least one sub-pixel region having an intersection region with the history gazing region is selected as a focused pixel region of the viewer.
Based on the possible implementation manner as above, the method comprises the step of determining the attention pixel area of the viewer in the binocular image by combining the historical attention area of the viewer and the image content in the binocular image. In some scenarios, there may be cases where the pixel area in the binocular image that coincides with the historical gaze area of the viewer is a partial pixel area representing a certain object, such as the pixel area where the head of a certain persona is located. In this case, if a part of the pixel regions of the object is determined as the pixel region of interest, after parallax adjustment is performed later, the parallax difference between the other pixel regions on the object and the region determined as the pixel region of interest is larger, that is, the stereoscopic effect presented to the same object is larger, which greatly reduces the viewing experience of the viewer. Based on the above embodiment, it is ensured that one sub-pixel region in the binocular image is determined as the pixel region of interest or not, and there is no case where a partial region in one sub-pixel region is determined as the pixel region of interest, so that the above problem can be effectively solved, and the viewing experience of the user is ensured.
In one possible implementation, determining a convergence conflict of a corresponding pixel point in a pixel region of interest according to a parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest includes: the method comprises the steps of obtaining the interpupillary distance of a viewer, the viewing distance of the viewer relative to an imaging surface and display parameters; and determining the convergence conflict of the corresponding pixel points in the pixel region of interest according to the parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest, the interpupillary distance of the viewer, the viewing distance of the viewer relative to the imaging surface and the display parameters. Further, the convergence conflict can be calculated using the parallax, the pupil distance, the viewing distance, and the display parameters as follows: determining virtual image distances perceived by a viewer for corresponding pixel points in the pixel region of interest according to parallax between the corresponding pixel points in the pixel region of interest, pupil distance of the viewer, viewing distance of the viewer relative to the imaging plane and display parameters; and determining the convergence conflict of the corresponding pixel point in the pixel region of interest according to the virtual image distance perceived by the viewer for the corresponding pixel point in the pixel region of interest and the viewing distance of the viewer relative to the imaging surface. Thereby, the implementation of accurately calculating the convergence conflict of the corresponding pixel points in the pixel region of interest according to the viewing state (pupil distance and viewing distance) of the viewer and the display parameters of the stereoscopic display device corresponding to the imaging surface is realized.
In one possible implementation, the method further includes: acquiring a plurality of historical viewing distances of a viewer relative to an imaging surface over a historical period of time; a viewing distance of the viewer relative to the imaging surface is determined based on the plurality of historical viewing distances.
Based on the possible implementation manner, the viewing distances of the viewers under the condition of viewing the binocular image are estimated by using the plurality of historical viewing distances, and more reference data are utilized for determining the viewing distances, so that the accuracy of the determined viewing distances is ensured, and the accuracy of the calculated convergence conflict is further ensured.
In one possible implementation, if the imaging surface is a real image imaging surface of a real image display device, the viewing distance of the viewer with respect to the imaging surface is the distance from the eyes of the viewer to the real image imaging surface; alternatively, if the imaging plane is a virtual image imaging plane of a virtual image display apparatus, the viewing distance of the viewer with respect to the imaging plane is the distance from the eyes of the viewer to the virtual image imaging plane.
In one possible implementation, the parallax adjustment may be performed as follows: acquiring a first display distance between a first pixel point on an imaging surface and a second pixel point on the imaging surface, wherein the first pixel point is a pixel point in a first attention pixel area in a left eye viewing image, and the second pixel point is a pixel point corresponding to the first pixel point in a second attention pixel area in a right eye viewing image; and adjusting the first display interval to a second display interval according to the convergence conflict of the corresponding pixel points in the target pixel area, wherein the second display interval is smaller than the first display interval.
In one possible implementation, the method further includes: determining target display position information of the binocular image on the imaging surface according to the initial display position information and the second display distance of the binocular image on the imaging surface; and displaying the binocular image on the imaging plane according to the target display position information.
In a second aspect, the present application provides a parallax adjustment device comprising respective modules for implementing the parallax adjustment method of the first aspect or any one of the possible implementations of the first aspect.
In a third aspect, the present application provides a processor for performing the operational steps of the parallax adjustment method of the first aspect or any one of the possible implementations of the first aspect.
In a fourth aspect, the present application provides a computing device comprising at least one processor and memory for storing a set of computer instructions; when the processor executes a set of computer instructions, the operational steps of the parallax adjustment method of the first aspect or any of the possible implementations of the first aspect are performed.
In a fifth aspect, the present application provides a display device comprising a processor, a memory, and a display unit, the memory for storing a set of computer instructions; when the processor executes a set of computer instructions, the operational steps of the parallax adjustment method of the first aspect or any one of the possible implementations of the first aspect are performed; the display unit is used for displaying the binocular image after parallax adjustment.
In a sixth aspect, the present application provides a computer-readable storage medium having computer-readable instructions stored thereon; when executed by a processor, the computer-readable instructions implement a disparity adjustment method as provided in the first aspect or any one of the possible implementations of the first aspect.
In a seventh aspect, the present application provides a computer program product for causing a computing device to perform the operational steps of the method for adjusting a disparity as in the first aspect or any one of the possible implementations of the first aspect, when the computer program product is run on a computer.
Further combinations of the present application may be made to provide further implementations based on the implementations provided in the above aspects.
Drawings
Fig. 1 shows a schematic representation of a three-dimensional virtual image perceived by a viewer at different parallaxes.
Fig. 2 is a schematic diagram of a parallax adjustment system according to an embodiment of the present application.
Fig. 3 is a schematic diagram of a parallax adjustment system according to another embodiment of the present application.
Fig. 4 is a flowchart illustrating a parallax adjustment method according to an embodiment of the present application.
Fig. 5 schematically shows a stereoscopic image presented on a converging real image display device.
Fig. 6 exemplarily shows a schematic diagram of rendering a stereoscopic image on a parallel virtual image display device.
Fig. 7 is a flowchart illustrating a parallax adjustment method according to another embodiment of the present application.
Fig. 8 is a block diagram of a parallax adjustment device according to an embodiment of the present application.
FIG. 9 is a block diagram of a computing device, according to an embodiment of the present application.
Detailed Description
Before describing the embodiments of the present application in detail, the terms referred to in the present application are explained as follows:
binocular image: refers to an image obtained by photographing the same subject from different viewpoints by two image pickup apparatuses, and a binocular image includes a left eye viewing image and a right eye viewing image.
Disparity (Disparity): refers to the horizontal pixel difference between the left eye viewing image and the right eye viewing image in the binocular image. Wherein parallax is typically the horizontal pixel coordinates of a pixel in the left-eye viewing image minus the horizontal pixel coordinates of a corresponding pixel in the right-eye viewing image. For example, if the pixel coordinate of the point P in the physical space in the left-eye viewing image is (x) l ,y l ) The pixel coordinates of the point P in the right-eye viewing image are (x r ,y r ) Parallax d N =x r -x l
Diopter: also known as power, the effect of the eye's refracted rays is called refraction, which is the ability of refraction in diopters, denoted as D. When parallel light passes through a refractive substance (e.g., eye, lens, etc.), the refractive power of the refractive substance is 1 diopter (i.e., 1D) when the focal point is 1 m.
Vergence conflict (vergent-accommodation conflict, VAC): also known as the vergence conflict index. When a viewer is viewing a binocular image having horizontal parallax, the viewer's line of sight is focused on an imaging plane (e.g., a screen), and a three-dimensional virtual image perceived by the viewer based on the binocular image having horizontal parallax is located in front of or behind the imaging plane. When the line of sight of the viewer is focused on the imaging surface, the refractive power of the eye of the viewer is the actual refractive power, and when the line of sight of the viewer is focused on the perceived three-dimensional virtual image, the refractive power of the eye of the viewer is the equivalent refractive power, and the difference between the equivalent refractive power and the actual refractive power is referred to as convergence conflict. When the convergence conflict is large, eye fatigue, dizziness or other physiological discomfort is easily caused to the viewer.
The display device may present a binocular image having a horizontal parallax to a viewer using a stereoscopic imaging principle, and the brain of the viewer forms a stereoscopic sense by judging and analyzing, for example, a sense that an object is suspended outside a screen or the object is deep into the screen, based on a left-eye viewing image viewed by the left eye of the viewer and a right-eye viewing image viewed by the right eye of the viewer.
Fig. 1 shows a schematic representation of a three-dimensional virtual image perceived by a viewer at different parallaxes. The right side in fig. 1 exemplarily shows a pixel region where a cube in an actual physical space is located in a left-eye viewing image and a pixel region where it is located in a right-eye viewing image. Let pixel I in the left-eye viewing image and pixel II in the right-eye viewing image represent the same point on the cube in real physical space.
As shown in fig. 1 (a), when the pixel area where the cube is located at a position to the left in the left-eye viewing image and the pixel area where the cube is located at a position to the right in the right-eye viewing image, the pixel point I in the left-eye viewing image is displayed at a position point A1 on the screen and the pixel point II in the right-eye viewing image is displayed at a position point B1 on the screen, and at this time, a convergence point of the viewing line of the left eye and the viewing line of the right eye of the viewer is located at the rear side of the screen. Correspondingly, based on the positions of the pixel points I and II on the screenCalculated parallax d N The virtual image distance N perceived by the viewer as a three-dimensional virtual image is greater than the actual viewing distance L, at which point the viewer produces the sensation of an object going deep into the screen (i.e., an on-screen effect). The virtual image distance N refers to the distance between the three-dimensional virtual image perceived by the viewer and eyes of the viewer along the direction vertical to the screen; the viewing distance L refers to the distance between the eyes of the viewer and the screen in the direction perpendicular to the screen.
As shown in fig. 1 (B), when the pixel area where the cube is located at a position to the right in the left-eye viewing image and the pixel area where the cube is located at a position to the left in the right-eye viewing image, the pixel point I in the left-eye viewing image is displayed at the position point A2 on the screen, and the pixel point II in the right-eye viewing image is displayed at the position point B2 on the screen, at this time, the convergence point of the viewing line of the left eye of the viewer and the viewing line of the right eye is located in front of the screen. At this time, parallax d N And < 0, the viewing distance N of the three-dimensional virtual image perceived by the viewer is smaller than the actual viewing distance L of the viewer, and the viewer generates the feeling that the object floats outside the screen (namely, the screen-out effect).
As shown in fig. 1 (c), in the case where the position of the pixel region where the cube is located in the left-eye viewing image is substantially the same as the position of the pixel region where the cube is located in the right-eye viewing image, on the screen, the pixel point I in the left-eye viewing image and the pixel point II in the right-eye viewing image are displayed at a position point A3 on the screen substantially overlapping with each other, at this time, the parallax d N =0, the viewing distance N of the virtual image perceived by the viewer is equal to the actual viewing distance L of the viewer, which produces an object that is just projected onto the screen (i.e. an on-screen effect). The plane where the parallax is zero may also be referred to as a zero plane.
As can be seen from (a) and (b) in fig. 1, at parallax d N In the case of non-zero, the viewer can feel the stereoscopic image, and the parallax d N The larger the absolute value of (c), the better the stereoscopic effect. However, while presenting the stereoscopic image, since the virtual image distance perceived by the viewer is different from the actual viewing distance of the viewer, the viewer's eyes are causedThe actual power and the equivalent power of the eye are different, i.e. there is a vergence conflict. When the convergence conflict is large, a large burden is easily brought to eyes, obvious visual fatigue and even physiological discomfort can be generated, and further 3D viewing experience of a user is reduced. Therefore, in order to solve this problem, the solution of the present application is proposed.
Fig. 2 is a schematic diagram of a parallax adjustment system according to an embodiment of the present application, and as shown in fig. 2, the parallax adjustment system 200 includes a stereoscopic display device 220 and an eye tracking device 210. The stereoscopic display device 220 may establish a communication connection with the eye tracking device 220 through a wired or wireless network.
The stereoscopic display device 220 has a 3D (3-dimensional) display function, which may be a stereoscopic real image display device, a near-eye stereoscopic display device (e.g., AR glasses, VR glasses, stereoscopic glasses, etc.); the stereoscopic real image display device may be a mobile real image stereoscopic display device (e.g., a smart phone, a tablet computer, a game machine, etc. having a 3D display function), a desktop real image stereoscopic display device (e.g., a desktop display having a 3D display function, a projector, an indoor and outdoor large-sized 3D screen, a 3D photographing device, a 3D post-processing device, etc.), and is not particularly limited herein.
The eye movement tracking device 210 is used for eye movement tracking of eyes of a viewer to obtain eye movement data of the viewer. The eye-tracking device 210 may be an eye tracker, an infrared camera, an RGB (Red Green Blue) camera, an RGB-D (RGB-Deep) camera, or the like, and is not particularly limited herein. The eye-tracking device 210 may transmit eye-movement data of the viewer to the stereoscopic display device 220.
The stereoscopic display device 220 includes a processor and a stereoscopic display unit, wherein the stereoscopic display unit may be a display screen of the stereoscopic display device. The parallax adjustment method provided in the present application may be implemented when the processor in the stereoscopic display device 220 executes the computer readable instructions: that is, determining a region of interest pixels of a viewer in a binocular image from eye movement data; and performing convergence conflict calculation according to the parallax between the corresponding pixel points in the pixel region of interest, and then performing parallax adjustment to obtain a binocular image after parallax adjustment. Then, the parallax-adjusted binocular image is displayed by the stereoscopic display unit.
In some embodiments, the eye tracking device 210 may further process the acquired eye movement data to obtain gaze information of the viewer on an imaging plane (i.e. an imaging plane corresponding to the stereoscopic display unit), and send the gaze information to the stereoscopic display device 220, where the stereoscopic display device 220 performs the parallax adjustment method according to the gaze information and the binocular image of the viewer. In a specific embodiment, based on eye movement data of the viewer, gaze information of the viewer on the imaging plane may be obtained by pupil-cornea reflection method, eyeball 3D modeling method, line of sight estimation based on neural network, and the like.
The eye-tracking device 210 may be fixedly mounted on the stereoscopic display device 220 such that the eye-tracking device 210 faces the viewer, and thus the eye-tracking device 210 may perform eye-tracking in real time while the viewer faces the stereoscopic display device 220. The mounting position of the eye-tracking apparatus 210 on the stereoscopic display apparatus 220 is not limited, and may be mounted above, below, left side, right side, etc. of the display screen of the stereoscopic display apparatus 220, for example; of course, the eye-tracking device 210 is not limited to being mounted on the stereoscopic display device 220, and may be placed at other positions where eye-tracking can be accurately performed, for example, in the case where the eye-tracking device 210 is head-mounted, the eye-tracking device 210 may be worn by a viewer.
Fig. 3 is a schematic diagram of a parallax adjustment system according to another embodiment of the present application. In contrast to the parallax adjustment system shown in fig. 2, the parallax adjustment system in fig. 3 further comprises a control device 230, which control device 230 is communicatively connected to the stereoscopic display device 210 and to the eye-tracking device 210, respectively. The control device 230 may be a computing device with computing processing capabilities, such as an in-vehicle device, an edge computing device, a smart home device, a gateway, a projection control device, etc., and is not specifically limited herein.
Based on the parallax adjustment system shown in fig. 3, the eye tracking device 210 may send the acquired eye movement data (or gaze information of the viewer on the imaging plane obtained by processing the eye movement data) to the control device 230, and the control device 230 executes the parallax adjustment method provided in the present application, so as to obtain a binocular image after parallax adjustment. The control device 230 may transmit the parallax-adjusted binocular image to the stereoscopic display device 220, and the parallax-adjusted binocular image is displayed by the stereoscopic display device 220.
Fig. 4 is a flowchart illustrating a parallax adjustment method according to an embodiment of the present application. The method may be performed by the stereoscopic display device 220 shown in fig. 2 or by the control device 230 in fig. 3. As shown in fig. 4, the method includes:
in step 410, a region of interest of a viewer in a binocular image is determined.
The pixel region of interest refers to a pixel region of interest to a viewer in a binocular image or a pixel region of interest to a viewer in a binocular image. It is also understood that the pixel region of interest refers to a pixel region in which an object of interest or interest to the viewer is located in the binocular image, or a pixel region in which the eye gaze probability of the viewer is high in the binocular image. The binocular image may be a binocular image to be currently displayed.
The binocular image includes a left eye viewing image and a right eye viewing image, and the pixel region of interest includes a first pixel region of interest in the left eye viewing image and a second pixel region of interest in the right eye viewing image, respectively. The first pixel region of interest refers to a pixel region of interest to a viewer in viewing an image with the left eye. The second pixel region of interest refers to a pixel region of interest to a viewer in viewing an image with the right eye. In fig. 4, a pixel region exemplarily shown by a dashed frame in the left-eye viewing image and the right-eye viewing image is a pixel region of interest, and further, a pixel region surrounded by a dashed frame in the left-eye viewing image may be further referred to as a first pixel region of interest, and a pixel region surrounded by a dashed frame in the right-eye viewing image may be further referred to as a second pixel region of interest.
It should be noted that the pixel point in the first attention pixel area and the pixel point in the second attention pixel area are corresponding, or if the pixel point in the first attention pixel area (assumed to be the pixel point D1) and the pixel point in the second attention pixel area (assumed to be the pixel point D2) represent the same point in the actual physical space, it is determined that the pixel point D1 in the first attention pixel area and the pixel point D2 in the second attention pixel area are corresponding.
In some embodiments, the object of interest or the object of interest may be preselected by the viewer, then object recognition is performed in the binocular image, the object of interest or the object of interest to the viewer in the binocular image is determined, and a pixel region where the object of interest or the object of interest to the viewer in the binocular image is located is taken as the pixel region of interest. In the post-processing process of the 3D video, a post-processing person may select an object of interest or an object of interest according to the scheme of the present embodiment, so as to determine a pixel region of interest of the post-processing person in the binocular image.
In some embodiments, the region of interest pixels in the current binocular image may be determined from historical objects of interest (or historical objects of interest) in the historical presented historical binocular image by the viewer. Specifically, after the historical attention object of the viewer in the historical binocular image is determined, the historical attention object is positioned in the current binocular image, and the pixel area where the historical attention object is positioned in the current binocular image is taken as the attention pixel area of the viewer in the current binocular image.
Step 420, determining a convergence conflict of the corresponding pixel points in the pixel region of interest according to the parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest.
As described above, parallax refers to a horizontal pixel difference between a left-eye viewing image and a right-eye viewing image in a binocular image. In general, parallax is the pixel coordinates in the left-eye viewing image minus the pixel coordinates in the right-eye viewing image.
The pixel points corresponding to the second pixel region in the first pixel region of interest refer to two pixel points for representing the same point in the actual physical space, and one pixel point of the two pixel points is located in the first pixel region of interest, and the other pixel point is located in the second pixel region of interest. It is understood that the parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest is equal to the difference between the horizontal pixel coordinates of the pixel points in the second pixel region of interest and the horizontal pixel coordinates of the pixel points in the first pixel region of interest. Further, in the present application, the pixel point corresponding to the second pixel region of interest in the first pixel region of interest may also be referred to as a corresponding pixel point in the pixel region of interest.
Specifically, a first attention pixel region in a left eye viewing image and a second attention pixel region in a right eye viewing image in a binocular image can be subjected to stereo matching, corresponding pixel points in the first attention pixel region and the second attention pixel region are determined, and then parallax between the corresponding pixel points is calculated according to pixel coordinates of the pixel points in the first attention pixel region and the pixel points in the second attention pixel region under a screen coordinate system.
In other embodiments, since the parallax of the pixel point in the image is calculated according to the focal length of the depth image capturing apparatus or the like to which the pixel point corresponds, the parallax of each pixel in the pixel region of interest may also be determined according to the depth map acquired in advance and the focal length of the image capturing apparatus. Specifically, the formula for calculating parallax based on depth is as follows:
wherein f is the focal length of the image acquisition device (camera, video camera, etc.) from which the binocular image is derived; b is the distance between the optical centers of two image acquisition devices from which the binocular images are derived; z is the depth value corresponding to the pixel point in the depth map.
Vergence conflict refers to the difference between the equivalent power and the actual power. As described above, since the virtual image perceived by the viewer according to the corresponding pixel point in the left-eye viewing image and the right-eye viewing image is not necessarily located on the imaging plane, the virtual image perceived by the viewer's eye for the corresponding pixel point in the pixel region of interest is different from the distance of the viewer with respect to the imaging plane. In the case where the line of sight of the viewer is focused on the imaging surface, the actual refractive power of the eyes of the viewer is the inverse of the viewing distance of the viewer with respect to the imaging surface, and when the line of sight of the viewer is focused at the position of the perceived virtual image, the equivalent refractive power of the eyes of the viewer is the inverse of the virtual image distance perceived by the viewer.
Therefore, for the corresponding pixel point in the pixel region of interest, the calculation formula of the convergence conflict is:
wherein L is the viewing distance of the viewer with respect to the imaging plane; n is the virtual image distance actually perceived by the viewer based on the corresponding pixel point in the pixel region of interest. In the formula 2, the units of L and N are millimeters, and in other cases, if the units of L and N are other length units, the conversion is performed correspondingly according to the conversion relationship between the length units, in the formula 2. For example, if the units of L and N are meters, the calculation formula of the convergence conflict can be converted into the following formula:
the imaging plane may be a real image imaging plane of the real image display apparatus, the viewing distance of the viewer with respect to the imaging plane is a distance from the eyes of the viewer to the real image imaging plane in a direction perpendicular to the imaging plane, and the real image imaging plane may be the screens in (a) - (c) of fig. 1. In this case, the viewing distance of the viewer with respect to the imaging plane can be determined by the position information of the eyes of the viewer and the position information of the real image imaging plane.
The imaging plane may also be a virtual image imaging plane of a virtual image display device, the viewing distance of the viewer with respect to the imaging plane being the distance of the viewer's eyes to the virtual image imaging plane in a direction perpendicular to the imaging plane. The virtual image forming plane can also be understood as an imaging plane passing through the focal point of the virtual image display device and perpendicular to the optical axis of the virtual image display device. The virtual image display device includes, for example, AR glasses, VR glasses, and stereoscopic glasses having a 3D display function. In this case, the distance of the eyes of the viewer with respect to the virtual image imaging plane may be calculated in combination with the positional information of the eyes of the viewer and the optical design parameters of the virtual image display device (e.g., focal length, VID (Virtual Image Distance, virtual image distance)). For example, if the virtual image display apparatus is AR glasses having a 3D display function, the distance of the eyes of the viewer from the lenses of the AR glasses may be determined according to the position information of the eyes of the viewer, and then the focal length of the AR glasses and the distance of the eyes of the viewer from the lenses of the AR glasses may be added to obtain the distance of the eyes of the viewer from the virtual image imaging plane.
The virtual image distance N perceived by the viewer can be calculated by parallax, display parameters of the display device corresponding to the imaging plane, and the viewing distance L. Specifically, step 420 includes: the method comprises the steps of obtaining the interpupillary distance of a viewer, the viewing distance of the viewer relative to an imaging surface and display parameters; and determining the convergence conflict of the corresponding pixel points in the pixel region of interest according to the parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest, the interpupillary distance of the viewer, the viewing distance of the viewer relative to the imaging surface and the display parameters.
According to the parallax of the corresponding pixel points in the pixel region of interest, the pupil distance of the viewer, the viewing distance of the viewer relative to the imaging surface and the display parameters, the virtual image distance perceived by the viewer for the corresponding pixel points in the pixel region of interest can be calculated; then, based on the above formula 2, the convergence conflict of the corresponding pixel point in the pixel region of interest can be determined from the virtual image distance perceived by the viewer for the corresponding pixel point in the pixel region of interest and the viewing distance of the viewer with respect to the imaging plane.
The display parameters may be a lateral resolution (i.e., resolution in a horizontal direction) of the display device corresponding to the imaging plane and a lateral display width of the display device; but also the lateral pixel density of the display device to which the imaging plane corresponds.
When binocular images are displayed with different types of display devices, there is a difference in the manner in which the virtual image distance N is calculated. The calculation method of the virtual image distance will be specifically described below by taking an example in which a converging real image display device displays a binocular image and a parallel eye type virtual image display device displays a binocular image, respectively.
In the case of presenting a stereoscopic image using a converging real image display device, a left eye viewing image and a right eye viewing image in a binocular image are the same area displayed on the same screen. Fig. 5 schematically illustrates a stereoscopic image presented on a convergent real image display device, and in fig. 5, a stereoscopic virtual image perceived by a viewer is located behind a real image imaging plane (screen).
Based on the schematic diagram of fig. 5, according to the principle of similar triangles, one can obtain:
transforming equation 3 can obtain a virtual image distance N as:
wherein d' N Representing an actual physical parallax on the stereoscopic display device, i.e., a physical parallax after converting the parallax in pixel coordinates to a screen coordinate system of the display device; w is the lateral display width of the display device.
It will be appreciated that if on the real image imaging plane, the pixels representing the same point in real space in the left eye viewing image and the right eye viewing image are located on the left side edge of the imaging plane and on the right side edge of the imaging plane, respectively, the parallax calculated for the two corresponding pixels at this time is-W. In practice, in order to ensure the viewing effect, the pixels on the left-eye viewing image and the right-eye viewing image are generally located on the imaging plane Without being located on the edge of the imaging plane, thus the above-mentioned d' N > -W. Since the display device is a convergent display device, d 'is' N < E. Combining the two constraint conditions, d' N The range of the values is as follows: w < d' N < E. . As can be seen from fig. 5, the physical parallax d' N The larger the absolute value of (coordinates in the right eye viewing image minus coordinates in the left eye viewing image), the further the virtual image perceived by the eye is from the screen.
Bringing equation 4 into equation 2, the convergence conflict can be obtained as:
and due to physical parallax d' N And parallax d in pixel coordinate system N The following are satisfied:
where width is the lateral resolution of the display device, which refers to the number of pixels that can be displayed by the screen of the display device in the horizontal direction.
In combination with 1 inch = 2.54 cm, there is a relationship between the lateral resolution of the display device and the lateral pixel density (i.e., the number of pixels included per inch of length) of the display device as follows:
bringing equation 7 into equation 6, the parallax d can be obtained N The method comprises the following steps:
bringing equation 8 into equation 5, the convergence conflict can be obtained as:
as can be seen from equation 9, when viewing distance L, pupil distance E, lateral pixel density ppi of the display device, and parallax d N In the known case, the convergence conflict can be calculated according to equation 9. Of course, if the display device is a converging virtual image display device, the convergence conflict may also be calculated in a similar manner, and will not be described in detail herein.
In the case of presenting a stereoscopic image using a parallel virtual image display apparatus, a left eye viewing image and a right eye viewing image in a binocular image are displayed on two screens (or two different areas of the same screen) directly in front of the eyes. Fig. 6 exemplarily shows a schematic view of presenting a stereoscopic image on a parallel virtual image display device, and in fig. 6, a left screen is used to display left eye viewing images and a right screen is used to display right eye viewing images. In fig. 6, a stereoscopic virtual image perceived by a viewer is positioned in front of a virtual image imaging plane. Based on fig. 6, according to the principle of similar triangles, it is possible to obtain:
wherein, -W < d' N < 0; (equation 10)
Bringing equation 10 into equation 2 yields a convergence conflict of:
bringing equation 8 into equation 11 may further yield a convergence conflict of:
as can be seen from equation 12, when the viewing distance L, the pupil distance E, the lateral pixel density ppi of the display device (or the lateral fraction of the display device)Resolution and lateral width), and parallax d N The convergence conflict can be calculated according to equation 12, under known conditions. Of course, if the display device is a parallel real image display device, the convergence conflict may also be calculated in a similar manner, and will not be described herein. It is worth mentioning that when the display device is a virtual image display device, W is the lateral display width of the virtual image imaging plane of the virtual image display device.
The viewing distance L in the above formula may be determined based on a historical viewing distance between the viewer and the imaging surface detected during the historical period of time. In some embodiments, the last detected historical viewing distance between the viewer and the imaging surface may be taken as the viewing distance L in the above equation.
Since there may be a slight change in the viewing distance of the viewer with respect to the imaging plane at different points in time, for example, there is a difference in the viewing distance of the viewer with respect to the imaging plane when displaying the binocular image a and when displaying the binocular image B. In this case, the viewing distance of the viewer with respect to the imaging plane may be determined using a plurality of historical viewing distances of the viewer with respect to the imaging plane over a past historical period of time.
In some embodiments, a plurality of historical viewing distances of the viewer relative to the imaging surface in the historical time period may be averaged to obtain an average historical viewing distance, and the average historical viewing distance is taken as the viewing distance of the viewer relative to the imaging surface. Namely:
Wherein L is i Is the historical viewing distance of the viewer relative to the imaging plane at a historical point in time i; n is the number of historical binocular images displayed over a historical time period (t- Δt, t). It will be appreciated that the length of the history period should not be too long, for example Δt may be 1 second, 1.5 seconds, 1.8 seconds, 2 seconds, 3 seconds, etc., and may be specifically set according to practical needs.
In other embodiments, the historical viewing distance of the percentile may also be designated as the distance of the viewer relative to the imaging plane from among a plurality of historical viewing distances.
In step 430, if there is a target pixel region in which the convergence conflict exceeds the conflict threshold in the target pixel region, parallax adjustment is performed on the target pixel region.
The target pixel region is a region where a pixel where the convergence conflict exceeds the conflict threshold value in the pixel region of interest, and a pixel region surrounded by a thick dot-dash frame in the left-eye view image and the right-eye view image in fig. 4 can be regarded as the target pixel region.
The collision threshold may be set according to actual needs, for example, 0.35D, 0.38D, 0.39D, 0.4D, 0.42D, 0.44D, 0.45D, etc., and is not particularly limited herein. The conflict threshold may be set according to a convergence conflict range that makes eyes feel comfortable, so that in the case where the convergence conflict is lower than the conflict threshold, the stereoscopic image viewed by the viewer is comfortable to the eyes of the viewer without causing a noticeable tiring feeling or physiological discomfort.
As can be seen from equations 9 and 12, the convergence conflict for the corresponding pixel point is directly related to the corresponding parallax. The corresponding convergence conflict can be correspondingly adjusted by changing the parallax of the corresponding pixel points; for example, when the parallax between the corresponding pixels in the target pixel region is positive, the parallax can be reduced, so that the convergence conflict of the adjusted pixel is reduced, and the convergence conflict of the adjusted corresponding pixel is lower than the conflict threshold.
As can be seen from (a) and (b) in fig. 1 and fig. 5, the absolute value of the parallax is directly related to the horizontal display pitch of the corresponding pixel on the imaging plane, wherein the display pitch of the corresponding pixel on the imaging plane can be understood as the physical parallax d 'in the above formulas 4 and 5' N Is the absolute value of (c). The relation between the (pixel) parallax and the physical parallax is referred to the above formula 8. Accordingly, the parallax of the pixels can be adjusted by adjusting the display pitch of the corresponding pair of pixel points (one pixel being located in the left-eye viewing image and the other pixel being located in the right-eye viewing image) on the imaging plane.
Specifically, step 430 includes: acquiring a first display distance between a first pixel point on an imaging surface and a second pixel point on the imaging surface, wherein the first pixel point is a pixel point in a first attention pixel area of a left eye watching image, and the second pixel point is a pixel point corresponding to the first pixel point in a second attention pixel area of a right eye watching image; and adjusting the first display interval to a second display interval according to the convergence conflict of the corresponding pixel points in the target pixel area, wherein the second display interval is smaller than the first display interval.
It is understood that the first pixel point in the left-eye viewing image and the second pixel point in the right-eye viewing image are the same point representing the actual physical space, in other words, the first pixel point in the left-eye viewing image and the second pixel point in the right-eye viewing image are a corresponding pair of pixel points.
In playing 3D video (e.g., 3D movie, 3D television, etc.), the initial display positions of the left eye viewing image and the right eye viewing image on the imaging plane in each frame of binocular image are preset, and based on the initial display positions, the display pitches between two corresponding pixel points representing the same point in the actual physical space in the left eye viewing image and the right eye viewing image can be determined. In this application, the display pitch on the imaging plane according to the corresponding pixel points in the left-eye viewing image and the right-eye viewing image in the case of displaying in accordance with the initial display position will be referred to as a first display pitch.
It will be understood that adjusting the first display pitch to the second display pitch corresponds to adjusting the display positions of the target pixel region in the left-eye viewing image and the target pixel region in the right-eye viewing image on the imaging plane, so that the absolute value of the parallax between the corresponding pixel points in the target pixel region is reduced and, correspondingly, the convergence conflict of the corresponding pixel points in the target pixel region is also reduced in the case of displaying according to the adjusted display positions.
Correspondingly, after the first display interval is adjusted to the second display interval, the target display position information of the binocular image on the imaging surface can be determined according to the initial display position information of the binocular image on the imaging surface and the second display interval; and displaying the binocular image on the imaging plane according to the target display position information.
The initial display position information of the binocular image on the imaging plane is used to indicate the display positions on the imaging plane initially set for the left eye viewing image and the right eye viewing image in the binocular image. As described above, if the binocular image is displayed on the imaging plane in accordance with the initial display position information, the convergence conflict perceived by the viewer for the target pixel area exceeds the set threshold value during viewing of the binocular image.
The target display position information is used for indicating target display positions of the left eye viewing image and the right eye viewing image in the binocular image on the imaging surface, and then the left eye viewing image and the right eye viewing image in the binocular image can be displayed on corresponding target display positions on the imaging surface according to the target display position information.
The first display interval is adjusted to be the second display interval, which is equivalent to adjusting the display position of at least one of the left eye viewing image and the right eye viewing image in the binocular image on the imaging surface, so that the display interval between the corresponding pixel point pairs in the left eye viewing image and the right eye viewing image is reduced when the display is performed according to the adjusted display position.
It can be understood that, when the binocular image is displayed at the target display position in the imaging plane, the convergence conflict is felt to be smaller when the viewer views the target pixel region in the binocular image, compared with the case where the binocular image is displayed at the initial display position in the imaging plane, so that the situation that the viewer is tired in eyes or uncomfortable in physiology during viewing is avoided.
In some embodiments, in order to achieve both the stereoscopic effect presented to the viewer when displaying the binocular image and to reduce the eye fatigue of the viewer, or to cause physiological discomfort, a comfortable convergence conflict zone may be set that makes the eyes of the viewer comfortable and can present a better stereoscopic effect. Based on this, in step 430, parallax adjustment may be performed on the target pixel area according to the comfortable convergence conflict zone, so that the convergence conflict of the corresponding pixel point in the adjusted target pixel area is located in the comfortable convergence conflict zone. It is understood that the convergence conflict within the comfort convergence conflict interval is less than the conflict threshold.
In the watching process, the time that the observer looks at the pixel area of interest is longer, and when the convergence conflict of the pixels in the pixel area of interest is larger, the longer the looking time is, the stronger the tiredness of eyes is, and the physiological discomfort is more obvious. Therefore, in the scheme of the application, parallax adjustment is performed on the target pixel area, in which the convergence conflict exceeds the preset threshold, of the viewer in the target pixel area in the binocular image, so that the convergence conflict of the corresponding pixel point in the target pixel area is reduced, and therefore, even if the viewer looks at the target pixel area for a long time, the fatigue sense or the physiological discomfort sense of the eyes of the viewer can be effectively reduced, and the stereoscopic viewing experience of the viewer is improved.
In some embodiments, for other pixel regions in the binocular image except for the target pixel region, parallax adjustment may not be performed, and then stereoscopic vision effects of the other pixel regions may be maintained, and since the other pixel regions are not normally viewed for a long time, eye fatigue of the viewer is not significantly caused even if there is a region where vergence conflict is large in the other pixel regions. In this case, the convergence conflict for the pixel of interest pixel area is maintained within the comfort range of the viewer's eye; and the area which is not concerned by the viewer for a long time still maintains a higher screen entering and exiting effect, so that the viewer can feel an obvious 3D effect, and the immersion in the viewing process is improved.
In general, 3D movies and the like are photographed and produced by a professional content production team, and a post-production person can fully consider the comfort level of eyes during long-term viewing, and in the post-processing process, the convergence conflict of each frame of picture in the 3D movie can be controlled at a lower level, and the convergence conflict is ensured to be larger only in a small number of pictures for representing 3D shocking feeling, so as to ensure a larger screen entering and exiting effect. However, this way of ensuring the viewing experience for the viewer is difficult to implement due to the need for specialized post-processor participation. The scheme provided by the application does not need special later-stage personnel to participate, has low implementation difficulty, and can be widely applied to various stereoscopic display equipment in the process of presenting stereoscopic images so as to improve stereoscopic viewing experience of viewers.
In other embodiments, step 410 may include: according to the historical gazing information of the viewer, determining a historical gazing area of the viewer on an imaging surface; and determining a pixel region of interest of the viewer on the binocular image according to the historical gazing region.
In general, in the process of continuously playing pictures, multi-frame images continuously played in a short time have continuity in content, and typically, no picture abrupt change occurs. In this case, the area at which the viewer looks on the imaging surface (e.g., screen) is relatively fixed in a short time (e.g., 0.5 seconds, 1 second, 1.5 seconds, 2 seconds, 2.3 seconds, 2.4 seconds, 3 seconds, etc.). Based on this, the region of interest pixel of the viewer on the binocular image to be played can be estimated using the region of the viewer's gaze on the imaging plane for the historical period of time.
The history gaze information is used to indicate a case where the viewer gazes at each pixel point on the imaging surface during the past period of time, for example, the history gaze information may be used to indicate a probability that the viewer gazes at each pixel point on the imaging surface during the past period of time.
In a specific embodiment, eye movement tracking can be performed on eyes of a viewer in a historical time period, and the probability that the line of sight of the viewer stays at each pixel point on an imaging surface in the historical time period is determined based on the eye movement data of the viewer, so that historical gazing information is obtained.
The historical viewing area refers to an area in which a viewer views on an imaging surface in a historical period, or an area in which the viewing probability of the viewer on the imaging surface in the historical period is higher than a probability threshold.
By the above embodiment, the pixel region of interest of the viewer on the binocular image to be displayed is determined according to the historical gaze information of the viewer, and since the picture has continuity in the viewing process and the region of interest of the viewer in the continuously played picture remains substantially unchanged in a short time, the pixel region of interest of the viewer on the binocular image to be displayed can be accurately estimated through the historical gaze information of the viewer, the accuracy of the determined pixel region of interest is ensured, and the effectiveness of parallax adjustment based on the vergence conflict of the pixels in the target pixel region is further ensured.
In some embodiments, the historical gaze information may indicate a probability that the viewer gazes at various pixels on the imaging surface over a historical period of time; in this case, the historical gaze area of the viewer on the imaging plane may be determined by the following procedure: and determining the reference fixation probability corresponding to each pixel point on the imaging surface according to the probability of the fixation of the viewer to the pixel point in the historical time period for each pixel point on the imaging surface. And taking the pixel point set with the reference fixation probability not lower than the probability threshold value on the imaging surface as the historical fixation area of the viewer on the imaging surface.
Let P x,y|r Representing the probability that the eyes of the viewer gazes at the pixel points (x, y) on the imaging plane at the time point r, in a specific embodiment, each frame of picture can be displayed on the imaging plane within the history time period deltat, and the probability that the eyes of the viewer gazes at each pixel point on the imaging plane at the corresponding time point is counted. If n frames of binocular images are displayed in total in the historical time period delta t, calculating the average probability that the viewer looks at each pixel point on the imaging plane in the historical time period delta t according to n probabilities respectively counted for each pixel point on the imaging plane in the historical time period delta t, namely:
then, the average probability calculated for each pixel point may be used as the reference gaze probability for each pixel point. In other embodiments, the historical gaze probability of the set percentile may be set as the reference gaze probability of each pixel point from a plurality of historical gaze probabilities counted for the pixel point, where the set percentile may be set according to actual needs, for example, 55%, 60%, 75%, 77%, and so on.
Thereafter, the historical gaze region S may be determined according to the following formula:
where argmax is a function that parameterizes the function (set). H is the height in the vertical direction of the imaging plane.
The probability threshold may be set according to actual needs, for example, the probability threshold is 70%, 75%, 77%, 80%, 83%, 85%, 88%, etc.
In other embodiments, the historical gaze region on the imaging plane may be transformed to the same coordinate system as the binocular image, e.g., the binocular image is transformed to the screen coordinate system where the imaging plane is located based on the initial display position set for the binocular image, after which the region of the binocular image that coincides with the historical gaze region is determined on the imaging plane, and the region of the binocular image that coincides with the historical gaze region is taken as the region of interest pixels in the binocular image for the viewer. Of course, in other embodiments, the historical gaze region on the imaging plane may be converted into the pixel coordinate system corresponding to the binocular image, and then the region overlapping with the historical gaze region in the binocular image may be regarded as the pixel region of interest.
In other embodiments, determining a region of interest pixels of a viewer on a binocular image from a historical gaze region includes: determining intersection areas of each sub-pixel area and the historical gazing area in the binocular image; at least one sub-pixel region is selected as a viewer's attention pixel region from among sub-pixel regions having an intersection region with the history attention region.
In particular, the binocular image may be segmented to divide the binocular image into a plurality of sub-pixel regions. The left eye viewing image and the right eye viewing image in the binocular image can be subjected to semantic segmentation, and a pixel area where an object is located is used as a sub-pixel area; for example, if the left-eye viewing image includes a tree, a dog and a house, the left-eye viewing image may be divided into a pixel area where the tree is located, a pixel area where the dog is located, and a pixel area where the house is located by semantic division, where the divided pixel areas where the tree is located, the pixel area where the dog is located, and the pixel area where the house is located are respectively one sub-pixel area in the left-eye viewing image. It will be appreciated that the sub-pixel regions obtained by semantically dividing the left-eye viewing image correspond to the sub-pixel regions obtained by semantically dividing the right-eye viewing image.
In other embodiments, the left eye viewing image and the right eye viewing image may be further segmented based on an image parallax, for example, the left eye viewing image may be segmented into a far view pixel region, a middle view pixel region, a near view pixel region, and the like.
Specifically, the sub-pixel regions and the historical gazing regions in the binocular image may be converted into the same coordinate system to determine intersection regions of the sub-pixel regions and the historical gazing regions. For example, according to the initial display position of the binocular image on the imaging plane, each sub-pixel region is converted to a screen coordinate system corresponding to the imaging plane, and then, under the same coordinate system, an intersection region between each sub-pixel region and the history gazing region is determined.
In some embodiments, a sub-pixel region that is a pixel region of interest may be selected according to the area of the intersection region or the number of pixel points. Specifically, the first N sub-pixel regions having the largest area of intersection with the historical gaze region may be regarded as the pixel region of interest, where N is a positive integer; the number of specific N may be set according to actual needs, for example, N is 1, 2, 3, etc., and is not particularly limited herein.
Based on the above embodiments, it is achieved that the pixel region of interest of the viewer in the binocular image is determined in combination with the historical gaze region of the viewer and the image content in the binocular image. In some scenarios, there may be cases where the pixel area in the binocular image that coincides with the historical gaze area of the viewer is a partial pixel area representing a certain object, such as the pixel area where the head of a certain persona is located. In this case, if a part of the pixel regions of the object is determined as the pixel region of interest, after parallax adjustment is performed later, the difference between the other pixel regions on the object and the region determined as the pixel region of interest is made larger, that is, the stereoscopic effect presented to the same object is made larger, which greatly reduces the viewing experience of the viewer.
By adopting the scheme of the embodiment, the content of the binocular image can be divided with the object as granularity or coarser and finer granularity, a plurality of sub-pixel areas in the binocular image are determined, at least one sub-pixel area is selected as the attention pixel area based on the intersection area between the sub-pixel area and the historical attention area, and therefore, the situation that one sub-pixel area is determined as the attention pixel area or not determined as the attention pixel area, and the situation that part of the area in one sub-pixel area is determined as the attention pixel area does not exist can be ensured, and therefore the problems can be effectively solved, and the watching experience of a user is ensured.
In other embodiments, during the process of watching continuous pictures, the objects of interest of the viewer in the pictures may not change greatly in a short time, based on this, the objects of interest of the viewer in the historical binocular image presented by the history may be determined according to the historical gazing area of the viewer, then the objects of interest are identified and located in the binocular image, and the pixel area where the objects of interest in the binocular image are located is taken as the pixel area of interest of the viewer in the binocular image.
Fig. 7 is a flowchart of a parallax adjustment method according to another embodiment of the present application, and steps 710 to 720 in fig. 7 may be performed by the stereoscopic display device 220 in fig. 2 or the control device 230 in fig. 3. As shown in fig. 7, specifically, the method includes:
step 710, convergence conflict calculation; specifically, according to the pupil distance, the viewing distance, the historical gaze information of the viewer, and the display parameters of the stereoscopic display unit of the viewer detected by the eye tracking device 210, the attention pixel area of the viewer in the binocular image is determined first, and then the vergence conflict of the corresponding pixel point in the attention pixel area is calculated. The specific determination of the pixel region of interest and the calculation of the convergence conflict can be found in the above description, and will not be repeated here.
In step 720, parallax adjustment is performed. Namely: and performing parallax adjustment on the target pixel area, of which the convergence conflict exceeds the conflict threshold, in the target pixel area so that the convergence conflict of the corresponding pixel point in the adjusted target pixel area does not exceed the conflict threshold. Thereafter, the parallax-adjusted binocular image may be displayed on the stereoscopic display unit.
According to the scheme of the embodiment, parallax of binocular images can be optimized in real time according to display parameters of different stereoscopic display units and states (such as historical gaze information and viewing distance) of viewers during viewing, convergence conflict of corresponding pixel points in a focused pixel area of each viewer is stabilized within a range not exceeding a conflict threshold, accordingly, eye fatigue or physiological discomfort of each viewer is avoided, and viewing experience of each viewer is improved.
The following describes apparatus embodiments of the present application that may be used to perform the methods of the above-described embodiments of the present application. For details not disclosed in the device embodiments of the present application, please refer to the method embodiments described above in the present application.
Fig. 8 is a block diagram of a parallax adjustment apparatus 800 according to an embodiment of the present application, which may be configured in the stereoscopic display device 220 shown in fig. 2 or in the control device 230 shown in fig. 3, for performing the parallax adjustment method provided in the present application. As shown in fig. 8, the parallax adjustment device 800 includes: a pixel-of-interest region determining module 810 for determining a pixel region of interest of a viewer in a binocular image, the binocular image including a left eye viewing image and a right eye viewing image, the pixel region of interest including a first pixel region of interest of the viewer in the left eye viewing image and a second pixel region of interest in the right eye viewing image; a convergence conflict determination module 820, configured to determine a convergence conflict of the corresponding pixel points in the pixel region of interest according to the parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest; the parallax adjustment module 830 is configured to, if a target pixel area with a convergence conflict exceeding a conflict threshold exists in the target pixel area, perform parallax adjustment on the target pixel area, and correspondingly obtain a binocular image after parallax adjustment.
In some embodiments, the pixel of interest region determination module 810 is to: according to the historical gazing information of the viewer, determining a historical gazing area of the viewer on an imaging surface; and determining a pixel region of interest of the viewer on the binocular image according to the historical gazing region.
In some embodiments, the historical gaze information indicates a probability that the viewer gazes at each pixel point on the imaging surface over a historical period of time; in the present embodiment, the pixel-of-interest region determination module 810 is further configured to: for each pixel point on the imaging surface, determining the reference fixation probability corresponding to each pixel point on the imaging surface according to the probability of the fixation of the viewer to the pixel point in the historical time period; and taking the pixel point set with the reference fixation probability not lower than the probability threshold value on the imaging surface as the historical fixation area of the viewer on the imaging surface.
In some embodiments, the pixel of interest region determination module 810 is further to: and determining an intersection area of each sub-pixel area and the historical gazing area in the binocular image, and selecting at least one sub-pixel area from the sub-pixel areas with the intersection area with the historical gazing area as a concerned pixel area of a viewer.
In some embodiments, the convergence conflict determination module 820 is configured to: the method comprises the steps of obtaining the interpupillary distance of a viewer, the viewing distance of the viewer relative to an imaging surface and display parameters; and determining the convergence conflict of each pixel point in the pixel region of interest according to the parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest, the interpupillary distance of the viewer, the viewing distance of the viewer relative to the imaging surface and the display parameters.
In some embodiments, the vergence conflict determination module 820 is further configured to: determining a virtual image distance perceived by a viewer for a corresponding pixel point in the pixel region of interest according to parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest, pupil distance of the viewer, viewing distance of the viewer relative to the imaging plane and display parameters; and determining the convergence conflict of the corresponding pixel point in the pixel region of interest according to the virtual image distance perceived by the viewer for the corresponding pixel point in the pixel region of interest and the viewing distance of the viewer relative to the imaging surface.
In some embodiments, the parallax adjustment device further comprises: a history viewing distance acquisition module for acquiring a plurality of history viewing distances of a viewer with respect to an imaging plane in a history period; and the viewing distance determining module is used for determining the viewing distance of the viewer relative to the imaging surface according to the plurality of historical viewing distances.
In some embodiments, the imaging surface is a real image imaging surface of a real image display device, and the viewing distance of the viewer with respect to the imaging surface is the distance of the eyes of the viewer from the real image imaging surface.
In other embodiments, the imaging plane is a virtual image imaging plane of a virtual image display device, and the viewing distance of the viewer with respect to the imaging plane is the distance of the eyes of the viewer from the virtual image imaging plane.
In some embodiments, the disparity adjustment module is to: acquiring a first display distance between a first pixel point on an imaging surface and a second pixel point on the imaging surface, wherein the first pixel point is a pixel point in a first attention pixel area in a left eye viewing image, and the second pixel point is a pixel point corresponding to the first pixel point in a second attention pixel area in a right eye viewing image; and adjusting the first display interval to a second display interval according to the convergence conflict of the corresponding pixel points in the target pixel area, wherein the second display interval is smaller than the first display interval.
In some embodiments, the parallax adjustment device further comprises: the target display position information determining module is used for determining target display position information of the binocular image on the imaging surface according to the initial display position information and the second display distance of the binocular image on the imaging surface; and the display module is used for displaying the binocular image on the imaging surface according to the target display position information.
It should be understood that the parallax adjustment device of the embodiments of the present application may correspond to performing the parallax adjustment method described in the embodiments of the present application, and the operations and/or functions of each module in the parallax adjustment device are respectively for implementing the corresponding flow of each method in the embodiments of the methods, which are not described herein for brevity.
The present application also provides a display device, which may be the stereoscopic display device shown in fig. 2, including a memory, a processor, and a display unit, where the memory is configured to store computer readable instructions, and when the computer readable instructions in the memory are executed by the processor, the stereoscopic display device may implement the parallax adjustment method of the present application. The display unit may be used to display the binocular image after parallax adjustment.
In some embodiments, the display device may further integrate an eye tracking device for eye tracking, so that the display device may further perform eye tracking on a viewer during the process of displaying the binocular image, and determine a region of interest pixel of the viewer in the binocular image to be displayed based on the eye data of the viewer.
The present application also provides a computing device, which may be the control device 230 in the parallax adjustment system of fig. 3, or may be the stereoscopic display device 220 of fig. 2. Fig. 9 is a schematic diagram of a computer device according to an embodiment of the present application. As shown in fig. 9, computing device 900 includes a processor 910, a bus 920, a memory 930, a memory unit 950 (also referred to as a main memory unit), and a communication interface 940. Processor 910, memory 930, memory unit 950, and communication interface 940 are coupled by bus 920.
It is to be appreciated that in this embodiment, the processor 910 may be a CPU, and the processor 910 may also be other general purpose processors, DSP, ASIC, FPGA or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or any conventional processor or the like.
Communication interface 940 is used to enable communication of computing device 900 with external devices or appliances. In this embodiment, the communication interface 940 may be used to communicate with the eye tracking device 210.
Bus 920 may include a path to transfer information between components such as processor 910, memory unit 950, and storage 930. The bus 920 may include a power bus, a control bus, a status signal bus, and the like in addition to a data bus. But for clarity of illustration, the various buses are labeled as bus 920 in the drawing. Bus 920 may be a peripheral component interconnect express (Peripheral Component Interconnect Express, PCIE) bus, or an extended industry standard architecture (extended industry standard architecture, EISA) bus, a unified bus (Ubus or UB), a computer quick link (compute express link, CXL), a cache coherent interconnect protocol (cache coherent interconnect for accelerators, CCIX), or the like.
As one example, computing device 900 may include multiple processors. The processor may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or computing units for processing data (e.g., computer program instructions). The memory 930 may be a solid state disk or a mechanical hard disk, which may be used to store computer readable storage instructions, and the processor 910 may invoke the computer readable instructions stored in the memory 930 to implement the parallax adjustment method provided in the present application.
It should be noted that, in fig. 9, only the computing device 900 includes 1 processor 910 and 1 memory 930 as an example, where the processor 910 and the memory 930 are respectively used to indicate a type of device or apparatus, and in a specific embodiment, the number of each type of device or apparatus may be determined according to service requirements.
The memory unit 950 may be volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous dynamic random access memory (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DR RAM). In this application, the memory unit 950 may be used to store data such as binocular images to be played.
It should be appreciated that the computing device 900 described above may be a DPU. The computing device 900 according to the present embodiment may correspond to the parallax adjustment device 800 in the present embodiment, and may correspond to the above and other operations and/or functions of the respective modules in the parallax adjustment device 800, which are not described herein for brevity.
The method steps in this embodiment may be implemented by hardware, or may be implemented by executing software instructions by a processor. The software instructions may be comprised of corresponding software modules that may be stored in random access memory (random access memory, RAM), flash memory, read-only memory (ROM), programmable ROM (PROM), erasable Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), registers, hard disk, removable disk, CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. In addition, the ASIC may reside in a computing device. The processor and the storage medium may reside as discrete components in a network device or terminal device.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer programs or instructions. The processes or functions of the embodiments of the present application are performed in whole or in part when computer programs or instructions are loaded and executed on a computing device. The computing device may be a general purpose computer, a special purpose computer, a computer network, a network device, a user device, or other programmable apparatus. The computer program or instructions may be stored in or transmitted from one computer readable storage medium to another, for example, by wired or wireless means from one website site, computer, server, or data center. Computer readable storage media can be any available media that can be accessed by a computer or data storage devices such as servers, data centers, etc. that integrate one or more available media. Usable media may be magnetic media such as floppy disks, hard disks, magnetic tape; optical media, such as digital video discs (digital video disc, DVD); but also semiconductor media such as solid state disks (solid state drive, SSD).
As another aspect, the present application also provides a computer-readable storage medium that may be contained in the computing device described in the above embodiments; or may exist alone without being assembled into the computing device. The computer-readable storage medium described above carries computer-readable instructions that, when executed by a processor, implement the parallax adjustment method in any of the above embodiments.
It will be appreciated that in order to implement the functionality of the above-described embodiments, the computing device includes corresponding hardware structures and/or software modules that perform the various functions. Those of skill in the art will readily appreciate that the elements and method steps of the examples described in connection with the embodiments disclosed herein may be implemented as hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application scenario and design constraints imposed on the solution.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more. For example, the plurality of processing units refers to two or more processing units; the plurality of systems means two or more systems.
In the embodiment of the present application, the term "and/or" is merely an association relationship describing the association object, which indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone.
The terms first and second and the like in the description and in the claims of embodiments of the present application are used for distinguishing between different objects and not necessarily for describing a particular sequential order of objects. For example, the first reference resistor and the second reference resistor, etc., are used to distinguish between different reference resistors, and are not used to describe a particular order of reference resistors.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any equivalent modifications or substitutions will be apparent to those skilled in the art within the scope of the present application, and these modifications or substitutions should be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (25)

1. A parallax adjustment method, characterized by comprising:
determining a region of interest of a viewer in a binocular image, the binocular image comprising a left eye viewing image and a right eye viewing image, the region of interest comprising a first region of interest of the viewer in the left eye viewing image and a second region of interest of the viewer in the right eye viewing image;
determining a convergence conflict of the corresponding pixel points in the pixel region of interest according to the parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest;
and if a target pixel area with convergence conflict exceeding a conflict threshold exists in the pixel area of interest, performing parallax adjustment on the target pixel area.
2. The method of claim 1, wherein the determining a region of interest pixel for a viewer in a binocular image comprises:
according to the historical gazing information of the viewer, determining a historical gazing area of the viewer on an imaging surface;
and determining a pixel region of interest of the viewer on the binocular image according to the historical gazing region.
3. The method of claim 2, wherein the historical gaze information indicates a probability that the viewer gazes at each pixel point on the imaging surface over a historical period of time;
The determining the historical gazing area of the viewer on the imaging surface according to the historical gazing information of the viewer comprises the following steps:
for each pixel point on the imaging surface, determining a reference fixation probability corresponding to each pixel point on the imaging surface according to the probability that the viewer fixates the pixel point in a historical time period;
and taking the pixel point set with the reference fixation probability not lower than a probability threshold value on the imaging surface as a historical fixation area of the viewer on the imaging surface.
4. A method according to claim 2 or 3, wherein said determining a region of interest pixels of the viewer on the binocular image from the historical gaze region comprises:
determining intersection areas of each sub-pixel area in the binocular image and the historical gazing area;
and selecting at least one sub-pixel area from the sub-pixel areas with intersection areas with the historical gazing area as a concerned pixel area of a viewer.
5. The method of claim 1, wherein the determining a vergence conflict for a corresponding pixel in the pixel of interest area based on a parallax between the corresponding pixels in the first pixel of interest area and the second pixel of interest area comprises:
Acquiring the interpupillary distance of the viewer, the viewing distance of the viewer relative to an imaging surface and display parameters;
and determining the convergence conflict of the corresponding pixel points in the pixel region of interest according to the parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest, the pupil distance of the viewer, the viewing distance of the viewer relative to the imaging surface and the display parameter.
6. The method of claim 5, wherein the determining the vergence conflict of the corresponding pixel point in the pixel of interest area according to the parallax between the corresponding pixel points in the first pixel of interest area and the second pixel of interest area, the interpupillary distance of the viewer, the viewing distance of the viewer with respect to the imaging plane, and the display parameters, comprises:
determining a virtual image distance perceived by the viewer for a corresponding pixel point in the pixel region of interest according to parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest, pupil distance of the viewer, viewing distance of the viewer relative to an imaging plane and display parameters;
And determining the convergence conflict of the corresponding pixel point in the concerned pixel area according to the virtual image distance perceived by the observer for the corresponding pixel point in the concerned pixel area and the watching distance of the observer relative to the imaging surface.
7. The method according to claim 5 or 6, characterized in that the method further comprises:
acquiring a plurality of historical viewing distances of the viewer relative to the imaging surface over a historical period of time;
and determining the viewing distance of the viewer relative to the imaging surface according to the historical viewing distances.
8. The method of claim 5 or 6, wherein the imaging surface is a real image imaging surface of a real image display device, and the viewing distance of the viewer relative to the imaging surface is the distance of the eyes of the viewer from the real image imaging surface.
9. A method as recited in claim 5 or 6, wherein the imaging plane is a virtual image imaging plane of a virtual image display device, and the viewing distance of the viewer relative to the imaging plane is the distance of the viewer's eyes from the virtual image imaging plane.
10. The method according to any one of claims 1 to 3, 5 and 6, wherein the performing parallax adjustment on the target pixel region if there is a target pixel region in which a convergence conflict exceeds a conflict threshold in the pixel region of interest, comprises:
Acquiring a first display distance between a first pixel point on an imaging surface and a second pixel point on the imaging surface, wherein the first pixel point is a pixel point in a first attention pixel area in the left eye viewing image, and the second pixel point is a pixel point corresponding to the first pixel point in a second attention pixel area in the right eye viewing image;
and adjusting the first display interval to a second display interval according to the convergence conflict of the corresponding pixel points in the target pixel area, wherein the second display interval is smaller than the first display interval.
11. The method of claim 10, wherein after adjusting the first display pitch to the second display pitch according to the convergence conflict of the corresponding pixel points in the target pixel area, the method further comprises:
determining target display position information of the binocular image on the imaging surface according to the initial display position information of the binocular image on the imaging surface and the second display interval;
and displaying the binocular image on the imaging plane according to the target display position information.
12. A parallax adjustment device, comprising:
A pixel-of-interest region determination module configured to determine a pixel-of-interest region of a viewer in a binocular image, the binocular image including a left-eye viewing image and a right-eye viewing image, the pixel-of-interest region including a first pixel-of-interest region of the viewer in the left-eye viewing image and a second pixel-of-interest region in the right-eye viewing image;
a convergence conflict determination module, configured to determine a convergence conflict of a corresponding pixel point in the pixel region of interest according to a parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest;
and the parallax adjustment module is used for performing parallax adjustment on the target pixel region if the convergence conflict exceeds a conflict threshold value in the target pixel region.
13. The apparatus of claim 12, wherein the pixel of interest region determination module is to:
according to the historical gazing information of the viewer, determining a historical gazing area of the viewer on an imaging surface;
and determining a pixel region of interest of the viewer on the binocular image according to the historical gazing region.
14. The apparatus of claim 13, wherein the historical gaze information indicates a probability that the viewer gazes at each pixel point on the imaging surface over a historical period of time; the pixel region of interest determination module is further configured to:
For each pixel point on the imaging surface, determining a reference fixation probability corresponding to each pixel point on the imaging surface according to the probability that the viewer fixates the pixel point in a historical time period;
and taking the pixel point set with the reference fixation probability not lower than a probability threshold value on the imaging surface as a historical fixation area of the viewer on the imaging surface.
15. The apparatus of claim 13 or 14, wherein the pixel of interest region determination module is further configured to:
determining intersection areas of each sub-pixel area in the binocular image and the historical gazing area;
and selecting at least one sub-pixel area from the sub-pixel areas with intersection areas with the historical gazing area as a concerned pixel area of a viewer.
16. The apparatus of claim 12, wherein the convergence conflict determination module is to:
acquiring the interpupillary distance of the viewer, the viewing distance of the viewer relative to an imaging surface and display parameters;
and determining the convergence conflict of the corresponding pixel points in the pixel region of interest according to the parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest, the pupil distance of the viewer, the viewing distance of the viewer relative to the imaging surface and the display parameter.
17. The apparatus of claim 16, wherein the convergence conflict determination module is further to:
determining a virtual image distance perceived by the viewer for a corresponding pixel point in the pixel region of interest according to parallax between the corresponding pixel points in the first pixel region of interest and the second pixel region of interest, pupil distance of the viewer, viewing distance of the viewer relative to an imaging plane and display parameters;
and determining the convergence conflict of the corresponding pixel point in the concerned pixel area according to the virtual image distance perceived by the observer for the corresponding pixel point in the concerned pixel area and the watching distance of the observer relative to the imaging surface.
18. The apparatus according to claim 16 or 17, wherein the parallax adjustment apparatus further comprises:
a history viewing distance acquisition module for acquiring a plurality of history viewing distances of the viewer with respect to the imaging plane in a history period;
and the viewing distance determining module is used for determining the viewing distance of the viewer relative to the imaging surface according to the plurality of historical viewing distances.
19. The apparatus of claim 16 or 17, wherein the imaging surface is a real image imaging surface of a real image display device, and the viewing distance of the viewer with respect to the imaging surface is the distance of the eyes of the viewer from the real image imaging surface.
20. An apparatus as recited in claim 16 or 17, wherein the imaging plane is a virtual image imaging plane of a virtual image display device, and the viewing distance of the viewer with respect to the imaging plane is a distance of an eye of the viewer from the virtual image imaging plane.
21. The apparatus of any one of claims 12 to 14, 16 and 17, wherein the binocular image comprises a left eye viewing image and a right eye viewing image; the parallax adjustment module is used for:
acquiring a first display distance between a first pixel point on an imaging surface and a second pixel point on the imaging surface, wherein the first pixel point is a pixel point in a first attention pixel area in the left eye viewing image, and the second pixel point is a pixel point corresponding to the first pixel point in a second attention pixel area in the right eye viewing image;
and adjusting the first display interval to a second display interval according to the convergence conflict of the corresponding pixel points in the target pixel area, wherein the second display interval is smaller than the first display interval.
22. The apparatus of claim 21, wherein the parallax adjustment apparatus further comprises:
A target display position information determining module, configured to determine target display position information of the binocular image on the imaging plane according to initial display position information of the binocular image on the imaging plane and the second display distance;
and the display module is used for displaying the binocular image on the imaging surface according to the target display position information.
23. A computing device comprising a memory and a processor, the memory for storing a set of computer instructions; the method of any one of claims 1 to 11, when executed by the processor.
24. A display device comprising a memory, a processor, and a display unit, the memory for storing a set of computer instructions; implementing the parallax adjustment method of any one of claims 1 to 11 when the processor executes the set of computer instructions; the display unit is used for displaying the binocular image after parallax adjustment.
25. A computer readable storage medium having computer readable instructions stored thereon, which when executed by a processor, implement the method of any of claims 1 to 11.
CN202210800791.0A 2022-07-06 2022-07-06 Parallax adjustment method, parallax adjustment device, storage medium and computing device Pending CN117412020A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210800791.0A CN117412020A (en) 2022-07-06 2022-07-06 Parallax adjustment method, parallax adjustment device, storage medium and computing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210800791.0A CN117412020A (en) 2022-07-06 2022-07-06 Parallax adjustment method, parallax adjustment device, storage medium and computing device

Publications (1)

Publication Number Publication Date
CN117412020A true CN117412020A (en) 2024-01-16

Family

ID=89489532

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210800791.0A Pending CN117412020A (en) 2022-07-06 2022-07-06 Parallax adjustment method, parallax adjustment device, storage medium and computing device

Country Status (1)

Country Link
CN (1) CN117412020A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118012372A (en) * 2024-04-10 2024-05-10 深圳市万国电器有限公司 Display method, device, equipment and storage medium based on artificial intelligence

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118012372A (en) * 2024-04-10 2024-05-10 深圳市万国电器有限公司 Display method, device, equipment and storage medium based on artificial intelligence
CN118012372B (en) * 2024-04-10 2024-07-12 深圳市万国电器有限公司 Display method, device, equipment and storage medium based on artificial intelligence

Similar Documents

Publication Publication Date Title
JP7094266B2 (en) Single-depth tracking-accommodation-binocular accommodation solution
US8913790B2 (en) System and method for analyzing three-dimensional (3D) media content
US20210132693A1 (en) Light Field Displays Incorporating Eye Trackers and Methods for Generating Views for a Light Field Display Using Eye Tracking Information
CN108600733B (en) Naked eye 3D display method based on human eye tracking
Terzić et al. Methods for reducing visual discomfort in stereoscopic 3D: A review
Banks et al. Stereoscopy and the human visual system
CN109901710B (en) Media file processing method and device, storage medium and terminal
WO2010084716A1 (en) Image processing device, program, image processing method, recording method, and recording medium
JP2020202569A (en) Virtual eyeglass set for viewing actual scene that corrects for different location of lenses than eyes
US8692870B2 (en) Adaptive adjustment of depth cues in a stereo telepresence system
WO2019013863A1 (en) Varifocal aberration compensation for near-eye displays
US9754379B2 (en) Method and system for determining parameters of an off-axis virtual camera
US20130021458A1 (en) Apparatus and method for presenting stereoscopic video
Hwang et al. Instability of the perceived world while watching 3D stereoscopic imagery: a likely source of motion sickness symptoms
CN106303498B (en) Video display control method and device, display equipment
CN106293561B (en) Display control method and device and display equipment
WO2022267573A1 (en) Switching control method for glasses-free 3d display mode, and medium and system
JP2002223458A (en) Stereoscopic video image generator
Zhang et al. Depth of field affects perceived depth in stereographs
CN117412020A (en) Parallax adjustment method, parallax adjustment device, storage medium and computing device
CN109298793B (en) Screen position adjusting method and device
Terzic et al. Causes of discomfort in stereoscopic content: a review
JP2014053782A (en) Stereoscopic image data processor and stereoscopic image data processing method
Watt et al. 3D media and the human visual system
CN106303315B (en) Video display control method and device, display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication