WO2021146978A1 - Display system, graphics processing unit (gpu), display controller, and display method - Google Patents

Display system, graphics processing unit (gpu), display controller, and display method Download PDF

Info

Publication number
WO2021146978A1
WO2021146978A1 PCT/CN2020/073691 CN2020073691W WO2021146978A1 WO 2021146978 A1 WO2021146978 A1 WO 2021146978A1 CN 2020073691 W CN2020073691 W CN 2020073691W WO 2021146978 A1 WO2021146978 A1 WO 2021146978A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
display
value
disparity
Prior art date
Application number
PCT/CN2020/073691
Other languages
French (fr)
Chinese (zh)
Inventor
朱韵鹏
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2020/073691 priority Critical patent/WO2021146978A1/en
Priority to CN202080017131.4A priority patent/CN113490963A/en
Publication of WO2021146978A1 publication Critical patent/WO2021146978A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the embodiments of the present application relate to a display system, and in particular to a display system, a graphics processor GPU, a display controller, and a display method.
  • Stereo rendering also known as stereo rendering, is a technology used to display two-dimensional projections of discrete three-dimensional sampled data sets. It can also be understood that this technology is a process of processing and converting a three-dimensional light signal emitted or reflected by an object into a two-dimensional image, so that the human eye can achieve the effect of viewing a three-dimensional object when viewing a two-dimensional image.
  • virtual reality virtual reality
  • two parallax two-dimensional images can be presented to the user's eyes respectively, so that the user can obtain more realistic images through the aforementioned two-dimensional images Watch the effect of three-dimensional objects.
  • a graphics processing unit can render two images with parallax, and transmit the aforementioned two images to the display controller, and then the display controller transmits the aforementioned two images to the display, so that The display can respectively display the aforementioned two images to the user's left and right eyes, so that the user can obtain a stereoscopic viewing effect.
  • the embodiments of the present application provide a display system, a graphics processor GPU, a display controller, and a display method, which are used to reduce the operating load of the GPU while ensuring that the display controller can display two images at the same time.
  • an embodiment of the present application provides a display system, which includes a graphics processor GPU and a display controller.
  • the graphics processor GPU is used to render a first image and provide the first image to the display controller;
  • the display controller is used to obtain the first image from the GPU, to obtain the first image and For the disparity information of the second image, determine at least a part of the second image based on at least a part of the first image and the disparity information, and send at least a part of the first image and the second image to the display, so that the first image An image and at least a part of the second image are used to present a three-dimensional effect on the display.
  • the graphics processor GPU in the display system only renders the first image, and provides the aforementioned first image to the display controller.
  • the display controller determines at least a part of the second image based on at least a part of the first image and the parallax information, and transmits at least a part of the first image and the second image to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image.
  • the display controller transmits two images to the display, the GPU only renders the first image without rendering the second image, and the display controller obtains at least a part of the second image through the disparity information and the first image.
  • Reduce the running load of GPU rendering images for example, can reduce GPU calculation and bandwidth.
  • the first image includes a plurality of pixels, each pixel of the plurality of pixels corresponds to a depth value, and the disparity information includes the first The disparity value between a first pixel in at least a part of the image and a second pixel in the second image corresponding to the first pixel.
  • the GPU is also used to determine the disparity value based on the depth value of the first pixel.
  • the display controller is specifically configured to obtain the disparity value from the GPU.
  • the disparity information is determined by the graphics processor GPU. Since the graphics processor GPU only needs to render the first image and calculate the disparity information, and provide the disparity information and the first image to the display controller, the display controller performs subsequent calculations. In this process, the graphics processor GPU does not need to render two images. Therefore, it is beneficial to reduce the operating load of the graphics processor GPU, for example, reduce the bandwidth occupied by the graphics processor GPU, and reduce the calculation amount of the graphics processor GPU.
  • the GPU is specifically configured to determine the disparity value based on the depth value, focal length, and baseline length of the first pixel .
  • the graphics processor GPU determines the disparity information, that is, the disparity value of each pixel is calculated separately using the disparity principle, and finally the disparity information of the first image and the second image can be obtained.
  • the first image includes a plurality of pixels, each pixel of the plurality of pixels corresponds to a depth value, and the disparity information includes the first The disparity value between a first pixel in at least a part of the image and a second pixel in the second image corresponding to the first pixel.
  • the display controller is specifically configured to obtain the depth value of the first pixel, and determine the disparity value based on the depth value of the first pixel.
  • the disparity information is determined by the display controller. Since the graphics processor GPU only needs to render the first image without calculating the disparity information, the display controller is responsible for subsequent calculations. Therefore, compared with the first implementation manner of the first aspect, the operating load of the graphics processor GPU can be further reduced, for example, the bandwidth occupied by the graphics processor GPU is further reduced, and the calculation amount of the graphics processor GPU is further reduced.
  • the display controller is specifically configured to determine the view based on the depth value, focal length, and baseline length of the first pixel. Difference.
  • a specific method for the display controller to determine the disparity information is proposed, that is, the disparity value of each pixel is calculated separately using the disparity principle, and finally the disparity information of the first image and the second image can be obtained.
  • the display controller is specifically used Positioning the second pixel in at least a part of the second image based on the first pixel and the disparity value, and using the pixel value of the first pixel as the pixel value of the second pixel.
  • the second At least a part of the image is the first part of the second image.
  • the GPU is also used to render the second part of the second image, and the second part of the second image is not included in the first image.
  • the display controller is also used to obtain the second part of the second image, and send the second part of the second image to the display, the first image, and the first part and the second part of the second image The combination is used to present a three-dimensional effect on the display.
  • the graphics processor GPU determines the second part of the second image and transmits it to the display for display by the display controller, which is beneficial to broaden the range of the second image that the display can present, and is beneficial to broaden the first image And the range of the second image in the human eye.
  • an embodiment of the present application provides a graphics processor GPU, which includes a processing module and an interface module.
  • the processing module is configured to obtain the first image through rendering, and obtain the disparity information of the first image and the second image.
  • the interface module is configured to provide the first image and the disparity information to a display controller, and at least a part of the first image and the second image are used to present a stereoscopic effect on a display corresponding to the display controller.
  • the graphics processor GPU in the display system only needs to determine the first image and disparity information, and provide the first image and disparity information to the display controller.
  • the display controller determines at least a part of the second image based on at least a part of the first image and the parallax information, and transmits at least a part of the first image and the second image to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image.
  • the display controller transmits two images to the display, the GPU only renders the first image without rendering the second image, and the display controller obtains at least a part of the second image through the disparity information and the first image.
  • Reduce the running load of GPU rendering images for example, can reduce GPU calculation and bandwidth.
  • the first image includes a plurality of pixels, each pixel in the first image corresponds to a depth value, and the disparity information includes the first The disparity value between a first pixel in at least a part of the image and a second pixel in the second image corresponding to the first pixel.
  • the processing module is specifically configured to determine the disparity value based on the depth value of the first pixel.
  • the disparity information is determined by the graphics processor GPU. Since the graphics processor GPU only needs to render the first image and calculate the disparity information, and provide the disparity information and the first image to the display controller, the display controller performs subsequent calculations. In this process, the graphics processor GPU does not need to render two images. Therefore, it is beneficial to reduce the operating load of the graphics processor GPU, for example, reduce the bandwidth occupied by the graphics processor GPU, and reduce the calculation amount of the graphics processor GPU.
  • the processing module is specifically configured to determine the parallax based on the depth value, focal length, and baseline length of the first pixel value.
  • the graphics processor GPU determines the disparity information, that is, the disparity value of each pixel is calculated separately using the disparity principle, and finally the disparity information of the first image and the second image can be obtained.
  • the second At least a part of the image is the first part of the second image.
  • the processing module is also used to render the second part of the second image, the second part of the second image is not included in the first image;
  • the interface module is also used to render the second part of the second image Provided to the display controller, the first image and the combination of the first part and the second part of the second image are used to present a stereoscopic effect on a display corresponding to the display controller.
  • the graphics processor GPU determines the second part of the second image and transmits it to the display for display by the display controller, which is beneficial to broaden the range of the second image that the display can present, and is beneficial to broaden the first image And the range of the second image in the human eye.
  • an embodiment of the present application provides a display controller, which includes a processing module and an interface module.
  • the processing module is configured to obtain the first image from the graphics processor GPU, obtain the disparity information of the first image and the second image, and determine at least part of the second image based on at least a part of the first image and the disparity information.
  • the interface module is configured to send at least a part of the first image and the second image to a display, and at least a part of the first image and the second image are used to present a stereoscopic effect on the display.
  • the graphics processor GPU in the display system only needs to determine the first image, and the display controller determines at least a part of the second image based on at least a part of the first image and the disparity information, and combines the first image And at least a part of the aforementioned second image is transmitted to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image.
  • the display controller transmits two images to the display, the GPU only renders the first image without rendering the second image, and the display controller obtains at least a part of the second image through the disparity information and the first image. Reduce the running load of GPU rendering images, for example, can reduce GPU calculation and bandwidth.
  • the first image includes a plurality of pixels, each pixel of the plurality of pixels corresponds to a depth value, and the disparity information includes the first The disparity value between a first pixel in at least a part of the image and a second pixel in the second image corresponding to the first pixel.
  • the processing module is specifically configured to obtain the disparity value from the GPU.
  • the disparity information is determined by the graphics processor GPU and provided to the display controller to enable the display controller to perform subsequent calculations.
  • the first image includes a plurality of pixels, each pixel of the plurality of pixels corresponds to a depth value, and the disparity information includes the first The disparity value between a first pixel in at least a part of the image and a second pixel in the second image corresponding to the first pixel.
  • the processing module is specifically configured to obtain the depth value of the first pixel, and determine the disparity value based on the depth value of the first pixel.
  • the disparity information is determined by the display controller, and the display controller determines the disparity information based on the depth value provided by the GPU, so that the display controller performs subsequent calculations.
  • the processing module is specifically configured to determine the parallax based on the depth value, focal length, and baseline length of the first pixel value.
  • a specific method for the display controller to determine the disparity information is proposed, that is, the disparity value of each pixel is calculated separately using the disparity principle, and finally the disparity information of the first image and the second image can be obtained.
  • the processing module Specifically used to locate the second pixel in at least a part of the second image based on the first pixel and the disparity value, and use the pixel value of the first pixel as the pixel value of the second pixel.
  • the second At least a part of the image is the first part of the second image.
  • the processing module is also used to obtain the second part of the second image from the GPU; the interface module is also used to send the second part of the second image to the display, and the second part of the second image is not Included in the first image, the first image and the combination of the first part and the second part of the second image are used to present a three-dimensional effect on the display.
  • the graphics processor GPU determines the second part of the second image and transmits it to the display for display by the display controller, which is beneficial to broaden the range of the second image that the display can present, and is beneficial to broaden the first image. And the range of the second image in the human eye.
  • an embodiment of the present application provides a display method, in which a display system renders a first image, and obtains parallax information of the first image and the second image. Then, the display system determines at least a part of the second image based on at least a part of the first image and the disparity information, and sends at least a part of the first image and the second image to a display, the first image and the disparity information At least a part of the second image is used to present a three-dimensional effect on the display.
  • the display system only renders the first image, determines at least a part of the second image based on at least a part of the first image and disparity information, and transmits at least part of the first image and the second image to The display shows.
  • the display system since the display system only renders the first image and does not need to render the second image, the operating load of the display system for rendering images is reduced. For example, the calculation amount and bandwidth of the GPU in the display system can be reduced.
  • the graphics processor GPU in the display system only renders the first image, and provides the aforementioned first image to the display controller.
  • the display controller determines at least a part of the second image based on at least a part of the first image and the parallax information, and transmits at least a part of the first image and the second image to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image.
  • the display controller transmits two images to the display, the GPU only renders the first image without rendering the second image, and the display controller obtains at least a part of the second image through the disparity information and the first image.
  • Reduce the running load of GPU rendering images can reduce GPU calculation and bandwidth.
  • FIG. 1 is a schematic diagram of a parallax principle in an embodiment of the application
  • Figure 2 is a schematic diagram of an embodiment of a display system in an embodiment of the application
  • FIG. 3A is a schematic diagram of another parallax principle in an embodiment of this application.
  • FIG. 3B is a diagram of an application scenario of the stand-alone head-mounted display in an embodiment of the application
  • FIG. 3C is a diagram of an application scenario of the bound head-mounted display in an embodiment of the application.
  • FIG. 3D is a schematic diagram of the display effects of the left display screen and the right display screen in an embodiment of the application.
  • 3E is another schematic diagram of the display effects of the left display screen and the right display screen in an embodiment of the application.
  • 3F is another schematic diagram of the display effects of the left display screen and the right display screen in an embodiment of the application.
  • FIG. 4 is a schematic diagram of an embodiment of a graphics processor GPU in an embodiment of the application.
  • FIG. 5 is a schematic diagram of an embodiment of a display controller in an embodiment of the application.
  • FIG. 6A is a schematic diagram of the first image and the second image in an embodiment of this application.
  • FIG. 6B is another schematic diagram of the first image and the second image in the embodiment of this application.
  • FIG. 6C is another schematic diagram of the first image and the second image in an embodiment of this application.
  • FIG. 6D is another schematic diagram of the first image and the second image in an embodiment of this application.
  • FIG. 6E is another schematic diagram of the first image and the second image in an embodiment of this application.
  • FIG. 7 is a flowchart of the display method in an embodiment of the application.
  • the embodiments of the present application provide a display system, a graphics processor GPU, a display controller, and a display method, which are used to reduce the operating load of the GPU when the display system provides two images for display on the display.
  • a display system a graphics processor GPU
  • a display controller a display controller
  • a display method which are used to reduce the operating load of the GPU when the display system provides two images for display on the display.
  • the display system proposed in the embodiments of the present application is mainly applied to augmented reality (AR) devices or virtual reality (VR) devices.
  • AR augmented reality
  • VR virtual reality
  • the user may wear an AR/VR device loaded with the aforementioned display system, and obtain a stereoscopic viewing effect through the two-dimensional image provided by the AR/VR device to the human eye.
  • AR/VR devices include head-mounted displays (HMDs), which can magnify the image on the ultra-micro display through a set of optical systems, and present the images calculated by the display system separately For the left and right eyes of the user, so that the user can obtain a three-dimensional visual effect through the images presented to the left and right eyes respectively.
  • HMDs head-mounted displays
  • the display system in the embodiments of the present application can be integrated in the aforementioned head-mounted display, or can be located in a computer connected to the aforementioned head-mounted display, and can also be located in other devices, such as a mobile phone, which is not specifically limited here. .
  • the parallax principle involved in the display system will be introduced below.
  • Figure 1 it is a schematic diagram of the parallax principle.
  • o and o' are the left optical center and the right optical center respectively
  • the pixel p is the imaging point where the object P is located in the image 1 on the left optical axis
  • the pixel p' is the object P is located in the image 2 on the right optical axis. ⁇ imaging point.
  • the display system proposed in the embodiment of the present application just utilizes the aforementioned parallax principle, and simultaneously presents two images with a certain parallax to the left and right eyes of the user, so that the two images viewed by the user can form a stereoscopic image on the user’s retina.
  • the virtual image of the user can obtain the effect of viewing three-dimensional objects.
  • the main structure of the display system will be introduced below based on the foregoing principles.
  • FIG. 2 a schematic structural diagram of a display system 20 proposed in this embodiment of the application.
  • the display system 20 includes a graphics processor GPU 201 and a display controller 202.
  • the graphics processor GPU 201 is used to render the first image.
  • the first image is an image from a certain perspective.
  • the first image is an image provided to the user's left eye or an image provided to the user's right eye, which is not specifically limited here.
  • the aforementioned rendering may be stereo rendering.
  • the graphics processor GPU 201 performs vertex processing, and forms triangles for every three vertices among the multiple vertices obtained by the vertex processing.
  • the graphics processor GPU 201 performs rasterization processing. Finally, the graphics processor GPU 201 performs pixel processing to obtain the aforementioned first image.
  • the display controller 202 is configured to obtain the first image from the graphics processor GPU 201, obtain the disparity information of the first image and the second image, and based on at least a part of the first image and the disparity information Determine at least a part of the second image.
  • the second image is an image from another perspective.
  • the second image is an image provided to the user's right eye; if the aforementioned first image is an image provided to the user's right eye; The image for the right eye of the user, the second image is the image provided for the left eye of the user, and the details are not limited here.
  • the aforementioned first image contains multiple pixels
  • the aforementioned second image contains multiple pixels.
  • the pixels in the aforementioned first image are referred to as first pixels
  • the pixels in the aforementioned second image are referred to as second pixels.
  • the disparity information between the first image and the second image includes the disparity value of the first pixel in the first image and the corresponding second pixel of the first pixel in the second image, which can also be understood as, It includes the disparity value between each first pixel in the aforementioned first image and the corresponding second pixel in the second image.
  • the disparity value may be the difference between the coordinate value of the first pixel and the coordinate value of the second pixel corresponding to the first pixel.
  • the first image in this embodiment may be image 1 in FIG. 1, and the second image in this embodiment may be image 2 in FIG. 1. Since there is a parallax between the first image and the second image, the display controller 202 can determine at least a part of the second image based on at least a part of the first image and the parallax information.
  • the first image is [a1, a2]
  • the second image is [b1, b2]
  • the [a3, a2] part of the first image is the same as [b1, a2] in the second image.
  • the display controller 202 may determine at least a part of the second image ([b1 in the second image] based on at least a part of the first image (for example, the [a3, a2] part in the first image) and disparity information , B3] partially overlapped).
  • the disparity information can be understood as the disparity information between the [a3, a2] part in the first image and the [b1, b3] part in the second image.
  • the display controller 202 is also used to send at least a part of the first image and the second image to a display (not shown), so that at least a part of the first image and the second image are displayed on the display.
  • the display may be a head-mounted display, and the head-mounted display is configured with a left display screen and a right display screen. It can be understood that the head-mounted display can be replaced by other types of displays, which is not limited in this embodiment. Since the aforementioned first image is the image observed by the left eye, and the aforementioned second image is the image observed by the right eye, the aforementioned first image will be displayed on the left display screen, and the second image will be displayed on the right eye. In the display. Specifically, for the principle of the head-mounted display, reference may be made to the related introduction in the embodiment corresponding to FIG. 1, and the details are not repeated here.
  • the aforementioned display system 20 and the display can be integrated in the same device, and the aforementioned display system 20 and the display can also be distributed in different devices.
  • the head-mounted display can be divided into a stand-alone head-mounted display and a bound head-mounted display.
  • the stand-alone head-mounted display refers to a head-mounted display that integrates computing and processing modules into the stand-alone head-mounted display, and does not need to be connected to an external computer.
  • FIG. 3B shows a stand-alone head-mounted display.
  • the display system 20 and the display are both located in the stand-alone head-mounted display. The user can obtain a stereoscopic viewing effect by watching at least a part of the first image and the second image displayed on the stand-alone head-mounted display.
  • the bound head-mounted display needs to be connected to an external computer, the data is processed by the external computer, and the bound head-mounted display is used for display.
  • FIG. 3C shows a bound head-mounted display.
  • the display system 20 may be located in an external computer connected to the bound head-mounted display, and the display is located in the bound head-mounted display.
  • the display system 20 in the external computer determines at least a part of the first image and the second image, and then transmits at least a part of the first image and the second image to the stand-alone head-mounted display.
  • the stand-alone head-mounted display presents a three-dimensional effect to the user.
  • the connection between the display system 20 and the display can be a wired connection or a wireless connection.
  • wireless connection methods such as wireless fidelity (Wi-Fi), ZigBee protocol (ZigBee), and Bluetooth, or other short-distance communication methods, or other short-distance communication methods may be used, which are not specifically limited here.
  • the graphics processor GPU 201 in the display system only renders the first image, and provides the aforementioned first image to the display controller 202.
  • the display controller 202 determines at least a part of the second image based on at least a part of the aforementioned first image and disparity information, and transmits at least a part of the aforementioned first image and the aforementioned second image to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image.
  • the display controller 202 transmits two images to the display, and the graphics processor GPU 201 only renders the first image without rendering the second image, which is obtained by the display controller 202 through the disparity information and the first image.
  • At least a part of the second image reduces the running load of the graphics processor GPU 201 to render the image. For example, the calculation amount and bandwidth of the graphics processor GPU 201 can be reduced.
  • the display system 20 may further include a storage device 203.
  • the aforementioned graphics processor GPU 201 and the aforementioned display controller 202 share the aforementioned storage device 203, and at least a part of the aforementioned first image, disparity information, and second image are stored in the aforementioned storage device 203.
  • the aforementioned image processor GPU 201 stores the rendered first image in the aforementioned storage device 203; the display controller 202 also stores at least a part of the determined second image in the aforementioned storage device 203; The foregoing image processor GPU 201 stores the determined disparity information in the foregoing storage device 203, or the display controller 202 stores the determined disparity information in the foregoing storage device 203, which is not specifically limited here.
  • the display controller 202 can directly obtain the first image from the graphics processor GPU 201, that is, the graphics processor GPU 201 directly
  • the first image is transmitted to the display controller 202; the display controller 202 may also obtain the first image from the storage device 203, which is not specifically limited here.
  • the storage device 203 may be a double-rate synchronous dynamic random access memory (double data rate synchronous dynamic random-access memory, DDR SDRAM), and the double-rate synchronous dynamic random access memory is also often referred to as DDR for short.
  • DDR SDRAM double data rate synchronous dynamic random-access memory
  • the graphics processor GPU 201 is a device that draws or renders an image.
  • the display controller 202 is also called a display subsystem or a display driver.
  • the display controller 202 can be used to perform layer superimposition processing, and send the formed image after the layer superimposition to the display for display.
  • the display controller 202 may also be used to perform processing such as image inversion, enlargement, or reduction, which is not limited in this embodiment.
  • the layer overlay processing includes, but is not limited to, overlaying the image drawn by the graphics processor GPU 201 with other images, such as background images or windows.
  • the display controller 202 can obtain the disparity information of the first image and the second image in a variety of implementation manners, which will be introduced separately below.
  • the display controller 202 may obtain the disparity information of the first image and the second image from the graphics processor GPU 201.
  • the graphics processor GPU 201 in the display system 20 may directly send the disparity information to the display controller 202, that is, the display controller 202 may directly receive the disparity information from the graphics processor GPU 201;
  • the graphics processor GPU 201 in the display system 20 sends the disparity information to the storage device 203, and the display controller 202 obtains the disparity information from the storage device 203, which is not specifically limited here.
  • the disparity information is determined by the graphics processor GPU 201.
  • the graphics processor GPU 201 is used to obtain the depth value corresponding to each pixel in the first image when rendering the aforementioned first image, and then calculate the visual value corresponding to each pixel based on the depth value corresponding to each pixel. Difference. For example, if the first image includes a first pixel, and the first pixel corresponds to a depth value, the graphics processor GPU 201 may determine the first pixel and the second pixel corresponding to the first pixel based on the depth value of the first pixel The parallax value between.
  • f the focal length
  • b the baseline length
  • d the depth value of the first pixel.
  • the graphics processor GPU 201 performs similar processing on each pixel in the first image to obtain disparity information.
  • the disparity information includes multiple disparity values. Specifically, the disparity information includes at least a part of the first image. The disparity value between the first pixel and the second pixel corresponding to the first pixel in the second image.
  • the graphics processor GPU 201 since the graphics processor GPU 201 only needs to render the first image, and calculate the disparity information by using the depth value corresponding to each pixel in the first image, and provide the disparity information and the first image to the display control
  • the display controller 202 performs subsequent calculations.
  • the graphics processor GPU 201 does not need to render the two images. Therefore, it is beneficial to reduce the operating load of the graphics processor GPU 201, for example, reduce the bandwidth occupied by the graphics processor GPU 201, and reduce the calculation amount of the graphics processor GPU 201.
  • the display controller 202 will obtain the depth value corresponding to each pixel in the first image from the graphics processor GPU 201, and calculate the corresponding depth value of each pixel based on the depth value corresponding to each pixel.
  • the parallax value That is to say, in this embodiment, the disparity information is calculated by the display controller 202 rather than directly obtained from the graphics processor GPU 201.
  • the display controller 202 obtains the depth value corresponding to each pixel in the first image from the graphics processor GPU 201. It can be understood that the graphics processor GPU 201 directly sends the depth value of each pixel to the display controller 202 That is, the display controller 202 directly receives the depth value of each pixel from the graphics processor GPU 201; or, the graphics processor GPU 201 sends the depth value of each pixel to the storage device 203, and the display control The device 202 obtains the depth value of each pixel from the storage device 203, which is not specifically limited here.
  • the display controller 202 may determine the disparity value based on the depth value, focal length, and baseline length of the first pixel, and the formula for calculating the disparity value of the display controller 202 is the same as the aforementioned graphics processor GPU 201
  • f the focal length
  • b the baseline length
  • d the depth value of the first pixel
  • the graphics processor GPU 201 since the graphics processor GPU 201 only needs to render the first image and does not need to calculate the disparity information, the display controller 202 is responsible for subsequent calculations. Therefore, the graphics processing is further reduced on the basis of the foregoing embodiment.
  • the operating load of the GPU 201 for example, further reduces the bandwidth occupied by the GPU 201, and further reduces the amount of calculation of the GPU 201.
  • the display controller 202 determines the second image based on at least a part of the aforementioned first image and the disparity information At least part of. Specifically, the display controller 202 may locate the second pixel in at least a part of the second image based on the first pixel in the first image and the disparity value corresponding to the first pixel, and set the first pixel The pixel value of is used as the pixel value of the second pixel. Further, the display controller 202 may calculate the difference between the coordinate value of the first pixel and the disparity value corresponding to the first pixel to obtain the coordinate value of the second pixel.
  • the display controller 202 can determine the position of the second pixel in the second image according to the coordinate value of the second pixel. Then, the display controller 202 assigns the pixel value of the aforementioned first pixel to the second pixel in the second image, so that the pixel value of the second pixel can be determined.
  • the display controller 202 can calculate each pixel in the first image according to the aforementioned calculation method, and then obtain the position and pixel value of at least a part of the pixels in the second image.
  • RGB red-green-blue
  • RGB pixel value is a two-dimensional image pixel value, which represents color.
  • the depth value of the pixel represents the depth information of the pixel.
  • the depth value and pixel value of the pixel can be used to form three-dimensional pixel information, such as red, green and blue depth (RGBD) values.
  • the depth value (D) can also be stored separately from the aforementioned RGB pixel value, and this embodiment is not used for limitation.
  • FIG. 3D For ease of understanding, take Figure 3D as an example for introduction.
  • pixel A there is a first pixel in the first image as pixel A.
  • the exemplarily shown pixel A in the figure is larger, but it is only used for understanding the solution and is not used for limitation.
  • the horizontal distance between the pixel A and the left border of the first image is x1, that is, the abscissa value of the pixel A in the first image is x1.
  • the disparity value corresponding to the pixel A is z0
  • the display controller 202 assigns the pixel value of the pixel A to the pixel A'.
  • the display controller 202 can determine at least a part of the second image.
  • the second image is the first part of the second image, as shown in FIG. 3E, the first part of the second image is the part calculated based on the parallax principle. At this time, the pixels of the second part of the second image have not yet been determined.
  • the display system 20 is also used to determine the second part of the second image. Generally, when the second image is an image presented to the user's right eye, the second part of the second image is a part close to the right side of the second image. If the aforementioned second image is an image presented to the left eye of the user, the second part of the second image is a part close to the left side of the second image.
  • the width of the second part of the second image is much smaller than the width of the first part of the second image, which can also be understood as the number of pixels in each row of the second part of the second image It is much less than the number of pixels in each row of the first part of the second image.
  • the graphics processor GPU 201 in the display system 20 will render the second part of the second image.
  • the display controller 202 obtains the second part of the second image and transfers the second part of the second image.
  • the two parts are sent to the display, so that the first image and the combination of the first part and the second part of the second image present a three-dimensional effect on the display.
  • FIG. 3F take FIG. 3F as an example.
  • the left display screen of the head-mounted display displays the first image
  • the right display screen of the head-mounted display displays the first part of the second image and the second part of the second image.
  • the combination is an example.
  • each pixel in the first part of the second image is different from each pixel in the second part of the second image
  • the content presented in the first part of the second image is different from the content presented in the second part of the second image.
  • the content of is different, so it helps to broaden the range of the second image that the display can present.
  • the display controller 202 can be implemented in a variety of ways.
  • the graphics processor GPU 201 may directly transmit the second part of the second image to the display controller 202, that is, the display controller 202 directly transfers the second part of the second image from the graphics processor 202.
  • the GPU 201 receives the second part of the second image.
  • the graphics processor GPU 201 transmits the second part of the aforementioned second image to the storage device 203, and the display controller 202 directly obtains the second part of the second image from the storage device 203. the second part.
  • the details are not limited here.
  • FIG. 4 it is a schematic structural diagram of a graphics processor GPU 40 proposed in an embodiment of this application.
  • the graphics processor GPU 40 includes a processing module 401 and an interface module 402.
  • the processing module 401 is configured to obtain a first image through rendering, and obtain disparity information of the first image and the second image.
  • the interface module 402 is configured to provide the first image and the disparity information to a display controller, and at least a part of the first image and the second image are used to present a stereoscopic effect on a display corresponding to the display controller.
  • the first image is an image of a certain angle of view, for example, the first image is an image provided to the user's left eye or an image provided to the user's right eye.
  • the second image is an image from another perspective.
  • the aforementioned first image is an image provided to the user's left eye
  • the second image is an image provided to the user's right eye
  • the aforementioned first image is an image provided to the user
  • the second image is the image provided to the left eye of the user, which is not specifically limited here.
  • the disparity information between the first image and the second image includes the disparity value of a certain first pixel in the aforementioned first image and a corresponding second pixel of the first pixel in the second image.
  • the disparity value may be the difference between the coordinate value of the first pixel and the coordinate value of the second pixel corresponding to the first pixel.
  • the interface module 402 provides the first image and the disparity information to the display controller. It can be understood that the interface module 402 directly sends the first image and the disparity information to the display controller; it can also be understood as the The interface module 402 transmits the first image and disparity information to a storage device, and the display controller obtains the first image and disparity information from the storage device. The details are not limited here.
  • the display controller 202 performs subsequent calculations. In this process, the graphics processor GPU 40 does not need to render the two images. Therefore, it is beneficial to reduce the operating load of the graphics processor GPU 40, for example, reduce the bandwidth occupied by the graphics processor GPU 40, and reduce the calculation amount of the graphics processor GPU 40.
  • the processing module 401 may determine the disparity information between the foregoing first image and the second image in the following manner.
  • the aforementioned first image includes a plurality of pixels, and each pixel in the aforementioned first image corresponds to a depth value, which can be obtained by the processing module 401 rendering the first image.
  • the aforementioned disparity information includes a disparity value between a first pixel in at least a part of the first image and a second pixel in the second image corresponding to the first pixel, and the disparity value can be understood as the aforementioned first pixel The difference between the coordinate value of and the coordinate value of the aforementioned second pixel.
  • the processing module 401 is further configured to render the second part of the second image, and the second part of the second image is not included in the second image.
  • the interface module 402 is also used to provide the second part of the second image to the display controller, and the first image and the combination of the first part and the second part of the second image are used in the display controller The corresponding display shows a three-dimensional effect.
  • the interface module 402 may send the second part of the second image to the display controller, and the interface module 402 may also transmit the second part of the second image to the storage device, and the display controller Obtain from the storage device, which is not specifically limited here.
  • the combination of the first part and the second part of the second image refers to a second image formed by the first part of the second image and the second part of the second image.
  • FIG. 3F the relevant description corresponding to the aforementioned FIG. 3F, which will not be repeated here.
  • the graphics processor GPU 40 is also used to transmit control information to the storage device.
  • the control information includes one or more of rendering mode information, view information, and preset coordinate ranges.
  • the rendering mode information is used to indicate the rendering mode, for example, 1 represents a stereoscopic rendering mode based on the parallax principle, and 0 represents a normal rendering mode.
  • the view information is used to indicate the perspective of the rendered image. For example, 1 indicates that the output image is the image presented to the left eye, and 0 indicates that the output image is the image presented to the right eye.
  • the preset coordinate range is used to indicate the coordinate value range of the first part of the second image.
  • the graphics processor GPU 40 provides the first image of the second image to the display controller. Two parts: When the abscissa value in the second image is less than the lower limit of the aforementioned preset coordinate range, the graphics processor GPU 40 provides the first part of the second image to the display controller.
  • the graphics processor GPU 40 since the graphics processor GPU 40 only needs to render the first image, and calculate the disparity information by using the depth value corresponding to each pixel in the first image, and provide the disparity information and the first image to the display control
  • the display controller 202 performs subsequent calculations.
  • the graphics processor GPU 40 does not need to render the two images. Therefore, it is beneficial to reduce the operating load of the graphics processor GPU 40, for example, reduce the bandwidth occupied by the graphics processor GPU 40, and reduce the calculation amount of the graphics processor GPU 40.
  • the display controller 50 includes a processing module 501 and an interface module 502.
  • the processing module 501 is configured to obtain a first image from a graphics processor GPU, obtain disparity information of the first image and the second image, and determine the disparity information of the second image based on at least a part of the first image and the disparity information. At least part of it.
  • the interface module 502 is configured to send at least a part of the first image and the second image to a display, and at least a part of the first image and the second image are used to present a stereoscopic effect on the display.
  • the first image is an image of a certain angle of view, for example, the first image is an image provided to the user's left eye or an image provided to the user's right eye.
  • the second image is an image from another perspective.
  • the aforementioned first image is an image provided to the user's left eye
  • the second image is an image provided to the user's right eye
  • the aforementioned first image is an image provided to the user
  • the second image is the image provided to the left eye of the user, which is not specifically limited here.
  • the disparity information between the first image and the second image includes the disparity value of a certain first pixel in the aforementioned first image and a corresponding second pixel of the first pixel in the second image.
  • the disparity value may be the difference between the coordinate value of the first pixel and the coordinate value of the second pixel corresponding to the first pixel.
  • the processing module 501 when the processing module 501 obtains the first image from the graphics processor GPU, the first image can be directly sent to the display controller 50 by the graphics processor GPU, that is, the processing module 501 directly passes through the aforementioned graphics processor GPU.
  • the interface module 402 receives the first image; the graphics processor GPU may also transmit the first image to the storage device inside the display system, and then the display controller 50 obtains the first image from the storage device ,
  • the specifics are not limited here.
  • the graphics processor GPU in the display system only renders the first image
  • the display controller 50 determines at least a part of the second image based on at least a part of the aforementioned first image and disparity information, and combines the aforementioned first image.
  • the image and at least a part of the aforementioned second image are transmitted to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image.
  • the display controller 50 transmits two images to the display, and the graphics processor GPU only renders the first image without rendering the second image.
  • the display controller 50 obtains the second image through the disparity information and the first image. At least a part of the second image reduces the running load of the graphics processor GPU rendering the image, for example, it can reduce the calculation amount and bandwidth of the graphics processor GPU.
  • the processing module 501 may adopt different implementation manners, which will be introduced separately below.
  • the processing module 501 may obtain the disparity information provided by the graphics processor GPU, that is, the disparity information is calculated by the graphics processor GPU. At this time, if the graphics processor GPU directly sends the disparity information to the display controller 50, the processing module 501 may directly receive the disparity information sent by the graphics processor GPU; if the graphics processor GPU sends the disparity information If it is transmitted to the storage device in the display system, the processing module 501 can obtain the disparity information from the storage device, which is not specifically limited here.
  • the display controller 50 includes a line buffer, and the line buffer is used to store the disparity information acquired by the display controller 50.
  • the processing module 501 can directly use the aforementioned first image and disparity information for subsequent calculations.
  • the aforementioned disparity information is calculated by the display controller based on the parameters provided by the graphics processor GPU, that is, the processing module 501 can calculate the aforementioned disparity information.
  • the processing module 501 needs to obtain a depth value, and then calculate the aforementioned disparity information based on the depth value.
  • the depth value is a depth value corresponding to each pixel in the first image, which can be obtained by rendering the first image by the graphics processor GPU, that is, provided by the graphics processor GPU to the display controller 50.
  • the graphics processor GPU can directly send the depth value of each pixel to the display controller 501, that is, the processing module 501 can directly receive the aforementioned various pixels from the graphics processor GPU.
  • the depth value of the pixel; the graphics processor GPU can also transmit the depth value of each pixel to the storage device inside the display system, and the processing module 501 obtains the depth value of each pixel from the storage device.
  • the processing module 501 may determine the disparity value corresponding to each pixel based on the depth value of each pixel, that is, for a certain pixel in the first image, the processing module 501 determines this pixel based on the depth value of this pixel.
  • the disparity value of, and so on, the processing module 501 can determine the depth value of each pixel in the first image.
  • the processing module 501 can also select some pixels in the first image as needed, and determine the depth value of each pixel in the aforementioned partial pixels, which is not specifically limited here. More specifically, if the disparity value of the first pixel in the first image needs to be determined, the processing module 501 may determine the disparity value based on the depth value, focal length, and baseline length of the first pixel.
  • the processing module 501 can use the aforementioned first image and the disparity information determined by the processing module 501 to perform subsequent calculations.
  • the display controller 50 includes a line buffer for storing the calculated disparity information.
  • the processing module 501 determines at least part of the second image based on at least a part of the first image and the disparity information. Part. Specifically, the processing module 501 will locate the second pixel in at least a part of the second image based on the first pixel and the disparity value, and use the pixel value of the first pixel as the pixel value of the second pixel .
  • the processing module 501 may calculate the difference between the coordinate value of the first pixel and the disparity value corresponding to the first pixel to obtain the coordinate value of the second pixel. Therefore, the processing module 501 can determine the position of the second pixel in the second image according to the coordinate value of the second pixel. Then, the processing module 501 assigns the pixel value of the aforementioned first pixel to the second pixel in the second image, so that the pixel value of the second pixel can be determined. By analogy, the processing module 501 can calculate each pixel in the first image according to the aforementioned calculation method, and then obtain the position and pixel value of at least a part of the pixels in the second image.
  • the processing module 401 or 501 may be a processor core for running software program instructions for processing.
  • the processing module 401 is one or more GPU computing cores, and for example, the processing module 501 may be one or more low power consumption. Computing core.
  • the operating power consumption of the processing module 501 is lower than the GPU computing core, such as a small central processing unit or a digital signal processor.
  • the software program instructions run by the above processor core may be stored in the storage device 203 in FIG. 2 or in other memories.
  • the interface module 402 or 502 may be an interface circuit, including but not limited to supporting various interface protocols, for implementing data transmission.
  • the processing module 401 or 501 and the interface module 402 or 502 may include hardware logic circuits without executing software programs. Any module may perform processing through logic circuit operations.
  • the logic circuits include but are not limited to transistors, At least one of the logic gate and the arithmetic circuit may also optionally include an analog circuit or a digital-analog hybrid circuit.
  • the processing modules 501 and 502 in the display controller 50 are implemented by pure hardware logic circuits to implement hardware acceleration of display control or display driving functions, and improve the performance of the display system 20.
  • the processing module 401 or 501 and the interface module 402 or 502 may be software modules, including software program instructions, for being executed by a processor or a processor core and implementing the aforementioned functions. Therefore, any of the above-mentioned modules can be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • software it can be implemented in the form of a computer program product in whole or in part.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application are generated in whole or in part.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions can be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media.
  • the usable medium may be a dynamic random access memory (for example, a double-rate synchronous dynamic random access memory DDR), or a static random access memory or the like.
  • the first row of pixels of the first image is taken as an example to introduce the process of determining at least a part of the second image by the processing module 501. It should be understood that the following processing only takes the deviation in the horizontal direction, that is, the parallax only includes the horizontal parallax as an example. . Specifically, as shown in FIG. 6A, it is assumed that there are only 8 pixels in the first row of the first image, and each pixel corresponds to a coordinate value and a pixel value.
  • the processing module 501 can calculate the difference between the abscissa value and the disparity value of each pixel, and then the processing The module 501 determines the abscissa value of the pixel in the second image based on the difference. It should be understood that the difference value represents the position of the pixel value assignment of the pixel in the second image.
  • the abscissa value of the pixel A0 is 0, and the disparity value corresponding to the pixel A0 is 1, it can be determined that the difference between the abscissa value and the disparity value is -1, that is, the pixel A0 is in the second image s position. Since, -1 ⁇ 0, the pixel value a0 cannot fall on the display screen corresponding to the second image, that is, it overflows the display screen. Therefore, the pixel value a0 is not included in the second image.
  • the abscissa value of the pixel A1 is 1, and the disparity value corresponding to the pixel A1 is 1, it can be determined that the difference between the abscissa value and the disparity value is 0, that is, the position of the pixel A1 in the second image. Therefore, as shown in FIG. 6A, the pixel value a1 should be assigned to the first row and first column of the second image.
  • the difference between the abscissa value of the pixel A2 and the disparity value corresponding to the pixel A2 is 1, that is, the position of the pixel A2 in the second image
  • the abscissa value of the pixel A3 is also 1, that is, the position of the pixel A3 in the second image.
  • the processing module 501 may select the corresponding pixel value of the pixel with the smaller depth value in the pixel A2 and the pixel A3 for display.
  • the processing module 501 retains the pixel value a3 corresponding to the pixel A3, that is, assigns the pixel value a3 In the first row and second column of the second image.
  • the disparity value is calculated based on the aforementioned formula 2.
  • the disparity value is not necessarily an integer, which results in that the difference between the calculated abscissa and the disparity value is not necessarily an integer.
  • rounding can be used to approximate the aforementioned difference. For example, if the abscissa value of the pixel A4 is 4, and the disparity value corresponding to the pixel A4 is 0.7, it can be determined that the difference between the abscissa value and the disparity value is 3.3.
  • the processing module 501 adopts a rounding method to take the approximate value of the aforementioned difference to 3, that is, the position of the pixel A4 in the second image. Therefore, as shown in FIG.
  • the pixel value a4 should be assigned to the first row and fourth column of the second image.
  • the abscissa value of pixel A5 is 5, and the disparity value corresponding to pixel A5 is 1, it can be determined that the difference between the abscissa value and the disparity value is 4, that is, the value of pixel A5 in the second image Location. Therefore, as shown in FIG. 6A, the pixel value a5 should be assigned to the first row and fifth column of the second image.
  • the abscissa value of the pixel A6 is 6, and the disparity value corresponding to the pixel A6 is 1.3, it can be determined that the difference between the abscissa value and the disparity value is 4.7, and the approximate value is 5.
  • the abscissa value of the pixel A7 is 7, and the disparity value corresponding to the pixel A7 is 2, it can be determined that the difference between the abscissa value and the disparity value is 2, and the approximate value is 5.
  • the positioning positions of the pixel A6 and the pixel A7 in the second image coincide.
  • the processing module 501 retains the pixel value a7 corresponding to the pixel A7, that is, the pixel value a7
  • the value a7 is assigned to the first row and sixth column of the second image.
  • pixel values may be lost.
  • the processing module 501 may use a difference algorithm or a filtering algorithm to determine the pixel value in the first row and third column of the second image.
  • the pixel value Suppose that after calculating the pixel value c1 at the first row and third column of the second image, the first row of the second image is as shown in FIG. 6B.
  • the processing module 501 can determine the pixel value of each image in the first part of the second image based on the pixels in the first image according to the aforementioned calculation method, that is, the processing module 501 can determine the first part of the second image , As shown in Figure 6C. It should be understood that how many columns of pixels are reserved in the first part of the second image depends on the aforementioned disparity value and the abscissa value of the pixels in the first image. FIG. 6C in this embodiment is only an example, and specifically does not limit the number of pixels in the first part of the second image.
  • the display controller 50 may further determine the second part of the second image.
  • the processing module 501 is also used to obtain the second part of the second image from the graphics processor GPU;
  • the interface module 502 is also used to send the second part of the second image to the display, the The second part of the second image is not included in the first image, and the first image and the combination of the first part and the second part of the second image are used to present a three-dimensional effect on the display.
  • the graphics processor GPU may directly transmit the second part of the second image to the display controller, that is, the processing module 501 directly
  • the second part of the second image is received from the graphics processor GPU; or the graphics processor GPU transmits the second part of the aforementioned second image to the storage device inside the display system, and the processing module 501 directly receives the second part of the second image from the display system.
  • the second part of the second image acquired in the storage device is not specifically limited here.
  • the processing module 501 is also used to obtain control information from the aforementioned storage device.
  • the control information includes one or more of rendering mode information, view information, and preset coordinate ranges.
  • the rendering mode information is used to indicate the rendering mode, for example, 1 represents a stereoscopic rendering mode based on the parallax principle, and 0 represents a normal rendering mode.
  • the view information is used to indicate the perspective of the rendered image. For example, 1 indicates that the output image is the image presented to the left eye, and 0 indicates that the output image is the image presented to the right eye.
  • the preset coordinate range is used to indicate the coordinate value range of the first part of the second image.
  • the aforementioned first image is an image presented to the left eye of the user
  • the second part of the second image acquired by the processing module 501 is located in the first part On the right side, the second part of the second image is transmitted to the display; when the abscissa value in the second image is less than the upper limit of the aforementioned preset coordinate range, the processing module 501 obtains the first part of the second image, And transmit the first part of the second image to the display.
  • the display controller determines at least a part of the second image based on at least a part of the first image and disparity information, and transmits at least a part of the aforementioned first image and the aforementioned second image to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image.
  • the display controller transmits two images to the display, the GPU only renders the first image without rendering the second image, and the display controller obtains at least a part of the second image through the disparity information and the first image.
  • Reduce the running load of GPU rendering images for example, can reduce GPU calculation and bandwidth.
  • the display system can perform the following steps:
  • the first image is an image of a certain angle of view, for example, the first image is an image provided to the user's left eye or an image provided to the user's right eye, which is not specifically limited here.
  • the first image includes a plurality of pixels, and each pixel corresponds to a depth value and a pixel value.
  • the pixel value is an RGB pixel value.
  • the disparity information between the first image and the second image includes the disparity value of the first pixel in the first image and the corresponding second pixel of the first pixel in the second image.
  • the disparity value may be the difference between the coordinate value of the first pixel and the coordinate value of the second pixel corresponding to the first pixel.
  • f the focal length
  • b the baseline length
  • d the depth value of the first pixel.
  • the display system may determine at least a part of the second image based on at least a part of the first image and the disparity information. Specifically, the display system will locate the second pixel in at least a part of the second image based on the first pixel and the disparity value, and use the pixel value of the first pixel as the pixel value of the second pixel.
  • the display system may calculate the difference between the coordinate value of the first pixel and the disparity value corresponding to the first pixel to obtain the coordinate value of the second pixel. Therefore, the display system can determine the position of the second pixel in the second image according to the coordinate value of the second pixel. Then, the display system assigns the pixel value of the aforementioned first pixel to the second pixel in the second image, so that the pixel value of the second pixel can be determined.
  • the display system can calculate each pixel in the first image according to the aforementioned calculation method, and then obtain the position and pixel value of at least a part of the pixels in the second image. Specifically, reference may be made to the related description in the foregoing embodiment corresponding to FIG. 5, which will not be repeated here.
  • step 705 Render the second part of the second image, and send the second part of the second image to the display.
  • step 705 is an optional step.
  • the first part of the second image is the first part of the second image.
  • the second part of the second image is not included in the first image. It can be understood that when the first image and the second part of the second image are presented to the user’s left and right eyes respectively, the second part of the second image Part of the image does not overlap with the image presented by the first image in the user's retina.
  • the first image and the combination of the first part and the second part of the second image are used to present a three-dimensional effect on the display. Specifically, reference may be made to the relevant introduction in the aforementioned embodiment corresponding to FIG. 2, which is not repeated here.
  • the display system only renders the first image, and then directly determines at least a part of the second image based on at least a part of the first image and the disparity information, and combines at least a part of the first image and the second image. It is transmitted to the display, so that the display presents a three-dimensional effect by displaying at least a part of the first image and the second image.
  • the display shows two images, but the first image is obtained through rendering, and the first part of the second image is determined by the disparity information and the first image. Therefore, it is beneficial to reduce the display system. Calculation amount and occupied bandwidth.
  • the display system also renders the second part of the second image.
  • the amount of calculation required to render the second part of the second image It is much smaller than the amount of calculation for rendering the first part of the second image. Therefore, under the condition of widening the range of the image presented to the user by the display, it only consumes a smaller amount of calculation and a smaller bandwidth, which is beneficial to reduce the GPU in the display system.
  • the operating load since the second part of the second image is much smaller than the first part of the second image, the amount of calculation required to render the second part of the second image It is much smaller than the amount of calculation for rendering the first part of the second image. Therefore, under the condition of widening the range of the image presented to the user by the display, it only consumes a smaller amount of calculation and a smaller bandwidth, which is beneficial to reduce the GPU in the display system.
  • the operating load Since the second part of the second image is much smaller than the first part of the second image, the amount of calculation required to render the second part of the second image It is much smaller than the amount of calculation for rendering the first part of the second image. Therefore, under the condition
  • the size of the sequence number of the above-mentioned processes does not mean the order of execution, and the execution order of each process should be determined by its function and internal logic, and should not correspond to the embodiments of the present application.
  • the implementation process constitutes any limitation.
  • the disclosed system, device, and method can be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or It can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

Disclosed in embodiments of the present application are a display system, a graphics processing unit (GPU), a display controller, and a display method, applicable to a VR/AR device, for example, a head-mounted display and a mobile phone. The display system comprises the GPU and the display controller, wherein the GPU is used for rendering a first image; the first display controller is used for obtaining the first image from the GPU and obtain parallax information of the first image and a second image, determining at least part of the second image on the basis of at least part of the first image and the parallax information, and sending the first image and the at least part of the second image to a display, the first image and the at least part of the second image being used for presenting a stereoscopic effect on the display. Because the display system merely renders the first image, when the display system provides the two images for the display for displaying, the operating load of the GPU can be reduced.

Description

显示***、图形处理器GPU、显示控制器以及显示方法Display system, graphics processor GPU, display controller and display method 技术领域Technical field
本申请实施例涉及显示***,尤其涉及显示***、图形处理器GPU、显示控制器以及显示方法。The embodiments of the present application relate to a display system, and in particular to a display system, a graphics processor GPU, a display controller, and a display method.
背景技术Background technique
立体渲染(stereo rendering),又称为立体绘制,是一种用于显示离散三维采样数据集的二维投影的技术。也可以理解为,该技术是将某物体发出或反射的三维的光信号处理转换为二维图像的过程,以使得人眼在观看二维图像时也能够达到观看三维物体的效果。当该立体渲染技术应用于虚拟现实技术(virtual reality,VR)时,可以通过分别给用户的双眼呈现出两幅存在视差的二维图像,以使得该用户可以通过前述二维图像获得更加逼真的观看三维物体的效果。Stereo rendering, also known as stereo rendering, is a technology used to display two-dimensional projections of discrete three-dimensional sampled data sets. It can also be understood that this technology is a process of processing and converting a three-dimensional light signal emitted or reflected by an object into a two-dimensional image, so that the human eye can achieve the effect of viewing a three-dimensional object when viewing a two-dimensional image. When the stereoscopic rendering technology is applied to virtual reality (virtual reality, VR), two parallax two-dimensional images can be presented to the user's eyes respectively, so that the user can obtain more realistic images through the aforementioned two-dimensional images Watch the effect of three-dimensional objects.
目前,图形处理器(graphics processing unit,GPU)可以渲染出两个存在视差的图像,并将前述两幅图像传输至显示控制器,再由显示控制器将前述两幅图像传输至显示器,以使得该显示器可以分别对用户的左眼和右眼显示出前述两幅图像,以使得用户获得立体观影效果。At present, a graphics processing unit (GPU) can render two images with parallax, and transmit the aforementioned two images to the display controller, and then the display controller transmits the aforementioned two images to the display, so that The display can respectively display the aforementioned two images to the user's left and right eyes, so that the user can obtain a stereoscopic viewing effect.
在这样的方案中,由于,立体渲染过程需要较大的计算量并占用较大的带宽,当GPU同时对两幅图像进行渲染处理时,将使得GPU的计算量急剧增加,带宽的占用量也急剧增加,进而加重GPU的运行负荷。In such a scheme, since the stereo rendering process requires a large amount of calculation and occupies a large bandwidth, when the GPU is rendering two images at the same time, the calculation amount of the GPU will increase sharply, and the bandwidth occupation will also be A sharp increase, which in turn increases the operating load of the GPU.
发明内容Summary of the invention
本申请实施例提供了一种显示***、图形处理器GPU、显示控制器以及显示方法,用于在保证显示控制器能够同时显示两幅图像的情况下,减少GPU的运行负荷。The embodiments of the present application provide a display system, a graphics processor GPU, a display controller, and a display method, which are used to reduce the operating load of the GPU while ensuring that the display controller can display two images at the same time.
第一方面,本申请实施例提供了一种显示***,该显示***包括图形处理器GPU和显示控制器。其中,该图形处理器GPU,用于渲染第一图像,并将该第一图像提供至该显示控制器;该显示控制器,用于从该GPU获取该第一图像,获取该第一图像与第二图像的视差信息,基于该第一图像的至少一部分和该视差信息确定该第二图像的至少一部分,并将该第一图像和该第二图像的至少一部分发送至显示器,以使得该第一图像和该第二图像的至少一部分用于在该显示器上呈现立体效果。In the first aspect, an embodiment of the present application provides a display system, which includes a graphics processor GPU and a display controller. Wherein, the graphics processor GPU is used to render a first image and provide the first image to the display controller; the display controller is used to obtain the first image from the GPU, to obtain the first image and For the disparity information of the second image, determine at least a part of the second image based on at least a part of the first image and the disparity information, and send at least a part of the first image and the second image to the display, so that the first image An image and at least a part of the second image are used to present a three-dimensional effect on the display.
本申请实施例中,显示***中的图形处理器GPU仅渲染出第一图像,并将前述第一图像提供给显示控制器。该显示控制器基于前述第一图像的至少一部分和视差信息确定第二图像的至少一部分,并将前述第一图像和前述第二图像的至少一部分传输至显示器。于是,该显示器便可以通过显示所述第一图像和所述第二图像的至少一部分呈现出立体效果。在这样的方案中,该显示控制器向显示器传输了两幅图像,GPU仅渲染第一图像而无需渲染第二图像,由显示控制器通过视差信息和第一图像得到第二图像的至少一部分,减少了GPU渲染图像的运行负荷,例如,可以减少GPU的计算量和带宽。In the embodiment of the present application, the graphics processor GPU in the display system only renders the first image, and provides the aforementioned first image to the display controller. The display controller determines at least a part of the second image based on at least a part of the first image and the parallax information, and transmits at least a part of the first image and the second image to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image. In such a solution, the display controller transmits two images to the display, the GPU only renders the first image without rendering the second image, and the display controller obtains at least a part of the second image through the disparity information and the first image. Reduce the running load of GPU rendering images, for example, can reduce GPU calculation and bandwidth.
根据第一方面,本申请实施例第一方面的第一种实施方式中,该第一图像包括多个像素,该多个像素中的每个像素对应一个深度值,该视差信息包括该第一图像的至少一部分中的第一像素与该第二图像中与该第一像素对应的第二像素之间的视差值。该GPU,还用于基于该第一像素的深度值确定该视差值。该显示控制器,具体用于从该GPU获取该视差值。According to the first aspect, in the first implementation manner of the first aspect of the embodiments of the present application, the first image includes a plurality of pixels, each pixel of the plurality of pixels corresponds to a depth value, and the disparity information includes the first The disparity value between a first pixel in at least a part of the image and a second pixel in the second image corresponding to the first pixel. The GPU is also used to determine the disparity value based on the depth value of the first pixel. The display controller is specifically configured to obtain the disparity value from the GPU.
本实施方式中,提出视差信息是由图形处理器GPU确定的。由于该图形处理器GPU仅需渲染出第一图像以及计算出视差信息,并将该视差信息和第一图像提供至显示控制器,由该显示控制器进行后续计算。在此过程中,该图形处理器GPU无需对两幅图像进行渲染。因此,有利于减少图形处理器GPU的运行负荷,例如,减少图形处理器GPU占用的带宽,减少图形处理器GPU的计算量。In this embodiment, it is proposed that the disparity information is determined by the graphics processor GPU. Since the graphics processor GPU only needs to render the first image and calculate the disparity information, and provide the disparity information and the first image to the display controller, the display controller performs subsequent calculations. In this process, the graphics processor GPU does not need to render two images. Therefore, it is beneficial to reduce the operating load of the graphics processor GPU, for example, reduce the bandwidth occupied by the graphics processor GPU, and reduce the calculation amount of the graphics processor GPU.
根据第一方面的第一种实施方式,本申请实施例第一方面的第二种实施方式中,该GPU,具体用于基于该第一像素的深度值、焦距以及基线长度确定该视差值。According to the first implementation manner of the first aspect, in the second implementation manner of the first aspect of the embodiments of the present application, the GPU is specifically configured to determine the disparity value based on the depth value, focal length, and baseline length of the first pixel .
本实施方式中,提出该图形处理器GPU确定视差信息的具体方式,即利用视差原理分别对每个像素的视差值进行计算,最终可以获得该第一图像与第二图像的视差信息。In this embodiment, a specific method for the graphics processor GPU to determine the disparity information is proposed, that is, the disparity value of each pixel is calculated separately using the disparity principle, and finally the disparity information of the first image and the second image can be obtained.
根据第一方面,本申请实施例第一方面的第三种实施方式中,该第一图像包括多个像素,该多个像素中的每个像素对应一个深度值,该视差信息包括该第一图像的至少一部分中的第一像素与该第二图像中与该第一像素对应的第二像素之间的视差值。该显示控制器,具体用于获取该第一像素的深度值,并基于该第一像素的深度值确定该视差值。According to the first aspect, in a third implementation manner of the first aspect of the embodiments of the present application, the first image includes a plurality of pixels, each pixel of the plurality of pixels corresponds to a depth value, and the disparity information includes the first The disparity value between a first pixel in at least a part of the image and a second pixel in the second image corresponding to the first pixel. The display controller is specifically configured to obtain the depth value of the first pixel, and determine the disparity value based on the depth value of the first pixel.
本实施方式中,提出该视差信息是由显示控制器确定的。由于,该图形处理器GPU仅需渲染出第一图像,而无需计算视差信息,由该显示控制器负责后续计算。因此,相比于第一方面的第一种实施方式,可以进一步减少图形处理器GPU的运行负荷,例如,进一步减少图形处理器GPU占用的带宽,进一步减少图形处理器GPU的计算量。In this embodiment, it is proposed that the disparity information is determined by the display controller. Since the graphics processor GPU only needs to render the first image without calculating the disparity information, the display controller is responsible for subsequent calculations. Therefore, compared with the first implementation manner of the first aspect, the operating load of the graphics processor GPU can be further reduced, for example, the bandwidth occupied by the graphics processor GPU is further reduced, and the calculation amount of the graphics processor GPU is further reduced.
根据第一方面的第三种实施方式,本申请实施例第一方面的第四种实施方式中,该显示控制器,具体用于基于该第一像素的深度值、焦距以及基线长度确定该视差值。According to the third implementation manner of the first aspect, in the fourth implementation manner of the first aspect of the embodiments of the present application, the display controller is specifically configured to determine the view based on the depth value, focal length, and baseline length of the first pixel. Difference.
本实施方式中,提出该显示控制器确定视差信息的具体方式,即利用视差原理分别对每个像素的视差值进行计算,最终可以获得该第一图像与第二图像的视差信息。In this embodiment, a specific method for the display controller to determine the disparity information is proposed, that is, the disparity value of each pixel is calculated separately using the disparity principle, and finally the disparity information of the first image and the second image can be obtained.
根据第一方面的第一种实施方式至第一方面的第四种实施方式中的任意一种实施方式,本申请实施例第一方面的第五种实施方式中,该显示控制器,具体用于基于该第一像素和该视差值在该第二图像的至少一部分中定位该第二像素,并将该第一像素的像素值作为该第二像素的像素值。According to any one of the first implementation manner of the first aspect to the fourth implementation manner of the first aspect, in the fifth implementation manner of the first aspect of the embodiments of the present application, the display controller is specifically used Positioning the second pixel in at least a part of the second image based on the first pixel and the disparity value, and using the pixel value of the first pixel as the pixel value of the second pixel.
根据第一方面、第一方面的第一种实施方式至第一方面的第五种实施方式中的任意一种实施方式,本申请实施例第一方面的第六种实施方式中,该第二图像的至少一部分为该第二图像的第一部分。该GPU,还用于渲染该第二图像的第二部分,该第二图像的第二部分不包括在该第一图像中。该显示控制器,还用于获取该第二图像的第二部分,将该第二图像的第二部分发送至该显示器,该第一图像、和该第二图像的第一部分与第二部分的结合用于在该显示器上呈现立体效果。According to any one of the first aspect, the first implementation manner of the first aspect to the fifth implementation manner of the first aspect, in the sixth implementation manner of the first aspect of the embodiments of the present application, the second At least a part of the image is the first part of the second image. The GPU is also used to render the second part of the second image, and the second part of the second image is not included in the first image. The display controller is also used to obtain the second part of the second image, and send the second part of the second image to the display, the first image, and the first part and the second part of the second image The combination is used to present a three-dimensional effect on the display.
本实施方式中,该图形处理器GPU确定第二图像的第二部分,并由显示控制器传输至 显示器显示,有利于拓宽该显示器所能呈现的第二图像的范围,有利于拓宽第一图像和第二图像在人眼中呈现的像的范围。In this embodiment, the graphics processor GPU determines the second part of the second image and transmits it to the display for display by the display controller, which is beneficial to broaden the range of the second image that the display can present, and is beneficial to broaden the first image And the range of the second image in the human eye.
第二方面,本申请实施例提供了一种图形处理器GPU,该图形处理器GPU包括处理模块和接口模块。其中,该处理模块,用于通过渲染获得第一图像,并获取该第一图像与第二图像的视差信息。该接口模块,用于将该第一图像和该视差信息提供至显示控制器,该第一图像和该第二图像的至少一部分用于在该显示控制器对应的显示器上呈现立体效果。In the second aspect, an embodiment of the present application provides a graphics processor GPU, which includes a processing module and an interface module. Wherein, the processing module is configured to obtain the first image through rendering, and obtain the disparity information of the first image and the second image. The interface module is configured to provide the first image and the disparity information to a display controller, and at least a part of the first image and the second image are used to present a stereoscopic effect on a display corresponding to the display controller.
本实施例中,显示***中的图形处理器GPU仅需确定第一图像和视差信息,并将第一图像和视差信息提供给显示控制器。由该显示控制器基于前述第一图像的至少一部分和视差信息确定第二图像的至少一部分,并将前述第一图像和前述第二图像的至少一部分传输至显示器。于是,该显示器便可以通过显示所述第一图像和所述第二图像的至少一部分呈现出立体效果。在这样的方案中,该显示控制器向显示器传输了两幅图像,GPU仅渲染第一图像而无需渲染第二图像,由显示控制器通过视差信息和第一图像得到第二图像的至少一部分,减少了GPU渲染图像的运行负荷,例如,可以减少GPU的计算量和带宽。In this embodiment, the graphics processor GPU in the display system only needs to determine the first image and disparity information, and provide the first image and disparity information to the display controller. The display controller determines at least a part of the second image based on at least a part of the first image and the parallax information, and transmits at least a part of the first image and the second image to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image. In such a solution, the display controller transmits two images to the display, the GPU only renders the first image without rendering the second image, and the display controller obtains at least a part of the second image through the disparity information and the first image. Reduce the running load of GPU rendering images, for example, can reduce GPU calculation and bandwidth.
根据第二方面,本申请实施例第二方面的第一种实施方式中,该第一图像包括多个像素,该第一图像中的每个像素对应一个深度值,该视差信息包括该第一图像的至少一部分中的第一像素与该第二图像中与该第一像素对应的第二像素之间的视差值。该处理模块,具体用于基于该第一像素的深度值确定该视差值。According to a second aspect, in a first implementation manner of the second aspect of the embodiments of the present application, the first image includes a plurality of pixels, each pixel in the first image corresponds to a depth value, and the disparity information includes the first The disparity value between a first pixel in at least a part of the image and a second pixel in the second image corresponding to the first pixel. The processing module is specifically configured to determine the disparity value based on the depth value of the first pixel.
本实施方式中,提出视差信息是由图形处理器GPU确定的。由于该图形处理器GPU仅需渲染出第一图像以及计算出视差信息,并将该视差信息和第一图像提供至显示控制器,由该显示控制器进行后续计算。在此过程中,该图形处理器GPU无需对两幅图像进行渲染。因此,有利于减少图形处理器GPU的运行负荷,例如,减少图形处理器GPU占用的带宽,减少图形处理器GPU的计算量。In this embodiment, it is proposed that the disparity information is determined by the graphics processor GPU. Since the graphics processor GPU only needs to render the first image and calculate the disparity information, and provide the disparity information and the first image to the display controller, the display controller performs subsequent calculations. In this process, the graphics processor GPU does not need to render two images. Therefore, it is beneficial to reduce the operating load of the graphics processor GPU, for example, reduce the bandwidth occupied by the graphics processor GPU, and reduce the calculation amount of the graphics processor GPU.
根据第二方面的第一种实施方式,本申请实施例第二方面的第二种实施方式中,该处理模块,具体用于基于该第一像素的深度值、焦距以及基线长度确定该视差值。According to the first implementation manner of the second aspect, in the second implementation manner of the second aspect of the embodiments of the present application, the processing module is specifically configured to determine the parallax based on the depth value, focal length, and baseline length of the first pixel value.
本实施方式中,提出该图形处理器GPU确定视差信息的具体方式,即利用视差原理分别对每个像素的视差值进行计算,最终可以获得该第一图像与第二图像的视差信息。In this embodiment, a specific method for the graphics processor GPU to determine the disparity information is proposed, that is, the disparity value of each pixel is calculated separately using the disparity principle, and finally the disparity information of the first image and the second image can be obtained.
根据第二方面、第二方面的第一种实施方式至第二方面的第二种实施方式中的任意一种实施方式,本申请实施例第二方面的第三种实施方式中,该第二图像的至少一部分为该第二图像的第一部分。该处理模块,还用于渲染该第二图像的第二部分,该第二图像的第二部分不包括在该第一图像中;该接口模块,还用于将该第二图像的第二部分提供至该显示控制器,该第一图像、和该第二图像的第一部分与第二部分的结合用于在该显示控制器对应的显示器上呈现立体效果。According to any one of the second aspect, the first implementation manner of the second aspect to the second implementation manner of the second aspect, in the third implementation manner of the second aspect of the embodiments of the present application, the second At least a part of the image is the first part of the second image. The processing module is also used to render the second part of the second image, the second part of the second image is not included in the first image; the interface module is also used to render the second part of the second image Provided to the display controller, the first image and the combination of the first part and the second part of the second image are used to present a stereoscopic effect on a display corresponding to the display controller.
本实施方式中,该图形处理器GPU确定第二图像的第二部分,并由显示控制器传输至显示器显示,有利于拓宽该显示器所能呈现的第二图像的范围,有利于拓宽第一图像和第二图像在人眼中呈现的像的范围。In this embodiment, the graphics processor GPU determines the second part of the second image and transmits it to the display for display by the display controller, which is beneficial to broaden the range of the second image that the display can present, and is beneficial to broaden the first image And the range of the second image in the human eye.
第三方面,本申请实施例提供了一种显示控制器,该显示控制器包括处理模块和接口模块。其中,该处理模块,用于从图形处理器GPU获取第一图像,获取该第一图像与第二 图像的视差信息,基于该第一图像的至少一部分和该视差信息确定该第二图像的至少一部分。该接口模块,用于将该第一图像和该第二图像的至少一部分发送至显示器,该第一图像和该第二图像的至少一部分用于在该显示器上呈现立体效果。In a third aspect, an embodiment of the present application provides a display controller, which includes a processing module and an interface module. Wherein, the processing module is configured to obtain the first image from the graphics processor GPU, obtain the disparity information of the first image and the second image, and determine at least part of the second image based on at least a part of the first image and the disparity information. Part. The interface module is configured to send at least a part of the first image and the second image to a display, and at least a part of the first image and the second image are used to present a stereoscopic effect on the display.
本实施例中,显示***中的图形处理器GPU仅需确定第一图像,由该显示控制器基于前述第一图像的至少一部分和视差信息确定第二图像的至少一部分,并将前述第一图像和前述第二图像的至少一部分传输至显示器。于是,该显示器便可以通过显示所述第一图像和所述第二图像的至少一部分呈现出立体效果。在这样的方案中,该显示控制器向显示器传输了两幅图像,GPU仅渲染第一图像而无需渲染第二图像,由显示控制器通过视差信息和第一图像得到第二图像的至少一部分,减少了GPU渲染图像的运行负荷,例如,可以减少GPU的计算量和带宽。In this embodiment, the graphics processor GPU in the display system only needs to determine the first image, and the display controller determines at least a part of the second image based on at least a part of the first image and the disparity information, and combines the first image And at least a part of the aforementioned second image is transmitted to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image. In such a solution, the display controller transmits two images to the display, the GPU only renders the first image without rendering the second image, and the display controller obtains at least a part of the second image through the disparity information and the first image. Reduce the running load of GPU rendering images, for example, can reduce GPU calculation and bandwidth.
根据第三方面,本申请实施例第三方面的第一种实施方式中,该第一图像包括多个像素,该多个像素中的每个像素对应一个深度值,该视差信息包括该第一图像的至少一部分中的第一像素与该第二图像中与该第一像素对应的第二像素之间的视差值。该处理模块,具体用于从该GPU获取该视差值。According to a third aspect, in a first implementation manner of the third aspect of the embodiments of the present application, the first image includes a plurality of pixels, each pixel of the plurality of pixels corresponds to a depth value, and the disparity information includes the first The disparity value between a first pixel in at least a part of the image and a second pixel in the second image corresponding to the first pixel. The processing module is specifically configured to obtain the disparity value from the GPU.
本实施方式中,提出视差信息是由图形处理器GPU确定的,并提供给显示控制器以使得显示控制器进行后续计算。In this embodiment, it is proposed that the disparity information is determined by the graphics processor GPU and provided to the display controller to enable the display controller to perform subsequent calculations.
根据第三方面,本申请实施例第三方面的第二种实施方式中,该第一图像包括多个像素,该多个像素中的每个像素对应一个深度值,该视差信息包括该第一图像的至少一部分中的第一像素与该第二图像中与该第一像素对应的第二像素之间的视差值。该处理模块,具体用于获取该第一像素的深度值,并基于该第一像素的深度值确定该视差值。According to a third aspect, in a second implementation manner of the third aspect of the embodiments of the present application, the first image includes a plurality of pixels, each pixel of the plurality of pixels corresponds to a depth value, and the disparity information includes the first The disparity value between a first pixel in at least a part of the image and a second pixel in the second image corresponding to the first pixel. The processing module is specifically configured to obtain the depth value of the first pixel, and determine the disparity value based on the depth value of the first pixel.
本实施方式中,提出视差信息是由显示控制器确定的,该显示控制器基于GPU提供的深度值确定的视差信息,以使得显示控制器进行后续计算。In this embodiment, it is proposed that the disparity information is determined by the display controller, and the display controller determines the disparity information based on the depth value provided by the GPU, so that the display controller performs subsequent calculations.
根据第三方面的第二种实施方式,本申请实施例第三方面的第三种实施方式中,该处理模块,具体用于基于该第一像素的深度值、焦距以及基线长度确定该视差值。According to the second implementation manner of the third aspect, in the third implementation manner of the third aspect of the embodiments of the present application, the processing module is specifically configured to determine the parallax based on the depth value, focal length, and baseline length of the first pixel value.
本实施方式中,提出该显示控制器确定视差信息的具体方式,即利用视差原理分别对每个像素的视差值进行计算,最终可以获得该第一图像与第二图像的视差信息。In this embodiment, a specific method for the display controller to determine the disparity information is proposed, that is, the disparity value of each pixel is calculated separately using the disparity principle, and finally the disparity information of the first image and the second image can be obtained.
根据第三方面、第三方面的第一种实施方式至第三方面的第三种实施方式中的任意一种实施方式,本申请实施例第三方面的第四种实施方式中,该处理模块,具体用于基于该第一像素和该视差值在该第二图像的至少一部分中定位该第二像素,并将该第一像素的像素值作为该第二像素的像素值。According to any one of the third aspect, the first implementation manner of the third aspect to the third implementation manner of the third aspect, in the fourth implementation manner of the third aspect of the embodiments of the present application, the processing module , Specifically used to locate the second pixel in at least a part of the second image based on the first pixel and the disparity value, and use the pixel value of the first pixel as the pixel value of the second pixel.
根据第三方面、第三方面的第一种实施方式至第三方面的第四种实施方式中的任意一种实施方式,本申请实施例第三方面的第五种实施方式中,该第二图像的至少一部分为该第二图像的第一部分。该处理模块,还用于从该GPU获取该第二图像的第二部分;该接口模块,还用于将该第二图像的第二部分发送至该显示器,该第二图像的第二部分不包括在该第一图像中,该第一图像、和该第二图像的第一部分与第二部分的结合用于在该显示器上呈现立体效果。According to any one of the third aspect, the first implementation manner of the third aspect to the fourth implementation manner of the third aspect, in the fifth implementation manner of the third aspect of the embodiments of the present application, the second At least a part of the image is the first part of the second image. The processing module is also used to obtain the second part of the second image from the GPU; the interface module is also used to send the second part of the second image to the display, and the second part of the second image is not Included in the first image, the first image and the combination of the first part and the second part of the second image are used to present a three-dimensional effect on the display.
本实施方式中,该图形处理器GPU确定第二图像的第二部分,并由显示控制器传输至 显示器显示,有利于拓宽该显示器所能呈现的第二图像的范围,有利于拓宽第一图像和第二图像在人眼中呈现的像的范围。In this embodiment, the graphics processor GPU determines the second part of the second image and transmits it to the display for display by the display controller, which is beneficial to broaden the range of the second image that the display can present, and is beneficial to broaden the first image. And the range of the second image in the human eye.
第四方面,本申请实施例提供了一种显示方法,在该方法中,显示***渲染第一图像,并获取该第一图像与第二图像的视差信息。然后,该显示***基于该第一图像的至少一部分和该视差信息确定该第二图像的至少一部分,并将该第一图像和该第二图像的至少一部分发送至显示器,该第一图像和该第二图像的至少一部分用于在该显示器上呈现立体效果。In a fourth aspect, an embodiment of the present application provides a display method, in which a display system renders a first image, and obtains parallax information of the first image and the second image. Then, the display system determines at least a part of the second image based on at least a part of the first image and the disparity information, and sends at least a part of the first image and the second image to a display, the first image and the disparity information At least a part of the second image is used to present a three-dimensional effect on the display.
本实施例中,显示***仅渲染出第一图像,并基于前述第一图像的至少一部分和视差信息确定第二图像的至少一部分,并将前述第一图像和前述第二图像的至少一部分传输至显示器显示。在这样的方案中,由于该显示***仅渲染第一图像而无需渲染第二图像,减少了显示***渲染图像的运行负荷,例如,可以减少显示***中的GPU的计算量和带宽。In this embodiment, the display system only renders the first image, determines at least a part of the second image based on at least a part of the first image and disparity information, and transmits at least part of the first image and the second image to The display shows. In such a solution, since the display system only renders the first image and does not need to render the second image, the operating load of the display system for rendering images is reduced. For example, the calculation amount and bandwidth of the GPU in the display system can be reduced.
从以上技术方案可以看出,本申请实施例具有以下优点:本申请实施例中,显示***中的图形处理器GPU仅渲染出第一图像,并将前述第一图像提供给显示控制器。该显示控制器基于前述第一图像的至少一部分和视差信息确定第二图像的至少一部分,并将前述第一图像和前述第二图像的至少一部分传输至显示器。于是,该显示器便可以通过显示所述第一图像和所述第二图像的至少一部分呈现出立体效果。在这样的方案中,该显示控制器向显示器传输了两幅图像,GPU仅渲染第一图像而无需渲染第二图像,由显示控制器通过视差信息和第一图像得到第二图像的至少一部分,减少了GPU渲染图像的运行负荷,例如,可以减少GPU的计算量和带宽。It can be seen from the above technical solutions that the embodiments of the present application have the following advantages: In the embodiments of the present application, the graphics processor GPU in the display system only renders the first image, and provides the aforementioned first image to the display controller. The display controller determines at least a part of the second image based on at least a part of the first image and the parallax information, and transmits at least a part of the first image and the second image to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image. In such a solution, the display controller transmits two images to the display, the GPU only renders the first image without rendering the second image, and the display controller obtains at least a part of the second image through the disparity information and the first image. Reduce the running load of GPU rendering images, for example, can reduce GPU calculation and bandwidth.
附图说明Description of the drawings
为了更清楚地说明本申请实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例。In order to more clearly describe the technical solutions of the embodiments of the present application, the following will briefly introduce the drawings needed in the description of the embodiments. Obviously, the drawings in the following description are only some embodiments of the present application.
图1为本申请实施例中一个视差原理示意图;FIG. 1 is a schematic diagram of a parallax principle in an embodiment of the application;
图2为本申请实施例中显示***的一个实施例示意图;Figure 2 is a schematic diagram of an embodiment of a display system in an embodiment of the application;
图3A为本申请实施例中另一个视差原理示意图;FIG. 3A is a schematic diagram of another parallax principle in an embodiment of this application;
图3B为本申请实施例中独立式头戴显示器的一个应用场景图;FIG. 3B is a diagram of an application scenario of the stand-alone head-mounted display in an embodiment of the application;
图3C为本申请实施例中绑定式头戴显示器的一个应用场景图;FIG. 3C is a diagram of an application scenario of the bound head-mounted display in an embodiment of the application;
图3D为本申请实施例中左显示屏和右显示屏的显示效果的一个示意图;FIG. 3D is a schematic diagram of the display effects of the left display screen and the right display screen in an embodiment of the application;
图3E为本申请实施例中左显示屏和右显示屏的显示效果的另一个示意图;3E is another schematic diagram of the display effects of the left display screen and the right display screen in an embodiment of the application;
图3F为本申请实施例中左显示屏和右显示屏的显示效果的另一个示意图;3F is another schematic diagram of the display effects of the left display screen and the right display screen in an embodiment of the application;
图4为本申请实施例中图形处理器GPU的一个实施例示意图;FIG. 4 is a schematic diagram of an embodiment of a graphics processor GPU in an embodiment of the application;
图5为本申请实施例中显示控制器的一个实施例示意图;FIG. 5 is a schematic diagram of an embodiment of a display controller in an embodiment of the application;
图6A为本申请实施例中第一图像和第二图像的一个示意图;FIG. 6A is a schematic diagram of the first image and the second image in an embodiment of this application;
图6B为本申请实施例中第一图像和第二图像的另一个示意图;FIG. 6B is another schematic diagram of the first image and the second image in the embodiment of this application;
图6C为本申请实施例中第一图像和第二图像的另一个示意图;FIG. 6C is another schematic diagram of the first image and the second image in an embodiment of this application;
图6D为本申请实施例中第一图像和第二图像的另一个示意图;FIG. 6D is another schematic diagram of the first image and the second image in an embodiment of this application;
图6E为本申请实施例中第一图像和第二图像的另一个示意图;FIG. 6E is another schematic diagram of the first image and the second image in an embodiment of this application;
图7为本申请实施例中显示方法的一个流程图。FIG. 7 is a flowchart of the display method in an embodiment of the application.
具体实施方式Detailed ways
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。The technical solutions in the embodiments of the present application will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present application. Obviously, the described embodiments are only a part of the embodiments of the present application, rather than all the embodiments.
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”、“第四”等(如果存在)是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的实施例能够以除了在这里图示或描述的内容以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、***、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。The terms "first", "second", "third", "fourth", etc. (if any) in the description and claims of this application and the above-mentioned drawings are used to distinguish similar objects, without having to use To describe a specific order or sequence. It should be understood that the data used in this way can be interchanged under appropriate circumstances so that the embodiments described herein can be implemented in a sequence other than the content illustrated or described herein. In addition, the terms "including" and "having" and any variations of them are intended to cover non-exclusive inclusions. For example, a process, method, system, product, or device that includes a series of steps or units is not necessarily limited to those clearly listed. Those steps or units may include other steps or units that are not clearly listed or are inherent to these processes, methods, products, or equipment.
本申请实施例提供了一种显示***、图形处理器GPU、显示控制器以及显示方法,用于使得显示***提供两幅图像给显示器显示时,可以减少GPU的运行负荷。为便于理解,下面先对本申请实施例所提出的显示***和显示方法的应用场景进行介绍。The embodiments of the present application provide a display system, a graphics processor GPU, a display controller, and a display method, which are used to reduce the operating load of the GPU when the display system provides two images for display on the display. For ease of understanding, the following first introduces the application scenarios of the display system and the display method proposed in the embodiments of the present application.
本申请实施例所提出的显示***主要应用于增强现实(augmented reality,AR)设备或虚拟现实(virtual reality,VR)设备中。具体地,用户可以佩戴装载有前述显示***的AR/VR设备,通过该AR/VR设备给人眼提供的二维图像获得立体观影效果。一般地,AR/VR设备包括头戴式显示器(head-mounted displays,HMDs),该头戴式显示器可以通过一组光学***放大超微显示屏上的图像,将显示***计算出的图像分别呈现于用户的左右眼,以使得用户通过分别呈现于左眼和右眼的图像获得立体的视觉效果。本申请实施例中的显示***可以集成于前述头戴式显示器中,也可以位于与前述头戴式显示器相连的计算机中,还可以位于其他设备中,例如位于手机中,具体此处不做限定。The display system proposed in the embodiments of the present application is mainly applied to augmented reality (AR) devices or virtual reality (VR) devices. Specifically, the user may wear an AR/VR device loaded with the aforementioned display system, and obtain a stereoscopic viewing effect through the two-dimensional image provided by the AR/VR device to the human eye. Generally, AR/VR devices include head-mounted displays (HMDs), which can magnify the image on the ultra-micro display through a set of optical systems, and present the images calculated by the display system separately For the left and right eyes of the user, so that the user can obtain a three-dimensional visual effect through the images presented to the left and right eyes respectively. The display system in the embodiments of the present application can be integrated in the aforementioned head-mounted display, or can be located in a computer connected to the aforementioned head-mounted display, and can also be located in other devices, such as a mobile phone, which is not specifically limited here. .
为便于理解本申请实施例所提出的显示***,下面将对该显示***涉及的视差原理进行介绍。具体如图1所示,为视差原理示意图。其中,o和o’分别为左光心和右光心,像素p为物体P在左光轴上位于图像1中的成像点,像素p’点为物体P在右光轴上位于图像2中的成像点。假设,图像1中的像素p位于距离图像1左侧x1处,图像2中的像素p’位于距离图像2左侧x2处,则图像1中的像素p与图像2中的像素p’之间的视差值为z=x1-x2(公式1)。又由于,该视差值z也可以由如下公式确定:z=f×b/d(公式2);其中,f为焦距,即左光心o与图像1之间的距离,或者,右光心o’与图像2之间的距离;b为前述左光心o和右光心o’之间的距离,也被称为基线长度;d为像素的深度值。因此,当已知焦距f、基线长度b、像素深度值d和p像素在图像1中的坐标值x1时,可以计算出像素p’在图像2中的坐标x2。也就是说,当确定前述两幅图像中的一幅图像时,可以基于视差原理确定另一幅图像。In order to facilitate the understanding of the display system proposed in the embodiment of the present application, the parallax principle involved in the display system will be introduced below. Specifically, as shown in Figure 1, it is a schematic diagram of the parallax principle. Among them, o and o'are the left optical center and the right optical center respectively, the pixel p is the imaging point where the object P is located in the image 1 on the left optical axis, and the pixel p'is the object P is located in the image 2 on the right optical axis.的imaging point. Assuming that the pixel p in image 1 is located x1 from the left of image 1, and the pixel p'in image 2 is located x2 from the left of image 2, then the pixel p in image 1 and the pixel p'in image 2 are between The parallax value of is z=x1-x2 (Equation 1). Moreover, the disparity value z can also be determined by the following formula: z=f×b/d (formula 2); where f is the focal length, that is, the distance between the left optical center o and the image 1, or the right optical The distance between the center o'and the image 2; b is the distance between the aforementioned left optical center o and the right optical center o', also known as the baseline length; d is the depth value of the pixel. Therefore, when the focal length f, the baseline length b, the pixel depth value d, and the coordinate value x1 of the p pixel in the image 1 are known, the coordinate x2 of the pixel p'in the image 2 can be calculated. That is, when determining one of the foregoing two images, the other image can be determined based on the principle of parallax.
本申请实施例提出的显示***正是利用了前述视差原理,同时向用户的左眼和右眼呈现两幅存在一定视差的图像,可以使得该用户观看的两幅图像在用户的视网膜处形成立体的虚像,进而使用户获得观看立体物体的效果。The display system proposed in the embodiment of the present application just utilizes the aforementioned parallax principle, and simultaneously presents two images with a certain parallax to the left and right eyes of the user, so that the two images viewed by the user can form a stereoscopic image on the user’s retina. The virtual image of the user can obtain the effect of viewing three-dimensional objects.
下面将基于前述原理对显示***的主要结构进行介绍,如图2所示,为本申请实施例提出的一种显示***20的结构示意图。该显示***20包括:图形处理器GPU 201和显示控制器202。其中,图形处理器GPU 201,用于渲染第一图像。该第一图像为某一视角的图像,例如,该第一图像为提供给用户左眼的图像或提供给用户右眼的图像,具体此处不做限定。前述渲染可以为立体渲染(stereo rendering)。具体地,该图形处理器GPU 201进行顶点处理(vertex processing),并将顶点处理获得的多个顶点中的每三个顶点形成三角形。然后,该图形处理器GPU 201进行光栅化(rasterization)处理。最后,图形处理器GPU 201进行像素处理,得到前述第一图像。此外,该显示控制器202,用于从该图形处理器GPU 201获取该第一图像,获取该第一图像与第二图像的视差信息,并且,基于该第一图像的至少一部分和该视差信息确定该第二图像的至少一部分。其中,该第二图像为另一视角的图像,例如,若前述第一图像为提供给用户左眼的图像,则该第二图像为提供给用户右眼的图像;若前述第一图像为提供给用户右眼的图像,则该第二图像为提供给用户左眼的图像,具体此处不做限定。The main structure of the display system will be introduced below based on the foregoing principles. As shown in FIG. 2, a schematic structural diagram of a display system 20 proposed in this embodiment of the application. The display system 20 includes a graphics processor GPU 201 and a display controller 202. Among them, the graphics processor GPU 201 is used to render the first image. The first image is an image from a certain perspective. For example, the first image is an image provided to the user's left eye or an image provided to the user's right eye, which is not specifically limited here. The aforementioned rendering may be stereo rendering. Specifically, the graphics processor GPU 201 performs vertex processing, and forms triangles for every three vertices among the multiple vertices obtained by the vertex processing. Then, the graphics processor GPU 201 performs rasterization processing. Finally, the graphics processor GPU 201 performs pixel processing to obtain the aforementioned first image. In addition, the display controller 202 is configured to obtain the first image from the graphics processor GPU 201, obtain the disparity information of the first image and the second image, and based on at least a part of the first image and the disparity information Determine at least a part of the second image. Wherein, the second image is an image from another perspective. For example, if the aforementioned first image is an image provided to the user's left eye, the second image is an image provided to the user's right eye; if the aforementioned first image is an image provided to the user's right eye; The image for the right eye of the user, the second image is the image provided for the left eye of the user, and the details are not limited here.
此外,前述第一图像中包含多个像素,前述第二图像中包含多个像素,为便于介绍,称前述第一图像中的像素为第一像素,称前述第二图像中的像素为第二像素。前述第一图像与前述第二图像之间的视差信息包括前述第一图像中的第一像素与该第一像素在第二图像中的对应的第二像素的视差值,也可以理解为,包括前述第一图像中的每个第一像素与第二图像中对应的第二像素之间的视差值。具体地,该视差值可以为该第一像素的坐标值与该第一像素对应的第二像素的坐标值的差值。为便于理解,依然以前述图1为例进行介绍。本实施例中的第一图像可以为图1中的图像1,本实施例中的第二图像可以为图1中的图像2。由于,该第一图像与该第二图像之间存在视差,则该显示控制器202可以基于该第一图像的至少一部分和视差信息确定第二图像的至少一部分。具体可以参阅图3A,其中,第一图像为[a1,a2],第二图像为[b1,b2],并且,第一图像中的[a3,a2]部分与第二图像中的[b1,b3]部分之间存在视差,导致在形成的虚像中,第一图像中的[a3,a2]部分与第二图像中的[b1,b3]部分重叠。因此,该显示控制器202可以基于第一图像的至少一部分(例如,该第一图像中的[a3,a2]部分)和视差信息确定该第二图像的至少一部分(第二图像中的[b1,b3]部分重叠)。此时,该视差信息可以理解为是第一图像中的[a3,a2]部分与第二图像中的[b1,b3]部分之间的视差信息。In addition, the aforementioned first image contains multiple pixels, and the aforementioned second image contains multiple pixels. For ease of introduction, the pixels in the aforementioned first image are referred to as first pixels, and the pixels in the aforementioned second image are referred to as second pixels. Pixels. The disparity information between the first image and the second image includes the disparity value of the first pixel in the first image and the corresponding second pixel of the first pixel in the second image, which can also be understood as, It includes the disparity value between each first pixel in the aforementioned first image and the corresponding second pixel in the second image. Specifically, the disparity value may be the difference between the coordinate value of the first pixel and the coordinate value of the second pixel corresponding to the first pixel. For ease of understanding, the aforementioned Figure 1 is still used as an example for introduction. The first image in this embodiment may be image 1 in FIG. 1, and the second image in this embodiment may be image 2 in FIG. 1. Since there is a parallax between the first image and the second image, the display controller 202 can determine at least a part of the second image based on at least a part of the first image and the parallax information. For details, please refer to Figure 3A, where the first image is [a1, a2], the second image is [b1, b2], and the [a3, a2] part of the first image is the same as [b1, a2] in the second image. There is parallax between the b3 parts, which causes the [a3, a2] part in the first image to overlap with the [b1, b3] part in the second image in the formed virtual image. Therefore, the display controller 202 may determine at least a part of the second image ([b1 in the second image] based on at least a part of the first image (for example, the [a3, a2] part in the first image) and disparity information , B3] partially overlapped). At this time, the disparity information can be understood as the disparity information between the [a3, a2] part in the first image and the [b1, b3] part in the second image.
此外,该显示控制器202,还用于将前述第一图像和前述第二图像的至少一部分发送至显示器(图未示),以使得该第一图像和该第二图像的至少一部分在该显示器上呈现立体效果。其中,该显示器可以为头戴式显示器,该头戴式显示器中配置有左显示屏和右显示屏。可以理解,头戴式显示器可以被其他类型的显示器所代替,本实施例不限定。由于,前述第一图像为左眼观察到的图像,前述第二图像为右眼观察到的图像,因此,前述第一图像将显示于该左显示屏中,该第二图像将显示于该右显示屏中。具体地,该头戴式显示器的原理可以参阅前述图1对应的实施例中的相关介绍,具体此处不再赘述。In addition, the display controller 202 is also used to send at least a part of the first image and the second image to a display (not shown), so that at least a part of the first image and the second image are displayed on the display. Presents a three-dimensional effect on Wherein, the display may be a head-mounted display, and the head-mounted display is configured with a left display screen and a right display screen. It can be understood that the head-mounted display can be replaced by other types of displays, which is not limited in this embodiment. Since the aforementioned first image is the image observed by the left eye, and the aforementioned second image is the image observed by the right eye, the aforementioned first image will be displayed on the left display screen, and the second image will be displayed on the right eye. In the display. Specifically, for the principle of the head-mounted display, reference may be made to the related introduction in the embodiment corresponding to FIG. 1, and the details are not repeated here.
在实际应用中,前述显示***20可以与该显示器集成于同一设备中,前述显示*** 20也可以与该显示器分布于不同的设备中。以该显示器位于前述头戴式显示器中为例进行介绍,该头戴式显示器可以分为独立式头戴显示器和绑定式头戴显示器。In practical applications, the aforementioned display system 20 and the display can be integrated in the same device, and the aforementioned display system 20 and the display can also be distributed in different devices. Taking the display in the aforementioned head-mounted display as an example for introduction, the head-mounted display can be divided into a stand-alone head-mounted display and a bound head-mounted display.
其中,该独立式头戴显示器指将计算处理等模块集成于该独立式头戴式显示器中,无需连接外部计算机的头戴式显示器。例如,图3B所示为独立式头戴显示器,此时,该显示***20与显示器均位于该独立式头戴显示器中。用户观看该独立式头戴显示器显示的第一图像和第二图像的至少一部分即可获得立体观影效果。Wherein, the stand-alone head-mounted display refers to a head-mounted display that integrates computing and processing modules into the stand-alone head-mounted display, and does not need to be connected to an external computer. For example, FIG. 3B shows a stand-alone head-mounted display. At this time, the display system 20 and the display are both located in the stand-alone head-mounted display. The user can obtain a stereoscopic viewing effect by watching at least a part of the first image and the second image displayed on the stand-alone head-mounted display.
此外,该绑定式头戴显示器需要连接外部计算机,由外部计算机对数据进行处理,由绑定式头戴显示器进行显示。例如,图3C所示为绑定式头戴显示器,此时,该显示***20可以位于与该绑定式头戴显示器相连的外部计算机中,该显示器位于该绑定式头戴显示器中。此时,外部计算机中的显示***20确定出前述第一图像和第二图像的至少一部分,然后,将前述第一图像和第二图像的至少一部分传输至该独立式头戴显示器中,由该独立式头戴显示器向用户呈现立体效果。在这种实施方式中,该显示***20与该显示器之间的连接方式可以为有线连接方式,也可以为无线连接方式。当采用无线连接方式时,可以采用无线保真(wireless fidelity,Wi-Fi)、紫蜂协议(ZigBee)以及蓝牙等无线连接方式或者其他短距离通信方式,具体此处不做限定。In addition, the bound head-mounted display needs to be connected to an external computer, the data is processed by the external computer, and the bound head-mounted display is used for display. For example, FIG. 3C shows a bound head-mounted display. At this time, the display system 20 may be located in an external computer connected to the bound head-mounted display, and the display is located in the bound head-mounted display. At this time, the display system 20 in the external computer determines at least a part of the first image and the second image, and then transmits at least a part of the first image and the second image to the stand-alone head-mounted display. The stand-alone head-mounted display presents a three-dimensional effect to the user. In this embodiment, the connection between the display system 20 and the display can be a wired connection or a wireless connection. When a wireless connection method is adopted, wireless connection methods such as wireless fidelity (Wi-Fi), ZigBee protocol (ZigBee), and Bluetooth, or other short-distance communication methods, or other short-distance communication methods may be used, which are not specifically limited here.
本实施例中,显示***中的图形处理器GPU 201仅渲染出第一图像,并将前述第一图像提供给显示控制器202。该显示控制器202基于前述第一图像的至少一部分和视差信息确定第二图像的至少一部分,并将前述第一图像和前述第二图像的至少一部分传输至显示器。于是,该显示器便可以通过显示所述第一图像和所述第二图像的至少一部分呈现出立体效果。在这样的方案中,该显示控制器202向显示器传输了两幅图像,该图形处理器GPU 201仅渲染第一图像而无需渲染第二图像,由显示控制器202通过视差信息和第一图像得到第二图像的至少一部分,减少了图形处理器GPU 201渲染图像的运行负荷,例如,可以减少图形处理器GPU 201的计算量和带宽。In this embodiment, the graphics processor GPU 201 in the display system only renders the first image, and provides the aforementioned first image to the display controller 202. The display controller 202 determines at least a part of the second image based on at least a part of the aforementioned first image and disparity information, and transmits at least a part of the aforementioned first image and the aforementioned second image to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image. In such a solution, the display controller 202 transmits two images to the display, and the graphics processor GPU 201 only renders the first image without rendering the second image, which is obtained by the display controller 202 through the disparity information and the first image. At least a part of the second image reduces the running load of the graphics processor GPU 201 to render the image. For example, the calculation amount and bandwidth of the graphics processor GPU 201 can be reduced.
可选的,如图2所示,该显示***20还可以包括存储装置203。可选的,前述图形处理器GPU 201与前述显示控制器202共用前述存储装置203,前述第一图像、视差信息以及第二图像的至少一部分均存储于前述存储装置203中。也可以理解为,前述图像处理器GPU 201将渲染出的第一图像存储于前述存储装置203中;该显示控制器202也将确定出的第二图像的至少一部分存储于前述存储装置203中;前述图像处理器GPU 201将确定出的视差信息存储于前述存储装置203中,或者,显示控制器202将确定出的视差信息存储于前述存储装置203中,具体此处不做限定。此时,当该显示控制器202从该GPU获取该第一图像时,该显示控制器202可以直接从该图形处理器GPU 201中获取该第一图像,即该图形处理器GPU 201直接将该第一图像传输至该显示控制器202;该显示控制器202也可以从该存储装置203中获取该第一图像,具体此处不做限定。可选的,该存储装置203可以为双倍速率同步动态随机存储器(double data rate synchronous dynamic random-access memory,DDR SDRAM),该双倍速率同步动态随机存储器也常简称为DDR,该DDR用于存储前述第一图像以及第一图像中各个像素的相关信息。Optionally, as shown in FIG. 2, the display system 20 may further include a storage device 203. Optionally, the aforementioned graphics processor GPU 201 and the aforementioned display controller 202 share the aforementioned storage device 203, and at least a part of the aforementioned first image, disparity information, and second image are stored in the aforementioned storage device 203. It can also be understood that the aforementioned image processor GPU 201 stores the rendered first image in the aforementioned storage device 203; the display controller 202 also stores at least a part of the determined second image in the aforementioned storage device 203; The foregoing image processor GPU 201 stores the determined disparity information in the foregoing storage device 203, or the display controller 202 stores the determined disparity information in the foregoing storage device 203, which is not specifically limited here. At this time, when the display controller 202 obtains the first image from the GPU, the display controller 202 can directly obtain the first image from the graphics processor GPU 201, that is, the graphics processor GPU 201 directly The first image is transmitted to the display controller 202; the display controller 202 may also obtain the first image from the storage device 203, which is not specifically limited here. Optionally, the storage device 203 may be a double-rate synchronous dynamic random access memory (double data rate synchronous dynamic random-access memory, DDR SDRAM), and the double-rate synchronous dynamic random access memory is also often referred to as DDR for short. Store the aforementioned first image and related information of each pixel in the first image.
在本实施例中,图形处理器GPU 201是对图像进行绘制或渲染的设备。与图形处理器GPU 201相对的,显示控制器202也称为显示子***或显示驱动器。显示控制器202可用于进行图层叠加处理,并将图层叠加后的形成图像送显示器显示。可选地,显示控制器202也可以用于进行图像的翻转、放大或缩小等处理,本实施例对此不限定。图层叠加处理包括但不限于将图形处理器GPU 201绘制的图像与其他图像,如背景图像或窗口做叠加。In this embodiment, the graphics processor GPU 201 is a device that draws or renders an image. In contrast to the graphics processor GPU 201, the display controller 202 is also called a display subsystem or a display driver. The display controller 202 can be used to perform layer superimposition processing, and send the formed image after the layer superimposition to the display for display. Optionally, the display controller 202 may also be used to perform processing such as image inversion, enlargement, or reduction, which is not limited in this embodiment. The layer overlay processing includes, but is not limited to, overlaying the image drawn by the graphics processor GPU 201 with other images, such as background images or windows.
基于前述实施例,该显示控制器202可以采用多种实现方式获取该第一图像与该第二图像的视差信息,下面分别进行介绍。Based on the foregoing embodiment, the display controller 202 can obtain the disparity information of the first image and the second image in a variety of implementation manners, which will be introduced separately below.
在一种可选的实施方式中,该显示控制器202可以从该图形处理器GPU 201获取该第一图像与该第二图像的视差信息。具体地,该显示***20中的图形处理器GPU 201可以直接将视差信息发送至该显示控制器202,也就是说,该显示控制器202可以直接从该图形处理器GPU 201接收该视差信息;或者,该显示***20中的图形处理器GPU 201将视差信息发送至存储装置203中,该显示控制器202从该存储装置203中获取该视差信息,具体此处不做限定。在本实施方式中,该视差信息是由图形处理器GPU 201确定的。In an optional implementation manner, the display controller 202 may obtain the disparity information of the first image and the second image from the graphics processor GPU 201. Specifically, the graphics processor GPU 201 in the display system 20 may directly send the disparity information to the display controller 202, that is, the display controller 202 may directly receive the disparity information from the graphics processor GPU 201; Alternatively, the graphics processor GPU 201 in the display system 20 sends the disparity information to the storage device 203, and the display controller 202 obtains the disparity information from the storage device 203, which is not specifically limited here. In this embodiment, the disparity information is determined by the graphics processor GPU 201.
具体地,该图形处理器GPU 201,用于在渲染前述第一图像时,获取该第一图像中每个像素对应的深度值,然后,基于各个像素对应的深度值分别计算各个像素对应的视差值。例如,该第一图像包括第一像素,该第一像素对应一个深度值,则该图形处理器GPU 201可以基于第一像素的深度值确定该第一像素与该第一像素对应的第二像素之间的视差值。Specifically, the graphics processor GPU 201 is used to obtain the depth value corresponding to each pixel in the first image when rendering the aforementioned first image, and then calculate the visual value corresponding to each pixel based on the depth value corresponding to each pixel. Difference. For example, if the first image includes a first pixel, and the first pixel corresponds to a depth value, the graphics processor GPU 201 may determine the first pixel and the second pixel corresponding to the first pixel based on the depth value of the first pixel The parallax value between.
更具体地,该图形处理器GPU 201,用于基于该第一像素的深度值、焦距以及基线长度确定该视差值。具体可以采用如下公式确定该视差值:z=f×b/d(公式2);其中,f为焦距;b为基线长度;d为第一像素的深度值。具体可以参阅前述图1对应的视差原理的介绍,具体此处不再赘述。More specifically, the graphics processor GPU 201 is configured to determine the disparity value based on the depth value, focal length, and baseline length of the first pixel. Specifically, the following formula can be used to determine the parallax value: z=f×b/d (formula 2); where f is the focal length; b is the baseline length; and d is the depth value of the first pixel. For details, please refer to the introduction of the parallax principle corresponding to FIG. 1, and the details are not repeated here.
该图形处理器GPU 201对该第一图像中的各个像素进行类似处理,可以得到视差信息,该视差信息包括多个视差值,具体地,该视差信息包括该第一图像的至少一部分中的第一像素与该第二图像中与该第一像素对应的第二像素之间的视差值。The graphics processor GPU 201 performs similar processing on each pixel in the first image to obtain disparity information. The disparity information includes multiple disparity values. Specifically, the disparity information includes at least a part of the first image. The disparity value between the first pixel and the second pixel corresponding to the first pixel in the second image.
本实施方式中,由于该图形处理器GPU 201仅需渲染出第一图像,并利用第一图像中各个像素对应的深度值计算出视差信息,并将该视差信息和第一图像提供至显示控制器202,由该显示控制器202进行后续计算。在此过程中,该图形处理器GPU 201无需对两幅图像进行渲染。因此,有利于减少图形处理器GPU 201的运行负荷,例如,减少图形处理器GPU 201占用的带宽,减少图形处理器GPU 201的计算量。In this embodiment, since the graphics processor GPU 201 only needs to render the first image, and calculate the disparity information by using the depth value corresponding to each pixel in the first image, and provide the disparity information and the first image to the display control The display controller 202 performs subsequent calculations. In this process, the graphics processor GPU 201 does not need to render the two images. Therefore, it is beneficial to reduce the operating load of the graphics processor GPU 201, for example, reduce the bandwidth occupied by the graphics processor GPU 201, and reduce the calculation amount of the graphics processor GPU 201.
在另一种可选的实施方式中,该显示控制器202将从图形处理器GPU 201中获取第一图像中各个像素对应的深度值,并基于前述各个像素对应的深度值计算出各个像素对应的视差值。也就是说,在本实施方式中,该视差信息是由显示控制器202计算出的而非直接从图形处理器GPU 201中获取的。In another optional implementation manner, the display controller 202 will obtain the depth value corresponding to each pixel in the first image from the graphics processor GPU 201, and calculate the corresponding depth value of each pixel based on the depth value corresponding to each pixel. The parallax value. That is to say, in this embodiment, the disparity information is calculated by the display controller 202 rather than directly obtained from the graphics processor GPU 201.
其中,该显示控制器202从图形处理器GPU 201中获取第一图像中各个像素对应的深度值,可以理解为,该图形处理器GPU 201直接将各个像素的深度值发送至该显示控制器202,也就是说,该显示控制器202直接从该图形处理器GPU 201接收前述各个像素的深度 值;或者,该图形处理器GPU 201将各个像素的深度值发送至存储装置203中,该显示控制器202从该存储装置203中获取各个像素的深度值,具体此处不做限定。The display controller 202 obtains the depth value corresponding to each pixel in the first image from the graphics processor GPU 201. It can be understood that the graphics processor GPU 201 directly sends the depth value of each pixel to the display controller 202 That is, the display controller 202 directly receives the depth value of each pixel from the graphics processor GPU 201; or, the graphics processor GPU 201 sends the depth value of each pixel to the storage device 203, and the display control The device 202 obtains the depth value of each pixel from the storage device 203, which is not specifically limited here.
本实施方式中,该显示控制器202可以基于该第一像素的深度值、焦距以及基线长度确定该视差值,并且,该显示控制器202计算视差值的公式与前述图形处理器GPU 201计算视差值的公式相同,具体可以采用如下公式确定该视差值:z=f×b/d(公式2);其中,f为焦距;b为基线长度;d为第一像素的深度值。具体可以参阅前述图1对应的视差原理的介绍,具体此处不再赘述。In this embodiment, the display controller 202 may determine the disparity value based on the depth value, focal length, and baseline length of the first pixel, and the formula for calculating the disparity value of the display controller 202 is the same as the aforementioned graphics processor GPU 201 The formula for calculating the disparity value is the same. Specifically, the following formula can be used to determine the disparity value: z=f×b/d (formula 2); where f is the focal length; b is the baseline length; d is the depth value of the first pixel . For details, please refer to the introduction of the parallax principle corresponding to FIG. 1, and the details are not repeated here.
本实施方式中,由于,该图形处理器GPU 201仅需渲染出第一图像,且无需计算视差信息,由该显示控制器202负责后续计算,因此,在前述实施方式的基础上进一步减少图形处理器GPU 201的运行负荷,例如,进一步减少图形处理器GPU 201占用的带宽,进一步减少图形处理器GPU 201的计算量。In this embodiment, since the graphics processor GPU 201 only needs to render the first image and does not need to calculate the disparity information, the display controller 202 is responsible for subsequent calculations. Therefore, the graphics processing is further reduced on the basis of the foregoing embodiment. The operating load of the GPU 201, for example, further reduces the bandwidth occupied by the GPU 201, and further reduces the amount of calculation of the GPU 201.
本实施例中,无论该显示控制器202从图形处理器GPU 201获取视差信息,还是直接计算出视差信息,该显示控制器202均需基于前述第一图像的至少一部分和视差信息确定第二图像的至少一部分。具体地,该显示控制器202可以基于该第一图像中的第一像素和该第一像素对应的视差值在该第二图像的至少一部分中定位该第二像素,并将该第一像素的像素值作为该第二像素的像素值。进一步地,该显示控制器202可以计算该第一像素的坐标值与该第一像素对应的视差值的差,得到该第二像素的坐标值。因此,该显示控制器202根据该第二像素的坐标值便可以确定该第二像素在该第二图像中的位置。然后,该显示控制器202将前述第一像素的像素值赋值于该第二图像中的第二像素,于是便可以确定该第二像素的像素值。以此类推,该显示控制器202可以按照前述计算方式对该第一图像中的各个像素进行计算,进而可以获得该第二图像中的至少一部分的像素的位置和像素值。可以理解,像素值的典型实施例是红绿蓝(RGB)值。RGB像素值是二维的图像像素值,表征颜色。像素的深度值则表征该像素的深度信息。像素的深度值和像素值可以用于形成三维的像素信息,如红绿蓝深度(RGBD)值。此外,该深度值(D)也可以与前述RGB像素值分别存储,本实施例不用于限定。In this embodiment, whether the display controller 202 obtains the disparity information from the graphics processor GPU 201 or directly calculates the disparity information, the display controller 202 needs to determine the second image based on at least a part of the aforementioned first image and the disparity information At least part of. Specifically, the display controller 202 may locate the second pixel in at least a part of the second image based on the first pixel in the first image and the disparity value corresponding to the first pixel, and set the first pixel The pixel value of is used as the pixel value of the second pixel. Further, the display controller 202 may calculate the difference between the coordinate value of the first pixel and the disparity value corresponding to the first pixel to obtain the coordinate value of the second pixel. Therefore, the display controller 202 can determine the position of the second pixel in the second image according to the coordinate value of the second pixel. Then, the display controller 202 assigns the pixel value of the aforementioned first pixel to the second pixel in the second image, so that the pixel value of the second pixel can be determined. By analogy, the display controller 202 can calculate each pixel in the first image according to the aforementioned calculation method, and then obtain the position and pixel value of at least a part of the pixels in the second image. It can be understood that typical examples of pixel values are red-green-blue (RGB) values. RGB pixel value is a two-dimensional image pixel value, which represents color. The depth value of the pixel represents the depth information of the pixel. The depth value and pixel value of the pixel can be used to form three-dimensional pixel information, such as red, green and blue depth (RGBD) values. In addition, the depth value (D) can also be stored separately from the aforementioned RGB pixel value, and this embodiment is not used for limitation.
为便于理解,以图3D为例进行介绍。其中,第一图像中存在一个第一像素为像素A,为了清晰展示内容,图中示例性展示的像素A较大,但仅用于理解方案,不用于限定。已知该像素A距离该第一图像的左边界的水平距离为x1,即该像素A在该第一图像中的横坐标值为x1。假设,该像素A对应的视差值为z0,则由前述公式1可知,第二图像中的第二像素像素A’的坐标值为x2=x1-z0。于是,可以确定该像素A’在该第二图像中的位置。然后,该显示控制器202将该像素A的像素值赋值于像素A’。以此类推,如图3E所示,该显示控制器202便可以确定该第二图像的至少一部分。For ease of understanding, take Figure 3D as an example for introduction. Among them, there is a first pixel in the first image as pixel A. In order to clearly display the content, the exemplarily shown pixel A in the figure is larger, but it is only used for understanding the solution and is not used for limitation. It is known that the horizontal distance between the pixel A and the left border of the first image is x1, that is, the abscissa value of the pixel A in the first image is x1. Assuming that the disparity value corresponding to the pixel A is z0, it can be known from the foregoing formula 1 that the coordinate value of the second pixel pixel A'in the second image is x2=x1-z0. Thus, the position of the pixel A'in the second image can be determined. Then, the display controller 202 assigns the pixel value of the pixel A to the pixel A'. By analogy, as shown in FIG. 3E, the display controller 202 can determine at least a part of the second image.
可选的,该第二图像的至少一部分为该第二图像的第一部分,如图3E所示,该第二图像的第一部分为前述基于视差原理计算出的部分。此时,该第二图像的第二部分的像素还未确定。为了拓宽显示器可以呈现的图像的范围,该显示***20还用于确定第二图像的第二部分。一般地,当该第二图像为呈现于用户的右眼的图像时,该第二图像的第二部分为 靠近该第二图像右侧的部分。若前述第二图像为呈现于用户的左眼的图像时,该第二图像的第二部分为靠近该第二图像左侧的部分。应当注意的是,在实际应用中,第二图像的第二部分的宽度远小于第二图像的第一部分的宽度,也可以理解为,该第二图像的第二部分的每一行的像素个数远少于第二图像的第一部分的每一行的像素个数。Optionally, at least a part of the second image is the first part of the second image, as shown in FIG. 3E, the first part of the second image is the part calculated based on the parallax principle. At this time, the pixels of the second part of the second image have not yet been determined. In order to broaden the range of images that the display can present, the display system 20 is also used to determine the second part of the second image. Generally, when the second image is an image presented to the user's right eye, the second part of the second image is a part close to the right side of the second image. If the aforementioned second image is an image presented to the left eye of the user, the second part of the second image is a part close to the left side of the second image. It should be noted that in practical applications, the width of the second part of the second image is much smaller than the width of the first part of the second image, which can also be understood as the number of pixels in each row of the second part of the second image It is much less than the number of pixels in each row of the first part of the second image.
具体地,该显示***20中的图形处理器GPU 201将渲染该第二图像的第二部分,然后,该显示控制器202获取该第二图像的第二部分,并将该第二图像的第二部分发送至该显示器,以使得该第一图像、和该第二图像的第一部分与第二部分的结合在该显示器上呈现立体效果。为便于理解,以图3F为例进行介绍,该头戴式显示器的左显示屏显示第一图像,该头戴式显示器的右显示屏显示第二图像的第一部分与第二图像的第二部分的结合。由于,该第二图像的第一部分中的各个像素与该第二图像的第二部分中的各个像素不同,因此,该第二图像的第一部分呈现的内容与该第二图像的第二部分呈现的内容不同,因此,有利于拓宽该显示器可以呈现的第二图像的范围。Specifically, the graphics processor GPU 201 in the display system 20 will render the second part of the second image. Then, the display controller 202 obtains the second part of the second image and transfers the second part of the second image. The two parts are sent to the display, so that the first image and the combination of the first part and the second part of the second image present a three-dimensional effect on the display. For ease of understanding, take FIG. 3F as an example. The left display screen of the head-mounted display displays the first image, and the right display screen of the head-mounted display displays the first part of the second image and the second part of the second image. The combination. Since each pixel in the first part of the second image is different from each pixel in the second part of the second image, the content presented in the first part of the second image is different from the content presented in the second part of the second image. The content of is different, so it helps to broaden the range of the second image that the display can present.
应当理解的是,当显示控制器202获取该第二图像的第二部分时,该显示控制器202可以采用多种方式实现。在一种可选的实施方式中,该图形处理器GPU 201可以直接将该第二图像的第二部分传输至该显示控制器202,也就是说,该显示控制器202直接从该图形处理器GPU 201接收该第二图像的第二部分。在另一种可选的实施方式中,该图形处理器GPU 201将前述第二图像的第二部分传输至存储装置203,该显示控制器202直接从该存储装置203中获取该第二图像的第二部分。具体此处不做限定。It should be understood that when the display controller 202 acquires the second part of the second image, the display controller 202 can be implemented in a variety of ways. In an optional implementation manner, the graphics processor GPU 201 may directly transmit the second part of the second image to the display controller 202, that is, the display controller 202 directly transfers the second part of the second image from the graphics processor 202. The GPU 201 receives the second part of the second image. In another optional implementation manner, the graphics processor GPU 201 transmits the second part of the aforementioned second image to the storage device 203, and the display controller 202 directly obtains the second part of the second image from the storage device 203. the second part. The details are not limited here.
为便于进一步理解前述显示***20,下面将分别对前述显示***20中的图形处理器GPU 201和显示控制器202进行介绍。如图4所示,为本申请实施例提出的一种图形处理器GPU 40的结构示意图。该图形处理器GPU 40包括处理模块401和接口模块402。To facilitate a further understanding of the foregoing display system 20, the graphics processor GPU 201 and the display controller 202 in the foregoing display system 20 will be respectively introduced below. As shown in FIG. 4, it is a schematic structural diagram of a graphics processor GPU 40 proposed in an embodiment of this application. The graphics processor GPU 40 includes a processing module 401 and an interface module 402.
其中,该处理模块401,用于通过渲染获得第一图像,并获取该第一图像与第二图像的视差信息。该接口模块402,用于将该第一图像和该视差信息提供至显示控制器,该第一图像和该第二图像的至少一部分用于在该显示控制器对应的显示器上呈现立体效果。Wherein, the processing module 401 is configured to obtain a first image through rendering, and obtain disparity information of the first image and the second image. The interface module 402 is configured to provide the first image and the disparity information to a display controller, and at least a part of the first image and the second image are used to present a stereoscopic effect on a display corresponding to the display controller.
其中,该第一图像为某一视角的图像,例如,该第一图像为提供给用户左眼的图像或提供给用户右眼的图像。该第二图像为另一视角的图像,例如,若前述第一图像为提供给用户左眼的图像,则该第二图像为提供给用户右眼的图像;若前述第一图像为提供给用户右眼的图像,则该第二图像为提供给用户左眼的图像,具体此处不做限定。具体地,可以参阅前述图3A至图3F对应的相关描述。该第一图像与第二图像之间的视差信息包括前述第一图像中的某一个第一像素与该第一像素在第二图像中的对应的第二像素的视差值。具体地,该视差值可以为该第一像素的坐标值与该第一像素对应的第二像素的坐标值的差值。Wherein, the first image is an image of a certain angle of view, for example, the first image is an image provided to the user's left eye or an image provided to the user's right eye. The second image is an image from another perspective. For example, if the aforementioned first image is an image provided to the user's left eye, the second image is an image provided to the user's right eye; if the aforementioned first image is an image provided to the user For the image of the right eye, the second image is the image provided to the left eye of the user, which is not specifically limited here. Specifically, reference may be made to the related descriptions corresponding to the foregoing FIGS. 3A to 3F. The disparity information between the first image and the second image includes the disparity value of a certain first pixel in the aforementioned first image and a corresponding second pixel of the first pixel in the second image. Specifically, the disparity value may be the difference between the coordinate value of the first pixel and the coordinate value of the second pixel corresponding to the first pixel.
此外,该接口模块402将该第一图像和该视差信息提供至显示控制器,可以理解为,该接口模块402直接将该第一图像和视差信息发送至显示控制器;也可以理解为,该接口模块402将该第一图像和视差信息传输至存储装置中,由该显示控制器从该存储装置中获取该第一图像和视差信息。具体此处不做限定。In addition, the interface module 402 provides the first image and the disparity information to the display controller. It can be understood that the interface module 402 directly sends the first image and the disparity information to the display controller; it can also be understood as the The interface module 402 transmits the first image and disparity information to a storage device, and the display controller obtains the first image and disparity information from the storage device. The details are not limited here.
本实施例中,由于该图形处理器GPU 40仅需向显示控制器提供第一图像和视差信息, 由该显示控制器202进行后续计算。在此过程中,该图形处理器GPU 40无需对两幅图像进行渲染。因此,有利于减少图形处理器GPU 40的运行负荷,例如,减少图形处理器GPU 40占用的带宽,减少图形处理器GPU 40的计算量。In this embodiment, since the graphics processor GPU 40 only needs to provide the first image and disparity information to the display controller, the display controller 202 performs subsequent calculations. In this process, the graphics processor GPU 40 does not need to render the two images. Therefore, it is beneficial to reduce the operating load of the graphics processor GPU 40, for example, reduce the bandwidth occupied by the graphics processor GPU 40, and reduce the calculation amount of the graphics processor GPU 40.
基于前述图形处理器GPU 40,在一种可选的实施方式中,该处理模块401可以采用如下方式确定前述第一图像和第二图像之间的视差信息。其中,前述第一图像包括多个像素,前述第一图像中的每个像素对应一个深度值,该深度值可以由该处理模块401渲染第一图像获得。前述视差信息包括该第一图像的至少一部分中的第一像素与该第二图像中与该第一像素对应的第二像素之间的视差值,该视差值可以理解为前述第一像素的坐标值与前述第二像素的坐标值之间的差值。Based on the foregoing graphics processor GPU 40, in an optional implementation manner, the processing module 401 may determine the disparity information between the foregoing first image and the second image in the following manner. The aforementioned first image includes a plurality of pixels, and each pixel in the aforementioned first image corresponds to a depth value, which can be obtained by the processing module 401 rendering the first image. The aforementioned disparity information includes a disparity value between a first pixel in at least a part of the first image and a second pixel in the second image corresponding to the first pixel, and the disparity value can be understood as the aforementioned first pixel The difference between the coordinate value of and the coordinate value of the aforementioned second pixel.
具体地,该处理模块401基于该第一像素的深度值、焦距以及基线长度确定该视差值。更具体地,该处理模块401可以采用前述公式2(z=f×b/d)确定视差值。其中,f为焦距;b为基线长度;d为第一像素的深度值。具体可以参阅前述图1对应的视差原理的介绍,具体此处不再赘述。Specifically, the processing module 401 determines the disparity value based on the depth value, focal length, and baseline length of the first pixel. More specifically, the processing module 401 may use the aforementioned formula 2 (z=f×b/d) to determine the disparity value. Among them, f is the focal length; b is the baseline length; and d is the depth value of the first pixel. For details, please refer to the introduction of the parallax principle corresponding to FIG. 1, and the details are not repeated here.
基于前述图形处理器GPU 40,在另一种可选的实施方式中,该处理模块401,还用于渲染该第二图像的第二部分,该第二图像的第二部分不包括在该第一图像中,可以理解为,当前述第一图像和第二图像的第二部分分别呈现于用户的左右眼时,该第二图像的第二部分与该第一图像在用户的视网膜中呈现的像不重叠。该接口模块402,还用于将该第二图像的第二部分提供至该显示控制器,该第一图像、和该第二图像的第一部分与第二部分的结合用于在该显示控制器对应的显示器上呈现立体效果。Based on the foregoing graphics processor GPU 40, in another optional implementation manner, the processing module 401 is further configured to render the second part of the second image, and the second part of the second image is not included in the second image. In an image, it can be understood that when the first image and the second part of the second image are presented to the user's left and right eyes, the second part of the second image and the first image are presented in the user's retina. Like does not overlap. The interface module 402 is also used to provide the second part of the second image to the display controller, and the first image and the combination of the first part and the second part of the second image are used in the display controller The corresponding display shows a three-dimensional effect.
具体地,该接口模块402可以将该第二图像的第二部分发送至该显示控制器,该接口模块402也可以将该第二图像的第二部分传输至存储装置中,由该显示控制器从该存储装置中获取,具体此处不做限定。此外,该第二图像的第一部分与第二部分的结合指由该第二图像的第一部分和该第二图像的第二部分形成的第二图像。具体可以参阅前述图3F对应的相关描述,此处不再赘述。Specifically, the interface module 402 may send the second part of the second image to the display controller, and the interface module 402 may also transmit the second part of the second image to the storage device, and the display controller Obtain from the storage device, which is not specifically limited here. In addition, the combination of the first part and the second part of the second image refers to a second image formed by the first part of the second image and the second part of the second image. For details, please refer to the relevant description corresponding to the aforementioned FIG. 3F, which will not be repeated here.
可选的,该图形处理器GPU 40还用于向存储装置传输控制信息。可选的,该控制信息包括渲染模式信息、视图信息以及预设坐标范围中的一种或多种。其中,该渲染模式信息用于指示渲染模式,例如,1表示基于视差原理的立体渲染模式,0表示普通渲染模式。该视图信息用于指示渲染出的图像的视角,例如,1表示输出的图像是呈现于左眼的图像,0表示输出的图像是呈现于右眼的图像。该预设坐标范围用于指示第二图像的第一部分的坐标值的范围。若前述第一图像为呈现于用户左眼的图像,则当该第二图像中横坐标值大于前述预设坐标范围的上限时,该图形处理器GPU 40向显示控制器提供第二图像的第二部分;当该第二图像中横坐标值小于前述预设坐标范围的下限时,该图形处理器GPU 40向显示控制器提供第二图像的第一部分。Optionally, the graphics processor GPU 40 is also used to transmit control information to the storage device. Optionally, the control information includes one or more of rendering mode information, view information, and preset coordinate ranges. Wherein, the rendering mode information is used to indicate the rendering mode, for example, 1 represents a stereoscopic rendering mode based on the parallax principle, and 0 represents a normal rendering mode. The view information is used to indicate the perspective of the rendered image. For example, 1 indicates that the output image is the image presented to the left eye, and 0 indicates that the output image is the image presented to the right eye. The preset coordinate range is used to indicate the coordinate value range of the first part of the second image. If the foregoing first image is an image presented to the left eye of the user, when the abscissa value in the second image is greater than the upper limit of the foregoing preset coordinate range, the graphics processor GPU 40 provides the first image of the second image to the display controller. Two parts: When the abscissa value in the second image is less than the lower limit of the aforementioned preset coordinate range, the graphics processor GPU 40 provides the first part of the second image to the display controller.
本实施例中,由于该图形处理器GPU 40仅需渲染出第一图像,并利用第一图像中各个像素对应的深度值计算出视差信息,并将该视差信息和第一图像提供至显示控制器202,由该显示控制器202进行后续计算。在此过程中,该图形处理器GPU 40无需对两幅图像进行渲染。因此,有利于减少图形处理器GPU 40的运行负荷,例如,减少图形处理器GPU 40 占用的带宽,减少图形处理器GPU 40的计算量。In this embodiment, since the graphics processor GPU 40 only needs to render the first image, and calculate the disparity information by using the depth value corresponding to each pixel in the first image, and provide the disparity information and the first image to the display control The display controller 202 performs subsequent calculations. In this process, the graphics processor GPU 40 does not need to render the two images. Therefore, it is beneficial to reduce the operating load of the graphics processor GPU 40, for example, reduce the bandwidth occupied by the graphics processor GPU 40, and reduce the calculation amount of the graphics processor GPU 40.
如图5所示,为本申请实施例提出的一种显示控制器50的结构示意图。该显示控制器50包括处理模块501和接口模块502。其中,该处理模块501,用于从图形处理器GPU获取第一图像,获取该第一图像与第二图像的视差信息,基于该第一图像的至少一部分和该视差信息确定该第二图像的至少一部分。该接口模块502,用于将该第一图像和该第二图像的至少一部分发送至显示器,该第一图像和该第二图像的至少一部分用于在该显示器上呈现立体效果。As shown in FIG. 5, it is a schematic structural diagram of a display controller 50 according to an embodiment of this application. The display controller 50 includes a processing module 501 and an interface module 502. The processing module 501 is configured to obtain a first image from a graphics processor GPU, obtain disparity information of the first image and the second image, and determine the disparity information of the second image based on at least a part of the first image and the disparity information. At least part of it. The interface module 502 is configured to send at least a part of the first image and the second image to a display, and at least a part of the first image and the second image are used to present a stereoscopic effect on the display.
其中,该第一图像为某一视角的图像,例如,该第一图像为提供给用户左眼的图像或提供给用户右眼的图像。该第二图像为另一视角的图像,例如,若前述第一图像为提供给用户左眼的图像,则该第二图像为提供给用户右眼的图像;若前述第一图像为提供给用户右眼的图像,则该第二图像为提供给用户左眼的图像,具体此处不做限定。具体地,可以参阅前述图3A至图3F对应的相关描述。该第一图像与第二图像之间的视差信息包括前述第一图像中的某一个第一像素与该第一像素在第二图像中的对应的第二像素的视差值。具体地,该视差值可以为该第一像素的坐标值与该第一像素对应的第二像素的坐标值的差值。Wherein, the first image is an image of a certain angle of view, for example, the first image is an image provided to the user's left eye or an image provided to the user's right eye. The second image is an image from another perspective. For example, if the aforementioned first image is an image provided to the user's left eye, the second image is an image provided to the user's right eye; if the aforementioned first image is an image provided to the user For the image of the right eye, the second image is the image provided to the left eye of the user, which is not specifically limited here. Specifically, reference may be made to the related descriptions corresponding to the foregoing FIGS. 3A to 3F. The disparity information between the first image and the second image includes the disparity value of a certain first pixel in the aforementioned first image and a corresponding second pixel of the first pixel in the second image. Specifically, the disparity value may be the difference between the coordinate value of the first pixel and the coordinate value of the second pixel corresponding to the first pixel.
其中,该处理模块501从图形处理器GPU获取第一图像时,可以由该图形处理器GPU将该第一图像直接发送至该显示控制器50,即该处理模块501直接通过前述图形处理器GPU的接口模块402接收该第一图像;也可以由该图形处理器GPU将该第一图像传输至显示***内部的存储装置中,再由该显示控制器50从该存储装置中获取该第一图像,具体此处不做限定。Wherein, when the processing module 501 obtains the first image from the graphics processor GPU, the first image can be directly sent to the display controller 50 by the graphics processor GPU, that is, the processing module 501 directly passes through the aforementioned graphics processor GPU. The interface module 402 receives the first image; the graphics processor GPU may also transmit the first image to the storage device inside the display system, and then the display controller 50 obtains the first image from the storage device , The specifics are not limited here.
本实施例中,显示***中的图形处理器GPU仅渲染出第一图像,由该显示控制器50基于前述第一图像的至少一部分和视差信息确定第二图像的至少一部分,并将前述第一图像和前述第二图像的至少一部分传输至显示器。于是,该显示器便可以通过显示所述第一图像和所述第二图像的至少一部分呈现出立体效果。在这样的方案中,该显示控制器50向显示器传输了两幅图像,该图形处理器GPU仅渲染第一图像而无需渲染第二图像,由显示控制器50通过视差信息和第一图像得到第二图像的至少一部分,减少了图形处理器GPU渲染图像的运行负荷,例如,可以减少图形处理器GPU的计算量和带宽。In this embodiment, the graphics processor GPU in the display system only renders the first image, and the display controller 50 determines at least a part of the second image based on at least a part of the aforementioned first image and disparity information, and combines the aforementioned first image. The image and at least a part of the aforementioned second image are transmitted to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image. In such a solution, the display controller 50 transmits two images to the display, and the graphics processor GPU only renders the first image without rendering the second image. The display controller 50 obtains the second image through the disparity information and the first image. At least a part of the second image reduces the running load of the graphics processor GPU rendering the image, for example, it can reduce the calculation amount and bandwidth of the graphics processor GPU.
基于前述显示控制器50,该处理模块501获取该第一图像与第二图像的视差信息时,该处理模块501可以采用不同的实施方式,下面分别进行介绍。Based on the aforementioned display controller 50, when the processing module 501 obtains the disparity information of the first image and the second image, the processing module 501 may adopt different implementation manners, which will be introduced separately below.
在一种可选的实施方式中,该处理模块501可以获取图形处理器GPU提供的视差信息,即该视差信息是由该图形处理器GPU计算出的。此时,若该图形处理器GPU直接将该视差信息发送至该显示控制器50,则该处理模块501可以直接接收该图形处理器GPU发送的视差信息;若该图形处理器GPU将该视差信息传输至该显示***内的存储装置中,则该处理模块501可以从该存储装置获取该视差信息,具体此处不做限定。可选的,该显示控制器50包含行缓冲器,该行缓冲器用于存储该显示控制器50获取的视差信息。In an optional implementation manner, the processing module 501 may obtain the disparity information provided by the graphics processor GPU, that is, the disparity information is calculated by the graphics processor GPU. At this time, if the graphics processor GPU directly sends the disparity information to the display controller 50, the processing module 501 may directly receive the disparity information sent by the graphics processor GPU; if the graphics processor GPU sends the disparity information If it is transmitted to the storage device in the display system, the processing module 501 can obtain the disparity information from the storage device, which is not specifically limited here. Optionally, the display controller 50 includes a line buffer, and the line buffer is used to store the disparity information acquired by the display controller 50.
在本实施方式中,由于该第一图像和该视差信息均由图形处理器GPU提供,因此,该 处理模块501可以直接利用前述第一图像和视差信息进行后续计算。In this embodiment, since the first image and the disparity information are both provided by the graphics processor GPU, the processing module 501 can directly use the aforementioned first image and disparity information for subsequent calculations.
在另一种可选的实施方式中,前述视差信息由显示控制器基于图形处理器GPU提供的参数计算所得,即该处理模块501可以计算出前述视差信息。In another optional implementation manner, the aforementioned disparity information is calculated by the display controller based on the parameters provided by the graphics processor GPU, that is, the processing module 501 can calculate the aforementioned disparity information.
具体地,该处理模块501需获取深度值,然后,基于该深度值计算前述视差信息。其中,该深度值为第一图像中各个像素对应的深度值,可以由该图形处理器GPU渲染第一图像获得,即由图形处理器GPU提供至该显示控制器50。当该深度值由图形处理器GPU提供时,该图形处理器GPU可以直接将该各个像素的深度值发送至该显示控制器501,即该处理模块501可以直接从该图形处理器GPU接收前述各个像素的深度值;该图形处理器GPU也可以将该各个像素的深度值传输至该显示***内部的存储装置中,由该处理模块501从该存储装置中获取该各个像素的深度值,具体此处不做限定。Specifically, the processing module 501 needs to obtain a depth value, and then calculate the aforementioned disparity information based on the depth value. Wherein, the depth value is a depth value corresponding to each pixel in the first image, which can be obtained by rendering the first image by the graphics processor GPU, that is, provided by the graphics processor GPU to the display controller 50. When the depth value is provided by the graphics processor GPU, the graphics processor GPU can directly send the depth value of each pixel to the display controller 501, that is, the processing module 501 can directly receive the aforementioned various pixels from the graphics processor GPU. The depth value of the pixel; the graphics processor GPU can also transmit the depth value of each pixel to the storage device inside the display system, and the processing module 501 obtains the depth value of each pixel from the storage device. Specifically, There is no limit.
具体地,该处理模块501可以基于各个像素的深度值确定各个像素对应的视差值,即对于该第一图像中的某一个像素,该处理模块501基于这一个像素的深度值确定这一个像素的视差值,以此类推,该处理模块501可以确定该第一图像中每个像素的深度值。在实际应用中,该处理模块501也可以按需选取第一图像中的部分像素,并确定前述部分像素中每个像素的深度值,具体此处不做限定。更具体地,若需确定该第一图像中的第一像素的视差值,则该处理模块501可以基于该第一像素的深度值、焦距以及基线长度确定该视差值。具体可以采用如下公式确定该视差值:z=f×b/d(公式2);其中,f为焦距;b为基线长度;d为第一像素的深度值。具体可以参阅前述图1对应的视差原理的介绍,具体此处不再赘述。于是,该处理模块501可以利用前述第一图像和该处理模块501确定的视差信息进行后续计算。可选的,该显示控制器50包含行缓冲器,该行缓冲器用于存储该计算出的视差信息。Specifically, the processing module 501 may determine the disparity value corresponding to each pixel based on the depth value of each pixel, that is, for a certain pixel in the first image, the processing module 501 determines this pixel based on the depth value of this pixel The disparity value of, and so on, the processing module 501 can determine the depth value of each pixel in the first image. In practical applications, the processing module 501 can also select some pixels in the first image as needed, and determine the depth value of each pixel in the aforementioned partial pixels, which is not specifically limited here. More specifically, if the disparity value of the first pixel in the first image needs to be determined, the processing module 501 may determine the disparity value based on the depth value, focal length, and baseline length of the first pixel. Specifically, the following formula can be used to determine the parallax value: z=f×b/d (formula 2); where f is the focal length; b is the baseline length; and d is the depth value of the first pixel. For details, please refer to the introduction of the parallax principle corresponding to FIG. 1, and the details are not repeated here. Therefore, the processing module 501 can use the aforementioned first image and the disparity information determined by the processing module 501 to perform subsequent calculations. Optionally, the display controller 50 includes a line buffer for storing the calculated disparity information.
基于前述两种实施方式中的任意一种,当该处理模块501确定前述第一图像和视差信息之后,该处理模块501基于该第一图像的至少一部分和该视差信息确定该第二图像的至少一部分。具体地,该处理模块501将基于该第一像素和该视差值在该第二图像的至少一部分中定位该第二像素,并将该第一像素的像素值作为该第二像素的像素值。Based on any one of the foregoing two implementation manners, after the processing module 501 determines the foregoing first image and disparity information, the processing module 501 determines at least part of the second image based on at least a part of the first image and the disparity information. Part. Specifically, the processing module 501 will locate the second pixel in at least a part of the second image based on the first pixel and the disparity value, and use the pixel value of the first pixel as the pixel value of the second pixel .
具体地,该处理模块501可以计算该第一像素的坐标值与该第一像素对应的视差值的差,得到该第二像素的坐标值。因此,该处理模块501根据该第二像素的坐标值便可以确定该第二像素在该第二图像中的位置。然后,该处理模块501将前述第一像素的像素值赋值于该第二图像中的第二像素,于是便可以确定该第二像素的像素值。以此类推,该处理模块501可以按照前述计算方式对该第一图像中的各个像素进行计算,进而可以获得该第二图像中的至少一部分的像素的位置和像素值。Specifically, the processing module 501 may calculate the difference between the coordinate value of the first pixel and the disparity value corresponding to the first pixel to obtain the coordinate value of the second pixel. Therefore, the processing module 501 can determine the position of the second pixel in the second image according to the coordinate value of the second pixel. Then, the processing module 501 assigns the pixel value of the aforementioned first pixel to the second pixel in the second image, so that the pixel value of the second pixel can be determined. By analogy, the processing module 501 can calculate each pixel in the first image according to the aforementioned calculation method, and then obtain the position and pixel value of at least a part of the pixels in the second image.
需理解,以上图4和图5中的每个模块可以使用硬件、软件或软硬件结合的方式来实现。例如,处理模块401或501可以是处理器核心,用于运行软件程序指令来进行处理,例如处理模块401是一个或多个GPU计算核心,又例如处理模块501可以是一个或多个低功耗运算核心。此时,处理模块501运行功耗低于GPU计算核心,例如是一个小型中央处理单元或数字信号处理器。以上处理器核心运行的该软件程序指令可存储在图2的存储装置203中或其他存储器中。接口模块402或502可以是接口电路,包括但不限于支持各类 接口协议,用于实现数据的传输。再例如,处理模块401或501、以及接口模块402或502可以是包括硬件逻辑电路,而不执行软件程序,任意模块可以是通过逻辑电路运算来执行处理,所述逻辑电路包括但不限于晶体管、逻辑门和算法电路的至少一项,也可选择性包括模拟电路或数模混合电路。在更常见的实现方案中,显示控制器50中的处理模块501和502以纯硬件逻辑电路来实现,实现显示控制或显示驱动功能的硬件加速,提高显示***20的性能。It should be understood that each of the modules in FIG. 4 and FIG. 5 above can be implemented by using hardware, software, or a combination of software and hardware. For example, the processing module 401 or 501 may be a processor core for running software program instructions for processing. For example, the processing module 401 is one or more GPU computing cores, and for example, the processing module 501 may be one or more low power consumption. Computing core. At this time, the operating power consumption of the processing module 501 is lower than the GPU computing core, such as a small central processing unit or a digital signal processor. The software program instructions run by the above processor core may be stored in the storage device 203 in FIG. 2 or in other memories. The interface module 402 or 502 may be an interface circuit, including but not limited to supporting various interface protocols, for implementing data transmission. For another example, the processing module 401 or 501 and the interface module 402 or 502 may include hardware logic circuits without executing software programs. Any module may perform processing through logic circuit operations. The logic circuits include but are not limited to transistors, At least one of the logic gate and the arithmetic circuit may also optionally include an analog circuit or a digital-analog hybrid circuit. In a more common implementation scheme, the processing modules 501 and 502 in the display controller 50 are implemented by pure hardware logic circuits to implement hardware acceleration of display control or display driving functions, and improve the performance of the display system 20.
再例如,处理模块401或501、以及接口模块402或502可以是软件模块,包括软件程序指令,用于被处理器或处理器核心执行并实现上述功能。因此,以上涉及的任一模块可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。该计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行该计算机程序指令时,全部或部分地产生按照本申请实施例该的流程或功能。该计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。该计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输。该计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。该可用介质可以是动态随机存储器(例如,双倍速率同步动态随机存储器DDR),也可以是静态随机存储器等。For another example, the processing module 401 or 501 and the interface module 402 or 502 may be software modules, including software program instructions, for being executed by a processor or a processor core and implementing the aforementioned functions. Therefore, any of the above-mentioned modules can be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented by software, it can be implemented in the form of a computer program product in whole or in part. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application are generated in whole or in part. The computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices. The computer instructions can be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium. The computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media. The usable medium may be a dynamic random access memory (for example, a double-rate synchronous dynamic random access memory DDR), or a static random access memory or the like.
为便于理解,以第一图像的第一行像素为例介绍前述处理模块501确定第二图像的至少一部分的过程,需理解,以下处理仅以水平方向存在偏差,即视差仅包括水平视差为例。具体如图6A所示,假设,第一图像的第一行仅有8个像素,每个像素对应一个坐标值和一个像素值。例如,第一行第一列的像素A0的坐标为(0,0),像素值为a0;第一行第二列的像素A1的坐标为(1,0),像素值为a1;以此类推,第一行第八列的像素A7的坐标为(7,0),像素值为a7。此外,假设该第一图像中的各个像素对应的视差值如下表1所示,则该处理模块501可以计算出各个像素的横坐标值与视差值之间的差值,然后,该处理模块501基于该差值确定该像素在第二图像中的像素的横坐标值。应当理解的是,该差值代表该像素的像素值赋值在第二图像中的位置。For ease of understanding, the first row of pixels of the first image is taken as an example to introduce the process of determining at least a part of the second image by the processing module 501. It should be understood that the following processing only takes the deviation in the horizontal direction, that is, the parallax only includes the horizontal parallax as an example. . Specifically, as shown in FIG. 6A, it is assumed that there are only 8 pixels in the first row of the first image, and each pixel corresponds to a coordinate value and a pixel value. For example, the coordinate of the pixel A0 in the first row and the first column is (0,0), and the pixel value is a0; the coordinate of the pixel A1 in the first row and second column is (1,0), and the pixel value is a1; By analogy, the coordinates of the pixel A7 in the first row and the eighth column are (7, 0), and the pixel value is a7. In addition, assuming that the disparity value corresponding to each pixel in the first image is shown in Table 1 below, the processing module 501 can calculate the difference between the abscissa value and the disparity value of each pixel, and then the processing The module 501 determines the abscissa value of the pixel in the second image based on the difference. It should be understood that the difference value represents the position of the pixel value assignment of the pixel in the second image.
其中,像素A0的横坐标值为0,该像素A0对应的视差值为1,则可以确定该横坐标值与视差值之间的差值为-1,即像素A0在第二图像中的位置。由于,-1<0,像素值a0无法落在第二图像对应的显示屏中,即溢出显示屏,因此,该像素值a0不包含于该第二图像。像素A1的横坐标值为1,该像素A1对应的视差值为1,则可以确定该横坐标值与视差值之间的差值为0,即像素A1在第二图像中的位置,因此,如图6A所示,该像素值a1应赋值于该第二图像的第一行第一列中。类似的,可以计算出该像素A2的横坐标值与该像素A2对应的视差值之间的差值为1,即像素A2在第二图像中的位置,并且,该像素A3的横坐标值与该像素A3对应的视差值之间的差值也为1,即像素A3在第二图像中的位置。此时像素A2和像素A3在第二图像中的定位位置重合,该处理模块501可以选取像素A2和像素A3中深度值较小的像素对应像素值进行显示。由前述公式2可知,深度值和视差值呈反相关,像素A3的视差值大于像素A2的视差值,因此,该处理模块501保留像素A3对应的像 素值a3,即将像素值a3赋值于与该第二图像的第一行第二列中。Among them, the abscissa value of the pixel A0 is 0, and the disparity value corresponding to the pixel A0 is 1, it can be determined that the difference between the abscissa value and the disparity value is -1, that is, the pixel A0 is in the second image s position. Since, -1<0, the pixel value a0 cannot fall on the display screen corresponding to the second image, that is, it overflows the display screen. Therefore, the pixel value a0 is not included in the second image. The abscissa value of the pixel A1 is 1, and the disparity value corresponding to the pixel A1 is 1, it can be determined that the difference between the abscissa value and the disparity value is 0, that is, the position of the pixel A1 in the second image. Therefore, as shown in FIG. 6A, the pixel value a1 should be assigned to the first row and first column of the second image. Similarly, it can be calculated that the difference between the abscissa value of the pixel A2 and the disparity value corresponding to the pixel A2 is 1, that is, the position of the pixel A2 in the second image, and the abscissa value of the pixel A3 The difference between the disparity values corresponding to the pixel A3 is also 1, that is, the position of the pixel A3 in the second image. At this time, the positioning positions of the pixel A2 and the pixel A3 in the second image coincide, and the processing module 501 may select the corresponding pixel value of the pixel with the smaller depth value in the pixel A2 and the pixel A3 for display. It can be seen from the foregoing formula 2 that the depth value and the disparity value are inversely related, and the disparity value of the pixel A3 is greater than the disparity value of the pixel A2. Therefore, the processing module 501 retains the pixel value a3 corresponding to the pixel A3, that is, assigns the pixel value a3 In the first row and second column of the second image.
此外,还应注意的是,视差值是基于前述公式2计算出的,该视差值不一定为整数,进而导致计算出的横坐标与视差值之间的差值不一定为整数,此时,可以采用四舍五入的方式取前述差值的近似值。例如,像素A4的横坐标值为4,该像素A4对应的视差值为0.7则可以确定该横坐标值与视差值之间的差值为3.3。此时,该处理模块501采用四舍五入的方式取前述差值的近似值为3,即像素A4在第二图像中的位置。因此,如图6A所示,该像素值a4应赋值于该第二图像的第一行第四列中。类似的,像素A5的横坐标值为5,该像素A5对应的视差值为1则可以确定该横坐标值与视差值之间的差值为4,即像素A5在第二图像中的位置。因此,如图6A所示,该像素值a5应赋值于该第二图像的第一行第五列中。此外,像素A6的横坐标值为6,该像素A6对应的视差值为1.3则可以确定该横坐标值与视差值之间的差值为4.7,取近似值为5。像素A7的横坐标值为7,该像素A7对应的视差值为2则可以确定该横坐标值与视差值之间的差值为2,取近似值为5。此时像素A6和像素A7在第二图像中的定位位置重合,由于,像素A7的视差值大于像素A6的视差值,因此,该处理模块501保留像素A7对应的像素值a7,即将像素值a7赋值于与该第二图像的第一行第六列中。In addition, it should also be noted that the disparity value is calculated based on the aforementioned formula 2. The disparity value is not necessarily an integer, which results in that the difference between the calculated abscissa and the disparity value is not necessarily an integer. At this time, rounding can be used to approximate the aforementioned difference. For example, if the abscissa value of the pixel A4 is 4, and the disparity value corresponding to the pixel A4 is 0.7, it can be determined that the difference between the abscissa value and the disparity value is 3.3. At this time, the processing module 501 adopts a rounding method to take the approximate value of the aforementioned difference to 3, that is, the position of the pixel A4 in the second image. Therefore, as shown in FIG. 6A, the pixel value a4 should be assigned to the first row and fourth column of the second image. Similarly, if the abscissa value of pixel A5 is 5, and the disparity value corresponding to pixel A5 is 1, it can be determined that the difference between the abscissa value and the disparity value is 4, that is, the value of pixel A5 in the second image Location. Therefore, as shown in FIG. 6A, the pixel value a5 should be assigned to the first row and fifth column of the second image. In addition, if the abscissa value of the pixel A6 is 6, and the disparity value corresponding to the pixel A6 is 1.3, it can be determined that the difference between the abscissa value and the disparity value is 4.7, and the approximate value is 5. The abscissa value of the pixel A7 is 7, and the disparity value corresponding to the pixel A7 is 2, it can be determined that the difference between the abscissa value and the disparity value is 2, and the approximate value is 5. At this time, the positioning positions of the pixel A6 and the pixel A7 in the second image coincide. Because the disparity value of the pixel A7 is greater than the disparity value of the pixel A6, the processing module 501 retains the pixel value a7 corresponding to the pixel A7, that is, the pixel value a7 The value a7 is assigned to the first row and sixth column of the second image.
表1Table 1
Figure PCTCN2020073691-appb-000001
Figure PCTCN2020073691-appb-000001
应当注意的是,在计算该第二图像的第一部分的过程中,可能存在像素值丢失的情况。例如,图6A中的第二图像的第一行第三列处的像素值缺失,此时,该处理模块501可以采用差值算法或滤波算法确定该第二图像的第一行第三列处的像素值。假设,经计算该第二图像的第一行第三列处的像素值c1,则该第二图像的第一行如图6B所示。It should be noted that in the process of calculating the first part of the second image, pixel values may be lost. For example, the pixel value in the first row and third column of the second image in FIG. 6A is missing. In this case, the processing module 501 may use a difference algorithm or a filtering algorithm to determine the pixel value in the first row and third column of the second image. The pixel value. Suppose that after calculating the pixel value c1 at the first row and third column of the second image, the first row of the second image is as shown in FIG. 6B.
类似的,该处理模块501可以按照前述计算方式基于该第一图像中的各个像素确定该第二图像的第一部分中每个像的像素值,即该处理模块501可以确定第二图像的第一部分,具体如图6C所示。应当理解的是,该第二图像的第一部分保留多少列像素取决前述视差值和第一图像中的像素的横坐标值。本实施例中的图6C仅为示例,具体不对第二图像的第一部分的像素数量进行限定。Similarly, the processing module 501 can determine the pixel value of each image in the first part of the second image based on the pixels in the first image according to the aforementioned calculation method, that is, the processing module 501 can determine the first part of the second image , As shown in Figure 6C. It should be understood that how many columns of pixels are reserved in the first part of the second image depends on the aforementioned disparity value and the abscissa value of the pixels in the first image. FIG. 6C in this embodiment is only an example, and specifically does not limit the number of pixels in the first part of the second image.
可选的,该显示控制器50可以进一步确定第二图像的第二部分。具体地,该处理模块501,还用于从该图形处理器GPU获取该第二图像的第二部分;该接口模块502,还用于将该第二图像的第二部分发送至该显示器,该第二图像的第二部分不包括在该第一图像中,该第一图像、和该第二图像的第一部分与第二部分的结合用于在该显示器上呈现立体效果。Optionally, the display controller 50 may further determine the second part of the second image. Specifically, the processing module 501 is also used to obtain the second part of the second image from the graphics processor GPU; the interface module 502 is also used to send the second part of the second image to the display, the The second part of the second image is not included in the first image, and the first image and the combination of the first part and the second part of the second image are used to present a three-dimensional effect on the display.
其中,该处理模块501获取该第二图像的第二部分时,可以由该图形处理器GPU直接将该第二图像的第二部分传输至该显示控制器,也就是说,该处理模块501直接从该图形处理器GPU接收该第二图像的第二部分;或者,由该图形处理器GPU将前述第二图像的第二部分传输至该显示***内部的存储装置,该处理模块501直接从该存储装置中获取该第二图像的第二部分,具体此处不做限定。Wherein, when the processing module 501 obtains the second part of the second image, the graphics processor GPU may directly transmit the second part of the second image to the display controller, that is, the processing module 501 directly The second part of the second image is received from the graphics processor GPU; or the graphics processor GPU transmits the second part of the aforementioned second image to the storage device inside the display system, and the processing module 501 directly receives the second part of the second image from the display system. The second part of the second image acquired in the storage device is not specifically limited here.
可选的,该处理模块501还用于从前述存储装置中获取控制信息。可选的,该控制信息包括渲染模式信息、视图信息以及预设坐标范围中的一种或多种。其中,该渲染模式信息用于指示渲染模式,例如,1表示基于视差原理的立体渲染模式,0表示普通渲染模式。该视图信息用于指示渲染出的图像的视角,例如,1表示输出的图像是呈现于左眼的图像,0表示输出的图像是呈现于右眼的图像。该预设坐标范围用于指示第二图像的第一部分的坐标值的范围。若前述第一图像为呈现于用户左眼的图像,则当该第二图像中横坐标值大于前述预设坐标范围的上限时,该处理模块501获取的第二图像的第二部分位于第一部分右侧,并将该第二图像的第二部分传输至显示器;当该第二图像中横坐标值小于前述预设坐标范围的上限时,该处理模块501获取的是第二图像的第一部分,并将该第二图像的第一部分传输至显示器。Optionally, the processing module 501 is also used to obtain control information from the aforementioned storage device. Optionally, the control information includes one or more of rendering mode information, view information, and preset coordinate ranges. Wherein, the rendering mode information is used to indicate the rendering mode, for example, 1 represents a stereoscopic rendering mode based on the parallax principle, and 0 represents a normal rendering mode. The view information is used to indicate the perspective of the rendered image. For example, 1 indicates that the output image is the image presented to the left eye, and 0 indicates that the output image is the image presented to the right eye. The preset coordinate range is used to indicate the coordinate value range of the first part of the second image. If the aforementioned first image is an image presented to the left eye of the user, when the abscissa value in the second image is greater than the upper limit of the aforementioned preset coordinate range, the second part of the second image acquired by the processing module 501 is located in the first part On the right side, the second part of the second image is transmitted to the display; when the abscissa value in the second image is less than the upper limit of the aforementioned preset coordinate range, the processing module 501 obtains the first part of the second image, And transmit the first part of the second image to the display.
本实施例中,该显示控制器基于第一图像的至少一部分和视差信息确定第二图像的至少一部分,并将前述第一图像和前述第二图像的至少一部分传输至显示器。于是,该显示器便可以通过显示所述第一图像和所述第二图像的至少一部分呈现出立体效果。在这样的方案中,该显示控制器向显示器传输了两幅图像,GPU仅渲染第一图像而无需渲染第二图像,由显示控制器通过视差信息和第一图像得到第二图像的至少一部分,减少了GPU渲染图像的运行负荷,例如,可以减少GPU的计算量和带宽。In this embodiment, the display controller determines at least a part of the second image based on at least a part of the first image and disparity information, and transmits at least a part of the aforementioned first image and the aforementioned second image to the display. Therefore, the display can present a three-dimensional effect by displaying at least a part of the first image and the second image. In such a solution, the display controller transmits two images to the display, the GPU only renders the first image without rendering the second image, and the display controller obtains at least a part of the second image through the disparity information and the first image. Reduce the running load of GPU rendering images, for example, can reduce GPU calculation and bandwidth.
前面对该显示***的结构进行了详细介绍,下面将对本申请实施例提出的显示方法进行介绍,如图7所示,该显示***可以执行如下步骤:The structure of the display system was introduced in detail above, and the display method proposed in the embodiment of the present application will be introduced below. As shown in FIG. 7, the display system can perform the following steps:
701、渲染第一图像。其中,该第一图像为某一视角的图像,例如,该第一图像为提供给用户左眼的图像或提供给用户右眼的图像,具体此处不做限定。该第一图像包含多个像素,并且,每个像素对应一个深度值和一个像素值。可选的,该像素值为RGB像素值。此外,关于该深度值的描述可以参阅前述图2对应的实施例中的相关介绍,此处不再赘述。701. Render the first image. Wherein, the first image is an image of a certain angle of view, for example, the first image is an image provided to the user's left eye or an image provided to the user's right eye, which is not specifically limited here. The first image includes a plurality of pixels, and each pixel corresponds to a depth value and a pixel value. Optionally, the pixel value is an RGB pixel value. In addition, for the description of the depth value, reference may be made to the relevant introduction in the embodiment corresponding to FIG. 2, which is not repeated here.
702、获取该第一图像与第二图像的视差信息。其中,前述第一图像与前述第二图像之间的视差信息包括前述第一图像中的第一像素与该第一像素在第二图像中的对应的第二像素的视差值。具体地,该视差值可以为该第一像素的坐标值与该第一像素对应的第二像素的坐标值的差值。702. Acquire disparity information of the first image and the second image. Wherein, the disparity information between the first image and the second image includes the disparity value of the first pixel in the first image and the corresponding second pixel of the first pixel in the second image. Specifically, the disparity value may be the difference between the coordinate value of the first pixel and the coordinate value of the second pixel corresponding to the first pixel.
具体地,该显示***可以基于该第一像素的深度值确定该视差值。更具体地,该显示***可以基于该第一像素的深度值、焦距以及基线长度确定该视差值。具体可以采用如下公式确定该视差值:z=f×b/d(公式2);其中,f为焦距;b为基线长度;d为第一像素的深度值。具体可以参阅前述图1对应的视差原理的介绍,具体此处不再赘述。Specifically, the display system may determine the disparity value based on the depth value of the first pixel. More specifically, the display system may determine the disparity value based on the depth value, focal length, and baseline length of the first pixel. Specifically, the following formula can be used to determine the parallax value: z=f×b/d (formula 2); where f is the focal length; b is the baseline length; and d is the depth value of the first pixel. For details, please refer to the introduction of the parallax principle corresponding to FIG. 1, and the details are not repeated here.
703、基于该第一图像的至少一部分和该视差信息确定该第二图像的至少一部分。本实 施例中,当该显示***确定前述第一图像和前述视差信息之后,该显示***可以基于该第一图像的至少一部分和该视差信息确定该第二图像的至少一部分。具体地,该显示***将基于该第一像素和该视差值在该第二图像的至少一部分中定位该第二像素,并将该第一像素的像素值作为该第二像素的像素值。703. Determine at least a part of the second image based on at least a part of the first image and the disparity information. In this embodiment, after the display system determines the first image and the disparity information, the display system may determine at least a part of the second image based on at least a part of the first image and the disparity information. Specifically, the display system will locate the second pixel in at least a part of the second image based on the first pixel and the disparity value, and use the pixel value of the first pixel as the pixel value of the second pixel.
具体地,该显示***可以计算该第一像素的坐标值与该第一像素对应的视差值的差,得到该第二像素的坐标值。因此,该显示***根据该第二像素的坐标值便可以确定该第二像素在该第二图像中的位置。然后,该显示***将前述第一像素的像素值赋值于该第二图像中的第二像素,于是便可以确定该第二像素的像素值。以此类推,该显示***可以按照前述计算方式对该第一图像中的各个像素进行计算,进而可以获得该第二图像中的至少一部分的像素的位置和像素值。具体地,可以参阅前述图5对应的实施例中的相关描述,此处不再赘述。Specifically, the display system may calculate the difference between the coordinate value of the first pixel and the disparity value corresponding to the first pixel to obtain the coordinate value of the second pixel. Therefore, the display system can determine the position of the second pixel in the second image according to the coordinate value of the second pixel. Then, the display system assigns the pixel value of the aforementioned first pixel to the second pixel in the second image, so that the pixel value of the second pixel can be determined. By analogy, the display system can calculate each pixel in the first image according to the aforementioned calculation method, and then obtain the position and pixel value of at least a part of the pixels in the second image. Specifically, reference may be made to the related description in the foregoing embodiment corresponding to FIG. 5, which will not be repeated here.
704、将该第一图像和该第二图像的至少一部分发送至显示器。其中,该第一图像和该第二图像的至少一部分用于在该显示器上呈现立体效果。具体地,可以参阅前述图2对应的实施例中的相关介绍,此处不再赘述。704. Send at least a part of the first image and the second image to a display. Wherein, at least a part of the first image and the second image is used to present a three-dimensional effect on the display. Specifically, reference may be made to the relevant introduction in the aforementioned embodiment corresponding to FIG. 2, which is not repeated here.
705、渲染该第二图像的第二部分,将该第二图像的第二部分发送至该显示器。本实施例中,步骤705为可选的步骤。705. Render the second part of the second image, and send the second part of the second image to the display. In this embodiment, step 705 is an optional step.
其中,前述第二图像的至少一部分为该第二图像的第一部分。该第二图像的第二部分不包括在该第一图像中,可以理解为,当前述第一图像和第二图像的第二部分分别呈现于用户的左右眼时,该第二图像的第二部分与该第一图像在用户的视网膜中呈现的像不重叠。此外,该第一图像、和该第二图像的第一部分与第二部分的结合用于在该显示器上呈现立体效果。具体地,可以参阅前述图2对应的实施例中的相关介绍,此处不再赘述。Wherein, at least a part of the aforementioned second image is the first part of the second image. The second part of the second image is not included in the first image. It can be understood that when the first image and the second part of the second image are presented to the user’s left and right eyes respectively, the second part of the second image Part of the image does not overlap with the image presented by the first image in the user's retina. In addition, the first image and the combination of the first part and the second part of the second image are used to present a three-dimensional effect on the display. Specifically, reference may be made to the relevant introduction in the aforementioned embodiment corresponding to FIG. 2, which is not repeated here.
本实施例中,显示***仅渲染出第一图像,然后,直接基于前述第一图像的至少一部分和视差信息确定第二图像的至少一部分,并将前述第一图像和前述第二图像的至少一部分传输至显示器,以使得该显示器便通过显示所述第一图像和所述第二图像的至少一部分呈现出立体效果。在这样的方案中,显示器显示了两幅图像,但是,其中的第一图像是经渲染获得的,第二图像的第一部分是通过视差信息和第一图像确定的,因此,有利于减少显示***计算量和占用的带宽。此外,该显示***还渲染了第二图像的第二部分,由于,该第二图像的第二部分远小于第二图像的第一部分,因此,渲染第二图像的第二部分所需的计算量远小于渲染第二图像的第一部分的计算量,因此,可以在拓宽显示器呈现给用户的图像范围的条件下,仅耗费较小的计算量以及较小的带宽,有利于减少显示***中的GPU的运行负荷。In this embodiment, the display system only renders the first image, and then directly determines at least a part of the second image based on at least a part of the first image and the disparity information, and combines at least a part of the first image and the second image. It is transmitted to the display, so that the display presents a three-dimensional effect by displaying at least a part of the first image and the second image. In such a solution, the display shows two images, but the first image is obtained through rendering, and the first part of the second image is determined by the disparity information and the first image. Therefore, it is beneficial to reduce the display system. Calculation amount and occupied bandwidth. In addition, the display system also renders the second part of the second image. Since the second part of the second image is much smaller than the first part of the second image, the amount of calculation required to render the second part of the second image It is much smaller than the amount of calculation for rendering the first part of the second image. Therefore, under the condition of widening the range of the image presented to the user by the display, it only consumes a smaller amount of calculation and a smaller bandwidth, which is beneficial to reduce the GPU in the display system. The operating load.
还应理解,本文中涉及的第一、第二以及各种数字编号仅为描述方便进行的区分,并不用来限制本申请实施例的范围。It should also be understood that the first, second, and various numerical numbers involved in this specification are only for easy distinction for description, and are not used to limit the scope of the embodiments of the present application.
应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。It should be understood that in the various embodiments of the present application, the size of the sequence number of the above-mentioned processes does not mean the order of execution, and the execution order of each process should be determined by its function and internal logic, and should not correspond to the embodiments of the present application. The implementation process constitutes any limitation.
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各种说明性逻辑 块(illustrative logical block)和步骤(step),能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。Those of ordinary skill in the art may realize that the various illustrative logical blocks and steps described in the embodiments disclosed herein can be implemented by electronic hardware or a combination of computer software and electronic hardware. accomplish. Whether these functions are executed by hardware or software depends on the specific application and design constraint conditions of the technical solution. Professionals and technicians can use different methods for each specific application to implement the described functions, but such implementation should not be considered beyond the scope of this application.
在本申请所提供的几个实施例中,应该理解到,所揭露的***、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个***,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。In the several embodiments provided in this application, it should be understood that the disclosed system, device, and method can be implemented in other ways. For example, the device embodiments described above are merely illustrative. For example, the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or It can be integrated into another system, or some features can be ignored or not implemented. In addition, the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的***,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。Those skilled in the art can clearly understand that, for the convenience and conciseness of the description, the specific working process of the above-described system, device, and unit can refer to the corresponding process in the foregoing method embodiment, which will not be repeated here.
以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例的技术方案的精神和范围。The above embodiments are only used to illustrate the technical solutions of the application, not to limit them; although the application has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art should understand that they can still record the foregoing embodiments. The technical solutions are modified, or some of the technical features are equivalently replaced; these modifications or replacements do not cause the essence of the corresponding technical solutions to deviate from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (18)

  1. 一种显示***,其特征在于,包括:A display system is characterized in that it comprises:
    图形处理器GPU,用于渲染第一图像;Graphics processor GPU, used to render the first image;
    所述显示控制器,用于从所述GPU获取所述第一图像,获取所述第一图像与第二图像的视差信息,基于所述第一图像的至少一部分和所述视差信息确定所述第二图像的至少一部分,并将所述第一图像和所述第二图像的至少一部分发送至显示器,所述第一图像和所述第二图像的至少一部分用于在所述显示器上呈现立体效果。The display controller is configured to obtain the first image from the GPU, obtain the disparity information of the first image and the second image, and determine the disparity information based on at least a part of the first image and the disparity information. And send at least a part of the first image and the second image to a display, and at least a part of the first image and the second image are used to present a stereoscopic image on the display Effect.
  2. 根据权利要求1所述的显示***,其特征在于,所述第一图像包括多个像素,所述多个像素中的每个像素对应一个深度值,所述视差信息包括所述第一图像的至少一部分中的第一像素与所述第二图像中与所述第一像素对应的第二像素之间的视差值;The display system according to claim 1, wherein the first image includes a plurality of pixels, each pixel in the plurality of pixels corresponds to a depth value, and the disparity information includes the A disparity value between a first pixel in at least a part and a second pixel in the second image corresponding to the first pixel;
    所述GPU,还用于基于所述第一像素的深度值确定所述视差值;The GPU is further configured to determine the disparity value based on the depth value of the first pixel;
    所述显示控制器,具体用于从所述GPU获取所述视差值。The display controller is specifically configured to obtain the disparity value from the GPU.
  3. 根据权利要求2所述的显示***,其特征在于,所述GPU,具体用于基于所述第一像素的深度值、焦距以及基线长度确定所述视差值。The display system according to claim 2, wherein the GPU is specifically configured to determine the parallax value based on the depth value, focal length, and baseline length of the first pixel.
  4. 根据权利要求1所述的显示***,其特征在于,所述第一图像包括多个像素,所述多个像素中的每个像素对应一个深度值,所述视差信息包括所述第一图像的至少一部分中的第一像素与所述第二图像中与所述第一像素对应的第二像素之间的视差值;The display system according to claim 1, wherein the first image includes a plurality of pixels, each pixel of the plurality of pixels corresponds to a depth value, and the disparity information includes the information of the first image A disparity value between a first pixel in at least a part and a second pixel in the second image corresponding to the first pixel;
    所述显示控制器,具体用于获取所述第一像素的深度值,并基于所述第一像素的深度值确定所述视差值。The display controller is specifically configured to obtain the depth value of the first pixel, and determine the disparity value based on the depth value of the first pixel.
  5. 根据权利要求4所述的显示***,其特征在于,所述显示控制器,具体用于基于所述第一像素的深度值、焦距以及基线长度确定所述视差值。4. The display system according to claim 4, wherein the display controller is specifically configured to determine the parallax value based on the depth value, focal length, and baseline length of the first pixel.
  6. 根据权利要求2至5中任意一项所述的显示***,其特征在于,所述显示控制器,具体用于基于所述第一像素和所述视差值在所述第二图像的至少一部分中定位所述第二像素,并将所述第一像素的像素值作为所述第二像素的像素值。The display system according to any one of claims 2 to 5, wherein the display controller is specifically configured to display at least a part of the second image based on the first pixel and the disparity value Locate the second pixel in, and use the pixel value of the first pixel as the pixel value of the second pixel.
  7. 根据权利要求1至6中任意一项所述的显示***,其特征在于,所述第二图像的至少一部分为所述第二图像的第一部分;The display system according to any one of claims 1 to 6, wherein at least a part of the second image is a first part of the second image;
    所述GPU,还用于渲染所述第二图像的第二部分,所述第二图像的第二部分不包括在所述第一图像中;The GPU is also used to render the second part of the second image, the second part of the second image is not included in the first image;
    所述显示控制器,还用于获取所述第二图像的第二部分,将所述第二图像的第二部分发送至所述显示器,所述第一图像、和所述第二图像的第一部分与第二部分的结合用于在所述显示器上呈现立体效果。The display controller is further configured to obtain the second part of the second image, and send the second part of the second image to the display, and the first image and the second part of the second image The combination of one part and the second part is used to present a three-dimensional effect on the display.
  8. 一种图形处理器GPU,其特征在于,包括:A graphics processor GPU, which is characterized in that it comprises:
    处理模块,用于通过渲染获得第一图像,并获取所述第一图像与第二图像的视差信息;A processing module, configured to obtain a first image through rendering, and obtain disparity information of the first image and the second image;
    接口模块,用于将所述第一图像和所述视差信息提供至显示控制器,所述第一图像和所述第二图像的至少一部分用于在所述显示控制器对应的显示器上呈现立体效果。An interface module, configured to provide the first image and the disparity information to a display controller, and at least a part of the first image and the second image are used to present stereoscopic images on a display corresponding to the display controller Effect.
  9. 根据权利要求8所述的GPU,其特征在于,所述第一图像包括多个像素,所述第一 图像中的每个像素对应一个深度值,所述视差信息包括所述第一图像的至少一部分中的第一像素与所述第二图像中与所述第一像素对应的第二像素之间的视差值;The GPU according to claim 8, wherein the first image includes a plurality of pixels, each pixel in the first image corresponds to a depth value, and the disparity information includes at least A disparity value between a first pixel in a part and a second pixel corresponding to the first pixel in the second image;
    所述处理模块,具体用于基于所述第一像素的深度值确定所述视差值。The processing module is specifically configured to determine the disparity value based on the depth value of the first pixel.
  10. 根据权利要求9所述的GPU,其特征在于,所述处理模块,具体用于基于所述第一像素的深度值、焦距以及基线长度确定所述视差值。The GPU according to claim 9, wherein the processing module is specifically configured to determine the disparity value based on the depth value, focal length, and baseline length of the first pixel.
  11. 根据权利要求8至10中任意一项所述的GPU,其特征在于,所述第二图像的至少一部分为所述第二图像的第一部分;The GPU according to any one of claims 8 to 10, wherein at least a part of the second image is a first part of the second image;
    所述处理模块,还用于渲染所述第二图像的第二部分,所述第二图像的第二部分不包括在所述第一图像中;The processing module is further configured to render a second part of the second image, the second part of the second image is not included in the first image;
    所述接口模块,还用于将所述第二图像的第二部分提供至所述显示控制器,所述第一图像、和所述第二图像的第一部分与第二部分的结合用于在所述显示控制器对应的显示器上呈现立体效果。The interface module is further configured to provide the second part of the second image to the display controller, and the first image and the combination of the first part and the second part of the second image are used in A stereo effect is presented on the display corresponding to the display controller.
  12. 一种显示控制器,其特征在于,包括:A display controller, characterized in that it comprises:
    处理模块,用于从图形处理器GPU获取第一图像,获取所述第一图像与第二图像的视差信息,基于所述第一图像的至少一部分和所述视差信息确定所述第二图像的至少一部分;The processing module is configured to obtain the first image from the graphics processor GPU, obtain the disparity information of the first image and the second image, and determine the disparity information of the second image based on at least a part of the first image and the disparity information At least part of
    接口模块,用于将所述第一图像和所述第二图像的至少一部分发送至显示器,所述第一图像和所述第二图像的至少一部分用于在所述显示器上呈现立体效果。The interface module is configured to send at least a part of the first image and the second image to a display, and at least a part of the first image and the second image are used to present a stereoscopic effect on the display.
  13. 根据权利要求12所述的显示控制器,其特征在于,所述第一图像包括多个像素,所述多个像素中的每个像素对应一个深度值,所述视差信息包括所述第一图像的至少一部分中的第一像素与所述第二图像中与所述第一像素对应的第二像素之间的视差值;The display controller according to claim 12, wherein the first image includes a plurality of pixels, each pixel of the plurality of pixels corresponds to a depth value, and the disparity information includes the first image A disparity value between a first pixel in at least a part of and a second pixel in the second image corresponding to the first pixel;
    所述处理模块,具体用于从所述GPU获取所述视差值。The processing module is specifically configured to obtain the disparity value from the GPU.
  14. 根据权利要求12所述的显示控制器,其特征在于,所述第一图像包括多个像素,所述多个像素中的每个像素对应一个深度值,所述视差信息包括所述第一图像的至少一部分中的第一像素与所述第二图像中与所述第一像素对应的第二像素之间的视差值;The display controller according to claim 12, wherein the first image includes a plurality of pixels, each pixel of the plurality of pixels corresponds to a depth value, and the disparity information includes the first image A disparity value between a first pixel in at least a part of and a second pixel in the second image corresponding to the first pixel;
    所述处理模块,具体用于获取所述第一像素的深度值,并基于所述第一像素的深度值确定所述视差值。The processing module is specifically configured to obtain the depth value of the first pixel, and determine the disparity value based on the depth value of the first pixel.
  15. 根据权利要求14所述的显示控制器,其特征在于,所述处理模块,具体用于基于所述第一像素的深度值、焦距以及基线长度确定所述视差值。The display controller according to claim 14, wherein the processing module is specifically configured to determine the disparity value based on the depth value, focal length, and baseline length of the first pixel.
  16. 根据权利要求12至15任意一项所述的显示控制器,其特征在于,所述处理模块,具体用于基于所述第一像素和所述视差值在所述第二图像的至少一部分中定位所述第二像素,并将所述第一像素的像素值作为所述第二像素的像素值。The display controller according to any one of claims 12 to 15, wherein the processing module is specifically configured to be based on the first pixel and the disparity value in at least a part of the second image The second pixel is located, and the pixel value of the first pixel is used as the pixel value of the second pixel.
  17. 根据权利要求12至16中任意一项所述的显示控制器,其特征在于,所述第二图像的至少一部分为所述第二图像的第一部分;The display controller according to any one of claims 12 to 16, wherein at least a part of the second image is a first part of the second image;
    所述处理模块,还用于从所述GPU获取所述第二图像的第二部分;The processing module is further configured to obtain the second part of the second image from the GPU;
    所述接口模块,还用于将所述第二图像的第二部分发送至所述显示器,所述第二图像的第二部分不包括在所述第一图像中,所述第一图像、和所述第二图像的第一部分与第二 部分的结合用于在所述显示器上呈现立体效果。The interface module is further configured to send a second part of the second image to the display, the second part of the second image is not included in the first image, the first image, and The combination of the first part and the second part of the second image is used to present a three-dimensional effect on the display.
  18. 一种显示方法,其特征在于,包括:A display method, characterized in that it comprises:
    渲染第一图像;Render the first image;
    获取所述第一图像与第二图像的视差信息;Acquiring disparity information of the first image and the second image;
    基于所述第一图像的至少一部分和所述视差信息确定所述第二图像的至少一部分;Determining at least a part of the second image based on at least a part of the first image and the disparity information;
    将所述第一图像和所述第二图像的至少一部分发送至显示器,所述第一图像和所述第二图像的至少一部分用于在所述显示器上呈现立体效果。At least a part of the first image and the second image is sent to a display, and at least a part of the first image and the second image is used to present a stereoscopic effect on the display.
PCT/CN2020/073691 2020-01-22 2020-01-22 Display system, graphics processing unit (gpu), display controller, and display method WO2021146978A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/073691 WO2021146978A1 (en) 2020-01-22 2020-01-22 Display system, graphics processing unit (gpu), display controller, and display method
CN202080017131.4A CN113490963A (en) 2020-01-22 2020-01-22 Display system, graphic processor GPU, display controller and display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/073691 WO2021146978A1 (en) 2020-01-22 2020-01-22 Display system, graphics processing unit (gpu), display controller, and display method

Publications (1)

Publication Number Publication Date
WO2021146978A1 true WO2021146978A1 (en) 2021-07-29

Family

ID=76991935

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/073691 WO2021146978A1 (en) 2020-01-22 2020-01-22 Display system, graphics processing unit (gpu), display controller, and display method

Country Status (2)

Country Link
CN (1) CN113490963A (en)
WO (1) WO2021146978A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120176364A1 (en) * 2011-01-06 2012-07-12 International Business Machines Corporation Reuse of static image data from prior image frames to reduce rasterization requirements
US20130027394A1 (en) * 2011-07-25 2013-01-31 Samsung Electronics Co., Ltd. Apparatus and method of multi-view rendering
CN102972038A (en) * 2011-07-01 2013-03-13 松下电器产业株式会社 Image processing apparatus, image processing method, program, and integrated circuit
CN103426163A (en) * 2012-05-24 2013-12-04 索尼公司 System and method for rendering affected pixels
US20140176613A1 (en) * 2011-04-19 2014-06-26 Deluxe 3D Llc Alternate viewpoint rendering
CN106127848A (en) * 2015-05-04 2016-11-16 三星电子株式会社 Viewpoint anaglyph is performed equipment and the method rendered
CN110555874A (en) * 2018-05-31 2019-12-10 华为技术有限公司 Image processing method and device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102240564B1 (en) * 2014-07-29 2021-04-15 삼성전자주식회사 Apparatus and method for rendering image
CN110139092A (en) * 2019-05-20 2019-08-16 中国科学院长春光学精密机械与物理研究所 Three-dimensional display system, image processing method, device, equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120176364A1 (en) * 2011-01-06 2012-07-12 International Business Machines Corporation Reuse of static image data from prior image frames to reduce rasterization requirements
US20140176613A1 (en) * 2011-04-19 2014-06-26 Deluxe 3D Llc Alternate viewpoint rendering
CN102972038A (en) * 2011-07-01 2013-03-13 松下电器产业株式会社 Image processing apparatus, image processing method, program, and integrated circuit
US20130027394A1 (en) * 2011-07-25 2013-01-31 Samsung Electronics Co., Ltd. Apparatus and method of multi-view rendering
CN103426163A (en) * 2012-05-24 2013-12-04 索尼公司 System and method for rendering affected pixels
CN106127848A (en) * 2015-05-04 2016-11-16 三星电子株式会社 Viewpoint anaglyph is performed equipment and the method rendered
CN110555874A (en) * 2018-05-31 2019-12-10 华为技术有限公司 Image processing method and device

Also Published As

Publication number Publication date
CN113490963A (en) 2021-10-08

Similar Documents

Publication Publication Date Title
US10506223B2 (en) Method, apparatus, and device for realizing virtual stereoscopic scene
US11436787B2 (en) Rendering method, computer product and display apparatus
US10776997B2 (en) Rendering an image from computer graphics using two rendering computing devices
EP3673463A1 (en) Rendering an image from computer graphics using two rendering computing devices
US11335066B2 (en) Apparatus and operating method for displaying augmented reality object
US20130293547A1 (en) Graphics rendering technique for autostereoscopic three dimensional display
US11417060B2 (en) Stereoscopic rendering of virtual 3D objects
US9766458B2 (en) Image generating system, image generating method, and information storage medium
JP2022543729A (en) System and method for foveated rendering
TWI602145B (en) Unpacking method, device and system of packed frame
US6559844B1 (en) Method and apparatus for generating multiple views using a graphics engine
KR20120139054A (en) Apparatus for tranforming image
WO2021146978A1 (en) Display system, graphics processing unit (gpu), display controller, and display method
CN114513646B (en) Method and device for generating panoramic video in three-dimensional virtual scene
TWI503788B (en) Method, device and system for restoring resized depth frame into original depth frame
TWI602144B (en) Method, device and system for packing color frame and original depth frame
US20230179754A1 (en) Electronic device and method for generating stereoscopic light-field data
US20240129448A1 (en) Method and system for converting single-view image to 2.5d view for extended reality (xr) applications
TWI812548B (en) Method and computer device for generating a side-by-side 3d image
US11887228B2 (en) Perspective correct vector graphics with foveated rendering
US20240161381A1 (en) Method and computer device for generating a side-by-side 3d image
US20190313077A1 (en) Virtual reality environment
TW202129599A (en) Methods and apparatus for multiple lens distortion correction
TWI524731B (en) Non-transitory storage medium for storing resized depth frame
TWI512678B (en) Non-transitory storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20915862

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20915862

Country of ref document: EP

Kind code of ref document: A1