WO2015161541A1 - 一种针对多视点裸眼3d显示的并行同步缩放引擎及方法 - Google Patents

一种针对多视点裸眼3d显示的并行同步缩放引擎及方法 Download PDF

Info

Publication number
WO2015161541A1
WO2015161541A1 PCT/CN2014/078731 CN2014078731W WO2015161541A1 WO 2015161541 A1 WO2015161541 A1 WO 2015161541A1 CN 2014078731 W CN2014078731 W CN 2014078731W WO 2015161541 A1 WO2015161541 A1 WO 2015161541A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
eye
naked
sub
interpolation
Prior art date
Application number
PCT/CN2014/078731
Other languages
English (en)
French (fr)
Inventor
任鹏举
刘庚
余江
孙宏滨
刘跃虎
郑南宁
Original Assignee
西安交通大学
任鹏举
刘庚
余江
孙宏滨
刘跃虎
郑南宁
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 西安交通大学, 任鹏举, 刘庚, 余江, 孙宏滨, 刘跃虎, 郑南宁 filed Critical 西安交通大学
Priority to US14/897,076 priority Critical patent/US9924153B2/en
Publication of WO2015161541A1 publication Critical patent/WO2015161541A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4007Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/139Format conversion, e.g. of frame-rate or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/167Synchronising or controlling image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/349Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
    • H04N13/351Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking for displaying simultaneously

Definitions

  • the invention belongs to the technical field of video image display processing, and particularly relates to an easy-to-hardware multi-view parallel synchronization scaling engine and method applied in a multi-view naked-eye 3D display technology. Background technique
  • Stereoscopic display technology can express the depth information and depth of field of the image, allowing viewers to have an immersive viewing experience, so it has broad market prospects.
  • the commercial 3D display technology is almost based on the principle of binocular stereo vision of the human eye, that is, the left and right eyes respectively receive the field of view images of different viewpoints. Because of the subtle differences in the field of view images between different viewpoints, the audience is fused by the brain. Produce a sense of three-dimensionality. Compared with the traditional glasses-type 3D display, the naked-eye 3D display has got rid of the need to wear 3D glasses to see the stereoscopic effect, so it has more market advantages.
  • the naked eye 3D display is mainly divided into: slit grating display, lenticular display, body display and holographic display.
  • the slit grating display and the lenticular lens display are optical media that add a layer of directional blocking or refraction in front of the display terminal to separate the left and right eye fields of view.
  • the naked-eye 3D display refers to a slit grating display and a lenticular lens display.
  • the 1080P four-view naked-eye 3D display arranged in an integer pixel is used as an example to briefly describe the naked-eye 3D display processing.
  • 1080P four-view naked eye The resolution of the four sub-field images of the 3D source is 960X 540, which is arranged in four squares.
  • the corresponding display processing is mainly divided into the following steps:
  • each sub-picture is separately scaled to the physical resolution of the display terminal (1920 ⁇ 1080), and the image (A, B, C, D) of each sub-field zoom is obtained;
  • the sub-pixels of the corresponding positions of the eight, B, C, and D are recomputed and combined to obtain the display pixels of the naked-eye stereoscopic image at the corresponding position;
  • FIG. 2 cites a 4-view integer arrangement, which only represents a pixel arrangement of the naked-eye 3D composite image.
  • the naked-eye stereoscopic display processing system mainly comprises four parts: an input video decoding module, an N-viewpoint sequence generation module, a video image frame storage control module, and a naked-eye stereogram generation module.
  • the release engine is included in the naked eye stereogram generating module, and the input is an interpolation window for each subfield of view, and the output is a display pixel of the synthesized stereogram.
  • Figure 4 shows the scaling engine system in the existing naked-eye 3D display system. It mainly includes the following steps:
  • the interpolated pixel window corresponding to each sub-field of view is subjected to scaling interpolation calculation by N scaling interpolation modules to obtain N scaled pixel value data;
  • the N interpolation scaling results of the N sub-fields are used for pixel synthesis by the multi-view naked-eye 3D video image combination calculation module to obtain the display pixel synthesis result of the current position. The above steps are repeated until one image is synthesized, and the synthesized naked-eye 3D image is displayed on the naked-eye stereoscopic display terminal.
  • the video signal (analog signal or digital signal) is decoded into a video digital signal (ie, RGB ⁇ YUV ⁇ RGBY signal) and a corresponding video flag signal through the input video decoding module.
  • a video digital signal ie, RGB ⁇ YUV ⁇ RGBY signal
  • the user cannot feel the stereoscopic effect.
  • image segmentation or multiple sub-field image sequences are obtained by 2D to 3D.
  • the video image frame storage module to control the video image data into SDRAM (including DDR SDRAM ⁇ DDR2 SDRAM ⁇ DDR3 SDRAM ) 0
  • SDRAM including DDR SDRAM ⁇ DDR2 SDRAM ⁇ DDR3 SDRAM
  • the resolution of the field image is scaled to the same physical resolution as the display terminal (such as 1080P, 4KX 2K or 8KX 4K resolution), resulting in N images of the same physical resolution as the display terminal.
  • the scaling engine shown in FIG.
  • each sub-field zoom interpolation module performs point-by-point interpolation calculation in parallel according to the scaling algorithm and the corresponding interpolated pixel window.
  • the N sub-field image sequence obtained by interpolation and scaling, according to the arrangement of the grating or the lenticular lens of the display terminal, requires the scaling position of each sub-field to be R ⁇ G ⁇ B (Y ⁇ U ⁇ V, The R ⁇ G ⁇ B ⁇ Y) sub-pixel point is obtained through the multi-view naked-eye 3D video image combination calculation module, and the combined display is performed to obtain the display pixel of the corresponding position naked-eye 3D image.
  • the processed naked-eye 3D composite image data is transmitted to the display terminal to perform naked-eye 3D display. Repeat the above process until the end of an image.
  • each sub-field image needs to be separately scaled, N fields of view need N independent scaling interpolation modules, and when the naked-eye stereo image is synthesized, only part of the data of each sub-field zoom result is used, so that multiple The scaled interpolation module computes a large amount of unused redundant data, thus wasting a lot of hardware computing resources.
  • N fields of view need N independent scaling interpolation modules, and when the naked-eye stereo image is synthesized, only part of the data of each sub-field zoom result is used, so that multiple The scaled interpolation module computes a large amount of unused redundant data, thus wasting a lot of hardware computing resources.
  • huge hardware computing resource consumption will eventually make it difficult for hardware resources to meet design requirements.
  • the existing scaling engine shown in Figure 3 can be replaced by the parallel synchronization scaling engine of the present invention (shown in Figure 5), which optimizes the system's resource utilization when functioning properly.
  • the parallel synchronous scaling engine of the invention is a key function module of the naked eye 3D display system, the input of the zoom engine is a multi-view image sequence, and the output is a naked-eye synthesized stereo view, and the naked-eye 3D display system is greatly reduced by the parallel synchronous zoom engine proposed by the invention.
  • a parallel synchronous scaling engine for multi-view naked-eye 3D display comprising: a multi-way image combination calculation module, configured to synchronously and parallelly extract interpolation used in an interpolation and scaling process from an on-chip storage unit corresponding to each sub-field of view Pixel window data, and combined operation and filtering of each sub-field interpolation pixel window data, obtaining pixel data of each sub-pixel interpolation pixel of the composite field of view, and inputting the pixel splicing module; pixel splicing module, for synthesizing the field of view The pixel interpolation pixel window data is spliced to obtain a synthetic field interpolation pixel window, and the input scaling interpolation module is used; the scaling interpolation module is configured to calculate a display of the corresponding position by interpolating the synthesized field interpolation pixel window data according to the scaling factor of the corresponding scaling algorithm. Pixel.
  • the scaling interpolation module displays the composite image display pixels on the naked-eye 3D display terminal while calculating a field image display pixel, and has the characteristics of real-time processing and real-time display, which is easy to implement in hardware.
  • each sub-field of view video data is stored in the corresponding on-chip memory unit, and the sub-field control states are consistent, and each sub-field data is synchronously written into the corresponding on-chip memory unit.
  • the naked-eye 3D display terminal is a naked-eye 3D display terminal arranged in an N-view integer or a floating-point pixel;
  • N F x When the naked-eye 3D display terminal is an integer pixel arrangement, one and only one of the N F x "is one, and the rest is zero; when the naked-eye 3D display terminal is a floating-point pixel arrangement, N F x "Satisfied;
  • ; 1 ;
  • F x " represents a weighting coefficient of the X sub-pixels corresponding to the position of the nth subfield when calculating the synthesized X subpixel data; x ⁇ R, G, B ⁇ , ne ⁇ l , 2...Nl , N ⁇ .
  • a parallel synchronization scaling method for multi-view naked-eye 3D display includes the following steps:
  • the multi-channel image combination calculation module synchronously and parallelly extracts the sub-field interpolation pixel window used in the interpolation and scaling process from the on-chip storage unit corresponding to each sub-field of view;
  • the multi-way image combination calculation module combines the sub-field interpolation pixel windows to calculate a new composite field of view sub-pixel interpolation pixel window;
  • pixel splicing module Separating the sub-pixel interpolation pixel windows of the composite field of view to obtain a composite field of view interpolation pixel window;
  • the scaling interpolation module calculates a display pixel by performing scaling interpolation on the synthesized field interpolation pixel window according to the corresponding scaling algorithm.
  • the scaling interpolation module displays the composite image display pixels on the naked-eye 3D display terminal while calculating a field image display pixel, and has the characteristics of real-time processing and real-time display, which is easy to implement in hardware.
  • a further improvement of the present invention is that in the process of storing the sub-field video data in the corresponding on-chip memory unit in step 1), the sub-field control states are kept consistent, and each sub-field data is synchronously written into the corresponding on-chip memory unit.
  • the naked-eye 3D display terminal is a naked-eye 3D display terminal arranged in an N-view integer or a floating-point pixel;
  • ⁇ 1 represents a weighting coefficient of the X sub-pixels corresponding to the position of the nth subfield when calculating the synthesized X subpixel data
  • a parallel synchronous scaling method for multi-view naked-eye 3D display firstly screening and synthesizing the interpolated pixel window according to the pixel arrangement requirement of the naked-eye 3D display terminal, discarding the redundant data, and then performing scaling interpolation calculation, and finally directly generating the naked-eye 3D synthesis
  • the display pixels of the graph firstly screening and synthesizing the interpolated pixel window according to the pixel arrangement requirement of the naked-eye 3D display terminal, discarding the redundant data, and then performing scaling interpolation calculation, and finally directly generating the naked-eye 3D synthesis The display pixels of the graph.
  • a parallel synchronization scaling method for multi-view naked-eye 3D display comprising:
  • the interpolation pixel window data obtained in step 2 is interpolated by the scaling interpolation module to calculate a display pixel at the corresponding position; and the display pixel of the naked eye stereoscopic composite image is displayed on the naked eye 3D display terminal.
  • the present invention has the following beneficial effects:
  • the present invention After obtaining the corresponding interpolated pixel window data of each sub-field image sequence data, the present invention does not directly perform scaling interpolation, but selects pixel points of each sub-field interpolation pixel window data according to the pixel arrangement requirement of the display terminal. According to the corresponding combination calculation, a new synthetic field of view interpolation pixel window data is obtained, and then the scaling interpolation calculation is performed.
  • the original stereo image is synthesized after being scaled, and the image is synthesized and then interpolated, so that parallel scaling of pixels corresponding to multiple fields of view can be performed synchronously, and redundant pixel calculation is eliminated, and the computational complexity is already 1/N of the method, effectively saving computing resources, easy to implement in real-time hardware, satisfying various viewpoints, multiple interpolation algorithms, and naked-eye 3D display compatible with various integer and floating-point pixel arrangements, without the number of viewpoints
  • the increase in computing resources increases, and the performance advantages of the parallel synchronous scaling engine proposed by the present invention are further reflected as the number of viewpoints increases.
  • Figure 1 is a schematic diagram of a conventional multi-view naked-eye 3D display processing system.
  • Figure 2 shows the schematic diagram of the existing multi-view naked-eye 3D display processing system.
  • Figure 3 is a schematic diagram of the structure of a multi-view naked-eye 3D display processing system.
  • FIG. 4 is a schematic diagram of a zoom engine system of a multi-view naked-eye 3D display system of the prior art.
  • FIG. 5 is a schematic diagram of a parallel synchronous scaling engine system of a multi-view naked-eye 3D display system according to the present invention.
  • the present invention provides a parallel synchronization scaling engine for multi-view naked-eye 3D display, including a multi-way image combination calculation module, a pixel splicing module, and a scaling interpolation module.
  • the multi-channel image combination calculation module is configured to synchronously and parallelly extract the interpolated pixel window data used in the interpolation and scaling process from the on-chip storage unit of each sub-field of view, and perform combined operation on each sub-field interpolation pixel window data. Screening, obtaining pixel data of each sub-pixel interpolation pixel, and inputting a pixel splicing module;
  • a pixel splicing module configured to splicing the sub-pixel interpolation pixel window data of the composite field of view to obtain a composite field of view interpolation pixel window, and outputting the result to the scaling interpolation module;
  • the scaling interpolation module is configured to calculate a display pixel of the corresponding position by interpolating the synthesized field of view interpolation pixel window data according to the scaling factor of the corresponding scaling algorithm, and display the display pixel on the naked eye 3D display terminal.
  • the invention provides a parallel synchronization scaling method for multi-view naked-eye 3D display, which comprises the following steps:
  • the multiplex image combination calculation module synchronously and parallelly extracts the sub-input and inter-segment used in the interpolation and scaling process from the on-chip storage unit corresponding to each sub-field of view.
  • Field of view interpolation pixel window 3, according to the naked eye 3D display terminal physical resolution and naked eye stereoscopic pixel arrangement requirements, the multiplex image combination calculation module combines each sub-field interpolation pixel window to calculate a new composite field of view sub-pixel interpolation pixel a pixel splicing module splicing the synthesized field of view sub-pixel interpolation pixel window to obtain a composite field of view interpolation pixel window; 4.
  • the scaling interpolation module performs scaling interpolation calculation on the synthetic field interpolation pixel window according to the corresponding scaling algorithm to obtain a naked eye stereoscopic image. Display pixels.
  • each sub-field data operation must be performed in parallel, in synchronization, and in real time. Since the parallel synchronous scaling engine needs to simultaneously use the data at the same position in each sub-field image, the sub-field video data is stored in the corresponding on-chip memory unit, and the sub-field control states are required to be consistent, and each sub-field of view The data is synchronously written to the corresponding on-chip memory unit, thus ensuring that the subsequent parallel synchronous scaling engine runs in real time and correctly.
  • each sub-field interpolation pixel window data needs to maintain complete consistency and synchronization.
  • an interpolation window address calculation module is used, and each sub-field of view uses the interpolation window address to calculate the operation result of the module, and synchronously outputs the interpolated pixel window data to the multi-channel image combination calculation module. It should be noted that the interpolation window address calculation module may vary according to the interpolation algorithm.
  • the multi-channel image combination calculation module performs a combined calculation on each sub-field zoom data window to obtain a new composite field of view sub-pixel interpolation pixel window;
  • the field sub-pixel interpolation pixel window is spliced to obtain a composite field of view interpolation pixel window:
  • N interpolated pixel windows will be obtained.
  • the NPO display mode only part of the data is useful data in the naked eye 3D display mode.
  • the remaining redundant data is discarded before the interpolation calculation, thereby avoiding the calculation of redundant data and saving computing resources.
  • the naked eye 3D display terminal compatible with various integer and floating point pixel arrangements is compatible.
  • a 3D display of a naked-eye 3D display is taken as an example to briefly explain the process of discarding redundant data.
  • the display pixel at a certain coordinate position is the R sub-pixel corresponding to the position of the third field of view zoom result, the G sub-pixel corresponding to the second field of view zoom result, and the first field of view zoom result correspondingly
  • the B sub-pixels of the position are synthesized. Then, during processing, only the R component of the third field of view zoom window, the G component of the second field of view zoom window, and the B component of the first field of view zoom window are retained, and then recombined into a composite field of view interpolation by the pixel stitching module.
  • the pixel window is used as the input of the scaling interpolation module, and the interpolation calculation result is the correct data that meets the requirements of the display terminal. Interpolation calculations for redundant data are discarded by this method.
  • the redundant data is the B, G component of the third field of view zoom window, the B and R components of the second field of view zoom window, the G and R components of the first field of view zoom window, and the fourth field of view zoom.
  • the B, G, and R components of the window is the B, G component of the third field of view zoom window, the B and R components of the second field of view zoom window, the G and R components of the first field of view zoom window, and the fourth field of view zoom.
  • the scaling interpolation module performs scaling interpolation calculation according to the corresponding scaling algorithm:
  • the interpolated pixel window of IX is interpolated into one pixel according to the specific scaling interpolation algorithm (bicubic, bilinear, multi-edge detection, etc.).
  • the present invention can meet the requirements of various interpolation and scaling algorithms by dynamically adjusting the size of the interpolation and scaling window IX and the configuration of the parameter ⁇ .
  • the display pixel After the display pixel is calculated, it is output to the subsequent video encoding module, and the video encoding module encodes the video data according to the interface and the encoding standard of the display screen, and completes the normal display of the naked-eye 3D image on the display terminal, and the optical path of the user through the directional medium Choose to watch a comfortable naked-eye stereo effect.
  • the advantage of the design of the present invention is that the existing zoom module is improved, and the parallel synchronous zoom engine design of the multi-view naked-eye 3D is completed.
  • the hardware computing resources of the multi-view naked-eye 3D display system are greatly reduced, and the timing of the critical path is more easily satisfied.
  • the invention is compatible with the naked eye 3D display of various different viewpoints, and does not additionally increase the consumption of computing resources due to the increase of the number of viewpoints, so the advantage of parallel synchronization scaling of the present invention is further reflected as the number of viewpoints increases.
  • the invention can dynamically adjust the scaling window size and the value of the scaling parameter to meet the requirements of different interpolation scaling algorithms, dynamically adjust the display pixel synthesis mode, and are compatible with various naked-eye 3D display terminals.
  • each sub-field image data is synchronously written into the corresponding on-chip storage unit of the naked eye stereogram generating module.
  • the input of the naked eye stereogram generating module is a multi-view image sequence of the original video, and the output is video image data corresponding to the resolution of the display terminal.
  • the invention provides a parallel synchronous scaling engine and method for multi-view naked-eye 3D display, and performs real-time display control of a naked-eye 3D composite image according to an output line-field synchronization signal, and each display pixel value of the output data is calculated by interpolation of each sub-field of view. And arranged in combination, the result includes sub-pixel components such as RGB ⁇ YUV ⁇ RGBY.
  • the interpolation algorithm is slightly different according to different designs and implementations.
  • the basic idea of the scaling interpolation algorithm is as follows: Firstly, according to the position of the interpolated pixel, the corresponding interpolated pixel window position in the original video is determined, and then the data in the interpolated pixel window is taken out. Root According to the corresponding scaling interpolation algorithm and the scaling interpolation coefficient (bilinear, bicubic, multi-edge detection, etc.) convolution calculation can obtain the interpolated pixel point data.
  • the existing naked-eye 3D video processing method requires N independent scaling interpolation modules, and in the present invention, only one scaling interpolation module is needed to complete the above functions.
  • the scaling interpolation is not directly performed, but the pixel window data is interpolated for each sub-field according to the pixel arrangement requirement of the display terminal. The selection and corresponding combination calculations are performed to obtain new interpolated pixel window data, and then the scaling interpolation calculation is performed.
  • the original stereo image is synthesized after being scaled, and the image is synthesized and then interpolated, so that parallel scaling of pixels corresponding to multiple fields of view can be performed synchronously, and redundant pixel calculation is eliminated, and the computational complexity is already
  • the 1/N of the method effectively saves computing resources.
  • g ir is the calculated red, green, and blue sub-pixel values of the (i, j) position in the synthesized field of view interpolation pixel window; i, j represents the coordinates of the current pixel value in the pixel window (i, j ) ; the parameter N represents the number of sub-fields; n represents the nth sub-field of view; F x n
  • the N-view naked-eye stereoscopic display supported by the present invention includes an integer and a floating-point pixel arrangement.
  • each sub-pixel of a naked-eye stereoscopic display pixel is represented by a sub-pixel value of a corresponding sub-field of view; for a floating-point pixel arrangement, a naked-eye stereoscopic display pixel
  • Each sub-pixel The sub-pixels of the corresponding positions are calculated in combination.
  • the interpolation scaling algorithm calculates a pixel according to the data window of IX J size, and takes the red (R) sub-pixel as an example to illustrate the correctness of the method.
  • the invention first performs screening of the interpolated pixel window, that is, first completes the pixel synthesis between the sub-fields of the interpolated pixel window, and then performs interpolation according to the calculated composite interpolated pixel window.
  • the interpolation results are as follows: ⁇ (3)
  • R represents the red sub-pixel result calculated by the IX J interpolation pixel window interpolation.
  • the parameter IJ indicates that the size of the interpolated pixel window is: 1 pixel in the horizontal direction and J pixels in the vertical direction; ij represents the coordinate (ij) in the interpolated pixel window; P TJ is the scaling factor, that is, the corresponding (ij) coordinate data Interpolation weighting coefficient; is the red sub-pixel value of the (ij) coordinate position in the interpolated pixel window; F s "is the permutation combined weighting coefficient corresponding to the nth sub-field red sub-pixel when calculating the R component; E; is the nth sub- When the field of view is not interpolated and combined, the coordinates are the red sub-pixel values of (ij ).
  • the traditional method firstly interpolates the sub-fields to obtain the interpolation result, and then performs the synthesis of the naked-eye stereo image according to the interpolation results of the sub-fields.
  • the final interpolation results are as follows: 1 1 1 1 1 1 1 1

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Image Processing (AREA)

Abstract

一种针对多视点裸眼3D显示的并行同步缩放引擎及方法,首先对插值像素窗进行选择和组合计算,然后再对合成的合成视场插值像素窗进行插值计算,计算结果直接在显示终端显示。即将已有方法所采用的先进行图像缩放,再进行裸眼立体图像合成的过程,改进为先进行像素点的筛选和组合,然后进行缩放插值处理。大幅减小了计算复杂度,易于实时硬件实现。可兼容各种视点数目、多种插值算法以及整型和浮点型像素排列的裸眼3D显示,不会因视点数目的增加而增加计算资源,随着视点数目的增加进一步体现出本并行同步缩放引擎的性能优势。

Description

一种针对多视点裸眼 3D显示的并行同步缩放引擎及方法 技术领域
本发明属于视频图像显示处理技术领域, 具体涉及到一种应用在多视点裸眼 3D显示技术 中的易于硬件实现的多视场并行同步缩放引擎及方法。 背景技术
立体显示技术可以表现图像的深度信息和景深感, 让观众产生身临其境的观影体验, 因此 有着广阔的市场前景。 目前已经商业化的 3D显示技术几乎都是基于人眼双目立体视觉原理, 即让左右眼分别接收不同视点的视场图像, 由于不同视点间视场图像存在细微的差异, 通过大 脑融合让观众产生立体感。 相比传统的眼镜式 3D显示, 裸眼 3D显示摆脱了需要佩戴 3D眼 镜才能观看到立体效果的束缚, 因而更具市场优势。 目前裸眼 3D显示主要分为: 狭缝光栅式 显示、 柱状透镜式显示、 体显示和全息显示。 狭缝光栅式显示和柱状透镜式显示, 是在显示终 端前方添加一层方向性阻挡或折射的光学介质, 从而将左右眼视场分离。本发明中如不做特殊 说明, 裸眼 3D显示均是指狭缝光栅显示以及柱状透镜显示。
如图 1, 以整型像素排列的 1080P四视点裸眼 3D显示为例, 简要说明裸眼 3D显示处理 过程。 1080P四视点裸眼 3D片源的 4个子视场图像分辨率均为 960X 540, 呈四宫格排列。 对 应的显示处理过程主要分为以下几步:
1、 将四子视场图像分割, 得到四幅分辨率为 960X 540的子图 (a、 b、 c、 d) ;
2、将各子图的分辨率分别插值缩放至显示终端的物理分辨率(1920X 1080), 得到各子视 场缩放后的图像 (A、 B、 C、 D);
3、 根据光栅或者柱状透镜的相关系数和视点之间的加权组合关系, 将八、 B、 C、 D相应 位置的各子像素点重新计算组合, 得出相应位置裸眼立体图像的显示像素;
4、 完成所合成得到的裸眼立体图像的终端显示。
如图 2所示, 通过狭缝光栅或柱状透镜与裸眼立体合成图之间对应的光路选择作用, 在不 同角度和距离下, 观看到的视场图像不相同。 由于用户左右眼之间有大约五公分的距离, 观众 在合适的观看位置观看图像时, 用户左右眼会接收到不同的视场图像, 经过大脑合成处理产生 立体景深感。 需要说明的是, 图 2列举了一种 4视点整型排列的情形, 仅代表了裸眼 3D合成 图的一种像素排布方式。
与上述方法相对应的裸眼立体显示处理***如图 3所示, 主要包含四部分: 输入视频解码 模块, N视点视图序列生成模块, 视频图像帧存储控制模块以及裸眼立体图生成模块。其中缩 放引擎包含于裸眼立体图生成模块中, 其输入为各子视场插值像素窗,输出为合成立体图的显 示像素。
图 4所示为已有的裸眼 3D显示***中的缩放引擎***。 主要包含以下几步:
1、 从 SDRAM (包括 DDRS DRAM\DDR2 SDRAM\DDR3 SDRAM) 中分别获取各子视场 图像数据,根据相关缩放算法,得出各子视场当前插值像素点插值计算所需的插值像素窗数据;
2、 根据相关缩放算法的缩放系数, 将各子视场相应的插值像素窗通过 N个缩放插值模块 并行进行缩放插值计算, 得到 N个缩放后的像素值数据;
3、 根据显示终端像素排列要求, 将 N个子视场的 N个插值缩放结果, 通过多视点裸眼 3D视频图像组合计算模块进行像素合成, 得到当前位置的显示像素合成结果。重复上述步骤, 直至一场图像合成完毕, 并将合成的裸眼 3D图像在裸眼立体显示终端上显示。
下面结合图 3、 图 4, 以 N视点裸眼 3D显示为例, 对以上步骤做具体描述:
首先将视频信号(模拟信号或数字信号)通过输入视频解码模块,解码为视频数字信号(即 RGB\YUV\RGBY信号) 以及相对应的视频标志信号。
如果直接将多视点裸眼 3D片源解码后得到的视频数据信号投放在显示终端上, 并不能让 用户感受到立体感。 为了能够显示 3D效果, 首先通过 N视点视图序列生成模块, 进行图像分 割或通过 2D转 3D等方式获得多个子视场图像序列。 然后通过视频图像帧存储控制模块将视 频图像数据存入 SDRAM (包括 DDR SDRAM\ DDR2 SDRAM \ DDR3 SDRAM )0 图像存入 SDRAM后, 根据显示终端的物理分辨率, 通过裸眼立体图生成模块将各子视场图像的分辨率 均缩放至与显示终端相同的物理分辨率 (如 1080P、 4KX 2K或 8KX 4K分辨率), 得到 N幅 与显示终端物理分辨率相同的图像。 上述过程中, 缩放引擎(如图 4所示)包含于裸眼立体图 生成模块中, 其运行过程如下: 取出 SDRAM中各子视场图像序列, 将各子视场数据存入片上 存储器中, 然后根据相应缩放算法, 取出缩放插值模块计算所需使用到的插值像素窗数据。各 子视场缩放插值模块根据缩放算法以及相应插值像素窗, 并行进行逐点插值计算。将插值缩放 后得到的 N幅子视场图像序列, 根据显示终端的光栅或柱状透镜的立体图像素排列要求将各 子视场的缩放结果相应位置的 R\G\B (Y\U\V、 R\G\B\Y)子像素点通过多视点裸眼 3D视频图 像组合计算模块, 完成组合计算得到对应位置裸眼 3D图像的显示像素。
最后, 根据显示终端的接口及相应的编码方式, 将上述处理好的裸眼 3D合成图像数据输 送给显示终端, 进行裸眼 3D的显示。 重复上述过程, 直至一场图像结束。
现有方法的缺点在于:
随着显示终端分辨率的不断提升, 多视点裸眼 3D视频源的视点数目正朝着越来越多的方 向发展, 这样各视点之间相互补充, 会让观众产生更好的观影体验以及支持更多的用户同时观 看。 已有方法中每个子视场图像均需单独缩放, N个视场需要 N个独立的缩放插值模块, 而 裸眼立体图像合成时, 只使用到了各子视场缩放结果的部分数据,这样多个缩放插值模块计算 了大量未使用的冗余数据, 因此浪费了大量的硬件计算资源。 另一方面, 随着视点数目的进一 步增加, 巨大的硬件计算资源消耗, 最终将使得硬件资源难以满足设计要求。 发明内容
本发明的目的在于提供一种针对多视点裸眼 3D显示的并行同步缩放引擎及方法,其易于 硬件实现, 且能够有效的节省硬件计算资源。 图 3 中所示现有缩放引擎可由本发明并行同步 缩放引擎 (如图 5所示) 替代, 在功能正确的情况下, 优化***的资源利用率。 本发明并行 同步缩放引擎为裸眼 3D显示***的关键功能模块,缩放引擎的输入为多视场图像序列,输出 为裸眼合成立体图,通过本发明提出的并行同步缩放引擎,大幅降低了裸眼 3D显示***的硬 件资源消耗。
为了实现上述目的, 本发明采用如下技术方案:
一种针对多视点裸眼 3D显示的并行同步缩放引擎, 包括: 多路图像组合计算模块, 用于 从各子视场对应的片上存储单元中同步、 并行的取出在插值缩放过程中使用到的插值像素窗 数据, 并对各子视场插值像素窗数据进行组合运算筛选, 得出合成视场各子像素插值像素窗 数据, 并输入像素拼接模块; 像素拼接模块, 用于对合成视场各子像素插值像素窗数据拼接 得出合成视场插值像素窗, 输入缩放插值模块; 缩放插值模块, 用于根据相应缩放算法的缩 放系数, 对合成视场插值像素窗数据插值计算出相应位置的一个显示像素。
本发明进一步的改进在于: 缩放插值模块在计算一场图像显示像素的同时, 将合成图像 显示像素在裸眼 3D显示终端上显示, 具有实时处理、 实时显示的特点, 易于硬件实现。
本发明进一步的改进在于: 各子视场视频数据存入对应片上存储单元的过程中各子视场 控制状态保持一致, 各子视场数据同步写入相应片上存储单元。
本发明进一步的改进在于: 所述裸眼 3D显示终端为 N视点整型或者浮点型像素排布的 裸眼 3D显示终端;
所述裸眼 3D显示终端为整型像素排列时, N个 Fx"中有且仅有一个为一, 其余为零; 所述裸眼 3D显示终端为浮点型像素排列时, N个 Fx"满足; |; = 1 ;
π 1
Fx"表示计算合成的 X 子像素数据时, 第 n 子视场相应位置的 X 子像素的加权系数; x {R, G, B} , n e { l ,2...N-l , N} 。 一种针对多视点裸眼 3D显示的并行同步缩放方法, 包括以下步骤:
1 )、 将各子视场图像数据写入对应的片上存储单元;
2 )、 多路图像组合计算模块从各子视场对应的片上存储单元中同步、 并行的取出在插值 缩放过程中使用到的子视场插值像素窗;
3 )、根据裸眼 3D显示终端物理分辨率以及像素排列要求,多路图像组合计算模块对各子 视场插值像素窗进行组合计算得出新的合成视场各子像素插值像素窗; 像素拼接模块对合成 视场各子像素插值像素窗进行拼接得出合成视场插值像素窗;
4 )、 缩放插值模块根据相应缩放算法对合成视场插值像素窗进行缩放插值计算得出一个 显示像素。
本发明进一步的改进在于: 缩放插值模块在计算一场图像显示像素的同时, 将合成图像 显示像素在裸眼 3D显示终端上显示, 具有实时处理、 实时显示的特点, 易于硬件实现。
本发明进一步的改进在于:步骤 1 )中各子视场视频数据存入对应片上存储单元的过程中 各子视场控制状态保持一致, 各子视场数据同步写入相应片上存储单元。
本发明进一步的改进在于: 所述裸眼 3D显示终端为 N视点整型或者浮点型像素排布的 裸眼 3D显示终端;
所述裸眼 3D显示终端为整型像素排列时, N个 Fx"中有且仅有一个为一, 其余为零; 所述裸眼 3D显示终端为浮点型像素排列时, N个 Fx"满足 = 1;
π二 1 表示计算合成的 X 子像素数据时, 第 n 子视场相应位置的 X 子像素的加权系数;
X G {R, G, B} , n e { l ,2...N-l , Ν} 。 一种针对多视点裸眼 3D显示的并行同步缩放方法, 首先根据裸眼 3D显示终端的像素排 列要求进行插值像素窗的筛选合成, 将冗余数据舍弃, 然后进行缩放插值计算, 最终直接生 成裸眼 3D合成图的显示像素。
一种针对多视点裸眼 3D显示的并行同步缩放方法, 包括:
1、 从 SDRAM中分别获取各子视场图像数据, 根据相关缩放算法, 得出各子视场当前插 值像素点插值计算所需的插值像素窗数据;
2、根据显示终端像素排列组合要求, 通过多路图像组合计算模块对各子视场插值像素窗 数据进行组合运算筛选, 得出合成视场各子像素插值像素窗数据, 进一步通过像素拼接模块, 拼接得出合成视场插值像素窗, 作为缩放插值模块的输入;
3、根据相应缩放算法的缩放系数, 通过缩放插值模块将步骤 2得到的插值像素窗数据插 值计算出相应位置的一个显示像素; 将裸眼立体合成图的显示像素在裸眼 3D 显示终端上显 示。
重复步骤 1-3直至一场图像结束。
相对于现有技术, 本发明具有以下有益效果:
本发明在得到各子视场图像序列数据相应插值像素窗数据后, 并不直接进行缩放插值, 而是根据显示终端的像素排列要求, 对各子视场插值像素窗数据进行像素点的选择以及相应 组合计算, 得出新的合成视场插值像素窗数据, 然后再进行缩放插值计算。 即原本先缩放后 进行立体图像合成, 改进为先进行图像合成再进行插值计算, 这样可以同步完成多个视场对 应像素的并行缩放, 剔除了冗余的像素点计算, 计算复杂度为已有方法的 1/N, 有效的节省了 计算资源, 易于实时硬件实现, 满足各种视点数目、 多种插值算法、 兼容各种整型和浮点型 像素排列的裸眼 3D显示,不会因视点数目的增加而增加计算资源,并会随着视点数目的增加 进一步体现出本发明所提出的并行同步缩放引擎的性能优势。
附图说明
图 1 为现有多视点裸眼 3D显示处理***示意图。
图 2 为现有多视点裸眼 3D显示处理***显示原理图。
图 3 为多视点裸眼 3D显示处理***结构示意图。
图 4 为已有方法的多视点裸眼 3D显示***的缩放引擎***示意图。
图 5 为本发明多视点裸眼 3D显示***的并行同步缩放引擎***示意图。
具体实施方式
下面结合附图对本发明进一步详细说明。 请参阅图 5所示, 本发明一种针对多视点裸眼 3D显示的并行同步缩放引擎, 包括多路图像组合计算模块、 像素拼接模块和缩放插值模块。
多路图像组合计算模块, 用于从各子视场的片上存储单元中同步、 并行的取出在插值缩 放过程中使用到的插值像素窗数据, 并对各子视场插值像素窗数据进行组合运算筛选, 得出 各子像素插值像素窗数据, 并输入像素拼接模块;
像素拼接模块,用于将合成视场各子像素插值像素窗数据拼接得出合成视场插值像素窗, 其结果输出给缩放插值模块; 缩放插值模块, 用于根据相应缩放算法的缩放系数, 对合成视场插值像素窗数据插值计 算出相应位置的一个显示像素, 并将显示像素在裸眼 3D显示终端上显示。
本发明一种针对多视点裸眼 3D显示的并行同步缩放方法, 包括以下步骤:
1、 将各子视场图像数据写入对应的片上存储单元; 2、 多路图像组合计算模块从各子视 场对应的片上存储单元中同步、并行的取出在插值缩放过程中使用到的子视场插值像素窗; 3、 根据裸眼 3D显示终端物理分辨率以及裸眼立体图像素排列要求,多路图像组合计算模块对各 子视场插值像素窗进行组合计算得出新的合成视场子像素插值像素窗; 像素拼接模块对各合 成视场子像素插值像素窗进行拼接得出合成视场插值像素窗; 4、缩放插值模块根据相应缩放 算法对合成视场插值像素窗进行缩放插值计算得出裸眼立体图一个显示像素。
重复步骤 1-4, 直至一场图像结束, 并将合成图像在裸眼 3D显示终端上显示。
1、 将各子视场图像数据写入对应的片上存储单元:
本发明中各子视场数据操作必须并行、 同步、 实时进行。 由于并行同步缩放引擎需要同 时使用到各子视场图像中相同位置的数据, 所以各子视场视频数据存入对应片上存储单元的 过程中要求各子视场控制状态保持一致, 各子视场数据同步写入相应片上存储单元, 这样保 证后续的并行同步缩放引擎实时、 正确运行。
2、 从各子视场的片上存储单元中同步、 并行的取出在插值缩放过程中所需使用到的插 值像素窗:
在本步骤 2中, 各子视场插值像素窗数据需保持完全的一致性和同步性。 实施过程中使 用到一个插值窗地址计算模块, 各子视场均使用这个插值窗地址计算模块的运算结果, 同步 将插值像素窗数据输出给多路图像组合计算模块。 需要说明的是, 根据插值算法的不同, 插 值窗地址计算模块会有所差异。
3、 根据显示终端物理分辨率以及像素排列要求, 多路图像组合计算模块对各子视场缩 放数据窗口进行组合计算得出新的合成视场各子像素插值像素窗; 像素拼接模块对合成视场 各子像素插值像素窗进行拼接得出合成视场插值像素窗:
对于 N视点裸眼 3D显示***, 通过步骤 2处理, 将得到 N个插值像素窗。 N个插值像 素窗中的数据在裸眼 3D显示模式下,仅有部分数据是有用数据,在进行插值计算之前先将其 余的冗余数据舍弃, 从而避免冗余数据的计算, 节约计算资源。 本发明中, 通过调整参数 Fx" 的配置, 进行数据的选取和舍弃, 兼容各种整型以及浮点型像素排布的裸眼 3D显示终端。
本例以 4视点整型像素排列裸眼 3D显示为例, 简要说明冗余数据的舍弃过程。 假设在 最终的裸眼立体合成图时, 某一坐标位置的显示像素是由第三视场缩放结果相应位置的 R子 像素、第二视场缩放结果相应位置的 G子像素以及第一视场缩放结果相应位置的 B子像素合 成得到。 那么在处理过程中, 仅保留第三视场缩放窗的 R分量、 第二视场缩放窗的 G分量、 第一视场缩放窗的 B分量, 然后通过像素拼接模块重新组合为合成视场插值像素窗, 将其作 为缩放插值模块的输入, 插值计算结果即为符合显示终端要求的正确数据。 通过此方法舍弃 了冗余数据的插值计算。 本例中冗余数据为相应位置第三视场缩放窗的 B、 G分量、 第二视 场缩放窗的 B、 R分量、第一视场缩放窗的 G、 R分量以及第四视场缩放窗的 B、 G、 R分量。
4、 缩放插值模块根据相应缩放算法进行缩放插值计算:
根据具体采用的缩放插值算法(双三次、 双线性、 多边缘检测等)将 IX〗的插值像素窗 插值为一个像素点。 在实际使用过程中, 本发明通过动态的调整插值缩放窗的尺寸 IX〗以及 参数 ^的配置, 可满足各种插值缩放算法的要求。 通过上述步骤将显示终端对应地址的显示像素计算出来, 按此方式, 逐点计算直至缩放 插值得出一场完整图像。 计算得出显示像素后, 将其输出给后续视频编码模块, 视频编码模 块根据显示屏的接口及编码标准将视频数据编码, 完成裸眼 3D图像在显示终端的正常显示, 用户通过方向性介质的光路选择即可观看到舒适的裸眼立体效果。
本发明设计的优点在于对已有缩放模块进行改进,完成了多视点裸眼 3D的并行同步缩放 引擎设计。采用本发明, 多视点裸眼 3D显示***的硬件计算资源大幅降低, 关键路径的时序 更易满足。本发明兼容各种不同视点数的裸眼 3D显示,不会因视点数目的增加额外增加计算 资源的消耗, 因此随着视点数的增加将进一步体现出本发明并行同步缩放的优势。 同时, 本 发明可以动态的调整缩放窗口大小和缩放参数的取值, 满足不同插值缩放算法的需求, 动态 的调整显示像素合成方式, 兼容各种不同的裸眼 3D显示终端。
下面对本发明方法进行进一步的解释、 说明:
首先, 将各子视场图像数据同步写入裸眼立体图生成模块的对应片上存储单元中。
裸眼立体图生成模块的输入为原始视频的多视场图像序列, 输出为对应于显示终端分辨 率的视频图像数据。本发明一种针对多视点裸眼 3D显示的并行同步缩放引擎及方法,根据输 出行场同步信号完成裸眼 3D合成图像的实时显示控制,输出数据的每一个显示像素值均由各 子视场插值计算并组合排列得到, 其结果包含 RGB\YUV\RGBY等子像素分量。 插值算法根 据不同的设计和实现方式略有不同, 然而缩放插值算法的基本思路为: 首先根据插值像素点 位置确定原始视频中需要使用到的相应插值像素窗位置, 然后取出插值像素窗中的数据, 根 据相应的缩放插值算法与缩放插值系数 (双线性、 双三次、 多边缘检测等) 做卷积计算即可 得出插值后的像素点数据。
针对于 N视点裸眼 3D显示***, 已有的裸眼 3D视频处理方法需要使用到 N个独立的 缩放插值模块, 而在本发明中, 仅需使用 1个缩放插值模块即可完成以上功能。 与已有方法 不同的是, 在得到各子视场图像序列数据相应插值像素窗数据后, 并不直接进行缩放插值, 而是根据显示终端的像素排列要求,对各子视场插值像素窗数据进行选择以及相应组合计算, 得出新的插值像素窗数据, 然后再进行缩放插值计算。 即原本先缩放后进行立体图像合成, 改进为先进行图像合成再进行插值计算, 这样可以同步完成多个视场对应像素的并行缩放, 剔除了冗余的像素点计算, 计算复杂度为已有方法的 1/N, 有效的节省了计算资源。
本发明插值像
Figure imgf000010_0001
即有:
Figure imgf000010_0002
公式 (1 )、 ( 2) 中参数说明:
、 gir 为计算得出的合成视场插值像素窗中坐标为 (i, j )位置的红、 绿、 蓝子像素值; i、 j 代表当前像素值在像素窗中所属坐标为(i, j ) ; 参数 N表示子视场数目; n表示第 n个子视场; Fx n
( X G {R, G, B}、 n e { l ,2...N-l , N} ), 表示计算合成的 x子像素数据时, 第 n子视场相应 位置的 X子像素的加权系数; R;;、 GJ;、 分别代表第 n子视场(i, j )坐标位置的 R、 G、 B数据。 需要说明的是, 整型像素排列时, N个 FT"中有且仅有一个为一, 其余为零; 浮点型像素排列 时, N个 Fx"可以有多种组合方式( Fx" = l ),即此方法可兼容各种不同的裸眼 3D显示终端。
π二 1
本发明支持的 N视点裸眼立体显示包括整型以及浮点型像素排列方式。 在合成裸眼立体显示 像素时: 对于整型像素排列方式, 裸眼立体图显示像素的各个子像素由某一子视场相应位置 的子像素值表示; 对于浮点型像素排列方式, 裸眼立体图显示像素的各个子像素由各子视场 相应位置的子像素共同组合计算得到。
假定插值缩放算法根据 IX J大小的数据窗计算出一个像素点, 以红色(R)子像素为例, 说明本方法的正确性。 本发明首先进行插值像素窗的筛选, 即先完成插值像素窗中各子视场 间的像素合成, 然后根据计算出的合成插值像素窗进行插值。 插值结果如下: χ (3)
Figure imgf000011_0001
公式 (3) 中参数说明如下:
R表示 IX J插值像素窗插值计算得出的红色子像素结果。 参数 I J表示插值像素窗的尺 寸为: 水平方向 I个像素点, 垂直方向 J个像素点; i j表示插值像素窗中的坐标(i j ); PTJ 为缩放系数, 即 (i j ) 坐标数据对应的插值加权系数; 为插值像素窗中 (i j ) 坐标位置 的红色子像素值; Fs"为计算 R分量时,第 n子视场红色子像素对应的排列组合加权系数; E; 为第 n个子视场未经插值和组合计算时, 坐标为 (i j ) 的红色子像素值。
传统方法首先对各子视场进行插值计算, 得出插值结果, 然后根据各子视场插值结果, 进行裸眼立体图像的合成, 最终插值结果如下: 1 1 1 1 1 1
对比公式 (3)、 (4) 可知, 本发明与传统方法的结果完全一致。 不过若采用本发明每个 像素点的计算将减少 N-1次缩放插值过程, 因此大幅降低了硬件计算资源的消耗。

Claims

权 利 要 求 书
1、 一种针对多视点裸眼 3D显示的并行同步缩放引擎, 其特征在于, 包括:
多路图像组合计算模块, 用于从各子视场对应的片上存储单元中同步、 并行的取出在插 值缩放过程中使用到的插值像素窗数据, 并对各子视场插值像素窗数据进行组合运算筛选, 得出合成视场各子像素插值像素窗数据, 其结果输出给像素拼接模块;
像素拼接模块, 用于对合成视场各子像素插值像素窗数据拼接, 得出合成视场插值像素 窗, 其结果输出给缩放插值模块;
缩放插值模块, 用于根据相应缩放算法的缩放系数, 对插值像素窗数据插值计算出相应 位置的一个显示像素。
2、 根据权利要求 1所述的一种针对多视点裸眼 3D显示的并行同步缩放引擎, 其特征在 于,缩放插值模块计算出一场图像中所有显示像素,并将合成图像在裸眼 3D显示终端上显示。
3、 根据权利要求 1所述的一种针对多视点裸眼 3D显示的并行同步缩放引擎, 其特征在 于, 各子视场视频数据存入对应片上存储单元的过程中, 各子视场控制状态保持一致, 各子 视场数据同步写入相应片上存储单元。
4、 根据权利要求 2所述的一种针对多视点裸眼 3D显示的并行同步缩放引擎, 其特征在 于, 所述裸眼 3D显示终端为 N视点整型或者浮点型像素排布的裸眼 3D显示终端;
所述裸眼 3D显示终端为整型像素排列时, N个 Fx"中有且仅有一个为一, 其余为零; 所述裸眼 3D显示终端为浮点型像素排列时, N个 Fx"满足 = 1;
π 1
Fx"表示计算合成的 X 子像素数据时, 第 n 子视场相应位置的 X 子像素的加权系数;
X G {R, G, B} , n e { l ,2...N-l , Ν}
5、 一种针对多视点裸眼 3D显示的并行同步缩放方法, 其特征在于, 包括以下步骤:
1 )、 将各子视场图像数据写入对应的片上存储单元;
2)、 多路图像组合计算模块从各子视场对应的片上存储单元中同步、 并行的取出在缩放 插值过程中使用到的子视场插值像素窗;
3 )、根据裸眼 3D显示终端物理分辨率以及合成立体图像素排列要求,多路图像组合计算 模块对各子视场插值像素窗进行组合计算, 得出新的合成视场各子像素插值像素窗; 像素拼 接模块对合成视场各子像素插值像素窗进行拼接得出合成视场插值像素窗;
4 )、 缩放插值模块根据相应缩放算法对合成视场插值像素窗进行缩放插值计算, 得出一 个显示像素。
6、 根据权力要求 5所述的一种针对多视点裸眼 3D显示的并行同步缩放方法, 其特征在 于,缩放插值模块计算出一场图像中所有显示像素, 并实时将合成图像显示像素在裸眼 3D显 示终端上显示。
7、 根据权力要求 6所述的一种针对多视点裸眼 3D显示的并行同步缩放方法, 其特征在 于,步骤 1 )中各子视场视频数据存入对应片上存储单元的过程中各子视场控制状态保持一致, 各子视场数据同步写入相应片上存储单元。
8、 根据权力要求 6所述的一种针对多视点裸眼 3D显示的并行同步缩放方法, 其特征在 于, 所述裸眼 3D显示终端为 N视点整型或者浮点型像素排布的裸眼 3D显示终端;
所述裸眼 3D显示终端为整型像素排列时, N个 Fx"中有且仅有一个为一, 其余为零; 所述裸眼 3D显示终端为浮点型像素排列时, N个 Fx"满足 ; = 1;
π 1 "表示计算合成的 X 子像素数据时, 第 n 子视场相应位置的 X 子像素的加权系数;
X G {R, G, B} , n e { l ,2...N-l , Ν}
9、 一种针对多视点裸眼 3D显示的并行同步缩放方法, 其特征在于, 首先根据裸眼 3D 显示终端的裸眼立体图像素排列要求进行插值像素窗的筛选以及合成, 将其余的冗余数据舍 弃, 然后进行缩放插值计算, 最终直接生成裸眼 3D合成图的显示像素。
PCT/CN2014/078731 2014-04-24 2014-05-29 一种针对多视点裸眼3d显示的并行同步缩放引擎及方法 WO2015161541A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/897,076 US9924153B2 (en) 2014-04-24 2014-05-29 Parallel scaling engine for multi-view 3DTV display and method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410167890.5A CN103945208B (zh) 2014-04-24 2014-04-24 一种针对多视点裸眼3d显示的并行同步缩放引擎及方法
CN201410167890.5 2014-04-24

Publications (1)

Publication Number Publication Date
WO2015161541A1 true WO2015161541A1 (zh) 2015-10-29

Family

ID=51192656

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/078731 WO2015161541A1 (zh) 2014-04-24 2014-05-29 一种针对多视点裸眼3d显示的并行同步缩放引擎及方法

Country Status (3)

Country Link
US (1) US9924153B2 (zh)
CN (1) CN103945208B (zh)
WO (1) WO2015161541A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108573521A (zh) * 2018-04-12 2018-09-25 东南大学 基于cuda并行计算框架的实时交互式裸眼3d显示方法

Families Citing this family (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9396554B2 (en) 2014-12-05 2016-07-19 Symbol Technologies, Llc Apparatus for and method of estimating dimensions of an object associated with a code in automatic response to reading the code
US10352689B2 (en) 2016-01-28 2019-07-16 Symbol Technologies, Llc Methods and systems for high precision locationing with depth values
US10145955B2 (en) 2016-02-04 2018-12-04 Symbol Technologies, Llc Methods and systems for processing point-cloud data with a line scanner
US10721451B2 (en) * 2016-03-23 2020-07-21 Symbol Technologies, Llc Arrangement for, and method of, loading freight into a shipping container
US10776661B2 (en) 2016-08-19 2020-09-15 Symbol Technologies, Llc Methods, systems and apparatus for segmenting and dimensioning objects
WO2018077394A1 (en) 2016-10-26 2018-05-03 Huawei Technologies Co., Ltd. Method and device for depth detection using stereo images
US11042161B2 (en) 2016-11-16 2021-06-22 Symbol Technologies, Llc Navigation control method and apparatus in a mobile automation system
US10451405B2 (en) 2016-11-22 2019-10-22 Symbol Technologies, Llc Dimensioning system for, and method of, dimensioning freight in motion along an unconstrained path in a venue
US10354411B2 (en) 2016-12-20 2019-07-16 Symbol Technologies, Llc Methods, systems and apparatus for segmenting objects
US11449059B2 (en) 2017-05-01 2022-09-20 Symbol Technologies, Llc Obstacle detection for a mobile automation apparatus
US11093896B2 (en) 2017-05-01 2021-08-17 Symbol Technologies, Llc Product status detection system
US10949798B2 (en) 2017-05-01 2021-03-16 Symbol Technologies, Llc Multimodal localization and mapping for a mobile automation apparatus
US10591918B2 (en) 2017-05-01 2020-03-17 Symbol Technologies, Llc Fixed segmented lattice planning for a mobile automation apparatus
US10726273B2 (en) 2017-05-01 2020-07-28 Symbol Technologies, Llc Method and apparatus for shelf feature and object placement detection from shelf images
US11367092B2 (en) 2017-05-01 2022-06-21 Symbol Technologies, Llc Method and apparatus for extracting and processing price text from an image set
US10663590B2 (en) 2017-05-01 2020-05-26 Symbol Technologies, Llc Device and method for merging lidar data
EP3619600A4 (en) 2017-05-01 2020-10-21 Symbol Technologies, LLC METHOD AND APPARATUS FOR OBJECT STATE DETECTION
WO2018201423A1 (en) 2017-05-05 2018-11-08 Symbol Technologies, Llc Method and apparatus for detecting and interpreting price label text
CN107147890B (zh) * 2017-05-11 2018-12-07 西安交通大学 一种兼容不同分辨率和宽长比的多视频缩放模块及并行工作方法
US10572763B2 (en) 2017-09-07 2020-02-25 Symbol Technologies, Llc Method and apparatus for support surface edge detection
US10521914B2 (en) 2017-09-07 2019-12-31 Symbol Technologies, Llc Multi-sensor object recognition system and method
CN107872683B (zh) * 2017-11-21 2021-04-20 广州视源电子科技股份有限公司 一种视频数据处理方法、装置、设备及存储介质
CN108040246B (zh) * 2017-12-21 2019-09-24 四川长虹电器股份有限公司 多视点裸眼3d显示***中降低时钟频率的方法
CN110191331B (zh) * 2018-02-22 2022-01-04 深圳市华胜软件技术有限公司 一种真三维裸眼3d图像合成方法、存储介质及合成装置
US10740911B2 (en) 2018-04-05 2020-08-11 Symbol Technologies, Llc Method, system and apparatus for correcting translucency artifacts in data representing a support structure
US10809078B2 (en) 2018-04-05 2020-10-20 Symbol Technologies, Llc Method, system and apparatus for dynamic path generation
US11327504B2 (en) 2018-04-05 2022-05-10 Symbol Technologies, Llc Method, system and apparatus for mobile automation apparatus localization
US10823572B2 (en) 2018-04-05 2020-11-03 Symbol Technologies, Llc Method, system and apparatus for generating navigational data
US10832436B2 (en) 2018-04-05 2020-11-10 Symbol Technologies, Llc Method, system and apparatus for recovering label positions
CN108932707B (zh) * 2018-08-17 2022-06-07 一艾普有限公司 一种图像处理方法及装置
US11010920B2 (en) 2018-10-05 2021-05-18 Zebra Technologies Corporation Method, system and apparatus for object detection in point clouds
US11506483B2 (en) 2018-10-05 2022-11-22 Zebra Technologies Corporation Method, system and apparatus for support structure depth determination
US11090811B2 (en) 2018-11-13 2021-08-17 Zebra Technologies Corporation Method and apparatus for labeling of support structures
US11003188B2 (en) 2018-11-13 2021-05-11 Zebra Technologies Corporation Method, system and apparatus for obstacle handling in navigational path generation
US11079240B2 (en) 2018-12-07 2021-08-03 Zebra Technologies Corporation Method, system and apparatus for adaptive particle filter localization
US11416000B2 (en) 2018-12-07 2022-08-16 Zebra Technologies Corporation Method and apparatus for navigational ray tracing
US11100303B2 (en) 2018-12-10 2021-08-24 Zebra Technologies Corporation Method, system and apparatus for auxiliary label detection and association
US11015938B2 (en) 2018-12-12 2021-05-25 Zebra Technologies Corporation Method, system and apparatus for navigational assistance
US10731970B2 (en) 2018-12-13 2020-08-04 Zebra Technologies Corporation Method, system and apparatus for support structure detection
CA3028708A1 (en) 2018-12-28 2020-06-28 Zih Corp. Method, system and apparatus for dynamic loop closure in mapping trajectories
WO2020210937A1 (en) * 2019-04-15 2020-10-22 Shanghai New York University Systems and methods for interpolative three-dimensional imaging within the viewing zone of a display
US11341663B2 (en) 2019-06-03 2022-05-24 Zebra Technologies Corporation Method, system and apparatus for detecting support structure obstructions
US11080566B2 (en) 2019-06-03 2021-08-03 Zebra Technologies Corporation Method, system and apparatus for gap detection in support structures with peg regions
US11662739B2 (en) 2019-06-03 2023-05-30 Zebra Technologies Corporation Method, system and apparatus for adaptive ceiling-based localization
US11402846B2 (en) 2019-06-03 2022-08-02 Zebra Technologies Corporation Method, system and apparatus for mitigating data capture light leakage
US11151743B2 (en) 2019-06-03 2021-10-19 Zebra Technologies Corporation Method, system and apparatus for end of aisle detection
US11200677B2 (en) 2019-06-03 2021-12-14 Zebra Technologies Corporation Method, system and apparatus for shelf edge detection
US11960286B2 (en) 2019-06-03 2024-04-16 Zebra Technologies Corporation Method, system and apparatus for dynamic task sequencing
US11507103B2 (en) 2019-12-04 2022-11-22 Zebra Technologies Corporation Method, system and apparatus for localization-based historical obstacle handling
US11107238B2 (en) 2019-12-13 2021-08-31 Zebra Technologies Corporation Method, system and apparatus for detecting item facings
CN113141501A (zh) * 2020-01-20 2021-07-20 北京芯海视界三维科技有限公司 实现3d显示的方法、装置及3d显示***
US11822333B2 (en) 2020-03-30 2023-11-21 Zebra Technologies Corporation Method, system and apparatus for data capture illumination control
US11450024B2 (en) 2020-07-17 2022-09-20 Zebra Technologies Corporation Mixed depth object detection
US11593915B2 (en) 2020-10-21 2023-02-28 Zebra Technologies Corporation Parallax-tolerant panoramic image generation
US11392891B2 (en) 2020-11-03 2022-07-19 Zebra Technologies Corporation Item placement detection and optimization in material handling systems
US11847832B2 (en) 2020-11-11 2023-12-19 Zebra Technologies Corporation Object classification for autonomous navigation systems
CN114967170B (zh) * 2021-02-18 2023-07-18 清华大学 基于柔性裸眼三维显示设备的显示处理方法及其装置
CN112668672A (zh) * 2021-03-16 2021-04-16 深圳市安软科技股份有限公司 基于TensorRT的目标检测模型加速方法及装置
US11954882B2 (en) 2021-06-17 2024-04-09 Zebra Technologies Corporation Feature-based georegistration for mobile computing devices

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120127286A1 (en) * 2010-11-22 2012-05-24 Sony Corporation 3d display device and 3d display method
CN102752620A (zh) * 2012-06-20 2012-10-24 四川长虹电器股份有限公司 3d视频的电视播放方法
CN103581652A (zh) * 2013-11-27 2014-02-12 重庆卓美华视光电有限公司 多视点立体视频数据处理方法及装置

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08126034A (ja) * 1994-10-20 1996-05-17 Canon Inc 立体画像表示装置および方法
CA2380105A1 (en) * 2002-04-09 2003-10-09 Nicholas Routhier Process and system for encoding and playback of stereoscopic video sequences
KR100657275B1 (ko) * 2004-08-26 2006-12-14 삼성전자주식회사 입체 영상 신호 발생 방법 및 이에 적합한 스케일링 방법
JP4440067B2 (ja) * 2004-10-15 2010-03-24 キヤノン株式会社 立体表示のための画像処理プログラム、画像処理装置および立体表示システム
CN102005063B (zh) * 2005-07-05 2012-10-03 三洋电机株式会社 立体图像处理方法和立体图像处理装置及程序以及存储程序的记录介质
JP4643727B2 (ja) * 2009-05-29 2011-03-02 株式会社東芝 画像処理装置及び画像処理方法
US8284237B2 (en) * 2009-09-09 2012-10-09 Nokia Corporation Rendering multiview content in a 3D video system
CN101754038B (zh) * 2009-12-09 2012-05-30 青岛海信网络科技股份有限公司 一种视差提取方法
US8619123B2 (en) * 2010-01-20 2013-12-31 Kabushiki Kaisha Toshiba Video processing apparatus and method for scaling three-dimensional video
KR20120109158A (ko) * 2011-03-28 2012-10-08 삼성디스플레이 주식회사 표시 시스템
KR101322910B1 (ko) * 2011-12-23 2013-10-29 한국과학기술연구원 다수의 관찰자에 적용가능한 동적 시역 확장을 이용한 다시점 3차원 영상표시장치 및 그 방법
JP2014082541A (ja) * 2012-10-12 2014-05-08 National Institute Of Information & Communication Technology 互いに類似した情報を含む複数画像のデータサイズを低減する方法、プログラムおよび装置
KR101944911B1 (ko) * 2012-10-31 2019-02-07 삼성전자주식회사 영상 처리 방법 및 영상 처리 장치
CN103581650B (zh) * 2013-10-21 2015-08-19 四川长虹电器股份有限公司 双目3d视频转多目3d视频的方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120127286A1 (en) * 2010-11-22 2012-05-24 Sony Corporation 3d display device and 3d display method
CN102752620A (zh) * 2012-06-20 2012-10-24 四川长虹电器股份有限公司 3d视频的电视播放方法
CN103581652A (zh) * 2013-11-27 2014-02-12 重庆卓美华视光电有限公司 多视点立体视频数据处理方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108573521A (zh) * 2018-04-12 2018-09-25 东南大学 基于cuda并行计算框架的实时交互式裸眼3d显示方法
CN108573521B (zh) * 2018-04-12 2022-02-08 东南大学 基于cuda并行计算框架的实时交互式裸眼3d显示方法

Also Published As

Publication number Publication date
CN103945208B (zh) 2015-10-28
US9924153B2 (en) 2018-03-20
US20160156898A1 (en) 2016-06-02
CN103945208A (zh) 2014-07-23

Similar Documents

Publication Publication Date Title
WO2015161541A1 (zh) 一种针对多视点裸眼3d显示的并行同步缩放引擎及方法
US9525858B2 (en) Depth or disparity map upscaling
JP4740135B2 (ja) 3次元画像ディスプレイの画面に3次元画像を描画するシステム及び方法
JP5083052B2 (ja) 立体視画像生成装置、立体視画像生成方法およびプログラム
CN103369337B (zh) 3d显示设备及使用该3d显示设备处理图像的方法
US20110298898A1 (en) Three dimensional image generating system and method accomodating multi-view imaging
CN102685523B (zh) 深度信息产生器、深度信息产生方法及深度调整装置
JP6128442B2 (ja) 立体画像および画像シーケンスのステレオベース拡張のための方法および装置{method and device for stereo base extension of stereoscopic images and image sequences}
CN103945205B (zh) 兼容2d与多视点裸眼3d显示的视频处理装置及方法
CN104506872A (zh) 一种平面视频转立体视频的方法及装置
JP2012120057A (ja) 画像処理装置、および画像処理方法、並びにプログラム
CN103561255B (zh) 一种裸眼立体显示方法
KR20170055930A (ko) 3d 디스플레이 시스템에서 입체 이미지 디스플레이 방법 및 장치
CN110958442A (zh) 用于处理全息图像数据的方法和装置
US20120163700A1 (en) Image processing device and image processing method
US20120050465A1 (en) Image processing apparatus and method using 3D image format
JP5700998B2 (ja) 立体映像表示装置及びその制御方法
KR100372177B1 (ko) 2차원 영상을 3차원 영상으로 변환하는 방법
JP2012134885A (ja) 画像処理装置及び画像処理方法
Salman et al. Overview: 3D Video from capture to Display
CN104796684A (zh) 裸眼3d视频处理方法
KR20130112679A (ko) 3d 디스플레이 장치 및 그 영상 처리 방법
Liu et al. Deinterlacing of depth-image-based three-dimensional video for a depth-image-based rendering system
Liu et al. 3D video rendering adaptation: a survey
TWI410120B (zh) 三維成像系統及方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14890217

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14897076

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14890217

Country of ref document: EP

Kind code of ref document: A1