WO2017054114A1 - 一种显示***及其显示方法 - Google Patents

一种显示***及其显示方法 Download PDF

Info

Publication number
WO2017054114A1
WO2017054114A1 PCT/CN2015/090973 CN2015090973W WO2017054114A1 WO 2017054114 A1 WO2017054114 A1 WO 2017054114A1 CN 2015090973 W CN2015090973 W CN 2015090973W WO 2017054114 A1 WO2017054114 A1 WO 2017054114A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
unit
image
solid object
display panel
Prior art date
Application number
PCT/CN2015/090973
Other languages
English (en)
French (fr)
Inventor
那庆林
黄彦
麦浩晃
Original Assignee
神画科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 神画科技(深圳)有限公司 filed Critical 神画科技(深圳)有限公司
Priority to PCT/CN2015/090973 priority Critical patent/WO2017054114A1/zh
Publication of WO2017054114A1 publication Critical patent/WO2017054114A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof

Definitions

  • the present invention relates to a display system, and more particularly to a display system and a display method thereof, which can generate a display image that matches an intersecting surface and/or a solid object based on a variable unit provided in the display panel.
  • a typical projection imaging surface is a smooth plane or curved surface, that is, the projection display area of the projector is usually a plane or a circular arc surface, and there is no three-dimensional solid object in the range of the projection display area.
  • liquid crystal displays are usually also smooth flat or curved surfaces, and their display can only be based on a fixed display surface, and cannot be adapted to the application requirements of various plots.
  • the present invention firstly solves the problem that the existing display system can only display an image based on a fixed display surface.
  • the present invention provides a display system including an image processing unit, further including a display panel having an area greater than or equal to a display area, the display area being formed on the display panel; Providing at least one variable unit and a driving mechanism corresponding to the variable unit; the driving mechanism can drive the variable unit to operate in a placed state or a hidden state; the variable unit operates in a pendulum At least two intersecting faces are present on the display panel when the state is released And/or at least one solid object; the system further comprising a picture storage unit for storing an application picture corresponding to the respective variable unit; the image processing unit is based on the variable unit currently operating in a placed state Processing the application screen to form a picture for performing pattern correction based on the at least two intersecting surfaces and/or at least one solid object, and then mapping the corrected picture and forming and forming in the display area A display image in which the at least two intersecting faces and/or at least one solid object match.
  • the surface of the display panel, the at least two intersecting faces, and/or the at least one solid object may be a liquid crystal display or an OLED.
  • the display system of the present invention may be a projection display system, and further includes a projection lens unit, wherein the projection display area corresponding to the projection lens unit is projected on the display panel; the image processing unit Processing a picture to form a picture for performing pattern correction based on the at least two intersecting faces and/or at least one solid object, and then mapping by the projection lens unit, and forming and at least in the display area A display image that matches two intersecting faces and/or at least one solid object.
  • the display system of the present invention may further include an information processing unit and a driving control unit, wherein the information processing unit is configured according to at least two intersecting faces and/or at least one solid object corresponding to the preset application screen currently required to be played, Transmitting a corresponding signal to the drive control unit in a wired or wireless manner; the drive control unit further sends a corresponding drive control signal to the drive mechanism, thereby driving at least one of the variable units to operate in a state of being placed or hidden status.
  • the information processing unit is configured according to at least two intersecting faces and/or at least one solid object corresponding to the preset application screen currently required to be played, Transmitting a corresponding signal to the drive control unit in a wired or wireless manner; the drive control unit further sends a corresponding drive control signal to the drive mechanism, thereby driving at least one of the variable units to operate in a state of being placed or hidden status.
  • variable unit when the variable unit operates in the placed state, it is a concave at least two intersecting faces and/or at least one solid object, or at least two intersecting faces that are convex and/or at least a solid object;
  • the solid object is a simple three-dimensional structure comprising a combination of one or more of a cube, a cuboid, a pyramid, a cone, a sphere, a cylinder, or a platform.
  • At least one of the variable units is a two-stage or multi-stage lifting structure, and/or at least one of the variable units is a laterally movable structure.
  • the display image is a display image having an expected augmented reality effect
  • the application screen includes at least one overall image to be displayed on at least two intersecting faces; the overall image is on each face Display a part of the screen separately, or continuously display on each side and in turn A portion of the continuous motion is displayed on each face.
  • the method further includes an interactive manipulation unit that can perform interactive manipulation based on the projection display image, and the interaction manipulation unit cooperates with the information processing unit to realize displaying an image based on the projection.
  • Interactive control is a part of the display system of the present invention.
  • the interactive manipulation unit includes a remote controller based on wireless communication, and the remote controller includes a combination of one or more of a mobile phone, a tablet computer, a game controller, or an air mouse; the remote controller Use 2.4G, Bluetooth, or WIFI communication to transmit interactive control signals.
  • the interactive control unit includes an infrared remote-based external remote controller and an infrared monitoring lens; the external remote controller can form an infrared spot in the projection display area, and the infrared monitoring lens captures In the infrared spot, the image processing unit displays a corresponding visible icon according to the position information of the infrared spot, and controls the visible icon through the external remote controller to implement an interactive manipulation function.
  • the interactive manipulation unit includes a direct touch type interactive manipulation unit, and selects one of the interactive manipulation modes from the dual monitor lens mode, the TOF sensor mode, or the structured light sensor mode.
  • the surface of the solid object is attached with an optical material layer capable of enhancing the contrast of the projected display image under bright ambient light conditions.
  • the display panel is provided with a bracket for supporting the projection lens unit, and the projection lens unit is mounted on the bracket and located above or on one side of the display panel.
  • the present invention also provides a display method in which a display panel having an area greater than or equal to a display area is used, and the display area is formed on the display panel; at least one of the display panels is provided to be in a state of being placed or a variable unit of a hidden state, wherein the variable unit is in a placed state, the display panel exhibiting at least two intersecting faces and/or at least one solid object; the method comprising the steps of:
  • the screen is mapped and formed in the display area with the at least two phases A display image that matches the intersection and/or at least one solid object.
  • a variable unit is disposed in the liquid crystal display or the OLED, or a variable unit is disposed in the projected display panel, and then the intersecting surface and/or the generated surface can be generated based on the variable unit built in the display panel.
  • FIG. 1 is a schematic diagram of a projection display system and an application scenario thereof in an embodiment of the present invention
  • Figure 2 is a schematic view showing a variable unit forming a concave structure on the basis of Figure 1;
  • Figure 3 is a schematic view showing two variable units rising one layer and two layers respectively on the basis of Figure 1;
  • Figure 4 is a schematic view showing the formation of a stepped structure on the basis of Figure 1;
  • Figure 5 is a schematic block diagram of a projection display system in accordance with a preferred embodiment of the present invention.
  • FIG. 5 is a schematic block diagram of a projection display system according to a preferred embodiment of the present invention.
  • the projection display system includes a projection lens unit, an image processing unit, and a picture storage unit.
  • the picture storage unit is configured to store a preset projection application screen.
  • the method further includes a projection display panel having an area greater than or equal to the projection display area, the projection display area corresponding to the projection lens unit being projected on the display panel; at least one variable unit and the variable unit being disposed in the projection display panel A one-to-one corresponding driving mechanism; the driving mechanism can drive the variable unit to work in a placed state or a hidden state; when the variable unit operates in the placed state, the display panel presents at least two intersecting faces and/or at least one solid object.
  • the application screen corresponding to each variable unit is preset, and then the image processing unit processes the application screen based on the variable unit currently working in the placed state to form based on the at least two intersecting surfaces and/or
  • the image corrected by the at least one solid object is further mapped by the projection lens unit, and a projection display image matching the at least two intersecting surfaces and/or the at least one solid object is formed in the projection display area.
  • the information processing unit and the driving control unit are further included; when working, the information processing unit is configured according to at least two intersecting surfaces and/or at least one solid object corresponding to the preset application screen currently required to be played.
  • the corresponding signal is sent to the drive control unit in a wired or wireless manner; the drive control unit then sends a corresponding drive control signal to the drive mechanism, thereby driving the at least one variable unit to operate in a placed state or a hidden state.
  • the projection display system of the present invention can automatically control the corresponding variable unit according to the projection display image that needs to be played, so that the corresponding three-dimensional object is correctly placed at the corresponding position of the display panel, and the three-dimensional object can be automatically switched according to the need. , forming a variety of combined display modes, not just a single mode.
  • an interactive manipulation unit that can perform interactive manipulation based on the projected display image is also included, and the interactive manipulation unit cooperates with the information processing unit to implement interactive manipulation based on the projected display image.
  • the interactive control unit may be a wireless communication based remote control, such as a combination of one or more of a mobile phone, a tablet, a game pad, or an air mouse; wherein the remote control can communicate using 2.4G, Bluetooth, or WIFI The way to transmit interactive control signals.
  • a wireless communication based remote control such as a combination of one or more of a mobile phone, a tablet, a game pad, or an air mouse; wherein the remote control can communicate using 2.4G, Bluetooth, or WIFI The way to transmit interactive control signals.
  • the interactive control unit can also be an infrared remote-based external remote control and an infrared monitoring lens; when working, the external remote control can form an infrared spot in the projection display area, the infrared monitoring lens captures the infrared spot, and the image processing unit according to the position of the infrared spot
  • the information displays the corresponding visible icons and controls the visible icons via an external remote control for interactive control.
  • the interactive control unit can also be a direct touch interactive control unit, which can select one of the interactive control modes from the dual monitor lens mode, the TOF sensor mode, or the structured light sensor mode.
  • FIG. 1 is a schematic diagram of a projection display system and an application scenario thereof according to an embodiment of the present invention.
  • the projector 100 is mounted on a bracket 140 and located above the display panel 108, and is disposed on both sides of the projection lens unit 102.
  • variable unit 110 there is a display panel 108 whose area is greater than or equal to the area of the projection display area, and the projection display area projected by the projector 100 is formed on the display panel; the display panel 108 includes many A grid, each grid being a variable unit 110.
  • variable When the unit is in the placed state it may be at least two intersecting faces and/or at least one solid object, or at least two intersecting faces and/or at least one solid object; the solid object is a simple three-dimensional object; Structure, including a combination of one or more of a cube, a cuboid, a pyramid, a cone, a sphere, a cylinder, or a pedestal.
  • the variable unit may be a two-stage or multi-stage lifting structure or a laterally movable structure.
  • an optical material layer capable of enhancing the contrast of the projected display image under bright ambient light conditions may be attached to the surface of the solid object.
  • a drive mechanism (not shown) is further provided for each of the variable units in Fig. 1 for driving the variable unit to operate in a placed state or a hidden state.
  • a direct touch type interactive control unit specifically a dual monitor lens mode, is provided, and two monitoring lenses 104 and 106 are disposed on both sides of the projection lens unit 102 in the projector 100.
  • An infrared light source (not shown) is also integrated that emits infrared light that covers the entire projected display area.
  • the infrared light source can also be mounted outside the projection lens to directly emit infrared light and cover the entire projection display area.
  • the two infrared monitoring lenses need to be scaled and corrected first, and the image disparity map is acquired at the same time.
  • the spatial reconstruction is realized by image tracking, image segmentation and image recognition, and then the three-dimensional spatial position of the finger is calculated by the algorithm; The three-dimensional high and low undulating field shadows.
  • the two monitoring lenses 104 and 106 can monitor the spatial position information of the finger, and can further determine the currently performed touch operation by monitoring the motion track of the finger and the spatial coordinate information. For example, clicking, sliding, etc. are realized at different heights of the space.
  • the direct touch interactive control unit in this embodiment may also be a TOF (Time of Flight) sensor mode or a structured light sensor mode.
  • TOF Time of Flight
  • the basic principle of the TOF sensor is to receive the light returned from the object by the sensor, and to obtain the target distance by detecting the flight (round trip) time of the light pulse.
  • the TOF sensor is similar to the general machine vision imaging process. It consists of several units, such as a light source, an optical component, a sensor, a control circuit, and a processing circuit.
  • the TOF sensor is obtained by the target distance acquired by the in-and-out-reflection detection. .
  • the TOF sensor By reconstructing the stereoscopic information of the projection space by the TOF sensor, the three-dimensional space of the finger can be calculated. Position to determine the touch of the space.
  • the structured light sensor works by using continuous light (near-infrared) to encode the measurement space, reading the encoded light through the sensor, and decoding it by the wafer operation to produce a depth image.
  • the light source is not a periodic two-dimensional image coding, but a "body coding" with three-dimensional depth.
  • This kind of light source is called Laser Speckle, which is a random diffraction spot formed when the laser is irradiated to a rough object or penetrates the frosted glass. These speckles are highly random and will change pattern with distance. Any two spots in the space will have different patterns, which is equivalent to marking the entire space, so any object enters the space and When moving, the position of the object can be recorded exactly.
  • FIG. 2 is a modification based on FIG. 1 in which a variable unit is formed after the concave structure 112 is formed.
  • three visible projection surfaces that is, the bottom surface 114 and the current viewing angle are formed.
  • the two side surfaces 116, 118 can perform pattern correction on the corresponding application screen based on the three surfaces and the surface of the display panel.
  • the side surface 116 forms a door by pattern correction
  • the projections on the side surface 118 and the bottom surface 114 form a virtual ladder.
  • the doors and sides 118 on the side 116 and the ladder on the bottom surface 114 are preset application screens.
  • the information processing unit sends a corresponding signal to the drive control unit in a wired or wireless manner; the drive control unit sends a corresponding drive control signal to the drive mechanism, and further
  • the driving bottom surface 114 is sunk from the original position to the position shown in the figure. This creates the desired image of the door and ladder created at the depression.
  • FIG. 3 is another variation based on FIG. 1, in which the bracket 140 is moved to the upper right corner of the display panel, and one variable unit 128 in the lower right portion is raised by one layer, and the other variable unit in the upper left portion is shown. 120 rose by two layers.
  • variable unit 128 which has been raised by one layer, two visible projection surfaces, that is, the upper surface 130 and the front surface 132 are formed at the current viewing angle, although the right side surface is visible, but since the position of the projector 100 is no longer
  • the method acts as a projection surface.
  • variable unit 120 having two layers raised, three visible projection surfaces, namely the upper surface 122, the front front surface 124, and the right side surface 126 are formed at the current viewing angle. Similar to FIG. 2, the system can preset a projection application screen, and the information processing unit drives the variable units 120 and 128 to this position according to the content content of the playback, thereby converting the variable unit into a desired image by mapping, for example. Building or castle.
  • FIG. 4 is another variation based on FIG. 1 , in which a part of the multi-variable unit remains unchanged, a part of the variable unit rises by one stage, and a part of the variable unit rises by two stages, thereby forming a stepped type. Structured display panel.
  • the foregoing description is based on a projection display system, and those skilled in the art can understand that the solution of the present invention can also be applied to a liquid crystal display or an OLDE, in which case the components such as the aforementioned projector are not required. Rather, the display panel and the visible surface of each variable unit are all set as liquid crystal display or OLED, so that the corresponding image can be directly displayed, and the same effect can be achieved.
  • the display image thereof is preferably a display image having an expected augmented reality effect, and at least one of the application screens needs to be displayed on at least two intersecting surfaces.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本发明公开了一种显示***和方法,其中包括一个显示面板,显示面板内设有至少一个可变单元、以及与所述可变单元一一对应的驱动机构;可变单元工作于摆放状态时显示面板上呈现至少两个相交面和/或至少一个立体物体;图像处理单元基于当前工作于摆放状态的可变单元对应用画面进行处理以形成图形校正的画面,再对经过图形校正的画面进行映射,并在显示区域内形成与所述至少两个相交面和/或至少一个立体物体匹配的显示图像。

Description

一种显示***及其显示方法 技术领域
本发明涉及显示***,更具体地说,涉及一种显示***及其显示方法,可基于显示面板内设的可变单元而生成与相交面和/或立体物体匹配的显示图像。
背景技术
通常的投影成像面为平滑的平面或弧面,也就是说,投影机的投影显示区域通常是平面或圆弧面,在投影显示区域范围内不会有三维的立体物体。
随着投影技术应用的不断拓展,具有立体场景投影功能的产品也陆续面市,例如,有些剧场的场景布置或者灯光秀节目中,就是用这种投影机结合被投射物体的形状投出虚拟场景。但是,这种虚拟场景通常只是基于固定的投影显示面,无法适应多种剧情要求的应用需要,比如游戏的要求。更不具备互动的操控功能。
同样,现有的液晶显示器通常也是平滑的平面或弧面,其显示只能基于固定的显示面,无法适应多种剧情要求的应用需要。
发明内容
针对现有技术的上述缺陷,本发明首先要解决现有显示***只能基于固定的显示面显示图像的问题。
为解决上述技术问题,本发明提供一种显示***,包括图像处理单元,其中还包括一个面积大于或等于显示区域的显示面板,所述显示区域形成于所述显示面板上;所述显示面板内设有至少一个可变单元、以及与所述可变单元一一对应的驱动机构;所述驱动机构可驱动所述可变单元工作于摆放状态或隐藏状态;所述可变单元工作于摆放状态时,所述显示面板上呈现至少两个相交面 和/或至少一个立体物体;该***中还包括用于存储与所述各个可变单元对应的应用画面的画面存储单元;所述图像处理单元基于当前工作于摆放状态的所述可变单元对所述应用画面进行处理,以形成基于所述至少两个相交面和/或至少一个立体物体进行图形校正的画面,再对经过图形校正的画面进行映射,并在所述显示区域内形成与所述至少两个相交面和/或至少一个立体物体匹配的显示图像。
本发明的显示***中,所述显示面板、所述至少两个相交面和/或至少一个立体物体的表面可为液晶显示屏或OLED。
本发明的显示***可为投影显示***,此时其中还包括投影镜头单元,与所述投影镜头单元对应的所述投影显示区域投射形成于所述显示面板上;所述图像处理单元对所述应用画面进行处理,以形成基于所述至少两个相交面和/或至少一个立体物体进行图形校正的画面,再由所述投影镜头单元进行映射,并在所述显示区域内形成与所述至少两个相交面和/或至少一个立体物体匹配的显示图像。
本发明的显示***中,还可包括信息处理单元、驱动控制单元;其中,所述信息处理单元根据当前需要播放的预设应用画面所对应的至少两个相交面和/或至少一个立体物体,以有线或无线方式向所述驱动控制单元发出相应的信号;所述驱动控制单元再向所述驱动机构发出相应的驱动控制信号,进而驱动至少一个所述可变单元工作于摆放状态或隐藏状态。
本发明的显示***中,所述可变单元工作于摆放状态时,是下凹的至少两个相交面和/或至少一个立体物体,或者是上凸的至少两个相交面和/或至少一个立体物体;所述立体物体为简单立体结构,包括正方体、长方体、棱锥体、圆锥体、球体、圆柱体、或台体中一种或多种的组合。
本发明的显示***中,至少一个所述可变单元为两级或多级升降的结构,和/或至少一个所述可变单元为可横向移动的结构。
本发明的显示***中,所述显示图像是具有预期增强现实效果的显示图像,所述应用画面中包含至少一个需显示于至少两个相交面上的整体图像;所述整体图像在各个面上分别显示一部分画面,或者连续显示于各个面上并依次 在各个面上显示连续动作的一部分。
本发明的显示***为投影显示***时,还包括可基于所述投影显示图像进行互动操控的互动操控单元,所述互动操控单元与所述信息处理单元配合工作以实现基于所述投影显示图像的互动操控。
本发明的投影显示***中,所述互动操控单元包括基于无线通信的遥控器,所述遥控器包括手机、平板电脑、游戏手柄、或空中鼠标中一种或多种的组合;所述遥控器使用2.4G、蓝牙、或者WIFI通信方式传输互动操控信号。
本发明的投影显示***中,所述互动操控单元包括基于红外光的外部遥控器及红外监控镜头;所述外部遥控器可在所述投影显示区域内形成红外光斑,所述红外监控镜头捕捉所述红外光斑,所述图像处理单元根据所述红外光斑的位置信息显示相应的可见图标,并通过所述外部遥控器控制所述可见图标以实现互动操控功能。
本发明的投影显示***中,所述互动操控单元包括直接触控式互动操控单元,并从双监控镜头模式、TOF传感器模式、或者结构光传感器模式选择其中的一种互动操控模式。
本发明的投影显示***中,所述立体物体表面附着有在较亮环境光条件下可增强投影显示图像对比度的光学材料层。
本发明的投影显示***中,所述显示面板上设有用于支撑所述投影镜头单元的支架,所述投影镜头单元装于支架上并位于所述显示面板的上方或一侧。
本发明还提供一种显示方法,其中使用一个面积大于或等于显示区域的显示面板,所述显示区域形成于所述显示面板上;所述显示面板内设有至少一个可工作于摆放状态或隐藏状态的可变单元,所述可变单元工作于摆放状态时,所述显示面板上呈现至少两个相交面和/或至少一个立体物体;该方法包括以下步骤:
预设与所述可变单元对应的应用画面;
基于当前工作于摆放状态的所述可变单元对所述应用画面进行处理,以形成基于所述至少两个相交面和/或至少一个立体物体进行图形校正的画面,再对经过图形校正的画面进行映射,并在所述显示区域内形成与所述至少两个相 交面和/或至少一个立体物体匹配的显示图像。
通过本发明的技术方案,液晶显示屏或OLED内设有可变单元,或者投影的显示面板内设有可变单元,然后可基于显示面板内设的可变单元而生成与相交面和/或立体物体匹配的显示图像。将这种显示***应用于舞台或游戏,可增加舞台或游戏的真实感。
附图说明
下面将结合附图及实施例对本发明作进一步说明,附图中:
图1是本发明一个实施例中的投影显示***及其应用场景示意图;
图2是在图1中基础上有一个可变单元形成下凹结构后的示意图;
图3是在图1中基础上有两个可变单元分别上升一层、两层后的示意图;
图4是在图1中基础上形成台阶式结构的示意图;
图5是本发明一个优选实施例中投影显示***的原理框图。
具体实施方式
下面以先投影显示***为例进行说明,如图5所示为本发明一个优选实施例中投影显示***的原理框图,该投影显示***中包括投影镜头单元、图像处理单元、画面存储单元;其中,画面存储单元用于存储预设的投影应用画面。
其中还包括一个面积大于或等于投影显示区域的投影显示面板,与投影镜头单元对应的投影显示区域投射形成于显示面板上;在投影显示面板内设有至少一个可变单元、以及与可变单元一一对应的驱动机构;驱动机构可驱动可变单元工作于摆放状态或隐藏状态;可变单元工作于摆放状态时,显示面板上呈现至少两个相交面和/或至少一个立体物体。
工作时,先预设与各个可变单元对应的应用画面,然后,图像处理单元基于当前工作于摆放状态的可变单元对应用画面进行处理,以形成基于所述至少两个相交面和/或至少一个立体物体进行图形校正的画面,再由投影镜头单元进行映射,并在投影显示区域内形成与所述至少两个相交面和/或至少一个立体物体匹配的投影显示图像。
从图5可以看出,其中还包括信息处理单元、驱动控制单元;工作时,信息处理单元根据当前需要播放的预设应用画面所对应的至少两个相交面和/或至少一个立体物体,以有线或无线方式向驱动控制单元发出相应的信号;驱动控制单元再向驱动机构发出相应的驱动控制信号,进而驱动至少一个可变单元工作于摆放状态或隐藏状态。也就是说,本发明的投影显示***可以根据需要播放的投影显示图像,自动控制相应的可变单元,让相应的立体物体被正确摆放在显示面板的对应位置,立体物体可根据需要自动切换,形成多种组合展示模式,而不只是单一模式。
从图5可以看出,其中还包括可基于投影显示图像进行互动操控的互动操控单元,互动操控单元与信息处理单元配合工作以实现基于投影显示图像的互动操控。
该互动操控单元可以是基于无线通信的遥控器,例如可以是手机、平板电脑、游戏手柄、或空中鼠标中一种或多种的组合;其中,遥控器可使用2.4G、蓝牙、或者WIFI通信方式传输互动操控信号。
该互动操控单元还可以是基于红外光的外部遥控器及红外监控镜头;工作时,外部遥控器可在投影显示区域内形成红外光斑,红外监控镜头捕捉红外光斑,图像处理单元根据红外光斑的位置信息显示相应的可见图标,并通过外部遥控器控制可见图标以实现互动操控功能。
该互动操控单元还可以是直接触控式互动操控单元,具体可从双监控镜头模式、TOF传感器模式、或者结构光传感器模式选择其中的一种互动操控模式。
如图1所示为本发明一个实施例中的投影显示***及其应用场景示意图,其中投影机100装于一个支架140上,并位于显示面板108的上方,在投影镜头单元102的两侧设有两个监控镜头102、104,投影镜头单元102可将预设好的投影应用画面投射到该投影显示区域内。
从图1中可以看出,本实施例中有一个显示面板108,其面积大于或等于投影显示区域的面积,投影机100所投射的投影显示区域形成于该显示面板上;显示面板108包括多个格子,每个格子是一个可变单元110。其中,可变 单元工作于摆放状态时,可以是下凹的至少两个相交面和/或至少一个立体物体,或者是上凸的至少两个相交面和/或至少一个立体物体;该立体物体为简单立体结构,包括正方体、长方体、棱锥体、圆锥体、球体、圆柱体、或台体中一种或多种的组合。另外,可变单元可为两级或多级升降的结构,或者是可横向移动的结构。另外,可在立体物体表面附着有在较亮环境光条件下可增强投影显示图像对比度的光学材料层。
针对图1中的每个可变单元还设有驱动机构(未在图中示出),用于驱动可变单元工作于摆放状态或隐藏状态。具体实施时,显示面板108内可以只有一个可变单元,或者是两个或多个可变单元。
图1所示的实施例中,采用的是直接触控式互动操控单元,具体是双监控镜头模式,在投影镜头单元102的两侧设有两个监控镜头104、106,在投影机100内还集成了红外光源(未在图中示出),其发出的红外光能够覆盖整个投影显示区域。红外光源也可装于投影镜头外部,直接发射红外光并覆盖整个投影显示区域。当人手指进入投影场景内去触碰投影图像时,两个红外监控镜头将抓拍到的图像同时传递到互动算法模组,计算出手指的空间位置,从而实现立体图像触控操作。其中,两个红外监控镜头需先定标和校正,同时获取图像视差图,通过图像跟踪、图像分割、图像识别实现空间重构,再通过算法计算出手指的三维空间位置;同时计算出投影画面的三维高低起伏场影。
本实施例中,当人手进入投影场景内,两个监控镜头104、106可以监控到手指的空间位置信息,进而可通过监控手指的运动轨迹和空间坐标信息来判断当前所执行的触控操作,例如在空间不同的高度实现点击、滑动等。
本实施例中的直接触控式互动操控单元,还可以是TOF(Time of Flight,即飞行时间)传感器模式、或者结构光传感器模式。
TOF传感器的基本原理是用传感器接收从物体返回的光,通过探测光脉冲的飞行(往返)时间来得到目标物距离。TOF传感器与普通机器视觉成像过程也有类似之处,都是由光源、光学部件、传感器、控制电路以及处理电路等几部单元组成,而TOF传感器是通过入、反射光探测来获取的目标距离获取。通过TOF传感器把投影空间的立体信息重构出来,即可计算手指的在三维空间的 位置,从而判断空间的触控。
结构光传感器的工作原理是利用连续光(近红外线)对测量空间进行编码,经感应器读取编码的光线,交由晶片运算进行解码后,产生一张具有深度的图像。但与传统的结构光方法不同的是,其光源打出去的并不是一副周期性变化的二维的图像编码,而是一个具有三维纵深的“体编码”。这种光源叫做激光散斑(Laser Speckle),是当激光照射到粗糙物体或穿透毛玻璃后形成的随机衍射斑点。这些散斑具有高度的随机性,而且会随着距离的不同变换图案,空间中任何两处的散斑都会是不同的图案,等于是将整个空间加上了标记,所以任何物体进入该空间以及移动时,都可确切记录物体的位置。通过结构光传感器把投影空间的立体信息重构出来,即可计算手指的在三维空间的位置,从而判断空间的触控。
如图2所示是在图1基础上的一种变形,其中有一个可变单元形成下凹结构112后的示意图,下凹之后,在当前视角形成三个可见的投影面,即底面114和两个侧面116、118,具体实施时,可基于这三个面、以及显示面板的表面,对相应的应用画面进行进行图形校正。本实施例当中,侧面116通过图形校正形成了一个门,在侧面118和底面114投影映射形成了一个虚拟的梯子。侧面116上的门和侧面118与底面114上的梯子为预设应用画面。当预设应用播放到这个环节时,所述信息处理单元以有线或无线方式向所述驱动控制单元发出相应的信号;所述驱动控制单元再向所述驱动机构发出相应的驱动控制信号,进而驱动底面114由原来位置下沉,达到本图所示位置。从而形成所需的在凹陷处生成门和梯子的图像。
如图3所示是在图1基础上的另一种变形,其中支架140移动到了显示面板的右上角,且右下部的一个可变单元128上升了一层,左上部的另一个可变单元120上升了两层。
针对上升了一层的可变单元128,在当前视角形成两个可见的投影面,即上表面130和正前面132,其右侧面虽然可见,但由于投影机100的位置已无 法作为投影面。
针对上升了两层的可变单元120,在当前视角形成三个可见的投影面,即上表面122、正前面124、以及右侧面126。与图2类似,本***可以预设投影应用画面,由信息处理单元根据播放的内容需要,驱动可变单元120和128到这个位置,从而通过映射将所述可变单元变成预期的图像比如大楼或者古堡。
如图4所示是在图1基础上的另一种变形,其中部分多可变单元保持不变、部分可变单元上升了一级、部分可变单元上升了两级,从而形成了台阶式结构的显示面板。
为便于对本发明的理解,前面是以投影显示***进行说明的,本领域技术人员可以理解的是,本发明的方案还可应用于液晶显示屏或OLDE,此时不需要前述投影机等部件,而是将显示面板、各个可变单元的可见表面,都设置为液晶显示屏或OLED,让其可直接显示相应的图像,同样可达到前述效果。
由于本发明中显示面板不再是单一的平面或弧面,所以,其显示图像最好是具有预期增强现实效果的显示图像,此时其应用画面中包含至少一个需显示于至少两个相交面上的整体图像;该整体图像在各个面上分别显示一部分画面,或者连续显示于各个面上并依次在各个面上显示连续动作的一部分。

Claims (16)

  1. 一种显示***,包括图像处理单元,其特征在于,
    还包括一个面积大于或等于显示区域的显示面板,所述显示区域形成于所述显示面板上;所述显示面板内设有至少一个可变单元、以及与所述可变单元一一对应的驱动机构;所述驱动机构可驱动所述可变单元工作于摆放状态或隐藏状态;所述可变单元工作于摆放状态时,所述显示面板上呈现至少两个相交面和/或至少一个立体物体;
    该***中还包括用于存储与所述各个可变单元对应的应用画面的画面存储单元;
    所述图像处理单元基于当前工作于摆放状态的所述可变单元对所述应用画面进行处理,以形成基于所述至少两个相交面和/或至少一个立体物体进行图形校正的画面,再对经过图形校正的画面进行映射,并在所述显示区域内形成与所述至少两个相交面和/或至少一个立体物体匹配的显示图像。
  2. 根据权利要求1所述的显示***,其特征在于,所述显示面板、所述至少两个相交面和/或至少一个立体物体的表面为液晶显示屏或OLED。
  3. 根据权利要求1所述的显示***,其特征在于,其中还包括投影镜头单元,与所述投影镜头单元对应的所述投影显示区域投射形成于所述显示面板上;
    所述图像处理单元为图像处理单元,所述图像处理单元对所述应用画面进行处理,以形成基于所述至少两个相交面和/或至少一个立体物体进行图形校正的画面,再由所述投影镜头单元进行映射,并在所述显示区域内形成与所述至少两个相交面和/或至少一个立体物体匹配的投影显示图像。
  4. 根据权利要求1-3中任一项所述的显示***,其特征在于,其中还包括信息处理单元、驱动控制单元;其中,所述信息处理单元根据当前需要播放 的预设应用画面所对应的至少两个相交面和/或至少一个立体物体,以有线或无线方式向所述驱动控制单元发出相应的信号;所述驱动控制单元再向所述驱动机构发出相应的驱动控制信号,进而驱动至少一个所述可变单元工作于摆放状态或隐藏状态。
  5. 根据权利要求4所述的显示***,其特征在于,所述可变单元工作于摆放状态时,是下凹的至少两个相交面和/或至少一个立体物体,或者是上凸的至少两个相交面和/或至少一个立体物体;所述立体物体为简单立体结构,包括正方体、长方体、棱锥体、圆锥体、球体、圆柱体、或台体中一种或多种的组合。
  6. 根据权利要求5所述的显示***,其特征在于,至少一个所述可变单元为两级或多级升降的结构。
  7. 根据权利要求5所述的显示***,其特征在于,至少一个所述可变单元为可横向移动的结构。
  8. 根据权利要求1-3中任一项所述的显示***,其特征在于,所述显示图像是具有预期增强现实效果的显示图像,所述应用画面中包含至少一个需显示于至少两个相交面上的整体图像;所述整体图像在各个面上分别显示一部分画面,或者连续显示于各个面上并依次在各个面上显示连续动作的一部分。
  9. 根据权利要求4所述的显示***,其特征在于,还包括可基于所述投影显示图像进行互动操控的互动操控单元,所述互动操控单元与所述信息处理单元配合工作以实现基于所述投影显示图像的互动操控。
  10. 根据权利要求9所述的显示***,其特征在于,所述互动操控单元包括基于无线通信的遥控器,所述遥控器包括手机、平板电脑、游戏手柄、或空 中鼠标中一种或多种的组合;所述遥控器使用2.4G、蓝牙、或者WIFI通信方式传输互动操控信号。
  11. 根据权利要求9所述的显示***,其特征在于,所述互动操控单元包括基于红外光的外部遥控器及红外监控镜头;所述外部遥控器可在所述投影显示区域内形成红外光斑,所述红外监控镜头捕捉所述红外光斑,所述图像处理单元根据所述红外光斑的位置信息显示相应的可见图标,并通过所述外部遥控器控制所述可见图标以实现互动操控功能。
  12. 根据权利要求9所述的显示***,其特征在于,所述互动操控单元包括直接触控式互动操控单元,并从双监控镜头模式、TOF传感器模式、或者结构光传感器模式选择其中的一种互动操控模式。
  13. 根据权利要求3所述的显示***,其特征在于,所述立体物体表面附着有在较亮环境光条件下可增强投影显示图像对比度的光学材料层。
  14. 根据权利要求3所述的显示***,其特征在于,所述显示面板上设有用于支撑所述投影镜头单元的支架,所述投影镜头单元装于支架上并位于所述显示面板的上方或一侧。
  15. 一种显示方法,其特征在于,其使用一个面积大于或等于显示区域的显示面板,所述显示区域形成于所述显示面板上;所述显示面板内设有至少一个可工作于摆放状态或隐藏状态的可变单元,所述可变单元工作于摆放状态时,所述显示面板上呈现至少两个相交面和/或至少一个立体物体;该方法包括以下步骤:
    预设与所述可变单元对应的应用画面;
    基于当前工作于摆放状态的所述可变单元对所述应用画面进行处理,以形成基于所述至少两个相交面和/或至少一个立体物体进行图形校正的画面,再 对经过图形校正的画面进行映射,并在所述显示区域内形成与所述至少两个相交面和/或至少一个立体物体匹配的显示图像。
  16. 根据权利要求15所述的显示方法,其特征在于,所述显示图像是具有预期增强现实效果的显示图像,所述应用画面中包含至少一个需显示于至少两个相交面上的整体图像;所述整体图像在各个面上分别显示一部分画面,或者连续显示于各个面上并依次在各个面上显示连续动作的一部分。
PCT/CN2015/090973 2015-09-28 2015-09-28 一种显示***及其显示方法 WO2017054114A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/090973 WO2017054114A1 (zh) 2015-09-28 2015-09-28 一种显示***及其显示方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2015/090973 WO2017054114A1 (zh) 2015-09-28 2015-09-28 一种显示***及其显示方法

Publications (1)

Publication Number Publication Date
WO2017054114A1 true WO2017054114A1 (zh) 2017-04-06

Family

ID=58422467

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/090973 WO2017054114A1 (zh) 2015-09-28 2015-09-28 一种显示***及其显示方法

Country Status (1)

Country Link
WO (1) WO2017054114A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130093660A1 (en) * 2011-10-14 2013-04-18 Alexander Samson Hirsch Method and system to control a process with bend movements
CN203350807U (zh) * 2013-07-31 2013-12-18 神画科技(深圳)有限公司 一种带互动功能的显示***
CN104423919A (zh) * 2013-09-10 2015-03-18 联想(北京)有限公司 一种图像处理方法及电子设备
CN104823134A (zh) * 2012-10-05 2015-08-05 三星电子株式会社 柔性显示装置及柔性显示装置控制方法
CN104834394A (zh) * 2014-02-09 2015-08-12 神画科技(深圳)有限公司 一种互动显示***
CN105245866A (zh) * 2015-09-28 2016-01-13 神画科技(深圳)有限公司 一种显示***及其显示方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130093660A1 (en) * 2011-10-14 2013-04-18 Alexander Samson Hirsch Method and system to control a process with bend movements
CN104823134A (zh) * 2012-10-05 2015-08-05 三星电子株式会社 柔性显示装置及柔性显示装置控制方法
CN203350807U (zh) * 2013-07-31 2013-12-18 神画科技(深圳)有限公司 一种带互动功能的显示***
CN104423919A (zh) * 2013-09-10 2015-03-18 联想(北京)有限公司 一种图像处理方法及电子设备
CN104834394A (zh) * 2014-02-09 2015-08-12 神画科技(深圳)有限公司 一种互动显示***
CN105245866A (zh) * 2015-09-28 2016-01-13 神画科技(深圳)有限公司 一种显示***及其显示方法

Similar Documents

Publication Publication Date Title
CN113711109A (zh) 具有直通成像的头戴式显示器
JP7184901B2 (ja) モバイルロボットを用いる動的カメラ位置決め及び照明による空間キャプチャ、モデル化、及び質感再構築
US9724609B2 (en) Apparatus and method for augmented reality
US9230368B2 (en) Hologram anchoring and dynamic positioning
JP6223965B2 (ja) 立体ビデオ表現のためのコンピュータ・システム及び方法
US11724184B2 (en) 2.5D graphics rendering system
US20160343166A1 (en) Image-capturing system for combining subject and three-dimensional virtual space in real time
JP2014509429A (ja) ユーザーインターフェースの提示およびインタラクション
US20110306413A1 (en) Entertainment device and entertainment methods
US20120098744A1 (en) Systems, methods, and apparatuses for spatial input associated with a display
JP5469516B2 (ja) 画像表示プログラム、画像表示システム、画像表示方法および画像表示装置
TW201527683A (zh) 混合真實性聚光燈
US20140267412A1 (en) Optical illumination mapping
US20210038975A1 (en) Calibration to be used in an augmented reality method and system
JP6039594B2 (ja) 情報処理装置および情報処理方法
US20130285919A1 (en) Interactive video system
CN111566596A (zh) 用于虚拟现实显示器的真实世界门户
CN105245866A (zh) 一种显示***及其显示方法
US20130082923A1 (en) Optical pointer control system and method therefor
WO2005124450A1 (ja) 展示装置
US20130249811A1 (en) Controlling a device with visible light
US20180357942A1 (en) Display device and display method
US20190176341A1 (en) Movable robot capable of providing a projected interactive user interface
KR101076263B1 (ko) 체감형 시뮬레이터 기반의 대규모 인터랙티브 게임 시스템 및 그 방법
US20220075477A1 (en) Systems and/or methods for parallax correction in large area transparent touch interfaces

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15905026

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15905026

Country of ref document: EP

Kind code of ref document: A1