WO2021212463A1 - 一种显示设备及投屏方法 - Google Patents

一种显示设备及投屏方法 Download PDF

Info

Publication number
WO2021212463A1
WO2021212463A1 PCT/CN2020/086665 CN2020086665W WO2021212463A1 WO 2021212463 A1 WO2021212463 A1 WO 2021212463A1 CN 2020086665 W CN2020086665 W CN 2020086665W WO 2021212463 A1 WO2021212463 A1 WO 2021212463A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
screen
information
terminal
display device
Prior art date
Application number
PCT/CN2020/086665
Other languages
English (en)
French (fr)
Inventor
宋子全
庞秀娟
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Priority to PCT/CN2020/086665 priority Critical patent/WO2021212463A1/zh
Publication of WO2021212463A1 publication Critical patent/WO2021212463A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers
    • H04N5/655Construction or mounting of chassis, e.g. for varying the elevation of the tube
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Definitions

  • This application relates to the technical field of smart TVs, and in particular to a display device and a screen projection method.
  • Screen projection is an interactive operation between a terminal and a display device.
  • a wireless local area network is used to transmit the video stream to display the screen on the terminal device through the display device.
  • the screen projection operation instruction can be executed through the mobile phone to send the screen displayed on the mobile phone to the smart TV in the form of a video stream to use
  • the large screen of the smart TV provides a better user experience.
  • the screen ratio of terminals such as mobile phones and the screen ratio of display devices are often different.
  • the screen display aspect ratio of the mobile phone is 1080:1940; while the display aspect ratio of the smart TV is 1940:1080, that is, the screen on the mobile phone is in the vertical state, and the screen on the smart TV is in the horizontal state. . Therefore, when a terminal screen is projected and displayed through a smart TV, it is easy to fail to display the projected screen normally because the aspect ratio of the terminal screen does not match the display aspect ratio.
  • the screen In order to fully display the screen on the phone, the screen needs to be zoomed based on the height of the phone screen.
  • the difference in screen ratio will result in large black areas on both sides of the screen displayed by the smart TV, which not only reduces the user's viewing experience, but also wastes the display space on the screen.
  • the present application provides a display device and a screen projection method to solve the problem of wasting the display space on the screen when the traditional display device displays the projection screen.
  • the present application provides a display device, including: a display; a rotating component configured to drive the display to rotate so that the display is in one of a horizontal screen state or a vertical screen state; a user interface,
  • the communication information is configured to be connected to a terminal; and a controller that communicates with the above-mentioned display, rotating assembly, and user interface.
  • the controller is configured as:
  • the image information includes valid information and left and right black areas;
  • this application also provides a screen projection method applied to a display device, including:
  • the image information includes valid information and left and right black information, and the valid information corresponds to the screen display content of the terminal;
  • FIG. 1A is an application scenario diagram of a display device of this application
  • FIG. 1B is a rear view of a display device of this application.
  • FIG. 2 is a block diagram of the hardware configuration of the control device of the application
  • FIG. 3 is a block diagram of the architecture configuration of the operating system in the storage device of the display device of this application;
  • FIG. 4A is a schematic diagram of the landscape mode of the mobile terminal of this application.
  • FIG. 4B is a schematic diagram of the vertical mode of the mobile terminal of this application.
  • 5A is a schematic diagram of a mobile terminal in a landscape mode sending image information to a display device in a landscape mode in an embodiment of the application;
  • 5B is a schematic diagram of a mobile terminal in a landscape mode sending image information to a display device in a portrait state in an embodiment of the application;
  • 6A is a schematic diagram of a mobile terminal in a portrait mode sending image information to a display device in a landscape mode in an embodiment of the application;
  • FIG. 6B is a schematic diagram of sending image information to a display device in a portrait mode when a mobile terminal is in a portrait mode in an embodiment of the present application;
  • FIG. 6C is a schematic diagram of the effect of presenting a picture when the display device is in a portrait state according to the projection protocol
  • FIG. 7 is a schematic diagram of a flow executed by a controller of an exemplary display device of this application.
  • FIG. 8 is a schematic flow chart of the application for determining the magnification of effective information according to the ratio of the height and width of the display device and the effective screen;
  • Figure 9 is a schematic diagram of the black information and image information of the application.
  • FIG. 10 is a schematic structural diagram of a display device of this application.
  • the display device may be an electrical device with a larger screen, such as a smart TV, that presents video and audio signals to users.
  • the display device can have an independent operating system and support function expansion.
  • Various applications can be installed in the display device according to user needs, for example, social applications such as traditional video applications, short videos, and reading applications such as comics and books. These applications can use the screen of the display device to display the application screen and provide users with richer media resources.
  • the display device can also perform data interaction and resource sharing with different terminals.
  • a smart TV can be connected to a mobile phone through wireless communication methods such as local area network, Bluetooth, etc., so as to play resources in the mobile phone or directly cast a screen to display the screen on the mobile phone.
  • an embodiment of the present application provides a display device and a computer storage medium.
  • the display device such as Rotate the TV. It should be noted that the method provided in this embodiment is not only applicable to rotating TVs, but also applicable to other display devices, such as computers and tablet computers.
  • module used in the various embodiments of this application can refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that can execute related components Function.
  • remote control used in the various embodiments of this application refers to a component of an electronic device (such as the display device disclosed in this application), which can generally control the electronic device wirelessly within a short distance.
  • the component can generally use infrared and/or radio frequency (RF) signals and/or Bluetooth to connect with electronic devices, and can also include functional modules such as WiFi, wireless USB, Bluetooth, and motion sensors.
  • RF radio frequency
  • a handheld touch remote control uses a user interface in a touch screen to replace most of the physical built-in hard keys in general remote control devices.
  • gesture used in the embodiments of the present application refers to a user's behavior through a change of hand shape or hand movement to express expected ideas, actions, goals, and/or results.
  • the term "hardware system” used in the various embodiments of this application may refer to an integrated circuit (IC), printed circuit board (Printed circuit board, PCB) and other mechanical, optical, electrical, and magnetic devices with computing , Control, storage, input and output functions of the physical components.
  • the hardware system is also usually referred to as a motherboard or a main chip or a controller.
  • FIG. 1A is an application scenario diagram of a display device provided by some embodiments of this application.
  • the control device 100 and the display device 200 can communicate in a wired or wireless manner.
  • control device 100 is configured to control the display device 200, which can receive operation instructions input by the user, and convert the operation instructions into instructions that the display device 200 can recognize and respond to, and act as an intermediary for the interaction between the user and the display device 200 effect.
  • the user operates the channel addition and subtraction keys on the control device 100, and the display device 200 responds to the channel addition and subtraction operations.
  • the control device 100 may be a remote controller 100A, including infrared protocol communication or Bluetooth protocol communication, and other short-distance communication methods, etc., to control the display device 200 in a wireless or other wired manner.
  • the user can control the display device 200 by inputting user instructions through keys on the remote control, voice input, control panel input, etc.
  • the user can control the display device 200 by inputting corresponding control commands through the volume plus and minus keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, and power on/off keys on the remote control. Function.
  • the control device 100 may also be a smart device, such as a mobile terminal 100B, a tablet computer, a computer, a notebook computer, and the like.
  • a smart device such as a mobile terminal 100B, a tablet computer, a computer, a notebook computer, and the like.
  • an application program running on a smart device is used to control the display device 200.
  • the application can be configured to provide users with various controls through an intuitive user interface (UI) on the screen associated with the smart device.
  • UI intuitive user interface
  • the mobile terminal 100B may install a software application with the display device 200, realize connection communication through a network communication protocol, and realize the purpose of one-to-one control operation and data communication.
  • the mobile terminal 100B can establish a control instruction protocol with the display device 200, and the functions of the physical keys arranged on the remote control 100A can be realized by operating various function keys or virtual controls of the user interface provided on the mobile terminal 100B.
  • the audio and video content displayed on the mobile terminal 100B can also be transmitted to the display device 200 to realize the synchronous display function.
  • the display device 200 may provide a broadcast receiving function and a network TV function of a computer support function.
  • the display device can be implemented as digital TV, Internet TV, Internet Protocol TV (IPTV), and so on.
  • the display device 200 may be a liquid crystal display, an organic light emitting display, or a projection device.
  • the specific display device type, size and resolution are not limited.
  • the display device 200 also performs data communication with the server 300 through a variety of communication methods.
  • the display device 200 may be allowed to communicate through a local area network (LAN), a wireless local area network (WLAN), and other networks.
  • the server 300 may provide various contents and interactions to the display device 200.
  • the display device 200 can send and receive information, such as receiving electronic program guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library.
  • EPG electronic program guide
  • the server 300 can be one group or multiple groups, and can be one type or multiple types of servers.
  • the server 300 provides other network service contents such as video-on-demand and advertising services.
  • the display device 200 includes a rotating assembly 276, a controller 250, a display 275, a terminal interface 278 protruding from a gap on the backplane, and a rotating assembly 276 connected to the backplane.
  • the component 276 can cause the display 275 to rotate.
  • the rotating component 276 can rotate the display screen to the vertical screen state, that is, the state where the vertical side length of the screen is greater than the horizontal side length, or it can rotate the screen to the horizontal screen state, that is, the screen horizontally. The state where the side length is greater than the vertical side length.
  • FIG. 2 exemplarily shows a block diagram of the hardware configuration of the display device 200.
  • the display device 200 may include a tuner and demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, Rotating component 276, audio processor 280, audio output interface 285, power supply 290.
  • the rotating assembly 276 may include components such as a drive motor and a rotating shaft.
  • the driving motor can be connected to the controller 250 and output the rotation angle under the control of the controller 250; one end of the rotating shaft is connected to the power output shaft of the driving motor, and the other end is connected to the display 275, so that the display 275 can be fixedly installed on the rotating assembly 276.
  • the wall or bracket On the wall or bracket.
  • the rotating assembly 276 may also include other components, such as transmission components, detection components, and so on.
  • the transmission component can adjust the rotation speed and torque output by the rotating assembly 276 through a specific transmission ratio, and can be a gear transmission mode
  • the detection component can be composed of sensors arranged on the rotating shaft, such as an angle sensor, an attitude sensor, and the like. These sensors can detect parameters such as the angle of rotation of the rotating component 276 and send the detected parameters to the controller 250 so that the controller 250 can determine or adjust the state of the display device 200 according to the detected parameters.
  • the rotating assembly 276 may include, but is not limited to, one or more of the aforementioned components.
  • the tuner and demodulator 210 which receives broadcast television signals through wired or wireless means, can perform modulation and demodulation processing such as amplification, mixing and resonance, and is used to demodulate the television selected by the user from multiple wireless or cable broadcast television signals
  • modulation and demodulation processing such as amplification, mixing and resonance
  • the audio and video signals carried in the frequency of the channel, as well as additional information (such as EPG data).
  • the tuner and demodulator 210 can be selected by the user and controlled by the controller 250 to respond to the frequency of the television channel selected by the user and the television signal carried by the frequency.
  • the tuner and demodulator 210 can receive signals in many ways according to different broadcasting formats of TV signals, such as terrestrial broadcasting, cable broadcasting, satellite broadcasting or Internet broadcasting; and according to different modulation types, it can be digital modulation or analog Modulation method; and according to different types of received TV signals, analog signals and digital signals can be demodulated.
  • different broadcasting formats of TV signals such as terrestrial broadcasting, cable broadcasting, satellite broadcasting or Internet broadcasting
  • modulation types it can be digital modulation or analog Modulation method
  • received TV signals, analog signals and digital signals can be demodulated.
  • the tuner demodulator 210 may also be in an external device, such as an external set-top box.
  • the set-top box outputs a TV signal after modulation and demodulation, and is input to the display device 200 through the external device interface 240.
  • the communicator 220 is a component used to communicate with external devices or external servers according to various types of communication protocols.
  • the display device 200 may transmit content data to an external device connected via the communicator 220, or browse and download content data from an external device connected via the communicator 220.
  • the communicator 220 may include a network communication protocol module such as a WIFI module 221, a Bluetooth communication protocol module 222, and a wired Ethernet communication protocol module 223 or a near field communication protocol module, so that the communicator 220 can receive the control device 100 according to the control of the controller 250 Control signals, and implement the control signals as WIFI signals, Bluetooth signals, radio frequency signals, etc.
  • the detector 230 is a component of the display device 200 for collecting signals from the external environment or interacting with the outside.
  • the detector 230 may include a sound collector 231, such as a microphone, which may be used to receive a user's voice, such as a voice signal of a control instruction for the user to control the display device 200; or, it may collect environmental sounds used to identify the type of environmental scene to realize display
  • the device 200 can adapt to environmental noise.
  • the detector 230 may also include an image collector 232, such as a camera, a camera, etc., which may be used to collect external environment scenes to adaptively change the display parameters of the display device 200; and to collect The attributes of the user or interactive gestures with the user to achieve the function of interaction between the display device and the user.
  • an image collector 232 such as a camera, a camera, etc., which may be used to collect external environment scenes to adaptively change the display parameters of the display device 200; and to collect The attributes of the user or interactive gestures with the user to achieve the function of interaction between the display device and the user.
  • the detector 230 may further include a light receiver, which is used to collect the ambient light intensity to adapt to changes in display parameters of the display device 200 and so on.
  • the detector 230 may also include a temperature sensor.
  • the display device 200 may adaptively adjust the display color temperature of the image. Exemplarily, when the temperature is relatively high, the color temperature of the display device 200 can be adjusted to be colder; when the temperature is relatively low, the color temperature of the display device 200 can be adjusted to be warmer.
  • the external device interface 240 is a component that provides the controller 250 to control data transmission between the display device 200 and external devices.
  • the external device interface 240 can be connected to external devices such as set-top boxes, game devices, notebook computers, etc. in a wired/wireless manner, and can receive external devices such as video signals (such as moving images), audio signals (such as music), and additional information (such as EPG) and other data.
  • the external device interface 240 may include: a high-definition multimedia interface (HDMI) terminal 241, a composite video blanking synchronization (CVBS) terminal 242, an analog or digital component terminal 243, a universal serial bus (USB) terminal 244, and a component (Component) Any one or more of terminals (not shown in the figure), red, green and blue (RGB) terminals (not shown in the figure), etc.
  • HDMI high-definition multimedia interface
  • CVBS composite video blanking synchronization
  • USB universal serial bus
  • Component Any one or more of terminals (not shown in the figure), red, green and blue (RGB) terminals (not shown in the figure), etc.
  • the controller 250 controls the work of the display device 200 and responds to user operations by running various software control programs (such as an operating system and various application programs) stored on the memory 260.
  • various software control programs such as an operating system and various application programs
  • the controller 250 includes a random access memory (RAM), a read only memory (ROM), a graphics processor, a CPU processor, a communication interface, and a communication bus.
  • RAM random access memory
  • ROM read only memory
  • CPU central processing unit
  • communication interface a communication bus.
  • RAM, ROM, graphics processor, CPU processor communication interface are connected through a communication bus.
  • ROM used to store various system startup instructions. For example, when the power-on signal is received, the power of the display device 200 starts to start, and the CPU processor 254 runs the system start-up instruction in the ROM 252 to copy the operating system stored in the memory 260 to the RAM 251 to start running the start-up operating system. After the operating system is started up, the CPU processor 254 copies various application programs in the memory 260 to the RAM 251, and then starts to run and start various application programs.
  • the graphics processor 253 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics.
  • the graphics processor 253 may include an arithmetic unit, which is used to perform operations by receiving various interactive instructions input by the user, and then display various objects according to display attributes; and includes a renderer, which is used to generate various objects obtained based on the arithmetic unit, and perform The rendered result is displayed on the display 275.
  • the CPU processor 254 is configured to execute operating system and application program instructions stored in the memory 260. And according to the received user input instructions, to execute various applications, data and content processing, so as to finally display and play various audio and video content.
  • the CPU processor 254 may include multiple processors.
  • the multiple processors may include a main processor and multiple or one sub-processors.
  • the main processor is configured to perform some initialization operations of the display device 200 in the display device preloading mode, and/or, to display screen operations in the normal mode. Multiple or one sub-processor, used to perform an operation in the standby mode of the display device.
  • the communication interface 255 may include the first interface to the nth interface. These interfaces may be network interfaces connected to external devices via a network.
  • the controller 250 may control the overall operation of the display device 200. For example, in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
  • the object can be any one of the selectable objects, such as a hyperlink or an icon.
  • the operation related to the selected object for example, the operation of displaying the page, document, image, etc. connected to the hyperlink, or the operation of executing the program corresponding to the object.
  • the user input command for selecting the GUI object may be a command input through various input devices (for example, a mouse, a keyboard, a touch pad, etc.) connected to the display device 200 or a voice command corresponding to a voice spoken by the user.
  • the memory 260 is used to store various types of data, software programs or application programs for driving and controlling the operation of the display device 200.
  • the memory 260 may include volatile and/or non-volatile memory.
  • the term "memory" includes the memory 260, the RAM and ROM of the controller 250, or the memory card in the display device 200.
  • the memory 260 is specifically used to store the operating program that drives the controller 250 in the display device 200; to store various application programs built in the display device 200 and downloaded from external devices by the user; and to store the configuration provided by the display 275 Data such as various GUIs, various objects related to the GUI, and visual effect images of the selector used to select GUI objects.
  • the memory 260 is specifically used to store drivers and related data of the tuner and demodulator 210, the communicator 220, the detector 230, the external device interface 240, the video processor 270, the display 275, the audio processor 280, etc.
  • external data such as audio and video data
  • user data such as key information, voice information, touch information, etc.
  • the memory 260 specifically stores software and/or programs used to represent an operating system (OS). These software and/or programs may include, for example, a kernel, middleware, application programming interface (API), and/or application.
  • OS operating system
  • these software and/or programs may include, for example, a kernel, middleware, application programming interface (API), and/or application.
  • the kernel can control or manage system resources and functions implemented by other programs (such as the middleware, API, or application program); at the same time, the kernel can provide interfaces to allow middleware, API, or application program access control To control or manage system resources.
  • FIG. 3 exemplarily shows a block diagram of the architecture configuration of the operating system in the memory of the display device 200.
  • the operating system architecture consists of the application layer, the middleware layer, and the kernel layer from top to bottom.
  • Application layer system built-in applications and non-system-level applications belong to the application layer. Responsible for direct interaction with users.
  • the application layer can include multiple applications, such as settings applications, e-post applications, media center applications, and so on. These applications can be implemented as web applications, which are executed based on the WebKit engine, and specifically can be developed and executed based on HTML5, Cascading Style Sheets (CSS) and JavaScript.
  • CSS Cascading Style Sheets
  • HTML HyperText Markup Language
  • HyperText Markup Language Hyper Text Markup Language
  • Web pages are described through markup tags. HTML tags are used to describe text, graphics, animation, sounds, and tables. , Links, etc., the browser will read the HTML document, explain the content of the tags in the document, and display it in the form of a web page.
  • CSS the full name of Cascading Style Sheets (Cascading Style Sheets), is a computer language used to express the style of HTML documents, and can be used to define style structures, such as fonts, colors, and positions. CSS styles can be directly stored in HTML web pages or in separate style files to achieve control over styles in web pages.
  • JavaScript is a language used in web page programming, which can be inserted into HTML pages and interpreted and executed by the browser.
  • the interaction logic of the web application is implemented through JavaScript.
  • JavaScript can encapsulate the JavaScript extension interface through the browser to realize the communication with the kernel layer,
  • the middleware layer can provide some standardized interfaces to support the operation of various environments and systems.
  • the middleware layer can be implemented as the Multimedia and Hypermedia Information Coding Expert Group (MHEG) of the middleware related to data broadcasting, and can also be implemented as the DLNA middleware of the middleware related to external device communication, and can also be implemented as providing Display the middleware of the browser environment in which each application in the device runs.
  • MHEG Multimedia and Hypermedia Information Coding Expert Group
  • the kernel layer provides core system services, such as file management, memory management, process management, network management, system security authority management and other services.
  • the kernel layer can be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
  • the kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: providing display drivers for displays, camera drivers for cameras, button drivers for remote controls, and WIFI modules Provide WiFi driver, audio driver for audio output interface, power management driver for power management (PM) module, etc.
  • device driver services for various hardware, such as: providing display drivers for displays, camera drivers for cameras, button drivers for remote controls, and WIFI modules Provide WiFi driver, audio driver for audio output interface, power management driver for power management (PM) module, etc.
  • the user interface 265 receives various user interactions. Specifically, it is used to send the input signal of the user to the controller 250, or to transmit the output signal from the controller 250 to the user.
  • the remote control 100A may send input signals input by the user, such as a power switch signal, a channel selection signal, and a volume adjustment signal, to the user interface 265, and then the user interface 265 transfers to the controller 250; or the remote control 100A may Receive output signals such as audio, video, or data output from the user interface 265 after the controller 250 processes, and display the received output signal or output the received output signal as audio or vibration.
  • the user may input a user command on a graphical user interface (GUI) displayed on the display 275, and the user interface 265 receives the user input command through the GUI.
  • GUI graphical user interface
  • the user interface 265 may receive user input commands for controlling the position of the selector in the GUI to select different objects or items.
  • “user interface” is a medium interface for interaction and information exchange between applications or operating systems and users. It realizes the conversion between the internal form of information and the form acceptable to users.
  • the commonly used form of the user interface is a graphical user interface (GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It can be an icon, window, control and other interface elements displayed on the display screen of an electronic device.
  • the control can include icons, controls, menus, tabs, text boxes, dialog boxes, status bars, channel bars, Widgets, etc. Visual interface elements.
  • the user may input a user command by inputting a specific sound or gesture, and the user interface 265 recognizes the sound or gesture through the sensor to receive the user input command.
  • the video processor 270 is used to receive external video signals, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to the standard codec protocol of the input signal.
  • video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to the standard codec protocol of the input signal.
  • the video signal displayed or played directly on the display 275.
  • the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
  • the demultiplexing module is used to demultiplex the input audio and video data stream, such as the input MPEG-2 stream (based on the compression standard of digital storage media moving images and voice), then the demultiplexing module will demultiplex it Multiplexed into video signals and audio signals, etc.
  • the video decoding module is used to process the demultiplexed video signal, including decoding and scaling.
  • An image synthesis module such as an image synthesizer, is used to superimpose and mix the GUI signal generated by the graphics generator with the zoomed video image according to the user input or itself, so as to generate an image signal for display.
  • the frame rate conversion module is used to convert the frame rate of the input video, such as converting the frame rate of the input 60Hz video to a frame rate of 120Hz or 240Hz, and the usual format is realized by such as frame interpolation.
  • the display formatting module is used to change the signal output by the frame rate conversion module to a signal conforming to the display format such as a display, for example, format the signal output by the frame rate conversion module to output RGB data signals.
  • the display 275 is used to receive the image signal input from the video processor 270 to display video content, images, and a menu control interface.
  • the displayed video content can be from the video content in the broadcast signal received by the tuner and demodulator 210, or from the video content input by the communicator 220 or the external device interface 240.
  • the display 275 simultaneously displays a user manipulation interface UI generated in the display device 200 and used to control the display device 200.
  • the display 275 may include a display screen component for presenting a picture and a driving component for driving image display.
  • the display 275 may also include a projection device and a projection screen.
  • the controller can send a control signal to make the rotating component 276 rotate the display 255.
  • the audio processor 280 is used to receive external audio signals, and perform decompression and decoding according to the standard codec protocol of the input signal, as well as audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing, so that it can be stored in the speaker 286 The audio signal to be played.
  • the audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), High Efficiency AAC (HE-AAC) and other formats.
  • AAC Advanced Audio Coding
  • HE-AAC High Efficiency AAC
  • the audio output interface 285 is used to receive the audio signal output by the audio processor 280 under the control of the controller 250.
  • the audio output interface 285 may include a speaker 286, or output to an external audio output terminal 287 of a generator of an external device, such as a headset Output terminal.
  • the video processor 270 may include one or more chip components.
  • the audio processor 280 may also include one or more chips.
  • the video processor 270 and the audio processor 280 may be separate chips, or may be integrated with the controller 250 in one or more chips.
  • the power supply 290 is used to provide power supply support for the display device 200 with power input from an external power supply under the control of the controller 250.
  • the power supply 290 may be a built-in power supply circuit installed inside the display device 200, or may be a power supply installed outside the display device 200.
  • the mobile terminal 100B may send display screen data to the display device 200 through a wireless connection, such as the Miracast protocol, to form a screen projection video stream.
  • the controller 250 can decode the projected video stream, analyze the processed frame and form a projected image and send it to the display 275 for display.
  • the mobile terminal 100B may be a smart terminal device with display and human-computer interaction functions, such as a mobile phone, a tablet computer, and the like. Since the mobile terminal 100B has different operation modes, the formed projection screen also has different layout modes. For example, when the user holds the mobile phone in a landscape orientation for operation, the screen presented on the mobile phone has a landscape layout, that is, the width of the screen is greater than the height of the screen, and the phone is in landscape mode, as shown in FIG. 4A. When the user holds the mobile phone vertically for operation, the screen presented on the mobile phone has a vertical layout, that is, the width of the screen is smaller than the height of the screen, and the mobile phone is in portrait mode, as shown in FIG. 4B.
  • the screen aspect ratio of a mobile phone is usually 9:16, 10:16, etc.; the screen aspect ratio of a tablet computer is 3:4, etc.
  • the screen aspect ratio of a tablet computer is 3:4, etc.
  • smart terminal devices with a screen aspect ratio of 1:1 such as smart watches.
  • the screen layout presented in the horizontal state and the vertical state is generally the same, and the orientation is different only when displayed on the display screen of the smart terminal device. Therefore, for the mobile terminal 100B with a display screen aspect ratio of 1:1, the projection screen formed during projection does not distinguish the horizontal and vertical states.
  • a projection protocol can be configured between the display device 200 and the mobile terminal 100B, for example based on the Miracast standard
  • the screen projection protocol is used to realize the transmission of the screen video stream through the screen projection protocol, so that the screen on the mobile terminal 100B can be screened to the display device 200.
  • the video stream that is projected to the display device 200 is always a 1920 ⁇ 1080 horizontal screen stream, which causes the display device 200 to be unable to achieve through the video stream. Automatically rotate the TV screen.
  • the image information sent to the display device is not Contains left and right black information, as shown in Figures 5A and 5B.
  • the image information sent to the display device includes left and right black information, as shown in FIGS. 6A and 6C.
  • the image information shown in the black areas on both sides of the screen is black information, or black borders
  • the display screen area in the middle of the screen is called the effective screen (that is, the corresponding mobile terminal Screen display screen)
  • the effective screen is the effective information in the image information of the mobile terminal.
  • the operation screen on the mobile terminal 100B is displayed, and the black borders are determined to have different widths and heights according to the screen ratio of the display 275 of the mobile terminal 100B.
  • the display device 200 displays the projected screen
  • the direction corresponding to the shorter side is used as a reference, such as the height direction in the landscape state. Therefore, the height direction of the projection video stream presented by the display device 200 is generally unchanged. That is, regardless of whether the orientation of the mobile terminal 100B is horizontal or vertical, the height of the projected screen received by the display device 200 is 1080P.
  • the display 275 is rotated to the vertical screen state, it cannot present the display state as shown in FIG. 6B, but is displayed according to the height of the entire image information including left and right black information, as shown in FIG. 6C. That is, in the vertical screen state, not only the left and right sides of the effective screen are filled with black information, but the top and bottom of the projected screen also display black areas because there is no effective screen for filling, which will greatly affect the user's viewing experience.
  • the screen display resolution corresponding to the screen projection video stream received by the display device 200 is 1920 ⁇ 1080, that is, in the height direction, the projection screen is based on the screen on the terminal and requires 1080 pixels for display .
  • the projection screen is based on the terminal display screen plus the width of the black borders on both sides, which requires 1920 pixels for display.
  • the height of the effective area in the projection screen is 1080, and the width is scaled and displayed in proportion to the height, which will greatly waste the display area on the display 275.
  • the rotation component 276 can be used to rotate the direction of the display 275, and the effective information can be enlarged to a certain proportion to adapt to the projection video stream sent by the mobile terminal 100B. .
  • the state shown in FIG. 4A when the mobile terminal is in the landscape mode, the state shown in FIG. 4A is present, and when the screen is projected, the parsed frame data of the video stream sent to the display device has no left and right black borders.
  • the display device receives image information that does not contain the left and right black borders, it detects whether the display is in a horizontal screen state.
  • the controller 250 receives the image information, and can directly present the image information on the display.
  • the controller 250 may send a rotation instruction to the rotation component 276 to control the rotation of the rotation component 276 to rotate the display to the horizontal screen state.
  • the mobile terminal When the mobile terminal is in the portrait mode, it presents the state shown in FIG. 4B.
  • the parsed frame data of the video stream sent to the display device has left and right black borders and valid images.
  • the display device receives the image information with left and right black borders, it detects whether the display is in a portrait state.
  • the controller 250 may send a rotation instruction to the rotation component 276 to control the rotation component 276 to rotate the display 275 to the portrait state.
  • the width of the display 275 In the vertical screen state, the width of the display 275 is smaller than the height, which is consistent with the display ratio of the terminal screen.
  • the frame data in the projected video stream is enlarged, and the effective image is displayed on the display.
  • the left and right black areas are enlarged due to enlargement. It cannot be displayed on the monitor, thereby reducing the area of black borders on both sides of the effective screen, as shown in FIG. 6B.
  • this application provides a display device that can determine whether it is necessary to rotate the display and adjust the screen according to whether the image information in the projection data stream contains left and right black information. Make adjustments to maximize use of the display area.
  • a display device provided by this application includes:
  • the rotating component is configured to drive the display to rotate so that the display is in one of a horizontal screen state or a vertical screen state;
  • a user interface the communication is configured to be connected to a terminal
  • controller see Figure 7, is configured to execute:
  • the image information includes valid information and left and right black area information.
  • the user may first perform a screen projection display operation on the mobile terminal 100B to send the display screen of the mobile terminal 100B to the display device 200. For example, the user selects "Settings-Connection and Sharing-Screencasting" on the mobile phone, and selects a display device in the current network as the screencasting object in the screencasting device list to perform the screencasting operation.
  • the mobile terminal 100B After performing the screen projection operation, the mobile terminal 100B will send the displayed screen to the display device 200 through a screen projection protocol, such as the Miracast protocol or other screen projection and mirroring protocols. As new interactive pictures are continuously generated during the screen projection process, the mobile terminal 100B will send the pictures to the display device 200 frame by frame to form a projection video stream.
  • a screen projection protocol such as the Miracast protocol or other screen projection and mirroring protocols.
  • users can also perform screen projection operations through third-party applications.
  • a user opens a video application, and a screencast icon is set on the video playback interface of the video application. The user can click the icon to perform the screen projection operation.
  • the screencasting image of the screencasting operation performed by a third-party application is based on the video resource being played. For example, when the video resource being played is a horizontal media asset such as a movie or a TV series, the width of the effective screen in the projection screen is greater than the height; when the video resource being played is a vertical media asset such as a short video or a comic, the screen is projected The width of the effective picture is smaller than the height.
  • the controller obtains the rotation angle callback information of the display, and determines the target rotation state of the display according to whether the left and right black borders are included in the image information obtained from the mobile terminal.
  • the image information does not include left and right black information, it means that the mobile terminal is currently in landscape mode, and when the current rotation state of the display is detected as the landscape state, the two are matched. No need to rotate the TV.
  • the current optional state of the monitor is the vertical screen state, the two are not matched, and the monitor needs to be rotated to the horizontal screen state.
  • the image information contains left and right black areas, it means that the mobile terminal is currently in portrait mode, and when the current rotation state of the display is detected as the horizontal screen state, the two do not match, and the display needs to be rotated to the vertical screen state.
  • the two are matched, and there is no need to rotate the display.
  • the mobile terminal 100B After the user performs a screen projection operation through the mobile terminal 100B, the mobile terminal 100B will send the screen projection image to the display device 200 through a mirroring protocol or a screen projection protocol.
  • the controller 250 may receive the image information sent by the terminal, and detect the current rotation state of the display 275. Among them, the detection of the rotation state of the display 275 can be completed by a built-in sensor in the display device 200.
  • sensor devices such as a gyroscope and a gravity acceleration sensor can be set on the display 275 of the display device 200, and the posture data of the display 275 relative to the direction of gravity can be determined by measuring the angular acceleration or the direction of gravity. Then, the detected posture data is compared with the posture data in the horizontal screen state and the vertical screen state, respectively, to determine the current rotation state of the display 275.
  • a grating angle sensor, a magnetic field angle sensor, or a sliding resistance angle sensor, etc. can be arranged on the rotating component 276, and the angle rotated by the rotating component 276 can be measured and compared with the angles in the horizontal screen state and the vertical screen state to determine The current rotation state of the display 275.
  • the controller is further configured to:
  • the display device 200 After the display device 200 receives the projected video stream, its controller 250 can analyze the received projected video stream frame by frame. For example, the screen aspect ratio of the screencast video stream sent by the mobile terminal 100B is 1920:1080. After receiving the screencast video stream, the controller 250 can obtain frame images by analyzing the screencast video stream. The resolution of the extracted frame image is 1920 ⁇ 1080.
  • the frame images in the projected video stream can be sampled again to extract the effective resolution.
  • the frame of picture used for sampling is called the sampling picture.
  • the effective resolution is the resolution of the effective picture on the frame data extracted from the projection video stream. Specifically, the effective resolution can be obtained in the projection video stream at a preset time interval.
  • the pixel color of the sampled picture can be traversed.
  • the pixel color value of the black area is black, and the pixel color value of the effective area is usually not all black. Therefore, by traversing each pixel of the sampled image, it is possible to determine that the black and rectangular area is black, and the other areas are valid. Picture.
  • the color filled in the black area is not limited to black.
  • the black area may be gray, blue, or other colors, and may also be gradient colors, specific patterns, and so on. For these situations, this application still refers to it as a black area or a black border for the convenience of subsequent description.
  • the effective resolution can be compared with the frame image resolution to determine the current
  • the effective picture conditions (such as ratio, direction, etc.) in the projected video stream, so as to choose whether to display according to the effective picture.
  • the effective resolution is equal to the frame image resolution, that is, the resolution of the first frame is 1920 ⁇ 1080, and the resolution of the effective area determined in the sampled image is also 1920 ⁇ 1080, it means that the current projection screen does not exist With black borders, the projection screen can fill the display area. That is, the display requirement of the projection screen can be met directly by displaying in the horizontal screen state of the display 275.
  • the resolution of the display screen is usually expressed by the number of pixels occupied by the width and height of the screen, such as 1920 ⁇ 1080.
  • the resolution of 1920 ⁇ 1080 is equal to 1080 ⁇ 1920. Therefore, in the actual comparison process, the effective resolution can be compared with the frame image resolution by extracting part of the value in the resolution or converting the resolution into other comparable values before performing comparison. result.
  • the width or height of the overall picture can be extracted from the frame image resolution and compared with the height or width of the effective picture to determine its effective resolution.
  • the effective resolution detection method of the projected video stream provided by this application can extract the frame image resolution and effective resolution from the projected video stream after receiving the projected video stream, and compare them to determine the current The effective resolution of the video stream.
  • the projected screen can be displayed according to the effective resolution, so as to adapt to the display direction of the projected screen, reduce the impact of black borders, and achieve a better user experience.
  • the method further includes:
  • the width of the mobile terminal is W 0 and the height is H ph ; the width of the display device in the vertical screen state is W 0 , and the height is H tv .
  • the range of the continuous black area can be detected from the left side of the image corresponding to the effective picture, and the range of the black area can be obtained: the width of the left black border a, and the height of the left black border H ph .
  • the range of the continuous black area is detected from the right side of the image, and the range of the black area is obtained: the width a of the right black border and the height of the right black border H ph , forming black border data.
  • the resolution of the effective picture is (W 0 -2a)*H ph .
  • the resolution of the display device is W 0 *H tv .
  • the screen content of the screen projected by the mobile terminal 100B is likely to affect the judgment of the effective screen area in the sampled screen.
  • the display screen of the mobile terminal 100B corresponding to the sampled screen is just black, if the screen is still projected Consecutive black pixels on the edge are used to determine the range.
  • a black picture will affect the range determination of the black area, which in turn affects the final effective resolution extraction result. Therefore, in order to alleviate the effect of the black screen content on the effective resolution, the step of extracting the effective resolution from the projection video stream also includes:
  • the sampling time interval By presetting the sampling time interval, it is possible to sample multiple times in the projection video stream, and extract the effective resolution of the picture in each sample. For example, a frame image is acquired every T time, and the effective resolution values obtained by the above-mentioned resolution algorithm are respectively: Sx0, Sx1, ..., Sxn.
  • multi-frame sampling pictures are not all affected by the black picture content, so collecting multi-frame sampling pictures can reduce the influence of the picture content on the judgment of the black area range, thereby improving the accuracy of the effective resolution judgment.
  • the method further includes: if the frame image resolution of the video stream is set to be different from the effective resolution, controlling to rotate the display 275 of the display device 200 to a portrait state.
  • the frame image resolution is 1920 ⁇ 1080
  • the effective resolution is 960 ⁇ 1080. Therefore, it can be determined that the corresponding display screen on the mobile terminal 100B is a 960 ⁇ 1080 vertical screen.
  • the vertical screen is more suitable for display in the vertical screen state. Therefore, after determining that the effective resolution is the effective resolution, the controller 250 can send a control instruction to the rotating component 276 to make the rotating component 276 drive the display 275 counterclockwise (or clockwise). Hour hand) rotate to the vertical screen state.
  • the projected screen can be displayed in a ratio of 960:1080 in aspect ratio.
  • its display resolution is usually 3840 ⁇ 2160 (horizontal screen state, corresponding to the vertical screen state is 2160 ⁇ 3840). Therefore, in order to display the projection screen with a resolution of 960 ⁇ 1080, it is necessary to zoom the projection screen so that the display 275 can completely display the projection screen.
  • the display device provided by the present application can determine whether the mobile terminal is in portrait mode according to image information, and automatically adjust the rotation state of the display, thereby using a larger display space to display the projection screen. Alleviate the problem that the traditional smart TV cannot display the projected screen normally.
  • a screen projection method applied to a display device includes:
  • the image information includes valid information and left and right black information
  • this application also provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in each embodiment of the method provided in this application when the program is executed.
  • the controller executes the steps of the controller configured in the present application.
  • the storage medium can be a magnetic disk, an optical disc, a read-only memory (ROM) or a random access memory (RAM), etc.
  • the technology in the embodiments of the present application can be implemented by means of software plus a necessary general hardware platform.
  • the technical solutions in the embodiments of the present application can be embodied in the form of software products, which can be stored in a storage medium, such as ROM/RAM. , Magnetic disks, optical disks, etc., including a number of instructions to enable a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the various embodiments or some parts of the embodiments of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请提供一种显示设备及投屏方法,显示设备的控制器被配置为:当终端处于纵向模式时,接收终端发送包括有效信息和左、右黑色信息的图像信息,若显示器的当前旋转状态与终端的纵向模式不匹配时,将显示器旋转至竖屏状态;基于有效信息控制显示器呈现投屏画面,该投屏画面为有效信息放大预设倍数得到的。

Description

一种显示设备及投屏方法 技术领域
本申请涉及智能电视技术领域,尤其涉及一种显示设备及投屏方法。
背景技术
投屏是一种终端与显示设备的互动操作。一般利用无线局域网络传递视频流,以通过显示设备展示终端设备上的画面。以手机投屏为例,对于连接在同一个WiFi网络中的手机和智能电视,可以通过手机端执行投屏操作指令,以将手机端显示的画面以视频流的方式发送给智能电视,以利用智能电视的大屏幕,获得更好的用户体验。
然而,手机等终端的画面显示比例与显示设备的屏幕比例往往存在差异。例如,常规操作下,手机的屏幕显示宽高比为1080:1940;而智能电视的显示器宽高比为1940:1080,即手机端的画面是竖向的状态,而智能电视的画面是横向的状态。因此,在通过智能电视投屏显示终端画面时,容易因终端画面宽高比与显示器宽高比不匹配,而无法正常显示投屏画面。
为了将手机上的画面完全显示,需要以手机画面的高度为准,对画面进行缩放。但在对投屏画面进行缩放时,画面比例的差异将导致智能电视显示的画面两侧拥有较大的黑色区域,不仅降低用户的观影体验,而且浪费屏幕上的显示空间。
发明内容
本申请提供了一种显示设备及投屏方法,以解决传统显示设备显示投屏画面时,浪费屏幕上的显示空间的问题。
一方面,本申请提供一种显示设备,包括:显示器;旋转组件,被配置为带动所述显示器旋转,以使所述显示处于横屏状态或竖屏状态中的一种旋转状态;用户接口,所述通信息被配置为连接到终端;以及与上述显示器、旋转组件和用户接口通信的控制器。其中,控制器被配置为:
接收所述终端发送的图像信息,其当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色区域;
若所述显示器的当前旋转状态与所述终端的纵向模式不匹配时,将所述显示器旋转至竖屏状态;
基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
另一方面,本申请还提供一种投屏方法,应用于显示设备,包括:
接收所述终端发送的图像信息,其中,当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色信息,所述有效信息对应于所述终端的屏幕显示内容;
若所述显示器的当前旋转状态与所述终端的纵向模式不匹配时,将所述显示器旋转至竖屏状态;
基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
附图说明
为了更清楚地说明本申请或相关技术中的实施方式,下面将对实施例或相关技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1A为本申请一种显示设备的应用场景图;
图1B为本申请一种显示设备的后视图;
图2为本申请控制装置的硬件配置框图;
图3为本申请显示设备存储器中操作***的架构配置框图;
图4A为本申请移动终端横向模式示意图;
图4B为本申请移动终端纵向模式示意图;
图5A为本申请一实施例中移动终端处于横向模式下发送图像信息至处于横屏状态的显示设备的示意图;
图5B为本申请一实施例中移动终端处于横向模式下发送图像信息至处于竖屏状态的显示设备的示意图;
图6A为本申请一实施例中移动终端处于纵向模式下发送图像信息至处于横屏状态的显示设备的示意图;
图6B为为本申请一实施例中移动终端处于纵向模式下发送图像信息至处于竖屏状态的显示设备的示意图;
图6C为根据投屏协议在显示设备为竖屏状态下呈现画面的效果示意图;
图7为本申请一示例性的显示设备的控制器执行的流程示意图;
图8为本申请根据显示设备和有效画面的高度和宽度的比例确定有效信息放大倍数的流程示意图;
图9为本申请黑色信息和图像信息示意图;
图10为本申请一种显示设备的结构示意图。
具体实施方式
下面将详细地对实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下实施例中描述的实施方式并不代表与本申请相一致的所有实施方式。仅是与权利要求书中所详述的、本申请的一些方面相一致的***和方法的示例。
本申请提供的技术方案中,所述显示设备可以为智能电视等带有较大屏幕、为用户呈现视频和音频信号的电器设备。显示设备可以拥有独立的操作***,并支持功能扩展。可以根据用户需要在显示设备中安装各种应用程序,例如,传统视频应用、短视频等社交应用以及漫画、看书等阅读应用。这些应用可利用显示设备的屏幕展示应用画面,为用户提供更丰富的媒体资源。同时,显示设备还可以与不同的终端进行数据交互和资源共享。例如,智能电视可以通过局域网、蓝牙等无线通信方式与手机连接,从而播放手机中的资源或者直接进行投屏显示手机上的画面。
为方便用户在显示器不同的横竖屏状态展示目标媒资详情页,便于提升显示设备在不同观看状态时的用户观看体验,本申请实施例提供了一种显示设备及计算机存储介质,显示设备,如旋转电视。需要说明的是,本实施例提供的方法不仅适用于旋转电视,还适用于其它显示设备,如计算机、平板电脑等。
本申请各实施例中使用的术语“模块”,可以是指任何已知或后来开发的硬件、软件、固件、人工智能、模糊逻辑或硬件或/和软件代码的组合,能够执行与该 元件相关的功能。
本申请各实施例中使用的术语“遥控器”,是指电子设备(如本申请中公开的显示设备)的一个组件,该组件通常可在较短的距离范围内无线控制电子设备。该组件一般可以使用红外线和/或射频(RF)信号和/或蓝牙与电子设备连接,也可以包括WiFi、无线USB、蓝牙、动作传感器等功能模块。例如:手持式触摸遥控器,是以触摸屏中用户界面取代一般遥控装置中的大部分物理内置硬键。
本申请各实施例中使用的术语“手势”,是指用户通过一种手型的变化或手部运动等动作,用于表达预期想法、动作、目的/或结果的用户行为。
本申请各实施例中使用的术语“硬件***”,可以是指由集成电路(Integrated Circuit,IC)、印刷电路板(Printed circuit board,PCB)等机械、光、电、磁器件构成的具有计算、控制、存储、输入和输出功能的实体部件。在本申请各个实施例中,硬件***通常也会被称为主板(motherboard)或主芯片或控制器。
参见图1A,为本申请一些实施例提供的一种显示设备的应用场景图。如图1所示,控制装置100和显示设备200之间可以有线或无线方式进行通信。
其中,控制装置100被配置为控制显示设备200,其可接收用户输入的操作指令,且将操作指令转换为显示设备200可识别和响应的指令,起着用户与显示设备200之间交互的中介作用。如:用户通过操作控制装置100上频道加减键,显示设备200响应频道加减的操作。
控制装置100可以是遥控器100A,包括红外协议通信或蓝牙协议通信,及其他短距离通信方式等,通过无线或其他有线方式来控制显示设备200。用户可以通过遥控器上按键、语音输入、控制面板输入等输入用户指令,来控制显示设备200。如:用户可以通过遥控器上音量加减键、频道控制键、上/下/左/ 右的移动按键、语音输入按键、菜单键、开关机按键等输入相应控制指令,来实现控制显示设备200的功能。
控制装置100也可以是智能设备,如移动终端100B、平板电脑、计算机、笔记本电脑等。例如,使用在智能设备上运行的应用程序控制显示设备200。该应用程序通过配置可以在与智能设备关联的屏幕上,通过直观的用户界面(UI)为用户提供各种控制。
示例性的,移动终端100B可与显示设备200安装软件应用,通过网络通信协议实现连接通信,实现一对一控制操作的和数据通信的目的。如:可以使移动终端100B与显示设备200建立控制指令协议,通过操作移动终端100B上提供的用户界面的各种功能键或虚拟控件,来实现如遥控器100A布置的实体按键的功能。也可以将移动终端100B上显示的音视频内容传输到显示设备200上,实现同步显示功能。
显示设备200可提供广播接收功能和计算机支持功能的网络电视功能。显示设备可以实施为,数字电视、网络电视、互联网协议电视(IPTV)等。
显示设备200,可以是液晶显示器、有机发光显示器、投影设备。具体显示设备类型、尺寸大小和分辨率等不作限定。
显示设备200还与服务器300通过多种通信方式进行数据通信。这里可允许显示设备200通过局域网(LAN)、无线局域网(WLAN)和其他网络进行通信连接。服务器300可以向显示设备200提供各种内容和互动。示例的,显示设备200可以发送和接收信息,例如:接收电子节目指南(EPG)数据、接收软件程序更新、或访问远程储存的数字媒体库。服务器300可以一组,也可以多组,可以一类或多类服务器。通过服务器300提供视频点播和广告服务等其他网络服 务内容。
在一些实施例中,如图1B所示,显示设备200包括旋转组件276,控制器250,显示器275,从背板上空隙处伸出的端子接口278以及和背板连接的旋转组件276,旋转组件276可以使显示器275进行旋转。从显示设备正面观看的角度,旋转组件276可以将显示屏旋转到竖屏状态,即屏幕竖向的边长大于横向的边长的状态,也可以将屏幕旋转至横屏状态,即屏幕横向的边长大于竖向的边长的状态。
图2中示例性示出了显示设备200的硬件配置框图。如图2所示,显示设备200中可以包括调谐解调器210、通信器220、检测器230、外部装置接口240、控制器250、存储器260、用户接口265、视频处理器270、显示器275、旋转组件276、音频处理器280、音频输出接口285、供电电源290。
其中,旋转组件276可以包括驱动电机、旋转轴等部件。其中,驱动电机可以连接控制器250,受控制器250的控制输出旋转角度;旋转轴的一端连接驱动电机的动力输出轴,另一端连接显示器275,以使显示器275可以通过旋转组件276固定安装在墙壁或支架上。
旋转组件276还可以包括其他部件,如传动部件、检测部件等。其中,传动部件可以通过特定传动比,调整旋转组件276输出的转速和力矩,可以为齿轮传动方式;检测部件可以由设置在旋转轴上的传感器组成,例如角度传感器、姿态传感器等。这些传感器可以对旋转组件276旋转的角度等参数进行检测,并将检测的参数发送给控制器250,以使控制器250能够根据检测的参数判断或调整显示设备200的状态。实际应用中,旋转组件276可以包括但不限于上述部件中的一种或多种。
调谐解调器210,通过有线或无线方式接收广播电视信号,可以进行放大、混频和谐振等调制解调处理,用于从多个无线或有线广播电视信号中解调出用户所选择的电视频道的频率中所携带的音视频信号,以及附加信息(例如EPG数据)。
调谐解调器210,可根据用户选择,以及由控制器250控制,响应用户选择的电视频道的频率以及该频率所携带的电视信号。
调谐解调器210,根据电视信号的广播制式不同,可以接收信号的途径有很多种,诸如:地面广播、有线广播、卫星广播或互联网广播等;以及根据调制类型不同,可以数字调制方式或模拟调制方式;以及根据接收电视信号的种类不同,可以解调模拟信号和数字信号。
在其他一些示例性实施例中,调谐解调器210也可在外部设备中,如外部机顶盒等。这样,机顶盒通过调制解调后输出电视信号,经过外部装置接口240输入至显示设备200中。
通信器220,是用于根据各种通信协议类型与外部设备或外部服务器进行通信的组件。例如显示设备200可将内容数据发送至经由通信器220连接的外部设备,或者,从经由通信器220连接的外部设备浏览和下载内容数据。通信器220可以包括WIFI模块221、蓝牙通信协议模块222、有线以太网通信协议模块223等网络通信协议模块或近场通信协议模块,从而通信器220可根据控制器250的控制接收控制装置100的控制信号,并将控制信号实现为WIFI信号、蓝牙信号、射频信号等。
检测器230,是显示设备200用于采集外部环境或与外部交互的信号的组件。检测器230可以包括声音采集器231,如麦克风,可以用于接收用户的声 音,如用户控制显示设备200的控制指令的语音信号;或者,可以采集用于识别环境场景类型的环境声音,实现显示设备200可以自适应环境噪声。
在其他一些示例性实施例中,检测器230,还可以包括图像采集器232,如相机、摄像头等,可以用于采集外部环境场景,以自适应变化显示设备200的显示参数;以及用于采集用户的属性或与用户交互手势,以实现显示设备与用户之间互动的功能。
在其他一些示例性实施例中,检测器230,还可以包括光接收器,用于采集环境光线强度,以自适应显示设备200的显示参数变化等。
在其他一些示例性实施例中,检测器230,还可以包括温度传感器,如通过感测环境温度,显示设备200可自适应调整图像的显示色温。示例性的,当温度偏高的环境时,可调整显示设备200显示图像色温偏冷色调;当温度偏低的环境时,可以调整显示设备200显示图像色温偏暖色调。
外部装置接口240,是提供控制器250控制显示设备200与外部设备间数据传输的组件。外部装置接口240可按照有线/无线方式与诸如机顶盒、游戏装置、笔记本电脑、等外部设备连接,可接收外部设备的诸如视频信号(例如运动图像)、音频信号(例如音乐)、附加信息(例如EPG)等数据。
其中,外部装置接口240可以包括:高清多媒体接口(HDMI)端子241、复合视频消隐同步(CVBS)端子242、模拟或数字分量端子243、通用串行总线(USB)端子244、组件(Component)端子(图中未示出)、红绿蓝(RGB)端子(图中未示出)等任一个或多个。
控制器250,通过运行存储在存储器260上的各种软件控制程序(如操作***和各种应用程序),来控制显示设备200的工作和响应用户的操作。
在一些示例性的实施方式中,控制器250包括随机存取存储器(RAM)、只读存储器(ROM)、图形处理器、CPU处理器、通信接口、以及通信总线。其中,RAM、ROM以及图形处理器、CPU处理器通信接口通过通信总线相连接。
ROM,用于存储各种***启动指令。如在接收到开机信号时,显示设备200电源开始启动,CPU处理器254运行ROM252中的***启动指令,将存储在存储器260的操作***拷贝至RAM251中,以开始运行启动操作***。当操作***启动完成后,CPU处理器254再将存储器260中各种应用程序拷贝至RAM251中,然后,开始运行启动各种应用程序。
图形处理器253,用于产生各种图形对象,如图标、操作菜单、以及用户输入指令显示图形等。图形处理器253可以包括运算器,用于通过接收用户输入各种交互指令进行运算,进而根据显示属性显示各种对象;以及包括渲染器,用于产生基于运算器得到的各种对象,将进行渲染的结果显示在显示器275上。
CPU处理器254,用于执行存储在存储器260中的操作***和应用程序指令。以及根据接收的用户输入指令,来执行各种应用程序、数据和内容的处理,以便最终显示和播放各种音视频内容。
在一些示例性实施例中,CPU处理器254,可以包括多个处理器。多个处理器可包括一个主处理器以及多个或一个子处理器。主处理器,用于在显示设备预加载模式中执行显示设备200的一些初始化操作,和/或,在正常模式下显示画面的操作。多个或一个子处理器,用于执行在显示设备待机模式等状态下的一种操作。
通信接口255,可包括第一接口到第n接口。这些接口可以是经由网络被连接到外部设备的网络接口。
控制器250可以控制显示设备200的整体操作。例如:响应于接收到用于选择在显示器275上显示的GUI对象的用户输入命令,控制器250便可以执行与由用户输入命令选择的对象有关的操作。
其中,该对象可以是可选对象中的任何一个,例如超链接或图标。该与所选择的对象有关的操作,例如显示连接到超链接页面、文档、图像等操作,或者执行与对象相对应的程序的操作。该用于选择GUI对象的用户输入命令,可以是通过连接到显示设备200的各种输入装置(例如,鼠标、键盘、触摸板等)输入命令或者与由用户说出语音相对应的语音命令。
存储器260,用于存储驱动和控制显示设备200运行的各种类型的数据、软件程序或应用程序。存储器260可以包括易失性和/或非易失性存储器。而术语“存储器”包括存储器260、控制器250的RAM和ROM、或显示设备200中的存储卡。
在一些实施例中,存储器260具体用于存储驱动显示设备200中控制器250的运行程序;存储显示设备200内置的和用户从外部设备下载的各种应用程序;存储用于配置由显示器275提供的各种GUI、与GUI相关的各种对象及用于选择GUI对象的选择器的视觉效果图像等数据。
在一些实施例中,存储器260具体用于存储调谐解调器210、通信器220、检测器230、外部装置接口240、视频处理器270、显示器275、音频处理器280等的驱动程序和相关数据,例如从外部装置接口接收的外部数据(例如音视频数据)或用户接口接收的用户数据(例如按键信息、语音信息、触摸信息等)。
在一些实施例中,存储器260具体存储用于表示操作***(OS)的软件和/ 或程序,这些软件和/或程序可包括,例如:内核、中间件、应用编程接口(API)和/或应用程序。示例性的,内核可控制或管理***资源,以及其它程序所实施的功能(如所述中间件、API或应用程序);同时,内核可以提供接口,以允许中间件、API或应用程序访问控制器,以实现控制或管理***资源。
图3中示例性示出了显示设备200存储器中操作***的架构配置框图。该操作***架构从上到下依次是应用层、中间件层和内核层。
应用层,***内置的应用程序以及非***级的应用程序都是属于应用层。负责与用户进行直接交互。应用层可包括多个应用程序,如设置应用程序、电子帖应用程序、媒体中心应用程序等。这些应用程序可被实现为Web应用,其基于WebKit引擎来执行,具体可基于HTML5、层叠样式表(CSS)和JavaScript来开发并执行。
这里,HTML,全称为超文本标记语言(Hyper Text Markup Language),是一种用于创建网页的标准标记语言,通过标记标签来描述网页,HTML标签用以说明文字、图形、动画、声音、表格、链接等,浏览器会读取HTML文档,解释文档内标签的内容,并以网页的形式显示出来。
CSS,全称为层叠样式表(Cascading Style Sheets),是一种用来表现HTML文件样式的计算机语言,可以用来定义样式结构,如字体、颜色、位置等的语言。CSS样式可以直接存储与HTML网页或者单独的样式文件中,实现对网页中样式的控制。
JavaScript,是一种应用于Web网页编程的语言,可以***HTML页面并由浏览器解释执行。其中Web应用的交互逻辑都是通过JavaScript实现。JavaScript可以通过浏览器,封装JavaScript扩展接口,实现与内核层的通信,
中间件层,可以提供一些标准化的接口,以支持各种环境和***的操作。例如,中间件层可以实现为与数据广播相关的中间件的多媒体和超媒体信息编码专家组(MHEG),还可以实现为与外部设备通信相关的中间件的DLNA中间件,还可以实现为提供显示设备内各应用程序所运行的浏览器环境的中间件等。
内核层,提供核心***服务,例如:文件管理、内存管理、进程管理、网络管理、***安全权限管理等服务。内核层可以被实现为基于各种操作***的内核,例如,基于Linux操作***的内核。
内核层也同时提供***软件和硬件之间的通信,为各种硬件提供设备驱动服务,例如:为显示器提供显示驱动程序、为摄像头提供摄像头驱动程序、为遥控器提供按键驱动程序、为WIFI模块提供WiFi驱动程序、为音频输出接口提供音频驱动程序、为电源管理(PM)模块提供电源管理驱动等。
图2中,用户接口265,接收各种用户交互。具体的,用于将用户的输入信号发送给控制器250,或者,将从控制器250的输出信号传送给用户。示例性的,遥控器100A可将用户输入的诸如电源开关信号、频道选择信号、音量调节信号等输入信号发送至用户接口265,再由用户接口265转送至控制器250;或者,遥控器100A可接收经控制器250处理从用户接口265输出的音频、视频或数据等输出信号,并且显示接收的输出信号或将接收的输出信号输出为音频或振动形式。
在一些实施例中,用户可在显示器275上显示的图形用户界面(GUI)输入用户命令,则用户接口265通过GUI接收用户输入命令。确切的说,用户接口265可接收用于控制选择器在GUI中的位置以选择不同的对象或项目的用户输入命令。其中,“用户界面”,是应用程序或操作***与用户之间进行交互和信 息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图标、控件、菜单、选项卡、文本框、对话框、状态栏、频道栏、Widget等可视的界面元素。
或者,用户可通过输入特定的声音或手势进行输入用户命令,则用户接口265通过传感器识别出声音或手势,来接收用户输入命令。
视频处理器270,用于接收外部的视频信号,根据输入信号的标准编解码协议,进行解压缩、解码、缩放、降噪、帧率转换、分辨率转换、图像合成等视频数据处理,可得到直接在显示器275上显示或播放的视频信号。
示例的,视频处理器270,包括解复用模块、视频解码模块、图像合成模块、帧率转换模块、显示格式化模块等。
其中,解复用模块,用于对输入音视频数据流进行解复用处理,如输入MPEG-2流(基于数字存储媒体运动图像和语音的压缩标准),则解复用模块将其进行解复用成视频信号和音频信号等。
视频解码模块,用于对解复用后的视频信号进行处理,包括解码和缩放处理等。
图像合成模块,如图像合成器,其用于将图形生成器根据用户输入或自身生成的GUI信号,与缩放处理后视频图像进行叠加混合处理,以生成可供显示的图像信号。
帧率转换模块,用于对输入视频的帧率进行转换,如将输入的60Hz视频的 帧率转换为120Hz或240Hz的帧率,通常的格式采用如插帧方式实现。
显示格式化模块,用于将帧率转换模块输出的信号,改变为符合诸如显示器显示格式的信号,如将帧率转换模块输出的信号进行格式转换以输出RGB数据信号。
显示器275,用于接收源自视频处理器270输入的图像信号,进行显示视频内容、图像以及菜单操控界面。显示视频内容,可以来自调谐解调器210接收的广播信号中的视频内容,也可以来自通信器220或外部装置接口240输入的视频内容。显示器275,同时显示显示设备200中产生且用于控制显示设备200的用户操控界面UI。
以及,显示器275可以包括用于呈现画面的显示屏组件以及驱动图像显示的驱动组件。或者,倘若显示器275为一种投影显示器,还可以包括一种投影装置和投影屏幕。
旋转组件276,控制器可以发出控制信号使旋转组件276旋转显示器255。
音频处理器280,用于接收外部的音频信号,根据输入信号的标准编解码协议,进行解压缩和解码,以及降噪、数模转换、和放大处理等音频数据处理,得到可以在扬声器286中播放的音频信号。
示例性的,音频处理器280可以支持各种音频格式。例如MPEG-2、MPEG-4、高级音频编码(AAC)、高效AAC(HE-AAC)等格式。
音频输出接口285,用于在控制器250的控制下接收音频处理器280输出的音频信号,音频输出接口285可包括扬声器286,或输出至外接设备的发生装置的外接音响输出端子287,如耳机输出端子。
在其他一些示例性实施例中,视频处理器270可以包括一个或多个芯片组 成。音频处理器280,也可以包括一个或多个芯片组成。
以及,在其他一些示例性实施例中,视频处理器270和音频处理器280,可以为单独的芯片,也可以与控制器250一起集成在一个或多个芯片中。
供电电源290,用于在控制器250的控制下,将外部电源输入的电力为显示设备200提供电源供电支持。供电电源290可以是安装在显示设备200内部的内置电源电路,也可以是安装在显示设备200外部的电源。
在投屏过程中,移动终端100B可以通过无线连接方式,如通过Miracast协议,向显示设备200发送显示画面数据,形成投屏视频流。当显示设备200在接收到投屏视频流后,可以通过控制器250对投屏视频流进行解码,解析出帧画面经处理后形成投屏画面发送给显示器275进行显示。
其中,移动终端100B可以为具有显示和人机交互功能的智能终端设备,例如手机、平板电脑等。由于移动终端100B具有不同的操作模式,因此所形成的投屏画面也具有不同的布局模式。例如,当用户在横向握持手机进行操作时,手机上所呈现的画面为横向布局,即画面的宽度大于画面的高度,手机处于横向模式,如图4A所示。当用户在竖向握持手机进行操作时,手机上所呈现的画面为竖向布局,即画面的宽度小于画面的高度,手机处于纵向模式,如图4B所示。
针对不同类型的移动终端100B,其显示屏比例也存在多种不同形式。例如,手机的屏幕宽高比通常为9:16、10:16等;平板电脑的屏幕宽高比为3:4等。还可能有部分智能终端设备的屏幕宽高比为1:1,例如智能手表等。对于屏幕宽高比为1:1的智能终端设备,其横向状态和竖向状态下所呈现的画面布局一般是相同的,仅仅在智能终端设备的显示屏上显示时,方向不同。因此,对于显示屏 幕宽高比1:1的移动终端100B,其投屏时形成的投屏画面并不区分横竖向状态。
为了使显示设备200能够根据移动终端100B在投屏时的横竖模式实现自动旋转屏幕,从而达到更好的用户体验,显示设备200和移动终端100B之间可以配置有投屏协议,例如基于Miracast标准的投屏协议,通过投屏协议实现画面视频流的传输,使移动终端100B上的画面能够投屏给显示设备200。但目前Miracast投屏协议的规定中,无论移动终端100B处于横向模式还是竖向模式,投屏给显示设备200的视频流一直为1920×1080的横屏流,导致显示设备200无法通过视频流实现自动旋转电视屏幕。
例如,移动终端在与显示设备进行投屏交互时,当移动终端处于横向模式时,无论与该移动终端交互的显示设备处于横屏状态还是竖屏状态,其向显示设备发送的图像信息均不包含左和右黑色信息,如图5A和5B。
而当移动终端处于纵向模式时,无论与该移动终端交互的显示设备处于横屏状态还是竖屏状态,其向显示设备发送的图像信息均包含左和右黑色信息,如图6A和6C。
在图6A所对应的投屏画面中,画面两侧的黑色区域所示的图像信息为黑色信息,或者称为黑边,画面中部的显示画面区域称为有效画面(即为移动终端相对应的屏幕显示画面),其该有效画面即为移动终端的图像信息中的有效信息。在有效画面对应的区域内,显示的是移动终端100B上的操作画面,黑边为根据移动终端100B显示器275的屏幕比例确定具有不同的宽度和高度。
即,显示设备200显示投屏画面时,以较短边对应的方向为基准,如在横屏状态下的高度方向。因此,显示设备200呈现的投屏视频流高度方向一般是不变的。即无论移动终端100B的放置方向为横向还是竖向,显示设备200接 收到的投屏画面高度都是1080P的。即使显示器275在旋转到竖屏状态后,并不能呈现如图6B所示的显示状态,而是按照整个包含左、右黑色信息的图像信息的高度进行显示,如图6C所示。即在竖屏状态下,不仅有效画面的左右两侧填充有黑色信息,投屏画面的顶部和底部也因没有有效画面做填充而显示黑色区域,这将大大影响用户的观看体验。
在实际投屏过程中,显示设备200接收到的投屏视频流对应画面显示分辨率为1920×1080,即在高度方向,投屏画面以终端上的画面为基准,需要1080个像素点进行显示。而在宽度方向,投屏画面以终端显示画面加两侧黑边宽度为基准,需要1920个像素点进行显示。相应的,投屏画面中有效区域的高度为1080,宽度按照高度的比例进行缩放显示,这将大大浪费显示器275上的显示区域。
因此,为了提高显示设备上显示区域的利用率,在实际应用中,可以通过旋转组件276旋转显示器275的方向,并对有效信息进行一定比例的放大,以适应移动终端100B发送的投屏视频流。
在一些实施方式中,当移动终端处于横向模式时,呈现图4A所示的状态,进行投屏时,发送到显示设备的视频流被解析后的帧数据没有左、右黑边。当显示设备接收到不包含左、右黑边的图像信息时,检测显示器是否处于横屏状态。当时横屏状态时,控制器250接收到图像信息,可直接将该图像信息呈现在显示器上。当显示器处于竖屏状态时,控制器250可以向旋转组件276发送旋转指令,控制旋转组件276旋转,以使显示器旋转至横屏状态。
当移动终端处于纵向模式时,呈现图4B所示的状态,进行投屏时,发送到显示设备的视频流被解析后的帧数据带有左、右黑边和有效画面。当显示设备 接收到带有左、右黑边的图像信息时,检测显示器是否处于竖屏状态。当处于横屏状态时,控制器250可以向旋转组件276发送旋转指令,控制旋转组件276,以使显示器275旋转至竖屏状态。在竖屏状态下,显示器275的宽度小于高度,与终端画面的显示比例相符,此时,对投屏视频流中的帧数据进行放大,将有效画面显示到显示器上,左右黑色区域因放大而无法显示在显示器上,从而减小有效画面两侧的黑边面积,如图6B所示。
为了改善用户的观影体验,减少画面两侧的黑色信息,本申请提供一种显示设备,可以根据投屏数据流的图像信息中是否包含左、右黑色信息,判断是否需要旋转显示器并对画面进行调整,以最大化利用显示区域。
本申请提供的一种显示设备包括:
显示器;
旋转组件,被配置为带动所述显示器旋转,以使所述显示处于横屏状态或竖屏状态中的一种旋转状态;
用户接口,所述通信息被配置为连接到终端;
以及控制器,参见图7,被配置为执行:
S1,接收所述终端发送的图像信息,其当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色区信息。
实际应用中,用户可以先在移动终端100B上执行投屏显示操作,以将移动终端100B的显示画面发送给显示设备200。例如,用户通过在手机上先后选择“设置-连接与共享-投屏”,并且在投屏操作的设备列表中选中当前网络中的一个显示设备作为投屏对象,执行投屏操作。
在执行投屏操作后,移动终端100B会通过投屏协议,如采用Miracast协 议或其他投屏以及镜像协议,将所显示的画面发送给显示设备200。随着投屏过程中不断产生新的交互画面,移动终端100B会逐帧将画面发送给显示设备200,形成投屏视频流。
需要说明的是,用户还可以根据通过第三方应用程序执行投屏操作。例如,用户打开视频应用,在视频应用的视频播放界面上,设有投屏图标。用户可以点击该图标执行投屏操作。通常,通过第三方应用程序执行的投屏操作的投屏画面以所播放的视频资源为准。例如,在播放的视频资源为电影、电视剧等横向媒资时,则投屏画面中有效画面的宽度大于高度;在播放的视频资源为短视频、漫画等竖向媒资时,则投屏画面中有效画面的宽度小于高度。
S2,若所述显示器的当前旋转状态与所述终端的纵向模式不匹配时,将所述显示器旋转至竖屏状态;
实际应用中,可以通过分别分析图像信息中是否包含左、右黑色信息,判断显示器的当前旋转状态与所述终端的显示状态是否匹配。
在一些示例性的实施方式中,控制器获取显示器的旋转角度回调信息,根据从移动终端获取的图像信息中是否包含左右黑边确定显示器的目标旋转状态。
当图像信息中是否不包含左、右黑色信息,则说明移动终端目前为横向模式,检测显示器的当前旋转状态为横屏状态时,二者为匹配。不需要旋转电视。检测显示器的当前选装状态为竖屏状态时,二者为不匹配,需要将显示器旋转至横屏状态。
当图像信息中是否包含左、右黑色区域,则说明移动终端目前为纵向模式,检测显示器的当前旋转状态为横屏状态时,二者为不匹配,需要将显示器旋转至竖屏状态。检测显示器的当前选装状态为竖屏状态时,二者为匹配,不需要 旋转显示器。
用户通过移动终端100B执行投屏操作后,移动终端100B会通过镜像协议或者投屏协议向显示设备200发送投屏画面。控制器250可以接收终端发送的图像信息,检测显示器275当前的旋转状态。其中,对于显示275旋转状态的检测可以通过显示设备200中内置的传感器完成。
例如,可以在显示设备200的显示器275上设置陀螺仪、重力加速度传感器等传感器设备,通过测量角加速度或重力方向确定显示器275相对于重力方向的姿态数据。再将检测的姿态数据分别与横屏状态和竖屏状态下的姿态数据进行比较,确定显示器275当前所处的旋转状态。又例如,可以在旋转组件276上设置光栅角度传感器、磁场角度传感器或滑动电阻角度传感器等,通过测量旋转组件276所旋转的角度,分别与横屏状态和竖屏状态下的角度进行比较,确定显示器275当前所处的旋转状态。
在一些示例性的实施方式中,控制器被进一步配置为:
计算所述投屏画面的旋转方向和旋转角度;所述投屏画面的旋转方向与所述显示器的转动方向相反;所述投屏画面的旋转角度与所述显示器的转动角度相等;
按照所述旋转方向和所述旋转角度旋转所述投屏画面。
S3,基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
显示设备200在接收到投屏视频流后,其控制器250可以对接收到的投屏视频流进行逐帧分析。例如,移动终端100B发送的投屏视频流的画面宽高比为1920:1080,控制器250在接收到投屏视频流后,可以通过解析该投屏视频 流,获取帧图像。提取帧图像分辨率为1920×1080。
在提取分辨率后,还可以针对投屏视频流中的帧图像再进行采样,提取出有效分辨率。其中,用于进行采样的那一帧画面称为采样画面。有效分辨率为在所述投屏视频流中提取的,帧数据上有效画面的分辨率。具体的,有效分辨率可以按照预设时间间隔在投屏视频流中获取。
在采样过程中,为了从采样画面中确定有效画面,可以对采样画面的像素点颜色进行遍历。显然,黑色区域的像素色值为黑色,有效区域的像素色值通常不全是黑色,因此,可以通过遍历采样画面的每个像素,确定黑色且呈矩形的区域为黑边,其他区域则为有效画面。
需要说明的是,根据不同显示设备200的适应显示方法,黑色区域所填充的颜色不仅仅局限于黑色。例如,为了适应操作***的整体UI设计风格,黑色区域可以为灰色、蓝色或其他颜色,还可能是渐变色,特定图案等。对于这些情况,本申请为了便于后续描述,仍然称之为黑色区域或者黑边。
在针对投屏视频流获取帧图像分辨率和有效分辨率后,可以将有效分辨率与帧图像分辨率进行对比,以根据采样分辨率有效分辨率和帧图像分辨率之间的差异,确定当前投屏视频流中有效画面情况(如比例、方向等),从而选择是否根据有效画面进行显示。
例如,如果有效分辨率与帧图像分辨率相等,即第一帧画面的分辨率为1920×1080,在采样画面中确定的有效区域分辨率也为1920×1080,则代表当前投屏画面不存在黑边,投屏画面可以充满显示区域。即,直接通过显示器275横屏状态显示即可满足投屏画面的显示要求。
需要说明的是,由于显示画面的分辨率通常采用画面宽度和高度方向所占 像素数量进行表示,例如1920×1080。而单纯通过分辨率的数值通常难以直接进行对比。例如,从数值上比较,分辨率1920×1080等于1080×1920。因此,在实际对比过程中,可以通过提取分辨率中的部分数值或者将分辨率转化为其他可比较的数值后,再进行对比,以获得所述有效分辨率与所述帧图像分辨率的对比结果。例如,可以在帧图像分辨率中提取整体画面的宽度或高度,并与有效画面的高度或宽度进行比较,从而确定其有效分辨率。
由以上技术方案可知,本申请提供的投屏视频流有效分辨率检测方法可以在接收投屏视频流后,从投屏视频流中提取帧图像分辨率和有效分辨率,并进行对比,确定当前视频流的有效分辨率。通过设置视频流的有效分辨率,可以按照有效分辨率对投屏画面进行显示,从而适应投屏画面的显示方向,减小黑边影响,达到更优的用户体验。
为了计算有效画面的放大倍数,在本申请的部分实施例中,如图8所示,所述方法还包括:
S81:通过遍历所述采样画面中连续的黑色像素点数,提取黑边数据;
S82:提取初始画面数据;
S83:计算所述显示设备和所述有效画面的高度比例R H和宽度比例R W
S84:判断R H>R W的大小。
如图9所示,以移动终端的宽度为W 0,高度为H ph;显示设备在竖屏状态时的宽度为W 0,高度为H tv。可以从有效画面对应的图像左侧开始检测连续黑色区域的范围,并得出黑色区域的范围:左侧黑边宽度a、左侧黑边高度H ph。再从图像右侧开始检测连续黑色区域的范围,并得出黑色区域的范围:右侧黑边宽度a以及右侧黑边高度H ph,形成黑边数据。此时,得到的移动 终端发送的图像信息中,有效画面的分辨率为(W 0-2a)*H ph。显示设备的分辨率为W 0*H tv
显示设备和所述有效信息的高度比例R H=H tv/H ph;宽度比例R W=W 0/(W 0-2a),若R H>R W,则执行S85,将所述有效信息放大R W倍,得到投屏画面;若R H<R W,则执行S86,将所述有效信息放大R H倍,得到投屏画面。
实际应用中,由于移动终端100B所投屏的画面内容容易对采样画面中有效画面区域的判断造成影响,例如,采样画面对应的移动终端100B的显示画面刚好是黑色时,如果仍旧以投屏画面边缘连续黑色像素点进行范围判断,黑色的画面会影响黑色区域的范围判断,进而影响最终的有效分辨率提取结果。因此,为了缓解黑色画面内容对有效分辨率的影响,在所述投屏视频流提取有效分辨率的步骤,还包括:
在所述投屏视频流中按相等时间间隔获取多帧采样画面;
分别计算每一帧所述采样画面的有效分辨率。
通过预设采样时间间隔,可以在投屏视频流中,多次进行采样,并分别提取出每次采样中画面的有效分辨率。例如,每隔T时间获取一帧画面图像,并通过上述分辨率算法得出的有效分辨率数值分别为:Sx0、Sx1、……、Sxn。
通过多次进行采样,可以随着移动终端100B上显示画面的变化,采集到多帧采样画面。通常多帧采样画面不会全部都受黑色画面内容的影响,因此采集多帧采样画面可以降低画面内容对黑色区域范围判断造成影响,从而提高有效分辨率判断时的准确率。
在一种实现方式中,所述方法还包括:如果设置所述视频流的帧图像分辨率不同于有效分辨率,则可以控制旋转显示设备200的显示器275至竖屏状态。 例如,帧图像分辨率为1920×1080,有效分辨率为960×1080。从而可以确定移动终端100B上对应的显示画面为960×1080的竖向画面。
而竖向画面更适合在竖屏状态下进行显示,因此在确定有效分辨率为有效分辨率后,控制器250可以向旋转组件276发送控制指令,使旋转组件276驱动显示器275逆时针(或顺时针)旋转至竖屏状态。
显示器275旋转至竖屏状态后,可以按照宽高比为960:1080的比例对投屏画面进行显示。但是由于显示器275的屏幕较大,通常其显示分辨率为3840×2160(横屏状态,对应竖屏状态则为2160×3840)。因此,为了显示分辨率为960×1080的投屏画面需要对投屏画面进行缩放,使显示器275能够完全显示投屏画面。
另外,由于单纯的对投屏画面进行大小的调整,容易使显示的投屏画面在大屏幕上较模糊,严重降低用户体验。因此,在对投屏画面进行缩放的同时,还可以对投屏画面进行插像素相关的画质调整,以改善模糊画面,提高画面显示效果。
由以上技术方案可知,如图10所示,本申请提供的显示设备可以根据图像信息判断移动终端是否处于纵向模式,并自动调整显示器的旋转状态,从而使用更大的显示空间显示投屏画面,缓解传统智能电视无法正常显示投屏画面的问题。
基于上述显示设备,本申请还提供一种投屏方法,如图7所示,一种投屏方法,应用于显示设备,包括:
接收所述终端发送的图像信息,其当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色信息;
若所述显示器的当前旋转状态与所述终端的纵向模式不匹配时,将所述显示器旋转至竖屏状态;
基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
具体实现中,本申请还提供一种计算机存储介质,其中,该计算机存储介质可存储有程序,该程序执行时可包括本申请提供的方法的各实施例中的部分或全部步骤,当本申请提供的显示设备的控制器运行所述计算机程序指令时,所述控制器执行本申请所述的控制器被配置的步骤。所述的存储介质可为磁碟、光盘、只读存储记忆体(read-only memory,ROM)或随机存储记忆体(random access memory,RAM)等。
本领域的技术人员可以清楚地了解到本申请实施例中的技术可借助软件加必需的通用硬件平台的方式来实现。基于这样的理解,本申请实施例中的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例或者实施例的某些部分所述的方法。
本说明书中各个实施例之间相同相似的部分互相参见即可。尤其,对于实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例中的说明即可。
以上所述的本申请实施方式并不构成对本申请保护范围的限定。

Claims (10)

  1. 一种显示设备,包括:
    显示器;
    旋转组件,被配置为带动所述显示器旋转,以使所述显示器处于横屏状态或竖屏状态中的一种旋转状态;
    用户接口,所述通信息被配置为连接到终端;
    控制器,被配置为:
    接收所述终端发送的图像信息,其中,当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色信息,所述有效信息对应于所述终端的屏幕显示内容;
    若所述显示器的当前旋转状态与所述终端的纵向模式不匹配时,将所述显示器旋转至竖屏状态;
    基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
  2. 根据权利要求1所述的显示设备,所述控制器被进一步配置为:
    若所述显示器的当前旋转状态与所述终端的纵向模式匹配,基于所述有效信息控制所述显示器呈现投屏画面。
  3. 根据权利要求1所述的显示设备,所述控制器判断所述显示器的当前旋转状态与所述终端的纵向模式是否匹配,被进一步配置为:
    获取所述显示器旋转角度回调信息;
    根据所述图像信息确定所述显示器的目标旋转状态。
  4. 根据权利要求3所述的显示设备,所述控制器根据所述图像信息确 定所述显示器的目标旋转状态,被进一步配置为:
    如果接收到所述终端你发送的所述图像信息中包括左、右黑色信息,则确定所述显示器的目标旋转状态为竖屏状态;
    如果接收到所述终端你发送的所述图像信息中不包括左、右黑色信息,则确定所述显示器的目标旋转状态为横屏状态。
  5. 根据权利要求1所述的显示设备,所述控制器执行基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的,进一步被配置为:
    计算所述显示设备和所述有效信息的高度比例R H和宽度比例R W
    若R H>R W,则将所述有效信息放大R W倍,得到投屏画面;
    若R H<R W,则将所述有效信息放大R H倍,得到投屏画面。
  6. 根据权利要求3所述的显示设备,所述控制器被进一步配置为:
    计算所述投屏画面的旋转方向和旋转角度;所述投屏画面的旋转方向与所述显示器的转动方向相反;所述投屏画面的旋转角度与所述显示器的转动角度相等;
    按照所述旋转方向和所述旋转角度旋转所述投屏画面。
  7. 一种投屏方法,应用于显示设备,包括:
    接收所述终端发送的图像信息,其中,当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色信息,所述有效信息对应于所述终端的屏幕显示内容;
    若所述显示器的当前旋转状态与所述终端的纵向模式不匹配时,将所述显示器旋转至竖屏状态;
    基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
  8. 根据权利要求7所述的投屏方法,包括:
    若所述显示器的当前旋转状态与所述终端的纵向模式匹配,基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
  9. 根据权利要求7所述的投屏方法,所述判断所述显示器的当前旋转状态与所述终端的纵向模式是否匹配,包括:
    获取所述显示器旋转角度回调信息;
    根据所述图像信息确定所述显示器的目标旋转状态。
  10. 根据权利要求9所述的投屏方法,所述根据所述图像信息确定所述显示器的目标旋转状态,包括:
    如果接收到所述终端发送的所述图像信息中包括左、右黑色信息,则确定所述显示器的目标旋转状态为竖屏状态;
    如果接收到所述终端你发送的所述图像信息中不包括左、右黑色信息,则确定所述显示器的目标旋转状态为横屏状态。
PCT/CN2020/086665 2020-04-24 2020-04-24 一种显示设备及投屏方法 WO2021212463A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/086665 WO2021212463A1 (zh) 2020-04-24 2020-04-24 一种显示设备及投屏方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/086665 WO2021212463A1 (zh) 2020-04-24 2020-04-24 一种显示设备及投屏方法

Publications (1)

Publication Number Publication Date
WO2021212463A1 true WO2021212463A1 (zh) 2021-10-28

Family

ID=78270826

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/086665 WO2021212463A1 (zh) 2020-04-24 2020-04-24 一种显示设备及投屏方法

Country Status (1)

Country Link
WO (1) WO2021212463A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114222188A (zh) * 2021-12-28 2022-03-22 深圳小湃科技有限公司 基于旋转屏的全屏显示方法、装置、设备及存储介质
CN114422839A (zh) * 2021-12-31 2022-04-29 当趣网络科技(杭州)有限公司 一种多路投屏显示处理方法、装置以及***
CN114428596A (zh) * 2022-01-30 2022-05-03 深圳创维-Rgb电子有限公司 投屏显示方法、电子设备及可读存储介质
CN115482797A (zh) * 2022-08-19 2022-12-16 合肥讯飞读写科技有限公司 图像的展示方法、装置、电子相框和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080303805A1 (en) * 2007-06-07 2008-12-11 Chih-Ta Chien Digital picture frame and method of displaying digital image data on a display unit of the digital picture frame
CN104683704A (zh) * 2013-11-26 2015-06-03 现代自动车株式会社 图像处理的***和方法、支持该***和方法的设备和终端
CN105230005A (zh) * 2013-05-10 2016-01-06 三星电子株式会社 显示装置及其控制方法
CN107135193A (zh) * 2016-02-26 2017-09-05 Lg电子株式会社 无线装置
CN108762702A (zh) * 2012-07-06 2018-11-06 Lg 电子株式会社 移动终端、图像显示装置及使用其的用户接口提供方法
EP3444798A1 (en) * 2017-08-16 2019-02-20 Vestel Elektronik Sanayi ve Ticaret A.S. Display system, method and computer program for controlling an electronic display

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080303805A1 (en) * 2007-06-07 2008-12-11 Chih-Ta Chien Digital picture frame and method of displaying digital image data on a display unit of the digital picture frame
CN108762702A (zh) * 2012-07-06 2018-11-06 Lg 电子株式会社 移动终端、图像显示装置及使用其的用户接口提供方法
CN105230005A (zh) * 2013-05-10 2016-01-06 三星电子株式会社 显示装置及其控制方法
CN104683704A (zh) * 2013-11-26 2015-06-03 现代自动车株式会社 图像处理的***和方法、支持该***和方法的设备和终端
CN107135193A (zh) * 2016-02-26 2017-09-05 Lg电子株式会社 无线装置
EP3444798A1 (en) * 2017-08-16 2019-02-20 Vestel Elektronik Sanayi ve Ticaret A.S. Display system, method and computer program for controlling an electronic display

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114222188A (zh) * 2021-12-28 2022-03-22 深圳小湃科技有限公司 基于旋转屏的全屏显示方法、装置、设备及存储介质
CN114422839A (zh) * 2021-12-31 2022-04-29 当趣网络科技(杭州)有限公司 一种多路投屏显示处理方法、装置以及***
CN114428596A (zh) * 2022-01-30 2022-05-03 深圳创维-Rgb电子有限公司 投屏显示方法、电子设备及可读存储介质
CN114428596B (zh) * 2022-01-30 2023-07-07 深圳创维-Rgb电子有限公司 投屏显示方法、电子设备及可读存储介质
CN115482797A (zh) * 2022-08-19 2022-12-16 合肥讯飞读写科技有限公司 图像的展示方法、装置、电子相框和存储介质

Similar Documents

Publication Publication Date Title
WO2021179359A1 (zh) 一种显示设备及显示画面旋转适配方法
WO2021212463A1 (zh) 一种显示设备及投屏方法
CN112565839B (zh) 投屏图像的显示方法及显示设备
CN112165644B (zh) 一种显示设备及竖屏状态下视频播放方法
WO2021179363A1 (zh) 一种显示设备及开机动画显示方法
CN111866593B (zh) 一种显示设备及开机界面显示方法
CN113556593B (zh) 一种显示设备及投屏方法
WO2021189712A1 (zh) 网页视频全屏播放切换小窗口播放的方法及显示设备
US11662971B2 (en) Display apparatus and cast method
CN112565861A (zh) 一种显示设备
WO2021179361A1 (zh) 一种显示设备
WO2021212470A1 (zh) 一种显示设备及投屏画面显示方法
WO2021179362A1 (zh) 一种显示设备及界面切换方法
CN113556591A (zh) 一种显示设备及投屏画面旋转显示方法
CN113556590B (zh) 一种投屏视频流有效分辨率检测方法及显示设备
WO2022193475A1 (zh) 显示设备、接收投屏内容的方法及投屏方法
WO2021180223A1 (zh) 一种显示方法及显示设备
CN113542824B (zh) 一种显示设备及应用界面的显示方法
CN112565915A (zh) 显示设备和显示方法
WO2021195919A1 (zh) 一种显示设备及开机信号源显示适配方法
WO2021208016A1 (zh) 一种显示设备及应用界面的显示方法
WO2021184387A1 (zh) 一种动画配置方法及显示设备
CN113497965B (zh) 旋转动画的配置方法及显示设备
CN113497962B (zh) 旋转动画的配置方法及显示设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20932524

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20932524

Country of ref document: EP

Kind code of ref document: A1