WO2021213097A1 - 一种显示设备及投屏方法 - Google Patents

一种显示设备及投屏方法 Download PDF

Info

Publication number
WO2021213097A1
WO2021213097A1 PCT/CN2021/081889 CN2021081889W WO2021213097A1 WO 2021213097 A1 WO2021213097 A1 WO 2021213097A1 CN 2021081889 W CN2021081889 W CN 2021081889W WO 2021213097 A1 WO2021213097 A1 WO 2021213097A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
screen
display device
information
terminal
Prior art date
Application number
PCT/CN2021/081889
Other languages
English (en)
French (fr)
Inventor
宋子全
庞秀娟
王之奎
于颜梅
李乃金
Original Assignee
海信视像科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202010334727.9A external-priority patent/CN113556593B/zh
Priority claimed from CN202010331501.3A external-priority patent/CN113556590B/zh
Application filed by 海信视像科技股份有限公司 filed Critical 海信视像科技股份有限公司
Priority to CN202180042822.4A priority Critical patent/CN115836528A/zh
Publication of WO2021213097A1 publication Critical patent/WO2021213097A1/zh
Priority to US17/805,276 priority patent/US11662971B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2092Details of a display terminals using a flat panel, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • G09G3/2096Details of the interface to the display terminal specific for a flat panel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/64Constructional details of receivers, e.g. cabinets or dust covers
    • H04N5/655Construction or mounting of chassis, e.g. for varying the elevation of the tube
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0442Handling or displaying different aspect ratios, or changing the aspect ratio
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/045Zooming at least part of an image, i.e. enlarging it or shrinking it
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0492Change of orientation of the displayed image, e.g. upside-down, mirrored
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/06Consumer Electronics Control, i.e. control of another device by a display or vice versa
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/16Use of wireless transmission of display information

Definitions

  • This application relates to the technical field of smart TVs, and in particular to a display device and a screen projection method.
  • Screen projection is an interactive operation between a terminal and a display device.
  • a wireless local area network is used to transmit the video stream to display the screen on the terminal device through the display device.
  • the screen projection operation instruction can be executed through the mobile phone to send the screen displayed on the mobile phone to the smart TV in the form of a video stream to use
  • the large screen of the smart TV provides a better user experience.
  • the screen ratio of terminals such as mobile phones and the screen ratio of display devices are often different.
  • the screen display aspect ratio of the mobile phone is 1080:1940; while the display aspect ratio of the smart TV is 1940:1080, that is, the screen on the mobile phone is in the vertical state, and the screen on the smart TV is in the horizontal state. . Therefore, when a terminal screen is projected and displayed through a smart TV, it is easy to fail to display the projected screen normally because the aspect ratio of the terminal screen does not match the display aspect ratio.
  • the screen In order to fully display the screen on the phone, the screen needs to be zoomed based on the height of the phone screen.
  • the difference in screen ratio will result in large black areas on both sides of the screen displayed by the smart TV, which not only reduces the user's viewing experience, but also wastes the display space on the screen.
  • the present application provides a display device and a screen projection method to solve the problem of wasting the display space on the screen when the traditional display device displays the projection screen.
  • the present application provides a display device, which is characterized in that it includes:
  • the rotating component is configured to drive the display to rotate so that the display is in one of a horizontal screen state or a vertical screen state;
  • a user interface the communication is configured to be connected to a terminal
  • the controller is configured as:
  • the image information includes valid information and left and right black areas;
  • this application also provides a screen projection method applied to a display device, which is characterized in that it includes:
  • the image information includes valid information and left and right black information, and the valid information corresponds to the screen display content of the terminal;
  • the controller of the display can receive the image information sent by the terminal.
  • the image information includes valid Information and left and right black information; if the current rotation state of the display does not match the portrait mode of the terminal, rotate the display to the portrait state; control the display to present a projection screen based on the effective information , Wherein the projected image is obtained by enlarging the effective information by a preset multiple.
  • the display device provided in this application can determine whether the mobile terminal is in portrait mode according to the image information, and automatically adjust the rotation state of the display, thereby using a larger display space to display the projection screen and alleviating the problem that the traditional smart TV cannot display the projection screen normally .
  • FIG. 1A is an application scenario diagram of a display device of this application
  • FIG. 1B is a rear view of a display device of this application.
  • FIG. 2 is a block diagram of the hardware configuration of the control device of the application
  • FIG. 3 is a block diagram of the architecture configuration of the operating system in the storage device of the display device of this application;
  • FIG. 4A is a schematic diagram of the landscape mode of the mobile terminal of this application.
  • FIG. 4B is a schematic diagram of the vertical mode of the mobile terminal of this application.
  • 5A is a schematic diagram of a mobile terminal in a landscape mode sending image information to a display device in a landscape mode in an embodiment of the application;
  • 5B is a schematic diagram of a mobile terminal in a landscape mode sending image information to a display device in a portrait state in an embodiment of the application;
  • 6A is a schematic diagram of a mobile terminal in a portrait mode sending image information to a display device in a landscape mode in an embodiment of the application;
  • FIG. 6B is a schematic diagram of the effect of presenting a picture when the display device is in a portrait state according to the projection protocol
  • FIG. 6C is a schematic diagram of sending image information to a display device in a portrait mode when the mobile terminal is in a portrait mode in an embodiment of the present application;
  • FIG. 7 is a schematic flow chart of a method for detecting effective resolution of a projection video stream according to this application.
  • Figure 8 is a schematic diagram of the application process for comparing sampling resolution and initial resolution
  • FIG. 9A is a schematic diagram of the process of calculating the reference value and the comparison value in this application.
  • Figure 9B is a schematic diagram of the black border data and initial screen data of the application.
  • Fig. 10 is a schematic diagram of the application process for judging preset execution conditions
  • FIG. 11 is a schematic diagram of the process of comparing multiple sampling resolutions and initial resolutions in this application.
  • FIG. 12 is a schematic diagram of a flow executed by a controller of an exemplary display device of this application.
  • FIG. 13 is a schematic flowchart of the application for determining the magnification of effective information according to the ratio of the height and width of the display device and the effective screen;
  • FIG. 14 is a schematic structural diagram of a display device of this application.
  • Figures 15A-15C are schematic diagrams of the user interface of a display device of this application receiving a screencast screen in a vertical screen situation.
  • the display device may be an electrical device with a larger screen, such as a smart TV, that presents video and audio signals to users.
  • the display device can have an independent operating system and support function expansion.
  • Various applications can be installed in the display device according to user needs, for example, social applications such as traditional video applications, short videos, and reading applications such as comics and books. These applications can use the screen of the display device to display the application screen and provide users with richer media resources.
  • the display device can also perform data interaction and resource sharing with different terminals.
  • a smart TV can be connected to a mobile phone through wireless communication methods such as local area network, Bluetooth, etc., so as to play resources in the mobile phone or directly cast a screen to display the screen on the mobile phone.
  • an embodiment of the present application provides a display device and a computer storage medium.
  • the display device such as Rotate the TV. It should be noted that the method provided in this embodiment is not only applicable to rotating TVs, but also applicable to other display devices, such as computers and tablet computers.
  • module used in the various embodiments of this application can refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that can execute related components Function.
  • remote control used in the various embodiments of this application refers to a component of an electronic device (such as the display device disclosed in this application), which can generally control the electronic device wirelessly within a short distance.
  • the component can generally use infrared and/or radio frequency (RF) signals and/or Bluetooth to connect with electronic devices, and can also include functional modules such as WiFi, wireless USB, Bluetooth, and motion sensors.
  • RF radio frequency
  • a handheld touch remote control uses a user interface in a touch screen to replace most of the physical built-in hard keys in general remote control devices.
  • gesture used in the embodiments of the present application refers to a user's behavior through a change of hand shape or hand movement to express expected ideas, actions, goals, and/or results.
  • the term "hardware system” used in the various embodiments of this application may refer to an integrated circuit (IC), printed circuit board (Printed circuit board, PCB) and other mechanical, optical, electrical, and magnetic devices with computing , Control, storage, input and output functions of the physical components.
  • the hardware system is also usually referred to as a motherboard or a main chip or a controller.
  • FIG. 1A is an application scenario diagram of a display device provided by some embodiments of this application.
  • the control device 100 and the display device 200 can communicate in a wired or wireless manner.
  • control device 100 is configured to control the display device 200, which can receive operation instructions input by the user, and convert the operation instructions into instructions that the display device 200 can recognize and respond to, and act as an intermediary for the interaction between the user and the display device 200 effect.
  • the user operates the channel addition and subtraction keys on the control device 100, and the display device 200 responds to the channel addition and subtraction operations.
  • the control device 100 may be a remote controller 100A, including infrared protocol communication or Bluetooth protocol communication, and other short-distance communication methods, etc., to control the display device 200 in a wireless or other wired manner.
  • the user can control the display device 200 by inputting user instructions through keys on the remote control, voice input, control panel input, etc.
  • the user can control the display device 200 by inputting corresponding control commands through the volume plus and minus keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, and power on/off keys on the remote control. Function.
  • the control device 100 may also be a smart device, such as a mobile terminal 100B, a tablet computer, a computer, a notebook computer, and the like.
  • a smart device such as a mobile terminal 100B, a tablet computer, a computer, a notebook computer, and the like.
  • an application program running on a smart device is used to control the display device 200.
  • the application can be configured to provide users with various controls through an intuitive user interface (UI) on the screen associated with the smart device.
  • UI intuitive user interface
  • the mobile terminal 100B may install a software application with the display device 200, realize connection communication through a network communication protocol, and realize the purpose of one-to-one control operation and data communication.
  • the mobile terminal 100B can establish a control instruction protocol with the display device 200, and the functions of the physical keys arranged on the remote control 100A can be realized by operating various function keys or virtual controls of the user interface provided on the mobile terminal 100B.
  • the audio and video content displayed on the mobile terminal 100B can also be transmitted to the display device 200 to realize the synchronous display function.
  • the display device 200 may provide a broadcast receiving function and a network TV function of a computer support function.
  • the display device can be implemented as digital TV, Internet TV, Internet Protocol TV (IPTV), and so on.
  • the display device 200 may be a liquid crystal display, an organic light emitting display, or a projection device.
  • the specific display device type, size and resolution are not limited.
  • the display device 200 also performs data communication with the server 300 through a variety of communication methods.
  • the display device 200 may be allowed to communicate through a local area network (LAN), a wireless local area network (WLAN), and other networks.
  • the server 300 may provide various contents and interactions to the display device 200.
  • the display device 200 can send and receive information, such as receiving electronic program guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library.
  • EPG electronic program guide
  • the server 300 can be one group or multiple groups, and can be one type or multiple types of servers.
  • the server 300 provides other network service content such as video-on-demand and advertising services.
  • the display device 200 includes a rotating assembly 276, a controller 250, a display 275, a terminal interface 278 protruding from a gap on the backplane, and a rotating assembly 276 connected to the backplane.
  • the component 276 can cause the display 275 to rotate.
  • the rotating component 276 can rotate the display screen to the vertical screen state, that is, the state where the vertical side length of the screen is greater than the horizontal side length, or it can rotate the screen to the horizontal screen state, that is, the screen horizontally. The state where the side length is greater than the vertical side length.
  • FIG. 2 exemplarily shows a block diagram of the hardware configuration of the display device 200.
  • the display device 200 may include a tuner and demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, Rotating component 276, audio processor 280, audio output interface 285, power supply 290.
  • the rotating assembly 276 may include components such as a drive motor and a rotating shaft.
  • the driving motor can be connected to the controller 250 and output the rotation angle under the control of the controller 250; one end of the rotating shaft is connected to the power output shaft of the driving motor, and the other end is connected to the display 275, so that the display 275 can be fixedly installed on the rotating assembly 276.
  • the wall or bracket On the wall or bracket.
  • the rotating assembly 276 may also include other components, such as transmission components, detection components, and so on.
  • the transmission component can adjust the rotation speed and torque output by the rotating assembly 276 through a specific transmission ratio, and can be a gear transmission mode
  • the detection component can be composed of sensors arranged on the rotating shaft, such as an angle sensor, an attitude sensor, and the like. These sensors can detect parameters such as the angle of rotation of the rotating component 276 and send the detected parameters to the controller 250 so that the controller 250 can determine or adjust the state of the display device 200 according to the detected parameters.
  • the rotating assembly 276 may include, but is not limited to, one or more of the aforementioned components.
  • the tuner and demodulator 210 which receives broadcast television signals through wired or wireless means, can perform modulation and demodulation processing such as amplification, mixing and resonance, and is used to demodulate the television selected by the user from multiple wireless or cable broadcast television signals
  • modulation and demodulation processing such as amplification, mixing and resonance
  • the audio and video signals carried in the frequency of the channel, as well as additional information (such as EPG data).
  • the tuner and demodulator 210 can be selected by the user and controlled by the controller 250 to respond to the frequency of the television channel selected by the user and the television signal carried by the frequency.
  • the tuner and demodulator 210 can receive signals in many ways according to different broadcasting formats of TV signals, such as terrestrial broadcasting, cable broadcasting, satellite broadcasting or Internet broadcasting; and according to different modulation types, it can be digitally modulated or simulated. Modulation method; and according to different types of received TV signals, analog signals and digital signals can be demodulated.
  • different broadcasting formats of TV signals such as terrestrial broadcasting, cable broadcasting, satellite broadcasting or Internet broadcasting
  • modulation types it can be digitally modulated or simulated. Modulation method; and according to different types of received TV signals, analog signals and digital signals can be demodulated.
  • the tuner demodulator 210 may also be in an external device, such as an external set-top box.
  • the set-top box outputs a TV signal after modulation and demodulation, and is input to the display device 200 through the external device interface 240.
  • the communicator 220 is a component used to communicate with external devices or external servers according to various types of communication protocols.
  • the display device 200 may transmit content data to an external device connected via the communicator 220, or browse and download content data from an external device connected via the communicator 220.
  • the communicator 220 may include a network communication protocol module such as a WIFI module 221, a Bluetooth communication protocol module 222, and a wired Ethernet communication protocol module 223 or a near field communication protocol module, so that the communicator 220 can receive the control device 100 according to the control of the controller 250 Control signals, and implement the control signals as WIFI signals, Bluetooth signals, radio frequency signals, etc.
  • the detector 230 is a component of the display device 200 for collecting signals from the external environment or interacting with the outside.
  • the detector 230 may include a sound collector 231, such as a microphone, which may be used to receive a user's voice, such as a voice signal of a control instruction for the user to control the display device 200; or, it may collect environmental sounds used to identify the type of environmental scene to realize display
  • the device 200 can adapt to environmental noise.
  • the detector 230 may also include an image collector 232, such as a camera, a camera, etc., which may be used to collect external environment scenes to adaptively change the display parameters of the display device 200; and to collect The attributes of the user or interactive gestures with the user to achieve the function of interaction between the display device and the user.
  • an image collector 232 such as a camera, a camera, etc., which may be used to collect external environment scenes to adaptively change the display parameters of the display device 200; and to collect The attributes of the user or interactive gestures with the user to achieve the function of interaction between the display device and the user.
  • the detector 230 may further include a light receiver, which is used to collect the ambient light intensity to adapt to changes in display parameters of the display device 200 and so on.
  • the detector 230 may also include a temperature sensor.
  • the display device 200 may adaptively adjust the display color temperature of the image. Exemplarily, when the temperature is relatively high, the color temperature of the display device 200 can be adjusted to be colder; when the temperature is relatively low, the color temperature of the display device 200 can be adjusted to be warmer.
  • the external device interface 240 is a component that provides the controller 250 to control data transmission between the display device 200 and external devices.
  • the external device interface 240 can be connected to external devices such as set-top boxes, game devices, notebook computers, etc. in a wired/wireless manner, and can receive external devices such as video signals (such as moving images), audio signals (such as music), and additional information (such as EPG) and other data.
  • the external device interface 240 may include: a high-definition multimedia interface (HDMI) terminal 241, a composite video blanking synchronization (CVBS) terminal 242, an analog or digital component terminal 243, a universal serial bus (USB) terminal 244, and a component (Component) Any one or more of terminals (not shown in the figure), red, green and blue (RGB) terminals (not shown in the figure), etc.
  • HDMI high-definition multimedia interface
  • CVBS composite video blanking synchronization
  • USB universal serial bus
  • Component Any one or more of terminals (not shown in the figure), red, green and blue (RGB) terminals (not shown in the figure), etc.
  • the controller 250 controls the work of the display device 200 and responds to user operations by running various software control programs (such as an operating system and various application programs) stored on the memory 260.
  • various software control programs such as an operating system and various application programs
  • the controller 250 includes a random access memory (RAM), a read only memory (ROM), a graphics processor, a CPU processor, a communication interface, and a communication bus.
  • RAM random access memory
  • ROM read only memory
  • CPU central processing unit
  • communication interface a communication bus.
  • RAM, ROM, graphics processor, CPU processor communication interface are connected through a communication bus.
  • ROM used to store various system startup instructions. For example, when the power-on signal is received, the power of the display device 200 starts to start, and the CPU processor 254 runs the system start-up instruction in the ROM 252 to copy the operating system stored in the memory 260 to the RAM 251 to start running the start-up operating system. After the operating system is started up, the CPU processor 254 copies various application programs in the memory 260 to the RAM 251, and then starts to run and start various application programs.
  • the graphics processor 253 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics.
  • the graphics processor 253 may include an arithmetic unit, which is used to perform operations by receiving various interactive instructions input by the user, and then display various objects according to display attributes; and includes a renderer, which is used to generate various objects obtained based on the arithmetic unit, and perform The rendered result is displayed on the display 275.
  • the CPU processor 254 is configured to execute operating system and application program instructions stored in the memory 260. And according to the received user input instructions, to execute various applications, data and content processing, so as to finally display and play various audio and video content.
  • the CPU processor 254 may include multiple processors.
  • the multiple processors may include a main processor and multiple or one sub-processors.
  • the main processor is configured to perform some initialization operations of the display device 200 in the display device preloading mode, and/or, to display screen operations in the normal mode. Multiple or one sub-processor, used to perform an operation in the standby mode of the display device.
  • the communication interface 255 may include the first interface to the nth interface. These interfaces may be network interfaces connected to external devices via a network.
  • the controller 250 may control the overall operation of the display device 200. For example, in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
  • the object can be any one of the selectable objects, such as a hyperlink or an icon.
  • the operation related to the selected object for example, the operation of displaying the page, document, image, etc. connected to the hyperlink, or the operation of executing the program corresponding to the object.
  • the user input command for selecting the GUI object may be a command input through various input devices (for example, a mouse, a keyboard, a touch pad, etc.) connected to the display device 200 or a voice command corresponding to a voice spoken by the user.
  • the memory 260 is used to store various types of data, software programs or application programs for driving and controlling the operation of the display device 200.
  • the memory 260 may include volatile and/or non-volatile memory.
  • the term "memory" includes the memory 260, the RAM and ROM of the controller 250, or the memory card in the display device 200.
  • the memory 260 is specifically used to store the operating program that drives the controller 250 in the display device 200; to store various application programs built in the display device 200 and downloaded from external devices by the user; and to store the configuration provided by the display 275 Data such as various GUIs, various objects related to the GUI, and visual effect images of the selector used to select GUI objects.
  • the memory 260 is specifically used to store drivers and related data of the tuner and demodulator 210, the communicator 220, the detector 230, the external device interface 240, the video processor 270, the display 275, the audio processor 280, etc.
  • external data such as audio and video data
  • user data such as key information, voice information, touch information, etc.
  • the memory 260 specifically stores software and/or programs for representing an operating system (OS). These software and/or programs may include, for example, a kernel, middleware, application programming interface (API), and/or application.
  • OS operating system
  • these software and/or programs may include, for example, a kernel, middleware, application programming interface (API), and/or application.
  • the kernel can control or manage system resources and functions implemented by other programs (such as the middleware, API, or application program); at the same time, the kernel can provide interfaces to allow middleware, API, or application program access control To control or manage system resources.
  • FIG. 3 exemplarily shows a block diagram of the architecture configuration of the operating system in the memory of the display device 200.
  • the operating system architecture consists of the application layer, the middleware layer, and the kernel layer from top to bottom.
  • Application layer system built-in applications and non-system-level applications belong to the application layer. Responsible for direct interaction with users.
  • the application layer can include multiple applications, such as settings applications, e-post applications, media center applications, and so on. These applications can be implemented as web applications, which are executed based on the WebKit engine, and specifically can be developed and executed based on HTML5, Cascading Style Sheets (CSS) and JavaScript.
  • CSS Cascading Style Sheets
  • HTML HyperText Markup Language
  • HyperText Markup Language Hyper Text Markup Language
  • Web pages are described through markup tags. HTML tags are used to describe text, graphics, animation, sounds, and tables. , Links, etc., the browser will read the HTML document, explain the content of the tags in the document, and display it in the form of a web page.
  • CSS the full name of Cascading Style Sheets (Cascading Style Sheets), is a computer language used to express the style of HTML documents, and can be used to define style structures, such as fonts, colors, and positions. CSS styles can be directly stored in HTML web pages or in separate style files to achieve control over styles in web pages.
  • JavaScript is a language used in web page programming, which can be inserted into HTML pages and interpreted and executed by the browser.
  • the interaction logic of the web application is implemented through JavaScript.
  • JavaScript can encapsulate the JavaScript extension interface through the browser to realize the communication with the kernel layer,
  • the middleware layer can provide some standardized interfaces to support the operation of various environments and systems.
  • the middleware layer can be implemented as the Multimedia and Hypermedia Information Coding Expert Group (MHEG) of the middleware related to data broadcasting, and can also be implemented as the DLNA middleware of the middleware related to external device communication, and can also be implemented as providing Display the middleware of the browser environment in which each application in the device runs.
  • MHEG Multimedia and Hypermedia Information Coding Expert Group
  • the kernel layer provides core system services, such as file management, memory management, process management, network management, system security authority management and other services.
  • the kernel layer can be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
  • the kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: providing display drivers for displays, camera drivers for cameras, button drivers for remote controls, and WIFI modules Provide WiFi driver, audio driver for audio output interface, power management driver for power management (PM) module, etc.
  • device driver services for various hardware, such as: providing display drivers for displays, camera drivers for cameras, button drivers for remote controls, and WIFI modules Provide WiFi driver, audio driver for audio output interface, power management driver for power management (PM) module, etc.
  • the user interface 265 receives various user interactions. Specifically, it is used to send the input signal of the user to the controller 250, or to transmit the output signal from the controller 250 to the user.
  • the remote control 100A may send input signals input by the user, such as a power switch signal, a channel selection signal, and a volume adjustment signal, to the user interface 265, and then the user interface 265 transfers to the controller 250; or the remote control 100A may Receive output signals such as audio, video, or data output from the user interface 265 after the controller 250 processes, and display the received output signal or output the received output signal as audio or vibration.
  • the user may input a user command on a graphical user interface (GUI) displayed on the display 275, and the user interface 265 receives the user input command through the GUI.
  • GUI graphical user interface
  • the user interface 265 may receive user input commands for controlling the position of the selector in the GUI to select different objects or items.
  • the "user interface” is a medium interface for interaction and information exchange between applications or operating systems and users. It realizes the conversion between the internal form of information and the form acceptable to users.
  • the commonly used form of the user interface is a graphical user interface (GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It can be an icon, window, control and other interface elements displayed on the display screen of an electronic device.
  • the control can include icons, controls, menus, tabs, text boxes, dialog boxes, status bars, channel bars, Widgets, etc. Visual interface elements.
  • the user may input a user command by inputting a specific sound or gesture, and the user interface 265 recognizes the sound or gesture through the sensor to receive the user input command.
  • the video processor 270 is used to receive external video signals, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to the standard codec protocol of the input signal.
  • video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to the standard codec protocol of the input signal.
  • the video signal displayed or played directly on the display 275.
  • the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
  • the demultiplexing module is used to demultiplex the input audio and video data stream, such as the input MPEG-2 stream (based on the compression standard of digital storage media moving images and voice), then the demultiplexing module will demultiplex it Multiplexed into video signals and audio signals, etc.
  • the video decoding module is used to process the demultiplexed video signal, including decoding and scaling.
  • An image synthesis module such as an image synthesizer, is used to superimpose and mix the GUI signal generated by the graphics generator with the zoomed video image according to the user input or itself, so as to generate an image signal for display.
  • the frame rate conversion module is used to convert the frame rate of the input video, such as converting the frame rate of the input 60Hz video to a frame rate of 120Hz or 240Hz, and the usual format is realized by such as frame interpolation.
  • the display formatting module is used to change the signal output by the frame rate conversion module to a signal conforming to the display format such as a display, for example, format the signal output by the frame rate conversion module to output RGB data signals.
  • the display 275 is used to receive the image signal input from the video processor 270 to display video content, images, and a menu control interface.
  • the displayed video content can be from the video content in the broadcast signal received by the tuner and demodulator 210, or from the video content input by the communicator 220 or the external device interface 240.
  • the display 275 simultaneously displays a user manipulation interface UI generated in the display device 200 and used to control the display device 200.
  • the display 275 may include a display screen component for presenting a picture and a driving component for driving image display.
  • the display 275 may also include a projection device and a projection screen.
  • the controller can send a control signal to make the rotating component 276 rotate the display 255.
  • the audio processor 280 is used to receive external audio signals, and perform decompression and decoding according to the standard codec protocol of the input signal, as well as audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing, so that it can be stored in the speaker 286 The audio signal to be played.
  • the audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), High Efficiency AAC (HE-AAC) and other formats.
  • AAC Advanced Audio Coding
  • HE-AAC High Efficiency AAC
  • the audio output interface 285 is used to receive the audio signal output by the audio processor 280 under the control of the controller 250.
  • the audio output interface 285 may include a speaker 286, or output to an external audio output terminal 287 of a generator of an external device, such as a headset Output terminal.
  • the video processor 270 may include one or more chips.
  • the audio processor 280 may also include one or more chips.
  • the video processor 270 and the audio processor 280 may be separate chips, or may be integrated with the controller 250 in one or more chips.
  • the power supply 290 is used to provide power supply support for the display device 200 with power input from an external power supply under the control of the controller 250.
  • the power supply 290 may be a built-in power supply circuit installed inside the display device 200, or may be a power supply installed outside the display device 200.
  • the mobile terminal 100B may send display screen data to the display device 200 through a wireless connection, such as the Miracast protocol, to form a screen projection video stream.
  • the controller 250 can decode the projected video stream, analyze the processed frame and form a projected image and send it to the display 275 for display.
  • the mobile terminal 100B may be a smart terminal device with display and human-computer interaction functions, such as a mobile phone, a tablet computer, and the like. Since the mobile terminal 100B has different operation modes, the formed projection screen also has different layout modes. For example, when the user holds the mobile phone in a landscape orientation for operation, the screen presented on the mobile phone has a landscape layout, that is, the width of the screen is greater than the height of the screen, and the phone is in landscape mode, as shown in FIG. 4A. When the user holds the mobile phone vertically to perform operations, the screen presented on the mobile phone has a vertical layout, that is, the width of the screen is smaller than the height of the screen, and the mobile phone is in portrait mode, as shown in Figure 4B.
  • the screen aspect ratio of a mobile phone is usually 9:16, 10:16, etc.; the screen aspect ratio of a tablet computer is 3:4, etc.
  • the screen aspect ratio of a tablet computer is 3:4, etc.
  • smart terminal devices with a screen aspect ratio of 1:1 such as smart watches.
  • the screen layout presented in the horizontal state and the vertical state is generally the same, and the orientation is different only when displayed on the display screen of the smart terminal device. Therefore, for the mobile terminal 100B with a display screen aspect ratio of 1:1, the projection screen formed when the screen is projected does not distinguish between the horizontal and vertical states.
  • a projection protocol can be configured between the display device 200 and the mobile terminal 100B, for example based on the Miracast standard
  • the screen projection protocol is based on the DLNA standard, the projection protocol based on the Airplay standard, etc., through the projection protocol to realize the transmission of the screen video stream, so that the screen on the mobile terminal 100B can be screened to the display device 200.
  • the video stream that is projected to the display device 200 is always a 1920 ⁇ 1080 horizontal screen stream, and the display device cannot follow the resolution or resolution of the video stream.
  • the aspect ratio acquires the horizontal and vertical relationship of the screen, so that the display device 200 cannot automatically rotate the TV screen through the video stream.
  • the image information sent to the display device is not Contains left and right black information, as shown in Figures 5A and 5B.
  • the image information sent to the display device includes left and right black information, as shown in FIGS. 6A and 6B.
  • the mobile terminal when the mobile terminal establishes a communication connection with the display device, the mobile terminal has already added black borders to the screenshot before sending the screenshot to the display device.
  • the purpose is to ensure that the original display device is in the horizontal screen state. Below, whether the mobile terminal is placed horizontally or vertically, the horizontal screen can be displayed.
  • the image information shown in the black areas on both sides of the screen is black information, or black borders
  • the display screen area in the middle of the screen is called the effective screen (that is, the corresponding mobile terminal Screen display screen)
  • the effective screen is the effective information in the image information of the mobile terminal.
  • the operation screen on the mobile terminal 100B is displayed, and the black borders are determined to have different widths and heights according to the screen ratio of the display 275 of the mobile terminal 100B.
  • the display device 200 displays the projected screen
  • the direction corresponding to the shorter side is used as a reference, such as the height direction in the landscape state. Therefore, the height direction of the projection video stream presented by the display device 200 is generally unchanged. That is, regardless of whether the orientation of the mobile terminal 100B is horizontal or vertical, the height of the projection screen received by the display device 200 is 1080P.
  • the display 275 is rotated to the portrait state, it cannot present the display state as shown in FIG. 6C, but is displayed according to the height of the entire image information including left and right black information, as shown in FIG. 6B. That is, in the vertical screen state, not only the left and right sides of the effective screen are filled with black information, but the top and bottom of the projected screen also display black areas because there is no effective screen for filling, which will greatly affect the user's viewing experience.
  • the screen display resolution corresponding to the screen projection video stream received by the display device 200 is 1920 ⁇ 1080, that is, in the height direction, the projection screen is based on the screen on the terminal and requires 1080 pixels for display .
  • the projection screen is based on the terminal display screen plus the width of the black borders on both sides, which requires 1920 pixels for display.
  • the height of the effective area in the projection screen is 1080, and the width is scaled and displayed in proportion to the height, which will greatly waste the display area on the display 275.
  • the state shown in FIG. 4A when the mobile terminal is in the landscape mode, the state shown in FIG. 4A is present, and when the screen is projected, the parsed frame data of the video stream sent to the display device has no left and right black borders.
  • the display device receives image information that does not contain the left and right black borders, it detects whether the display is in a horizontal screen state.
  • the controller 250 receives the image information, and can directly present the image information on the display.
  • the controller 250 may send a rotation instruction to the rotation component 276 to control the rotation of the rotation component 276 to rotate the display to the horizontal screen state.
  • the mobile terminal When the mobile terminal is in the portrait mode, it presents the state shown in FIG. 4B.
  • the parsed frame data of the video stream sent to the display device has left and right black borders and valid images.
  • the display device receives the image information with left and right black borders, it detects whether the display is in a vertical screen state.
  • the controller 250 may send a rotation instruction to the rotation component 276 to control the rotation component 276 to rotate the display 275 to the portrait state.
  • the width of the display 275 In the vertical screen state, the width of the display 275 is smaller than the height, which is consistent with the display ratio of the terminal screen.
  • the frame data in the projection video stream is enlarged to display the effective image on the display, and the left and right black areas are enlarged due to enlargement. It cannot be displayed on the monitor, thereby reducing the area of black borders on both sides of the effective screen, as shown in Fig. 6C.
  • this application provides a display device that can determine whether it is necessary to rotate the display and adjust the screen according to whether the image information in the projection data stream contains left and right black information. Make adjustments to maximize use of the display area.
  • a display device provided by this application includes:
  • the rotating component is configured to drive the display to rotate so that the display is in one of a horizontal screen state or a vertical screen state;
  • a user interface the communication is configured to be connected to a terminal
  • controller see Figure 7, is configured to execute:
  • the user may first perform a screen projection display operation on the mobile terminal 100B to send the display screen of the mobile terminal 100B to the display device 200. For example, the user selects "Settings-Connection and Sharing-Screencasting" on the mobile phone, and selects a display device in the current network as the screencasting object in the screencasting device list to perform the screencasting operation.
  • the mobile terminal 100B After performing the screen projection operation, the mobile terminal 100B will send the displayed screen to the display device 200 through a screen projection protocol, such as the Miracast protocol or other mirroring protocols. As new interactive pictures are continuously generated during the screen projection process, the mobile terminal 100B will send the pictures to the display device 200 frame by frame to form a screen projection data stream (hereinafter referred to as a screen projection video stream).
  • a screen projection protocol such as the Miracast protocol or other mirroring protocols.
  • the mobile terminal 100B will send the pictures to the display device 200 frame by frame to form a screen projection data stream (hereinafter referred to as a screen projection video stream).
  • users can also perform screen projection operations through third-party applications.
  • a user opens a video application, and a screencast icon is set on the video playback interface of the video application. The user can click the icon to perform the screen projection operation.
  • the screencasting image of the screencasting operation performed by a third-party application is based on the video resource being played. For example, when the video resource being played is a horizontal media asset such as a movie or a TV series, the width of the effective screen in the projection screen is greater than the height; when the video resource being played is a vertical media asset such as a short video or a comic, the screen is projected The width of the effective picture is smaller than the height.
  • the display device 200 After the display device 200 receives the projected video stream, its controller 250 can analyze the received projected video stream frame by frame to extract the initial resolution.
  • the initial resolution is the overall resolution of the first frame in the projected video stream.
  • the screen aspect ratio of the screencast video stream sent by the mobile terminal 100B is 1920:1080.
  • the controller 250 can obtain each frame of the screen by analyzing the screencast video stream. And the initial resolution is extracted from the first frame of picture (the first frame of picture), that is, the initial resolution is extracted as 1920 ⁇ 1080.
  • the first frame picture is not limited to the first frame picture in the entire screen projection process, and may also be the first few frames of the projection video stream or the corresponding frame picture at a specified time node. Since the initial resolution is the overall resolution of the projected screen, it is easier to obtain the resolution information, and in practical applications, the earlier the initial resolution is obtained, the more conducive to timely detection of the effective resolution of the projected video stream. Thus, the sooner the display screen can be adjusted to a better state.
  • the sampling resolution is the resolution of the effective area (that is, the effective picture) on the sampled picture extracted from the projected video stream at a preset time interval.
  • the preset time interval for the designated sampling can be set according to the computing capability of the controller 250, for example, a picture 10s after the first frame of the picture is used as the sampling picture.
  • the pixel color of the sampled picture can be traversed. Obviously, the pixel color value of the black area is black, and the pixel color value of the effective area is usually not all black. Therefore, by traversing each pixel of the sampled picture, it is possible to determine that the black and rectangular area is black, and the other areas are Effective area.
  • sampling resolution After extracting the initial resolution and sampling resolution for the projection video stream, you can compare the sampling resolution with the initial resolution to determine the current projection video stream based on the difference between the sampling resolution and the initial resolution Screen conditions (such as scale, direction, etc.), so as to choose whether to display according to the effective screen.
  • the sampling resolution is equal to the initial resolution, that is, the resolution of the first frame is 1920 ⁇ 1080, and the resolution of the effective area determined in the sampled picture is also 1920 ⁇ 1080, it means that there is no black on the current projection screen.
  • the projection screen can fill the display area. That is, the display requirement of the projection screen can be met directly by displaying in the horizontal screen state of the display 275.
  • the resolution of the display screen is usually represented by the number of pixels occupied by the width and height of the screen, for example, 1920 ⁇ 1080.
  • the resolution of 1920 ⁇ 1080 is equal to 1080 ⁇ 1920. Therefore, in the actual comparison process, part of the value in the resolution can be extracted or the resolution can be converted into other comparable values, and then the comparison can be performed to obtain the comparison result of the sampling resolution and the initial resolution.
  • the width or height of the overall picture can be extracted from the initial resolution and compared with the height or width of the effective picture extracted in the sampling resolution to determine its effective resolution.
  • the value corresponding to the sampling resolution is greater than or equal to the value corresponding to the initial resolution, it means that the black border area in the current projected screen is already at a minimum under the premise of ensuring complete display. At this time, even if the screen is zoomed, the area of the black border area will not be increased or decreased, and it is determined that the effective resolution of the video stream has not changed and remains the original resolution.
  • the display device 200 can improve the display effect by rotating the display 275 and zooming the projection screen.
  • the effective resolution detection method of the projection video stream provided by this application can extract the initial resolution and sampling resolution from the projection video stream after receiving the projection video stream, and compare them to determine the current video The effective resolution of the stream. If the sampling resolution is less than the initial resolution, set the effective resolution of the video stream to the sampling resolution.
  • the projected screen can be displayed according to the effective resolution, so as to adapt to the display direction of the projected screen, reduce the impact of black borders, and achieve a better user experience.
  • the step of comparing the sampling resolution with the initial resolution further includes:
  • part of the data can be extracted from the initial resolution as a reference value. For example, when the height of the projected picture does not change in the projected video stream, the overall picture height of the first frame of the picture is used as the reference value. Then by traversing the color values of the pixels in the sampled picture, the ratio of the effective picture is determined, and the contrast value is generated.
  • the contrast value is the picture width of the effective picture in the sample picture.
  • the reference value is 1080.
  • the reference value and the comparison value you can directly determine the relationship between the sampling resolution and the initial resolution by comparing the size of the reference value and the comparison value, that is, if the comparison value is greater than or equal to the reference value, determine that the sampling resolution is greater than or Equal to the initial resolution; if the contrast value is less than the reference value, it is determined that the sampling resolution is less than the initial resolution.
  • the reference value is 1080 and the comparison value is 960. Since the comparison value 960 is less than the reference value 1080, it is determined that the sampling resolution is less than the initial resolution, that is, the effective resolution of the current projection video stream is set to the sampling resolution. And in the subsequent display process, the projection screen of the projection video stream can be zoomed and displayed according to the effective resolution.
  • the method further includes:
  • S421 Extract black edge data by traversing the number of consecutive black pixels in the sampled picture
  • the range of the continuous black area can be detected from the left side of the image corresponding to the sampled picture, and the range of the black border area can be obtained :The width of the black border on the left is H L , and the height of the black border on the left is V L. Then the range of the continuous black area is detected from the right side of the image, and the range of the black area is obtained: the width of the right black border H R and the height of the right black border V R , forming black border data.
  • the initial picture data can also be extracted from the first frame picture.
  • the initial picture data includes the initial width H 0 and the initial height V 0 of the first frame picture to calculate the reference value and the comparison value.
  • the reference value and the comparison value can be calculated according to the following formula:
  • the method before the step of calculating the reference value and the comparison value, the method further includes:
  • S4231 Determine whether the sampled picture meets a preset execution condition.
  • the image display status in the sampled image may be judged first to determine whether it meets the preset execution condition.
  • the effective resolution can be judged further, that is, the step of calculating the reference value and the comparison value can be performed to compare the results.
  • the sampling resolution and the initial resolution are the initial resolution.
  • the screen content of the screen projected by the mobile terminal 100B is likely to affect the determination of the effective screen area in the sampled screen.
  • the display screen of the mobile terminal 100B corresponding to the sampled screen is just black
  • Consecutive black pixels on the edge are used to determine the range, and the black picture will affect the range determination of the black area, which in turn affects the final sampling resolution extraction result. Therefore, in order to alleviate the influence of the black picture content on the sampling resolution, the step of extracting the sampling resolution from the projection video stream also includes:
  • S202 Calculate the sampling resolution of each frame of the sampling picture respectively.
  • sampling time interval By presetting the sampling time interval, it is possible to sample multiple times in the projected video stream, and extract the sampling resolution of the picture in each sample. For example, a frame image is acquired every T time, and the sampling resolution values obtained by the above-mentioned resolution algorithm are respectively: S x0 , S x1 , ..., S xn .
  • multiple frames of sampled pictures can be collected as the display picture on the mobile terminal 100B changes.
  • the multi-frame sampling pictures are not all affected by the black picture content, so collecting the multi-frame sampling pictures can reduce the influence of the picture content on the judgment of the black border area, thereby improving the accuracy of the effective resolution judgment.
  • the sampling resolutions can also be judged separately to obtain the comparison results of the sampling resolutions and the initial resolutions, that is, the method further includes :
  • the method further includes: if the effective resolution of the video stream is set to the sampling resolution, the display 275 of the display device 200 can be controlled to rotate to the portrait state.
  • the initial resolution is 1920 ⁇ 1080
  • the sampling resolution is 960 ⁇ 1080.
  • the vertical screen is more suitable for display in the vertical screen state. Therefore, after determining that the effective resolution is the sampling resolution, the controller 250 can send a control instruction to the rotating component 276 to make the rotating component 276 drive the display 275 counterclockwise (or clockwise). Hour hand) rotate to the vertical screen state.
  • the projected screen can be displayed in a ratio of 960:1080 in aspect ratio.
  • its display resolution is usually 3840 ⁇ 2160 (horizontal screen state, corresponding to the vertical screen state is 2160 ⁇ 3840). Therefore, in order to display the projection screen with a resolution of 960 ⁇ 1080, it is necessary to zoom the projection screen so that the display 275 can completely display the projection screen.
  • the effective resolution detection method of the projection video stream provided by this application can extract the initial resolution and sampling resolution from the projection video stream after receiving the projection video stream, and compare them to determine the current video The effective resolution of the stream. If the sampling resolution is less than the initial resolution, set the effective resolution of the video stream to the sampling resolution.
  • the projected screen can be displayed according to the effective resolution, so as to adapt to the display direction of the projected screen, reduce the impact of black borders, and achieve a better user experience.
  • another display device including:
  • the rotating component is configured to drive the display to rotate so that the display is in one of a horizontal screen state or a vertical screen state;
  • a user interface the communication is configured to be connected to a terminal
  • controller see Figure 12, is configured to execute:
  • the image information includes valid information and left and right black area information.
  • the user may first perform a screen projection display operation on the mobile terminal 100B to send the display screen of the mobile terminal 100B to the display device 200. For example, the user selects "Settings-Connection and Sharing-Screencasting" on the mobile phone, and selects a display device in the current network as the screencasting object in the screencasting device list to perform the screencasting operation.
  • the mobile terminal 100B After performing the screen projection operation, the mobile terminal 100B will send the displayed screen to the display device 200 through a screen projection protocol, such as the Miracast protocol or other screen projection and mirroring protocols. As new interactive pictures are continuously generated during the screen projection process, the mobile terminal 100B will send the pictures to the display device 200 frame by frame to form a projection video stream.
  • a screen projection protocol such as the Miracast protocol or other screen projection and mirroring protocols.
  • the controller obtains the rotation angle callback information of the display, and determines the target rotation state of the display according to whether the left and right black borders are included in the image information obtained from the mobile terminal.
  • the image information does not include left and right black information, it means that the mobile terminal is currently in landscape mode, and when the current rotation state of the display is detected as the landscape state, the two are matched. No need to rotate the TV.
  • the current optional state of the monitor is the vertical screen state, the two are not matched, and the monitor needs to be rotated to the horizontal screen state.
  • the image information contains left and right black areas, it means that the mobile terminal is currently in portrait mode, and when the current rotation state of the display is detected as the horizontal screen state, the two do not match, and the display needs to be rotated to the vertical screen state.
  • the two are matched, and there is no need to rotate the display.
  • the mobile terminal 100B After the user performs a screen projection operation through the mobile terminal 100B, the mobile terminal 100B will send the screen projection image to the display device 200 through a mirroring protocol or a screen projection protocol.
  • the controller 250 may receive the image information sent by the terminal, and detect the current rotation state of the display 275. Among them, the detection of the rotation state of the display 275 can be completed by a built-in sensor in the display device 200.
  • sensor devices such as a gyroscope and a gravity acceleration sensor can be set on the display 275 of the display device 200, and the posture data of the display 275 relative to the direction of gravity can be determined by measuring the angular acceleration or the direction of gravity. Then, the detected posture data is compared with the posture data in the horizontal screen state and the vertical screen state, respectively, to determine the current rotation state of the display 275.
  • a grating angle sensor, a magnetic field angle sensor, or a sliding resistance angle sensor, etc. can be arranged on the rotating component 276, and the angle rotated by the rotating component 276 can be measured and compared with the angles in the horizontal screen state and the vertical screen state to determine The current rotation state of the display 275.
  • the controller is further configured to:
  • the display device 200 After the display device 200 receives the projected video stream, its controller 250 can analyze the received projected video stream frame by frame.
  • the screen aspect ratio of the projection video stream sent by the mobile terminal 100B is 1920:1080, and the controller 250 may obtain frame images by analyzing the projection video stream after receiving the projection video stream.
  • the resolution of the extracted frame image is 1920 ⁇ 1080, which is the initial resolution.
  • the frame images in the projected video stream can be sampled again to extract the effective resolution.
  • the frame of picture used for sampling is called the sampling picture.
  • the effective resolution is the resolution of the effective picture on the frame data extracted from the projection video stream. Specifically, the effective resolution can be obtained in the projection video stream at a preset time interval.
  • the pixel color of the sampled picture can be traversed.
  • the pixel color value of the black area is black, and the pixel color value of the effective area is usually not all black. Therefore, by traversing each pixel of the sampled image, it is possible to determine that the black and rectangular area is black, and the other areas are valid. Picture.
  • the color filled in the black area is not limited to black.
  • the black area may be gray, blue, or other colors, and may also be gradient colors, specific patterns, and so on. For these situations, this application still refers to it as a black area or a black border for the convenience of subsequent description.
  • the effective resolution can be compared with the frame image resolution to determine the current
  • the effective picture conditions (such as ratio, direction, etc.) in the projected video stream, so as to choose whether to display according to the effective picture.
  • the effective resolution is equal to the frame image resolution, that is, the resolution of the first frame is 1920 ⁇ 1080, and the resolution of the effective area determined in the sampled image is also 1920 ⁇ 1080, it means that the current projection screen does not exist With black borders, the projection screen can fill the display area. That is, the display requirement of the projection screen can be met directly by displaying in the horizontal screen state of the display 275.
  • the resolution of the display screen is usually represented by the number of pixels occupied by the width and height of the screen, for example, 1920 ⁇ 1080.
  • the resolution of 1920 ⁇ 1080 is equal to 1080 ⁇ 1920. Therefore, in the actual comparison process, the effective resolution can be compared with the frame image resolution by extracting part of the value in the resolution or converting the resolution into other comparable values before performing comparison. result.
  • the width or height of the overall picture can be extracted from the frame image resolution and compared with the height or width of the effective picture to determine its effective resolution.
  • the effective resolution detection method of the projected video stream provided by this application can extract the frame image resolution and effective resolution from the projected video stream after receiving the projected video stream, and compare them to determine the current The effective resolution of the video stream.
  • the projected screen can be displayed according to the effective resolution, so as to adapt to the display direction of the projected screen, reduce the impact of black borders, and achieve a better user experience.
  • the method further includes:
  • the width of the image frame transmitted by the mobile terminal is a or H 0 , and the height is H ph or V L ; the width of the display device in the vertical screen state is W tv , and the height is H tv .
  • the range of the continuous black area can be detected from the left side of the image corresponding to the effective picture, and the range of the black area can be obtained: the width of the left black border a, and the height of the left black border H ph .
  • the range of the continuous black area is detected from the right side of the image, and the range of the black area is obtained: the width a of the right black border and the height of the right black border H ph , forming black border data.
  • the resolution of the effective picture is (H 0 -2a)*V Lh .
  • the resolution of the display device is W tv *H tv .
  • the display device provided by the present application can determine whether the mobile terminal is in portrait mode according to image information, and automatically adjust the rotation state of the display, thereby using a larger display space to display the projection screen. Alleviate the problem that the traditional smart TV cannot display the projected screen normally.
  • the display device can obtain the real video stream of the mobile terminal, such as the Airplay protocol, which can send the image data of the mobile terminal itself to the display device; in this way, the display device can directly according to the aspect ratio of the image Determine whether the image is landscape or portrait.
  • the Airplay protocol such as the Airplay protocol
  • the display device cannot obtain the real screen video stream of the mobile terminal, that is, the mobile device processes the screen data information before transmitting the projection data to the display device, such as the Miracast standard projection protocol Smart devices always send horizontal media resources to the display device. Whether the screen is placed horizontally or vertically, the display device always obtains horizontal video data.
  • the mobile terminal when the mobile terminal is placed vertically for mirroring, before the screen data is sent to the display device, the two sides of the screen data are processed with black borders. Its original intention is to adapt to display devices placed horizontally.
  • the current Miracast standard projection protocol does not adapt to the vertical screen display device for the time being, and it still sends horizontal screen resources with black borders to the display device.
  • the display device when the display device receives the projection resource, it will judge whether the image has black borders. Therefore, the display device will not be able to know the actual placement of the mobile terminal. In this case, the display device will not be able to rotate according to the effective picture in the image. That is, no matter whether the mobile phone has a horizontal screen or a vertical screen, after receiving the horizontal screen projection data, the screen projection data is directly displayed.
  • the display device just does not rotate according to the projection data. Regardless of whether the display device is in a landscape or portrait state, the user can still actively issue a rotation command to rotate the display device to a different state than before.
  • the rotating component 276 can rotate the display screen to the vertical screen state, that is, the state where the vertical side length of the screen is greater than the horizontal side length, or it can rotate the screen to the horizontal screen state, that is, the screen horizontally. The state where the side length is greater than the vertical side length.
  • the display device For example, if the display device is in the landscape position, the screen shown in FIG. 5A or 6A is displayed.
  • the mobile terminal For FIG. 5A, the mobile terminal is placed horizontally, and the screen projection data stream is a horizontal resource with a full screen without black borders on both sides.
  • the mobile terminal For FIG. 6A, the mobile terminal is placed vertically, and the screen projection data stream is a horizontal resource with valid pictures with black borders on both sides only in the middle of the data stream. In the two display placements, for the display device, it receives horizontal and vertical screen resources.
  • a remote control or voice, gesture, etc. can be used to issue control instructions to mirror the current interface
  • a zoom ruler is shown on the user interface of the display device, as shown in FIG. 15A.
  • the operation bar disappears automatically within the preset interval. For example, you can call up, down, left, and right through the remote control to call out, and you can manually zoom in to an appropriate multiple through the remote control, as shown in Figures 15B and 15C.
  • the display device only displays the zoom ruler in the vertical screen state, and does not display the zoom ruler in the horizontal screen pile.
  • the mirroring will be reset during ConfigurationChange Display the screen, after the rotation is completed, the new display effect will be displayed.
  • this application also provides a computer storage medium, where the computer storage medium may store a program, and the program may include some or all of the steps in each embodiment of the method provided in this application when the program is executed.
  • the controller executes the steps of the controller configured in the present application.
  • the storage medium can be a magnetic disk, an optical disc, a read-only memory (ROM) or a random access memory (RAM), etc.
  • the technology in the embodiments of the present application can be implemented by means of software plus a necessary general hardware platform.
  • the technical solutions in the embodiments of the present application can be embodied in the form of software products, which can be stored in a storage medium, such as ROM/RAM. , Magnetic disks, optical disks, etc., including a number of instructions to enable a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the various embodiments or some parts of the embodiments of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请提供一种显示设备,包括:显示器;旋转组件;控制器,用于接收所述终端发送的图像信息;当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色信息,所述有效信息对应于所述终端的屏幕显示内容;基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。

Description

一种显示设备及投屏方法
本申请要求在2020年4月24日提交中国专利局、申请号为202010334727.9、申请名称为“一种显示设备及投屏方法”的中国专利申请的优先权,以及在2020年4月24日提交中国专利局、申请号为202010331501.3、申请名称为“一种投屏视频流有效分辨率检测方法及显示设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及智能电视技术领域,尤其涉及一种显示设备及投屏方法。
背景技术
投屏是一种终端与显示设备的互动操作。一般利用无线局域网络传递视频流,以通过显示设备展示终端设备上的画面。以手机投屏为例,对于连接在同一个WiFi网络中的手机和智能电视,可以通过手机端执行投屏操作指令,以将手机端显示的画面以视频流的方式发送给智能电视,以利用智能电视的大屏幕,获得更好的用户体验。
然而,手机等终端的画面显示比例与显示设备的屏幕比例往往存在差异。例如,常规操作下,手机的屏幕显示宽高比为1080:1940;而智能电视的显示器宽高比为1940:1080,即手机端的画面是竖向的状态,而智能电视的画面是横向的状态。因此,在通过智能电视投屏显示终端画面时,容易因终端画面宽高比与显示器宽高比不匹配,而无法正常显示投屏画面。
为了将手机上的画面完全显示,需要以手机画面的高度为准,对画面进行缩放。但在对投屏画面进行缩放时,画面比例的差异将导致智能电视显示的画面两侧拥有较大的黑色区域,不仅降低用户的观影体验,而且浪费屏幕上的显示空间。
发明内容
本申请提供了一种显示设备及投屏方法,以解决传统显示设备显示投屏画面时,浪费屏幕上的显示空间的问题。
一方面,本申请提供一种显示设备,其特征在于,包括:
显示器;
旋转组件,被配置为带动所述显示器旋转,以使所述显示处于横屏状态或竖屏状态中的一种旋转状态;
用户接口,所述通信息被配置为连接到终端;
控制器,被配置为:
接收所述终端发送的图像信息,其当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色区域;
若所述显示器的当前旋转状态与所述终端的纵向模式不匹配时,将所述显示器旋转至竖屏状态;
基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
另一方面,本申请还提供一种投屏方法,应用于显示设备,其特征在于,包括:
接收所述终端发送的图像信息,其中,当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色信息,所述有效信息对应于所述终端的屏幕显示内容;
若所述显示器的当前旋转状态与所述终端的纵向模式不匹配时,将所述显示器旋转至竖屏状态;
基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
由以上技术方案可知,本申请第一方面提供的显示设备及投屏方法,显示器的控制器可以接收所述终端发送的图像信息,其当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色信息;若所述显示器的当前旋转状态与所述终端的纵向模式不匹配时,将所述显示器旋转至竖屏状态;基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。本申请提供的显示设备可以根据图像信息判断移动终端是否处于纵向模式,并自动调整显示器的旋转状态,从而使用更大的显示空间显示投屏画面,缓解传统智能电视无法正常显示投屏画面的问题。
附图说明
为了更清楚地说明本申请的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动的前提下, 还可以根据这些附图获得其他的附图。
图1A为本申请一种显示设备的应用场景图;
图1B为本申请一种显示设备的后视图;
图2为本申请控制装置的硬件配置框图;
图3为本申请显示设备存储器中操作***的架构配置框图;
图4A为本申请移动终端横向模式示意图;
图4B为本申请移动终端纵向模式示意图;
图5A为本申请一实施例中移动终端处于横向模式下发送图像信息至处于横屏状态的显示设备的示意图;
图5B为本申请一实施例中移动终端处于横向模式下发送图像信息至处于竖屏状态的显示设备的示意图;
图6A为本申请一实施例中移动终端处于纵向模式下发送图像信息至处于横屏状态的显示设备的示意图;
图6B为根据投屏协议在显示设备为竖屏状态下呈现画面的效果示意图;
图6C为为本申请一实施例中移动终端处于纵向模式下发送图像信息至处于竖屏状态的显示设备的示意图;
图7为本申请一种投屏视频流有效分辨率检测方法流程示意图;
图8为本申请对比采样分辨率与初始分辨率流程示意图;
图9A为本申请计算基准值和对比值的流程示意图;
图9B为本申请黑边数据和初始画面数据示意图;
图10为本申请判断预设执行条件的流程示意图;
图11为本申请对比多个采样分辨率和初始分辨率的流程示意图;
图12为本申请一示例性的显示设备的控制器执行的流程示意图;
图13为本申请根据显示设备和有效画面的高度和宽度的比例确定有效信息放大倍数的流程示意图;
图14为本申请一种显示设备的结构示意图;
图15A-15C为本申请一种显示设备在竖屏情况下接收到投屏画面的用户界面示意图。
具体实施方式
下面将详细地对实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下实施例中描述的实施方式并不代表与本申请相一致的所有实施方式。仅是与权利要求书中所详述的、本申请的一些方面相一致的***和方法的示例。
本申请提供的技术方案中,所述显示设备可以为智能电视等带有较大屏幕、为用户呈现视频和音频信号的电器设备。显示设备可以拥有独立的操作***,并支持功能扩展。可以根据用户需要在显示设备中安装各种应用程序,例如,传统视频应用、短视频等社交应用以及漫画、看书等阅读应用。这些应用可利用显示设备的屏幕展示应用画面,为用户提供更丰富的媒体资源。同时,显示设备还可以与不同的终端进行数据交互和资源共享。例如,智能电视可以通过局域网、蓝牙等无线通信方式与手机连接,从而播放手机中的资源或者直接进行投屏显示手机上的画面。
为方便用户在显示器不同的横竖屏状态展示目标媒资详情页,便于提升显示设备在不同观看状态时的用户观看体验,本申请实施例提供了一种显示设备及计算机存储介质,显示设备,如旋转电视。需要说明的是,本实施例提供的方法不仅适用于旋转电视,还适用于其它显示设备,如计算机、平板电脑等。
本申请各实施例中使用的术语“模块”,可以是指任何已知或后来开发的硬件、软件、固件、人工智能、模糊逻辑或硬件或/和软件代码的组合,能够执行与该元件相关的功能。
本申请各实施例中使用的术语“遥控器”,是指电子设备(如本申请中公开的显示设备)的一个组件,该组件通常可在较短的距离范围内无线控制电子设备。该组件一般可以使用红外线和/或射频(RF)信号和/或蓝牙与电子设备连接,也可以包括WiFi、无线USB、蓝牙、动作传感器等功能模块。例如:手持式触摸遥控器,是以触摸屏中用户界面取代一般遥控装置中的大部分物理内置硬键。
本申请各实施例中使用的术语“手势”,是指用户通过一种手型的变化或手部运动等动作,用于表达预期想法、动作、目的/或结果的用户行为。
本申请各实施例中使用的术语“硬件***”,可以是指由集成电路(Integrated Circuit,IC)、印刷电路板(Printed circuit board,PCB)等机械、光、电、磁器件 构成的具有计算、控制、存储、输入和输出功能的实体部件。在本申请各个实施例中,硬件***通常也会被称为主板(motherboard)或主芯片或控制器。
参见图1A,为本申请一些实施例提供的一种显示设备的应用场景图。如图1所示,控制装置100和显示设备200之间可以有线或无线方式进行通信。
其中,控制装置100被配置为控制显示设备200,其可接收用户输入的操作指令,且将操作指令转换为显示设备200可识别和响应的指令,起着用户与显示设备200之间交互的中介作用。如:用户通过操作控制装置100上频道加减键,显示设备200响应频道加减的操作。
控制装置100可以是遥控器100A,包括红外协议通信或蓝牙协议通信,及其他短距离通信方式等,通过无线或其他有线方式来控制显示设备200。用户可以通过遥控器上按键、语音输入、控制面板输入等输入用户指令,来控制显示设备200。如:用户可以通过遥控器上音量加减键、频道控制键、上/下/左/右的移动按键、语音输入按键、菜单键、开关机按键等输入相应控制指令,来实现控制显示设备200的功能。
控制装置100也可以是智能设备,如移动终端100B、平板电脑、计算机、笔记本电脑等。例如,使用在智能设备上运行的应用程序控制显示设备200。该应用程序通过配置可以在与智能设备关联的屏幕上,通过直观的用户界面(UI)为用户提供各种控制。
示例性的,移动终端100B可与显示设备200安装软件应用,通过网络通信协议实现连接通信,实现一对一控制操作的和数据通信的目的。如:可以使移动终端100B与显示设备200建立控制指令协议,通过操作移动终端100B上提供的用户界面的各种功能键或虚拟控件,来实现如遥控器100A布置的实体按键的功能。也可以将移动终端100B上显示的音视频内容传输到显示设备200上,实现同步显示功能。
显示设备200可提供广播接收功能和计算机支持功能的网络电视功能。显示设备可以实施为,数字电视、网络电视、互联网协议电视(IPTV)等。
显示设备200,可以是液晶显示器、有机发光显示器、投影设备。具体显示设备类型、尺寸大小和分辨率等不作限定。
显示设备200还与服务器300通过多种通信方式进行数据通信。这里可允许显示设备200通过局域网(LAN)、无线局域网(WLAN)和其他网络进行通信连接。服务器300可以向显示设备200提供各种内容和互动。示例的,显示设备200可以发送和接收信息,例 如:接收电子节目指南(EPG)数据、接收软件程序更新、或访问远程储存的数字媒体库。服务器300可以一组,也可以多组,可以一类或多类服务器。通过服务器300提供视频点播和广告服务等其他网络服务内容。
在一些实施例中,如图1B所示,显示设备200包括旋转组件276,控制器250,显示器275,从背板上空隙处伸出的端子接口278以及和背板连接的旋转组件276,旋转组件276可以使显示器275进行旋转。从显示设备正面观看的角度,旋转组件276可以将显示屏旋转到竖屏状态,即屏幕竖向的边长大于横向的边长的状态,也可以将屏幕旋转至横屏状态,即屏幕横向的边长大于竖向的边长的状态。
图2中示例性示出了显示设备200的硬件配置框图。如图2所示,显示设备200中可以包括调谐解调器210、通信器220、检测器230、外部装置接口240、控制器250、存储器260、用户接口265、视频处理器270、显示器275、旋转组件276、音频处理器280、音频输出接口285、供电电源290。
其中,旋转组件276可以包括驱动电机、旋转轴等部件。其中,驱动电机可以连接控制器250,受控制器250的控制输出旋转角度;旋转轴的一端连接驱动电机的动力输出轴,另一端连接显示器275,以使显示器275可以通过旋转组件276固定安装在墙壁或支架上。
旋转组件276还可以包括其他部件,如传动部件、检测部件等。其中,传动部件可以通过特定传动比,调整旋转组件276输出的转速和力矩,可以为齿轮传动方式;检测部件可以由设置在旋转轴上的传感器组成,例如角度传感器、姿态传感器等。这些传感器可以对旋转组件276旋转的角度等参数进行检测,并将检测的参数发送给控制器250,以使控制器250能够根据检测的参数判断或调整显示设备200的状态。实际应用中,旋转组件276可以包括但不限于上述部件中的一种或多种。
调谐解调器210,通过有线或无线方式接收广播电视信号,可以进行放大、混频和谐振等调制解调处理,用于从多个无线或有线广播电视信号中解调出用户所选择的电视频道的频率中所携带的音视频信号,以及附加信息(例如EPG数据)。
调谐解调器210,可根据用户选择,以及由控制器250控制,响应用户选择的电视频道的频率以及该频率所携带的电视信号。
调谐解调器210,根据电视信号的广播制式不同,可以接收信号的途径有很多种, 诸如:地面广播、有线广播、卫星广播或互联网广播等;以及根据调制类型不同,可以数字调制方式或模拟调制方式;以及根据接收电视信号的种类不同,可以解调模拟信号和数字信号。
在其他一些示例性实施例中,调谐解调器210也可在外部设备中,如外部机顶盒等。这样,机顶盒通过调制解调后输出电视信号,经过外部装置接口240输入至显示设备200中。
通信器220,是用于根据各种通信协议类型与外部设备或外部服务器进行通信的组件。例如显示设备200可将内容数据发送至经由通信器220连接的外部设备,或者,从经由通信器220连接的外部设备浏览和下载内容数据。通信器220可以包括WIFI模块221、蓝牙通信协议模块222、有线以太网通信协议模块223等网络通信协议模块或近场通信协议模块,从而通信器220可根据控制器250的控制接收控制装置100的控制信号,并将控制信号实现为WIFI信号、蓝牙信号、射频信号等。
检测器230,是显示设备200用于采集外部环境或与外部交互的信号的组件。检测器230可以包括声音采集器231,如麦克风,可以用于接收用户的声音,如用户控制显示设备200的控制指令的语音信号;或者,可以采集用于识别环境场景类型的环境声音,实现显示设备200可以自适应环境噪声。
在其他一些示例性实施例中,检测器230,还可以包括图像采集器232,如相机、摄像头等,可以用于采集外部环境场景,以自适应变化显示设备200的显示参数;以及用于采集用户的属性或与用户交互手势,以实现显示设备与用户之间互动的功能。
在其他一些示例性实施例中,检测器230,还可以包括光接收器,用于采集环境光线强度,以自适应显示设备200的显示参数变化等。
在其他一些示例性实施例中,检测器230,还可以包括温度传感器,如通过感测环境温度,显示设备200可自适应调整图像的显示色温。示例性的,当温度偏高的环境时,可调整显示设备200显示图像色温偏冷色调;当温度偏低的环境时,可以调整显示设备200显示图像色温偏暖色调。
外部装置接口240,是提供控制器250控制显示设备200与外部设备间数据传输的组件。外部装置接口240可按照有线/无线方式与诸如机顶盒、游戏装置、笔记本电脑、等外部设备连接,可接收外部设备的诸如视频信号(例如运动图像)、音频信号(例如音 乐)、附加信息(例如EPG)等数据。
其中,外部装置接口240可以包括:高清多媒体接口(HDMI)端子241、复合视频消隐同步(CVBS)端子242、模拟或数字分量端子243、通用串行总线(USB)端子244、组件(Component)端子(图中未示出)、红绿蓝(RGB)端子(图中未示出)等任一个或多个。
控制器250,通过运行存储在存储器260上的各种软件控制程序(如操作***和各种应用程序),来控制显示设备200的工作和响应用户的操作。
在一些示例性的实施方式中,控制器250包括随机存取存储器(RAM)、只读存储器(ROM)、图形处理器、CPU处理器、通信接口、以及通信总线。其中,RAM、ROM以及图形处理器、CPU处理器通信接口通过通信总线相连接。
ROM,用于存储各种***启动指令。如在接收到开机信号时,显示设备200电源开始启动,CPU处理器254运行ROM252中的***启动指令,将存储在存储器260的操作***拷贝至RAM251中,以开始运行启动操作***。当操作***启动完成后,CPU处理器254再将存储器260中各种应用程序拷贝至RAM251中,然后,开始运行启动各种应用程序。
图形处理器253,用于产生各种图形对象,如图标、操作菜单、以及用户输入指令显示图形等。图形处理器253可以包括运算器,用于通过接收用户输入各种交互指令进行运算,进而根据显示属性显示各种对象;以及包括渲染器,用于产生基于运算器得到的各种对象,将进行渲染的结果显示在显示器275上。
CPU处理器254,用于执行存储在存储器260中的操作***和应用程序指令。以及根据接收的用户输入指令,来执行各种应用程序、数据和内容的处理,以便最终显示和播放各种音视频内容。
在一些示例性实施例中,CPU处理器254,可以包括多个处理器。多个处理器可包括一个主处理器以及多个或一个子处理器。主处理器,用于在显示设备预加载模式中执行显示设备200的一些初始化操作,和/或,在正常模式下显示画面的操作。多个或一个子处理器,用于执行在显示设备待机模式等状态下的一种操作。
通信接口255,可包括第一接口到第n接口。这些接口可以是经由网络被连接到外部设备的网络接口。
控制器250可以控制显示设备200的整体操作。例如:响应于接收到用于选择在显 示器275上显示的GUI对象的用户输入命令,控制器250便可以执行与由用户输入命令选择的对象有关的操作。
其中,该对象可以是可选对象中的任何一个,例如超链接或图标。该与所选择的对象有关的操作,例如显示连接到超链接页面、文档、图像等操作,或者执行与对象相对应的程序的操作。该用于选择GUI对象的用户输入命令,可以是通过连接到显示设备200的各种输入装置(例如,鼠标、键盘、触摸板等)输入命令或者与由用户说出语音相对应的语音命令。
存储器260,用于存储驱动和控制显示设备200运行的各种类型的数据、软件程序或应用程序。存储器260可以包括易失性和/或非易失性存储器。而术语“存储器”包括存储器260、控制器250的RAM和ROM、或显示设备200中的存储卡。
在一些实施例中,存储器260具体用于存储驱动显示设备200中控制器250的运行程序;存储显示设备200内置的和用户从外部设备下载的各种应用程序;存储用于配置由显示器275提供的各种GUI、与GUI相关的各种对象及用于选择GUI对象的选择器的视觉效果图像等数据。
在一些实施例中,存储器260具体用于存储调谐解调器210、通信器220、检测器230、外部装置接口240、视频处理器270、显示器275、音频处理器280等的驱动程序和相关数据,例如从外部装置接口接收的外部数据(例如音视频数据)或用户接口接收的用户数据(例如按键信息、语音信息、触摸信息等)。
在一些实施例中,存储器260具体存储用于表示操作***(OS)的软件和/或程序,这些软件和/或程序可包括,例如:内核、中间件、应用编程接口(API)和/或应用程序。示例性的,内核可控制或管理***资源,以及其它程序所实施的功能(如所述中间件、API或应用程序);同时,内核可以提供接口,以允许中间件、API或应用程序访问控制器,以实现控制或管理***资源。
图3中示例性示出了显示设备200存储器中操作***的架构配置框图。该操作***架构从上到下依次是应用层、中间件层和内核层。
应用层,***内置的应用程序以及非***级的应用程序都是属于应用层。负责与用户进行直接交互。应用层可包括多个应用程序,如设置应用程序、电子帖应用程序、媒体中心应用程序等。这些应用程序可被实现为Web应用,其基于WebKit引擎来执行,具 体可基于HTML5、层叠样式表(CSS)和JavaScript来开发并执行。
这里,HTML,全称为超文本标记语言(Hyper Text Markup Language),是一种用于创建网页的标准标记语言,通过标记标签来描述网页,HTML标签用以说明文字、图形、动画、声音、表格、链接等,浏览器会读取HTML文档,解释文档内标签的内容,并以网页的形式显示出来。
CSS,全称为层叠样式表(Cascading Style Sheets),是一种用来表现HTML文件样式的计算机语言,可以用来定义样式结构,如字体、颜色、位置等的语言。CSS样式可以直接存储与HTML网页或者单独的样式文件中,实现对网页中样式的控制。
JavaScript,是一种应用于Web网页编程的语言,可以***HTML页面并由浏览器解释执行。其中Web应用的交互逻辑都是通过JavaScript实现。JavaScript可以通过浏览器,封装JavaScript扩展接口,实现与内核层的通信,
中间件层,可以提供一些标准化的接口,以支持各种环境和***的操作。例如,中间件层可以实现为与数据广播相关的中间件的多媒体和超媒体信息编码专家组(MHEG),还可以实现为与外部设备通信相关的中间件的DLNA中间件,还可以实现为提供显示设备内各应用程序所运行的浏览器环境的中间件等。
内核层,提供核心***服务,例如:文件管理、内存管理、进程管理、网络管理、***安全权限管理等服务。内核层可以被实现为基于各种操作***的内核,例如,基于Linux操作***的内核。
内核层也同时提供***软件和硬件之间的通信,为各种硬件提供设备驱动服务,例如:为显示器提供显示驱动程序、为摄像头提供摄像头驱动程序、为遥控器提供按键驱动程序、为WIFI模块提供WiFi驱动程序、为音频输出接口提供音频驱动程序、为电源管理(PM)模块提供电源管理驱动等。
图2中,用户接口265,接收各种用户交互。具体的,用于将用户的输入信号发送给控制器250,或者,将从控制器250的输出信号传送给用户。示例性的,遥控器100A可将用户输入的诸如电源开关信号、频道选择信号、音量调节信号等输入信号发送至用户接口265,再由用户接口265转送至控制器250;或者,遥控器100A可接收经控制器250处理从用户接口265输出的音频、视频或数据等输出信号,并且显示接收的输出信号或将接收的输出信号输出为音频或振动形式。
在一些实施例中,用户可在显示器275上显示的图形用户界面(GUI)输入用户命令,则用户接口265通过GUI接收用户输入命令。确切的说,用户接口265可接收用于控制选择器在GUI中的位置以选择不同的对象或项目的用户输入命令。其中,“用户界面”,是应用程序或操作***与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图标、控件、菜单、选项卡、文本框、对话框、状态栏、频道栏、Widget等可视的界面元素。
或者,用户可通过输入特定的声音或手势进行输入用户命令,则用户接口265通过传感器识别出声音或手势,来接收用户输入命令。
视频处理器270,用于接收外部的视频信号,根据输入信号的标准编解码协议,进行解压缩、解码、缩放、降噪、帧率转换、分辨率转换、图像合成等视频数据处理,可得到直接在显示器275上显示或播放的视频信号。
示例的,视频处理器270,包括解复用模块、视频解码模块、图像合成模块、帧率转换模块、显示格式化模块等。
其中,解复用模块,用于对输入音视频数据流进行解复用处理,如输入MPEG-2流(基于数字存储媒体运动图像和语音的压缩标准),则解复用模块将其进行解复用成视频信号和音频信号等。
视频解码模块,用于对解复用后的视频信号进行处理,包括解码和缩放处理等。
图像合成模块,如图像合成器,其用于将图形生成器根据用户输入或自身生成的GUI信号,与缩放处理后视频图像进行叠加混合处理,以生成可供显示的图像信号。
帧率转换模块,用于对输入视频的帧率进行转换,如将输入的60Hz视频的帧率转换为120Hz或240Hz的帧率,通常的格式采用如插帧方式实现。
显示格式化模块,用于将帧率转换模块输出的信号,改变为符合诸如显示器显示格式的信号,如将帧率转换模块输出的信号进行格式转换以输出RGB数据信号。
显示器275,用于接收源自视频处理器270输入的图像信号,进行显示视频内容、图像以及菜单操控界面。显示视频内容,可以来自调谐解调器210接收的广播信号中的 视频内容,也可以来自通信器220或外部装置接口240输入的视频内容。显示器275,同时显示显示设备200中产生且用于控制显示设备200的用户操控界面UI。
以及,显示器275可以包括用于呈现画面的显示屏组件以及驱动图像显示的驱动组件。或者,倘若显示器275为一种投影显示器,还可以包括一种投影装置和投影屏幕。
旋转组件276,控制器可以发出控制信号使旋转组件276旋转显示器255。
音频处理器280,用于接收外部的音频信号,根据输入信号的标准编解码协议,进行解压缩和解码,以及降噪、数模转换、和放大处理等音频数据处理,得到可以在扬声器286中播放的音频信号。
示例性的,音频处理器280可以支持各种音频格式。例如MPEG-2、MPEG-4、高级音频编码(AAC)、高效AAC(HE-AAC)等格式。
音频输出接口285,用于在控制器250的控制下接收音频处理器280输出的音频信号,音频输出接口285可包括扬声器286,或输出至外接设备的发生装置的外接音响输出端子287,如耳机输出端子。
在其他一些示例性实施例中,视频处理器270可以包括一个或多个芯片组成。音频处理器280,也可以包括一个或多个芯片组成。
以及,在其他一些示例性实施例中,视频处理器270和音频处理器280,可以为单独的芯片,也可以与控制器250一起集成在一个或多个芯片中。
供电电源290,用于在控制器250的控制下,将外部电源输入的电力为显示设备200提供电源供电支持。供电电源290可以是安装在显示设备200内部的内置电源电路,也可以是安装在显示设备200外部的电源。
在投屏过程中,移动终端100B可以通过无线连接方式,如通过Miracast协议,向显示设备200发送显示画面数据,形成投屏视频流。当显示设备200在接收到投屏视频流后,可以通过控制器250对投屏视频流进行解码,解析出帧画面经处理后形成投屏画面发送给显示器275进行显示。
其中,移动终端100B可以为具有显示和人机交互功能的智能终端设备,例如手机、平板电脑等。由于移动终端100B具有不同的操作模式,因此所形成的投屏画面也具有不同的布局模式。例如,当用户在横向握持手机进行操作时,手机上所呈现的画面为横向布局,即画面的宽度大于画面的高度,手机处于横向模式,如图4A所示。当用户在竖向 握持手机进行操作时,手机上所呈现的画面为竖向布局,即画面的宽度小于画面的高度,手机处于纵向模式,如图4B所示。
针对不同类型的移动终端100B,其显示屏比例也存在多种不同形式。例如,手机的屏幕宽高比通常为9:16、10:16等;平板电脑的屏幕宽高比为3:4等。还可能有部分智能终端设备的屏幕宽高比为1:1,例如智能手表等。对于屏幕宽高比为1:1的智能终端设备,其横向状态和竖向状态下所呈现的画面布局一般是相同的,仅仅在智能终端设备的显示屏上显示时,方向不同。因此,对于显示屏幕宽高比1:1的移动终端100B,其投屏时形成的投屏画面并不区分横竖向状态。
为了使显示设备200能够根据移动终端100B在投屏时的横竖模式实现自动旋转屏幕,从而达到更好的用户体验,显示设备200和移动终端100B之间可以配置有投屏协议,例如基于Miracast标准的投屏协议,基于DLNA标准的投屏协议,基于Airplay标准的投屏协议等,通过投屏协议实现画面视频流的传输,使移动终端100B上的画面能够投屏给显示设备200。
目前Miracast投屏协议的规定中,无论移动终端100B处于横向模式还是竖向模式,投屏给显示设备200的视频流一直为1920×1080的横屏流,显示设备无法根据视频流的分辨率或者宽高比获取屏幕的横竖关系,导致显示设备200无法通过视频流实现自动旋转电视屏幕。
例如,移动终端在与显示设备进行投屏交互时,当移动终端处于横向模式时,无论与该移动终端交互的显示设备处于横屏状态还是竖屏状态,其向显示设备发送的图像信息均不包含左和右黑色信息,如图5A和5B。
而当移动终端处于纵向模式时,无论与该移动终端交互的显示设备处于横屏状态还是竖屏状态,其向显示设备发送的图像信息均包含左和右黑色信息,如图6A和6B。
在Miracast协议中,当移动终端与显示设备建立通信连接,移动终端在将截屏画面传出至显示设备之前,已经对截屏画面作了加黑边的处理,目的在于保证原显示设备在横屏状态下,无论是移动终端是横向放置还是竖向放置,都可以展现横向画面。
在图6A所对应的投屏画面中,画面两侧的黑色区域所示的图像信息为黑色信息,或者称为黑边,画面中部的显示画面区域称为有效画面(即为移动终端相对应的屏 幕显示画面),其该有效画面即为移动终端的图像信息中的有效信息。在有效画面对应的区域内,显示的是移动终端100B上的操作画面,黑边为根据移动终端100B显示器275的屏幕比例确定具有不同的宽度和高度。
即,显示设备200显示投屏画面时,以较短边对应的方向为基准,如在横屏状态下的高度方向。因此,显示设备200呈现的投屏视频流高度方向一般是不变的。即无论移动终端100B的放置方向为横向还是竖向,显示设备200接收到的投屏画面高度都是1080P的。即使显示器275在旋转到竖屏状态后,并不能呈现如图6C所示的显示状态,而是按照整个包含左、右黑色信息的图像信息的高度进行显示,如图6B所示。即在竖屏状态下,不仅有效画面的左右两侧填充有黑色信息,投屏画面的顶部和底部也因没有有效画面做填充而显示黑色区域,这将大大影响用户的观看体验。
在实际投屏过程中,显示设备200接收到的投屏视频流对应画面显示分辨率为1920×1080,即在高度方向,投屏画面以终端上的画面为基准,需要1080个像素点进行显示。而在宽度方向,投屏画面以终端显示画面加两侧黑边宽度为基准,需要1920个像素点进行显示。相应的,投屏画面中有效区域的高度为1080,宽度按照高度的比例进行缩放显示,这将大大浪费显示器275上的显示区域。
为了提高显示设备上显示区域的利用率,并对有效信息进行一定比例的放大,以适应移动终端100B发送的投屏视频流。
在一些实施方式中,当移动终端处于横向模式时,呈现图4A所示的状态,进行投屏时,发送到显示设备的视频流被解析后的帧数据没有左、右黑边。当显示设备接收到不包含左、右黑边的图像信息时,检测显示器是否处于横屏状态。当时横屏状态时,控制器250接收到图像信息,可直接将该图像信息呈现在显示器上。当显示器处于竖屏状态时,控制器250可以向旋转组件276发送旋转指令,控制旋转组件276旋转,以使显示器旋转至横屏状态。
当移动终端处于纵向模式时,呈现图4B所示的状态,进行投屏时,发送到显示设备的视频流被解析后的帧数据带有左、右黑边和有效画面。当显示设备接收到带有左、右黑边的图像信息时,检测显示器是否处于竖屏状态。当处于横屏状态时,控制器250可以向旋转组件276发送旋转指令,控制旋转组件276,以使显示器275旋转至竖屏状态。在竖屏状态下,显示器275的宽度小于高度,与终端画面的显示比例相符,此时, 对投屏视频流中的帧数据进行放大,将有效画面显示到显示器上,左右黑色区域因放大而无法显示在显示器上,从而减小有效画面两侧的黑边面积,如图6C所示。
为了改善用户的观影体验,减少画面两侧的黑色信息,本申请提供一种显示设备,可以根据投屏数据流的图像信息中是否包含左、右黑色信息,判断是否需要旋转显示器并对画面进行调整,以最大化利用显示区域。
本申请提供的一种显示设备包括:
显示器;
旋转组件,被配置为带动所述显示器旋转,以使所述显示处于横屏状态或竖屏状态中的一种旋转状态;
用户接口,所述通信息被配置为连接到终端;
以及控制器,参见图7,被配置为执行:
S1:接收投屏视频流。
实际应用中,用户可以先在移动终端100B上执行投屏显示操作,以将移动终端100B的显示画面发送给显示设备200。例如,用户通过在手机上先后选择“设置-连接与共享-投屏”,并且在投屏操作的设备列表中选中当前网络中的一个显示设备作为投屏对象,执行投屏操作。
在执行投屏操作后,移动终端100B会通过投屏协议,如采用Miracast协议或其他镜像协议,将所显示的画面发送给显示设备200。随着投屏过程中不断产生新的交互画面,移动终端100B会逐帧将画面发送给显示设备200,形成投屏数据流(下面以投屏视频流)。
需要说明的是,用户还可以根据通过第三方应用程序执行投屏操作。例如,用户打开视频应用,在视频应用的视频播放界面上,设有投屏图标。用户可以点击该图标执行投屏操作。通常,通过第三方应用程序执行的投屏操作的投屏画面以所播放的视频资源为准。例如,在播放的视频资源为电影、电视剧等横向媒资时,则投屏画面中有效画面的宽度大于高度;在播放的视频资源为短视频、漫画等竖向媒资时,则投屏画面中有效画面的宽度小于高度。
S2:在所述投屏视频流中提取初始分辨率。
显示设备200在接收到投屏视频流后,其控制器250可以对接收到的投屏视频流进 行逐帧分析,从而提取初始分辨率。其中,所述初始分辨率为所述投屏视频流中首帧画面整体的分辨率。
例如,移动终端100B发送的投屏视频流的画面宽高比为1920:1080,控制器250在接收到投屏视频流后,可以通过解析该投屏视频流,获取每一帧画面。并且在第一帧画面(首帧画面)中提取初始分辨率,即提取初始分辨率为1920×1080。
显然,所述首帧画面并不仅仅局限于整个投屏过程中第一帧画面,还可以是投屏视频流的前几帧或者在指定时间节点上的对应帧画面。由于初始分辨率是投屏画面的整体分辨率,因此较容易获取该分辨率信息,并且在实际应用时,越早获得初始分辨率,越有利于及时检测出投屏视频流的有效分辨率,从而越早将显示画面调整至较佳的状态。
S3:在所述投屏视频流中提取采样分辨率。
在提取初始分辨率后,还可以针对投屏视频流中的画面再进行采样,提取出采样分辨率。其中,用于进行采样的那一帧画面称为采样画面。所述采样分辨率为按照预设时间间隔在所述投屏视频流中提取的,采样画面上有效区域(即有效画面)的分辨率。
在采样过程中,指定采样的预设时间间隔可以根据控制器250的运算能力进行设定,例如相对于首帧画面10s后的画面作为采样画面。为了从采样画面中确定有效画面,可以对采样画面的像素点颜色进行遍历。显然,黑边区域的像素色值为黑色,有效区域的像素色值通常不全是黑色,因此,可以通过遍历采样画面的每个像素,确定黑色且呈矩形的区域为黑边,其他区域则为有效区域。
S4:对比所述采样分辨率与所述初始分辨率。
在针对投屏视频流提取初始分辨率和采样分辨率后,可以将采样分辨率与初始分辨率进行对比,以根据采样分辨率和初始分辨率之间的差异,确定当前投屏视频流中有效画面情况(如比例、方向等),从而选择是否根据有效画面进行显示。
例如,如果采样分辨率与初始分辨率相等,即第一帧画面的分辨率为1920×1080,在采样画面中确定的有效区域分辨率也为1920×1080,则代表当前投屏画面不存在黑边,投屏画面可以充满显示区域。即,直接通过显示器275横屏状态显示即可满足投屏画面的显示要求。
需要说明的是,由于显示画面的分辨率通常采用画面宽度和高度方向所占像素数量进行表示,例如1920×1080。而单纯通过分辨率的数值通常难以直接进行对比。例如, 从数值上比较,分辨率1920×1080等于1080×1920。因此,在实际对比过程中,可以通过提取分辨率中的部分数值或者将分辨率转化为其他可比较的数值后,再进行对比,以获得所述采样分辨率与所述初始分辨率的对比结果。例如,可以在初始分辨率中提取整体画面的宽度或高度,并与采样分辨率中提取的有效画面的高度或宽度进行比较,从而确定其有效分辨率。
S5:如果所述采样分辨率大于或等于所述初始分辨率,设置所述视频流的有效分辨率为初始分辨率。
实际应用中,如果采样分辨率对应的数值大于或等于初始分辨率对应的数值,则说明在保证显示完全的前提下,当前投屏画面中黑边区域已是最小情况。此时,即使对画面进行缩放,也不会增缩小黑边区域面积,则确定视频流的有效分辨率未发生改变,仍为初始分辨率。
S6:如果所述采样分辨率小于所述初始分辨率,设置所述视频流的有效分辨率为采样分辨率。
如果采样分辨率对应的数值小于初始分辨率对应的数值,则确定当前投屏画面填充了大量的黑色区域,显示设备200可以通过对显示器275的旋转以及投屏画面的缩放提高显示效果。
由以上技术方案可知,本申请提供的投屏视频流有效分辨率检测方法可以在接收投屏视频流后,从投屏视频流中提取初始分辨率和采样分辨率,并进行对比,确定当前视频流的有效分辨率。如果采样分辨率小于初始分辨率,设置视频流的有效分辨率为采样分辨率。通过设置视频流的有效分辨率,可以按照有效分辨率对投屏画面进行显示,从而适应投屏画面的显示方向,减小黑边影响,达到更优的用户体验。
在一种实现方式中,为了实现采样分辨率与初始分辨率,可以分别在采样分辨率和初始分辨率中,提取能够用于对比的数据。即,如图8所示,对比所述采样分辨率与所述初始分辨率的步骤,还包括:
S41:根据所述初始分辨率提取基准值;
S42:遍历所述采样画面的有效画面,生成对比值;
S43:如果所述对比值大于或等于所述基准值,确定所述采样分辨率大于或等于所述初始分辨率;
S44:如果所述对比值小于所述基准值,确定所述采样分辨率小于所述初始分辨率。
在获取初始分辨率后,可以在初始分辨率中提取部分数据作为基准值。例如,在投屏视频流中投屏画面高度不改变时,以首帧画面的整体画面高度作为所述基准值。再通过遍历采样画面中的像素点色值,确定有效画面的比例,从而生成对比值。所述对比值为所述采样画面中有效画面的画面宽度。
例如,在首帧画面中提取初始分辨率为1920×1080,则提取基准值为1080。再通过遍历采样画面中的像素点色值,确定剔除黑色区域后的有效画面的分辨率为960×1080,生成对比值为960。
在确定基准值和对比值后,可以直接通过对比基准值和对比值的大小,确定采样分辨率和初始分辨率之间的关系,即如果对比值大于或等于基准值,确定采样分辨率大于或等于初始分辨率;如果对比值小于基准值,确定采样分辨率小于初始分辨率。
例如,基准值为1080,对比值为960,由于对比值960小于基准值1080,则确定采样分辨率小于初始分辨率,即设置当前投屏视频流的有效分辨率为采样分辨率。并且在后续显示过程中,可以按照有效分辨率对投屏视频流的投屏画面进行缩放显示。
为了计算对比值,在本申请的部分实施例中,如图9A所示,所述方法还包括:
S421:通过遍历所述采样画面中连续的黑色像素点数,提取黑边数据;
S422:提取初始画面数据;
S423:计算所述基准值和所述对比值。
为了计算对比值,需要确定投屏画面中有效画面的分辨率,因此,如图9B所示,可以从采样画面对应的图像左侧开始检测连续黑色区域的范围,并得出黑边区域的范围:左侧黑边宽度H L、左侧黑边高度V L。再从图像右侧开始检测连续黑色区域的范围,并得出黑边区域的范围:右侧黑边宽度H R以及右侧黑边高度V R,形成黑边数据。同时,还可以在首帧画面中提取初始画面数据。所述初始画面数据包括首帧画面的初始宽度H 0和初始高度V 0,以计算基准值和对比值。
其中,基准值和对比值可以按照下式进行计算:
所述基准值:S 0=V 0=V L=V R
所述对比值:S X=H 0-(H L+H R)。
在一种实现方式中,如图10所示,在计算所述基准值和所述对比值的步骤前,所述 方法还包括:
S4231:判断所述采样画面是否满足预设执行条件。
S4232:如果所述采样画面不满足预设判断条件,设置所述有效分辨率为所述初始分辨率;
S4233:如果所述采样画面满足预设判断条件,执行计算所述基准值和所述对比值的步骤。
本实施例中,可以在获取采样画面以后,先对采样画面中的画面显示状况进行判断,确定其是否满足预设执行条件。其中,所述预设执行条件为:左侧黑边宽度H L等于所述右侧黑边宽度H R,且左侧黑边高度V L、所述右侧黑边高度V R与所述初始高度V 0相等。即,判断采样画面是否满足“H L=H R且V L=V R=V 0”。
如果采样画面不满足“H L=H R且V L=V R=V 0”,则确定采样画面的两侧黑边宽度和高度不相等。这种情况一般是由于移动终端100B的显示内容对采样结果造成了影响。由于这种情况下很难判断出有效分辨率,因此可以保持原始分辨率输出投屏画面,即设置所述有效分辨率为所述初始分辨率,有效分辨率对应的画面宽度H S=H 0,对应的画面高度V S=V 0
如果采样画面满足“H L=H R且V L=V R=V 0”,则可以进一步进行有效分辨率的判断,即可以执行计算所述基准值和所述对比值的步骤,以对比所述采样分辨率与所述初始分辨率。
例如,判断采样画面是否满足“H 0-(H L+H R)<V 0”,若不满足“H 0-(H L+H R)<V 0”,则保持原始分辨率输出,即设置所述有效分辨率为所述初始分辨率,有效分辨率对应的画面宽度H S=H 0,对应的画面高度V S=V 0。若满足“H 0-(H L+H R)<V 0”,则设置所述有效分辨率为采样分辨率,有效分辨率对应的输出画面宽度H S=H 0-(H L+H R),对应的画面高度V S=V 0
实际应用中,由于移动终端100B所投屏的画面内容容易对采样画面中有效画面区域的判断造成影响,例如,采样画面对应的移动终端100B的显示画面刚好是黑色时,如果仍旧以投屏画面边缘连续黑色像素点进行范围判断,黑色的画面会影响黑色区域的范围判断,进而影响最终的采样分辨率提取结果。因此,为了缓解黑色画面内容对采样分辨率的影响,在所述投屏视频流提取采样分辨率的步骤,还包括:
S201:在所述投屏视频流中按相等时间间隔获取多帧采样画面;
S202:分别计算每一帧所述采样画面的采样分辨率。
通过预设采样时间间隔,可以在投屏视频流中,多次进行采样,并分别提取出每次 采样中画面的采样分辨率。例如,每隔T时间获取一帧画面图像,并通过上述分辨率算法得出的采样分辨率数值分别为:S x0、S x1、……、S xn
通过多次进行采样,可以随着移动终端100B上显示画面的变化,采集到多帧采样画面。通常多帧采样画面不会全部都受黑色画面内容的影响,因此采集多帧采样画面可以降低画面内容对黑边区域范围判断造成影响,从而提高有效分辨率判断时的准确率。
进一步地,如图11所示,在获取多帧画面对应的采样分辨率后,还可以分别对采样分辨率进行判断,以获得采样分辨率与初始分辨率的对比结果,即所述方法还包括:
S211:分别对比每一帧所述采样画面的采样分辨率与所述初始分辨率;
S212:如果所有所述采样分辨率均大于或等于与所述初始分辨率,设置所述视频流的有效分辨率为初始分辨率;
S213:如果所有所述采样分辨率均小于所述初始分辨率,设置所述视频流的有效分辨率为采样分辨率。
通过分别对比每一帧采样画面的采样分辨率与初始分辨率,若连续S x0≤S 0、S x1≤S 0、……、S xn≤S 0,则得出有效分辨率为S xn,并将S 0=S xn。若连续S x0≥S 0、S x1≥S 0、……、S xn≥S 0则得出有效分辨率为S 0
在一种实现方式中,所述方法还包括:如果设置所述视频流的有效分辨率为采样分辨率,则可以控制旋转显示设备200的显示器275至竖屏状态。例如,初始分辨率为1920×1080,采样分辨率为960×1080。通过计算基准值和对比值,可以确定当前采样画面中有效画面的宽度S X=H 0-(H L+H R)=960,而首帧画面的高度为1080,因此可以确定当前有效分辨率为采样分辨率,即有效画面的分辨率:960×1080。从而可以确定移动终端100B上对应的显示画面为960×1080的竖向画面。
而竖向画面更适合在竖屏状态下进行显示,因此在确定有效分辨率为采样分辨率后,控制器250可以向旋转组件276发送控制指令,使旋转组件276驱动显示器275逆时针(或顺时针)旋转至竖屏状态。
显示器275旋转至竖屏状态后,可以按照宽高比为960:1080的比例对投屏画面进行显示。但是由于显示器275的屏幕较大,通常其显示分辨率为3840×2160(横屏状态,对应竖屏状态则为2160×3840)。因此,为了显示分辨率为960×1080的投屏画面需要对投屏画面进行缩放,使显示器275能够完全显示投屏画面。
另外,由于单纯的对投屏画面进行大小的调整,容易使显示的投屏画面在大屏幕上较模糊,严重降低用户体验。因此,在对投屏画面进行缩放的同时,还可以对投屏画面进行插像素相关的画质调整,以改善模糊画面,提高画面显示效果。
由以上技术方案可知,本申请提供的投屏视频流有效分辨率检测方法可以在接收投屏视频流后,从投屏视频流中提取初始分辨率和采样分辨率,并进行对比,确定当前视频流的有效分辨率。如果采样分辨率小于初始分辨率,设置视频流的有效分辨率为采样分辨率。通过设置视频流的有效分辨率,可以按照有效分辨率对投屏画面进行显示,从而适应投屏画面的显示方向,减小黑边影响,达到更优的用户体验。
本申请另一种示例性的实施方式中,提供了另一种处理投屏画面的方式,具体提供另一种显示设备,包括:
显示器;
旋转组件,被配置为带动所述显示器旋转,以使所述显示处于横屏状态或竖屏状态中的一种旋转状态;
用户接口,所述通信息被配置为连接到终端;
以及控制器,参见图12,被配置为执行:
S1′,接收所述终端发送的图像信息,其当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色区信息。
实际应用中,用户可以先在移动终端100B上执行投屏显示操作,以将移动终端100B的显示画面发送给显示设备200。例如,用户通过在手机上先后选择“设置-连接与共享-投屏”,并且在投屏操作的设备列表中选中当前网络中的一个显示设备作为投屏对象,执行投屏操作。
在执行投屏操作后,移动终端100B会通过投屏协议,如采用Miracast协议或其他投屏以及镜像协议,将所显示的画面发送给显示设备200。随着投屏过程中不断产生新的交互画面,移动终端100B会逐帧将画面发送给显示设备200,形成投屏视频流。
S2′,若所述显示器的当前旋转状态与所述终端的纵向模式不匹配时,将所述显示器旋转至竖屏状态;
实际应用中,可以通过分别分析图像信息中是否包含左、右黑色信息,判断显示器 的当前旋转状态与所述终端的显示状态是否匹配。
在一些示例性的实施方式中,控制器获取显示器的旋转角度回调信息,根据从移动终端获取的图像信息中是否包含左右黑边确定显示器的目标旋转状态。
当图像信息中是否不包含左、右黑色信息,则说明移动终端目前为横向模式,检测显示器的当前旋转状态为横屏状态时,二者为匹配。不需要旋转电视。检测显示器的当前选装状态为竖屏状态时,二者为不匹配,需要将显示器旋转至横屏状态。
当图像信息中是否包含左、右黑色区域,则说明移动终端目前为纵向模式,检测显示器的当前旋转状态为横屏状态时,二者为不匹配,需要将显示器旋转至竖屏状态。检测显示器的当前选装状态为竖屏状态时,二者为匹配,不需要旋转显示器。
用户通过移动终端100B执行投屏操作后,移动终端100B会通过镜像协议或者投屏协议向显示设备200发送投屏画面。控制器250可以接收终端发送的图像信息,检测显示器275当前的旋转状态。其中,对于显示275旋转状态的检测可以通过显示设备200中内置的传感器完成。
例如,可以在显示设备200的显示器275上设置陀螺仪、重力加速度传感器等传感器设备,通过测量角加速度或重力方向确定显示器275相对于重力方向的姿态数据。再将检测的姿态数据分别与横屏状态和竖屏状态下的姿态数据进行比较,确定显示器275当前所处的旋转状态。又例如,可以在旋转组件276上设置光栅角度传感器、磁场角度传感器或滑动电阻角度传感器等,通过测量旋转组件276所旋转的角度,分别与横屏状态和竖屏状态下的角度进行比较,确定显示器275当前所处的旋转状态。
在一些示例性的实施方式中,控制器被进一步配置为:
计算所述投屏画面的旋转方向和旋转角度;所述投屏画面的旋转方向与所述显示器的转动方向相反;所述投屏画面的旋转角度与所述显示器的转动角度相等;
按照所述旋转方向和所述旋转角度旋转所述投屏画面。
S3′,基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
显示设备200在接收到投屏视频流后,其控制器250可以对接收到的投屏视频流进行逐帧分析。例如,移动终端100B发送的投屏视频流的画面宽高比为1920:1080,控制器250在接收到投屏视频流后,可以通过解析该投屏视频流,获取帧图像。提取帧图 像分辨率为1920×1080,此为初始分辨率。
在提取分辨率后,还可以针对投屏视频流中的帧图像再进行采样,提取出有效分辨率。其中,用于进行采样的那一帧画面称为采样画面。有效分辨率为在所述投屏视频流中提取的,帧数据上有效画面的分辨率。具体的,有效分辨率可以按照预设时间间隔在投屏视频流中获取。
在采样过程中,为了从采样画面中确定有效画面,可以对采样画面的像素点颜色进行遍历。显然,黑色区域的像素色值为黑色,有效区域的像素色值通常不全是黑色,因此,可以通过遍历采样画面的每个像素,确定黑色且呈矩形的区域为黑边,其他区域则为有效画面。
需要说明的是,根据不同显示设备200的适应显示方法,黑色区域所填充的颜色不仅仅局限于黑色。例如,为了适应操作***的整体UI设计风格,黑色区域可以为灰色、蓝色或其他颜色,还可能是渐变色,特定图案等。对于这些情况,本申请为了便于后续描述,仍然称之为黑色区域或者黑边。
在针对投屏视频流获取帧图像分辨率和有效分辨率后,可以将有效分辨率与帧图像分辨率进行对比,以根据采样分辨率有效分辨率和帧图像分辨率之间的差异,确定当前投屏视频流中有效画面情况(如比例、方向等),从而选择是否根据有效画面进行显示。
例如,如果有效分辨率与帧图像分辨率相等,即第一帧画面的分辨率为1920×1080,在采样画面中确定的有效区域分辨率也为1920×1080,则代表当前投屏画面不存在黑边,投屏画面可以充满显示区域。即,直接通过显示器275横屏状态显示即可满足投屏画面的显示要求。
需要说明的是,由于显示画面的分辨率通常采用画面宽度和高度方向所占像素数量进行表示,例如1920×1080。而单纯通过分辨率的数值通常难以直接进行对比。例如,从数值上比较,分辨率1920×1080等于1080×1920。因此,在实际对比过程中,可以通过提取分辨率中的部分数值或者将分辨率转化为其他可比较的数值后,再进行对比,以获得所述有效分辨率与所述帧图像分辨率的对比结果。例如,可以在帧图像分辨率中提取整体画面的宽度或高度,并与有效画面的高度或宽度进行比较,从而确定其有效分辨率。
由以上技术方案可知,本申请提供的投屏视频流有效分辨率检测方法可以在接收投 屏视频流后,从投屏视频流中提取帧图像分辨率和有效分辨率,并进行对比,确定当前视频流的有效分辨率。通过设置视频流的有效分辨率,可以按照有效分辨率对投屏画面进行显示,从而适应投屏画面的显示方向,减小黑边影响,达到更优的用户体验。
为了计算有效画面的放大倍数,在本申请的部分实施例中,如图13所示,所述方法还包括:
S131:通过遍历所述采样画面中连续的黑色像素点数,提取黑边数据;
S132:提取初始画面数据;
S133:计算所述显示设备和所述有效画面的高度比例R H和宽度比例R W
S134:判断R H>R W的大小。
如图9所示,以移动终端传出的图像帧的宽度为a或者记为H 0,高度为H ph或者记为V L;显示设备在竖屏状态时的宽度为W tv,高度为H tv。可以从有效画面对应的图像左侧开始检测连续黑色区域的范围,并得出黑色区域的范围:左侧黑边宽度a、左侧黑边高度H ph。再从图像右侧开始检测连续黑色区域的范围,并得出黑色区域的范围:右侧黑边宽度a以及右侧黑边高度H ph,形成黑边数据。此时,得到的移动终端发送的图像信息中,有效画面的分辨率为(H 0-2a)*V Lh。显示设备的分辨率为W tv*H tv
显示设备和所述有效信息的高度比例R H=H tv/H ph;宽度比例R W=W 0/(W 0-2a),若R H>R W,则执行S85,将所述有效信息放大R W倍,得到投屏画面;若R H<R W,则执行S86,将所述有效信息放大R H倍,得到投屏画面。
由以上技术方案可知,如图14所示,本申请提供的显示设备可以根据图像信息判断移动终端是否处于纵向模式,并自动调整显示器的旋转状态,从而使用更大的显示空间显示投屏画面,缓解传统智能电视无法正常显示投屏画面的问题。
目前投屏以及镜像的协议众多,如例如基于Miracast标准的投屏协议,基于DLNA标准的投屏协议,基于Airplay标准的投屏协议或者可自定义的投屏方式等。
其中,有的投屏协议中显示设备可获取移动终端真实的视频流,如Airplay协议,可将移动终端本身屏幕本身的图像数据发送至显示设备;这样,显示设备可以根据图像的宽高比直接确定图像是横向还是纵向。
而另一些投屏协议中,显示设备无法获取移动终端真实的屏幕视频流,也即移动设备在将投屏数据传输至显示设备之前,先将屏幕数据信息处理,如于Miracast标准的投 屏协议中,智能设备总是发送横向的媒体资源到显示设备,无论是屏幕处于横向放置还是竖向放置,显示设备总是获取到的是横向的视频数据。
例如,在竖向放置移动终端进行镜像时,在将屏幕数据发送至显示设备之前,先将屏幕数据两边做加黑边的处理。其本意在于适配横向放置的显示设备。
但是在显示设备可横竖旋转以后,目前的Miracast标准的投屏协议暂时未对竖屏的显示设备进行适配的处理,还是发送带有黑边的横屏资源到显示设备。
在接下来所示出的实施方式中,显示设备在接收到投屏资源时,均对图像是否有黑边进行判断,因此,显示设备将无法获知移动终端的真实放置情况。这种情况下,显示设备将无法根据图像中有效画面的情况进行旋转。也即无论手机是横屏还是竖屏,在接收到横向的投屏数据后,将所述投屏数据直接进行展示。
当然,显示设备仅仅是不根据投屏数据进行旋转,不论显示设备处在是横屏状态还是竖屏状态,用户仍然可以主动发出旋转指令,将显示设备旋转电视旋转至与之前不同的所处状态。从显示设备正面观看的角度,旋转组件276可以将显示屏旋转到竖屏状态,即屏幕竖向的边长大于横向的边长的状态,也可以将屏幕旋转至横屏状态,即屏幕横向的边长大于竖向的边长的状态。
例如,如果显示设备处在横向位置时,则显示如图5A或者6A的画面。对于图5A,移动终端横向放置,投屏数据流为两边不存在黑边的满屏的横向资源。对于图6A,移动终端竖向放置,投屏数据流为两边带有黑边的有效画面仅在数据流中间的横向资源。这两中显示放置中,对于显示设备而言,其接收的都是横向的竖屏资源。
如果显示设备处在竖向位置时,则显示如图5B或者6B的画面。对于图6B的情况,由于有效画面在屏幕中占比较少,为提高用户体验,在申请示例性的实施方式中,可通过遥控器或者语音、手势等方式,发出控制指令,以对当前镜像界面进行放大,相适应的,在显示设备的用户界面上,示出缩放标尺,如图15A所示。在放大操作接收后,可选的,预设之间内,操作条自动消失。例如,可通过遥控器上下左右OK呼出,可通过遥控器手动放大到合适的倍数,即如图15B和15C所示。
在一些示例性的实施方式中,显示设备仅在竖屏状态下展示缩放标尺,而在横屏桩体下,不显示缩放标尺。
在另一些实施方式中,在投屏过程中,如果接收用户输出的从竖屏状态转至横屏状 态的旋转指令时,显示器从竖屏状态旋转到横屏状态,会在ConfigurationChange时重新设置镜像显示画面,旋转完成后展示新的显示效果。
具体实现中,本申请还提供一种计算机存储介质,其中,该计算机存储介质可存储有程序,该程序执行时可包括本申请提供的方法的各实施例中的部分或全部步骤,当本申请提供的显示设备的控制器运行所述计算机程序指令时,所述控制器执行本申请所述的控制器被配置的步骤。所述的存储介质可为磁碟、光盘、只读存储记忆体(read-only memory,ROM)或随机存储记忆体(random access memory,RAM)等。
本领域的技术人员可以清楚地了解到本申请实施例中的技术可借助软件加必需的通用硬件平台的方式来实现。基于这样的理解,本申请实施例中的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例或者实施例的某些部分所述的方法。
本说明书中各个实施例之间相同相似的部分互相参见即可。尤其,对于实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例中的说明即可。
以上所述的本申请实施方式并不构成对本申请保护范围的限定。

Claims (10)

  1. 一种显示设备,其特征在于,包括:
    显示器;
    旋转组件,被配置为带动所述显示器旋转,以使所述显示器处于横屏状态或竖屏状态中的一种旋转状态;
    用户接口,所述通信息被配置为连接到终端;
    控制器,被配置为:
    接收所述终端发送的图像信息,其中,当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色信息,所述有效信息对应于所述终端的屏幕显示内容;
    若所述显示器的当前旋转状态与所述终端的纵向模式不匹配时,将所述显示器旋转至竖屏状态;
    基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
  2. 根据权利要求1所述的显示设备,其特征在于,所述控制器被进一步配置为:
    若所述显示器的当前旋转状态与所述终端的纵向模式匹配,基于所述有效信息控制所述显示器呈现投屏画面。
  3. 根据权利要求1所述的显示设备,其特征在于,所述控制器判断所述显示器的当前旋转状态与所述终端的纵向模式是否匹配,被进一步配置为:
    获取所述显示器旋转角度回调信息;
    根据所述图像信息确定所述显示器的目标旋转状态。
  4. 根据权利要求3所述的显示设备,其特征在于,所述控制器根据所述图像信息确定所述显示器的目标旋转状态,被进一步配置为:
    如果接收到所述终端你发送的所述图像信息中包括左、右黑色信息,则确定所述显示器的目标旋转状态为竖屏状态;
    如果接收到所述终端你发送的所述图像信息中不包括左、右黑色信息,则确定所述显示器的目标旋转状态为横屏状态。
  5. 根据权利要求1所述的显示设备,其特征在于,所述控制器执行基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的,进一步被配置为:
    计算所述显示设备和所述有效信息的高度比例R H和宽度比例R W
    若R H>R W,则将所述有效信息放大R W倍,得到投屏画面;
    若R H<R W,则将所述有效信息放大R H倍,得到投屏画面。
  6. 根据权利要求3所述的显示设备,其特征在于,所述控制器被进一步配置为:
    计算所述投屏画面的旋转方向和旋转角度;所述投屏画面的旋转方向与所述显示器的转动方向相反;所述投屏画面的旋转角度与所述显示器的转动角度相等;
    按照所述旋转方向和所述旋转角度旋转所述投屏画面。
  7. 一种投屏方法,应用于显示设备,其特征在于,包括:
    接收所述终端发送的图像信息,其中,当所述终端处于纵向模式时,所述图像信息包括有效信息和左、右黑色信息,所述有效信息对应于所述终端的屏幕显示内容;
    若所述显示器的当前旋转状态与所述终端的纵向模式不匹配时,将所述显示器旋转至竖屏状态;
    基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
  8. 根据权利要求7所述的投屏方法,其特征在于,包括:
    若所述显示器的当前旋转状态与所述终端的纵向模式匹配,基于所述有效信息控制所述显示器呈现投屏画面,其中,所述投屏画面为所述有效信息放大预设倍数得到的。
  9. 根据权利要求7所述的投屏方法,其特征在于,所述判断所述显示器的当前旋转状态与所述终端的纵向模式是否匹配,包括:
    获取所述显示器旋转角度回调信息;
    根据所述图像信息确定所述显示器的目标旋转状态。
  10. 根据权利要求9所述的投屏方法,其特征在于,所述根据所述图像信息确定所述显示器的目标旋转状态,包括:
    如果接收到所述终端发送的所述图像信息中包括左、右黑色信息,则确定所述显示器的目标旋转状态为竖屏状态;
    如果接收到所述终端你发送的所述图像信息中不包括左、右黑色信息,则确定所述显示器的目标旋转状态为横屏状态。
PCT/CN2021/081889 2020-04-24 2021-03-19 一种显示设备及投屏方法 WO2021213097A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202180042822.4A CN115836528A (zh) 2020-04-24 2021-03-19 一种显示设备及投屏方法
US17/805,276 US11662971B2 (en) 2020-04-24 2022-06-03 Display apparatus and cast method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202010334727.9A CN113556593B (zh) 2020-04-24 2020-04-24 一种显示设备及投屏方法
CN202010334727.9 2020-04-24
CN202010331501.3 2020-04-24
CN202010331501.3A CN113556590B (zh) 2020-04-24 2020-04-24 一种投屏视频流有效分辨率检测方法及显示设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/805,276 Continuation US11662971B2 (en) 2020-04-24 2022-06-03 Display apparatus and cast method

Publications (1)

Publication Number Publication Date
WO2021213097A1 true WO2021213097A1 (zh) 2021-10-28

Family

ID=78271115

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/081889 WO2021213097A1 (zh) 2020-04-24 2021-03-19 一种显示设备及投屏方法

Country Status (3)

Country Link
US (1) US11662971B2 (zh)
CN (1) CN115836528A (zh)
WO (1) WO2021213097A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023086582A1 (en) * 2021-11-12 2023-05-19 Danvas, Inc. Exchange and display of digital content
KR20230128649A (ko) * 2022-02-28 2023-09-05 엘지전자 주식회사 디스플레이 장치 및 그의 동작 방법

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140333671A1 (en) * 2013-05-10 2014-11-13 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
CN110469751A (zh) * 2019-08-19 2019-11-19 四川长虹电器股份有限公司 用于液晶电视的安装结构
CN110581960A (zh) * 2019-09-12 2019-12-17 广州视源电子科技股份有限公司 视频处理方法、装置、***、存储介质和处理器
CN110740364A (zh) * 2019-11-14 2020-01-31 四川长虹电器股份有限公司 一种智能旋转电视装置、***及其工作方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012220595A (ja) 2011-04-06 2012-11-12 Seiko Epson Corp 表示装置、表示装置の制御方法、及び、プログラム
CN104753989B (zh) 2013-12-27 2018-09-14 阿里巴巴集团控股有限公司 基于Web-based OS运行环境的屏幕影像传输播放方法及装置
EP3190763B1 (en) 2014-11-03 2022-01-05 Huawei Technologies Co., Ltd. Screen sharing method and sharing device
JP6631181B2 (ja) 2015-11-13 2020-01-15 セイコーエプソン株式会社 画像投射システム、プロジェクター、及び、画像投射システムの制御方法
CN107105184A (zh) 2017-04-01 2017-08-29 深圳市蓝莓派科技有限公司 一种移动终端在竖屏广告机上的同屏投射方法
CN109597268B (zh) 2017-09-30 2020-09-22 昆山国显光电有限公司 显示装置
CN110109636B (zh) 2019-04-28 2022-04-05 华为技术有限公司 投屏方法、电子设备以及***
CN110286864A (zh) 2019-05-15 2019-09-27 武汉卡比特信息有限公司 一种手机与计算机类终端自适应互联投屏方法
CN110267073A (zh) 2019-07-24 2019-09-20 深圳市颍创科技有限公司 一种投屏画面显示及投屏画面旋转方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140333671A1 (en) * 2013-05-10 2014-11-13 Samsung Electronics Co., Ltd. Display apparatus and control method thereof
CN110469751A (zh) * 2019-08-19 2019-11-19 四川长虹电器股份有限公司 用于液晶电视的安装结构
CN110581960A (zh) * 2019-09-12 2019-12-17 广州视源电子科技股份有限公司 视频处理方法、装置、***、存储介质和处理器
CN110740364A (zh) * 2019-11-14 2020-01-31 四川长虹电器股份有限公司 一种智能旋转电视装置、***及其工作方法

Also Published As

Publication number Publication date
US11662971B2 (en) 2023-05-30
US20220300241A1 (en) 2022-09-22
CN115836528A (zh) 2023-03-21

Similar Documents

Publication Publication Date Title
CN112565839B (zh) 投屏图像的显示方法及显示设备
WO2021179359A1 (zh) 一种显示设备及显示画面旋转适配方法
WO2021212463A1 (zh) 一种显示设备及投屏方法
WO2021179363A1 (zh) 一种显示设备及开机动画显示方法
CN111866593B (zh) 一种显示设备及开机界面显示方法
CN113556593B (zh) 一种显示设备及投屏方法
WO2021212470A1 (zh) 一种显示设备及投屏画面显示方法
US11662971B2 (en) Display apparatus and cast method
CN112565861A (zh) 一种显示设备
CN113556591A (zh) 一种显示设备及投屏画面旋转显示方法
WO2021179361A1 (zh) 一种显示设备
CN113395600B (zh) 一种显示设备的界面切换方法及显示设备
CN113556590B (zh) 一种投屏视频流有效分辨率检测方法及显示设备
WO2022193475A1 (zh) 显示设备、接收投屏内容的方法及投屏方法
WO2021180223A1 (zh) 一种显示方法及显示设备
CN113630639B (zh) 一种显示设备
CN113542824B (zh) 一种显示设备及应用界面的显示方法
CN112565915A (zh) 显示设备和显示方法
WO2021208016A1 (zh) 一种显示设备及应用界面的显示方法
WO2021195919A1 (zh) 一种显示设备及开机信号源显示适配方法
CN113497965B (zh) 旋转动画的配置方法及显示设备
WO2021184387A1 (zh) 一种动画配置方法及显示设备
CN113497962B (zh) 旋转动画的配置方法及显示设备
CN113015006A (zh) 显示设备及显示方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21792807

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21792807

Country of ref document: EP

Kind code of ref document: A1