CN112363683B - Method and display device for supporting multi-layer display by webpage application - Google Patents

Method and display device for supporting multi-layer display by webpage application Download PDF

Info

Publication number
CN112363683B
CN112363683B CN202011323324.0A CN202011323324A CN112363683B CN 112363683 B CN112363683 B CN 112363683B CN 202011323324 A CN202011323324 A CN 202011323324A CN 112363683 B CN112363683 B CN 112363683B
Authority
CN
China
Prior art keywords
layer
application
display
core
core layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011323324.0A
Other languages
Chinese (zh)
Other versions
CN112363683A (en
Inventor
张小涛
李源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vidaa Netherlands International Holdings BV
Vidaa USA Inc
Original Assignee
Vidaa Netherlands International Holdings BV
Vidaa USA Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vidaa Netherlands International Holdings BV, Vidaa USA Inc filed Critical Vidaa Netherlands International Holdings BV
Priority to CN202011323324.0A priority Critical patent/CN112363683B/en
Publication of CN112363683A publication Critical patent/CN112363683A/en
Application granted granted Critical
Publication of CN112363683B publication Critical patent/CN112363683B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a method and a display device for supporting multi-layer display by a web application, wherein a UI layer of the displayed web application is divided into a UI core layer positioned at the bottom layer of the uppermost layer of an OSD and a UI system layer positioned at the top layer of the uppermost layer of the OSD by a controller; when the appointed application is started, the UI core layer sends a layer display instruction to the application management module, and the application management module sends the layer display instruction to the UI system layer, so that the UI system layer displays the system functions of the webpage application. Therefore, the method and the display device provided by the application can perform multi-layer display on the same webpage application, when a new appointed application is started, the bottom layer function of the webpage application started before is still displayed on the UI core layer (the bottom layer of the uppermost OSD layer), and the system function required to be displayed on the top layer of the uppermost OSD layer is displayed on the UI system layer, so that different functions of the same webpage application are simultaneously displayed on different layers, and the user experience is improved.

Description

Method and display device for supporting multi-layer display by webpage application
Technical Field
The application relates to the technical field of multi-layer management, in particular to a method and display equipment for supporting multi-layer display by web application.
Background
With the rapid development of display devices, the functions of the display devices are more and more abundant and the performances of the display devices are more and more powerful, and currently, the display devices comprise intelligent televisions, intelligent set top boxes, intelligent boxes, products with intelligent display screens and the like. To implement different functions, different applications may be preset within the display device, and some applications typically present a user interface using a browser configured within the display device.
When a User Interface is presented by a browser, the User Interface needs to be displayed by a UI (User Interface) layer in a display device, and an application which needs to be displayed by the UI layer is used as a web application. The web application is displayed at the uppermost layer of the on-screen display (OSD, on Screen Display) in the system for user viewing. However, since the display device may support simultaneous launching of multiple web applications, the newly launched web application may also need to be displayed at the uppermost layer, which may cause the newly launched web application to obscure functions that need to be displayed at the uppermost layer in the previously launched web application, where the functions that need to be displayed at the uppermost layer in the web application include a volume bar, a device or a status prompt icon, and the like.
It can be seen that, for the same web application, part of the functions need to be displayed at the uppermost layer, and part of the functions need to be located at the lowermost layer, but generally only one layer is set for the same web application, which results in that different functions of the same web application cannot be displayed at the same time.
Disclosure of Invention
The application provides a method and display equipment for supporting multi-layer display by a webpage application, which are used for solving the problem that the existing webpage application cannot display different functions at the same time.
In a first aspect, the present application provides a display apparatus comprising:
a display configured to present a user interface for displaying different functions of the web application;
the controller is connected with the display and is used for splitting a UI layer of the displayed webpage application into a UI core layer and a UI system layer, the UI system layer is positioned on the top layer of the uppermost OSD layer, and the UI system layer is used for displaying system functions which need to be displayed on the top layer of the uppermost OSD layer in the webpage application; the UI core layer is positioned at the bottom layer of the uppermost OSD layer and is used for displaying the bottom layer function which is required to be displayed at the bottom layer of the uppermost OSD layer in the webpage application;
an application management module for realizing communication between the UI core layer and the UI system layer is configured in the controller, and the application management module is configured to:
Receiving a layer display instruction sent by the UI core layer, wherein the layer display instruction refers to an instruction generated by the UI core layer after receiving an application starting instruction for starting a specified application, and the specified application refers to an application which needs to be displayed on the uppermost layer of the OSD;
and responding to the layer display instruction, displaying the designated function of the designated application between the UI core layer and the UI system layer, and sending the layer display instruction to the UI system layer, wherein the layer display instruction is used for indicating the UI system layer to display the system function of the webpage application.
In some embodiments of the application, the application management module is further configured to:
after the webpage application is started, receiving registration information sent by the UI core layer and the UI system layer respectively;
and establishing communication connection between the local terminal and the UI core layer based on the registration information sent by the UI core layer, and establishing communication connection between the local terminal and the UI system layer based on the registration information sent by the UI system layer.
In some embodiments of the application, the application management module is further configured to:
receiving request information sent by the UI system layer;
And sending the request information to the UI core layer, and establishing communication connection between the UI system layer and the UI core layer.
In some embodiments of the application, the application management module is further configured to:
receiving request information sent by the UI core system layer;
and sending the request information to the UI system layer, and establishing communication connection between the UI core layer and the UI system layer.
In some embodiments of the application, the application management module is further configured to:
receiving a layer exit instruction sent by the UI core layer;
and sending the layer exit instruction to a UI system layer, wherein the layer exit instruction is used for indicating the UI system layer to cancel the system function of displaying the webpage application.
In some embodiments of the present application, a main process for storing data is configured in the controller, and the main process is configured to:
receiving data storage instructions respectively sent by the UI core layer and the UI system layer;
and respectively storing the data corresponding to the UI core layer and the data corresponding to the UI system layer in response to each data storage instruction.
In some embodiments of the application, the master-process is further configured to:
Generating a data change notification when the stored data changes;
and sending the data change notification to the UI core layer and the UI system layer respectively.
In some embodiments of the application, the master-process is further configured to:
receiving a core data change notification carrying changed data sent by the UI core layer, wherein the core data change notification is used for representing that data corresponding to the UI core layer changes;
and responding to the core data change notification, storing changed data carried by the core data change notification, and sending the core data change notification to the UI system layer.
In some embodiments of the application, the master-process is further configured to:
receiving a system data change notification carrying changed data sent by the UI system layer, wherein the system data change notification is used for representing that the data corresponding to the UI system layer changes;
and responding to the system data change notification, storing changed data carried by the system data change notification, and sending the system data change notification to the UI core layer.
In a second aspect, the present application further provides a method for supporting multi-layer display by a web application, which is applied to a controller, where the controller is configured to split a UI layer of a displayed web application into a UI core layer and a UI system layer, where the UI system layer is located at a top layer of an OSD, and the UI system layer is configured to display a system function in the web application, where the system function needs to be displayed at the top layer of the OSD; the UI core layer is positioned at the bottom layer of the uppermost OSD layer and is used for displaying the bottom layer function which is required to be displayed at the bottom layer of the uppermost OSD layer in the webpage application; the method comprises the following steps:
An application management module in the controller receives a layer display instruction sent by the UI core layer, wherein the layer display instruction refers to an instruction generated by the UI core layer after receiving an application starting instruction for starting a specified application, and the specified application refers to an application which needs to be displayed on the uppermost layer of the OSD;
and responding to the layer display instruction, displaying the designated function of the designated application between the UI core layer and the UI system layer, and sending the layer display instruction to the UI system layer, wherein the layer display instruction is used for indicating the UI system layer to display the system function of the webpage application.
In a third aspect, the present application further provides a storage medium, where a program may be stored, where the program may implement some or all of the steps in embodiments of a method for supporting multi-layer display by a web application provided by the present application when executed.
As can be seen from the above technical solutions, in the method and the display device for supporting multi-layer display by using a web application provided by the embodiments of the present application, a controller splits a UI layer of a displayed web application into a UI core layer located at a bottom layer of an OSD uppermost layer and a UI system layer located at a top layer of the OSD uppermost layer; when the appointed application is started up, the UI core layer sends a layer display instruction to the application management module, and the application management module sends the layer display instruction to the UI system layer so that the UI system layer displays the system functions of the webpage application. Therefore, the method and the display device provided by the embodiment of the application can perform multi-layer display on the same webpage application, when a new appointed application is started, the bottom layer function of the webpage application started before is still displayed on the UI core layer (the bottom layer of the uppermost layer of the OSD), and the system function required to be displayed on the top layer of the uppermost layer of the OSD is displayed on the UI system layer, so that different functions of the same webpage application are simultaneously displayed on different layers, and the user experience is improved.
Drawings
In order to more clearly illustrate the technical solution of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
A schematic diagram of an operational scenario between a display device and a control apparatus according to some embodiments is schematically shown in fig. 1;
a hardware configuration block diagram of a display device 200 according to some embodiments is exemplarily shown in fig. 2;
a hardware configuration block diagram of the control device 100 according to some embodiments is exemplarily shown in fig. 3;
a schematic diagram of the software configuration in a display device 200 according to some embodiments is exemplarily shown in fig. 4;
an icon control interface display schematic of an application in a display device 200 according to some embodiments is illustrated in fig. 5;
a display schematic of OSD layer priorities and applications according to some embodiments is illustrated in fig. 6;
a display schematic of OSD layer priorities and new applications according to some embodiments is illustrated in fig. 7;
a schematic display of a Layer1 Layer according to some embodiments is shown schematically in fig. 8;
A schematic diagram of a new layer relationship is shown schematically in fig. 9, in accordance with some embodiments;
a relationship diagram of functional modules in a display device according to some embodiments is exemplarily shown in fig. 10;
a schematic diagram of interactions between applications according to some embodiments is shown schematically in fig. 11;
a flowchart of a method for a web application to support multi-layer display according to some embodiments is illustrated in fig. 12;
an interaction diagram of a method for a web application to support multi-layer display according to some embodiments is illustrated in FIG. 13;
an interaction diagram for data sharing according to some embodiments is illustrated in fig. 14.
Detailed Description
For the purposes of making the objects, embodiments and advantages of the present application more apparent, an exemplary embodiment of the present application will be described more fully hereinafter with reference to the accompanying drawings in which exemplary embodiments of the application are shown, it being understood that the exemplary embodiments described are merely some, but not all, of the examples of the application.
Based on the exemplary embodiments described herein, all other embodiments that may be obtained by one of ordinary skill in the art without making any inventive effort are within the scope of the appended claims. Furthermore, while the present disclosure has been described in terms of an exemplary embodiment or embodiments, it should be understood that each aspect of the disclosure can be practiced separately from the other aspects.
It should be noted that the brief description of the terminology in the present application is for the purpose of facilitating understanding of the embodiments described below only and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms first, second, third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated (Unless otherwise indicated). It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" as used in this disclosure refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the function associated with that element.
The term "remote control" as used herein refers to a component of an electronic device (such as a display device as disclosed herein) that can be controlled wirelessly, typically over a relatively short distance. Typically, the electronic device is connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in a general remote control device with a touch screen user interface.
The term "gesture" as used herein refers to a user behavior by which a user expresses an intended idea, action, purpose, and/or result through a change in hand shape or movement of a hand, etc.
A schematic diagram of an operational scenario between a display device and a control apparatus according to some embodiments is schematically shown in fig. 1. As shown in fig. 1, a user may operate the display apparatus 200 through the mobile terminal 300 and the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, etc., and the display device 200 is controlled by a wireless or other wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc. Such as: the user can input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller to realize the functions of the control display device 200.
In some embodiments, mobile terminals, tablet computers, notebook computers, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on a smart device. The application program, by configuration, can provide various controls to the user in an intuitive User Interface (UI) on a screen associated with the smart device.
In some embodiments, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and achieve the purpose of one-to-one control operation and data communication. Such as: it is possible to implement a control command protocol established between the mobile terminal 300 and the display device 200, synchronize a remote control keyboard to the mobile terminal 300, and implement a function of controlling the display device 200 by controlling a user interface on the mobile terminal 300. The audio/video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display device 200 is also in data communication with the server 400 via a variety of communication means. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. By way of example, display device 200 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limited, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide a smart network television function of a computer support function, including, but not limited to, a network television, a smart television, an Internet Protocol Television (IPTV), etc., in addition to the broadcast receiving television function.
A hardware configuration block diagram of a display device 200 according to some embodiments is illustrated in fig. 2.
In some embodiments, at least one of the controller 250, the modem 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, the display 275 is configured to receive image signals from the first processor output, and to display video content and images and components of the menu manipulation interface.
In some embodiments, display 275 includes a display screen assembly for presenting pictures, and a drive assembly for driving the display of images.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via a wired or wireless communication protocol. Alternatively, various image contents received from the network server side transmitted from the network communication protocol may be displayed.
In some embodiments, the display 275 is used to present a user-manipulated UI interface generated in the display device 200 and used to control the display device 200.
In some embodiments, depending on the type of display 275, a drive assembly for driving the display is also included.
In some embodiments, display 275 is a projection display and may further include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator 220 may include at least one of a Wifi module 221, a bluetooth module 222, a wired ethernet module 223, or other network communication protocol module or a near field communication protocol module, and an infrared receiver.
In some embodiments, the display device 200 may establish control signal and data signal transmission and reception between the communicator 220 and the external control device 100 or the content providing device.
In some embodiments, the user interface 265 may be used to receive infrared control signals from the control device 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal that the display device 200 uses to capture or interact with the external environment.
In some embodiments, the detector 230 includes an optical receiver, a sensor for capturing the intensity of ambient light, a parameter change may be adaptively displayed by capturing ambient light, etc.
In some embodiments, the detector 230 may further include an image collector 232, such as a camera, a video camera, etc., which may be used to collect external environmental scenes, collect attributes of a user or interact with a user, adaptively change display parameters, and recognize a user gesture to implement a function of interaction with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display device 200 may adaptively adjust the display color temperature of the image. The display device 200 may be adjusted to display a colder color temperature shade of the image, such as when the temperature is higher, or the display device 200 may be adjusted to display a warmer color shade of the image when the temperature is lower.
In some embodiments, the detector 230 also includes a sound collector 231 or the like, such as a microphone, that may be used to receive the user's sound. Illustratively, a voice signal including a control instruction for a user to control the display apparatus 200, or an acquisition environmental sound is used to recognize an environmental scene type so that the display apparatus 200 can adapt to environmental noise.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to enable data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, command instruction data, or the like.
In some embodiments, external device interface 240 may include, but is not limited to, the following: any one or more interfaces of a high definition multimedia interface HDMI interface, an analog or data high definition component input interface, a composite video input interface, a USB input interface, an RGB port, and the like can be used. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the modem 210 is configured to receive the broadcast television signal by a wired or wireless receiving manner, and may perform modulation and demodulation processes such as amplification, mixing, and resonance, and demodulate the audio/video signal from a plurality of wireless or wired broadcast television signals, where the audio/video signal may include a television audio/video signal carried in a television channel frequency selected by a user, and an EPG data signal.
In some embodiments, the frequency point demodulated by the modem 210 is controlled by the controller 250, and the controller 250 may send a control signal according to the user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to a broadcasting system of the television signal. Or may be differentiated into digital modulation signals, analog modulation signals, etc., depending on the type of modulation. Or it may be classified into digital signals, analog signals, etc. according to the kind of signals.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like. In this way, the set-top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored on the memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command to select to display a UI object on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation of connecting to a hyperlink page, a document, an image, or the like, or executing an operation of a program corresponding to the icon. The user command for selecting the UI object may be an input command through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display device 200 or a voice command corresponding to a voice uttered by the user.
As shown in fig. 2, the controller 250 includes at least one of a random access Memory 251 (Random Access Memory, RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a graphics processor (Graphics Processing Unit, GPU), a central processing unit 254 (Central Processing Unit, CPU), a communication interface (Communication Interface), and a communication Bus 256 (Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other on-the-fly programs
In some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, ROM 252 is used to store a basic input output system, referred to as a basic input output system (Basic Input Output System, BIOS). The system comprises a drive program and a boot operating system, wherein the drive program is used for completing power-on self-checking of the system, initialization of each functional module in the system and basic input/output of the system.
In some embodiments, upon receipt of a power-on signal, the display device 200 power begins to boot, and the processor 254 executes system boot instructions in the ROM 252 to copy temporary data of the operating system stored in memory into the RAM 251 to facilitate booting or running the operating system. When the operating system is started, the processor 254 copies temporary data of various applications in memory to the RAM 251, and then facilitates the starting or running of the various applications.
In some embodiments, processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
In some example embodiments, the processor 254 may include a plurality of processors. The plurality of processors may include one main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in the pre-power-up mode and/or displaying a picture in the normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is configured to generate various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. The device comprises an arithmetic unit, wherein the arithmetic unit is used for receiving various interaction instructions input by a user to carry out operation and displaying various objects according to display attributes. And a renderer for rendering the various objects obtained by the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, video processor 270 is configured to receive external video signals, perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image composition, etc., according to standard codec protocols for input signals, and may result in signals that are displayed or played on directly displayable device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2, and demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signals, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display.
The frame rate conversion module is configured to convert the input video frame rate, for example, converting the 60Hz frame rate into the 120Hz frame rate or the 240Hz frame rate, and the common format is implemented in an inserting frame manner.
The display format module is used for converting the received frame rate into a video output signal, and changing the video output signal to a signal conforming to the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor 253 may be integrated with the video processor, or may be separately configured, where the integrated configuration may perform processing of graphics signals output to the display, and the separate configuration may perform different functions, such as gpu+ FRC (Frame Rate Conversion)) architecture, respectively.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the audio signal according to a standard codec protocol of an input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing, so as to obtain a sound signal that can be played in a speaker.
In some embodiments, video processor 270 may include one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, video processor 270 and audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
In some embodiments, the audio output, under the control of the controller 250, receives sound signals output by the audio processor 280, such as: the speaker 286, and an external sound output terminal that can be output to a generating device of an external device, other than the speaker carried by the display device 200 itself, such as: external sound interface or earphone interface, etc. can also include the close range communication module in the communication interface, for example: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
The power supply 290 supplies power input from an external power source to the display device 200 under the control of the controller 250. The power supply 290 may include a built-in power circuit installed inside the display device 200, or may be an external power source installed in the display device 200, and a power interface for providing an external power source in the display device 200.
The user interface 265 is used to receive an input signal from a user and then transmit the received user input signal to the controller 250. The user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
In some embodiments, a user inputs a user command through the control apparatus 100 or the mobile terminal 300, the user input interface is then responsive to the user input through the controller 250, and the display device 200 is then responsive to the user input.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 275, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
In some embodiments, a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The memory 260 includes memory storing various software modules for driving the display device 200. Such as: various software modules stored in the first memory, including: at least one of a base module, a detection module, a communication module, a display control module, a browser module, various service modules, and the like.
The base module is a bottom software module for signal communication between the various hardware in the display device 200 and for sending processing and control signals to the upper modules. The detection module is used for collecting various information from various sensors or user input interfaces and carrying out digital-to-analog conversion and analysis management.
For example, the voice recognition module includes a voice analysis module and a voice instruction database module. The display control module is used for controlling the display to display the image content, and can be used for playing the multimedia image content, the UI interface and other information. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. And the service module is used for providing various services and various application programs. Meanwhile, the memory 260 also stores received external data and user data, images of various items in various user interfaces, visual effect maps of focus objects, and the like.
Fig. 3 illustrates a block diagram of a configuration of the control device 100 according to some embodiments. As shown in fig. 3, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface, a memory, and a power supply.
The control device 100 is configured to control the display device 200, and may receive an input operation instruction of a user, and convert the operation instruction into an instruction recognizable and responsive to the display device 200, to function as an interaction between the user and the display device 200. Such as: the user responds to the channel addition and subtraction operation by operating the channel addition and subtraction key on the control apparatus 100, and the display apparatus 200.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications for controlling the display apparatus 200 according to user's needs.
In some embodiments, as shown in fig. 1, a mobile terminal 300 or other intelligent electronic device may function similarly to the control device 100 after installing an application that manipulates the display device 200. Such as: the user may implement the functions of controlling the physical keys of the device 100 by installing various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 110 includes a processor 112 and RAM 113 and ROM 114, a communication interface 130, and a communication bus. The controller is used to control the operation and operation of the control device 100, as well as the communication collaboration among the internal components and the external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display device 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display device 200. The communication interface 130 may include at least one of a WiFi chip 131, a bluetooth module 132, an NFC module 133, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touchpad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can implement a user instruction input function through actions such as voice, touch, gesture, press, and the like, and the input interface converts a received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the corresponding instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display device 200. In some embodiments, an infrared interface may be used, as well as a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. And the following steps: when the radio frequency signal interface is used, the user input instruction is converted into a digital signal, and then the digital signal is modulated according to a radio frequency control signal modulation protocol and then transmitted to the display device 200 through the radio frequency transmission terminal.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an input-output interface 140. The control device 100 is provided with a communication interface 130 such as: the WiFi, bluetooth, NFC, etc. modules may send the user input instruction to the display device 200 through a WiFi protocol, or a bluetooth protocol, or an NFC protocol code.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control device 200 under the control of the controller. The memory 190 may store various control signal instructions input by a user.
A power supply 180 for providing operating power support for the various elements of the control device 100 under the control of the controller. May be a battery and associated control circuitry.
In some embodiments, the system may include a Kernel (Kernel), a command parser (shell), a file system, and an application. The kernel, shell, and file system together form the basic operating system architecture that allows users to manage files, run programs, and use the system. After power-up, the kernel is started, the kernel space is activated, hardware is abstracted, hardware parameters are initialized, virtual memory, a scheduler, signal and inter-process communication (IPC) are operated and maintained. After the kernel is started, shell and user application programs are loaded again. The application program is compiled into machine code after being started to form a process.
A schematic diagram of the software configuration in the display device 200 according to some embodiments is schematically shown in fig. 4. Referring to FIG. 4, in some embodiments, the system is divided into four layers, from top to bottom, an application layer (referred to as an "application layer"), an application framework layer (Application Framework layer) (referred to as a "framework layer"), a An Zhuoyun row (Android run) and a system library layer (referred to as a "system runtime layer"), and a kernel layer, respectively.
In some embodiments, at least one application program is running in the application program layer, and these application programs may be a Window (Window) program of an operating system, a system setting program, a clock program, a camera application, and the like; and may be an application program developed by a third party developer, such as a hi-see program, a K-song program, a magic mirror program, etc. In particular implementations, the application packages in the application layer are not limited to the above examples, and may actually include other application packages, which the embodiments of the present application do not limit.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions. The application framework layer corresponds to a processing center that decides to let the applications in the application layer act. An application program can access resources in a system and acquire services of the system in execution through an API interface
As shown in fig. 4, the application framework layer in the embodiment of the present application includes a manager (manager), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used to interact with all activities that are running in the system; a Location Manager (Location Manager) is used to provide system services or applications with access to system Location services; a Package Manager (Package Manager) for retrieving various information about an application Package currently installed on the device; a notification manager (Notification Manager) for controlling the display and clearing of notification messages; a Window Manager (Window Manager) is used to manage bracketing icons, windows, toolbars, wallpaper, and desktop components on the user interface.
In some embodiments, the activity manager is to: the lifecycle of each application program is managed, as well as the usual navigation rollback functions, such as controlling the exit of the application program (including switching the currently displayed user interface in the display window to the system desktop), opening, backing (including switching the currently displayed user interface in the display window to the previous user interface of the currently displayed user interface), etc.
In some embodiments, the window manager is configured to manage all window procedures, such as obtaining a display screen size, determining whether there is a status bar, locking the screen, intercepting the screen, controlling display window changes (e.g., scaling the display window down, dithering, distorting, etc.), and so on.
In some embodiments, the system runtime layer provides support for the upper layer, the framework layer, and when the framework layer is in use, the android operating system runs the C/C++ libraries contained in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the kernel layer contains at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, touch sensor, pressure sensor, etc.), and the like.
In some embodiments, the kernel layer further includes a power driver module for power management.
In some embodiments, the software programs and/or modules corresponding to the software architecture in fig. 4 are stored in the first memory or the second memory shown in fig. 2 or fig. 3.
In some embodiments, taking a magic mirror application (photographing application) as an example, when the remote control receiving device receives an input operation of the remote control, a corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the input operation into the original input event (including the value of the input operation, the timestamp of the input operation, etc.). The original input event is stored at the kernel layer. The application program framework layer acquires an original input event from the kernel layer, identifies a control corresponding to the input event according to the current position of the focus and takes the input operation as a confirmation operation, wherein the control corresponding to the confirmation operation is a control of a magic mirror application icon, the magic mirror application calls an interface of the application framework layer, the magic mirror application is started, and further, a camera driver is started by calling the kernel layer, so that a still image or video is captured through a camera.
In some embodiments, for a display device with a touch function, taking a split screen operation as an example, the display device receives an input operation (such as a split screen operation) acted on a display screen by a user, and the kernel layer may generate a corresponding input event according to the input operation and report the event to the application framework layer. The window mode (e.g., multi-window mode) and window position and size corresponding to the input operation are set by the activity manager of the application framework layer. And window management of the application framework layer draws a window according to the setting of the activity manager, then the drawn window data is sent to a display driver of the kernel layer, and the display driver displays application interfaces corresponding to the window data in different display areas of the display screen.
An icon control interface display schematic of an application in a display device 200 according to some embodiments is illustrated in fig. 5. In some embodiments, as shown in fig. 5, the application layer contains at least one icon control that the application can display in the display, such as: a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control, and the like.
In some embodiments, the live television application may provide live television via different signal sources. For example, a live television application may provide television signals using inputs from cable television, radio broadcast, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
In some embodiments, the video on demand application may provide video from different storage sources. Unlike live television applications, video-on-demand provides video displays from some storage sources. For example, video-on-demand may come from the server side of cloud storage, from a local hard disk storage containing stored video programs.
In some embodiments, the media center application may provide various multimedia content playing applications. For example, a media center may be a different service than live television or video on demand, and a user may access various images or audio through a media center application.
In some embodiments, an application center may be provided to store various applications. The application may be a game, an application, or some other application associated with a computer system or other device but which may be run in a smart television. The application center may obtain these applications from different sources, store them in local storage, and then be run on the display device 200.
In some embodiments, when displaying a web application using a browser (e.g., a Cobalt) in a display device, the current browser implementation is generally divided into a core layer and a docking layer; the docking layer is used for docking with specific hardware of the bottom layer (display device) to realize hardware related functions, such as window drawing, media playing function and the like. The core layer is connected with the butt joint layer and is used for realizing the communication between the core layer and a bottom layer platform (display device).
A window interface layer (window API) is configured in the docking layer for calling the underlying platform to create a window, and the window creation involves many attributes, such as wide cache, cache size, layer number (layer), etc. The layer number (layer) is used for finally fusing the contents rendered on different layers together so as to achieve the aim of realizing an interface with rich contents.
In implementing graphic composition, taking video playing in a display device as an example, the graphic composition needs to be implemented by using a video layer and an OSD (On Screen Display ) layer in video playing, where the video layer is used to implement video playing, and the OSD layer is used to display a playing picture. To facilitate graphical display, there is a distinction of priority between different layers, with high priority layers overlapping low priority layers. For example, the video layer has a lower priority than the OSD layer, that is, the OSD layer is displayed above the video layer when the video is played.
A display schematic of OSD layer priorities and applications according to some embodiments is illustrated in fig. 6. Referring to fig. 6, among OSD layers, layers of Layer0, layer1, layer2, etc. are generally included, and the priority of the three layers is Layer 2<Layer 0<Layer 1, and the arrow direction is the line of sight direction of the user, i.e., layer1 is located at the uppermost Layer and Layer2 is located at the lowermost Layer.
In some embodiments, when presenting a user interface of an application using a display device, the UI typically used to present the user interface is located at the uppermost Layer (Layer 1), the application is located at the middle Layer (Layer 0), where the application may be YouTube, amazon (a video website), EPOS (Electronic Point Of Sale electronic point of sale), and so on.
The applications of YouTube, amazon, EPOS (the three are located in Layer 0) and UI (the UI is located in Layer 1) are single page Web applications (SPA) based on browser implementation, i.e. Web applications. Because the related implementation of the interface of the Window interface layer (Window API) in the interface layer of the Cobalt browser only has windows with one attribute setting (such as hierarchy setting), that is, the Window created after each start of the current Cobalt is the same attribute, that is, each application is displayed at a fixed hierarchy after the start. However, as the variety of applications supported by the display device increases, new application requirements exist for presentation in Layer 1.
A display schematic of OSD layer priorities and new applications according to some embodiments is illustrated in fig. 7; a schematic display of a Layer1 Layer according to some embodiments is shown schematically in fig. 8. Referring to fig. 7 and 8, new applications that need to be exposed in Layer1 may be Google assant and Amazon APL (Alexa Presentation Language), in which case the applications of Google assant, amazon APL and UI layers are located in the same Layer 1.
Although the three are shown in the same Layer (Layer 1), there is also a fixed order among the three, e.g., *** Assistant is at the forefront, APL times, UI is at the lowest. It can be seen that this arrangement will cause the part of the functionality of the UI that needs to be displayed at the forefront to be blocked, that is, when a new web application is launched, the newly launched web application will block the functionality of the previously launched web application (the application that has been displayed in the UI layer) that needs to be displayed at the uppermost layer, where the functionality of the web application that needs to be displayed at the uppermost layer includes a volume bar, a device or a status prompt icon, etc.
Therefore, to avoid the occurrence of occlusion, the above functions need to be displayed on the top Layer of the OSD (i.e., the uppermost Layer of Layer 1). In some embodiments, the function that needs to be displayed at the top layer of the OSD is taken as a system function, which is a sub-module in the UI layer, such as a system menu, a volume bar, a device or status prompt icon, etc.
Therefore, due to the characteristic that the existing UI layer is displayed in a single layer, all functions related to the UI are displayed in the same layer. For the top layer of the OSD where some functions are required to be displayed, when other functions are still displayed at the bottom layer of the OSD, the single-layer display method cannot be implemented, so that different functions of the web page application displayed on the UI layer cannot be displayed simultaneously.
Therefore, in order to enable different functions of a web application displayed in a UI layer to be displayed simultaneously, the embodiment of the invention provides a mechanism for supporting multi-layer display of the web application, which splits the UI layer into two different layers, namely a UI core layer and a UI system layer, and provides a communication mechanism, a data sharing mechanism and a life cycle management mechanism for quality inspection of the UI core layer and the UI system layer, so that different functions of the web application can be displayed in the UI core layer and the UI system layer respectively, that is, different functions of the same web application are displayed simultaneously in the multi-layer, thereby improving user experience.
In order to execute a mechanism for supporting multi-layer display by a web application, an embodiment of the present invention provides a display device, including: a display configured to present a user interface for displaying different functions of the web application; the controller is connected with the display and is used for splitting the UI layer of the displayed webpage application into a UI Core layer (UI-Core) and a UI System layer (UI-System) so as to perform different functions of the multi-layer display webpage application. Where a displayed web application refers to a previously launched web application, i.e., an application presented at the front-most, viewable by a user in the user interface of the display.
A schematic diagram of the new layer relationships according to some embodiments is illustrated in fig. 9. Referring to fig. 9, the UI core Layer is located at the bottom Layer of the OSD uppermost Layer (i.e., layer 1), and is respectively APL and Google assant in the upward order, and the UI system Layer is located at the top Layer of the OSD uppermost Layer (i.e., layer 1).
In some embodiments, the UI system layer is located at the top layer of the OSD, where the UI system layer is used to display a system function in the web application that needs to be displayed at the top layer of the OSD; the UI core layer is positioned at the bottom layer of the top layer of the OSD, and is used for displaying the bottom layer function which needs to be displayed at the bottom layer of the top layer of the OSD in the webpage application.
After splitting the UI layer into a UI core layer and a UI system layer, in order to facilitate communication, in some embodiments, interactions between the UI core layer and the UI system layer are performed through APM (Application Program Manager, application management module) and TV Main (Main process).
A relationship diagram of functional modules in a display device according to some embodiments is illustrated in fig. 10. Referring to fig. 10, when a certain web application has been displayed in the UI layer, that is, the UI system layer and the UI core layer of the web application have been displayed in the browser presented in the display, new applications Google assant and Amazon APL are restarted, and at this time, the display priorities of the four functions of the UI core layer, the UI system layer, *** assant and Amazon APL are respectively: UI system layer > Google Assistant > Amazon APL > UI core layer. Window windows 1.1 corresponding to the UI system layer are located at the forefront, window windows 1.4 corresponding to the UI core layer are located at the rearrear, windows 1.2 and windows 1.3 corresponding to the Google assant and Amazon APL are located between window windows 1.1 and window1.4 in sequence.
For control and data transfer, the UI core layer, the UI system layer, the Google Assistant, and the Amazon APL all need to interact with the controller in the display device through the application management module and the host process. The controller in the display device is used as a bottom layer platform and is respectively connected with the application management module and the main process so that the controller can transmit data or notification, instructions and the like to the application management module and the main process.
The UI core layer and the UI system layer cannot directly interact with each other, and data transmission or control is needed by means of an application management module and a main process. When the UI core layer requests to realize a certain function to the UI system layer, the UI core layer sends a request/instruction to the application management module, and the application management module forwards the request/instruction to the UI system layer so as to establish communication between the UI core layer and the UI system layer; when the UI system layer requests to realize a certain function to the UI core layer, the UI system layer sends a request/instruction to the application management module, and the application management module forwards the request/instruction to the UI core layer so as to establish communication between the UI system layer and the UI core layer; it can be seen that the UI core layer and the UI system layer are implemented by the application management module for communication.
In order to ensure real-time interaction between the UI core layer and the UI system layer and accurately respond to corresponding functions, data synchronization between the UI core layer and the UI system layer is required, and data sharing is realized. To this end, in some embodiments, data sharing between the UI core layer and the UI system layer is implemented with a host process. The UI core layer sends the corresponding real-time data to the main process for storage, and the UI system layer also sends the corresponding real-time data to the main process for storage. All data required by the display device in presenting the intelligent television function, including channel information, program information, volume information, personalized setting parameters and the like, are stored in the main process.
The data stored in the host process may be accessed by all web applications configured within the display device, the manner in which the applications access the data including, but not limited to, two types: one is that each web application establishes communication connection with the main process, and can actively acquire data stored in the main process; and when the data stored in the main process changes, the data can be notified to the UI core layer and the UI system layer corresponding to each webpage application in real time, so that the UI core layer and the UI system layer can update the corresponding data in time, and the UI picture effect is ensured. The data transmission process and the process of sending requests/instructions can be seen from the solid line portion of fig. 10.
When a new application is started, such as Google Assistant and Amazon APL, an application management module controls the display of Google Assistant and Amazon APL, namely the positions of the displayed layers are controlled, so that a plurality of web page applications are displayed by a plurality of layers at the same time, and the sequential display of the plurality of web page applications on different layers is ensured. The control process is shown in phantom in fig. 10.
Therefore, when the mechanism that the webpage application supports multi-layer display is executed, the display device provided by the embodiment of the invention can support the display of different functions of the same webpage application on different layers, can also support the sequential display of a plurality of webpage applications on different layers, and improves the user experience.
A schematic diagram of interactions between applications according to some embodiments is illustrated in fig. 11. Real-time interaction is performed between the UI core layer and the UI system layer, and the UI core layer and the UI system layer are required to respectively establish a connection relationship with the application management module. In some embodiments, in establishing the connection relationship, the application management module is further configured to:
step 011, after the web application is started, the registration information sent by the UI core layer and the UI system layer is respectively received.
Step 012, based on the registration information sent by the UI core layer, establishing a communication connection between the local terminal and the UI core layer, and based on the registration information sent by the UI system layer, establishing a communication connection between the local terminal and the UI system layer.
When a web application has been launched in the display device, a user interface for the web application is presented in the display. The controller in the display device splits the UI layer of the webpage application for presenting the user interface into a UI core layer and a UI system layer, and after the display device is started and the webpage application is started, the controller starts an application management module and simultaneously starts the UI core layer and the UI system layer corresponding to the webpage application.
Referring to fig. 11 (a), after the UI core layer and the UI system layer of the same web application are started, corresponding registration information is sent to the application management module respectively, so as to establish connection with the application management module respectively, and achieve communication between the UI core layer and the UI system layer. The registration information comprises a registration mode, an object, state information and the like, wherein the registration mode is used for representing registration means which can be supported by the UI core layer and the UI system layer respectively with the application management module; the object is used for transmitting the message, and as a data aggregate, provides an interface for sending and receiving the message; the state information comprises the starting states of the UI core layer and the UI system layer, and the UI core layer and the UI system layer send the state information to the application management module so as to inform the application management module that the UI layer is started.
After receiving the registration information of the UI core layer, the application management module can establish communication connection between the application management module and the UI core layer; after receiving the registration information of the UI system layer, the application management module can establish communication connection between the application management module and the UI system layer. After the application management module establishes communication connection with the UI core layer and the UI system layer respectively, communication between the UI core layer and the UI system layer can be realized.
After communication is established between the UI core layer and the UI system layer, the UI core layer can send a request/instruction to the UI system layer through the application management module, and meanwhile, the UI system layer can also send the request/instruction to the UI core layer through the application management module.
In some embodiments, when the UI system layer sends the request/instruction to the UI core layer, the application management module is further configured to perform the steps of:
step 021, receiving the request information sent by the UI system layer.
Step 022, the request information is sent to the UI core layer, and communication connection between the UI system layer and the UI core layer is established.
Referring to fig. 11 (b), in actual application, if there is a request/instruction that the UI system layer needs to send to the UI core layer to request the UI core layer to perform a certain function, request information including a request or instruction is sent by the UI system layer to the application management module. After receiving the request information sent by the UI system layer, the application management module forwards the request information to the UI core layer, and the UI core layer receives the request information and can execute corresponding actions, and at the moment, communication connection between the UI system layer and the UI core layer is established through the application management module.
In some embodiments, when the UI core layer sends the request/instruction to the UI system layer, the application management module is further configured to:
step 031, receiving request information sent by the UI core system layer;
step 032, sending the request information to the UI system layer, and establishing communication connection between the UI core layer and the UI system layer.
Referring to fig. 11 (c), in actual application, if there is a request/instruction that the UI core layer needs to send to the UI system layer to request the UI system layer to perform a certain function, request information including a request or instruction is sent by the UI core layer to the application management module. After receiving the request information sent by the UI core layer, the application management module forwards the request information to the UI system layer, and the UI system layer receives the request information and can execute corresponding actions, and at the moment, communication connection between the UI core layer and the UI system layer is established through the application management module.
For example, when the UI core layer controls the UI system layer to display the system function, the UI core layer sends a system function display instruction to the application management module, and the application management module forwards the system function display instruction to the UI system layer, and the UI system layer displays the system function in response to the system function display instruction.
It can be seen that the controller in the display device splits the UI layer into a UI core layer and a UI system layer, and the communication between the UI core layer and the UI system layer is implemented by the application management module. After the communication connection between the UI core layer and the UI system layer is established, the system function can be displayed on the UI system layer when a new application is started, and the system function is prevented from being blocked by the new application, so that a user can always check the system function, and the user experience is improved.
A flowchart of a method for a web application to support multi-layer display according to some embodiments is illustrated in fig. 12; an interaction diagram of a method for a web application to support multi-layer display according to some embodiments is illustrated in FIG. 13. In order to facilitate the display of a web application on multiple layers, the embodiment of the present invention provides a display device, and the method for supporting the display on multiple layers by the web application shown in fig. 12 and 13 is executed by using an application management module configured in a controller, so that the system function of the web application started previously is displayed on the top layer of the OSD. Specifically, the application management module is configured to perform the steps of:
s1, receiving a layer display instruction sent by a UI core layer, wherein the layer display instruction refers to an instruction generated by the UI core layer after receiving an application starting instruction for starting a specified application, and the specified application refers to an application which needs to be displayed on the uppermost layer of an OSD.
When a web application is started in the current display device, at this time, the related functions of the web application are displayed in the uppermost Layer (Layer 1) of the OSD, that is, the user interface content presented in the display is the related function of the web application.
If the user starts the other web applications again, the other web applications will block the started web applications, and at this time, the started UI core layer may send a layer display instruction to the application management module to notify the UI system layer to display the system functions.
In this process, referring to fig. 13 (a), when a user starts a new web application, that is, starts a designated application, an application start instruction is generated and sent to the UI core layer that previously started the web application. After receiving the application starting instruction, the UI core layer can determine that a new webpage application is started in the current display device, and needs to display the system function on the top layer of the uppermost OSD layer, so that the system function is convenient to be checked by a user.
The number of newly launched web applications may be one or more, and the newly launched web application (designated application) is the web application that needs to be displayed on the top layer of the OSD, and the designated applications affect each other or affect the previously launched web application, for example, are blocked from each other.
At this time, the UI core layer generates a layer display instruction based on the application start instruction, and sends the layer display instruction to the application management module, and the application management module forwards the layer display instruction to the UI system layer.
S2, responding to the layer display instruction, displaying the appointed function of the appointed application between the UI core layer and the UI system layer, and sending the layer display instruction to the UI system layer, wherein the layer display instruction is used for indicating the UI system layer to display the system function of the webpage application.
After receiving the layer display instruction sent by the UI core layer, the application management module can display the appointed function of the appointed application on the uppermost layer of the OSD, and simultaneously sends the layer display instruction to the UI system layer.
The display information is carried in the layer display instruction, and the UI system layer receives the layer display instruction, so that the system function of the web application which is required to be displayed on the basis of the display information in the layer display instruction is known, namely, the system function of the web application is displayed on the UI system layer.
At this time, since the UI system layer is located at the top layer of the OSD and the specified function of the specified application is displayed at the top layer of the OSD, that is, the specified function of the newly started specified application is located between the UI core layer and the UI system layer, the specified function of the newly started specified application blocks the bottom layer function of the previously started web application and the system function of the previously started web application blocks the specified function of the specified application.
Since the system function is usually a volume bar, a device or a status prompt icon, etc., the display frame is small, so that the area where the system function shields the designated function is small, and an effect that the system function of the previously started web application emerges on the surface of the designated function of the designated application is presented.
Therefore, when the method for supporting multi-layer display by the webpage application is executed, the display device provided by the embodiment of the invention is based on the UI core layer and the UI system layer which are split to the UI layer, when the appointed application is newly started, the UI core layer receives the application starting instruction and sends the layer display instruction to the application management module, and the application management module sends the layer display instruction to the UI system layer so that the UI system layer displays the system functions of the webpage application. Therefore, the display device can perform multi-layer display on the same webpage application, when a new appointed application is started, the bottom layer function of the webpage application started before is still displayed on the UI core layer (the bottom layer of the uppermost layer of the OSD), and the system function required to be displayed on the top layer of the uppermost layer of the OSD is displayed on the UI system layer, so that different functions of the same webpage application are simultaneously displayed on different layers, and user experience is improved.
In some embodiments, if the user needs to execute the web application that returns to or exits from a certain display, for example, exits from displaying a system function, or exits from presenting a specified application, the UI core layer still executes the exiting operation on the UI system layer through the application management module, where the application management module is further configured to execute the following steps:
and step 31, receiving a layer exit instruction sent by the UI core layer.
And step 32, sending a layer exit instruction to the UI system layer, wherein the layer exit instruction is used for indicating the UI system layer to cancel the system function of the display webpage application.
Referring to fig. 13 (b), when a user performs an exit operation through a remote controller, an exit instruction is generated and transmitted to a UI core layer, and the UI core layer generates a layer exit instruction based on the exit instruction and transmits to an application management module. The application management module forwards the layer exit instruction to the UI system layer so that the UI system layer no longer displays the system function of the previously started webpage application.
Therefore, after the UI layer is split into the UI core layer and the UI system layer, the application management module realizes communication between the UI core layer and the UI system layer, so that when a new application is started, system functions of a previously started webpage application can be timely displayed on the UI system layer, shielding of a designated function of the newly started application on the previously started webpage application is avoided, the system functions required to be displayed at the forefront are displayed on the top layer (UI system layer) of the uppermost layer of the OSD, the user can conveniently see, different functions of the same webpage application are simultaneously displayed through multiple layers, and user experience is improved.
In some embodiments, in order to facilitate the split UI core layer and the UI system layer to share data in real time, it is necessary to implement the split UI core layer and the UI system layer by using a main process configured in the display device.
An interaction diagram for data sharing according to some embodiments is illustrated in fig. 14. To this end, a main process for storing data is configured within the controller, for data sharing, see fig. 14, the main process being configured to perform the steps of:
and step 41, receiving data storage instructions respectively sent by the UI core layer and the UI system layer.
And 42, respectively storing the data corresponding to the UI core layer and the data corresponding to the UI system layer in response to each data storage instruction.
Referring to fig. 14 (a), in order to achieve data sharing, it is necessary to establish communication connections of the UI core layer and the UI system layer with the host process, respectively. Therefore, after the UI core layer and the UI system layer are started, the respective data storage instructions are respectively sent to the main process.
After receiving a data storage instruction sent by the UI core layer, the main process stores data corresponding to the UI core layer; and after receiving the data storage instruction sent by the UI system layer, the main process stores the data corresponding to the UI system layer.
The data of the UI core layer and the data of the UI system layer are stored in the main process, so that the data sharing of the UI core layer and the UI system layer can be realized, the display synchronization of the display equipment in the process of presenting the picture content is ensured, and the user experience is improved.
Because the data stored in the main process is not only from the UI core layer and the UI system layer, but also from other application modules in the controller, in order to ensure real-time sharing of the data, the UI core layer and the UI system layer need to be notified in time when the data stored in the main process changes so as to ensure synchronization of the data. At this time, the main process is further configured to perform the steps of:
step 51, when the stored data changes, a data change notification is generated.
Step 52, the data change notification is sent to the UI core layer and the UI system layer, respectively.
Referring to fig. 14 (b), when the data stored in the main process changes, a data change notification is generated and sent to the UI core layer and the UI system layer, respectively, to inform the UI core layer and the UI system layer that the data in the controller has changed, and the respective data needs to be updated in time.
For example, if the user adjusts the playing sound of the display device, the volume parameter stored in the main process will be changed, and at this time, the UI system layer needs to synchronously display the adjusted volume value. In the process, the main process sends the volume parameter change notification to the UI core layer and the UI system layer, and the UI core layer controls the UI system layer to synchronize the value on the volume bar to be the same as the new volume parameter through the application management module, so that the user can adjust the volume parameter again based on the value displayed by the volume bar.
In some embodiments, when the data corresponding to the UI core layer changes, the main process needs to synchronize to the main process to store, and inform the UI system layer at the same time, so as to synchronize the data, where the main process is further configured to perform the following steps:
step 61, receiving a core data change notification carrying changed data sent by the UI core layer, where the core data change notification is used to characterize that data corresponding to the UI core layer changes.
And step 62, responding to the core data change notification, storing changed data carried by the core data change notification, and sending the core data change notification to the UI system layer.
Referring to fig. 14 (c), when data corresponding to a UI core layer that previously started a web application changes, the UI core layer sends a core data change notification to a main process, where the core data change notification carries changed data of the UI core layer. After receiving the core data change notification, the main process stores the changed data, and forwards the core data change notification to the UI system layer so as to notify the UI system layer that the UI core layer has data update, so that new data can be timely acquired to ensure the data synchronization of the UI system layer and the UI core layer.
In some embodiments, when the data corresponding to the UI system layer changes, the main process needs to synchronize to the main process to store, and inform the UI core layer at the same time, so as to synchronize the data, where the main process is further configured to perform the following steps:
and step 71, receiving a system data change notification carrying changed data sent by the UI system layer, wherein the system data change notification is used for representing that the corresponding data of the UI system layer changes.
And step 72, responding to the system data change notification, storing changed data carried by the system data change notification, and sending the system data change notification to the UI core layer.
Referring to fig. 14 (d), when the data corresponding to the UI system layer that previously started the web application changes, the UI system layer sends a system data change notification to the host process, where the system data change notification carries changed data of the UI system layer. After receiving the system data change notification, the main process stores the changed data, and forwards the system data change notification to the UI core layer so as to notify the UI core layer that the UI system layer has data update, so that new data can be timely acquired to ensure the data synchronization of the UI core layer and the UI system layer.
Therefore, after the UI layer is split into the UI core layer and the UI system layer, the display device provided in the embodiment of the invention uses the main process to realize data sharing between the UI core layer and the UI system layer, and timely notifies the other two parties except for the main process when the main process changes data, the UI core layer changes data or the UI system layer changes data, so that the latest data can be stored in the main process, and the synchronous update of the data between the UI system layer and the UI core layer is ensured.
A flowchart of a method for a web application to support multi-layer display according to some embodiments is illustrated in fig. 12. Referring to fig. 12, an embodiment of the present invention provides a method for supporting multi-layer display by a web application, which is applied to a controller, where the controller is configured to split a UI layer of a displayed web application into a UI core layer and a UI system layer, where the UI system layer is located at a top layer of an OSD, and the UI system layer is configured to display a system function required to be displayed at the top layer of the OSD in the web application; the UI core layer is positioned at the bottom layer of the uppermost OSD layer and is used for displaying the bottom layer function which is required to be displayed at the bottom layer of the uppermost OSD layer in the webpage application; referring to fig. 12, the method is performed by an application management module, the method comprising:
S1, an application management module in the controller receives a layer display instruction sent by the UI core layer, wherein the layer display instruction is an instruction generated by the UI core layer after receiving an application starting instruction for starting a specified application, and the specified application is an application which needs to be displayed on the uppermost layer of the OSD;
s2, responding to the layer display instruction, displaying the appointed function of the appointed application between the UI core layer and the UI system layer, and sending the layer display instruction to the UI system layer, wherein the layer display instruction is used for indicating the UI system layer to display the system function of the webpage application.
As can be seen from the above technical solutions, in the method and the display device for supporting multi-layer display by using a web application provided by the embodiments of the present invention, a controller splits a UI layer of a displayed web application into a UI core layer located at a bottom layer of an OSD uppermost layer and a UI system layer located at a top layer of the OSD uppermost layer; when the appointed application is started up, the UI core layer sends a layer display instruction to the application management module, and the application management module sends the layer display instruction to the UI system layer so that the UI system layer displays the system functions of the webpage application. Therefore, the method and the display device provided by the embodiment of the invention can perform multi-layer display on the same webpage application, when a new appointed application is started, the bottom layer function of the webpage application started before is still displayed on the UI core layer (the bottom layer of the uppermost layer of the OSD), and the system function required to be displayed on the top layer of the uppermost layer of the OSD is displayed on the UI system layer, so that different functions of the same webpage application are simultaneously displayed on different layers, and the user experience is improved.
In a specific implementation, the present invention further provides a computer storage medium, where the computer storage medium may store a program, where the program may include some or all of the steps in each embodiment of the method for supporting multi-layer display by a web application provided by the present invention when the program is executed. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random-access memory (random access memory, RAM), or the like.
It will be apparent to those skilled in the art that the techniques of embodiments of the present invention may be implemented in software plus a necessary general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be embodied in essence or what contributes to the prior art in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the embodiments or some parts of the embodiments of the present invention.
The same or similar parts between the various embodiments in this specification are referred to each other. In particular, for embodiments of the method for web applications supporting multi-layer display, the description is relatively simple as it is substantially similar to the display device embodiment, as will be described with respect to the display device embodiment.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, characterized by comprising:
a display configured to present a user interface for displaying different functions of the web application;
The controller is connected with the display and is used for splitting a UI layer of the displayed webpage application into a UI core layer and a UI system layer, the UI system layer is positioned on the top layer of the uppermost OSD layer, and the UI system layer is used for displaying system functions which need to be displayed on the top layer of the uppermost OSD layer in the webpage application; the UI core layer is positioned at the bottom layer of the uppermost OSD layer and is used for displaying the bottom layer function which is required to be displayed at the bottom layer of the uppermost OSD layer in the webpage application;
an application management module for realizing communication between the UI core layer and the UI system layer is configured in the controller, and the application management module is configured to:
receiving a layer display instruction sent by the UI core layer, wherein the layer display instruction refers to an instruction generated by the UI core layer after receiving an application starting instruction for starting a specified application, and the specified application refers to an application which needs to be displayed on the uppermost layer of the OSD;
and responding to the layer display instruction, displaying the designated function of the designated application between the UI core layer and the UI system layer, and sending the layer display instruction to the UI system layer, wherein the layer display instruction is used for indicating the UI system layer to display the system function of the webpage application.
2. The display device of claim 1, wherein the application management module is further configured to:
after the webpage application is started, receiving registration information sent by the UI core layer and the UI system layer respectively;
and establishing communication connection between the local terminal and the UI core layer based on the registration information sent by the UI core layer, and establishing communication connection between the local terminal and the UI system layer based on the registration information sent by the UI system layer.
3. The display device of claim 2, wherein the application management module is further configured to:
receiving request information sent by the UI system layer;
and sending the request information to the UI core layer, and establishing communication connection between the UI system layer and the UI core layer.
4. The display device of claim 2, wherein the application management module is further configured to:
receiving request information sent by the UI core system layer;
and sending the request information to the UI system layer, and establishing communication connection between the UI core layer and the UI system layer.
5. The display device of claim 1, wherein the application management module is further configured to:
Receiving a layer exit instruction sent by the UI core layer;
and sending the layer exit instruction to a UI system layer, wherein the layer exit instruction is used for indicating the UI system layer to cancel the system function of displaying the webpage application.
6. The display device of claim 1, wherein a host process for storing data is configured within the controller, the host process configured to:
receiving data storage instructions respectively sent by the UI core layer and the UI system layer;
and respectively storing the data corresponding to the UI core layer and the data corresponding to the UI system layer in response to each data storage instruction.
7. The display device of claim 6, wherein the host process is further configured to:
generating a data change notification when the stored data changes;
and sending the data change notification to the UI core layer and the UI system layer respectively.
8. The display device of claim 6, wherein the host process is further configured to:
receiving a core data change notification carrying changed data sent by the UI core layer, wherein the core data change notification is used for representing that data corresponding to the UI core layer changes;
And responding to the core data change notification, storing changed data carried by the core data change notification, and sending the core data change notification to the UI system layer.
9. The display device of claim 6, wherein the host process is further configured to:
receiving a system data change notification carrying changed data sent by the UI system layer, wherein the system data change notification is used for representing that the data corresponding to the UI system layer changes;
and responding to the system data change notification, storing changed data carried by the system data change notification, and sending the system data change notification to the UI core layer.
10. The method for supporting multi-layer display by the webpage application is characterized by being applied to a controller, wherein the controller is used for splitting a UI layer of the displayed webpage application into a UI core layer and a UI system layer, the UI system layer is positioned at the top layer of an OSD uppermost layer, and the UI system layer is used for displaying system functions which need to be displayed at the top layer of the OSD uppermost layer in the webpage application; the UI core layer is positioned at the bottom layer of the uppermost OSD layer and is used for displaying the bottom layer function which is required to be displayed at the bottom layer of the uppermost OSD layer in the webpage application; the method comprises the following steps:
An application management module in the controller receives a layer display instruction sent by the UI core layer, wherein the layer display instruction refers to an instruction generated by the UI core layer after receiving an application starting instruction for starting a specified application, and the specified application refers to an application which needs to be displayed on the uppermost layer of the OSD;
and responding to the layer display instruction, displaying the designated function of the designated application between the UI core layer and the UI system layer, and sending the layer display instruction to the UI system layer, wherein the layer display instruction is used for indicating the UI system layer to display the system function of the webpage application.
CN202011323324.0A 2020-11-23 2020-11-23 Method and display device for supporting multi-layer display by webpage application Active CN112363683B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011323324.0A CN112363683B (en) 2020-11-23 2020-11-23 Method and display device for supporting multi-layer display by webpage application

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011323324.0A CN112363683B (en) 2020-11-23 2020-11-23 Method and display device for supporting multi-layer display by webpage application

Publications (2)

Publication Number Publication Date
CN112363683A CN112363683A (en) 2021-02-12
CN112363683B true CN112363683B (en) 2023-10-31

Family

ID=74533232

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011323324.0A Active CN112363683B (en) 2020-11-23 2020-11-23 Method and display device for supporting multi-layer display by webpage application

Country Status (1)

Country Link
CN (1) CN112363683B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118276747A (en) * 2022-06-23 2024-07-02 荣耀终端有限公司 Picture display method and device and terminal equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101499084A (en) * 2008-11-14 2009-08-05 深圳市茁壮网络技术有限公司 Application element indication method, web page browser and web page image generation system
CN110750748A (en) * 2019-10-24 2020-02-04 杭州网景汇网络科技有限公司 Webpage display method
CN111343492A (en) * 2020-02-17 2020-06-26 海信电子科技(深圳)有限公司 Display method and display device of browser in different layers
CN111935530A (en) * 2020-07-31 2020-11-13 海信视像科技股份有限公司 Display device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2792662C (en) * 2011-10-18 2017-11-14 Research In Motion Limited Method of rendering a user interface
US11074053B2 (en) * 2018-09-07 2021-07-27 Boyd Cannon Multerer User interface generation system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101499084A (en) * 2008-11-14 2009-08-05 深圳市茁壮网络技术有限公司 Application element indication method, web page browser and web page image generation system
CN110750748A (en) * 2019-10-24 2020-02-04 杭州网景汇网络科技有限公司 Webpage display method
CN111343492A (en) * 2020-02-17 2020-06-26 海信电子科技(深圳)有限公司 Display method and display device of browser in different layers
CN111935530A (en) * 2020-07-31 2020-11-13 海信视像科技股份有限公司 Display device

Also Published As

Publication number Publication date
CN112363683A (en) 2021-02-12

Similar Documents

Publication Publication Date Title
CN112214189B (en) Image display method and display device
CN112019782B (en) Control method and display device of enhanced audio return channel
CN111970549B (en) Menu display method and display device
CN112165640B (en) Display device
CN112087671B (en) Display method and display equipment for control prompt information of input method control
CN112243141B (en) Display method and display equipment for screen projection function
CN112118400A (en) Display method of image on display device and display device
CN114095769B (en) Live broadcast low-delay processing method of application-level player and display device
CN111984167B (en) Quick naming method and display device
CN112269668A (en) Application resource sharing and display equipment
CN112017415A (en) Recommendation method of virtual remote controller, display device and mobile terminal
CN112040340A (en) Resource file acquisition method and display device
CN112363683B (en) Method and display device for supporting multi-layer display by webpage application
CN114390190B (en) Display equipment and method for monitoring application to start camera
CN111935530B (en) Display equipment
CN111988646B (en) User interface display method and display device of application program
CN115185392A (en) Display device, image processing method and device
CN113971049A (en) Background service management method and display device
CN112291600B (en) Caching method and display device
CN113438553B (en) Display device awakening method and display device
CN112199612B (en) Bookmark adding and combining method and display equipment
CN112231088B (en) Browser process optimization method and display device
CN113194355B (en) Video playing method and display equipment
CN111935519B (en) Channel switching method and display device
CN113194361B (en) Legal statement content display and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant