CN114339332B - Mobile terminal, display device and cross-network screen projection method - Google Patents

Mobile terminal, display device and cross-network screen projection method Download PDF

Info

Publication number
CN114339332B
CN114339332B CN202110534832.1A CN202110534832A CN114339332B CN 114339332 B CN114339332 B CN 114339332B CN 202110534832 A CN202110534832 A CN 202110534832A CN 114339332 B CN114339332 B CN 114339332B
Authority
CN
China
Prior art keywords
screen projection
mobile terminal
service
application
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110534832.1A
Other languages
Chinese (zh)
Other versions
CN114339332A (en
Inventor
肖成创
刘美玉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202110534832.1A priority Critical patent/CN114339332B/en
Priority to PCT/CN2022/084106 priority patent/WO2022242328A1/en
Priority to CN202280026627.7A priority patent/CN117157987A/en
Publication of CN114339332A publication Critical patent/CN114339332A/en
Application granted granted Critical
Publication of CN114339332B publication Critical patent/CN114339332B/en
Priority to US18/510,339 priority patent/US20240089526A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The application provides a mobile terminal, a display device and a cross-network screen projection method, wherein the mobile terminal establishes cross-network communication connection with the display device through a communication unit, starts a proxy screen projection service, and the display device establishes cross-network communication connection with the mobile terminal through a communicator; the agent screen projection service acquires a screen projection instruction input by a user in a third-party application; the agent screen projection service establishes screen projection connection with the third-party application and acquires screen projection information of the third-party application; the mobile terminal sends the screen projection information to the display equipment through the agent screen projection service; the display equipment analyzes the screen projection information and presents screen projection video; the proxy screen projection service of the mobile terminal is connected with the third-party application screen projection based on the DLNA, and then the cross-network communication connection of the mobile terminal and the display device is achieved, so that the third-party application screen projection based on the DLNA is achieved, and the user experience is improved.

Description

Mobile terminal, display device and cross-network screen projection method
Technical Field
The application relates to the technical field of smart televisions, in particular to a mobile terminal, a display device and a cross-network screen projection method.
Background
The smart television is used as large-screen equipment, can provide better presentation for video playing, and can also provide better watching experience for users, so that videos on the mobile terminal can be projected to the smart television for playing.
The method for playing the video on the mobile terminal on the smart television mainly comprises the steps that DLNA screen projection is carried out, video links of the video on the mobile terminal are pushed to a television end to be played, a player of the television end obtains audio and video streams from a network end to be played, and the DLNA screen projection can be operated only when the mobile terminal and the smart television are in the same local area network.
Along with living needs, the demand of users for screen projection across networks is higher and higher, and at present, screen projection connection can be established between a mobile terminal of the same manufacturer and an intelligent television through binding of built-in rules of an operating system. However, the screen projection connection established between the mobile terminal and the smart television of the same manufacturer is only for the video resources owned by the mobile terminal, and when the video resources are provided by third-party applications such as love art, tencent video, youkou and the like, the resources capable of being projected on the screen are limited, so that the user experience is reduced.
Disclosure of Invention
The application provides a mobile terminal, a display device and a cross-network screen projection method, which can be used for solving the technical problem that a third party application can only project screens in the same local area network based on DLNA.
In a first aspect, the present application provides a mobile terminal comprising a display unit, a communication unit, and a processor. Wherein the communication unit is configured to establish a communication connection with the display device; a processor configured to:
establishing cross-network communication connection with the display equipment through the communication unit;
starting an agent screen projection service;
receiving a screen projection instruction input by a user in a third-party application;
responding to the screen projection instruction, the agent screen projection service establishes screen projection connection with the third-party application, and screen projection information of the third-party application is obtained;
and sending screen projection information of the third party application to the display equipment through the proxy screen projection service.
With reference to the first aspect, in a possible implementation manner, in the step of starting the proxy screen projection service, the processor is further configured to:
the agent screen-casting service starts a screen-casting request receiving function;
binding the agent screen-casting service and the third-party application in the same local area network;
if the user selects the proxy screen projection service in the screen projection equipment of the third-party application, the proxy screen projection service and the third-party application establish screen projection connection based on DLNA;
and the agent screen projection service receives a screen projection instruction sent by a third-party application.
With reference to the first aspect, in a possible implementation manner, the third party application and the proxy screen casting service are bound to a same local area network by binding a default route or a loopback address to the proxy screen casting service.
With reference to the first aspect, in a possible implementation manner, after sending, to the display device, screen projection information of the third-party application, the processor is further configured to:
receiving a broadcast control instruction of a third party application through the agent screen projection service;
and sending a broadcast control instruction of the third-party application to the display equipment through the proxy screen projection service.
With reference to the first aspect, in one possible implementation manner, the inter-network communication connection is established with the display device, and the inter-network communication connection is established through a server and is directly established; and establishing cross-network communication connection through the binding relationship between the proxy screen projection service and the display equipment.
With reference to the first aspect, in a possible implementation manner, the establishing, by the mobile terminal and the display device, a cross-network communication connection through a server includes: the agent screen projection service establishes communication connection with the server; and the display equipment establishes communication connection with the server.
As can be seen from the above technical solutions, a first aspect of the present application provides a mobile terminal, which includes a display unit, a communication unit, and a processor. Wherein the communication unit is configured to establish communication connection with the display device; a processor configured to: establishing cross-network communication connection with the display equipment through the communication unit; starting an agent screen-casting service; receiving a screen projection instruction input by a user in a third-party application, wherein the screen projection instruction points to the agent screen projection service; responding to the screen projection instruction, the agent screen projection service establishes screen projection connection with the third-party application, and screen projection information of the third-party application is obtained; and sending the screen projection information to the display equipment through the proxy screen projection service. The problem that the third party application cannot be subjected to cross-network screen projection based on DLNA is solved.
In a second aspect, the present application provides a display device, comprising a display, a communicator and a controller, wherein the communicator is configured to establish a communication connection with a mobile terminal; a controller configured to:
establishing cross-network communication connection with the mobile terminal through the communicator;
receiving screen projection information and a broadcast control instruction of a third-party application sent by the agent screen projection service, and analyzing the screen projection information.
With reference to the second aspect, in a possible implementation manner, in the step of parsing the screen projection information, the controller is further configured to:
and analyzing the video link and the video name information in the screen projection information.
As can be seen from the foregoing technical solutions, a second aspect of the present application provides a display device, including a display, a communicator and a controller, where the communicator is configured to establish a communication connection with a mobile terminal; a controller configured to: establishing cross-network communication connection with the mobile terminal through the communicator; receiving screen projection information and a broadcast control instruction of a third-party application sent by the agent screen projection service, and analyzing the screen projection information. The problem that the third-party application cannot be subjected to cross-network screen projection based on the DLNA is solved.
In a third aspect, the application provides a cross-network screen projection method, including:
the mobile terminal establishes cross-network communication connection with the display equipment through the communication unit and starts an agent screen projection service;
the display equipment establishes cross-network communication connection with the mobile terminal through the communicator;
the agent screen-casting service acquires a screen-casting instruction input by a user in a third-party application;
the agent screen projection service establishes screen projection connection with the third-party application and acquires screen projection information of the third-party application;
the mobile terminal sends screen projection information of the third-party application to the display equipment through the agent screen projection service;
the display device receives and analyzes screen projection information of the third-party application and presents screen projection videos. According to the technical scheme, the third aspect of the application provides a cross-network screen projection method, a mobile terminal establishes cross-network communication connection with a display device through a communication unit, a proxy screen projection service is started, and the display device establishes cross-network communication connection with the mobile terminal through a communicator; the agent screen projection service acquires a screen projection instruction input by a user in a third-party application; the agent screen projection service establishes screen projection connection with the third-party application and acquires screen projection information of the third-party application; the mobile terminal sends the screen projection information to the display equipment through the agent screen projection service; and the display equipment analyzes the screen projection information and presents the screen projection video. The problem that the third party application cannot be subjected to cross-network screen projection based on DLNA is solved.
In a fourth aspect, the application provides a cross-network screen projection system, which includes a mobile terminal, a display device and a server; the mobile terminal is internally provided with a communication unit; the display device is internally provided with a communicator;
the mobile terminal is configured to establish cross-network communication connection with a display device through the communication unit and start an agent screen projection service; the agent screen projection service acquires a screen projection instruction input by a user in a third-party application; the agent screen projection service establishes screen projection connection with the third-party application and acquires screen projection information of the third-party application;
the display device is configured to establish a cross-network communication connection with the mobile terminal through the communicator, receive and parse screen projection information of the third party application, and present screen projection video.
According to the technical scheme, the cross-network screen projection system comprises a mobile terminal, a display device and a server; the mobile terminal is internally provided with a communication unit; the display device is internally provided with a communicator; the mobile terminal is configured to establish cross-network communication connection with a display device through the communication unit and start an agent screen projection service; the agent screen projection service acquires a screen projection instruction input by a user in a third-party application; the agent screen projection service establishes screen projection connection with the third-party application and acquires screen projection information of the third-party application; the display device is configured to establish cross-network communication connection with the mobile terminal through the communicator, receive and analyze screen projection information of the third-party application, and present screen projection videos.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus in an embodiment of the present application;
fig. 2 is a block diagram of a hardware configuration of a display device 200 in an embodiment of the present application;
fig. 3 is a software configuration diagram of a display device 200 in an embodiment of the present application;
FIG. 4 is a schematic diagram of an icon control interface display of an application program of a display device in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a mobile terminal 300 in the embodiment of the present application;
fig. 6 is a schematic diagram of a software architecture of the mobile terminal 300 according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a user interface of a mobile terminal according to an embodiment of the present application;
fig. 8a is a schematic structural diagram of a cross-network screen projection system in an embodiment of the present application;
FIG. 8b is a schematic cross-network screen projection operation flow diagram in the embodiment of the present application;
fig. 8c is a connection signaling diagram transferred by a server in the embodiment of the present application;
FIG. 9a is a schematic structural diagram of another cross-web screen projection system in an embodiment of the present application;
fig. 9b is a connection signaling diagram of a direct connection in the embodiment of the present application;
fig. 10 is a schematic diagram of a cross-network screen projection connection structure of a mobile terminal in an embodiment of the present application;
fig. 11 is a schematic diagram of a cross-network screen projection connection structure of a display device in an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the present application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module," as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Reference throughout this specification to "embodiments," "some embodiments," "one embodiment," or "an embodiment," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases "in various embodiments," "in some embodiments," "in at least one other embodiment," or "in an embodiment," or the like, throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Thus, the particular features, structures, or characteristics illustrated or described in connection with one embodiment may be combined, in whole or in part, with the features, structures, or characteristics of one or more other embodiments, without limitation. Such modifications and variations are intended to be included within the scope of the present application.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display apparatus 200 is also in data communication with a server 400, and a user can operate the display apparatus 200 through the mobile terminal 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the mobile terminal 300 may include any of a cell phone, a tablet, a computer, a laptop, an AR/VR device, and the like.
In some embodiments, the mobile terminal 300 may also be used to control the display device 200. For example, the display apparatus 200 is controlled using an application program running on the mobile terminal.
In some embodiments, the mobile terminal 300 and the display device 200 may also be used for communication of data.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the mobile terminal 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
In some embodiments, software steps executed by one step execution agent may be migrated on demand to another step execution agent in data communication therewith for execution. Illustratively, software steps performed by the server may be migrated to be performed on a display device in data communication therewith, and vice versa, as desired.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
In some embodiments, the communication interface 130 is used for external communication, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module.
In some embodiments, the user input/output interface 140 includes at least one of a microphone, a touchpad, a sensor, a key, or an alternative module.
Fig. 3 illustrates a hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for displaying pictures, and a driving component for driving image display, a component for receiving image signals from the controller output, displaying video content, image content, and menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting the intensity of ambient light; alternatively, the detector 230 includes an image collector, such as a camera, which can be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other operable control. The operations related to the selected object are: displaying an operation of connecting to a hyperlink page, document, image, etc., or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application program instructions stored in the memory, and executing various application programs, data and contents according to various interactive instructions receiving external input, so as to finally display and play various audio-video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of an icon, an operation menu, and a user input instruction display figure. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform at least one of decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and other video processing according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on the direct display device 200.
In some embodiments, the video processor includes at least one of a demultiplexing module, a video decoding module, an image composition module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received video output signal after the frame rate conversion, and changing the signal to be in accordance with the signal of the display format, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform at least one of noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in the speaker.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on display 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. visual interface elements.
In some embodiments, user interface 280 is an interface that may be used to receive control inputs (e.g., physical buttons on the body of the display device, or the like).
In some embodiments, the system of the display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are, from top to bottom, an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer.
In some embodiments, at least one application program runs in the application program layer, and the application programs may be windows (Window) programs carried by an operating system, system setting programs, clock programs or the like; or may be an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various applications as well as general navigational fallback functions, such as controlling exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of a display screen, judging whether a status bar exists, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
Fig. 5 shows a schematic structural diagram of the mobile terminal 300.
The following describes an embodiment of the mobile terminal 300 as an example. It should be understood that the mobile terminal 300 shown in fig. 5 is merely an example, and that the mobile terminal 300 may have more or fewer components than shown in fig. 5, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
As shown in fig. 5, the mobile terminal 300 includes: radio Frequency (RF) circuit 310, memory 320, display unit 330, camera 340, sensor 350, audio circuit 360, wireless Fidelity (Wi-Fi) circuit 370, processor 380, bluetooth circuit 381, and power supply 390.
The RF circuit 310 may be used for receiving and transmitting signals during information transmission and reception or during a call, and may receive downlink data of a base station and then send the downlink data to the processor 380 for processing; the uplink data may be transmitted to the base station. Typically, the RF circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
Memory 320 may be used to store software programs and data. The processor 380 performs various functions of the mobile terminal 300 and data processing by executing software programs or data stored in the memory 320. The memory 320 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. The memory 320 stores an operating system that enables the mobile terminal 300 to operate. The memory 320 may store an operating system and various application programs, and may also store codes for performing the methods described in the embodiments of the present application.
The display unit 330 may be used to receive input numeric or character information and generate signal input related to user settings and function control of the mobile terminal 300, and particularly, the display unit 330 may include a touch screen 331 disposed on the front surface of the mobile terminal 300 and collecting touch operations of a user thereon or nearby, such as clicking a button, dragging a scroll box, and the like.
The display unit 330 may also be used to display information input by the user or information provided to the user and a Graphical User Interface (GUI) of various menus of the mobile terminal 300. In particular, the display unit 330 may include a display screen 332 disposed on the front surface of the mobile terminal 300. The display screen 332 may be configured in the form of a liquid crystal display, a light emitting diode, or the like. The display unit 330 may be used to display various graphical user interfaces described herein.
The touch screen 331 may be covered on the display screen 332, or the touch screen 331 and the display screen 332 may be integrated to implement the input and output functions of the mobile terminal 300, and after the integration, the touch screen may be referred to as a touch display screen for short. The display unit 330 in this application can display the application programs and the corresponding operation steps.
Camera 340 may be used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing elements convert the light signals into electrical signals which are then passed to the processor 380 for conversion into digital image signals.
The mobile terminal 300 may further comprise at least one sensor 350, such as an acceleration sensor 351, a distance sensor 352, a fingerprint sensor 353, a temperature sensor 354. The mobile terminal 300 may also be configured with other sensors such as a gyroscope, barometer, hygrometer, thermometer, infrared sensor, light sensor, motion sensor, and the like.
The audio circuitry 360, speaker 361, microphone 362 may provide an audio interface between a user and the mobile terminal 300. The audio circuit 360 may transmit the electrical signal converted from the received audio data to the speaker 361, and the audio signal is converted by the speaker 361 and output. The mobile terminal 300 may be further provided with a volume button for adjusting the volume of the sound signal. On the other hand, the microphone 362 converts collected sound signals into electrical signals, which are received by the audio circuit 360 and converted into audio data, which are then output to the RF circuit 310 for transmission to, for example, another terminal or to the memory 320 for further processing. In this application, the microphone 362 may capture the voice of the user.
Wi-Fi is a short-range wireless transmission technology, and the mobile terminal 300 can help a user send and receive e-mails, browse webpages, access streaming media and the like through the Wi-Fi circuit 370, and provides wireless broadband Internet access for the user.
The processor 380 is a control center of the mobile terminal 300, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the mobile terminal 300 and processes data by running or executing software programs stored in the memory 320 and calling data stored in the memory 320. In some embodiments, processor 380 may include one or more processing units; the processor 380 may also integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a baseband processor, which primarily handles wireless communications. It will be appreciated that the baseband processor described above may not be integrated into the processor 380. The processor 380 in the present application may run an operating system, an application program, a user interface display, and a touch response, and the processing method described in the embodiments of the present application. Further, the processor 380 is coupled with the input unit 330 and the display unit 340.
And the bluetooth circuit 381 is used for performing information interaction with other bluetooth devices with bluetooth circuits through a bluetooth protocol. For example, the mobile terminal 300 may establish a bluetooth connection with a wearable electronic device (e.g., a smart watch) having a bluetooth circuit through the bluetooth circuit 381, so as to perform data interaction.
The mobile terminal 300 also includes a power supply 390 (e.g., a battery) that powers the various components. The power supply may be logically coupled to the processor 380 through a power management system to manage charging, discharging, and power consumption functions through the power management system. The mobile terminal 300 may also be configured with power buttons for powering the terminal on and off, and for locking the screen.
Fig. 6 is a block diagram of a software configuration of the mobile terminal 300 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 6, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 6, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide a communication function of the mobile terminal 300. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to notify download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the mobile terminal vibrates, an indicator light flashes, and the like.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary software and hardware workflow of the mobile terminal 300 in connection with capturing a photo scene.
When the touch screen 331 receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera 340.
In some implementations, as shown in FIG. 7, a user can open a corresponding application by touching an application icon on the user interface, or can open a corresponding folder by touching a folder icon on the user interface.
Based on the foregoing, the mobile terminal 300 and the display device 200 implement screen-casting connection through different connection modes, and based on DLNA (digital living NETWORK ALLIANCE ), the mobile terminal 300 and the display device 200 establish communication connection in the same local area NETWORK, and the display device 200 may obtain a link address and a name of a certain resource in the mobile terminal 300 through a screen-casting data channel, and obtain media content data by accessing the link address, thereby playing the media data. Meanwhile, the display device 200 may also obtain a control instruction sent by the mobile terminal 300 through the screen-casting data channel when playing the media asset data, such as fast forwarding, pausing, stopping screen casting, and the like. However, screen projection through DLNA is limited in technology, and can only be performed in the same lan, and cross-network operation cannot be achieved.
For the cross-network operation, the display device 200 and the mobile terminal 300 of the same manufacturer directly establish a screen-casting connection through a built-in rule. For example, the APP of the mobile terminal 300 and the APP of the display device 200 of the same manufacturer are bound through the same account, and this cross-network screen casting manner only aims at the own resources of the manufacturer APP, and may be free media resources or partner media resources, but cannot perform cross-network screen casting on the play resources owned by other manufacturers.
In order to screen resources on APPs of different vendors across networks, as shown in fig. 8a, in some embodiments of the present application, a system for screen-casting across networks is provided, including a mobile terminal 300, a display device 200, and a server 400; wherein, the mobile terminal 300 includes a display unit 330, a communication unit, and a processor 380; the display device 200 includes a display 275, a communicator 220, and a controller 250.
In practical applications, a user may start a proxy screen projection Service through the mobile terminal 300, that is, start a screen projection received DMR (Digital Media Renderer) Service, and may receive and Play a Media file pushed by a DMC (Digital Media Controller), where DMR is a main function module of DLNA, DLNA is implemented based on a UPnP (Universal Plug and Play), SSDP (Simple Service Discovery Protocol) is a device Discovery Protocol defined by UPnP, and SOAP (Simple Object Access Protocol) is a control Protocol defined by UPnP, and is based on HTTP. The mobile terminal 300 is in communication connection with the display device 200 through the proxy screen projection service.
The agent screen-casting service may be implemented by an application installed in the mobile terminal 300, or may be implemented in the form of a system application, a system service, a background service, or the like.
If the agent screen-casting service is implemented by an application installed in the mobile terminal 300, the application is an own APP, as shown in fig. 8b and 8c, when the display device 200 is turned on and a login account is input, the display device 200 starts a receiving service for receiving an instruction message sent by the server 400 and screen-casting information, the application is an own APP, the own APP is opened, a user logs in an account on the own APP, the login account of the own APP is the same as the login account of the display device 200, and at this time, the server 400 establishes a communication connection with the own APP; the proxy screen projection service is started, the DMR function of the DLNA can be started, and when the DLNA is started, the IP bound by the SSDP and the HTTP can be 0.0.0.0; at this time, a third party APP is started, the third party APP supports the DMC function in the DLNA, namely, a Media resource file on a DMS (Digital Media Server) can be searched, and a DMR capable of playing the Media resource file is appointed to play or a device for controlling the multimedia file to be uploaded or downloaded to the DMS; when the user selected the film that needs to throw the screen propelling movement and opened and throw the screen service, can look over own APP in its screen equipment list of throwing, the user realizes that third party APP is connected with throwing of own APP's screen through selecting this propelling movement, and third party APP can also initiate control command to own APP this moment.
After receiving a screen projecting instruction sent by a third party APP, the self APP can send screen projecting information to the display device 200 which has established communication connection, the display device 200 receives the screen projecting instruction and then analyzes the screen projecting information, and analyzes action information such as media connection, media name and the like or pause playing in a command control instruction, and corresponding media playing or control is performed on the display device 200.
In some embodiments, for the received screen projection instruction, the own APP may process the screen projection instruction or may not process the screen projection instruction and transmit the screen projection information to the display device 200.
When binding 0.0.0.0.0, it is equivalent to binding all IPV4 addresses of the mobile terminal 300, and can receive data packets from all network cards, and whether the third-party APP is bound to the WiFi network or the mobile data network of the mobile terminal is in the same local area network as the own APP. In some embodiments, the bound IP may be a loopback address such as 127.0.0.1, etc. that may establish a binding relationship with the same network communication.
The mobile terminal 300 is in communication connection with the display device 200 through proxy screen projection service, the communication connection can be realized through server transfer, and cross-network communication connection can be realized in a direct connection mode without transfer; the binding mode of the mobile terminal 300 and the display device 200 may be the same account number binding, PIN code binding, MAC code binding of the display device 200, unique identifier binding, and the like, the mobile terminal 300 knows which display device needs to be screen-thrown through the binding, and the mobile terminal 300 is in communication connection with the display device 200 through the proxy screen-throwing service and is used for transmitting screen-throwing information. The communication connection may be directly connected without passing through a relay, and the mobile terminal 300 and the display device 200 may both have a public network IP address to implement cross-network connection between the two. If the mobile terminal 300 is directly connected to the display device 200, as shown in fig. 9a, in some embodiments of the present application, a cross-network screen projection system is provided, which includes the mobile terminal 300 and the display device 200; wherein, the mobile terminal 300 includes a display unit 330, a communication unit, and a processor 380; the display device 200 includes a display 275, a communicator 220, and a controller 250.
As shown in fig. 9b, when the display device 200 is turned on and a login account is input, the display device 200 starts a receiving service for receiving an instruction message sent by the mobile terminal 300 and screen projection information, if the application program is an own APP, the own APP is turned on, a user logs in an account on the own APP, the login account of the own APP is the same as the login account of the display device 200, and at this time, the display device 200 establishes a communication connection with the own APP; the proxy screen projection service is started, the DMR function of the DLNA can be started, and when the DLNA is started, the IP bound by the SSDP and the HTTP can be 0.0.0.0; at the moment, a third party APP is started, the third party APP supports the DMC function in the DLNA, namely, the media resource file on the DMS can be searched, and a DMR capable of playing the media resource file is appointed to play or a multimedia file is controlled to be uploaded or downloaded to the DMS device; when the user selected the film that needs to throw the screen propelling movement and opened and throw the screen service, can look over own APP in its screen equipment list of throwing, the user realizes that third party APP is connected with throwing of own APP's screen through selecting this propelling movement, and third party APP can also initiate control command to own APP this moment.
After receiving a screen projecting instruction sent by a third party APP, the self APP can send screen projecting information to the display device 200 which has established communication connection, the display device 200 receives the screen projecting instruction and then analyzes the screen projecting information, and analyzes action information such as media connection, media name and the like or pause playing in a command control instruction, and corresponding media playing or control is performed on the display device 200.
In the present application, the mobile terminal 300 starts a screen-casting received DMR service in the same local area network, the service does not execute the playing after receiving the command successfully, but provides an agent for the remote target playing device, a channel capable of realizing cross-network communication is established between the agent service and the display device 200, and after receiving the command, the agent service performs either the lower processing or the non-processing on the command, and transfers the necessary information to the real performer display device 200 through the communication channel to complete the execution of the command.
Based on the above cross-network screen projection system, some embodiments of the present application further provide a mobile terminal 300, which includes a display unit 330, a communication unit, and a processor 380; the communication unit is configured to establish a communication connection with the display device; a processor 380 configured to initiate a proxy screen-casting service that sends a request for establishing a communication connection to a display device through the communication unit; receiving a screen projection instruction input by a user in a third-party application, wherein the screen projection instruction points to the agent screen projection service; responding to the screen projection instruction, the agent screen projection service establishes screen projection connection with the third-party application, and screen projection information is obtained; and sending the screen projection information to the display equipment through the proxy screen projection service.
Wherein, in the step of initiating the proxy screen projection service, the processor is further configured to: the agent screen-casting service starts a screen-casting request receiving function; binding the agent screen projection service and the third-party application in the same local area network; and if the agent screen projection service is selected from the screen projection equipment of the third-party application, the agent screen projection service receives a screen projection instruction input by the user in the third-party application.
Also provided in some embodiments of the present application is a display device 200 comprising a display 275, a communicator 220, and a controller 250. Wherein the communicator 220 is configured to establish a communication connection with a mobile terminal; the controller 250 is configured to: receiving a communication connection establishment request sent by a mobile terminal; and responding to the communication connection request, receiving screen projection information and a broadcast control instruction sent by the proxy screen projection service, and analyzing the screen projection information. And analyzing the video link and the video name information in the screen projection information.
In some embodiments of the present application, a cross-network screen projection method is further provided, and is applied to a cross-network screen projection system, where the cross-network screen projection system includes a mobile terminal 300 and a display device 200, and the method includes the following steps: the mobile terminal 300 starts an agent screen-casting service, which sends a request for establishing communication connection to the display device through the communication unit; the display equipment receives 200 the communication connection request and establishes communication connection with the mobile terminal according to the communication connection request; the agent screen projection service acquires a screen projection instruction input by a user in a third-party application; the agent screen projection service establishes screen projection connection with the third-party application and acquires screen projection information; the mobile terminal 300 sends the screen projection information to the display device through the proxy screen projection service. The display device 200 parses the screen projection information and presents the screen projection video.
The embodiments provided in the present application are only a few examples of the general concept of the present application, and do not limit the scope of the present application. Any other embodiments extended according to the scheme of the present application without inventive efforts will be within the scope of protection of the present application for a person skilled in the art.

Claims (10)

1. A mobile terminal, comprising:
a display unit;
a communication unit configured to establish a communication connection with a display device;
a processor configured to:
establishing cross-network communication connection with the display equipment through the communication unit;
starting an agent screen projection service, wherein the agent screen projection service is used for the communication connection between the mobile terminal and the display equipment;
when the agent screen projection service is an own application, receiving a screen projection instruction input by a user in a third-party application through the own application, wherein the own application and the display equipment have the same login account;
responding to the screen projection instruction, establishing screen projection connection between the self application and the third-party application, and acquiring screen projection information of the third-party application;
and sending screen projection information of the third-party application to the display equipment through the self-owned application.
2. The mobile terminal of claim 1, wherein in the step of initiating the proxy screen-casting service, the processor is further configured to:
the agent screen-casting service starts a screen-casting request receiving function;
binding the agent screen projection service and the third-party application in the same local area network;
if the user selects the proxy screen projection service in the screen projection equipment of the third-party application, the proxy screen projection service and the third-party application establish screen projection connection based on DLNA;
and the agent screen projection service receives a screen projection instruction sent by a third-party application.
3. The mobile terminal of claim 2, wherein the proxy screen projection service and the third party application are bound to the same local area network:
and binding the third party application and the agent screen-casting service in the same local area network through the binding of the agent screen-casting service with a default route or a binding of a loopback address.
4. The mobile terminal of claim 1, wherein after sending the screen projection information of the third-party application to the display device, the processor is further configured to:
receiving a broadcast control instruction of a third party application through the agent screen projection service;
and sending a broadcast control instruction of the third-party application to the display equipment through the proxy screen projection service.
5. The mobile terminal of claim 1, wherein the cross-network communication connection with the display device is established by a server and directly;
and establishing cross-network communication connection through the binding relationship between the proxy screen projection service and the display equipment.
6. The mobile terminal according to claim 5, wherein the mobile terminal and the display device establish a cross-network communication connection through the server, comprising:
the agent screen projection service establishes communication connection with the server;
and the display equipment establishes communication connection with the server.
7. A display device, comprising:
a display;
a communicator configured to establish a communication connection with a mobile terminal;
a controller configured to:
the method comprises the steps that cross-network communication connection is established with a mobile terminal through a communicator, wherein screen projection between display equipment and the mobile terminal is in communication connection through proxy screen projection service;
receiving screen projection information and a broadcast control instruction of a third-party application sent by the agent screen projection service, and analyzing the screen projection information;
when the agent screen projection service is an own application, the own application and the display equipment have the same login account.
8. The display device of claim 7, wherein in the step of parsing the screen projection information, the controller is further configured to:
and analyzing the video link and the video name information in the screen projection information.
9. A cross-network screen projection method is characterized by comprising the following steps:
the method comprises the steps that a mobile terminal establishes cross-network communication connection with a display device through a communication unit, and agent screen projection service is started, wherein the agent screen projection service is used for the communication connection between the mobile terminal and the display device;
the display equipment establishes cross-network communication connection with the mobile terminal through the communicator;
when the agent screen projection service is the self-owned application, acquiring a screen projection instruction input by a user in a third-party application through the self-owned application;
the self-owned application and the third-party application establish screen projection connection, and screen projection information of the third-party application is obtained;
the mobile terminal sends screen projection information of the third-party application to the display equipment through the own application service;
the display device receives and analyzes screen projection information of the third-party application and presents screen projection videos.
10. A cross-network screen projection system is characterized by comprising a mobile terminal, a display device and a server; the mobile terminal is internally provided with a communication unit; the display device is internally provided with a communicator;
the mobile terminal is configured to establish cross-network communication connection with a display device through the communication unit, and start a proxy screen projection service, wherein the proxy screen projection service is used for the communication connection between the mobile terminal and the display device; when the agent screen projection service is the self-owned application, acquiring a screen projection instruction input by a user in a third-party application through the self-owned application; the self-owned application and the third-party application establish screen projection connection, and screen projection information of the third-party application is obtained;
the display device is configured to establish cross-network communication connection with the mobile terminal through the communicator, receive and analyze screen projection information of the third-party application, and present screen projection videos.
CN202110534832.1A 2021-05-17 2021-05-17 Mobile terminal, display device and cross-network screen projection method Active CN114339332B (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN202110534832.1A CN114339332B (en) 2021-05-17 2021-05-17 Mobile terminal, display device and cross-network screen projection method
PCT/CN2022/084106 WO2022242328A1 (en) 2021-05-17 2022-03-30 Method for playback in split screen and display device
CN202280026627.7A CN117157987A (en) 2021-05-17 2022-03-30 Split-screen playing method and display device
US18/510,339 US20240089526A1 (en) 2021-05-17 2023-11-15 Method for playback in split screen and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110534832.1A CN114339332B (en) 2021-05-17 2021-05-17 Mobile terminal, display device and cross-network screen projection method

Publications (2)

Publication Number Publication Date
CN114339332A CN114339332A (en) 2022-04-12
CN114339332B true CN114339332B (en) 2023-01-20

Family

ID=81044276

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110534832.1A Active CN114339332B (en) 2021-05-17 2021-05-17 Mobile terminal, display device and cross-network screen projection method

Country Status (1)

Country Link
CN (1) CN114339332B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115334365A (en) * 2022-04-27 2022-11-11 亦非云科技(上海)有限公司 Design method for directly using smart television without installing third-party application
CN115426523A (en) * 2022-08-25 2022-12-02 咪咕视讯科技有限公司 Software screen projection method, device, equipment and storage medium
CN116136751B (en) * 2023-04-04 2023-07-25 北京智象信息技术有限公司 Mirror image method for cross-operating system of primary screen and secondary screen
CN116796368A (en) * 2023-06-30 2023-09-22 飞虎互动科技(北京)有限公司 Method, device, equipment and medium for processing internet banking data

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2879398B1 (en) * 2013-11-27 2020-05-20 LG Electronics, Inc. Digital device and method of processing a service thereof
CN105897521A (en) * 2015-12-15 2016-08-24 乐视网信息技术(北京)股份有限公司 Screen projection method
CN107135417B (en) * 2017-06-08 2019-08-20 深圳市耐飞科技有限公司 A kind of throwing screen method and system of HLS protocol
CN110312160A (en) * 2019-06-17 2019-10-08 广州视源电子科技股份有限公司 It is wireless to throw screen method, apparatus, Intelligent flat, terminal and system
CN110324701A (en) * 2019-08-12 2019-10-11 深圳新智联软件有限公司 A kind of wired throwing screen based on DLNA

Also Published As

Publication number Publication date
CN114339332A (en) 2022-04-12

Similar Documents

Publication Publication Date Title
CN114339332B (en) Mobile terminal, display device and cross-network screen projection method
CN114286165B (en) Display equipment, mobile terminal, and screen-throwing data transmission method and system
CN112367543B (en) Display device, mobile terminal, screen projection method and screen projection system
WO2022048203A1 (en) Display method and display device for manipulation prompt information of input method control
CN112165640B (en) Display device
CN114286152A (en) Display device, communication terminal and screen projection picture dynamic display method
CN111970549A (en) Menu display method and display device
CN113784200A (en) Communication terminal, display device and screen projection connection method
CN111954059A (en) Screen saver display method and display device
CN113593279B (en) Vehicle, interaction parameter adjusting method thereof and mobile terminal
CN112351334B (en) File transmission progress display method and display equipment
CN112269668A (en) Application resource sharing and display equipment
CN111818654A (en) Channel access method and display device
CN114390190B (en) Display equipment and method for monitoring application to start camera
CN111787115B (en) Server, display device and file transmission method
CN114286320A (en) Display device, mobile terminal and Bluetooth connection method
CN111782606A (en) Display device, server, and file management method
CN114302197A (en) Voice separation control method and display device
CN113971049A (en) Background service management method and display device
CN111787117A (en) Data transmission method and display device
CN114079827A (en) Menu display method and display device
CN112231088B (en) Browser process optimization method and display device
CN113438553B (en) Display device awakening method and display device
CN112087651B (en) Method for displaying inquiry information and smart television
CN111914511B (en) Remote file browsing method, intelligent terminal and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant