CN115113832A - Cross-device synchronous display control method and system - Google Patents

Cross-device synchronous display control method and system Download PDF

Info

Publication number
CN115113832A
CN115113832A CN202110287070.XA CN202110287070A CN115113832A CN 115113832 A CN115113832 A CN 115113832A CN 202110287070 A CN202110287070 A CN 202110287070A CN 115113832 A CN115113832 A CN 115113832A
Authority
CN
China
Prior art keywords
display
control data
remote control
interface
display content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110287070.XA
Other languages
Chinese (zh)
Inventor
杨帆
卢曰万
张乐乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110287070.XA priority Critical patent/CN115113832A/en
Priority to PCT/CN2022/079942 priority patent/WO2022194005A1/en
Publication of CN115113832A publication Critical patent/CN115113832A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The application relates to a cross-device synchronous display control method and system. The method comprises the following steps: receiving remote control data by a first device with a first display, wherein the first device is coupled with a second device with a second display, and the first display synchronously displays a first display interface, wherein the first display content is displayed by the second display; sending the remote control data or the control data matched with the remote control data to the second equipment; receiving a second display interface, wherein the second display interface is obtained by adjusting the first display interface by the second device according to the remote control data or the control data; displaying the second display content on the first display.

Description

Cross-device synchronous display control method and system
Technical Field
The application relates to the technical field of intelligent equipment, in particular to a cross-equipment synchronous display control method and system.
Background
The screen projection technology is widely applied to work and life of people, and a user can connect the intelligent terminal with the display by using a screen projection module of the display or by using external screen projection equipment, and display content displayed by the intelligent terminal in the display in a screen projection mode so as to increase the viewing field.
In the related art, a user needs to operate on an intelligent terminal in the process of controlling the content displayed on a display. In the process, the sight of the user needs to be switched back and forth on the intelligent terminal and the display, and the operation is not convenient.
Therefore, a screen projection method which can be operated conveniently is needed in the related art.
Disclosure of Invention
In view of this, a method and a system for controlling cross-device synchronous display are provided.
In a first aspect, an embodiment of the present application provides a control method for cross-device synchronous display, including:
receiving remote control data by a first device having a first display, the first device coupled to a second device having a second display, the first display synchronously displaying first display content, wherein the first display content is displayed by the second display;
sending the remote control data to the second device;
receiving second display content, wherein the second display content is obtained by adjusting the first display content by the second equipment according to the remote control data;
displaying the second display content on the first display.
In the embodiment of the application, the first device may send the remote control data received by the first device to the second device, and the first device synchronously displays the content displayed by the second device. After receiving the remote control data, the second device may adjust the displayed first display content according to the remote control data, generate second display content, and send the second display content to the first device. Through the mode of the embodiment, when a user watches the content of the second display, under the condition that the sight line is not required to be switched to the first device, the adjustment of the display interface can be realized, and the convenience in use is greatly improved.
According to a first possible implementation manner of the first aspect, the first display is further configured to display an interface focus, and the remote control data includes operation data of the interface focus.
In this embodiment, the interface focus is set in the display interface, so that the user can recognize the position of the focus in the current display interface by naked eyes, and thus, the operation can be performed according to the interface focus.
According to a second possible implementation manner of the first aspect, the interface focus is disposed on a control of the display interface.
In the embodiment of the application, the interface focus is arranged on the control of the display interface, so that the interface focus can be focused at a relatively important position in the display interface, and the adjustment efficiency of a user on the display interface is improved.
According to a third possible implementation manner of the first aspect, the sending the remote control data to the second device includes:
converting the remote control data into a control instruction matched with the second equipment according to a preset conversion rule;
and sending the control instruction to the second equipment.
In the embodiment of the application, the remote control data is converted into the control instruction matched with the second device at the first device end, and then the second device directly executes the control instruction after receiving the control instruction.
According to a fourth possible implementation manner of the first aspect, the receiving second display content includes:
receiving second display content and the position of the interface focus in the second display content;
superimposing the interface focus at the location of the second display content.
In this embodiment, the interface focus and the second display content may be superimposed on the first device side.
In a second aspect, an embodiment of the present application provides a control method for cross-device synchronous display, including:
a second display of the second device displays the first display content;
the second device receiving remote control data, wherein the remote control data is for a first device having a first display, the second device being coupled to the first device, the first display synchronously displaying the first display content;
adjusting the first display content according to the remote control data to generate second display content;
and transmitting the second display content.
In the embodiment of the application, the first device may send the remote control data received by the first device to the second device, and the first device synchronously displays the content displayed by the second device. After receiving the remote control data, the second device may adjust the displayed first display content according to the remote control data, generate second display content, and send the second display content to the first device. Through the mode of the embodiment, when a user watches the content of the second display, under the condition that the sight line is not required to be switched to the first device, the adjustment of the display interface can be realized, and the convenience in use is greatly improved.
According to a first possible implementation form of the second aspect, the remote control data comprises operational data for an interface focus displayed by the first display.
In this embodiment, the interface focus is set in the display interface, so that the user can recognize the position of the focus in the current display interface by naked eyes, and thus, the operation can be performed according to the interface focus.
According to a second possible implementation manner of the second aspect, the adjusting the first display content according to the remote control data includes:
determining a new position of the interface focus according to the remote control data;
and adjusting the first display content according to the new position to generate second display content.
In this embodiment, the second display content is determined according to the new position of the interface focus, and the display of the interface focus can automatically meet the preset requirement.
According to a third possible implementation manner of the second aspect, the sending the second display content includes:
superimposing the interface focus at the new location of the second display content;
and sending second display content after the interface focus is superposed.
In this embodiment, after completing the superposition of the interface focus, the second device may send the second display content on which the interface focus is superposed to the first device, so that the first device may directly display the second display content.
According to a fourth possible implementation manner of the second aspect, the sending the second display content includes:
and sending the second display content and the new position of the interface focus.
In this embodiment, the first device may implement the superposition of the interface focus and the second display content.
According to a fifth possible implementation manner of the second aspect, the interface focus includes a control in the first display content.
In the embodiment of the application, the interface focus is arranged on the control of the display interface, so that the interface focus can be focused at a relatively important position in the display interface, and the adjustment efficiency of a user on the display interface is improved.
According to a sixth possible implementation manner of the second aspect, the remote control data includes a control instruction matched with the second device, which is obtained by converting original remote control data according to a preset conversion rule.
In the embodiment of the present application, the remote control data includes a control instruction after the first device has been converted according to a preset conversion rule.
According to a seventh possible implementation manner of the second aspect, the adjusting the first display content according to the remote control data includes:
analyzing the remote control data into a control instruction matched with the second equipment according to a preset analysis rule;
and adjusting the first display content by using the control instruction.
In the embodiment of the application, the second device can complete the conversion of the original remote control data by itself to generate the matched control instruction.
According to an eighth possible implementation manner of the second aspect, the converting the remote control data into the control instruction matched with the second device according to the preset conversion rule includes:
acquiring identification information of the first device;
and analyzing the remote control data into a control instruction matched with the second equipment according to the preset analysis rule, wherein the preset analysis rule is related to the identification information of the first equipment.
In the embodiment of the application, under the condition that the first device is diversified and the design rule of the remote control data is different, the first device can determine the preset analysis rule matched with the first device according to the identification information of the first device, and the preset analysis rule is utilized to analyze the remote control data.
In a third aspect, an embodiment of the present application provides a first device, including a first display, a first network module, a first data receiving module, and a first data sending module, wherein,
the first network module for coupling with a second device having a second display;
the first display is used for synchronously displaying first display content, wherein the first display content is displayed by the second display, and the first display content is used for displaying second display content;
the first data receiving module is configured to receive remote control data and receive the second display content, where the second display content is obtained by adjusting the first display content by the second device according to the remote control data or control data matched with the remote control data;
and the first data sending module is used for sending the remote control data or the control data to the second equipment.
Optionally, in an embodiment of the present application, the first display is further configured to display an interface focus, and the remote control data includes operation data of the interface focus.
Optionally, in an embodiment of the present application, the interface focus is disposed on a control of the display interface.
Optionally, in an embodiment of the present application, the first device further includes a first data processing module,
the first data processing module is used for converting the remote control data into a control instruction matched with the second equipment according to a preset conversion rule;
correspondingly, the first data sending module is configured to send the control instruction to the second device.
Optionally, in an embodiment of the application, the first data receiving module is specifically configured to:
receiving second display content and a position of the interface focus in the second display content;
superimposing the interface focus at the location of the second display content.
In a fourth aspect, an embodiment of the present application provides that the terminal device further provides a second device, where the second device includes a second display, a second network module, a second data receiving module, a second data processing module, and a second data sending module, where,
the second display is used for displaying first display content;
the second network module is used for being coupled with a first device with a first display, and the first display synchronously displays the first display content;
the second data receiving module is used for receiving remote control data aiming at the first equipment or control data matched with the remote control data;
the second data processing module is used for adjusting the first display content according to the remote control data or the control data and generating second display content;
and the second data sending module is used for sending the second display content.
Optionally, in an embodiment of the present application, the remote control data includes operation data for an interface focus displayed by the first display.
Optionally, in an embodiment of the application, the second data processing module is specifically configured to:
determining a new position of the interface focus according to the remote control data or the control data;
and adjusting the first display content according to the new position to generate second display content.
Optionally, in an embodiment of the application, the second data sending module is specifically configured to:
superimposing the interface focus at the new location of the second display content;
and sending second display content after the interface focus is superposed.
Optionally, in an embodiment of the application, the second data sending module is specifically configured to:
and sending the second display content and the new position of the interface focus.
Optionally, in an embodiment of the present application, the interface focus includes a control in the first display content.
Optionally, in an embodiment of the present application, the control data includes a control instruction executable by the terminal device, and the control instruction is obtained by converting the remote control data according to a preset conversion rule.
Optionally, in an embodiment of the application, the second data processing module is specifically configured to:
analyzing the remote control data into a control instruction matched with the second equipment according to a preset analysis rule;
and adjusting the first display content by using the control instruction.
Optionally, in an embodiment of the application, the second data processing module is specifically configured to:
acquiring identification information of the first device;
determining a preset analysis rule matched with the first equipment according to the identification information;
and analyzing the remote control data into a control instruction matched with the second equipment according to the preset analysis rule.
In a fifth aspect, an embodiment of the present application provides a terminal device, including: a processor; a memory for storing processor-executable instructions; the processor is configured to execute the instructions to implement the control method for cross-device synchronous display of the first/second aspect or one or more of the multiple possible implementation manners of the first/second aspect.
In a sixth aspect, an embodiment of the present application provides a control system for cross-device synchronous display, including a first device in the third aspect or multiple possible implementation manners of the third aspect, and a second device in the fourth aspect or multiple possible implementation manners of the fourth aspect.
In a seventh aspect, an embodiment of the present application provides a computer program product, which includes computer readable code or a non-transitory computer readable storage medium carrying computer readable code, and when the computer readable code runs in an electronic device, a processor in the electronic device executes a method for controlling cross-device synchronous display of the first aspect or one or more of the multiple possible implementations of the first aspect.
In an eighth aspect, embodiments of the present application provide a chip including at least one processor, where the processor is configured to execute a computer program or computer instructions stored in a memory to perform a method that may be implemented by any one of the above aspects.
Optionally, the chip may further comprise a memory for storing a computer program or computer instructions.
Optionally, the chip may further include a communication interface for communicating with other modules outside the chip.
Alternatively, one or more chips may constitute a system of chips.
These and other aspects of the present application will be more readily apparent from the following description of the embodiment(s).
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments, features, and aspects of the application and, together with the description, serve to explain the principles of the application.
Fig. 1 shows a schematic structural diagram of a control system for cross-device synchronous display according to an embodiment of the present application.
Fig. 2 shows a schematic block diagram of the first device 101 and the remote control device 103.
FIG. 3 shows a workflow diagram of the control system for cross-device synchronized display.
FIG. 4 is a method flow diagram illustrating an embodiment of a method for controlling a cross-device synchronous display.
FIG. 5 illustrates a schematic diagram of a user interface 500 having a plurality of controls.
Fig. 6 illustrates a schematic diagram of a user interface 600.
Fig. 7 is a method flow diagram illustrating another embodiment of a control method for cross-device synchronous display.
FIG. 8 illustrates a method flow diagram of one embodiment of a method of adjusting the first display interface.
Fig. 9 shows a schematic diagram of a user interface 900.
Fig. 10 illustrates a schematic diagram of a user interface 1000.
Fig. 11 shows a schematic block diagram of an embodiment of the first device 101 and the second device 105.
Fig. 12 shows a schematic structural diagram of a terminal device according to an embodiment of the present application.
Fig. 13 shows a block diagram of a software configuration of a terminal device according to an embodiment of the present application.
Detailed Description
Various exemplary embodiments, features and aspects of the present application will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present application. It will be understood by those skilled in the art that the present application may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present application.
In the embodiments of the present application, "/" may indicate a relationship in which the objects associated before and after are "or", for example, a/B may indicate a or B; "and/or" may be used to describe that there are three relationships associated with an object, e.g., a and/or B, which may represent: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. For convenience of describing the technical solutions of the embodiments of the present application, in the embodiments of the present application, words such as "first", "second", and the like may be used to distinguish technical features having the same or similar functions. The terms "first", "second", and the like do not necessarily limit the number and execution order, and the terms "first", "second", and the like do not necessarily differ. In the embodiments of the present application, the words "exemplary" or "such as" are used to indicate examples, illustrations or illustrations, and any embodiment or design described as "exemplary" or "e.g.," should not be construed as preferred or advantageous over other embodiments or designs. The use of the terms "exemplary" or "such as" are intended to present relevant concepts in a concrete fashion for ease of understanding.
In the embodiment of the present application, for a technical feature, the technical features in the technical feature are distinguished by "first", "second", "third", "a", "B", "C", and "D", and the like, and the technical features described in "first", "second", "third", "a", "B", "C", and "D" are not in a sequential order or a size order.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present application. It will be understood by those skilled in the art that the present application may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present application.
In order to facilitate understanding of the embodiments of the present application, a structure of one of the control systems for cross-device synchronous display provided in the embodiments of the present application is described below. Referring to fig. 1, fig. 1 is a schematic structural diagram of a cross-device synchronous display control system 100 according to an embodiment of the present application, where the system includes a first device 101, a remote control device 103 of the first device 101, and a second device 105. Wherein the first device 101 has a first display 107 and the second device 105 has a second display 109. In some examples, the first device 101 may include a smart display device, a smart television, a projection device, and the like, which have a display function, and the remote control device 103 may include a remote controller, such as a television remote controller, a display device remote controller, a projector remote controller, and the like, which is configured with the smart display device, the smart television, and the projection device. The second device 105 may comprise a smart terminal with a display, such as a smart phone, a tablet, a Personal Digital Assistant (PDA), etc., the size of the first display 107 is typically larger than the size of the second display 109.
A data transmission channel is established between the first device 101 and the second device 105, for example, the first device 101 and the second device 105 access to a Wireless Local Area Network (WLAN) (e.g., a Wireless Fidelity (Wi-Fi) network), and data can be transmitted between the first device 101 and the second device 105 through the Wireless Local Area network. Of course, the data transmission channel is not limited to the above wireless lan, and may also include a short-range wireless communication channel, such as bluetooth, infrared, and the like, and may even include a wired connection, which is not limited herein. The second device 105 may transmit the content displayed on the second display 109 to the first device 101 through the data transmission channel, and the first device 101 may display the content transmitted by the second device 105 on the first display 107, so that an effect of synchronous display of the first display 107 and the second display 109 may be formed. In a specific example, the first device 101 and the second device 105 access the same Wi-Fi network, and based on the mirroring protocol, the second device 105 may send the screenshot image to the first device 101 at a rate of several frames per second, and the first device 101 may continuously display the screenshot image transmitted by the second device 105. The mirror protocol may include, for example, an airplay mirror, a lelink mirror, and any other protocol that may be capable of synchronizing the display interface of the second device 105 to the first device 101, which is not limited herein. In a specific scene, the smart phone and the smart display device are connected into the same Wi-Fi network, and the smart phone can transmit the displayed data content to the smart display device through the Wi-Fi network and display the data content on the display screen, so that the effect of synchronously displaying the smart phone and the smart display device is achieved.
Of course, in other embodiments, a data transmission channel may also be established between the first device 101 and the second device 105 through a screen projection device, and particularly for the second device 105 without a mirror protocol, the screen projection device may be used as a function extension module of the second device 105, so that the second device 105 has a function of synchronous display across devices. Fig. 2 shows an exemplary structure diagram of the data transmission channel established by the screen projection device 201. As shown in fig. 2, the screen projecting device 201 may be connected to the first device 101, which may specifically include a wired connection, such as connecting the screen projecting device 201 to a High Definition Multimedia Interface (HDMI) of the first device 101, and of course, may also include a wireless connection manner such as bluetooth, which is not limited herein. The screen projection device 201 and the second device 105 may be simultaneously connected to the routing device 203, so that a data transmission channel between the screen projection device 201 and the second device 105, that is, between the first device 101 and the second device 105, may be established.
The first device 101 and the remote control device 103 are matched with each other, and a user can send a control instruction to the first device 101 by operating the remote control device 103. In a specific example, as shown in fig. 3, the remote control device 103 may include a keyboard 301, an encoding module 303, a modulation module 305, a signal transmitting module 307, and the like, and the first device 101 may include a signal receiving module 309 (e.g., an optical-to-electrical conversion and amplification circuit), a demodulation module 311, a decoding module 313, an execution module 315, and the like. Specifically, after the user selects a key of the keyboard 301, the encoding module 303 may encode the selected key, convert the generated code into a modulated wave through the modulation module 305, and transmit the modulated wave through the signal transmission module 307. On the first device 101 side, after the signal receiving module 309 receives the modulated wave, the modulated wave may be demodulated and decoded by the demodulation module 311 and the decoding module 313, respectively, and then a pulse signal having a certain frequency may be generated. The pulse signals with different frequencies respectively correspond to different control commands, and based on this, after the control command corresponding to the pulse signal is identified, the first device 101 may execute the corresponding control command by the execution model 315. It should be noted that, in the embodiment of the present application, the signal transmitting module 307 may transmit the modulated wave through a short-range radio wave such as infrared ray, bluetooth, and the like, and the present application is not limited herein.
The work flow of the control system for cross-device synchronous display is described below with reference to the interactive flow chart shown in fig. 4.
As shown in fig. 4, in the embodiment of the present application, before step 1, a data transmission channel, such as a wired channel, a wireless local area network, a short-range wireless communication channel (e.g., bluetooth, infrared), etc., may be established between the first device 101 and the second device 105. In step 1, under the operation of the user, the remote control device 103 may transmit remote control data to the first device 101, where the remote control data may include data such as the above-mentioned modulated wave. In step 2, the first device 101 may transmit the remote control data or the control data matching the remote control data to the second device 105. Here, the control data may include control instructions executable by the second device 105, which are obtained by converting the remote control data according to a preset conversion rule. Of course, the above parsing process may be executed on the first device 101 or the second device 105, and the present application is not limited herein. In step 3, the second device 105 may adjust the first display content currently displayed by the second device 105 according to the control instruction obtained by analyzing the remote control data, and generate the second display content. The first display content includes a first display interface and the second display content may include a second display interface. In step 4, the second device 105 may send the second display interface to the first device 101 through the data transmission channel. In step 5, the first device 101 may display a second display interface. Displaying the second display interface by the first device 101 may include the first device 101 presenting the same data content as the second display interface using the first display 107. For example, in a case that the size of the first display 107 of the first device 101 is not matched with the size of the second display 105 of the second device 105, the first device 101 may perform size adaptation on the second display interface in the process of displaying the second display interface, so that the adapted second display interface is matched with the size of the first display 107.
By using the control method for cross-device synchronous display provided by the embodiment of the application, the second device 105 can be controlled by using the remote control device 103 of the first device 101 based on the data transmission channel between the first device 101 and the second device 105. In this way, in the process of synchronously displaying the display interface of the second device 105 by using the first device 101, the user only needs to control the adjustment of the interface by using the remote control device 103 of the first device 101 based on the interface displayed by the first device 101. From the perspective of the user experience, the user's gaze does not need to switch back and forth between the first device 101 and the second device 105 to complete the adjustment of the display interface.
The following describes the control method for cross-device synchronous display according to the present application in detail with reference to the accompanying drawings. Although the present application provides method steps as shown in the following examples or figures, more or fewer steps may be included in the method based on conventional or non-inventive efforts. In the case of steps where no necessary causal relationship exists logically, the order of execution of the steps is not limited to that provided by the embodiments of the present application. The method can be executed in sequence or in parallel according to the method shown in the embodiment or the figure (for example, a parallel processor or a multi-thread processing environment) in the control process of cross-device synchronous display or the execution of the device in practice.
The following describes an embodiment of the control method for cross-device synchronous display shown in fig. 4 in detail.
In the embodiment of the present application, as shown in fig. 4, the first device 101 may transmit remote control data received from the remote control device 103 to the second device 105. In practical applications, there is a specific encoding and decoding protocol between the first device 101 and the remote control device 103, and for the remote control data sent by the remote control device 103, generally, only the first device 101 can parse out its corresponding meaning. Therefore, the remote control data needs to be converted into control data matching the second device 105 so as to be recognized by the second device 105. Based on this, in one embodiment of the present application, the first device 101 may further convert the remote control data into control data matching the second device 105, and then transmit the control data to the second device 105.
In this embodiment, the control data includes a control instruction executable by the second device 105, which is obtained by converting the remote control data according to a preset conversion rule. The preset conversion rule may include a correspondence between the remote control data and the control instruction recognizable by the second device 105, that is, the remote control data recognizable by the first device 101 is converted into the control instruction recognizable by the second device 105. Table 1 shows an example of the preset conversion rule, as shown in table 1, in the table of the example of the preset conversion rule, the first column is remote control data of the first device 101, such as an "up key", "down key", "back key", and the like, and corresponding control instructions corresponding to the remote control data can be set. For example, after receiving the remote control data of the "down key", the first device 101 may convert the remote control data into the control command of "user interface move down" in the second device 105 according to the preset conversion rule shown in table 1. Of course, the preset conversion rule is not limited to the example shown in table 1, and any conversion relationship from the remote control data to the control command may be set, which is not limited herein.
TABLE 1 example table with preset conversion rules
Remote control data Control instructions matched to the second device 105
Up key User interface move up
Down key User interface move down
To the left User interface moving to the left
Right key User interface moving to the right
Return key Quitting the current user interface
Main page key Click the home page key of the second device 105
Volume reducing key Zooming out of a user interface
Volume up key Magnifying user interface
Of course, in other embodiments, the conversion of the remote control data may also be accomplished by a third-party device, and the third-party device is coupled to the first device 101, for example, the third-party device may include a cloud server or a server cluster, and may also include an external device of the first device 101, such as an external storage processor such as a usb disk.
In the above embodiments, the first device 101 converts the remote control data into the control data matching the second device 105, and the second device 105 can directly execute the control data after receiving the control data, reducing the resource consumption on the second device 105 side.
However, in practical applications, many of the first devices 101 do not have processing capabilities, such as a display device that is older in design. Based on this, the first device 101 may forward the original remote control data to the second device 105. After receiving the remote control data, the second device 105 may parse the original remote control data into a control instruction matching the second device 105 according to a preset parsing rule. Specifically, the preset parsing rule may include the preset conversion rule shown in table 1, and certainly, the remote control data may include not only the control instruction parsed by the first device 101 shown in table 1, but also an original modulation wave, and the like, which is not limited herein.
In practical application, the encoding modes of the remote control data corresponding to the display devices of different brands and different models are different. For the remote control mode of infrared emission, the frequency of the modulation wave possibly used by different display devices is different for the same control instruction. Based on this, in an embodiment of the present application, the second device 105 converts the remote control data into the control instruction matched with the second device according to the preset conversion rule, including:
s101: acquiring identification information of the first device 101;
s103: and resolving the remote control data into a control instruction matched with the second equipment 105 according to a preset resolving rule, wherein the preset resolving rule is related to the identification information of the first equipment.
In this embodiment of the application, in the process of converting the remote control data, first, the second device 105 may obtain the identification information of the first device 101, specifically, in the process of establishing a data channel between the first device 101 and the second device 105, the second device 105 may obtain the identification information of the first device 101, where the identification information may include information that can identify the identity of the first device, such as a brand and a model of the first device. Table 2 shows an exemplary table of the preset parsing rule. As shown in table 2, for different first devices 101, the parsing rules between the corresponding remote control data and the control commands are different. For example, for the same control command "user interface up", the remote control data corresponding to the first device 101 identified as 0001 is "up key", and the remote control data corresponding to the second device 101 identified as 0002 is number key 2, which is not limited herein.
TABLE 2 example table of preset parsing rules
Figure BDA0002980907720000091
In this embodiment, after receiving or analyzing the executable control instruction corresponding to the remote control data, the second device 105 may adjust the currently displayed first display interface according to the control instruction, generate a second display interface, and send the second display interface to the first device 101. After receiving the second display interface, the first device 101 may display the second interface by using the first display 107, so as to implement synchronous display with the first device 101. Of course, in the case that the sizes of the first display 107 of the first device 101 and the second display 105 of the second device 105 are not matched, the first device 101 may perform size adaptation on the second display interface in the process of displaying the second display interface, so that the adapted second display interface is matched with the size of the first display 107. The first display 107 and the second display 109 display screens that are adapted to the size of the displays from the viewing field of view of the user.
In one embodiment of the present application, the first display 107 may not only display the display interface synchronized by the second device 105, but also display the interface focus in the display interface. Wherein the interface focus comprises a focused position in the display interface, and the interface focus in the display interface functions like a mouse cursor. The interface focus can be arranged at any position of the display interface, including a control, a picture, a character and the like. In the embodiment of the present application, the interface focus may be displayed in any manner that can be highlighted in the display interface, such as by using a cursor, a border frame, a highlight, a masking layer, and the like, for example, the interface focus is displayed in a text input box by using a flashing cursor, or a border frame is added to a picture where the interface focus is located, or a control where the interface focus is located is highlighted. Displaying the interface focus by using the first display 107 can enable a user to visually recognize the position of the focus in the currently displayed interface, so that the user can operate according to the interface focus. Based on this, the remote control data may include an operation on the interface focus. The operation includes, for example, moving up, moving down, moving left, and moving right the interface focus, and may also include an operation on a page element where the interface focus is located, for example, opening a link, a picture, and the like where the interface focus is located.
In one embodiment of the application, the interface focus may be set on a control of the display interface. The control may include a visual image disposed in the display interface, such as a button, a file editing box, and the like. The control may have the function of executing a function or causing code to run and complete a response through an event. FIG. 5 illustrates a user interface 500 having a plurality of controls, as shown in FIG. 5, with the interface focus in user interface 500 on control 2. In the embodiment of the application, the interface focus is arranged on the control of the display interface, so that the interface focus can be focused at a relatively important position in the display interface, and the adjustment efficiency of a user on the display interface is improved.
In an embodiment of the present application, in a case where the remote control data includes operation data for an interface focus displayed by the first display 107, the second device 105 may adjust the first display interface according to the remote control data and generate the second display interface after receiving the remote control data. Specifically, in one embodiment, the second device 105 adjusts the first display interface according to the remote control data, including:
s201: determining a new position of the interface focus according to the remote control data;
s203: and adjusting the first display interface according to the new position to generate a second display interface.
In the embodiment of the application, the new position of the interface focus can be determined according to the remote control data. The new location may include, for example, an identification of the control, coordinates of the cursor in the display interface, and so forth. And under the condition that the remote control data comprise operation data aiming at the interface focus, the analyzed control instructions are all operation instructions aiming at the interface focus. For example, the remote control data "up key" indicates that the interface focus is moved up, and in a case where the interface focus is set on a control, the remote control data "up key" corresponds to a control instruction to switch the interface focus to the next control. After determining the new position, the first display interface may be adjusted according to the new position to generate a second display interface. The specific adjustment manner may include adjusting the first display interface so that the new position of the interface focus is located in the upper half of the generated second display interface.
An exemplary scenario is described below in conjunction with the application scenario diagram shown in fig. 6, the method flow diagram shown in fig. 7, and the user interface diagrams shown in fig. 8 and 9. As shown in fig. 6, the first device 101 and the second device 105 are simultaneously displaying the user interface 500. The user's gaze may rest on the first display 107 of the first device 101, at which point the user wants to move the interface focus from control 2 up and down to control 4, based on which the user may operate the remote control device 103, such as clicking a down key of the remote control device 103. From the point of view of the method execution of the second device 105, as shown in fig. 7, the second device 105 receives remote control data "down key" from the first device 101 in step 701. In step 702, the second device 105 may convert the remote control data "down key" into a control instruction "interface focus down" according to the preset parsing rule. Then, in step 703, the second device 105 may determine a new position of the interface focus according to the control instruction. For example, for user interface 500 shown in FIG. 5, the new position of the interface focus is determined to be control 4. In step 704, it may be determined whether the new position of the interface focus is at an upper portion of the first display interface. In the case that the determination result is yes, in step 705, the generated second display interface may be maintained as the first display interface. For example, in the user interface 500 shown in fig. 5, the new position of the interface focus, i.e., the position of the control 4, remains in the upper half of the user interface 500, and thus the generated user interface remains as the user interface 500. In case the determination result is no, in step 707, the second device 105 may move the first display interface upward so that the new position of the interface focus is located at the upper half of the generated second display interface. As shown in fig. 8, in user interface 800, the interface focus is located on control 6, and after moving the interface focus down onto control 7, it is determined that control 7 is located in the lower half of user interface 800. Therefore, the user interface 800 needs to be moved up to show more content to generate the user interface 900 shown in fig. 9. As shown in FIG. 9, the new position of the interface focus, control 7, is in the top half of user interface 900.
By adjusting the first display interface in each of the above embodiments, the comfort of page display can be automatically achieved, so that the focus of the page is located at a prominent position in the display.
In other embodiments, in the case that the user selects to execute the function of the corresponding control, the user may also jump from the first display interface to the second display interface. For example, in the user interface 800, after the user selects a new tab, the second device 105 may jump from the current page to the new user interface in response to the corresponding control instruction. In addition, optionally, the interface focus may be disposed on a first control in the new user interface, or may be disposed on a control at an intermediate position in the new user interface, which is not limited herein.
In one embodiment of the present application, the second display 109 of the second device 105 may or may not display the interface focus. In a case that the second display 109 does not display the interface focus, the second device 105 may send the interface focus to the first device 101 after overlapping the interface focus at the new position of the second display interface in a case that it is determined that the interface focus is at the new position of the second display interface, and the first device 101 may directly display the second display interface after receiving the second display interface overlapping the interface focus. Specifically, in the embodiment of the present application, the manner of superimposing the interface focus on the second display interface may be executed according to a preset superimposing rule. In the case where the second display interface includes images, the superimposition rules may include, for example, highlighting at the corresponding image location, adding a flashing cursor, adding a bounding box over the corresponding control image, and so forth. Of course, the overlay rule may include any manner capable of highlighting the interface focus in the second display interface, and the application is not limited thereto.
Of course, in another embodiment of the present application, the interface focus may also be superimposed in the second display interface by the first device 101. Based on this, the second device 105 may send information of the second display interface and the new position of the interface focus to the first device 101 after determining the new position of the second display interface and the interface focus in the second display interface.
The first device 101, after receiving the second display interface and new location information of the interface focus, may superimpose at the interface focus at the new location of the second display interface. Since the first display 107 and the second display 109 may not match in size, the first device 101 may adapt the new position information to the position in the first display 107 after receiving the new position information of the interface focus.
In one example, for the user interface 500 shown in FIG. 5, the user sends remote control data "down key" via the remote control 103 in an attempt to rotate the position of the interface focus down from control 2 to control 4. Based on this, the second device 105 adjusts the user interface 500 to the user interface 1000 shown in fig. 10 after receiving the remote control data or the control data corresponding to the remote control data. According to some rules of user interface adjustment, for example, the position of control 4 remains in the top half of the user interface, and thus, user interface 1000 is the same as user interface 500. The second device 105 also determines the location of the interface focus in the user interface 1000, i.e. the location of control 4, and sends the location information of control 4 to the first device 101. After receiving the user interface 1000 and the position of the interface focus in the user interface 1000, that is, the position of the control 4, the first device 101 may superimpose the interface focus at the position of the control 4 in the user interface 1000, and generate the effect shown in fig. 10.
The application also provides a control method for cross-device synchronous display from the perspective of the first device 101, and the control method can be applied to any first device 101 with a display function, such as an intelligent display device, an intelligent television, a projection device and the like. The first device 101 is coupled to a second device 105 having a second display 109, and a first display 107 of the first device synchronously displays a first display content of the second display 109. The first device may receive remote control data from the corresponding remote control device 103. Based on the coupling relationship with the second device 105, the first device 101 transmits the remote control data or the control data matching the remote control data to the second device 105. The first device 101 may further receive second display content, where the second display content includes display content adjusted according to the remote control data.
Optionally, in an embodiment of the present application, the first display is further configured to display an interface focus, and the remote control data includes operation data of the interface focus.
Optionally, in an embodiment of the present application, the interface focus is disposed on a control of the display interface.
Optionally, in an embodiment of the application, the control data includes a control instruction executable by the second device and obtained by converting the remote control data according to a preset conversion rule.
Optionally, in an embodiment of the present application, the receiving the second display interface includes:
receiving a second display interface and a position of the interface focus in the second display interface;
superimposing the interface focus at the location of the second display interface.
An embodiment of the control method for cross-device synchronous display will be described below from the perspective of the second device 105, and the control method can be applied to any data processing function of the second device 105, including a smart phone, a tablet computer, and other devices. A second device 105 having a second display 109 may receive remote control data or control data matching the remote control data for a first device 101 having a first display 107, the second device 105 being coupled to the first device 101, the first display 107 displaying a first display interface of the second display 109 synchronously. The second device 105 may adjust the first display interface according to the remote control data or the control data, generate a second display interface, and finally send the second display interface to the first device 101.
Optionally, in an embodiment of the present application, the remote control data includes operation data for an interface focus displayed by the first display.
Optionally, in an embodiment of the present application, the adjusting the first display interface according to the remote control data includes:
determining a new position of the interface focus according to the remote control data;
and adjusting the first display interface according to the new position to generate a second display interface.
Optionally, in an embodiment of the application, the sending the second display interface includes:
superimposing the interface focus at the new location of the second display interface;
and sending a second display interface after the interface focus is superposed.
Optionally, in an embodiment of the present application, the sending the second display interface includes:
and sending the second display interface and the new position of the interface focus.
Optionally, in an embodiment of the present application, the interface focus includes a control in the first display interface.
Optionally, in an embodiment of the application, the control data includes a control instruction executable by the second device and obtained by converting the remote control data according to a preset conversion rule. .
Optionally, in an embodiment of the present application, the adjusting the first display interface according to the remote control data includes:
analyzing the remote control data into a control instruction matched with the second equipment according to a preset analysis rule;
and adjusting the first display interface by using the control instruction.
Optionally, in an embodiment of the application, the converting the remote control data into the control instruction matched with the second device according to a preset conversion rule includes:
acquiring identification information of the first device;
determining a preset analysis rule matched with the first equipment according to the identification information;
and analyzing the remote control data into a control instruction matched with the second equipment according to the preset analysis rule.
In another aspect of the present application, an embodiment of the first device 101 is further provided, fig. 11 shows a schematic block diagram of the embodiment of the first device 101, as shown in fig. 11, the first device 101 includes a first display 107, a first network module 1301, a first data receiving module 1303, and a first data sending module 1305, where,
the first network module 1301, configured to couple with a second device having a second display;
the first display 107 is configured to synchronously display a first display interface of the second display;
the first data receiving module 1303 is configured to receive remote control data and receive a second display interface, where the second display interface includes a display interface obtained by adjusting the first display interface according to the remote control data or control data matched with the remote control data;
the first data sending module 1305 is configured to send the remote control data or the control data to the second device.
Optionally, in an embodiment of the present application, the first display 107 is further configured to display an interface focus, and the remote control data includes operation data of the interface focus.
Optionally, in an embodiment of the present application, the interface focus is disposed on a control of the display interface.
Optionally, in an embodiment of the present application, the first device 101 further comprises a first data processing module 1307,
the first data processing module 1307 is configured to convert the remote control data into a control instruction matched with the second device according to a preset conversion rule;
correspondingly, the first data sending module 1305 is configured to send the control instruction to the second device.
Optionally, in an embodiment of the present application, the first data receiving module 1303 is specifically configured to:
receiving a second display interface and a position of the interface focus in the second display interface;
superimposing the interface focus at the location of the second display interface.
Another aspect of the present application further provides an embodiment of the second device 105, fig. 11 shows a schematic block structure diagram of the embodiment of the second device 105, as shown in fig. 11, the second device 105 includes a second display 109, a second network module 1301 ', a second data receiving module 1303', a second data processing module 1307 ', and a second data sending module 1305', wherein,
the second network module 1301' is used for coupling with a first device 101 having a first display 107, and the first display synchronously displays a first display interface of the second display;
the second data receiving module 1303' is configured to receive remote control data for the first device 101 or control data matched with the remote control data;
the second data processing module 1307' is configured to adjust the first display interface according to the remote control data or the control data, and generate a second display interface;
the second data sending module 1305' is configured to send the second display interface.
Optionally, in an embodiment of the present application, the remote control data includes operation data for an interface focus displayed by the first display.
Optionally, in an embodiment of the present application, the second data processing module 1307' is specifically configured to:
determining a new position of the interface focus according to the remote control data or the control data;
and adjusting the first display interface according to the new position to generate a second display interface.
Optionally, in an embodiment of the present application, the second data sending module 1305' is specifically configured to:
superimposing the interface focus at the new location of the second display interface;
and sending a second display interface after the interface focus is superposed.
Optionally, in an embodiment of the present application, the second data sending module 1305' is specifically configured to:
and sending the second display interface and the new position of the interface focus.
Optionally, in an embodiment of the present application, the interface focus includes a control in the first display interface.
Optionally, in an embodiment of the present application, the control data includes a control instruction executable by the terminal device, and the control instruction is obtained by converting the remote control data according to a preset conversion rule.
Optionally, in an embodiment of the present application, the second data processing module 1307' is specifically configured to:
analyzing the remote control data into a control instruction matched with the second equipment according to a preset analysis rule;
and adjusting the first display interface by using the control instruction.
Optionally, in an embodiment of the present application, the second data processing module 1307' is specifically configured to:
acquiring identification information of the first device;
determining a preset analysis rule matched with the first equipment according to the identification information;
and analyzing the remote control data into a control instruction matched with the second equipment according to the preset analysis rule.
In the embodiments of the present application, a party that initiates data transmission across devices and sends the data may be referred to as a source (source) side, and a party that receives the data may be referred to as a sink (sink) side. It should be noted that a device acting as a source in one pair of relationships may also act as a sink in another pair of relationships, that is, a terminal device may act as a source of another terminal device or as a sink of another terminal device.
The terminal device (including the device at the source end and the device at the receiving end) related to the present application may refer to a device having a wireless connection function, the wireless connection function may refer to being connected with other terminal devices through wireless connection modes such as wifi and bluetooth, and the terminal device of the present application may also have a function of communicating through wired connection. The terminal device can be a touch screen or a non-touch screen, the touch screen can control the terminal device in a manner of clicking, sliding and the like on a display screen through fingers, a touch pen and the like, the non-touch screen device can be connected with input devices such as a mouse, a keyboard, a touch panel and the like, the terminal device is controlled through the input devices, and the device without the screen can be a Bluetooth sound box without the screen and the like.
For example, the terminal device of the present application may be a smart phone, a netbook, a tablet computer, a notebook computer, a wearable electronic device (such as a smart band, a smart watch, and the like), a TV, a virtual reality device, a sound, electronic ink, and the like.
Fig. 12 shows a schematic structural diagram of a terminal device according to an embodiment of the present application. Taking the terminal device as a mobile phone as an example, fig. 12 shows a schematic structural diagram of the mobile phone 200.
The mobile phone 200 may include a processor 210, an external memory interface 220, an internal memory 221, a USB interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 251, a wireless communication module 252, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, keys 290, a motor 291, an indicator 292, a camera 293, a display 294, a SIM card interface 295, and the like. The sensor module 280 may include a gyroscope sensor 280A, an acceleration sensor 280B, a proximity light sensor 280G, a fingerprint sensor 280H, and a touch sensor 280K (of course, the mobile phone 200 may further include other sensors, such as a temperature sensor, a pressure sensor, a distance sensor, a magnetic sensor, an ambient light sensor, an air pressure sensor, a bone conduction sensor, and the like, which are not shown in the figure).
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the mobile phone 200. In other embodiments of the present application, handset 200 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units, such as: the processor 210 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. Wherein the controller can be the neural center and the command center of the cell phone 200. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution.
A memory may also be provided in processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 210. If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 210, thereby increasing the efficiency of the system.
The processor 210 may execute the control method for synchronous display across devices provided in the embodiments of the present application. When the processor 210 may include different devices, for example, a CPU and a GPU are integrated, and the CPU and the GPU may cooperate to execute the control method for cross-device synchronous display provided in the embodiment of the present disclosure, for example, part of algorithms in the control method for cross-device synchronous display is executed by the CPU, and another part of algorithms is executed by the GPU, so as to obtain faster processing efficiency.
The display screen 294 is used to display images, video, and the like. The display screen 294 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the cell phone 200 may include 1 or N display screens 294, N being a positive integer greater than 1. The display screen 294 may be used to display information input by or provided to the user as well as various Graphical User Interfaces (GUIs). For example, the display 294 may display a photograph, video, web page, or file, among others. As another example, the display 294 may display a graphical user interface. The graphical user interface comprises a status bar, a hidden navigation bar, a time and weather widget (widget) and an application icon, such as a browser icon. The status bar includes the name of the operator (e.g., china mobile), the mobile network (e.g., 4G), the time and the remaining power. The navigation bar includes a back key icon, a home screen (home) key icon, and a forward key icon. Further, it is understood that in some embodiments, a Bluetooth icon, a Wi-Fi icon, an add-on icon, etc. may also be included in the status bar. It will also be appreciated that in other embodiments, a Dock bar may also be included in the graphical user interface, and that a commonly used application icon may be included in the Dock bar, etc. When the processor 210 detects a touch event of a user's finger (or stylus, etc.) with respect to a certain application icon, in response to the touch event, the user interface of the application corresponding to the application icon is opened and displayed on the display 294.
In the embodiment of the present application, the display screen 294 may be an integrated flexible display screen, or a spliced display screen formed by two rigid screens and a flexible screen located between the two rigid screens may be adopted.
After the processor 210 runs the control method for cross-device synchronous display provided in the embodiment of the present application, the terminal device may establish a connection with another terminal device through the antenna 1, the antenna 2, or the USB interface, and transmit data according to the control method for cross-device synchronous display provided in the embodiment of the present application and control the display screen 294 to display a corresponding graphical user interface.
The cameras 293 (front camera or rear camera, or one camera may be used as both front camera and rear camera) are used for capturing still images or video. In general, the camera 293 may include a photosensitive element such as a lens group including a plurality of lenses (convex or concave lenses) for collecting an optical signal reflected by an object to be photographed and transferring the collected optical signal to an image sensor, and an image sensor. And the image sensor generates an original image of the object to be shot according to the optical signal.
Internal memory 221 may be used to store computer-executable program code, including instructions. The processor 210 executes various functional applications and data processing of the cellular phone 200 by executing instructions stored in the internal memory 221. The internal memory 221 may include a program storage area and a data storage area. Wherein the storage program area may store an operating system, code for application programs (such as a camera application, a WeChat application, etc.), and the like. The data storage area can store data (such as images, videos and the like acquired by a camera application) and the like created in the use process of the mobile phone 200.
The internal memory 221 can further store one or more computer programs 1310 corresponding to the control method for synchronous display across devices provided by the embodiment of the present application. The one or more computer programs 1304 are stored in the memory 221 and configured to be executed by the one or more processors 210, and the one or more computer programs 1310 include instructions that can be used for executing the control method for cross-device synchronous display according to any of the embodiments.
In addition, the internal memory 221 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like.
Of course, the code of the control method for cross-device synchronous display provided in the embodiment of the present application may also be stored in the external memory. In this case, the processor 210 may execute the code of the control method of the cross-device synchronous display stored in the external memory through the external memory interface 220.
The function of the sensor module 280 is described below.
The gyro sensor 280A may be used to determine the motion attitude of the cellular phone 200. In some embodiments, the angular velocity of the cell phone 200 about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 280A. I.e., the gyro sensor 280A may be used to detect the current state of motion of the handset 200, such as shaking or standing still.
When the display screen in the embodiment of the present application is a foldable screen, the gyro sensor 280A may be used to detect a folding or unfolding operation acting on the display screen 294. The gyro sensor 280A may report the detected folding operation or unfolding operation as an event to the processor 210 to determine the folded state or unfolded state of the display screen 294.
The acceleration sensor 280B can detect the magnitude of acceleration of the cellular phone 200 in various directions (typically three axes). I.e., the gyro sensor 280A may be used to detect the current state of motion of the handset 200, such as shaking or standing still. When the display screen in the embodiment of the present application is a foldable screen, the acceleration sensor 280B may be used to detect a folding or unfolding operation acting on the display screen 294. The acceleration sensor 280B may report the detected folding operation or unfolding operation as an event to the processor 210 to determine the folded state or unfolded state of the display screen 294.
The proximity light sensor 280G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The mobile phone emits infrared light outwards through the light emitting diode. The handset uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the handset. When insufficient reflected light is detected, the handset can determine that there are no objects near the handset. When the display screen in the embodiment of the present application is a foldable display screen, the proximity optical sensor 280G may be disposed on a first screen of the foldable display screen 294, and the proximity optical sensor 280G may detect a folding angle or an unfolding angle of the first screen and the second screen according to an optical path difference of the infrared signal.
The gyro sensor 280A (or the acceleration sensor 280B) may transmit the detected motion state information (such as an angular velocity) to the processor 210. The processor 210 determines whether the mobile phone 200 is currently in the hand-held state or the tripod state (for example, when the angular velocity is not 0, it indicates that the mobile phone 200 is in the hand-held state) based on the motion state information.
The fingerprint sensor 280H is used to collect a fingerprint. The mobile phone 200 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The touch sensor 280K is also referred to as a "touch panel". The touch sensor 280K may be disposed on the display screen 294, and the touch sensor 280K and the display screen 294 form a touch screen, which is also called a "touch screen". The touch sensor 280K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operations may be provided through the display screen 294. In other embodiments, the touch sensor 280K can be disposed on the surface of the mobile phone 200, different from the position of the display screen 294.
Illustratively, the display 294 of the cell phone 200 displays a home interface that includes icons for a plurality of applications (e.g., a camera application, a WeChat application, etc.). The user clicks an icon of the camera application in the main interface through the touch sensor 280K, and the processor 210 is triggered to start the camera application and open the camera 293. Display screen 294 displays an interface, such as a viewfinder interface, for a camera application.
The wireless communication function of the mobile phone 200 can be implemented by the antenna 1, the antenna 2, the mobile communication module 251, the wireless communication module 252, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 200 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 251 can provide a solution including 2G/3G/4G/5G wireless communication applied to the handset 200. The mobile communication module 251 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 251 can receive electromagnetic waves from the antenna 1, and filter, amplify, etc. the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module 251 can also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave to radiate the electromagnetic wave through the antenna 1. In some embodiments, at least some of the functional modules of the mobile communication module 251 may be disposed in the processor 210. In some embodiments, at least some of the functional modules of the mobile communication module 251 may be disposed in the same device as at least some of the modules of the processor 210. In this embodiment, the mobile communication module 251 may also be used for information interaction with other terminal devices.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 270A, the receiver 270B, etc.) or displays images or video through the display screen 294. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 251 or other functional modules, independent of the processor 210.
The wireless communication module 252 may provide solutions for wireless communication applied to the mobile phone 200, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 252 may be one or more devices that integrate at least one communication processing module. The wireless communication module 252 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 252 may also receive a signal to be transmitted from the processor 210, frequency modulate it, amplify it, and convert it into electromagnetic waves via the antenna 2 to radiate it. In this embodiment, the wireless communication module 252 is configured to transmit data with other terminal devices under the control of the processor 210, for example, when the processor 210 executes the control method for cross-device synchronous display provided in this embodiment, the processor may control the wireless communication module 252 to send a determination request to the other terminal devices, and may also receive a determination result made by the other terminal devices based on the determination request, where the determination result indicates whether data to be transmitted can be transmitted to the other terminal devices, and then control the display screen 294 to display the determination result, so as to provide visual feedback for a user, avoid erroneous operation and repeated operation, and improve operation efficiency
In addition, the mobile phone 200 can implement an audio function through the audio module 270, the speaker 270A, the receiver 270B, the microphone 270C, the earphone interface 270D, and the application processor. Such as music playing, recording, etc. The handset 200 may receive key 290 inputs, generating key signal inputs relating to user settings and function control of the handset 200. Cell phone 200 can generate a vibration alert (such as an incoming call vibration alert) using motor 291. The indicator 292 in the mobile phone 200 may be an indicator light, and may be used to indicate a charging status, a power change, or an indication message, a missed call, a notification, or the like. The SIM card interface 295 in the handset 200 is used to connect a SIM card. The SIM card can be attached to and detached from the mobile phone 200 by being inserted into the SIM card interface 295 or being pulled out from the SIM card interface 295.
It should be understood that in practical applications, the mobile phone 200 may include more or less components than those shown in fig. 12, and the embodiment of the present application is not limited thereto. The illustrated handset 200 is merely an example, and the handset 200 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The software system of the terminal device may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of a terminal device.
Fig. 13 is a block diagram of a software configuration of a terminal device according to an embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 13, the application package may include phone, camera, gallery, calendar, talk, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 13, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The window manager may also be configured to detect whether a device expansion transmission operation, such as a drag operation, exists according to an embodiment of the present disclosure.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The telephone manager is used for providing a communication function of the terminal equipment. Such as management of call status (including connection, hangup, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the terminal device vibrates, an indicator light flickers, and the like.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
An embodiment of the present application provides a control device for cross-device synchronous display, including: a processor and a memory for storing processor-executable instructions; wherein the processor is configured to implement the above method when executing the instructions.
Embodiments of the present application provide a non-transitory computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the above-described method.
Embodiments of the present application provide a computer program product comprising computer readable code, or a non-transitory computer readable storage medium carrying computer readable code, which when run in a processor of an electronic device, the processor in the electronic device performs the above method.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an erasable Programmable Read-Only Memory (EPROM or flash Memory), a Static Random Access Memory (SRAM), a portable Compact Disc Read-Only Memory (CD-ROM), a Digital Versatile Disc (DVD), a Memory stick, a floppy disk, a mechanical coding device, a punch card or an in-groove protrusion structure, for example, having instructions stored thereon, and any suitable combination of the foregoing.
The computer readable program instructions or code described herein may be downloaded to the respective computing/processing device from a computer readable storage medium, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present application may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of Network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry can execute computer-readable program instructions to implement aspects of the present application by utilizing state information of the computer-readable program instructions to personalize custom electronic circuitry, such as Programmable Logic circuits, Field-Programmable Gate arrays (FPGAs), or Programmable Logic Arrays (PLAs).
Various aspects of the present application are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
It is also noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by hardware (e.g., a Circuit or an ASIC) for performing the corresponding function or action, or by combinations of hardware and software, such as firmware.
While the invention has been described in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a review of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Having described embodiments of the present application, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (19)

1. A control method for cross-device synchronous display is characterized by comprising the following steps:
receiving remote control data by a first device having a first display, the first device coupled to a second device having a second display, the first display synchronously displaying first display content, wherein the first display content is displayed by the second display;
sending the remote control data or the control data matched with the remote control data to the second equipment;
receiving second display content, wherein the second display content is obtained by adjusting the first display content by the second equipment according to the remote control data or the control data;
displaying the second display content on the first display.
2. The method of claim 1, wherein the first display is further configured to display an interface focus, and wherein the remote control data comprises operational data for the interface focus.
3. The method of claim 2, wherein the interface focus is disposed on a control of a display interface.
4. A method according to any of claims 1-3, wherein said control data comprises control instructions executable by said second device to transform said remote control data according to a predetermined transformation rule.
5. The method of claim 2 or 3, wherein receiving second display content comprises:
receiving second display content and a position of the interface focus in the second display content;
superimposing the interface focus at the location of the second display content.
6. A control method for cross-device synchronous display is characterized by comprising the following steps:
a second display of the second device displays the first display content;
the second device receives remote control data or control data matched with the remote control data, wherein the remote control data is specific to a first device with a first display, the second device is coupled with the first device, and the first display synchronously displays the first display content;
adjusting the first display content according to the remote control data or the control data to generate second display content;
and transmitting the second display content.
7. The method of claim 6, wherein the remote control data comprises operational data for an interface focus displayed by the first display.
8. The method of claim 7, wherein the adjusting the first display content according to the remote control data comprises:
determining a new position of the interface focus according to the remote control data;
and adjusting the first display content according to the new position to generate second display content.
9. The method of claim 8, wherein the sending the second display content comprises:
superimposing the interface focus at the new location of the second display content;
and sending second display content after the interface focus is superposed.
10. The method of claim 8, wherein the sending the second display content comprises:
and sending the second display content and the new position of the interface focus.
11. The method of any of claims 7-10, wherein the interface focus comprises a control in the first display content.
12. The method according to any of claims 6-11, wherein the control data comprises control instructions executable by the second device to convert the remote control data according to a predetermined conversion rule.
13. The method of any of claims 6-11, wherein said adjusting said first display content according to said remote control data comprises:
analyzing the remote control data into a control instruction matched with the second equipment according to a preset analysis rule;
and adjusting the first display content by using the control instruction.
14. The method of claim 13, wherein converting the remote control data into the control command matching the second device according to the preset conversion rule comprises:
acquiring identification information of the first device;
and analyzing the remote control data into a control instruction matched with the second equipment according to a preset analysis rule, wherein the preset analysis rule is related to the identification information of the first equipment.
15. A terminal device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any one of claims 1-5 when executing the instructions.
16. A terminal device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any one of claims 6-14 when executing the instructions.
17. A control system for cross-device synchronous display, characterized by comprising the terminal device of claim 15 and the terminal device of claim 16.
18. A computer readable storage medium having computer program instructions stored thereon, wherein the computer program instructions, when executed by a processor, implement the method of any of claims 1-5 or implement the method of any of claims 6-14.
19. A computer program product comprising computer readable code or a non-transitory computer readable storage medium carrying computer readable code, which when run in a processor of an electronic device, the processor in the electronic device performs the method of any of claims 1-5 or implements the method of any of claims 6-14.
CN202110287070.XA 2021-03-17 2021-03-17 Cross-device synchronous display control method and system Pending CN115113832A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110287070.XA CN115113832A (en) 2021-03-17 2021-03-17 Cross-device synchronous display control method and system
PCT/CN2022/079942 WO2022194005A1 (en) 2021-03-17 2022-03-09 Control method and system for synchronous display across devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110287070.XA CN115113832A (en) 2021-03-17 2021-03-17 Cross-device synchronous display control method and system

Publications (1)

Publication Number Publication Date
CN115113832A true CN115113832A (en) 2022-09-27

Family

ID=83321569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110287070.XA Pending CN115113832A (en) 2021-03-17 2021-03-17 Cross-device synchronous display control method and system

Country Status (2)

Country Link
CN (1) CN115113832A (en)
WO (1) WO2022194005A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI832728B (en) * 2023-02-27 2024-02-11 大陸商泓凱電子科技(東莞)有限公司 Wireless touch Bluetooth anti-control system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9632648B2 (en) * 2012-07-06 2017-04-25 Lg Electronics Inc. Mobile terminal, image display device and user interface provision method using the same
US20150147961A1 (en) * 2013-07-19 2015-05-28 Google Inc. Content Retrieval via Remote Control
CN104333789A (en) * 2014-10-30 2015-02-04 向火平 On-screen interacting system and control method thereof
KR102500558B1 (en) * 2016-03-16 2023-02-17 엘지전자 주식회사 Display device and method for operating thereof
CN106502604A (en) * 2016-09-28 2017-03-15 北京小米移动软件有限公司 Throw screen changing method and device
CN111880870B (en) * 2020-06-19 2024-06-07 维沃移动通信有限公司 Method and device for controlling electronic equipment and electronic equipment
CN114071207B (en) * 2020-07-30 2023-03-24 华为技术有限公司 Method and device for controlling display of large-screen equipment, large-screen equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI832728B (en) * 2023-02-27 2024-02-11 大陸商泓凱電子科技(東莞)有限公司 Wireless touch Bluetooth anti-control system

Also Published As

Publication number Publication date
WO2022194005A1 (en) 2022-09-22

Similar Documents

Publication Publication Date Title
US20230099824A1 (en) Interface layout method, apparatus, and system
US20220342850A1 (en) Data transmission method and related device
WO2022100237A1 (en) Screen projection display method and related product
CN111666055B (en) Data transmission method and device
CN112558825A (en) Information processing method and electronic equipment
CN112286618A (en) Device cooperation method, device, system, electronic device and storage medium
EP4228244A1 (en) Video processing method and apparatus, and storage medium
CN111221845A (en) Cross-device information searching method and terminal device
CN112527174B (en) Information processing method and electronic equipment
CN113050841A (en) Method, electronic equipment and system for displaying multiple windows
CN112527222A (en) Information processing method and electronic equipment
CN114065706A (en) Multi-device data cooperation method and electronic device
CN114442969B (en) Inter-equipment screen collaboration method and equipment
WO2022194005A1 (en) Control method and system for synchronous display across devices
CN114666433A (en) Howling processing method and device in terminal equipment and terminal
CN114520867B (en) Camera control method based on distributed control and terminal equipment
WO2022105793A1 (en) Image processing method and device
US20240125603A1 (en) Road Recognition Method and Apparatus
WO2022121751A1 (en) Camera control method and apparatus, and storage medium
EP4273679A1 (en) Method and apparatus for executing control operation, storage medium, and control
CN114513760B (en) Font library synchronization method, device and storage medium
CN116777740A (en) Screen capturing method, electronic equipment and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination