CN116069224A - Input control method and electronic equipment - Google Patents

Input control method and electronic equipment Download PDF

Info

Publication number
CN116069224A
CN116069224A CN202111276091.8A CN202111276091A CN116069224A CN 116069224 A CN116069224 A CN 116069224A CN 202111276091 A CN202111276091 A CN 202111276091A CN 116069224 A CN116069224 A CN 116069224A
Authority
CN
China
Prior art keywords
electronic device
display area
input
display
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111276091.8A
Other languages
Chinese (zh)
Inventor
卢跃东
周学而
魏凡翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN202111276091.8A priority Critical patent/CN116069224A/en
Priority to PCT/CN2022/119062 priority patent/WO2023071590A1/en
Publication of CN116069224A publication Critical patent/CN116069224A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an input control method and electronic equipment, and relates to the technical field of terminals. After the operation of switching the input focus position is determined, the first electronic device intercepts and forwards the input to the second electronic device, and user experience is improved. The method comprises the following steps: after the first electronic device determines the first operation acting on the first display area of the second electronic device, the first input is intercepted and forwarded to the second electronic device, so that the second electronic device displays the first input in the first display area. The first operation is used for indicating to switch the input focus position to a first display area of the second electronic device.

Description

Input control method and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of terminals, in particular to an input control method and electronic equipment.
Background
Along with the development of terminal technology, multi-screen coordination can be realized among the electronic devices, and the display screens of the two electronic devices are controlled by one electronic device, so that the display space is enlarged, and the content on one electronic device can be rapidly transmitted to the other electronic device for display. In general, in a multi-screen collaboration process, a set of key mice is shared by a plurality of electronic devices, and the display position of the keyboard input content is switched along with the switching of the display position of the mouse pointer between the display screens.
Illustratively, as shown in fig. 1 (a), a multi-screen cooperative connection is established between the computer 101 and the tablet 102, and the computer 101 and the tablet 102 share a set of mice. If the mouse pointer 11 is displayed on the display screen of the computer 101 at this time, the contents input by the user through the keyboard can be displayed on the display screen of the computer 101 at this time. Then, the mouse pointer 11 is moved in the direction of the tablet 102 and out of the edge of the display screen of the computer 101 in response to the user operation, so that the computer 101 hides the displayed mouse pointer 11, and as shown in fig. 1 (b), the mouse pointer 12 is displayed on the display screen of the tablet 102, so that the contents input by the user through the keyboard are displayed on the display screen of the tablet 102.
It can be seen that if the user inadvertently moves the mouse such that the mouse pointer display position switches between the display screens, the display position of the keyboard input content is caused to switch as the mouse pointer switches. If the user does not perceive the switching of the display position of the mouse pointer, the error of the input position can be caused, and the use experience of the user is affected.
Disclosure of Invention
In order to solve the technical problems described above, embodiments of the present application provide an input control method and an electronic device. According to the technical scheme provided by the embodiment of the application, after the operation of determining the position of the switching input focus, the first electronic device intercepts and forwards the input to the second electronic device, so that the user experience is improved.
In order to achieve the technical purpose, the embodiment of the application provides the following technical scheme:
in a first aspect, an input control method is provided and applied to a first electronic device. The method comprises the following steps: determining a first operation acting on the first display area; the first operation is for indicating to switch the input focus position to a first display area of the second electronic device. After determining the first operation on the first display area, intercepting the first input and transmitting the first input to the second electronic device to enable display of the first input in the first display area of the second electronic device.
Alternatively, the first operation is a mouse click operation acting on the first display area, or the first operation is a touch operation acting on the first display area.
In some embodiments, the first display area may be, for example, all or a portion of the area of the display screen. Specifically, the first display area may refer to a display area for displaying a certain application, function, component, or the like, and any one of component areas that can support input.
In some embodiments, the first electronic device implements interception of the first input through keyboard HOOK. After the first operation is determined, the first electronic device mounts the keyboard HOOK, intercepts and forwards the first input through the first HOOK. And before the first operation is determined, the first electronic device does not mount the keyboard HOOK (such as unloading the keyboard HOOK), and then the obtained input is issued to the local application, so that the input is displayed in the display area of the local application.
Thus, the first electronic device does not switch the input focus position after detecting the operation of moving the mouse pointer to other display areas. After the first electronic device determines that the user has the intention of switching the input focus, the first electronic device is hung on the keyboard HOOK, intercepts a subsequent keyboard input event and sends the subsequent keyboard input event to the second electronic device, so that the switching of the input focus is realized, the input display is ensured to meet the user requirement, and the user experience is improved.
According to a first aspect, or any implementation manner of the first aspect, after determining a first operation acting on the first display area, intercepting the first input and sending the first input to the second electronic device includes: after detecting a click operation of the mouse in the first display area, or after determining a touch operation acting on the first display area, intercepting a first input of the keyboard and transmitting the first input to the second electronic device.
In this way, the first electronic device determines the user intention according to the click position of the mouse, or determines the user intention according to the position of the touch operation. The first electronic device switches the input focus position to the position of the first operation action only after determining the user intention. The display position of the subsequent input can meet the user demand, the problem of abnormal input position caused by misoperation of moving the mouse position can be avoided, and the user experience is improved.
According to a first aspect, or any implementation manner of the first aspect, determining a first operation acting on a first display area includes: a first operation is detected that is applied to the first display area. Or receiving the first information sent by the second electronic device, wherein the first information is the information sent after the second electronic device detects the first operation acting on the first display area.
In some embodiments, the first electronic device acts as a master device, and the multi-screen collaboration mode applied by the first electronic device and the second electronic device includes: a shared collaboration mode, a windowed collaboration mode, or a full-screen extended collaboration mode.
For example, in the sharing collaboration mode, the display screen of the first electronic device and the display screen of the second electronic device are displayed separately, the first display area is a display area displayed by the display screen of the second electronic device, and the second display area is an area displayed by the first electronic device. The first electronic device is capable of detecting a mouse click operation on a display screen of the second electronic device
For another example, in the windowed collaboration mode, the second display area is a screen projection area of the first electronic device on the second electronic device display screen, and the first display area is a display area other than the screen projection area on the second electronic device display screen. Then, after detecting a touch operation acting on the first display area, the second electronic device transmits first information to the first electronic device, and the first electronic device determines that the first operation is generated when receiving the first information, that is, determines that the input focus is to be switched to the first display area.
In this way, the first electronic device needs to determine the first operation before switching the input focus position, and even if the first operation is not acting on the first electronic device, the first operation can be determined by the connection with the second electronic device. The first electronic device switches the input focus position only after determining the first operation, intercepts and forwards the input to the second electronic device. Thereby meeting the input requirements of the user.
According to the first aspect, or any implementation manner of the first aspect, the method further includes: after detecting a second operation acting on the second display area, determining to scroll display the display content displayed in the second display area; the second display area is a display area corresponding to the first electronic device.
According to the first aspect, or any implementation manner of the first aspect, the second operation is a mouse scroll wheel operation when the second display area displays the mouse pointer.
According to the first aspect, or any implementation manner of the first aspect, determining to scroll display the display content displayed in the second display area includes: when the second display area is the display area of the first electronic device, scrolling display contents of the second display area; or, in the case that the second display area is a display area for displaying the screen-casting content of the first electronic device, which is displayed by the second electronic device, sending the scroll display content of the second display area to the second electronic device.
For example, assume that the multi-screen collaboration mode of the first electronic device and the second electronic device application is a shared collaboration mode. The first electronic device detects an operation of scrolling the mouse wheel acting on the second display area of the first electronic device in the process of intercepting and transmitting the first input to the second electronic device, and scrolls and displays the display content of the second display area.
Also by way of example, assume that the multi-screen collaborative mode applied by the first electronic device and the second electronic device is a windowed collaborative mode or a full-screen extended screen collaborative mode. After the first electronic equipment detects the second operation acting on the second display area, the first electronic equipment sends first screen throwing content to the second electronic equipment, wherein the first screen throwing content is the screen throwing content of rolling display.
Therefore, the display position of the mouse pointer is decoupled from the position of the input focus, the input focus is uniformly managed by the first electronic equipment, and the requirement that a user inputs content in one display area while scrolling and browsing in the other display area is met.
According to the first aspect, or any implementation manner of the first aspect, the method further includes: determining a third operation acting on the second display area; the third operation is used for indicating to switch the input focus position to a second display area corresponding to the first electronic device. Based on the second input, it is determined to display the second input in the second display area.
In some embodiments, the first electronic device, after determining the third operation, uninstalls the keyboard HOOK, does not intercept the first input any more, but determines to display the second input in the second display area according to the second input after subsequently receiving the second input. Alternatively, the third operation is a mouse click operation acting on the second display area, or the third operation is a touch operation acting on the second display area.
In this way, the first electronic device determines whether to mount or dismount the keyboard HOOK according to whether the display area is the display area corresponding to the first electronic device only after determining the mouse click operation or the touch operation acting on the display area. Thereby ensure that the input display satisfies the user demand, promote user's use experience.
In a second aspect, an input control method is provided and applied to a second electronic device. The method comprises the following steps: after detecting a first operation acting on the first display area, sending first information to the first electronic equipment; the first operation is for indicating to switch the input focus position to a first display area of the second electronic device. Receiving a first input sent by first electronic equipment; the first input is an input intercepted by the first electronic device after receiving the first information. The first input is displayed in the first display area.
According to the second aspect, the first operation is a mouse click operation acting on the first display area, or the first operation is a touch operation acting on the first display area.
According to a second aspect, or any implementation manner of the second aspect, the second electronic device further displays a second display area, where the second display area is used to display the screen content sent by the first electronic device, and the method further includes: and displaying first screen throwing content sent by the first electronic equipment in the second display area, wherein the first screen throwing content is the screen throwing content of rolling display, and the first screen throwing content is the screen throwing content sent by the first electronic equipment after detecting the second operation acting on the second display area.
According to a second aspect, or any implementation manner of the second aspect, the second operation is a mouse scroll wheel operation when the second display area displays the mouse pointer.
The technical effects corresponding to the second aspect and any implementation manner of the second aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a third aspect, an electronic device is provided. The electronic device is a first electronic device, comprising: a processor and a memory coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions that, when read from the memory by the processor, cause the electronic device to perform the operations of: determining a first operation acting on the first display area; the first operation is for indicating to switch the input focus position to a first display area of the second electronic device. After determining the first operation on the first display area, intercepting the first input and transmitting the first input to the second electronic device to enable display of the first input in the first display area of the second electronic device.
According to the third aspect, the first operation is a mouse click operation acting on the first display area, or the first operation is a touch operation acting on the first display area.
According to a third aspect, or any implementation manner of the above third aspect, after determining a first operation acting on the first display area, intercepting the first input and sending the first input to the second electronic device, includes: after detecting a click operation of the mouse in the first display area, or after determining a touch operation acting on the first display area, intercepting a first input of the keyboard and transmitting the first input to the second electronic device.
According to a third aspect, or any implementation manner of the above third aspect, determining a first operation acting on the first display area includes: a first operation is detected that is applied to the first display area. Or receiving first information sent by the second electronic device, and determining a first operation according to the first information, wherein the first information is information sent after the second electronic device detects the first operation acting on the first display area.
According to a third aspect, or any implementation manner of the above third aspect, when the processor reads the computer instructions from the memory, the electronic device is further caused to perform the following operations: after detecting a second operation acting on the second display area, determining to scroll display the display content displayed in the second display area; the second display area is a display area corresponding to the first electronic device.
According to a third aspect, or any implementation manner of the third aspect, the second operation is a mouse scroll wheel operation when the second display area displays a mouse pointer.
According to a third aspect, or any implementation manner of the above third aspect, determining to scroll display the display content displayed in the second display area includes: when the second display area is the display area of the first electronic device, scrolling display contents of the second display area; or, in the case that the second display area is a display area for displaying the screen-casting content of the first electronic device, which is displayed by the second electronic device, sending the scroll display content of the second display area to the second electronic device.
According to a third aspect, or any implementation manner of the above third aspect, when the processor reads the computer instructions from the memory, the electronic device is further caused to perform the following operations: determining a third operation acting on the second display area; the third operation is used for indicating to switch the input focus position to a second display area corresponding to the first electronic device. Based on the second input, it is determined to display the second input in the second display area.
The technical effects corresponding to the third aspect and any implementation manner of the third aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, and are not described herein again.
In a fourth aspect, an electronic device is provided. The electronic device is a second electronic device, comprising: a processor, a memory, and a display screen, the memory, the display screen coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions that, when read from the memory by the processor, cause the electronic device to perform operations comprising: after detecting a first operation acting on the first display area, sending first information to the first electronic equipment; the first operation is for indicating to switch the input focus position to a first display area of the second electronic device. Receiving a first input sent by first electronic equipment; the first input is an input intercepted by the first electronic device after receiving the first information. The first input is displayed in the first display area.
According to the fourth aspect, the first operation is a mouse click operation acting on the first display area, or the first operation is a touch operation acting on the first display area.
According to a fourth aspect, or any implementation manner of the fourth aspect, the second electronic device further displays a second display area, where the second display area is used to display the screen content sent by the first electronic device, and when the processor reads the computer instructions from the memory, the electronic device further causes the electronic device to perform the following operations: and displaying first screen throwing content sent by the first electronic equipment in the second display area, wherein the first screen throwing content is the screen throwing content of rolling display, and the first screen throwing content is the screen throwing content sent by the first electronic equipment after detecting the second operation acting on the second display area.
According to a fourth aspect, or any implementation manner of the fourth aspect, the second operation is a mouse scroll wheel operation when the second display area displays a mouse pointer.
The technical effects corresponding to any implementation manner of the fourth aspect and the fourth aspect may refer to the technical effects corresponding to any implementation manner of the second aspect and the second aspect, and are not described herein.
In a fifth aspect, an embodiment of the present application provides an electronic device, where the electronic device has a function of implementing the input control method as described in the first aspect and any one of possible implementation manners of the first aspect; alternatively, the electronic device has the functionality to implement the input control method as described in the second aspect and any one of the possible implementations. The functions may be implemented by hardware, or by corresponding software executed by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
The technical effects corresponding to the fifth aspect and any implementation manner of the fifth aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a sixth aspect, embodiments of the present application provide a computer-readable storage medium. The computer readable storage medium stores a computer program (which may also be referred to as instructions or code) which, when executed by an electronic device, causes the electronic device to perform the method of the first aspect or any implementation of the first aspect; or a party causing an electronic device to perform the second aspect or any one of the embodiments of the second aspect.
The technical effects corresponding to the sixth aspect and any implementation manner of the sixth aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a seventh aspect, embodiments of the present application provide a computer program product for, when run on an electronic device, causing the electronic device to perform the method of the first aspect or any of the embodiments of the first aspect; or cause the electronic device to perform the method of the second aspect or any of the embodiments of the second aspect.
The technical effects corresponding to the seventh aspect and any implementation manner of the seventh aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein again.
In an eighth aspect, embodiments of the present application provide a circuit system comprising a processing circuit configured to perform the method of the first aspect or any one of the embodiments of the first aspect; alternatively, the processing circuitry is configured to perform the second aspect or a method of any one of the embodiments of the second aspect.
The technical effects corresponding to the eighth aspect and any implementation manner of the eighth aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein again.
In a ninth aspect, an embodiment of the present application provides a chip system, including at least one processor and at least one interface circuit, where the at least one interface circuit is configured to perform a transceiver function and send an instruction to the at least one processor, and when the at least one processor executes the instruction, the at least one processor performs the method of the first aspect or any implementation manner of the first aspect; alternatively, at least one processor, when executing instructions, performs the method of the second aspect or any of the embodiments of the second aspect.
The technical effects corresponding to the ninth aspect and any implementation manner of the ninth aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, and are not described herein again.
Drawings
FIG. 1 is a schematic diagram of an interface provided in an embodiment of the present application;
fig. 2A is a schematic diagram of a communication system to which the input control method provided in the embodiment of the present application is applied;
fig. 2B is a schematic diagram of a scenario in which an input control method provided in an embodiment of the present application is applied;
fig. 3 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 4A is a schematic software structural diagram of a first electronic device according to an embodiment of the present application;
fig. 4B is a schematic software architecture diagram of a second electronic device according to an embodiment of the present application;
FIG. 5 is a second interface schematic diagram provided in an embodiment of the present application;
FIG. 6 is a third interface schematic provided in an embodiment of the present application;
fig. 7 is a schematic diagram of an interface provided in an embodiment of the present application;
fig. 8 is a fifth interface schematic diagram provided in an embodiment of the present application;
fig. 9 is a schematic diagram of an interface provided in an embodiment of the present application;
FIG. 10 is a schematic diagram of an interface provided in an embodiment of the present application;
FIG. 11 is a schematic diagram eighth interface provided in an embodiment of the present application;
FIG. 12 is a flowchart of an input control method according to an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of a first electronic device according to an embodiment of the present application;
Fig. 14 is a schematic structural diagram of a second electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application are described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments of the present application, the terminology used in the embodiments below is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in the various embodiments herein below, "at least one", "one or more" means one or more than two (including two).
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise. The term "coupled" includes both direct and indirect connections, unless stated otherwise. The terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the embodiments of the present application, the "user" is not limited to the owner of the electronic device or a particular user, but may be an animal, hard object, or other object. The user's manipulation of the electronic device may also be manipulation of the electronic device by an animal, hard object, or other object. The use of words such as "user" or "operation of a user" are intended to be illustrative of, and not limiting of, the operations that are performed by, and in particular, who.
Fig. 2A is a schematic diagram of a communication system to which an input control method according to an embodiment of the present application is applied. As shown in fig. 2A, the communication system includes a first electronic device 100 and a second electronic device 200.
The first electronic device 100 may establish a wireless communication connection with the second electronic device 200 through wireless communication technology. Among other wireless communication technologies, at least one of the following is included but not limited to: near field wireless communication (near field communication, NFC), bluetooth (BT) (e.g., conventional bluetooth or low energy (bluetooth low energy, BLE) bluetooth), wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), zigbee (Zigbee), frequency modulation (frequency modulation, FM), infrared (IR), and the like.
In some embodiments, both the first electronic device 100 and the second electronic device 200 support a proximity discovery function. Illustratively, both the first electronic device 100 and the second electronic device 200 are capable of implementing a proximity discovery function through NFC induction. After the first electronic device 100 approaches the second electronic device 200, the first electronic device 100 and the second electronic device 200 can find each other, and then establish a wireless communication connection such as a Wi-Fi end-to-peer (P2P) connection, a bluetooth connection, or the like. Thereafter, the first electronic device 100 and the second electronic device 200 implement multi-screen collaboration.
In some embodiments, the first electronic device 100 and the second electronic device 200 establish a wireless communication connection through a local area network. For example, the first electronic device 100 and the second electronic device 200 are both connected to the same router.
In some embodiments, the first electronic device 100 establishes a wireless communication connection with the second electronic device 200 through a cellular network, the internet, or the like. For example, the second electronic device 200 accesses the internet through a router, and the first electronic device 100 accesses the internet through a cellular network; further, the first electronic device 100 establishes a wireless communication connection with the second electronic device 200.
Alternatively, the first electronic device 100 or the second electronic device 200 may be, for example, a personal computer (personal computer, PC), a tablet (Pad), a mobile phone (mobile phone), a notebook, a desktop, a notebook, a computer with a transceiver function, a wearable device, a vehicle-mounted device, an artificial intelligence (artificial intelligence, AI) device, or other terminal devices. The operating system installed by the first electronic device 100 or the second electronic device 200 includes, but is not limited to
Figure BDA0003329335650000071
Figure BDA0003329335650000072
Or other operating system. In some embodiments, the device types of the first electronic device 100 and the second electronic device 200 are the same or different, and the operating systems installed by the first electronic device 100 and the second electronic device 200 are the same or different. In some embodiments, the first electronic device 100 or the second electronic device 200 may be a fixed device or a portable device. The specific type of the first electronic device 100 or the second electronic device 200, and the installed operating system are not limited in this application.
In some embodiments, the first electronic device 100 or the second electronic device 200 is configured with a respective keyboard and mouse. For example, assume that the first electronic device 100 is a PC configured with a keyboard and a mouse touch pad. Optionally, the PC may also be configured with an external keyboard and/or mouse. Alternatively, in the case where the first electronic device 100 or the second electronic device 200 is configured with a touch screen, a mouse or a keyboard may be configured, or the keyboard or the mouse may not be configured, such as displaying a virtual keyboard directly through the touch screen and receiving a touch operation. For example, as shown in fig. 2A, assume that the second electronic device 200 is a PAD, which is a touch screen. Alternatively, the PAD may configure the stylus 21. Alternatively, the PAD may be connected to an external keyboard and mouse.
In some embodiments, after the first electronic device 100 and the second electronic device 200 establish a connection, the method may be applied to a multi-screen collaboration scenario as shown in fig. 2B.
As shown in fig. 2B (a), after the first electronic device 100 and the second electronic device 200 are connected, the display screen of the first electronic device 100 and the display screen of the second electronic device 200 may be displayed separately, and the keyboard and the mouse of the first electronic device 100 may control the second electronic device 200, which is equivalent to that the first electronic device 100 is externally connected with a display screen, so as to enlarge the display space of the first electronic device. Such as by the first electronic device 100 transmitting the received input content of the keyboard to the display screen of the second electronic device 200 for display. Alternatively, the current multi-screen collaboration scenario may be a shared collaboration mode of the first electronic device 100 and the second electronic device 200.
Also for example, as shown in fig. 2B (B), after the first electronic device 100 and the second electronic device 200 are connected, the display area 22 is displayed on the second electronic device 200, and the display area 22 is used to display the content sent by the first electronic device 100. In the current scenario, the display content of the display area 22 on the first electronic device 100 and the second electronic device 200 may be the same or different. Alternatively, the first electronic device 100 changes the display content in response to the user operation, and does not affect the operation of displaying the content in the display area 22 on the second electronic device 200. For example, the first electronic device 100 drops the application a onto the display area 22 of the second electronic device 200 for display, and the user's operation on the application B in the first electronic device 100 does not affect the application a displayed in the display area 22 of the second electronic device 200. Alternatively, the current multi-screen collaboration scenario may be a windowed collaboration mode of the first electronic device 100 and the second electronic device 200.
As shown in fig. 2B (c), after the first electronic device 100 and the second electronic device 200 are connected, the second electronic device 200 displays a display area 23, where the display area 23 is used for displaying the display content of the second electronic device 200, and the screen-projection area 23 is used for displaying the screen-projection content sent by the first electronic device 100. Alternatively, the current multi-screen collaborative scenario may be a full-screen extended screen collaborative mode of the first electronic device 100 and the second electronic device 200.
It should be noted that, the first electronic device 100 and the second electronic device 200 establish multi-screen cooperative connection in three cooperative modes as shown in fig. 2B is merely illustrative, and as technology advances, the first electronic device 100 and the second electronic device 200 may also establish multi-screen cooperative connection in more cooperative modes, and may divide different multi-screen cooperative scenes in other cooperative mode dividing manners, which is not limited in particular.
In addition, in the above-described scenario as shown in fig. 2B, the first electronic device 100 and the second electronic device 200 may determine the master device according to the user operation in the process of establishing the connection. Alternatively, the first electronic device 100 and the second electronic device negotiate to determine the master device. In the embodiment of the present application, the first electronic device 100 is taken as a main device, and a keyboard and a mouse capable of controlling the second electronic device 200 are configured as examples, and the input control method is described by taking a multi-screen collaboration scenario as shown in fig. 2B as an example. It will be appreciated that the second electronic device 200 may also act as a master device. Optionally, the number of second electronic devices 200 is one or more.
Optionally, in the scenario shown in fig. 2B (B) or fig. 2B (c) above, in order to avoid a collision with the keyboard input of the first electronic device 100, the second electronic device 200 may not display the virtual keyboard any more, and/or the keyboard of the second electronic device 200 may not operate any more. This will not be described further below.
By way of example, fig. 3 shows a schematic diagram of a structure of the first electronic device 100 or the second electronic device 200.
The first electronic device 100 or the second electronic device 200 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a sensor module 180, keys 190, a motor 191, a mouse module 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the first electronic device 100 or the second electronic device 200. In other embodiments of the present application, the first electronic device 100 or the second electronic device 200 may include more or less components than illustrated, or certain components may be combined, certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
For example, in the case where the first electronic device 100 is a PC, the first electronic device 100 may not include the mobile communication module 150 and the SIM card interface 195.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to a touch sensor, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor through an I2C interface, such that the processor 110 communicates with the touch sensor through an I2C bus interface to implement a touch function of the first electronic device 100 or the second electronic device 200.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing function of first electronic device 100 or second electronic device 200. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the first electronic device 100 or the second electronic device 200.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the first electronic device 100 or the second electronic device 200, or may be used to transfer data between the first electronic device 100 or the second electronic device 200 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not limited to the structure of the first electronic device 100 or the second electronic device 200. In other embodiments of the present application, the first electronic device 100 or the second electronic device 200 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the first electronic device 100 or the second electronic device 200. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the first electronic device 100 or the second electronic device 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the first electronic device 100 or the second electronic device 200 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the first electronic device 100 or the second electronic device 200. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device or displays images or video through a display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied on the first electronic device 100 or the second electronic device 200. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, the antenna 1 of the first electronic device 100 or the second electronic device 200 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 such that the first electronic device 100 or the second electronic device 200 can communicate with a network and other devices through wireless communication technology. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The first electronic device 100 or the second electronic device 200 realizes a display function through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may be manufactured using a liquid crystal display (liquid crystal display, LCD), for example, using an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (FLED), a Mini-led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the first electronic device 100 or the second electronic device 200 may include 1 or N display screens 194, N being a positive integer greater than 1.
The keyboard and mouse module 192 may include a mouse, a keyboard, a touchpad for implementing keyboard-mouse functionality, and the like. In some embodiments, after receiving the content input by the user through the mouse module 192, the first electronic device 100 may display on the display screen of the first electronic device 100, or may display on the display screen of the second electronic device 200 that establishes a communication connection with the first electronic device 100. Optionally, the mouse module 192 is an optional module. For example, the second electronic device 200 is a PAD, and the input operation of the user may be directly received through the display screen 194 without configuring the mouse module 192.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the first electronic device 100 or the second electronic device 200 may include 1 or N cameras 193, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the first electronic device 100 or the second electronic device 200. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (such as audio data, phonebook, etc.) created during use of the first electronic device 100 or the second electronic device 200, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications and data processing of the first electronic device 100 or the second electronic device 200 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110. The first electronic device 100 or the second electronic device 200 may play, record, etc. music through the audio module 170. The audio module 170 may include a speaker, a receiver, a microphone, a headphone interface, an application processor, etc. to implement audio functions.
The sensor module 180 may include a pressure sensor, a gyroscope sensor, a barometric sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be provided on the display screen 194. Pressure sensors are of many kinds, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor, the capacitance between the electrodes changes. The first electronic device 100 or the second electronic device 200 determines the intensity of the pressure according to the change of the capacitance. When a touch operation is applied to the display screen, the first electronic device 100 or the second electronic device 200 detects the touch operation intensity according to the pressure sensor. The first electronic device 100 or the second electronic device 200 may also calculate the position of the touch from the detection signal of the pressure sensor. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
Touch sensors, also known as "touch devices". The touch sensor may be disposed on the display screen 194, and the touch sensor and the display screen 194 form a touch screen, which is also referred to as a "touch screen". The touch sensor is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor may also be disposed on the surface of the first electronic device 100 or the second electronic device 200, which is different from the location of the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The first electronic device 100 or the second electronic device 200 may receive key inputs, generating key signal inputs related to user settings and function controls of the first electronic device 100 or the second electronic device 200.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195 or withdrawn from the SIM card interface 195 to enable contact and separation with the first electronic device 100 or the second electronic device 200. The first electronic device 100 or the second electronic device 200 may support 1 or N SIM card interfaces, N being a positive integer greater than 1.
In some scenarios, the same or different operating systems may be installed in the first electronic device 100 and the second electronic device 200. The following are listed below
Figure BDA0003329335650000121
System and->
Figure BDA0003329335650000122
The system is an example, and a software structure of the electronic device will be described.
Illustratively, assume that the operating system installed in the first electronic device 100 is
Figure BDA0003329335650000123
Fig. 4A is a software structural block diagram of the first electronic device 100 according to the embodiment of the present application.
As shown in fig. 4A, a first application is installed in the first electronic device 100, where the first application may be used to manage a connection with another electronic device, for example, the first application is a computer manager application, and the first electronic device 100 can establish a multi-screen collaborative connection with the second electronic device 200 through the computer manager application.
As shown in fig. 4A, the first application includes a transmission management module, a mouse management module, and an event interception module. The transmission management module is configured to manage transmission based on the multi-screen cooperative connection, for example, the first electronic device 100 sends, to the second electronic device 200 through the transmission management module, content input by a user through a keyboard of the first electronic device 100. The keyboard and mouse management module is used for managing a keyboard and a mouse of the first electronic device 100, and managing an input focus. If it is determined that the input focus is on the display area of the first electronic device 100, the user's input may be received Entering content and sending the input content to a display area of the first electronic device 100 for display; if it is determined that the input focus is on the display area of the second electronic device 200, the event interception module may be activated and turned off after it is determined that the input focus is switched back to the display area of the first electronic device 100. The event interception module is used for intercepting an operating system (such as
Figure BDA0003329335650000124
System) is provided. If the keyboard and mouse management module determines that the input focus is in the display area of the second electronic device 200 and starts the event interception module, the event interception module intercepts +.>
Figure BDA0003329335650000125
And an input event of a keyboard in the system, and the input content corresponding to the input event is sent to the second electronic equipment 200 for display through the transmission management module, so that user input is received through the keyboard of the first electronic equipment 100 and displayed in the display area of the second electronic equipment 200.
Alternatively, the function of the event interception module may be implemented by a HOOK (HOOK) function. If the keyboard and mouse module determines to start the event interception module, the keyboard HOOK can be indicated to be mounted, and then the event interception module can intercept an input event of the keyboard; if the keyboard and mouse module determines to shut down the event interception module, the keyboard HOOK can be indicated to be unloaded, and then the event interception module does not intercept the input event of the keyboard.
It should be noted that, the module division manner in the first application shown in fig. 4A is only an exemplary illustration, other module division manners may exist for implementing the functions of the transmission management module, the mouse management module, and the event interception module, and further, more or fewer modules may exist in the first application, and the module division manner and the number of module division are not limited in the embodiments of the present application.
Also exemplary, assume that the operating system installed in second electronic device 200 is
Figure BDA0003329335650000126
Fig. 4B is a software architecture block diagram of the second electronic device 200 according to an embodiment of the present application.
The software system of the second electronic device 200 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the second electronic device 200 is illustrated.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 4B, the application package may include a first application, a short message, a camera, a calendar, music, a gallery, a map, a call, a video, and the like.
Optionally, the first application may be used to manage connections with other electronic devices, and the second electronic device 200 may be capable of establishing a multi-screen collaborative connection with the first electronic device 100 through the first application. For example, the second electronic device 200 is a cell phone and the first application is a cell phone manager. The module dividing manner and the functions of each module in the first application of the second electronic device 200 may refer to the module dividing manner and the functions of each module in the first application shown in fig. 4A, which are not described herein.
It should be noted that the first application in the second electronic device 200 and the first application in the first electronic device 100 may be the same application or different applications. For example, the first electronic device 100 and the second electronic device 200 are the same type of electronic device, and the first electronic device 100 and the second electronic device 200 may install the same first application for establishing the multi-screen cooperative connection. For another example, the first electronic device 100 and the second electronic device 200 are different types of electronic devices and different operating systems are installed, and then the first application in the first electronic device 100 and the first application in the second electronic device 200 may be different applications.
In some embodiments, after the first electronic device 100 and the second electronic device 200 establish a multi-screen collaborative connection, if it is determined that a keyboard and a mouse of one of the electronic devices are used, or that a keyboard and a mouse of another electronic device are unavailable, or that another electronic device is not configured with a keyboard and a mouse, the electronic device determined to use the keyboard and the mouse may be determined to be a master device, a keyboard and mouse management module in the master device may be determined to be started, and an event interception module in the master device may be started or closed according to a user requirement, and the event interception module in the other electronic device may be in a closed or dormant state. For example, after the PC and the PAD establish multi-screen cooperative connection, the host device is determined to be the PC, and the user can operate the PC and the PAD through the keyboard and the mouse of the PC. And, the keyboard and mouse management module in the PC starts or closes the event interception module in the PC according to the user demand, and correspondingly mounts or uninstalls the keyboard HOOK in the PC.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 4B, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the second electronic device 200. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), two-dimensional graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
In the following embodiments, the first electronic device 100 and the second electronic device 200 are used to establish a multi-screen cooperative connection, and the first electronic device 100 is used as a main device, and an input control method provided in the embodiments of the present application is described as an example. It should be noted that, the input control method using the second electronic device 200 as the main device may refer to the input control method using the first electronic device 100 as the main device, which is not described herein.
In some scenarios, the first electronic device 100 and the second electronic device 200 establish a multi-screen collaborative connection that shares a collaborative mode as shown in fig. 2B (a). That is, after the first electronic device 100 and the second electronic device 200 establish a connection, the display screen of the first electronic device 100 and the display screen of the second electronic device 200 may be displayed separately. Also, the keyboard and mouse of the first electronic device 100 may control the second electronic device 200.
In some embodiments, after the first electronic device 100 detects that the mouse pointer moves out of the display screen where the input focus is located, the display screen where the input focus is located is not switched, so as to ensure that the user can perform input on the original display screen. And only after detecting the clicking operation of the user on other display screens, the input focus is moved to the corresponding display screen. Optionally, the clicking operation may be clicking operation such as clicking operation of a mouse, double clicking operation, etc., or may be touch operation of a user on a display screen or clicking operation by a stylus.
For example, as shown in fig. 5 (a), the first electronic device 100 is configured with a keyboard and a mouse, and a user can use the functions of the mouse of the first electronic device 100 through the external mouse 51 and/or the touch pad 52. It is assumed that the user inputs display contents on the display screen of the first electronic device 100 through the keyboard, and the mouse pointer 53 is displayed on the display screen of the first electronic device 100. As shown in fig. 5 (b), the first electronic device 100 detects that the user moves the mouse pointer 53 out of the edge of the display screen toward the direction where the second electronic device 200 is located, and can hide the mouse pointer displayed on the display screen of the first electronic device 100 and instruct the second electronic device 200 to display the mouse pointer 53. Alternatively, the first electronic device 100 may detect the movement position of the mouse pointer in real time, and determine whether the mouse pointer moves out of the display area of the display screen. Alternatively, the first electronic device 100 determines whether the mouse pointer is moved out of the display area through a display area range of the display screen configured in advance. The first electronic device 100 can determine that the mouse needs to be moved to the display screen of the second electronic device 200 for display according to the connection relationship and the positional relationship with the second electronic device 200. It should be noted that, the implementation of the movement position of the mouse pointer and the switching of the display screen may refer to the prior art, and the embodiments of the present application are not specifically limited.
Thereafter, as shown in fig. 5 (b), in the case where the second electronic device 200 displays the mouse pointer 53, the first electronic device 100 remains to display the input content on the display screen of the first electronic device 100 after receiving the user's input, as shown by reference numeral 54. That is, the input focus of the electronic device does not switch the input position following the mouse pointer switching display screen any more, thereby avoiding an input abnormality caused by a user's misoperation.
As shown in fig. 5 (c), the first electronic device 100 detects a mouse click operation of the user on the display screen of the second electronic device 200, and determines that the user needs to input display content on the display screen of the second electronic device 200, and needs to switch the input focus to the second electronic device, as shown by reference numeral 55. Thus, the event interception module as shown in fig. 4A is activated, enabling switching of the input focus to the second electronic device 200. For example, after receiving a mouse-down event, the mouse module in the first electronic device 100 determines that the operation position is the second electronic device 200, and mounts a keyboard HOOK through the event interception module, so as to intercept a subsequent keyboard input event, and send the keyboard input event to the second electronic device 200. For another example, the second electronic device 200 detects a touch operation (such as a touch operation by a finger, a stylus, or the like) of the user on the display screen, sends a network signaling to the first electronic device 100, instructs the first electronic device 100 to start the event interception module, mounts the keyboard HOOK, and then, the first electronic device 100 intercepts the keyboard input event using the keyboard HOOK again, and then sends the keyboard input event to the second electronic device 200.
As shown in fig. 5 (d), after the first electronic device 100 mounts the keyboard HOOK, the first electronic device 100 detects the input of the user and forwards the input data to the second electronic device 200, and the input content of the user is displayed on the display screen of the second electronic device 200 as shown by reference numeral 56.
In this way, the first electronic device 100 can hang on the keyboard HOOK after determining that the user has the intention of switching the input focus, intercept the subsequent keyboard input event and send the subsequent keyboard input event to the second electronic device 200, thereby realizing the switching of the input focus, ensuring that the input display meets the user requirement, and improving the user experience.
Also exemplary, corresponding to the scenario shown in fig. 5, as shown in fig. 6 (a), after the first electronic device 100 mounts the keyboard HOOK, the keyboard HOOK intercepts and forwards the keyboard input event to the second electronic device 200. Then content entered by the user via the keyboard of the first electronic device 100 is displayed on the second electronic device 200 as indicated by reference numeral 61. And during the user input, it is assumed that the mouse pointer 53 is moved from the display screen displayed on the second electronic device 200 as shown in (a) of fig. 6 to the display screen displayed on the first electronic device 100 as shown in (b) of fig. 6. After the movement, the second electronic device 200 hides or uninstalls the mouse pointer, and the first electronic device 100 reloads the mouse pointer, but the input focus does not switch to the first electronic device 100. As indicated by reference numeral 62, content entered by the user through the keyboard of the first electronic device 100 will still be intercepted by the keyboard HOOK and forwarded by the transmission management module as shown in fig. 4A to the second electronic device 200 for display.
Thereafter, as shown in fig. 6 (c), the first electronic device 100 detects an operation (such as clicking a mouse, touching operation, etc.) of the user on the display screen, and determines that the user needs to input display content on the display screen of the first electronic device 100, thereby turning off the event interception module shown in fig. 4A, and implementing switching of the input focus to the first electronic device 100. For example, after receiving a mouse-down event, the mouse module in the first electronic device 100 determines that the operation position is the first electronic device 100, and unloads the keyboard HOOK through the event interception module, so that the subsequent keyboard input event is not intercepted by the keyboard HOOK.
As shown in fig. 6 (d), after the first electronic device 100 unloads the keyboard HOOK, the first electronic device 100 detects the input of the user, and as shown by reference numeral 64, the first electronic device 100 directly displays the content input by the user on its own display screen. And, even if it is detected that the user moves the mouse pointer 53 to the second electronic device 200 to display during the user input, the user input content remains displayed on the display screen of the first electronic device 100.
In this way, the electronic apparatus can switch the input focus position according to the user intention. In the input display process, the input focus position can not be switched according to the movement of the mouse pointer, and the use experience of a user is improved.
In some embodiments, after the first electronic device 100 switches the input focus position, the user may still switch the display screen acted by the scroll wheel along with the movement of the display position of the mouse pointer. Thereby realizing scrolling display of display contents of other display screens in response to user operation while one display screen displays user input contents.
For example, as shown in fig. 7 (a), assuming that the first electronic device 100 starts the event interception module shown in fig. 4A, the keyboard HOOK is mounted, and the first electronic device 100 forwards the user input intercepted by the keyboard HOOK to the display screen of the second electronic device 200 for display. The first electronic device 100 detects a user's scroll operation of the mouse 51 in receiving a user input, and determines that the current mouse pointer 53 is displayed on the display screen of the first electronic device 100. Accordingly, the first electronic device 100 may determine that the user needs to scroll display the display content on the display screen of the first electronic device 100, and may control the display content of the display screen to scroll in the direction indicated by the arrow 71. In the scroll display process, as shown in fig. 7 (b), after the first electronic device 100 receives the input of the user through the keyboard, the keyboard HOOK intercepts the input event and forwards the input event to the second electronic device 200, and then the second electronic device 200 displays the content input by the user as shown by reference numeral 72.
Accordingly, assume that the first electronic device 100 uninstalls the keyboard HOOK and no longer intercepts the input event. When it is detected that the user scrolls the mouse wheel during the display of the mouse pointer by the display of the second electronic device 200, the display content of the display of the second electronic device 200 may be scrolled during the display of the user input by the first electronic device 100.
Thus, after decoupling the display position of the mouse pointer from the position of the input focus, the input focus is uniformly managed by the first electronic device 100, so as to meet the requirement that the user inputs content on one display screen while scrolling and browsing on the other display screen. In the using process, a user can not feel the input fracture of a plurality of devices, so that one system is really connected with a plurality of display screens, and other display screens are used as expansion screens.
In other scenarios, the first electronic device 100 and the second electronic device 200 establish a multi-screen collaborative connection in a windowed collaborative mode as shown in fig. 2B (B). That is, after the first electronic device 100 and the second electronic device 200 establish a connection, a display area for displaying the content transmitted by the first electronic device 100 is set on the display screen of the second electronic device 200. Also, the keyboard and mouse of the first electronic device 100 may control the second electronic device 200.
In some embodiments, after the first electronic device 100 detects that the mouse pointer moves out of the display screen or the display area where the mouse pointer is located, the position of the input focus is not switched, so as to ensure that the user can perform input on the original display screen or the original display area. After detecting the operation such as the mouse click operation and the touch operation of the user in the screen projection area, the position of the input focus is not switched, and the input focus is moved to the second electronic device 200 only after detecting the operation such as the mouse click operation and the touch operation of the user in the position outside the screen projection area of the second electronic device 200.
For example, as shown in fig. 8 (a), the first electronic device 100 is configured with a keyboard and a mouse, and the same user can use the function of the mouse of the first electronic device 100 through an external mouse and/or a touch pad. The second electronic device 200 includes a screen-throwing area 81 for displaying screen-throwing content sent by the first electronic device 100, and a display area outside the screen-throwing area 81 is used for displaying display content of the second electronic device 200. Alternatively, the content displayed on the display screen of the first electronic device 100 and the display content of the screen-throwing area 81 may be the same or different, which is not particularly limited in this embodiment of the present application. For related content of the screen projection method and the specific screen projection method, reference may be made to the prior art, and the embodiments of the present application are not repeated.
Alternatively, as shown in fig. 8 (a), the movement of the mouse pointer 82 does not affect the display of the content being input. For example, the first electronic device 100 uninstalls the keyboard HOOK, and the mouse pointer 82 moves to the display screen of the second electronic device 200 and to the display area outside the screen-throwing area 81 in the process of displaying the content input by the user on the display screen of the first electronic device 100 and/or in the screen-throwing area 81, so that the input being displayed is not affected. For another example, the first electronic device 100 mounts the keyboard HOOK, and transmits the content input by the user to the display area other than the screen-throwing area 81 of the second electronic device 200 for display, and the movement of the mouse pointer 82 to the screen-throwing area 81 or to the first electronic device 100 does not affect the user input being displayed in the display area other than the screen-throwing area 81 of the second electronic device 200.
Thereafter, as shown in fig. 8 (b), in the process of displaying the user input content by the first electronic device 100 and/or displaying the user input content in the screen-throwing area 81, as shown by reference numeral 83, after the second electronic device 200 detects a click operation (for example, a touch operation) of the user in a display area other than the screen-throwing area 81, it is determined that the user needs to input the content in the display area of the second electronic device 200, and thus, network signaling is transmitted to the first electronic device 100. After receiving the network signaling, the first electronic device 100 starts an event interception module shown in fig. 4A, and mounts the keyboard HOOK through the event interception module. Thereafter, as shown in fig. 8 (c), the first electronic device 100 detects the input of the user, the keyboard HOOK intercepts the input of the user and forwards the input of the user to the second electronic device 200, and after receiving the input of the user and processing, the second electronic device 200 displays the input of the user in a display area for displaying the content of the second electronic device 200 other than the screen-on area 81, as shown by reference numeral 84.
Alternatively, the clicking operation of the user in the display area other than the screen projection area 81 may include, for example, one or more items of content selected from a touch operation of the user, a clicking operation of a stylus, a clicking operation of a mouse, and the like. If the clicking operation is an operation of the user on the mouse, the first electronic device 100 can directly detect the position of the clicking operation of the user, and determine that the user needs to input in the display area of the second electronic device 200, then the second electronic device 200 may not need to send network signaling to the first electronic device 100, so as to save signaling consumption and improve efficiency. Alternatively, the second electronic device 200 still transmits network signaling to the first electronic device 100, guaranteeing the accuracy of the user intent determination.
Therefore, the electronic equipment can switch the input focus after determining that the user has the intention of switching the input focus, so that the input display is ensured to meet the user requirement, and the user experience is improved.
Also exemplary, corresponding to the scenario shown in fig. 8, as shown in fig. 9 (a), the first electronic device 100 mounts a keyboard HOOK, and as shown by reference numeral 91, the content input by the user through the keyboard of the first electronic device 100 is displayed on a display area other than the screen-throwing area 81 of the display screen of the second electronic device 200. And movement of the mouse pointer 82 does not change the display area in which the user input content is displayed during the user input.
Thereafter, as shown in fig. 9 b, the second electronic device 200 detects a click operation (for example, a touch operation) of the user within the screen-casting area 81, determines that the user needs to input display content within the screen-casting area, and thus, transmits network signaling to the first electronic device 100. After the first electronic device 100 receives the network signaling, the event interception module shown in fig. 4A is turned off, and the keyboard HOOK is uninstalled, so that the subsequent keyboard input event is not intercepted by the keyboard HOOK any more. Then, as shown in fig. 9 (c), after the first electronic device 100 uninstalls the keyboard HOOK, the first electronic device 100 detects the input of the user, and after processing the user input to determine the screen content, the screen content is transmitted to the second electronic device 200. As shown by reference numeral 92, the second electronic device 200, upon receiving the screen contents, displays the screen contents in the screen-casting area 81.
Optionally, in a scenario in which the display content of the display screen of the first electronic device 100 is synchronized with the display content of the screen-throwing area 81 of the second electronic device 200, corresponding to the scenario shown in fig. 9, during unloading the keyboard HOOK (that is, during displaying the user input in the display area other than the screen-throwing area 81 by the second electronic device 200), the first electronic device 100 detects a clicking operation of the user on the display screen of the first electronic device 100, and the first electronic device 100 will also mount the keyboard HOOK. Thereby realizing switching of the input focus to the first electronic device 100, and displaying the input content of the user in the display screen of the first electronic device 100 and the screen-throwing area 81 of the second electronic device 200.
In some embodiments, after the first electronic device 100 switches the input focus position, the user may still switch the active display area along with the movement of the display position of the mouse pointer when operating the mouse wheel. Thereby realizing scrolling display of display contents of other display areas in response to a user operation while displaying user input contents of one display area.
For example, in the scenario shown in fig. 8 and 9, the user is also able to scroll through the display content in one display area and enter the content in another display area. For example, as shown in fig. 10 (a), it is assumed that the first electronic device 100 mounts a keyboard HOOK. As shown by reference numeral 1001, in the process in which the screen-throwing area 81 of the second electronic apparatus 200 displays the input content of the user, the first electronic apparatus 100 detects the operation of the user to move the mouse pointer 82 to the display area other than the screen-throwing area 81 of the second electronic apparatus 200, and detects the operation of the wheel of the mouse 1002 by the user. The first electronic device 100 transmits a scroll instruction to the second electronic device 200, and the second electronic device 200 can control the display contents of the display area other than the scroll screen projection area 81 to scroll in the direction indicated by the arrow 1003 according to the scroll instruction. Then, the second electronic device 200 may scroll display contents of a display area other than the screen-casting area 81 in the process of displaying the user input contents in the screen-casting area 81.
Accordingly, as shown in fig. 10 (b), it is assumed that the first electronic device 100 unloads the keyboard HOOK. As shown by reference numeral 1004, in the process of displaying the input content of the user in the display area other than the screen-throwing area 81 of the second electronic apparatus 200, the first electronic apparatus 100 detects the operation of the user to move the mouse pointer 82 into the screen-throwing area 81 of the second electronic apparatus 200, and detects the operation of the wheel of the mouse 1002 by the user. The first electronic device 100 sends the scrolled screen content to the second electronic device 200 for display, and the second electronic device 200 can control the display content of the scrolled screen area 81 to scroll and display in the direction indicated by an arrow 1005. Then, the second electronic device 200 may scroll the display contents of the screen-casting area 81 in displaying the user input contents in the display area other than the screen-casting area 81.
In this way, after decoupling the display position of the mouse pointer from the position of the input focus, the input focus is uniformly managed by the first electronic device 100, so as to meet the requirement that the user inputs content in one display area while scrolling and browsing in another display area.
In yet other scenarios, the first electronic device 100 and the second electronic device 200 establish a multi-screen collaborative connection in a full-screen extended-screen collaborative mode as shown in fig. 2B (c). That is, after the first electronic device 100 and the second electronic device 200 establish a connection, a display area for displaying own content is set on the display screen of the second electronic device 200. Also, the keyboard and mouse of the first electronic device 100 may control the second electronic device 200.
In some embodiments, after the first electronic device 100 detects that the mouse pointer moves out of the display screen or the display area where the mouse pointer is located, the position of the input focus is not switched, so as to ensure that the user can perform input in the original display area. After detecting the operation such as the mouse click operation or the touch operation performed by the user in the display area other than the display area for displaying the content of the second electronic device 200 itself, the position of the input focus is not switched, but the input focus is moved to the second electronic device 200 only after detecting the operation such as the mouse click operation or the touch operation performed by the user in the display area of the content of the second electronic device 200 itself.
For example, as shown in fig. 11 (a), the first electronic device 100 is configured with a keyboard and a mouse, and the same user can use the function of the mouse of the first electronic device 100 through an external mouse and/or a touch pad. The second electronic device 200 includes a display area 1101 for displaying display content of the second electronic device 200 itself, and a display area other than the display area 1101 is used for displaying screen contents sent by the first electronic device 100. Alternatively, the content displayed on the display screen of the first electronic device 100 and the display content of the display area other than the display area 1101 may be the same or different, which is not particularly limited in this embodiment of the present application. For related content of the screen projection method and the specific screen projection method, reference may be made to the prior art, and the embodiments of the present application are not repeated.
Alternatively, as shown in fig. 11 (a), the movement of the mouse pointer 1102 does not affect the display of the content being input. For example, when the first electronic device 100 uninstalls the keyboard HOOK and displays the content input by the user on the display screen of the first electronic device 100 and/or in the display area other than the display area 1101 of the second electronic device 200, the mouse pointer 1102 moves into the display area 1101, and the input being displayed in the display area other than the display area 1101 is not affected. For another example, the first electronic device 100 mounts the keyboard HOOK, and transmits the content of the intercepted user input to the display area 1101 of the second electronic device 200 for display, and the movement of the mouse pointer 1102 to the display area other than the display area 1101 or to the display screen of the first electronic device 100 does not affect the user input being displayed in the display area 1101 of the second electronic device 200.
Thereafter, as shown in fig. 11 (a), in displaying user input content on the first electronic device 100 and/or in displaying user input content on a display area other than the display area 1101 of the second electronic device 200 as shown by reference numeral 1103, after the second electronic device 200 detects a click operation (for example, a touch operation) of the user in the display area 1101 as shown by reference numeral 1104, it is determined that the user needs to input in the display area 1101, and thus network signaling is transmitted to the first electronic device 100. After receiving the network signaling, the first electronic device 100 starts an event interception module shown in fig. 4A, and mounts the keyboard HOOK through the event interception module. Then, as shown in fig. 11 (b), the first electronic device 100 detects the input of the user, intercepts the input data through the keyboard HOOK and forwards the input data to the second electronic device 200, and the second electronic device 200 displays the input content of the user in the display area 1101 according to the received input data, as shown by reference numeral 1105.
Thereafter, as shown in fig. 11 b, as indicated by reference numeral 1106, the second electronic device 200 detects a click operation (for example, a touch operation) by the user on a display area other than the display area 1101, determines that the user needs to input display content on the display area other than the display area 1101, and thus, transmits network signaling to the first electronic device 100. After the first electronic device 100 receives the network signaling, the event interception module shown in fig. 4A is turned off, and the keyboard HOOK is uninstalled, so that the subsequent keyboard input event is not intercepted by the keyboard HOOK any more. Then, as shown in fig. 11 (c), after the first electronic device 100 uninstalls the keyboard HOOK, the first electronic device 100 detects the input of the user, and after processing the input data of the user to determine the screen content, the screen content is spoken to the second electronic device 200 for displaying. As indicated by reference numeral 1107, the user input content received by the first electronic device 100 is displayed in a display area other than the display area 1101 of the second electronic device 200 by means of screen projection.
Alternatively, other contents of the scene shown in fig. 11 may refer to the related contents shown in fig. 8 or fig. 9 described above. In addition, in the scenario shown in fig. 11, the user may also realize scrolling through one display area while inputting content in another display area, and the related content may refer to the related content shown in fig. 10. And will not be described in detail herein.
In this way, the input focus can be normalized between the electronic devices configured with the same or different systems, and the input can be managed by the master device. The main equipment can switch the input focus after determining that the user has the intention of switching the input focus, so that the input display is ensured to meet the user requirement, and the user experience is improved.
Fig. 12 is a schematic flow chart of an input control method according to an embodiment of the present application. As shown in fig. 12, the method includes the following steps.
S1201, the first electronic device determines a first operation acting on the first display area.
In some embodiments, the first electronic device and the second electronic device establish a multi-screen collaborative connection, wherein the first electronic device acts as a master device. The multi-screen collaboration mode of the first electronic device and the second electronic device application includes: a shared collaboration mode, a windowed collaboration mode, or a full-screen extended collaboration mode.
In some embodiments, the first operation is to instruct switching of the input focus position to a first display area of the second electronic device. Alternatively, the first operation is a mouse click operation acting on the first display area, or the first operation is a touch operation acting on the first display area.
For example, in the sharing collaboration mode, the display screen of the first electronic device and the display screen of the second electronic device are displayed separately, the first display area is a display area displayed by the display screen of the second electronic device, and the second display area is an area displayed by the first electronic device. The first electronic device is configured with a mouse and a keyboard, and the first electronic device is capable of detecting a mouse click operation acting on a display screen of the second electronic device. Alternatively, the specific implementation manner of the first operation may refer to the related content shown in fig. 5, which is not described herein.
For another example, in the windowed collaboration mode, the second display area is a screen projection area of the first electronic device on the second electronic device display screen, and the first display area is a display area other than the screen projection area on the second electronic device display screen. Then, the second electronic device transmits first information to the first electronic device after detecting a touch operation acting on the first display area, and the first electronic device determines the first operation according to the first information. Alternatively, the specific implementation manner of the first operation may refer to the related content shown in fig. 8, which is not described herein.
S1202, the first electronic device intercepts a first input.
In some embodiments, the first electronic device determines to mount the keyboard HOOK after detecting a click operation of the mouse on the first display area, or after determining a touch operation acting on the first display area. The first electronic device intercepts a first input of a keyboard through a keyboard HOOK. That is, the keyboard HOOK is not mounted after the mouse pointer moves to other display areas, and the first electronic device only mounts the keyboard HOOK after determining the first operation, so as to intercept the input of the keyboard.
In this embodiment of the present application, the input module of the first electronic device is described by taking a keyboard as an example, and the keyboard may be a built-in keyboard or an external keyboard. The method for intercepting keyboard input by the first electronic device will be described by taking a keyboard HOOK method as an example. However, no matter what the names of the input module and the module for intercepting input are, whether the input module and the module for intercepting input are two modules or one module, as long as the input module and the module for intercepting input have similar functions for receiving input and intercepting input, the technical ideas of the method provided by the embodiment of the application are met, and all the technical ideas are covered in the protection scope of the application.
S1203, the first electronic device sends a first input to the second electronic device.
The first electronic device, after intercepting the first input of the keyboard, sends the first input to the second electronic device. Correspondingly, the second electronic device receives the first input sent by the first electronic device.
S1204, the second electronic device displays the first input in the first display area.
In some embodiments, the second electronic device displays the first input in the first display area after receiving the first input sent by the first electronic device.
Therefore, the first electronic device can mount the keyboard HOOK after determining that the user has the intention of switching the input focus, intercept the subsequent keyboard input event and send the subsequent keyboard input event to the second electronic device, so that the switching of the input focus is realized, the input display is ensured to meet the user requirement, and the user experience is improved.
In some embodiments, the first electronic device determines to scroll through the display content displayed in the second display area after detecting the second operation on the second display area. Optionally, the second operation is a mouse scroll wheel operation when the second display area displays the mouse pointer.
In an exemplary embodiment, in a case where the second display area is the display area of the first electronic device, the display content of the second display area is scroll-displayed. For example, the multi-screen collaboration mode of the first electronic device and the second electronic device application is a shared collaboration mode. The first electronic device detects an operation of scrolling the mouse wheel acting on the second display area of the first electronic device in the process of intercepting and transmitting the first input to the second electronic device, and scrolls and displays the display content of the second display area. Alternatively, the specific implementation manner of scrolling the display content of the second display area according to the second operation may refer to the related content shown in fig. 7, which is not described herein.
Also exemplary, in the case where the second display area is a display area for displaying the screen-cast content of the first electronic device displayed by the second electronic device, the scroll display content of the second display area is transmitted to the second electronic device. For example, the multi-screen collaborative mode applied by the first electronic device and the second electronic device is a windowed collaborative mode or a full-screen extended screen collaborative mode. After the first electronic equipment detects the second operation acting on the second display area, the first electronic equipment sends first screen throwing content to the second electronic equipment, wherein the first screen throwing content is the screen throwing content of rolling display. Alternatively, the specific implementation manner of scrolling the display content of the second display area according to the second operation may refer to the related content shown in fig. 10, which is not described herein.
Therefore, the display position of the mouse pointer is decoupled from the position of the input focus, the input focus is uniformly managed by the first electronic equipment, and the requirement that a user inputs content in one display area while scrolling and browsing in the other display area is met.
In some embodiments, the first electronic device determines a third operation on the second display region for instructing to switch the input focus position to the corresponding second display region of the first electronic device. The first electronic device uninstalls the keyboard HOOK after determining the third operation, does not intercept the first input any more, and determines to display the second input in the second display area according to the second input after subsequently receiving the second input. Alternatively, the third operation is a mouse click operation acting on the second display area, or the third operation is a touch operation acting on the second display area.
In this way, the first electronic device determines whether to mount or dismount the keyboard HOOK according to whether the display area is the display area corresponding to the first electronic device only after determining the mouse click operation or the touch operation acting on the display area. Thereby ensure that the input display satisfies the user demand, promote user's use experience.
The input control method provided in the embodiment of the present application is described in detail above with reference to fig. 5 to 12. The electronic device provided in the embodiment of the present application is described in detail below with reference to fig. 13 and 14.
In one possible design, fig. 13 is a schematic structural diagram of a first electronic device according to an embodiment of the present application. As shown in fig. 13, the first electronic device 1300 may include: a processing unit 1301 and a transceiving unit 1302. The first electronic device 1300 may be used to implement the functions of the first electronic device 100 referred to in the above-described method embodiments.
Alternatively, the processing unit 1301 is configured to support the first electronic apparatus 1300 to execute S1201 and S1202 in fig. 12.
Optionally, the transceiver unit 1302 is configured to support the first electronic device 1300 to perform S1203 in fig. 12.
The transceiver unit may include a receiving unit and a transmitting unit, may be implemented by a transceiver or a transceiver related circuit component, and may be a transceiver or a transceiver module. The operations and/or functions of each unit in the first electronic device 1300 may be referred to as a functional description of a corresponding functional unit for brevity, so that corresponding flows of the input control method described in the above method embodiments are implemented, and all relevant contents of each step related to the above method embodiments are not repeated herein.
Optionally, the first electronic device 1300 shown in fig. 13 may further include a storage unit (not shown in fig. 13) in which a program or instructions are stored. When the processing unit 1301 and the transceiver unit 1302 execute the program or instructions, the first electronic apparatus 1300 shown in fig. 13 is enabled to execute the input control method described in the above method embodiment.
The technical effects of the first electronic device 1300 shown in fig. 13 may refer to the technical effects of the input control method described in the above method embodiment, and will not be described herein.
In addition to the form of the first electronic device 1300, the technical solution provided in the present application may also be a functional unit or a chip in the first electronic device, or a device used in cooperation with the first electronic device.
In one possible design, fig. 14 is a schematic structural diagram of a second electronic device according to an embodiment of the present application. As shown in fig. 14, the second electronic device 1400 may include: a transceiver unit 1401 and a display unit 1402. The second electronic device 1400 may be used to implement the functionality of the second electronic device 200 as referred to in the method embodiments described above.
Optionally, the transceiver unit 1401 is configured to support the second electronic device 1400 to perform S1203 in fig. 12.
Optionally, a display unit 1402 is configured to support the second electronic device 1400 to execute S1204 in fig. 12.
The transceiver unit may include a receiving unit and a transmitting unit, may be implemented by a transceiver or a transceiver related circuit component, and may be a transceiver or a transceiver module. The operations and/or functions of each unit in the second electronic device 1400 are respectively for implementing the corresponding flow of the input control method described in the above method embodiment, and all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional unit, which is not repeated herein for brevity.
Optionally, the second electronic device 1400 shown in fig. 14 may further include a processing unit (not shown in fig. 14), which may be implemented as a processing module or a processing circuit, for processing the first input sent by the first electronic device, to implement display of the display unit 1402.
Optionally, the second electronic device 1400 shown in fig. 14 may further include a storage unit (not shown in fig. 14) in which a program or instructions are stored. When the transceiver unit 1401 and the display unit 1402 execute the program or instructions, the second electronic device 1400 shown in fig. 14 is enabled to execute the input control method described in the above-described method embodiment.
The technical effects of the second electronic device 1400 shown in fig. 14 may refer to the technical effects of the input control method described in the above method embodiment, and will not be described herein.
In addition to the form of the second electronic device 1400, the technical solution provided in the present application may also be a functional unit or a chip in the second electronic device, or a device matched with the second electronic device for use.
The embodiment of the application also provides a chip system, which comprises: a processor coupled to a memory for storing programs or instructions which, when executed by the processor, cause the system-on-a-chip to implement the method of any of the method embodiments described above.
Alternatively, the processor in the system-on-chip may be one or more. The processor may be implemented in hardware or in software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general purpose processor, implemented by reading software code stored in a memory.
Alternatively, the memory in the system-on-chip may be one or more. The memory may be integrated with the processor or may be separate from the processor, and embodiments of the present application are not limited. For example, the memory may be a non-transitory processor, such as a ROM, which may be integrated on the same chip as the processor, or may be separately disposed on different chips, and the type of memory and the manner of disposing the memory and the processor in the embodiments of the present application are not specifically limited.
Illustratively, the chip system may be a field programmable gate array (field programmable gate array, FPGA), an application specific integrated chip (AP device plication specific integrated circuit, ASIC), a system on chip (SoC), a central processor (central processor unit, CPU), a network processor (network processor, NP), a digital signal processing circuit (digital signal processor, DSP), a microcontroller (micro controller unit, MCU), a programmable controller (programmable logic device, PLD) or other integrated chip.
It should be understood that the steps in the above-described method embodiments may be accomplished by integrated logic circuitry in hardware in a processor or instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor or in a combination of hardware and software modules in a processor.
The present application also provides a computer-readable storage medium having a computer program stored therein, which when run on a computer causes the computer to perform the above-described related steps to implement the input control method in the above-described embodiments.
The present application also provides a computer program product, which when run on a computer, causes the computer to perform the above-mentioned related steps to implement the input control method in the above-mentioned embodiments.
In addition, the embodiment of the application also provides a device. The apparatus may be a component or module in particular, and may comprise one or more processors and memory coupled. Wherein the memory is for storing a computer program. The computer program, when executed by one or more processors, causes the apparatus to perform the input control method in the method embodiments described above.
Wherein an apparatus, a computer-readable storage medium, a computer program product, or a chip provided by embodiments of the present application are each configured to perform the corresponding method provided above. Therefore, the advantages achieved by the method can be referred to as the advantages in the corresponding method provided above, and will not be described herein.
The steps of a method or algorithm described in connection with the disclosure of the embodiments disclosed herein may be embodied in hardware, or may be embodied in software instructions executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in random access memory (random access memory, RAM), flash memory, read Only Memory (ROM), erasable programmable read only memory (erasable programmable ROM), electrically Erasable Programmable Read Only Memory (EEPROM), registers, hard disk, a removable disk, a compact disc read only memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an application specific integrated circuit (AP device plication specific integrated circuit, ASIC).
From the foregoing description of the embodiments, it will be apparent to those skilled in the art that the foregoing functional block divisions are merely illustrative for convenience and brevity of description. In practical application, the above functions can be allocated by different functional modules according to the need; i.e. the internal structure of the device is divided into different functional modules to perform all or part of the functions described above. The specific working processes of the above-described systems, devices and units may refer to the corresponding processes in the foregoing method embodiments, which are not described herein.
In the several embodiments provided in this application, it should be understood that the disclosed methods may be implemented in other ways. The device embodiments described above are merely illustrative. For example, the division of the modules or units is only one logic function division, and other division modes can be adopted when the modules or units are actually implemented; for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interface, module or unit indirect coupling or communication connection, which may be electrical, mechanical or other form.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
Computer readable storage media include, but are not limited to, any of the following: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (16)

1. An input control method, applied to a first electronic device, comprising:
determining a first operation acting on a first display area, the first operation being used for indicating to switch an input focus position to the first display area of a second electronic device;
After determining the first operation acting on the first display area, intercepting a first input and transmitting the first input to the second electronic device to enable display of the first input in the first display area of the second electronic device.
2. The method of claim 1, wherein the first operation is a mouse click operation on the first display area or the first operation is a touch operation on the first display area.
3. The method of claim 2, wherein after determining the first operation on the first display area, intercepting a first input and transmitting the first input to the second electronic device comprises:
after detecting a mouse click operation acting on the first display area, or after determining a touch operation acting on the first display area, intercepting the first input of a keyboard and transmitting the first input to the second electronic device.
4. A method according to any one of claims 1-3, wherein said determining a first operation on the first display area comprises:
Detecting the first operation acting on the first display area;
or receiving first information sent by the second electronic device, wherein the first information is information sent after the second electronic device detects the first operation acting on the first display area.
5. The method according to any one of claims 1-4, further comprising:
after detecting a second operation acting on a second display area, determining to scroll display content displayed in the second display area; the second display area is a display area corresponding to the first electronic device.
6. The method of claim 5, wherein the second operation is a mouse scroll wheel operation when the second display area displays a mouse pointer.
7. The method of claim 5 or 6, wherein the determining to scroll display the display content displayed in the second display area comprises:
scrolling display contents of the second display area when the second display area is the display area of the first electronic device; or alternatively, the process may be performed,
and sending the scroll display content of the second display area to the second electronic equipment under the condition that the second display area is the display area which is displayed by the second electronic equipment and used for displaying the screen throwing content of the first electronic equipment.
8. The method according to any one of claims 1-7, further comprising:
determining a third operation acting on the second display area; the third operation is used for indicating to switch the input focus position to the second display area corresponding to the first electronic equipment;
and according to the second input, determining to display the second input in the second display area.
9. An input control method, characterized by being applied to a second electronic device, the method comprising:
after detecting a first operation acting on the first display area, sending first information to the first electronic equipment; the first operation is used for indicating to switch an input focus position to the first display area of the second electronic device;
receiving a first input sent by the first electronic equipment; the first input is the input intercepted by the first electronic equipment after receiving the first information;
and displaying the first input in the first display area.
10. The method of claim 9, wherein the first operation is a mouse click operation on the first display area or the first operation is a touch operation on the first display area.
11. The method of claim 9 or 10, wherein the second electronic device further displays a second display area, the second display area being used to display the screen content sent by the first electronic device, the method further comprising:
and displaying first screen throwing content sent by the first electronic equipment in the second display area, wherein the first screen throwing content is screen throwing content in rolling display, and the first screen throwing content is screen throwing content sent by the first electronic equipment after detecting a second operation acting on the second display area.
12. The method of claim 11, wherein the second operation is a mouse scroll wheel operation when the second display area displays a mouse pointer.
13. An electronic device, comprising: a processor and a memory coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions that, when read from the memory by the processor, cause the electronic device to perform the method of any of claims 1-8.
14. An electronic device, comprising: a processor, a memory and a display screen, the processor being coupled to the memory and the display screen, respectively, the memory being for storing computer program code, the computer program code comprising computer instructions which, when read from the memory by the processor, cause the terminal device to perform the method of any of claims 9-12.
15. A computer readable storage medium, characterized in that the computer readable storage medium comprises a computer program which, when run on an electronic device, causes the electronic device to perform the method of any one of claims 1-8; or cause the electronic device to perform the method of any one of claims 9-12.
16. A computer program product, characterized in that the computer program product, when run on a computer, causes the computer to perform the method according to any of claims 1-8; alternatively, the computer is caused to perform the method of any of claims 9-12.
CN202111276091.8A 2021-10-29 2021-10-29 Input control method and electronic equipment Pending CN116069224A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111276091.8A CN116069224A (en) 2021-10-29 2021-10-29 Input control method and electronic equipment
PCT/CN2022/119062 WO2023071590A1 (en) 2021-10-29 2022-09-15 Input control method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111276091.8A CN116069224A (en) 2021-10-29 2021-10-29 Input control method and electronic equipment

Publications (1)

Publication Number Publication Date
CN116069224A true CN116069224A (en) 2023-05-05

Family

ID=86159106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111276091.8A Pending CN116069224A (en) 2021-10-29 2021-10-29 Input control method and electronic equipment

Country Status (2)

Country Link
CN (1) CN116069224A (en)
WO (1) WO2023071590A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102021048B1 (en) * 2012-07-31 2019-09-11 삼성전자주식회사 Method for controlling user input and an electronic device thereof
KR20150026132A (en) * 2013-08-30 2015-03-11 삼성전자주식회사 Method and apparatus for sharing object using a electronic device
CN105511787B (en) * 2015-12-04 2020-03-24 联想(北京)有限公司 Input method, electronic equipment and input system
CN105955513A (en) * 2016-04-25 2016-09-21 北京润科通用技术有限公司 Information processing method, electronic device and wireless mouse
CN110147256B (en) * 2019-04-03 2022-04-29 珠海全志科技股份有限公司 Multi-screen interaction method and device
CN113050841A (en) * 2019-12-26 2021-06-29 华为技术有限公司 Method, electronic equipment and system for displaying multiple windows

Also Published As

Publication number Publication date
WO2023071590A1 (en) 2023-05-04

Similar Documents

Publication Publication Date Title
WO2020238774A1 (en) Notification message preview method and electronic device
EP4002108B1 (en) Application start method and electronic device
CN111666119B (en) UI component display method and electronic device
WO2020000448A1 (en) Flexible screen display method and terminal
CN110543287B (en) Screen display method and electronic equipment
CN112351412A (en) Content connection method, system and electronic equipment
US20220358089A1 (en) Learning-Based Keyword Search Method and Electronic Device
US20240073978A1 (en) Method for monitoring link and terminal device
CN113805797B (en) Processing method of network resource, electronic equipment and computer readable storage medium
CN112130788A (en) Content sharing method and device
WO2023016012A9 (en) Information display method and electronic device
CN116974634A (en) Peripheral control method, electronic equipment and system
CN116431044A (en) Method and device for starting application program and terminal equipment
CN114201738B (en) Unlocking method and electronic equipment
CN110609650B (en) Application state switching method and terminal equipment
WO2023029985A1 (en) Method for displaying dock bar in launcher and electronic device
WO2023071590A1 (en) Input control method and electronic device
WO2024037346A1 (en) Page management method and electronic device
CN117061266B (en) Control method and control device of intelligent household equipment
WO2023045774A1 (en) Display method and electronic device
CN116662150B (en) Application starting time-consuming detection method and related device
WO2022252805A1 (en) Display method and electronic device
CN114006969B (en) Window starting method and electronic equipment
WO2023160217A1 (en) Method for detecting embedded subscriber identity module (esim) card, and terminal device
WO2022228065A1 (en) Function skipping method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination