CN114579016A - Method for sharing input equipment, electronic equipment and system - Google Patents

Method for sharing input equipment, electronic equipment and system Download PDF

Info

Publication number
CN114579016A
CN114579016A CN202110131920.7A CN202110131920A CN114579016A CN 114579016 A CN114579016 A CN 114579016A CN 202110131920 A CN202110131920 A CN 202110131920A CN 114579016 A CN114579016 A CN 114579016A
Authority
CN
China
Prior art keywords
electronic device
display interface
boundary
input
cursor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110131920.7A
Other languages
Chinese (zh)
Inventor
熊彬
李斌飞
罗朴良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to US18/254,984 priority Critical patent/US20240045557A1/en
Priority to PCT/CN2021/134032 priority patent/WO2022111690A1/en
Priority to EP21897190.1A priority patent/EP4235371A4/en
Publication of CN114579016A publication Critical patent/CN114579016A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a method, electronic equipment and a system for sharing input equipment, which can support a plurality of electronic equipment to share and use the input equipment. After the first electronic device establishes the first connection with the second electronic device, the mouse pointer of the first electronic device may cross the boundary of the display interface and appear on the display interface of the second electronic device. When the mouse pointer is displayed on the display screen of the second electronic device, the second electronic device can receive the input event from the first electronic device and respond to the input event correspondingly when a user uses the input device of the first electronic device, such as a mouse, a keyboard and the like, to perform input operation, and meanwhile the input event on the side of the first electronic device is shielded. By implementing the method, the user can use the input device of the first electronic device to perform input operation on the second electronic device, the input operation of the user can be conveniently and quickly switched among different devices, the input operation is more consistent, and the input experience of the user is improved.

Description

Method for sharing input equipment, electronic equipment and system
Technical Field
The present application relates to the field of terminals, and in particular, to a method, an electronic device, and a system for sharing an input device.
Background
In daily life, intelligent terminal devices are various, including mobile phones, Personal Computers (PCs), tablet computers (PADs), televisions, and the like. Since each terminal device is independent, and there may be ecosystem difference between terminals (for example, the operating system of a PC is
Figure BDA0002925670980000011
The operating system of the other tablet computer is
Figure BDA0002925670980000012
) Each electronic device has its own input device, and the input devices of different electronic devices cannot be shared. This causes redundancy of the input device and a problem that the user needs to switch the input device back and forth when using different electronic devices, which is inconvenient for the user to use and reduces the user experience.
Disclosure of Invention
The application provides a method for sharing input equipment, related electronic equipment and a system, which can support a user to use the same input equipment to perform input operation among different electronic equipment.
The above and other objects are achieved by the features of the independent claims. Further implementations are presented in the dependent claims, the description and the drawings.
In a first aspect, an embodiment of the present application provides a method for sharing an input device, where the method includes: the first electronic device establishes a first connection with the second electronic device. The first electronic device detects a first movement operation, the first movement operation is an operation of indicating the first cursor to move out of the first display interface of the first electronic device, the first movement operation corresponds to a third offset, and the third offset is used for representing the movement of the first cursor out of the first display interface. The first electronic device may then send a first message to the second electronic device over the first connection, the first message may be used to notify the second electronic device to display the second cursor. The second electronic device may display a second cursor at a first location in the second display interface according to the first message. After the second cursor is displayed on the second electronic device, the first electronic device detects a second movement operation, and the second movement operation corresponds to the first offset. And the first electronic equipment sends a second message to the second electronic equipment through the first connection, wherein the second message carries the first offset. The second electronic equipment moves the second cursor from the first position to the second position, and the offset of the second position relative to the first position is the first offset. For example, in some embodiments, after the PC establishes a connection with the PAD, an input device such as a mouse or a keyboard of the PC may be used to make an input on the PAD. When the mouse pointer on the PC moves to the boundary of the PC display interface and continues to move out of the boundary, the PC sends a message to the PAD to inform the PAD to display the mouse pointer. In response to the PC's message, the PAD may display a mouse pointer in a display interface. The PAD may then receive and respond to input events from the PC as the user inputs content using the PC's input device. For example, when a user moves a mouse for a certain distance, a mouse pointer in the PAD can move a corresponding offset; the user inputs on the keyboard of the PC, and the input characters and the like can be displayed in the PAD.
Implementing the method of the first aspect, the input device may be shared among a plurality of electronic devices, and input may be performed in the second electronic device by using the input device of the first electronic device. For example, the user can use the mouse or keyboard of the PC to make an input in the PAD. The method can make the input operation of the user more consistent, the input operation can be conveniently and quickly switched among different devices without switching the input devices back and forth, and the input experience of the user is improved.
In combination with the first aspect, in some embodiments, the method may further include: the first electronic device detects a third movement operation, where the third movement operation is an operation indicating that the second cursor moves out of the second display interface of the second electronic device, and the third movement operation corresponds to a second offset, where the second offset is used to represent that the second cursor moves out of the second display interface of the second electronic device. The first electronic device displays a first cursor at a third location in the first display interface. For example, the mouse pointer may be displayed again on the display interface of the PC when the mouse pointer reaches the boundary of the PAD display interface and continues to move outward.
In combination with the first aspect, in some embodiments, the first electronic device includes a first input device, which may include one or more of: a mouse, keyboard, tablet, camera, touch pad, scanner, stylus, remote control stick, voice input device, etc.
In combination with the first aspect, in some embodiments, the first movement operation or the second movement operation or the third movement operation is detected by the first input device through the first input device.
With reference to the first aspect, in some embodiments, the first location is located on the second boundary of the second display interface and the third location is located on the first boundary of the first display interface.
With reference to the first aspect, in some embodiments, the first movement operation, the second movement operation, or the third movement operation is an input operation of a user moving a mouse, or a user operating a touch-sensitive panel, or a user operating a keyboard, or a user operating a writing pad, or a user operating a remote control stick, or a user operating a voice input device, or a user moving an eye, or a program instruction indicating a cursor movement.
In combination with the first aspect, in some embodiments, the first message carries coordinates of the first location.
In some embodiments, in combination with the first aspect, the first offset is used to indicate that the second cursor is moving in a direction within the second display interface if the first location is on a boundary of the second display interface.
With reference to the first aspect, in some embodiments, the first display interface has four boundaries, where the four boundaries are an upper boundary, a lower boundary, a left boundary, and a right boundary of the display interface in the forward layout, and the second display interface has four boundaries, where the four boundaries are an upper boundary, a lower boundary, a left boundary, and a right boundary of the display interface in the forward layout. Wherein if the first boundary is the left boundary of the first display interface, the second boundary is the right boundary of the second display interface; if the first boundary is the right boundary of the first display interface, the second boundary is the left boundary of the second display interface; if the first boundary is the upper boundary of the first display interface, the second boundary is the lower boundary of the second display interface; if the first boundary is a lower boundary of the first display interface, then the second boundary is an upper boundary of the second display interface.
With reference to the first aspect, in some embodiments, the first electronic device hosts a first operating system and the second electronic device hosts a second operating system.
In combination with the first aspect, in some embodiments, the method further comprises: under the condition that a second cursor is displayed on a second display interface, the first electronic device detects a first input event, wherein the first input event is from input operation collected by a first input device of the first electronic device. The first electronic device maps the first input event into a second input event, wherein a first mapping table is stored in the first electronic device, and the first mapping table stores a mapping relation between the first input event and the second input event. And the first electronic equipment sends a third message to the second electronic equipment, and the third message carries the second input event. The second electronic device receives a second input event.
In combination with the first aspect, in some embodiments, where the first input device is a keyboard, the first input event includes a first key value generated by a user tapping the keyboard, and the second input event includes a second key value, and a character or a control command corresponding to the first key value in the first operating system is consistent with a character or a control command corresponding to the second key value in the second operating system.
With reference to the first aspect, in some embodiments, the display area of the first display interface is a visible area corresponding to the first resolution, and the display area of the second display interface is a visible area corresponding to the second resolution.
In combination with the first aspect, in some embodiments, the method further comprises: after the first electronic device detects the first movement operation, the first electronic device determines the coordinate of a first position of a second cursor displayed on a second display interface according to the coordinate of the first cursor on the first display interface, the first resolution of the first display interface and the second resolution of the second display interface acquired from the second electronic device.
In combination with the first aspect, in some embodiments, the method further comprises: the first electronic equipment creates a virtual screen, and the resolution of the virtual screen is the second resolution. And the first electronic equipment moves the first cursor to a fourth position of the virtual screen while the second electronic equipment displays the second cursor at the first position in the second display interface, wherein the coordinate value of the fourth position is the same as that of the first position. When the second movement operation is detected, the first electronic device moves the first cursor from the fourth position to a fifth position, and the coordinate values of the fifth position and the second position are the same. For example, the PC creates a virtual screen with the same resolution as the PAD display screen, the mouse pointer moves to the virtual screen after moving out of the boundary of the PC display screen, and simultaneously a mouse pointer is displayed on the PAD display interface, the position of the mouse pointer on the virtual screen corresponds to the coordinate position of the mouse pointer displayed on the PAD one by one, and the coordinate values are the same, so that the coordinate of the mouse pointer on the virtual screen can be directly sent to the PAD, complex coordinate conversion is not needed, and the power consumption of a processor is saved.
With reference to the first aspect, in some embodiments, the first cursor is not displayed on the first display interface of the first electronic device when the second cursor is displayed on the second display interface. Such as when a mouse pointer is displayed in the PAD display interface, while input events on the PC side can be masked, i.e. the PC does not respond to input events of the input device.
In combination with the first aspect, in some embodiments, the method further comprises: when the layout of the display interface of the second electronic device is changed from the second display interface to a third display interface, the second electronic device changes the position of the second cursor from the second position to a sixth position, wherein the second display interface and the third display interface contain the same interface elements, and the resolution of the second display interface is different from that of the third display interface. For example, when the PAD is changed from the horizontal screen state to the vertical screen state, if the layout of the PAD display interface changes, the position of the mouse pointer may change accordingly, and the mouse pointer may point to the same pixel point or the same interface element. Or when the folding screen mobile phone is switched between the unfolding state and the folding state, the position of the mouse pointer may also change along with the change of the display interface.
In some embodiments, in combination with the first aspect, the second location and the sixth location point to the same interface element.
In a second aspect, an embodiment of the present application provides a method for sharing an input device, which is applied to a first electronic device, and the method includes: the first electronic device establishes a first connection with the second electronic device. The first electronic device detects a first movement operation, the first movement operation is an operation of indicating the first cursor to move out of the first display interface of the first electronic device, the first movement operation corresponds to a third offset, and the third offset is used for representing the movement of the first cursor out of the first display interface. The first electronic device sends a first message to the second electronic device through the first connection, and the first message is used for informing the second electronic device to display a second cursor. The first electronic device detects a second movement operation, the second movement operation corresponding to the first offset. The first electronic device sends a second message to the second electronic device through the first connection, the second message carries a first offset, the second message is used for informing the second electronic device to move a second cursor from a first position to a second position, and the offset of the second position relative to the first position is the first offset. For example, in some embodiments, after the PC establishes a connection with the PAD, an input device such as a mouse or a keyboard of the PC may be used to make an input on the PAD. When the mouse pointer on the PC moves to the boundary of the PC display interface and continues to move out of the boundary, the PC sends a message to the PAD to inform the PAD to display the mouse pointer. In response to the PC's message, the PAD may display a mouse pointer in a display interface. The PAD may then receive and respond to input events from the PC when the user inputs content using the PC's input device. For example, when a user moves a mouse for a certain distance, a mouse pointer in the PAD can move a corresponding offset; the user inputs on the keyboard of the PC, and the input characters and the like can be displayed in the PAD.
Implementing the method of the second aspect, the input device may be shared among a plurality of electronic devices, and input may be performed in the second electronic device by using the input device of the first electronic device. For example, the user can use the mouse or keyboard of the PC to make an input in the PAD. The method can make the input operation of the user more consistent, the input operation can be conveniently and quickly switched among different devices without switching the input devices back and forth, and the input experience of the user is improved.
In some embodiments, in combination with the second aspect, the method further includes the first electronic device detecting a third movement operation, the third movement operation being an operation that indicates movement of the second cursor out of the second display interface of the second electronic device, the third movement operation corresponding to a second offset amount, the second offset amount being usable to characterize movement of the second cursor out of the second display interface of the second electronic device. The first electronic device displays a first cursor at a third location in the first display interface.
In combination with the second aspect, in some embodiments, the first electronic device includes a first input device, which may include one or more of: a mouse, keyboard, tablet, camera, touch pad, scanner, stylus, remote control stick, voice input device, etc.
In combination with the second aspect, in some embodiments the first movement operation or the second movement operation or the third movement operation is detected by the first input device via the first input device.
In some embodiments, the first location is located on the second boundary of the second display interface and the third location is located on the first boundary of the first display interface.
In some embodiments, the first movement operation, the second movement operation, or the third movement operation is an input operation of a user moving a mouse, or a user operating a touch-sensitive panel, or a user operating a keyboard, or a user operating a writing pad, or a user operating a remote control stick, or a user operating a voice input device, or a user moving an eye, or a program instruction indicating a cursor movement.
In combination with the second aspect, in some embodiments, the first message carries coordinates of the first location.
In conjunction with the second aspect, in some embodiments, the first offset is used to indicate that the second cursor is moving in a direction within the second display interface if the first location is on a boundary of the second display interface.
With reference to the second aspect, in some embodiments, the first display interface has four boundaries, where the four boundaries are an upper boundary, a lower boundary, a left boundary, and a right boundary of the display interface in the forward layout, and the second display interface has four boundaries, where the four boundaries are an upper boundary, a lower boundary, a left boundary, and a right boundary of the display interface in the forward layout. If the first boundary is the left boundary of the first display interface, the second boundary is the right boundary of the second display interface; if the first boundary is the right boundary of the first display interface, the second boundary is the left boundary of the second display interface; if the first boundary is the upper boundary of the first display interface, the second boundary is the lower boundary of the second display interface; if the first boundary is a lower boundary of the first display interface, then the second boundary is an upper boundary of the second display interface.
In some embodiments, in combination with the second aspect, the first electronic device hosts a first operating system and the second electronic device hosts a second operating system.
In combination with the second aspect, in some embodiments, the method further includes, with the second cursor displayed on the second display interface, the first electronic device detecting a first input event, where the first input event is from an input operation collected by the first input device of the first electronic device. And then the first electronic equipment maps the first input event into a second input event, wherein a first mapping table is stored in the first electronic equipment, and the first mapping table stores the mapping relation between the first input event and the second input event. And the first electronic equipment sends a third message to the second electronic equipment, and the third message carries the second input event.
With reference to the second aspect, in some embodiments, the display area of the first display interface is a visible area corresponding to a first resolution, and the display area of the second display interface is a visible area corresponding to a second resolution, and the method further includes, after the first electronic device detects the first movement operation, the first electronic device determines, according to the coordinates of the first cursor on the first display interface, the first resolution of the first display interface, and the second resolution of the second display interface acquired from the second electronic device, the coordinates of the first position where the second cursor is displayed on the second display interface.
In combination with the second aspect, in some embodiments, where the first input device is a keyboard, the first input event comprises a first key value generated by a user tapping the keyboard, and the second input event comprises a second key value, and a character or a control command corresponding to the first key value in the first operating system is consistent with a character or a control command corresponding to the second key value in the second operating system.
With reference to the second aspect, in some embodiments, the method further includes the first electronic device creating a virtual screen, the resolution of the virtual screen being a second resolution. And the first electronic equipment moves the first cursor to a fourth position of the virtual screen while the second electronic equipment displays the second cursor at the first position in the second display interface, wherein the coordinate value of the fourth position is the same as that of the first position. When the second movement operation is detected, the first electronic device moves the first cursor from the fourth position to a fifth position, and the coordinate values of the fifth position and the second position are the same.
In combination with the second aspect, in some embodiments, the first cursor is not displayed on the first display interface of the first electronic device when the second cursor is displayed on the second display interface. Such as when a mouse pointer is displayed in the PAD display interface, while input events on the PC side can be masked, i.e. the PC does not respond to input events of the input device.
In a third aspect, an embodiment of the present application provides a method for sharing an input device, which is applied to a second electronic device, and the method may include: the second electronic device establishes a first connection with the first electronic device. The second electronic device receives a first message from the first electronic device through the first connection, the first message is used for the first electronic device to inform the second electronic device of displaying a second cursor, the first message is generated after the first electronic device detects a first movement operation, the first movement operation is an operation of indicating the first cursor to move out of a first display interface of the first electronic device, the first movement operation corresponds to a third offset, and the third offset is used for representing the fact that the first cursor moves out of the first display interface of the first electronic device. The second electronic device displays a second cursor at a first position in the second display interface according to the first message. The second electronic device receives a second message from the first electronic device through the first connection, the second message carries the first offset, the second message is generated after the first electronic device detects a second moving operation, and the second moving operation corresponds to the first offset. The second electronic equipment moves the second cursor from the first position to the second position, and the offset of the second position relative to the first position is the first offset. For example, in some embodiments, after the PAD is connected to the PC, the PAD may be input on the PAD using an input device such as a mouse or a keyboard of the PC. When the mouse pointer on the PC moves to the boundary of the PC display interface and continues to move out of the boundary, the PC sends a message to the PAD to inform the PAD to display the mouse pointer. In response to the PC's message, the PAD may display a mouse pointer in a display interface. The PAD may then receive and respond to input events from the PC when the user inputs content using the PC's input device. For example, when a user moves a mouse for a certain distance, a mouse pointer in the PAD can move a corresponding offset; the user inputs on the keyboard of the PC, and the input characters and the like may be displayed in the PAD.
Implementing the method of the third aspect, the input device may be shared among a plurality of electronic devices, and the second electronic device may perform input in the second electronic device by using the input device of the first electronic device in a case where the input device is inconvenient. For example, the user can use the mouse or keyboard of the PC to make an input in the PAD. The method can make the input operation of the user more consistent, the input operation can be conveniently and quickly switched among different devices without switching the input devices back and forth, and the input experience of the user is improved.
In combination with the third aspect, in some embodiments, the method may further include: and the second electronic equipment receives a second offset sent by the first electronic equipment, wherein the second offset is an offset corresponding to a third moving operation, the third moving operation is an operation of indicating the second cursor to move out of the second display interface of the second electronic equipment, and the second offset is used for representing that the second cursor moves out of the second display interface of the second electronic equipment. The second electronic device then cancels displaying the second cursor.
With reference to the third aspect, in some embodiments, the first cursor is displayed at a third location in the first display interface of the first electronic device when the second cursor is not displayed by the second electronic device.
With reference to the third aspect, in some embodiments, the first location is located on the second boundary of the second display interface and the third location is located on the first boundary of the first display interface.
In combination with the third aspect, in some embodiments, the first electronic device includes a first input device, which may include one or more of: a mouse, keyboard, tablet, camera, touch pad, scanner, stylus, remote control stick, voice input device, etc.
With reference to the third aspect, in some embodiments, the first movement operation or the second movement operation or the third movement operation is detected by the first input device through the first input device.
With reference to the third aspect, in some embodiments, the first movement operation, the second movement operation, or the third movement operation is an input operation of a user moving a mouse, or a user operating a touch-sensitive panel, or a user operating a keyboard, or a user operating a writing pad, or a user operating a remote control stick, or a user operating a voice input device, or a user moving an eye, or a program instruction indicating a cursor movement.
In combination with the third aspect, in some embodiments, the first message carries coordinates of the first location.
With reference to the third aspect, in some embodiments, if the first location is on a boundary of the second display interface, the first offset is used to indicate that the second cursor is moving in a direction within the second display interface.
With reference to the third aspect, in some embodiments, the first display interface has four boundaries, where the four boundaries are an upper boundary, a lower boundary, a left boundary, and a right boundary of the display interface in the forward layout, and the second display interface has four boundaries, where the four boundaries are an upper boundary, a lower boundary, a left boundary, and a right boundary of the display interface in the forward layout. If the first boundary is the left boundary of the first display interface, the second boundary is the right boundary of the second display interface; if the first boundary is the right boundary of the first display interface, the second boundary is the left boundary of the second display interface; if the first boundary is the upper boundary of the first display interface, the second boundary is the lower boundary of the second display interface; if the first boundary is a lower boundary of the first display interface, then the second boundary is an upper boundary of the second display interface.
With reference to the third aspect, in some embodiments, the method may further include, in a case that the second cursor is displayed on the second display interface, the second electronic device receives, through the first connection, a third message from the first electronic device, where the third message carries a second input event, and the second input event is an input event mapped correspondingly to the first input event, where the first electronic device stores a first mapping table, the first mapping table stores a mapping relationship between the first input event and the second input event, and the first input event is from an input operation acquired by the first input device of the first electronic device.
In combination with the third aspect, in some embodiments, where the first input device is a keyboard, the first input event comprises a first key value generated by a user tapping the keyboard, and the second input event comprises a second key value, the character or control command corresponding to the first key value in the first operating system corresponding to the character or control command corresponding to the second key value in the second operating system.
With reference to the third aspect, in some embodiments, the first electronic device hosts a first operating system and the second electronic device hosts a second operating system.
With reference to the third aspect, in some embodiments, the display area of the first display interface is a visible area corresponding to the first resolution, the display area of the second display interface is a visible area corresponding to the second resolution, and the coordinates of the first position of the second cursor displayed on the second display interface are determined by the first electronic device according to the coordinates of the first cursor on the first display interface, the first resolution of the first display interface, and the second resolution of the second display interface acquired from the second electronic device.
With reference to the third aspect, in some embodiments, the coordinate value of the first position of the second cursor in the second display interface is the same as the coordinate value of the fourth position of the first cursor in the virtual screen, the virtual screen is created by the first electronic device, the resolution of the virtual screen is the second resolution, and the fourth position is a position where the first cursor appears on the virtual screen after moving out of the first display interface. The coordinate value of the second position of the second cursor in the second display interface is the same as the coordinate value of the fifth position of the first cursor in the virtual screen, and the offset of the fifth position relative to the fourth position is the first offset.
In combination with the third aspect, in some embodiments, the method may further include: when the layout of the display interface of the second electronic device is changed from the second display interface to a third display interface, the second electronic device changes the position of the second cursor from the second position to a sixth position, wherein the second display interface and the third display interface contain the same interface elements, and the resolution of the second display interface is different from that of the third display interface. For example, when the PAD is changed from the horizontal screen state to the vertical screen state, if the layout of the PAD display interface changes, the position of the mouse pointer may change accordingly, and the mouse pointer may point to the same pixel point or the same interface element. Or when the folding screen mobile phone is switched between the unfolding state and the folding state, the position of the mouse pointer may also change along with the change of the display interface.
In combination with the third aspect, in some embodiments, the second location and the sixth location point to the same interface element.
In a fourth aspect, embodiments of the present invention provide an electronic device, which may include a communication apparatus, a memory, one or more processors, and one or more programs, where the one or more processors are configured to execute the one or more computer programs stored in the memory, so that the electronic device may implement any of the functions as the first electronic device in the second aspect.
In a fifth aspect, embodiments of the present application provide an electronic device, which may include a communication apparatus, a memory, one or more processors, and one or more programs, where the one or more processors are configured to execute the one or more computer programs stored in the memory, so that the electronic device may implement any of the functions that the second electronic device has in the third aspect.
In a sixth aspect, the present application provides a communication system, which may include the first electronic device and the second electronic device described in the foregoing aspects. It can be understood that, based on the same inventive concept, the steps executed by the first electronic device and the second electronic device in the system of the sixth aspect may refer to the steps executed when the first electronic device and the second electronic device in the method of the first aspect implement the corresponding functions, and the functions and other descriptions of the first electronic device and the second electronic device may refer to the relevant descriptions in the fourth aspect and the fifth aspect, which are not repeated herein.
In a seventh aspect, an embodiment of the present invention provides a computer storage medium, where a computer program is stored in the storage medium, where the computer program includes executable instructions, and when the executable instructions are executed by a processor, the processor is caused to perform operations corresponding to the method provided in the first aspect, or the second aspect, or the third aspect.
In an eighth aspect, an embodiment of the present application provides a chip system, where the chip system may be applied to an electronic device, and the chip includes one or more processors, where the processors are configured to invoke computer instructions to enable the electronic device to implement any one of the possible implementations according to the first aspect, or any one of the possible implementations according to the second aspect, or any one of the possible implementations according to the third aspect.
In a ninth aspect, an embodiment of the present application provides a computer program product containing instructions, which is characterized in that, when the computer program product is run on an electronic device, the electronic device is caused to execute any one of the possible implementations as in the first aspect, or any one of the possible implementations as in the second aspect, or any one of the possible implementations as in the third aspect.
By implementing the method for sharing the input device provided by the application, the input device can be shared among a plurality of electronic devices, for example, a user can use a mouse and a keyboard of a PC to input in a PAD. The method can make the input operation of the user more consistent, the input operation can be conveniently and quickly switched among different devices without switching the input devices back and forth, and the input experience of the user is improved. And the method optimizes the implementation steps, reduces the complex coordinate conversion and reduces the power consumption of the processor.
Drawings
Fig. 1 is a schematic diagram of a communication system according to an embodiment of the present application;
fig. 2 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a schematic software architecture diagram of a communication system according to an embodiment of the present application;
FIGS. 4A-4B are schematic diagrams of an internal implementation method provided by an embodiment of the present application;
FIGS. 5A-5C are a set of schematic user interfaces provided by embodiments of the present application;
6A-6C are a set of scene interface diagrams provided by embodiments of the present application;
FIG. 7 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIGS. 9A-9C are a set of schematic views of a scenario provided by an embodiment of the present application;
FIGS. 10A-10C are schematic views of another set of scenarios provided by embodiments of the present application;
FIG. 11 is a schematic view of a user interface provided by an embodiment of the present application;
FIG. 12 is a schematic view of a user interface provided by an embodiment of the present application;
fig. 13 is a flowchart of a method for sharing an input device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described in detail and clearly with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In the following, the terms "first", "second" are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and further, in the description of embodiments of the application, "plurality" means two or more than two.
The intelligent terminal equipment is of various types, including mobile phones, PCs, PADs, televisions and the like. Since each terminal device is independent and there may be ecosystem differences between terminals, for example, the operating system of a PC is
Figure BDA0002925670980000081
The operating system of the other tablet computer is
Figure BDA0002925670980000082
Each electronic device has an input device matched with itself, and usually the input devices of different electronic devices cannot be shared, for example, a PC generally needs to use a mouse, a keyboard, a writing PAD, etc. to perform input operation, and a PAD and a mobile phone generally perform input through touch operation. However, in some cases, for example, text input using a keyboard or a mouse is more convenient and faster than touch-sensitive soft keyboard, and a user may need to use a mouse or a keyboard when writing a document on a PAD. Under the situation that a user uses a PC and a PAD at the same time, if a set of keyboard and mouse equipment is additionally equipped for the PAD, input equipment redundancy can be caused, and the problem that the user needs to switch the input equipment back and forth when using different electronic equipment is solved, so that the user is inconvenient to use, and the user experience is reduced.
Taking an example that the PC and the PAD share input equipment such as a mouse, a keyboard and the like using the PC, in one screen projection scheme, a screen projection interface of the PAD is established in a display interface of the PC, a user can use the input equipment such as the mouse, the keyboard and the like to perform input operation in the screen projection interface, then the input operation is sent to a PAD end through a screen projection connection, and the PAD responds to the input operation. In the implementation scheme, the PAD screen projection interface shields and occupies the space of the PC display interface, so that the waste of the PC display interface is caused; the mouse position in the PAD screen projection interface needs to be converted in a relatively complex way so as to correspond to the position of the PAD actual display interface, and the calculation is relatively complex; in addition, maintaining the screen projection function is power consuming for the processor.
The embodiment of the application provides a method for sharing an input device, which can be applied to a communication system comprising a plurality of electronic devices. Taking an input device, such as a mouse, a keyboard and the like, of a PC shared by the PC and the PAD for using the PC as an example, according to the method provided by the application, after the PC establishes a connection for sharing the input device with the PAD, a mouse pointer of the PC can reach a boundary of a display interface of the PC, and when an offset of the mouse moving to the outside of the boundary is detected, the PC can calculate a first position of the mouse pointer to be appeared on the display interface of the PAD and send the first position to the PAD, and notify the PAD to display the mouse pointer on the display interface of the PAD. The PAD can then draw a mouse pointer at a first location based on the message sent by the PC. The first location may be located on a boundary of the PAD display interface. In some embodiments, the PC may create a virtual screen of the same resolution as the PAD after establishing a connection with the PAD for the shared input device. When the mouse pointer reaches the boundary of the PC display interface and continues to move outwards, the mouse pointer can move to the virtual screen through the PC display interface, and meanwhile, the PAD can draw the mouse pointer on the PAD display interface according to the message sent by the PC. Because the resolution ratio of the virtual screen is the same as that of the PAD display screen, the position of the mouse pointer on the virtual screen corresponds to the coordinate position displayed on the PAD one by one, the coordinate of the mouse pointer on the virtual screen can be directly sent to the PAD, and complex coordinate conversion is not needed.
When the mouse pointer is displayed on the PAD, each time the PC receives an input event such as a mouse or a keyboard, the input event can be mapped to a corresponding input event acting on the PAD and transmitted to the PAD. The PAD responds to input events after receiving the input events from the PC input device. While masking input events on the PC side, i.e. the PC does not respond to input events of the input device.
By implementing the method for sharing the input device provided by the application, the input device can be shared among a plurality of electronic devices, for example, a user can use a mouse and a keyboard of a PC to input in a PAD. The method can make the input operation of the user more consistent, the input operation can be conveniently and quickly switched among different devices without switching the input devices back and forth, and the input experience of the user is improved. And compared with the screen projection scheme, the method reduces complex coordinate conversion and reduces the power consumption of the processor.
Some relevant terms, concepts, and concepts referred to in the embodiments of the application are described below.
A Pixel (PX) is a basic unit of image display. Each pixel may have a respective color value and may be displayed using three primary colors, such as may be divided into three sub-pixels red, green, and blue (RGB color gamut), or cyan, magenta, yellow, and black (CMYK color gamut). An image is a set of pixel points, and generally, the more pixels in a unit area, the higher the resolution is, and the displayed image is close to a real object. On an electronic device, the number of pixels can be divided into a horizontal pixel number and a vertical pixel number. The horizontal pixel number indicates the number of pixels included in the horizontal direction, and the vertical pixel number indicates the number of pixels included in the vertical direction.
The resolution refers to the number of pixels in the horizontal direction and the longitudinal direction, and the unit is px, and 1px is equal to 1 pixel. The resolution may determine how much information is displayed, measured in horizontal pixel count and vertical pixel count, i.e., the resolution is horizontal pixel count by vertical pixel count, such as 1960 1080. For images with the same physical size, when the resolution is relatively low (for example, 640 × 480), the number of displayed pixels is small, the size of a single pixel is large, and the display effect is rough; when the resolution is relatively high (e.g. 1600 × 1200), a large number of pixels are displayed, the size of each pixel is relatively small, and the display effect is relatively fine.
The User Interface (UI) is a media interface for interaction and information exchange between an application program or an operating system and a user, and it realizes conversion between an internal form of information and a form acceptable to the user. A common presentation form of the user interface is a Graphical User Interface (GUI), which refers to a user interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The input device (inputtevice) is a device that inputs data and information to an electronic device, and is a bridge between a user and the electronic device or between the electronic device and another electronic device. The input device is one of the main devices for exchanging information between a user and an electronic device, and includes, but is not limited to, a keyboard (keyboard), a mouse (mouse), a camera, a scanner, a tablet, a stylus pen, a remote control stick, a touch screen (touch panel), a voice input device, and so on. The input device may input the detected user operation to the electronic device to generate input data, which may be numeric data or non-numeric data, such as graphics, images, sounds, etc. The embodiment of the application does not limit the type of the input device and the input data generated by the input device.
Fig. 1 illustrates a communication system 10 provided by an embodiment of the present application. Communication system 10 may include electronic device 100 and electronic device 200 with a first connection 105 established between electronic device 100 and electronic device 200. The electronic device 100 may be a smart phone, a tablet computer, a notebook computer, a desktop computer, or other types of electronic devices, which is not limited in this application. In some embodiments, as shown in fig. 1, the electronic device 100 may be a PC and the electronic device 200 may be a tablet computer.
In the communication system 10 shown in fig. 1, an electronic device 100, such as a PC, may include a display 101 and input devices such as a mouse 102 and a keyboard 103. The input device, for example, the mouse 102 and the keyboard 103, may be connected to the electronic device 100 through a wired connection, such as a Universal Serial Bus (USB) connection, or may be connected to the electronic device 100 through a wireless connection, such as a Bluetooth (BT) connection, a wireless fidelity (Wi-Fi) connection, and the connection manner of the input device is not particularly limited in the present application. After an input device such as a mouse 102 and keyboard 103 accesses the electronic device 100, a user may input content to the electronic device 100 via the mouse 102 and keyboard 103.
The electronic device 200 may include a screen 106, and the like, and the screen 106 may be used to receive a touch operation of a user and display a corresponding user interface.
A first connection 105 is established between the electronic device 100 and the electronic device 200, where the first connection 105 may be a wired connection, such as a USB connection, and the first connection 105 may also be a wireless connection, such as a bluetooth connection, a Wi-Fi connection, and the like, and the type of the first connection is not limited in this embodiment of the application. The electronic device 100 and the electronic device 200 may have a Bluetooth (BT) module and/or a wireless local area network (wires) thereins local area networks, WLAN). Wherein, the bluetooth module may provide solutions including one or more of classic bluetooth (bluetooth 2.1) or Bluetooth Low Energy (BLE) bluetooth communication, and the WLAN module may provide solutions including one or more of wireless fidelity peer-to-peer (Wi-Fi P2P), wireless fidelity local area network (Wi-Fi LAN) or wireless fidelity software access point (Wi-Fi software access point). In some embodiments, the first connection 105 may be Wi-FiP2P, and Wi-Fi P2P refers to a connection that allows devices in a wireless network to connect to each other in a point-to-point fashion without going through a wireless router
Figure BDA0002925670980000101
The system may also be referred to as wireless fidelity direct (Wi-Fi direct). The devices establishing Wi-FiP2P connection can exchange data directly through Wi-Fi (which must be in the same frequency band) without connecting to a network or a hot spot, so as to realize point-to-point communication, such as transmission of data of files, pictures, videos, and the like. Compared with Bluetooth, Wi-FiP2P has the advantages of faster searching speed and transmission speed, longer transmission distance and the like.
Electronic device 100 and electronic device 200 may transmit data over first connection 105. For example, in some embodiments, the electronic device 100 may send coordinate data and input events of the mouse 102, input events of the keyboard 103 to the electronic device 200 via the first connection 105. Upon receiving a message sent by the electronic device 100, the electronic device 200 may display a mouse pointer 110 on the screen 106 or respond correspondingly to input events of the mouse 102 and keyboard 103. Therefore, a user can perform input operation on the electronic device 200 by using the mouse 102 and the keyboard 103 of the electronic device 100, the electronic device 100 and the electronic device 200 share one set of input devices, and the electronic device 200 does not need to be additionally provided with the input devices.
The electronic device 100 and the electronic device 200 may be mounted thereon
Figure BDA0002925670980000102
Or other types of operating systems, the operating systems of the electronic device 100 and the electronic device 200 may be the same or different, and the application is not limited thereto.
In some embodiments, as shown in fig. 1, the electronic device 100 displays an interface 104, the interface 104 is a desktop of the electronic device 100, and a control 109 and a control 111 may be displayed in the interface 104. Control 111 may indicate that electronic device 100 has established a bluetooth connection with another electronic device (here, electronic device 200), and control 109 may indicate that electronic device 100 shares input devices, such as mouse 102 and keyboard 103, with the connected device (here, electronic device 200). The screen 106 of the electronic device 200 may display an interface 107, the interface 107 may be a desktop of the electronic device 200, and the control 108 and the control 112 may be displayed in the interface 107. Wherein the control 112 may indicate that the electronic device 200 has established a bluetooth connection with another electronic device (here, the electronic device 100). Control 108 may indicate that electronic device 200 established a connection for sharing input devices, which may refer herein to input devices such as mouse 102 and keyboard 103 that electronic device 200 may use connected devices (referred to herein as electronic device 100). When the user moves a mouse cursor (which may also be referred to as a mouse pointer) to an edge of the interface 104 of the electronic device 100, the mouse pointer 110 may shuttle to an edge of the interface 107 of the electronic device 200, and the mouse pointer 110 is then displayed in the screen 106 of the electronic device 200 and may shift position as the mouse 102 moves.
Fig. 2 shows a schematic structural diagram of the electronic device 100. Fig. 2 may also be a schematic structural diagram of the electronic device 200.
The electronic device 100 may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) device, a Virtual Reality (VR) device, an Artificial Intelligence (AI) device, a wearable device, a vehicle-mounted device, a smart home device, and/or a smart city device, which may be provided with or externally connected to an input device, such as a keyboard or a mouse.
The electronic device 100 may include a processor 210, an external memory interface 220, an internal memory 221, a Universal Serial Bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, a key 290, a motor 291, an indicator 292, a camera 293, a display screen 294, a Subscriber Identification Module (SIM) card interface 295, and the like. The sensor module 280 may include a pressure sensor 280A, a gyroscope sensor 280B, an air pressure sensor 280C, a magnetic sensor 280D, an acceleration sensor 280E, a distance sensor 280F, a proximity light sensor 280G, a fingerprint sensor 280H, a temperature sensor 280J, a touch sensor 280K, an ambient light sensor 280L, a bone conduction sensor 280M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units, such as: the processor 210 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 210. If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 210, thereby increasing the efficiency of the system.
In some embodiments, processor 210 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 210 may include multiple sets of I2C buses. The processor 210 may be coupled to the touch sensor 280K, the charger, the flash, the camera 293, etc. through different I2C bus interfaces. For example: the processor 210 may be coupled to the touch sensor 280K via an I2C interface, such that the processor 210 and the touch sensor 280K communicate via an I2C bus interface to implement the touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 210 may include multiple sets of I2S buses. Processor 210 may be coupled to audio module 270 via an I2S bus to enable communication between processor 210 and audio module 270. In some embodiments, the audio module 270 may communicate audio signals to the wireless communication module 260 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 270 and wireless communication module 260 may be coupled by a PCM bus interface. In some embodiments, the audio module 270 may also transmit audio signals to the wireless communication module 260 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 210 with the wireless communication module 260. For example: the processor 210 communicates with the bluetooth module in the wireless communication module 260 through the UART interface to implement the bluetooth function. In some embodiments, the audio module 270 may transmit the audio signal to the wireless communication module 260 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 210 with peripheral devices such as the display screen 294, the camera 293, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 210 and camera 293 communicate via a CSI interface to implement the capture functionality of electronic device 100. The processor 210 and the display screen 294 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect processor 210 with camera 293, display 294, wireless communication module 260, audio module 270, sensor module 280, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 230 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 230 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the method can also be used for connecting a headset and playing audio through the headset. The method can also be used for connecting a mouse and a keyboard, inputting operation instructions through the mouse and the keyboard, and inputting character strings through the keyboard. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charge management module 240 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 240 may receive charging input from a wired charger via the USB interface 230. In some wireless charging embodiments, the charging management module 240 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 240 may also supply power to the electronic device through the power management module 241 while charging the battery 42.
The power management module 241 is used to connect the battery 242, the charging management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charging management module 240, and provides power to the processor 210, the internal memory 221, the display 294, the camera 293, and the wireless communication module 260. The power management module 241 may also be used to monitor parameters such as battery capacity, battery cycle number, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 241 may also be disposed in the processor 210. In other embodiments, the power management module 241 and the charging management module 240 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 250 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 250 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 250 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 250 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be disposed in the processor 210. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be disposed in the same device as at least some of the modules of the processor 210.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 270A, the receiver 270B, etc.) or displays images or video through the display screen 294. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 210, and may be disposed in the same device as the mobile communication module 250 or other functional modules.
The wireless communication module 260 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 260 may be one or more devices integrating at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 260 may also receive a signal to be transmitted from the processor 210, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 250 and antenna 2 is coupled to wireless communication module 260, such that electronic device 100 may communicate with networks and other devices via wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 294, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 294 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 294 is used to display images, video, and the like. The display screen 294 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 294, N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 293, the video codec, the GPU, the display screen 294, the application processor, and the like.
The ISP is used to process the data fed back by the camera 293. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 293.
The camera 293 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, electronic device 100 may include 1 or N cameras 293, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
Internal memory 221 may include one or more Random Access Memories (RAMs) and one or more non-volatile memories (NVMs).
The random access memory may include static random-access memory (SRAM), dynamic random-access memory (DRAM), synchronous dynamic random-access memory (SDRAM), double data rate synchronous dynamic random-access memory (DDR SDRAM), such as fifth generation DDR SDRAM generally referred to as DDR5 SDRAM, and the like;
the nonvolatile memory may include a magnetic disk storage device, a flash memory (flash memory).
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. according to the operation principle, may include single-level cells (SLC), multi-level cells (MLC), three-level cells (TLC), four-level cells (QLC), etc. according to the level order of the memory cells, and may include universal FLASH memory (UFS), embedded multimedia memory cards (eMMC), etc. according to the storage specification.
The random access memory may be read and written directly by the processor 210, may be used to store executable programs (e.g., machine instructions) of an operating system or other programs in operation, and may also be used to store data of users and applications, etc.
The nonvolatile memory may also store executable programs, store data of users and application programs, and the like, and may be loaded into the random access memory in advance for the processor 210 to directly read and write.
The external memory interface 220 may be used to connect an external nonvolatile memory to expand the storage capability of the electronic device 100. The external non-volatile memory communicates with the processor 210 through the external memory interface 220 to perform data storage functions. For example, files such as music, video, etc. are saved in an external nonvolatile memory.
Electronic device 100 may implement audio functions via audio module 270, speaker 270A, receiver 270B, microphone 270C, headset interface 270D, and an application processor, among other things. Such as music playing, recording, etc.
Audio module 270 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. Audio module 270 may also be used to encode and decode audio signals. In some embodiments, the audio module 270 may be disposed in the processor 210, or some functional modules of the audio module 270 may be disposed in the processor 210.
The speaker 270A, also called a "horn", is used to convert an audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 270A or listen to a hands-free call.
The receiver 270B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 270B close to the ear of the person.
The microphone 270C, also referred to as a "microphone," is used to convert acoustic signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 270C by speaking the user's mouth near the microphone 270C. The electronic device 100 may be provided with at least one microphone 270C. In other embodiments, the electronic device 100 may be provided with two microphones 270C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 270C to collect sound signals, reduce noise, identify sound sources, implement directional recording functions, and so on.
The headphone interface 270D is used to connect wired headphones. The headset interface 270D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association) standard interface of the USA.
The pressure sensor 280A is used for sensing a pressure signal, which can be converted into an electrical signal. In some embodiments, the pressure sensor 280A may be disposed on the display screen 294. The pressure sensor 280A can be of a wide variety of types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 280A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 294, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 280A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 280A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 280B may be used to determine the motion pose of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyroscope sensor 280B. The gyro sensor 280B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 280B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyro sensor 280B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 280C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 280C.
The magnetic sensor 280D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 280D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 280D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 280E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 280F for measuring distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 280F to range for fast focus.
The proximity light sensor 280G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 280G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to save power. The proximity light sensor 280G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 280L is used to sense the ambient light level. The electronic device 100 may adaptively adjust the brightness of the display screen 294 based on the perceived ambient light level. The ambient light sensor 280L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 280L may also cooperate with the proximity light sensor 280G to detect whether the electronic device 100 is in a pocket to prevent inadvertent contact.
The fingerprint sensor 280H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 280J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 280J. For example, when the temperature reported by the temperature sensor 280J exceeds the threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 280J, so as to reduce power consumption and implement thermal protection. In other embodiments, electronic device 100 heats battery 242 when the temperature is below another threshold to avoid an abnormal shutdown of electronic device 100 due to low temperatures. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs a boost on the output voltage of the battery 242 to avoid abnormal shutdown due to low temperature.
The touch sensor 280K is also referred to as a "touch device". The touch sensor 280K may be disposed on the display screen 294, and the touch sensor 280K and the display screen 294 form a touch screen, which is also called a "touch screen". The touch sensor 280K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display screen 294. In other embodiments, the touch sensor 280K can be disposed on a surface of the electronic device 100, different from the position of the display screen 294.
The bone conduction sensor 280M may acquire a vibration signal. In some embodiments, the bone conduction transducer 280M may acquire a vibration signal of the human voice vibrating a bone mass. The bone conduction sensor 280M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 280M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 270 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 280M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure pulsation signal acquired by the bone conduction sensor 280M, so as to realize a heart rate detection function.
The keys 290 include a power-on key, a volume key, etc. The keys 290 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 291 may generate a vibration cue. The motor 291 can be used for incoming call vibration prompting, and can also be used for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 291 may also respond to different vibration feedback effects for touch operations on different areas of the display 294. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 292 may be an indicator light that may be used to indicate a state of charge, a change in charge, or may be used to indicate a message, missed call, notification, etc.
The SIM card interface 295 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device 100 by being inserted into the SIM card interface 295 or being pulled out of the SIM card interface 295. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 295 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 295 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 295 may also be compatible with different types of SIM cards. The SIM card interface 295 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
A system software architecture provided in the embodiments of the present application is described below. Taking the communication system 10 composed of the electronic device 100 and the electronic device 200 of the present invention as an example, a system software structure provided in the embodiment of the present application is exemplarily explained.
Fig. 3 is a block diagram of a system software architecture of the communication system 10 according to an embodiment of the present application.
As shown in fig. 3, the software architecture of the communication system 10 includes the electronic device 100 and the electronic device 200, a first connection may be established between the electronic device 100 and the electronic device 200 and perform communication through the first connection, where the first connection may be a bluetooth connection, a Wi-Fi connection, and the like, and the connection manner of the first connection is not limited in this embodiment.
In some embodiments, the software architecture of communication system 10 may be divided into an application & kernel (application & kernel) layer and a device (equipment) layer. Wherein the application layer may comprise a series of application packages. The system software architecture is described herein by taking the electronic device 100 as a PC and the electronic device 200 as a tablet PC. The input device of the PC may be a mouse 102, a keyboard 103, etc., and the tablet PC may have a touch screen.
The device layer of the electronic device 100 may include a display 101, a mouse 102, a keyboard 103, and other input and output devices. Wherein the content of the first and second substances,
the display 101 is an output device for displaying images, videos, and the like. The display screen 101 includes a display panel. The electronic device 100 may include 1 or more displays 101, N being a positive integer greater than 1.
The mouse 102 is an input device, and is a pointer for positioning vertical and horizontal coordinates in a display system of the electronic device, and the operation of the electronic device can be more convenient and faster by using the mouse. The mouse type can include a rolling ball mouse, an optical mouse, a wireless mouse and the like, and the mouse can be expanded into any device capable of generating a cursor and performing point selection in the application.
The keyboard 103 is an input device through which a user can input characters, numerals, punctuation marks, control instructions, etc. to the electronic device.
The application & kernel layer of the electronic device 100 may include a display driver 311, a mouse driver 312, a keyboard driver 313, and the like. The driver can communicate with the hardware device through the bus, control the hardware to enter various working states, and obtain the value of the device-related register, thereby obtaining the state of the device. For example, user operation events such as mouse input, keyboard input, rotating the electronic device and the like can be acquired through driving, and the events are converted into data.
The display driver 311 may be a program for driving a display.
The mouse driver 312 may be responsible for doing the following three things: firstly, displaying a mouse cursor on a screen and maintaining the movement of the cursor; providing the state of the mouse for the application program, wherein the state comprises the position of a mouse cursor on a screen and the state that each key of the mouse is pressed or released; and thirdly, some auxiliary functions of mouse operation are provided for the application program.
The keyboard driver 313 is an interrupt routine that generates a scan code according to a pressed key, and then obtains an American Standard Code for Information Interchange (ASCII) code from the scan code, and then puts the ASCII code into a buffer queue for output or other calls.
The application & kernel layer of the electronic device 100 may further include a virtual screen management module 314, an input event generation module 315, an input event transmission module 316, and the like.
The virtual screen management module 314 may be used to create a virtual screen. In some embodiments, such as in
Figure BDA0002925670980000181
The virtual screen may be created by creating an IDDCX _ MONITOR object in the 10 operating system, which may be at the same resolution as the display screen of the electronic device 200 and may not be visible to the user. The virtual screen is created by the electronic device 100 so that the mouse pointer can pass through the display screen of the electronic device 100 and the coordinates of the mouse pointer in the virtual screen can be directly transmitted to the electronic device 200 without complicated coordinate conversion. If a virtual screen is not created or a new display screen is not externally connected, the mouse pointer may be limited within the edge range of the display screen of the electronic device 100, and the jump display of the mouse pointer between different display screens (including the virtual screen) cannot be achieved. After the mouse pointer is moved to the virtual screen, the resolution of the virtual screen is the same as that of the display screen of the electronic device 200, so that the coordinates of the mouse pointer on the virtual screen can be directly sent to the electronic device 200 without complex coordinate conversion, the method is simple and convenient, and the consumption of CPU resources is reduced.
The input event generating module 315 may be configured to convert the acquired input event of the input device into a corresponding input event that may act on the electronic device 200. For example, when detecting that the mouse pointer reaches the edge of the display screen of the electronic device 100, the electronic device 100 may calculate a starting position of the mouse pointer displayed on the display screen of the electronic device 200, and send the starting position to the electronic device 200 through the input event sending module 316, and the electronic device 200 receives the message and displays the mouse pointer at the corresponding position, forming a visual effect that the mouse pointer shuttles from the electronic device 100 to the electronic device 200. Input events of input devices (e.g., mouse, keyboard, tablet) of electronic device 100 are all performed after, for example, a mouse pointer is moved onto a display screen of electronic device 200The input event is captured, and then the corresponding input event which can act on the electronic device 200 is generated according to the mapping relation in the first mapping table and is sent to the electronic device 200. Input events include, but are not limited to, mouse movement events, mouse click events, mouse wheel scroll events, keyboard input events, remote joystick movement events, voice input events, and the like. For example, the electronic device 100 may map the first mapping table to a second mapping table
Figure BDA0002925670980000182
Input events of the system are mapped into
Figure BDA0002925670980000183
Input events of the system, the
Figure BDA0002925670980000184
Input events of the system may act on the electronic device 200. For example, in
Figure BDA0002925670980000185
An event on the system that clicks the left mouse button can be mapped to
Figure BDA0002925670980000186
A single click event in the system
Figure BDA0002925670980000187
An event on the system that clicks the right mouse button can be mapped to
Figure BDA0002925670980000188
Long press events in the system.
Figure BDA0002925670980000189
The key value of the first key in the system can be mapped to be corresponding
Figure BDA00029256709800001810
Key values in the system, e.g. also the character "a", are
Figure BDA00029256709800001811
The key code in the system may be identical to
Figure BDA00029256709800001812
The key codes in the system are not identical.
The input event transmission module 316 may be used for the electronic device 100 to transmit an input event or the like to the electronic device 200 through the first connection.
The device layers of the electronic device 200 may include a touch sensor 321 and a display screen 322. Wherein the content of the first and second substances,
the touch sensor 321 is also referred to as a "touch panel". The touch sensor 321 may be disposed on the display screen 322, and the touch sensor 321 and the display screen 322 form a touch screen, which is also called a "touch screen". The touch sensor 321 is used to detect a touch operation applied thereto or nearby. The touch sensor 321 can pass the detected touch operation to the application processor to determine the touch event type.
The display screen 322 is an output device that can be used to display images and colors. The display screen 322 may provide visual output related to touch operations.
The application & kernel layer of the electronic device 200 may include a touch sensor driver 323, a display screen driver 324, an input event receiving module 325, and an input event response module 326. Among them, the touch sensor driver 323 and the display screen driver 324 are programs that drive the hardware devices touch sensors and the display screen.
The input event receiving module 325 may be configured to monitor the communication interface, and obtain a message sent by the electronic device 100 through the first connection, where the message includes, but is not limited to, an instruction to display a mouse pointer, absolute coordinates of the mouse pointer, offset coordinates of the mouse pointer, a pressing event of a mouse button, a scrolling event of a mouse wheel, a pressing event of a keyboard button, a key value corresponding to the keyboard button, and the like.
The input event response module 326 may be used to process the input event after the input event receiving module 325 receives the message from the electronic device 100. For example, when the input event receiving module 325 receives a message carrying a mouse pointer displaying coordinates (padX, padY) from the electronic device 100, the input event responding module 326 may draw the mouse pointer at the coordinates (padX, padY) and display the mouse pointer on the display screen 322 in response to the message. For another example, after the input event receiving module 325 receives an input event such as a mouse movement, a mouse click, a mouse wheel scroll, a keyboard input, a joystick movement, etc. from the electronic device 100, the input event responding module 326 may process the input event.
For specific implementation, reference may be made to the following description, which is not repeated herein.
The above description of the software architecture of the communication system 10 is only an example, and it should be understood that the software architecture illustrated in the embodiment of the present invention is not specifically limited to the present application. In other embodiments of the present application, communication system 10 may include more or fewer modules than shown, or combine certain modules, or split certain modules, or a different architectural arrangement. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
An internal implementation method provided by the embodiment of the present application is described below.
In this embodiment, to be mounted with
Figure BDA0002925670980000191
PC with operating system and computer-readable recording medium having operating system installed thereon
Figure BDA0002925670980000192
The sharing of input devices such as a mouse, a keyboard, etc. between the tablet computers of the operating system is described as an example. The example provided in this embodiment does not limit the present application, and the method for sharing an input device provided in this application may also be implemented between a plurality of electronic devices of different types, such as a mobile phone, a notebook computer, and a tablet computer. The embodiment of the application also does not limit the types of the operating systems and the input devices carried by different electronic devices, for example, the input device may be a handwriting pad, a voice input device, a camera, and the like.
In the application, the relative concept of directions of up, down, left, and right may be, for example, a direction of gravity of the electronic device when the electronic device is in a vertical state may be defined, or when the eyes of the user face the display interface of the forward layout, the left side is the left-hand side of the user perspective, the right side is the right-hand side of the user perspective, the upper side is the direction of the eyes of the user perspective facing the top of the head, the lower side is the direction of the eyes of the user perspective facing the trunk, and the display interface of the forward layout means that the arrangement layout direction of the characters, symbols, icons, and the like of the display interface is the direction that best conforms to the reading of the user. The above description is by way of example only, and not by way of limitation, as orientation is a relative concept.
In this embodiment, the PC establishes the first connection with the PAD, the PC has a display screen for displaying a user interface, and further has input devices such as a mouse and a keyboard, and the PC can create a virtual screen invisible to a user, and the virtual screen and the display screen of the PAD have the same size and resolution. When the mouse pointer reaches the edge of the PC display screen, the mouse pointer can penetrate through the PC display screen to move to the virtual screen, and meanwhile, the PAD can draw the mouse pointer according to the corresponding position of the PC virtual screen and the PAD display screen. When the mouse pointer is displayed on the PAD, when the PC receives an input event such as a mouse or a keyboard, the input event can be mapped to the corresponding PAD and transmitted to the PAD. And the PAD responds to the input event correspondingly after receiving the input event. Thus, the user can make an input in the PAD using the input device of the PC.
The specific implementation of the embodiment can be divided into three parts: creating a virtual screen; shuttling a mouse pointer; and (III) inputting the response of the event.
Creation of virtual screen
Specifically, the PC can pass
Figure BDA0002925670980000201
An operating system's own Application Programming Interface (API) function creates one or more virtual screens. The virtual screen created is of the same resolution as the PAD display screen and is not visible to the user. Of course, if there are more than oneThe electronic equipment needing to share the input equipment can create a plurality of virtual screens corresponding to the electronic equipment, and the screen resolution of each virtual screen is the same as that of the corresponding electronic equipment. Here, the PC creates a virtual screen with the same screen resolution as the PAD as an example. In some embodiments, such as in
Figure BDA0002925670980000202
The virtual screen can be created by creating an IDDCX _ MONITOR object in an operating system, and specifically, the virtual screen can be created by the following steps:
(1) the specification of the virtual screen is defined.
Firstly, initializing, configuring relevant parameters, initializing by using functions such as IDD _ CX _ CLIENT _ configuration _ INIT, setting a callback function, and configuring display modes, such as parameters of resolution, refresh rate, and the like. For example, the set resolution is the same as the resolution of the PAD display acquired by the PC.
(2) A virtual screen is created.
After initialization is complete, the iddcxminitor function may be used to create an IDDCX _ MONITOR object, i.e., a virtual screen.
(3) A virtual screen is inserted.
After the creation of the IDDCX _ MONITOR object is successful, the IddCxMonitorArrival function is called, informing the system that this virtual screen is inserted. When the system returns a successfully inserted message, it indicates that the virtual screen was successfully created and available for use. At this time, the process of the present invention,
Figure BDA0002925670980000203
the virtual display screen is detected in the display setting of the system.
The iddcxmitordepartment function may be called if this virtual screen is to be deleted or logged out of the system later. Every "insert" a virtual display is equal to successfully creating an IDDCX _ MONITOR object in the system, and then "delete" the virtual display is equal to destroy the IDDCX _ MONITOR object.
To this end, the PC creates a virtual screen with the same resolution as the PAD display. In addition, the relative position of the PC display screen and the virtual screen can be set. For example, the virtual screen may be located on the right of the PC display screen, that is, the right edge of the PC display screen is connected to the left edge of the virtual screen, and then the mouse pointer may penetrate through the right edge of the PC display screen and penetrate through the left edge of the virtual screen at a corresponding position. If the virtual screen is arranged below the PC display screen, namely the lower edge of the PC display screen is connected with the upper edge of the virtual screen, the mouse pointer can penetrate out of the lower edge of the PC display screen and penetrate into the position corresponding to the upper edge of the virtual screen. The corresponding situation of other edges is the same. If a plurality of virtual screens are created, generally speaking, in order to avoid conflicts, the plurality of virtual screens may be correspondingly connected to different edges of the PC display screen, for example, the virtual screen 1 is set to be located on the right side of the PC display screen, and then the virtual screen 2 may be set to be located on the lower side of the PC display screen.
Shuttling of mouse pointer
In visual effect, the mouse pointer can "shuttle" from the PC display to the PAD display. The example is illustrated here as the mouse pointer "shuttling" from the right edge of the PC display screen to the left edge of the PAD. The PC and the PAD are arranged to be transversely placed, namely the left edge and the right edge of the PC and the PAD are short edges of the equipment. Referring to fig. 4A, in particular, the following steps may be included:
(1) and detecting that the mouse pointer reaches the edge of the PC display screen.
Specifically, the PC can use
Figure BDA0002925670980000211
The GetPhysicalcursorPos function of the system acquires absolute coordinates (pcX, pcY) of the position of the mouse pointer, and the maximum value of the absolute coordinates (pcX, pcY) of the position of the mouse pointer can be the value of the display resolution of the PC display screen. For example, if the current display resolution (screen width, screen height) of the PC display screen is 1920 × 1080 (pixel), the absolute coordinate of the upper left corner of the display screen may be the origin (0,0), the absolute coordinate of the lower right corner may be (1920,1080), the absolute coordinate of the lower left corner may be (0,1080), and the absolute coordinate of the upper right corner may be (1920, 0). Absolute coordinates of arbitrary positions of the PC display screen are located at (0,0) to (1920,1080)) Within the range. The display resolution of the PC display screen can be obtained by the getsystemetrics function.
When the mouse moves for a certain distance, the PC may obtain an offset of the mouse pointer corresponding to the mouse moving distance by using a RawInput function, where the offset is a difference between the movement start position and the movement end position. The offset may be a vector including direction and distance, and may be represented by offset coordinates (relX, relY).
Here, taking the example of detecting that the mouse pointer reaches the right edge of the PC display screen, if (pcX + relX) > screen window, the result of the mouse pointer reaching the right edge of the PC desktop is returned. If (pcY + relY) > screen height, then the result of the mouse cursor reaching the lower edge of the PC desktop is returned. The situation that the mouse pointer reaches the upper edge and the left edge of the PC display screen can be analogized accordingly, and the description is omitted.
(2) And calculating the starting position of the mouse pointer on the PAD display screen.
When the PC detects that the mouse pointer reaches the edge of the PC display screen, the PC calculates the coordinates of the starting point of the display of the mouse pointer on the PAD display screen. The PC may calculate the starting position (padX, padY) of the mouse pointer on the PAD display based on the absolute coordinates (pcX, pcY) of the mouse pointer currently reaching the edge of the PC display, the display resolution (screenWidth) of the PC display, and the display resolution (remoteWidth) of the PAD display.
Taking the upper left corner of the PAD display as the origin of absolute coordinates here, in some embodiments, when the mouse pointer moves to the right edge of the PC display (pcX)maxpcY), (pcX)max) Indicating that the abscissa is at a maximum value, i.e. the value is screen width, the starting position (padX, padY) of the mouse pointer on the PAD display is calculated as:
Figure BDA0002925670980000212
according to the above calculation formula, the abscissa of the initial position where the mouse pointer appears on the PAD display screen is 0, and the ordinate thereof is the position of the scale corresponding to the ordinate position of the PC calculated from the ratio of the PC to the PAD display resolution, i.e., the initial position where the mouse pointer appears is on the left edge of the PAD display screen. For example, if the mouse pointer leaves from the middle position of the right edge of the PC display screen, the mouse pointer can appear at the middle position of the left edge of the PAD display screen, so that the user can feel that the shuttling process of the whole mouse pointer is continuous, and the user experience is better.
Similarly, when the mouse pointer moves to the left edge of the PC display screen (pcX)minpcY), (pcX)min) Indicating that the abscissa is the minimum value, i.e. the value is 0, then the starting position of the mouse pointer on the PAD display (padX, padY) is calculated as:
Figure BDA0002925670980000213
namely, the starting position of the mouse pointer appearing on the PAD display screen is a certain position at the right edge of the PAD display screen.
Similarly, when the mouse pointer moves to the top edge of the PC display screen (pcX, pcY)min) When (pcY)min) Indicating that the ordinate is the minimum value, i.e. the value 0, then the starting position of the mouse pointer on the PAD display (padX, padY) is calculated as:
Figure BDA0002925670980000221
namely, the starting position of the mouse pointer appearing on the PAD display screen is a certain position of the lower edge of the PAD display screen.
Similarly, when the mouse pointer moves to the lower edge of the PC display screen (pcX, pcY)max) When (pcX)max) Indicating that the ordinate is maximum, i.e. the value is screen height, then the starting position (padX, padY) of the mouse pointer on the PAD display is calculated as:
Figure BDA0002925670980000222
namely, the starting position of the mouse pointer appearing on the PAD display screen is a certain position of the upper edge of the PAD display screen.
The above method for calculating the starting position of the mouse pointer on the PAD display screen is only an example and does not limit the present application in any way.
In other embodiments, when the mouse pointer is detected to move off any edge of the PC display screen, the mouse pointer may appear at a fixed coordinate of the PAD display screen, e.g., at the origin, i.e., (padX, padY) is (0, 0). The embodiment of the application does not limit the position of the mouse pointer appearing in the PAD display screen.
(3) And a mouse pointer is displayed on the PAD display screen.
After the PC calculates the starting position (padX, padY) where the mouse pointer will be displayed on the PAD display, the PC may send a message to the PAD, the message carrying the coordinates (padX, padY) informing the PAD to display the mouse pointer at the coordinates (padX, padY). Meanwhile, the mouse pointer of the PC moves to the coordinates (vmX, vmY) of the virtual screen of the PC, and the coordinates (vmX, vmY) correspond to the coordinates (padX, padY) in a one-to-one mode, and the coordinate values are the same.
For example, the display resolution of a PC display screen is 2560 × 1600 (pixels), the display resolution of a PAD display screen is 1280 × 800 (pixels), and the display resolution of a virtual screen created by a PC is the same as PAD, and is also 1280 × 800 (pixels). When the mouse pointer reaches the PC right edge coordinate (pcX, pcY) of (2560,1000), the PC calculates that the starting position of the mouse pointer on the PAD display screen can be (padX, padY) of (0, 500). Likewise, the mouse pointer is moved to the coordinates (vmX, vmY) on the virtual screen, which are also (0, 500). The PC's mouse pointer is then moved from the PC's right edge (2560,1000) to the virtual screen's left edge (0, 500), while the PC sends a message to the PAD, the message carrying the coordinates (0, 500), informing the PAD to display the mouse pointer at the coordinates (0, 500). In response to the PC's notification, the PAD displays a mouse pointer at coordinates (0, 500). Thus, it appears that the mouse pointer shuttles from the display of the PC to the display of the PAD. The above example does not set any limit to the present embodiment, as the mouse pointer "shuttles" from the PC to the PAD.
The PC creates the virtual screen in order to facilitate the jumping display of the mouse pointer between different display screens (including the virtual screen) across the PC display screen, and the coordinates of the mouse pointer in the virtual screen can be sent directly to the PAD without complex coordinate conversion. If no virtual screen is created or no new display is attached, the mouse pointer will be limited to the edge of the PC display. After the mouse pointer moves to the virtual screen, the resolution of the virtual screen is the same as that of the PAD display screen, so that the coordinate of the mouse pointer on the virtual screen can be directly sent to the PAD, complex coordinate conversion is not needed, simplicity and convenience are realized, and the consumption of CPU resources is reduced.
(III) response to input event
After the mouse pointer is moved to the PAD display screen, input events from a PC input device (e.g., mouse, keyboard, tablet) are captured and sent to the PAD, which may respond to input events from the PC input device. While masking input events on the PC side, i.e. the PC does not respond to input events of the input device. Referring to fig. 4B, in particular, the following steps may be included:
(1) and acquiring an input event.
When an input device, such as a mouse, a keyboard, a voice input device, a tablet, a camera, etc., detects that a user makes an input operation, the PC may capture the input operation, such as movement of the mouse, clicking of the mouse, tapping of a key of the keyboard, etc., and generate a corresponding input event. For example, carry
Figure BDA0002925670980000231
The PC of the system may use the RawInput function to obtain the MOUSE's MOUSE _ MOVE event and generate the corresponding offset coordinates (relX, relY) of the MOUSE pointer. That is, the PC can acquire the distance and direction of the mouse movement and convert into the offset direction and offset distance of the mouse pointer accordingly. If it is specified that MOUSE _ MOVE is MOUSE _ MOVE _ RELATIVE value, it means that the offset coordinates (relX, relY) are offset coordinates with respect to the position where the MOUSE was last located; if the MOUSE _ MOVE is specified as a MOUSE _ MOVE _ ABSOLUTE value, it indicates that the offset coordinate (relX, relY) is offset with respect to a fixed positionAnd (4) moving the coordinate. However, in either case, the offset coordinates (relX, relY) indicate a relative movement of the data. Regarding the screen coordinate axis, when the mouse moves upward, relY is a negative value, when the mouse moves downward, relY is a positive value, when the mouse moves leftward, relX is a negative value, and when the mouse moves rightward, relX is a positive value.
In addition, the PC can also acquire a mouse button pressing event and a mouse wheel scrolling event. For example, in
Figure BDA00029256709800002313
In the system, an RI _ MOUSE _ LEFT _ BUTTON _ DOWN value can indicate that a LEFT MOUSE BUTTON is pressed, and an RI _ MOUSE _ LEFT _ BUTTON _ UP value can indicate that the LEFT MOUSE BUTTON is released; the RI _ MOUSE _ MIDDLE _ BUTTON _ DOWN value can indicate that a MIDDLE MOUSE key is pressed, and the RI _ MOUSE _ MIDDLE _ BUTTON _ UP value can indicate that the MIDDLE MOUSE key is released; an RI _ motion _ RIGHT _ BUTTON _ DOWN value may indicate that the RIGHT MOUSE BUTTON is pressed, an RI _ motion _ RIGHT _ BUTTON _ UP value may indicate that the RIGHT MOUSE BUTTON is released, etc. RI _ MOUSE _ WHEEL may indicate input from a MOUSE WHEEL with positive values of WHEEL increment indicating WHEEL forward rotation and negative values indicating WHEEL backward rotation. RI _ MOUSE _ HWHEEL may indicate input from a horizontal MOUSE wheel with positive values of wheel increment indicating wheel turning to the right and negative values indicating scrolling turning to the left.
The PC may obtain an input event of the keyboard through the Hook function, for example, when a first key of the keyboard is pressed, the PC may obtain a state KEYDOWN in which the first key is pressed and a key value (key) of the first key. If the first key is released, the first key returns to the release state KEYUP. Each key in the keyboard corresponds to a key value, and the key values can follow an ASCII code table. When character keys are pressed, corresponding characters can be input, and when control keys are pressed, corresponding control functions can be called.
The above embodiments are only examples and do not limit the embodiments of the present application. The input event may also be a voice input, a handwriting input, a touch input, combinations thereof, and the like.
(2) A mapping of the input events.
In some embodiments of the present invention, the,the PC may have a first mapping table Map stored therein, and the first mapping table may indicate a mapping relationship between an input event of the PC and an input event of the PAD. Input events include, but are not limited to, mouse movement events, mouse click events, mouse wheel scroll events, keyboard input events, remote joystick movement events, voice input events, and the like. For example, the PC may map the first mapping table to the second mapping table
Figure BDA0002925670980000232
Input events of the system are mapped into
Figure BDA0002925670980000233
Input events of the system, the
Figure BDA0002925670980000234
Input events to the system may act on the PAD. For example, in
Figure BDA0002925670980000235
An event on the system that clicks the left mouse button can be mapped to
Figure BDA0002925670980000236
A single click event in the system
Figure BDA0002925670980000237
An event on the system that clicks the right mouse button can be mapped to
Figure BDA0002925670980000238
Long press events in the system.
Figure BDA0002925670980000239
The key code value of the first key in the system can be mapped to be corresponding
Figure BDA00029256709800002310
Key-code values in the system, e.g. also the character "a", are
Figure BDA00029256709800002311
The key code in the system may be identical to
Figure BDA00029256709800002312
The key codes in the system are not identical.
Because the resolution ratios of the PC virtual screen and the PAD display screen are the same, the coordinate value of a mouse pointer on the PC virtual screen can be the same as the coordinate value of the mouse pointer on the PAD, the absolute coordinate and the offset coordinate of the mouse pointer in the PC virtual screen can be directly sent to the PAD, complex coordinate conversion is not needed, simplicity and convenience are realized, and the consumption of CPU resources is saved.
(3) Sending and responding of input data.
In some embodiments of the present invention, the,
Figure BDA0002925670980000241
the system is provided with an input subsystem of the system, and the input subsystem is used for uniformly managing input events. The uinput implemented based on the input subsystem may facilitate the simulation of input events in the user space userpace. For example, a virtual device (e.g., a virtual mouse, a virtual keyboard, etc.) may be created by uinput and an attribute of the virtual device may be configured, then an input event sequence obtained from the PC is written into a/dev/uinput device file, an input event is delivered, and the PAD may obtain an input event of an input device such as a mouse, a keyboard, etc. of the PC without the input device such as the mouse, the keyboard, etc.
The PC may package the input event obtained in the first mapping table into a data packet conforming to a first connection transport protocol format, and then send the data packet to the PAD through a first connection (e.g., Wi-Fi direct). After the PAD receives an input event, the input event is injected into the uinput and the PAD responds to the input event. The data of the input event may include an occurrence time of the input event, an input event type, a code of the input event type, a value of the input event, and the like.
For example, the input EVENT type is EVENT _ KEY (keyboard), the code of the input EVENT type is device keyboard code, the code values of 0 to 127 are KEY codes on the keyboard, when the value of the input EVENT is 1, the KEY is pressed, and when the value of the input EVENT is 0, the KEY is released. If the input EVENT type is EVENT _ MOUSE, the code of the input EVENT type is device MOUSE code, and the code values 0x110 to 0x116 are key codes of the MOUSE, wherein 0x110(BTN _ LEFT) is LEFT MOUSE key code, 0x111(BTN _ RIGHT) is RIGHT MOUSE key code, 0x112(BTN _ MIDDLE) is MIDDLE MOUSE key code, the value of the input EVENT is 1, the key is pressed, and the value of the input EVENT is 0, the key is released. If the input EVENT type is EVENT _ REL (relative coordinates), the code value of the input EVENT type indicates the type of the track, such as indicating the mouse pointer to shift REL _ X (code 0X00) in the X-axis direction, indicating the mouse to shift REL _ Y (code 0X01) in the Y-axis direction, REL _ WHEEL (code 0X08) indicating the moving direction of the WHEEL in the mouse, and the positive and negative values of value respectively represent the values in two different directions. The code meaning of other input events can be referred to in the include/linux/input.h file.
After the above steps, after the first connection between the PC and the PAD is established, when the mouse pointer moves over the edge of the PC display screen, the PAD display screen may correspondingly display the mouse pointer. And while the mouse pointer is displayed on the PAD, input events from the PC input device may be converted and transmitted to the PAD, which responds to the input events, such as mouse movement, mouse clicks, keyboard character entry, and so forth.
The implementation manner described in the above embodiments is only an example and does not limit the present application in any way. The specific internal implementation manner may be different according to different types of electronic devices, different operating systems installed, different programs used, and different interfaces called.
Fig. 5A, 5B, and 5C illustrate some user interface diagrams provided by embodiments of the present application. In some embodiments, as shown in fig. 5A, the electronic device 100 may add an input device sharing setup interface, such as interface 501, based on the functionality of the sharing input device provided in other embodiments of the present application. A title bar 502 may be included in the interface 501, and the title bar 502 may display the title words "input device sharing device" and icon. Also included in interface 501 are a plurality of lists of options, such as a list of connectable devices 501, a list of shared input devices 504, a location of connected devices 505, and the like.
Devices that can currently be connected to use the shared input device function may be displayed in the connectable device list 503. These devices may be listed in order in the connectable device list 503 and simultaneously display the connection status of the devices, such as: connected, paired but not connected or unpaired. In fig. 5A, the electronic device 200 is in a connected state and can share an input device with the electronic device, the electronic device 300 is in a paired but unconnected state and can share the input device only after connection is required, and the electronic device 400 is in an unpaired state and can be connected only after pairing is initiated and pairing is successful by the electronic device 100. There is also a refresh control in the connectable device list 503, and the current connectable device list can be refreshed after selecting the refresh control. Also in the connectable device list 503 is a show/hide list control 506, and clicking the show/hide list control 506 may show or hide all or a portion of the connectable devices. The connectable device list 503 may only show a portion of the commonly used or closest electronic devices, and if the sought-after electronic device is not listed in the connectable device list, the "more" control may be clicked and the electronic device 100 may display all of the connectable devices in another user interface. The connection between the electronic device 100 and the electronic device 200 may be a connection initiated manually by the user, or may be set to be automatically established when the function is turned on next time after the connection is successful. The user operation for initiating the connection may be that the user clicks a "connection" control in the setting interface, or the user approaches the electronic device 200 to the electronic device 100, for example, a "touch-and-dash" operation is an operation for triggering the connection, or a user operation such as a first gesture, which is not limited in this embodiment of the application.
Existing input devices, such as a mouse, a keyboard, a tablet, a camera, etc., may be displayed in the shared input device list 504, a user may autonomously select an input device to be shared, the switch control 507 is an on/off control for selecting the input device, and the switch control 507 of the mouse and the keyboard in fig. 5A is displayed in an on state, that is, the mouse and the keyboard of the electronic device 100 may be shared with the electronic device 200 at present. Similarly, shared input device list 504 also includes a show/hide list, and clicking on the show/hide list control can show or hide all or a portion of the input devices. Similarly, only a portion of the commonly used input devices may be displayed in shared input device list 504, and if the sought-after input device is not listed in the input device list, the "more" control may be clicked and other input devices may be displayed by electronic device 100.
The control for connecting the device position 505 is convenient for a user to set the relative position relationship between a plurality of electronic devices, and when the control 508 is clicked, a detailed setting interface as shown in fig. 5B and 5C can be displayed.
Fig. 5B is a schematic diagram of setting the position of the connection device. As shown in fig. 5B, there may be a schematic box 511 of the device location relationship in the connected device location interface 510, and a prompt 512 "relative position of the dragged box adjustment setting". Fig. 5B illustrates a device position relationship diagram box 511 that the right edge of the electronic device 100 (the present device) is connected to the left edge of the tablet pc, and the mouse pointer can pass through the right edge of the electronic device 100 (the present device) and then pass through the left edge of the tablet pc. In the interface 510, a horizontal/vertical selection box 513 may also be displayed, that is, after selecting a certain device in the schematic box 511, the user may select the orientation of the long side or the short side of the device in the horizontal/vertical selection box 513. If the transverse direction indicates that the long side of the device is at the upper and lower positions and the short side is at the left and right positions; the vertical direction indicates that the long side of the device is in the left-right position and the long side is in the up-down position. In fig. 5B, the tablet computer is selected by the user, the area is displayed in gray to indicate that the tablet computer is in the selected state, and the horizontal/vertical selection box 513 is vertical, that is, the long side of the tablet computer is in the left-right position and the long side is in the up-down position. The horizontal/vertical direction affects which side, the long side or the short side, the edges connected between the electronic devices, and the position where the mouse pointer appears at the edges of the devices. For example, if the tablet is portrait, the right edge of the device is connected to the long side of the tablet; if the tablet is horizontal, the right edge of the device is connected to the long edge of the tablet. The edge connections are different, as are the positions at which the mouse pointer shuttles between devices. After the position relationship of the device is adjusted, the application control 514 can be clicked, the setting is saved and validated, and if the setting is not wanted to be changed, the cancel control 515 can be clicked.
The user may drag the schematic device in schematic block 511 to change the edge connection relationship between the devices. For example, in fig. 5B, the user drags the schematic diagram of the tablet computer from the right side of the device to the lower side of the device, and a connection device location interface 520 shown in fig. 5C may be displayed, that is, as shown in a schematic block 521, the tablet computer is located at the lower side of the device, which indicates that the mouse pointer may penetrate through the lower edge of the device and penetrate through the upper edge of the tablet computer. The tablet computer is set in the horizontal/vertical selection frame 522 to be horizontal, i.e., the long side of the tablet computer is located in the up-down position.
The setting-related user interfaces shown in fig. 5A, 5B, and 5C are only examples, and do not limit the embodiments of the present application in any way, and the setting interface of the shared input device may include more or less controls or functions than the example interfaces, such as setting font size, setting resolution, and the like. In addition, the setting interface of the shared input device may be provided not only on the electronic device 100, but also on the opposite-end electronic device, for example, on the electronic device 200, there may be a setting interface of the same or similar shared input device. The related setting of the shared input device can be carried out on both side terminals, and the same effect can be achieved.
Fig. 6A and 6B illustrate a process of moving the mouse pointer from the edge of the display screen of the electronic device 100 to the edge of the display screen of the electronic device 200, that is, the process of the aforementioned mouse pointer shuttling.
As shown in fig. 6A and 6B, electronic device 100 and electronic device 200 have established a first connection 610 that supports shared input device functionality. After the first connection 610 is established, the electronic device 200 may create a virtual screen 500 having the same resolution as the electronic device 200. In the example of fig. 6A, the left edge of the virtual screen 500 may be arranged to meet the right edge of the electronic device 100, such that the mouse pointer 601 may pass out from the right edge of the interface 104 of the electronic device 100 and into the left edge of the interface of the virtual screen 500. Meanwhile, a mouse pointer 603 is displayed on the left edge of the interface 107 of the electronic device 200, as shown in fig. 6B.
As shown in fig. 6A, an interface 104 is displayed on a display screen 101 of the electronic device 100, the interface 104 is a desktop of the electronic device 100, and a mouse pointer 601 is displayed in the interface 104. When the user moves the mouse 102 to the right, as shown in the figure, from the dashed line position to the solid line position, the mouse pointer 601 may move in the direction of the arrow in the figure to the position (pcX, pcY) of the mouse pointer 602 at the right edge of the interface 104. Then, the electronic device 100 may calculate, according to (pcX, pcY), a position (padX, padY) where the mouse pointer is to appear correspondingly at the left edge of the interface 107 of the electronic device 200, and a position (vmX, vmY) where the mouse pointer 602 is to be displayed at the left edge of the virtual screen 500. Here, since the resolution of the virtual screen 500 is set to be the same as that of the electronic apparatus 200 in order to simplify the coordinate conversion, the position (vmX, vmY) of the mouse pointer in the virtual screen 500 is the same as the coordinate values of the position (padX, padY) of the mouse pointer in the electronic apparatus 200.
As shown in fig. 6B, while the mouse pointer 602 of the electronic device 100 is located at the right edge of the interface 104, and the user continues to move the mouse 102 to the right, as the mouse moves from the dashed line position to the solid line position in the illustration, the mouse pointer 602 may move in the direction of the arrow in the illustration to the left edge position (vmX, vmY) of the virtual screen 500. Meanwhile, the electronic apparatus 100 may send a message carrying the location (padX, padY) to the electronic apparatus 200 through the first connection 610, informing the electronic apparatus 200 to display the mouse pointer at the location (padX, padY). Receiving the message, the electronic device 200 displays a mouse pointer 603 at a location (padX, padY) at the right edge of the interface 107. The position (vmX, vmY) of the mouse pointer 602 in the virtual screen 500 corresponds to the position (padX, padY) of the mouse pointer 603 in the electronic device 200. Visually, a sequential effect of the mouse pointer shuttling from the right edge of the desktop interface 104 of the electronic device 100 to the left edge of the desktop interface 107 of the electronic device 200 may be presented.
As shown in fig. 6C, if the user continues to move the mouse 102 to the right after the mouse pointer 603 is displayed at the left edge position of the electronic device 200, the mouse moves from the dotted line position to the solid line position as shown in the figure, a certain distance by which the mouse 102 moves corresponds to the offset coordinates (relX, relY), and the mouse pointer 602 can move to the position of the mouse pointer 604 in the arrow direction on the virtual screen 500 according to the offset coordinates (relX, relY). Meanwhile, the electronic device 100 may send a message carrying the offset coordinates (relX, relY) to the electronic device 200 through the first connection 610, notifying the mouse pointer 603 of the electronic device 200 of the rightward offset distance (relX, relY). Receiving the message, the mouse pointer 603 in the interface 107 of the electronic device 200 is shifted to the right at the mouse pointer 605. The position (vmX, vmY) of the mouse pointer 604 in the virtual screen 500 corresponds to the position (padX, padY) of the mouse pointer 605 in the electronic device 200. To the user's perspective, a visual effect may be presented that the mouse pointer on the desktop interface 107 of the electronic device 200 moves as the mouse 102 moves. Not limited to the situation that the mouse pointer penetrates from the right edge of the electronic device 100 and then penetrates from the left edge of the electronic device 200 as shown in fig. 6A, 6B, and 6C, the mouse pointer may also penetrate from other edges of the electronic device 100, such as the upper edge, the lower edge, and the left edge, and then penetrate from any other edge of the electronic device 200, and the display interface refers to fig. 6A to 6C, which is not described herein again.
FIG. 7 is a diagram of the electronic device 200 responding to a mouse 102 click event. After "shuttling" the mouse pointer to the interface 107 of the electronic device 200 as shown in fig. 6A-6C, the user may manipulate the input device of the electronic device 100, such as the mouse 102, the keyboard 103, generate an input event, and the electronic device 200 may respond after receiving the input event from the electronic device 100, as shown in fig. 7. For example, in fig. 7, a mouse pointer 701 is moved to an icon of a "music" application of the interface 107 of the electronic device 200, and a user may perform a mouse click operation 702. After obtaining the mouse click operation 702, the electronic device 100 may map the mouse click operation 702 event to a click event that may act on the electronic device 200 according to the first mapping table, and send the click event to the electronic device 200 through the first connection 610. In response to the click event of the "music" application icon, the electronic device 200 opens a music application and displays the application interface 703. Then, the user can also perform operations such as playing music, adjusting volume, changing songs, etc. on the application interface 703 by manipulating the mouse 102.
Fig. 8 is a schematic diagram of the electronic device 200 responding to a keyboard 103 input event. The electronic device 200 in the figure has established a first connection 610 with the electronic device 100, sharing the input device, and the electronic device 200 may receive input from the keypad 103 and respond.
As shown in fig. 8, the electronic device 200 is displayed with an interface 800, and when a mouse pointer 801 clicks an input box 802, contents can be input in the input box 802. For example, when the user taps a first key of the keyboard 103, and the electronic device 100 acquires that the first key is pressed, the electronic device 100 may send a message of inputting content and a key value of the corresponding first key to the electronic device 200 through the first connection 610. The electronic device 200 may display the corresponding character in the input box 802 after receiving the message of the electronic device 100. As shown in fig. 8, the electronic apparatus 200 displays the character string "nihao" in the input box 802 in response to a key operation by the user hitting a key corresponding to the string of characters "nihao" on the keyboard 103. Of course, the electronic device 200 may respond to a control key command of the keyboard, such as "enter", besides the characters, which is not limited in this embodiment.
Fig. 7 and 8 illustrate only scenarios of a mouse click event and a keyboard input event. Of course, the input event includes, but is not limited to, a mouse moving event, a mouse clicking event, a mouse wheel scrolling event, a keyboard input event, a remote joystick moving event, a voice input event, etc., any operation of the input device of the electronic device 100 may be applied to the electronic device 200, which is not limited in the above embodiment.
Fig. 9A-9C are schematic diagrams of interfaces in a scenario where multiple electronic devices share an input device. Here, a connection scenario among three devices, that is, the electronic device 100 (the present device), a tablet computer, and a mobile phone is taken as an example for description, and a case that the other multiple electronic devices share an input device can be similarly introduced, which is not described in detail.
In some embodiments, after the electronic device 100 establishes connections for sharing input devices with the tablet and the cell phone, respectively, the electronic device 100 may create a virtual screen with the same resolution as the tablet and a virtual screen with the same resolution as the cell phone. Then in the input device sharing setting, a connected device location interface 901 as shown in fig. 9A may be displayed. In a device connection relationship block 902, a user may adjust a connection relationship between the electronic device 100 and a tablet computer or a mobile phone. For example, in the device position relationship diagram box 902 in fig. 9A, it is illustrated that the tablet pc is placed on the right side of the electronic device 100 (the device), the mobile phone is placed on the lower side of the device, that is, the right edge of the device is connected to the left edge of the tablet pc, and the lower edge of the device is connected to the upper edge of the mobile phone. Indicating that the mouse pointer may pass out from the right edge of the electronic device 100 (the present device) and then pass in from the left edge of the tablet computer, and that the mouse pointer may pass out from the lower edge of the electronic device 100 (the present device) and then pass in from the upper edge of the handset.
As shown in fig. 9B, the electronic device 100 establishes a connection 915 for sharing the input device with the electronic device 200 (tablet computer), while the electronic device 100 establishes a connection 914 for sharing the input device with the electronic device 300 (mobile phone). The input devices of the electronic device 100, such as a mouse, a keyboard, etc., may be shared for use by the electronic device 200 and the electronic device 300, for example, the electronic device 200 and the electronic device 300 may perform a pointing operation using the mouse of the electronic device 100 or perform an operation of inputting characters using the keyboard. As shown in fig. 9B, after the electronic device 100 is connected to the mouse device, a mouse pointer 916 may be displayed on the interface 911 of the display screen. After passing through the connected device position setting as shown in fig. 9A, in response to the moving operation of the mouse, the mouse pointer 916 may pass out from the lower edge of the display interface 911 and then pass in from the upper edge of the display interface 912 of the electronic device 300, and the mouse pointer 917 is displayed on the interface 912; the mouse pointer 916 may also be passed out from the right edge of the display interface 911 and then passed in from the left edge of the display interface 913 of the electronic device 200, displaying the mouse pointer 918 on the interface 913. The mouse pointer 917 can move with the movement of the mouse, and when the mouse pointer 917 is located in the interface 912 of the electronic device 300, the electronic device 300 can respond to an input operation of an input device such as a click of the mouse, a character input of the keyboard, or the like. Similarly, the mouse pointer 918 may move along with the movement of the mouse, and when the mouse pointer 918 is located in the interface 913 of the electronic device 200, the electronic device 200 may respond to input operations of an input device such as a click of the mouse, a character input of the keyboard, and the like.
Fig. 9C shows a process in which the mouse pointer is moved from the display screen of the electronic device 300 to the display screen of the electronic device 200 in the scenario shown in fig. 9B. In the scenario shown in fig. 9C, since no direct communication connection is established between the electronic device 200 and the electronic device 300, the mouse pointer cannot be moved directly from the electronic device 300 to the electronic device 200, but needs to pass through the electronic device 100. For example, as shown by the arrow in fig. 9C, as the user moves the mouse upward, the mouse pointer 921 located on the electronic device 300 may pass through the upper edge of the interface 912 and then pass through the lower edge of the interface 911 of the electronic device 100, and the mouse pointer 922 is displayed on the interface 911 of the electronic device 100. Then, the user may continue to move the mouse to the right, and the mouse pointer 922 located on the electronic device 100 may continue to exit from the right edge of the interface 911 and then enter from the left edge of the interface 913 of the electronic device 200, and the mouse pointer 923 is displayed on the interface 913 of the electronic device 200.
Of course, the user may drag the schematic device in the device connection relationship schematic box 1002 to change the edge connection relationship between the devices. For example, in fig. 10A, a user may adjust a connection relationship between the electronic device 100 and a mobile phone or a tablet computer, and may display that, as shown in a schematic frame 1002 in a connection device location interface 1001, the tablet computer is disposed on the right side of the electronic device 100 (the device), the mobile phone is disposed on the left side of the device, that is, the right edge of the device is connected to the left edge of the tablet computer, and the left edge of the device is connected to the right edge of the mobile phone. Indicating that the mouse pointer may pass out from the right edge of the electronic device 100 (the present device) and then pass in from the left edge of the tablet computer, and that the mouse pointer may pass out from the left edge of the electronic device 100 (the present device) and then pass in from the right edge of the handset.
Similarly, as shown in fig. 10B, the electronic device 100 and the electronic device 200 (tablet computer) establish a connection 1015 for sharing the input device, while the electronic device 100 and the electronic device 300 (mobile phone) establish a connection 1014 for sharing the input device. The input devices of the electronic device 100, such as a mouse, a keyboard, etc., may be shared for use by the electronic device 200 and the electronic device 300, for example, the electronic device 200 and the electronic device 300 may perform a pointing operation using the mouse of the electronic device 100 or perform an operation of inputting characters using the keyboard. As shown in fig. 10B, after the electronic device 100 is connected to the mouse device, a mouse pointer 1016 may be displayed on the interface 1011 of the display screen. After passing through the connected device position setting as shown in fig. 10A, in response to a movement operation of the mouse, the mouse pointer 1016 may pass out from the left edge of the display interface 1011 and then pass in from the right edge of the display interface 1012 of the electronic device 300, displaying the mouse pointer 1017 on the interface 1012; the mouse pointer 1016 may also be passed out from the right edge of the display interface 1011 and then in from the left edge of the display interface 1013 of the electronic device 200, displaying the mouse pointer 1018 on the interface 1013. The mouse pointer 1017 may move with the movement of the mouse, and the electronic device 300 may respond to an input operation of an input device such as a click of the mouse, a character input of the keyboard, and the like, while the mouse pointer 1017 is located in the interface 1012 of the electronic device 300. Similarly, the mouse pointer 1018 may move along with the movement of the mouse, and when the mouse pointer 1018 is located in the interface 1013 of the electronic apparatus 200, the electronic apparatus 200 may respond to an input operation of an input device such as a click of the mouse, a character input of the keyboard, or the like.
Fig. 10C shows a process in which the mouse pointer is moved from the display screen of the electronic device 300 to the display screen of the electronic device 200 in the scenario shown in fig. 10B. In the scenario shown in fig. 10C, since no direct communication connection is established between the electronic device 200 and the electronic device 300, the mouse pointer cannot be moved directly from the electronic device 300 to the electronic device 200, but needs to pass through the electronic device 100. For example, as shown by the arrow in fig. 10C, as the user moves the mouse to the right, the mouse pointer 1021 on the electronic device 300 may pass out from the right edge of the interface 1012 and then pass in from the left edge of the interface 1011 of the electronic device 100, and the mouse pointer 1022 is displayed on the interface 1011 of the electronic device 100. Then, the user may continue to move the mouse to the right, and the mouse pointer 1022 located on the electronic device 100 may continue to exit from the right edge of the interface 1011 and then enter from the left edge of the interface 1013 of the electronic device 200, and the mouse pointer 923 is displayed on the interface 1013 of the electronic device 200.
The present invention is not limited to the arrangement of multiple devices in the above embodiments, and a user may increase or decrease the number of devices and/or adjust the connection relationship between multiple devices according to his own needs.
Fig. 11 illustrates a change in the position of the mouse pointer when the electronic device 200 (tablet computer) changes from the landscape screen state to the portrait screen state in some embodiments. Referring to the foregoing embodiment, the tablet computer shown in fig. 11 has established a connection of a shared input device with the electronic device 100, and a user can input on the tablet computer using an input device (such as a mouse, a keyboard, a tablet, etc.) of the electronic device 100. For example, as shown in fig. 11, the horizontal screen state of the tablet pc indicates that the long side of the tablet pc display screen 1102 is located in the up-down direction, and the short side is located in the left-right direction, and when the electronic device is in the vertical state, the direction of gravity is defined as downward. The vertical screen state of the tablet computer means that the short side of the tablet computer display screen 1102 is located in the up-down direction and the long side is located in the left-right direction. If the direction of the display interface of the electronic device 200 is not set to be locked, when the electronic device 200 is changed from the landscape screen state to the portrait screen state, the arrangement layout of the display interface may be changed under the detection of gravity sensing, for example, the direction of gravity is taken as a downward direction, the arrangement layout of the interface in the landscape screen state is different from the arrangement layout of the interface in the portrait screen state, and the arrangement layout of characters, symbols, icons and the like in the display interface may be automatically adjusted by the electronic device 200, so as to facilitate reading by the user. With the change of the arrangement layout of the display interface, the position of the mouse pointer may also change.
As shown in fig. 11, the tablet pc displays a mouse pointer 1101 on the display interface 1103 in the landscape state. The tablet computer in the horizontal screen state is rotated by 90 degrees, the tablet computer becomes the vertical screen state, the typesetting in the display interface 1104 can also be changed along with the change of the horizontal screen state into the vertical screen state, and the typesetting of the icons, controls and characters in the interface 1104 is the typesetting suitable for the vertical screen. The location of the mouse pointer 1105 in the display interface 1104 may also change accordingly. In some embodiments, such as shown in fig. 11, in the landscape state, the mouse pointer 1101 points to the application icon "information", and after the tablet computer rotates to change to the portrait state, the tablet computer may redraw and display the mouse pointer 1105 at the corresponding position of the application icon "information" after the layout change. Under the condition, the target pointed by the mouse pointer before and after the horizontal and vertical screens are switched is unchanged, so that the operation is more convenient for a user, and the experience is better.
In other embodiments, the absolute position of the mouse pointer may be unchanged in the horizontal and vertical screen states. For example, the resolution of the tablet computer is 1280 × 800 (pixels), the vertex at the top left corner is used as the origin (0,0), the position of the mouse pointer in the landscape state is the point a (600, 500), if the tablet computer is rotated 90 degrees to the right, the tablet computer becomes the portrait state, and the position of the mouse pointer in the portrait state becomes the point B (300, 600). In the visual effect, the position of the mouse pointer in the horizontal screen and the position of the mouse pointer in the vertical screen are the same position and point to the same pixel point position.
In other embodiments, it may also be that the scale of the lateral/vertical coordinates of the mouse pointer is unchanged. For example, the resolution of the tablet pc is 1280 × 800 (pixels), the vertex at the top left corner is the origin (0,0), the position of the mouse pointer in the landscape screen state is the C point (320, 600), the horizontal coordinate of the C point is one fourth of the length of the horizontal side, and the vertical coordinate is three quarters of the length of the vertical side. If the tablet computer is rotated to the right by 90 degrees, the tablet computer is changed into a vertical screen state, and the ratio of the position of the mouse pointer to the horizontal side and the vertical side of the horizontal screen is kept consistent, the position of the mouse pointer in the vertical screen state can be changed into a D point (200, 960).
When the position, the state, the layout, and the like of the electronic device change, the position of the mouse pointer may change correspondingly, which is not limited in the embodiments of the present application, and the position of the mouse pointer may change in various ways.
In some embodiments, as shown in fig. 12, the electronic device 400 is a folding-screen mobile phone, and when the state of the folding-screen mobile phone changes, for example, the display screen changes from the unfolded state to the folded state, the position of the mouse pointer may change accordingly. Referring to the foregoing embodiments, the folding screen mobile phone shown in fig. 12 has established a connection of a shared input device with the electronic device 100, and a user can input on the folding screen mobile phone using an input device (e.g., a mouse, a keyboard, a tablet, etc.) of the electronic device 100. The unfolded state of the folding screen mobile phone means that the first display screen (large screen) 1201 is in a visible state, and the folded state means that the second display screen (small screen) 1202 is in a visible state, wherein the size of the first display screen is larger than that of the second display screen. As shown in fig. 12, in the unfolded state, the first display 1201 displays an interface 1203, and the first display 1201 may be folded along the bending portion until the two edges of the display are folded and overlapped, and displayed as a folded state, and in the folded state, the second display 1202 displays an interface 1204.
As shown in fig. 12, in the folded-screen mobile phone in the unfolded state, a mouse pointer 1205 is displayed in the display interface 1203. The screen is completely folded, and the folding screen mobile phone is changed into a folding state. As the expanded state is changed to the folded state, the layout in the display interface may also change, and the layout of the icons, controls, and characters in the interface 1204 is a layout adapted to the folded state. The location of the mouse pointer 1206 on the display interface 1204 may also change accordingly. For example, as shown in fig. 11, when the folding-screen mobile phone is in the unfolded state, the mouse pointer 1205 points to the application icon "information", and after the folding-screen mobile phone is changed into the folded state, the mouse pointer 1206 may be redrawn and displayed at a position corresponding to the application icon "information" with the changed layout. Under the condition, the target pointed by the mouse pointer before and after the unfolding/folding state is switched is unchanged, so that the operation is more convenient for a user, and the experience is better. The above examples do not limit the embodiments of the present application, and the position of the mouse pointer may be changed in various ways.
With reference to the foregoing embodiments, a method for sharing an input device provided in an embodiment of the present application is described below.
Fig. 13 is a flowchart of a method for sharing an input device according to an embodiment of the present application. The method may be applied to a first electronic device and a second electronic device. The method can comprise the steps of cursor shuttling, cursor moving, input event responding and the like, and the specific implementation steps are as follows:
s101, first connection is established between first electronic equipment and second electronic equipment.
In some embodiments, the first electronic device may be the aforementioned electronic device 100 (e.g., PC), the second electronic device may be the aforementioned electronic device 200 (e.g., PAD, mobile phone), and the first electronic device and the second electronic device establish a first connection to form the communication system 10. Reference may be made to the foregoing embodiments with respect to the description of communication system 10. The first electronic device and the second electronic device can be mounted
Figure BDA0002925670980000301
Or other types of operating systems, the operating systems of the first electronic device and the second electronic device may be the same or different, and the application is not limited thereto.
In some embodiments, the first connection may be a wireless connection, such as a bluetooth connection, a Wi-Fi connection, or the like, or may be a wired connection, such as a USB connection, or the like, and the embodiment does not limit the type of the first connection. The embodiment also does not limit the process of establishing the first connection, and in an implementation manner, the first connection may be established between the first electronic device and the second electronic device by using an NFC short-range communication technology when the first electronic device and the second electronic device touch each other.
The first electronic device may be configured with an input device such as a mouse, a keyboard, and a tablet, and after the first connection is established between the first electronic device and the second electronic device, the second electronic device may also use the input device of the first electronic device to input content. The first connection may be a wired connection, such as a USB connection, or a wireless connection, such as a bluetooth connection, a Wi-Fi connection, and the like.
In a possible implementation manner, after the first electronic device establishes the first connection with the second electronic device, the first electronic device may create a virtual screen that is invisible to a user, and the virtual screen has the same resolution as the second display interface of the second electronic device. The virtual screen is created by the first electronic device in order to facilitate a jump display of a mouse pointer or other cursor for indicating a position across the edges of the display interface of the first electronic device between different display screens (including the virtual screen). If a virtual screen is not created or a new display screen is not attached, a mouse pointer or other cursor for indicating a position may be limited within a boundary of the display interface of the first electronic device. And after the mouse pointer or other indication cursors move to the virtual screen, because the resolution of the virtual screen is the same as that of the display screen of the second electronic equipment, the coordinates of the mouse pointer on the virtual screen can be directly sent to the second electronic equipment without complex coordinate conversion, so that the method is simple and convenient, and the consumption of CPU resources is saved. For specific description, reference may be made to the foregoing embodiments, which are not described herein again.
1. Cursor shuttle (S102-S106)
S102, the first electronic device detects that the first cursor moves to a third position of the first boundary of the first display interface.
The first cursor is used to indicate a target position in the display interface, for example, the first cursor may be a mouse pointer. The user may control movement of the first cursor by operating a first input device of the first electronic device. For example, when the first input device is a mouse, when a user moves the mouse, the first electronic device may instruct a mouse pointer in the display interface to move a certain distance in a corresponding direction according to the detected moving direction and distance of the mouse. The correspondence between the distance moved by the mouse and the distance moved by the mouse pointer in the display interface may be adjusted in the mouse pointer sensitivity setting. If the first input device is a trackpad or other touch-sensitive panel, the user may control movement of the first cursor by sliding a finger across the trackpad. The first input device may also be a keyboard, and the user may control the movement of the first cursor by operating the "up, down, left, right" direction keys. The first input device may also be a remote control lever, and the user may control movement of the first cursor by operating the remote control lever. The first input device may also be a camera, and the movement of the first cursor is controlled by detecting the movement of the pupil through the camera. The first input device is not limited in any way by the present application.
In this embodiment, the display interface may be a two-dimensional planar interface, and the third position on the first boundary may be represented as a two-dimensional coordinate. The first display interface is a display interface for displaying content on the first electronic device, the display screen of the first electronic device is not necessarily covered by the first display interface, in some cases, the aspect ratio of the first display interface may not be consistent with the aspect ratio of the display screen, a black area may be generated outside the first display interface in the display screen, and no content is displayed in the black area.
The display interface of the first electronic device may have a plurality of boundaries, such as four boundaries of a PC, a PAD, and a mobile phone in general, and for convenience of understanding and description, the display interface in which the user is facing the forward layout is taken as an example, and may be referred to as an upper boundary, a lower boundary, a left boundary, and a right boundary in a distinguishing manner. In some embodiments, as shown in fig. 5B, 5C, the first electronic device display screen may be placed in a border interfacing relationship with the second electronic device/virtual screen. For example, the second electronic device/virtual screen may be set to be located on the right side of the display screen of the first electronic device, that is, the right boundary of the display screen of the first electronic device is connected to the left boundary of the second electronic device/virtual screen, so that the first cursor may penetrate through the right boundary of the display interface of the first electronic device and penetrate through a corresponding position of the left boundary of the display interface of the second electronic device/virtual screen; of course, the first cursor may also pass back from the left boundary of the second electronic device/virtual screen interface to the right boundary of the first electronic device display interface. If the second electronic equipment/virtual screen is arranged below the display screen of the first electronic equipment, namely the lower boundary of the display screen of the first electronic equipment is connected with the upper boundary of the second electronic equipment/virtual screen, the first cursor can penetrate out of the lower boundary of the display interface of the first electronic equipment and penetrate into the corresponding position of the upper boundary of the display interface of the second electronic equipment/virtual screen; of course, the first cursor may also pass from the upper boundary of the second electronic device/virtual screen interface back to the lower boundary of the first electronic device display interface. The same applies to the connection of other boundaries. If a plurality of other devices are connected to the first electronic device, generally speaking, to avoid a conflict, the plurality of devices may be correspondingly connected to different boundaries of the display screen of the first electronic device, for example, the second electronic device is set to be located on the right of the first electronic device, and then the third electronic device may be set to be located below the first electronic device. For specific description, reference may be made to the foregoing embodiments, which are not described herein again.
In the embodiment of the method, the first boundary of the display interface of the first electronic device may be connected to the second boundary of the display interface of the second electronic device, and the connected meaning does not mean that the display screens of the two electronic devices are close to each other, but means that the first cursor may be displayed on the second boundary of the display interface of the second electronic device when the first cursor reaches the first boundary of the display interface of the first electronic device and then moves along the direction of the first boundary. That is, visually, a first cursor "passes out" of a first boundary of a first electronic device and then "passes in" of a second boundary of a second electronic device, and the first cursor "shuttles" from a display interface of the first electronic device to a display interface of the second electronic device. Similarly, the first cursor may also "pass back" from the display interface of the second electronic device to the display interface of the first electronic device, that is, when the first cursor reaches the second boundary of the display interface of the second electronic device and then continues to move along the direction of the second boundary, the first cursor may be displayed on the first boundary of the display interface of the first electronic device.
S103, the first electronic device detects a first moving operation, the first moving operation is an operation of indicating the first cursor to move out of the first display interface of the first electronic device, the first moving operation corresponds to a third offset, and the third offset is used for representing that the first cursor moves out of the first display interface.
For example, the first movement operation may be a user moving a mouse, and the first movement generated by the user moving the mouse may be converted into a third offset indicating a movement of a mouse pointer, where the third offset is a vector and includes an offset direction and an offset distance, where the third offset is used to represent that the first cursor moves in a first direction, and the first direction is a direction outside a boundary of the first display interface. If the first boundary is the right boundary of the first display interface and the first cursor is located on the first boundary, the first electronic device may execute step S104 when it is detected that the direction of the third offset includes a rightward direction component.
S104, the first electronic device may calculate a first position where the second cursor is to be displayed on the second display interface of the second electronic device according to the third position.
In some embodiments, the first position and the third position may be two-dimensional coordinates, and the first electronic device may calculate a first position where the first cursor will appear in the second display interface of the second electronic device according to the first resolution of the first display interface, the second resolution of the second display interface, and the third position where the first cursor is located. The third position may be a coordinate position on the first boundary of the display interface of the first electronic device, and the first position may be a coordinate position on the second boundary of the display interface of the second electronic device. For some specific calculation methods how to calculate the first position according to the third position, reference may be made to the embodiment described in fig. 4A, and details are not repeated here. Similarly, the second display interface refers to an interface area of the second electronic device displaying content, and the second display interface may not be spread across the display screen of the second electronic device.
S105, the first electronic device sends a first message to the second electronic device, where the first message may carry information such as the first location.
The first electronic device may send a first message carrying the first location information to the second electronic device to notify the second electronic device to display the second cursor at the first location. In other embodiments, the first message may not carry information of the designated position, but only notify that information of the second cursor is displayed, and the second electronic device may determine, by itself, a position of the second cursor to be displayed in the second display interface after receiving the message that the second cursor is displayed.
And S106, the second electronic equipment displays a second cursor at the first position of the second display interface according to the first message.
After the second electronic device receives the first message, the second electronic device may draw and display a second cursor at the first location. The first position may be a coordinate position on the second boundary of the display interface of the second electronic device, or may be a fixed position, which is not limited in this embodiment.
Accordingly, the cursor may also "pass back" from the second display interface of the second electronic device to the first display interface of the first electronic device. For example, when the second cursor is located at a second boundary of the second display interface, when the first electronic device detects a third movement operation, the third movement operation is an operation that the user operates to generate a third movement, the third movement operation is an operation that instructs the second cursor to move out of the second display interface, the third movement operation corresponds to a second offset amount, the second offset amount is used for instructing the second cursor to move towards a second direction, and the second direction is a direction beyond the second display interface boundary. The first electronic device may then display the first cursor at a third location in the first display interface.
In some embodiments, when the second cursor is displayed on the second display interface, the first cursor is not displayed on the first display interface of the first electronic device, so as to avoid confusion of users.
2. Cursor movement (S107-S109)
S107, the first electronic device detects a first offset.
In some embodiments, the first electronic device may detect a second movement operation of the user, the second movement operation generating a second movement, the second movement corresponding to a first offset, the first offset being a vector coordinate including a movement direction and a movement distance. For example, the second movement operation may be a user manipulating a mouse to move a certain distance, or a finger to move a certain distance on a touch pad, or a remote control lever to move, and the like, and the second movement operation may instruct a second cursor to move by a first offset. In the embodiment, the input device and the type of the input event are not limited, and the input event causing the cursor to move may also be that the camera of the first electronic device detects a gesture operation of the user or a movement of an eye pupil, and the like.
If the first position is a position on a boundary of the second display interface, the second movement produces a first offset to indicate that the second cursor is moving in a direction within the second display interface.
And S108, the first electronic equipment sends a second message to the second electronic equipment, wherein the second message carries information such as the first offset.
And the first electronic equipment sends a second message carrying the first offset information to the second electronic equipment to inform the second cursor of the second electronic equipment of generating the displacement of the first offset.
And S109, the second electronic device moves the second cursor from the first position to the second position, wherein the offset of the second position relative to the first position is the first offset.
Since the first electronic device creates the virtual screen with the same resolution as the second electronic device, the second electronic device displays the second cursor at the first position in the second display interface, and simultaneously the first cursor moves to a fourth position of the virtual screen, and the coordinate value of the fourth position is the same as that of the first position. When the second movement operation is detected, the second electronic device moves the second cursor from the first position to the second position, and simultaneously, the first electronic device moves the first cursor from the fourth position to the fifth position, and the coordinate values of the fifth position and the second position are the same.
3. Inputting event response (S110-S112)
Since the first operating system carried by the first electronic device may be different from the second operating system carried by the second electronic device, when the first electronic device receives an input event from an input device such as a mouse, a keyboard, a stylus pen, etc., while the second cursor is displayed in the display interface of the second electronic device, the first electronic device may convert the input event into a corresponding input event on the second electronic device and transmit the input event to the second electronic device. After receiving the input event, the second electronic device can respond to the input event correspondingly, so that the function that the user inputs on the second electronic device by using the input device of the first electronic device is realized.
S110, the first electronic device detects the first input event, and may map the first input event to a second input event according to the first mapping table.
The first input event is from an input operation acquired by a first input device of the first electronic device, and the first input device may include: a mouse, keyboard, tablet, camera, touch pad, scanner, stylus, remote control stick, voice input device, etc. For example, the first input event may be an input event such as a left mouse click, a right mouse click, a middle mouse click, a left mouse click, a right mouse click, a mouse wheel slide, a first key of a keyboard click, or may also be an input event such as voice capture, a stylus pen, a touch click, a combination thereof, and the like, which is not limited in this application.
In some embodiments, a first mapping table may be stored in the first electronic device, and a mapping relationship between a first input event acting on the first electronic device and a second input event acting on the second electronic device may be indicated in the first mapping table. The first electronic device may convert the acquired first input event of the input device into a corresponding second input event that may act on the second electronic device according to the first mapping table, and send the second input event to the second electronic device.
For example, a first electronic device, such as a PC, may act on a first mapping table according to the first mapping table
Figure BDA0002925670980000341
Input event mapping of system to act on
Figure BDA0002925670980000342
Input event of system, second electronic equipment such as PAD is carried
Figure BDA0002925670980000343
Provided is a system. For example, in
Figure BDA0002925670980000344
An event on the system that clicks the left mouse button can be mapped to
Figure BDA0002925670980000345
A single click event in the system
Figure BDA0002925670980000346
An event on the system that clicks the right mouse button can be mapped to
Figure BDA0002925670980000347
Long press events in the system.
Figure BDA0002925670980000348
The key code value of the first key in the system can be mapped to be corresponding
Figure BDA0002925670980000349
Key values in the system, e.g. also the character "a", are
Figure BDA00029256709800003410
Key values in the system may be associated with
Figure BDA00029256709800003411
The key values in the system are not identical.
And S111, the first electronic equipment sends a third message to the second electronic equipment, and the third message carries the second input event.
In some embodiments, the first electronic device may package the second input event into a data packet conforming to a transport protocol format and then send to the second electronic device over the first connection (e.g., Wi-Fi direct). The data of the input event may include an occurrence time of the input event, an input event type, a code of the input event type, a value of the input event, and the like.
For example, carry
Figure BDA00029256709800003412
The PAD of the system is provided with an input subsystem of the PAD, and the input subsystem is used for uniformly managing input events. The uinput implemented based on the input subsystem may facilitate the simulation of input events in the user space userpace. The PAD may create a virtual device (e.g., a virtual mouse, a virtual keyboard, etc.) through the uinput and configure the attributes of the virtual device, and then write the input event sequence obtained from the first electronic device, e.g., a PC, into the/dev/uinput device file. In this way, input in the PAD using the mouse or keyboard of the PC can be achieved. For specific description, reference may be made to the foregoing embodiments, which are not described herein again.
And S112, the second electronic equipment receives and responds to the second input event.
The second electronic device may respond to the second input event correspondingly after receiving the second input event. Such as playing music in response to a single-click instruction, dragging an icon in response to a long-press instruction, displaying a character corresponding to a first key of a first electronic device keyboard, and so on.
It is to be understood that, with reference to the embodiments shown in fig. 11 and 12, in some embodiments, when the display interface layout of the second electronic device is changed from the second display interface to the third display interface, the second cursor may be changed from the second position to the sixth position, where the second display interface and the third display interface include the same interface elements, and the resolution of the second display interface is different from the resolution of the third display interface. For example, after the tablet computer is changed from the landscape screen state to the portrait screen state, the layout of the display interface is changed, but the mouse pointer can point to the same icon before and after the display interface is changed. Or for example, when the folding screen is changed from the folding state to the unfolding state, the size of the display interface of the display screen is changed, the layout of the display interface is also changed, and a mouse pointer can be set to point to the same icon before and after the display interface is changed.
In some embodiments, when the second cursor is displayed on the second display interface of the second electronic device, and when the second electronic device is locked, turned off, or restarted, the first cursor may be redisplayed at a certain position in the first display interface of the first electronic device, which is convenient for a user to position and operate the cursor.
As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)".
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (28)

1. A method of sharing an input device, the method comprising:
the method comprises the steps that first connection is established between first electronic equipment and second electronic equipment;
the first electronic device detects a first movement operation, wherein the first movement operation is an operation of indicating that a first cursor moves out of a first display interface of the first electronic device;
the first electronic equipment sends a first message to the second electronic equipment through the first connection;
the second electronic equipment displays a second cursor at a first position in a second display interface according to the first message;
the first electronic device detects a second movement operation, wherein the second movement operation corresponds to a first offset;
the first electronic device sends a second message to the second electronic device through the first connection, wherein the second message carries the first offset;
and the second electronic equipment moves the second cursor from the first position to a second position, wherein the offset of the second position relative to the first position is the first offset.
2. The method of claim 1, further comprising:
the first electronic device detects a third movement operation, wherein the third movement operation is an operation of indicating that the second cursor moves out of the second display interface of the second electronic device;
the first electronic device displays the first cursor at a third location in the first display interface.
3. The method of claim 2, wherein the first location is located on a second boundary of the second display interface and the third location is located on a first boundary of the first display interface.
4. The method of claim 3, wherein the first display interface has four boundaries, the four boundaries being an upper boundary, a lower boundary, a left boundary, and a right boundary of the display interface in the forward layout, and the second display interface has four boundaries, the four boundaries being an upper boundary, a lower boundary, a left boundary, and a right boundary of the display interface in the forward layout;
if the first boundary is a left boundary of the first display interface, the second boundary is a right boundary of the second display interface; if the first boundary is a right boundary of the first display interface, the second boundary is a left boundary of the second display interface; if the first boundary is the upper boundary of the first display interface, the second boundary is the lower boundary of the second display interface; the second boundary is an upper boundary of the second display interface if the first boundary is a lower boundary of the first display interface.
5. The method according to any one of claims 1-4, further comprising:
in the case that the second cursor is displayed on the second display interface, the first electronic device detects a first input event, wherein the first input event is from an input operation acquired by a first input device of the first electronic device, and the first input device comprises one or more of the following: mouse, keyboard, writing pad, camera, touch pad, scanner, writing pen, remote control rod, voice input device;
the first electronic device maps the first input event into a second input event, wherein a first mapping table is stored in the first electronic device, and the first mapping table stores a mapping relation between the first input event and the second input event;
the first electronic equipment sends a third message to the second electronic equipment, and the third message carries the second input event;
the second electronic device receives the second input event.
6. The method according to any one of claims 1-5, wherein the display area of the first display interface is a visible area corresponding to a first resolution, and the display area of the second display interface is a visible area corresponding to a second resolution, the method further comprising:
after the first electronic device detects the first movement operation, the first electronic device determines the coordinates of the first position of the second cursor displayed on the second display interface according to the coordinates of the first cursor on the first display interface, the first resolution of the first display interface, and the second resolution of the second display interface acquired from the second electronic device.
7. The method of claim 6, further comprising:
the first electronic equipment creates a virtual screen, and the resolution of the virtual screen is the second resolution;
the first electronic device moves the first cursor to a fourth position of the virtual screen while the second electronic device displays a second cursor at a first position in a second display interface, wherein the coordinate value of the fourth position is the same as that of the first position;
when the second movement operation is detected, the first electronic device moves the first cursor from the fourth position to a fifth position, and the coordinate value of the fifth position is the same as that of the second position.
8. The method of any one of claims 1-7, further comprising:
when the display interface layout of the second electronic device is changed from the second display interface to a third display interface, the second electronic device changes the position of the second cursor from the second position to a sixth position, wherein the second display interface and the third display interface contain the same interface elements, and the resolution of the second display interface is different from that of the third display interface.
9. A method of sharing an input device, the method comprising:
the method comprises the steps that first connection is established between first electronic equipment and second electronic equipment;
the first electronic device detects a first movement operation, wherein the first movement operation is an operation of indicating that a first cursor moves out of a first display interface of the first electronic device;
the first electronic device sends a first message to the second electronic device through the first connection, wherein the first message is used for informing the second electronic device to display a second cursor;
the first electronic device detects a second movement operation, wherein the second movement operation corresponds to a first offset;
the first electronic device sends a second message to the second electronic device through the first connection, where the second message carries the first offset, the second message is used to notify the second electronic device to move the second cursor from the first position to a second position, and an offset of the second position relative to the first position is the first offset.
10. The method of claim 9, further comprising:
the first electronic device detects a third movement operation, wherein the third movement operation is an operation of indicating that the second cursor moves out of the second display interface of the second electronic device;
the first electronic device displays the first cursor at a third location in the first display interface.
11. The method of claim 10, wherein the first location is located on a second boundary of the second display interface and the third location is located on a first boundary of the first display interface.
12. The method of claim 11, wherein the first display interface has four boundaries, the four boundaries being an upper boundary, a lower boundary, a left boundary, and a right boundary of the display interface in the forward layout, and the second display interface has four boundaries, the four boundaries being an upper boundary, a lower boundary, a left boundary, and a right boundary of the display interface in the forward layout;
if the first boundary is a left boundary of the first display interface, the second boundary is a right boundary of the second display interface; if the first boundary is a right boundary of the first display interface, the second boundary is a left boundary of the second display interface; if the first boundary is the upper boundary of the first display interface, the second boundary is the lower boundary of the second display interface; the second boundary is an upper boundary of the second display interface if the first boundary is a lower boundary of the first display interface.
13. The method according to any one of claims 9-12, further comprising:
in the case that the second cursor is displayed on the second display interface, the first electronic device detects a first input event, wherein the first input event is from an input operation acquired by a first input device of the first electronic device, and the first input device comprises one or more of the following: mouse, keyboard, writing pad, camera, touch pad, scanner, writing pen, remote control rod, voice input device;
the first electronic device maps the first input event into a second input event, wherein a first mapping table is stored in the first electronic device, and the first mapping table stores a mapping relation between the first input event and the second input event;
and the first electronic equipment sends a third message to the second electronic equipment, wherein the third message carries the second input event.
14. The method according to any one of claims 9-13, wherein the display area of the first display interface is a viewable area corresponding to a first resolution, and the display area of the second display interface is a viewable area corresponding to a second resolution, the method further comprising:
after the first electronic device detects the first movement operation, the first electronic device determines the coordinates of the first position of the second cursor displayed on the second display interface according to the coordinates of the first cursor on the first display interface, the first resolution of the first display interface, and the second resolution of the second display interface acquired from the second electronic device.
15. The method as recited in claim 14, further comprising:
the first electronic equipment creates a virtual screen, and the resolution of the virtual screen is the second resolution;
the first electronic device moves the first cursor to a fourth position of the virtual screen while the second electronic device displays a second cursor at a first position in a second display interface, wherein the coordinate value of the fourth position is the same as that of the first position;
when the second movement operation is detected, the first electronic device moves the first cursor from the fourth position to a fifth position, and the coordinate value of the fifth position is the same as that of the second position.
16. A method of sharing an input device, the method comprising:
the second electronic equipment establishes a first connection with the first electronic equipment;
the second electronic device receives a first message from the first electronic device through the first connection, wherein the first message is generated after the first electronic device detects a first movement operation, and the first movement operation is an operation of indicating a first cursor to move out of a first display interface of the first electronic device;
the second electronic device displays the second cursor at a first position in a second display interface according to the first message;
the second electronic device receives a second message from the first electronic device through the first connection, wherein the second message carries a first offset, the second message is generated after the first electronic device detects a second mobile operation, and the second mobile operation corresponds to the first offset;
and the second electronic equipment moves the second cursor from the first position to a second position, wherein the offset of the second position relative to the first position is the first offset.
17. The method of claim 16, further comprising:
the second electronic device receives a second offset sent by the first electronic device, wherein the second offset is an offset corresponding to a third movement operation, and the third movement operation is an operation of indicating the second cursor to move out of the second display interface of the second electronic device;
and the second electronic equipment cancels the display of the second cursor.
18. The method of claim 17, wherein the first cursor is displayed at a third location in the first display interface of the first electronic device when the second electronic device cancels displaying the second cursor.
19. The method of claim 18, wherein the first location is located on a second boundary of the second display interface and the third location is located on a first boundary of the first display interface.
20. The method of claim 19, wherein the first display interface has four borders, and wherein the four borders are an upper border, a lower border, a left border, and a right border of the display interface in the forward layout;
if the first boundary is a left boundary of the first display interface, the second boundary is a right boundary of the second display interface; if the first boundary is a right boundary of the first display interface, the second boundary is a left boundary of the second display interface; if the first boundary is the upper boundary of the first display interface, the second boundary is the lower boundary of the second display interface; the second boundary is an upper boundary of the second display interface if the first boundary is a lower boundary of the first display interface.
21. The method according to any one of claims 16-20, further comprising:
under the condition that the second cursor is displayed on the second display interface, the second electronic device receives a third message from the first electronic device through the first connection, the third message carries a second input event, the second input event is an input event mapped correspondingly to the first input event, a first mapping table is stored in the first electronic device, the first mapping table stores a mapping relationship between the first input event and the second input event, the first input event is from an input operation collected by a first input device of the first electronic device, and the first input device includes one or more of the following items: mouse, keyboard, handwriting pad, camera, touch pad, scanner, stylus pen, remote control rod, voice input device.
22. The method according to any one of claims 16 to 21, wherein a display area of the first display interface is a visible area corresponding to a first resolution, a display area of the second display interface is a visible area corresponding to a second resolution, and coordinates of the first position of the second cursor displayed on the second display interface are determined by the first electronic device according to coordinates of the first cursor on the first display interface, the first resolution of the first display interface, and the second resolution of the second display interface obtained from the second electronic device.
23. The method of claim 22, wherein the coordinate value of the first position of the second cursor in the second display interface is the same as the coordinate value of the fourth position of the first cursor in a virtual screen created by the first electronic device, the resolution of the virtual screen is the second resolution, and the fourth position is a position on the virtual screen where the first cursor appears after moving out of the first display interface;
the coordinate value of the second position of the second cursor in the second display interface is the same as the coordinate value of the fifth position of the first cursor in the virtual screen, and the offset of the fifth position relative to the fourth position is the first offset.
24. The method of any one of claims 16-23, further comprising:
when the display interface layout of the second electronic device is changed from the second display interface to a third display interface, the second electronic device changes the position of the second cursor from the second position to a sixth position, wherein the second display interface and the third display interface contain the same interface elements, and the resolution of the second display interface is different from that of the third display interface.
25. An electronic device, characterized in that the electronic device comprises: a communication device, a display screen, a memory, and a processor coupled to the memory, a plurality of applications, and one or more programs; the memory has stored therein computer-executable instructions that, when executed by the processor, cause the electronic device to implement the method of any of claims 9-15.
26. An electronic device, characterized in that the electronic device comprises: a communication device, a display screen, a memory, and a processor coupled to the memory, a plurality of applications, and one or more programs; the memory has stored therein computer-executable instructions that, when executed by the processor, cause the electronic device to implement the method of any of claims 16-24.
27. A communication system, comprising: a first electronic device and a second electronic device, wherein:
the first electronic device is the electronic device of claim 25;
the second electronic device is the electronic device recited in claim 26.
28. A computer storage medium, wherein a computer program is stored in the storage medium, and wherein the computer program comprises executable instructions that, when executed by a processor, cause the processor to perform operations corresponding to the method of any one of claims 9 to 15 and 16 to 24.
CN202110131920.7A 2020-11-30 2021-01-30 Method for sharing input equipment, electronic equipment and system Pending CN114579016A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US18/254,984 US20240045557A1 (en) 2020-11-30 2021-11-29 Method for Sharing Input Device, Electronic Device, and System
PCT/CN2021/134032 WO2022111690A1 (en) 2020-11-30 2021-11-29 Method for sharing input device, electronic devices, and system
EP21897190.1A EP4235371A4 (en) 2020-11-30 2021-11-29 Method for sharing input device, electronic devices, and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2020113775390 2020-11-30
CN202011377539 2020-11-30

Publications (1)

Publication Number Publication Date
CN114579016A true CN114579016A (en) 2022-06-03

Family

ID=81769920

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110131920.7A Pending CN114579016A (en) 2020-11-30 2021-01-30 Method for sharing input equipment, electronic equipment and system

Country Status (1)

Country Link
CN (1) CN114579016A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023241257A1 (en) * 2022-06-13 2023-12-21 荣耀终端有限公司 Method for establishing connection between devices and terminal device
WO2023241558A1 (en) * 2022-06-17 2023-12-21 华为技术有限公司 Communication method, communication system and mouse
WO2024046117A1 (en) * 2022-09-02 2024-03-07 荣耀终端有限公司 Data transmission method and terminal device
WO2024046315A1 (en) * 2022-09-02 2024-03-07 荣耀终端有限公司 Event processing method and apparatus for input device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023241257A1 (en) * 2022-06-13 2023-12-21 荣耀终端有限公司 Method for establishing connection between devices and terminal device
WO2023241558A1 (en) * 2022-06-17 2023-12-21 华为技术有限公司 Communication method, communication system and mouse
WO2024046117A1 (en) * 2022-09-02 2024-03-07 荣耀终端有限公司 Data transmission method and terminal device
WO2024046315A1 (en) * 2022-09-02 2024-03-07 荣耀终端有限公司 Event processing method and apparatus for input device

Similar Documents

Publication Publication Date Title
CN109917956B (en) Method for controlling screen display and electronic equipment
CN112130742B (en) Full screen display method and device of mobile terminal
CN112217923B (en) Display method of flexible screen and terminal
KR102534354B1 (en) System navigation bar display control method, graphical user interface and electronic device
CN110536004B (en) Method for applying multiple sensors to electronic equipment with flexible screen and electronic equipment
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
WO2021036770A1 (en) Split-screen processing method and terminal device
WO2022042285A1 (en) Method for displaying interface of application program and electronic device
CN112598594A (en) Color consistency correction method and related device
CN114579016A (en) Method for sharing input equipment, electronic equipment and system
CN112506386A (en) Display method of folding screen and electronic equipment
CN110559645B (en) Application operation method and electronic equipment
CN112671976A (en) Control method of electronic equipment and electronic equipment
CN114115769A (en) Display method and electronic equipment
CN110401768B (en) Method and device for adjusting working state of electronic equipment
WO2022001258A1 (en) Multi-screen display method and apparatus, terminal device, and storage medium
WO2022143180A1 (en) Collaborative display method, terminal device, and computer readable storage medium
CN115129410A (en) Desktop wallpaper configuration method and device, electronic equipment and readable storage medium
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
CN115150542A (en) Video anti-shake method and related equipment
WO2022111690A1 (en) Method for sharing input device, electronic devices, and system
CN115032640B (en) Gesture recognition method and terminal equipment
CN113391775A (en) Man-machine interaction method and equipment
WO2022062902A1 (en) File transfer method and electronic device
CN108881715B (en) Starting method and device of shooting mode, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination