CN112527222A - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN112527222A
CN112527222A CN201910937409.9A CN201910937409A CN112527222A CN 112527222 A CN112527222 A CN 112527222A CN 201910937409 A CN201910937409 A CN 201910937409A CN 112527222 A CN112527222 A CN 112527222A
Authority
CN
China
Prior art keywords
electronic device
screen
cursor
content
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910937409.9A
Other languages
Chinese (zh)
Inventor
邵凯
薛朝栋
徐亚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN112527222A publication Critical patent/CN112527222A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The application provides an information processing method, which comprises the following steps: generating screen projection content; sending the screen projection content to a second electronic device to enable a display screen of the second electronic device to display the screen projection content, wherein the screen projection content comprises a cursor and interface content of the first electronic device, and the cursor is used for being positioned at an operation position in the interface content. In the application, the cursor is added to the screen projection content, so that a user can determine the position to be operated through the cursor displayed by the second electronic device, the user can determine the position to be operated only based on the cursor position in the screen projection content displayed by the second electronic device without looking at the interface content of the first electronic device, and further, the operation can be executed at the position of the cursor.

Description

Information processing method and electronic equipment
Technical Field
The present application relates to the field of electronic technologies, and in particular, to an information processing method and an electronic device.
Background
With the popularization of smart televisions and network set-top boxes, the screen projection function is used by more and more people. Under the scenes of family, work and teaching, the picture of the mobile phone can be displayed under a large screen, and the content can be shared to people around.
In the prior art, in a screen projection scene, a user can only control a far-end screen by operating a display screen of a mobile phone, and the sight of the user needs to be transferred back and forth between the mobile phone and the far-end screen, so that the experience is poor.
Disclosure of Invention
In a first aspect, the present application provides an information processing method, including:
generating screen projection content;
sending the screen projection content to a second electronic device to enable a display screen of the second electronic device to display the screen projection content, wherein the screen projection content comprises a cursor and interface content of the first electronic device, and the cursor is used for being positioned at an operation position in the interface content.
Optionally, in a design of the first aspect, the generating screen projection content includes:
and acquiring interface content of the front-end application of the first electronic equipment, and generating a cursor on the interface content to obtain screen projection content.
Optionally, in a design of the first aspect, the generating screen projection content includes:
acquiring interface content of a front-end application of the first electronic equipment, and generating a cursor and a menu bar on the interface content to obtain screen projection content; wherein the menu bar does not belong to the interface content of the first electronic device.
Optionally, in a design of the first aspect, before the generating the screen projection content, the method further includes:
detecting that the first electronic device and the second electronic device establish screen projection connection.
Optionally, in a design of the first aspect, the method further includes:
acquiring pose change information of the first electronic equipment;
causing the cursor to move in the screen-cast content displayed by the second electronic device based on the pose change information.
Optionally, in one design of the first aspect, the pose change information includes a first rotation angle of the first electronic device on a horizontal plane and a second rotation angle of the first electronic device on a vertical plane, and the causing the cursor to move in the screen-projected content displayed by the second electronic device based on the pose change information includes:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and moving the cursor in the screen-casting content displayed by the second electronic device according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the first aspect, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, the first direction is parallel to a left-right moving direction of a cursor displayed on a display screen of the second electronic device, the second direction is parallel to a up-down moving direction of the cursor displayed on the display screen of the second electronic device, and the moving the cursor in the screen projection content displayed on the second electronic device based on the pose change information includes:
Determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and moving the cursor in the screen-casting content displayed by the second electronic device according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the first aspect, the method further includes:
when the cursor moves to a target object in the screen projection content displayed by the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
enabling a target object in the screen projection content displayed by the second electronic equipment to respond to the first touch operation.
Optionally, in a design of the first aspect, an operation position of the first touch operation corresponds to a second object in the interface content of the first electronic device, and the method further includes:
and shielding the response of the second object to the first touch operation.
Optionally, in a design of the first aspect, the first touch operation includes at least a click operation and a first slide operation, and the target object includes at least an application and a functionality control.
Optionally, in a design of the first aspect, the first touch operation includes at least a click operation and a first slide operation, and the causing a target object in the screen-shot content displayed by the second electronic device to respond to the first touch operation includes:
generating a click event corresponding to the click operation;
executing the click event on a target object in interface content of the first electronic device so that the target object in the screen-casting content displayed by the second electronic device responds to the click operation; or the like, or, alternatively,
generating a sliding event corresponding to the first sliding operation;
executing the sliding event on a target object in interface content of the first electronic device, so that the target object in the screen-casting content displayed by the second electronic device responds to the first sliding operation.
Optionally, in a design of the first aspect, the method further includes:
determining the displacement of the cursor according to the first sliding operation;
causing the cursor to move in the screen-shot content displayed by the second electronic device based on the displacement of the cursor.
Optionally, in a design of the first aspect, the method further includes:
Receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in the screen-cast content displayed by the second electronic device based on the displacement.
Optionally, in a design of the first aspect, an operation position corresponding to the second sliding operation corresponds to a third object in the interface content of the first electronic device, and the method further includes:
and shielding the response of the third object to the second touch operation.
Optionally, in a design of the first aspect, the method further includes:
when the cursor moves to a target object in the screen projection content displayed by the second electronic device, if a click operation on a display screen of the first electronic device is received;
enabling a target object in the screen-shot content displayed by the second electronic equipment to respond to the clicking operation.
Optionally, in a design of the first aspect, an operation position of the click operation corresponds to a fourth object in the interface content of the first electronic device, and the method further includes:
and shielding the response of the fourth object to the click operation.
Optionally, in a design of the first aspect, the method further includes:
receiving a pressing operation on a physical key of the first electronic equipment;
generating a corresponding second event according to the pressing operation;
and executing the second event to the first electronic equipment front-end application.
Optionally, in a design of the first aspect, the method further includes:
receiving a first operation on a display screen of the first electronic device, wherein the first operation is a preset shortcut operation;
the first electronic device is responsive to the first operation.
Optionally, in a design of the first aspect, an operation position of the first operation corresponds to a fifth object in interface content of the second electronic device, and the fifth object does not respond to the first operation.
Optionally, in a design of the first aspect, the preset shortcut operation at least includes:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Optionally, in a design of the first aspect, the method further includes:
reducing the display brightness of the display screen of the first electronic device; or the like, or, alternatively,
and executing screen-off operation on the display screen of the first electronic equipment.
Optionally, in a design of the first aspect, a display area of the display screen of the first electronic device is smaller than a display area of the display screen of the second electronic device.
In a second aspect, the present application provides an electronic device comprising:
the processing module is used for generating and generating screen projection content;
the sending module is used for sending the screen projection content to a second electronic device so that a display screen of the second electronic device displays the screen projection content, the screen projection content comprises a cursor and interface content of the first electronic device, and the cursor is used for being positioned at an operation position in the interface content.
Optionally, in a design of the second aspect, the processing module is specifically configured to:
and acquiring interface content of the front-end application of the first electronic equipment, and generating a cursor on the interface content to obtain screen projection content.
Optionally, in a design of the second aspect, the processing module is specifically configured to:
Acquiring interface content of a front-end application of the first electronic equipment, and generating a cursor and a menu bar on the interface content to obtain screen projection content; wherein the menu bar does not belong to the interface content of the first electronic device.
Optionally, in a design of the second aspect, the processing module is further configured to detect that the first electronic device and the second electronic device establish a screen-casting connection.
Optionally, in a design of the second aspect, the processing module is further configured to acquire pose change information of the first electronic device;
causing the cursor to move in the screen-cast content displayed by the second electronic device based on the pose change information.
Optionally, in a design of the second aspect, the pose change information includes a first rotation angle of the first electronic device on a horizontal plane and a second rotation angle of the first electronic device on a vertical plane, and the processing module is specifically configured to:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and moving the cursor in the screen-casting content displayed by the second electronic device according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the second aspect, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, where the first direction is parallel to a left-right moving direction of a cursor displayed on a display screen of the second electronic device, and the second direction is parallel to a up-down moving direction of the cursor displayed on the display screen of the second electronic device, and the processing module is further configured to:
determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and moving the cursor in the screen-casting content displayed by the second electronic device according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the second aspect, the processing module is further configured to:
when the cursor moves to a target object in the screen projection content displayed by the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
enabling a target object in the screen projection content displayed by the second electronic equipment to respond to the first touch operation.
Optionally, in a design of the second aspect, the processing module is further configured to:
and shielding the response of the second object to the first touch operation.
Optionally, in a design of the second aspect, the first touch operation includes at least a click operation and a first slide operation, and the target object includes at least an application and a functionality control.
Optionally, in a design of the second aspect, the processing module is further configured to:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in the screen-cast content displayed by the second electronic device based on the displacement.
Optionally, in a design of the second aspect, an operation position corresponding to the second sliding operation corresponds to a third object in the interface content of the first electronic device, and the processing module is further configured to:
and shielding the response of the third object to the second touch operation.
Optionally, in a design of the second aspect, the processing module is further configured to:
when the cursor moves to a target object in the screen projection content displayed by the second electronic device, if a click operation on a display screen of the first electronic device is received;
Enabling a target object in the screen-shot content displayed by the second electronic equipment to respond to the clicking operation.
Optionally, in a design of the second aspect, an operation position of the click operation corresponds to a fourth object in the interface content of the first electronic device, and the processing module is further configured to:
and shielding the response of the fourth object to the click operation.
Optionally, in a design of the second aspect, the processing module is further configured to:
receiving a pressing operation on a physical key of the first electronic equipment;
generating a corresponding second event according to the pressing operation;
and executing the second event to the first electronic equipment front-end application.
Optionally, in a design of the second aspect, the processing module is further configured to:
receiving a first operation on a display screen of the first electronic device, wherein the first operation is a preset shortcut operation;
the first electronic device is responsive to the first operation.
Optionally, in a design of the second aspect, the operation position of the first operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object does not respond to the first operation.
Optionally, in a design of the second aspect, the preset shortcut operation at least includes:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Optionally, in a design of the second aspect, the processing module is further configured to:
reducing the display brightness of the display screen of the first electronic device; or the like, or, alternatively,
and executing screen-off operation on the display screen of the first electronic equipment.
Optionally, in a design of the second aspect, a display area of the display screen of the first electronic device is smaller than a display area of the display screen of the second electronic device.
In a third aspect, the present application provides an operating method applied to screen projection, the method including:
the interface content of the first electronic equipment is projected to a second electronic equipment, so that a second display screen of the second electronic equipment displays the interface content;
Receiving a first operation acting on a first display screen of the first electronic device; causing the interface content displayed by the second display screen to be responsive to the first operation; the operation position of the first operation corresponds to a first object in the interface content of the first electronic equipment; the first object does not respond to the first operation.
Optionally, in a design of the third aspect, the making the interface content displayed by the second display screen respond to the first operation specifically includes:
determining a first position in the interface content displayed by the first electronic equipment; causing the interface content displayed by the second display screen to respond to the first operation based on the first location; wherein the first position is independent of an operating position of the first operation.
Optionally, in a design of the third aspect, the method further includes:
the cursor is projected in the second electronic device, so that the second display screen displays the cursor.
Optionally, in a design of the third aspect, a corresponding position of the cursor in the interface content displayed by the first electronic device is a first position.
Optionally, in a design of the third aspect, the determining a first position in the interface content displayed by the first electronic device includes:
acquiring pose change information of the first electronic equipment;
determining a first position of the cursor in interface content displayed by the first electronic device based on the pose change information.
Optionally, in one design of the third aspect, the determining the first position of the cursor in the interface content displayed by the first electronic device based on the pose change information includes:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the third aspect, the determining, based on the pose change information, a first position of the cursor in interface content displayed by the first electronic device in a first direction and a second position of the first electronic device in a second direction, where the first direction is parallel to a left-right moving direction of a cursor displayed on a display screen of the second electronic device, includes:
Determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the third aspect, the first operation includes at least a click operation and a first slide operation, and the first object includes at least one of an application and a functionality control.
Optionally, in a design of the third aspect, the determining a first position in the interface content displayed by the first electronic device includes:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the displacement of the cursor.
Optionally, in a design of the third aspect, the first operation includes at least a click operation, and the first object includes at least one of an application and a functionality control.
Optionally, in an embodiment of the third aspect, the first operation is a preset shortcut operation, and after the receiving is performed on the first operation on the first display screen, the method further includes:
The first electronic device is responsive to the first operation.
Optionally, in a design of the third aspect, the operation position of the first operation corresponds to a third object in the interface content of the second electronic device, and the third object does not respond to the first operation.
Optionally, in a design of the third aspect, the preset shortcut operation at least includes:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Optionally, in a design of the third aspect, the method further includes:
reducing the display brightness of the first display screen; or the like, or, alternatively,
and executing screen-off operation on the first display screen.
Optionally, in a design of the third aspect, the method further includes:
generating a menu bar;
and projecting the menu bar into the second electronic equipment so that the second display screen displays the menu bar.
Optionally, in one design of the third aspect, before the interface content of the first electronic device is projected on the second electronic device, so that the second display screen displays the interface content, the method further includes:
detecting that the first electronic device and the second electronic device establish screen projection connection.
In a fourth aspect, the present application provides a first electronic device, comprising:
the processing module is used for projecting the interface content of the first electronic equipment into the second electronic equipment so that the second display screen of the second electronic equipment displays the interface content;
the processing module is further used for receiving a first operation acted on a first display screen of the first electronic device; causing the interface content displayed by the second display screen to respond to the first operation; wherein the operation position of the first operation corresponds to a first object in the interface content of the first electronic device; the first object is not responsive to the first operation.
Optionally, in a design of the fourth aspect, the processing module is specifically configured to:
determining a first position in the interface content displayed by the first electronic equipment; causing the interface content displayed by the second display screen to respond to the first operation based on the first location; wherein the first position is independent of an operating position of the first operation.
Optionally, in a design of the fourth aspect, the processing module is further configured to:
the cursor is projected in the second electronic device, so that the second display screen displays the cursor.
Optionally, in a design of the fourth aspect, a position of the cursor in the interface content displayed by the first electronic device is a first position.
Optionally, in a design of the fourth aspect, the processing module is specifically configured to:
acquiring pose change information of the first electronic equipment;
determining a first position of the cursor in interface content displayed by the first electronic device based on the pose change information.
Optionally, in a design of the fourth aspect, the pose change information includes a first rotation angle of the first electronic device in a horizontal plane and a second rotation angle of the first electronic device in a vertical plane, and the processing module is specifically configured to:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the fourth aspect, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, where the first direction is parallel to a left-right moving direction of a cursor displayed on a display screen of the second electronic device, and the second direction is parallel to an up-down moving direction of the cursor displayed on the display screen of the second electronic device, and the processing module is specifically configured to:
determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the fourth aspect, the first operation includes at least a click operation and a first slide operation, and the first object includes at least one of an application and a functionality control.
Optionally, in a design of the fourth aspect, the processing module is specifically configured to:
receiving a second sliding operation on a display screen of the first electronic equipment;
Determining the displacement of the cursor according to the second sliding operation;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the displacement of the cursor.
Optionally, in a design of the fourth aspect, the first operation includes at least a click operation, and the first object includes at least one of an application and a functionality control.
Optionally, in a design of the fourth aspect, the first operation is a preset shortcut operation, and the processing module is further configured to:
causing the first electronic device to respond to the first operation.
Optionally, in a design of the fourth aspect, the operation position of the first operation corresponds to a third object in the interface content of the second electronic device, and the third object does not respond to the first operation.
Optionally, in a design of the fourth aspect, the preset shortcut operation at least includes:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Optionally, in a design of the fourth aspect, the processing module is further configured to:
reducing the display brightness of the first display screen; or the like, or, alternatively,
and executing screen-off operation on the first display screen.
Optionally, in a design of the fourth aspect, the processing module is further configured to:
generating a menu bar;
and projecting the menu bar into the second electronic equipment so that the second display screen displays the menu bar.
Optionally, in a design of the fourth aspect, the processing module is further configured to:
detecting that the first electronic device and the second electronic device establish screen projection connection.
In a fifth aspect, the present application provides an electronic device for use as a first electronic device, comprising a processor, an input device, an output device, and a memory, wherein the memory is configured to store a computer program comprising program instructions that, when executed by the processor, cause the first electronic device to perform the method according to the first aspect.
In a sixth aspect, the present application provides an electronic device, which is used as a first electronic device and comprises a processor, an input device, an output device and a memory, wherein the memory is used for storing a computer program comprising program instructions, which when executed by the processor, cause the first electronic device to perform the method according to the third aspect.
In a seventh aspect, the present application provides a screen projection system, including: the first electronic device and the second electronic device are connected in a screen-projecting mode.
In an eighth aspect, the present application provides a screen projection system, including: the first electronic device and the second electronic device are connected in a screen projection mode.
In a ninth aspect, the present application provides a computer readable storage medium having stored thereon a computer program comprising program instructions which, when executed by a computer, cause the computer to perform the method of the first aspect as described above.
In a tenth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program comprising program instructions which, when executed by a computer, cause the computer to perform the method of the third aspect as described above.
In an eleventh aspect, the present application provides a manipulation method applied to a first electronic device connected to a second electronic device, wherein the second electronic device includes an imaging system; the method comprises the following steps:
Displaying interface contents of N applications operated by the first electronic equipment in N display areas in the imaging system respectively; the first electronic equipment comprises a first display screen, and N is an integer greater than 1;
displaying a cursor in the imaging system, wherein the cursor is used for determining an operation object in the content displayed by the imaging system;
receiving a first sliding operation acting on the first display screen;
determining a displacement of the cursor in content displayed by the imaging system according to the first sliding operation;
the starting position of the first sliding operation corresponds to a first object in the current interface content of the first electronic device, the first object does not respond to the first sliding operation, and the current interface content is the interface content of one application in the interface contents of the N applications.
Optionally, in a design of the eleventh aspect, the second electronic device includes a television, an AR device, a VR device, or an MR device.
Optionally, in a design of the eleventh aspect, the method further includes:
determining a first operation object in interface contents of N applications displayed by the imaging system according to the cursor, wherein the first operation object at least comprises an application program and a function control;
And receiving a second operation, and enabling the first operation object to respond to the second operation.
Optionally, in a design of the eleventh aspect, the method further includes:
reducing the display brightness of the first display screen; or the like, or, alternatively,
and executing screen-off operation on the first display screen.
Optionally, in a design of the eleventh aspect, the method further includes:
displaying a menu bar in the imaging system; the menu bar includes functionality controls for adding or deleting interface content for an application in the imaging system.
In a twelfth aspect, the present application provides a first electronic device, wherein the first electronic device is connected to a second electronic device; the first electronic device comprises a first display screen, and the second electronic device comprises an imaging system; the first electronic device includes:
the sending module is used for respectively displaying the interface contents of N applications operated by the first electronic equipment in N display areas in the imaging system; the first electronic equipment comprises a first display screen, and N is an integer greater than 1; further for displaying a cursor in the imaging system, the cursor for determining an operational object in content displayed by the imaging system;
The processing module is used for receiving a first sliding operation acted on the first display screen; determining the displacement of the cursor in the content displayed by the imaging system according to the first sliding operation;
the starting position of the first sliding operation corresponds to a first object in the current interface content of the first electronic device, the first object does not respond to the first sliding operation, and the current interface content is the interface content of one application in the interface contents of the N applications.
Optionally, in a design of the twelfth aspect, the second electronic device includes a television, an AR device, a VR device, or an MR device.
Optionally, in a design of the twelfth aspect, the processing module is further specifically configured to:
determining a first operation object in interface contents of N applications displayed by the imaging system according to the cursor, wherein the first operation object at least comprises an application program and a function control;
and receiving a second operation, and enabling the first operation object to respond to the second operation.
Optionally, in a design of the twelfth aspect, the processing module is further configured to:
reducing the display brightness of the first display screen; or the like, or, alternatively,
And executing screen-off operation on the first display screen.
Optionally, in a design of the twelfth aspect, the sending module is further configured to:
displaying a menu bar in the imaging system; the menu bar includes functionality controls for adding or deleting interface content for an application in the imaging system.
In a thirteenth aspect, the present application provides a first electronic device, comprising:
a processor, an input device, an output device, and a memory;
wherein the input device is used for receiving data and instructions; the output device is used for sending data and instructions; the memory is for storing a computer program comprising program instructions that, when executed by the processor, cause the first electronic device to perform the method of the eleventh aspect.
The embodiment of the application provides an information processing method, which comprises the following steps: generating screen projection content; sending the screen projection content to a second electronic device to enable a display screen of the second electronic device to display the screen projection content, wherein the screen projection content comprises a cursor and interface content of the first electronic device, and the cursor is used for being positioned at an operation position in the interface content. Through the manner, the cursor is added to the screen projection content, so that the user can determine the position to be operated through the cursor displayed by the second electronic device, the user can determine the position to be operated only based on the cursor position in the screen projection content displayed by the second electronic device without seeing the interface content of the first electronic device, and further, the operation can be executed on the position of the cursor.
Drawings
FIG. 1 is a system architecture diagram of a projection system provided by an embodiment of the present application;
fig. 2 is a schematic structural diagram of a first electronic device provided in an embodiment of the present application;
FIG. 3a is a block diagram of a software architecture of an electronic device according to an embodiment of the present application;
fig. 3b is a schematic flowchart of an information processing method according to an embodiment of the present application;
fig. 4a is a schematic interface diagram of a first electronic device according to an embodiment of the present disclosure;
fig. 4b is a schematic interface diagram of a first electronic device according to an embodiment of the present disclosure;
fig. 4c is a schematic interface diagram of a first electronic device according to an embodiment of the present disclosure;
fig. 5a is a schematic interface diagram of a first electronic device according to an embodiment of the present disclosure;
fig. 5b is a schematic interface diagram of a first electronic device according to an embodiment of the present disclosure;
fig. 5c is a schematic interface diagram of a first electronic device according to an embodiment of the present disclosure;
FIG. 6a is a schematic view of an actual scene of a projection screen;
FIG. 6b is a schematic diagram of an actual scene of a projection screen;
FIG. 6c is a schematic diagram of an actual scene of a projection screen;
FIG. 6d is a schematic diagram of an actual scene of a projection screen;
FIG. 7a is a schematic view of an actual scene of a projection screen;
FIG. 7b is a schematic diagram illustrating a change in the orientation of the first electronic device in the horizontal direction;
FIG. 7c is a schematic diagram illustrating the displacement of a cursor displayed on a second electronic device;
FIG. 7d is a schematic diagram of an actual scene of a projection screen;
fig. 7e is a schematic diagram illustrating a posture change of the first electronic device in a vertical direction;
FIG. 7f is a schematic diagram of the displacement of the cursor displayed on the second electronic device;
FIG. 7g is a schematic diagram of an actual scene of a projection screen;
FIG. 7h is a schematic view of a sliding operation performed by a user;
FIG. 7i is a schematic view of a sliding operation of a user;
FIG. 7j is a schematic diagram illustrating the displacement of a cursor displayed by a second electronic device;
fig. 8a is a schematic screen projection content diagram of a second electronic device according to an embodiment of the present application;
fig. 8b is a schematic screen projection content diagram of a second electronic device according to an embodiment of the present application;
fig. 9a is a schematic view of screen projection contents of a second electronic device according to an embodiment of the present application;
fig. 9b is a schematic view of screen projection contents of a second electronic device according to an embodiment of the present application;
fig. 9c is a schematic view of screen projection contents of a second electronic device according to an embodiment of the present application;
fig. 9d is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
Fig. 10a is a schematic screen projection content diagram of a second electronic device according to an embodiment of the present application;
fig. 10b is a schematic screen projection content diagram of a second electronic device according to an embodiment of the present application;
fig. 10c is an interaction diagram of a first electronic device according to an embodiment of the present application;
fig. 10d is a schematic interface content diagram of a first electronic device according to an embodiment of the present application;
FIG. 11a is an interaction diagram of a first electronic device and a second electronic device;
FIG. 11b is an interaction diagram of a first electronic device and a second electronic device;
FIG. 11c is an interaction diagram of a first electronic device and a second electronic device;
FIG. 11d is an interaction diagram of a first electronic device and a second electronic device;
FIG. 11e is an interaction diagram of a first electronic device and a second electronic device;
FIG. 11f is an interaction diagram of a first electronic device and a second electronic device;
FIG. 11g is an interaction diagram of a first electronic device and a second electronic device;
fig. 12a is a schematic interface diagram of a second electronic device according to an embodiment of the present disclosure;
fig. 12b is a schematic interface diagram of a second electronic device according to an embodiment of the present disclosure;
Fig. 12c is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12d is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 12e is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12f is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12g is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 12h is a schematic interface diagram of a second electronic device according to an embodiment of the present disclosure;
fig. 12i is a schematic interface diagram of a second electronic device according to an embodiment of the present disclosure;
fig. 12j is an interface schematic diagram of a second electronic device according to an embodiment of the present application;
fig. 12k is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12l is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 12m is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 12n is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12o is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
Fig. 12p is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12q is a schematic diagram of a user operating a first electronic device according to an embodiment of the present application;
fig. 12r is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12s is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 12t is a schematic interface diagram of a second electronic device according to an embodiment of the present disclosure;
fig. 12u is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 13 is a schematic view of a screen projection scene provided in an embodiment of the present application;
fig. 14a is a schematic screen projection content diagram of a second electronic device according to an embodiment of the present application;
fig. 14b is a schematic screen projection content diagram of a second electronic device according to an embodiment of the present application;
fig. 14c is a schematic view of screen projection contents of a second electronic device according to an embodiment of the present application;
fig. 14d is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 14e is a schematic view of screen projection contents of a second electronic device according to an embodiment of the present application;
fig. 14f is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
Fig. 14g is a schematic view of screen projection contents of a second electronic device according to an embodiment of the present application;
fig. 15 is a flowchart illustration of an information processing method according to an embodiment of the present application;
FIG. 16 is a schematic view of an embodiment of the present application;
FIG. 17 is a schematic view of an embodiment of the present application;
fig. 18 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 19 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 20 is an interaction diagram of a first electronic device and a second electronic device;
FIG. 21a is a schematic diagram illustrating a connection of a first electronic device to a second electronic device;
FIG. 21b is a schematic diagram illustrating the connection of a first electronic device to a second electronic device;
FIG. 22 includes schematic diagrams 22a, 22b, 22c, and 22d illustrating an interaction between a first electronic device and a second electronic device;
FIG. 23a is a diagram illustrating a user interaction with a first electronic device;
FIG. 23b is a diagram illustrating a user interaction with a first electronic device;
FIG. 23c is a schematic diagram of a user interaction with a first electronic device;
FIG. 23d is a schematic diagram of a user interaction with the first electronic device;
FIG. 24a is a schematic diagram of an interaction of a user using a first electronic device according to an embodiment of the present application;
FIG. 24b is a schematic diagram illustrating an interaction of a user with a first electronic device according to an embodiment of the present application;
FIG. 24c is a schematic diagram of an interaction of a user with a first electronic device according to an embodiment of the present application;
fig. 25 is an operation diagram of a first electronic device according to an embodiment of the present application;
FIG. 26 is a schematic diagram illustrating an interaction of a user using a first electronic device according to an embodiment of the present application;
FIG. 27 is a block diagram of a system architecture according to an embodiment of the present application;
FIG. 28 is a block diagram of a system architecture according to an embodiment of the present application;
fig. 29 is a flowchart illustration of an information processing method according to an embodiment of the present application;
fig. 30 is a control diagram of multiple screens according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides an information processing method and electronic equipment, and a user can use a mobile phone as control equipment of AR (augmented reality) equipment, Virtual Reality (VR) equipment or Mixed Reality (MR) equipment.
Embodiments of the present application are described below with reference to the accompanying drawings. As can be known to those skilled in the art, with the development of technology and the emergence of new scenarios, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances and are merely descriptive of the various ways in which objects of the same nature may be described in connection with the embodiments of the application. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of elements is not necessarily limited to those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
The screen projection method can comprise the steps of wire screen projection and wireless screen projection, wherein the wire screen projection can establish connection between electronic equipment through a High Definition Multimedia Interface (HDMI), and transmit media data through an HDMI transmission line; the wireless screen projection may establish a connection between a plurality of electronic devices through, for example, the Miracast protocol, and transmit media data through, for example, a wireless local area network (WIFI).
The screen projection system of the present application may include at least two electronic devices and a screen projection port, where the screen projection port may include a wired port and/or a wireless port. The wired port may be HDMI; the wireless port may be an Application Programming Interface (API) or a hardware screen projection module. Referring to fig. 1, fig. 1 is a system architecture diagram of a projection system provided by an embodiment of the present application. As shown in fig. 1, the screen projection system includes a first electronic device 100 and a second electronic device 200, the first electronic device 100 may include a first wired port 101 and/or a first wireless port 102, and the second electronic device 200 may include a second wired port 201 and/or a second wireless port 202. The first wired port 101 and the first wireless port 102 may be integrated on the first electronic device 100 or exist independently of the first electronic device 100; the second wired port 201 and the second wireless port 202 may be integrated on the second electronic device 200, or may exist independently of the second electronic device 200, which is not limited in this embodiment. The first electronic device 100 and the second electronic device 200 may establish a screen-cast connection through a screen-cast port (wired port or wireless port). Wherein, the first electronic device 100 at least has a screen projection and transmission (Source) capability.
The first electronic device 100 may include an enhanced interaction service that may obtain sensor information of the electronic device (e.g., may be obtained from sensor inputs in fig. 1) including, but not limited to, gesture information of the first electronic device, and may also obtain touch screen information of the electronic device (e.g., may be obtained from touch screen inputs in fig. 1) including, but not limited to, touch information to the touch screen. How the obtained sensor information and the touch screen information are used by the enhanced interaction service of the first electronic device will be described in the following embodiments, and details are not repeated here.
In an implementation manner, the enhanced interaction service may also obtain interface content of a front-end application of the first electronic device, draw other images (for example, a cursor and a menu bar) on the interface content, and send the drawn interface content to the screen-casting service, where the screen-casting service may generate screen-casting content based on the drawn interface content, and send the screen-casting content to the second electronic device 200, so that the display screen of the second electronic device displays the screen-casting content.
In another implementation manner, the enhanced interaction service may also add other images (e.g., a cursor and a menu bar) to the interface content of the front-end application of the first electronic device through a floating window interface of the system, and the screen-projecting service may acquire the interface content of the front-end application of the first electronic device to which the other images are added, generate screen-projecting content based on the acquired interface content of the front-end application of the first electronic device to which the other images are added, and send the screen-projecting content to the second electronic device 200, so that the display screen of the second electronic device displays the screen-projecting content.
The first electronic device 100 may include a screen projection service, which is configured to implement a screen projection sending (source) capability, for example, the screen projection service may acquire interface content of the first electronic device as screen projection data, or may take interface content drawn by the enhanced interaction service as screen projection content, and send the screen projection content to the second electronic device 200 through the first wireless port or the first wired port.
The second electronic device 100 may be, but is not limited to, equipped with a screen projection reception (Sink) capability and an image display capability. The second electronic device 200 may include a screen-casting service for implementing a screen-casting reception (sink) capability, which may display received screen-casting content on a display screen of the second electronic device.
Examples of the first electronic device 100 include, but are not limited to, an electronic device equipped with iOS, android, microsoft, or other operating systems, and alternatively, the first electronic device 100 may be an electronic device such as a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), or a desktop computer. The second electronic device 200 may be a television, a tablet computer, or a desktop computer.
It should be noted that the first electronic device may be an electronic device having a display function.
It should be noted that, in some scenarios, if the second electronic device is a television, a tablet computer, or a desktop computer, the size of the display area of the display screen of the second electronic device may be larger than the size of the display area of the display screen of the first electronic device.
For ease of understanding, the structure of the first electronic device 100 provided in the embodiments of the present application will be described below by way of example. Referring to fig. 2, fig. 2 is a schematic structural diagram of a first electronic device provided in an embodiment of the present application.
As shown in fig. 2, the first electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the first electronic device 100. In other embodiments of the present application, the first electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
Wherein the controller may be a neural center and a command center of the first electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I1C) interface, an integrated circuit built-in audio (I1S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not constitute a structural limitation for the first electronic device 100. In other embodiments of the present application, the first electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the first electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
In some possible implementations, the first electronic device 100 may communicate with other devices using wireless communication capabilities. For example, the first electronic device 100 may communicate with the second electronic device 200, the first electronic device 100 establishes a screen-cast connection with the second electronic device 200, the first electronic device 100 outputs screen-cast data to the second electronic device 200, and so on. The screen projection data output by the first electronic device 100 may be audio and video data. The communication process of the first electronic device 100 and the second electronic device 200 can refer to the related description of the following embodiments, and the details are not repeated herein.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the first electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 1G/3G/4G/5G, etc. applied to the first electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 2 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays images or videos through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the first electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrated with at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 1, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the first electronic device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the first electronic device 100 can communicate with networks and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The first electronic device 100 implements the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, connected to a display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the first electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
In some possible implementations, the display screen 194 may be used to display various interfaces of the system output of the first electronic device 100. The interfaces output by the first electronic device 100 can refer to the relevant description of the subsequent embodiments.
The first electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the first electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals.
Video codecs are used to compress or decompress digital video. The first electronic device 100 may support one or more video codecs. In this way, the first electronic device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG1, MPE63, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU may implement applications such as intelligent recognition of the first electronic device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the first electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications and data processing of the first electronic device 100 by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the first electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The first electronic device 100 can implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc. . In some possible implementations, the audio module 170 may be used to play sound corresponding to video. For example, when the display screen 194 displays a video playing screen, the audio module 170 outputs the sound of the video playing.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The gyro sensor 180B may be used to determine the motion attitude of the first electronic device 100. The air pressure sensor 180C is used to measure air pressure.
The acceleration sensor 180E can detect the magnitude of acceleration of the first electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the first electronic device 100 is stationary. The method can also be used for identifying the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance.
The ambient light sensor 180L is used to sense the ambient light level.
The fingerprint sensor 180H is used to collect a fingerprint.
The temperature sensor 180J is used to detect temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or thereabout. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the first electronic device 100, different from the position of the display screen 194.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The first electronic device 100 may receive a key input, and generate a key signal input related to user setting and function control of the first electronic device 100.
The motor 191 may generate a vibration cue.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card.
The above is a description of the structure of the first electronic device 100, and next, a description is given of the software structure of the first electronic device. The software system of the first electronic device 100 may adopt a hierarchical architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Examples of the first electronic device 100 include, but are not limited to, an electronic device equipped with an iOS, an Android, a microsoft, or another operating system, and in the embodiment of the present application, the first electronic device 100 is equipped with an Android system, which exemplifies a software structure of the first electronic device 100.
Fig. 3a is a block diagram of a software structure of the first electronic device 100 according to the embodiment of the present application.
As shown in fig. 3a, the layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom. The application layer may include a series of application packages.
As shown in fig. 3a, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3a, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The interface content may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 200. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message reminders, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a kernel library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The graphics processing library is used for realizing drawing, image rendering, synthesis, layer processing and the like of 2D or 3D graphics.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
It should be noted that the second electronic device may have all or a part of the structure shown in fig. 2 and fig. 3a, and is not limited herein. For convenience of understanding, the following embodiments of the present application will specifically describe a screen projection method provided by the embodiments of the present application by taking a first electronic device having a structure shown in fig. 2 and fig. 3a as an example, with reference to the accompanying drawings and application scenarios.
In this embodiment, the second electronic device 200 may be an electronic device such as a television, a tablet computer, or a desktop computer, and the screen projection method provided in this embodiment is described below by taking the second electronic device as a television.
First, a process of establishing a screen-cast connection between the first electronic device 100 and the second electronic device 200 will be described.
It should be noted that the number of the second electronic devices may be one or more, that is, the first electronic device may establish a screen-projecting connection with one second electronic device, or may establish a screen-projecting connection with multiple second electronic devices at the same time.
Next, how the first electronic device establishes a screen-casting connection with the second electronic device is explained.
In an alternative mode, the first electronic device can realize screen projection connection with the second electronic device by selecting a screen projection control carried by the system.
Optionally, referring to fig. 4a, fig. 4a is an interface schematic diagram of a first electronic device provided in an embodiment of the present application, as shown in fig. 4a, a finger of a user may slide down on a top area of a main interface of the first electronic device 100. When the first electronic device 100 detects a downslide operation on the main interface, the notification management interface 40 as shown in fig. 4b is displayed. As shown in fig. 4b, the notification management interface 40 includes a mobile data icon 402, a wireless network icon 403, and a wireless screen shot icon 401, among others. The first electronic device 100 may initiate a wireless screen shot function by the user clicking a wireless screen shot icon 401 on the notification management interface 40.
Optionally, in this embodiment of the application, if the wireless network is not opened when the user clicks the wireless screen projection icon 401 on the notification management interface 40, the first electronic device 100 may prompt or automatically open the wireless network, and prompt the user to select a wireless network connection. In the embodiment of the present application, the first electronic device 100 automatically searches for a screen projection device (representing an electronic device with screen projection transmission/reception capability) connected to wireless fidelity (WIFI) through the WIFI. First electronic device 100 may display a search/selection box 404 as shown in fig. 4c on notification management interface 40, where search/selection box 404 includes names of one or more screen projection devices searched for, to prompt the user to select one screen projection device from the searched screen projection devices to establish a screen projection connection. After the user selects a screen projection device to be projected, the first electronic device 100 may establish a screen projection connection with the screen projection device selected by the user.
In an alternative, the first electronic device may also enable a screen-cast connection with the second electronic device by selecting a screen-cast control in some applications (e.g., a video application or an application for presentation).
Optionally, referring to fig. 5a, fig. 5a is an interface schematic diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 5a, a plurality of applications, such as a video Application (APP), may be installed in the first electronic device 100. The video APP can be used for watching videos, live broadcasts, novels and/or cartoons. The video APP at least has a standby screen sending function. The video APP may be pre-installed when the first electronic device 100 leaves a factory, or may be installed after being downloaded by a user. The video APP may be a video APP developed by a manufacturer of the first electronic device 100, or may be a video APP developed by a third party manufacturer.
As shown in fig. 5a, the interface of the first electronic device may further include: a status bar, and icons for a plurality of other applications, such as an icon for a social APP, etc. The status bar may include a WiFi icon, a signal strength, a current remaining power, a current time, and/or the like. In some possible implementations, the status bar may also include a bluetooth icon, an alarm icon, and the like. When the first electronic device 100 detects a click event of a finger or a stylus of a user for an application icon, in response to the click event, the first electronic device 100 starts an application and displays a user interface corresponding to the application icon. For example, when the first electronic device 100 detects that a finger of the user touches the icon 501 of the video APP, the first electronic device 100 starts the video APP in response to the touch event and displays a main interface of the video APP. The user can select a video on the main interface of the video APP, and accordingly, the first electronic device 100 can receive the video selected by the user and display the video playing interface.
Optionally, in an embodiment, after the first electronic device 100 starts the video APP, the user may search or select a video that the user wants to play in a main interface of the video APP, and the first electronic device 100 searches a video name input by the user in the search bar from the cloud platform and displays an icon of the searched video. After the user selects a video desired to be played, the user can click on an icon of the video to enter a video playing interface. The video APP of the first electronic device 100 acquires a resource (e.g., a video stream) of a video selected by a user from the cloud platform, and parses and outputs a video image of the video. Referring to fig. 5b, fig. 5b is a schematic view of a video playing interface of a video APP of a first electronic device provided in an embodiment of the present application. Illustratively, the video selected by the user is "lovely oneself" first collection, and the corresponding video playing interface 50 can be as shown in fig. 5 b. Illustratively, the video playback interface 50 may include a video image 504, a screen projection control 502, a selection 506, a return control 503, a full screen control 505, and the like.
In addition to the elements and controls described above, the video playback interface 40 may include more content, such as cache controls, comment controls, and the like. The user can touch the cache control on the video playing interface 50 by fingers to download the video 400; or the user may touch the comment control on the video playback interface 50 with a finger to view comments, make comments, etc. It can be understood that the description of the interface is only an example, and different video APPs or different videos may be different in corresponding video playing interfaces, elements and controls included in a full-screen playing interface, arrangement modes of the elements and controls, and the like, which is not limited herein.
If the user wants to screen-cast the video on the second electronic device 200 for playing, the user can click the screen-casting control 502 on the video playing interface 50 through a finger or a stylus pen, and accordingly, the first electronic device 100 can receive a screen-casting indication generated by the user clicking the screen-casting control 502. The first electronic device 100 may detect whether the first electronic device 100 has currently established a screen-cast connection. When the first electronic device 100 detects that the screen projection connection is not currently established, the first electronic device 100 may search for a screen projection device, display the searched screen projection device, and prompt the user to select the screen projection device.
For example, when the first electronic device 100 detects that a screen-casting connection is not currently established, the first electronic device 100 searches for one or more screen-casting devices connected to a wireless network through the wireless network. The screen projection device according to the embodiment of the application is an electronic device with screen projection transmission (Source)/reception (Sink) capabilities. When the first electronic device 100 searches for one or more screen projection devices, referring to fig. 5c, fig. 5c is a schematic view of an interface content of a screen projection device selection provided in an embodiment of the present application, as shown in fig. 5c, the first electronic device 100 may display a search/selection box 507 shown in fig. 5c on the interface, where the search/selection box 507 includes names of the searched one or more screen projection devices to prompt a user to select one screen projection device from the searched screen projection devices to establish a screen projection connection. Illustratively, the search/select box 507 includes the searched screen-projection reception-capable device name: office 1's television, office 2's television, my computer. After the user selects a device to be projected, the first electronic device 100 establishes a screen projection connection with the device selected by the user through a wireless display standard Miracast protocol in a wireless video display (WFD) technology. Illustratively, the screen-casting device selected by the user is a television of the office 1, and the first electronic device 100 establishes a screen-casting connection with the television of the office 1. For convenience of description, the following description refers to the screen-projecting device selected by the user as the second electronic device 200.
It should be noted that the interface content selected by the screen projection device may further include a refresh control 509 and an exit control 508, which is not limited in the present application.
Optionally, in another embodiment, the user may also start the wireless screen projection function of the first electronic device 100 by clicking on another control on the first electronic device 100, which is not limited in this application.
Next, a process for establishing a screen-casting connection between the first electronic device and the second electronic device is described.
Optionally, in an embodiment, the user may establish the wired screen-casting connection between the first electronic device 100 and the second electronic device 200 through a High Definition Multimedia Interface (HDMI).
Optionally, in another embodiment, the first electronic device 100 may establish a screen-casting connection with the second electronic device 200 through a signal distribution terminal, and in this embodiment, the signal distribution terminal may include a signal distributor or a signal distributor, and the application is not limited thereto. In the operation process, the electronic device serving as the signal distribution terminal can operate a certain application program, so that screen projection data sent by the first electronic device is received, and the screen projection data is further distributed to the second electronic device, so that the screen projection function in the application is realized. The application program may be a screen projection application dedicated to a screen projection function, or may be another application including a screen projection function.
It should be noted that, currently, a commonly used electronic device can already use a conventional processing method to achieve a function of supporting a wireless screen projection service; for example, configuring an intelligent device to support a wireless screen-casting related protocol, currently, commonly used protocols include Miracast, DLNA (digital living network alliance), AirPlay protocol, and the like, where a conventional processing manner is to install a wireless screen-casting application conforming to an intelligent operating system on an electronic device, so that the intelligent device can support a wireless screen-casting service. Of course, the electronic devices (the first electronic device and the second electronic device) may be configured in other ways to support the wireless screen projection service, which is not particularly limited in this application.
Optionally, in this embodiment, after the first electronic device establishes the screen-projecting connection with the second electronic device, the first electronic device may send the screen-projecting data to the second electronic device, and the second electronic device may display the screen-projecting data corresponding to the screen-projecting data sent by the first electronic device.
Optionally, in this embodiment, after the first electronic device establishes the screen-projecting connection with the second electronic device, the first electronic device may indirectly send the screen-projecting data to the second electronic device: the first electronic device may send the screen projection data to the signal distribution terminal, and then the signal distribution terminal further sends the screen projection data to the second electronic device.
In fact, the present application is only for explaining how the screen projection data is derived from the first electronic device and finally obtained and displayed by the second electronic device, and the present application is not limited to how the screen projection data is transmitted from the first electronic device to the second electronic device.
The process of establishing the screen-casting connection between the first electronic device 100 and the second electronic device 200 is described above, and how the user interacts with the screen-casting content displayed on the second electronic device during the screen-casting process based on the operation of the first electronic device is described next.
Referring to fig. 3b, fig. 3b is a flowchart of an information processing method according to an embodiment of the present application, and as shown in fig. 3b, the information processing method includes:
301. the first electronic device generates screen projection content.
In this embodiment of the application, the first electronic device may generate the screen-projecting content after detecting that the first electronic device establishes the screen-projecting connection with the second electronic device.
In the embodiment of the application, the screen projection content may include, but is not limited to, a cursor and interface content of the first electronic device.
Optionally, in this embodiment of the application, the first electronic device may obtain interface content of a front-end application of the first electronic device, and generate a cursor on the interface content to obtain screen projection content.
The shape of the cursor may be a mouse shape or other shapes, which is not limited in this application. It should be noted that the cursor may be used to be positioned at an operation position in the interface content, and the cursor may move on the display screen of the second electronic device in response to the operation of the user on the first electronic device (changing the posture information or sliding the display screen of the first electronic device), which will be described in the following embodiments regarding how the cursor moves on the display screen of the second electronic device based on the operation of the user on the first electronic device, and will not be described herein again.
Specifically, the first electronic device may obtain interface content of a current front-end application. For example, the first electronic device may obtain interface content of a current front-end application based on a screen recording interface (e.g., a mediaproject interface provided by Android) provided by the system, and draw a cursor on the obtained interface content, the first electronic device may use the drawn content as screen projection content, and a screen projection service of the first electronic device may obtain the screen projection content and send the screen projection content (which may be subjected to an encoding operation and/or size conversion of the content) to the second electronic device based on a certain screen projection protocol, so that a display screen of the second electronic device displays the screen projection content.
Optionally, in this embodiment of the application, the first electronic device may further add, based on the floating window interface, image information corresponding to the cursor to interface content of a front-end application of the first electronic device, to generate screen projection data, at this time, a screen projection service of the first electronic device may obtain the screen projection data, and send the screen projection data (which needs to be encoded and size conversion of the content) to the second electronic device based on a certain screen projection protocol, so that a display screen of the second electronic device displays the screen projection content corresponding to the screen projection data.
It should be noted that, after the second electronic device establishes connection with the first electronic device, the cursor may be directly displayed on the display screen of the second electronic device, or the cursor may not be directly displayed on the display screen of the second electronic device, but the cursor is displayed on the display screen of the second electronic device only after the user performs a certain operation on the first electronic device.
Specifically, after the second electronic device establishes a connection with the first electronic device, the second electronic device may directly display a cursor in a central area of the displayed screen-shot content or in another preset area.
In addition, after the second electronic device establishes connection with the first electronic device, the second electronic device may detect the attitude information of the first electronic device instead of displaying the cursor in the central area of the displayed screen projection content or in another predetermined area, and display the cursor in the central area of the displayed screen projection content or in another predetermined area if it is detected that the attitude information of the first electronic device is changed. From the perspective of the user, if the user does not see the cursor on the display screen of the second electronic device, the user can change the posture information of the first electronic device by waving the first electronic device, and further trigger the display of the cursor on the second electronic device.
Optionally, in an embodiment, after the first electronic device and the second electronic device establish a screen-casting connection, a cursor may not be displayed on the second electronic device first, and a user may activate cursor display of the second electronic device by performing a touch operation on a display screen of the first electronic device, where the touch operation may be a click operation, a double-click operation, or a long-press operation in a preset area on the display screen of the first electronic device, and the application is not limited.
Illustratively, referring to fig. 6a, fig. 6a is a schematic view of a screen projection actual scene, as shown in fig. 6a, when a user holds the first electronic device 100, and the second electronic device 200 displays the screen projection content 60 sent by the first electronic device 100, wherein the screen projection content 60 includes the current interface content (e.g., the interface shown in fig. 4 a) of the first electronic device 100. It should be noted that the interface content of the first electronic device 100 may be, but is not limited to, the interface content of the front-end application.
Alternatively, in another embodiment, based on the example shown in fig. 5c, after the user selects the screen projection device, referring to fig. 6d, fig. 6d is a schematic view of an actual scene of the screen projection, as shown in fig. 6d, the second electronic device 200 displays the screen projection content 60 sent by the first electronic device 100, where the screen projection content 60 includes the current interface content (the video playing interface shown in fig. 5 b) of the first electronic device 100, and the screen projection content 60 further includes: a cursor 601.
In this embodiment of the application, after the first electronic device detects that the screen projection connection with the second electronic device is established, the display brightness of the display screen of the first electronic device may also be reduced, or the screen-off operation may be performed on the display screen of the first electronic device.
Optionally, in this embodiment of the application, referring to fig. 6b, fig. 6b is an actual scene illustration of a screen projection, as shown in fig. 6b, at this time, the interface content of the first electronic device 100 held by the user is the interface (the main interface of the first electronic device) shown in fig. 4a, and after the screen projection connection is established between the first electronic device and the second electronic device, as shown in fig. 6c, at this time, the first electronic device turns off the screen. It should be noted that, at this time, the first electronic device may not display the page in fig. 4a, fig. 4a is only the page that the first electronic device should present, and fig. 6c is displayed at this time by the first electronic device.
Optionally, in another embodiment, after the first electronic device and the second electronic device establish the screen-shooting connection, a control whether to enter the screen-off state may be displayed on the first electronic device (or through another predetermined operation), and the user may click the control entering the screen-off state to cause the first electronic device to enter the screen-off state.
Compared with the prior art, when the screen is projected, the first electronic device needs to keep the state of the bright screen, in the embodiment of the application, after the screen projection connection is established between the first electronic device and the second electronic device, the first electronic device can reduce the display brightness of the display screen, or directly enter the screen extinguishing state, so that the energy consumption of the first electronic device is reduced.
It should be noted that, in this embodiment of the application, after a first electronic device detects that a screen projection connection is established with a second electronic device, display brightness of a display screen of the first electronic device may be reduced through a brightness adjustment interface of a system, or a screen-off operation is performed on the display screen of the first electronic device.
It should be noted that, in this embodiment of the application, the first electronic device is only in the screen-off state, at this time, the application in the first electronic device is still running, and the screen-throwing control in the first electronic device may still obtain the interface content of the current front-end application.
302. The first electronic device sends the screen projection content to a second electronic device so that a display screen of the second electronic device displays the screen projection content, wherein the screen projection content comprises a cursor and interface content of the first electronic device, and the cursor is used for being positioned at an operation position in the interface content.
In this embodiment of the application, after the first electronic device generates the screen projection content, the screen projection content may be sent to the second electronic device, so that the display screen of the second electronic device displays the screen projection content, where the screen projection content includes a cursor and interface content of the first electronic device, and the cursor is used to be positioned at an operation position in the interface content.
For the implementation of step 302, reference may be made to the above embodiment, where the first electronic device sends the description related to the screen projection data to the second electronic device, and details are not described here.
In this embodiment of the application, in order to reduce energy consumption of the first electronic device, after the first electronic device and the second electronic device establish a screen-throwing connection, the first electronic device enters a screen-extinguishing state, at this time, a display screen of the first electronic device is black, and a user cannot operate an object on the first electronic device by operating on interface content, for example, in a scene shown in fig. 6d, the user wants to click a return control or drag a progress bar, however, because the interface content of the first electronic device is black, the user cannot position the return control and the progress bar in the first electronic device on the display screen of the first electronic device, and further cannot perform an operation of clicking the return control or dragging the progress bar. Next, how the user operates the screen projection content displayed by the second electronic device in the state where the first electronic device is in the screen-off state in the embodiment of the present application is discussed.
First, how the user adjusts the display position of the cursor on the display screen of the second electronic device is described.
In this embodiment, the cursor displayed by the second electronic device may be used to locate the operation position in the interface content, so that the user may change the operation position by changing the display position of the cursor on the display screen of the second electronic device, and it is described next how the user changes the display position of the cursor on the display screen of the second electronic device.
In the embodiment of the application, a user can adjust the display position of a cursor on the display screen of the second electronic device by changing the gesture of the first electronic device, and can also adjust the display position of the cursor on the display screen of the second electronic device by executing a sliding operation gesture on the display screen of the first electronic device.
Firstly, the display position of a cursor on a display screen of the second electronic equipment is adjusted by changing the posture of the first electronic equipment.
In this embodiment of the application, the first electronic device may acquire pose change information of the first electronic device, and cause the cursor to move in the screen projection content displayed by the second electronic device based on the pose change information.
In the embodiment of the application, a cursor 601 is displayed on the screen projection content of the second electronic device, and the cursor 601 can perform corresponding displacement along with the change of the posture of the first electronic device. Specifically, the user may change the posture of the first electronic device in the three-dimensional space by waving the first electronic device, and then the cursor 601 on the second electronic device may perform corresponding displacement on the screen projection content of the second electronic device according to the posture change of the first electronic device in the three-dimensional space.
Alternatively, referring to fig. 7a, fig. 7a is a schematic view of an actual scene of a screen projection, as shown in fig. 7a, a user may change the posture (horizontal direction angle and/or vertical direction angle) of the first electronic device by waving the first electronic device, and accordingly, the cursor 601 displayed on the second electronic device 200 may be displaced in the horizontal direction and/or vertical direction.
Next, how the cursor 601 displayed on the second electronic device 200 is displaced in the horizontal direction and the vertical direction respectively will be described.
1) And the horizontal direction:
in this embodiment, the first electronic device may determine a horizontal displacement of the cursor according to the second rotation angle, and move the cursor in the screen-projected content displayed by the second electronic device according to the horizontal displacement.
Specifically, referring to fig. 7b, fig. 7b is a schematic diagram of a posture change of the first electronic device rotating in the horizontal direction, as shown in fig. 7b, the first electronic device rotates from a posture 1 to a posture 2 in the horizontal plane, and an angle change component of the rotation is θ1. In the embodiment of the present application, the first electronic device 100 mayTo monitor the attitude change in the three-dimensional space and obtain the spatial orientation parameter, taking the attitude change shown in fig. 7a as an example, the first electronic device 100 may obtain that the spatial orientation parameter (the angular change component of the horizontal rotation) is θ1
In this embodiment, the first electronic device may obtain a mapping relationship between the angle change component of the horizontal rotation and the size L1 of the cursor 601 in the horizontal displacement direction in the second electronic device, where the mapping relationship may indicate that the larger the angle change component of the horizontal rotation of the first electronic device is, the larger the size L1 of the cursor 601 in the horizontal displacement direction in the second electronic device is. The first electronic device may determine the displacement size L1 of the cursor 601 in the screen-projected content of the second electronic device based on the acquired angle change component of the horizontal rotation and the mapping relationship.
For example, the mapping relationship between the angle change component of the horizontal rotation acquired by the first electronic device and the horizontal displacement of the cursor 601 in the second electronic device is as follows: every time the first electronic device rotates 1 ° on the horizontal plane, the horizontal pixel coordinate of the cursor 601 in the screen projection content of the second electronic device changes by 30 pixels.
For example, if the user waves the first electronic device such that the first electronic device rotates by 15 ° on a horizontal plane, the horizontal pixel coordinate of the cursor 601 in the screen projection content of the second electronic device changes by 450 pixels.
It should be noted that, in the process of establishing the screen-casting connection between the first electronic device 100 and the second electronic device 200, the first electronic device 100 and the second electronic device 200 may perform exchange and negotiation of performance parameters, that is, the first electronic device 100 may obtain size parameters of interface contents on the second electronic device 200. The first electronic device may adjust the mapping relationship based on the size parameter of the interface content on the second electronic device 200. For example, the larger the horizontal size of the interface content on the second electronic device 200, the larger the horizontal pixel coordinate change of the cursor 601 in the screen projection content of the second electronic device when the first electronic device rotates in the horizontal direction by the same angular change component.
It should be noted that the first electronic device may rotate left or right (i.e., rotate counterclockwise or clockwise from the perspective of the main interface of the first electronic device), and accordingly, the cursor 601 may shift left or right in the screen-projected content of the second electronic device. For example, the first electronic device may be rotated to the left and, correspondingly, the cursor 601 may be displaced to the left in the screen-projected content of the second electronic device. For another example, the first electronic device may be rotated to the right, and accordingly, the cursor 601 may be displaced to the right in the screen-projected content of the second electronic device.
Referring to fig. 7c, fig. 7c is a displacement diagram of the cursor displayed by the second electronic device, as shown in fig. 7c, where the angular variation component of the horizontal rotation in the three-dimensional space acquired by the first electronic device is θ1In this case, the first electronic device may determine that the horizontal pixel coordinate variation of the cursor 601 in the screen projection content of the second electronic device is L1 according to the above mapping relationship, and transmit information carrying the horizontal pixel coordinate variation L1 to the second electronic device, and accordingly, the second electronic device may change the display position of the cursor 601 based on the horizontal pixel coordinate variation L1. At this time, since the first electronic device rotates leftward (counterclockwise) when rotating from the posture 1 to the posture 2, the cursor 601 is also displaced leftward on the screen projection content of the second electronic device, and consistency of movement of the cursor 601 and user operation is ensured.
2) And the vertical direction:
in this embodiment, the first electronic device may determine a vertical displacement of the cursor according to the second rotation angle, and move the cursor in the screen-projected content displayed by the second electronic device according to the vertical displacement.
Similarly, referring to fig. 7d, fig. 7d is a schematic view of an actual scene of a screen projection, as shown in fig. 7d, a user may change the posture (vertical direction angle) of the first electronic device by waving the first electronic device, and accordingly, the cursor 601 displayed on the second electronic device 200 may be displaced in the vertical direction.
The change of the vertical angle of the first electronic device in the three-dimensional space may refer to an angle change component of the first electronic device rotating in the vertical direction in the three-dimensional space.
The change of the vertical angle of the first electronic device in the three-dimensional space may refer to an angle change component of the first electronic device rotating in the vertical direction in the three-dimensional space. Referring to fig. 7e, fig. 7e is a schematic diagram of a posture change of the first electronic device rotating in the vertical direction, as shown in fig. 7b, the first electronic device rotates from posture 1 to posture 2 in the vertical direction, and the angle change component of the rotation is θ 2. In this embodiment of the application, the first electronic device 100 may monitor a posture change in a three-dimensional space and obtain a spatial orientation parameter, and taking the posture change shown in fig. 7a as an example, the first electronic device 100 may obtain that the spatial orientation parameter (an angle change component of vertical rotation) is θ2
In this embodiment of the application, the first electronic device may obtain a mapping relationship between the angle change component of the vertical rotation and a vertical displacement L2 of the cursor 601 in the second electronic device, and determine a displacement L2 of the cursor 601 in the screen projection content of the second electronic device based on the obtained angle change component of the vertical rotation and the mapping relationship.
For how the first electronic device determines the displacement size L2 of the cursor 601 in the screen-projected content of the second electronic device, reference may be made to the description in the above embodiments, and details are not repeated here.
Referring to fig. 7f, fig. 7f is a displacement diagram of a cursor displayed by a second electronic device, as shown in fig. 7f, where the first electronic device acquires an angle change amount θ of vertical direction rotation in three-dimensional space2In this case, the first electronic device may determine that the vertical pixel coordinate change of the cursor 601 in the screen-projection content of the second electronic device is L2 according to the mapping relationship, and send information carrying the vertical pixel coordinate change L2 to the second electronic device, and accordingly, the second electronic device may change the vertical pixel coordinate change L2 The display position of the cursor 601. At this time, since the first electronic device is rotated upward when being rotated from the posture 1 to the posture 2, the cursor 601 is also displaced upward on the screen-projected content of the second electronic device, and consistency of movement of the cursor 601 and user operation is ensured.
Optionally, in another embodiment, the first electronic device may acquire a displacement size of the first electronic device in a first horizontal direction, and determine a horizontal displacement of the cursor on the interface content based on a translation size of the first electronic device in the horizontal direction. The first horizontal direction may be a horizontal direction parallel to the display screen of the second electronic device.
In this embodiment of the application, the first electronic device may obtain a mapping relationship between the displacement in the horizontal direction and the horizontal displacement of the cursor 601 in the second electronic device, and determine the horizontal displacement of the cursor 601 in the screen projection content of the second electronic device based on the obtained displacement in the horizontal direction and the mapping relationship.
Optionally, in another embodiment, the first electronic device may acquire a displacement size of the first electronic device in a first vertical direction, and determine a vertical displacement of the cursor on the interface content based on a translation size of the first electronic device in the vertical direction. The first horizontal direction may be a vertical direction perpendicular to the display screen of the second electronic device.
In the embodiment of the application, the first electronic device may obtain a mapping relationship between the displacement in the vertical direction and the vertical displacement of the cursor 601 in the second electronic device, and determine the vertical displacement of the cursor 601 in the screen projection content of the second electronic device based on the obtained displacement in the vertical direction and the mapping relationship.
It should be noted that, in practical applications, the first electronic device may monitor the posture change through a built-in gravitational acceleration sensor (e.g., a gyroscope) thereof and obtain the spatial orientation parameter, and may also monitor the posture change through an infrared spatial detection technique or an acoustic detection technique and obtain the spatial orientation parameter. The present embodiment does not specifically limit the means for acquiring the spatial orientation parameter by the first electronic device.
Secondly, the user can adjust the position of the projected content of the cursor 601 on the second electronic device by performing a sliding operation on the display screen in the first electronic device.
In this embodiment, the first electronic device may receive a second sliding operation on the display screen of the first electronic device, determine a displacement of the cursor according to the second sliding operation, and move the cursor in the screen-casting content displayed by the second electronic device based on the displacement.
It should be noted that the track of the second sliding operation passes through one or more objects in the interface content of the first electronic device, where the "one or more objects" may be all objects that pass through the track of the second sliding operation.
Specifically, in the embodiment of the present application, a cursor 601 may be displayed on the screen projection content of the second electronic device, and the cursor 601 may perform corresponding displacement based on a sliding operation of the user on the display screen (in the screen-off state) of the first electronic device. Specifically, the user may slide on the display screen through a finger or a touch pen, and accordingly, the cursor 601 on the second electronic device may perform corresponding displacement on the screen projection content of the second electronic device based on the sliding operation of the user.
Alternatively, referring to fig. 7g, fig. 7g is a schematic view of a real scene of a screen projection, as shown in fig. 7g, a user may slide on a display screen by a finger or a touch pen, and accordingly, a cursor 601 on the second electronic device may perform corresponding displacement on the screen projection content of the second electronic device based on the sliding operation of the user.
In the embodiment of the application, the first electronic device may acquire a mapping relationship between a sliding displacement L3 of the user on the display screen of the first electronic device and a displacement size L3 of the cursor 601 in the second electronic device, and determine a displacement size L3 of the cursor 601 in the screen projection content of the second electronic device based on the acquired sliding displacement L3 on the display screen of the first electronic device and the mapping relationship.
For example, the mapping relationship between the sliding displacement L3 on the display screen of the first electronic device acquired by the first electronic device and the displacement size L3 of the cursor 601 in the second electronic device is as follows: every time the user slides 1 pixel on the display screen of the first electronic device, the pixel coordinates of the cursor 601 in the screen projection content of the second electronic device change by 30 pixels. The above is merely an example, and the present application is not limited thereto.
It should be noted that, in the process of establishing the screen-casting connection between the first electronic device 100 and the second electronic device 200, the first electronic device 100 and the second electronic device 200 may perform exchange and negotiation of performance parameters, that is, the first electronic device 100 may obtain size parameters of interface contents on the second electronic device 200. The first electronic device may adjust the mapping relationship based on the size parameter of the interface content on the second electronic device 200. For example, the larger the horizontal size of the interface content on the second electronic device 200, the larger the pixel displacement (pixel coordinate change) of the cursor 601 in the screen projection content of the second electronic device in the case where the user slides the same displacement on the display screen of the first electronic device.
It should be noted that the sliding displacement of the user on the display screen of the first electronic device may include displacements in two directions (x direction and y direction) perpendicular to each other, and referring to fig. 7h, fig. 7h is a sliding operation schematic of the user, and accordingly, as shown in fig. 7i, the sliding displacement L3 of the user on the display screen of the first electronic device may include a displacement L5 in a first direction perpendicular to the central axis of the first electronic device and a displacement L6 in a second direction parallel to the central axis of the first electronic device. Accordingly, the pixel displacement of the cursor 601 in the screen projection content of the second electronic device includes a displacement component in the horizontal direction and a displacement component in the vertical direction. At this time, the first electronic device may determine the displacement magnitude of the cursor 601 in the horizontal direction and the displacement magnitude in the vertical direction in the screen-projected content of the second electronic device, respectively, based on the above-described mapping relationship.
Referring to fig. 7j, fig. 7j is a displacement diagram of a cursor displayed by a second electronic device, as shown in fig. 7j, in a case that a first electronic device acquires a sliding displacement L3 of a user on a display screen of the first electronic device, the first electronic device may determine, according to the mapping relationship, that a displacement of the cursor 601 in screen projection content of the second electronic device is L4, and send information carrying L4 to the second electronic device, and accordingly, the second electronic device may change a display position of the cursor 601 based on L4. At this time, since the user slides to the right obliquely upward on the display screen of the first electronic device, the cursor 601 is also obliquely upward to the right on the screen projection content of the second electronic device, and consistency between movement of the cursor 601 and user operation is ensured.
In this embodiment of the application, the operation position corresponding to the second sliding operation corresponds to a third object in the interface content of the first electronic device, and the first electronic device may shield a response of the third object to the second touch operation
In the embodiment of the present application, the user may implement the movement of the cursor 601 in the display screen of the second electronic device based on the operation (changing the posture or sliding on the display screen of the first electronic device) on the first electronic device in the above manner.
402
Next, how the user performs operations such as clicking and sliding on the object on which the cursor is located by operating the first electronic device will be described.
In this embodiment of the application, when the cursor is moved to a target object in the screen-projected content displayed by the second electronic device, if the first electronic device receives a first touch operation on the display screen of the first electronic device, the target object in the screen-projected content displayed by the second electronic device responds to the first touch operation.
Illustratively, the first touch operation may be a click operation or a slide operation.
In this embodiment, the operation position of the first touch operation corresponds to a second object in the interface content of the first electronic device, and the first electronic device may shield a response of the second object to the first touch operation.
Specifically, after receiving a first touch operation on a display screen of the first electronic device, the first electronic device may shield a response of the interface content to an operation position of the first touch operation, and determine a position of a cursor on a second display screen to determine a response position of the interface content to the first touch operation. For example, if the user performs a click operation on an a object of the display screen of the first electronic device, and at this time, the cursor is located on a B object in the second electronic device, at this time, the first electronic device may mask a response of the a object to the click operation, and respond to the click operation on the B object of the interface content.
The first touch operation may be, but is not limited to, a click operation or a slide operation, and the first touch operation is described as an example of the click operation.
In the embodiment of the application, after the user moves the cursor 601 to the area where the target object desired to be operated is located on the screen-projecting content displayed by the second electronic device, the user may perform a click operation on the display screen of the first electronic device, and accordingly, the click operation may be performed in the area where the cursor 601 is located.
Specifically, in an embodiment, after the cursor 601 moves to the target area desired to be operated on the screen projection content of the second electronic device, the first electronic device may acquire a specific pixel coordinate position of the cursor 601 in the screen projection content of the second electronic device, and determine a pixel coordinate position corresponding to the specific pixel coordinate position in the interface content applied at the front end of the first electronic device according to the specific pixel coordinate position of the cursor 601 in the screen projection content of the second electronic device.
For example, if the user moves the cursor 601 in the second electronic device into the area of the video APP icon, at this time, the first electronic device may determine a pixel coordinate position (a pixel coordinate position of the area of the video APP icon) corresponding to the interface content of the cursor 601 in the first electronic device.
The user performs a click operation on the display screen of the first electronic device, at this time, the first electronic device may shield a response of the interface content applied at the front end of the first electronic device to an operation position of the click operation, and the first electronic device may perform an event corresponding to the click operation at a pixel coordinate position corresponding to the interface content of the cursor 601 in the first electronic device, which is equivalent to causing the first electronic device to perform the click operation at a pixel coordinate position corresponding to the cursor 601 in the first electronic device.
For example, if the user moves the cursor 601 in the second electronic device into the area of the video APP icon and performs a click operation on the display screen of the first electronic device, the first electronic device may perform the click operation on the video APP in the current interface content and send the opened interface (screen projection data) to the second electronic device, and the second electronic device may display the received screen projection data.
Next, the first touch operation will be described as an example of the slide operation.
In this embodiment of the application, the first electronic device may generate a sliding event corresponding to the first sliding operation, and execute the sliding event on a target object in interface content of the first electronic device, so that the target object in the screen-casting content displayed by the second electronic device responds to the first sliding operation.
For example, if the user moves the cursor 601 in the second electronic device into the area of the video APP icon and performs a sliding operation on the display screen of the first electronic device, the first electronic device may perform a dragging operation on the video APP icon in the current interface content.
It should be noted that, in addition to the cursor 601, a menu bar may be displayed on the display screen of the second electronic device.
In the embodiment of the application, the first electronic device can obtain interface content of a front-end application of the first electronic device, and generate a cursor and a menu bar on the interface content to obtain screen projection content; wherein the menu bar does not belong to the interface content of the first electronic device. That is, the menu bar is not the interface content originally on the first electronic device, but is the newly added content.
Specifically, the first electronic device may add a menu bar to interface content currently applied to the front end of the first electronic device based on the floating window interface to generate screen-projecting content, and at this time, the screen-projecting service of the first electronic device may obtain the screen-projecting content, and send the screen-projecting content (which needs to be encoded and size-changed) to the second electronic device based on a certain screen-projecting protocol, so that the display screen of the second electronic device displays the screen-projecting content.
Optionally, the first electronic device may also obtain interface content of the current front-end application. For example, the first electronic device may obtain interface content of a current front-end application based on a screen recording interface (e.g., a mediaproject interface provided by Android) provided by the system, and draw a menu bar on the obtained interface content, the first electronic device may use the drawn content as screen projection content, and a screen projection service of the first electronic device may obtain the screen projection content and send the screen projection content (which needs to be subjected to an encoding operation and/or size conversion of the content) to the second electronic device based on a certain screen projection protocol, so that a display screen of the second electronic device displays the screen projection content.
In this embodiment, the screen-shot content of the second electronic device may further include a menu bar that does not belong to the interface content of the first electronic device, for example, referring to a function control 800 shown in fig. 8a, as shown in fig. 8a, the user may cause a cursor 601 on the second electronic device to move to an area where the function control 800 is located by changing a gesture of the first electronic device 100, and perform a click operation 80, and the first electronic device may perform a click operation on a target object where the cursor 601 is located in response to the click operation 80, and display a function selection area 801, and accordingly, referring to fig. 8b, fig. 8b shows a screen-shot content of the second electronic device provided by the present application embodiment, as shown in fig. 8b, the second electronic device displays the function selection area 801, where the function selection area 801 may include a sensitivity adjustment control, and a sensitivity adjustment control, Mouse size adjustment control, sliding mode control 802, brightness adjustment control, and normal mode control.
Illustratively, the user may effect mode switching between the gesture-based interaction mode shown in fig. 7a and 7d and the sliding interaction mode shown in fig. 7g by clicking on the sliding mode control 802. As shown in fig. 8b, the user may click the sliding mode control 802, and at this time, the mode of changing the position of the cursor 601 based on the gesture of the first electronic device may be switched to the sliding interaction mode shown in fig. 7g, it should be noted that, in an embodiment, after switching to the sliding interaction mode, the sliding mode control 802 in the function selection area 801 may be replaced by a gesture mode control, and if the user clicks the gesture mode control, the sliding interaction mode shown in fig. 7g may be switched to the gesture-based interaction mode shown in fig. 7a and 7 d.
It should be noted that, after the first electronic device and the second electronic device establish the screen-casting connection, the gesture-based interaction mode shown in fig. 7a and 7d may be used by default, or the sliding interaction mode shown in fig. 7g may also be used by default, which is not limited in the present application.
Referring to fig. 9a, fig. 9a is a screen-shot content schematic of a second electronic device provided in this embodiment of the present application, as shown in fig. 9a, a user may click a sensitivity adjustment control 901, and accordingly, the second electronic device may display a sensitivity adjustment area 90, referring to fig. 9b, and fig. 9b is a screen-shot content schematic of a second electronic device provided in this embodiment of the present application, as shown in fig. 9b, the second electronic device displays the sensitivity adjustment area 90, and the sensitivity adjustment area 90 may include a sliding control, and the user may adjust a sensitivity to a manipulation of a cursor 601 by dragging the sliding control. In addition, the sensitivity adjustment area 90 may further include a sensitivity magnitude prompt, and it should be noted that the arrangement of the above interface and the control is only an illustration, and the application is not limited thereto.
Alternatively, in a mode in which the position of the cursor 601 is adjusted based on the posture of the first electronic device, the first electronic device may acquire an angle variation amount of rotation in a horizontal direction or a vertical direction, and determine the displacement amount of the cursor 601 in the screen-projected content of the second electronic device based on a mapping relationship between the angle variation amount in the horizontal direction and the horizontal displacement amount of the cursor 601 in the second electronic device, and a mapping relationship between the angle variation amount in the vertical direction and the vertical displacement amount of the cursor 601 in the second electronic device. The mapping relationship between the angle variation in the horizontal direction and the horizontal displacement of the cursor 601 in the second electronic device and the mapping relationship between the angle variation in the vertical direction and the vertical displacement of the cursor 601 in the second electronic device can be adjusted through the sensitivity adjustment, so that when the first electronic device has the same posture change, the displacement of the cursor 601 on the second electronic device changes.
For example, if the user feels that the movement of the cursor 601 is too slow when operating the cursor 601, the user may increase the sensitivity of the cursor 601 by dragging the sliding control shown in fig. 9b to the right, and conversely, if the user feels that the movement of the cursor 601 is too fast when operating the cursor 601, the user may decrease the sensitivity of the cursor 601 by dragging the sliding control shown in fig. 9b to the left.
Illustratively, at a sensitivity of 40, the horizontal pixel coordinate of the cursor 601 in the screen projection content of the second electronic device changes by 30 pixels every 1 ° of rotation of the first electronic device on the horizontal plane, and the vertical pixel coordinate of the cursor 601 in the screen projection content of the second electronic device changes by 50 pixels every 1 ° of rotation of the first electronic device on the vertical plane. By dragging the sliding control shown in fig. 9b to the right, when the user can adjust the sensitivity to 58, the horizontal pixel coordinate of the cursor 601 in the screen projection content of the second electronic device changes by 45 pixels every time the first electronic device rotates by 1 ° on the horizontal plane, and the vertical pixel coordinate of the cursor 601 in the screen projection content of the second electronic device changes by 75 pixels every time the first electronic device rotates by 1 ° on the vertical plane. It should be noted that the above description of mapping relationship is only an example, and does not constitute a limitation to the present application.
Optionally, in an embodiment, after the first electronic device and the second electronic device establish a screen-casting connection, the first electronic device may add image data corresponding to the function control 800 to the current interface content through the floating window interface, so that the function control 800 is superimposed in the current interface content of the first electronic device, at this time, the screen-casting service of the first electronic device may obtain screen-casting data including the interface content of the function control 800, and send the screen-casting data to the second electronic device based on a certain screen-casting protocol, accordingly, the interface content including the function control 800 may be displayed on the second electronic device, the first electronic device may obtain an operation (sliding/changing a posture of the first electronic device) of a user to change a position of the cursor 601 in the current interface content, when the user controls the cursor 601 on the second electronic device to move within a range of the function control 800, the first electronic device may determine that the position of the cursor 601 in the current interface content is within the range of the function control 800, and if a user clicks a display screen of the first electronic device, the first electronic device may add image data corresponding to the function selection area 801 to the current interface content through a floating window interface, so that the function selection area 801 is superimposed in the current interface content of the first electronic device, at this time, a screen-casting service of the first electronic device may obtain screen-casting data including the interface content of the function selection area 801, and send the screen-casting data to the second electronic device based on a certain screen-casting protocol, and accordingly, the interface content including the function selection area 801 may be displayed on the second electronic device.
Taking the function selection area 801 shown in fig. 8b as an example, the function selection area 801 may include a sensitivity adjustment control, a mouse size adjustment control, a sliding mode control 802, a brightness adjustment control, and a normal mode control. At this time, the first electronic device may acquire the positions of the controls in the function selection area 801 in the current interface content. If the position of the cursor 601 is within the area of one of the controls in the function selection area 801 and the first electronic device detects a click operation of the user on the display screen, the first electronic device may respond to the operation of the user, for example, as shown in fig. 9a, if the user clicks the sensitivity control, in response to the click operation of the user, the first electronic device may add image data corresponding to the sensitivity adjustment area 90 to the current interface content through the floating window interface, so that the sensitivity adjustment area 90 is superimposed in the current interface content of the first electronic device, and remove the original function selection area 801 (or superimpose the sensitivity adjustment area 90 on the function selection area 801), at this time, the screen projection service of the first electronic device may obtain screen projection data including the interface content of the sensitivity adjustment area 90, and send the screen projection data to the second electronic device based on a certain screen projection protocol, and accordingly, the interface content including the sensitivity adjustment area 90 shown in fig. 9b may be displayed on the second electronic device. At this time, the user may perform a sliding operation on the display screen of the first electronic device and drag the sliding button in the sensitivity adjustment area 90, and accordingly, the first electronic device adds the image data corresponding to the dragged sliding button to the current interface content through the floating window interface in response to the sliding operation of the user, and meanwhile, modifies the relevant parameter of the sensitivity of the user for operating the cursor 601 based on the dragging size of the user. The above description is merely an example and is not intended to limit the present application.
It should be noted that the control included in the function selection area 801 is an illustration at present, in practical applications, a screen projection ending control may also be set, and a user may move a cursor in the second electronic device to the screen projection ending control and click a display screen of the first electronic device to trigger the first electronic device to end the screen projection connection with the second electronic device.
Optionally, the user may also trigger the first electronic device to end the screen-shooting connection with the second electronic device by other means, such as pressing a power key, and the like, which is not limited herein.
Optionally, referring to fig. 9c, fig. 9c is a screen projection content illustration of a second electronic device provided in this embodiment of the present application, as shown in fig. 9c, a user may click a common mode control 902, and accordingly, the first electronic device may change an interaction mode with the user into a common interaction mode, at this time, a display screen of the first electronic device may be bright, and the user may normally operate on the display screen of the first electronic device.
Optionally, referring to fig. 9d, fig. 9d is a schematic diagram of a user operating a first electronic device according to an embodiment of the present application, as shown in fig. 9d, the user may press a power key of the first electronic device, and accordingly, the first electronic device may change an interaction mode with the user to a normal interaction mode, at this time, a display screen of the first electronic device may be bright, and the user may normally operate on the display screen of the first electronic device.
The above manner of switching the first electronic device to the normal interaction mode is merely an illustration, and does not limit the present application.
The above switching of the first electronic device to the normal interaction mode is described next with reference to a specific scenario.
Referring to fig. 10a and 10a, which are schematic screen projection contents of a second electronic device provided in the embodiment of the present application, as shown in fig. 10a, when a first electronic device receives a chat message sent by another electronic device, a prompt window for chat is displayed on the second electronic device (data of the prompt window is sent by the first electronic device, and specific details may refer to the above embodiment, and are not described here again).
As shown in fig. 10a, the user may move the cursor 601 into the area of the prompt window, and performs a click operation on the first electronic device, and accordingly, the second electronic device may display a chat interface as shown in fig. 10b (the implementation of the process is that the first electronic device detects the click operation of the user and performs the click event at the position of the cursor, and specific details may refer to the above-mentioned embodiment and are not described herein again), the chat interface includes an input keyboard, and at this time, if sentence input is performed by adjusting the position of the cursor displayed on the second electronic device in combination with performing a click operation on the first electronic device, the user experience may be poor (since the display screen of the second electronic device may be large in size, and the user is not suitable to use the input keyboard through the above-mentioned interactive manner).
Referring to fig. 10c, fig. 10c is an interaction schematic of a first electronic device according to an embodiment of the present application, as shown in fig. 10c, a user may press a power key of the first electronic device, and accordingly, the first electronic device may change an interaction mode with the user to a normal interaction mode, at this time, a display screen of the first electronic device may be bright, and the user may normally operate on the display screen of the first electronic device. Referring to fig. 10d, fig. 10d is an illustration of interface contents of a first electronic device according to an embodiment of the present application, as shown in fig. 10d, a user may directly operate on an input keyboard displayed on the first electronic device. Optionally, after the input of the sentence is completed, the power key of the first electronic device may be pressed again, and the first electronic device is switched to the screen-off mode and the interaction mode.
It should be noted that the above is only a schematic description of one scene, and in practical applications, as long as the scene needs to be operated by facing the display screen of the first electronic device, the user can switch to the normal mode of screen projection in the above manner.
Optionally, after the screen-throwing connection is established between the first electronic device and the second electronic device, a call request (for example, a voice call request or a video call request) of another electronic device may be received, taking the call request as the voice call request as an example, at this time, interface content corresponding to the call request of the first electronic device may be displayed on the second electronic device, where the interface content includes a call receiving control and a call rejecting control.
Optionally, the user may move a cursor displayed on the second electronic device to the receiving call control by operating the first electronic device, and click on the display screen of the first electronic device, so that the user may perform a call through the first electronic device.
Optionally, the user may switch the current interaction mode to the normal mobile phone interaction mode by pressing a power key of the first electronic device or other modes (by clicking a cursor displayed on the second electronic device in the display screen of the first electronic device to move to the call receiving control and clicking the call receiving control on the display screen of the first electronic device, the user may further perform a call through the first electronic device.
Next, how to combine the above sensitivity adjustment to make the front end orientation of the first electronic device coincide with the display position of the cursor on the second electronic device as much as possible when the user swings the first electronic device is described.
In the embodiment of the present application, after the first electronic device establishes the screen-casting connection with the second electronic device, the first electronic device may set the cursor 601 at a preset position in the interface content, for example, at a geometric center of the interface content, referring to fig. 11a, fig. 11a is an interaction schematic of the first electronic device and the second electronic device, after the first electronic device establishes the screen-casting connection with the second electronic device, the first electronic device may set the cursor 601 at the geometric center of the interface content, and accordingly, the display content of the second electronic device may be as shown in fig. 11a, however, since the front end of the first electronic device may not face the center position of the second electronic device when the first electronic device establishes the screen-casting connection with the second electronic device, at this time, the front end of the first electronic device faces to the left, at this time, the user swings the first electronic device to the right, so that the display position of the cursor on the second electronic device is displaced to the right, and at this time, when the user controls the cursor to displace, the orientation of the first electronic device is all shifted to the left, which is not in accordance with the ideal operation mode of the user (the ideal user operation mode is that the orientation of the front end of the first electronic device can be consistent with the display position of the cursor on the second electronic device as much as possible). In order to solve the above technical problem, when the user swings the first electronic device to the right so that the front end of the first electronic device faces to the vicinity of the center position of the second electronic device, as shown in fig. 11b, the display position of the cursor on the second electronic device is at the boundary of the right end, and in this case, the user may continue to swing the first electronic device to the right, as shown in fig. 11c, the cursor may stay at the boundary of the right end on the second electronic device, and after the user rotates the first electronic device to the right by a certain angle, the user may rotate the first electronic device to the left, as shown in fig. 11d, the cursor on the second electronic device is displaced to the left, and when the front end of the first electronic device faces to the center position of the second electronic device, the display position of the cursor on the second electronic device is further to the left than before the adjustment. When the user can orient the front end of the first electronic device to the central position of the second electronic device in the manner, the cursor is adjusted to be positioned near the central position of the second electronic device, and the orientation of the front end of the first electronic device can be consistent with the display position of the cursor on the second electronic device as far as possible when the user swings the first electronic device in combination with the adjustment of the sensitivity.
In this embodiment, the first electronic device may obtain a size of a display area of a display screen of the second electronic device, and then, the first electronic device may determine, according to the size of the display area of the display screen of the second electronic device and a pixel coordinate position of screen projection content displayed by the cursor on the second electronic device, whether the cursor moves to a boundary of the display area of the second electronic device, and when the first electronic device determines whether the cursor moves to the boundary of the display area of the second electronic device, a display position of the cursor on the display screen of the second electronic device may stay at the boundary of the display area of the second electronic device. It should be understood herein that when the cursor is moved to the left and right boundaries of the display area on the display screen of the second electronic device, the cursor may still move up and down, although the cursor may not exceed the left and right boundaries of the display area on the display screen. Similarly, when the cursor moves to the upper and lower boundaries of the display area on the display screen of the second electronic device, the cursor may still move left and right, although the cursor may not exceed the upper and lower boundaries of the display area on the display screen. Similarly, when the cursor is moved to a corner point (upper left corner point, lower left corner point, upper right corner point, or lower right corner point) of the display area on the display screen of the second electronic device, although the cursor does not exceed the boundary of the display area on the display screen, the cursor may still be moved in a certain direction, for example, when the cursor is moved to the upper left corner point of the display area on the display screen of the second electronic device, the cursor may still be moved in a right direction, a downward direction, and a downward and rightward direction.
Optionally, in another embodiment, a quick adjustment mechanism for the cursor position may be set, for example, after the user touches the display screen of the first electronic device for more than a preset time, the first electronic device may initialize the position of the cursor in the interface content in response to the operation of the user.
Illustratively, referring to fig. 11e, fig. 11e is an interaction schematic of a first electronic device and a second electronic device, as shown in fig. 11e, after the first electronic device and the second electronic device are connected by screen projection, because the front end of the first electronic device faces to the left position of the geometric center of the second electronic device, at this time, the user waves the first electronic device to the right, and the front end of the first electronic device is directed to the vicinity of the center position of the second electronic device, as shown in fig. 11f, the display position of the cursor is in the vicinity of the center position of the second electronic device, at which time, the user can press the display screen of the first electronic device for 5 seconds or more, as illustrated in fig. 11g, the first electronic device may adjust the display position of the cursor to be at the geometric center of the interface content in response to the user' S operation of long-pressing the display screen 5S. When the user can orient the front end of the first electronic device to the central position of the second electronic device in the manner, the cursor is adjusted to be positioned near the central position of the second electronic device, and the orientation of the front end of the first electronic device can be consistent with the display position of the cursor on the second electronic device as far as possible when the user waves the first electronic device in combination with the adjustment of the sensitivity.
The above method for adjusting the cursor position is only an example, and does not limit the present application.
Optionally, in an embodiment, if the first electronic device is not operated by the user within a certain time after the first electronic device and the second electronic device establish the screen-casting connection, a cursor displayed on the second electronic device may be hidden, and the user may activate cursor display in the display screen of the second electronic device through the touch operation and/or by changing the gesture of the first electronic device.
In this embodiment of the application, the first electronic device may further receive a first operation on a display screen of the first electronic device, where the first operation is a preset shortcut operation, and the first electronic device may respond to the first operation.
In this embodiment of the application, the first electronic device may detect a first operation on a display screen of the first electronic device, and recognize that the first operation is a preset shortcut operation, at this time, the first electronic device may generate a first event corresponding to the first electronic device, and the first electronic device may also generate another event (referred to as a second operation for distinguishing from the first operation) according to a position of a cursor and the first operation, and the first electronic device may determine priorities of the first operation and the second operation, and determine whether to execute the first operation or the second operation based on the priorities of the first operation and the second operation.
Optionally, in this embodiment of the application, if the first electronic device recognizes that the first operation is a preset shortcut operation, at this time, the first electronic device may generate a first event corresponding to the first electronic device, and the first electronic device may also generate a second operation according to the position of the cursor and the first operation, where an execution priority of the first event is higher than an execution priority of the second event, and the first electronic device executes the first event. At this time, the operation position of the first operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object does not respond to the first operation because the execution priority of the second event is lower than that of the first event.
Next, the screen projection method provided by the present application is further described in conjunction with two application scenarios.
Video playing
Referring to fig. 12a, fig. 12a is an interface schematic diagram of a second electronic device according to an embodiment of the present application, as shown in fig. 12a, a user may click a video APP application by operating a first electronic device, and after the user clicks a video that the user wants to play, the second electronic device may display a video playing interface as shown in fig. 12 b.
Referring to fig. 12a, fig. 12a is an interface schematic diagram of a second electronic device provided in an embodiment of the present application, as shown in fig. 12a, a user may click a full-screen control in an interface to implement full-screen playing of a video, and in another embodiment, as shown in fig. 12d, a user may rotate a first electronic device (from a vertical screen to a horizontal screen) to implement full-screen playing of a video, and a full-screen playing interface of a video may refer to fig. 12e, where the full-screen playing interface may include, but is not limited to, a video image, a pause/play control, a next set of controls, and the like.
In this embodiment of the application, a user may click on the video playing area in fig. 12e, and accordingly, referring to fig. 12f, the screen projection content of the second electronic device may only include a video image, or, if the user does not operate the first electronic device for a long time, the screen projection content of the second electronic device may only include a video image (see fig. 12 f).
Referring to fig. 12g, fig. 12g is an operation diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 12g, when a second electronic device displays a video playing interface, a user performs a horizontal sliding operation on a display screen of the first electronic device, a video displayed by the second electronic device may fast forward or fast reverse, the second electronic device displays an interface as shown in fig. 12e, and when the user slides to the right on the display screen of the first electronic device, as shown in fig. 12h, the projected screen content of the second electronic device includes a fast-forward preview image and a fast-forward progress bar, as shown in fig. 12i, the projected screen content of the second electronic device includes a fast-forward preview image and a fast-forward progress bar, after the user determines a target position of fast forward based on the preview image displayed on the second electronic device, the sliding operation on the display screen of the first electronic device may be ended, and the video displayed by the second electronic device is fast-forwarded to the target position where the user determined the fast-forwarding (10: 03), as shown in fig. 12j and 12 k.
Referring to fig. 12l, fig. 12l is an operation schematic diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 12l, a user may perform a sliding operation in a vertical direction in a right area of a display screen of the first electronic device to adjust a volume level of a currently playing video, or as shown in fig. 12m, a user may adjust a volume level of a currently playing video through a physical volume adjustment key 1201 of the first electronic device, as shown in fig. 12n, and fig. 12n shows a volume adjustment interface of a second electronic device.
Referring to fig. 12o, fig. 12o is an operation schematic diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 12o, a user may perform a sliding operation in a vertical direction in a left area of a display screen of the first electronic device to adjust a display brightness of a currently playing video, as shown in fig. 12p, and fig. 12p shows a brightness adjustment interface of a second electronic device.
Referring to fig. 12q, fig. 12q is an operation schematic diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 12q, a user may slide from a right side boundary area to a screen center on a display screen of the first electronic device, at this time, a second electronic device may display interface content of a previous level (i.e., the first electronic device performs a return operation, and returns to the interface content of the previous time), as shown in fig. 12r, at this time, screen projection content of the second electronic device is a previous level interface (a video playing interface) of a full-screen video playing interface.
Referring to fig. 12s, fig. 12s is an operation schematic diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 12s, a user may slide from a lower side boundary area to a center of a screen on a display screen of the first electronic device, at this time, a second electronic device may display a main interface (that is, the first electronic device performs an operation of returning to the main interface), as shown in fig. 12t, at this time, screen projection content of the second electronic device is the main interface of the first electronic device.
In this embodiment of the application, the first electronic device may further receive a first operation on a display screen of the first electronic device, where the first operation is a preset shortcut operation, and the first electronic device may respond to the first operation.
In this embodiment of the application, the first electronic device may detect a first operation on a display screen of the first electronic device, and recognize that the first operation is a preset shortcut operation, at this time, the first electronic device may generate a first event corresponding to the first electronic device, and the first electronic device may also generate another event (referred to as a second operation for distinguishing from the first operation) according to a position of a cursor and the first operation, and the first electronic device may determine priorities of the first operation and the second operation, and determine whether to execute the first operation or the second operation based on the priorities of the first operation and the second operation.
Optionally, in this embodiment of the application, if the first electronic device recognizes that the first operation is a preset shortcut operation, at this time, the first electronic device may generate a first event corresponding to the first electronic device, and the first electronic device may also generate a second operation according to the position of the cursor and the first operation, where an execution priority of the first event is higher than an execution priority of the second event, and the first electronic device executes the first event. At this time, the operation position of the first operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object does not respond to the first operation because the execution priority of the second event is lower than that of the first event.
For example, referring to fig. 12u, fig. 12u is an operation schematic diagram of a first electronic device provided in an embodiment of the present application, where the first electronic device may support a common full-screen gesture operation (shortcut operation) besides a click confirmation operation, and the operation is consistent with an original mode of using the first electronic device, so as to reduce a learning cost of a user. As shown in fig. 12u, sliding the representative return key from the left edge or the right edge to the middle, sliding the representative home key from the lower edge to the middle, and sliding and holding the representative menu key from the lower edge to the middle, a sliding operation of up, down, left, and right can be used to implement fast forward and fast backward in a video scene.
The principle of implementing the full-screen gesture operation described above is described next:
1. return key
In this embodiment, sliding from the left edge or the right edge to the middle represents a return key, and specifically, when the first electronic device is in the screen-off state, if a sliding operation from the left edge or the right edge to the middle is detected, the sliding operation executed at the current cursor position may be intercepted, but a return event is directly injected into the current front-end application, and in response to the return event, the interface content of the system returns to the previous stage, which is equivalent to a return operation executed by the mobile phone when the user slides from the left edge or the right edge to the middle in the normal mode.
2. Home key
In this embodiment of the application, the home key is represented by sliding from the lower edge to the middle, specifically, when the first electronic device is in the screen-off state, if the sliding operation from the lower edge to the middle is detected, the sliding operation executed at the current cursor position may be intercepted, but an event returning to the main interface is directly injected into the current front-end application, and in response to the event returning to the main interface, the interface content of the system returns to the main interface at this time, which is equivalent to that in the normal mode, the user clicks the home key.
3. Menu key
In this embodiment, a representative menu key slides and stays from the lower edge to the middle, specifically, when the first electronic device is in the screen-off state, if an operation of sliding and staying from the lower edge to the middle is detected, the operation may be intercepted at the current cursor position, but an event for displaying a menu is directly injected into the current front-end application, and in response to the event for displaying the menu, the interface content of the system displays a pull-up menu, which is equivalent to that in the normal mode, a user clicks to slide and stay from the lower edge to the middle, and the mobile phone displays the pull-up menu (the menu may include a currently running application list, or a history running list of applications, and the application is not limited in this application).
In this embodiment of the application, the first electronic device may receive a second touch operation on the display screen of the first electronic device, determine a corresponding third event according to an operation form of the second touch operation, where different operation forms correspond to different third events, and execute the third event to the front-end application of the first electronic device, where the operation form of the second touch operation at least includes one of the following operation forms: the first preset area of the display screen of the first electronic device is contacted, and the first preset area slides to a first preset direction; the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Specifically, after acquiring a first operation input by the touch screen, the enhanced interaction service may identify whether the first operation is a preset shortcut operation, and generate an event corresponding to the preset shortcut operation based on that the first operation conforms to the preset shortcut operation, where the enhanced interaction service may directly inject the event into the front-end application instead of being executed at a position where a cursor is located (that is, the first electronic device may directly respond to the first operation). Reference may be specifically made to fig. 12u and the description of the corresponding embodiments, which are not described herein again.
Optionally, in another implementation, the display screen may be divided into two regions (for example, an upper half region and a lower half region), if the user touches the display screen in the upper half region of the display screen, the display position of the cursor on the interface content may be controlled, and if the user touches the display screen in the lower half region of the display screen, the operation manner similar to that in fig. 12u may be performed.
In addition, it should be noted that, in the above video playing scenario, after the user selects the video to be played, the second electronic device may directly play the video in full screen.
Optionally, the first electronic device 100 may obtain the video 400 corresponding to the current video playing interface, and process the video 400 into a video stream. In some possible embodiments, the first electronic device 100 acquires image data of a video and compresses an image corresponding to each image frame in the image data of the video, where the compressed image size of the image corresponding to each image frame is the same as the image size on the second electronic device. For example, when the image size is in units of pixel points, assuming that the image size on the second electronic device is 400 × 800 pixel points, and the image size of the image corresponding to each image frame is 800 × 800 pixel points, when the first electronic device compresses the image corresponding to each image frame, the image size of the image corresponding to each image frame may be 800 × 800 pixel points, which are compressed to 400 × 800 pixel points, so as to obtain a compressed image corresponding to each image frame. The first electronic device may perform video compression coding on compressed images corresponding to the plurality of image frames to obtain a video stream. The plurality of image frames may be image frames of image data of the video at a plurality of consecutive time nodes, wherein an image frame exists at a time node of image data of the video. In other possible embodiments, after obtaining the image data of the video, the first electronic device may directly perform video compression encoding on a plurality of image frames of the image data on a plurality of consecutive time nodes to obtain a video stream.
In some possible embodiments, the first electronic device may obtain audio data of the video over a period of time determined by the plurality of consecutive time nodes. The first electronic device may perform audio compression encoding, such as Advanced Audio Coding (AAC), on the audio data during the period of time. The first electronic device mixes the image frames after video compression coding and the audio data in the period of time after audio compression coding into a video stream. The data format of the video stream is any data format that the second electronic device can receive, such as a video stream in MP4(MPEG-4part 14) format. Images and audio corresponding to the video stream are synchronously presented on the second electronic device.
The first electronic device may transmit the video stream to the second electronic device. Correspondingly, after the second electronic device receives the video stream, the video stream is processed into image and audio output. In some possible embodiments, the first electronic device may output the video stream as the projection data to the second electronic device through the projection port in a video streaming protocol (such as Real Time Streaming Protocol (RTSP)). The second electronic device performs video streaming media protocol receiving processing and video audio decoding processing on the screen projection data (namely, video streams), and then renders and outputs, and at this time, the second electronic device displays an image corresponding to the screen projection data in a full screen mode and/or plays an audio corresponding to the screen projection data. For example, the first electronic device 100 has a screen width of S-W1 and a screen height of SH 1; the second electronic device 100 has a screen width of S-W2 and a screen height of S-H2. When the second electronic device 200 displays the image corresponding to the screen projection data, the aspect ratio of the image corresponding to the screen projection data may be adjusted to be the same as the screen aspect ratio S-W2: S-H2 of the second electronic device 200, and then the image may be displayed. Assuming that the video 400 is played on the first electronic device 100 in a full screen mode and has no black or white edge, and the aspect ratio S-W1: S-H1 of the first electronic device 100 is the same as the aspect ratio S-W2: S-H2 of the second electronic device 200, the video playing screen displayed by the second electronic device 200 also has no black or white edge, and the first electronic device 100 and the second electronic device 200 display the same image and play the same audio.
In some possible implementations, if the user performs a click operation on the display screen of the first electronic device by moving a cursor on the second electronic device to the pause control, the first electronic device pauses the playing of the video 400 in response to the operation. Meanwhile, the first electronic device 100 may pause the transmission of the video stream of the video 400, and the second electronic device 200 may also pause the playing because there is no video stream transmission. If the user moves the cursor on the second electronic device to the pause control and performs a click operation on the display screen of the first electronic device, the first electronic device responds to the operation, and continues to play from the current playing progress of the video 400, and the first electronic device 100 continues to transmit the video stream of the video 400, so that the second electronic device 200 receives the video stream and also continues to play. If the transmission of the video stream is a fragmented transmission, the first electronic device 100 transmits only the video stream for a fixed period of time each time the video stream is transmitted. For example, if the playing time of the video 400 is 25 minutes and 34 seconds, the first electronic device 100 may transmit a video stream with a length of 10 seconds each time, and then the video 400 needs to be transmitted 154 times to be transmitted to the second electronic device 200.
The above describes a description of an embodiment of a screen projection method provided by the present application in a video playing scene, and next describes an embodiment of a screen projection method provided by the present application in another application scene.
Demonstration scene
Fig. 13 is a schematic view of a screen projection scene provided in an embodiment of the present application. As shown in fig. 13, assume that in a conference room of an enterprise, user a, user B, user C, user D, user E, user F, user G, user H, and user I are engaged in a conference; the user a sends the screen projection data containing the conference information to the second electronic device 200 through the first electronic device, and the second electronic device 200 displays the screen projection data. Generally, the device screen of the first electronic device 100 is small, it is difficult for the user a to share conference information with other users through the first electronic device 100, and the second electronic device 200 (e.g., a television) can display screen projection data through a large screen, so that the other users can view the screen projection data conveniently.
For the user B, the user C, the user D, the user E, and the like, which are close to the second electronic device 200, the conference information shared by the user a can be viewed by viewing the display content of the second electronic device 200, without separately carrying and using additional electronic devices for assistance. However, due to the screen size limitation of the second electronic device 200 and the visual problems such as myopia that some users may have, users such as user F, user G, user H and user I may not be able to clearly see the display contents of the second electronic device 200, which affects the normal progress of the conference.
Therefore, the first electronic device 100 may establish a screen-projecting connection with a plurality of second electronic devices 200 at the same time, and based on the technical solution of the present application, the user F, the user G, the user H, the user I, and the like may respectively use their respective second electronic devices 200 to receive and display screen-projecting data sent by the first electronic device 100, so as to facilitate viewing by corresponding users. Wherein, the processing procedure in the present application is the same for each of the plurality of second electronic devices 200.
Referring to fig. 14a, fig. 14a is a screen-shot content diagram of a second electronic device provided by an embodiment of the present application, as shown in fig. 14a, a user may click a PPT application by operating a first electronic device, and after the user clicks a PPT that wants to be demonstrated, the second electronic device may display a PPT presentation interface as shown in fig. 14 b.
As shown in fig. 14a, the PPT presentation interface may include a presentation control 1403, a current film presentation area 1404, and a film list area 1405. The current film presentation area 1404 is the film currently to be presented, the film list area 1405 can include therein a list of films, and the user can select the film to be presented in the film list area 1405, for example, the user can select the film C by operating the first electronic device, and accordingly, the film displayed in the film presentation area 1404 at this time is the selected film C.
As shown in fig. 14b, the user can click on a presentation control 1403 in the PPT presentation interface to implement the presentation of the film, which can be referred to fig. 14 c. Referring to fig. 14d, fig. 14d is an operation schematic diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 14d, a user may perform an active operation to the left on a display screen of the first electronic device, at this time, a second electronic device may display a presentation interface (in which a presentation film is switched to a next one) as shown in fig. 14 e. Referring to fig. 14f, fig. 14f is an operation schematic diagram of a first electronic device provided by an embodiment of the present application, as shown in fig. 14f, a user may provide a full screen presentation of a film by rotating the first electronic device (from a vertical screen to a horizontal screen), as shown in fig. 14g, and as shown in fig. 14g, a full screen presentation interface of a film is shown in fig. 14 g.
Next, a specific flow of an information processing method provided in an embodiment of the present application is described, referring to fig. 15, fig. 15 is a flowchart illustrating an information processing method provided in an embodiment of the present application, and as shown in fig. 15, the information processing method provided in the present application includes:
1501. The first electronic equipment monitors the screen projection connection state.
In the embodiment of the application, the first electronic device may be equipped with the enhanced interaction service, and the enhanced interaction service may monitor the screen projection connection state of the first electronic device.
1502. The first electronic device detects that a screen projection connection is established with the second electronic device.
In this embodiment of the application, the enhanced interaction service may detect that the first electronic device and the second electronic device establish the screen-projecting connection, and the manner of establishing the screen-projecting connection between the first electronic device and the second electronic device may refer to the description of the embodiments corresponding to fig. 4a to 5c, which is not described herein again.
1503. The first electronic device generates screen projection content.
In the embodiment of the application, the enhanced interaction service of the first electronic device may obtain interface content of a front-end application of the first electronic device, and generate the cursor on the interface content to obtain the screen-projecting content.
In this embodiment, the enhanced interaction service of the first electronic device may further generate the cursor and the menu bar on the interface content to obtain the screen projection content.
Specifically, referring to fig. 16, fig. 16 is an architecture schematic diagram of an embodiment of the present application, as shown in fig. 16, the enhanced interaction service may add a cursor and a menu bar to interface content of a front-end application of a current first electronic device based on a floating window interface to generate screen projection data, at this time, a screen projection service of the first electronic device may obtain the screen projection data, and send the screen projection data (which needs to be encoded and size-converted of content) to a second electronic device based on a certain screen projection protocol, so that a display screen of the second electronic device displays screen projection content corresponding to the screen projection data.
Optionally, the first electronic device may obtain posture change information of the first electronic device input by the sensor, move the position of the cursor in the interface content based on the posture change information of the first electronic device, and refer to the description in the foregoing embodiment for how the enhanced interaction service moves the cursor in the display screen of the second electronic device based on the posture change information of the first electronic device, which is not described herein again.
Optionally, the first electronic device may receive a second sliding operation on the display screen of the first electronic device, determine a displacement of the cursor according to the second sliding operation, move the cursor on the screen projection content based on the displacement, and refer to the description in the foregoing embodiment for how to move the cursor in the display screen of the second electronic device according to the second sliding operation for enhancing the interactive service, which is not described herein again.
Optionally, the first electronic device may acquire a first touch operation input by the touch screen, generate a corresponding first event according to the first touch operation, and execute the first event at the target object of the interface content (at this time, the target object where the cursor is located). For example, the first touch operation may be a click operation or a slide operation, and accordingly, the first event may be a click event or a slide event.
Optionally, the first electronic device may shield a response of the interface content to the operation position of the first touch operation. Specifically, after receiving a first touch operation on a display screen of the first electronic device, the first electronic device may shield a response of interface content of the current front-end application to an operation position of the first touch operation, but determine a position of a cursor on a second display screen to determine a response position of the interface content to the first touch operation. For example, if the user performs a click operation on an a object of the display screen of the first electronic device, and at this time, the cursor is located on a B object in the second electronic device, at this time, the first electronic device may shield a response of the a object to the click operation, and respond to the click operation on the B object of the interface content.
Optionally, the first electronic device may receive a pressing operation on a physical key of the first electronic device, generate a corresponding second event according to the pressing operation, and execute the second event on the front-end application of the first electronic device. For example, the first electronic device may receive a pressing operation (for example, volume reduction) on a volume key of the first electronic device, generate a corresponding volume reduction event according to the pressing operation, and execute the volume reduction event on the first electronic device front-end application, so that the volume played by the first electronic device front-end application is reduced.
Optionally, the first electronic device may receive a second touch operation on the display screen of the first electronic device, determine a corresponding third event according to an operation form of the second touch operation, where different operation forms correspond to different third events, and execute the third event on the front-end application of the first electronic device, where the operation form of the second touch operation at least includes one of the following operation forms: the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area; the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Specifically, after the enhanced interaction service acquires the touch operation input by the touch screen, it may be recognized whether the touch operation conforms to a preset operation form, and an event corresponding to the preset operation form is generated based on that the touch operation conforms to the preset operation form, where the enhanced interaction service may directly inject the event into the front-end application instead of being executed at the position of the cursor.
Optionally, referring to fig. 17, fig. 17 is an architecture diagram of an embodiment of the present application, and as shown in fig. 17, the enhanced interaction service may obtain interface content of the current front-end application. Optionally, the enhanced interaction service may obtain interface content of a current front-end application based on a screen recording interface provided by the system (for example, a mediaproject interface provided by Android), and draw a cursor and a menu bar on the obtained interface content, and the enhanced interaction service may send the drawn content as screen projection data to a screen projection service of the first electronic device, at this time, the screen projection service of the first electronic device obtains the screen projection data, and sends the screen projection data (which needs to be subjected to encoding operation and/or size conversion of the content) to the second electronic device based on a certain screen projection protocol.
1504. The first electronic device sends screen projection content to the second electronic device.
In the embodiment of the application, after the first electronic device sends the screen projection data to the second electronic device, the display screen of the second electronic device may display screen projection content corresponding to the screen projection data, where the screen projection content includes a cursor and interface content of the first electronic device, and the cursor is used to be positioned at an operation position in the interface content.
For the implementation of step 1504, reference may be made to the above embodiment, where the first electronic device sends a description related to the screen projection data to the second electronic device, and details are not described here again.
It should be noted that, the above steps 1503 and 1504 may be implemented independently, instead of being implemented on the basis of the execution of the steps 1501 and 1502.
The embodiment of the application provides an information processing method, which comprises the following steps: generating screen projection content; sending the screen projection content to a second electronic device to enable a display screen of the second electronic device to display the screen projection content, wherein the screen projection content comprises a cursor and interface content of the first electronic device, and the cursor is used for being positioned at an operation position in the interface content. In the above manner, the cursor is added to the screen-projected content, so that the user can determine the position to be operated through the cursor displayed by the second electronic device, the user can determine the position to be operated based on the position of the cursor in the screen-projected content displayed by the display screen of the second electronic device without looking at the interface content of the first electronic device, and further perform the operation on the position of the cursor.
An embodiment of the present application further provides an electronic device, please refer to fig. 18, where fig. 18 is a schematic structural diagram of the electronic device according to the embodiment of the present application, and the electronic device includes:
a processing module 1801, configured to generate screen projection content;
a sending module 1802, configured to send the screen-casting content to a second electronic device, so that a display screen of the second electronic device displays the screen-casting content, where the screen-casting content includes a cursor and interface content of the first electronic device, and the cursor is used to be positioned at an operation position in the interface content.
Optionally, the processing module 1801 is specifically configured to:
and acquiring interface content of the front-end application of the first electronic equipment, and generating a cursor on the interface content to obtain screen projection content.
Optionally, the processing module 1801 is specifically configured to:
acquiring interface content of a front-end application of the first electronic equipment, and generating a cursor and a menu bar on the interface content to obtain screen projection content; wherein the menu bar does not belong to the interface content of the first electronic device.
Optionally, the processing module 1801 is further configured to detect that the first electronic device and the second electronic device establish a screen-casting connection.
Optionally, the processing module 1801 is further configured to obtain pose change information of the first electronic device;
causing the cursor to move in the screen-cast content displayed by the second electronic device based on the pose change information.
Optionally, the pose change information includes a first rotation angle of the first electronic device on a horizontal plane and a second rotation angle of the first electronic device on a vertical plane, and the processing module 1801 is specifically configured to:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and moving the cursor in the screen-casting content displayed by the second electronic device according to the horizontal displacement and the vertical displacement.
Optionally, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, where the first direction is parallel to a left-right moving direction of a cursor displayed on a display screen of the second electronic device, the second direction is parallel to an up-down moving direction of the cursor displayed on the display screen of the second electronic device, and the processing module 1801 is further configured to:
Determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and moving the cursor in the screen-casting content displayed by the second electronic device according to the horizontal displacement and the vertical displacement.
Optionally, the processing module 1801 is further configured to:
when the cursor moves to a target object in the screen projection content displayed by the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
enabling a target object in the screen projection content displayed by the second electronic equipment to respond to the first touch operation.
Optionally, the processing module 1801 is further configured to:
and shielding the response of the second object to the first touch operation.
Optionally, the first touch operation at least includes a click operation and a first sliding operation, and the target object at least includes an application program and a function control.
Optionally, the processing module 1801 is further configured to:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
Causing the cursor to move in the screen-cast content displayed by the second electronic device based on the displacement.
Optionally, the operation position corresponding to the second sliding operation corresponds to a third object in the interface content of the first electronic device, and the processing module 1801 is further configured to:
and shielding the response of the third object to the second touch operation.
Optionally, the processing module 1801 is further configured to:
when the cursor moves to a target object in the screen projection content displayed by the second electronic device, if a click operation on a display screen of the first electronic device is received;
enabling a target object in the screen-shot content displayed by the second electronic equipment to respond to the clicking operation.
Optionally, the operation position of the click operation corresponds to a fourth object in the interface content of the first electronic device, and the processing module 1801 is further configured to:
and shielding the response of the fourth object to the click operation.
Optionally, the processing module 1801 is further configured to:
receiving a pressing operation on a physical key of the first electronic equipment;
generating a corresponding second event according to the pressing operation;
And executing the second event to the first electronic equipment front-end application.
Optionally, the processing module 1801 is further configured to:
receiving a first operation on a display screen of the first electronic device, wherein the first operation is a preset shortcut operation;
the first electronic device is responsive to the first operation.
Optionally, the operation position of the first operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object does not respond to the first operation.
Optionally, the preset shortcut operation at least includes:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Optionally, the processing module 1801 is further configured to:
reducing the display brightness of the display screen of the first electronic device; or the like, or, alternatively,
and executing screen-off operation on the display screen of the first electronic equipment.
Optionally, a display area of the display screen of the first electronic device is smaller than a display area of the display screen of the second electronic device.
Referring to fig. 19, fig. 19 is a schematic structural diagram of an electronic device according to an embodiment of the present application, and an electronic device 1900 may be embodied as a mobile phone, a tablet, an intelligent wearable device, and the like, which is not limited herein. Specifically, the electronic device 1900 includes: a receiver 1901, a transmitter 1902, a processor 1903, and a memory 1904 (wherein the number of processors 1903 in the electronic device 1900 may be one or more, and one processor is taken as an example in fig. 19), wherein the processor 1903 may include an application processor 19031 and a communication processor 19032. In some embodiments of the present application, the receiver 1901, the transmitter 1902, the processor 1903, and the memory 1904 may be connected by a bus or other means.
The memory 1904 may include both read-only memory and random access memory, and provides instructions and data to the processor 1903. A portion of the memory 1904 may also include non-volatile random access memory (NVRAM). The memory 1904 stores processors and operating instructions, executable modules or data structures, or subsets thereof, or expanded sets thereof, wherein the operating instructions may include various operating instructions for performing various operations.
The processor 1903 controls the operation of the electronic device. In a particular application, the various components of the electronic device are coupled together by a bus system that may include a power bus, a control bus, a status signal bus, etc., in addition to a data bus. For clarity of illustration, the various buses are referred to in the figures as a bus system.
The method disclosed in the above embodiments of the present application may be applied to the processor 1903, or implemented by the processor 1903. The processor 1903 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 1903. The processor 1903 may be a general-purpose processor, a Digital Signal Processor (DSP), a microprocessor or a microcontroller, and may further include an Application Specific Integrated Circuit (ASIC), a field-programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components. The processor 1903 may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or may be implemented by a combination of hardware and software modules in the decoding processor. The software modules may be located in ram, flash, rom, prom, eprom, eeprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 1904, and the processor 1903 reads the information in the memory 1904 and completes the steps of the method in combination with the hardware.
The receiver 1901 may be used to receive input numeric or character information and generate signal inputs related to settings and function controls of the electronic device. The transmitter 1902 may be configured to output numeric or character information through a first interface; the transmitter 1902 may also be configured to send, via the first interface, an instruction to the disk group to modify data in the disk group; the emitter 1902 may also include a display device such as a display screen.
In this embodiment, in one case, the processor 1903 is configured to execute the information processing method in the corresponding embodiment in the foregoing embodiments.
Next, an embodiment provided by the present application is described taking the second electronic device 200 as a head-mounted display (HMD) as an example, wherein the HMD may be a VR device display, an AR device display or an MR device display.
In embodiments of the application, a user may be immersed in an augmented reality environment, a virtual reality environment, or a mixed display environment by wearing a Head Mounted Display (HMD), and the user may be immersed in a 3D virtual environment and interact with the virtual environment through a variety of different types of input. For example, these inputs may include physical interactions, such as manipulation of the first electronic device 100 separate from the HMD, manipulation of the HMD itself (based on head movement), and so forth.
Referring to fig. 20, fig. 20 is an interaction schematic of a first electronic device and a second electronic device, as shown in fig. 20, a user wearing a second electronic device 200(HMD) is holding the first electronic device 100 as a control device of the second electronic device 200. It should be noted that although only one first electronic apparatus 100 is illustrated as the control apparatus of the second electronic apparatus 200 in the example shown in fig. 20, two (or more) additional external devices may also be paired and/or interacted with the HMD100 in the virtual environment. During operation (after pairing), the first electronic device 100 (and/or other external devices) may communicate with the second electronic device 200 via, for example, a wired connection, or a wireless connection such as, for example, a WiFi or bluetooth connection, or other communication modes available to both devices.
Fig. 21a shows a schematic diagram of a first electronic device 100 connected to a second electronic device 200 using a cable 1600. The first electronic device 100 may connect to the second electronic device 200 using one or more high-speed communication protocols (e.g., USB 2.0, USB 3.0, and USB 3.1). In some cases, the first electronic device 100 may be connected to the second electronic device 200 using an audio/video interface, such as a high-definition multimedia interface (HDMI). In some cases, the first electronic device 100 may connect to the second electronic device 200 using a DisplayPort standby mode for the USB type-C standard interface. The DisplayPort standby mode may include a high speed USB communication interface and DisplayPort functionality.
The cable 2100 may include suitable connectors that plug into the second electronic device 200 and the first electronic device 100 at either end. For example, the cable may include Universal Serial Bus (USB) connectors at both ends. The USB connectors may be identical USB type connectors, or each USB connector may be a different type of USB connector. The various types of USB connectors may include, but are not limited to, USB A-type connectors, USB B-type connectors, Micro-USB A connectors, Micro-USB B connectors, Micro-USB AR connectors, USB five pin Mini-B connectors, USB four pin Mini-B connectors, USB 3.0A-type connectors, USB 3.0B-type connectors, USB 3.0Micro B connectors, USB C-type connectors, and the like.
Fig. 21b is a schematic diagram illustrating the use of a wireless connection 1601 to connect the first electronic device 100 to the second electronic device 100 without a cable (e.g., without the cable 102 shown in fig. 21 a). The first electronic device 100 may connect to the second electronic device 200 using the wireless connection 1601 by implementing one or more high speed communication protocols, such as WiFi, bluetooth, or bluetooth Low Energy (LE).
It should be noted that the second electronic device 200 can also be connected to other control devices, such as a handle.
As shown in fig. 22a, at this time, the user uses the handle as the interactive device, however, in some scenarios, the handle is less portable, the independent handle needs to use a battery and is wirelessly connected to the first electronic device 100, extra power consumption is needed, and the handle of some AR/VR devices is large and heavy, and long-time use easily causes fatigue to the user.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 22b is a block diagram of a software structure of the first electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 22b, the application packages may include phone, camera, gallery, calendar, talk, map, navigation, WLAN, bluetooth, music, video, short message, VR glasses application, etc. applications. The VR glasses application comprises a 3D background drawing module, a handle event management module, an application icon loading module, a virtual screen management module and a virtual screen content acquisition module.
The 3D background drawing module is used for finishing drawing of a background picture displayed in a 3D virtual environment, so that a user can obtain the feeling of being in a certain real scene.
And the handle event management module is used for processing the events from the handle so as to achieve the purpose that a user can touch the control in the virtual display interface by operating the handle.
The application icon loading module is used for loading and displaying icons (such as WeChat, microblog, tremble and the like) of a plurality of applications on the electronic equipment in the virtual environment of the VR glasses.
And the virtual screen management module is used for creating a virtual screen when the user clicks the application icon to start the application, and destroying the virtual screen when the user closes the application.
And the virtual screen content acquisition module is used for acquiring the content in the application when the user clicks the started application, and rendering the content in the application through distortion so as to realize display in the virtual environment.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 22b, the application framework layer may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
In the embodiment of the present application, the Activity Manager Service (AMS), the Window Management Service (WMS), and the Download Management Service (DMS) in the application framework layer may further include an application keep-alive module, an event injection module, and a virtual screen management module.
The application keep-alive module is used for controlling the electronic equipment to enter a VR multi-screen display mode after the application with the multi-screen display mode function is started. In this mode, the electronic device may run multiple applications simultaneously and support each application being active at the same time.
And the event injection module is used for acquiring an event corresponding to the operation of the user in the multi-screen display mode and distributing the event to a virtual screen corresponding to the application.
And the virtual screen management module is used for providing the capability of creating a virtual screen and destroying the virtual screen for the electronic equipment.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a kernel library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
As shown in fig. 22c, in this embodiment of the application, the VR scene display adapter module (hwvrdisplayadapter) implements creating a virtual screen and destroying the virtual screen in the multi-screen mode, implements management of the virtual screen, and opens and creates a virtual screen interface (createfrsisplay { }) and a destroy virtual screen interface (destroy vrdisplayjrslay { }) for other services (for example, a display management module (DisplayManager), a display management global module (DisplayManager global), a display management service module (DisplayManager service), and the like). The display management module, the display management global module and the display management service module complete the function of creating the virtual screen when entering the VR multi-screen display mode by calling the created virtual screen interface layer by layer, and in addition, the display management module, the display management global module and the display management service module complete the function of destroying the virtual screen when exiting the VR multi-screen display mode by calling the created virtual screen interface layer by layer. In fig. 22c, the display management service module registers a callback required for creating a virtual screen when the electronic device is initialized, that is, the display management service module first calls an interface (registered vrdisplayadapter locked { }) of the registered VR scene display adapter module, and then calls a registration interface (registered locked { }) to complete registration and creation of the virtual screen.
As shown in fig. 22d, in the embodiment of the present application, Android supports injecting an event into a designated screen, and a custom interface opens the capability of injecting the event into the screen to other services. In fig. 22d, the input management module (IputManager) sequentially calls an injectInputEvent (event, mode, displayId) interface, an injectinputeventotdisplay (event, mode), and an injectinputeventernal (event, displayId) interface to inject an event to a specified virtual screen.
Referring to fig. 23a, fig. 23a is an interaction diagram of a user using a first electronic device, and the user operates a VR/AR application on the first electronic device based on a handle, as shown in fig. 23a, a display interface of a second electronic device includes a current display image 2300 of the first electronic device, a function selection area 2302 and an interactive mode display area 2301, where the interactive mode display area 2301 indicates that the current interactive mode is a handle, and the user can operate the handle to operate the current interface.
Referring to fig. 23b, fig. 23b is a schematic diagram of a user's interaction with the first electronic device, as shown in fig. 23b, the user can click on the interactive mode display area 2301 by sliding the touch screen 2303 on the handle, so that the handle-based interactive mode is switched to the first electronic device-based interactive mode (e.g., the interface shown in fig. 23 d).
In this embodiment of the application, when the handle is connected to the second electronic device, the handle or the first electronic device may be used as a control device of the second electronic device; when only the first electronic device establishes connection with the second electronic device, the first electronic device may be used as a control device of the second electronic device.
In this embodiment, the handle may be used as the control device of the second electronic device, and the control device of the second electronic device may be switched from the handle to the first electronic device in response to the first interaction mode switching instruction based on receiving the first interaction mode switching instruction.
In this embodiment of the application, the first electronic device may receive a first interaction mode switching instruction sent by the handle, and generate a corresponding first interaction mode switching instruction based on receiving a second operation on the first electronic device, or receive a first interaction mode switching instruction sent by the second electronic device.
Optionally, in an embodiment, a corresponding physical key may be further disposed on the handle to implement a function of switching to an interaction mode based on the first electronic device, in this embodiment of the application, a user may press the physical key to switch to the interaction mode based on the first electronic device (switch the control device of the second electronic device from the handle to the first electronic device).
Optionally, in another embodiment, the interaction mode based on the first electronic device (for example, the power button shown in fig. 23 c) may be switched to by pressing a physical button on the first electronic device.
Optionally, in another embodiment, the interaction mode based on the first electronic device may be switched to by pressing a physical key on the second electronic device.
Optionally, in another embodiment, if it is detected that the handle does not establish a connection with the second electronic device, the interaction mode based on the first electronic device may be directly used.
Similarly, the first electronic device may switch to the handle-based interaction mode in the same manner.
In the handle-based interaction mode, the user can control the display position of the cursor on the second electronic device based on the operation of the handle.
In an interaction mode based on a first electronic device, a user can control the display position of the cursor on the second electronic device based on the operation of the first electronic device.
How to control the display position of the cursor on the second electronic device by the user based on the operation of the first electronic device may refer to the description in the above embodiments, and details are not repeated here.
Alternatively, in an embodiment, different from the way of indicating the user operation object by displaying the cursor, in this embodiment, a ray exiting from the bottom side of the screen (or from the mobile phone image displayed by the second electronic device) to the currently displayed content may be displayed in the second electronic device, where the ray includes an end point, and the end point may indicate a position currently selected by the user (corresponding to the cursor).
At this time, the user may adjust the direction of the ray by adjusting the posture of the first electronic device, so that the user may adjust the position of the end point of the ray to the target object to be operated, referring to fig. 24a, fig. 24a is an interaction schematic of the user using the first electronic device provided in the embodiment of the present application, as shown in fig. 24a, if the user wants to operate the position corresponding to the area a, the user may adjust the direction of the ray by adjusting the posture of the first electronic device, so that the user may adjust the position of the end point of the ray to the target position to be operated (area a).
Referring to fig. 24b, fig. 24b is an interaction schematic diagram of a user using a first electronic device according to an embodiment of the present application, as shown in fig. 24b, the user may rotate the first electronic device on a horizontal plane, and accordingly, a direction of a ray displayed by a second electronic device may be changed accordingly, and an end position of the ray may be shifted in the horizontal direction.
Referring to fig. 24c, fig. 24c is an interaction schematic diagram of a user using a first electronic device according to an embodiment of the present application, as shown in fig. 24c, the user may rotate the first electronic device on a vertical plane, and accordingly, an end position of a ray displayed by a second electronic device may be shifted in a vertical direction.
In the embodiment of the application, after the user adjusts the position of the end point of the ray to the target position which the user wants to operate, the user can click the display screen of the first electronic device, and accordingly, the user can perform the click operation at the position of the end point of the ray.
Specifically, in an embodiment, after the user adjusts the position of the end point of the ray to the target position that the user wants to operate, the first electronic device may acquire the specific pixel coordinate position of the end point of the ray in the display interface of the second electronic device, and determine the pixel coordinate position corresponding to the display interface in the first electronic device according to the specific pixel coordinate position of the end point of the ray in the display interface of the second electronic device.
The user executes a click operation on the display screen of the first electronic device, and at this time, the click operation is not responded by a foreground application (at this time, the display interface of the first electronic device), and the first electronic device injects an event corresponding to the click operation into a pixel coordinate position corresponding to the end point of the ray in the display interface of the first electronic device, which is equivalent to causing the first electronic device to execute the click operation at the pixel coordinate position corresponding to the end point of the ray in the first electronic device.
Optionally, in another embodiment, a user performs a sliding operation on a display screen of the first electronic device, where the sliding operation is not responded by a foreground application (at this time, a display interface of the first electronic device), and the first electronic device injects an event corresponding to the sliding operation to a pixel coordinate position, corresponding to an end point of a ray, of the display interface in the first electronic device, and accordingly the sliding operation is performed at the pixel coordinate position, corresponding to the end point of the ray, of the first electronic device in the first electronic device. For more details, reference may be made to the above-described embodiments, which are not described in detail here.
Referring to fig. 25, fig. 25 is an operation schematic diagram of a first electronic device according to an embodiment of the present application, where the first electronic device can support a common full-screen gesture operation besides a click confirmation operation, and the operation is consistent with an original mode of using a handle, so as to reduce a learning cost of a user. As shown in fig. 25, sliding the representative return key from the left edge or the right edge to the middle (corresponding to the return key in the handle), sliding the representative home key from the lower edge to the middle (corresponding to the home key in the handle), sliding the representative home key up, down, left, and right (corresponding to the touch screen sliding in the handle), adjusting the volume by the volume key (corresponding to the volume adjustment key in the handle), sliding the representative return key from the lower edge to the middle and stopping at the middle to realize the view return (corresponding to the home key long-press operation in the handle to realize the view return), pressing the volume down key and the power key at the same time to realize the screen capture, and the like.
The principle of implementing the full-screen gesture operation may refer to the description of the embodiment in the screen projection scene, and is not described herein again.
Referring to fig. 26, fig. 26 is an interaction schematic diagram of a user using a first electronic device according to an embodiment of the present application, as shown in fig. 26, the user may perform a sliding operation on a display screen of the first electronic device, and accordingly, a pointer displacement displayed by a second electronic device, as to how the user performs the sliding operation on the display screen of the first electronic device to implement the pointer displacement displayed by the second electronic device, reference may be made to the description in the foregoing embodiment, and details are not repeated here.
Referring to fig. 27, fig. 27 is a schematic diagram of a system architecture provided in this example, as shown in fig. 27, taking a first electronic device as a mobile phone as an example, the system architecture includes: AR device/VR device/MR device, cell phone and independent handle. The independent handle can be connected with the AR device/VR device/MR device, and the mobile phone can be connected with the AR device/VR device/MR device.
In one implementation, a user may interact with the first electronic device by operating a separate handle and control the display content of the AR/VR/MR device.
Specifically, the independent handle may acquire gesture information of itself or sliding information on the touch panel, and send the acquired gesture information or sliding information to the mobile phone through the AR device/VR device/MR device, the mobile phone may process the gesture information or sliding information based on an independent handle interaction manner, and move the pointer in the interface content based on the gesture information or sliding information, and in addition, the independent handle may acquire a selection instruction (for example, through a physical button on the independent handle), and send the selection instruction to the mobile phone, and the mobile phone may process the selection instruction based on the independent handle interaction manner.
In one implementation, a user may interact with the first electronic device by operating a cell phone and control the display content of the AR/VR/MR device.
Specifically, the mobile phone may obtain gesture information of itself or sliding information in the display screen, the mobile phone may process the gesture information or the sliding information based on a mobile phone interaction manner, and move the pointer in the interface content based on the gesture information or the sliding information, and in addition, the independent handle may obtain a touch operation, generate a corresponding event, execute the generated event, and execute the event in the current front-end application.
Referring to fig. 28, fig. 28 is a schematic diagram of a system architecture provided in an embodiment of the present application, and as shown in fig. 28, the system architecture includes a system (for example, an Android system), an enhanced interaction service, a screen-off and lighting service, and an AR/VR/MR service.
The system can send information (such as gesture information) input by the sensor to the enhanced interaction service, the system can send information (such as touch events) input by the touch screen to the AR/VR/MR service, the AR/VR/MR service can send information input by the touch screen to the enhanced interaction service, the enhanced interaction service can send a screen-off or screen-lighting instruction to the screen-on and screen-off service, and the screen-on and screen-off service can send a screen-off or screen-lighting instruction to the system to achieve screen-off or screen-on of the first electronic device.
The enhanced interaction service may process the received information (sensor-input information or touch screen-input information), perform a pointer drawing, and perform a corresponding event (which may be based on the pointer's location, for example) on the AR/VR/MR application.
An embodiment of the present application further provides an electronic device, where the electronic device includes:
the system comprises a sending module, a processing module and a display module, wherein the sending module is used for establishing connection with second electronic equipment, the second electronic equipment displays a cursor and interface contents of the first electronic equipment, the first electronic equipment comprises a touch screen, and the second electronic equipment is augmented display AR equipment, virtual reality VR equipment or mixed reality MR equipment;
And the processing module is used for acquiring the operation of the first electronic equipment and controlling the display position of the cursor on the second electronic equipment based on the operation of the first electronic equipment.
Optionally, the interface content is interface content of a front-end application of the first electronic device.
Optionally, the second electronic device further displays: a menu bar, the menu bar not belonging to the interface content of the first electronic device.
Optionally, the processing module is specifically configured to acquire pose change information of the first electronic device, and move the cursor on the display content of the second electronic device based on the pose change information.
Optionally, the pose change information includes a first rotation angle of the first electronic device on a horizontal plane and a second rotation angle of the first electronic device on a vertical plane, and the processing module is specifically configured to:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and moving the cursor on the display content of the second electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, where the first direction is parallel to a left-right movement direction of a cursor displayed by the second electronic device, and the second direction is parallel to an up-down movement direction of the cursor displayed by the second electronic device, and the processing module is specifically configured to:
determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and moving the cursor on the display content of the second electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, the processing module is further configured to:
when the cursor moves to a target object in the display content of the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the first touch operation.
Optionally, the operation position of the first touch operation corresponds to a second object in the interface content of the first electronic device, and the processing module is further configured to:
And shielding the response of the second object to the first touch operation.
Optionally, the first touch operation at least includes a click operation and a first sliding operation, and the target object at least includes an application program and a function control.
Optionally, the processing module is specifically configured to:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in display content of the second electronic device based on the displacement.
Optionally, the starting operation position corresponding to the second sliding operation corresponds to a third object in the interface content of the first electronic device, and the processing module is further configured to:
and shielding the response of the third object to the second touch operation.
Optionally, the processing module is further configured to:
when the cursor moves to a target object in the display content of the second electronic device, if a click operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the clicking operation.
Optionally, the operation position of the click operation corresponds to a fourth object in the interface content of the first electronic device, and the processing module is further configured to:
And shielding the response of the fourth object to the click operation.
Optionally, the second electronic device further displays: and the ray, the end point of which is the cursor.
Optionally, the processing module is further configured to:
receiving a pressing operation on a physical key of first electronic equipment;
generating a corresponding second event according to the pressing operation;
and executing the second event to the first electronic equipment front-end application.
Optionally, the processing module is further configured to receive a second operation on the display screen of the first electronic device, where the second operation is a preset shortcut operation, and enable the first electronic device to respond to the second operation.
Optionally, the operation position of the second operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object does not respond to the first operation.
Optionally, the preset shortcut operation at least includes:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Optionally, the processing module is further configured to reduce display brightness of a display screen of the first electronic device; or the like, or, alternatively,
and executing screen-off operation on the display screen of the first electronic equipment.
Optionally, the processing module is specifically configured to:
acquiring pose change information of the first electronic equipment based on the display of a first application on the second electronic equipment, and enabling the cursor to move on the display content of the second electronic equipment based on the pose change information;
the method includes the steps of receiving a second sliding operation on a display screen of the first electronic device based on the fact that a second application is displayed on the second electronic device, determining displacement of a cursor according to the second sliding operation, and enabling the cursor to move in display content of the second electronic device based on the displacement, wherein the first application and the second application are different applications.
The embodiment of the present application further provides a first electronic device, the first electronic device with the second electronic device is connected, the first electronic device includes a touch screen, the second electronic device is an augmented reality AR device, a virtual reality VR device or a mixed reality MR device, the first electronic device includes:
The processing module is used for taking the handle or the first electronic equipment as control equipment of the second electronic equipment when the handle is connected with the second electronic equipment;
and when only the first electronic equipment is connected with the second electronic equipment, the first electronic equipment is used as the control equipment of the second electronic equipment.
Optionally, the processing module is specifically configured to:
using the handle as a control device for the second electronic device;
receiving a first interaction mode switching instruction;
and responding to the first interaction mode switching instruction, and switching the control device of the second electronic device from the handle to the first electronic device.
Optionally, the processing module is specifically configured to:
receiving a first interaction mode switching instruction sent by the handle, and generating a corresponding first interaction mode switching instruction or receiving a first interaction mode switching instruction sent by the second electronic equipment based on receiving a second operation on the first electronic equipment.
Optionally, the processing module is specifically configured to:
the first electronic equipment is used as the control equipment of the second electronic equipment;
The first electronic equipment further comprises an acquisition module for receiving and receiving a second interaction mode switching instruction;
the processing module is specifically configured to switch the control device of the second electronic device from the first electronic device to the handle in response to the second interaction mode switching instruction.
Optionally, the processing module is specifically configured to:
receiving a second interaction mode switching instruction sent by the handle, generating a corresponding second interaction mode switching instruction based on receiving a second operation on the first electronic equipment, or receiving a second interaction mode switching instruction sent by the second electronic equipment.
Optionally, the second electronic device displays a cursor and interface content of the first electronic device, and the processing module is specifically configured to:
controlling a display position of the cursor on the second electronic device based on the operation of the handle;
and controlling the display position of the cursor on the second electronic equipment based on the operation of the first electronic equipment.
Optionally, the interface content is interface content of a front-end application of the first electronic device.
Optionally, the processing module is specifically configured to:
Acquiring pose change information of the first electronic equipment;
causing the cursor to move on display content of the second electronic device based on the pose change information.
Optionally, the processing module is further configured to:
when the cursor moves to a target object in the display content of the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the first touch operation.
Optionally, the processing module is further configured to:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in display content of the second electronic device based on the displacement.
Optionally, the processing module is further configured to:
when the cursor moves to a target object in the display content of the second electronic device, if a click operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the clicking operation.
An embodiment of the present application further provides an electronic device, including:
the system comprises a sending module, a processing module and a display module, wherein the sending module is used for displaying interface content of first electronic equipment in second electronic equipment, the first electronic equipment comprises a touch screen, and the second electronic equipment is augmented display AR equipment, virtual reality VR equipment or mixed reality MR equipment;
the processing module is used for receiving a first operation acted on a first display screen of the first electronic equipment; causing the interface content displayed by the second electronic device to be responsive to the first operation; wherein the operation position of the first operation corresponds to a first object in the interface content of the first electronic device; the first object does not respond to the first operation.
Optionally, the causing of the interface content displayed by the second electronic device to respond to the first operation specifically includes:
determining a first position in the interface content displayed by the first electronic equipment; causing the interface content displayed by the second electronic device to respond to the first operation based on the first location; wherein the first position is independent of an operating position of the first operation.
Optionally, the electronic device further includes:
the cursor is projected in the second electronic device, so that the second electronic device displays the cursor.
Optionally, a position of the cursor corresponding to the interface content displayed by the first electronic device is a first position.
Optionally, the determining a first position in the interface content displayed by the first electronic device includes:
acquiring pose change information of the first electronic equipment;
determining a first position of the cursor in interface content displayed by the first electronic device based on the pose change information.
Optionally, the determining the first position of the cursor in the interface content displayed by the first electronic device based on the pose change information includes:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, the determining, based on the pose change information, a first position of the cursor in the interface content displayed by the first electronic device based on the pose change information includes:
determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, the first operation includes at least a click operation and a first slide operation, and the first object includes at least one of an application and a functionality control.
Optionally, the determining a first position in the interface content displayed by the first electronic device includes:
receiving a second sliding operation on a display screen of the first electronic equipment;
Determining the displacement of the cursor according to the second sliding operation;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the displacement of the cursor.
Optionally, the first operation at least includes a click operation, and the first object at least includes one of an application program and a function control.
Optionally, the electronic device further includes:
reducing the display brightness of the first display screen; or the like, or, alternatively,
and executing screen-off operation on the first display screen.
Optionally, the electronic device further includes:
displaying the menu bar in the second electronic device.
Referring to fig. 29, fig. 29 is a flowchart illustrating an operation method applied to screen projection according to an embodiment of the present disclosure, where a first electronic device has a first display screen, and a second electronic device has a second display screen; as shown in fig. 29, the operation method applied to screen projection provided by the embodiment of the present application includes:
2901. the first electronic equipment casts the interface content of the first electronic equipment into the second electronic equipment, so that the second display screen of the second electronic equipment displays the interface content.
In the embodiment of the application, the first electronic device may be equipped with the enhanced interaction service, and the enhanced interaction service may monitor the screen projection connection state of the first electronic device. The enhanced interaction service may detect that the first electronic device and the second electronic device establish the screen-casting connection, and the manner of establishing the screen-casting between the first electronic device and the second electronic device may refer to the description of the embodiments corresponding to fig. 4a to 5c, which is not described herein again.
In the embodiment of the application, after it is detected that the screen projection connection is established between the first electronic device and the second electronic device, the first electronic device projects the interface content displayed by the first display screen into the second electronic device, so that the second display screen displays the interface content.
Optionally, the first electronic device may generate a cursor and project the cursor on the second electronic device, so that the second display screen displays the cursor. How the first electronic device generates the cursor may be described with reference to the corresponding embodiment in step 301, and is not described herein again.
Alternatively, the first electronic device may generate a menu bar and screen-cast the menu bar in the second electronic device, so that the second display screen displays the menu bar. How the first electronic device generates the menu bar may be described with reference to the corresponding embodiment in step 301, and is not described herein again.
Alternatively, the first electronic device may acquire pose change information of the first electronic device, and cause the cursor to move on the content displayed on the second display screen based on the pose change information. As to how to move the cursor in the display screen of the second electronic device based on the posture change information of the first electronic device, reference may be made to the description in the above embodiments, and details are not repeated here.
Optionally, the first electronic device may determine a first position in the interface content displayed by the first electronic device, and cause the interface content displayed by the second display screen to respond to the first operation based on the first position; wherein the first position is independent of an operating position of the first operation.
Optionally, the first electronic device may acquire pose change information of the first electronic device, and determine a first position of the cursor in interface content displayed by the first electronic device based on the pose change information. At this time, the position of the cursor corresponding to the interface content displayed by the first electronic device is a first position.
Optionally, the first electronic device may further receive a second sliding operation acting on the first display screen, determine a displacement of the cursor according to the second sliding operation, and move the cursor on the content displayed on the second display screen based on the displacement of the cursor. As to how to move the cursor in the display screen of the second electronic device according to the second sliding operation, reference may be made to the description in the foregoing embodiments, and details are not repeated here.
Optionally, the first electronic device may receive a second sliding operation on the display screen of the first electronic device, determine a displacement of the cursor according to the second sliding operation, and determine a first position of the cursor in the interface content displayed by the first electronic device according to the displacement of the cursor.
In this embodiment, after a first position is determined in the interface content displayed by the first electronic device, the interface content displayed by the second display screen may respond to the first operation based on the first position. As to how the first position responds to the first operation, reference may be made to the description of step 302 in the foregoing embodiment, which is not repeated here.
Optionally, after the first electronic device detects that the first electronic device establishes the screen projection connection with the second electronic device, the display brightness of the first display screen may also be reduced; or executing screen-off operation on the first display screen.
2902. The first electronic equipment receives a first operation acted on a first display screen of the first electronic equipment; causing the interface content displayed by the second display screen to be responsive to the first operation; wherein the operation position of the first operation corresponds to a first object in the interface content of the first electronic device; the first object does not respond to the first operation.
In an embodiment of the present application, the first object may include at least one of an application and a functionality control.
In this embodiment, the first electronic device may shield a response of the content of the front-end application to the operation position of the first operation, that is, the operation position of the first operation corresponds to a first object in the interface content of the first electronic device, and the first object does not respond to the first operation.
Specifically, after receiving a first operation on the display screen of the first electronic device, the first electronic device may mask a response of the interface content to the operation position of the first operation, and determine a position of a cursor on the second display screen to determine a response position (first position) of the interface content to the first operation. For example, if the user performs a click operation on the a object of the first display screen of the first electronic device, and at this time, the cursor is located on the B object in the second electronic device, at this time, the first electronic device may mask the response of the a object to the click operation, and respond to the click operation on the B object of the interface content. The first position is, correspondingly, independent of the operating position of the first operation, but only of the position of the cursor on the second display screen of the second electronic device.
In this embodiment of the application, the first electronic device responds to the first operation at the first position of the content displayed on the first display screen, and the second display screen of the second electronic device also synchronously displays the content updated after the content displayed on the first display screen responds to the first operation at the first position, that is, the content displayed on the second display screen responds to the first operation.
In the embodiment of the application, the interface content of the first electronic equipment is projected to the second electronic equipment, so that the second display screen of the second electronic equipment displays the interface content; receiving a first operation acting on a first display screen of the first electronic device; causing the interface content displayed by the second display screen to be responsive to the first operation; wherein the operation position of the first operation corresponds to a first object in the interface content of the first electronic device; the first object does not respond to the first operation. Through the above manner, after the screen-casting connection is established between the first electronic device and the second electronic device, the content in the first display screen does not respond to the first object corresponding to the operation position where the first operation is located, and the content displayed by the second display screen responds to the first operation, so that the user can operate the content of the first display screen based on the content displayed by the second display screen without seeing the content of the first display screen of the first electronic device.
An embodiment of the present application further provides a first electronic device, where the first electronic device includes:
the processing module is used for projecting the interface content of the first electronic equipment into the second electronic equipment so that the second display screen of the second electronic equipment displays the interface content;
the processing module is further used for receiving a first operation acted on a first display screen of the first electronic device; causing the interface content displayed by the second display screen to respond to the first operation; wherein the operation position of the first operation corresponds to a first object in the interface content of the first electronic device; the first object is not responsive to the first operation.
Optionally, the processing module is specifically configured to:
determining a first position in the interface content displayed by the first electronic equipment; causing the interface content displayed by the second display screen to respond to the first operation based on the first location; wherein the first position is independent of an operating position of the first operation.
Optionally, the processing module is further configured to:
the cursor is projected in the second electronic device, so that the second display screen displays the cursor.
Optionally, a position of the cursor corresponding to the interface content displayed by the first electronic device is a first position.
Optionally, the processing module is specifically configured to:
acquiring pose change information of the first electronic equipment;
determining a first position of the cursor in interface content displayed by the first electronic device based on the pose change information.
Optionally, the pose change information includes a first rotation angle of the first electronic device on a horizontal plane and a second rotation angle of the first electronic device on a vertical plane, and the processing module is specifically configured to:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, where the first direction is parallel to a left-right moving direction of a cursor displayed on a display screen of the second electronic device, and the second direction is parallel to an up-down moving direction of the cursor displayed on the display screen of the second electronic device, and the processing module is specifically configured to:
Determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, the first operation includes at least a click operation and a first slide operation, and the first object includes at least one of an application and a functionality control.
Optionally, the processing module is specifically configured to:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the displacement of the cursor.
Optionally, the first operation at least includes a click operation, and the first object at least includes one of an application program and a function control.
Optionally, the first operation is a preset shortcut operation, and the processing module is further configured to:
causing the first electronic device to respond to the first operation.
Optionally, the operation position of the first operation corresponds to a third object in the interface content of the second electronic device, and the third object does not respond to the first operation.
Optionally, the preset shortcut operation at least includes:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Optionally, the processing module is further configured to:
reducing the display brightness of the first display screen; or the like, or, alternatively,
and executing screen-off operation on the first display screen.
Optionally, the processing module is further configured to:
generating a menu bar;
and projecting the menu bar into the second electronic equipment so that the second display screen displays the menu bar.
Optionally, the processing module is further configured to:
detecting that the first electronic device and the second electronic device establish screen projection connection.
The embodiment of the application also provides a control method for multi-application display.
Specifically, the imaging system (e.g., display screen, glasses, etc.) in which the second electronic device (e.g., large screen, television, AR, VR, or MR device, etc.) may display the multiple display interfaces of the first electronic device.
Alternatively, the first electronic device may generate a menu bar when the second electronic device establishes content transmission, the menu bar may be displayed in the imaging system of the second electronic device, and the user may add a new independent display interface (e.g., a mobile phone interface) of the first electronic device in the imaging system of the second electronic device according to an "add" button in the menu bar. As shown in fig. 30;
optionally, each time the user clicks one addition, an independent mobile phone interface may be newly added to the imaging system, and the state of the newly added interface may be a home interface of the mobile phone, or a current display interface of the mobile phone, or a preset default interface, or an interface of a certain preset application, or a copy interface of a currently displayed mobile phone interface of the imaging system, or a random interface; the present invention is by way of example only and is not limited thereto. Because a plurality of independent interfaces (such as a first interface and a second interface) of the mobile phone can be presented in the imaging system, the content in the large screen can be enriched.
Optionally, the user may also select a new application to run in the imaging system, which may generate a new, separate application interface in the imaging system of the second electronic device.
Optionally, the user clicks a title in the application, and a new independent interface of the second level of the application extension may be opened, for example, a chat of a specific contact in the WeChat, for example, a specific article in the News APP.
Optionally, when the user establishes content connection and transmission between the first electronic device and the second electronic device, and the first electronic device already runs N applications, the interface content of the N applications may be correspondingly displayed in N display areas in the imaging system, respectively.
Optionally, the interface contents may be distributed without overlapping, or may be distributed in a stacked manner (a display interface of a frontmost application currently running is kept displayed at the frontmost end), which is not limited in this application.
The first electronic device may generate a cursor (as in the cursor generation method described above) when the second electronic device establishes the content transfer, may transmit the data, and may display the data in the imaging system. The range of movement of the cursor may be adapted to the full range of the imaging system. Specifically, the first electronic device may first acquire parameter information of the imaging system, such as a size specification of a display screen of a large screen, a field of view parameter of the ARVR device, and the like; the moving range and the mapping rule of the cursor are determined based on the parameter information, the cursor is not limited to a single first electronic equipment display interface, operation can be carried out across multiple different first electronic equipment display interfaces, and the display boundary of the whole imaging system can be reached. In this way, when the plurality of first electronic device display interfaces are generated in the imaging system, the operation range of the cursor in the imaging system can be covered to the content in the plurality of first electronic device display interfaces, and the cursor can move freely in the plurality of first electronic device display interfaces. The cursor is used to determine an operation object in the contents of the entire display range of the imaging system. The operation object can be an object in the display interface of the first electronic device, and can also be other objects in the imaging system, which do not belong to the object in the display interface of the first electronic device.
Optionally, the user may control the position of the cursor in the imaging system by adjusting the pose of the first electronic device or sliding the screen of the first electronic device, so that the operation object may be determined in the content displayed in the imaging system according to the operation of the user. Wherein the starting position of the sliding operation corresponds to a first object in the interface content of the first electronic equipment; the first object does not respond to the sliding operation. It should be understood that the sliding operation is a random operation that may be performed multiple times. The longer the trajectory of the first sliding operation, the larger the movement range of the corresponding cursor in the imaging system.
Alternatively, when the cursor is located on an object (including but not limited to an application icon, a function key, an unlimited type of option, or a preset position (e.g., a blank), etc.) in the imaging system, indicating that the object has been confirmed as an operation object, the user inputs an operation representing "confirmation" on the terminal, such as a key press, a touch, a click, or a special slide, etc., and the operation object responds to the command of "confirmation". For example entering an application, a certain function key enabling, a selection of a certain option, an operation to generate a certain shortcut, etc., the invention is not exhaustive. It should be understood that, at this time, if the "confirm" operation is an operation on the touch screen, and the touch position of the "confirm" operation corresponds to the second object in the interface content of the first electronic device; the second object does not respond to the "confirm" operation.
In a specific implementation, the content of the first electronic device interface may be adapted to the content in the imaging system.
Optionally, the current interface content of the first electronic device may be "single screen". For example, if a cursor in a display interface in the imaging system is located in a first application interface, interface content of the first electronic device is synchronized to the first application interface at this time; if the cursor in the imaging system transitions from the first application interface to the second application interface, the interface content of the first electronic device also switches from the first application interface to the second application interface at this time.
Optionally, in a specific implementation process, the content of the interface of the first electronic device may be "multi-screen", for example, the display interface of the first electronic device remains corresponding to all display interfaces in the second electronic device. Specifically, if the display interface in the imaging system includes the first application interface and the second application interface, the interface content of the first electronic device is synchronized to the first application interface and the second application interface; if the display interface in the imaging system comprises the first application interface, the second application interface and the third application interface, the interface content of the first electronic device is synchronized to the first application interface, the second application interface and the third application interface.
In order to save power consumption, when the first electronic device transmits content to be displayed to the imaging system, the display brightness of the first electronic device can be reduced; or, performing screen-off operation on the first electronic device.
It should be understood that the embodiments of the present invention are extremely rich, and any of the above steps can be freely combined without violating the natural law, and detailed description cannot be given for all possible scenarios and implementations in the present application. The description of the relevant features may also be applied to further embodiments.
It should be noted that the above-described embodiments of the apparatus are merely schematic, where the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. In addition, in the drawings of the embodiments of the apparatus provided by the present application, the connection relationship between the modules indicates that there is a communication connection between them, and may be specifically implemented as one or more communication buses or signal lines.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present application can be implemented by software plus necessary general-purpose hardware, and certainly can also be implemented by special-purpose hardware including special-purpose integrated circuits, special-purpose CPUs, special-purpose memories, special-purpose components and the like. Generally, functions performed by computer programs can be easily implemented by corresponding hardware, and specific hardware structures for implementing the same functions can be various, such as analog circuits, digital circuits, or dedicated circuits. However, for the present application, the implementation of a software program is more preferable. Based on such understanding, the technical solutions of the present application may be substantially embodied in the form of a software product, which is stored in a readable storage medium, such as a floppy disk, a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, an exercise device, or a network device) to execute the method according to the embodiments of the present application.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website site, computer, training device, or data center to another website site, computer, training device, or data center by wire (e.g., coaxial cable, fiber optics, Digital Subscriber Line (DSL)) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that a computer can store or a data storage device, such as a training device, a data center, etc., that incorporates one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.

Claims (43)

1. An information processing method characterized by comprising:
generating screen projection content;
sending the screen projection content to a second electronic device to enable a display screen of the second electronic device to display the screen projection content, wherein the screen projection content comprises a cursor and interface content of the first electronic device, and the cursor is used for being positioned at an operation position in the interface content.
2. The method of claim 1, wherein the generating the screen shot content comprises:
and acquiring interface content of the front-end application of the first electronic equipment, and generating a cursor on the interface content to obtain screen projection content.
3. The method of claim 1 or 2, wherein the generating the screen-shot content comprises:
acquiring interface content of a front-end application of the first electronic equipment, and generating a cursor and a menu bar on the interface content to obtain screen projection content; wherein the menu bar does not belong to the interface content of the first electronic device.
4. The method of any of claims 1 to 3, wherein prior to the generating the screen-shot content, the method further comprises:
detecting that the first electronic device and the second electronic device establish screen projection connection.
5. The method of any of claims 1 to 4, further comprising:
acquiring pose change information of the first electronic equipment;
causing the cursor to move in the screen projection content displayed by the second electronic device based on the pose change information.
6. The method according to claim 5, wherein the pose change information includes a first rotation angle of the first electronic device on a horizontal plane and a second rotation angle of the first electronic device on a vertical plane, and the causing the cursor to move in the screen-projected content displayed by the second electronic device based on the pose change information includes:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and moving the cursor in the screen-casting content displayed by the second electronic device according to the horizontal displacement and the vertical displacement.
7. The method of claim 5, wherein the pose change information comprises a first displacement of the first electronic device in a first direction parallel to a left-right movement direction of a cursor displayed on a display screen of the second electronic device and a second displacement of the first electronic device in a second direction parallel to an up-down movement direction of a cursor displayed on a display screen of the second electronic device, wherein causing the cursor to move in the screen shot content displayed on the second electronic device based on the pose change information comprises:
Determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and moving the cursor in the screen-casting content displayed by the second electronic device according to the horizontal displacement and the vertical displacement.
8. The method of any of claims 5 to 7, further comprising:
when the cursor moves to a target object in the screen projection content displayed by the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
enabling a target object in the screen projection content displayed by the second electronic equipment to respond to the first touch operation.
9. The method of claim 8, wherein the operational position of the first touch operation corresponds to a second object in the interface content of the first electronic device, the method further comprising:
and shielding the response of the second object to the first touch operation.
10. The method according to claim 8 or 9, wherein the first touch operation comprises at least a click operation and a first slide operation, and the target object comprises at least an application and a functionality control.
11. The method of any of claims 1 to 4, further comprising:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in the screen-cast content displayed by the second electronic device based on the displacement.
12. The method of claim 11, wherein the starting operation position corresponding to the second sliding operation corresponds to a third object in the interface content of the first electronic device, and wherein the method further comprises:
and shielding the response of the third object to the second touch operation.
13. The method according to claim 11 or 12, characterized in that the method further comprises:
when the cursor moves to a target object in the screen projection content displayed by the second electronic device, if a click operation on a display screen of the first electronic device is received;
enabling a target object in the screen-shot content displayed by the second electronic equipment to respond to the clicking operation.
14. The method of claim 13, wherein an operation position of the click operation corresponds to a fourth object in interface content of the first electronic device, the method further comprising:
And shielding the response of the fourth object to the click operation.
15. The method of any one of claims 1 to 14, further comprising:
receiving a pressing operation on a physical key of the first electronic equipment;
generating a corresponding second event according to the pressing operation;
and executing the second event to the first electronic equipment front-end application.
16. The method of any one of claims 1 to 15, further comprising:
receiving a first operation on a display screen of the first electronic device, wherein the first operation is a preset shortcut operation;
the first electronic device is responsive to the first operation.
17. The method of claim 16, wherein the operation position of the first operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object does not respond to the first operation.
18. Method according to claim 16 or 17, characterized in that said preset shortcut operations comprise at least:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
The second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
19. The method of any one of claims 1 to 18, further comprising:
reducing the display brightness of the display screen of the first electronic device; or the like, or, alternatively,
and executing screen-off operation on the display screen of the first electronic equipment.
20. The method of any of claims 1-19, wherein a display area of a display screen of the first electronic device is smaller than a display area of a display screen of the second electronic device.
21. An electronic device, comprising:
the processing module is used for generating and generating screen projection content;
the sending module is used for sending the screen projection content to a second electronic device so that a display screen of the second electronic device displays the screen projection content, wherein the screen projection content comprises a cursor and interface content of the first electronic device, and the cursor is used for being positioned at an operation position in the interface content.
22. The electronic device of claim 21, wherein the processing module is specifically configured to:
And acquiring interface content of the front-end application of the first electronic equipment, and generating a cursor on the interface content to obtain screen projection content.
23. The electronic device according to claim 20 or 21, wherein the processing module is specifically configured to:
acquiring interface content of a front-end application of the first electronic equipment, and generating a cursor and a menu bar on the interface content to obtain screen projection content; wherein the menu bar does not belong to the interface content of the first electronic device.
24. The electronic device of any of claims 19-21, wherein the processing module is further configured to detect that the first electronic device establishes a screen-casting connection with the second electronic device.
25. The electronic device according to any one of claims 21 to 24, wherein the processing module is further configured to acquire pose change information of the first electronic device;
causing the cursor to move in the screen projection content displayed by the second electronic device based on the pose change information.
26. The electronic device according to claim 25, wherein the pose change information includes a first rotation angle of the first electronic device in a horizontal plane and a second rotation angle of the first electronic device in a vertical plane, and wherein the processing module is specifically configured to:
Determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and moving the cursor in the screen-casting content displayed by the second electronic device according to the horizontal displacement and the vertical displacement.
27. The electronic device of claim 25, wherein the pose change information comprises a first displacement of the first electronic device in a first direction parallel to a left-right movement direction of a cursor displayed on a display screen of the second electronic device and a second displacement of the first electronic device in a second direction parallel to a up-down movement direction of the cursor displayed on the display screen of the second electronic device, the processing module further configured to:
determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and moving the cursor in the screen-casting content displayed by the second electronic device according to the horizontal displacement and the vertical displacement.
28. The electronic device of any of claims 25-27, wherein the processing module is further configured to:
When the cursor moves to a target object in the screen projection content displayed by the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
enabling a target object in the screen projection content displayed by the second electronic equipment to respond to the first touch operation.
29. The electronic device of claim 28, wherein the processing module is further configured to:
and shielding the response of the second object to the first touch operation.
30. The electronic device according to claim 28 or 29, wherein the first touch operation includes at least a click operation and a first slide operation, and the target object includes at least an application and a functionality control.
31. The electronic device of claim 30, wherein the processing module is further configured to:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in the screen-cast content displayed by the second electronic device based on the displacement.
32. The electronic device of claim 31, wherein an operation position corresponding to the second sliding operation corresponds to a third object in the interface content of the first electronic device, and wherein the processing module is further configured to:
And shielding the response of the third object to the second touch operation.
33. The electronic device of claim 31 or 32, wherein the processing module is further configured to:
when the cursor moves to a target object in the screen projection content displayed by the second electronic device, if a click operation on a display screen of the first electronic device is received;
enabling a target object in the screen-shot content displayed by the second electronic equipment to respond to the clicking operation.
34. The electronic device of claim 33, wherein an operation position of the click operation corresponds to a fourth object in the interface content of the first electronic device, and wherein the processing module is further configured to:
and shielding the response of the fourth object to the click operation.
35. The electronic device of any of claims 21-34, wherein the processing module is further configured to:
receiving a pressing operation on a physical key of the first electronic equipment;
generating a corresponding second event according to the pressing operation;
and executing the second event to the first electronic equipment front-end application.
36. The electronic device of any of claims 21-35, wherein the processing module is further configured to:
Receiving a first operation on a display screen of the first electronic device, wherein the first operation is a preset shortcut operation;
the first electronic device is responsive to the first operation.
37. The electronic device of claim 36, wherein the operation position of the first operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object is not responsive to the first operation.
38. The electronic device according to claim 36 or 37, wherein the preset shortcut operation at least comprises:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
39. The electronic device of any of claims 21-38, wherein the processing module is further configured to:
reducing the display brightness of the display screen of the first electronic device; or the like, or, alternatively,
and executing screen-off operation on the display screen of the first electronic equipment.
40. The electronic device of any of claims 21-39, wherein a display area of the display screen of the first electronic device is smaller than a display area of the display screen of the second electronic device.
41. An electronic device, for use as a first electronic device, comprising a processor, an input device, an output device, and a memory, wherein the memory is configured to store a computer program comprising program instructions that, when executed by the processor, cause the first electronic device to perform the method of any of claims 1 to 20.
42. A screen projection system, comprising: the first electronic device of claim 41, and a second electronic device, the first electronic device being connected to the second electronic device for screen projection.
43. A computer-readable storage medium, characterized in that it stores a computer program comprising program instructions which, when said program requests to be executed by a computer, cause the computer to carry out the method according to any one of claims 1 to 20.
CN201910937409.9A 2019-09-19 2019-09-27 Information processing method and electronic equipment Pending CN112527222A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910888929 2019-09-19
CN2019108889295 2019-09-19

Publications (1)

Publication Number Publication Date
CN112527222A true CN112527222A (en) 2021-03-19

Family

ID=74974549

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910937409.9A Pending CN112527222A (en) 2019-09-19 2019-09-27 Information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN112527222A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113242284A (en) * 2021-04-30 2021-08-10 北京有竹居网络技术有限公司 Communication method, device and communication system
CN114063951A (en) * 2021-09-26 2022-02-18 荣耀终端有限公司 Screen projection abnormity processing method and electronic equipment
CN114089935A (en) * 2021-10-25 2022-02-25 青岛海尔科技有限公司 Screen projection processing method, device, equipment and storage medium
CN114115691A (en) * 2021-10-27 2022-03-01 荣耀终端有限公司 Electronic device and interaction method and medium thereof
WO2022242408A1 (en) * 2021-05-19 2022-11-24 华为技术有限公司 Display method and terminal device
WO2024012402A1 (en) * 2022-07-15 2024-01-18 华为技术有限公司 Display method and electronic device
WO2024046024A1 (en) * 2022-08-29 2024-03-07 华为技术有限公司 Screencasting method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103576901A (en) * 2012-08-02 2014-02-12 腾讯科技(深圳)有限公司 Implement method for performing cursor control on screen through handheld electronic device and handheld electronic device
CN105227986A (en) * 2015-09-24 2016-01-06 小米科技有限责任公司 Synchronization processing method and device
CN105512086A (en) * 2016-02-16 2016-04-20 联想(北京)有限公司 Information processing device and information processing method
US20180121009A1 (en) * 2016-10-28 2018-05-03 Nanning Fugui Precision Industrial Co., Ltd. Interface control method and electronic device using the same
CN108646997A (en) * 2018-05-14 2018-10-12 刘智勇 A method of virtual and augmented reality equipment is interacted with other wireless devices
CN109960449A (en) * 2019-03-22 2019-07-02 深圳前海达闼云端智能科技有限公司 A kind of throwing screen display methods and relevant apparatus
CN110221798A (en) * 2019-05-29 2019-09-10 华为技术有限公司 A kind of throwing screen method, system and relevant apparatus

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103576901A (en) * 2012-08-02 2014-02-12 腾讯科技(深圳)有限公司 Implement method for performing cursor control on screen through handheld electronic device and handheld electronic device
CN105227986A (en) * 2015-09-24 2016-01-06 小米科技有限责任公司 Synchronization processing method and device
CN105512086A (en) * 2016-02-16 2016-04-20 联想(北京)有限公司 Information processing device and information processing method
US20180121009A1 (en) * 2016-10-28 2018-05-03 Nanning Fugui Precision Industrial Co., Ltd. Interface control method and electronic device using the same
CN108646997A (en) * 2018-05-14 2018-10-12 刘智勇 A method of virtual and augmented reality equipment is interacted with other wireless devices
CN109960449A (en) * 2019-03-22 2019-07-02 深圳前海达闼云端智能科技有限公司 A kind of throwing screen display methods and relevant apparatus
CN110221798A (en) * 2019-05-29 2019-09-10 华为技术有限公司 A kind of throwing screen method, system and relevant apparatus

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113242284A (en) * 2021-04-30 2021-08-10 北京有竹居网络技术有限公司 Communication method, device and communication system
WO2022242408A1 (en) * 2021-05-19 2022-11-24 华为技术有限公司 Display method and terminal device
CN114063951A (en) * 2021-09-26 2022-02-18 荣耀终端有限公司 Screen projection abnormity processing method and electronic equipment
CN114089935A (en) * 2021-10-25 2022-02-25 青岛海尔科技有限公司 Screen projection processing method, device, equipment and storage medium
CN114089935B (en) * 2021-10-25 2024-01-23 青岛海尔科技有限公司 Screen projection processing method, device, equipment and storage medium
CN114115691A (en) * 2021-10-27 2022-03-01 荣耀终端有限公司 Electronic device and interaction method and medium thereof
WO2024012402A1 (en) * 2022-07-15 2024-01-18 华为技术有限公司 Display method and electronic device
WO2024046024A1 (en) * 2022-08-29 2024-03-07 华为技术有限公司 Screencasting method and device

Similar Documents

Publication Publication Date Title
WO2021057830A1 (en) Information processing method and electronic device
CN110109636B (en) Screen projection method, electronic device and system
CN111324327B (en) Screen projection method and terminal equipment
WO2020238874A1 (en) Vr multi-screen display method and electronic device
CN112527174B (en) Information processing method and electronic equipment
WO2022100315A1 (en) Method for generating application interface, and related apparatus
CN113553014B (en) Application interface display method under multi-window screen projection scene and electronic equipment
WO2022100237A1 (en) Screen projection display method and related product
CN112527222A (en) Information processing method and electronic equipment
CN112394895B (en) Picture cross-device display method and device and electronic device
WO2021212922A1 (en) Object dragging method and device
JP2023503679A (en) MULTI-WINDOW DISPLAY METHOD, ELECTRONIC DEVICE AND SYSTEM
CN112398855B (en) Method and device for transferring application contents across devices and electronic device
WO2022105445A1 (en) Browser-based application screen projection method and related apparatus
WO2022017393A1 (en) Display interaction system, display method, and device
WO2020248714A1 (en) Data transmission method and device
WO2023030099A1 (en) Cross-device interaction method and apparatus, and screen projection system and terminal
CN114286152A (en) Display device, communication terminal and screen projection picture dynamic display method
CN112383664A (en) Equipment control method, first terminal equipment and second terminal equipment
WO2021254113A1 (en) Control method for three-dimensional interface and terminal
WO2021052488A1 (en) Information processing method and electronic device
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
WO2023045597A1 (en) Cross-device transfer control method and apparatus for large-screen service
WO2022111701A1 (en) Screen projection method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination