CN112558825A - Information processing method and electronic equipment - Google Patents

Information processing method and electronic equipment Download PDF

Info

Publication number
CN112558825A
CN112558825A CN202010882797.8A CN202010882797A CN112558825A CN 112558825 A CN112558825 A CN 112558825A CN 202010882797 A CN202010882797 A CN 202010882797A CN 112558825 A CN112558825 A CN 112558825A
Authority
CN
China
Prior art keywords
electronic device
cursor
screen
interface
touch screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010882797.8A
Other languages
Chinese (zh)
Inventor
王提政
邵凯
徐亚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to PCT/CN2020/117308 priority Critical patent/WO2021057830A1/en
Publication of CN112558825A publication Critical patent/CN112558825A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application provides a control method based on a touch screen, which is applied to first electronic equipment, wherein the first electronic equipment comprises the touch screen; touch screens can be used to display interfaces; the method comprises the following steps: when the first electronic device and the second electronic device establish screen projection connection, a cursor is generated and displayed in an interface of a touch screen; generating screen projection content based on interface content in the touch screen; the screen projection content comprises a cursor; sending the screen projection content to the second electronic equipment; when receiving a target touch operation of a user on the touch screen, responding to the target touch operation based on the display position of a cursor in the current interface; the display position of the cursor in the current interface is used for indicating a response position for the target touch operation in the current interface; the response position is irrelevant to an operation position of the target touch operation on the touch screen. According to the invention, the first electronic equipment of the screen projection content source can be changed into the control equipment, and no additional equipment is added, so that a user can clearly perceive and control the touch screen of the first electronic equipment by watching the screen projection content of the second electronic equipment, and further, the interface of the content source is operated, and very good blind operation experience can be obtained.

Description

Information processing method and electronic equipment
Technical Field
The present application relates to the field of electronic technologies, and in particular, to an information processing method and an electronic device.
Background
With the popularization of Augmented Reality (AR), Virtual Reality (VR) technology, and Mixed Reality (MR), AR devices, VR devices, and MR devices have been widely used in many scenes of work, entertainment, and the like. Unlike conventional terminal devices, current AR devices, VR devices and MR devices use independent handles as the main interaction solution.
However, the handles of the AR device, the VR device, and the MR device are large and heavy, the manipulation system is complicated, and fatigue is easily caused by long-time use.
In some demonstration scenes, a user needs an additional control device such as a laser pen to demonstrate, and the dependence of the control system on third-party hardware is also high.
Disclosure of Invention
In a first aspect, an embodiment of the present application provides an information processing method, including:
the method comprises the steps that a first electronic device is connected with a second electronic device, wherein the second electronic device displays a cursor and interface contents of the first electronic device, the first electronic device comprises a touch screen, and the second electronic device is an AR (augmented reality) device, a Virtual Reality (VR) device or a Mixed Reality (MR) device;
Acquiring a first operation on the first electronic device;
and controlling the display position of the cursor on the second electronic equipment based on the first operation of the first electronic equipment.
Optionally, in a design of the first aspect, the interface content is interface content of a front-end application of the first electronic device.
Optionally, in a design of the first aspect, the second electronic device further displays: a menu bar, the menu bar not belonging to the interface content of the first electronic device.
Optionally, in a design of the first aspect, the controlling, based on the operation of the first electronic device, a display position of the cursor on the second electronic device includes:
acquiring pose change information, namely pose change, of the first electronic equipment;
causing the cursor to move on display content of the second electronic device based on the pose change information.
Optionally, in one design of the first aspect, the position and orientation change information includes a first rotation angle of the first electronic device in a horizontal plane and a second rotation angle of the first electronic device in a vertical plane, and the causing the cursor to move on the display content of the second electronic device based on the position and orientation change information includes:
Determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and moving the cursor on the display content of the second electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the first aspect, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, the first direction is parallel to a left-right moving direction of a cursor displayed by the second electronic device, the second direction is parallel to a up-down moving direction of the cursor displayed by the second electronic device, and the moving the cursor on the display content of the second electronic device based on the pose change information includes:
determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and moving the cursor on the display content of the second electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the first aspect, the method further includes:
When the cursor moves to a target object in the display content of the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the first touch operation.
Optionally, in a design of the first aspect, an operation position of the first touch operation corresponds to a second object in interface content of the first electronic device, and the method further includes:
and shielding the response of the second object to the first touch operation.
Optionally, in a design of the first aspect, the first touch operation includes at least a click operation and a first slide operation, and the target object includes at least an application and a functionality control.
Optionally, in a design of the first aspect, the controlling the display position of the cursor on the second electronic device based on the operation of the first electronic device includes:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in display content of the second electronic device based on the displacement.
Optionally, in a design of the first aspect, a start operation position corresponding to the second sliding operation corresponds to a third object in the interface content of the first electronic device, and the method further includes:
and shielding the response of the third object to the second touch operation.
Optionally, in a design of the first aspect, the method further includes:
when the cursor moves to a target object in the display content of the second electronic device, if a click operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the clicking operation.
Optionally, in a design of the first aspect, an operation position of the click operation corresponds to a fourth object in the interface content of the first electronic device, and the method further includes:
and shielding the response of the fourth object to the click operation.
Optionally, in a design of the first aspect, the second electronic device further displays: and the ray, the end point of which is the cursor.
Optionally, in a design of the first aspect, the method further includes:
receiving a pressing operation on a physical key of first electronic equipment;
Generating a corresponding second event according to the pressing operation;
and executing the second event to the first electronic equipment front-end application.
Optionally, in a design of the first aspect, the method further includes:
receiving a second operation on a display screen of the first electronic equipment, wherein the second operation is a preset shortcut operation;
the first electronic device is responsive to the second operation.
Optionally, in a design of the first aspect, an operation position of the second operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object does not respond to the first operation.
Optionally, in a design of the first aspect, the preset shortcut operation at least includes:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Optionally, in a design of the first aspect, the method further includes:
Reducing the display brightness of the display screen of the first electronic device; or the like, or, alternatively,
and executing screen-off operation on the display screen of the first electronic equipment.
Optionally, in a design of the first aspect, the controlling, based on the operation of the first electronic device, a display position of the cursor on the second electronic device includes:
acquiring pose change information of the first electronic equipment based on the display of a first application on the second electronic equipment, and enabling the cursor to move on the display content of the second electronic equipment based on the pose change information;
the method comprises the steps of receiving a second sliding operation on a display screen of the first electronic device based on the fact that a second application is displayed on the second electronic device, determining the displacement of a cursor according to the second sliding operation, and enabling the cursor to move in the display content of the second electronic device based on the displacement, wherein the first application and the second application are different applications.
In a second aspect, the present application provides an information processing method applied to a first electronic device connected to a second electronic device, the first electronic device including a touch screen, the second electronic device being an augmented display AR device, a virtual reality VR device, or a mixed reality MR device, the method including:
When the handle is connected with the second electronic equipment, the handle or the first electronic equipment is used as control equipment of the second electronic equipment;
and when only the first electronic equipment is connected with the second electronic equipment, the first electronic equipment is used as the control equipment of the second electronic equipment.
Optionally, in a design of the second aspect, the using the handle or the first electronic device as a control device of the second electronic device includes:
using the handle as a control device for the second electronic device;
receiving a first interaction mode switching instruction;
and responding to the first interaction mode switching instruction, and switching the control device of the second electronic device from the handle to the first electronic device.
Optionally, in a design of the second aspect, the receiving a first interaction mode switching instruction includes:
receiving a first interaction mode switching instruction sent by the handle, and generating a corresponding first interaction mode switching instruction or receiving a first interaction mode switching instruction sent by the second electronic equipment based on receiving a second operation on the first electronic equipment.
Optionally, in a design of the second aspect, the using the handle or the first electronic device as a control device of the second electronic device includes:
the first electronic equipment is used as the control equipment of the second electronic equipment;
receiving a second interaction mode switching instruction;
responding to the second interaction mode switching instruction, and switching the control device of the second electronic device from the first electronic device to the handle.
Optionally, in a design of the second aspect, the receiving a second interaction mode switching instruction includes:
receiving a second interaction mode switching instruction sent by the handle, generating a corresponding second interaction mode switching instruction based on receiving a second operation on the first electronic equipment, or receiving a second interaction mode switching instruction sent by the second electronic equipment.
Optionally, in a design of the second aspect, the displaying, by the second electronic device, a cursor and interface content of the first electronic device, and using the handle or the first electronic device as a control device of the second electronic device includes:
controlling a display position of the cursor on the second electronic device based on the operation of the handle;
And controlling the display position of the cursor on the second electronic equipment based on the operation of the first electronic equipment.
Optionally, in a design of the second aspect, the interface content is interface content of a front-end application of the first electronic device.
Optionally, in a design of the second aspect, the controlling a display position of the cursor on the second electronic device based on the operation of the first electronic device includes:
acquiring pose change information of the first electronic equipment;
causing the cursor to move on display content of the second electronic device based on the pose change information.
Optionally, in a design of the second aspect, the method further includes:
when the cursor moves to a target object in the display content of the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the first touch operation.
Optionally, in a design of the second aspect, the controlling the display position of the cursor on the second electronic device based on the operation of the first electronic device includes:
Receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in display content of the second electronic device based on the displacement.
Optionally, in a design of the second aspect, the method further includes:
when the cursor moves to a target object in the display content of the second electronic device, if a click operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the clicking operation.
In a third aspect, the present application provides an electronic device, comprising:
the system comprises a sending module, a processing module and a display module, wherein the sending module is used for establishing connection with second electronic equipment, the second electronic equipment displays a cursor and interface contents of the first electronic equipment, the first electronic equipment comprises a touch screen, and the second electronic equipment is augmented display AR equipment, virtual reality VR equipment or mixed reality MR equipment;
and the processing module is used for acquiring the operation of the first electronic equipment and controlling the display position of the cursor on the second electronic equipment based on the operation of the first electronic equipment.
Optionally, in a design of the third aspect, the interface content is interface content of a front-end application of the first electronic device.
Optionally, in a design of the third aspect, the second electronic device further displays: a menu bar, the menu bar not belonging to the interface content of the first electronic device.
Optionally, in a design of the third aspect, the processing module is specifically configured to acquire pose change information of the first electronic device, and move the cursor on the display content of the second electronic device based on the pose change information.
Optionally, in a design of the third aspect, the pose change information includes a first rotation angle of the first electronic device on a horizontal plane and a second rotation angle of the first electronic device on a vertical plane, and the processing module is specifically configured to:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and moving the cursor on the display content of the second electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the third aspect, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, where the first direction is parallel to a left-right moving direction of a cursor displayed by the second electronic device, and the second direction is parallel to a up-down moving direction of the cursor displayed by the second electronic device, and the processing module is specifically configured to:
Determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and moving the cursor on the display content of the second electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the third aspect, the processing module is further configured to:
when the cursor moves to a target object in the display content of the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the first touch operation.
Optionally, in a design of the third aspect, an operation position of the first touch operation corresponds to a second object in the interface content of the first electronic device, and the processing module is further configured to:
and shielding the response of the second object to the first touch operation.
Optionally, in a design of the third aspect, the first touch operation includes at least a click operation and a first slide operation, and the target object includes at least an application and a functionality control.
Optionally, in a design of the third aspect, the processing module is specifically configured to:
Receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in display content of the second electronic device based on the displacement.
Optionally, in a design of the third aspect, a starting operation position corresponding to the second sliding operation corresponds to a third object in the interface content of the first electronic device, and the processing module is further configured to:
and shielding the response of the third object to the second touch operation.
Optionally, in a design of the third aspect, the processing module is further configured to:
when the cursor moves to a target object in the display content of the second electronic device, if a click operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the clicking operation.
Optionally, in a design of the third aspect, an operation position of the click operation corresponds to a fourth object in the interface content of the first electronic device, and the processing module is further configured to:
and shielding the response of the fourth object to the click operation.
Optionally, in a design of the third aspect, the second electronic device further displays: and the ray, the end point of which is the cursor.
Optionally, in a design of the third aspect, the processing module is further configured to:
receiving a pressing operation on a physical key of first electronic equipment;
generating a corresponding second event according to the pressing operation;
and executing the second event to the first electronic equipment front-end application.
Optionally, in a design of the third aspect, the processing module is further configured to receive a second operation on the display screen of the first electronic device, where the second operation is a preset shortcut operation, and enable the first electronic device to respond to the second operation.
Optionally, in a design of the third aspect, the operation position of the second operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object does not respond to the first operation.
Optionally, in a design of the third aspect, the preset shortcut operation at least includes:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
The second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Optionally, in a design of the third aspect, the processing module is further configured to reduce display brightness of a display screen of the first electronic device; or the like, or, alternatively,
and executing screen-off operation on the display screen of the first electronic equipment.
Optionally, in a design of the third aspect, the processing module is specifically configured to:
acquiring pose change information of the first electronic equipment based on the display of a first application on the second electronic equipment, and enabling the cursor to move on the display content of the second electronic equipment based on the pose change information;
the method comprises the steps of receiving a second sliding operation on a display screen of the first electronic device based on the fact that a second application is displayed on the second electronic device, determining the displacement of a cursor according to the second sliding operation, and enabling the cursor to move in the display content of the second electronic device based on the displacement, wherein the first application and the second application are different applications.
In a fourth aspect, the present application provides a first electronic device, the first electronic device is connected with a second electronic device, the first electronic device includes a touch screen, the second electronic device is an augmented display AR device, a virtual reality VR device, or a mixed reality MR device, the first electronic device includes:
the processing module is used for taking the handle or the first electronic equipment as control equipment of the second electronic equipment when the handle is connected with the second electronic equipment;
and when only the first electronic equipment is connected with the second electronic equipment, the first electronic equipment is used as the control equipment of the second electronic equipment.
Optionally, in a design of the fourth aspect, the processing module is specifically configured to:
using the handle as a control device for the second electronic device;
receiving a first interaction mode switching instruction;
and responding to the first interaction mode switching instruction, and switching the control device of the second electronic device from the handle to the first electronic device.
Optionally, in a design of the fourth aspect, the processing module is specifically configured to:
Receiving a first interaction mode switching instruction sent by the handle, and generating a corresponding first interaction mode switching instruction or receiving a first interaction mode switching instruction sent by the second electronic equipment based on receiving a second operation on the first electronic equipment.
Optionally, in a design of the fourth aspect, the processing module is specifically configured to:
the first electronic equipment is used as the control equipment of the second electronic equipment;
the first electronic equipment further comprises an acquisition module for receiving and receiving a second interaction mode switching instruction;
the processing module is specifically configured to switch the control device of the second electronic device from the first electronic device to the handle in response to the second interaction mode switching instruction.
Optionally, in a design of the fourth aspect, the processing module is specifically configured to:
receiving a second interaction mode switching instruction sent by the handle, generating a corresponding second interaction mode switching instruction based on receiving a second operation on the first electronic equipment, or receiving a second interaction mode switching instruction sent by the second electronic equipment.
Optionally, in a design of the fourth aspect, the second electronic device displays a cursor and interface content of the first electronic device, and the processing module is specifically configured to:
Controlling a display position of the cursor on the second electronic device based on the operation of the handle;
and controlling the display position of the cursor on the second electronic equipment based on the operation of the first electronic equipment.
Optionally, in a design of the fourth aspect, the interface content is interface content of a front-end application of the first electronic device.
Optionally, in a design of the fourth aspect, the processing module is specifically configured to:
acquiring pose change information of the first electronic equipment;
causing the cursor to move on display content of the second electronic device based on the pose change information.
Optionally, in a design of the fourth aspect, the processing module is further configured to:
when the cursor moves to a target object in the display content of the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the first touch operation.
Optionally, in a design of the fourth aspect, the processing module is further configured to:
receiving a second sliding operation on a display screen of the first electronic equipment;
Determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in display content of the second electronic device based on the displacement.
Optionally, in a design of the fourth aspect, the processing module is further configured to:
when the cursor moves to a target object in the display content of the second electronic device, if a click operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the clicking operation.
In a fifth aspect, the present application provides an information processing method, including:
displaying interface content of a first electronic device in a second electronic device, wherein the first electronic device comprises a touch screen, and the second electronic device is an augmented display (AR) device, a Virtual Reality (VR) device or a Mixed Reality (MR) device;
receiving a first operation acting on a first display screen of the first electronic device; causing the interface content displayed by the second electronic device to be responsive to the first operation; wherein the operation position of the first operation corresponds to a first object in the interface content of the first electronic device; the first object does not respond to the first operation.
Optionally, in a design of the fifth aspect, the causing the interface content displayed by the second electronic device to respond to the first operation specifically includes:
determining a first position in the interface content displayed by the first electronic equipment; causing the interface content displayed by the second electronic device to respond to the first operation based on the first location; wherein the first position is independent of an operating position of the first operation.
Optionally, in a design of the fifth aspect, the method further includes:
the cursor is projected in the second electronic device, so that the second electronic device displays the cursor.
Optionally, in a design of the fifth aspect, a position of the cursor corresponding to the interface content displayed by the first electronic device is a first position.
Optionally, in a design of the fifth aspect, the determining a first position in the interface content displayed by the first electronic device includes:
acquiring pose change information of the first electronic equipment;
determining a first position of the cursor in interface content displayed by the first electronic device based on the pose change information.
Optionally, in one design of the fifth aspect, the determining the first position of the cursor in the interface content displayed by the first electronic device based on the pose change information includes:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the fifth aspect, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, the first direction is parallel to a left-right moving direction of a cursor displayed on the second electronic device, the second direction is parallel to a up-down moving direction of the cursor displayed on the second electronic device, and the determining, based on the pose change information, a first position of the cursor in interface content displayed on the first electronic device includes:
Determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the fifth aspect, the first operation includes at least a click operation and a first slide operation, and the first object includes at least one of an application and a functionality control.
Optionally, in a design of the fifth aspect, the determining a first position in the interface content displayed by the first electronic device includes:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the displacement of the cursor.
Optionally, in a design of the fifth aspect, the first operation includes at least a click operation, and the first object includes at least one of an application and a functionality control.
Optionally, in a design of the fifth aspect, the method further includes:
Reducing the display brightness of the first display screen; or the like, or, alternatively,
and executing screen-off operation on the first display screen.
Optionally, in a design of the fifth aspect, the method further includes:
displaying the menu bar in the second electronic device.
In a sixth aspect, the present application provides an electronic device, comprising:
the system comprises a sending module, a processing module and a display module, wherein the sending module is used for displaying interface content of first electronic equipment in second electronic equipment, the first electronic equipment comprises a touch screen, and the second electronic equipment is augmented display AR equipment, virtual reality VR equipment or mixed reality MR equipment;
the processing module is used for receiving a first operation acted on a first display screen of the first electronic equipment; causing the interface content displayed by the second electronic device to be responsive to the first operation; wherein the operation position of the first operation corresponds to a first object in the interface content of the first electronic device; the first object does not respond to the first operation.
Optionally, in a design of the sixth aspect, the causing the interface content displayed by the second electronic device to respond to the first operation specifically includes:
Determining a first position in the interface content displayed by the first electronic equipment; causing the interface content displayed by the second electronic device to respond to the first operation based on the first location; wherein the first position is independent of an operating position of the first operation.
Optionally, in a design of the sixth aspect, the method further includes:
the cursor is projected in the second electronic device, so that the second electronic device displays the cursor.
Optionally, in a design of the sixth aspect, a position of the cursor corresponding to the interface content displayed by the first electronic device is a first position.
Optionally, in a design of the sixth aspect, the determining a first position in the interface content displayed by the first electronic device includes:
acquiring pose change information of the first electronic equipment;
determining a first position of the cursor in interface content displayed by the first electronic device based on the pose change information.
Optionally, in one design of the sixth aspect, the determining the first position of the cursor in the interface content displayed by the first electronic device based on the pose change information includes:
Determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the sixth aspect, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, where the first direction is parallel to a left-right moving direction of a cursor displayed on the second electronic device, and the second direction is parallel to a up-down moving direction of the cursor displayed on the second electronic device, and the determining, based on the pose change information, a first position of the cursor in interface content displayed on the first electronic device includes:
determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, in a design of the sixth aspect, the first operation includes at least a click operation and a first slide operation, and the first object includes at least one of an application and a functionality control.
Optionally, in a design of the sixth aspect, the determining a first position in the interface content displayed by the first electronic device includes:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the displacement of the cursor.
Optionally, in a design of the sixth aspect, the first operation includes at least a click operation, and the first object includes at least one of an application and a functionality control.
Optionally, in a design of the sixth aspect, the method further includes:
reducing the display brightness of the first display screen; or the like, or, alternatively,
and executing screen-off operation on the first display screen.
Optionally, in a design of the sixth aspect, the method further includes:
displaying the menu bar in the second electronic device.
In a seventh aspect, the present application provides an electronic device, serving as a first electronic device, comprising a processor, an input device, an output device and a memory, wherein the memory is configured to store a computer program comprising program instructions that, when executed by the processor, cause the first electronic device to perform the method according to any of the first, second or fifth aspects.
In an eighth aspect, the present application provides a screen projection system, comprising: the electronic device may further include a first electronic device as described in the first aspect, the second aspect, or the fifth aspect, and a second electronic device, wherein the first electronic device is connected to the second electronic device.
In a ninth aspect, an embodiment of the present application provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program, the computer program includes program instructions, which, when executed by a computer, cause the computer to perform the method according to any one of the first, second or fifth aspects.
In a tenth aspect, the present application provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of any of the first, second or fifth aspects as described above.
In an eleventh aspect, the present application provides a manipulation method applied to a first electronic device connected to a second electronic device, wherein the second electronic device includes an imaging system; the method comprises the following steps:
displaying interface contents of N applications operated by the first electronic equipment in N display areas in the imaging system respectively; the first electronic equipment comprises a first display screen, and N is an integer greater than 1;
displaying a cursor in the imaging system, wherein the cursor is used for determining an operation object in the content displayed by the imaging system;
receiving a first sliding operation acting on the first display screen;
determining a displacement of the cursor in content displayed by the imaging system according to the first sliding operation;
the starting position of the first sliding operation corresponds to a first object in the current interface content of the first electronic device, the first object does not respond to the first sliding operation, and the current interface content is the interface content of one application in the interface contents of the N applications.
Optionally, in a design of the eleventh aspect, the second electronic device includes a television, an AR device, a VR device, or an MR device.
Optionally, in a design of the eleventh aspect, the method further includes:
determining a first operation object in interface contents of N applications displayed by the imaging system according to the cursor, wherein the first operation object at least comprises an application program and a function control;
and receiving a second operation, and enabling the first operation object to respond to the second operation.
Optionally, in a design of the eleventh aspect, the method further includes:
reducing the display brightness of the first display screen; or the like, or, alternatively,
and executing screen-off operation on the first display screen.
Optionally, in a design of the eleventh aspect, the method further includes:
displaying a menu bar in the imaging system; the menu bar includes functionality controls for adding or deleting interface content for applications in the imaging system.
In a twelfth aspect, the present application provides a first electronic device, wherein the first electronic device is connected to a second electronic device; the first electronic device comprises a first display screen, and the second electronic device comprises an imaging system; the first electronic device includes:
the sending module is used for respectively displaying the interface contents of N applications operated by the first electronic equipment in N display areas in the imaging system; the first electronic equipment comprises a first display screen, and N is an integer greater than 1; the cursor is also used for displaying a cursor in the imaging system, and the cursor is used for determining an operation object in the content displayed by the imaging system;
The processing module is used for receiving a first sliding operation acted on the first display screen; determining the displacement of the cursor in the content displayed by the imaging system according to the first sliding operation;
the starting position of the first sliding operation corresponds to a first object in the current interface content of the first electronic device, the first object does not respond to the first sliding operation, and the current interface content is the interface content of one application in the interface contents of the N applications.
Optionally, in a design of the twelfth aspect, the second electronic device includes a television, an AR device, a VR device, or an MR device.
Optionally, in a design of the twelfth aspect, the processing module is further specifically configured to:
determining a first operation object in interface contents of N applications displayed by the imaging system according to the cursor, wherein the first operation object at least comprises an application program and a function control;
and receiving a second operation, and enabling the first operation object to respond to the second operation.
Optionally, in a design of the twelfth aspect, the processing module is further configured to:
reducing the display brightness of the first display screen; or the like, or, alternatively,
And executing screen-off operation on the first display screen.
Optionally, in a design of the twelfth aspect, the sending module is further configured to:
displaying a menu bar in the imaging system; the menu bar includes functionality controls for adding or deleting interface content for applications in the imaging system.
In a thirteenth aspect, the present application provides a first electronic device, comprising:
a processor, an input device, an output device, and a memory;
wherein the input device is used for receiving data and instructions; the output device is used for sending data and instructions; the memory is for storing a computer program comprising program instructions that, when executed by the processor, cause the first electronic device to perform the method of the eleventh aspect.
In a fourteenth aspect, the present application provides a control method based on a touch screen, which is applied to a first electronic device, wherein the first electronic device includes the touch screen; touch screens can be used to display interfaces; the method comprises the following steps: when the first electronic device and the second electronic device establish screen projection connection, a cursor is generated and displayed in an interface of a touch screen; generating screen projection content based on interface content in the touch screen; the screen projection content comprises a cursor; sending the screen projection content to the second electronic equipment; when receiving a target touch operation of a user on the touch screen, responding to the target touch operation based on the display position of a cursor in the current interface; the display position of the cursor in the current interface is used for indicating a response position for the target touch operation in the current interface; the response position is irrelevant to the operation position of the target touch operation on the touch screen. The first electronic equipment comprises a mobile phone or a tablet; the second electronic device comprises a display, a television, a projector, an AR device, a VR device, or an MR device.
In a fifteenth aspect, the present application provides a touch screen-based control apparatus, which is applied to a first electronic device, wherein the first electronic device includes a touch screen; touch screens can be used to display interfaces; the device comprises: the generation module is used for generating a cursor when the first electronic equipment and the second electronic equipment establish screen projection connection, and the cursor is displayed in an interface of the touch screen; the screen projection module is used for generating screen projection content based on interface content in the touch screen and sending the screen projection content to the second electronic equipment, wherein the screen projection content comprises a cursor; the receiving module is used for receiving target touch operation of a user on the touch screen; the response module is used for responding to the target touch operation based on the display position of the cursor in the current interface; the display position of the cursor in the current interface is used for indicating a response position for the target touch operation in the current interface; the response position is irrelevant to the operation position of the target touch operation on the touch screen.
In a possible design according to the fourteenth aspect or the fifteenth aspect, when the first electronic device does not establish a screen-casting connection with the second electronic device, the interface of the touch screen does not include a cursor. The screen-shot content comprises part or all of content in an interface, and the screen-shot content is used for enabling part or all of content presented in a display system of the second electronic equipment to be synchronous with part or all of content displayed in the touch screen. The screen projection content further comprises at least one of an interface of a background application run by the first electronic device, a newly drawn menu bar or a preset function control. Both the first electronic device and the second electronic device may display the cursor.
In a possible design according to the fourteenth or fifteenth aspect, the manner of generating the cursor includes: acquiring interface content of a front-end application corresponding to a current interface, and drawing a cursor in the interface content; or drawing a cursor, and adding the cursor to the current interface based on the floating window interface; or drawing a ray, and intersecting the ray and the current interface at the end point based on the floating window interface; the end point is the cursor. The method may be performed by a generation module. In a subsequent operation, a predefined operation for activating or hiding the display of the cursor may also be received.
In one possible design according to the fourteenth or fifteenth aspect, the first event handling system is invoked and the touch screen event handling system is masked; and re-analyzing the target touch operation into a target event, and injecting the target event into the first event processing system, so that the first electronic equipment responds to the target touch operation according to the first event processing system. Or; calling a first event processing system; drawing a transparent target suspension window on the current interface; the target floating window is used for shielding the touch screen event processing system; analyzing a target event corresponding to the target touch operation through the target floating window, and injecting the target event into the first event processing system, so that the first electronic device responds to the target touch operation according to the first event processing system; wherein the response of the touch screen event processing system to the target touch operation is related to the operation position of the target touch operation on the touch screen; and responding to the target touch operation by the first event processing system based on the relation between the position of the cursor and the operation position of the target touch operation on the touch screen. The method may be performed by a calling module.
In one possible design according to the fourteenth or fifteenth aspect, before invoking the first event handling system, the method further comprises: detecting that a first electronic device and a second electronic device establish screen projection connection; or, detecting a first switching command input by a user; the first switch command is used to instruct the first electronic device to invoke the first event processing system and to mask the touch screen event processing system. The method may be performed by a detection module.
In a possible design according to the fourteenth or fifteenth aspect, the display brightness of the touch screen may be reduced or the touch screen may be turned on. May be performed by the brightness control module.
In a possible design according to the fourteenth aspect or the fifteenth aspect, when the target touch operation is a sliding operation, the first electronic device changes the position of the cursor in the current interface according to the sliding operation. May be executed by the response module.
In a possible design, when the target touch operation is a click operation, the first electronic device performs a confirmation operation on an object corresponding to the cursor in the current interface. May be executed by the response module.
In a possible design, when the target touch operation is a long-press sliding operation, the first electronic device may slide or drag a corresponding object of the cursor in the interface. May be executed by the response module.
In a possible design according to the fourteenth aspect or the fifteenth aspect, when the target touch operation causes the motion trajectory of the cursor in the current interface to match a preset shortcut operation command, or when the target touch operation matches the preset shortcut operation command, the first electronic device executes the shortcut operation command. The target touch operation includes: sliding the first preset area of the touch screen to a first preset direction; or the touch screen slides to a second preset direction from a second preset area of the touch screen, and the time of contacting the touch screen is longer than the preset time; or the number of times of clicking the touch screen reaches a preset number; or the sliding track of the target touch operation meets a preset pattern. May be executed by the response module.
In a possible design according to the fourteenth or fifteenth aspect, the method further includes: starting a multi-screen display mode; creating N virtual screens in an interface of a touch screen; determining N interfaces to be displayed; respectively displaying N interfaces to be displayed in N virtual screens; the N interfaces to be displayed belong to M applications, M is an integer not greater than N, and N is an integer greater than 1; the cursor may be located at any position in the N virtual screens. May be performed by a multi-screen creation module.
In a possible design according to the fourteenth or fifteenth aspect, the first electronic device may further obtain a second switching instruction; the second switching instruction is used for indicating the first electronic equipment to call a second event processing system and shielding the first event processing system and the touch screen event processing system; calling the second event processing system; wherein when the posture of the first electronic device changes, the position of the cursor in the current interface is changed based on the response of the second event processing system to the posture change of the first electronic device. For example, the horizontal displacement of the cursor is determined from the rotation angle in the left-right direction, and the vertical displacement of the cursor is determined from the rotation angle in the up-down direction. The first event handling system may be a touchpad event handling system and the second event handling system may be a mouse event handling system. Any event processing system can be flexibly defined. May be executed by the calling module.
The embodiment of the application provides an information processing method, which comprises the following steps: the method comprises the steps that a first electronic device is connected with a second electronic device, wherein the second electronic device displays a cursor and interface contents of the first electronic device, the first electronic device comprises a touch screen, and the second electronic device is an AR (augmented reality) device, a Virtual Reality (VR) device or a Mixed Reality (MR) device; acquiring a first operation on the first electronic device; and controlling the display position of the cursor on the second electronic equipment based on the first operation of the first electronic equipment. In this way, the first electronic device with the touch screen can be used as a control device for the AR device/VE device/MR device instead of the handle.
Drawings
FIG. 1 is a system architecture diagram of a projection system provided by an embodiment of the present application;
fig. 2 is a schematic structural diagram of a first electronic device provided in an embodiment of the present application;
FIG. 3a is a block diagram of a software architecture of an electronic device according to an embodiment of the present application;
fig. 3b is a schematic flowchart of an information processing method according to an embodiment of the present application;
fig. 4a is a schematic interface diagram of a first electronic device according to an embodiment of the present disclosure;
fig. 4b is a schematic interface diagram of a first electronic device according to an embodiment of the present disclosure;
fig. 4c is a schematic interface diagram of a first electronic device according to an embodiment of the present disclosure;
fig. 5a is a schematic interface diagram of a first electronic device according to an embodiment of the present disclosure;
fig. 5b is a schematic interface diagram of a first electronic device according to an embodiment of the present disclosure;
fig. 5c is a schematic interface diagram of a first electronic device according to an embodiment of the present disclosure;
FIG. 6a is a schematic view of an actual scene of a projection screen;
FIG. 6b is a schematic diagram of an actual scene of a projection screen;
FIG. 6c is a schematic diagram of an actual scene of a projection screen;
FIG. 6d is a schematic diagram of an actual scene of a projection screen;
FIG. 7a is a schematic view of an actual scene of a projection screen;
FIG. 7b is a schematic diagram illustrating a change in the orientation of the first electronic device in the horizontal direction;
FIG. 7c is a schematic diagram illustrating the displacement of a cursor displayed on a second electronic device;
FIG. 7d is a schematic diagram of an actual scene of a projection screen;
fig. 7e is a schematic diagram illustrating a posture change of the first electronic device in a vertical direction;
FIG. 7f is a schematic diagram of the displacement of the cursor displayed on the second electronic device;
FIG. 7g is a schematic diagram of an actual scene of a projection screen;
FIG. 7h is a schematic view of a sliding operation performed by a user;
FIG. 7i is a schematic view of a sliding operation of a user;
FIG. 7j is a schematic diagram illustrating the displacement of a cursor displayed by a second electronic device;
fig. 8a is a schematic screen projection content diagram of a second electronic device according to an embodiment of the present application;
fig. 8b is a schematic screen projection content diagram of a second electronic device according to an embodiment of the present application;
fig. 9a is a schematic view of screen projection contents of a second electronic device according to an embodiment of the present application;
fig. 9b is a schematic view of screen projection contents of a second electronic device according to an embodiment of the present application;
fig. 9c is a schematic view of screen projection contents of a second electronic device according to an embodiment of the present application;
fig. 9d is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
Fig. 10a is a schematic screen projection content diagram of a second electronic device according to an embodiment of the present application;
fig. 10b is a schematic screen projection content diagram of a second electronic device according to an embodiment of the present application;
fig. 10c is an interaction diagram of a first electronic device according to an embodiment of the present application;
fig. 10d is a schematic interface content diagram of a first electronic device according to an embodiment of the present application;
FIG. 11a is an interaction diagram of a first electronic device and a second electronic device;
FIG. 11b is an interaction diagram of a first electronic device and a second electronic device;
FIG. 11c is an interaction diagram of a first electronic device and a second electronic device;
FIG. 11d is an interaction diagram of a first electronic device and a second electronic device;
FIG. 11e is an interaction diagram of a first electronic device and a second electronic device;
FIG. 11f is an interaction diagram of a first electronic device and a second electronic device;
FIG. 11g is an interaction diagram of a first electronic device and a second electronic device;
fig. 12a is a schematic interface diagram of a second electronic device according to an embodiment of the present disclosure;
fig. 12b is a schematic interface diagram of a second electronic device according to an embodiment of the present disclosure;
Fig. 12c is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12d is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 12e is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12f is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12g is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 12h is a schematic interface diagram of a second electronic device according to an embodiment of the present disclosure;
fig. 12i is a schematic interface diagram of a second electronic device according to an embodiment of the present disclosure;
fig. 12j is an interface schematic diagram of a second electronic device according to an embodiment of the present application;
fig. 12k is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12l is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 12m is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 12n is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12o is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
Fig. 12p is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12q is a schematic diagram of a user operating a first electronic device according to an embodiment of the present application;
fig. 12r is a schematic interface diagram of a second electronic device according to an embodiment of the present application;
fig. 12s is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 12t is a schematic interface diagram of a second electronic device according to an embodiment of the present disclosure;
fig. 12u is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 13 is a schematic view of a screen projection scene provided in an embodiment of the present application;
fig. 14a is a schematic screen projection content diagram of a second electronic device according to an embodiment of the present application;
fig. 14b is a schematic screen projection content diagram of a second electronic device according to an embodiment of the present application;
fig. 14c is a schematic view of screen projection contents of a second electronic device according to an embodiment of the present application;
fig. 14d is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
fig. 14e is a schematic view of screen projection contents of a second electronic device according to an embodiment of the present application;
fig. 14f is a schematic diagram illustrating a user operating a first electronic device according to an embodiment of the present application;
Fig. 14g is a schematic view of screen projection contents of a second electronic device according to an embodiment of the present application;
fig. 15 is a flowchart illustration of an information processing method according to an embodiment of the present application;
FIG. 16 is a schematic view of an embodiment of the present application;
FIG. 17 is a schematic view of an embodiment of the present application;
fig. 18 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 19 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 20 is an interaction diagram of a first electronic device and a second electronic device;
FIG. 21a is a schematic diagram illustrating a connection of a first electronic device to a second electronic device;
FIG. 21b is a schematic diagram illustrating the connection of a first electronic device to a second electronic device;
FIG. 22a shows a schematic view of a user with a handle as the interaction device;
FIG. 22b is a diagram illustrating a possible software structure of the first electronic device;
FIG. 22c is a diagram illustrating a virtual screen management method in a VR scenario;
FIG. 22d shows a schematic diagram of an android system processing a screen injection event;
FIG. 23a is a diagram illustrating a user interaction with a first electronic device;
FIG. 23b is a diagram illustrating a user interaction with a first electronic device;
FIG. 23c is a schematic diagram of a user interaction with a first electronic device;
FIG. 23d is a schematic diagram of a user interaction with the first electronic device;
FIG. 24a is a schematic diagram of an interaction of a user using a first electronic device according to an embodiment of the present application;
FIG. 24b is a schematic diagram illustrating an interaction of a user with a first electronic device according to an embodiment of the present application;
FIG. 24c is a schematic diagram of an interaction of a user with a first electronic device according to an embodiment of the present application;
fig. 25 is an operation diagram of a first electronic device according to an embodiment of the present application;
FIG. 26 is a schematic diagram illustrating an interaction of a user using a first electronic device according to an embodiment of the present application;
FIG. 27 is a block diagram of a system architecture according to an embodiment of the present application;
FIG. 28 is a block diagram of a system architecture according to an embodiment of the present application;
fig. 29 is a flowchart illustration of an information processing method according to an embodiment of the present application;
fig. 30 is a schematic view illustrating manipulation of multiple screens according to an embodiment of the present application;
fig. 31 is a flowchart of a control method according to an embodiment of the present application;
FIG. 32 is a diagram illustrating an event handling process according to an embodiment of the present application;
fig. 33 is a schematic view of an operation device according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides an information processing method and electronic equipment, and a user can use a mobile phone as control equipment of AR (augmented reality) equipment, Virtual Reality (VR) equipment or Mixed Reality (MR) equipment.
Embodiments of the present application are described below with reference to the accompanying drawings. As can be known to those skilled in the art, with the development of technology and the emergence of new scenarios, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
The terms "first," "second," and the like in the description and in the claims of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances and are merely descriptive of the various embodiments of the application and how objects of the same nature can be distinguished. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of elements is not necessarily limited to those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
The screen projection method can comprise the steps of wire screen projection and wireless screen projection, wherein the wire screen projection can establish connection between electronic equipment through a High Definition Multimedia Interface (HDMI), and transmit media data through an HDMI transmission line; the wireless screen projection may establish a connection between a plurality of electronic devices through, for example, the Miracast protocol, and transmit media data through, for example, a wireless local area network (WIFI).
The screen projection system of the present application may include at least two electronic devices and a screen projection port, where the screen projection port may include a wired port and/or a wireless port. The wired port may be HDMI; the wireless port may be an Application Programming Interface (API) or a hardware screen projection module. Referring to fig. 1, fig. 1 is a system architecture diagram of a projection system provided by an embodiment of the present application. As shown in fig. 1, the screen projection system includes a first electronic device 100 and a second electronic device 200, the first electronic device 100 may include a first wired port 101 and/or a first wireless port 102, and the second electronic device 200 may include a second wired port 201 and/or a second wireless port 202. The first wired port 101 and the first wireless port 102 may be integrated on the first electronic device 100 or exist independently of the first electronic device 100; the second wired port 201 and the second wireless port 202 may be integrated on the second electronic device 200, or may exist independently of the second electronic device 200, which is not limited in this embodiment of the application. The first electronic device 100 and the second electronic device 200 may establish a screen-cast connection through a screen-cast port (wired port or wireless port). The first electronic device 100 at least has a screen projection and transmission (Source) capability.
The first electronic device 100 may include an enhanced interaction service that may obtain sensor information of the electronic device (e.g., may be obtained from sensor inputs in fig. 1) including, but not limited to, gesture information of the first electronic device, and may also obtain touch screen information of the electronic device (e.g., may be obtained from touch screen inputs in fig. 1) including, but not limited to, touch information to the touch screen. How the acquired sensor information and the touch screen information are used by the enhanced interaction service of the first electronic device will be described in the following embodiments, and details are not described here.
In an implementation manner, the enhanced interaction service may also obtain interface content of a front-end application (foreground application) of the first electronic device, draw other images (such as a cursor and a menu bar) on the interface content, and send the drawn interface content to the screen-projection service, where the screen-projection service may generate screen-projection content based on the drawn interface content and send the screen-projection content to the second electronic device 200, so that the display screen of the second electronic device displays the screen-projection content.
In another implementation manner, the enhanced interaction service may also add other images (e.g., a cursor and a menu bar) to the interface content of the first electronic device front-end application through a floating window interface of the system, and the screen-projecting service may acquire the interface content of the first electronic device front-end application added with the other images, generate screen-projecting content based on the acquired interface content of the first electronic device front-end application added with the other images, and send the screen-projecting content to the second electronic device 200, so that the display screen of the second electronic device displays the screen-projecting content.
The first electronic device 100 may include a screen projection service, which is configured to implement a screen projection sending (source) capability, for example, the screen projection service may acquire interface content of the first electronic device as screen projection data, or may take interface content drawn by the enhanced interaction service as screen projection content, and send the screen projection content to the second electronic device 200 through the first wireless port or the first wired port.
The second electronic device 100 may be, but is not limited to, equipped with a screen projection reception (Sink) capability and an image display capability. The second electronic device 200 may include a screen-casting service for implementing a screen-casting reception (sink) capability, which may display received screen-casting content on a display screen of the second electronic device.
Examples of the first electronic device 100 include, but are not limited to, an electronic device equipped with iOS, android, microsoft, or other operating systems, and alternatively, the first electronic device 100 may be an electronic device such as a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), or a desktop computer. The second electronic device 200 may be a television, a tablet computer, or a desktop computer.
It should be noted that the first electronic device may be an electronic device having a display function.
It should be noted that, in some scenarios, if the second electronic device is a television, a tablet computer, or a desktop computer, the size of the display area of the display screen of the second electronic device may be larger than the size of the display area of the display screen of the first electronic device.
For ease of understanding, the structure of the first electronic device 100 provided in the embodiments of the present application will be described below by way of example. Referring to fig. 2, fig. 2 is a schematic structural diagram of a first electronic device provided in an embodiment of the present application.
As shown in fig. 2, the first electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the first electronic device 100. In other embodiments of the present application, the first electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
Wherein the controller may be a neural center and a command center of the first electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I1C) interface, an integrated circuit built-in audio (I1S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiment of the present application is only an illustration, and does not constitute a limitation on the structure of the first electronic device 100. In other embodiments of the present application, the first electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the first electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
In some possible implementations, the first electronic device 100 may communicate with other devices using wireless communication capabilities. For example, the first electronic device 100 may communicate with the second electronic device 200, the first electronic device 100 establishes a screen-cast connection with the second electronic device 200, the first electronic device 100 outputs screen-cast data to the second electronic device 200, and so on. The screen projection data output by the first electronic device 100 may be audio and video data. The communication process of the first electronic device 100 and the second electronic device 200 can refer to the related description of the following embodiments, and the details are not repeated here.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the first electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 1G/3G/4G/5G, etc. applied to the first electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 2 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the first electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 1, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the first electronic device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the first electronic device 100 can communicate with networks and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The first electronic device 100 implements the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the first electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
In some possible implementations, the display screen 194 may be used to display various interfaces of the system output of the first electronic device 100. The interfaces output by the first electronic device 100 can refer to the relevant description of the subsequent embodiments.
The first electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the first electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals.
Video codecs are used to compress or decompress digital video. The first electronic device 100 may support one or more video codecs. In this way, the first electronic device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG1, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU may implement applications such as intelligent recognition of the first electronic device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the first electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications and data processing of the first electronic device 100 by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phone book, etc.) created during the use of the first electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The first electronic device 100 can implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc. In some possible implementations, the audio module 170 may be used to play sound corresponding to video. For example, when the display screen 194 displays a video playing screen, the audio module 170 outputs the sound of the video playing.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The gyro sensor 180B may be used to determine the motion attitude of the first electronic device 100. The air pressure sensor 180C is used to measure air pressure.
The acceleration sensor 180E may detect the magnitude of acceleration of the first electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the first electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance.
The ambient light sensor 180L is used to sense the ambient light level.
The fingerprint sensor 180H is used to collect a fingerprint.
The temperature sensor 180J is used to detect temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the first electronic device 100, different from the position of the display screen 194.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The first electronic device 100 may receive a key input, and generate a key signal input related to user setting and function control of the first electronic device 100.
The motor 191 may generate a vibration cue.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card.
The above is a description of the structure of the first electronic device 100, and next, a description is given of the software structure of the first electronic device. The software system of the first electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. Examples of the first electronic device 100 include, but are not limited to, an electronic device equipped with an iOS, an Android, a microsoft, or another operating system, and in the embodiment of the present application, the first electronic device 100 is equipped with an Android system, which exemplifies a software structure of the first electronic device 100.
Fig. 3a is a block diagram of a software structure of the first electronic device 100 according to the embodiment of the present application.
As shown in fig. 3a, the layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom. The application layer may include a series of application packages.
As shown in fig. 3a, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 3a, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The interface content may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 200. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The graphics processing library is used for realizing drawing, image rendering, synthesis, layer processing and the like of 2D or 3D graphics.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
It should be noted that the second electronic device may have all or a part of the structure shown in fig. 2 and fig. 3a, and is not limited herein. For convenience of understanding, the following embodiments of the present application will specifically describe a screen projection method provided by the embodiments of the present application by taking a first electronic device having a structure shown in fig. 2 and fig. 3a as an example, with reference to the accompanying drawings and application scenarios.
In this embodiment, the second electronic device 200 may be an electronic device such as a television, a tablet computer, or a desktop computer, and the screen projection method provided in this embodiment is described below by taking the second electronic device as a television.
First, a process of establishing a screen-cast connection between the first electronic device 100 and the second electronic device 200 will be described.
It should be noted that the number of the second electronic devices may be one or more, that is, the first electronic device may establish a screen-projecting connection with one second electronic device, or may establish a screen-projecting connection with multiple second electronic devices at the same time.
Next, how the first electronic device establishes a screen-casting connection with the second electronic device is explained.
In an alternative mode, the first electronic device can realize screen projection connection with the second electronic device by selecting a screen projection control carried by the system.
Optionally, referring to fig. 4a, fig. 4a is an interface schematic diagram of a first electronic device provided in an embodiment of the present application, as shown in fig. 4a, a finger of a user may slide down on a top area of a main interface of the first electronic device 100. When the first electronic device 100 detects a downslide operation on the main interface, the notification management interface 40 as shown in fig. 4b is displayed. As shown in fig. 4b, the notification management interface 40 includes a mobile data icon 402, a wireless network icon 403, and a wireless screen shot icon 401, among others. The user clicks the wireless screen shot icon 401 on the notification management interface 40, and the first electronic device 100 may initiate a wireless screen shot function.
Optionally, in this embodiment of the application, if the wireless network is not opened when the user clicks the wireless screen projection icon 401 on the notification management interface 40, the first electronic device 100 may prompt or automatically open the wireless network, and prompt the user to select a wireless network connection. In the embodiment of the present application, the first electronic device 100 automatically searches for a screen projection device (representing an electronic device with screen projection transmission/reception capability) connected to wireless fidelity (WIFI) through the WIFI. First electronic device 100 may display a search/selection box 404 as shown in fig. 4c on notification management interface 40, where search/selection box 404 includes names of one or more screen projection devices searched for, to prompt the user to select one screen projection device from the searched screen projection devices to establish a screen projection connection. After the user selects a screen projection device to be projected, the first electronic device 100 may establish a screen projection connection with the screen projection device selected by the user.
In an alternative, the first electronic device may also enable a screen-cast connection with the second electronic device by selecting a screen-cast control in some applications (e.g., a video application or an application for presentation).
Optionally, referring to fig. 5a, fig. 5a is an interface schematic diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 5a, a plurality of applications, such as a video Application (APP), may be installed in the first electronic device 100. The video APP can be used for watching videos, live broadcasts, novels and/or cartoons. The video APP at least has a screen projection sending function. The video APP may be pre-installed when the first electronic device 100 leaves a factory, or may be installed after being downloaded by a user. The video APP may be a video APP developed by a manufacturer of the first electronic device 100, or may be a video APP developed by a third party manufacturer.
As shown in fig. 5a, the interface of the first electronic device may further include: a status bar, and icons for a plurality of other applications, such as an icon for a social APP, etc. The status bar may include a WiFi icon, a signal strength, a current remaining power, a current time, and/or the like. In some possible implementations, the status bar may also include a bluetooth icon, an alarm icon, and the like. When the first electronic device 100 detects a click event of a finger or a stylus of a user for an application icon, in response to the click event, the first electronic device 100 starts an application and displays a user interface corresponding to the application icon. For example, when the first electronic device 100 detects that a finger of the user touches the icon 501 of the video APP, the first electronic device 100 starts the video APP in response to the touch event and displays a main interface of the video APP. The user can select a video on the main interface of the video APP, and accordingly, the first electronic device 100 can receive the video selected by the user and display the video playing interface.
Optionally, in an embodiment, after the first electronic device 100 starts the video APP, the user may search or select a video that the user wants to play in a main interface of the video APP, and the first electronic device 100 searches a video name input by the user in the search bar from the cloud platform and displays an icon of the searched video. After the user selects a video desired to be played, the user can click on an icon of the video to enter a video playing interface. The video APP of the first electronic device 100 acquires a resource (e.g., a video stream) of a video selected by a user from the cloud platform, and parses and outputs a video image of the video. Referring to fig. 5b, fig. 5b is a schematic view of a video playing interface of a video APP of a first electronic device provided in an embodiment of the present application. Illustratively, the video selected by the user is "lovely oneself" first collection, and the corresponding video playing interface 50 can be as shown in fig. 5 b. Illustratively, the video playback interface 50 may include a video image 504, a screen-cast control 502, a selection 506, a return control 503, a full screen control 505, and the like.
In addition to the elements and controls described above, the video playback interface 40 may include more content, such as cache controls, comment controls, and the like. The user can touch the cache control on the video playing interface 50 by fingers to download the video 400; or the user can touch the comment control on the video playing interface 50 by a finger to view comments, make comments, and the like. It can be understood that the description of the interface is only an example, and different video APPs or different videos may be different in corresponding video playing interfaces, elements and controls included in a full-screen playing interface, arrangement modes of the elements and controls, and the like, which is not limited herein.
If the user wants to screen-cast the video on the second electronic device 200 for playing, the user can click the screen-casting control 502 on the video playing interface 50 through a finger or a stylus pen, and accordingly, the first electronic device 100 can receive a screen-casting indication generated by the user clicking the screen-casting control 502. The first electronic device 100 may detect whether the first electronic device 100 has currently established a screen-cast connection. When the first electronic device 100 detects that the screen projection connection is not currently established, the first electronic device 100 may search for a screen projection device, display the searched screen projection device, and prompt the user to select the screen projection device.
For example, when the first electronic device 100 detects that a screen-casting connection is not currently established, the first electronic device 100 searches for one or more screen-casting devices connected to a wireless network through the wireless network. The screen projection device according to the embodiment of the application is an electronic device with screen projection transmission (Source)/reception (Sink) capabilities. When the first electronic device 100 searches for one or more screen projection devices, referring to fig. 5c, fig. 5c is an illustration of interface contents selected by a screen projection device according to an embodiment of the present application, as shown in fig. 5c, the first electronic device 100 may display a search/selection box 507 shown in fig. 5c on the interface, where the search/selection box 507 includes names of the searched one or more screen projection devices to prompt a user to select one screen projection device from the searched screen projection devices to establish a screen projection connection. Illustratively, the search/select box 507 includes the searched screen projection reception-capable device name: office 1's television, office 2's television, my computer. After the user selects a device to be projected, the first electronic device 100 establishes a screen projection connection with the device selected by the user through a wireless display standard Miracast protocol in a wireless video display (WFD) technology. Illustratively, the screen-casting device selected by the user is a television of the office 1, and the first electronic device 100 establishes a screen-casting connection with the television of the office 1. For convenience of description, the following description refers to the screen-projecting device selected by the user as the second electronic device 200.
It should be noted that the interface content selected by the screen projection device may further include a refresh control 509 and an exit control 508, which is not limited in the present application.
Optionally, in another embodiment, the user may also start the wireless screen projection function of the first electronic device 100 by clicking on another control on the first electronic device 100, which is not limited in this application.
Next, a process for establishing a screen-casting connection between the first electronic device and the second electronic device is described.
Optionally, in an embodiment, the user may establish the wired screen-casting connection between the first electronic device 100 and the second electronic device 200 through a High Definition Multimedia Interface (HDMI).
Optionally, in another embodiment, the first electronic device 100 may establish a screen projection connection with the second electronic device 200 through a signal distribution terminal, and in this embodiment, the signal distribution terminal may include a signal distributor or a signal distributor, which is not limited in this application. In the operation process, the electronic device serving as the signal distribution terminal can operate a certain application program, so that screen projection data sent by the first electronic device is received, and the screen projection data is further distributed to the second electronic device, so that the screen projection function in the application is realized. The application program may be a screen projection application dedicated to a screen projection function, or may be another application including a screen projection function.
It should be noted that, currently, a commonly used electronic device can already use a conventional processing method to achieve a function of supporting a wireless screen projection service; for example, configuring an intelligent device to support a wireless screen-casting related protocol, currently, commonly used protocols include Miracast, DLNA (digital living network alliance), AirPlay protocol, and the like, where a conventional processing manner is to install a wireless screen-casting application conforming to an intelligent operating system on an electronic device, so that the intelligent device can support a wireless screen-casting service. Of course, the electronic devices (the first electronic device and the second electronic device) may be configured in other ways to support the wireless screen projection service, which is not particularly limited in this application.
Optionally, in this embodiment, after the first electronic device establishes the screen-projecting connection with the second electronic device, the first electronic device may send the screen-projecting data to the second electronic device, and the second electronic device may display the screen-projecting data corresponding to the screen-projecting data sent by the first electronic device.
Optionally, in this embodiment, after the first electronic device establishes the screen-projecting connection with the second electronic device, the first electronic device may indirectly send the screen-projecting data to the second electronic device: the first electronic device may send the screen projection data to the signal distribution terminal, and then the signal distribution terminal further sends the screen projection data to the second electronic device.
In fact, the present application is only for explaining how the screen projection data is derived from the first electronic device and finally obtained and presented by the second electronic device, and the present application is not limited to how the screen projection data is transmitted from the first electronic device to the second electronic device.
The process of establishing the screen-casting connection between the first electronic device 100 and the second electronic device 200 is described above, and how the user interacts with the screen-casting content displayed by the second electronic device during the screen-casting process based on the operation of the first electronic device is described next.
Referring to fig. 3b, fig. 3b is a flowchart of an information processing method according to an embodiment of the present application, and as shown in fig. 3b, the information processing method includes:
301. the first electronic device generates screen projection content.
In this embodiment of the application, the first electronic device may generate the screen-projecting content after detecting that the first electronic device establishes the screen-projecting connection with the second electronic device.
In the embodiment of the application, the screen projection content may include, but is not limited to, a cursor and interface content of the first electronic device.
Optionally, in this embodiment of the application, the first electronic device may obtain interface content of a front-end application of the first electronic device, and generate a cursor on the interface content to obtain screen projection content.
The shape of the cursor may be a mouse shape or other shapes, which is not limited in this application. It should be noted that the cursor may be used to be positioned at an operation position in the interface content, and the cursor may move on the display screen of the second electronic device in response to the operation of the user on the first electronic device (changing the posture information or sliding the display screen of the first electronic device), which will be described in the following embodiments regarding how the cursor moves on the display screen of the second electronic device based on the operation of the user on the first electronic device, and will not be described herein again.
Specifically, the first electronic device may obtain interface content of a current front-end application. For example, the first electronic device may obtain interface content of a current front-end application based on a screen recording interface provided by the system (e.g., a mediaproject interface provided by Android), and draw a cursor on the obtained interface content, the first electronic device may use the drawn content as screen projection content, and a screen projection service of the first electronic device may obtain the screen projection content and send the screen projection content (which may be subjected to an encoding operation and/or size transformation of the content) to the second electronic device based on a certain screen projection protocol, so that a display screen of the second electronic device displays the screen projection content.
Optionally, in this embodiment of the application, the first electronic device may further add, based on the floating window interface, image information corresponding to the cursor to interface content of a front-end application of the first electronic device, to generate screen projection data, at this time, a screen projection service of the first electronic device may obtain the screen projection data, and send the screen projection data (which needs to be encoded and size-converted of the content) to the second electronic device based on a certain screen projection protocol, so that a display screen of the second electronic device displays the screen projection content corresponding to the screen projection data.
It should be noted that, after the second electronic device establishes connection with the first electronic device, the cursor may be directly displayed on the display screen of the second electronic device, or the cursor may not be directly displayed on the display screen of the second electronic device, but the cursor is displayed on the display screen of the second electronic device only after the user performs a certain operation on the first electronic device.
Specifically, after the second electronic device establishes a connection with the first electronic device, the second electronic device may directly display a cursor in a central area of the displayed screen-shot content or in another preset area.
In addition, after the second electronic device establishes connection with the first electronic device, the second electronic device may detect the attitude information of the first electronic device without displaying the cursor in the central area of the displayed screen-projected content or other predetermined area, and display the cursor in the central area of the displayed screen-projected content or other predetermined area if it is detected that the attitude information of the first electronic device is changed. From the perspective of the user, if the user does not see the cursor on the display screen of the second electronic device, the user can change the posture information of the first electronic device by waving the first electronic device, and further trigger the display of the cursor on the second electronic device.
Optionally, in an embodiment, after the first electronic device and the second electronic device establish a screen-casting connection, a cursor may not be displayed on the second electronic device first, and a user may activate cursor display of the second electronic device by performing a touch operation on a display screen of the first electronic device, where the touch operation may be a click operation, a double-click operation, or a long-press operation in a preset area on the display screen of the first electronic device, and the application is not limited.
Illustratively, referring to fig. 6a, fig. 6a is a schematic view of a screen projection actual scene, as shown in fig. 6a, when a user holds the first electronic device 100, and the second electronic device 200 displays the screen projection content 60 sent by the first electronic device 100, wherein the screen projection content 60 includes the current interface content (e.g., the interface shown in fig. 4 a) of the first electronic device 100. It should be noted that the interface content of the first electronic device 100 may be, but is not limited to, the interface content of the front-end application.
Optionally, in another embodiment, based on that shown in fig. 5c, after the user selects the screen projection device, referring to fig. 6d, fig. 6d is a schematic view of an actual scene of screen projection, as shown in fig. 6d, the second electronic device 200 displays the screen projection content 60 sent by the first electronic device 100, where the screen projection content 60 includes the current interface content (the video playing interface shown in fig. 5 b) of the first electronic device 100, and the screen projection content 60 further includes: a cursor 601.
In the embodiment of the application, after the first electronic device detects that the screen projection connection with the second electronic device is established, the display brightness of the display screen of the first electronic device can be reduced, or the screen-off operation is executed on the display screen of the first electronic device.
Optionally, in this embodiment of the application, referring to fig. 6b, fig. 6b is an actual scene illustration of a screen projection, as shown in fig. 6b, at this time, the interface content of the first electronic device 100 held by the user is the interface (the main interface of the first electronic device) shown in fig. 4a, and after the screen projection connection is established between the first electronic device and the second electronic device, as shown in fig. 6c, at this time, the first electronic device turns off the screen. It should be noted that, at this time, the first electronic device may not display the page of fig. 4a, fig. 4a is only a page that the first electronic device should present, and fig. 6c is displayed at this time by the first electronic device.
Optionally, in another embodiment, after the first electronic device and the second electronic device establish the screen-shooting connection, a control whether to enter the screen-off state may be displayed on the first electronic device (or through another predetermined operation), and the user may cause the first electronic device to enter the screen-off state by clicking the control entering the screen-off state.
Compared with the prior art, when the screen is projected, the first electronic device needs to keep a screen-on state, in the embodiment of the application, after the screen projection connection is established between the first electronic device and the second electronic device, the first electronic device can reduce the display brightness of the display screen, or directly enter a screen-off state, so that the energy consumption of the first electronic device is reduced.
It should be noted that, in this embodiment of the application, after the first electronic device detects that the screen projection connection is established with the second electronic device, the display brightness of the display screen of the first electronic device may be reduced through a brightness adjustment interface of the system, or a screen-off operation is performed on the display screen of the first electronic device.
It should be noted that, in this embodiment of the application, the first electronic device is only in the screen-off state, at this time, the application in the first electronic device is still running, and the screen-throwing control in the first electronic device may still obtain the interface content of the current front-end application.
302. The first electronic device sends the screen projection content to a second electronic device so that a display screen of the second electronic device displays the screen projection content, wherein the screen projection content comprises a cursor and interface content of the first electronic device, and the cursor is used for being positioned at an operation position in the interface content.
In this embodiment of the application, after generating the screen projection content, the first electronic device may send the screen projection content to the second electronic device, so that the display screen of the second electronic device displays the screen projection content, where the screen projection content includes a cursor and interface content of the first electronic device, and the cursor is used to be positioned at an operation position in the interface content.
For the implementation of step 302, reference may be made to the above embodiment, where the first electronic device sends the description related to the screen projection data to the second electronic device, and details are not described here.
In this embodiment of the application, in order to reduce energy consumption of the first electronic device, after the first electronic device and the second electronic device establish a screen-throwing connection, the first electronic device enters a screen-extinguishing state, at this time, a display screen of the first electronic device is black, and a user cannot operate an object on the first electronic device by operating on interface content, for example, in a scene shown in fig. 6d, the user wants to click a return control or drag a progress bar, however, because the interface content of the first electronic device is black, the user cannot position the return control and the progress bar in the first electronic device on the display screen of the first electronic device, and further cannot perform an operation of clicking the return control or dragging the progress bar. Next, how the user operates the screen projection content displayed by the second electronic device in the state where the first electronic device is in the screen-off state in the embodiment of the present application is discussed.
First, how the user adjusts the display position of the cursor on the display screen of the second electronic device is described.
In this embodiment, the cursor displayed by the second electronic device may be used to locate the operation position in the interface content, so that the user may change the operation position by changing the display position of the cursor on the display screen of the second electronic device, and it is described next how the user changes the display position of the cursor on the display screen of the second electronic device.
In the embodiment of the application, the user can adjust the display position of the cursor on the display screen of the second electronic device by changing the posture of the first electronic device, and can also adjust the display position of the cursor on the display screen of the second electronic device by executing sliding operation on the display screen of the first electronic device.
Firstly, the display position of a cursor on a display screen of the second electronic equipment is adjusted by changing the posture of the first electronic equipment.
In this embodiment of the application, the first electronic device may acquire pose change information of the first electronic device, and cause the cursor to move in the screen projection content displayed by the second electronic device based on the pose change information.
In the embodiment of the application, a cursor 601 is displayed on the screen projection content of the second electronic device, and the cursor 601 can perform corresponding displacement along with the change of the posture of the first electronic device. Specifically, the user may change the posture of the first electronic device in the three-dimensional space by waving the first electronic device, and then the cursor 601 on the second electronic device may perform corresponding displacement on the screen projection content of the second electronic device according to the posture change of the first electronic device in the three-dimensional space.
Alternatively, referring to fig. 7a, fig. 7a is a schematic view of an actual scene of a screen projection, as shown in fig. 7a, a user may change the posture (horizontal direction angle and/or vertical direction angle) of the first electronic device by waving the first electronic device, and accordingly, the cursor 601 displayed on the second electronic device 200 may be displaced in the horizontal direction and/or vertical direction.
Next, how the cursor 601 displayed on the second electronic device 200 is displaced in the horizontal direction and the vertical direction, respectively, will be described.
1) And the horizontal direction:
in this embodiment, the first electronic device may determine a horizontal displacement of the cursor according to the second rotation angle, and move the cursor in the screen-projected content displayed by the second electronic device according to the horizontal displacement.
Specifically, referring to fig. 7b, fig. 7b is a schematic diagram of a posture change of the first electronic device rotating in the horizontal direction, as shown in fig. 7b, the first electronic device rotates from a posture 1 to a posture 2 in the horizontal plane, and an angle change component of the rotation is θ1. In this embodiment of the application, the first electronic device 100 may monitor a posture change in a three-dimensional space and obtain a spatial orientation parameter, and taking the posture change shown in fig. 7a as an example, the first electronic device 100 may obtain that the spatial orientation parameter (an angle change component of horizontal rotation) is θ1
In this embodiment, the first electronic device may obtain a mapping relationship between the angle change component of the horizontal rotation and the size L1 in the horizontal displacement direction of the cursor 601 in the second electronic device, where the mapping relationship may indicate that the larger the angle change component of the horizontal rotation of the first electronic device is, the larger the size L1 in the horizontal displacement direction of the cursor 601 in the second electronic device is. The first electronic device may determine the displacement size L1 of the cursor 601 in the screen projection content of the second electronic device based on the acquired angle change component of the horizontal rotation and the mapping relationship.
For example, the mapping relationship between the angle change component of the horizontal rotation acquired by the first electronic device and the horizontal displacement of the cursor 601 in the second electronic device is as follows: every time the first electronic device rotates 1 ° on the horizontal plane, the horizontal pixel coordinate of the cursor 601 in the screen projection content of the second electronic device changes by 30 pixels.
For example, if the user waves the first electronic device such that the first electronic device rotates by 15 ° on a horizontal plane, the horizontal pixel coordinate of the cursor 601 in the screen projection content of the second electronic device changes by 450 pixels.
It should be noted that, in the process of establishing the screen-casting connection between the first electronic device 100 and the second electronic device 200, the first electronic device 100 and the second electronic device 200 may perform exchange and negotiation of performance parameters, that is, the first electronic device 100 may obtain size parameters of interface contents on the second electronic device 200, and the like. The first electronic device may adjust the mapping relationship based on the size parameter of the interface content on the second electronic device 200. For example, the larger the horizontal size of the interface content on the second electronic device 200, the larger the horizontal pixel coordinate change of the cursor 601 in the screen projection content of the second electronic device when the first electronic device rotates in the horizontal direction by the same angular change component.
It should be noted that the first electronic device may rotate left or right (i.e., rotate counterclockwise or clockwise from the perspective of the main interface of the first electronic device), and accordingly, the cursor 601 may shift left or right in the screen-projected content of the second electronic device. For example, the first electronic device may be rotated to the left and, correspondingly, the cursor 601 may be displaced to the left in the screen-projected content of the second electronic device. For another example, the first electronic device may be rotated to the right, and accordingly, the cursor 601 may be displaced to the right in the screen-projected content of the second electronic device.
Referring to fig. 7c, fig. 7c is a displacement diagram of the cursor displayed by the second electronic device, as shown in fig. 7c, where the angular variation component of the horizontal rotation in the three-dimensional space acquired by the first electronic device is θ1In this case, the first electronic device may determine that the horizontal pixel coordinate variation of the cursor 601 in the screen projection content of the second electronic device is L1 according to the above mapping relationship, and transmit information carrying the horizontal pixel coordinate variation L1 to the second electronic device, and accordingly, the second electronic device may change the display position of the cursor 601 based on the horizontal pixel coordinate variation L1. At this time, the process of the present invention, Since the first electronic device rotates leftwards (anticlockwise) when rotating from the posture 1 to the posture 2, the cursor 601 is also displaced leftwards on the screen projection content of the second electronic device, and the consistency of the movement of the cursor 601 and the operation of the user is ensured.
2) And the vertical direction:
in this embodiment, the first electronic device may determine a vertical displacement of the cursor according to the second rotation angle, and move the cursor in the screen-projected content displayed by the second electronic device according to the vertical displacement.
Similarly, referring to fig. 7d, fig. 7d is a schematic view of an actual scene of a screen projection, as shown in fig. 7d, a user may change the posture (vertical direction angle) of the first electronic device by waving the first electronic device, and accordingly, the cursor 601 displayed on the second electronic device 200 may be displaced in the vertical direction.
The change of the vertical angle of the first electronic device in the three-dimensional space may refer to an angle change component of the first electronic device rotating in the vertical direction in the three-dimensional space.
The change of the vertical angle of the first electronic device in the three-dimensional space may refer to an angle change component of the first electronic device rotating in the vertical direction in the three-dimensional space. Referring to fig. 7e, fig. 7e is a schematic diagram of a posture change of the first electronic device rotating in the vertical direction, as shown in fig. 7b, the first electronic device rotates from a posture 1 to a posture 2 in the vertical direction, and an angle change component of the rotation is θ 2. In this embodiment of the application, the first electronic device 100 may monitor a posture change in a three-dimensional space and obtain a spatial orientation parameter, and taking the posture change shown in fig. 7a as an example, the first electronic device 100 may obtain that the spatial orientation parameter (an angle change component of vertical rotation) is θ2
In this embodiment of the application, the first electronic device may obtain a mapping relationship between the angle change component of the vertical direction rotation and a vertical displacement L2 of the cursor 601 in the second electronic device, and determine a displacement L2 of the cursor 601 in the screen projection content of the second electronic device based on the obtained angle change component of the vertical direction rotation and the mapping relationship.
For how the first electronic device determines the displacement size L2 of the cursor 601 in the screen-projected content of the second electronic device, reference may be made to the description in the above embodiments, and details are not repeated here.
Referring to fig. 7f, fig. 7f is a displacement diagram of a cursor displayed by a second electronic device, as shown in fig. 7f, where the first electronic device acquires an angle change amount θ of vertical direction rotation in three-dimensional space2In this case, the first electronic device may determine that the vertical pixel coordinate change of the cursor 601 in the screen projection content of the second electronic device is L2 according to the mapping relationship, and send information carrying the vertical pixel coordinate change L2 to the second electronic device, and accordingly, the second electronic device may change the display position of the cursor 601 based on the vertical pixel coordinate change L2. At this time, since the first electronic device is rotated upward when being rotated from the posture 1 to the posture 2, the cursor 601 is also displaced upward on the screen-projected content of the second electronic device, and consistency of movement of the cursor 601 and user operation is ensured.
Optionally, in another embodiment, the first electronic device may acquire a displacement size of the first electronic device in a first horizontal direction, and determine a horizontal displacement of the cursor on the interface content based on a translation size of the first electronic device in the horizontal direction. The first horizontal direction may be a horizontal direction parallel to the display screen of the second electronic device.
In this embodiment of the application, the first electronic device may obtain a mapping relationship between the displacement in the horizontal direction and the horizontal displacement of the cursor 601 in the second electronic device, and determine the horizontal displacement of the cursor 601 in the screen projection content of the second electronic device based on the obtained displacement in the horizontal direction and the mapping relationship.
Optionally, in another embodiment, the first electronic device may acquire a displacement size of the first electronic device in a first vertical direction, and determine a vertical displacement of the cursor on the interface content based on a translation size of the first electronic device in the vertical direction. The first horizontal direction may be a vertical direction perpendicular to the display screen of the second electronic device.
In the embodiment of the application, the first electronic device may obtain a mapping relationship between the displacement in the vertical direction and the vertical displacement of the cursor 601 in the second electronic device, and determine the vertical displacement of the cursor 601 in the screen projection content of the second electronic device based on the obtained displacement in the vertical direction and the mapping relationship.
In practical applications, the first electronic device may monitor the attitude change and obtain the spatial orientation parameter through a built-in gravitational acceleration sensor (e.g., a gyroscope), or may monitor the attitude change and obtain the spatial orientation parameter through an infrared spatial detection technique or an acoustic detection technique. The present embodiment does not specifically limit the means for acquiring the spatial orientation parameter by the first electronic device.
Secondly, the user can adjust the position of the projected content of the cursor 601 on the second electronic device by performing a sliding operation on the display screen in the first electronic device.
In this embodiment, the first electronic device may receive a second sliding operation on the display screen of the first electronic device, determine a displacement of the cursor according to the second sliding operation, and move the cursor in the screen-projected content displayed by the second electronic device based on the displacement.
It should be noted that the track of the second sliding operation passes through one or more objects in the interface content of the first electronic device, where the "one or more objects" may be all objects that pass through the track of the second sliding operation.
Specifically, in the embodiment of the present application, a cursor 601 may be displayed on the screen projection content of the second electronic device, and the cursor 601 may perform corresponding displacement based on a sliding operation of the user on the display screen (in the screen-off state) of the first electronic device. Specifically, the user may slide on the display screen by using a finger or a touch pen, and accordingly, the cursor 601 on the second electronic device may perform corresponding displacement on the screen-projected content of the second electronic device based on the sliding operation of the user.
Alternatively, referring to fig. 7g, fig. 7g is a schematic view of an actual scene of the screen projection, as shown in fig. 7g, the user may slide on the display screen by a finger or a touch pen, and accordingly, a cursor 601 on the second electronic device may perform corresponding displacement on the screen projection content of the second electronic device based on the sliding operation of the user.
In this embodiment of the application, the first electronic device may obtain a mapping relationship between a sliding displacement L3 of the user on the display screen of the first electronic device and a displacement size L3 of the cursor 601 in the second electronic device, and determine a displacement size L3 of the cursor 601 in the screen projection content of the second electronic device based on the obtained sliding displacement L3 on the display screen of the first electronic device and the mapping relationship.
For example, the mapping relationship between the sliding displacement L3 on the display screen of the first electronic device acquired by the first electronic device and the displacement size L3 of the cursor 601 in the second electronic device is as follows: every time the user slides 1 pixel on the display screen of the first electronic device, the pixel coordinates of the cursor 601 in the screen projection content of the second electronic device change by 30 pixels. The above is merely an example, and the present application is not limited thereto.
It should be noted that, in the process of establishing the screen-casting connection between the first electronic device 100 and the second electronic device 200, the first electronic device 100 and the second electronic device 200 may perform exchange and negotiation of performance parameters, that is, the first electronic device 100 may obtain size parameters of interface contents on the second electronic device 200, and the like. The first electronic device may adjust the mapping relationship based on the size parameter of the interface content on the second electronic device 200. For example, the larger the horizontal size of the interface content on the second electronic device 200, the larger the pixel displacement (pixel coordinate change) of the cursor 601 in the screen projection content of the second electronic device in the case where the user slides the same displacement on the display screen of the first electronic device.
It should be noted that the sliding displacement of the user on the display screen of the first electronic device may include displacements in two directions (x direction and y direction) perpendicular to each other, and referring to fig. 7h, fig. 7h is a sliding operation schematic of the user, and accordingly, as shown in fig. 7i, the sliding displacement L3 of the user on the display screen of the first electronic device may include a displacement L5 in a first direction and a displacement L6 in a second direction, where the first direction is perpendicular to the central axis of the first electronic device, and the second direction is parallel to the central axis of the first electronic device. Accordingly, the pixel displacement of the cursor 601 in the screen projection content of the second electronic device includes a displacement component in the horizontal direction and a displacement component in the vertical direction. At this time, the first electronic device may determine the displacement magnitude of the cursor 601 in the horizontal direction and the displacement magnitude in the vertical direction in the screen-projected content of the second electronic device, respectively, based on the above-described mapping relationship.
Referring to fig. 7j, fig. 7j is a displacement diagram of a cursor displayed by a second electronic device, as shown in fig. 7j, in a case that a first electronic device acquires a sliding displacement L3 of a user on a display screen of the first electronic device, the first electronic device may determine, according to the mapping relationship, that a displacement of the cursor 601 in screen projection content of the second electronic device is L4, and send information carrying L4 to the second electronic device, and accordingly, the second electronic device may change a display position of the cursor 601 based on L4. At this time, since the user slides to the right obliquely upward on the display screen of the first electronic device, the cursor 601 is also obliquely upward to the right on the screen projection content of the second electronic device, and consistency between movement of the cursor 601 and user operation is ensured.
In this embodiment of the application, the operation position corresponding to the second sliding operation corresponds to a third object in the interface content of the first electronic device, and the first electronic device may shield a response of the third object to the second touch operation
In the embodiment of the present application, the user may implement the movement of the cursor 601 in the display screen of the second electronic device based on the operation (changing the posture or sliding on the display screen of the first electronic device) on the first electronic device in the above manner.
402
Next, how the user performs operations such as clicking and sliding on the object on which the cursor is located by operating the first electronic device will be described.
In this embodiment of the application, when the cursor moves to a target object in the screen-projected content displayed by the second electronic device, if the first electronic device receives a first touch operation on the display screen of the first electronic device, the target object in the screen-projected content displayed by the second electronic device is made to respond to the first touch operation.
Illustratively, the first touch operation may be a click operation or a slide operation.
In this embodiment, the operation position of the first touch operation corresponds to a second object in the interface content of the first electronic device, and the first electronic device may shield a response of the second object to the first touch operation.
Specifically, after receiving a first touch operation on a display screen of the first electronic device, the first electronic device may shield a response of the interface content to an operation position of the first touch operation, and determine a position of a cursor on a second display screen to determine a response position of the interface content to the first touch operation. For example, if the user performs a click operation on an a object of the display screen of the first electronic device, and at this time, the cursor is located on a B object in the second electronic device, at this time, the first electronic device may mask a response of the a object to the click operation, and respond to the click operation on the B object of the interface content.
The first touch operation may be, but is not limited to, a click operation or a slide operation, and the first touch operation will be described as an example of the click operation.
In the embodiment of the application, after the user moves the cursor 601 to the area where the target object desired to be operated is located on the screen-projecting content displayed by the second electronic device, the user may perform a click operation on the display screen of the first electronic device, and accordingly, the click operation may be performed in the area where the cursor 601 is located.
Specifically, in an embodiment, after the cursor 601 moves to the target area desired to be operated on the screen projection content of the second electronic device, the first electronic device may acquire a specific pixel coordinate position of the cursor 601 in the screen projection content of the second electronic device, and determine, according to the specific pixel coordinate position of the cursor 601 in the screen projection content of the second electronic device, a pixel coordinate position corresponding to the position in the interface content applied at the front end of the first electronic device.
For example, if the user moves the cursor 601 in the second electronic device into the area of the video APP icon, at this time, the first electronic device may determine a pixel coordinate position (pixel coordinate position of the area of the video APP icon) corresponding to the interface content of the cursor 601 in the first electronic device.
The user performs a click operation on the display screen of the first electronic device, at this time, the first electronic device may shield a response of the interface content applied at the front end of the first electronic device to an operation position of the click operation, and the first electronic device may perform an event corresponding to the click operation at a pixel coordinate position corresponding to the interface content of the cursor 601 in the first electronic device, which is equivalent to causing the first electronic device to perform the click operation at a pixel coordinate position corresponding to the cursor 601 in the first electronic device.
For example, if the user moves the cursor 601 in the second electronic device into the area of the video APP icon and performs a click operation on the display screen of the first electronic device, the first electronic device may perform the click operation on the video APP in the current interface content and send the opened interface (screen projection data) to the second electronic device, and the second electronic device may display the received screen projection data.
Next, the first touch operation will be described as an example of the slide operation.
In this embodiment of the application, the first electronic device may generate a sliding event corresponding to the first sliding operation, and execute the sliding event on a target object in interface content of the first electronic device, so that the target object in the screen-casting content displayed by the second electronic device responds to the first sliding operation.
For example, if the user moves the cursor 601 in the second electronic device into the area of the video APP icon and performs a sliding operation on the display screen of the first electronic device, the first electronic device may perform a dragging operation on the video APP icon in the current interface content.
It should be noted that, in addition to the cursor 601, a menu bar may be displayed on the display screen of the second electronic device.
In the embodiment of the application, the first electronic device can obtain interface content of a front-end application of the first electronic device, and generate a cursor and a menu bar on the interface content to obtain screen projection content; wherein the menu bar does not belong to the interface content of the first electronic device. That is, the menu bar is not the interface content originally on the first electronic device, but is the newly added content.
Specifically, the first electronic device may add a menu bar to interface content of a front-end application of the first electronic device based on a floating window interface to generate screen-projecting content, and at this time, a screen-projecting service of the first electronic device may obtain the screen-projecting content and send the screen-projecting content (which needs to be encoded and size-changed of the content) to the second electronic device based on a certain screen-projecting protocol, so that a display screen of the second electronic device displays the screen-projecting content.
Optionally, the first electronic device may also obtain interface content of the current front-end application. For example, the first electronic device may obtain interface content of a current front-end application based on a screen recording interface provided by the system (e.g., a mediaproject interface provided by Android), and draw a menu bar on the obtained interface content, the first electronic device may use the drawn content as screen projection content, a screen projection service of the first electronic device may obtain the screen projection content, and send the screen projection content (which needs to be subjected to an encoding operation and/or size conversion of the content) to the second electronic device based on a certain screen projection protocol, so that a display screen of the second electronic device displays the screen projection content.
In this embodiment, the screen-shot content of the second electronic device may further include a menu bar that does not belong to the interface content of the first electronic device, for example, referring to the function control 800 shown in fig. 8a, as shown in fig. 8a, the user may cause the cursor 601 on the second electronic device to move to the area where the function control 800 is located by changing the gesture of the first electronic device 100, and perform the click operation 80, and the first electronic device may perform the click operation on the target object where the cursor 601 is located in response to the click operation 80, and display a function selection area 801, and accordingly, referring to fig. 8b, fig. 8b is a screen-shot content illustration of the second electronic device provided in this embodiment, as shown in fig. 8b, the second electronic device displays the function selection area 801, where the function selection area 801 may include a sensitivity adjustment control, Mouse size adjustment control, sliding mode control 802, brightness adjustment control, and normal mode control.
Illustratively, the user may effect mode switching between the gesture-based interaction mode shown in fig. 7a and 7d and the sliding interaction mode shown in fig. 7g by clicking on the sliding mode control 802. As shown in fig. 8b, the user may click the sliding mode control 802, and at this time, the mode of changing the position of the cursor 601 based on the gesture of the first electronic device may be switched to the sliding interaction mode shown in fig. 7g, it should be noted that, in an embodiment, after the mode is switched to the sliding interaction mode, the sliding mode control 802 in the function selection area 801 may be replaced by a gesture mode control, and if the user clicks the gesture mode control, the mode may be switched to the gesture-based interaction mode shown in fig. 7a and 7d from the sliding interaction mode shown in fig. 7 g.
It should be noted that, after the first electronic device and the second electronic device establish the screen-casting connection, the gesture-based interaction mode shown in fig. 7a and 7d may be used by default, or the sliding interaction mode shown in fig. 7g may also be used by default, which is not limited in the present application.
Referring to fig. 9a, fig. 9a is a screen-shot content schematic of a second electronic device provided in this embodiment of the present application, as shown in fig. 9a, a user may click a sensitivity adjustment control 901, and accordingly, the second electronic device may display a sensitivity adjustment area 90, referring to fig. 9b, and fig. 9b is a screen-shot content schematic of a second electronic device provided in this embodiment of the present application, as shown in fig. 9b, the second electronic device displays the sensitivity adjustment area 90, and the sensitivity adjustment area 90 may include a sliding control, and the user may adjust a sensitivity to a manipulation of a cursor 601 by dragging the sliding control. In addition, the sensitivity adjustment area 90 may further include a sensitivity magnitude prompt, and it should be noted that the arrangement of the above interface and the control is only an illustration, and the application is not limited thereto.
Alternatively, in a mode in which the position of the cursor 601 is adjusted based on the posture of the first electronic device, the first electronic device may acquire an angle variation amount of rotation in a horizontal direction or a vertical direction, and determine the displacement amount of the cursor 601 in the screen-projected content of the second electronic device based on a mapping relationship between the angle variation amount in the horizontal direction and the horizontal displacement amount of the cursor 601 in the second electronic device, and a mapping relationship between the angle variation amount in the vertical direction and the vertical displacement amount of the cursor 601 in the second electronic device. The mapping relationship between the angle variation in the horizontal direction and the horizontal displacement of the cursor 601 in the second electronic device and the mapping relationship between the angle variation in the vertical direction and the vertical displacement of the cursor 601 in the second electronic device can be adjusted through the sensitivity adjustment, so that when the first electronic device has the same posture change, the displacement of the cursor 601 on the second electronic device is changed.
For example, if the user feels that the movement of the cursor 601 is too slow when operating the cursor 601, the user may increase the sensitivity of the cursor 601 by dragging the slide control shown in fig. 9b to the right, and conversely, if the user feels that the movement of the cursor 601 is too fast when operating the cursor 601, the user may decrease the sensitivity of the cursor 601 by dragging the slide control shown in fig. 9b to the left.
Illustratively, at a sensitivity of 40, the horizontal pixel coordinate of the cursor 601 in the screen projection content of the second electronic device changes by 30 pixels every 1 ° of rotation of the first electronic device on the horizontal plane, and the vertical pixel coordinate of the cursor 601 in the screen projection content of the second electronic device changes by 50 pixels every 1 ° of rotation of the first electronic device on the vertical plane. By dragging the sliding control shown in fig. 9b to the right, when the user can adjust the sensitivity to 58, the horizontal pixel coordinate of the cursor 601 in the screen projection content of the second electronic device changes by 45 pixels every time the first electronic device rotates by 1 ° on the horizontal plane, and the vertical pixel coordinate of the cursor 601 in the screen projection content of the second electronic device changes by 75 pixels every time the first electronic device rotates by 1 ° on the vertical plane. It should be noted that the above description of mapping relationship is only an example, and does not constitute a limitation to the present application.
Optionally, in an embodiment, after the first electronic device and the second electronic device establish a screen-casting connection, the first electronic device may add image data corresponding to the function control 800 to the current interface content through the floating window interface, so that the function control 800 is superimposed in the current interface content of the first electronic device, at this time, the screen-casting service of the first electronic device may obtain screen-casting data including the interface content of the function control 800, and send the screen-casting data to the second electronic device based on a certain screen-casting protocol, accordingly, the interface content including the function control 800 may be displayed on the second electronic device, the first electronic device may obtain an operation of a user (sliding/changing a posture of the first electronic device) to change a position of the cursor 601 in the current interface content, when the user controls the cursor 601 on the second electronic device to move within a range of the function control 800, the first electronic device may determine that the position of the cursor 601 in the current interface content is within the range of the function control 800, and if a user clicks a display screen of the first electronic device, the first electronic device may add image data corresponding to the function selection area 801 to the current interface content through a floating window interface, so that the function selection area 801 is superimposed in the current interface content of the first electronic device, at this time, a screen-casting service of the first electronic device may obtain screen-casting data including the interface content of the function selection area 801, and send the screen-casting data to the second electronic device based on a certain screen-casting protocol, and accordingly, the interface content including the function selection area 801 may be displayed on the second electronic device.
Taking the function selection area 801 shown in fig. 8b as an example, the function selection area 801 may include a sensitivity adjustment control, a mouse size adjustment control, a sliding mode control 802, a brightness adjustment control, and a normal mode control. At this time, the first electronic device may acquire the positions of the controls in the function selection area 801 in the current interface content. If the position of the cursor 601 is within a region of one of the controls in the function selection region 801 and the first electronic device detects a click operation of the user on the display screen, the first electronic device may respond to the operation of the user, for example, as shown in fig. 9a, if the user clicks the sensitivity control, in response to the click operation of the user, the first electronic device may add image data corresponding to the sensitivity adjustment region 90 to the current interface content through the floating window interface, so that the sensitivity adjustment region 90 is superimposed in the current interface content of the first electronic device, and the original function selection region 801 is removed (or the sensitivity adjustment region 90 is superimposed on the function selection region 801), at this time, the screen projection service of the first electronic device may obtain the screen projection service including the interface content of the sensitivity adjustment region 90 And sending the screen projection data to the second electronic device based on a certain screen projection protocol, and accordingly, displaying the interface content including the sensitivity adjustment area 90 as shown in fig. 9b on the second electronic device. At this time, the user may perform a sliding operation on the display screen of the first electronic device and drag the sliding button in the sensitivity adjustment area 90, and accordingly, the first electronic device adds the image data corresponding to the dragged sliding button to the current interface content through the floating window interface in response to the sliding operation of the user, and at the same time, modifies the relevant parameter of the sensitivity of the user for operating the cursor 601 based on the dragging size of the user. The above description is merely an example and is not intended to limit the present application.
It should be noted that the control included in the function selection area 801 is an illustration at present, in practical applications, a screen projection ending control may also be set, and a user may move a cursor in the second electronic device to the screen projection ending control and click a display screen of the first electronic device to trigger the first electronic device to end the screen projection connection with the second electronic device.
Optionally, the user may also trigger the first electronic device to end the screen-shooting connection with the second electronic device by other means, such as pressing a power key, and the like, which is not limited herein.
Optionally, referring to fig. 9c, fig. 9c is a screen projection content illustration of a second electronic device provided in this embodiment of the present application, as shown in fig. 9c, a user may click a common mode control 902, and accordingly, the first electronic device may change an interaction mode with the user into a common interaction mode, at this time, a display screen of the first electronic device may be bright, and the user may normally operate on the display screen of the first electronic device.
Optionally, referring to fig. 9d, fig. 9d is a schematic diagram of a user operating a first electronic device according to an embodiment of the present application, as shown in fig. 9d, the user may press a power key of the first electronic device, and accordingly, the first electronic device may change an interaction mode with the user to a normal interaction mode, at this time, a display screen of the first electronic device may be bright, and the user may normally operate on the display screen of the first electronic device.
The above manner of switching the first electronic device to the normal interaction mode is merely an illustration, and does not limit the present application.
The above switching of the first electronic device to the normal interaction mode is described next with reference to a specific scenario.
Referring to fig. 10a, fig. 10a is a schematic screen projection content of a second electronic device according to an embodiment of the present application, as shown in fig. 10a, when a first electronic device receives a chat message sent by another electronic device, a prompt window for chat is displayed on the second electronic device (data of the prompt window is sent by the first electronic device, and specific details refer to the above embodiment and are not described here again).
As shown in fig. 10a, the user may move the cursor 601 into the area of the prompt window, and perform a click operation on the first electronic device, and accordingly, the second electronic device may display a chat interface as shown in fig. 10b (the implementation of this process is that the first electronic device detects the click operation of the user, and performs the click event at the position of the cursor, and specific details may refer to the foregoing embodiment and are not described here again), where the chat interface includes an input keyboard, and at this time, if sentence input is performed by adjusting the position of the cursor displayed on the second electronic device and performing the click operation on the first electronic device, the experience of the user may be poor (since the display screen of the second electronic device may be large in size, and the user is not adapted to use the input keyboard through the above interaction manner).
Referring to fig. 10c, fig. 10c is an interaction schematic of a first electronic device according to an embodiment of the present application, as shown in fig. 10c, a user may press a power key of the first electronic device, and accordingly, the first electronic device may change an interaction mode with the user to a normal interaction mode, at this time, a display screen of the first electronic device may be bright, and the user may normally operate on the display screen of the first electronic device. Referring to fig. 10d, fig. 10d is an illustration of interface contents of a first electronic device according to an embodiment of the present application, as shown in fig. 10d, a user may directly operate on an input keyboard displayed on the first electronic device. Optionally, after the input of the sentence is completed, the power key of the first electronic device may be pressed again, and the first electronic device is switched to the screen-off mode and the interaction mode.
It should be noted that the above is only a schematic description of one scene, and in practical applications, as long as the scene needs to be operated by facing the display screen of the first electronic device, the user can switch to the normal screen-projecting mode in the above manner.
Optionally, after the screen-casting connection is established between the first electronic device and the second electronic device, a call request (for example, a voice call request or a video call request) of another electronic device may be received, taking the call request as the voice call request as an example, at this time, interface content corresponding to the call request of the first electronic device may be displayed on the second electronic device, where the interface content includes a call receiving control and a call rejecting control.
Optionally, the user may move a cursor displayed on the second electronic device to the call receiving control by operating the first electronic device, and click on the display screen of the first electronic device, so that the user may perform a call through the first electronic device.
Optionally, the user may switch the current interaction mode to the normal mobile phone interaction mode by pressing a power key of the first electronic device or other modes (by clicking a cursor displayed on the second electronic device in the display screen of the first electronic device to move to the call receiving control and clicking the call receiving control on the display screen of the first electronic device, the user may further perform a call through the first electronic device.
Next, how to combine the above sensitivity adjustment to make the front end orientation of the first electronic device coincide with the display position of the cursor on the second electronic device as much as possible when the user swings the first electronic device is described.
In the embodiment of the present application, after the first electronic device establishes the screen-casting connection with the second electronic device, the first electronic device may set 601 of the cursor at a preset position in the interface content, for example, at a geometric center of the interface content, referring to fig. 11a, fig. 11a is an interaction schematic of the first electronic device and the second electronic device, after the first electronic device establishes the screen-casting connection with the second electronic device, the first electronic device may set 601 of the cursor at the geometric center of the interface content, and accordingly, the display content of the second electronic device may be as shown in fig. 11a, however, since the front end of the first electronic device may not face the center position of the second electronic device when the first electronic device establishes the screen-casting connection with the second electronic device, at this time, the front end of the first electronic device faces to the left, at this time, the user waves the first electronic device to the right, the display position of the cursor on the second electronic device is displaced to the right, and at the moment, when the user controls the cursor to displace, the orientations of the first electronic device are all deviated to the left, which is not in accordance with the ideal operation mode of the user (the ideal user operation mode is that the orientation of the front end of the first electronic device can be consistent with the display position of the cursor on the second electronic device as much as possible). In order to solve the above technical problem, when the user swings the first electronic device to the right so that the front end of the first electronic device faces to the vicinity of the center position of the second electronic device, as shown in fig. 11b, the display position of the cursor on the second electronic device is at the boundary of the right end, at this time, the user may continue to swing the first electronic device to the right, as shown in fig. 11c, the cursor may stay at the boundary of the right end on the second electronic device, after the user rotates the first electronic device to the right by a certain angle, the user may rotate the first electronic device to the left, as shown in fig. 11d, the cursor on the second electronic device is displaced to the left, and when the front end of the first electronic device faces to the center position of the second electronic device, the display position of the cursor on the second electronic device is further to the left than before the adjustment. When the user can orient the front end of the first electronic device to the central position of the second electronic device in the manner, the cursor is adjusted to be positioned near the central position of the second electronic device, and the orientation of the front end of the first electronic device can be consistent with the display position of the cursor on the second electronic device as much as possible when the user waves the first electronic device in combination with the adjustment of the sensitivity.
In this embodiment, the first electronic device may obtain a size of a display area of a display screen of the second electronic device, and then, the first electronic device may determine, according to the size of the display area of the display screen of the second electronic device and a pixel coordinate position of screen projection content displayed by the cursor on the second electronic device, whether the cursor moves to a boundary of the display area of the second electronic device, and when the first electronic device determines whether the cursor moves to the boundary of the display area of the second electronic device, a display position of the cursor on the display screen of the second electronic device may stay at the boundary of the display area of the second electronic device. It should be understood herein that when the cursor is moved to the left and right boundaries of the display area on the display screen of the second electronic device, the cursor may still move up and down, although the cursor may not exceed the left and right boundaries of the display area on the display screen. Similarly, when the cursor moves to the upper and lower boundaries of the display area on the display screen of the second electronic device, the cursor may still move left and right, although the cursor may not exceed the upper and lower boundaries of the display area on the display screen. Similarly, when the cursor is moved to a corner point (upper left corner point, lower left corner point, upper right corner point, or lower right corner point) of the display area on the display screen of the second electronic device, although the cursor does not exceed the boundary of the display area on the display screen, the cursor may still be moved in a certain direction, for example, when the cursor is moved to the upper left corner point of the display area on the display screen of the second electronic device, the cursor may still be moved in a right direction, a downward direction, and a downward and rightward direction.
Optionally, in another embodiment, a quick adjustment mechanism for the cursor position may be set, for example, after the user touches the display screen of the first electronic device for more than a preset time, the first electronic device may initialize the position of the cursor in the interface content in response to the above operation by the user.
Illustratively, referring to fig. 11e, fig. 11e is an interaction schematic of a first electronic device and a second electronic device, as shown in fig. 11e, after the first electronic device and the second electronic device are connected by screen projection, because the front end of the first electronic device faces to the left position of the geometric center of the second electronic device, at this time, the user waves the first electronic device to the right, and the front end of the first electronic device is directed to the vicinity of the center position of the second electronic device, as shown in fig. 11f, the display position of the cursor is in the vicinity of the center position of the second electronic device, at which time, the user can press the display screen of the first electronic device for 5 seconds or more than 5 seconds, as illustrated in fig. 11g, the first electronic device may adjust the display position of the cursor to be at the geometric center of the interface content in response to the user' S operation of long-pressing the display screen 5S. When the user can orient the front end of the first electronic device to the central position of the second electronic device in the manner, the cursor is adjusted to be positioned near the central position of the second electronic device, and the orientation of the front end of the first electronic device can be consistent with the display position of the cursor on the second electronic device as much as possible when the user waves the first electronic device in combination with the adjustment of the sensitivity.
The above method for adjusting the cursor position is only an example, and does not limit the present application.
Optionally, in an embodiment, if the first electronic device is not operated by the user within a certain time after the first electronic device and the second electronic device establish the screen-casting connection, a cursor displayed on the second electronic device may be hidden, and the user may activate cursor display in the display screen of the second electronic device through the touch operation and/or by changing the gesture of the first electronic device.
In this embodiment of the application, the first electronic device may further receive a first operation on a display screen of the first electronic device, where the first operation is a preset shortcut operation, and the first electronic device may respond to the first operation.
In this embodiment of the application, the first electronic device may detect a first operation on a display screen of the first electronic device, and recognize that the first operation is a preset shortcut operation, at this time, the first electronic device may generate a first event corresponding to the first electronic device, and the first electronic device may also generate another event (referred to as a second operation herein for distinguishing from the first operation) according to a position where a cursor is located and the first operation, and the first electronic device may determine priorities of the first operation and the second operation, and determine whether to execute the first operation or the second operation based on the priorities of the first operation and the second operation.
Optionally, in this embodiment of the application, if the first electronic device recognizes that the first operation is a preset shortcut operation, at this time, the first electronic device may generate a first event corresponding to the first electronic device, and the first electronic device may also generate a second operation according to the position of the cursor and the first operation, where an execution priority of the first event is higher than an execution priority of the second event, and the first electronic device executes the first event. At this time, the operation position of the first operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object does not respond to the first operation because the execution priority of the second event is lower than the execution priority of the first event.
Next, the screen projection method provided by the present application is further described in conjunction with two application scenarios.
Video playing
Referring to fig. 12a, fig. 12a is an interface schematic diagram of a second electronic device according to an embodiment of the present application, as shown in fig. 12a, a user may click a video APP application by operating a first electronic device, and after the user clicks a video that the user wants to play, the second electronic device may display a video playing interface as shown in fig. 12 b.
Referring to fig. 12a, fig. 12a is an interface schematic diagram of a second electronic device provided in this embodiment of the application, as shown in fig. 12a, a user may click a full-screen control in an interface to implement full-screen playing of a video, and in another embodiment, as shown in fig. 12d, a user may rotate a first electronic device (from a portrait screen to a landscape screen) to implement full-screen playing of a video, and a full-screen playing interface of a video may refer to fig. 12e, where the full-screen playing interface may include, but is not limited to, a video image, a pause/play control, a next set of controls, and the like.
In this embodiment of the application, a user may click on the video playing area in fig. 12e, and accordingly, referring to fig. 12f, the screen projection content of the second electronic device may only include a video image, or, if the user does not operate the first electronic device for a long time, the screen projection content of the second electronic device may only include a video image (see fig. 12 f).
Referring to fig. 12g, fig. 12g is an operation diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 12g, when a second electronic device displays a video playing interface, a user performs a horizontal sliding operation on a display screen of the first electronic device, a video displayed by the second electronic device may fast forward or fast reverse, the second electronic device displays an interface as shown in fig. 12e, and when the user slides to the right on the display screen of the first electronic device, as shown in fig. 12h, screen projection content of the second electronic device includes a fast-forward preview image and a fast-forward progress bar, as shown in fig. 12i, screen projection content of the second electronic device includes a fast-forward preview image and a fast-forward progress bar, after the user determines a fast-forward target position based on the preview image displayed on the second electronic device, the sliding operation on the display screen of the first electronic device may be ended, and the video displayed by the second electronic device is fast-forwarded to the target position where the user determined the fast-forwarding (10:03), as shown in fig. 12j and 12 k.
Referring to fig. 12l, fig. 12l is an operation schematic diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 12l, a user may perform a sliding operation in a vertical direction in a right area of a display screen of the first electronic device to adjust a volume level of a currently played video, or as shown in fig. 12m, a user may adjust a volume level of a currently played video through a physical volume adjustment key 1201 of the first electronic device, as shown in fig. 12n, and fig. 12n shows a volume adjustment interface of a second electronic device.
Referring to fig. 12o, fig. 12o is an operation schematic diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 12o, a user may perform a sliding operation in a vertical direction in a left area of a display screen of the first electronic device to adjust a display brightness of a currently played video, as shown in fig. 12p, and fig. 12p shows a brightness adjustment interface of a second electronic device.
Referring to fig. 12q, fig. 12q is an operation schematic diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 12q, a user may slide from a right side boundary area to a screen center on a display screen of the first electronic device, at this time, a second electronic device may display interface content of a previous level (i.e., the first electronic device performs a return operation, and returns to the previous interface content), as shown in fig. 12r, at this time, screen projection content of the second electronic device is a previous level interface (a video playing interface) of a video full-screen playing interface.
Referring to fig. 12s, fig. 12s is an operation schematic diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 12s, a user may slide from a lower side boundary area to a center of a screen on a display screen of the first electronic device, at this time, a second electronic device may display a main interface (that is, the first electronic device performs an operation of returning to the main interface), as shown in fig. 12t, at this time, screen projection content of the second electronic device is the main interface of the first electronic device.
In this embodiment of the application, the first electronic device may further receive a first operation on a display screen of the first electronic device, where the first operation is a preset shortcut operation, and the first electronic device may respond to the first operation.
In this embodiment of the application, the first electronic device may detect a first operation on a display screen of the first electronic device, and recognize that the first operation is a preset shortcut operation, at this time, the first electronic device may generate a first event corresponding to the first electronic device, and the first electronic device may also generate another event (referred to as a second operation herein for distinguishing from the first operation) according to a position where a cursor is located and the first operation, and the first electronic device may determine priorities of the first operation and the second operation, and determine whether to execute the first operation or the second operation based on the priorities of the first operation and the second operation.
Optionally, in this embodiment of the application, if the first electronic device recognizes that the first operation is a preset shortcut operation, at this time, the first electronic device may generate a first event corresponding to the first electronic device, and the first electronic device may also generate a second operation according to the position of the cursor and the first operation, where an execution priority of the first event is higher than an execution priority of the second event, and the first electronic device executes the first event. At this time, the operation position of the first operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object does not respond to the first operation because the execution priority of the second event is lower than the execution priority of the first event.
For example, referring to fig. 12u, fig. 12u is an operation schematic diagram of a first electronic device provided in an embodiment of the present application, where the first electronic device may support a common full-screen gesture operation (shortcut operation) besides a click confirmation operation, and the operation is consistent with an original mode of using the first electronic device, so as to reduce a learning cost of a user. As shown in fig. 12u, sliding the representative return key from the left edge or the right edge to the middle, sliding the representative home key from the lower edge to the middle, and sliding and holding the representative menu key from the lower edge to the middle, a sliding operation of up, down, left, and right can be used to implement fast forward and fast backward in a video scene.
The principle of implementing the full-screen gesture operation described above is described next:
1. return key
In this embodiment, sliding from the left edge or the right edge to the middle represents a return key, and specifically, when the first electronic device is in the screen-off state, if a sliding operation from the left edge or the right edge to the middle is detected, the sliding operation executed at the current cursor position may be intercepted, but a return event is directly injected into the current front-end application, and in response to the return event, the interface content of the system is returned to the previous stage, which is equivalent to a return operation executed by the mobile phone when the user slides from the left edge or the right edge to the middle in the normal mode.
2. Home key
In this embodiment of the application, sliding from the lower edge to the middle represents a home key, specifically, when the first electronic device is in the screen-off state, if a sliding operation from the lower edge to the middle is detected, the sliding operation executed at the current cursor position may be intercepted, but an event returning to the main interface is directly injected into the current front-end application, and in response to the event returning to the main interface, the interface content of the system returns to the main interface at this time, which is equivalent to that when in the normal mode, the user clicks the home key.
3. Menu key
In this embodiment, a representative menu key slides and stays from the lower edge to the middle, specifically, when the first electronic device is in the screen-off state, if an operation of sliding and staying from the lower edge to the middle is detected, the operation may be intercepted at the current cursor position, but an event for displaying a menu is directly injected into the current front-end application, and in response to the event for displaying the menu, the interface content of the system displays a pull-up menu, which is equivalent to that in the normal mode, a user clicks to slide and stay from the lower edge to the middle, and the mobile phone displays the pull-up menu (the menu may include a currently running application list, or a history running list of applications, and the application is not limited in this application).
In this embodiment of the application, the first electronic device may receive a second touch operation on the display screen of the first electronic device, determine a corresponding third event according to an operation form of the second touch operation, where different operation forms correspond to different third events, and execute the third event on the front-end application of the first electronic device, where the operation form of the second touch operation at least includes one of the following operation forms: the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area; the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Specifically, after acquiring a first operation input by the touch screen, the enhanced interaction service may identify whether the first operation is a preset shortcut operation, and generate an event corresponding to the preset shortcut operation based on that the first operation conforms to the preset shortcut operation, where the enhanced interaction service may directly inject the event into the front-end application instead of being executed at a position where a cursor is located (that is, the first electronic device may directly respond to the first operation). Reference may be made to fig. 12u and the description of the corresponding embodiments, which are not described herein again.
Optionally, in another implementation, the display screen may be divided into two regions (for example, an upper half region and a lower half region), if the user touches the display screen in the upper half region of the display screen, the display position of the cursor on the interface content may be controlled, and if the user touches the display screen in the lower half region of the display screen, the operation manner similar to that in fig. 12u may be performed.
In addition, it should be noted that, in the above video playing scenario, after the user selects the video to be played, the second electronic device may directly play the video in full screen.
Optionally, the first electronic device 100 may obtain the video 400 corresponding to the current video playing interface, and process the video 400 into a video stream. In some possible embodiments, the first electronic device 100 acquires image data of a video and compresses an image corresponding to each image frame in the image data of the video, wherein the compressed image size of the image corresponding to each image frame is the same as the image size on the second electronic device. For example, when the image size is in units of pixels, assuming that the image size on the second electronic device is 400 × 800 pixels, and the image size of the image corresponding to each image frame is 800 × 800 pixels, when the first electronic device compresses the image corresponding to each image frame, the image size of 800 × 800 pixels of the image corresponding to each image frame may be compressed to 400 × 800 pixels to obtain a compressed image corresponding to each image frame. The first electronic device may perform video compression coding on compressed images corresponding to the plurality of image frames to obtain a video stream. The plurality of image frames may be image frames of image data of the video at a plurality of consecutive time nodes, wherein the image data of the video has one image frame at one time node. In other possible embodiments, after obtaining the image data of the video, the first electronic device may directly perform video compression encoding on a plurality of image frames of the image data on a plurality of consecutive time nodes to obtain a video stream.
In some possible embodiments, the first electronic device may obtain audio data of the video over a period of time determined by the plurality of consecutive time nodes. The first electronic device may perform audio compression encoding, such as Advanced Audio Coding (AAC), on the audio data during the period of time. The first electronic device mixes the image frames with the audio data in the period of time after video compression coding into a video stream. The data format of the video stream is any data format that the second electronic device can receive, such as a video stream in MP4(MPEG-4part 14) format. Images and audio corresponding to the video stream are synchronously presented on the second electronic device.
The first electronic device may transmit the video stream to the second electronic device. Correspondingly, after the second electronic device receives the video stream, the video stream is processed into image and audio output. In some possible implementations, the first electronic device may output the video stream as the projection data to the second electronic device through the projection port in a video streaming media protocol (such as Real Time Streaming Protocol (RTSP)). The second electronic device performs video streaming media protocol receiving processing and video audio decoding processing on the screen projection data (namely, video stream), and then renders and outputs, and at this time, the second electronic device displays an image corresponding to the screen projection data in a full screen mode and/or plays an audio corresponding to the screen projection data. For example, the first electronic device 100 has a screen width of S-W1 and a screen height of SH 1; the second electronic device 100 has a screen width of S-W2 and a screen height of S-H2. When the second electronic device 200 displays the image corresponding to the screen projection data, the width-to-height ratio of the image corresponding to the screen projection data may be adjusted to be the same as the screen width-to-height ratio S-W2: S-H2 of the second electronic device 200, and then the image may be displayed. Assuming that the video 400 is played on the first electronic device 100 in a full screen mode and has no black or white edge, and the screen width height ratio S-W1: S-H1 of the first electronic device 100 is the same as the screen width height ratio S-W2: S-H2 of the second electronic device 200, the video playing screen displayed by the second electronic device 200 also has no black or white edge, and the first electronic device 100 and the second electronic device 200 display the same image and play the same audio.
In some possible implementations, if the user performs a click operation on the display screen of the first electronic device by moving a cursor on the second electronic device to the pause control, the first electronic device pauses the playing of the video 400 in response to the operation. Meanwhile, the first electronic device 100 may pause the transmission of the video stream of the video 400, and the second electronic device 200 may also pause the playing because there is no video stream transmission. If the user moves the cursor on the second electronic device to the pause control and performs a click operation on the display screen of the first electronic device, the first electronic device responds to the operation, and continues to play from the current playing progress of the video 400, and the first electronic device 100 continues to transmit the video stream of the video 400, so that the second electronic device 200 receives the video stream and also continues to play. If the transmission of the video stream is a fragmented transmission, the first electronic device 100 transmits only the video stream for a fixed period of time each time the video stream is transmitted. For example, if the playing time of the video 400 is 25 minutes and 34 seconds, the first electronic device 100 may transmit a video stream with a length of 10 seconds each time, and then the video 400 needs to be transmitted 154 times to be transmitted to the second electronic device 200.
The above describes a description of an embodiment of a screen projection method provided by the present application in a video playing scene, and next describes an embodiment of a screen projection method provided by the present application in another application scene.
Demonstration scene
Fig. 13 is a schematic view of a screen projection scene provided in an embodiment of the present application. As shown in fig. 13, assume that in a conference room of a certain enterprise, user a, user B, user C, user D, user E, user F, user G, user H, and user I hold a conference; the user a sends the screen projection data containing the conference information to the second electronic device 200 through the first electronic device, and the second electronic device 200 displays the screen projection data. Generally, the device screen of the first electronic device 100 is small, it is difficult for the user a to share conference information with other users through the first electronic device 100, and the second electronic device 200 (e.g., a television) can display screen projection data through a large screen so as to be conveniently viewed by other users.
For the user B, the user C, the user D, the user E, and the like, which are close to the second electronic device 200, the conference information shared by the user a can be viewed by viewing the display content of the second electronic device 200 without separately carrying and using additional electronic devices for assistance. However, due to the screen size limitation of the second electronic device 200 and the visual problems such as myopia that some users may have, users such as user F, user G, user H and user I may not be able to clearly see the display contents of the second electronic device 200, which affects the normal progress of the conference.
Therefore, the first electronic device 100 may establish a screen-projecting connection with a plurality of second electronic devices 200 at the same time, and based on the technical solution of the present application, the user F, the user G, the user H, the user I, and the like may respectively use their respective second electronic devices 200 to receive and display screen-projecting data sent by the first electronic device 100, so as to facilitate viewing by corresponding users. Wherein, the processing procedure in the present application is the same for each of the plurality of second electronic devices 200.
Referring to fig. 14a, fig. 14a is a screen-shot content diagram of a second electronic device provided by an embodiment of the present application, as shown in fig. 14a, a user may click a PPT application by operating a first electronic device, and after the user clicks a PPT that wants to be demonstrated, the second electronic device may display a PPT presentation interface as shown in fig. 14 b.
As shown in fig. 14a, the PPT presentation interface may include a presentation control 1403, a current film presentation area 1404, and a film list area 1405. The current film presentation area 1404 is the film currently to be presented, the film list area 1405 can include therein a list of films, and the user can select the film to be presented in the film list area 1405, for example, the user can select the film C by operating the first electronic device, and accordingly, the film displayed in the film presentation area 1404 is the selected film C at this time.
As shown in fig. 14b, the user can click on a presentation control 1403 in the PPT presentation interface to implement the presentation of the film, which can be referred to fig. 14 c. Referring to fig. 14d, fig. 14d is an operation schematic diagram of a first electronic device according to an embodiment of the present application, as shown in fig. 14d, a user may perform an active operation to the left on a display screen of the first electronic device, at which time, a second electronic device may display a presentation interface (in which a presentation film is switched to a next one) as shown in fig. 14 e. Referring to fig. 14f, fig. 14f is an operation schematic diagram of a first electronic device provided by an embodiment of the present application, as shown in fig. 14f, a user may provide a full screen presentation of a film by rotating the first electronic device (from a vertical screen to a horizontal screen), as shown in fig. 14g, and as shown in fig. 14g, a full screen presentation interface of a film is shown.
Referring to fig. 15, fig. 15 is a flowchart illustrating an information processing method according to an embodiment of the present application, and as shown in fig. 15, the information processing method according to the present application includes:
1501. The first electronic equipment monitors the screen projection connection state.
In the embodiment of the application, the first electronic device may be equipped with the enhanced interaction service, and the enhanced interaction service may monitor the screen projection connection state of the first electronic device.
1502. The first electronic device detects that a screen projection connection is established with the second electronic device.
In this embodiment of the application, the enhanced interaction service may detect that the first electronic device and the second electronic device establish the screen-projecting connection, and the manner of establishing the screen-projecting connection between the first electronic device and the second electronic device may refer to the description of the embodiments corresponding to fig. 4a to 5c, which is not described herein again.
1503. The first electronic device generates screen projection content.
In the embodiment of the application, the enhanced interaction service of the first electronic device may obtain interface content of a front-end application of the first electronic device, and generate the cursor on the interface content to obtain the screen-projecting content.
In this embodiment, the enhanced interaction service of the first electronic device may further generate the cursor and the menu bar on the interface content to obtain the screen-shot content.
Specifically, referring to fig. 16, fig. 16 is an architecture schematic diagram of an embodiment of the present application, as shown in fig. 16, the enhanced interaction service may add a cursor and a menu bar to interface content of a front-end application of a current first electronic device based on a floating window interface to generate screen projection data, at this time, a screen projection service of the first electronic device may obtain the screen projection data, and send the screen projection data (which needs to be encoded and size-converted of content) to a second electronic device based on a certain screen projection protocol, so that a display screen of the second electronic device displays screen projection content corresponding to the screen projection data.
Optionally, the first electronic device may obtain posture change information of the first electronic device input by the sensor, move the position of the cursor in the interface content based on the posture change information of the first electronic device, and refer to the description in the above embodiment, and details are not repeated here, regarding how the enhanced interaction service moves the cursor in the display screen of the second electronic device based on the posture change information of the first electronic device.
Optionally, the first electronic device may receive a second sliding operation on the display screen of the first electronic device, determine a displacement of the cursor according to the second sliding operation, move the cursor on the screen-projecting content based on the displacement, and refer to the description in the foregoing embodiment for how to move the cursor in the display screen of the second electronic device according to the second sliding operation for enhancing the interactive service, which is not described herein again.
Optionally, the first electronic device may acquire a first touch operation input by the touch screen, generate a corresponding first event according to the first touch operation, and execute the first event at the target object of the interface content (at this time, the target object where the cursor is located). For example, the first touch operation may be a click operation or a slide operation, and accordingly, the first event may be a click event or a slide event.
Optionally, the first electronic device may mask a response of the interface content to the operation position of the first touch operation. Specifically, after receiving a first touch operation on a display screen of the first electronic device, the first electronic device may shield a response of interface content of the current front-end application to an operation position of the first touch operation, but determine a position of a cursor on a second display screen to determine a response position of the interface content to the first touch operation. For example, if the user performs a click operation on an a object of the display screen of the first electronic device, and at this time, the cursor is located on a B object in the second electronic device, at this time, the first electronic device may mask a response of the a object to the click operation, and respond to the click operation on the B object of the interface content.
Optionally, the first electronic device may receive a pressing operation on a physical key of the first electronic device, generate a corresponding second event according to the pressing operation, and execute the second event on the front-end application of the first electronic device. For example, the first electronic device may receive a pressing operation (for example, volume reduction) on a volume key of the first electronic device, generate a corresponding volume reduction event according to the pressing operation, and execute the volume reduction event on the first electronic device front-end application, so that the volume played by the first electronic device front-end application is reduced.
Optionally, the first electronic device may receive a second touch operation on the display screen of the first electronic device, determine a corresponding third event according to an operation form of the second touch operation, where different operation forms correspond to different third events, and execute the third event on the front-end application of the first electronic device, where the operation form of the second touch operation at least includes one of the following operation forms: the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area; the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Specifically, after the enhanced interaction service acquires the touch operation input by the touch screen, it may be recognized whether the touch operation conforms to a preset operation form, and an event corresponding to the preset operation form is generated based on that the touch operation conforms to the preset operation form, where the enhanced interaction service may directly inject the event into the front-end application instead of being executed at the position of the cursor.
Optionally, referring to fig. 17, fig. 17 is an architecture diagram of an embodiment of the present application, and as shown in fig. 17, the enhanced interaction service may obtain interface content of the current front-end application. Optionally, the enhanced interaction service may obtain interface content of a current front-end application based on a screen recording interface provided by the system (for example, a mediaproject interface provided by Android), and draw a cursor and a menu bar on the obtained interface content, and the enhanced interaction service may send the drawn content as screen projection data to a screen projection service of the first electronic device, where the screen projection service of the first electronic device obtains the screen projection data and sends the screen projection data (which needs to be subjected to encoding operation and/or size conversion of the content) to the second electronic device based on a certain screen projection protocol.
1504. The first electronic device sends screen projection content to the second electronic device.
In the embodiment of the application, after the first electronic device sends the screen projection data to the second electronic device, the display screen of the second electronic device may display screen projection content corresponding to the screen projection data, where the screen projection content includes a cursor and interface content of the first electronic device, and the cursor is used to be positioned at an operation position in the interface content.
For the implementation of step 1504, reference may be made to the above embodiment, where the first electronic device sends a description related to the screen projection data to the second electronic device, and details are not described here again.
It should be noted that, the above steps 1503 and 1504 may be implemented independently, instead of being implemented on the basis of the execution of the steps 1501 and 1502.
The embodiment of the application provides an information processing method, which comprises the following steps: generating screen projection content; sending the screen projection content to a second electronic device to enable a display screen of the second electronic device to display the screen projection content, wherein the screen projection content comprises a cursor and interface content of the first electronic device, and the cursor is used for being positioned at an operation position in the interface content. Through the mode, the cursor is added to the screen projection content, so that a user can determine the position to be operated through the cursor displayed by the second electronic device, the user can determine the position to be operated based on the position of the cursor in the screen projection content displayed by the display screen of the second electronic device without looking at the interface content of the first electronic device, and further, the operation is executed on the position of the cursor.
An embodiment of the present application further provides an electronic device, please refer to fig. 18, where fig. 18 is a schematic structural diagram of the electronic device according to the embodiment of the present application, and the electronic device includes:
A processing module 1801, configured to generate screen projection content;
a sending module 1802, configured to send the screen-shot content to a second electronic device, so that a display screen of the second electronic device displays the screen-shot content, where the screen-shot content includes a cursor and interface content of the first electronic device, and the cursor is used to be positioned at an operation position in the interface content.
Optionally, the processing module 1801 is specifically configured to:
and acquiring interface content of the front-end application of the first electronic equipment, and generating a cursor on the interface content to obtain screen projection content.
Optionally, the processing module 1801 is specifically configured to:
acquiring interface content of a front-end application of the first electronic equipment, and generating a cursor and a menu bar on the interface content to obtain screen projection content; wherein the menu bar does not belong to the interface content of the first electronic device.
Optionally, the processing module 1801 is further configured to detect that the first electronic device and the second electronic device establish a screen-casting connection.
Optionally, the processing module 1801 is further configured to obtain pose change information of the first electronic device;
causing the cursor to move in the screen projection content displayed by the second electronic device based on the pose change information.
Optionally, the pose change information includes a first rotation angle of the first electronic device on a horizontal plane and a second rotation angle of the first electronic device on a vertical plane, and the processing module 1801 is specifically configured to:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and moving the cursor in the screen-casting content displayed by the second electronic device according to the horizontal displacement and the vertical displacement.
Optionally, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, where the first direction is parallel to a left-right moving direction of a cursor displayed on a display screen of the second electronic device, the second direction is parallel to an up-down moving direction of the cursor displayed on the display screen of the second electronic device, and the processing module 1801 is further configured to:
determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and moving the cursor in the screen-casting content displayed by the second electronic device according to the horizontal displacement and the vertical displacement.
Optionally, the processing module 1801 is further configured to:
when the cursor moves to a target object in the screen projection content displayed by the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
enabling a target object in the screen projection content displayed by the second electronic equipment to respond to the first touch operation.
Optionally, the processing module 1801 is further configured to:
and shielding the response of the second object to the first touch operation.
Optionally, the first touch operation at least includes a click operation and a first slide operation, and the target object at least includes an application program and a function control.
Optionally, the processing module 1801 is further configured to:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in the screen-cast content displayed by the second electronic device based on the displacement.
Optionally, the operation position corresponding to the second sliding operation corresponds to a third object in the interface content of the first electronic device, and the processing module 1801 is further configured to:
And shielding the response of the third object to the second touch operation.
Optionally, the processing module 1801 is further configured to:
when the cursor moves to a target object in the screen projection content displayed by the second electronic device, if a click operation on a display screen of the first electronic device is received;
enabling a target object in the screen-shot content displayed by the second electronic equipment to respond to the clicking operation.
Optionally, the operation position of the click operation corresponds to a fourth object in the interface content of the first electronic device, and the processing module 1801 is further configured to:
and shielding the response of the fourth object to the click operation.
Optionally, the processing module 1801 is further configured to:
receiving a pressing operation on a physical key of the first electronic equipment;
generating a corresponding second event according to the pressing operation;
and executing the second event to the first electronic equipment front-end application.
Optionally, the processing module 1801 is further configured to:
receiving a first operation on a display screen of the first electronic device, wherein the first operation is a preset shortcut operation;
the first electronic device is responsive to the first operation.
Optionally, the operation position of the first operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object does not respond to the first operation.
Optionally, the preset shortcut operation at least includes:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Optionally, the processing module 1801 is further configured to:
reducing the display brightness of the display screen of the first electronic device; or the like, or, alternatively,
and executing screen-off operation on the display screen of the first electronic equipment.
Optionally, a display area of the display screen of the first electronic device is smaller than a display area of the display screen of the second electronic device.
Referring to fig. 19, fig. 19 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, and an electronic device 1900 may be embodied as a mobile phone, a tablet, a smart wearable device, and the like, which is not limited herein. Specifically, the electronic device 1900 includes: a receiver 1901, a transmitter 1902, a processor 1903, and a memory 1904 (wherein the number of processors 1903 in the electronic device 1900 may be one or more, and one processor is taken as an example in fig. 19), wherein the processor 1903 may include an application processor 19031 and a communication processor 19032. In some embodiments of the present application, the receiver 1901, the transmitter 1902, the processor 1903, and the memory 1904 may be connected by a bus or other means.
The memory 1904 may include both read-only memory and random access memory, and provides instructions and data to the processor 1903. A portion of the memory 1904 may also include non-volatile random access memory (NVRAM). The memory 1904 stores processors and operating instructions, executable modules or data structures, or subsets thereof, or expanded sets thereof, wherein the operating instructions may include various operating instructions for performing various operations.
The processor 1903 controls the operation of the electronic device. In a particular application, the various components of the electronic device are coupled together by a bus system that may include a power bus, a control bus, a status signal bus, etc., in addition to a data bus. For clarity of illustration, the various buses are referred to in the figures as a bus system.
The method disclosed in the above embodiments of the present application may be applied to the processor 1903, or implemented by the processor 1903. The processor 1903 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 1903. The processor 1903 may be a general-purpose processor, a Digital Signal Processor (DSP), a microprocessor or a microcontroller, and may further include an Application Specific Integrated Circuit (ASIC), a field-programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components. The processor 1903 may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 1904, and the processor 1903 reads the information in the memory 1904 and completes the steps of the method in combination with the hardware.
The receiver 1901 may be used to receive input numeric or character information and generate signal inputs related to settings and function controls of the electronic device. The transmitter 1902 may be configured to output numeric or character information through a first interface; the transmitter 1902 may also be configured to send instructions to the disk groups via the first interface to modify data in the disk groups; the emitter 1902 may also include a display device such as a display screen.
In this embodiment, in one case, the processor 1903 is configured to execute the information processing method in the corresponding embodiment in the foregoing embodiment.
Next, an embodiment provided by the present application is described taking the second electronic device 200 as a head-mounted display (HMD) as an example, wherein the HMD may be a VR device display, an AR device display or an MR device display.
In embodiments of the application, a user may be immersed in an augmented reality environment, a virtual reality environment, or a mixed display environment by wearing a Head Mounted Display (HMD), and the user may be immersed in a 3D virtual environment and interact with the virtual environment through a variety of different types of input. For example, the inputs may include physical interactions, such as manipulation of the first electronic device 100 separate from the HMD, manipulation of the HMD itself (based on head movement), and so forth.
Referring to fig. 20, fig. 20 is an interaction schematic of a first electronic device and a second electronic device, as shown in fig. 20, a user wearing a second electronic device 200(HMD) is holding the first electronic device 100 as a control device of the second electronic device 200. It should be noted that although only one first electronic apparatus 100 is illustrated as the control apparatus of the second electronic apparatus 200 in the example shown in fig. 20, two (or more) additional external devices may also be paired and/or interacted with the HMD100 in the virtual environment. During operation (after pairing), the first electronic device 100 (and/or other external apparatus) may communicate with the second electronic device 200 via, for example, a wired connection, or a wireless connection such as, for example, a WiFi or bluetooth connection, or other communication modes available to both apparatuses.
Fig. 21a shows a schematic diagram of a first electronic device 100 connected to a second electronic device 200 using a cable 1600. The first electronic device 100 may connect to the second electronic device 200 using one or more high-speed communication protocols (e.g., USB 2.0, USB 3.0, and USB 3.1). In some cases, the first electronic device 100 may be connected to the second electronic device 200 using an audio/video interface, such as a high-definition multimedia interface (HDMI). In some cases, the first electronic device 100 may connect to the second electronic device 200 using a DisplayPort standby mode for the USB type-C standard interface. The DisplayPort standby mode may include a high speed USB communication interface and DisplayPort functionality.
The cable 2100 may include suitable connectors that plug into the second electronic device 200 and the first electronic device 100 at either end. For example, the cable may include Universal Serial Bus (USB) connectors at both ends. The USB connectors may be identical USB type connectors, or each USB connector may be a different type of USB connector. The various types of USB connectors may include, but are not limited to, USB A-type connectors, USB B-type connectors, Micro-USB A connectors, Micro-USB B connectors, Micro-USB AB connectors, USB five pin Mini-B connectors, USB four pin Mini-B connectors, USB 3.0A-type connectors, USB 3.0B-type connectors, USB 3.0Micro B connectors, USB C-type connectors, and the like.
Fig. 21b is a schematic diagram illustrating the use of a wireless connection 1601 to connect the first electronic device 100 to the second electronic device 100 without a cable (e.g., without the cable 102 shown in fig. 21 a). The first electronic device 100 may connect to the second electronic device 200 using the wireless connection 1601 by implementing one or more high speed communication protocols, such as WiFi, bluetooth, or bluetooth Low Energy (LE).
It should be noted that the second electronic device 200 can also be connected to other control devices, such as a handle.
As shown in fig. 22a, at this time, the user uses the handle as the interactive device, however, in some scenarios, the handle is less portable, the independent handle needs to use a battery and is wirelessly connected to the first electronic device 100, extra power consumption is needed, and the handle of some AR/VR devices is large and heavy, and long-time use easily causes fatigue to the user.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 22b is a block diagram of a software structure of the first electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 22b, the application packages may include phone, camera, gallery, calendar, talk, map, navigation, WLAN, bluetooth, music, video, short message, VR glasses application, etc. applications. The VR glasses application comprises a 3D background drawing module, a handle event management module, an application icon loading module, a virtual screen management module and a virtual screen content acquisition module.
The 3D background drawing module is used for finishing drawing of a background picture displayed in a 3D virtual environment, so that a user can obtain the feeling of being in a certain real scene.
And the handle event management module is used for processing the events from the handle so as to achieve the purpose that a user can touch the control in the virtual display interface by operating the handle.
The application icon loading module is used for loading and displaying icons (such as WeChat, microblog, tremble and the like) of a plurality of applications on the electronic equipment in the virtual environment of the VR glasses.
And the virtual screen management module is used for creating a virtual screen when the user clicks the application icon to start the application, and destroying the virtual screen when the user closes the application.
And the virtual screen content acquisition module is used for acquiring the content in the application when the user clicks the started application and rendering the content in the application through distortion so as to realize display in the virtual environment.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 22b, the application framework layer may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
In the embodiment of the present application, the Activity Manager Service (AMS), the Window Management Service (WMS), and the Download Management Service (DMS) in the application framework layer may further include an application keep-alive module, an event injection module, and a virtual screen management module.
The application keep-alive module is used for controlling the electronic equipment to enter a VR multi-screen display mode after the application with the multi-screen display mode function is started. In this mode, the electronic device can run multiple applications simultaneously and support the applications to be active at the same time.
And the event injection module is used for acquiring an event corresponding to the operation of the user in the multi-screen display mode and distributing the event to a virtual screen corresponding to the application.
And the virtual screen management module is used for providing the capability of creating a virtual screen and destroying the virtual screen for the electronic equipment.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
As shown in fig. 22c, in this embodiment of the application, the VR scene display adapter module (hwvrdisplayadapter) implements creating a virtual screen and destroying the virtual screen in the multi-screen mode, implements management of the virtual screen, and opens and creates a virtual screen interface (createfrsisplay { }) and a destroyed virtual screen interface (destroyVRDisplay } for other services (for example, a display management module (DisplayManager), a display management global module (DisplayManager global), a display management service module (DisplayManager service), and the like). The display management module, the display management global module and the display management service module complete the function of creating the virtual screen when entering the VR multi-screen display mode by calling the created virtual screen interface layer by layer, and in addition, the display management module, the display management global module and the display management service module complete the function of destroying the virtual screen when exiting the VR multi-screen display mode by calling the created virtual screen interface layer by layer. In fig. 22c, the display management service module registers a callback required for creating a virtual screen when the electronic device is initialized, that is, the display management service module first calls an interface (registervrdisplayadapperlockiked { }) of the registration VR scene display adaptation module, and then calls a registration interface (registerLocked { }) to complete the registration for creating the virtual screen.
As shown in fig. 22d, in the embodiment of the present application, Android supports injecting an event into a designated screen, and a custom interface opens the capability of injecting the event into the screen to other services. In fig. 22d, the input management module (IputManager) sequentially calls an injectInputEvent (event, mode, displayId) interface, an injectinputeventotdisplay (event, mode), and an injectinputeventernal (event, displayId) interface to inject an event to a specified virtual screen.
Referring to fig. 23a, fig. 23a is an interaction schematic of a user using a first electronic device, and the user operates a VR/AR application on the first electronic device based on a handle, as shown in fig. 23a, a display interface of a second electronic device includes a current display image 2300 of the first electronic device, a function selection area 2302 and an interactive mode display area 2301, where the interactive mode display area 2301 indicates that the current interactive mode is a handle, and the user can operate the handle to operate the current interface.
Referring to fig. 23b, fig. 23b is a schematic diagram of a user's interaction with the first electronic device, as shown in fig. 23b, the user can click on the interactive mode display area 2301 by sliding the touch screen 2303 on the handle, so that the handle-based interactive mode is switched to the first electronic device-based interactive mode (e.g., the interface shown in fig. 23 d).
In this embodiment of the application, when the handle is connected to the second electronic device, the handle or the first electronic device may be used as a control device of the second electronic device; when only the first electronic device establishes connection with the second electronic device, the first electronic device may be used as a control device of the second electronic device.
In this embodiment, the handle may be used as the control device of the second electronic device, and the control device of the second electronic device may be switched from the handle to the first electronic device in response to the first interaction mode switching instruction based on receiving the first interaction mode switching instruction.
In this embodiment of the application, the first electronic device may receive a first interaction mode switching instruction sent by the handle, and generate a corresponding first interaction mode switching instruction based on receiving a second operation on the first electronic device, or receive a first interaction mode switching instruction sent by the second electronic device.
Optionally, in an embodiment, a corresponding physical key may be further disposed on the handle to implement a function of switching to an interaction mode based on the first electronic device, in this embodiment of the application, a user may press the physical key to switch to the interaction mode based on the first electronic device (switch the control device of the second electronic device from the handle to the first electronic device).
Optionally, in another embodiment, the interaction mode based on the first electronic device may also be switched by pressing a physical key on the first electronic device (for example, a power key shown in fig. 23 c).
Optionally, in another embodiment, the interaction mode based on the first electronic device may be switched to by pressing a physical key on the second electronic device.
Optionally, in another embodiment, if it is detected that the handle does not establish a connection with the second electronic device, the interaction mode based on the first electronic device may be directly used.
Similarly, the first electronic device may switch to the handle-based interaction mode in the same manner.
In the handle-based interaction mode, the user can control the display position of the cursor on the second electronic device based on the operation of the handle.
In an interaction mode based on a first electronic device, a user can control the display position of the cursor on the second electronic device based on the operation of the first electronic device.
How to control the display position of the cursor on the second electronic device by the user based on the operation of the first electronic device may refer to the description in the above embodiments, and details are not repeated here.
Alternatively, in an embodiment, different from the way of indicating the user operation object by displaying the cursor, in this embodiment, a ray exiting from the bottom side of the screen (or from the mobile phone image displayed by the second electronic device) to the currently displayed content may be displayed in the second electronic device, where the ray includes an end point, and the end point may indicate a position currently selected by the user (corresponding to the cursor).
At this time, the user may adjust the direction of the ray by adjusting the posture of the first electronic device, so that the user may adjust the position of the end point of the ray to the target object to be operated, referring to fig. 24a, fig. 24a is an interaction schematic of the user using the first electronic device provided in the embodiment of the present application, as shown in fig. 24a, if the user wants to operate the position corresponding to the area a, the user may adjust the direction of the ray by adjusting the posture of the first electronic device, so that the user may adjust the position of the end point of the ray to the target position to be operated (area a).
Referring to fig. 24b, fig. 24b is an interaction schematic diagram of a user using a first electronic device according to an embodiment of the present application, as shown in fig. 24b, the user may rotate the first electronic device on a horizontal plane, and accordingly, the direction of a ray displayed by a second electronic device may be changed accordingly, and the end position of the ray may be shifted in the horizontal direction.
Referring to fig. 24c, fig. 24c is an interaction schematic diagram of a user using a first electronic device according to an embodiment of the present application, as shown in fig. 24c, the user may rotate the first electronic device on a vertical plane, and accordingly, an end position of a ray displayed by a second electronic device may be vertically displaced.
In the embodiment of the application, after the user adjusts the position of the end point of the ray to the target position which the user wants to operate, the user can click the display screen of the first electronic device, and accordingly, the user can perform the click operation at the position of the end point of the ray.
Specifically, in an embodiment, after the user adjusts the position of the end point of the ray to the target position that the user wants to operate, the first electronic device may acquire the specific pixel coordinate position of the end point of the ray in the display interface of the second electronic device, and determine the pixel coordinate position corresponding to the display interface in the first electronic device according to the specific pixel coordinate position of the end point of the ray in the display interface of the second electronic device.
The user executes a click operation on the display screen of the first electronic device, at this time, the click operation is not responded by a foreground application (at this time, the display interface of the first electronic device), and the first electronic device injects an event corresponding to the click operation into a pixel coordinate position corresponding to the end point of the ray in the display interface of the first electronic device, which is equivalent to causing the first electronic device to execute the click operation at the pixel coordinate position corresponding to the end point of the ray in the first electronic device.
Optionally, in another embodiment, a user performs a sliding operation on a display screen of the first electronic device, at this time, the sliding operation is not responded by a foreground application (at this time, a display interface of the first electronic device), and the first electronic device injects an event corresponding to the sliding operation into a pixel coordinate position, corresponding to the end point of the ray, of the display interface in the first electronic device, which is equivalent to causing the first electronic device to perform the sliding operation at the pixel coordinate position, corresponding to the end point of the ray, of the first electronic device in the first electronic device. For more details, reference may be made to the above-described embodiments, which are not described in detail here.
Referring to fig. 25, fig. 25 is an operation schematic diagram of a first electronic device according to an embodiment of the present application, where the first electronic device can support a common full-screen gesture operation besides a click confirmation operation, and the operation is consistent with an original mode of using a handle, so as to reduce a learning cost of a user. As shown in fig. 25, sliding the representative return key from the left edge or the right edge toward the middle (corresponding to the return key in the handle), sliding the representative home key from the lower edge toward the middle (corresponding to the home key in the handle), sliding the representative home key up, down, left, and right (corresponding to the touch screen sliding in the handle), adjusting the volume by the volume key (corresponding to the volume adjustment key in the handle), sliding the representative return key from the lower edge toward the middle and stopping to realize the view return (corresponding to the home key long-press operation in the handle to realize the view return), pressing the volume down key and the power key simultaneously to realize the screenshot, and the like.
The principle of implementing the full-screen gesture operation may refer to the description of the embodiment in the screen projection scene, and is not described herein again.
Referring to fig. 26, fig. 26 is an interaction schematic diagram of a user using a first electronic device according to an embodiment of the present application, as shown in fig. 26, the user may perform a sliding operation on a display screen of the first electronic device, and accordingly, a pointer displacement displayed by a second electronic device, as to how the user performs the sliding operation on the display screen of the first electronic device to implement the pointer displacement displayed by the second electronic device, reference may be made to the description in the foregoing embodiment, and details are not repeated here.
Referring to fig. 27, fig. 27 is a schematic diagram of a system architecture provided in this example, as shown in fig. 27, taking a first electronic device as a mobile phone as an example, the system architecture includes: AR device/VR device/MR device, cell phone and independent handle. The independent handle can be connected with the AR device/VR device/MR device, and the mobile phone can be connected with the AR device/VR device/MR device.
In one implementation, a user may interact with the first electronic device by operating a separate handle and control the display content of the AR/VR/MR device.
Specifically, the independent handle may acquire gesture information of itself or sliding information on the touch panel, and send the acquired gesture information or sliding information to the mobile phone through the AR device/VR device/MR device, the mobile phone may process the gesture information or sliding information based on an independent handle interaction manner, and move the pointer in the interface content based on the gesture information or sliding information, and in addition, the independent handle may acquire a selection instruction (for example, through a physical button on the independent handle), and send the selection instruction to the mobile phone, and the mobile phone may process the selection instruction based on the independent handle interaction manner.
In one implementation, a user may interact with the first electronic device by operating a cell phone and control the display content of the AR/VR/MR device.
Specifically, the mobile phone may obtain gesture information of itself or sliding information in the display screen, the mobile phone may process the gesture information or the sliding information based on a mobile phone interaction manner, and move the pointer in the interface content based on the gesture information or the sliding information, and in addition, the independent handle may obtain a touch operation, generate a corresponding event, execute the generated event, and execute the event in the current front-end application.
Referring to fig. 28, fig. 28 is a schematic diagram of a system architecture provided in the embodiment of the present application, and as shown in fig. 28, the system architecture includes a system (for example, an Android system), an enhanced interaction service, a screen-off and lighting service, and an AR/VR/MR service.
The system can send information (such as gesture information) input by the sensor to the enhanced interaction service, the system can send information (such as touch events) input by the touch screen to the AR/VR/MR service, the AR/VR/MR service can send information input by the touch screen to the enhanced interaction service, the enhanced interaction service can send a screen-off or screen-lighting instruction to the screen-lighting and screen-off service, and the screen-lighting and screen-off service can send a screen-off or screen-lighting instruction to the system to achieve screen-off or screen-lighting of the first electronic device.
The enhanced interaction service may process the received information (sensor-input information or touch screen-input information), draw a pointer, and perform a corresponding event (which may be based on the location of the pointer, for example) on the AR/VR/MR application.
An embodiment of the present application further provides an electronic device, where the electronic device includes:
the system comprises a sending module, a processing module and a display module, wherein the sending module is used for establishing connection with second electronic equipment, the second electronic equipment displays a cursor and interface contents of the first electronic equipment, the first electronic equipment comprises a touch screen, and the second electronic equipment is augmented display AR equipment, virtual reality VR equipment or mixed reality MR equipment;
And the processing module is used for acquiring the operation of the first electronic equipment and controlling the display position of the cursor on the second electronic equipment based on the operation of the first electronic equipment.
Optionally, the interface content is interface content of a front-end application of the first electronic device.
Optionally, the second electronic device further displays: a menu bar, the menu bar not belonging to the interface content of the first electronic device.
Optionally, the processing module is specifically configured to acquire pose change information of the first electronic device, and move the cursor on the display content of the second electronic device based on the pose change information.
Optionally, the pose change information includes a first rotation angle of the first electronic device on a horizontal plane and a second rotation angle of the first electronic device on a vertical plane, and the processing module is specifically configured to:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and moving the cursor on the display content of the second electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, where the first direction is parallel to a left-right moving direction of a cursor displayed by the second electronic device, and the second direction is parallel to an up-down moving direction of the cursor displayed by the second electronic device, and the processing module is specifically configured to:
determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and moving the cursor on the display content of the second electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, the processing module is further configured to:
when the cursor moves to a target object in the display content of the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the first touch operation.
Optionally, the operation position of the first touch operation corresponds to a second object in the interface content of the first electronic device, and the processing module is further configured to:
And shielding the response of the second object to the first touch operation.
Optionally, the first touch operation at least includes a click operation and a first slide operation, and the target object at least includes an application program and a function control.
Optionally, the processing module is specifically configured to:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in display content of the second electronic device based on the displacement.
Optionally, the starting operation position corresponding to the second sliding operation corresponds to a third object in the interface content of the first electronic device, and the processing module is further configured to:
and shielding the response of the third object to the second touch operation.
Optionally, the processing module is further configured to:
when the cursor moves to a target object in the display content of the second electronic device, if a click operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the clicking operation.
Optionally, the operation position of the click operation corresponds to a fourth object in the interface content of the first electronic device, and the processing module is further configured to:
And shielding the response of the fourth object to the click operation.
Optionally, the second electronic device further displays: and the ray, the end point of which is the cursor.
Optionally, the processing module is further configured to:
receiving a pressing operation on a physical key of first electronic equipment;
generating a corresponding second event according to the pressing operation;
and executing the second event to the first electronic equipment front-end application.
Optionally, the processing module is further configured to receive a second operation on the display screen of the first electronic device, where the second operation is a preset shortcut operation, and enable the first electronic device to respond to the second operation.
Optionally, the operation position of the second operation corresponds to a fifth object in the interface content of the second electronic device, and the fifth object does not respond to the first operation.
Optionally, the preset shortcut operation at least includes:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Optionally, the processing module is further configured to reduce display brightness of a display screen of the first electronic device; or the like, or, alternatively,
and executing screen-off operation on the display screen of the first electronic equipment.
Optionally, the processing module is specifically configured to:
acquiring pose change information of the first electronic equipment based on the display of a first application on the second electronic equipment, and enabling the cursor to move on the display content of the second electronic equipment based on the pose change information;
the method comprises the steps of receiving a second sliding operation on a display screen of the first electronic device based on the fact that a second application is displayed on the second electronic device, determining the displacement of a cursor according to the second sliding operation, and enabling the cursor to move in the display content of the second electronic device based on the displacement, wherein the first application and the second application are different applications.
The embodiment of the present application further provides a first electronic device, the first electronic device with the second electronic device is connected, the first electronic device includes a touch screen, the second electronic device is an augmented reality AR device, a virtual reality VR device or a mixed reality MR device, the first electronic device includes:
The processing module is used for taking the handle or the first electronic equipment as control equipment of the second electronic equipment when the handle is connected with the second electronic equipment;
and when only the first electronic equipment is connected with the second electronic equipment, the first electronic equipment is used as the control equipment of the second electronic equipment.
Optionally, the processing module is specifically configured to:
using the handle as a control device for the second electronic device;
receiving a first interaction mode switching instruction;
and responding to the first interaction mode switching instruction, and switching the control device of the second electronic device from the handle to the first electronic device.
Optionally, the processing module is specifically configured to:
receiving a first interaction mode switching instruction sent by the handle, and generating a corresponding first interaction mode switching instruction or receiving a first interaction mode switching instruction sent by the second electronic equipment based on receiving a second operation on the first electronic equipment.
Optionally, the processing module is specifically configured to:
the first electronic equipment is used as the control equipment of the second electronic equipment;
The first electronic equipment further comprises an acquisition module for receiving and receiving a second interaction mode switching instruction;
the processing module is specifically configured to switch the control device of the second electronic device from the first electronic device to the handle in response to the second interaction mode switching instruction.
Optionally, the processing module is specifically configured to:
receiving a second interaction mode switching instruction sent by the handle, generating a corresponding second interaction mode switching instruction based on receiving a second operation on the first electronic equipment, or receiving a second interaction mode switching instruction sent by the second electronic equipment.
Optionally, the second electronic device displays a cursor and interface content of the first electronic device, and the processing module is specifically configured to:
controlling a display position of the cursor on the second electronic device based on the operation of the handle;
and controlling the display position of the cursor on the second electronic equipment based on the operation of the first electronic equipment.
Optionally, the interface content is interface content of a front-end application of the first electronic device.
Optionally, the processing module is specifically configured to:
Acquiring pose change information of the first electronic equipment;
causing the cursor to move on display content of the second electronic device based on the pose change information.
Optionally, the processing module is further configured to:
when the cursor moves to a target object in the display content of the second electronic device, if a first touch operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the first touch operation.
Optionally, the processing module is further configured to:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
causing the cursor to move in display content of the second electronic device based on the displacement.
Optionally, the processing module is further configured to:
when the cursor moves to a target object in the display content of the second electronic device, if a click operation on a display screen of the first electronic device is received;
causing a target object in the display content of the second electronic device to respond to the clicking operation.
An embodiment of the present application further provides an electronic device, including:
the system comprises a sending module, a processing module and a display module, wherein the sending module is used for displaying interface content of first electronic equipment in second electronic equipment, the first electronic equipment comprises a touch screen, and the second electronic equipment is augmented display AR equipment, virtual reality VR equipment or mixed reality MR equipment;
the processing module is used for receiving a first operation acted on a first display screen of the first electronic equipment; causing the interface content displayed by the second electronic device to be responsive to the first operation; wherein the operation position of the first operation corresponds to a first object in the interface content of the first electronic device; the first object does not respond to the first operation.
Optionally, the causing of the interface content displayed by the second electronic device to respond to the first operation specifically includes:
determining a first position in the interface content displayed by the first electronic equipment; causing the interface content displayed by the second electronic device to respond to the first operation based on the first location; wherein the first position is independent of an operating position of the first operation.
Optionally, the electronic device further includes:
the cursor is projected in the second electronic device, so that the second electronic device displays the cursor.
Optionally, a position of the cursor corresponding to the interface content displayed by the first electronic device is a first position.
Optionally, the determining a first position in the interface content displayed by the first electronic device includes:
acquiring pose change information of the first electronic equipment;
determining a first position of the cursor in interface content displayed by the first electronic device based on the pose change information.
Optionally, the determining the first position of the cursor in the interface content displayed by the first electronic device based on the pose change information includes:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, the determining, based on the pose change information, a first position of the cursor in the interface content displayed by the first electronic device based on the pose change information includes:
determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, the first operation includes at least a click operation and a first slide operation, and the first object includes at least one of an application and a functionality control.
Optionally, the determining a first position in the interface content displayed by the first electronic device includes:
receiving a second sliding operation on a display screen of the first electronic equipment;
Determining the displacement of the cursor according to the second sliding operation;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the displacement of the cursor.
Optionally, the first operation at least includes a click operation, and the first object at least includes one of an application and a functionality control.
Optionally, the electronic device further includes:
reducing the display brightness of the first display screen; or the like, or, alternatively,
and executing screen-off operation on the first display screen.
Optionally, the electronic device further includes:
displaying the menu bar in the second electronic device.
Referring to fig. 29, fig. 29 is a flowchart illustrating an operation method applied to screen projection according to an embodiment of the present disclosure, where a first electronic device has a first display screen, and a second electronic device has a second display screen; as shown in fig. 29, the operation method applied to screen projection provided by the embodiment of the present application includes:
2901. the first electronic device screens interface content of the first electronic device in a second electronic device, so that a second display screen of the second electronic device displays the interface content.
In the embodiment of the application, the first electronic device may be equipped with the enhanced interaction service, and the enhanced interaction service may monitor the screen projection connection state of the first electronic device. The enhanced interaction service may detect that the first electronic device and the second electronic device establish the screen-projecting connection, and the manner of establishing the screen-projecting connection between the first electronic device and the second electronic device may refer to the description of the embodiments corresponding to fig. 4a to 5c, which is not described herein again.
In the embodiment of the application, after it is detected that the first electronic device and the second electronic device establish screen projection connection, the first electronic device projects the interface content displayed by the first display screen into the second electronic device, so that the second display screen displays the interface content.
Optionally, the first electronic device may generate a cursor and project the cursor on the second electronic device, so that the second display screen displays the cursor. How the first electronic device generates the cursor may be described with reference to the corresponding embodiment in step 301, and is not described herein again.
Alternatively, the first electronic device may generate a menu bar and screen-cast the menu bar in the second electronic device, so that the second display screen displays the menu bar. How the first electronic device generates the menu bar may be described with reference to the corresponding embodiment in step 301, and is not described herein again.
Alternatively, the first electronic device may acquire pose change information of the first electronic device, and cause the cursor to move on the content displayed on the second display screen based on the pose change information. As to how to move the cursor in the display screen of the second electronic device based on the posture change information of the first electronic device, reference may be made to the description in the above embodiments, and details are not repeated here.
Optionally, the first electronic device may determine a first position in the interface content displayed by the first electronic device, and cause the interface content displayed by the second display screen to respond to the first operation based on the first position; wherein the first position is independent of an operating position of the first operation.
Optionally, the first electronic device may acquire pose change information of the first electronic device, and determine a first position of the cursor in interface content displayed by the first electronic device based on the pose change information. At this time, the position of the cursor corresponding to the interface content displayed by the first electronic device is a first position.
Optionally, the first electronic device may further receive a second sliding operation acting on the first display screen, determine a displacement of the cursor according to the second sliding operation, and move the cursor on the content displayed on the second display screen based on the displacement of the cursor. As to how to move the cursor in the display screen of the second electronic device according to the second sliding operation, reference may be made to the description in the above embodiments, which is not repeated here.
Optionally, the first electronic device may receive a second sliding operation on the display screen of the first electronic device, determine a displacement of the cursor according to the second sliding operation, and determine a first position of the cursor in the interface content displayed by the first electronic device according to the displacement of the cursor.
In this embodiment, after a first position is determined in the interface content displayed by the first electronic device, the interface content displayed by the second display screen may respond to the first operation based on the first position. As to how the first position responds to the first operation, reference may be made to the description of step 302 in the above embodiment, which is not described herein again.
Optionally, after the first electronic device detects that the first electronic device establishes the screen projection connection with the second electronic device, the display brightness of the first display screen may also be reduced; or executing screen-off operation on the first display screen.
2902. The first electronic equipment receives a first operation acted on a first display screen of the first electronic equipment; causing the interface content displayed by the second display screen to be responsive to the first operation; wherein the operation position of the first operation corresponds to a first object in the interface content of the first electronic device; the first object does not respond to the first operation.
In an embodiment of the present application, the first object may include at least one of an application and a functionality control.
In this embodiment, the first electronic device may shield a response of the content of the front-end application to the operation position of the first operation, that is, the operation position of the first operation corresponds to a first object in the interface content of the first electronic device, and the first object does not respond to the first operation.
Specifically, after receiving a first operation on the display screen of the first electronic device, the first electronic device may mask a response of the interface content to the operation position of the first operation, and determine a position of a cursor on the second display screen to determine a response position (first position) of the interface content to the first operation. For example, if the user performs a click operation on the a object of the first display screen of the first electronic device, and at this time, the cursor is located on the B object in the second electronic device, at this time, the first electronic device may mask the response of the a object to the click operation, and respond to the click operation on the B object of the interface content. The first position is, correspondingly, independent of the operating position of the first operation, but only of the position of the cursor on the second display screen of the second electronic device.
In this embodiment of the application, the first electronic device responds to the first operation at the first position of the content displayed on the first display screen, and the second display screen of the second electronic device also synchronously displays the content updated after the content displayed on the first display screen responds to the first operation at the first position, that is, the content displayed on the second display screen responds to the first operation.
In the embodiment of the application, the interface content of the first electronic equipment is projected to the second electronic equipment, so that the second display screen of the second electronic equipment displays the interface content; receiving a first operation acting on a first display screen of the first electronic device; causing the interface content displayed by the second display screen to be responsive to the first operation; wherein the operation position of the first operation corresponds to a first object in the interface content of the first electronic device; the first object does not respond to the first operation. By the above manner, after the screen-casting connection is established between the first electronic device and the second electronic device, the content in the first display screen does not respond to the first object corresponding to the operation position where the first operation is located, and the content displayed by the second display screen responds to the first operation, so that the user can operate the content of the first display screen based on the content displayed by the second display screen without watching the content of the first display screen of the first electronic device.
An embodiment of the present application further provides a first electronic device, where the first electronic device includes:
the processing module is used for projecting the interface content of the first electronic equipment into the second electronic equipment so that the second display screen of the second electronic equipment displays the interface content;
the processing module is further used for receiving a first operation acted on a first display screen of the first electronic device; causing the interface content displayed by the second display screen to be responsive to the first operation; wherein the operation position of the first operation corresponds to a first object in the interface content of the first electronic device; the first object does not respond to the first operation.
Optionally, the processing module is specifically configured to:
determining a first position in the interface content displayed by the first electronic equipment; causing the interface content displayed by the second display screen to respond to the first operation based on the first position; wherein the first position is independent of an operating position of the first operation.
Optionally, the processing module is further configured to:
the cursor is projected in the second electronic device, so that the second display screen displays the cursor.
Optionally, a position of the cursor corresponding to the interface content displayed by the first electronic device is a first position.
Optionally, the processing module is specifically configured to:
acquiring pose change information of the first electronic equipment;
determining a first position of the cursor in interface content displayed by the first electronic device based on the pose change information.
Optionally, the pose change information includes a first rotation angle of the first electronic device on a horizontal plane and a second rotation angle of the first electronic device on a vertical plane, and the processing module is specifically configured to:
determining a horizontal displacement of the cursor according to the first rotation angle;
determining a vertical displacement of the cursor according to the second rotation angle;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, the pose change information includes a first displacement of the first electronic device in a first direction and a second displacement of the first electronic device in a second direction, where the first direction is parallel to a left-right moving direction of a cursor displayed on a display screen of the second electronic device, and the second direction is parallel to an up-down moving direction of the cursor displayed on the display screen of the second electronic device, and the processing module is specifically configured to:
Determining a horizontal displacement of the cursor according to the first displacement;
determining a vertical displacement of the cursor according to the second displacement;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the horizontal displacement and the vertical displacement.
Optionally, the first operation includes at least a click operation and a first slide operation, and the first object includes at least one of an application and a functionality control.
Optionally, the processing module is specifically configured to:
receiving a second sliding operation on a display screen of the first electronic equipment;
determining the displacement of the cursor according to the second sliding operation;
and determining a first position of the cursor in the interface content displayed by the first electronic equipment according to the displacement of the cursor.
Optionally, the first operation at least includes a click operation, and the first object at least includes one of an application and a functionality control.
Optionally, the first operation is a preset shortcut operation, and the processing module is further configured to:
causing the first electronic device to respond to the first operation.
Optionally, the operation position of the first operation corresponds to a third object in the interface content of the second electronic device, and the third object does not respond to the first operation.
Optionally, the preset shortcut operation at least includes:
the method comprises the steps of contacting a first preset area of a display screen of the first electronic device, and sliding towards a first preset direction from the first preset area;
the second preset area is contacted with the display screen of the first electronic equipment, the second preset area slides to the second preset direction, and the time of contacting the display screen of the first electronic equipment is longer than the preset time.
Optionally, the processing module is further configured to:
reducing the display brightness of the first display screen; or the like, or, alternatively,
and executing screen-off operation on the first display screen.
Optionally, the processing module is further configured to:
generating a menu bar;
and projecting the menu bar into the second electronic equipment so that the second display screen displays the menu bar.
Optionally, the processing module is further configured to:
detecting that the first electronic device and the second electronic device establish screen projection connection.
The embodiment of the application also provides a control method for multi-application display.
Specifically, the imaging system (e.g., display screen, glasses, etc.) in which the second electronic device (e.g., large screen, television, AR, VR, or MR device, etc.) may display the multiple display interfaces of the first electronic device.
Alternatively, the first electronic device may generate a menu bar when the second electronic device establishes content transmission, the menu bar may be displayed in the imaging system of the second electronic device, and the user may add a new independent display interface (e.g., a mobile phone interface) of the first electronic device in the imaging system of the second electronic device according to an "add" button in the menu bar. As shown in fig. 30;
optionally, each time the user clicks one addition, an independent mobile phone interface may be newly added to the imaging system, and the state of the newly added interface may be a home interface of the mobile phone, or a current display interface of the mobile phone, or a preset default interface, or an interface of a certain preset application, or a copy interface of a currently displayed mobile phone interface of the imaging system, or a random interface; the present invention is by way of example only and is not limited thereto. Because a plurality of independent interfaces (such as a first interface and a second interface) of the mobile phone can be presented in the imaging system, the content in the large screen can be enriched.
Optionally, the user may also select a new application to run in the imaging system, which may generate a new, separate application interface in the imaging system of the second electronic device.
Optionally, the user clicks a title in the application, and a new independent interface of the second level of the application extension may be opened, for example, a chat of a specific contact in the WeChat, for example, a specific article in the News APP.
Optionally, when the user establishes content connection and transmission between the first electronic device and the second electronic device, and the first electronic device already runs N applications, the interface content of the N applications may be correspondingly displayed in N display areas in the imaging system, respectively.
Optionally, the interface contents may be distributed without overlapping, or may be distributed in a stacked manner (a display interface of the frontmost application currently running is kept displayed at the frontmost end), which is not limited in this application.
The first electronic device may generate a cursor (as in the cursor generation method described above) when the second electronic device establishes the content transfer, through the data transfer, and may be displayed in the imaging system. The range of movement of the cursor may be adapted to the full range of the imaging system. Specifically, the first electronic device may first acquire parameter information of the imaging system, such as a size specification of a display screen of a large screen, a field of view parameter of the ARVR device, and the like; the moving range and the mapping rule of the cursor are determined based on the parameter information, the operation can be performed across a plurality of different display interfaces of the first electronic equipment instead of being limited to a single display interface of the first electronic equipment, and the display boundary of the whole imaging system can be reached. In this way, when the plurality of first electronic device display interfaces are generated in the imaging system, the operation range of the cursor in the imaging system can be covered to the content in the plurality of first electronic device display interfaces, and the cursor can move freely in the plurality of first electronic device display interfaces. The cursor is used to determine an operation object in the contents of the entire display range of the imaging system. The operation object can be an object in the display interface of the first electronic device, and can also be other objects in the imaging system, which do not belong to the object in the display interface of the first electronic device.
Optionally, the user may control the position of the cursor in the imaging system by adjusting the pose (in this embodiment, the pose may also be understood as a gesture) of the first electronic device or sliding the screen of the first electronic device, so that the operation object may be determined in the content displayed in the imaging system according to the operation of the user. The starting position of the sliding operation corresponds to a first object in the interface content of the first electronic equipment; the first object does not respond to the sliding operation. It should be understood that the sliding operation is a random operation that may be performed multiple times. The longer the trajectory of the first sliding operation, the larger the movement range of the corresponding cursor in the imaging system.
Alternatively, when the cursor is located on an object (including but not limited to an application icon, a function key, an option of an unlimited type, or a preset position (e.g., a blank), etc.) in the imaging system, indicating that the object has been confirmed as an operation object, the user inputs an operation representing "confirmation" on the terminal, such as a key press, a touch, a click, a special slide, etc., and the operation object responds to the command of "confirmation". For example entering an application, a certain function key enabling, a selection of a certain option, an operation to generate a certain shortcut, etc., the invention is not exhaustive. It should be understood that, at this time, the "confirm" operation is an operation on the touch screen, and the touch position of the "confirm" operation corresponds to the second object in the interface content of the first electronic device; the second object does not respond to the "confirm" operation.
In a specific implementation, the content of the first electronic device interface may be adapted to the content in the imaging system.
Optionally, the current interface content of the first electronic device may be "single screen". For example, if a cursor in a display interface in the imaging system is located in a first application interface, interface content of the first electronic device is synchronized to the first application interface at this time; if the cursor in the imaging system transitions from the first application interface to the second application interface, the interface content of the first electronic device also switches from the first application interface to the second application interface at this time.
Optionally, in a specific implementation process, the content of the interface of the first electronic device may be "multi-screen", for example, the display interface of the first electronic device remains corresponding to all display interfaces in the second electronic device. Specifically, if the display interface in the imaging system includes the first application interface and the second application interface, the interface content of the first electronic device is synchronized to the first application interface and the second application interface; if the display interface in the imaging system comprises the first application interface, the second application interface and the third application interface, the interface content of the first electronic equipment is synchronized to be the first application interface, the second application interface and the third application interface.
In order to save power consumption, when the first electronic device transmits content to be displayed to the imaging system, the display brightness of the first electronic device can be reduced; or, performing screen-off operation on the first electronic device.
In conjunction with the foregoing embodiments, the present application proposes a novel touch screen-based control method. The method can be applied to a screen projection scene, wherein the screen projection scene can include that intelligent terminal equipment such as a mobile phone, a tablet personal computer and a notebook computer transmits and projects data contents to large display terminals such as a television and a large screen; the intelligent terminal equipment can also transmit and show data contents to AR, VR, MR and other imaging equipment; the present application is given by way of illustration and not limitation. The method can be applied to a universal screen projection application scene and can also be applied to a demonstration application scene; if a user explains the content in the terminal to a plurality of audiences, the user needs to interact with the audiences more, but is not suitable to look at the terminal screen all the time, and the terminal swings to be changed, and the terminal is not only a content source device but also a control device. The invention can also be applied to the blind operation of the equipment such as AR, VR, etc.; if the user brings AR, VR or MR equipment, the eyes of the user are already shielded, the terminal screen is invisible to the user, however, the mobile phone is usually a content input source of the AR, VR or MR equipment, so the user needs to operate the mobile phone, and at this time, the prior art needs to realize control by means of a handle.
The method can be applied to first electronic equipment, wherein the first electronic equipment can include but is not limited to a mobile phone, a tablet computer, a notebook computer and other portable small intelligent terminals; the first electronic device comprises one or more touch screens, and the touch screens can be used for displaying an interface of the terminal; referring to fig. 31, the method may include:
s1001: when the first electronic device and the second electronic device are connected in a screen projection mode, a cursor is generated and displayed in an interface of the touch screen. The second electronic device may include, but is not limited to, the aforementioned television, large screen, display, AR, VR or MR display device; it should be understood that AR, VR or MR do not have an oversized display volume, but they are in close proximity to the user's glasses when worn by the user, and thus also produce a relatively large viewing field, and thus may also be understood as a type of projection screen. The first electronic device includes, but is not limited to, the above-mentioned smart terminal such as a mobile phone, a tablet computer, a notebook computer, etc.
The method for establishing the screen-casting connection may refer to the related description in the foregoing embodiments, such as the wireless direct connection or the command switching in the setting option, which is not described herein again. When the screen projection connection is established between the first electronic device and the second electronic device, the screen projection connection can be understood as any time when the first electronic device and the second electronic device are in a screen projection state; for example, the screen projection success establishment between the first electronic device and the second electronic device may be included, or a certain time or a certain period of time after the screen projection success establishment between the first electronic device and the second electronic device is included; the invention is not limited.
Optionally, after the screen-casting connection between the first electronic device and the second electronic device is successfully established, the touch screen of the first electronic device may be immediately used as a touch panel, that is, the first electronic device is triggered to call a preset touch panel event processing system, and a factory default touch screen event processing system of the first electronic device is shielded.
Optionally, after the screen-casting connection is established, the interface of the first electronic device or the related settings may include a first function control; the first function control may be configured to respond to a first switching command of a user, instruct the first electronic device to call a preset touch panel event processing system, and shield a factory default touch screen event processing system of the first electronic device.
It should be appreciated that touch screen event handling systems are mature prior art technologies considered separately from touchpad event handling systems.
From the technical principle, the touch screen is a set of instruction system/event processing system for positioning based on absolute coordinates, and the touch screen can be directly touched where the touch screen is positioned without a cursor; the response position to the touch is correlated to the actual operating position of the touch. The touch pad is a set of instruction system/event processing system which performs positioning based on relative coordinates, and needs a cursor to indicate an operation position in an interface in real time, and senses the movement of a user finger to move the cursor or senses the touch of the user finger to execute operations such as confirmation, and can be understood as a replacement of a mouse; the response position to the touch is not related to the actual operation position of the touch, but is related to the cursor position. Therefore, whether the cursor can be used as a distinguishing element of whether the first electronic equipment adopts a touch screen event processing system or a touch pad event processing system or not is available; a cursor is not needed in the touch screen event processing system; the touchpad event handling system requires a cursor. It should be understood that the touch screen of the terminal in the related art is configured with only a touch screen event processing system.
In general, a touch operation on a touch screen may include the most basic three operations:
touchdown (xy), i.e. "press down";
touch (xy), i.e. "slide" or "long press";
touchup (xy), i.e., "lift-off.
xy may represent an actual touch position or an actual operation position of the touch operation in the screen.
In the touchpad event processing system, a cursor is controlled by using the relative position of a touch operation, and the initial position of the cursor is default or random.
For the first electronic device, one possible implementation of the touch screen event handling system and the touch pad event handling system may be as shown in table 1 below:
TABLE 1
Figure RE-GDA0002896996540000651
Figure RE-GDA0002896996540000661
Optionally, a mouse event processing system similar to the touch pad event processing system may be predefined, and the position of the cursor in the interface may be changed by changing the posture of the mobile phone; the specific implementation manner can refer to the relevant description in the foregoing embodiments. One possible implementation of the touch screen event processing system and the air mouse event processing system can be shown in the following table 2:
TABLE 2
Function/instruction Operation mode of touch screen event processing system Operation mode of air mouse event processing system
Changing cursor position / Changing the posture of a mobile phone
Confirmation Is pressed down Pressing at the cursor
End up Is lifted up Is lifted up
Dragging an object Sliding motion After long press, change the posture of the mobile phone
Return to Single click return key Full screen gestures: the screen sliding sideways to the centre
Enter the main interface Single-click Home key Full screen gestures: screen bottom up slide
…… ……
It should be understood that the event processing systems in tables 1 and 2 can be invoked independently or in combination with the first electronic device. For example, the position of the cursor in the interface can be changed through sliding operation, the position of the cursor in the interface can be changed through changing the posture of the mobile phone, and the position of the cursor in the interface can be changed through the sliding operation and the change of the posture of the mobile phone in a matching mode. Namely, the touch panel event processing system is combined with the air mouse event processing system; where some of the same instruction definitions may be multiplexed.
It should be understood that the above mentioned touch panel event processing system or air mouse event processing system may include more or less function definitions than those in the related examples, any preset event processing system may be defined in advance, and the number, types and triggering manners of commands may be flexibly defined by a user or a developer according to experience or requirements, so that the user of a consumer can freely define and invoke the preset event processing system according to different scenarios when using the terminal. The embodiments of the present application are not intended to be exhaustive or limiting.
It should be understood that when the first electronic device does not establish a screen-casting connection with any electronic device, the interface of the touch screen does not include a cursor. Before the first electronic device does not have any electronic device to establish screen projection connection, the first electronic device works in a normal mode, namely the first electronic device applies a conventional touch screen event processing system, and at the moment, the touch screen event processing system does not need a cursor as an indication. When the first electronic device and the second electronic device establish screen projection connection, the first electronic device can call a new event processing system, such as the touch pad event processing system or the air mouse event processing system; at this time, a cursor needs to be generated in the interface to prompt the real-time operation position under the new instruction system.
Optionally, the first electronic device obtains interface content of a front-end application corresponding to the current interface, and draws a cursor in the interface content.
Optionally, the first electronic device may draw a cursor and add the cursor to the current interface based on the floating window interface.
Optionally, the first electronic device may draw a ray and intersect the ray with the current interface or a virtual plane/panel of the current interface at the end point based on the floating window interface; the end point is the cursor. Rays can be widely applied in 3D immersive display scenes with AR/VR engines.
It should be understood that the drawn cursor does not belong to the same process as the currently running foreground application.
It should be understood that, during the process that the terminal calls the touch pad event processing system or the air mouse event processing system, a cursor can be displayed in the interface of the first electronic device so as to clearly prompt the user for operation. Specifically, the method for generating the cursor and the presentation form of the cursor may refer to the related description in the foregoing embodiments, and are not repeated here.
S1002: after the screen projection connection is established, the first electronic equipment generates screen projection content and sends the screen projection content to the second electronic equipment in real time, wherein the screen projection content comprises a cursor;
generally, the screen projection is to display the content displayed in the screen projection source device in another "wider-view" display device; the display contents of the two electronic devices are basically kept consistent.
Specifically, after the screen-projecting connection between the first electronic device and the second electronic device is successfully established, the first electronic device generates screen-projecting contents in real time; the screen-shot content may include some or all of the content in the interface of the touch screen; for example, the screen-shot content may further include an interface of a background application executed by the first electronic device, a newly drawn menu bar, or preset function controls. The first electronic equipment sends the screen projection content to the second electronic equipment in real time; wherein the screen-projected content is used to synchronize some or all of the content presented in the display system of the second electronic device with some or all of the content displayed in the touch screen.
Optionally, the content of the first electronic device generating the screen projection may be all of the display interfaces, and the second electronic device synchronously displays the screen projection content. In addition, the display interface of the second electronic device may not only include the screen projection content, but also display more function options, backgrounds or menus. Optionally, the content that the first electronic device generates the screen projection may be a part of the display interface, for example, when the first electronic device is watching news or videos, the screen projection content may be only content corresponding to a news application or a video application, and other parts in the terminal interface, for example, the prompt icon above a signal, power, a network, and the like, may not be in the screen projection content.
It should be understood that after the first electronic device and the second electronic device are connected, the display interfaces of the first electronic device and the second electronic device may include real-time screen projection content. The change of the interface and the change of the content in the first electronic equipment can also be displayed and reflected in the second electronic equipment in real time along with the screen projection content in whole or in part. Therefore, the cursor in the first electronic device is also displayed in the second electronic device, and the movement of the cursor in the first electronic device and the operation for a certain object can be synchronously displayed in the second electronic device. Therefore, the user can feed back and perceive the operation of the first electronic device by watching the screen projection content synchronously displayed in the second electronic device.
It should be understood that the operation or instruction received by the first electronic device is firstly responded in the first electronic device, and the response change of the first electronic device is synchronously displayed in the display system of the second electronic device along with the screen projection of the interface content.
S1003: and receiving a touch operation of a user on the touch screen.
As described above, the touch screen-based operation may include:
touchdown (xy), i.e. "press down";
touch (xy), i.e. "slide" or "long press";
touchup (xy), i.e., "lift-off.
Usually, one-time quick 'Touchdown + Touchup' is called one-time click operation; a constant position "Touching" can be understood as a long press operation; a "Touching" of a shift position may be understood as a sliding operation.
Touch screen based operations may generally include the three basic operations described above; or a combination of time, number, position, orientation and/or force.
It should be understood that the user may input other operations besides the touch operation during the use of the first electronic device, such as shaking, pressing a key, touching other bodies, and the like, and the invention is not limited thereto.
S1004: responding to target touch operation based on the display position of the cursor in the current interface;
The display position of the cursor in the current interface is used for indicating a response position for the target touch operation in the current interface; the response position is irrelevant to an operation position of the target touch operation on the touch screen.
Optionally, after the first electronic device calls a preset event processing system, shielding the touch screen event processing system; enabling the target touch operation not to be injected into the touch screen event processing system to form a channel; and re-analyzing the target touch operation into a target event, and injecting the target event into a preset event processing system, so that the first electronic device responds to the target touch operation according to the preset event processing system.
Specifically, taking the linux system as an example for explanation, referring to fig. 32, a general event processing procedure is as follows:
1) the Linux kernel writes raw events (e.g., touch screen operations) into the device nodes. In Linux, all devices are stored in a/dev directory in a file form and are accessed in a file manner, and a device node is an abstraction of a Linux kernel to a device.
2) An Input Reader (also called Reader) continuously extracts original Input events from a bottom layer such as EventHub in a thread cycle, and after processing, the processed events are temporarily placed into a dispatching queue
3) The InputDispatcher (also called Dispatcher) fetches events in the dispatch queue in a thread loop, looks up the appropriate window, and writes the events into the window's event receive pipeline (APP). I.e., the InputDispatcher is responsible for dispatching input events to the application layer.
4) The window event receiving thread Looper takes out the event from the pipeline and sends the event to the event processing function to carry out event response. The application will respond to the function and receive. The InputDispatcher and each application are not in the same process, and the Window pipe Inputchannel mainly completes cross-process communication and transmits input data to the application.
5) Looper is a loop in the application that constantly fetches messages and then invokes event recipient handling. The event receiver mainly comprises View in the application, such as a button, an input box and the like.
The whole process has three threads connected end to end, and delivers the events to the event processing function layer by layer like three water pumps. Modifying any one link can cut off the paths of the three threads, and further inject the input event into the path mask of the event processing system.
At this time, the first electronic device may intercept and forward the input event to the newly called event processing system, convert the input event into a new operation event, and inject the new event into a related link of the new event processing system to form a new event processing thread path, thereby implementing a new event command.
Optionally, after the first electronic device calls a preset event processing system, drawing a transparent target floating window on the current interface; the target floating window is used for shielding the touch screen event processing system; and specifically, the operation is performed by target touch. The touch operation cannot be injected into the touch screen event processing system to form a channel; analyzing a target event corresponding to the target touch operation through the target floating window, and injecting the target event into a preset event processing system, so that the first electronic equipment responds according to the preset event processing system
Specifically, a developer can develop an APP, and a transparent floating window is displayed on a currently running application interface during running, so that a touch event can be intercepted, an input event can be preferentially obtained, and the display of the current application interface can not be influenced. The path of event processing may refer to the threads described above. And will not be described herein.
To give some specific examples, the following:
specifically, when the first electronic device calls the touch pad event processing system, if the touch operation is a sliding operation, the sliding operation is used for changing the position of the cursor in the current interface. And the functions of sliding objects such as page turning, upward sliding, downward sliding and the like generated on the actual operation position of the sliding operation based on the touch screen event processing system are shielded. The objects referred to in the present application include, but are not limited to, applications, icons, menu options, content of interest in an interface, blank areas or function controls, and the like.
Specifically, when the first electronic device invokes the touch pad event processing system, if the touch operation is a click operation, the click operation is used for executing a confirmation operation on an object corresponding to the cursor in the current interface. While functions like "confirm the object at the absolute position of the click" based on the touch screen event processing system are masked.
Specifically, when the first electronic device calls the touch pad event processing system, if the touch operation is a long press plus slide operation, the operation is used for performing operations of objects in the slide interface, such as "page turning", "slide up", "pull down", "move icon", or "adjust progress bar/intensity", on the object corresponding to the cursor in the current interface. Which is equivalent to simulating a sliding operation in a touch screen event processing system. And the functions of performing the functions like 'moving after selecting the object' and the like on the object at the absolute position of the long press operation generated by the touch screen event processing system are shielded.
Specifically, when a target touch operation enables a motion track of the cursor in the current interface to match a preset shortcut operation command, or when the target touch operation matches the preset shortcut operation command, the target touch operation is used for executing the shortcut operation command. And the operations of the objects in the sliding interface, such as 'page turning', 'slide up', 'pull down' or 'progress bar/strength adjustment', on the basis of the objects corresponding to the actual operation position of the target touch operation generated by the touch screen event processing system are shielded.
Specifically, the target touch operation matched with the preset shortcut operation command includes: sliding towards a first preset direction from a first preset area of the touch screen; or sliding the touch screen from a second preset area of the touch screen to a second preset direction, wherein the time for contacting the touch screen is longer than the preset time; or the number of times of clicking the touch screen reaches a preset number; or the sliding track of the target touch operation meets a preset pattern.
The specific functional response may refer to, but is not limited to, the command corresponding manner in the touch pad event processing system and the air mouse event processing system recorded in tables 1 and 2 above, and may also refer to the description in the foregoing related embodiments, such as how to control the specific movement of the cursor by specific sliding and changing the posture of the electronic device, how to trigger a shortcut, and the like. It should be understood that, for any event processing system, there is a set of preset correspondence table between functions/instructions and operation modes, and when the first electronic device receives or recognizes an operation corresponding to a target function/instruction, it can respond to the corresponding target function/instruction of the event processing system. And are not intended to be exhaustive or repeated here.
In addition to the above steps, the technical solution may further include the following optional embodiments.
S1005, after the first electronic device invokes the new event processing system, the user does not need to view the display content in the first electronic device in real time. At this time, the first electronic device may reduce the display brightness of the touch screen or may touch the touch screen. Reference may be made specifically to the description of the foregoing embodiments. The dimming time may be after the first electronic device and the second electronic device establish the screen-casting connection successfully, or may be when the user clicks the screen-casting switching command, or may be when the user establishes the screen-casting connection for a preset time.
S1006, when detecting that the first electronic device and the second electronic device establish screen-casting connection, triggering and calling a first event processing system; or, detecting a first switching command input by a user; the first switch command is used to instruct or trigger the first electronic device to invoke the first event processing system and to mask the touch screen event processing system.
And S1007, reducing the display brightness of the touch screen or displaying the touch screen. So that the power consumption of the first electronic device is reduced when the user does not need to look at the screen of the first electronic device.
S1008, referring to fig. 30, when the first electronic device establishes a screen-projecting connection with the second electronic device, the method further includes: starting a multi-screen display mode; creating N virtual screens in an interface of a touch screen; determining N interfaces to be displayed; respectively displaying N interfaces to be displayed in N virtual screens; the N interfaces to be displayed belong to M applications, and the M applications can run simultaneously; m is an integer not greater than N, N is an integer greater than 1. The N interfaces may include, but are not limited to, an intrinsic interface of the first application, or a primary interface or a secondary extended interface of the first application, a floating window for playing, a picture-in-picture, a parallel page, a split screen, or other display interfaces; in addition, a desktop interface such as a main interface, a minus one screen, etc. can also be considered as an application. The cursor can be located at any position in the N virtual screens, namely the cursor can be free across the interfaces, the operation and control effects can be generated on the multiple interfaces, and the operation and control capability of the first electronic device is enhanced. Optionally, a menu bar may be included in the interface; the menu bar includes functionality controls for adding or deleting interface content for applications in the imaging system.
S1009, if the time that the cursor stays in the second application exceeds the preset time, or the second application is the application temporarily selected by the user; the first electronic device may close the multi-screen display mode, and reserve or enlarge the interface corresponding to the second application as the front-end application interface; while switching other applications to the background. The first application or the second application described above belongs to one of the M applications. Optionally, if the first electronic device does not receive the external input for a long time, the cursor may be hidden, for example, when the user plays a video or music. When the first electronic device receives the predefined operation, the cursor may be reactivated. As described in relation to the previous embodiments. Seamless manipulation among multiple applications or multiple interfaces can be realized in the multi-screen display interface. A more versatile interface is presented with a "wider field of view".
S1010, the first electronic equipment receives a second switching command; the second toggle command is used to instruct the first electronic device to turn off the preset event processing system and resume enabling the touch screen event processing system, or to toggle to another predefined event processing system. The second switching command may be input by a user or a preset switching condition is met, so that the first electronic device is triggered. For example by shaking or the like.
S1011, a floating window menu bar may also be drawn, which may include controls for a variety of predefined functions. Such as a toggle control that changes a "ray" maneuver to an empty mouse maneuver as described in the previous embodiments. In a specific implementation process, the switching between the ray control and the mouse control can be realized by shaking.
It should be understood that the above-mentioned "drawing" can be implemented by the screen projection service of the first electronic device, a preset service in the underlying system, or a running specific application.
For the application, for the first electronic device, the touch screen can display an interface, namely a content source, and also integrates a touch sensor to respond to touch operation; generally, for a touch screen, the first electronic device only has a set of touch screen event processing systems which are regularly used by a user. The invention also provides at least one new event processing system for the touch screen. The new event processing system may be written in advance before the terminal leaves the factory, may be written by a developer, or may be written by an application program. The invention can write predefined touch pad, mouse or similar event processing system into the system of the terminal such as mobile phone; in the process that a consumer uses the terminal, a conventional touch screen event processing system can be converted into a new event processing system according to scene requirements, so that the first electronic equipment can be used as a content source and a touch pad; screen projection, demonstration or blind operation are carried out without additional control equipment. Particularly, for an application scene that a user cannot watch or watch the first electronic device, the content source in the first electronic device can be further accurately controlled through feedback of changes of screen projection content of the second electronic device, a screen with excellent touch feeling of a touch screen is used as the touch pad, and all operations on the content source can be almost integrated and completed by only one terminal, so that great convenience is provided for the user.
Correspondingly, the present application also provides a control device 2000 based on a touch screen, please refer to fig. 33, which can be applied to a first electronic device, wherein the first electronic device includes a touch screen; touch screens can be used to display interfaces; the device includes:
a generating module 2001, configured to generate a cursor when the first electronic device and the second electronic device establish a screen-casting connection, where the cursor is displayed in an interface of the touch screen; the generation module 2001 may be implemented by a processor calling a corresponding program instruction.
A screen projection module 2002 for generating screen projection content based on the interface content in the touch screen; sending the screen projection content to the second electronic equipment; the screen projection content comprises a cursor; the screen projection module 2002 may be implemented by a processor invoking corresponding program instructions, cooperating with an associated screen recording interface, screen projection interface, and/or transceiver.
A receiving module 2003, configured to receive a target touch operation of a user on a touch screen; the receiving module 2003 may be implemented by a processor invoking corresponding program instructions in conjunction with a touch screen sensing event.
A response module 2004, configured to respond to the target touch operation based on a display position of the cursor in the current interface;
The display position of the cursor in the current interface is used for indicating a response position for the target touch operation in the current interface; the response position is irrelevant to the operation position of the target touch operation on the touch screen.
In a specific implementation process, the apparatus may further include a calling module 2005, configured to call the first event processing system and shield the touch screen event processing system before the generating module generates the cursor; or calling a first event processing system; drawing a transparent target suspension window on the front-end application corresponding to the current interface; the target floating window is used for shielding the touch screen event processing system. The response of the touch screen event-based processing system to the target touch operation is related to the operation position of the target touch operation on the touch screen; the response of the first event processing system to the target touch operation is related to the position of the cursor and is not related to the operation position of the target touch operation on the touch screen. For the specific response of the response module to the first event processing system, reference may be made to the embodiments in the foregoing method, and details are not described here.
In a specific implementation process, the apparatus may further include a detection module 2006, configured to detect that the first electronic device and the second electronic device establish a screen-casting connection before the calling module calls the first event processing system; or, detecting a first switching command input by a user; the first switch command is used to instruct the first electronic device to invoke the first event processing system and to mask the touch screen event processing system.
In a specific implementation process, the apparatus may further include a brightness control module 2007, configured to reduce the display brightness of the touch screen, or screen-touch the touch screen. So that the power consumption of the first electronic device is reduced when the user does not need to look at the screen of the first electronic device.
In a specific implementation process, the apparatus may further include a multi-screen creating module 2008, configured to start a multi-screen display mode; creating N virtual screens in an interface of a touch screen; determining N interfaces to be displayed; respectively displaying N interfaces to be displayed in N virtual screens; the N interfaces to be displayed belong to M applications, M is an integer not greater than N, and N is an integer greater than 1; the cursor may be located at any position in the N virtual screens.
In a specific implementation, the apparatus may further include an activation module 2009 configured to receive a predefined operation, where the predefined operation is used to activate or hide the display of the cursor. Reference may be made to the foregoing embodiments regarding the partial scenes of hiding and activating the cursor.
In a specific implementation process, the apparatus may further include a switching module 2010, configured to receive a second switching command; the second toggle command is used to instruct the first electronic device to turn off the preset event processing system and resume enabling the touch screen event processing system, or to toggle to another predefined event processing system. The second switching command can be input by a user or meets a preset switching condition, so that the first electronic device is triggered; for example by shaking or the like. Alternatively, the switching module may switch among different event processing systems. The form of the cursor in different systems may be multiplexed or different, such as the aforementioned ray or cursor.
In a specific implementation process, the generating module 2001 is specifically configured to execute the method mentioned in S1001 and equivalent methods; the screen projection module 2002 is specifically configured to perform the method mentioned in S1002 and equivalent methods; the receiving module 2003 is specifically configured to perform the method mentioned in S1003 and equivalent alternatives; the response module 2004 is particularly adapted to carry out the method mentioned in S1004 and equivalent methods; the calling module 2005 is specifically configured to execute the method mentioned in S1005 and equivalent methods. The detection module 2006 is specifically configured to perform the method mentioned in S1006 and equivalents thereof. The brightness control module 2007 is specifically configured to perform the method mentioned in S1007 and methods that may be equally substituted. The multi-screen creation module 2008 is specifically configured to execute the method mentioned in S1008 and equivalents thereof. The activation module 2009 is specifically configured to perform the method mentioned in S1009 and equally alternative methods. The switching module 2010 is specifically adapted to perform the method mentioned in S1010 and equivalent alternatives.
The aforementioned specific method embodiments, explanations and expressions of application scenarios and technical features, and extensions and additions of various implementation forms are also applicable to the execution of the related methods related to each module in the apparatus, and are not repeated in the apparatus embodiments.
It should be understood that the above division of the modules in the apparatus 2000 is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. For example, each of the above modules may be a processing element separately set up, or may be implemented by being integrated in a certain chip of the terminal, or may be stored in a storage element of the controller in the form of program code, and a certain processing element of the processor calls and executes the functions of each of the above modules. In addition, the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit chip having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software. The processing element may be a general-purpose processor, such as a Central Processing Unit (CPU), or may be one or more integrated circuits configured to implement the above method, such as: one or more application-specific integrated circuits (ASICs), one or more microprocessors (DSPs), one or more field-programmable gate arrays (FPGAs), etc.
It should be noted that the above-described embodiments of the apparatus are merely schematic, where the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. In addition, in the drawings of the embodiments of the apparatus provided in the present application, the connection relationship between the modules indicates that there is a communication connection therebetween, and may be implemented as one or more communication buses or signal lines.
It should be understood that the embodiments of the present invention are extremely rich, and any of the above steps can be freely combined without violating the natural law, and detailed description cannot be given for all possible scenarios and implementations in the present application. The description of the relevant features may also be applied to further embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present application can be implemented by software plus necessary general-purpose hardware, and certainly can also be implemented by special-purpose hardware including special-purpose integrated circuits, special-purpose CPUs, special-purpose memories, special-purpose components and the like. Generally, functions performed by computer programs can be easily implemented by corresponding hardware, and specific hardware structures for implementing the same functions may be various, such as analog circuits, digital circuits, or dedicated circuits. However, for the present application, the implementation of a software program is more preferable. Based on such understanding, the technical solutions of the present application may be substantially embodied in the form of a software product, which is stored in a readable storage medium, such as a floppy disk, a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, an exercise device, or a network device) to execute the method according to the embodiments of the present application.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, training device, or data center to another website site, computer, training device, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that a computer can store or a data storage device, such as a training device, a data center, etc., that incorporates one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While some embodiments of the invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the recited embodiments and all such alterations and modifications as fall within the scope of the invention. It will be apparent to those skilled in the art that various modifications and variations can be made in the embodiments of the present invention without departing from the spirit or scope of the embodiments of the invention. If such modifications and variations of the embodiments of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention also includes such modifications and variations.

Claims (58)

1. The control method based on the touch screen is characterized by being applied to a first electronic device, wherein the first electronic device comprises the touch screen; the touch screen may be used to display an interface; the method comprises the following steps:
When the first electronic device and the second electronic device establish screen projection connection, a cursor is generated and displayed in an interface of the touch screen;
generating screen projection content based on interface content in the touch screen; the screen projection content comprises the cursor;
sending the screen projection content to the second electronic equipment;
when receiving a target touch operation of a user on the touch screen, responding to the target touch operation based on the display position of the cursor in the current interface; the display position of the cursor in the current interface is used for indicating a response position of the current interface to the target touch operation; the response position is irrelevant to the operation position of the target touch operation on the touch screen.
2. The method of claim 1, wherein the cursor is not included in the interface of the touch screen when the first electronic device does not establish a screen-cast connection with the second electronic device.
3. The method of claim 1 or 2, wherein the screen shot content comprises part or all of content in an interface, and the screen shot content is used for enabling part or all of content presented in a display system of the second electronic device to be synchronized with part or all of content displayed in the touch screen.
4. The method of any one of claims 1-3, wherein the generating a cursor comprises:
acquiring interface content of a front-end application corresponding to the current interface, and drawing a cursor in the interface content; alternatively, the first and second electrodes may be,
drawing a cursor, and adding the cursor to the current interface based on a floating window interface; alternatively, the first and second electrodes may be,
drawing a ray, and intersecting the ray and the current interface at a terminal point based on a suspended window interface; the end point is the cursor.
5. The method of any of claims 1-4, wherein prior to the generating a cursor, the method further comprises:
calling a first event processing system and shielding the touch screen event processing system;
the responding to the target touch operation based on the display position of the cursor in the current interface comprises:
the target touch operation is re-analyzed into a target event, and the target event is injected into the first event processing system, so that the first electronic equipment responds to the target touch operation according to the first event processing system;
wherein the response of the touch screen event processing system to the target touch operation is related to the operation position of the target touch operation on the touch screen; and responding to the target touch operation by the first event processing system based on the relation between the position of the cursor and the operation position of the target touch operation on the touch screen.
6. The method of any of claims 1-4, wherein prior to the generating a cursor, the method further comprises:
calling a first event processing system;
drawing a transparent target suspension window on the current interface; the target floating window is used for shielding the touch screen event processing system;
the responding to the target touch operation based on the display position of the cursor in the current interface comprises:
analyzing a target event corresponding to the target touch operation through the target floating window, and injecting the target event into the first event processing system, so that the first electronic device responds to the target touch operation according to the first event processing system;
wherein the response of the touch screen event processing system to the target touch operation is related to the operation position of the target touch operation on the touch screen; and responding to the target touch operation by the first event processing system based on the relation between the position of the cursor and the operation position of the target touch operation on the touch screen.
7. The method of claim 5 or 6, wherein prior to invoking the first event processing system, the method further comprises: detecting that the first electronic equipment and the second electronic equipment establish screen projection connection; alternatively, the first and second electrodes may be,
Detecting a first switching command input by a user; the first switch command is used to instruct the first electronic device to invoke the first event processing system and to mask the touch screen event processing system.
8. The method of any one of claims 1-7, further comprising:
and reducing the display brightness of the touch screen, or displaying the touch screen on the screen.
9. The method of any one of claims 1 to 8,
when the target touch operation is a slide operation,
the responding to the target touch operation based on the display position of the cursor in the current interface comprises:
and changing the position of the cursor in the current interface according to the sliding operation.
10. The method of any one of claims 1 to 8,
when the target touch operation is a click operation,
the responding to the target touch operation based on the display position of the cursor in the current interface comprises:
and executing confirmation operation aiming at the object corresponding to the cursor in the current interface.
11. The method of any one of claims 1 to 8,
when the target touch operation is a long press-and-slide operation,
The responding to the target touch operation based on the display position of the cursor in the current interface comprises:
and sliding or dragging the corresponding object of the cursor in the interface.
12. The method according to any one of claims 1-8, wherein when the target touch operation causes the motion track of the cursor in the current interface to match a preset shortcut operation command, or when the target touch operation matches a preset shortcut operation command,
the responding to the target touch operation based on the display position of the cursor in the current interface comprises:
and executing the shortcut operation command.
13. The method of claim 12, wherein the target touch operation comprises:
sliding the first preset area of the touch screen to a first preset direction; alternatively, the first and second electrodes may be,
sliding the touch screen from a second preset area of the touch screen to a second preset direction, wherein the time for contacting the touch screen is longer than the preset time; alternatively, the first and second electrodes may be,
the number of times of clicking the touch screen reaches a preset number; alternatively, the first and second electrodes may be,
and the sliding track of the target touch operation meets a preset pattern.
14. The method of any one of claims 1-13, wherein after the first electronic device establishes a screen-casting connection with the second electronic device, the method further comprises:
Starting a multi-screen display mode;
creating N virtual screens in an interface of the touch screen;
determining N interfaces to be displayed;
respectively displaying the N interfaces to be displayed in the N virtual screens; the N interfaces to be displayed belong to M applications, wherein M is an integer not greater than N, and N is an integer greater than 1; the cursor may be located at any position in the N virtual screens.
15. The method of any of claims 1-14, wherein the screen-cast content further comprises at least one of an interface of a background application running on the first electronic device, a newly drawn menu bar, or a preset functionality control.
16. The method of any one of claims 1-15, wherein the first electronic device comprises a cell phone or tablet; the second electronic device comprises a display, a television, a projector, an AR device, a VR device, or an MR device.
17. The method of any of claims 1-16, wherein the first electronic device further comprises a second event processing system; the method further comprises the following steps:
acquiring a second switching instruction; the second switching instruction is used for indicating the first electronic equipment to call a second event processing system and shielding the first event processing system and the touch screen event processing system;
Calling the second event processing system; wherein when the posture of the first electronic device changes, the position of the cursor in the current interface is changed based on the response of the second event processing system to the posture change of the first electronic device.
18. The method of any one of claims 1-17, wherein prior to the cursor being displayed in the interface of the touch screen, the method further comprises: receiving a predefined operation for activating or hiding display of the cursor.
19. The control device based on the touch screen is characterized in that the control device is applied to first electronic equipment, wherein the first electronic equipment comprises the touch screen; the touch screen may be used to display an interface; the device comprises:
the generation module is used for generating a cursor when the first electronic device and the second electronic device establish screen projection connection, and the cursor is displayed in an interface of the touch screen;
the screen projection module is used for generating screen projection content based on the interface content in the touch screen; sending the screen projection content to the second electronic equipment; the screen projection content comprises the cursor;
The receiving module is used for receiving target touch operation of a user on the touch screen;
the response module is used for responding to the target touch operation based on the display position of the cursor in the current interface;
the display position of the cursor in the current interface is used for indicating a response position of the current interface to the target touch operation; the response position is irrelevant to the operation position of the target touch operation on the touch screen.
20. The apparatus of claim 19, wherein the cursor is not included in the interface of the touch screen when the first electronic device does not establish a screen-cast connection with the second electronic device.
21. The apparatus of claim 19 or 20, wherein the screen shot content comprises some or all of the content in an interface for synchronizing some or all of the content presented in the display system of the second electronic device with some or all of the content displayed in the touch screen.
22. The apparatus of any one of claims 19-21, wherein the generation module is specifically configured to:
acquiring interface content of a front-end application corresponding to the current interface, and drawing a cursor in the interface content; alternatively, the first and second electrodes may be,
Drawing a cursor, and adding the cursor to the current interface based on a floating window interface; alternatively, the first and second electrodes may be,
drawing a ray, and intersecting the ray and the current interface at a terminal point based on a suspended window interface; the end point is the cursor.
23. The apparatus of any one of claims 19-22, wherein the apparatus further comprises:
the calling module is used for calling the first event processing system and shielding the touch screen event processing system before the cursor is generated by the generating module;
the response module is specifically configured to re-analyze the target touch operation into a target event, and inject the target event into the first event processing system, so that the first electronic device responds to the target touch operation according to the first event processing system;
wherein the response of the touch screen event processing system to the target touch operation is related to the operation position of the target touch operation on the touch screen; and responding to the target touch operation by the first event processing system based on the relation between the position of the cursor and the operation position of the target touch operation on the touch screen.
24. The apparatus of any one of claims 19-23, wherein the apparatus further comprises:
the calling module is used for calling the first event processing system before the cursor is generated by the generating module; drawing a transparent target suspension window on the front-end application corresponding to the current interface; the target floating window is used for shielding the touch screen event processing system;
the response module is specifically configured to analyze a target event corresponding to the target touch operation through a target floating window, and inject the target event into the first event processing system, so that the first electronic device responds to the target touch operation according to the first event processing system;
wherein the response of the touch screen event processing system to the target touch operation is related to the operation position of the target touch operation on the touch screen; and responding to the target touch operation by the first event processing system based on the relation between the position of the cursor and the operation position of the target touch operation on the touch screen.
25. The apparatus of claim 23 or 24, wherein the apparatus further comprises a detection module for, prior to the invocation module invoking the first event processing system,
Detecting that the first electronic device and the second electronic device establish screen projection connection; alternatively, the first and second electrodes may be,
detecting a first switching command input by a user; the first switch command is used to instruct the first electronic device to invoke the first event processing system and to mask the touch screen event processing system.
26. The apparatus of any one of claims 19-25, wherein the apparatus further comprises:
and the brightness control module is used for reducing the display brightness of the touch screen or displaying the touch screen on the screen.
27. The apparatus according to any one of claims 19 to 26,
when the target touch operation is a sliding operation, the response module is used for changing the position of the cursor in the current interface according to the sliding operation; alternatively, the first and second electrodes may be,
when the target touch operation is a click operation, the response module is used for executing a confirmation operation on an object corresponding to the cursor in the current interface; alternatively, the first and second electrodes may be,
when the target touch operation is a long-press sliding operation, the response module is used for sliding or dragging the corresponding object of the cursor in the interface; alternatively, the first and second electrodes may be,
when the target touch operation enables the motion track of the cursor in the current interface to be matched with a preset shortcut operation command, or when the target touch operation is matched with the preset shortcut operation command, the response module is used for executing the shortcut operation command.
28. The apparatus of claim 27, wherein the target touch operation matching the preset shortcut operation command comprises:
sliding the first preset area of the touch screen to a first preset direction; alternatively, the first and second electrodes may be,
sliding the touch screen from a second preset area of the touch screen to a second preset direction, wherein the time for contacting the touch screen is longer than the preset time; alternatively, the first and second electrodes may be,
the number of times of clicking the touch screen reaches a preset number; alternatively, the first and second electrodes may be,
and the sliding track of the target touch operation meets a preset pattern.
29. The apparatus according to any one of claims 19-28, wherein the apparatus further comprises a multi-screen creation module for initiating a multi-screen display mode;
creating N virtual screens in an interface of the touch screen;
determining N interfaces to be displayed;
respectively displaying the N interfaces to be displayed in the N virtual screens; the N interfaces to be displayed belong to M applications, wherein M is an integer not greater than N, and N is an integer greater than 1; the cursor may be located at any position in the N virtual screens.
30. The apparatus of any of claims 19-29, wherein the screen-shot content further comprises at least one of an interface of a background application running on the first electronic device, a newly drawn menu bar, or a preset functionality control.
31. The apparatus of any one of claims 19-30, wherein the first electronic device comprises a cell phone or tablet; the second electronic device comprises a display, a television, a projector, an AR device, a VR device, or an MR device.
32. The apparatus of any of claims 19-31, wherein the first electronic device further comprises a second event processing system; the calling module is further configured to:
acquiring a second switching instruction; the second switching instruction is used for indicating the first electronic equipment to call a second event processing system and shielding the first event processing system and the touch screen event processing system;
calling the second event processing system; wherein when the posture of the first electronic device changes, the position of the cursor in the current interface is changed based on the response of the second event processing system to the posture change of the first electronic device.
33. The apparatus of any one of claims 19-32, wherein the apparatus further comprises an activation module to receive a predefined operation to activate or hide the display of the cursor.
34. The terminal equipment is characterized by comprising a memory, a processor, a touch screen and a bus; the memory, the touch screen and the processor are connected through the bus;
The memory for storing computer programs and instructions;
the touch screen is used for receiving touch operation and displaying an interface;
the processor is used for calling the computer program and the instructions stored in the memory and executing the method according to any one of claims 1-18.
35. The terminal device of claim 34, further comprising an antenna system, the antenna system under control of the processor for transceiving wireless communication signals for wireless communication with a mobile communication network; the mobile communications network comprises one or more of: GSM networks, CDMA networks, 3G networks, 4G networks, 5G networks, FDMA, TDMA, PDC, TACS, AMPS, WCDMA, TDSCDMA, WIFI, and LTE networks.
36. The control method based on the touch screen is characterized by being applied to a first electronic device, wherein the first electronic device comprises the touch screen; the touch screen may be used to display an interface; the method comprises the following steps:
when the first electronic device and the second electronic device establish screen projection connection, switching a touch screen event processing system of the first electronic device into a touch pad event processing system;
Generating a cursor, wherein the cursor is displayed in an interface of the touch screen;
generating screen projection content based on interface content in the touch screen; sending the screen projection content to the second electronic equipment; the screen projection content comprises the cursor;
when receiving a target touch operation of a user on the touch screen, responding to the target touch operation based on the touch pad event processing system; wherein the response of the touchpad event processing system to the target touch operation is related to the position of the cursor and is unrelated to the operation position of the target touch operation on the touch screen; the touch screen event processing system does not include the cursor.
37. The method of claim 36, wherein the generating a cursor comprises:
acquiring interface content of a front-end application corresponding to the current interface, and drawing a cursor in the interface content; or drawing a cursor, and adding the cursor to the current interface based on a floating window interface; alternatively, the first and second electrodes may be,
drawing a ray, and intersecting the ray and the current interface at a terminal point based on a suspended window interface; the end point is the cursor.
38. The method as recited in claim 36 or 37, wherein prior to switching the touch screen event handling system of the first electronic device to a touchpad event handling system, the method further comprises: detecting that the first electronic equipment and the second electronic equipment establish screen projection connection; alternatively, the first and second electrodes may be,
detecting a first switching command input by a user; the first switch command is used for instructing the first electronic device to call the touchpad event processing system and to shield the touch screen event processing system.
39. The method of any one of claims 36-38, further comprising:
and reducing the display brightness of the touch screen, or displaying the touch screen on the screen.
40. The method of any one of claims 36-39,
when the target touch operation is a slide operation,
the responding to the target touch operation based on the display position of the cursor in the current interface comprises:
changing the position of the cursor in the current interface according to the sliding operation;
when the target touch operation is a click operation,
the responding to the target touch operation based on the display position of the cursor in the current interface comprises:
And executing confirmation operation aiming at the object corresponding to the cursor in the current interface.
41. The method of any one of claims 36-40,
when the target touch operation is a long press-and-slide operation,
the responding to the target touch operation based on the display position of the cursor in the current interface comprises:
and sliding or dragging the corresponding object of the cursor in the interface.
42. The method of any one of claims 36-41,
when the target touch operation enables the motion track of the cursor in the current interface to be matched with a preset shortcut operation command, or when the target touch operation is matched with the preset shortcut operation command,
the responding to the target touch operation based on the display position of the cursor in the current interface comprises:
executing the shortcut operation command; wherein the content of the first and second substances,
the target touch operation includes:
sliding the first preset area of the touch screen to a first preset direction; alternatively, the first and second electrodes may be,
sliding the touch screen from a second preset area of the touch screen to a second preset direction, wherein the time for contacting the touch screen is longer than the preset time; alternatively, the first and second electrodes may be,
the number of times of clicking the touch screen reaches a preset number; alternatively, the first and second electrodes may be,
And the sliding track of the target touch operation meets a preset pattern.
43. The method of any one of claims 36-42, wherein after the first electronic device establishes a screen-casting connection with the second electronic device, the method further comprises:
starting a multi-screen display mode;
creating N virtual screens in an interface of the touch screen;
determining N interfaces to be displayed;
respectively displaying the N interfaces to be displayed in the N virtual screens; the N interfaces to be displayed belong to M applications, wherein M is an integer not greater than N, and N is an integer greater than 1; the cursor may be located at any position in the N virtual screens.
44. The method of any one of claims 36-43,
acquiring a second switching instruction; the second switching instruction is used for instructing the first electronic equipment to switch the touch pad event processing system to an empty mouse event processing system; wherein when the posture of the first electronic device changes, the position of the cursor in the current interface is changed based on the response of the air mouse event processing system to the posture change of the first electronic device.
45. The method of any one of claims 36-44,
The first electronic equipment comprises a mobile phone or a tablet; the second electronic device comprises a display, a television, a projector, an AR device, a VR device, or an MR device.
46. The control device based on the touch screen is characterized in that the control device is applied to first electronic equipment, wherein the first electronic equipment comprises the touch screen; the touch screen may be used to display an interface; the device comprises:
the switching module is used for switching the touch screen event processing system of the first electronic device into a touch pad event processing system when the first electronic device and the second electronic device establish screen projection connection;
the generation module is used for generating a cursor, and the cursor is displayed in an interface of the touch screen;
the screen projection module is used for generating screen projection content based on the interface content in the touch screen; sending the screen projection content to the second electronic device; the screen projection content comprises the cursor standby;
the response module is used for responding to the target touch operation based on the touchpad event processing system when receiving the target touch operation of a user on the touch screen;
wherein the response of the touchpad event processing system to the target touch operation is related to the position of the cursor and is unrelated to the operation position of the target touch operation on the touch screen; the touch screen event processing system does not include the cursor.
47. The apparatus of claim 46, wherein the generation module is specifically configured to:
acquiring interface content of a front-end application corresponding to the current interface, and drawing a cursor in the interface content; or drawing a cursor, and adding the cursor to the current interface based on a floating window interface; alternatively, the first and second electrodes may be,
drawing a ray, and intersecting the ray and the current interface at a terminal point based on a suspended window interface; the end point is the cursor.
48. The apparatus of claim 46 or 47, wherein the apparatus further comprises a detection module configured to:
before the switching module switches the touch screen event processing system of the first electronic device to a touch pad event processing system, detecting that the first electronic device and the second electronic device establish screen projection connection; alternatively, the first and second electrodes may be,
detecting a first switching command input by a user; the first switch command is used for instructing the first electronic device to call the touchpad event processing system and to shield the touch screen event processing system.
49. The apparatus of any one of claims 46-48, further comprising a brightness control module to reduce a display brightness of the touch screen or to turn the touch screen on.
50. The device according to any of claims 46-49,
when the target touch operation is a sliding operation, the response module is used for changing the position of the cursor in the current interface according to the sliding operation; alternatively, the first and second electrodes may be,
when the target touch operation is a click operation, the response module is used for executing a confirmation operation on an object corresponding to the cursor in the current interface; alternatively, the first and second electrodes may be,
when the target touch operation is a long-press sliding operation, the response module is used for sliding or dragging the corresponding object of the cursor in the interface; alternatively, the first and second electrodes may be,
when the target touch operation enables the motion track of the cursor in the current interface to be matched with a preset shortcut operation command, or when the target touch operation is matched with the preset shortcut operation command, the response module is used for executing the shortcut operation command.
51. The apparatus of any one of claims 46-50, wherein the apparatus further comprises a multi-screen creation module to:
starting a multi-screen display mode;
creating N virtual screens in an interface of the touch screen;
determining N interfaces to be displayed;
respectively displaying the N interfaces to be displayed in the N virtual screens; the N interfaces to be displayed belong to M applications, wherein M is an integer not greater than N, and N is an integer greater than 1; the cursor may be located at any position in the N virtual screens.
52. The apparatus of any one of claims 46-51, wherein the switching module is further configured to:
acquiring a second switching instruction; the second switching instruction is used for instructing the first electronic equipment to switch the touch pad event processing system to an empty mouse event processing system; wherein when the posture of the first electronic device changes, the position of the cursor in the current interface is changed based on the response of the air mouse event processing system to the posture change of the first electronic device.
53. The apparatus according to any of claims 46-52,
the first electronic equipment comprises a mobile phone or a tablet; the second electronic device comprises a display, a television, a projector, an AR device, a VR device, or an MR device.
54. The terminal equipment is characterized by comprising a memory, a processor, a touch screen and a bus; the memory, the touch screen and the processor are connected through the bus;
the memory for storing computer programs and instructions;
the touch screen is used for receiving touch operation and displaying an interface;
the processor is used for calling the computer program and the instructions stored in the memory and executing the method according to any one of claims 36-45.
55. The terminal device of claim 54, further comprising an antenna system, the antenna system under the control of the processor for transceiving wireless communication signals for wireless communication with a mobile communication network; the mobile communications network comprises one or more of: GSM networks, CDMA networks, 3G networks, 4G networks, 5G networks, FDMA, TDMA, PDC, TACS, AMPS, WCDMA, TDSCDMA, WIFI, and LTE networks.
56. A control method is applied to a first electronic device connected with a second electronic device, wherein the second electronic device comprises an imaging system; characterized in that the method comprises:
displaying interface contents of N applications operated by the first electronic equipment in N display areas in the imaging system respectively; the first electronic equipment comprises a first display screen, and N is an integer greater than 1;
displaying a cursor in the imaging system, wherein the cursor is used for determining an operation object in the content displayed by the imaging system; receiving a first sliding operation acting on the first display screen;
determining a displacement of the cursor in content displayed by the imaging system according to the first sliding operation;
The starting position of the first sliding operation corresponds to a first object in the current interface content of the first electronic device, the first object does not respond to the first sliding operation, and the current interface content is the interface content of one application in the interface contents of the N applications.
57. A screen projection system, comprising: a first electronic device as claimed in claim 34, 35, 54 or 55, and a second electronic device; the first electronic device and the second electronic device are connected in a screen projection mode.
58. A computer-readable storage medium, characterized in that it stores a computer program comprising program instructions which, when executed by a computer, cause the computer to perform the method of any of claims 1-18 or 36-45.
CN202010882797.8A 2019-09-26 2020-08-25 Information processing method and electronic equipment Pending CN112558825A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/117308 WO2021057830A1 (en) 2019-09-26 2020-09-24 Information processing method and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910922738 2019-09-26
CN2019109227386 2019-09-26

Publications (1)

Publication Number Publication Date
CN112558825A true CN112558825A (en) 2021-03-26

Family

ID=75040941

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010882797.8A Pending CN112558825A (en) 2019-09-26 2020-08-25 Information processing method and electronic equipment

Country Status (2)

Country Link
CN (1) CN112558825A (en)
WO (1) WO2021057830A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113467737A (en) * 2021-06-29 2021-10-01 阿波罗智联(北京)科技有限公司 Method, device and equipment for transmitting picture data and storage medium
CN113721773A (en) * 2021-08-27 2021-11-30 高创(苏州)电子有限公司 Input device sharing method and intelligent display device
CN113766304A (en) * 2021-08-23 2021-12-07 珠海格力电器股份有限公司 Screen projection method and device, electronic equipment and storage medium
CN113760143A (en) * 2021-09-09 2021-12-07 联想(北京)有限公司 Information processing method and device and electronic equipment
CN113835802A (en) * 2021-08-30 2021-12-24 荣耀终端有限公司 Device interaction method, system, device and computer readable storage medium
CN114040242A (en) * 2021-09-30 2022-02-11 荣耀终端有限公司 Screen projection method and electronic equipment
CN114153368A (en) * 2021-12-07 2022-03-08 Oppo广东移动通信有限公司 Application control method and system
CN114153542A (en) * 2021-11-30 2022-03-08 阿波罗智联(北京)科技有限公司 Screen projection method and device, electronic equipment and computer readable storage medium
CN114201103A (en) * 2021-08-16 2022-03-18 荣耀终端有限公司 Data input method and terminal equipment
CN114326446A (en) * 2021-12-28 2022-04-12 北京真视通科技股份有限公司 Automatic release method and device of electronic equipment
CN114356162A (en) * 2021-12-30 2022-04-15 Oppo广东移动通信有限公司 Content display method and related product
CN114527920A (en) * 2020-10-30 2022-05-24 华为终端有限公司 Man-machine interaction method and electronic equipment
CN114779970A (en) * 2022-06-20 2022-07-22 艾视雅健康科技(苏州)有限公司 Device and method for displaying object position on second display screen near to eye
CN115248655A (en) * 2021-04-28 2022-10-28 闪耀现实(无锡)科技有限公司 Method and apparatus for displaying information
WO2022242408A1 (en) * 2021-05-19 2022-11-24 华为技术有限公司 Display method and terminal device
WO2022252945A1 (en) * 2021-05-31 2022-12-08 华为技术有限公司 Screen-projection reverse control method and device
WO2023030519A1 (en) * 2021-09-06 2023-03-09 维沃移动通信有限公司 Screen projection processing method and related device
CN115964011A (en) * 2023-03-16 2023-04-14 深圳市湘凡科技有限公司 Method and related device for displaying application interface based on multi-screen cooperation
CN116033158A (en) * 2022-05-30 2023-04-28 荣耀终端有限公司 Screen projection method and electronic equipment
WO2023202606A1 (en) * 2022-04-21 2023-10-26 华为技术有限公司 Multi-screen interaction method and electronic device
WO2024012402A1 (en) * 2022-07-15 2024-01-18 华为技术有限公司 Display method and electronic device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160132139A1 (en) * 2014-11-11 2016-05-12 Qualcomm Incorporated System and Methods for Controlling a Cursor Based on Finger Pressure and Direction
CN107613373A (en) * 2017-09-12 2018-01-19 中广热点云科技有限公司 A kind of method that multi-screen continuously watches TV programme
CN108646997A (en) * 2018-05-14 2018-10-12 刘智勇 A method of virtual and augmented reality equipment is interacted with other wireless devices
CN109960449A (en) * 2019-03-22 2019-07-02 深圳前海达闼云端智能科技有限公司 A kind of throwing screen display methods and relevant apparatus

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200743010A (en) * 2006-05-10 2007-11-16 Compal Communications Inc Portable communication device with a projection function and control method thereof
KR101733115B1 (en) * 2011-11-08 2017-05-10 한국전자통신연구원 Method and apparatus for controlling content of the remote screen
CN103941845A (en) * 2013-01-18 2014-07-23 源贸科技股份有限公司 Display operation system and operation method thereof
CN105630452A (en) * 2013-02-21 2016-06-01 海信集团有限公司 Screen transmission method and electronic devices
CN104123058A (en) * 2013-04-24 2014-10-29 广明光电股份有限公司 Method for touch host computer to control mobile device
WO2014189984A1 (en) * 2013-05-20 2014-11-27 Abalta Technologies, Inc. Interactive multi-touch remote control
CN104202643B (en) * 2014-09-16 2019-04-05 北京云视触动科技有限责任公司 Touch screen remote terminal screen map method, the control method and system of touch screen remote terminal of smart television
CN105512086B (en) * 2016-02-16 2018-08-10 联想(北京)有限公司 Information processing equipment and information processing method
CN106681632A (en) * 2016-12-09 2017-05-17 北京小米移动软件有限公司 Projection control method, device and system, terminal device and display device
CN110262985B (en) * 2019-06-26 2021-09-14 联想(北京)有限公司 Processing method and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160132139A1 (en) * 2014-11-11 2016-05-12 Qualcomm Incorporated System and Methods for Controlling a Cursor Based on Finger Pressure and Direction
CN107613373A (en) * 2017-09-12 2018-01-19 中广热点云科技有限公司 A kind of method that multi-screen continuously watches TV programme
CN108646997A (en) * 2018-05-14 2018-10-12 刘智勇 A method of virtual and augmented reality equipment is interacted with other wireless devices
CN109960449A (en) * 2019-03-22 2019-07-02 深圳前海达闼云端智能科技有限公司 A kind of throwing screen display methods and relevant apparatus

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114527920A (en) * 2020-10-30 2022-05-24 华为终端有限公司 Man-machine interaction method and electronic equipment
CN115248655A (en) * 2021-04-28 2022-10-28 闪耀现实(无锡)科技有限公司 Method and apparatus for displaying information
WO2022242408A1 (en) * 2021-05-19 2022-11-24 华为技术有限公司 Display method and terminal device
WO2022252945A1 (en) * 2021-05-31 2022-12-08 华为技术有限公司 Screen-projection reverse control method and device
CN113467737A (en) * 2021-06-29 2021-10-01 阿波罗智联(北京)科技有限公司 Method, device and equipment for transmitting picture data and storage medium
WO2023273084A1 (en) * 2021-06-29 2023-01-05 阿波罗智联(北京)科技有限公司 Method and apparatus for transmitting picture data, device, and storage medium
CN114201103A (en) * 2021-08-16 2022-03-18 荣耀终端有限公司 Data input method and terminal equipment
CN114201103B (en) * 2021-08-16 2023-11-21 荣耀终端有限公司 Data input method and terminal equipment
CN113766304B (en) * 2021-08-23 2022-10-11 珠海格力电器股份有限公司 Screen projection method and device, electronic equipment and storage medium
CN113766304A (en) * 2021-08-23 2021-12-07 珠海格力电器股份有限公司 Screen projection method and device, electronic equipment and storage medium
CN113721773A (en) * 2021-08-27 2021-11-30 高创(苏州)电子有限公司 Input device sharing method and intelligent display device
CN113835802A (en) * 2021-08-30 2021-12-24 荣耀终端有限公司 Device interaction method, system, device and computer readable storage medium
WO2023030519A1 (en) * 2021-09-06 2023-03-09 维沃移动通信有限公司 Screen projection processing method and related device
CN113760143A (en) * 2021-09-09 2021-12-07 联想(北京)有限公司 Information processing method and device and electronic equipment
CN114040242A (en) * 2021-09-30 2022-02-11 荣耀终端有限公司 Screen projection method and electronic equipment
CN114153542A (en) * 2021-11-30 2022-03-08 阿波罗智联(北京)科技有限公司 Screen projection method and device, electronic equipment and computer readable storage medium
CN114153368A (en) * 2021-12-07 2022-03-08 Oppo广东移动通信有限公司 Application control method and system
CN114326446A (en) * 2021-12-28 2022-04-12 北京真视通科技股份有限公司 Automatic release method and device of electronic equipment
CN114356162A (en) * 2021-12-30 2022-04-15 Oppo广东移动通信有限公司 Content display method and related product
WO2023202606A1 (en) * 2022-04-21 2023-10-26 华为技术有限公司 Multi-screen interaction method and electronic device
CN116033158A (en) * 2022-05-30 2023-04-28 荣耀终端有限公司 Screen projection method and electronic equipment
CN116033158B (en) * 2022-05-30 2024-04-16 荣耀终端有限公司 Screen projection method and electronic equipment
CN114779970A (en) * 2022-06-20 2022-07-22 艾视雅健康科技(苏州)有限公司 Device and method for displaying object position on second display screen near to eye
WO2024012402A1 (en) * 2022-07-15 2024-01-18 华为技术有限公司 Display method and electronic device
CN115964011A (en) * 2023-03-16 2023-04-14 深圳市湘凡科技有限公司 Method and related device for displaying application interface based on multi-screen cooperation
CN115964011B (en) * 2023-03-16 2023-06-06 深圳市湘凡科技有限公司 Method and related device for displaying application interface based on multi-screen cooperation

Also Published As

Publication number Publication date
WO2021057830A1 (en) 2021-04-01

Similar Documents

Publication Publication Date Title
CN112558825A (en) Information processing method and electronic equipment
WO2022022495A1 (en) Cross-device object dragging method and device
WO2022100315A1 (en) Method for generating application interface, and related apparatus
US20220342850A1 (en) Data transmission method and related device
WO2022100237A1 (en) Screen projection display method and related product
WO2020052529A1 (en) Method for quickly adjusting out small window in fullscreen display during video, graphic user interface and terminal
CN112527174B (en) Information processing method and electronic equipment
CN112394895B (en) Picture cross-device display method and device and electronic device
EP4130963A1 (en) Object dragging method and device
WO2021129253A1 (en) Method for displaying multiple windows, and electronic device and system
WO2021036770A1 (en) Split-screen processing method and terminal device
WO2021115194A1 (en) Application icon display method and electronic device
CN112527222A (en) Information processing method and electronic equipment
US20210342044A1 (en) System navigation bar display method, system navigation bar control method, graphical user interface, and electronic device
WO2022100304A1 (en) Method and apparatus for transferring application content across devices, and electronic device
WO2022105445A1 (en) Browser-based application screen projection method and related apparatus
WO2022017393A1 (en) Display interaction system, display method, and device
WO2020238759A1 (en) Interface display method and electronic device
WO2023030099A1 (en) Cross-device interaction method and apparatus, and screen projection system and terminal
US20230119849A1 (en) Three-dimensional interface control method and terminal
WO2021052488A1 (en) Information processing method and electronic device
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
WO2021190524A1 (en) Screenshot processing method, graphic user interface and terminal
WO2023045597A1 (en) Cross-device transfer control method and apparatus for large-screen service

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination