CN109144396B - Touch information processing method, device, equipment and storage medium - Google Patents

Touch information processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN109144396B
CN109144396B CN201811030140.8A CN201811030140A CN109144396B CN 109144396 B CN109144396 B CN 109144396B CN 201811030140 A CN201811030140 A CN 201811030140A CN 109144396 B CN109144396 B CN 109144396B
Authority
CN
China
Prior art keywords
touch
event
data
information processing
touch information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811030140.8A
Other languages
Chinese (zh)
Other versions
CN109144396A (en
Inventor
邱伟波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shirui Electronics Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN201811030140.8A priority Critical patent/CN109144396B/en
Publication of CN109144396A publication Critical patent/CN109144396A/en
Application granted granted Critical
Publication of CN109144396B publication Critical patent/CN109144396B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/542Event management; Broadcasting; Multicasting; Notifications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/545Interprogram communication where tasks reside in different layers, e.g. user- and kernel-space

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a touch information processing method, a device, equipment and a storage medium, which relate to the technical field of intelligent tablets, and the method comprises the following steps: the method comprises the steps that touch information detected by a control terminal is received, wherein the touch information comprises touch coordinates; analyzing and converting the touch information in a kernel driving layer to obtain touch data; encapsulating the touch data into a touch event; and responding to control operation according to the touch event. By adopting the method, the problem of complex flow caused by the fact that the touch event can be forwarded only by reporting the touch event to the application layer again in the prior art can be solved, and the forwarding and transmission of the touch event to the multi-module can be directly completed at the kernel driving layer so as to respond to multi-window touch operation and optimize the touch information processing flow.

Description

Touch information processing method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of intelligent tablets, in particular to a touch information processing method, device, equipment and storage medium.
Background
With the development of electronic technology, touch technology is applied to various fields, and is convenient for users to use. In a traditional touch control all-in-one machine intelligent terminal, a service background receives and analyzes touch points, a foreground active window is informed through interprocess communication, and after receiving touch information, the foreground active window executes touch operation such as clicking, marking and the like by a single module. In addition, in the existing touch system of the traditional screen transmission, the service background application layer receives and analyzes the touch points, processes the touch information according to the android native touch event flow, creates a virtual input device by using a native interface of the Linux system, encapsulates the analyzed touch information into a touch event in the kernel driving layer, reports the touch event to the input flinger through an input subsystem of the Linux system, and forwards the touch event to each module on the application layer, so as to realize multi-module response operation.
In the process of realizing the screen transmission operation between the mobile terminal and the intelligent interactive tablet, the inventor finds that the prior art has the following defects: in a touch system of a traditional touch control all-in-one machine, touch operation can only be responded by a single window application, and if other windows need to respond, development needs to be added separately, so that development cost is increased; in the existing touch system of the touch screen, touch information is processed according to the android native touch event process, the touch information is analyzed at an application layer, the touch event is processed and packaged at a kernel layer, and then the touch event is transmitted to the application layer to forward a plurality of modules, so that the process is complex.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method, an apparatus, a device, and a storage medium for processing touch information, so as to solve the technical problem in the prior art that a multi-module touch operation response between multiple devices in a touch screen system cannot be implemented, and optimize a flow of touch information processing.
In a first aspect, an embodiment of the present invention provides a touch information processing method, including:
receiving touch information detected by a control terminal, wherein the touch information comprises touch coordinates;
acquiring touch data, wherein the touch data is obtained by analyzing and converting the touch information, and the analyzing process occurs in a kernel driving layer;
encapsulating the touch data into a touch event;
in response to a control operation, the control operation is determined by the touch event.
Further, before the acquiring the touch data, the method includes:
creating virtual touch equipment on a kernel driving layer, wherein the virtual touch equipment is obtained according to the touch information;
the acquiring touch data includes:
acquiring the logic resolution of the virtual touch equipment;
analyzing the touch information, and acquiring a touch coordinate of a touch action on the control terminal;
converting the touch coordinates into virtual resolution coordinates;
converting the virtual resolution coordinate into a control coordinate according to the resolution of the virtual resolution coordinate;
and acquiring control coordinates.
Further, the touch event is sent to a television module;
the encapsulating of the touch data into touch events comprises:
and packaging the touch data into the touch event according to a specified event structure format.
Further, the touch event is sent to a system module;
the encapsulating of the touch data into touch events comprises:
the touch data is encapsulated into a touch event using USB protocol format encapsulation.
Further, the response control operation occurs at the television module;
the response control operation includes:
reporting the touch event;
distributing the touch event to all components, wherein the components comprise a plurality of sub-components;
finding a target sub-component, the target sub-component for processing the touch event;
responding to the touch event.
Further, the response control operation occurs at a system module;
the response control operation includes:
transmitting the touch event to the system module;
and the system module responds to the control operation corresponding to the touch event.
Further, after the response control operation, the method further includes:
and sending display data to the control terminal, wherein the display data is a display picture refreshed after responding to the control operation.
In a second aspect, an embodiment of the present invention further provides a touch information processing apparatus, including:
the touch information receiving module is used for receiving touch information detected by the control terminal, and the touch information comprises touch coordinates;
the touch data acquisition module is used for acquiring touch data, the touch data is obtained by analyzing and converting the touch information, and the analyzing process occurs in a kernel driving layer;
the touch event packaging module is used for packaging the touch data into a touch event;
and the control operation response module is used for responding to control operation, and the control operation is determined by the touch event.
In a third aspect, an embodiment of the present invention further provides a touch information processing apparatus, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the touch information processing method according to the first aspect when executing the program.
In a fourth aspect, embodiments of the present invention also provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are used to perform the touch information processing method according to the first aspect.
According to the touch information processing method, the touch information processing device, the touch information processing equipment and the storage medium, touch information detected by a control terminal is received, and the touch information comprises touch coordinates; analyzing and converting the touch information in a kernel driving layer to obtain touch data; encapsulating the touch data into a touch event; according to the touch event response control operation, the problem that the flow is complex due to the fact that the touch event can be forwarded only by reporting the touch event to the application layer again in the prior art is solved, forwarding and transmission of the touch event to multiple modules are achieved directly on the kernel driving layer, the multi-window touch operation is responded, and the touch information processing flow is optimized.
Drawings
Fig. 1 is a flowchart of a touch information processing method according to an embodiment of the present invention;
fig. 2 is a flowchart of a touch information processing method according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a touch system of a touch screen to which a touch information processing method is applied according to a second embodiment of the present invention;
fig. 4 is a schematic structural diagram of a touch information processing apparatus according to a third embodiment of the present invention;
fig. 5 is a schematic structural diagram of a touch information processing apparatus according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a touch information processing method according to an embodiment of the present invention. The touch information processing method provided by this embodiment may be executed by a touch information processing device, the touch information processing device may be implemented by software and/or hardware, and the touch information processing device may be formed by two or more physical entities or may be formed by one physical entity. The touch information processing device can be a computer, a mobile phone, a tablet, a projector, an interactive smart tablet and the like. In the embodiment, an interactive smart tablet is taken as an example of a touch information processing device for description, where the interactive smart tablet may be an integrated device that controls content displayed on a display tablet and implements human-computer interaction operations through a touch technology, and integrates one or more functions of a projector, an electronic whiteboard, a curtain, a sound, a television, a video conference terminal, and the like.
In an embodiment, the interactive smart tablet establishes a data connection with at least one control terminal. Wherein, the control terminal includes but is not limited to: mobile phones, notebook computers, USB flash disks, tablet computers, desktop computers, and the like. The embodiment of the communication mode of the data connection between the control terminal and the interactive intelligent tablet is not limited, and the communication mode can be a USB connection mode, an internet mode, a local area network mode, a Bluetooth mode, a Wi-Fi mode or a ZigBee mode and the like.
Further, when the interactive intelligent tablet and the at least one control terminal perform data interaction, screen projection data are sent to the interactive intelligent tablet, so that the control terminal, which displays screen projection content of the screen projection data, of the interactive intelligent tablet is used as a screen projection client.
Optionally, the screen projection client and/or the interactive smart tablet are installed with screen projection application software, and the screen projection application software may be installed in the screen projection client and/or the interactive smart tablet in advance, or may be downloaded from a third-party device or a server and installed for use when the screen projection client and/or the interactive smart tablet starts a screen projection application. The third-party device is not limited in the embodiment. Specifically, the screen-casting application software is used for acquiring content displayed by the screen-casting client, using the content as screen-casting data, and instructing the interactive intelligent panel to display the content. In the embodiment, the screen projection client and the interactive smart tablet are exemplified to be simultaneously installed with screen projection application software. The screen projection application software of the screen projection client is used for acquiring screen projection data and directly or indirectly sending the screen projection data to the interactive intelligent panel. If the screen projection client side is indirectly sent, the screen projection client side can be sent to the interactive intelligent tablet through the transfer equipment, and the transfer equipment can be a wireless screen transfer device or other equipment with a data transfer/processing function. And the screen projection application software of the interactive intelligent panel is used for receiving screen projection data and converting the screen projection data into corresponding contents, so that the interactive intelligent panel can conveniently display the contents. It should be noted that the resolution ratios of the display screen of the screen projection client and the display screen of the interactive smart tablet are different, and the screen projection data is obtained based on the resolution ratio of the screen projection client, so that in order to display the screen projection data in the display screen of the interactive smart tablet, the screen projection application software needs to determine a screen mapping relationship according to the resolution ratios of the display screen of the screen projection client and the display screen of the interactive smart tablet, and then convert the screen projection data according to the screen mapping relationship to obtain the screen projection content. In the embodiment, the display contents of the screen shot content and the screen shot data are substantially the same, and only the resolution is different.
Specifically, referring to fig. 1, the touch information processing method provided in this embodiment specifically includes the following steps:
and S110, receiving the touch information detected by the control terminal.
Wherein the touch information includes touch coordinates.
The control terminal is used as a touch signal source, is a device for receiving touch actions, and has the functions of detecting touch operations and processing the touch operations into corresponding touch information. The control terminal serves as a screen projection client and can be mobile electronic equipment with display and touch functions, such as a mobile phone, a notebook computer, a tablet computer and the like. And the control terminal sends the touch information to the touch information processing equipment in a communication mode. The control terminal comprises a main control unit, a touch module and a display screen; the display screen is arranged on the display surface of the control terminal and used for independently displaying a display picture; the touch control operation surface of the touch control module covers the display screen and is used for detecting touch control operation and generating corresponding touch control data; the main control unit is connected with the display screen and the touch module and used for receiving and processing touch data and controlling the corresponding display screen to display a display picture generated after the touch data are processed so as to respond to touch operation. The main control unit may be a chip with control and processing functions, such as a central processing unit, a controller, etc. The touch module is an integrated device of a touch system and a touch component which realize a touch function through software and hardware technologies, and may include: the touch screen, a drive circuit, a control circuit, a signal processing circuit and the like. The touch module includes but is not limited to: capacitance induction type, resistance type, infrared type and surface acoustic wave type.
It should be noted that, in this embodiment, the touch information processing method is executed in the second terminal, where the second terminal and the touch information processing device may be the same device. The android system is used as an operating system of the second terminal, and android is an operating system developed based on a Linux kernel. The second terminal has a plurality of modules inside, such as a television module and a system module, wherein the system module further includes: a computer module, an android module and the like. The second terminal is used as one end for receiving the touch information sent by the control terminal and can be an interactive intelligent tablet, a conference tablet and other devices. In this embodiment, the interactive smart tablet is used as the second terminal for explanation, where the received touch information includes touch coordinates, and specifically, the received touch information may further include information such as an ID of a touch point, and a height and a width of a display screen of the control terminal. Packaging the touch information into a fixed format according to a predefined communication standard, and sending the touch information to a second terminal through the communication channel, wherein the touch information comprises: the ID of the touch point, the touch state, the X-axis coordinate of the touch point, the Y-axis coordinate of the touch point, the width of the display screen of the control terminal, the height of the display screen of the control terminal and the like. Specifically, when there are multiple touch points, each touch point is assigned with an ID, so that the control terminal can recognize each different touch point, and the touch state includes: the size of the display screen of the control terminal is the same as that of the display screen of the control terminal when the touch screen of the control terminal is pressed, moved and lifted, the resolution of the display screen is determined by the height and the width of the touch screen of the control terminal, and if the resolution of the display screen of the control terminal is 1080 × 1920, the control terminal can indicate that the width of each line of the touch screen of the control terminal is 1080 pixels and the height is 1920 pixels. And the action of receiving the touch information is executed in a Linux application layer of the second terminal.
And S120, acquiring touch data.
The touch data is obtained by analyzing and converting the touch information, and the analyzing process occurs in a kernel driving layer.
When the application layer of the second terminal receives the touch information detected by the control terminal connected to the outside, the touch information is not analyzed, but the touch information is directly transmitted to the kernel driving layer, and the touch information packaged according to the predefined communication standard format is re-analyzed by the kernel driving layer to obtain the required touch data, wherein the touch data comprises: the ID of the touch point, the touch time, the touch state, the X-axis coordinate of the touch point, the Y-axis coordinate of the touch point, the width of the display screen of the control terminal, the height of the display screen of the control terminal and the like.
Specifically, since the width and height of the display window of the second terminal may be different from the width and height of the display screen of the control terminal, when the user touches the control terminal, or the obtained touch coordinate is a coordinate on the control terminal, if the second terminal is to be controlled, the touch coordinate needs to be calibrated, that is, the touch coordinate is converted to obtain the control coordinate on the second terminal according to the width and height of the display screen on the control terminal. And the second terminal analyzes the received touch information so as to accurately match and position the control position on the control terminal at the corresponding position of the second terminal and obtain touch data, wherein the touch data comprises control coordinates and an operation instruction. The touch coordinate is converted according to the width and height of the display window of the second terminal, so that the coordinate suitable for the display window of the second terminal can be obtained, that is, the position of the second terminal can be controlled by the control terminal, so that the second terminal can perform corresponding operation according to the received operation instruction, and the control of the second terminal is realized through the touch control terminal.
And S130, packaging the touch data into a touch event.
The process of storing the touch data according to the specified event structure format is called a package event, where the touch data may include: touch time, ID of touch point, touch state, X-axis coordinate of touch point, and Y-axis coordinate of touch point. In the touch operation, the touch actions can be divided into three types: the method comprises the steps of pressing (ACTION _ DOWN), sliding (ACTION _ MOVE) and lifting (ACTION _ UP), wherein when a user finger contacts a screen, a touch event of pressing ACTION is generated, if the user finger continues to slide after contacting the screen, when the sliding distance exceeds a predefined distance constant in a system, the touch event of sliding ACTION is generated, and at the moment, if the user finger leaves the screen immediately, the touch event of lifting ACTION is generated. When the user's finger contacts the screen, clicks or slides on the screen, and leaves the screen again, this process generates a series of touch events that constitute a control event. According to the specific position of the touch point on the second terminal, the touch time, the touch distance, the number of times of clicking touch at the limited time, the magnitude of the touch pressure and other variables, different touch events may be triggered, the series of touch events constitute different control events, and the control events may specifically include: new creation, filling, scribing, deletion, withdrawal, enlargement, reduction, etc.
And S140, responding to the control operation.
After the kernel driving layer encapsulates the touch data into the touch event, the touch event is directly distributed and transmitted to the plurality of modules without being transmitted to the application layer, the touch event comprises an operation instruction corresponding to the event, and the television module and the system module receive the operation instruction and perform corresponding control operation. Illustratively, when the touch event is a "new" event, the television module and the system module respond to the "new" event, and the second terminal performs a "new" presentation operation.
It should be noted that, in the existing touch system with a touch screen, the control terminal packages the detected touch information and sends the packaged touch information to the second terminal, the second terminal receives the packaged touch information and analyzes the packaged touch information according to the packaged protocol format of the touch information to obtain specific touch data, such as touch state, touch point coordinates, touch point ID and the like, the touch data are transmitted to the kernel driving layer of the android system, the touch data are analyzed and processed based on the relevant function of the native interface calling system of the android system to obtain touch control signals, the touch control signals are transmitted to the application layer and forwarded to other modules of the second terminal for response, the flow is complex, and the flexibility is low based on the native interface and calling the existing function; according to the touch signal processing method, the second terminal receives the packaged touch information and then directly transmits the touch information to the kernel driving layer from the application layer, touch signal analysis is carried out on the kernel driving layer, the touch data are packaged into the touch event and directly transmitted to the plurality of modules of the system in the kernel driving layer, the touch event does not need to be transmitted to the application layer again for transmission, the process is simplified, and the flexibility of touch signal processing is improved.
According to the technical scheme provided by the embodiment, touch information detected by an externally connected control terminal is received, wherein the touch information comprises a touch state and a touch coordinate; analyzing and converting the touch information in a kernel driving layer to obtain touch data; encapsulating the touch data into a touch event; according to the touch event response control operation, the problem that in the information processing flow based on the android native program in the prior art, the touch event needs to be reported to the application layer again to be forwarded, so that the process is complex is solved, the forwarding and transmission of the touch event to multiple modules are directly completed on the kernel driving layer, the multi-window touch operation is responded, and the touch information processing flow is optimized.
Example two
Fig. 2 is a flowchart of a touch information processing method according to a second embodiment of the present invention. The present embodiment is embodied on the basis of the first embodiment. In this embodiment, the touch information processing method is executed in the second terminal, the control terminal is used as a touch signal source, and the second terminal and the control terminal can be connected in a wired manner or in a wireless manner. Fig. 3 is a schematic structural diagram of a touch system using a touch information processing method according to a second embodiment of the present invention, and as shown in fig. 3, a second terminal 20 and a control terminal 10 establish a wireless communication connection for a user to operate. The second terminal 20 may be an interactive smart tablet, the second terminal 20 uses an android system as an operating system, and has a plurality of modules inside, such as a television module 211 and a system module 212, where the system module 212 includes: a computer module, an android module and the like. Different modules can be connected through a USB wire, and can also be connected based on a wireless module mode.
Referring to fig. 2, the touch information processing method provided in this embodiment specifically includes the following steps:
s201, receiving touch information detected by the control terminal.
It should be noted that, before this step S201, the following steps may be included: a communication connection is established with the control terminal 10. Before receiving the touch information detected by the control terminal 10, a communication connection needs to be established with the control terminal 10 to transmit the information, so as to realize the transmission of the subsequent display screen.
It should be further noted that, before the step S201, the method may further include: a screen information request is sent to the control terminal 10. After the connection is established, in order to acquire the display screen of the control terminal 10, a screen information request needs to be sent to the control terminal 10 to request the control terminal 10 to send the display screen.
S202, creating a virtual touch device in the kernel driving layer.
The received touch information detected by the control terminal 10 is not analyzed, but the touch information is directly transferred from the application layer to the kernel driving layer through write operation, and virtual touch equipment is added and created in the kernel driving layer. The virtual touch device refers to a touch device which is simulated on an operating system through software technology and does not really exist, so that a plurality of user processes can use the touch device simultaneously. The virtual touch equipment comprises virtual single-point touch equipment and virtual multi-point touch equipment, wherein the virtual single-point touch equipment corresponds to a single-point touch action detected by the control terminal, and the virtual multi-point touch equipment corresponds to a multi-point touch action detected by the control terminal. In this embodiment, the user process includes a touch information processing and response process of the television module and a touch information and response process of the system module.
S203, acquiring the logic resolution of the virtual touch equipment.
The control terminal 10 and the second terminal 20 are hardware devices whose display screens have different resolutions, and the resolution refers to a physical resolution, which is a resolution inherent to hardware. The virtual touch device simulated according to the software technology also has a resolution, which is a logical resolution achieved by a software algorithm. Since the physical resolution of the display screen of the control terminal 10, the physical resolution of the display screen of the second terminal, and the logical resolution of the virtual touch device are different, the virtual touch device serves as an intermediary for coordinate conversion between the control terminal 10 and the second terminal 20, and it is necessary to calibrate the touch coordinates on the virtual touch device. And acquiring the logical resolution of the virtual touch device, and establishing a mapping relation between the logical resolution and the physical resolution of the control terminal 10 so as to facilitate subsequent conversion of touch coordinates.
And S204, analyzing the touch information, and acquiring a touch coordinate of the touch action on the control terminal.
Since the touch information sent by the control terminal 10 to the second terminal 20 is the touch information packed and compressed according to the predefined communication standard, the second terminal 20 cannot directly acquire the touch information therein, and the touch information needs to be analyzed. The parsing process is to decompress the compressed touch information packaged according to a specific protocol format so as to facilitate the second terminal 20 to obtain specific touch information therein, such as an ID of a touch point, a touch time, a touch state, an X-axis coordinate of the touch point and a Y-axis coordinate of the touch point, where the X-axis coordinate of the touch point and the Y-axis coordinate of the touch point refer to touch coordinates of a touch action on the control terminal 10.
And S205, converting the touch coordinate into a virtual resolution coordinate.
After the logical resolution of the virtual touch device is obtained, the width and the height of the display screen of the control terminal 10 are obtained through analysis, and since the width and the height of the display window of the virtual touch device may not be the same as the width and the height of the display screen of the control terminal 10, if the virtual touch device is to be controlled, the touch coordinate needs to be calibrated, that is, the touch coordinate is converted to obtain the virtual resolution coordinate on the virtual touch device according to the width and the height of the display screen on the control terminal 10. The touch coordinates on the control terminal 10 are converted into virtual resolution coordinates according to the relationship between the logical resolution and the physical resolution.
And S206, converting the virtual resolution coordinate into a control coordinate according to the resolution of the virtual resolution coordinate.
The virtual touch device may be regarded as an intermediary for touch coordinate conversion between the control terminal 10 and the second terminal 20, and the obtained virtual resolution coordinate needs to be converted into a control coordinate on the second terminal 20 to control the designated area of the second terminal 20. Since the width and height of the display window of the virtual touch device may not be the same as the width and height of the display screen of the second terminal 20, if the second terminal 20 is to be controlled, the virtual resolution coordinate needs to be calibrated, that is, the virtual resolution coordinate is converted according to the width and height of the display screen of the virtual touch device to obtain the control coordinate on the second terminal 20. Acquiring the self-resolution of the second terminal 20, converting the virtual resolution coordinate according to the relationship between the self-resolution of the second terminal 20 and the logic resolution of the virtual touch device, namely completing the calibration of the virtual resolution coordinate, and generating a control coordinate corresponding to the size of the display screen of the second terminal 20.
And S207, acquiring control coordinates.
The control coordinate is a coordinate adapted to the size of the display window of the second terminal 20, and the control coordinate is the same as the position designated by the touch coordinate on the control terminal, that is, the control of any position of the display screen of the second terminal 20 is realized by the control terminal 10, so that the second terminal 20 can perform corresponding operation according to the received operation instruction, and the control of the second terminal 20 is realized by the touch control terminal.
And S208, packaging the touch data into a touch event.
According to the difference of the modules to which the touch events are sent, the data structures of the adopted packaged events are also different, and specifically, the modules can be divided into a television module and a system module.
If the touch data is encapsulated into the touch event occurring in the television module 211, where the television module 211 is a main chip module, specifically, the step S208 may specifically include:
and packaging the touch data into the touch event according to a specified event structure format.
Illustratively, touch events are generated by storing touch data in an EventEntry structure format. According to the data information stored in the event structure, specific touch events can be obtained, such as obtaining the storage bit information of the touch state, obtaining the state of the touch event, obtaining the storage bit information of the touch point coordinates, obtaining the position of the touch event and the like.
If the encapsulating of the touch data into the touch event occurs in the system module 212, the system module 212 may include: the computer module and the android module, specifically, the step S208 may specifically include:
the touch data is encapsulated into a touch event using USB protocol format encapsulation.
For example, the system module 212 may be connected to the tv module 211 through a Universal Serial Bus (USB), where USB is a master-slave mode structure for implementing interconnection between a host and a slave device, and USB is a protocol for implementing communication between different devices. And storing the touch data according to a USB protocol format to generate a touch event so as to realize transmission on a USB transmission line. And storing touch data on different bytes according to a USB protocol format, for example, a 25 th byte of the USB protocol stores a touch state, and if the stored information on the byte is acquired, acquiring the touch state information in the touch event.
And S209, responding to the control operation.
The response mode is different according to the module where the response control operation occurs, and specifically, the module may be divided into a television module 211 and a system module 212.
Specifically, if the response control operation occurs in the television module 211, the step S209 may specifically include:
it should be noted that the process of the response control operation includes a distribution process of the touch event and a response process of the touch event, and a distribution flow of the touch event in the android system is as follows from top to bottom: activity- > Viewgroup1- > Viewgroup1 child Viewgroup2- > … - > TargetView.
S20911, reporting the touch event;
and reporting the touch event to an inputflingher through an input subsystem of the Linux system. Wherein, the inputflinger is a native service based on the Linux system.
S20912, distributing the touch event.
And distributing the touch event to a component, namely the View, by utilizing an inputflinger processing flow, wherein a plurality of sub-components, namely sub-views, exist in each View. The InputFlinger processing flow refers to a processing procedure of sending a touch event to each View. When a touch event is generated, the system sends the touch event to the TargetView for processing, and the process of sending the touch event to the TargetView is distribution of the touch event.
Specifically, the MotionEvent encapsulated with the touch information is firstly transmitted to the current Activity, the dispatch touch method of the Activity is responsible for distributing the touch event to TargetView, the touch event distribution is completed by Activity Window, the Activity Window transmits the touch event to the top View of the current user interface, and the Activity transmits the touch event to the corresponding ViewGroup according to the display condition of the current user interface.
S20913, searching a target sub-component.
And traversing all subcomponents of the ViewGroup according to the touch coordinates until finding a subcomponent meeting the touch accepting condition, namely finding a subcomponent capable of processing the touch event. If the currently traversed subassembly can process the touch event, calling a dispatcouchevent method of the subassembly to process the touch event, wherein the touch coordinate of the touch event is located in the area of the subassembly capable of processing the touch event.
It should be noted that, when the touch event is not processed after traversing all the sub-components, the ViewGroup processes the touch event by itself.
S20914, responding to the touch event.
A subcomponent capable of processing the touch event or a dispatch touch event or an ontouche method of a View group callback Activity or View, and an application program implements a processing logic of the touch event in the callback method and responds to the touch operation.
Specifically, if the response control operation occurs in the system module 212, the step S209 may specifically include:
s20921, transmitting the touch event to the system module.
The touch event packaged according to the USB protocol format is transmitted to the system module 212 through a USB transmission line, and the system module 212 obtains the touch data in the touch event for analysis.
S20922, the system module responds to the control operation corresponding to the touch event.
And responding to the control operation according to the touch instruction corresponding to the obtained touch event, such as implementing control operations of line drawing, new construction, withdrawal and the like, and displaying a display effect implemented by the control operation on the display screen of the second terminal 20.
And S210, sending display data to the control terminal.
And the display data is a display picture refreshed after responding to the control operation.
The display data is mainly information of a display screen, and the second terminal 20 responds after receiving a screen information request sent by the control terminal 10, captures a screen of the display screen, codes the display screen and then sends the coded display screen to the control terminal 10 in a wireless communication mode, so that the control terminal 10 can receive the coded display data to display and refresh the display screen.
It should be noted that steps S202 to S209 are performed at the kernel driver layer of the second terminal, and step S201 and step S210 are performed at the application layer of the second terminal.
On the basis of the above embodiment, after step S210, the method may further include: judging whether to maintain a screen transmission touch service with the control terminal; if yes, repeating the steps S201-S210; if not, closing the program and releasing the resources.
According to the technical scheme provided by the embodiment, the touch information detected by the control terminal is received, the virtual touch equipment is established in the kernel driving layer, the logic resolution of the virtual touch equipment is obtained, the touch information is analyzed, the touch coordinate is converted into the virtual resolution coordinate, the virtual resolution coordinate is converted into the control coordinate according to the resolution of the virtual touch equipment, the control coordinate is obtained, the touch data is encapsulated into the touch event, the control operation is responded, the problem that the process is complex due to the fact that the touch event needs to be reported to the application layer again to be forwarded, forwarding and transmission of the touch event to multiple modules are directly completed in the kernel driving layer, the multi-window touch operation is responded, and the touch information processing process is optimized.
EXAMPLE III
Fig. 4 is a schematic structural diagram of a touch information processing apparatus according to a third embodiment of the present invention. The touch information processing apparatus provided in this embodiment may be integrated in a touch information processing device, where the touch information processing device may be formed by two or more physical entities or may be formed by one physical entity, and the touch information processing device may be a computer, a mobile phone, a tablet, a projector, an interactive smart tablet, or the like. Referring to fig. 4, the touch information processing apparatus provided in this embodiment specifically includes: a touch information receiving module 310, a touch data acquiring module 320, a touch event encapsulating module 330, and a control operation responding module 340.
The touch information receiving module 310 is configured to receive touch information detected by a control terminal, where the touch information includes touch coordinates; a touch data obtaining module 320, configured to obtain touch data, where the touch data is obtained through analysis and conversion of the touch information, and the analysis process occurs in a kernel driver layer; a touch event encapsulation module 330, configured to encapsulate touch data into touch events; a control operation response module 340, configured to respond to a control operation, where the control operation is determined by the touch event.
According to the technical scheme provided by the embodiment, touch information detected by a control terminal is received, wherein the touch information comprises touch coordinates; analyzing and converting the touch information in a kernel driving layer to obtain touch data; encapsulating the touch data into a touch event; according to the touch event response control operation, the problem that the flow is complex due to the fact that the touch event can be forwarded only by reporting the touch event to the application layer again in the prior art is solved, forwarding and transmission of the touch event to multiple modules are achieved directly on the kernel driving layer, the multi-window touch operation is responded, and the touch information processing flow is optimized.
On the basis of the above embodiment, the touch information processing apparatus further includes: and the virtual touch equipment creating module is used for creating virtual touch equipment on the kernel driving layer, and the virtual touch equipment is obtained according to the touch information. The touch data acquisition module 320 includes: a logical resolution obtaining unit, configured to obtain a logical resolution of the virtual touch device; the touch information analysis unit is used for analyzing the touch information and acquiring the touch state and the touch coordinate of the touch action on the control terminal; the virtual resolution coordinate conversion unit is used for converting the touch coordinate into a virtual resolution coordinate; the control coordinate conversion unit is used for converting the virtual resolution coordinate into a control coordinate according to the resolution of the control coordinate; and the control coordinate acquisition unit is used for acquiring the control coordinates.
On the basis of the embodiment, the touch event is sent to the television module; the touch event encapsulation module 330 includes: and the first touch event packaging unit is used for packaging the touch data into the touch event according to a specified event structure format.
On the basis of the embodiment, the touch event is sent to a system module; the touch event encapsulation module 330 includes: and the second touch event packaging unit is used for packaging the touch data into the touch event by using USB protocol format packaging.
On the basis of the above embodiment, the response control operation occurs at the television module; the control operation response module 340 includes: a reporting unit, configured to report the touch event; a distributing unit for distributing the touch event to all components, wherein the components comprise a plurality of sub-components; a target sub-component searching unit for searching a target sub-component, the target sub-component being used for processing the touch event; and the first response unit is used for responding to the touch event.
On the basis of the above embodiment, the response control operation occurs at the system module; the control operation response module 340 includes: a transmission unit for transmitting the touch event to the system module; and the second response unit is used for responding the control operation corresponding to the touch event by the system module.
On the basis of the above embodiment, the touch information processing apparatus further includes: and the display data sending module is used for sending display data to the control terminal, wherein the display data is a display picture refreshed after responding to the control operation.
The touch information processing device provided by the embodiment can be used for executing the touch information processing method provided by any of the above embodiments, and has corresponding functions and beneficial effects.
Example four
Fig. 5 is a schematic structural diagram of a touch information processing apparatus according to a fourth embodiment of the present invention. As shown in fig. 5, the touch information processing apparatus includes: a processor 60, a memory 61, a display 62 with touch functionality, an input device 63, an output device 64 and a communication device 65. The number of the processors 60 in the touch information processing device may be one or more, and one processor 60 is illustrated in fig. 5. The number of the memories 61 in the touch information processing device may be one or more, and one memory 61 is illustrated in fig. 5. The processor 60, the memory 61, the display 62, the input device 63, the output device 64, and the communication device 65 of the touch information processing apparatus may be connected by a bus or other means, and the bus connection is exemplified in fig. 5. In an embodiment, the touch information processing device may be a computer, a mobile phone, a tablet, a projector, an interactive smart tablet, or the like. In the embodiment, a touch information processing device is taken as an example of an interactive smart tablet to describe.
The memory 61 is a computer-readable storage medium, and can be used for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the touch information processing method according to any embodiment of the present invention (for example, the touch information receiving module 310, the touch data acquiring module 320, the touch event encapsulating module 330, and the control operation responding module 340 in the touch information processing apparatus). The processor 60 executes various functional applications of the device and data processing by running software programs, instructions, and modules stored in the memory 61, that is, implements the touch information processing method described above.
The memory 61 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the device, and the like. Further, the memory 61 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 61 may further include memory located remotely from the processor 60, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The display screen 62 is a display screen 62 with a touch function, which may be a capacitive screen, an electromagnetic screen, or an infrared screen. In general, the display screen 62 is used for displaying data according to instructions from the processor 60, and is also used for receiving touch operations applied to the display screen 62 and sending corresponding signals to the processor 60 or other devices. Optionally, when the display screen 62 is an infrared screen, the display screen further includes an infrared touch frame, and the infrared touch frame is disposed around the display screen 62, and may also be configured to receive an infrared signal and send the infrared signal to the processor 60 or other devices.
The communication device 65 is used for establishing a communication connection with other devices, and may be a wired communication device and/or a wireless communication device.
The input device 63 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the touch information processing apparatus, and may also be a camera for acquiring images and a sound pickup apparatus for acquiring audio data. The output device 64 may include an audio device such as a speaker. It should be noted that the specific composition of the input device 63 and the output device 64 can be set according to actual conditions.
The touch information processing device provided by the above can be used for executing the touch information processing method provided by any of the above embodiments, and has corresponding functions and beneficial effects.
EXAMPLE five
An embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, perform a method for touch information processing, including:
receiving touch information detected by a control terminal, wherein the touch information comprises touch coordinates;
acquiring touch data, wherein the touch data is obtained by analyzing and converting the touch information, and the analyzing process occurs in a kernel driving layer;
encapsulating the touch data into a touch event;
in response to a control operation, the control operation is determined by the touch event.
Of course, the storage medium provided by the embodiment of the present invention contains computer-executable instructions, and the computer-executable instructions are not limited to the operations of the touch information processing method described above, and may also perform related operations in the touch information processing method provided by any embodiment of the present invention, and have corresponding functions and advantages.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions to enable a computer device (which may be a robot, a personal computer, a server, or a network device) to execute the touch information processing method according to any embodiment of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A touch information processing method is applied to a screen projection touch system and is characterized by comprising the following steps: the method comprises the steps that a second terminal receives touch information detected by a control terminal, the control terminal serves as a screen projection client, and the touch information comprises touch coordinates;
acquiring touch data, wherein the touch data is obtained by analyzing and converting the touch information, and the analyzing process occurs in a kernel driving layer of a second terminal;
the kernel driving layer encapsulates the touch data into touch events;
in response to a control operation, the control operation is determined by the touch event.
2. The method according to claim 1, wherein the acquiring touch data comprises: creating virtual touch equipment on a kernel driving layer, wherein the virtual touch equipment is obtained according to the touch information;
the acquiring touch data includes:
acquiring the logic resolution of the virtual touch equipment;
analyzing the touch information, and acquiring a touch coordinate of a touch action on the control terminal;
converting the touch coordinates into virtual resolution coordinates;
converting the virtual resolution coordinate into a control coordinate according to the resolution of the virtual resolution coordinate;
and acquiring control coordinates.
3. The touch information processing method according to claim 1, wherein the touch event is transmitted to a television module;
the encapsulating of the touch data into touch events comprises: and packaging the touch data into the touch event according to a specified event structure format.
4. The touch information processing method according to claim 1, wherein the touch event is transmitted to a system module;
the encapsulating of the touch data into touch events comprises: the touch data is encapsulated into a touch event using USB protocol format encapsulation.
5. The touch information processing method according to claim 1, wherein the response control operation occurs at a television module;
the response control operation includes:
reporting the touch event;
distributing the touch event to all components, wherein the components comprise a plurality of sub-components;
finding a target sub-component, the target sub-component for processing the touch event;
responding to the touch event.
6. The touch information processing method according to claim 1, wherein the response control operation occurs at a system module;
the response control operation includes:
transmitting the touch event to the system module;
and the system module responds to the control operation corresponding to the touch event.
7. The touch information processing method according to claim 1, wherein after the response control operation, further comprising: and sending display data to the control terminal, wherein the display data is a display picture refreshed after responding to the control operation.
8. A touch information processing apparatus applied to a screen-projection touch system, comprising: the touch information receiving module is used for receiving touch information detected by a control terminal, the control terminal is used as a screen projection client, and the touch information comprises touch coordinates;
the touch data acquisition module is used for acquiring touch data, the touch data is obtained by analyzing and converting the touch information, and the analyzing process occurs in a kernel driving layer;
the touch event packaging module is used for packaging the touch data into a touch event in the kernel driving layer;
and the control operation response module is used for responding to control operation, and the control operation is determined by the touch event.
9. A touch information processing apparatus comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the touch information processing method according to any one of claims 1 to 7 when executing the program.
10. A storage medium containing computer-executable instructions, which when executed by a computer processor, are for performing the touch information processing method of any one of claims 1-7.
CN201811030140.8A 2018-09-05 2018-09-05 Touch information processing method, device, equipment and storage medium Active CN109144396B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811030140.8A CN109144396B (en) 2018-09-05 2018-09-05 Touch information processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811030140.8A CN109144396B (en) 2018-09-05 2018-09-05 Touch information processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109144396A CN109144396A (en) 2019-01-04
CN109144396B true CN109144396B (en) 2021-06-15

Family

ID=64826900

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811030140.8A Active CN109144396B (en) 2018-09-05 2018-09-05 Touch information processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109144396B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111124241B (en) * 2019-12-10 2021-07-09 深圳市创易联合科技有限公司 Writing touch identification method and device based on infrared touch screen and readable storage medium
CN111552402B (en) * 2020-04-22 2022-04-15 湖南安元信息科技有限公司 Mapping method of multi-display touch component system, terminal and readable storage medium
CN111831166B (en) * 2020-07-10 2023-06-02 深圳市康冠商用科技有限公司 Android and Linux-based touch area segmentation method and device, computer equipment and storage medium
CN114063948A (en) * 2020-08-06 2022-02-18 福建天泉教育科技有限公司 Method and system for realizing remote control by combining software and hardware
CN114063949A (en) * 2020-08-06 2022-02-18 福建天泉教育科技有限公司 Remote control method and system thereof
CN112714425A (en) * 2020-12-25 2021-04-27 广州朗国电子科技有限公司 Touch forwarding method applied to conference education integrated machine
CN112882766A (en) * 2021-02-03 2021-06-01 广州华欣电子科技有限公司 Data processing method, device and system
CN113553198A (en) * 2021-06-01 2021-10-26 刘启成 Data processing method and device
CN113253877B (en) * 2021-06-23 2022-01-21 广州朗国电子科技股份有限公司 Electronic whiteboard system and control method thereof
CN113918043B (en) * 2021-09-08 2023-12-05 华人运通(上海)云计算科技有限公司 Touch control method
CN114327151A (en) * 2022-01-05 2022-04-12 深圳康咏科技有限公司 Touch data conversion method and device, computer equipment and storage medium
CN114791780B (en) * 2022-06-21 2022-09-20 武汉海微科技有限公司 Display screen touch control method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102802068A (en) * 2012-05-07 2012-11-28 海信集团有限公司 Remote control method and system smart television
CN102915113A (en) * 2012-08-24 2013-02-06 深圳创维-Rgb电子有限公司 Method and apparatus for supporting data input
CN106648402A (en) * 2016-09-05 2017-05-10 广州视睿电子科技有限公司 Information sending method and device and information processing method and device
CN107024984A (en) * 2017-01-12 2017-08-08 瑞声科技(新加坡)有限公司 The feedback response method and terminal of a kind of button

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101980153B (en) * 2010-10-22 2015-07-22 中兴通讯股份有限公司 Method and mobile terminal for identifying hardware gestures
TW201423560A (en) * 2012-12-10 2014-06-16 Hon Hai Prec Ind Co Ltd Handheld device and remote control method therefore
CN103281575B (en) * 2013-03-28 2016-06-15 四川长虹电器股份有限公司 A kind of Novel control method of intelligent television
CN103593185B (en) * 2013-11-04 2017-01-04 浙江大学 The method that many android system based on single linux kernel share input display device
CN103929668B (en) * 2014-04-28 2018-01-16 Tcl集团股份有限公司 A kind of control method and device of the television set based on using Android system
CN103984438B (en) * 2014-04-30 2017-01-04 南京航空航天大学 Display screen touch-control system and method for cockpit display system
CN106685959A (en) * 2016-12-28 2017-05-17 福建星网视易信息***有限公司 Method for controlling set top box by mobile terminal in real time, device and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102802068A (en) * 2012-05-07 2012-11-28 海信集团有限公司 Remote control method and system smart television
CN102915113A (en) * 2012-08-24 2013-02-06 深圳创维-Rgb电子有限公司 Method and apparatus for supporting data input
CN106648402A (en) * 2016-09-05 2017-05-10 广州视睿电子科技有限公司 Information sending method and device and information processing method and device
CN107024984A (en) * 2017-01-12 2017-08-08 瑞声科技(新加坡)有限公司 The feedback response method and terminal of a kind of button

Also Published As

Publication number Publication date
CN109144396A (en) 2019-01-04

Similar Documents

Publication Publication Date Title
CN109144396B (en) Touch information processing method, device, equipment and storage medium
US10922041B2 (en) Wireless screen transmission method, extension device, and wireless screen transmission system
CN109032390B (en) Handwriting data caching method, device, equipment and storage medium
WO2019120054A1 (en) Interactive control method and system, and terminal and storage medium
TWI601055B (en) A unified extensible firmware interface (uefi) basic input/output system (bios)-controlled computing device and method and non-transitory medium thereof
US9532098B2 (en) Method, apparatus and system for controlling focus on TV interface
WO2021072926A1 (en) File sharing method, apparatus, and system, interactive smart device, source end device, and storage medium
US20160139797A1 (en) Display apparatus and contol method thereof
CN103472996A (en) Method and device for receiving touch in mobile device
US20190051147A1 (en) Remote control method, apparatus, terminal device, and computer readable storage medium
CN106990908B (en) Local touch shielding method, device, system, equipment and storage medium
US20140361991A1 (en) Method and electronic device for controlling mouse module
WO2020147677A1 (en) Method and apparatus for displaying content, and terminal and storage medium
CN109408072B (en) Application program deleting method and terminal equipment
CN104391581A (en) Mobile terminal and method for projecting operation interface of application program thereof
CN115437542A (en) Display device and screen projection inverse control method
WO2019205634A1 (en) Touch control method and device for smart interactive tablet and smart interactive tablet
WO2019006958A1 (en) View scrolling method and apparatus, and smart television
CN111880759A (en) Control method and device for multi-split screen display picture, display and storage medium
CN103607632A (en) Previewing method and device based on desktop live broadcast
CN111414097A (en) Interaction method, interaction system and display equipment
CN107038024B (en) Operation configuration method and equipment thereof
US11334240B2 (en) Method, device, electronic device, and storage medium for sending and receiving message
CN104506909A (en) Screen capturing remote-control method and screen capturing remote-control device
CN115623255A (en) Display device, terminal device and mirror image screen projection inverse control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant