CN117093125A - Touch response method based on first system, touch screen and electronic equipment - Google Patents

Touch response method based on first system, touch screen and electronic equipment Download PDF

Info

Publication number
CN117093125A
CN117093125A CN202311054795.XA CN202311054795A CN117093125A CN 117093125 A CN117093125 A CN 117093125A CN 202311054795 A CN202311054795 A CN 202311054795A CN 117093125 A CN117093125 A CN 117093125A
Authority
CN
China
Prior art keywords
touch
external
module
signal source
osd
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311054795.XA
Other languages
Chinese (zh)
Inventor
张振业
杨洋
卢习昌
王传文
李婷婷
李金磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Honghe Innovation Information Technology Co Ltd
Original Assignee
Shenzhen Honghe Innovation Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Honghe Innovation Information Technology Co Ltd filed Critical Shenzhen Honghe Innovation Information Technology Co Ltd
Priority to CN202311054795.XA priority Critical patent/CN117093125A/en
Publication of CN117093125A publication Critical patent/CN117093125A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application relates to a touch response method, a system, a touch screen, electronic equipment and a computer readable storage medium based on a first system, wherein the touch screen can simultaneously display an OSD window, an external signal source interactive application window and other pictures of the first system, and the windows and the pictures can synchronously use a touch function; if not, the touch module, the MCU module or the first system judges whether the touch position is located in the display area of the external signal source, if so, the touch data is subjected to coordinate conversion and then format conversion, and then the touch data is sent to the external device so that the external device responds to the touch data. According to the application, the switching between the touch channel and the display interface is not required, the process of touch operation can be simplified, and the user experience is improved.

Description

Touch response method based on first system, touch screen and electronic equipment
Technical Field
The application relates to the technical field of touch control, in particular to a touch control response method based on a first system, a touch control screen, electronic equipment and a computer readable storage medium.
Background
With the wide application of the interactive large-screen device, the large-screen device is also connected with an external device to enrich the functions of the large-screen device, such as an OPS (operation panel) computer. The large screen is usually developed based on an android system, namely, a main system is an android system, and the large screen is also based on a windows system, namely, the main system is a windows system. In an application scenario based on the windows system, for example, when the current interface is the windows system, the touch channel is connected with the windows system, and the external device (or called an external signal source) cannot realize the touch operation, and the touch operation cannot be directly used by the external device with a large screen in such a way, so that the control of the touch operation is complicated and the user experience is affected.
Disclosure of Invention
Based on the above-mentioned current situation, the main objective of the present application is to provide a touch response method based on a first system, a touch screen, an electronic device and a computer readable storage medium, wherein an OSD window, an external signal source interactive application window of the first system and other pictures can synchronously use a touch function, so as to further improve user experience.
In order to achieve the above purpose, the technical scheme adopted by the application is as follows:
the first aspect of the application provides a touch response method based on a first system, which is applied to a touch screen; the touch screen can simultaneously display an OSD window, an external signal source interaction application window of the first system and other pictures, wherein the external signal source interaction application window is a window corresponding to the external signal source interaction application and comprises an external signal source display area for displaying an input picture of external equipment; the OSD window, the external signal source interactive application window and other pictures of the first system can synchronously use touch operation, and the external equipment adopts a second system;
the touch response method comprises the following steps:
OSD judging step: when touch data are received, judging whether the touch position is positioned in an OSD region or not according to the touch data, if so, sending the touch data to an OSD module; if not, executing an external connection region judging step; the OSD region is a region occupied by an OSD window corresponding to the OSD module on a touch screen;
OSD response step: the OSD module responds to the touch data when receiving the touch data;
and (3) external judgment: judging whether the touch position is positioned in the external signal source display area, if so, performing coordinate conversion on the touch data to obtain external coordinates corresponding to external equipment;
and an information output step: and converting the external coordinate format into external data which can be received by the external equipment, and sending the external data to the external equipment so that the external equipment can respond to the touch data.
Preferably, in the OSD judging step, if the touch position is not located in the OSD region, the external judging step is executed after the touch data is sent to the first system;
the external judgment step comprises the following steps:
a first substep: the first system generates a touch event when receiving the touch data, judges whether the touch position is positioned in an external signal source interaction application window, and if yes, executes a second sub-step; if not, other applications corresponding to the touch position on the first system respond to the touch data;
a second substep: and the first system judges whether the touch position is positioned in the external signal source display area or not through the external signal source interaction application, if so, the external signal source interaction application performs coordinate conversion on the touch data to obtain external coordinates corresponding to external equipment and sends the external coordinates to the MCU module.
Preferably, the external signal source interactive application window further comprises a menu area; in the external judging step, if the touch position is not located in the external signal source display area, the first system generates the touch event according to the touch data and judges whether the touch position is located in the menu area, if so, the first system responds to the touch data through external signal source interactive application; if not, the first system responds to the touch data through other applications.
Preferably, in the external connection judging step,
and carrying out coordinate conversion on the touch data according to the window zone bit information of the external signal source interactive application and the resolution information of the external equipment.
A second aspect of the present invention provides a touch screen based on a first system, configured to execute the touch response method described in any one of the foregoing, where the touch screen includes: the touch control module, the OSD module, the first system and the MCU module, wherein the first system is provided with an external signal source interaction application, an external signal source interaction application window corresponding to the external signal source interaction application is provided with an external signal source display area, and the external signal source display area is used for displaying an input picture of external equipment; the touch screen can simultaneously display an OSD window, an external signal source interactive application window and other pictures of the first system, and the windows and the pictures can synchronously use touch operation;
When the touch control module receives touch control data, the touch control module or the MCU module judges whether the touch control position is positioned in an OSD region according to the touch control data, if so, the touch control data is sent to the OSD module, and the OSD module responds to the touch control data; if not, the touch module, the MCU module or the first system judges whether the touch position is located in the external signal source display area, if so, coordinate conversion is carried out on the touch position to obtain external coordinates corresponding to external equipment, then the MCU module converts the external coordinate format into external data which can be received by the external equipment and sends the external data to the external equipment so that the external equipment responds to the touch data;
the OSD region is a region occupied by an OSD window corresponding to the OSD module on a touch screen; the external device adopts a second system.
Preferably, the touch module transmits signals to the OSD module in two directions through a serial port, and transmits signals to the first system through a USB; the first system transmits signals to the MCU module through a serial port or USB; the MCU module transmits signals to external equipment through USB; the external device transmits signals to the first system through HDMI;
Wherein, the touch control module judges whether the touch control position is positioned in an OSD region,
if so, the touch control module sends the touch control data to the OSD module;
if not, the touch control module sends the touch control data to the first system; and the first system judges whether the touch position is positioned in the external signal source display area, if so, performs coordinate conversion on the touch position to obtain external coordinates corresponding to external equipment, and sends the MCU module.
Preferably, the OSD module transmits a signal to the touch module through a serial port; the touch control module transmits signals to the MCU module in a bidirectional manner through the USB and the first system, and transmits signals to the MCU module through the USB; the MCU module transmits signals to the OSD module through a serial port and transmits signals to external equipment through a USB; the external device transmits signals to the first system through HDMI;
wherein, the touch control module judges whether the touch control position is positioned in an OSD region,
if so, the touch control module sends the touch control data to the OSD module;
if not, the touch control module judges whether the touch control position is located in the external signal source display area, if so, the touch control module conducts coordinate conversion on the touch control data to obtain external coordinates corresponding to external equipment, and the external coordinates are sent to the MCU module; if not, the touch module sends the touch data to the first system, and the first system generates a touch event when receiving the touch data and responds to the touch event.
Preferably, the touch module transmits a signal to the MCU module through USB; the MCU module is used for bidirectionally transmitting signals with the OSD module through a serial port, bidirectionally transmitting signals with the first system through a USB and transmitting signals to the external equipment through the USB; the external device transmits signals to the first system through HDMI;
when the touch control module receives the touch control data, the touch control data are sent to the MCU module; the MCU module judges whether the touch control position is positioned in an OSD region,
if yes, the MCU module sends the touch data to the OSD module;
if not, the MCU module sends the touch data to the first system, the first system judges whether the touch position is positioned in the external signal source display area, if yes, coordinate conversion is carried out on the touch position, external coordinates corresponding to external equipment are obtained, and the external coordinates are sent to the MCU module; or the MCU module judges whether the touch position is positioned in the external signal source display area, if so, the MCU module performs coordinate conversion on the touch data to obtain external coordinates corresponding to external equipment; if not, the MCU module sends the touch data to the first system, and the first system generates a touch event when receiving the touch data and responds to the touch event. A third aspect of the present application provides an electronic device, including a touch screen as described in any one of the preceding claims, where the electronic device includes an interactive large screen or an electronic whiteboard.
A fourth aspect of the present application provides a computer-readable storage medium storing a computer program, wherein the computer program when run controls a device in which the computer-readable storage medium is located to execute the touch response method according to any one of the above.
According to the touch screen touch response method, the window corresponding to the OSD module, the picture of the external equipment and other pictures of the first system can be displayed at the same time on one interface, whether the OSD area is touched or not is judged firstly, if the OSD area is touched or not is judged, the external signal source display area of the touch or not is judged, so that the OSD window, the external signal source interactive application window of the first system and other pictures of windows can synchronously use the touch function, a user can determine the system or the module which needs to be touched according to the user's own needs at the same moment, and the switching of a touch channel and the display interface is not needed, so that the process of touch operation can be simplified, and the user experience is improved.
In addition, in the touch response method, the external signal source display area is only required to be acquired when the touch position is not in the OSD area, the touch module or the MCU module does not need to acquire area information of a plurality of windows (such as the OSD area and the external signal source display area and possibly including window areas of other applications of the first system in the subsequent embodiments) at the same time, the real-time data interaction amount is reduced, the data loss in interaction is avoided as much as possible, and system resources are saved, so that the accuracy of touch response is improved.
Other advantages of the present application will be set forth in the description of specific technical features and solutions, by which those skilled in the art should understand the advantages that the technical features and solutions bring.
Drawings
Preferred embodiments of the present application will be described below with reference to the accompanying drawings. In the figure:
FIG. 1 is a schematic diagram of an interface of a touch screen according to a preferred embodiment of the touch response method of the present application;
FIG. 2 is a flow chart of a touch response method according to a preferred embodiment of the present application;
FIG. 3 is a system block diagram of a preferred embodiment of a touch screen of the present application;
FIG. 4 is a system block diagram of another preferred embodiment of a touch screen of the present application;
FIG. 5 is a system block diagram of yet another preferred embodiment of a touch screen of the present application;
FIG. 6 is a system block diagram of yet another preferred embodiment of a touch screen of the present application;
fig. 7 is a system block diagram of still another preferred embodiment of the touch screen of the present application.
Detailed Description
The present application is described below based on examples, but the present application is not limited to only these examples. In the following detailed description of the present application, certain specific details are set forth in order to avoid obscuring the present application, and in order to avoid obscuring the present application, well-known methods, procedures, flows, and components are not presented in detail.
Moreover, those of ordinary skill in the art will appreciate that the drawings are provided herein for illustrative purposes and that the drawings are not necessarily drawn to scale.
Unless the context clearly requires otherwise, throughout the description and the claims, the words "comprise", "comprising", and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is, it is the meaning of "including but not limited to".
In the description of the present application, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Furthermore, in the description of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
The application provides a touch response method based on a first system, which is applied to a touch screen, wherein a main system of the touch screen is the first system and can be realized through an x86 architecture, and of course, the touch response method can also be realized through other architectures, such as an ARM architecture and the like. Besides the main system, the touch screen can be externally connected with equipment, such as an external PC (personal computer) through a signal source interface, wherein the system of the externally connected equipment is a second system, the first system and the second system can be the same system or different systems, and specifically, the first system and the second system can be windows system, android system, mac system or linux system respectively. It can be understood that the touch screen also has an OSD (on-screen display) module, i.e. a screen menu adjusting module, for displaying characters, graphics and images on the touch screen and controlling part of the states of the touch screen, specifically, the OSD module can implement a menu of popping up multiple state information (including volume, brightness, etc.) of the touch screen on the touch screen, and adjust each item of state information of the touch screen through the menu.
The existing touch screen based on the windows system as a main system can only realize the touch operation of a single interface and a single system even if external equipment can be accessed, namely, when the main interface of the touch screen is the windows interface of the main system, only the windows system can use the touch function, and even if a window can display a picture of an external signal source, the external equipment (or the window displaying the picture of the external signal source) cannot be directly touched through the window; otherwise, if the main interface of the touch screen is an external signal source screen, the touch function cannot be used for the windows system including each application.
The touch screen can realize the display of single-interface multi-system pictures, and can realize the touch operation on multiple systems synchronously, namely, the display interfaces of multiple systems can be displayed simultaneously on the same interface of the touch screen, so that OSD windows, external signal source interaction application windows of a first system and other pictures of the first system, such as other application windows, can be displayed simultaneously, and touch operation can be carried out in the windows or the interfaces. The other pictures of the first system refer to the parts except the external signal source interactive application window in the pictures displayed by the first system.
The touch screen includes a touch module 100, an OSD module 400, a first system 200, and an MCU module 300 (i.e., a main control module), where the first system has an external signal source interaction application, an external signal source interaction application window 210 corresponding to the external signal source interaction application has an external signal source display area 211, and the external signal source display area is used for displaying an input picture of the external device, that is, the external signal source interaction application set by the first system can display an input picture of the external device, and it can be understood that the first system has some pictures of itself besides the external signal source interaction application, such as other applications, where windows corresponding to the other applications are recorded as other windows. The single-interface multi-system display is shown in fig. 1, and an OSD window 410 corresponding to the OSD module and a frame of the first system are displayed on the current interface of the touch screen at the same time, where the frame of the first system includes an external signal source interactive application window and other applications or windows of the first system.
As shown in fig. 2, the touch response method of the present application includes:
OSD judging step: when touch data is received, judging whether the touch position is positioned in an OSD region according to the touch data, if so, sending the touch data to an OSD module; if not, executing the circumscribed area judging step. The OSD region is a region occupied by an OSD window corresponding to the OSD module on the touch screen.
In the step, when touch operation occurs, the touch screen specifically comprises a touch module and acquires touch data, whether the touch is an OSD window is judged through an OSD region provided by the OSD module, if yes, the touch data is sent to the OSD module, and the OSD module responds to the touch data; if not, the touch operation is considered to be the touch operation to the part in the first system picture, and the external judgment step is carried out. The OSD region may be stored in advance, or may be provided by the OSD module after the touch screen is started, and when the OSD module is provided, the window occupation coordinates may be specifically determined according to the corresponding window occupation coordinates of the OSD module, and the window occupation coordinates may be an upper left corner coordinate and a lower right corner coordinate of the window, or may be a certain corner coordinate (such as an upper left corner coordinate) of the window, and a length or a width of the window, or may be other window occupation coordinate representation modes, so long as existing various representation modes of the position and the size of the window can be obtained.
And (3) external judgment: and judging whether the touch position is positioned in the external signal source display area, if so, performing coordinate conversion on the touch data to obtain external coordinates corresponding to the external equipment. The first system can obtain an external signal source display area.
In this step, it is determined whether the touch position is an external signal source display area of the touch, that is, whether the touch position is an operation of the external device, if so, because the coordinate information in the touch data is based on the touch screen, and the input screen of the external device on the touch screen only occupies a smaller area of the touch screen, the coordinate conversion is performed on the touch data to obtain the external coordinate corresponding to the external device, and the specific coordinate conversion mode can be the coordinate conversion modes of two different display windows in the prior art, which will not be repeated here. When the external signal source interaction application is started, the first system acquires an external signal source interaction application window, and the external signal source interaction application informs an external signal source display area of the external signal source interaction application window, so that the first system acquires the external signal source display area.
And an information output step: the MCU module converts the external coordinate format into external data which can be received by the external equipment and sends the external data to the external equipment so that the external equipment can respond to the touch data.
In the touch response method, when the touch module 100 receives the touch data, the touch module 100 or the MCU module 300 determines whether the touch position is located in the OSD region according to the touch data, if so, the touch data is sent to the OSD module 400, and the OSD module responds to the touch data; if not, the touch control module, the MCU module or the first system external signal source display area judges whether the touch control position is located in the external signal source display area, if so, coordinate conversion is carried out on the touch control position to obtain external coordinates corresponding to the external equipment, and then the MCU module converts the external coordinate format into external data which can be received by the external equipment and sends the external data to the external equipment.
According to the touch screen and the touch response method thereof, the display area of the touch screen is divided into the display OSD area and the area corresponding to the first system (namely, the picture displayed by the first system), the external signal source interaction application window (comprising the external signal source display area) belongs to the area corresponding to the first system, and the external signal source interaction application belongs to the application of the first system. The touch screen can simultaneously display a window of the OSD module, a picture of the external equipment and other pictures of the first system on one interface, when touch operation is carried out, whether touch is an OSD area is judged firstly, if not, whether touch is carried out is judged again, and if so, the OSD window, the external signal source interactive application window of the first system and other pictures of the first system can synchronously use the touch function, and a user can determine the system or the module which needs to be touched according to the self requirement at the same moment without switching a touch channel and a display interface, so that the process of touch operation can be simplified, and the user experience is improved; in addition, the control method of the application only needs to acquire the external signal source display area when the touch position is not in the OSD area, namely, the touch module or the MCU module does not need to acquire area information of a plurality of windows (such as the OSD area and the external signal source display area and possibly including window areas of other applications of the first system in the subsequent embodiments), thereby reducing the real-time data interaction amount, avoiding data loss in interaction as much as possible, saving system resources and improving the accuracy of touch response.
In the external connection judging step, if the touch position is not located in the external signal source display area, the first system generates a touch event according to the touch data and responds to the touch event, specifically, when the external signal source interactive application window further comprises a menu area (detailed below), the first system judges whether the touch position is located in the menu area, if so, the first system responds to the touch data through the external signal source interactive application; if not, the first system responds to the touch data through other applications.
In a preferred embodiment of the present application, in the OSD determining step, the touch module determines whether the touch location is located in the OSD region, and if so, the touch module sends the touch data to the OSD module. In this way, the touch module does not need to send out the touch data when receiving the touch data, and can directly judge the touch data, so that the real-time performance of the judgment is better. In this embodiment, the determination whether the touch location is located in the external signal source interactive application window may be performed by the first system, or may be performed continuously by the touch module, or may be performed by the MCU module.
In the embodiment that the first system judges whether the touch position is located in the external signal source display area or not, the first system does not need to send the external signal source display area to the touch module in real time, only the touch module sends touch data to the first system, therefore, real-time data transmission quantity can be reduced, and especially when a user moves an external signal source interactive application window frequently, the first system does not need to send the moved external signal source display area to the touch module in real time, data loss caused in data transmission can be further reduced, even touch errors are caused, and accordingly accuracy of touch operation of the user is improved better, and user experience is improved. Specifically, in the OSD determining step, if the touch module determines that the touch position is not in the OSD region, the touch module sends the touch data to the first system, and then the first system performs the external determining step, i.e. the first system determines whether the touch position is the external signal source display region when receiving the touch data.
When the first system executes the external judgment step, the first system can directly judge whether the touch is the external signal source display area, if not, then judge whether the touch is the external signal source interactive application window (actually, whether the touch position is in the menu area described below is judged at this time). In a preferred embodiment of the present application, when receiving touch data, the first system first determines whether the touch position is located in the external signal source interactive application window, if so, then determines whether the touch position is located in the external signal source interactive application window, and specifically, the external determining step includes:
a first substep: the first system generates a touch event when receiving touch data, judges whether the touch position is positioned in an external signal source interactive application window, and if yes, executes a second sub-step;
a second substep: the first system judges whether the touch position is located in the external signal source display area through the external signal source interaction application, if so, the external signal source interaction application performs coordinate conversion on the touch data to obtain external coordinates corresponding to the external equipment, and the external coordinates are sent to the MCU module.
As described above, when the external signal source interactive application is opened, the first system itself directly stores the window information of the corresponding external signal source interactive application window, and if the external signal source display area is to be acquired, the external signal source interactive application needs to transmit the information of the external signal source display area to the first system. By adopting the mode, when the first system judges that the touch position is positioned in the external signal source interaction application window, namely the first substep, the first system does not need to carry out interaction application through the external signal source, but the first system can directly judge, if the touch position is positioned in the external signal source interaction application window, the touch position is processed by the external signal source interaction application, and if the touch position is not positioned in the external signal source interaction application window, the external signal source interaction application cannot be processed, so that the data interaction times can be reduced to a certain extent, and the touch response time is reduced.
Further, in the first substep, if the first system determines that the touch position is not located in the external signal source interactive application window, the touch is considered to be other frames of the first system, and other applications corresponding to the touch position on the first system respond to the touch data.
The external signal source interactive application window 210 may include a menu area 212 in addition to the external signal source display area 211, as shown in fig. 1, and conventional menu operations such as copy and paste may be implemented by touching a corresponding position in the area, and the specific function of the menu area is not limited in the present application. In this embodiment, in the second substep, if the touch position is not located in the external signal source display area, the touch position is considered to be located in the menu area, and the external signal source interactive application directly responds to the touch data.
It can be seen that, in this embodiment, the advantage of the manner in which the first system first determines whether to locate the external signal source interactive application window can be displayed more prominently.
In the embodiment of the touch module determining whether the touch position is located in the external signal source display area, in the OSD determining step, after the touch module determines that the touch position is not located in the OSD area, the touch data does not need to be sent out, but only the first system is required to send information of the external signal source display area to the touch footbath, so as to determine whether the touch position is located in the external signal source display area according to the acquired external signal source display area. Specifically, in the external connection judging step, the touch module judges whether the touch position is located in the external connection signal source display area, if yes, the touch module performs coordinate conversion on the touch data to obtain external connection coordinates corresponding to the external connection equipment, and the external connection coordinates are sent to the MCU module. That is, if the touch module determines that the touch position is located in the display area of the external signal source, the touch module considers that the touch operation is to be performed on the external device, and the coordinate conversion is required to be performed first and then sent to the MCU module, so as to be sent to the external device subsequently. In this way, the judgment of the touch position in the OSD region and the display region of the external signal source, and the coordinate conversion of the touch data are performed by the touch module, and the MCU module only performs format conversion and then transmits the data to the external device.
Further, if the touch module judges that the touch position is not located in the external signal source display area, the touch module sends touch data to the first system; the touch response method further comprises the following steps:
windows response step: the first system generates a touch event when the touch data is received and responds thereto.
That is, if the touch module determines that the touch position is neither in the OSD region nor in the external signal source display region, the touch is considered to be other regions in the first system screen (including the menu region of the external signal source interactive application window), and the touch data is responded by the first system (specific response mode is described in detail below).
In another preferred embodiment of the present application, whether the touch location is located in the OSD region is determined by the MCU module, and specifically, the OSD determining step includes:
when the touch control module receives touch control data, the touch control data are sent to the MCU module;
the MCU module judges whether the touch position is located in an OSD region, if so, the MCU module sends touch data to the OSD module.
In this embodiment, the touch module may send the touch data to the MCU module as long as the touch data is received, and the MCU module may further obtain the OSD region, specifically, the OSD region may be stored in the MCU module in advance, or the OSD region may be sent to the MCU by the OSD module, and the MCU module may further determine. By the mode, occupation of touch module resources can be reduced, and accuracy and timeliness of touch data detection are improved. In this embodiment, the determination whether the touch location is located in the external signal source interactive application window may be performed by the first system, or may be performed continuously by the MCU module, or may be performed by the touch module.
In the embodiment that the first system judges whether the touch position is located in the external signal source display area or not, the first system does not need to send the external signal source display area to the MCU module in real time, but only the touch module or the MCU module (preferably the MCU module) sends touch data to the first system, so that real-time data transmission quantity can be reduced, especially when a user moves the external signal source interactive application window frequently, the first system does not need to send the moved external signal source display area to the MCU module in real time, data loss caused in data transmission can be further reduced, even touch errors are caused, and therefore accuracy of touch operation of the user can be improved, and user experience is improved. In a specific embodiment, in the OSD determining step, if the touch location is not located in the OSD region, the MCU module sends the touch data to the first system, and the first system executes the external determining step. That is, after the MCU module judges that the touch position is not in the OSD region, it is not necessary to inform the touch module to send touch data to the first system, but the touch data is directly sent to the first system, and then the first system judges which position is specifically touched, so that the number of data interactions between the MCU module and the touch module and the number of interactions between the first system and the MCU module are reduced, further, the error probability of data transmission is better reduced, and the accuracy of touch response is improved. For the manner of executing the external connection determining step by the first system, reference may be made to the manner of executing the external connection determining step by the first system in the embodiment of determining the OSD region by the touch module, which will not be described herein.
In the embodiment of the MCU module determining whether the touch location is located in the external signal source display area, in the OSD determining step, after the MCU module determines that the touch location is not located in the OSD area, it is not necessary to send out touch data, but it is only necessary for the first system to send information of the external signal source display area to the MCU module, so as to determine whether the touch location is located in the external signal source display area according to the obtained external signal source display area. Specifically, in the external connection judging step, the MCU module judges whether the touch position is located in the external connection signal source display area, if so, the MCU module performs coordinate conversion on the touch data to obtain external connection coordinates corresponding to the external connection device. That is, if the MCU module determines that the touch location is located in the external signal source display area, it is considered that the touch operation performed on the external device needs to be performed first, and since the coordinate conversion and the format conversion are performed by the MCU module, in this embodiment, the external coordinate is not required to be sent out after the coordinate conversion is performed, and the MCU module directly performs the format conversion on the external coordinate subsequently, that is, in this manner, the determination of the touch location in the OSD area and in the external signal source display area, and the coordinate conversion and the format conversion on the touch data are performed by the MCU module, which can reduce the number of times of transmission of the touch data in this case, reduce the error probability in data transmission, and further improve the accuracy of touch.
Further, if the MCU module judges that the touch position is not located in the external signal source display area, the MCU module sends touch data to the first system; correspondingly, the touch response method further comprises the following steps:
windows response step: the first system generates a touch event when the touch data is received and responds thereto.
That is, if the MCU module determines that the touch location is neither in the OSD region nor in the external signal source display region, the touch is considered to be other regions in the first system screen (including the menu region of the external signal source interactive application window), and the touch data is responded by the first system.
When the touch module or the MCU module judges that the touch position is neither in the OSD region nor in the external signal source display region, after the touch data is sent to the first system, the windows respond to the touch event specifically comprises: judging whether the touch position is positioned in an external signal source interaction application window (specifically judging whether the touch position is positioned in a menu area), if so, responding touch data by the first system through the external signal source interaction application; if not, the first system responds to the touch data through other applications. That is, the touch position is neither in the OSD region nor in the external signal source display region, and may be the external signal source interactive application window, but not in the external signal source display region, and may be the menu region as described above, and at this time, the external signal source interactive application in the first system responds to the touch data; of course, it is also possible to be a window area corresponding to other applications in the windows screen, and at this time, other applications corresponding to the touch position in the first system respond to the touch event generated by the touch data, and in particular, how to respond to the touch event may refer to the response of the first system itself in the prior art. The accuracy of touch response can be improved by increasing the judgment of the external signal source interactive application window, and user experience is improved, especially when the external signal source interactive application window further comprises a menu area. Of course, the external signal source interactive application window may not include the menu area.
The above-mentioned external signal source interactive application window, external signal source display area, menu area, etc. may be determined according to the corresponding bit information of the window, and the following description will be given by taking the external signal source interactive application window as an example, where the bit information of the window of the external signal source interactive application window may specifically be coordinates of the upper left corner and the lower right corner of the window of the external signal source interactive application window on the touch screen, or may also be coordinates of the upper left corner of the OSD window on the touch screen and the length and width of the window, and of course may also be other representation modes of the external signal source interactive application window.
The window occupation coordinates of the OSD window can be obtained when the OSD module is started, the first system can know the window zone bit information corresponding to the display area of the external signal source after the external signal source interactive application is started, and the window zone bit information corresponding to the menu area can be obtained after the external signal source interactive application is started.
In the external connection judging step of each embodiment, whether the external signal source interaction application of the first system performs coordinate conversion or the touch module or the MCU module performs coordinate conversion, preferably, the touch data may be subjected to coordinate conversion according to the window flag bit information of the external signal source interaction application and the resolution information of the external device provided by the first system. That is, the window flag bit information of the external signal source interaction application window can assist the external signal source interaction application to know the position and the size of the window, and further coordinate conversion is performed according to the resolution of the external signal source display area and the resolution of the external device, which are known by the external signal source interaction application itself, and the specific coordinate conversion method can be implemented by adopting conversion methods of different coordinate systems in the prior art, which will not be described herein.
In the above embodiments, two modules or systems of mutual data interaction or unidirectional data transmission are arranged in the touch module, the MCU module and the first system, and may be connected through USB or serial signal. The external device is in signal connection with the first system through HDMI.
In the embodiment in which the touch module determines whether the touch position is located in the OSD region (i.e., the touch module performs the OSD determining step) and the first system determines whether the touch position is located in the external signal source display region, preferably, as shown in fig. 3, the touch module 100 transmits signals to the OSD module 400 in two directions through the serial port, and transmits signals to the first system 200 through the USB; the first system 200 transmits a signal to the MCU module 300 through a serial port or USB; the MCU module 300 transmits a signal to the external device 500 through the USB; the external device 500 transmits a signal to the first system 200 through HDMI, and the touch module can determine whether the touch position is located in the OSD region, if so, the touch module sends the touch data to the OSD module; if not, the touch control module sends the touch control data to the first system; and the first system judges whether the touch position is positioned in the external signal source display area, if so, performs coordinate conversion on the touch position to obtain external coordinates corresponding to external equipment, and sends the MCU module. By adopting the mode, the reliability of connection of each module or system and equipment can be ensured, the efficiency of data transmission can be improved, and the first system does not need to frequently send information of an external signal source display area or an external signal source interactive application window to the touch module 100 or the MCU module 300, so that the error probability of data transmission is reduced, the accuracy of touch response is improved, the touch experience of a user is improved, and meanwhile, the cost of the whole touch screen can be reduced.
In the embodiment that the touch module firstly judges whether the touch position is located in an OSD area (namely, the touch module executes an OSD judging step) and then judges whether the touch position is located in an external signal source display area, if so, the touch module sends touch data to the OSD module; if not, the touch control module judges whether the touch control position is located in the external signal source display area, if so, the touch control module conducts coordinate conversion on the touch control data to obtain external coordinates corresponding to external equipment, and the external coordinates are sent to the MCU module; if not, the touch module sends the touch data to the first system, and the first system generates a touch event when receiving the touch data and responds to the touch event. In this embodiment, the first system needs to send information of the display area of the external signal source to the touch module, in one embodiment, the touch module 100 sends data to the OSD module 400 through the MCU module 300, the OSD module 400 directly sends OSD area to the touch module 100, as shown in fig. 4, the OSD module 400 transmits signals to the touch module 100 through the serial port; the touch module 100 transmits signals to the MCU module 300 through the USB in a bidirectional manner and the first system 200 through the USB; the MCU module 300 transmits signals to the OSD module 400 through a serial port, and transmits signals to the external device 500 through a USB; the external device 500 transmits a signal to the first system 500 through HDMI. In another embodiment, the OSD module 400 also sends data (specifically, OSD regions) to the touch module 100 through the MCU module, as shown in fig. 5, and compared to the embodiment shown in fig. 4, the direct serial connection between the touch module 100 and the OSD module 400 is absent. In yet another embodiment, the touch module 100 and the OSD module 400 are directly connected through a serial port, and both are directly used for data transmission, and at this time, compared with the embodiment of fig. 4, the serial port between the MCU module 300 and the OSD module 400 is omitted, and the touch scale group 100 and the OSD module 400 are connected through a serial port bi-directional signal, as shown in fig. 6.
In the embodiment that the MCU module firstly judges whether the touch position is located in an OSD area and then judges whether the touch position is located in an external signal source display area, if so, the touch data is sent to the MCU module when the touch data is received by the touch module; the MCU module judges whether the touch position is located in an OSD region, if so, the MCU module sends touch data to the OSD module; if not, the MCU module sends the touch data to the first system, the first system judges whether the touch position is positioned in the display area of the external signal source, if so, coordinate conversion is carried out on the touch position to obtain external coordinates corresponding to the external equipment, and the external coordinates are sent to the MCU module; or the MCU module judges whether the touch position is positioned in the external signal source display area, if so, the MCU module performs coordinate conversion on the touch data to obtain external coordinates corresponding to external equipment; if not, the MCU module sends the touch data to the first system, and the first system generates a touch event when receiving the touch data and responds to the touch event. In this embodiment, as shown in fig. 7, the touch module 100 transmits a signal to the MCU module 300 through the USB; the MCU module 300 bidirectionally transmits signals with the OSD module 400 through a serial port, bidirectionally transmits signals with the first system 200 through a USB, and transmits signals to the external device 500 through a USB; the external device 500 transmits a signal to the first system through HDMI.
The application also provides electronic equipment which can be an interactive large screen or an electronic whiteboard, and comprises the touch screen in any embodiment and the external equipment in each embodiment.
The application also provides a touch response system, which comprises the touch screen and the external device in any embodiment, wherein the external device is in signal connection with the touch screen, and the specific connection mode is described with reference to the foregoing, so that the detailed description is omitted.
It should be noted that the multiple systems may be the same operating system, for example, the external device may be the first system, or may be different operating systems, for example, the external device may be a non-first system, which is all included in the present application. The OSD window may be in an expanded state (as shown in fig. 1) or in a minimized state on the touch screen, and the OSD region in the present application refers to a region occupied by the OSD window in the expanded state.
The application further relates to a computer readable storage medium, wherein the computer readable storage medium stores a computer program, and the device where the computer readable storage medium is located is controlled to execute the touch response method when the computer program runs.
The computer readable storage medium according to the embodiments of the present disclosure is not limited to the above-described embodiments, and may be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the above. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In an embodiment of the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Those skilled in the art will appreciate that the above-described preferred embodiments can be freely combined and stacked without conflict. In which the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures, for example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. The numbering of the steps herein is for convenience of illustration and reference only and is not intended to limit the order in which the steps are performed, the particular order of execution being determined by the technology itself, and the skilled artisan can determine various allowable, reasonable orders based on the technology itself.
It should be noted that step numbers (letter or number numbers) are used in the present application to refer to certain specific method steps for convenience and brevity only, and are not intended to limit the order of the method steps by letter or number in any way. It will be apparent to those skilled in the art that the sequence of steps of the relevant method should be determined by the technique itself, should not be unduly limited by the presence of step numbers, and that one skilled in the art can determine various allowable, reasonable sequences of steps based on the technique itself.
Those skilled in the art will appreciate that the above-described preferred embodiments can be freely combined and stacked without conflict.
It will be understood that the above-described embodiments are merely illustrative and not restrictive, and that all obvious or equivalent modifications and substitutions to the details given above may be made by those skilled in the art without departing from the underlying principles of the application, are intended to be included within the scope of the appended claims.

Claims (10)

1. A touch response method based on a first system is applied to a touch screen; the touch screen is characterized in that the touch screen can simultaneously display an OSD window, an external signal source interaction application window of the first system and other pictures, wherein the external signal source interaction application window is a window corresponding to the external signal source interaction application and comprises an external signal source display area for displaying an input picture of external equipment; the OSD window, the external signal source interactive application window and other pictures of the first system can synchronously use touch operation, and the external equipment adopts a second system;
The touch response method comprises the following steps:
OSD judging step: when touch data are received, judging whether the touch position is positioned in an OSD region or not according to the touch data, if so, sending the touch data to an OSD module; if not, executing an external connection region judging step; the OSD region is a region occupied by an OSD window corresponding to the OSD module on a touch screen;
OSD response step: the OSD module responds to the touch data when receiving the touch data;
and (3) external judgment: judging whether the touch position is positioned in the external signal source display area, if so, performing coordinate conversion on the touch data to obtain external coordinates corresponding to external equipment;
and an information output step: and converting the external coordinate format into external data which can be received by the external equipment, and sending the external data to the external equipment so that the external equipment can respond to the touch data.
2. The touch response method according to claim 1, wherein in the OSD determining step, if the touch location is not located in the OSD region, the external determining step is performed after the touch data is sent to the first system;
the external judgment step comprises the following steps:
A first substep: the first system generates a touch event when receiving the touch data, judges whether the touch position is positioned in an external signal source interaction application window, and if yes, executes a second sub-step; if not, other applications corresponding to the touch position on the first system respond to the touch data;
a second substep: and the first system judges whether the touch position is positioned in the external signal source display area or not through the external signal source interaction application, if so, the external signal source interaction application performs coordinate conversion on the touch data to obtain external coordinates corresponding to external equipment and sends the external coordinates to the MCU module.
3. The touch response method of claim 1, wherein the external signal source interactive application window further comprises a menu area; in the external judging step, if the touch position is not located in the external signal source display area, the first system generates the touch event according to the touch data and judges whether the touch position is located in the menu area, if so, the first system responds to the touch data through external signal source interactive application; if not, the first system responds to the touch data through other applications.
4. The touch response method according to any one of claims 1-3, wherein in the external connection determination step,
and carrying out coordinate conversion on the touch data according to the window zone bit information of the external signal source interactive application and the resolution information of the external equipment.
5. A touch screen based on a first system, configured to perform the touch response method of any one of claims 1-4, the touch screen comprising: the touch control module, the OSD module, the first system and the MCU module, wherein the first system is provided with an external signal source interaction application, an external signal source interaction application window corresponding to the external signal source interaction application is provided with an external signal source display area, and the external signal source display area is used for displaying an input picture of external equipment; the touch screen can simultaneously display an OSD window, an external signal source interactive application window and other pictures of the first system, and the windows and the pictures can synchronously use touch operation;
when the touch control module receives touch control data, the touch control module or the MCU module judges whether the touch control position is positioned in an OSD region according to the touch control data, if so, the touch control data is sent to the OSD module, and the OSD module responds to the touch control data; if not, the touch module, the MCU module or the first system judges whether the touch position is located in the external signal source display area, if so, coordinate conversion is carried out on the touch position to obtain external coordinates corresponding to external equipment, then the MCU module converts the external coordinate format into external data which can be received by the external equipment and sends the external data to the external equipment so that the external equipment responds to the touch data;
The OSD region is a region occupied by an OSD window corresponding to the OSD module on a touch screen; the external device adopts a second system.
6. The touch screen of claim 5, wherein the touch module transmits signals to the first system via USB via serial port and the OSD module in both directions; the first system transmits signals to the MCU module through a serial port or USB; the MCU module transmits signals to external equipment through USB; the external device transmits signals to the first system through HDMI;
wherein, the touch control module judges whether the touch control position is positioned in an OSD region,
if so, the touch control module sends the touch control data to the OSD module;
if not, the touch control module sends the touch control data to the first system; and the first system judges whether the touch position is positioned in the external signal source display area, if so, performs coordinate conversion on the touch position to obtain external coordinates corresponding to external equipment, and sends the MCU module.
7. The touch screen of claim 5, wherein the OSD module transmits a signal to the touch module through a serial port; the touch control module transmits signals to the MCU module in a bidirectional manner through the USB and the first system, and transmits signals to the MCU module through the USB; the MCU module transmits signals to the OSD module through a serial port and transmits signals to external equipment through a USB; the external device transmits signals to the first system through HDMI;
Wherein, the touch control module judges whether the touch control position is positioned in an OSD region,
if so, the touch control module sends the touch control data to the OSD module;
if not, the touch control module judges whether the touch control position is located in the external signal source display area, if so, the touch control module conducts coordinate conversion on the touch control data to obtain external coordinates corresponding to external equipment, and the external coordinates are sent to the MCU module; if not, the touch module sends the touch data to the first system, and the first system generates a touch event when receiving the touch data and responds to the touch event.
8. The touch screen of claim 5, wherein the touch module transmits signals to the MCU module via USB; the MCU module is used for bidirectionally transmitting signals with the OSD module through a serial port, bidirectionally transmitting signals with the first system through a USB and transmitting signals to the external equipment through the USB; the external device transmits signals to the first system through HDMI;
when the touch control module receives the touch control data, the touch control data are sent to the MCU module; the MCU module judges whether the touch control position is positioned in an OSD region,
If yes, the MCU module sends the touch data to the OSD module;
if not, the MCU module sends the touch data to the first system, the first system judges whether the touch position is positioned in the external signal source display area, if yes, coordinate conversion is carried out on the touch position, external coordinates corresponding to external equipment are obtained, and the external coordinates are sent to the MCU module; or the MCU module judges whether the touch position is positioned in the external signal source display area, if so, the MCU module performs coordinate conversion on the touch data to obtain external coordinates corresponding to external equipment; if not, the MCU module sends the touch data to the first system, and the first system generates a touch event when receiving the touch data and responds to the touch event.
9. An electronic device comprising the touch screen of any of claims 5-8, the electronic device comprising an interactive large screen or an electronic whiteboard.
10. A computer readable storage medium, wherein the computer readable storage medium stores a computer program, and wherein the computer program when executed controls a device in which the computer readable storage medium is located to perform the touch response method according to any one of claims 1 to 4.
CN202311054795.XA 2023-08-18 2023-08-18 Touch response method based on first system, touch screen and electronic equipment Pending CN117093125A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311054795.XA CN117093125A (en) 2023-08-18 2023-08-18 Touch response method based on first system, touch screen and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311054795.XA CN117093125A (en) 2023-08-18 2023-08-18 Touch response method based on first system, touch screen and electronic equipment

Publications (1)

Publication Number Publication Date
CN117093125A true CN117093125A (en) 2023-11-21

Family

ID=88776412

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311054795.XA Pending CN117093125A (en) 2023-08-18 2023-08-18 Touch response method based on first system, touch screen and electronic equipment

Country Status (1)

Country Link
CN (1) CN117093125A (en)

Similar Documents

Publication Publication Date Title
US10459545B2 (en) All-in-one machine and method and computer memory medium for realizing quick touch in all channels thereof
CN109164964B (en) Content sharing method and device, terminal and storage medium
US20100017744A1 (en) Image display control method, image supply device, and image display control program product
US11243737B2 (en) Method and system for remote collaboration
CN103455292A (en) Business data display and processing method and device and user equipment
CN109885270B (en) Display device control method, display device, and storage medium
CN110688190A (en) Control method and device of intelligent interactive panel
CN110865718A (en) Method and device for supporting application of input method to multi-screen switching
KR100633161B1 (en) Display Apparatus and Data Writing Device
US20140035816A1 (en) Portable apparatus
US20150145749A1 (en) Image processing apparatus and image processing method
CN117093125A (en) Touch response method based on first system, touch screen and electronic equipment
US20190179474A1 (en) Control method, electronic device, and non-transitory computer readable recording medium
CN112926420B (en) Display device and menu character recognition method
CN114237482A (en) Handwriting display processing method, device, system, equipment and storage medium
CN107390981B (en) Global menu control method, device, equipment and storage medium
WO2018126585A1 (en) Display content updating method and mobile terminal
CN111142994A (en) Data display method and device, storage medium and electronic equipment
WO2024109286A1 (en) Multi-window switching method and apparatus, electronic device, and computer-readable storage medium
CN113316022B (en) Video playing method, device, equipment, system and storage medium
CN114077413B (en) Display module control system, display device and control method
KR102317091B1 (en) Apparatus and method for processing image
CN213122917U (en) KVM multi-system combined control device based on OSD
CN110851066B (en) Method and device for supporting touch control of multiple display screens
CN113419651B (en) Multi-window double-screen switching method and system, intelligent terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination