WO2023004600A1 - Procédé et appareil de commande de fenêtre d'application, panneau plat interactif et support de stockage - Google Patents

Procédé et appareil de commande de fenêtre d'application, panneau plat interactif et support de stockage Download PDF

Info

Publication number
WO2023004600A1
WO2023004600A1 PCT/CN2021/108753 CN2021108753W WO2023004600A1 WO 2023004600 A1 WO2023004600 A1 WO 2023004600A1 CN 2021108753 W CN2021108753 W CN 2021108753W WO 2023004600 A1 WO2023004600 A1 WO 2023004600A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
screen
interface
application window
touch position
Prior art date
Application number
PCT/CN2021/108753
Other languages
English (en)
Chinese (zh)
Inventor
丁静静
黄业
Original Assignee
广州视源电子科技股份有限公司
广州视源创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 广州视源电子科技股份有限公司, 广州视源创新科技有限公司 filed Critical 广州视源电子科技股份有限公司
Priority to PCT/CN2021/108753 priority Critical patent/WO2023004600A1/fr
Priority to CN202180005735.1A priority patent/CN115885245A/zh
Publication of WO2023004600A1 publication Critical patent/WO2023004600A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Definitions

  • the present application relates to the technical field of application window management, in particular to an application window control method, device, interactive panel and storage medium.
  • the interactive tablet is a representative large-scale integrated device, which is suitable for group interaction occasions such as conferences, teaching, and business exhibitions.
  • This device integrates multiple functions such as screen projection and video conferencing, and is mainly realized based on touch technology. information interaction.
  • the user can exit the full-screen mode of the application window through the following operations: 1) Click the "Exit Full Screen” icon in the operation bar of the application window to make the application window exit full-screen mode; 2) Click the "Minimize window” icon in the operation bar of the application window ” icon to close the application window to the process management; 3) Click the “home” icon in the system operation bar to close the application window to the process management and display the desktop.
  • the operation bar of the application window is often set at the top of the window.
  • the operation bar In full-screen mode, the operation bar is located at the top of the screen of the interactive tablet, which makes it difficult for the user to operate; If you need to use the operation bar, you need to swipe down from the top edge of the interactive tablet to call out the operation bar. This operation is difficult to operate and the interactive tablet is also difficult to perceive.
  • the embodiments of the present application provide an application window management method, device, interactive panel and storage medium, which realize convenient manipulation of application windows and improve the usability of the system.
  • an application window management method including:
  • the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen and stays at the first touch position;
  • the second touch operation is that the touch object slides from the first touch position to the middle of the screen, and stays at the second touch position;
  • the third touch operation being that the touch object leaves the screen from the second touch position
  • a first interface of the target application window is displayed, and the first interface has the same interface position as the second preview interface.
  • the third touch operation is that the touch object leaves the screen from the second touch position, and the sliding speed of the touch object to the second touch position is less than the set first A speed threshold.
  • the method also includes:
  • the fourth touch operation is that the touch object leaves the screen from the second touch position, and the sliding speed of the touch object to the second touch position is greater than or equal to said first speed threshold;
  • a second interface of the target application window is displayed, and the display size of the second interface is a set multiple of the display size of the target application window in a full-screen state.
  • the method also includes:
  • the fifth touch operation being that the touch object slides from the second touch position to the edge of the screen and stays at the third touch position;
  • the sixth touch operation is that the touch object slides from the third touch position to the edge of the screen and stays at the fourth touch position;
  • a fourth preview interface of the target application window is displayed, and an interface position of the fourth preview interface is related to the fourth touch position.
  • the method also includes:
  • the seventh touch operation is that the touch object leaves the screen from the fourth touch position
  • a third interface of the target application window is displayed, where the display position of the third interface is the same as that of the fourth preview interface.
  • the seventh touch operation is that the touch object leaves the screen from the fourth touch position, and the sliding speed of the touch object to the fourth touch position is lower than the set first Two speed thresholds.
  • the method also includes:
  • the eighth touch operation is that the touch object leaves the screen from the fourth touch position, and the sliding speed of the touch object to the fourth touch position is greater than or equal to said second speed threshold;
  • the target application window in a full screen state is displayed.
  • the method also includes:
  • the ninth touch operation is sliding the touch object from the fourth touch position to the edge of the screen;
  • the target application window in a full screen state is displayed.
  • the interface aspect ratio of the preview interface is the same as the aspect ratio of the screen, and the interface position of the preview interface is determined by the vertex coordinates of the interface vertices representation;
  • the coordinates of each vertex are determined by the target touch coordinates of the target touch position combined with the width and height values of the screen, wherein the target touch position is the touch point where the touch object stays when the preview interface is displayed. Location.
  • the step of determining the coordinates of each vertex according to the target touch coordinates of the target touch position in combination with the width and height values of the screen includes:
  • the abscissa of each vertex of the interface is determined by the abscissa of the target touch coordinates
  • the ordinate of the vertex corresponding to each vertex of the interface is determined.
  • the step of determining the coordinates of each vertex according to the target touch coordinates of the target touch position in combination with the width and height values of the screen includes:
  • the abscissa of each vertex corresponding to each interface vertex is determined by combining the ordinate of each vertex with the width and height values of the screen.
  • the sliding speed when the touch object slides from one touch position to another touch position is determined by the touch coordinates and touch time points of each touch position.
  • an application window control device which includes:
  • a full-screen display module configured to display a target application window in a full-screen state
  • the first receiving module is configured to receive a first touch operation, the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen, and stays at the first touch position;
  • a first display module configured to display a first preview interface of the target application window, where the interface position of the first preview interface is related to the first touch position;
  • the second receiving module is configured to receive a second touch operation, the second touch operation is that the touch object slides from the first touch position to the middle of the screen and stays at the second touch position;
  • a second display module configured to display a second preview interface of the target application window, where the interface position of the second preview interface is related to the second touch position;
  • the third receiving module is configured to receive a third touch operation, the third touch operation is that the touch object leaves the screen from the second touch position;
  • the third display module is configured to display the first interface of the target application window, and the first interface has the same interface position as the second preview interface.
  • an interactive tablet including:
  • the touch component is used to respond to the touch operation of the touch object through the included hardware circuit
  • the display screen is covered with the touch component to form a touch screen for displaying application windows;
  • processors one or more processors
  • the one or more processors are made to implement the method provided in the first aspect of the present application.
  • the embodiment of the present application further provides a storage medium containing computer-executable instructions, and the computer-executable instructions are used to execute the method as described in the first aspect when executed by a computer processor.
  • the application window control method, device, interactive panel and storage medium provided above.
  • the proposed method can be executed by an interactive tablet, on which the target application window in full-screen state is firstly displayed, and after receiving the first touch operation, the first preview interface can be displayed, and the interface position of the first preview interface can be controlled It is related to the first touch position where the first touch operation stays; after receiving the second touch operation, the second preview interface can be displayed, and the interface position of the second preview interface can be controlled to match the second touch operation.
  • the second touch position where you stay is related, and then after receiving the third touch operation, the first interface can be displayed, and the interface position of the first interface is the same as the second touch position corresponding to leaving the screen during the third touch operation
  • the interface positions of the associated second preview interface are the same.
  • the touch operation generated by the interaction between the user and the interactive tablet is only related to the sliding of the touch object from the edge of the screen to the middle of the screen, and the interactive tablet can also directly adjust the interface position of the target application window by responding to each touch operation , to control the target application window to exit the full-screen state.
  • the problem that the user cannot effectively control the application window to exit the full screen due to the large size of the interactive tablet is overcome, and the problem that requires frequent interaction with the user to control the application window to exit the full screen is also solved.
  • the convenience of user manipulation is realized, and the system usability of the interactive panel is greatly improved.
  • FIG. 1 is a schematic flowchart of an application window control method provided in Embodiment 1 of the present invention
  • Fig. 1a is a schematic interface diagram of the first preview interface displayed on the interactive panel in the first embodiment
  • Fig. 1b is a schematic diagram of the interface of the second preview interface displayed on the interactive tablet in the first embodiment
  • Fig. 1c is a schematic interface diagram of the first interface displayed on the interactive panel in the first embodiment
  • Fig. 1d is a schematic interface diagram of the second interface displayed on the interactive panel in the first embodiment
  • FIG. 2 is a schematic flowchart of an application window control method provided in Embodiment 2 of the present invention.
  • FIG. 2a is a schematic diagram of the interface of the third preview interface displayed on the interactive panel in the second embodiment
  • Fig. 2b is a schematic interface diagram of the fourth preview interface displayed on the interactive panel in the second embodiment
  • Fig. 2c is an interface schematic diagram of the third interface displayed on the interactive panel in the second embodiment
  • FIG. 2d is a schematic diagram of one interface of the target application window in the full-screen state displayed on the interactive tablet in the second embodiment
  • FIG. 2e is another schematic diagram of the interface of the target application window in the full-screen state displayed on the interactive tablet in the second embodiment
  • FIG. 3 is a schematic structural diagram of an application window control device provided in Embodiment 3 of the present invention.
  • FIG. 4 is a schematic structural diagram of an interactive panel provided in Embodiment 4 of the present application.
  • FIG. 1 is a schematic flowchart of a method for controlling an application window provided by Embodiment 1 of the present invention.
  • This embodiment is applicable to controlling the display position and display size of an application window on a screen.
  • the method can be executed by an application window control device, the application window control device can be realized by software and/or hardware, and can be configured in the interactive panel, especially in the processor of the interactive panel, and the processor can be an intelligent processing system
  • the interactive panel is equipped with a touch screen, which can be regarded as an electrical connection combination of the touch frame and the display screen.
  • the display screen may be a window interface of an application in the smart interactive tablet, or a screen projection interface sent by an external device, or video data sent by an external device.
  • the external device can be: a mobile phone, a notebook computer, a tablet computer, a desktop computer, etc., and the external device establishes a data connection with the smart interactive tablet.
  • the display interface is the window interface of the application program as the main processing object to realize Convenient control of the application window interface display.
  • this embodiment takes the intelligent teaching based on the interactive tablet as the actual application scenario.
  • the teacher conducts online teaching, it is usually necessary to open multiple functional application windows and display them on the screen of the interactive tablet.
  • the contents of the application windows can be displayed on the screen without occlusion. It is necessary to exit the full-screen state of each application window and display it in a small window interface.
  • the existing operation mode (such as clicking the minimize/exit full-screen button at the top of the screen) provided to the user for interactive control of the display state of the application window is not suitable for a large-sized terminal device such as an interactive tablet.
  • the method for controlling the application window provided by this embodiment can respond to the touch operation generated by the user relative to the application window by adopting an interactive operation mode different from the existing ones, so as to control the application window to display from full screen to other display sizes.
  • the interaction process controlled by the entire application window in this embodiment is more convenient for the user's operation.
  • the application window control method provided by Embodiment 1 of the present application specifically includes the following operations:
  • a functional application after triggered, its application window may be presented in a full-screen state on the display screen of the interactive panel by default.
  • multiple full-screen application windows can exist on the interactive tablet, but only the content of the application window at the top of the screen can be displayed to the user, and the contents of other application windows are covered by the top application window.
  • the user when the user interacts with the interactive panel, he can only manipulate the topmost element (application window) on the screen.
  • the uppermost application window displayed in a full-screen state can be recorded as the target application window that can be interactively manipulated by the user, and the subsequent steps in this embodiment can be considered as execution steps for the target application window.
  • the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen, and stays at the first touch position.
  • the touch object may specifically be a user's finger, an active stylus or a passive stylus, etc., and the user may manipulate the touch object to slide on the display surface of the interactive panel.
  • the sliding behavior of the user sliding through the touch object can be determined by the relevant hardware or software of the execution subject in cooperation with each other, and the specific information of the user on the display screen can be determined by analyzing the data information related to the sliding behavior. What operation has been performed, so that the touch operation triggered by the user can be received through this step.
  • the execution body analyzes and determines that the user has performed the first touch operation on the screen.
  • the specific implementation can be described as: through the touch frame configured on the execution body, it can respond to touch objects to touch the screen to generate The touch signal is used to obtain the touch point information generated when the touch object touches the screen and feed it back to the upper processing system.
  • the processing system can determine whether the user has performed a sliding operation on the display screen by analyzing the information of each touch point. , and what is the sliding direction and sliding trajectory corresponding to the sliding operation.
  • the processing system can determine that the user manipulates the touch object to move from the edge of the screen to the middle of the screen, and currently stays at the first touch position.
  • the subject of execution may consider that the user has performed a first touch operation interaction with the target application window, and thus the first touch operation generated by interacting with the user may be received through this step.
  • the stay of the touch object at the first touch position may be a very short transitory stay (for example, only the touch object passes the first touch position during the sliding process, and then the touch object is based on the user's manipulation. continue to slide), or stay at a certain interval (for example, the touch object slides and stays at a certain position on the screen for a certain time interval).
  • the hardware part of the interactive tablet as the execution subject is composed of a display screen, an intelligent processing system, etc., which are combined by integral structural parts and supported by a dedicated software system.
  • the display screen specifically may include a light emitting diode (Light Emitting Diode, LED) display screen, an organic electro-laser display (Organic Light-Emitting Diode, OLED) display screen, a liquid crystal display screen (Liquid Crystal Display, LCD) display screen, and the like.
  • the display screen of an interactive panel refers to a touch screen, a touch screen, or a touch panel, which is an inductive liquid crystal display device.
  • the tactile feedback system on the screen can The program drives various connection devices, which can be used to replace the mechanical button panel, and create vivid audio-visual effects through the liquid crystal display screen.
  • touch screens can be divided into five basic types: vector pressure sensing technology touch screens, resistive technology touch screens, capacitive technology touch screens, infrared technology touch screens, and surface acoustic wave technology touch screens.
  • the touch screen can be divided into four types: resistive type, capacitive induction type, infrared type and surface acoustic wave type.
  • the coordinates of the point are positioned, so as to realize the control of the intelligent processing system, and then realize different functional applications with the built-in software of the intelligent processing system.
  • the coordinates of the point are positioned, so as to realize the control of the intelligent processing system, and then realize different functional applications with the built-in software of the intelligent processing system.
  • optical touch sensors are arranged on both sides of the surface of the display screen to form a touch frame, so as to form a touch display screen.
  • the touch information recognition process can be described as: the optical touch sensor constituting the touch frame can scan the touch object, such as the user's finger, stylus, etc., using light signals on the surface of the display screen.
  • the touch object touches the display screen, triggers a certain interface on the display screen, or performs positioning and other operations
  • the touch frame can respond to the above touch operations and transmit the corresponding touch operation information to the intelligent processing system at the application level, so that through the intelligent The processing system implements various interactive applications.
  • the edge of the screen can be regarded as the frame position of the display screen on the interactive panel. Considering that the display screen has four sides, therefore, relative to the display screen, the edge of the screen can be the left edge, right edge of the display screen. side edge, top edge or bottom edge.
  • the edge of the screen can be the left edge, right edge of the display screen. side edge, top edge or bottom edge.
  • the sliding trajectories corresponding to the bottom edge can be described as sliding up from the bottom;
  • the sliding trajectories corresponding to the top edge can be described as sliding down from the top;
  • the corresponding sliding track can be described as sliding from left to right, and the sliding track corresponding to the right edge can be described as sliding from right to left.
  • this embodiment does not limit the sliding track from the edge of the screen to the middle of the screen as horizontal sliding or vertical sliding.
  • the touch object of the first touch operation is the target application window
  • the purpose of the touch is: to display the window position of the target application window And the display size of the window is adjusted.
  • the result after responding to the first touch operation is displayed, specifically displaying the first preview interface of the target application window.
  • the first touch operation corresponds to the user interaction operation specifically: the touch object slides from the edge of the screen to the middle of the screen, and stays at the first touch position.
  • the execution subject receives and responds to the first touch operation, it can control the target application window to temporarily appear on the display screen in the form of the first preview interface.
  • the display form (interface position, interface size) of the first preview interface presented is specifically related to the first touch position where the touch object stays at this time.
  • the first touch position it can be determined that the first preview interface is on the screen.
  • the position of the interface above corresponds to determining the interface size of the first preview interface.
  • the relationship between the interface position of the first preview interface and the first touch position may be the touch coordinates of the first touch position, which happens to be in one of the four sides of the displayed first preview interface.
  • the interface state of the target application window changes as the touch object slides from the edge of the screen to the middle of the screen, and the interface position of the first preview interface formed by the change changes the touch position that the touch object passes through (stays) when sliding relevant.
  • this step is equivalent to receiving the touch operation performed by the user to manipulate the touch object.
  • the received touch operation is recorded as the second touch operation, and the second touch operation
  • the operation is specifically characterized as the touch object continues to slide from the above-mentioned first touch position to the middle of the screen, and stays at the second touch position during the sliding process.
  • the stay of the touch object at the second touch position can also be a short time or a certain interval.
  • the executive body can first respond to the touch signal when the touch object slides through the equipped touch frame, feed back the corresponding touch point information to the upper layer, and then analyze the corresponding touch point information. Finally, when both the sliding direction and the sliding track represent that the touch object continues to slide from the first touch position to the middle of the screen and stays at the second touch position, it can be determined that the operation performed by the user is the second touch position. touch operation, and then the second touch operation can be received through this step.
  • the display form of the target application window can also be adjusted along with the obtained information about the second touch position, and the adjustment can specifically be: As the touch object slides from the first touch position to the middle of the screen to the second touch position, the target application window is also adjusted from the displayed first preview interface to the display form of the second preview interface, and the second preview interface The interface position of is related to the second touch position where the touch object currently stays.
  • the second preview interface controls one of the sides of the second preview interface to contain the touch coordinates of the second touch position, or to control one of the interface vertices of the second preview interface to be the touch coordinate of the second touch position.
  • this embodiment considers that the interface position of the target window preview interface will gradually approach the center of the screen as the touch object slides toward the middle of the screen, and the interface size of the preview interface will also gradually move closer to the center of the screen as the touch object slides toward the middle of the screen. Slide to get smaller gradually.
  • the interface positions of the above-mentioned first preview interface and the second preview interface are respectively related to the first touch position and the second touch position, and this embodiment can specifically use the first touch position and the second touch position
  • the touch coordinates of the second touch position combined with the width and height values of the screen, respectively determine the specific position information of the interface positions corresponding to the first preview interface and the second preview interface.
  • the executive body can determine the various touch operations (such as the first touch operation and the second touch operation) performed by the user during the process of manipulating the touch object to slide from the edge of the screen to the middle of the screen, thereby , the touch operation of the user manipulating the touch object to leave the display screen can also be monitored.
  • This touch operation can be recorded as the third touch operation.
  • This step can receive the corresponding generation of the touch object leaving the screen from the second touch position. the third touch operation.
  • the executive body can monitor the leaving event of the touch object leaving the screen through a preset touch monitoring function. Specifically, during the process of the user manipulating the touch object to perform various interactive operations on the display screen, the execution subject can also use the touch monitoring function to monitor whether an exit event of manipulating the touch object to leave the screen occurs. If the leaving event is detected, it can be considered that the execution subject recognizes that the user has manipulated the touch object to leave the screen from a touch position, and correspondingly generates a touch operation that the touch object leaves the screen.
  • this step presents the final display form of the target application interface on the screen, that is, first interface.
  • the display form of the first interface is specifically related to the corresponding touch position when the touch object leaves the screen, and specifically may be the same as the interface position and interface size of the preview interface presented by the target application window when the touch object leaves the screen.
  • the second preview interface associated with the target application window at the second touch position may be directly displayed as the first interface. That is, the interface size and interface position of the first interface presented in this step are the same as those of the second preview interface.
  • the first preview interface and the second preview interface presented in this embodiment may or may not be the interface form that the user expects the target application window to finally appear.
  • the display form of the target application window will change with the touch position where the touch object stays. For example, the touch object slides from the edge of the screen to the first
  • the display form of the target application window is adjusted from the full screen state to the interface position of the first preview interface; and when the touch object slides from the first touch position to the middle of the screen to the second touch position, the target application window
  • the display mode is adjusted from the first preview interface to the interface position of the second preview interface.
  • the target application window will be presented on the screen in the display form of the preview interface associated with the touch position as the final form, for example, the touch object leaves the second touch position , and the display form associated with the second touch position is the second preview interface, then the second preview interface can be displayed as the final form through this step and recorded as the first interface.
  • this embodiment uses the following example to describe the specific implementation of controlling the adjustment of the display form of an application window based on user interaction manipulation from a visualization perspective.
  • the application window in full screen state is displayed on the interactive tablet, and the screen edges on the interactive tablet include the left edge, right edge, bottom edge and top edge.
  • This example uses the bottom edge as an example.
  • the user uses a finger (the number of fingers is not limited in this embodiment, it can be one finger, two fingers or even multiple fingers) as a touch object, and slides upward from the bottom edge of the interactive tablet.
  • This embodiment uses the provided application window control method to process the user's operation of manipulating the finger to slide upward from the bottom edge, and can give different responses based on different operations, and the response results can be specifically presented through the target application window.
  • the display form of the interface is reflected.
  • FIG. 1a is a schematic diagram of the interface of the first preview interface displayed on the interactive tablet in the first embodiment; as shown in FIG. The interface presented relative to the target application window.
  • the first touch operation is as follows: the finger 11 slides upward from the bottom edge 12 of the interactive tablet 1 and stays at the first touch position 13 on the screen.
  • the preview interface of the target application window is shown as The display form of the first preview interface 14 is presented.
  • the first touch position 13 determines the interface position of the first preview interface 14, specifically, the first touch position 13 is located on the bottom edge of the first preview interface 14; in addition, in order to ensure that the first preview interface 14 To achieve a better display effect, it may be preferred that the first preview interface 14 has the same aspect ratio as the screen.
  • FIG. 1b is a schematic diagram of the interface of the second preview interface displayed on the interactive tablet in the first embodiment; as shown in FIG. The interface presented relative to the target application window.
  • the second touch operation is shown as: the finger 11 continues to slide upward from the first touch position 13 of the interactive tablet 1, and stays at the second touch position 15 on the screen.
  • the target application The preview interface of the window is presented in the display form of the second preview interface 16 .
  • the second touch position 15 determines the interface position of the second preview interface 16, specifically shown that the second touch position 15 is located on the bottom edge of the second preview interface 16; in addition, in order to ensure that the second preview interface 16 It is also preferable that the second preview interface 16 has the same aspect ratio as the screen.
  • FIG. 1c is a schematic interface diagram of the first interface displayed on the interactive panel in the first embodiment.
  • the first interface 17 is an interface presented to the target application window after the interactive tablet 1 receives the user's third touch operation.
  • the third touch operation is shown as: the finger 11 leaves the screen from the second touch position 15 of the interactive tablet 1 .
  • the target application window is no longer presented in the form of a preview interface, but is presented in a fixed form of the first interface 17 .
  • the interface position of the presented fixed interface (eg, the first interface 17 ) is the same as the interface position of the preview interface presented at the touch position when leaving the screen.
  • the first interface 17 in FIG. 1c is at the same interface position as the second preview interface in FIG. 1b , and the interface sizes and the like are also the same.
  • the touch operation generated by the interaction between the user and the interactive tablet is only related to the sliding of the touch object from the edge of the screen to the middle of the screen, and the interactive tablet can also directly respond to each touch
  • the operation adjusts the interface position of the target application window, and controls the target application window to exit the full-screen state.
  • the problem that the user cannot effectively control the application window to exit the full screen due to the large size of the interactive tablet is overcome, and the problem that requires frequent interaction with the user to control the application window to exit the full screen is also solved.
  • the convenience of user manipulation is realized, and the system usability of the interactive panel is greatly improved.
  • the third touch operation is further optimized as the touch object leaves the screen from the second touch position, and the touch object slides The sliding speed to the second touch position is less than a set first speed threshold.
  • the first interface displayed in the above embodiment is performed after receiving the third touch operation, and this optional embodiment provides further limitations on the third touch operation.
  • the third touch operation also includes a limiting condition, that is, the touch object slides to the second touch position during the process of sliding to the middle of the screen. The sliding speed needs to be less than the first speed threshold.
  • the execution subject can determine the above-mentioned limited conditions by monitoring the sliding speed during the sliding process of the touch object, and when it is determined that the sliding speed is less than the first speed threshold, it is determined that the generation of the third touch operation is in line with conditions, thereby generating a third touch operation, and performing the above-mentioned related display operation of S107.
  • the execution subject may determine the touch coordinates and the touch time points of the above two touch positions.
  • the process of determining the sliding speed when the touch object slides from one touch position to another can be described as: record the touch position where the touch object starts to slide as the initial touch position, and the touch position to be reached by sliding
  • the control position is the target touch position; obtain the initial touch time point when the touch object is at the initial touch position, and the target touch time point when the touch object is at the target touch position; and obtain the initial touch position at the same time
  • the initial touch coordinates and the target touch coordinates of the target touch position after that, the distance value between the touch positions can be determined through the initial touch coordinates and the target touch coordinates, and the target touch time point and the start touch point can also be used
  • the touch start time point determines the sliding time, and finally, the ratio of the distance value to the sliding time can be used as the sliding speed when the touch object reaches the target touch position.
  • the first speed threshold may preferably be a set multiple of the screen height of the interactive tablet, and the set multiple may be 0.3.
  • this optional embodiment also includes:
  • the fourth touch operation is that the touch object leaves the screen from the second touch position, and the sliding speed of the touch object to the second touch position is greater than or equal to said first speed threshold;
  • a second interface of the target application window is displayed, and the display size of the second interface is a set multiple of the display size of the target application window in a full-screen state.
  • the executive body after determining that the touch object has left the second touch position, the executive body further judges the sliding speed of the touch object to the second touch position and the first speed threshold, so that it can When the speed is less than the first speed threshold, it is considered that the interactive manipulation performed by the user meets the generation condition of the third touch operation; when the sliding speed is greater than or equal to the first speed threshold, it is considered that the interactive operation performed by the user meets the fourth touch operation control operation.
  • this optional embodiment may also receive the fourth touch operation recognized by the executive body, and in response to the received fourth touch operation, it is equivalent to first determining that the touch object leaves the screen at the second touch position , the leaving operation is equivalent to triggering the execution of displaying the target application window in a fixed form, and then the target application can be determined when it is determined that the sliding speed of the touch object sliding to the second touch position is greater than or equal to the first speed threshold
  • the window should be presented with a fixed form of display size, and the interface presented with the determined display size can be recorded as the second interface.
  • the display size of the second interface may preferably be a set multiple of the corresponding display size in a full-screen state of the target application window.
  • the setting multiple may preferably be 1/4.
  • FIG. 1d is a schematic diagram of the interface of the second interface displayed on the interactive tablet in the first embodiment.
  • the performance of the fourth touch operation is as follows: the finger 11 leaves the screen from the second touch position 15 of the interactive tablet 1, and controls the sliding speed of the finger 11 to slide to the second touch position 15 before leaving the screen. Greater than or equal to 0.3 times the screen height.
  • the target application window is also no longer presented in the form of a preview interface, but a second interface 18 in a fixed form is presented.
  • the characteristic of the presented second interface 18 is that its display size is 1/4 of the display size when the target application window is in full-screen state.
  • the dotted line diagrams appearing in each of the above figures can be understood as the historical state corresponding to the historical event, as shown in Fig.
  • the virtual finger pointing to the second touch position 15 can be regarded as the historical state of the touch screen before the finger leaves the screen.
  • this embodiment provides a specific example of an application interaction scenario to further illustrate the application window interface realized through the interactive operation between the user and the interactive tablet Display size changes.
  • the user manipulates the finger to slide upward from the bottom edge of the interactive tablet.
  • the target application window changes in real time in the form of a preview interface with the finger position, and the window display size of the preview interface shrinks continuously as the finger slides up (refer to Variation of Figure 1a to Figure 1b).
  • the target application window can be triggered to exit the full screen and leave the screen with the finger
  • the touch position of the finger at the time is used as the presentation position of the interface at the bottom of the target application window (refer to the interface effect shown in FIG. 1 c ).
  • the target application window can be triggered to exit the full screen, and the target application window can be controlled to display the interface at 1/4 times the full screen interface (refer to Figure 1d shows the interface effect).
  • This optional embodiment further optimizes the situation after the touch object leaves the screen from a touch position on the basis of the first embodiment above, and introduces the sliding speed when the touch object slides to the touch position.
  • the different display results respond differently.
  • the gesture operation mode of the application window control is added, the user can flexibly control the display state of the application window, and the usability of the interactive tablet is effectively improved.
  • Fig. 2 is a schematic flowchart of an application window control method provided by Embodiment 2 of the present invention.
  • This Embodiment 2 is optimized on the basis of the above-mentioned embodiment. On the basis of the above-mentioned embodiment, the optimization is added: receiving the fifth touch operation, the fifth touch operation is sliding the touch object from the second touch position to the edge of the screen and staying at the third touch position; displaying a third preview interface of the target application window, The interface position of the third preview interface is related to the third touch position; receiving a sixth touch operation, the sixth touch operation is that the touch object moves from the third touch position to the screen Slide the edge and stay at the fourth touch position; display a fourth preview interface of the target application window, where the interface position of the fourth preview interface is related to the fourth touch position.
  • an optimization is added: receiving a seventh touch operation, the seventh touch operation is that the touch object leaves the screen from the fourth touch position; displaying the third interface of the target application window, so The display positions of the third interface and the fourth preview interface are the same.
  • an optimization is added: receiving a ninth touch operation, the ninth touch operation is sliding the touch object from the fourth touch position to the edge of the screen; displaying the target application in a full-screen state window.
  • a method for controlling an application window provided in Embodiment 2 specifically includes the following steps:
  • the state displayed in this step can be used as the initial presentation state of the application window control, specifically, the application window in the uppermost full-screen state on the screen can be determined as the target application window that the user can interact with.
  • sliding the touch object from the edge of the screen to the middle of the screen can be used as a trigger operation for controlling the execution of the application window, that is, when the user wants to control the adjustment of the application window by interacting with the interactive tablet, First, you need to manipulate the touch object to slide from the edge of the screen to the middle of the screen.
  • the executive body can also recognize different touch operations corresponding to when the touch object passes through or stays at different touch positions during the sliding process.
  • the touch object stays at the first touch position during the sliding process.
  • the executive body can execute the response step of the touch operation, and can display the corresponding response result through this step.
  • the response step of the touch operation For example, when responding to the received first touch operation, it is equivalent to determining that the interface display form of the target application window needs to be adjusted currently.
  • the specific adjustment of the interface display form is related to the first touch position where the touch object stays in the first touch operation, so the interface position of the interface to be displayed in the target application window can be determined through the first touch position, and the first preview The form of the interface is displayed at the determined interface position.
  • this step can also receive the second touch operation recognized due to the sliding and staying of the touch object.
  • this step can also present the result of the response to the received second touch operation, and specifically present a second preview interface whose interface position is related to the second touch position.
  • this step can be regarded as a continuation of the above steps, and the received fifth touch operation is also related to the user's manipulation of the touch object.
  • the sliding direction of the touch object is changed.
  • the touch object no longer follows the sliding direction from the edge of the screen to the middle of the screen, but instead slides from the current touch position (such as the second touch position) to the edge of the screen.
  • the execution subject is equivalent to recognizing the generated fifth trigger operation, and can receive the fifth trigger operation through this step .
  • the executive body can also respond to the touch signal when the touch object slides through the equipped touch frame, feed back the corresponding touch point information to the upper layer, and then analyze the corresponding touch point information.
  • the sliding direction and sliding trajectory can finally be determined when the sliding direction and sliding trajectory represent that the touch object slides from the current touch position (such as the second touch position) to the edge of the screen and stays at the third touch position.
  • the performed operation is the fifth touch operation, and the fifth touch operation may be received by this step.
  • the fifth touch operation meets the condition of presenting the preview interface of the target application window on the screen. Then, adjust the display form of the preview interface of the target application window again.
  • the adjustment may specifically be: as the touch object slides from the second touch position to the edge of the screen to the third touch position, the target application window is also adjusted and changed from the second preview interface displayed to the display of the third preview interface form, and the interface position of the third preview interface is related to the third touch position where the touch object currently stays.
  • the interface position of the preview interface of the target application window can be adjusted based on the third touch position until reaching the interface position corresponding to the third preview interface.
  • the sixth touch operation received in this step is also equivalent to receiving the touch operation generated by the user manipulating the touch object. Similar to the above-mentioned fifth touch operation, the sliding direction of the touch object corresponding to the sixth touch operation is also sliding from the current touch position (the third touch position) to the edge of the screen.
  • the sixth touch operation may specifically be characterized as: the touch object continues to slide from the third touch position to the edge of the screen, and stays at the fourth touch position during the sliding process. It can be known that, in this embodiment, the stay of the touch object at the third touch position and the fourth touch position may be for a short time or for a certain interval. And the recognition of the sixth touch operation by the executive body is also based on the combination of the equipped touch frame and the upper-level processing system.
  • the sixth touch operation in response to the sixth touch operation of S208 above, it can be analyzed that the sixth touch operation also meets the condition of presenting the preview interface of the target application window on the screen. Relevant information about the touch position is used to adjust the display form of the preview interface of the target application window again.
  • the adjustment may specifically be: as the touch object slides from the third touch position to the edge of the screen to the fourth touch position, the target application window is also adjusted from the displayed third preview interface to the fourth preview The display form of the interface, and the interface position of the fourth preview interface is related to the fourth touch position where the touch object currently stays.
  • association relationship between the fourth preview interface and the fourth touch position can also refer to the above description of the association relationship between the third preview interface and the third touch position, and will not be repeated here.
  • the executive body recognizes various touch operations performed during the sliding process of the user's manipulation of the touch object, it can also recognize the touch operation of the user's manipulation of the touch object to leave the display screen.
  • the touch operation may be recorded as a seventh touch operation.
  • a seventh touch operation corresponding to the touch object leaving the screen from the fourth touch position may be received.
  • the execution subject can also monitor the leaving event of the touch object leaving the screen through a preset touch monitoring function. Specifically, the execution subject can also use the touch monitoring function to monitor whether the touch object leaves the screen when the user slides the touch object to the edge of the screen. If the leaving event is detected, it can be considered that the execution subject recognizes that the user has manipulated the touch object to leave the screen from a touch position, and correspondingly, the seventh touch operation of the touch object leaving the screen can still be generated, and is passed through this step. take over.
  • this step presents the final display form of the target application interface on the screen, that is, third interface.
  • the display form of the third interface is specifically related to the corresponding touch position when the touch object leaves the screen, and may specifically be the same as the interface position and interface size of the preview interface presented by the target application window when the touch object leaves the screen.
  • the fourth preview interface associated with the target application window at the fourth touch position may be directly displayed as the third interface. That is, the interface size and interface position of the third interface presented in this step are the same as those of the fourth preview interface.
  • the display form of the target application window will also change with the touch position where the touch object stays.
  • the display form of the target application window will be adjusted from the second preview interface to the interface position of the third preview interface;
  • the display form of the target application window is adjusted from the third preview interface to the interface position of the fourth preview interface.
  • the preview interface of the target application window gradually approaches the edge of the screen as the touch position gradually approaches the edge of the screen during the process.
  • the size of the interface presented by it relative to the target application window gradually increases accordingly.
  • this embodiment optimizes and limits the seventh touch operation on the basis of the above S210.
  • the seventh touch operation is defined as an operation in which the touch object moves away from the fourth touch position, and also includes a limiting condition, that is, the touch object slides to the edge of the screen during the sliding process.
  • the sliding speed at the fourth touch position needs to be smaller than the second speed threshold.
  • this embodiment is equivalent to optimizing the seventh touch operation, specifically: the touch object leaves the screen from the fourth touch position, and the touch object slides to the fourth touch position. The speed is less than the set second speed threshold.
  • the execution subject in addition to recognizing that the touch object has left the screen at the fourth touch position, the execution subject also needs to monitor the sliding speed during the sliding process of the touch object to determine the sliding speed-related limiting conditions. Therefore, when it is determined that the sliding speed of sliding to the fourth touch position is less than the second speed threshold, it is determined that the generation condition of the seventh touch operation is met, thereby generating the seventh touch operation, and performing the above-mentioned display operation related to S211.
  • the second speed threshold may preferably be a set multiple of the screen height of the interactive tablet, and the value of the set multiple may also be 0.3.
  • the executive body may recognize that the touch object leaves the fourth touch position, and determine that the sliding speed to the fourth touch position is greater than or equal to the first When the second speed threshold is reached, it is considered that the interactive operation performed by the user satisfies the eighth touch operation.
  • the executive body can also receive the recognized eighth touch operation, and respond to the eighth touch operation.
  • the target application window needs to be fixed. Then, when it is determined that the sliding speed at which the touch object slides to the fourth touch position is greater than or equal to the second speed threshold, it is determined that the target application window should be displayed in a full-screen state. That is, in this embodiment, when the eighth touch operation is received, the target application window can be displayed in a full-screen state.
  • the sliding speed at which the touch object slides to the fourth touch position can be determined by referring to the method for determining the sliding speed mentioned in the first embodiment above, and the process of determining the sliding speed will not be repeated here.
  • this step can also be regarded as two execution branches with the above S210.
  • the executive body may determine that a ninth touch operation is correspondingly generated when it is recognized that the user manipulates the touch object to slide from the fourth touch position to the edge of the screen and slide all the way to the edge of the screen. Different from the seventh touch operation corresponding to the user manipulating the touch object to leave the screen at the fourth touch position, this embodiment can perform the receiving of the ninth touch operation through this step when the ninth touch operation is recognized. operate.
  • the presented response result is to directly display the target application window in a full-screen state on the screen.
  • the execution of the above S212 and S213 in this embodiment is equivalent to providing an implementation manner of controlling the application window to restore to the full screen state again.
  • this implementation is mainly aimed at the situation where the user manipulates the touch object to slide from the edge of the screen to the middle of the screen first, and then slides back to the edge of the screen after reaching a certain touch position.
  • the touch object slides from the edge of the screen to the middle of the screen.
  • the touch object does not leave the screen during the process of sliding and sliding towards the edge of the screen in the opposite direction, that is, the target application window is not presented on the screen in the final form of the first interface, the second interface or the third interface during the whole process.
  • this embodiment still uses the following example to describe the specific implementation of controlling and adjusting the display form of an application window based on user interaction manipulation from a visualization perspective.
  • an application window in a full-screen state is displayed on the interactive tablet, and the user uses a finger as a touch object to perform interactive operations on the interactive tablet.
  • the display states of the target application window are as follows: Figure 1a, Figure 1b, Figure 1c and Figure 1d. The above-mentioned touch operations performed by the user relative to the interactive tablet will not be repeated here.
  • FIG. 2a is a schematic diagram of the interface of the third preview interface displayed on the interactive panel in the second embodiment; as shown in FIG. The interface presented relative to the target application window.
  • the finger 21 stays at the third touch position 24 while sliding from the second touch position 22 of the interactive panel 2 to the bottom edge 23.
  • the preview interface of the target application window is displayed in the third The display form of the preview interface 25 is presented.
  • the third touch position 24 determines the interface position of the third preview interface 25, which is embodied in that the third touch position 24 is located on the bottom edge of the third preview interface 25; in addition, in order to ensure that the third preview interface 25 To achieve a better display effect, it may be preferred that the third preview interface 25 has the same aspect ratio as the screen.
  • FIG. 2b is a schematic diagram of the interface of the fourth preview interface displayed on the interactive tablet in the second embodiment; as shown in FIG. The interface presented relative to the target application window.
  • the sixth touch operation is as follows: the finger 21 continues to slide downward from the third touch position 24 of the interactive tablet 2, and stays at the fourth touch position 26 on the screen.
  • the target The preview interface of the application window is presented in the display form of the fourth preview interface 27 .
  • the fourth touch position 26 determines the interface position of the fourth preview screen 27, which is embodied in that the fourth touch position 26 is located on the bottom edge of the fourth preview interface 27; in addition, in order to ensure that the fourth preview interface 27 It may also be preferred that the fourth preview interface 27 has the same aspect ratio as the screen.
  • FIG. 2c is a schematic interface diagram of the third interface displayed on the interactive panel in the second embodiment.
  • FIG. 2d is a schematic diagram of one interface of the target application window in full screen state displayed on the interactive tablet in the second embodiment.
  • the third interface 28 is an interface presented to the target application window after the interactive tablet 2 receives the user's seventh touch operation.
  • the seventh touch operation is manifested as: the finger 21 leaves the screen from the fourth touch position 26 of the interactive tablet 2, and the sliding speed of the finger 21 sliding to the fourth touch position 26 is less than the second speed threshold (e.g. 0.3 times the screen height).
  • the target application window is no longer presented as a preview interface, but is presented as a third interface 28 in a fixed form.
  • the departure of the finger 21 from a touch position determines that the target application window will no longer be adjusted with the movement of the finger 21 in the preview form, but will be presented in a fixed interface
  • the fixed interface to be presented is also limited by another limiting condition, that is, the sliding speed at which the finger 21 slides to the fourth touch position 26 is less than the second speed threshold.
  • the interface position of the presented fixed interface is the same as the interface position of the preview interface presented at the touch position when leaving the screen. Referring to FIG. 2b and FIG. 2c , it can be seen that the third interface 28 in FIG. 2c is at the same interface position as the fourth preview interface in FIG. 2b , and the interface sizes and the like are also the same.
  • the target application window is displayed in a full-screen state in FIG. 2d , and the full-screen state can be considered as the state presented when the eighth touch operation is received.
  • the eighth touch operation is shown as: the finger 21 leaves the screen from the fourth touch position 26 of the interactive panel 2, and the sliding speed of the finger 21 sliding to the fourth touch position 26 is greater than or equal to the second Velocity threshold (e.g. 0.3 times the screen height).
  • the target application window is no longer presented in the form of a preview interface, but directly presented in a full-screen state.
  • FIG. 2e is a schematic diagram of another interface of the target application window in the full-screen state displayed on the interactive tablet in the second embodiment.
  • the full-screen state presented in FIG. 2e can also be regarded as the state presented after receiving the ninth touch operation.
  • the ninth touch operation is specifically represented as: the finger 21 continues to slide downward from the fourth touch position 26 of the interactive tablet 2 until it reaches the bottom edge 23 of the screen. At this time, it is also equivalent to triggering the target application window to return to the full-screen state for display. Therefore, in this embodiment, after receiving the ninth touch operation, the target application window is also directly presented in the full-screen state.
  • the dotted line diagrams appearing in the above-mentioned figures can all be understood as the historical state corresponding to the historical event, as shown in the third preview interface 25 shown in dotted lines in Figure 2b, indicating that the fourth preview interface 27 is displayed.
  • the virtual finger pointing to the fourth touch position 26 can be regarded as the historical state of the finger touching the screen before leaving the screen at the fourth touch position 26 .
  • the third preview interface 25 and the fourth preview interface 27 shown in dotted lines in FIG. 2d can both represent the historical preview status of the corresponding preview interface before the target application window is displayed in a full screen state.
  • the virtual finger continuously extending downward as shown in Figure 2e it also illustrates the historical sliding state of the finger sliding down to the bottom edge.
  • this embodiment also provides a specific example of an application interaction scenario to further illustrate the application realized through the interactive operation between the user and the interactive tablet
  • the user manipulates the finger to slide upwards from the bottom edge of the interactive tablet. During this period, the change of the target application window on the interactive tablet is described in the above embodiment , which will not be elaborated here.
  • the sliding direction of the finger is changed, that is, the manipulation finger slides downward from the touch position toward the bottom edge of the screen.
  • the target application window will firstly change in real time in the form of a preview interface as the finger position changes, and the window display size of the preview interface will continue to increase as the finger slides down. Large (see Fig. 2a to Fig. 2b for variation).
  • the target application window can also be triggered to exit the full screen, and the finger The touch position where the finger is when leaving the screen is used as the presentation position of the interface at the bottom of the target application window (refer to the interface effect shown in FIG. 2c ).
  • the sliding speed of the finger at the touch position is greater than or equal to 0.3 times the height of the screen, it can be considered that the operation of restoring the target application window to full screen is triggered, thereby restoring the interface of the target application window to the full screen state , and display in the display size of the full screen state (refer to the interface effect shown in FIG. 2d).
  • the user can also manipulate the finger to slide down until it reaches the bottom edge of the screen (there is no limit to the sliding speed of the finger).
  • This kind of interactive operation can also be considered as triggering the operation of restoring the target application window to full screen.
  • Embodiment 2 of the present invention provides an application window control method.
  • the touch operation generated by the interaction between the user and the interactive panel is not only related to the sliding of the touch object from the edge of the screen to the middle of the screen, but also related to the sliding of the touch object from the middle of the screen to the edge of the screen in the opposite direction.
  • the interactive tablet can flexibly adjust the interface position of the target application window directly through the corresponding touch operations, and can control the target application window to exit the full-screen state, and can also control the target application window to return to the full-screen state.
  • this embodiment overcomes the problem that the user cannot effectively control the application window to exit the full screen due to the large size of the interactive tablet, and also solves the problem that frequent interaction with the user is required to control the application window to exit the full screen.
  • the interactive control gestures between the user and the interactive tablet are enriched, the convenience of user manipulation is realized, and the system usability of the interactive tablet is greatly improved.
  • this optional embodiment further provides a specific realization of the influence of different touch positions on the interface position of the preview interface of the target application window from the perspective of the bottom layer.
  • the interface aspect ratio of the preview interface is the same as the aspect ratio of the screen, and the interface position of the preview interface can be determined by The vertex coordinates of the vertices of the interface are represented.
  • the target touch coordinates of the target touch position in combination with the width and height values of the screen, wherein the target touch position is when the preview interface is displayed, the touch object The touch position where it stays.
  • the preview interface in this optional embodiment includes the first preview interface, the second preview interface, the third preview interface, and the fourth preview interface in Embodiment 1 and Embodiment 2.
  • This optional The target touch position in the embodiment includes the first touch position, the second touch position, the third touch position, the fourth touch position and the like passed by the touch object in the first embodiment and the second embodiment.
  • this embodiment can extract the target touch coordinates of the target touch position, on the premise that the interface aspect ratio of the target application window remains the same as the screen aspect ratio
  • the vertex coordinates of each interface vertex in the preview interface can be determined through the target touch coordinates and the width and height values of the screen.
  • the edge of the screen selected by the user has an influence on the determination of the coordinates of vertices of each interface in the preview interface.
  • the specific implementation of determining the coordinates of each vertex is described according to the situation.
  • the abscissa of each vertex of the interface can be determined through the abscissa of the target touch coordinates;
  • the abscissa is combined with the width and height values of the screen to determine the vertex ordinate corresponding to each interface vertex.
  • the touch coordinates of the target touch position are first set to be on the side parallel to the edge of the screen and closest to the edge of the screen, that is, one side of the preview interface is It moves with the movement of the control position.
  • the executive body can obtain the touch coordinates fed back by the touch frame relative to the target touch position.
  • the touch coordinates are marked as target touch coordinates, and the target touch coordinates include abscissa and Y-axis.
  • the edge of the screen is the left edge or the right edge
  • the main direction of sliding from the edge of the screen to the middle of the screen is horizontal sliding.
  • horizontal sliding is used as the main sliding direction, it is equivalent to horizontal translation on the edge parallel to the left or right edge on the preview interface, and the specific value of the horizontal translation can be determined by the abscissa of the target touch position.
  • the preview interface of the target application window is represented by ABCD, where side AB represents the side parallel to the left edge/right edge and is closest to the left edge/right edge, and side CD represents the side parallel to the left edge The edge/right edge, but the edge that is farther from the left/right edge.
  • point E is used to represent the target touch position. At this time, point E can be considered to be on side AB. Therefore, the abscissa of vertex A and vertex B can be considered as the abscissa of point E, and thus the touch can be determined.
  • the specific values of sides AC and BD can be calculated.
  • the specific values of side AB and side CD can also be determined, so that the vertical coordinates of vertices A, B, C and D can be obtained, and finally each vertex can be obtained Vertex coordinates of A, B, C, and D.
  • the vertex ordinate in the target touch coordinates can be used to determine the vertex ordinate of each interface vertex;
  • the coordinates are combined with the width and height values of the screen to determine the abscissa of each vertex corresponding to each interface vertex.
  • the sliding direction from the edge of the screen to the middle of the screen takes vertical sliding as the main direction.
  • vertical sliding is used as the main sliding direction, firstly, it is equivalent to vertical translation on the side parallel to the top edge or bottom edge on the preview interface, and the specific value of vertical translation can be determined by the vertical coordinate of the target touch position.
  • the preview interface of the target application window is still represented by ABCD, where side AC represents the side parallel to the top edge/bottom edge and is closest to the top edge/bottom edge, and side BD can represent the side parallel to the top edge/bottom edge, But the edges that are farther from the top/bottom edges.
  • point E is still used to represent the target touch position. At this time, point E can be considered to be on side AC.
  • the ordinate of vertex A and vertex C can be considered as the ordinate of point E, and thus it can be determined
  • the specific values of side AB and side CD can be calculated.
  • the specific values of side AC and side BD can also be determined, so that the abscissas of vertices A, B, C and D can be obtained, and finally each vertex can be obtained Vertex coordinates of A, B, C, and D.
  • the above-mentioned optional embodiment of the second embodiment provides the specific implementation of the bottom layer of determining the interface position of the preview interface through the touch coordinates of the touch position where the touch object stays, and provides a basis for the realization of the application window control method provided in this embodiment. Underlying technical support.
  • Fig. 3 is a schematic structural diagram of an application window control device provided in Embodiment 3 of the present invention. As shown in Fig. 3, the device includes: a full-screen display module 31, a first receiving module 32, a first display module 33, a second receiving module module 34, a second display module 35, a third receiving module 36, and a third display module 37;
  • the full-screen display module 31 is used to display the target application window in a full-screen state
  • the first receiving module 32 is configured to receive a first touch operation, the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen, and stays at the first touch position;
  • the first display module 33 is configured to display a first preview interface of the target application window, where the interface position of the first preview interface is related to the first touch position;
  • the second receiving module 34 is configured to receive a second touch operation, the second touch operation is that the touch object slides from the first touch position to the middle of the screen and stays at the second touch position;
  • the second display module 35 is configured to display a second preview interface of the target application window, where the interface position of the second preview interface is related to the second touch position;
  • the third receiving module 36 is configured to receive a third touch operation, the third touch operation is that the touch object leaves the screen from the second touch position;
  • the third display module 37 is configured to display the first interface of the target application window, the first interface has the same interface position as the second preview interface.
  • the touch operation generated by the interaction between the user and the interactive panel is only related to the sliding of the touch object from the edge of the screen to the middle of the screen, and the interactive panel can also directly respond to each touch
  • the operation adjusts the interface position of the target application window, and controls the target application window to exit the full-screen state.
  • the problem that the user cannot effectively control the application window to exit the full screen due to the large size of the interactive tablet is overcome, and the problem that requires frequent interaction with the user to control the application window to exit the full screen is also solved.
  • the convenience of user manipulation is realized, and the system usability of the interactive panel is greatly improved.
  • the third touch operation is that the touch object leaves the screen from the second touch position, and the sliding speed of the touch object to the second touch position is less than the set first A speed threshold.
  • the device also includes:
  • the fourth receiving module is configured to receive a fourth touch operation, the fourth touch operation is that the touch object leaves the screen from the second touch position, and the touch object slides to the second touch
  • the sliding speed at the control position is greater than or equal to the first speed threshold
  • the fourth display module is configured to display the second interface of the target application window, and the display size of the second interface is a set multiple of the display size of the target application window in a full-screen state.
  • the device also includes:
  • the fifth receiving module is configured to receive a fifth touch operation, the fifth touch operation is that the touch object slides from the second touch position to the edge of the screen and stays at the third touch position;
  • a fifth display module configured to display a third preview interface of the target application window, where the interface position of the third preview interface is related to the third touch position;
  • the sixth receiving module is configured to receive a sixth touch operation, the sixth touch operation is that the touch object slides from the third touch position to the edge of the screen and stays at the fourth touch position;
  • the sixth display module is configured to display a fourth preview interface of the target application window, where the interface position of the fourth preview interface is related to the fourth touch position.
  • the device also includes:
  • a seventh receiving module configured to receive a seventh touch operation, the seventh touch operation is that the touch object leaves the screen from the fourth touch position;
  • a seventh display module configured to display a third interface of the target application window, where the display position of the third interface is the same as that of the fourth preview interface.
  • the seventh touch operation is that the touch object leaves the screen from the fourth touch position, and the sliding speed of the touch object to the fourth touch position is lower than the set second speed threshold.
  • the device also includes:
  • the eighth receiving module is configured to receive an eighth touch operation, the eighth touch operation is that the touch object leaves the screen from the fourth touch position, and the touch object slides to the fourth touch
  • the sliding speed at the control position is greater than or equal to the second speed threshold
  • An eighth display module configured to display the target application window in a full-screen state.
  • the device also includes:
  • a ninth receiving module configured to receive a ninth touch operation, where the ninth touch operation is sliding the touch object from the fourth touch position to the edge of the screen;
  • a ninth display module configured to display the target application window in a full-screen state.
  • the interface aspect ratio of the preview interface is the same as the aspect ratio of the screen, and the interface position of the preview interface is determined by the vertex coordinates of the interface vertices representation;
  • Each of the vertex coordinates is determined by an information determination module included in the device,
  • the information determination module is used to determine the target touch coordinates of the target touch position in combination with the width and height values of the screen, wherein the target touch position is the touch point where the touch object stays when the preview interface is displayed. control position.
  • the information determination module is specifically used for:
  • the abscissa of each vertex of the interface is determined by the abscissa of the target touch coordinates
  • the information determination module is also specifically used for:
  • the abscissa of each vertex corresponding to each interface vertex is determined by combining the ordinate of each vertex with the width and height values of the screen.
  • the operation of the touch object moving away from a touch position is identified by calling a touch monitoring function.
  • the sliding speed of the touch object when sliding from one touch position to another touch position is determined by the touch coordinates and touch time points of each touch position.
  • FIG. 4 is a schematic structural diagram of an interactive panel provided in Embodiment 4 of the present application.
  • the interactive panel includes: a processor 40 , a memory 41 , a display screen 42 , an input device 43 , an output device 44 , and a touch component 45 .
  • the number of processors 40 in the interactive panel may be one or more, and one processor 40 is taken as an example in FIG. 4 .
  • the number of memory 41 in the interactive panel can be one or more, and one memory 41 is taken as an example in FIG. 4 .
  • the processor 40 , memory 41 , display screen 42 , input device 43 , output device 44 and touch component 45 of the interactive panel can be connected through a bus or in other ways. In FIG. 4 , connection through a bus is taken as an example.
  • the memory 41 can be used to store software programs, computer-executable programs and modules, such as the program instructions/modules corresponding to the interactive panel described in any embodiment of the present invention (for example, the application window control device) Full screen display module 31, first receiving module 32, first display module 33, second receiving module 34, second display module 35, third receiving module 36, and third display module 37).
  • the memory 41 can mainly include a program storage area and a data storage area, wherein the program storage area can store an operating system and at least one application required by a function; the data storage area can store data created according to the use of the device, etc.
  • the memory 41 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage devices. In some instances, the memory 41 may further include memory located remotely relative to the processor 40, and these remote memories may be connected to the device through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
  • the display screen 42 is covered with the touch assembly 45 (the covering relationship is not shown in FIG. 4 ), which can constitute a touch screen for displaying interactive content.
  • the display screen 42 is used to display data according to the instructions of the processor 40. , which is also used to receive a touch operation on the display screen 42 and send a corresponding signal to the processor 40 or other devices.
  • the input device 43 can be used to receive input digital or character information, and generate key signal input related to user settings and function control of the display device, and can also be a camera for obtaining graphics and a sound pickup device for obtaining audio data.
  • the output device 44 may include an audio device such as a speaker. It should be noted that the specific composition of the input device 43 and the output device 44 can be set according to actual conditions.
  • the touch component 45 is used to respond to the touch operation of the touch object through the included hardware circuit.
  • the processor 40 executes various functional applications and data processing of the device by running software programs, instructions and modules stored in the memory 41 , that is, implements the application window control method provided by any of the above embodiments.
  • the interactive panel provided above can be used to execute the application window control method provided by any of the above embodiments, and has corresponding functions and beneficial effects.
  • Embodiment 5 of the present application also provides a storage medium containing computer-executable instructions, the computer-executable instructions are used to execute an application window control method when executed by a computer processor, including:
  • the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen, and stays at the first touch position;
  • the second touch operation is that the touch object slides from the first touch position to the middle of the screen, and stays at the second touch position;
  • the third touch operation being that the touch object leaves the screen from the second touch position
  • a first interface of the target application window is displayed, and the first interface has the same interface position as the second preview interface.
  • the storage medium containing computer-executable instructions provided in the embodiments of the present application the computer-executable instructions are not limited to the operation of the above-mentioned application window control method, and can also execute the application window provided in any embodiment of the present invention. Relevant operations in the control method have corresponding functions and beneficial effects.
  • Embodiment 6 of the present application also provides a computer program, which is used to implement an application window control method when executed by a computer processor, and the method includes:
  • the first touch operation is that the touch object slides from the edge of the screen to the middle of the screen and stays at the first touch position;
  • the second touch operation is that the touch object slides from the first touch position to the middle of the screen, and stays at the second touch position;
  • the third touch operation being that the touch object leaves the screen from the second touch position
  • a first interface of the target application window is displayed, and the first interface has the same interface position as the second preview interface.
  • the third touch operation is that the touch object leaves the screen from the second touch position, and the sliding speed of the touch object to the second touch position is less than the set first A speed threshold.
  • the method also includes:
  • the fourth touch operation is that the touch object leaves the screen from the second touch position, and the sliding speed of the touch object to the second touch position is greater than or equal to said first speed threshold;
  • a second interface of the target application window is displayed, and the display size of the second interface is a set multiple of the display size of the target application window in a full-screen state.
  • the method also includes:
  • the fifth touch operation being that the touch object slides from the second touch position to the edge of the screen and stays at the third touch position;
  • the sixth touch operation is that the touch object slides from the third touch position to the edge of the screen and stays at the fourth touch position;
  • a fourth preview interface of the target application window is displayed, and an interface position of the fourth preview interface is related to the fourth touch position.
  • the method also includes:
  • the seventh touch operation is that the touch object leaves the screen from the fourth touch position
  • a third interface of the target application window is displayed, where the display position of the third interface is the same as that of the fourth preview interface.
  • the seventh touch operation is that the touch object leaves the screen from the fourth touch position, and the sliding speed of the touch object to the fourth touch position is lower than the set first Two speed thresholds.
  • the method also includes:
  • the eighth touch operation is that the touch object leaves the screen from the fourth touch position, and the sliding speed of the touch object to the fourth touch position is greater than or equal to said second speed threshold;
  • the target application window in a full screen state is displayed.
  • the method also includes:
  • the ninth touch operation is sliding the touch object from the fourth touch position to the edge of the screen;
  • the target application window in a full screen state is displayed.
  • the interface aspect ratio of the preview interface is the same as the aspect ratio of the screen, and the interface position of the preview interface is determined by the vertex coordinates of the interface vertices representation;
  • the coordinates of each vertex are determined by the target touch coordinates of the target touch position combined with the width and height values of the screen, wherein the target touch position is the touch point where the touch object stays when the preview interface is displayed. Location.
  • the step of determining the coordinates of each vertex according to the target touch coordinates of the target touch position in combination with the width and height values of the screen includes:
  • the abscissa of each vertex of the interface is determined by the abscissa of the target touch coordinates
  • the ordinate of the vertex corresponding to each vertex of the interface is determined.
  • the step of determining the coordinates of each vertex according to the target touch coordinates of the target touch position in combination with the width and height values of the screen includes:
  • the abscissa of each vertex corresponding to each interface vertex is determined by combining the ordinate of each vertex with the width and height values of the screen.
  • the sliding speed when the touch object slides from one touch position to another touch position is determined by the touch coordinates and touch time points of each touch position.
  • the present application can be realized by means of software and necessary general-purpose hardware, and of course it can also be realized by hardware, but in many cases the former is a better implementation .
  • the technical solution of the present application can be embodied in the form of a software product in essence or the part that contributes to the prior art, and the computer software product can be stored in a computer-readable storage medium, such as a floppy disk of a computer , read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), flash memory (FLASH), hard disk or CD, etc., including several instructions to make an interactive tablet (it can be a robot, A personal computer, a server, or a network device, etc.) executes the application window control method described in any embodiment of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente demande divulgue un procédé et un appareil de gestion de fenêtre d'application, un panneau plat interactif et un support de stockage. Le procédé consiste : à afficher une fenêtre d'application cible dans un état plein écran ; à recevoir une première opération de commande tactile, la première opération de commande tactile étant telle qu'un objet tactile coulisse depuis le bord d'un écran jusqu'au milieu de l'écran et reste à une première position de commande tactile ; à afficher une première interface de prévisualisation de la fenêtre d'application cible ; à recevoir une deuxième opération de commande tactile, la deuxième opération de commande tactile étant telle que l'objet tactile coulisse depuis la première position de commande tactile jusqu'au milieu de l'écran et reste à une seconde position de commande tactile ; à afficher une seconde interface de prévisualisation de la fenêtre d'application cible ; à recevoir une troisième opération de commande tactile, la troisième opération de commande tactile étant telle que l'objet tactile quitte l'écran à partir de la seconde position de commande tactile ; et à afficher une première interface de la fenêtre d'application cible. Au moyen du procédé, le problème de l'impossibilité pour un utilisateur de commander efficacement une fenêtre d'application pour sortir d'un plein écran en raison de la taille trop importante d'un panneau plat interactif est surmonté, ce qui permet de réaliser la commodité de la commande de l'utilisateur et d'améliorer la disponibilité du système du panneau plat interactif dans une large mesure.
PCT/CN2021/108753 2021-07-27 2021-07-27 Procédé et appareil de commande de fenêtre d'application, panneau plat interactif et support de stockage WO2023004600A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/108753 WO2023004600A1 (fr) 2021-07-27 2021-07-27 Procédé et appareil de commande de fenêtre d'application, panneau plat interactif et support de stockage
CN202180005735.1A CN115885245A (zh) 2021-07-27 2021-07-27 应用窗口控制方法、装置、交互平板及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/108753 WO2023004600A1 (fr) 2021-07-27 2021-07-27 Procédé et appareil de commande de fenêtre d'application, panneau plat interactif et support de stockage

Publications (1)

Publication Number Publication Date
WO2023004600A1 true WO2023004600A1 (fr) 2023-02-02

Family

ID=85086107

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/108753 WO2023004600A1 (fr) 2021-07-27 2021-07-27 Procédé et appareil de commande de fenêtre d'application, panneau plat interactif et support de stockage

Country Status (2)

Country Link
CN (1) CN115885245A (fr)
WO (1) WO2023004600A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101702111A (zh) * 2009-11-13 2010-05-05 宇龙计算机通信科技(深圳)有限公司 一种实现触摸屏内容缩放的方法及终端
CN102981596A (zh) * 2012-12-21 2013-03-20 东莞宇龙通信科技有限公司 终端和屏幕界面显示方法
CN104503682A (zh) * 2014-11-07 2015-04-08 联发科技(新加坡)私人有限公司 一种屏幕显示窗口的处理方法以及移动终端
CN105549824A (zh) * 2015-12-26 2016-05-04 魅族科技(中国)有限公司 一种显示控制方法及移动终端
CN110199252A (zh) * 2017-01-19 2019-09-03 微软技术许可有限责任公司 具有窗口重定位预览界面的计算设备
CN111124338A (zh) * 2019-12-18 2020-05-08 青岛海信商用显示股份有限公司 屏幕控制方法及触摸显示设备
CN111966252A (zh) * 2020-05-14 2020-11-20 华为技术有限公司 应用窗口显示方法和电子设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101702111A (zh) * 2009-11-13 2010-05-05 宇龙计算机通信科技(深圳)有限公司 一种实现触摸屏内容缩放的方法及终端
CN102981596A (zh) * 2012-12-21 2013-03-20 东莞宇龙通信科技有限公司 终端和屏幕界面显示方法
CN104503682A (zh) * 2014-11-07 2015-04-08 联发科技(新加坡)私人有限公司 一种屏幕显示窗口的处理方法以及移动终端
CN105549824A (zh) * 2015-12-26 2016-05-04 魅族科技(中国)有限公司 一种显示控制方法及移动终端
CN110199252A (zh) * 2017-01-19 2019-09-03 微软技术许可有限责任公司 具有窗口重定位预览界面的计算设备
CN111124338A (zh) * 2019-12-18 2020-05-08 青岛海信商用显示股份有限公司 屏幕控制方法及触摸显示设备
CN111966252A (zh) * 2020-05-14 2020-11-20 华为技术有限公司 应用窗口显示方法和电子设备

Also Published As

Publication number Publication date
CN115885245A (zh) 2023-03-31

Similar Documents

Publication Publication Date Title
US12045440B2 (en) Method, device, and graphical user interface for tabbed and private browsing
US11967039B2 (en) Automatic cropping of video content
US10416789B2 (en) Automatic selection of a wireless connectivity protocol for an input device
WO2021184375A1 (fr) Procédé d'exécution de commandes de mouvement de la main, appareil, système et support d'enregistrement
KR102611858B1 (ko) 지능형 인터랙티브 태블릿의 조작 방법, 저장 매체 및 관련 기기
US9965039B2 (en) Device and method for displaying user interface of virtual input device based on motion recognition
US11474614B2 (en) Method and device for adjusting the control-display gain of a gesture controlled electronic device
US20140139430A1 (en) Virtual touch method
US20130257734A1 (en) Use of a sensor to enable touch and type modes for hands of a user via a keyboard
US11112959B2 (en) Linking multiple windows in a user interface display
JP2013539580A (ja) デバイス上の動き制御方法及び装置
US10474324B2 (en) Uninterruptable overlay on a display
US10656746B2 (en) Information processing device, information processing method, and program
WO2020143387A1 (fr) Procédé, dispositif et système de traitement de tables, et support de stockage et tablette interactive intelligente
WO2023004600A1 (fr) Procédé et appareil de commande de fenêtre d'application, panneau plat interactif et support de stockage
CN109739422B (zh) 一种窗口控制方法、装置及设备
KR102480568B1 (ko) 동작인식을 기반으로 하는 가상 입력장치의 사용자 인터페이스(ui)를 표시하는 장치 및 방법
CN114756159A (zh) 智能交互平板及其数据处理方法、装置、计算机存储设备
WO2023065939A1 (fr) Procédé et appareil de réponse tactile, panneau interactif, et support de stockage
WO2024001135A1 (fr) Procédés et appareils d'affichage à écran divisé, terminal et support de stockage
CN118020044A (zh) 手势控制设备中显示边缘交互的方法和***
KR102244547B1 (ko) 터치 스크린에서 터치 제어 장치 및 방법
CN114296599A (zh) 界面交互方法、装置、终端和存储介质
CN117422057A (zh) 一种显示设备及便签显示方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21951216

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE