CN111580713A - Display interaction system - Google Patents

Display interaction system Download PDF

Info

Publication number
CN111580713A
CN111580713A CN202010340666.7A CN202010340666A CN111580713A CN 111580713 A CN111580713 A CN 111580713A CN 202010340666 A CN202010340666 A CN 202010340666A CN 111580713 A CN111580713 A CN 111580713A
Authority
CN
China
Prior art keywords
display
sub
main
window
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010340666.7A
Other languages
Chinese (zh)
Other versions
CN111580713B (en
Inventor
何安琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Original Assignee
Guangzhou Shiyuan Electronics Thecnology Co Ltd
Guangzhou Shirui Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shiyuan Electronics Thecnology Co Ltd, Guangzhou Shirui Electronics Co Ltd filed Critical Guangzhou Shiyuan Electronics Thecnology Co Ltd
Priority to CN202010340666.7A priority Critical patent/CN111580713B/en
Publication of CN111580713A publication Critical patent/CN111580713A/en
Application granted granted Critical
Publication of CN111580713B publication Critical patent/CN111580713B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • G06F3/1446Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display display composed of modules, e.g. video walls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a display interaction system, which comprises main display equipment, auxiliary display equipment and a processor, wherein the main display equipment is internally provided with a main display controller; the main display controller and the auxiliary display controller respectively correspond to the main display equipment and the auxiliary display equipment to carry out display and interactive detection, the processor carries out display response corresponding to the whiteboard application according to an operation signal obtained by the interactive detection, and sends a display signal generated correspondingly to the main display equipment and the auxiliary display equipment to display. The display pictures of the whiteboard application are displayed in multiple paths or in the same path on the main display device and the auxiliary display device, the display areas of the sub-window pictures among different areas are switched according to the moving operation of the sub-window pictures related to the whiteboard application interface, and the switching of the multiple paths and the same path display is adjusted, so that the partition operation can be carried out according to the actual interaction requirement, and the multi-user cooperative operation of the intelligent interaction panel with the combination of multiple display screens is realized.

Description

Display interaction system
Technical Field
The embodiment of the invention relates to the technical field of interaction, in particular to a display interaction system.
Background
Along with the development of intelligent technology, the types of electronic products contacted by people in daily life are increasingly rich, wherein the interactive electronic products realized based on the touch technology have a more comprehensive function integration trend due to good human-computer interaction experience. The intelligent interactive flat panel is representative integrated equipment, is suitable for group interaction occasions such as conferences, teaching, commercial exhibition and the like, and integrates multiple functions such as a projector, a video conference and the like.
The method comprises the steps that various rich application developments are performed on the basis of an intelligent interactive panel under different application scenes, wherein a whiteboard application is an application with quite high use frequency on the intelligent interactive panel, a user interface presented in the use process of the whiteboard application is called a whiteboard application interface, the whiteboard application can be used for obtaining the operation of a finger or a writing pen of a user on the whiteboard application interface, obtaining a plurality of touch points according to the operation of the user, generating the writing handwriting of the user according to the touch points, and inserting other multimedia elements such as graphs, pictures, forms and the like into the whiteboard application interface.
In order to meet the increasingly rich application scenes, the intelligent interactive flat panel obtains a better application effect in a mode of combining a plurality of display screens, but when the inventor uses the whiteboard application in the intelligent interactive flat panel with the combination of the plurality of display screens, the whiteboard application can only mainly display and interact with one display screen, other display screens can only display the screen repeatedly or display other applications, and multi-user cooperative operation based on the plurality of display screens is lacked in the process of using the whiteboard application.
Disclosure of Invention
The invention provides a display interaction system, which aims to solve the technical problem that in the prior art, a whiteboard application used by an intelligent interaction panel with a combination of multiple display screens lacks multi-user cooperative operation based on multiple display screens.
In a first aspect, an embodiment of the present invention provides a display interaction system, including a main display device, an auxiliary display device, and a processor, where the main display device is provided with a main display controller, and the auxiliary display device is provided with an auxiliary display controller;
the processor generates a first main display signal and a first auxiliary display signal or generates a second main display signal and a second auxiliary display signal according to the operation of the whiteboard application; the first main display signal and the second main display signal carry main window pictures of the whiteboard application, and the first auxiliary display signal and the second main display signal carry sub-window pictures of the whiteboard application;
the processor transmits the first main display signal or the second main display signal to the main display controller and transmits the first auxiliary display signal or the second auxiliary display signal to the auxiliary display controller;
the main display controller is used for controlling the main display equipment to display according to the first main display signal or the second main display signal; the auxiliary display controller is used for controlling the auxiliary display equipment to display according to the first auxiliary display signal or the second auxiliary display signal;
the main display controller receives a main operation signal acting on main display equipment and sends the main operation signal to the processor; the auxiliary display controller receives an auxiliary operation signal acting on auxiliary display equipment and sends the auxiliary operation signal to the processor;
and the processor confirms the operation of the whiteboard application according to the main operation signal or the auxiliary operation signal.
The display interaction system comprises a main display device, an auxiliary display device and a processor, wherein the main display device is internally provided with a main display controller, and the auxiliary display device is internally provided with an auxiliary display controller; the processor generates a first main display signal and a first auxiliary display signal or generates a second main display signal and a second auxiliary display signal according to the operation of the whiteboard application; the first main display signal and the second main display signal carry main window pictures of the whiteboard application, and the first auxiliary display signal and the second main display signal carry sub-window pictures of the whiteboard application; the processor transmits the first main display signal or the second main display signal to the main display controller and transmits the first auxiliary display signal or the second auxiliary display signal to the auxiliary display controller; the main display controller is used for controlling the main display equipment to display according to the first main display signal or the second main display signal; the auxiliary display controller is used for controlling the auxiliary display equipment to display according to the first auxiliary display signal or the second auxiliary display signal; the main display controller receives a main operation signal acting on main display equipment and sends the main operation signal to the processor; the auxiliary display controller receives an auxiliary operation signal acting on auxiliary display equipment and sends the auxiliary operation signal to the processor; and the processor confirms the operation of the whiteboard application according to the main operation signal or the auxiliary operation signal. The sub-window pictures are switched between different display screens through the moving operation of the sub-window pictures related to the white board application interface, so that the white board application interface can be operated in a partition mode on the different display screens according to actual interaction requirements, and multi-user cooperative operation of the intelligent interaction panel with the combination of the multiple display screens is achieved.
Drawings
Fig. 1 is a schematic structural diagram of a display interaction system according to an embodiment of the present invention;
fig. 2 is a schematic interface diagram illustrating a touch input of a display interaction system according to an embodiment of the present invention;
fig. 3 is a flowchart illustrating an interaction process of a display interaction system according to an embodiment of the present invention;
FIGS. 4 and 5 are schematic diagrams of the display contents of the sub-window screen according to the embodiment of the present invention;
fig. 6-fig. 13 are schematic diagrams illustrating interface changes when interaction is implemented by the intelligent interaction system according to the embodiment of the present invention;
fig. 14 is a schematic structural diagram of an interaction apparatus according to a second embodiment of the present invention;
fig. 15 is a schematic structural diagram of a terminal device according to a third embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are for purposes of illustration and not limitation. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
It should be noted that, for the sake of brevity, this description does not exhaust all alternative embodiments, and it should be understood by those skilled in the art after reading this description that any combination of features may constitute an alternative embodiment as long as the features are not mutually inconsistent.
For example, in one embodiment of the first embodiment, one technical feature is described: the moving operation of the sub-window screen is triggered by the setting area of the sub-window screen, and in another embodiment of the first embodiment, another technical feature is described: the speed change is performed in a reflective manner when the window touches the second type of boundary during the window shaking process. Since the above two technical features are not contradictory to each other, after reading the present application, it should be understood that an embodiment having both of these features is also an optional embodiment, that is, after the moving operation of the sub-window picture is triggered on the setting area of the sub-window picture, when the moving operation is the window swing operation, the speed change is performed in a reflective manner when the sub-window picture contacts the second type boundary.
It should be noted that the embodiment of the present disclosure is not a set of all the technical features described in the first embodiment, some of the technical features are described for the optimal implementation of the disclosure, and the combination of several technical features described in the first embodiment may be used as an independent embodiment if the design of the present disclosure is designed originally, and may of course be used as a specific product form.
The following examples are described in detail.
Example one
Fig. 1 is a schematic structural diagram of a display interaction system according to an embodiment of the present invention. As shown in fig. 1, the display interaction system includes a main display device 21, a sub display device 22 and a processor 23, wherein the main display device 21 is provided with a main display controller 211, and the sub display device 22 is provided with a sub display controller 221;
the processor 23 generates a first main display signal and a first sub display signal, or generates a second main display signal and a second sub display signal according to the operation of the whiteboard application; the first main display signal and the second main display signal carry main window pictures of the whiteboard application, and the first auxiliary display signal and the second main display signal carry sub-window pictures of the whiteboard application;
the processor 23 transmits the first main display signal or the second main display signal to the main display controller 211, and transmits the first sub display signal or the second sub display signal to the sub display controller 221;
the main display controller 211 is configured to control the main display device 21 to display according to the first main display signal or the second main display signal; the auxiliary display controller 221 is configured to control the auxiliary display device 22 to display according to the first auxiliary display signal or the second auxiliary display signal;
the main display controller 211 receives a main operation signal acting on the main display device 21 and sends the main operation signal to the processor 23; the sub display controller 221 receives a sub operation signal applied to the sub display device 22 and transmits the sub operation signal to the processor 23;
the processor 23 confirms the operation of the whiteboard application according to the main operation signal or the auxiliary operation signal.
The display interaction system provided by the embodiment can be used as a specific product form by the intelligent interaction tablet. In order to facilitate understanding, in the embodiment, the intelligent interactive tablet is taken as an actual carrier, and a detailed process of the display interactive system when performing interaction implementation, such as moving operation on a child window picture generated by the whiteboard application interface, confirming a final display position, and operating the child window picture, is exemplarily described. The intelligent interactive panel can be an integrated device which controls the content displayed on the display panel and realizes man-machine interaction operation through a touch technology, and integrates one or more functions of a projector, an electronic whiteboard, a curtain, a sound box, a television, a video conference terminal and the like.
In the display interactive system shown in fig. 1, the processor 23 is installed in the main display device 21, but in actual operation, the main display device 21 and the sub-display device 22 and the corresponding main display controller 211 and sub-display controller 221 have no difference in hardware, and as two identical display devices, the processor 23 may select one of the two identical display devices to implement a physical structure for installation. The main display device 21 is a display device that displays a main window screen of the whiteboard application, and during the use of the whiteboard application, a user can select one of the two display devices to display the main window screen according to the own use habit and specific scene, where the display device that displays the main window screen is the main display device 21, and the other display device is the auxiliary display device 22. If the display main body of the main window picture is switched in the using process, the current main display device 21 and the auxiliary display device 22 are correspondingly switched. In general, the main display device 21 and the sub display device 22 are not completely fixed definitions of the two display devices, but real-time definitions according to display contents thereof (the display device displaying the main window screen of the whiteboard application is the main display device).
On the basis of the above embodiment, when the main operation signal is an operation signal for moving the sub-window screen from the main display device to the sub-display device, the processor generates a first main display signal and a first sub-display signal.
In addition, the processor generates a second main display signal and a second sub display signal when the sub window screen is moved from a sub display apparatus to the main display apparatus.
When the processor detects an operation signal that the sub-window picture moves from the main display device to the auxiliary display device or detects an operation signal that the sub-window picture moves from the auxiliary display device to the main display device at the auxiliary display device, the operation signal indicates that the display main body of the sub-window picture needs to be changed, the processor regenerates a corresponding display signal according to the change of the display main body as a response to the operation of the whiteboard application, and the main change in the display signal is to display the sub-window picture at the auxiliary display device (the main window picture and the sub-window picture are displayed at different display devices) or to display the sub-window picture at the main display device (the main window picture and the sub-window picture are displayed at the same display device).
In another operation, when the main operation signal is a main moving operation signal of the sub-window picture in the main display device, the processor updates the display position of the sub-window picture in the main display device according to the main moving operation signal, and correspondingly updates the second main display signal.
When the sub-operation signal is a sub-movement operation signal of the sub-window picture in the sub-display device, the processor updates the display position of the sub-window picture in the sub-display device according to the sub-movement operation signal, and correspondingly updates the first sub-display signal.
When the display main body of the sub-window picture is switched, the display signal needs to be correspondingly changed, and when the sub-window picture detects a moving operation signal inside the display main body, the display signal also needs to be updated according to the corresponding operation, and the main content of the update lies in the display position of the sub-window picture in the display main body. Of course, there are other updates to the content in the main window frame and the sub-window frame, which also result in the update of the display signal, and the content update process is not the main point of the present solution and is not separately described here.
In a specific moving process, when the terminal moving speed corresponding to the main operation signal or the auxiliary operation signal is greater than a preset threshold value, the processor continuously updates the display position of the sub-window picture based on the terminal moving speed and a preset acceleration. In the display interaction system in the scheme, when the terminal moving speed is greater than a preset threshold value, namely within a shaking speed range, the moving operation is determined to be window shaking operation, and after the touch operation corresponding to the operation signal is finished, the continuous sub-window frame is controlled to decelerate and move according to the set acceleration until the moving speed is zero.
In the process of deceleration movement, in order to ensure complete display of the sub-window picture in the main display device or the auxiliary display device, when the sub-window picture contacts the boundary of the main display device or the auxiliary display device and the moving direction does not have the auxiliary display device or the main display device, the boundary is used as a reflecting surface to change the moving direction of the sub-window picture.
In a specific display process, because the main window picture is displayed in a full screen mode, in order to ensure normal display of the sub-window picture, the sub-window picture is displayed on the top of the main display device.
In addition, in order to ensure the normal display of other application interfaces when the sub-window picture is displayed in the sub-display device, the display priority of the sub-window picture in the sub-display device is the same as that of other application interfaces.
In the process of displaying the sub-window picture, the processor is further configured to receive a sub-window adjustment operation, and adjust the display state and/or the display content of the sub-window picture according to the sub-window operation.
The following is a comprehensive description of the display interactive system in the present solution for specific devices and application scenarios. As shown in fig. 2, the present application is mainly directed to an intelligent interactive tablet 1 composed of at least two display screens. In the embodiment of the present invention, description is mainly made based on two display screens (the first display screen 11 and the second display screen 12), and in the following description, the first display screen 11 serves as a main display device, and the second display screen 12 serves as a sub display device. The intelligent interactive flat plate 1 is provided with two display screens with touch functions, and the display screens can be capacitive screens, resistive screens or electromagnetic screens. In the intelligent interactive tablet provided in this embodiment, further, the user may implement touch operation by touching the display screen with a finger or a stylus, and correspondingly, the intelligent interactive tablet 1 detects the touch position and responds according to the touch position to implement a touch function. Typically, the smart interactive tablet 1 is installed with at least one operating system, wherein the operating system includes, but is not limited to, an android system, a Linux system, and a Windows system. Further, the smart interactive tablet 1 may install at least one application having a writing function. The application program may be an application program carried by an operating system, and an application program downloaded from a third-party device or a server is also installed. Optionally, the application program has other editing functions besides the writing function, such as inserting tables, inserting pictures, inserting graphics, drawing tables, drawing graphics, and the like. And the drawn table or graph is a standard element drawn by the computer. Computer drawn standard elements may be understood as print standard elements drawn by an intelligent interactive tablet, which are distinguished from elements written by a user.
A touch screen configured by intelligent handwriting equipment such as an electronic whiteboard, an electronic blackboard, a digitizer, an intelligent conference tablet and the like is provided with a writing area for responding to the writing operation of a user to display input content, when writing is performed in the writing area, for example, a touch pen or a finger is used to touch the touch screen, the touch screen can sense the change of current, the change of voltage or the change of magnetic flux (corresponding to the specific setting type of the capacitive touch screen, the resistive touch screen or the electromagnetic touch screen), so as to obtain a touch signal containing the coordinates of the touch position and the trigger time of the touch signal, according to the coordinates of the touch position and the trigger time of the touch signal, track data of a writing track input in the process of writing to lifting and stopping writing by a stylus or a finger of a user can be obtained, and the writing track input by the user is displayed in a writing area of the touch screen in real time according to the track data. Of course, the writing process is not limited to writing, and any entry process implemented on the touch screen of the intelligent handwriting device for displaying the operation track can be regarded as the writing process. In addition, the touch operation may respond to operations such as clicking, dragging, and the like of the user according to the difference of the display element of the occurrence position of the touch operation. The way in which these different responses are processed at the bottom level is the same. Generally, the area where touch writing occurs coincides with the area where display is located. Based on the hardware implementation of basic functions such as touch and display, the function of whiteboard application can be further implemented on the intelligent interactive panel, and each embodiment of the scheme is the implementation of the scheme for whiteboard application in the display interactive system.
The technical scheme is further an interaction scheme when the whiteboard application is realized in an intelligent interaction panel composed of multiple display screens, specifically, a whiteboard application interface (namely a main window picture) is displayed on one display screen of at least two display screens of the intelligent interaction panel in a full screen mode, and a sub-window picture generated by the whiteboard application interface is displayed on one display screen of the at least two display screens.
The application interface refers to an interactive interface used for receiving and/or displaying information, such as an application of a video playing class, and the interactive interface is mainly used for displaying a changed video picture; the real-time communication type application, wherein the interactive interface is mainly used for receiving the content input by a near-end user and displaying the content input by a far-end user; the interactive interface of the file editing application is mainly used for receiving and displaying the content input by the user; the interactive interface of the browser-type application is mainly used for receiving keywords input by a user and displaying webpage content obtained based on the keywords.
In this embodiment, the application interface specifically targeted refers to a user interface presented in the whiteboard application use process, that is, a whiteboard application interface. As described above, the whiteboard application refers to an application for a user to perform operations such as writing and displaying, and may be used to generate writing traces according to a writing track of the user on the whiteboard application interface, and may also be used to insert other multimedia elements such as graphics, pictures, forms, and the like on the whiteboard application interface. In the whiteboard application interface, a user can realize writing, drawing, erasing and other operations similar to those of an entity blackboard, and further has better digital functions of moving, storing, zooming, inserting pictures, adjusting colors, setting stroke weights and the like. Writing-based operations in whiteboard applications are a mature realization of the prior art and are not described in detail in this solution. In practical applications, the whiteboard application may also be named as a writing application, an electronic whiteboard application, a collaborative whiteboard application, and the like, and the application for realizing the above functions is equivalent to the whiteboard application of the present application regardless of changes in the name.
In using the whiteboard application, which is typically displayed full screen in one display screen, the whiteboard application is displayed in the first display screen 11 in this embodiment. Referring to fig. 2, in addition to the contents written and added during the use and the contents inserted into the display (not shown in fig. 2), there are various tool controls for operating the above contents, which are normally displayed on the toolbar 111 in the whiteboard application interface, in the example description of the present scheme, the toolbar 111 is displayed at the bottom of the whiteboard application interface, and may also be displayed on the left and/or right side in the actual layout, or even on the left, right, and bottom in a comprehensive manner. Some tool controls in the toolbar 111 correspond to digital implementation of the function of the entity blackboard, such as writing, erasing, page turning (part of the entity blackboard is available), and other tool controls are function supplement controls implemented on the basis of whiteboard application, such as cancel input, restore input, handwriting selection, insertion, and the like. In order to realize multi-user coordinated operation, a toolbar 121 is also arranged at the bottom of the second display screen 12, and compared with the first display screen 11 serving as a display main body of the whiteboard application interface, the second display screen 12 needs to perform relatively fewer operations, and the controls in the toolbar 121 are also relatively fewer.
The sub-window frame is a display area on the whiteboard application interface where different content can be displayed. As shown in fig. 4, displayed in the sub-window screen 112 is a file selected by a file presentation operation in the whiteboard application interface; as shown in fig. 5, displayed in the sub-window screen 115 is a display element 114 selected by an element selection operation from the display element 113 and the display element 114 currently displayed in the whiteboard application interface, and of course, the element selection operation may select a plurality of elements from the whiteboard application interface; in addition, the sub-window screen can also display a whole page of the whiteboard application interface and the like. The files are less formally modified in the whiteboard application interface, and more contents are displayed, annotated and recorded in real time, so that only basic controls for displaying, annotating and recording the communication contents in real time can be reserved in the sub-window picture. The method is suitable for various display contents (further different file types), the child window pictures corresponding to different display contents are only same in basic design style, and in specific implementation, a plurality of bottom layer implementations corresponding to different display contents are different from basic controls, for example, the child window pictures corresponding to document files, the core of the child window pictures is encapsulated with a picture browsing control (the document files are displayed in the child window pictures page by page in a single-page screenshot mode), and the basic controls are correspondingly provided with page turning and the like; the core of the file display window corresponding to the webpage file is packaged with a webpage browser control, and the basic control corresponds to a website input frame and the like. The above picture generation in the whiteboard application interaction process is completed through a processor of the display interaction system, and is finally sent to the corresponding display screen for display.
Specifically, referring to fig. 3, the interaction method of the whiteboard application implemented in the display interaction system specifically includes:
step S101: and receiving a window moving operation on the child window picture.
The operation in the touch device is mostly realized through touch operation, the window moving operation in the scheme is triggered through direct touch of a set area (for example, a certain area in a certain width range and a corner at the top) of a sub-window picture or long press touch (for example, 1S) of any position in the sub-window picture, namely, a corresponding touch initial action is detected at the position, the window moving operation is confirmed to be triggered, and the display position of the sub-window picture is responded according to a moving track and touch parameters of the window moving operation. In the detailed description of the present embodiment, the position change of the sub-window screen is explained in detail by combining the drawings. The window moving operation is detected by the main display controller or the auxiliary display controller, and specifically, the main display controller or the auxiliary display controller is confirmed by the display position of the sub-window picture, that is, the sub-window picture is displayed on the main display device, and then the main display controller detects the touch operation to obtain a main operation signal, and the auxiliary display controller detects the touch operation to obtain an auxiliary operation signal. The main operation signal and the auxiliary operation signal are both sent to the processor to respond correspondingly.
Step S102: and adjusting the display position of the sub-window picture along with the window moving operation.
The window moving operation mainly has two types from the moving range, one is movement across the display screen, and the other is movement within the display screen.
The moving track of the window moving operation is substantially a touch point position changing route, when the window moving operation is triggered, the touch point establishes a position incidence relation with the sub-window picture, and the display position of the sub-window picture is correspondingly translated according to the position changing trend of the touch point, so that the adjustment process of the display position of the sub-window picture is completed.
The position of the touch point can be freely changed in one display screen, but in the use scene (the intelligent interactive flat plate spliced by multiple display screens) of the scheme, the moving process of the sub-window picture is restrained by the interaction requirement in the use scene. Specifically, when the sub-window picture is not in contact with the boundary of the display screen, the sub-window picture freely moves along with the position change of the touch point, and when the sub-window picture is in contact with the boundary of the display screen, the sub-window picture performs position adjustment which is not completely consistent with the movement of the touch point according to the type of the boundary.
Specifically, if the sub-window image contacts with the boundary of the display screen during the position adjustment process, the type of the contact needs to be determined, and the following two moving modes are provided:
confirming that the sub-window picture is in contact with a first class boundary of the intelligent interaction tablet in a following adjustment process, and responding to a movement component in a direction vertical to the first class boundary, wherein the sub-window picture is displayed across a display screen;
confirming that the child window picture is in contact with a second type boundary of the intelligent interaction tablet in the following adjustment process, and stopping responding to a movement component of the window movement operation in the direction vertical to the second type boundary;
the first type of boundary is a boundary determined by a common side adjacent to the two display screens, and the second type of boundary is a boundary determined by a side adjacent to and opposite to the common side.
And when the sub-window picture contacts the boundary of the main display equipment or the auxiliary display equipment and the moving direction does not have the auxiliary display equipment or the main display equipment, changing the moving direction of the sub-window picture by taking the boundary as a reflecting surface. For an intelligent interactive panel composed of multiple display screens, the adjacent display screens exist, two adjacent side edges of the two display screens are defined as a common side edge, a sub-window picture can continuously move into the adjacent display screens after contacting the common side edge, and in the display screens, a display boundary determined by the common side edge is a first-class boundary; the sub-window may not have an adjacent display screen in some moving direction, and after contacting the side edge in this direction, the display cannot be kept completely displayed while keeping the forward state, the display boundary determined by the side edge which is not adjacent to other display screens is the second type boundary, and the side edge in one display screen is not the first type boundary, namely the second type boundary. For an intelligent interaction panel composed of multiple display screens, the display screens positioned at the periphery of the intelligent interaction panel have sides which are not adjacent to other display screens, namely, the intelligent interaction panel of the type has a first type boundary and a second type boundary, and for the display screens with adjacent display screens at the periphery, each side is a public side, namely, the boundaries of the display screens are the first type boundaries. So for an intelligent interactive tablet, there is a second type of boundary for each display screen, but it is possible that there is no second type of boundary for some display screens. Of course, if a display screen without the second type of boundary is to appear, the smart interactive tablet includes at least 9(3 × 3) display screens, and under the trend that the size of a single display screen is larger, the design is already not suitable for displaying and interacting in the application scene targeted by the smart interactive tablet, and the display of the whiteboard application is not performed on the smart interactive tablet with the layout.
If the sub-window picture contacts with the first type of boundary in the following moving process, the user has a trend of controlling the sub-window picture to move towards the adjacent display screen, the sub-window picture is gradually moved to the adjacent display screen to be displayed in response to the trend, and the cross-screen of the sub-window picture is gradually realized. If the sub-window picture contacts the second type boundary in the following movement process, the sub-window picture is controlled to continuously respond to the movement under the condition that the window movement operation keeps the movement trend, and partial disappearance is caused, at the moment, the movement component of the window movement operation in the direction vertical to the second type boundary is stopped responding, namely, the window movement operation is decomposed into the movement component vertical to the second type boundary and the movement component parallel to the second type boundary, wherein the movement component vertical to the second type boundary is not responded, and only the movement component parallel to the second type boundary is responded to and moves along the second type boundary.
Step S103: and displaying the sub-window picture at the target position of the window moving operation, wherein the sub-window picture is completely displayed on one of the at least two display screens.
In the application scene of an intelligent interactive panel composed of multiple display screens, the display screens are influenced by the innate structure, the display boundaries in the display screens brought by the side edges on the structure cannot be eliminated temporarily, touch operation is difficult to execute across the display screens, and in order to ensure display and interaction effects and subsequent cooperative operation, sub-window pictures are required to be displayed completely in one display screen. If the position display of the sub-window picture directly determined after the window moving operation is completely displayed in one display screen, the position is the target position and can be directly displayed in the target position; if the position display of the sub-window picture directly determined after the window moving operation is the cross-display, the display ratio of the sub-window picture in different display screens is used for confirming which display screen is adjusted to display, namely the target position of the window moving operation needs to be confirmed again according to the display ratio.
The moving type of the window moving operation can be further subdivided into two types, namely, a window dragging operation and a window flicking operation, according to the difference of the touch parameters (especially the terminal moving speed) of the window moving operation. For window dragging operation, stopping operation is stopping along with the moving process, and judging whether the position of the child window picture needs to be adjusted according to whether the child window picture is wholly in one display screen at the moment. For window swing operation, after the operation is stopped, the sub-window picture has a process of performing deceleration motion along the moving direction of a touch point when the touch movement is finished until the speed is reduced to zero, and whether the position of the sub-window picture needs to be adjusted or not is judged in a display screen by judging whether the sub-window picture is wholly reduced to zero or not. Because the size of the intelligent interactive flat plate formed by the multiple display screens is usually larger, the movement of a larger distance can be realized through a smaller action by window swing operation, and the interactive action is simplified.
Confirming window dragging operation corresponding to the window moving operation with the terminal moving speed within the preset dragging speed range;
correspondingly, the displaying the sub-window picture at the target position of the window moving operation includes:
if the sub-window picture is completely displayed on one display screen when the window moving operation is finished, the display position of the sub-window picture is kept;
and if the sub-window picture is displayed across the display screen when the window moving operation is finished, the sub-window picture is wholly translated to the display screen with the largest display proportion to be displayed.
Window swing operation is confirmed to be window moving operation corresponding to the window moving speed of which the tail end moving speed is within a preset swing speed range;
correspondingly, the displaying the sub-window picture at the target position of the window moving operation includes:
continuously moving the sub-window picture at a preset acceleration based on the terminal moving speed;
if the sub-window picture is contacted with the second type of boundary in the continuous moving process, changing the moving direction of the sub-window picture by taking the second type of boundary as a reflecting surface;
if the moving speed of the sub-window picture is zero, the sub-window picture is completely displayed on a display screen, and the display position of the sub-window picture is kept;
and if the moving speed of the sub-window picture is zero, the sub-window picture is displayed across the display screen, and the whole sub-window picture is translated to the display screen with the largest display proportion to be displayed.
For window dragging action and window shaking operation, the main difference between the two is whether the touch point moves continuously after disappearing, and whether the touch point moves continuously or not, the moving speed is confirmed when the touch point disappears, if the moving speed is in the dragging speed range, no continuous movement exists, if the moving speed is in the shaking speed range, continuous movement exists, the two can be judged by a speed threshold value, and in consideration of interactive habits, the dragging speed is less than the shaking speed. The confirmation mechanism of the target position after all the movement is finished is the same, namely if the target position is completely displayed on one display screen, the display state is kept; and if the cross-screen display is carried out, adjusting the display screen with the largest display screen proportion to carry out the integral display.
Regarding the continuous moving process of the window swing operation, the control is performed according to the moving speed when the touch point disappears and the preset acceleration, wherein the situation that the touch point contacts with the second type boundary in the moving process is mainly considered, at this time, the direction of the speed can be changed by taking the second type boundary as the reflecting surface, and the value of the speed is reserved. Specifically, a normal is constructed based on a contact point of the sub-window picture and the second-class boundary, the sub-window picture is located on two sides of the normal before and after the speed change, and the included angle between the sub-window picture and the normal is the same, and the sub-window picture continues to perform deceleration movement at the changed speed and the changed acceleration until the speed is zero.
On the basis of the above embodiment, if the sub-window picture and the whiteboard application interface are displayed on the same display screen, the sub-window picture is displayed on the whiteboard application interface;
and if the sub-window picture and the whiteboard application interface are displayed on different display screens, the display priority of the sub-window picture and the application interface displayed in the display screen is the same.
For the sub-window picture, the display hierarchy of the sub-window picture can be further fixed to be a top display in the whiteboard application interface. Specifically, the sub-window picture displayed on the whiteboard application interface does not change the display level of the sub-window picture due to the operation on other display elements displayed outside the sub-window picture, so that the sub-window picture can only be shielded by other sub-window pictures in the existence period of the sub-window picture, and the display of the sub-window picture cannot be influenced by other display elements (such as writing tracks, inserted pictures and inserted tables) outside the sub-window picture. If the sub-window screen and the whiteboard application interface are displayed on different display screens, the display priority of the sub-window screen and the application interface displayed on the display screens is the same, namely which (sub-window screen or application interface) is the current operation focus and which is displayed on the uppermost layer, and after the current display on the uppermost layer is closed, the previous one is displayed on the uppermost layer as the operation focus.
On the basis of the above embodiment, the interaction method further includes:
and receiving the adjustment operation on the sub-window picture, and adjusting the display state and/or the display content of the sub-window picture.
The adjustment operation on the sub-window screen can be subdivided into a state adjustment operation and a content adjustment operation. The state adjustment operation mainly comprises window maximization, window minimization, window restoration, window closing and the like; the content adjustment operation mainly includes adding writing handwriting, deleting writing handwriting, and the like.
The present solution is further exemplified with reference to fig. 7-13 based on the display state shown in fig. 6. In fig. 6, the child window screen 116 is displayed on the first display screen 11, and the toolbar 111 can confirm that the whiteboard application interface is also displayed on the first display screen 11.
Fig. 7 and fig. 8 are combined on the basis of fig. 6, and a process of implementing the window drag operation is described. In fig. 7, the sub-window screen 116 moves in response to a window moving operation, and when a touch point disappears, it is detected that the speed of the touch point is within the range of the dragging speed, and it is confirmed that the window moving operation is the window dragging operation. The sub-window frame 116d is in a cross-screen display state and does not need to move continuously, 40% of the sub-window frame 116d is displayed on the first display screen 11, 60% is displayed on the second display screen 12, and the display scale of the second display screen 12 is the largest, so that the sub-window frame 116e is displayed on the second display screen 12 as a whole as shown in fig. 8.
Fig. 9 and fig. 10 are combined on the basis of fig. 6, and a process for implementing the window shaking operation is described. In fig. 9, the sub-window screen 116 moves in response to the window moving operation, and when the touch point disappears, it is detected that the speed of the touch point is within the swing speed range, and it is confirmed that the window moving operation is the window swing operation. When the sub-window frame 116f is displayed across screens and needs to move continuously, 55% of the sub-window frame 116g is displayed on the first display screen 11 and 45% of the sub-window frame 116g is displayed on the second display screen 12 when the sub-window frame 116g stops, and the display scale in the first display screen 11 is the largest, so that the sub-window frame 116h is displayed on the first display screen 11 as a whole as shown in fig. 10.
Fig. 11 is a conventional implementation process of a window drag operation on the basis of fig. 6, and the sub-window screen 116 moves to obtain a sub-window screen 116a in response to the window drag operation. When the sub-window picture 116 contacts the second type boundary along with the moving process, the sub-window picture is moved along the second type boundary, and if the window dragging operation has a component far away from the second type boundary, the sub-window picture is moved to the center of the first display screen 11.
Fig. 12 and 13 are combined on the basis of fig. 5, and a further implementation process of the window shaking operation is shown. The sub-window frame 116 moves following in response to the window moving operation, the touch point disappears after the sub-window frame 116b is obtained, and when the touch point disappears, it is detected that the speed of the touch point is within the swing speed range, and it is determined that the window moving operation is the window swing operation. The sub-window frame 116b is completely displayed on the first display screen 11, and needs to move continuously, and after contacting the second type boundary in the process of moving continuously, the speed change is performed with the second type boundary as a reflective surface, and the deceleration motion is continuously performed at the changed speed until the speed is zero, so as to obtain the sub-window frame 116 c. Referring to fig. 13, the contact point between the sub-window frame and the second-type boundary is a, v1 is the speed when contacting, v2 is the speed after changing, the speed before and after changing is located on both sides of the normal, and the angle between the speed before and after changing and the normal is α.
The description in this example is for as many operations as possible in one implementation, and not for one unique implementation. In one implementation, operation and response may be performed only for one requirement, such as cross-screen flicking; it is also possible to handle multiple requirements and to respond to multiple requirements, and the sequence is not necessarily the same as the description in the scheme, and all can be regarded as the implementation of the scheme without departing from the core design idea of the scheme. For example, moving the sub-window frame from the other display screen to the display screen on which the whiteboard application interface is located.
In the prior art, the detection of touch operation and the display of a screen are mature technologies, and are not particularly described in this embodiment. The foregoing judgment, response, and picture generation of the operation types are all implemented by a processor, that is, the specific implementation process is more operation signal receiving, response, and picture generation process based on the processor angle.
The display interaction system comprises a main display device, an auxiliary display device and a processor, wherein the main display device is internally provided with a main display controller, and the auxiliary display device is internally provided with an auxiliary display controller; the processor generates a first main display signal and a first auxiliary display signal or generates a second main display signal and a second auxiliary display signal according to the operation of the whiteboard application; the first main display signal and the second main display signal carry main window pictures of the whiteboard application, and the first auxiliary display signal and the second main display signal carry sub-window pictures of the whiteboard application; the processor transmits the first main display signal or the second main display signal to the main display controller and transmits the first auxiliary display signal or the second auxiliary display signal to the auxiliary display controller; the main display controller is used for controlling the main display equipment to display according to the first main display signal or the second main display signal; the auxiliary display controller is used for controlling the auxiliary display equipment to display according to the first auxiliary display signal or the second auxiliary display signal; the main display controller receives a main operation signal acting on main display equipment and sends the main operation signal to the processor; the auxiliary display controller receives an auxiliary operation signal acting on auxiliary display equipment and sends the auxiliary operation signal to the processor; and the processor confirms the operation of the whiteboard application according to the main operation signal or the auxiliary operation signal. The sub-window pictures are switched between different display screens through the moving operation of the sub-window pictures related to the white board application interface, so that the white board application interface can be operated in a partition mode on the different display screens according to actual interaction requirements, and multi-user cooperative operation of the intelligent interaction panel with the combination of the multiple display screens is achieved.
Example two
Fig. 14 is a schematic structural diagram of a display interaction device according to a second embodiment of the present invention. Referring to fig. 14, the display interaction apparatus includes: a mobile receiving unit 201, a following display unit 202, and a target display unit 203.
The intelligent interactive panel comprises at least two display screens, wherein one of the at least two display screens displays a whiteboard application interface in a full-screen manner, and one of the at least two display screens displays a sub-window picture generated by the whiteboard application interface; in the interactive apparatus, a mobile receiving unit 201, configured to receive a window moving operation on the sub-window screen; a following display unit 202, configured to adjust a display position of the sub-window frame in accordance with the window moving operation; and the target display unit 203 is configured to display the sub-window picture at a target position of the window moving operation, where the sub-window picture is completely displayed on one of the at least two display screens.
On the basis of the above embodiment, the following display unit 202 includes:
the first following module is used for confirming that the child window picture is in contact with a first class boundary of the intelligent interaction tablet in the following adjustment process, and responding to a movement component in the direction vertical to the first class boundary, and the child window picture is displayed across a display screen;
the second confirming module is used for confirming that the child window picture is in contact with a second type boundary of the intelligent interaction tablet in the following adjustment process and stopping responding to the movement component of the window movement operation in the direction vertical to the second type boundary;
the first type of boundary is a boundary determined by a common side adjacent to the two display screens, and the second type of boundary is a boundary determined by a side adjacent to and opposite to the common side.
On the basis of the embodiment, the window moving operation comprises a window dragging operation, and the terminal moving speed of the window dragging operation is within a preset dragging speed range;
correspondingly, the target display unit 203 includes:
the first display module is used for keeping the display position of the sub-window picture if the sub-window picture is completely displayed on one display screen when the window moving operation is finished;
and the second display module is used for translating the whole sub-window picture to the display screen with the largest display proportion for display if the sub-window picture is displayed across the display screens when the window moving operation is finished.
On the basis of the above embodiment, the window moving operation includes a window swing operation, and a moving speed of a terminal of the window swing operation is within a preset swing speed range;
correspondingly, the target display unit 203 includes:
the deceleration moving module is used for continuously moving the sub-window picture at a preset acceleration based on the terminal moving speed;
the third display module is used for keeping the display position of the sub-window picture if the moving speed of the sub-window picture is zero and the sub-window picture is completely displayed on one display screen;
and the fourth display module is used for translating the whole sub-window picture to the display screen with the maximum display scale for displaying if the moving speed of the sub-window picture is zero and the sub-window picture is displayed across the display screens.
On the basis of the above embodiment, the deceleration moving module includes:
and the reflection turning submodule is used for changing the moving direction of the sub-window picture by taking the second type of boundary as a reflecting surface if the sub-window picture is contacted with the second type of boundary in the continuous moving process.
On the basis of the above embodiment, the display interaction apparatus further includes:
the top display unit is used for displaying the sub-window picture on the whiteboard application interface if the sub-window picture and the whiteboard application interface are displayed on the same display screen;
and the peer display unit is used for displaying the sub-window picture on a different display screen from the whiteboard application interface, wherein the display priority of the sub-window picture is the same as that of the application interface displayed in the display screen where the sub-window picture is located.
On the basis of the above embodiment, the display interaction apparatus further includes:
and the window adjusting unit is used for receiving the adjusting operation on the sub-window picture and adjusting the display state and/or the display content of the sub-window picture.
The display interaction device provided by the embodiment of the invention is operated by the processor, can be used for executing any display interaction method provided by the first embodiment, and has corresponding functions and beneficial effects.
EXAMPLE III
Fig. 15 is a schematic structural diagram of a terminal device according to a third embodiment of the present invention, where the terminal device is a specific hardware presentation scheme of the display interaction system described above. As shown in fig. 15, the terminal device includes a processor 310, a memory 320, an input means 330, an output means 340, and a communication means 350; the number of the processors 310 in the terminal device may be one or more, and one processor 310 is taken as an example in fig. 15; the processor 310, the memory 320, the input device 330, the output device 340 and the communication device 350 in the terminal equipment may be connected by a bus or other means, and the connection by the bus is exemplified in fig. 15.
The memory 320 is used as a computer-readable storage medium for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the display interaction method in the embodiment of the present invention (for example, the mobile receiving unit 201, the following display unit 202, and the target display unit 203 in the display interaction apparatus). The processor 310 executes various functional applications of the terminal device and data processing by executing software programs, instructions and modules stored in the memory 320, that is, implements the above-described display interaction method.
The memory 320 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. Further, the memory 320 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, memory 320 may further include memory located remotely from processor 310, which may be connected to the terminal device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 330 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal apparatus. The output device 340 may include a display device such as a display screen.
The terminal equipment comprises a display interaction device, can be used for executing any display interaction method, and has corresponding functions and beneficial effects.
Example four
Embodiments of the present invention further provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform operations related to the display interaction method provided in any of the embodiments of the present application, and have corresponding functions and advantages.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product.
Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein. The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks. These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory. The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A display interaction system is characterized by comprising main display equipment, auxiliary display equipment and a processor, wherein the main display equipment is internally provided with a main display controller, and the auxiliary display equipment is internally provided with an auxiliary display controller;
the processor generates a first main display signal and a first auxiliary display signal or generates a second main display signal and a second auxiliary display signal according to the operation of the whiteboard application; the first main display signal and the second main display signal carry main window pictures of the whiteboard application, and the first auxiliary display signal and the second main display signal carry sub-window pictures of the whiteboard application;
the processor transmits the first main display signal or the second main display signal to the main display controller and transmits the first auxiliary display signal or the second auxiliary display signal to the auxiliary display controller;
the main display controller is used for controlling the main display equipment to display according to the first main display signal or the second main display signal; the auxiliary display controller is used for controlling the auxiliary display equipment to display according to the first auxiliary display signal or the second auxiliary display signal;
the main display controller receives a main operation signal acting on main display equipment and sends the main operation signal to the processor; the auxiliary display controller receives an auxiliary operation signal acting on auxiliary display equipment and sends the auxiliary operation signal to the processor;
and the processor confirms the operation of the whiteboard application according to the main operation signal or the auxiliary operation signal.
2. The display interaction system according to claim 1, wherein the processor generates a first main display signal and a first sub display signal when the main operation signal is an operation signal for moving the sub window screen from a main display device to the sub display device.
3. The display interaction system of claim 1, wherein the processor generates a second main display signal and a second sub display signal when the sub window screen is moved from a sub display device to the main display device.
4. The display interaction system according to claim 1, wherein when the main operation signal is a main movement operation signal of the sub-window frame in the main display device, the processor updates a display position of the sub-window frame in the main display device according to the main movement operation signal, and correspondingly updates the second main display signal.
5. The display interaction system according to claim 1, wherein when the sub-operation signal is a sub-movement operation signal of the sub-window picture in the sub-display device, the processor updates a display position of the sub-window picture in the sub-display device according to the sub-movement operation signal, and correspondingly updates the first sub-display signal.
6. The display interaction system of claim 1, wherein when a terminal moving speed corresponding to the main operation signal or the sub operation signal is greater than a preset threshold value, the processor continuously updates the display position of the sub-window screen based on the terminal moving speed and a preset acceleration.
7. The display interaction system according to claim 6, wherein when the sub-window screen contacts a boundary of the main display device or the sub-display device and the moving direction is not the sub-display device or the main display device, the moving direction of the sub-window screen is changed by using the boundary as a reflection surface.
8. The display interaction system of claim 1, wherein the sub-window frame is displayed on top of the main display device.
9. The display interaction system of claim 1, wherein the sub-window screen has the same display priority on the secondary display device as other application interfaces.
10. The display interaction system of claim 1, wherein the processor is further configured to receive a sub-window adjustment operation, and adjust a display state and/or display content of the sub-window screen according to the sub-window adjustment operation.
CN202010340666.7A 2020-04-26 2020-04-26 Display interaction system Active CN111580713B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010340666.7A CN111580713B (en) 2020-04-26 2020-04-26 Display interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010340666.7A CN111580713B (en) 2020-04-26 2020-04-26 Display interaction system

Publications (2)

Publication Number Publication Date
CN111580713A true CN111580713A (en) 2020-08-25
CN111580713B CN111580713B (en) 2021-09-17

Family

ID=72114990

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010340666.7A Active CN111580713B (en) 2020-04-26 2020-04-26 Display interaction system

Country Status (1)

Country Link
CN (1) CN111580713B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI797929B (en) * 2021-12-29 2023-04-01 華碩電腦股份有限公司 Displaying control method
CN116541121A (en) * 2023-07-06 2023-08-04 深圳市微克科技有限公司 Dial fish swimming method, system and storage medium based on intelligent wearable device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030011678A1 (en) * 2001-07-14 2003-01-16 Chun Doo-Hwan Multichannel image processor and security system employing the same
CN101986384A (en) * 2009-07-29 2011-03-16 赛丽电子***(上海)有限公司 Method for processing display of multi-layer picture in picture for DLP multi-screen splicing display wall
CN102203722A (en) * 2008-09-03 2011-09-28 智能技术无限责任公司 Method of displaying applications in a multi-monitor computer system and multi-monitor computer system employing the method
CN103049136A (en) * 2013-01-05 2013-04-17 锐达互动科技股份有限公司 Double-board interaction implementation method on basis of electronic white boards
CN104750440A (en) * 2013-12-30 2015-07-01 纬创资通股份有限公司 Multi-screen window management method, electronic device and computer program product
CN105204771A (en) * 2015-10-27 2015-12-30 广东威创视讯科技股份有限公司 Mobile terminal control-based splicing screen windowing position determining method and mobile terminal
CN105867870A (en) * 2016-05-04 2016-08-17 广东威创视讯科技股份有限公司 Back-display method and device of spliced wall windows
CN108874331A (en) * 2017-05-08 2018-11-23 Tcl新技术(惠州)有限公司 A kind of video is across screen display methods, storage equipment and device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030011678A1 (en) * 2001-07-14 2003-01-16 Chun Doo-Hwan Multichannel image processor and security system employing the same
CN1397888A (en) * 2001-07-14 2003-02-19 三星电子株式会社 Multichannel image processor and security system using same
CN102203722A (en) * 2008-09-03 2011-09-28 智能技术无限责任公司 Method of displaying applications in a multi-monitor computer system and multi-monitor computer system employing the method
CN101986384A (en) * 2009-07-29 2011-03-16 赛丽电子***(上海)有限公司 Method for processing display of multi-layer picture in picture for DLP multi-screen splicing display wall
CN103049136A (en) * 2013-01-05 2013-04-17 锐达互动科技股份有限公司 Double-board interaction implementation method on basis of electronic white boards
CN104750440A (en) * 2013-12-30 2015-07-01 纬创资通股份有限公司 Multi-screen window management method, electronic device and computer program product
CN105204771A (en) * 2015-10-27 2015-12-30 广东威创视讯科技股份有限公司 Mobile terminal control-based splicing screen windowing position determining method and mobile terminal
CN105867870A (en) * 2016-05-04 2016-08-17 广东威创视讯科技股份有限公司 Back-display method and device of spliced wall windows
CN108874331A (en) * 2017-05-08 2018-11-23 Tcl新技术(惠州)有限公司 A kind of video is across screen display methods, storage equipment and device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI797929B (en) * 2021-12-29 2023-04-01 華碩電腦股份有限公司 Displaying control method
US11687306B1 (en) 2021-12-29 2023-06-27 Asustek Computer Inc. Displaying control method
CN116541121A (en) * 2023-07-06 2023-08-04 深圳市微克科技有限公司 Dial fish swimming method, system and storage medium based on intelligent wearable device
CN116541121B (en) * 2023-07-06 2023-09-19 深圳市微克科技有限公司 Dial fish swimming method, system and storage medium based on intelligent wearable device

Also Published As

Publication number Publication date
CN111580713B (en) 2021-09-17

Similar Documents

Publication Publication Date Title
CN110928468A (en) Page display method, device, equipment and storage medium of intelligent interactive tablet
JP7345052B2 (en) Intelligent interactive panel control method and device
CN114217726B (en) Operation method and device of intelligent interaction panel, terminal equipment and storage medium
CN111813302B (en) Screen projection display method and device, terminal equipment and storage medium
CN110928459B (en) Writing operation method, device, equipment and storage medium of intelligent interactive tablet
CN110928475B (en) Page interaction method, device, equipment and storage medium of intelligent interaction panel
CN110941373B (en) Interaction method and device for intelligent interaction panel, terminal equipment and storage medium
CN113934356B (en) Display operation method, device, equipment and storage medium of intelligent interaction panel
CN111338538A (en) Page operation method, device, equipment and storage medium of intelligent interactive tablet
CN111580713B (en) Display interaction system
CN111428455B (en) Form management method, device, equipment and storage medium
WO2021068405A1 (en) Element transfer method, apparatus and device, and storage medium
CN112462972A (en) White board page new method and device, interactive panel and storage medium
KR102682276B1 (en) Control method and device for smart interactive tablet
WO2023093504A1 (en) Interactive board split-screen control method, apparatus and device and storage medium
CN114155326A (en) Demonstration manuscript blackboard writing display method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant