CN111078117B - Courseware system based on multi-screen interaction - Google Patents

Courseware system based on multi-screen interaction Download PDF

Info

Publication number
CN111078117B
CN111078117B CN202010210582.1A CN202010210582A CN111078117B CN 111078117 B CN111078117 B CN 111078117B CN 202010210582 A CN202010210582 A CN 202010210582A CN 111078117 B CN111078117 B CN 111078117B
Authority
CN
China
Prior art keywords
interactive
interaction
touch
unit
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010210582.1A
Other languages
Chinese (zh)
Other versions
CN111078117A (en
Inventor
黄敦笔
杜武平
唐金腾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Sairobo Network Technology Co ltd
Original Assignee
Hangzhou Sairobo Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Sairobo Network Technology Co ltd filed Critical Hangzhou Sairobo Network Technology Co ltd
Priority to CN202010210582.1A priority Critical patent/CN111078117B/en
Publication of CN111078117A publication Critical patent/CN111078117A/en
Application granted granted Critical
Publication of CN111078117B publication Critical patent/CN111078117B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • G09B5/067Combinations of audio and projected visual presentation, e.g. film, slides

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A courseware system based on multi-screen interaction comprises a first display unit, a second display unit and a third display unit, wherein the first display unit is used for displaying visual output content and instant interactive response content used for man-machine interaction of a first user; the interaction data packet is used for collecting and generating a first user interaction, and the interaction response is fed back to the first interaction unit of the first presentation unit indicating the occurrence of the interaction: the first computing unit is used for receiving the sequence of the interactive data packets output by the first interactive unit and computing and generating the interactive process of the interactive object body and the whole interactive behavior result; the first communication unit is used for receiving the track, the stay time, the speed transient data information and the whole interactive behavior result of the interactive body generated by the first calculation unit; the first interaction response unit is used for monitoring, receiving and analyzing the transient data and the overall interaction behavior result data sent by the first communication unit. The invention can realize real-time sensing and multi-screen response of the behavior of the whole interaction subject, so that the man-machine interaction mode is not only simple touch control.

Description

Courseware system based on multi-screen interaction
Technical Field
The invention relates to the technical field of internet, in particular to a courseware system based on multi-screen interaction.
Background
The existing touch screen interaction scheme includes infrared and laser radar technology.
In the existing typical infrared technology, an infrared multi-touch (IRMT) is an electronic device that detects and locates a touch gesture of a person by using an X, Y-directional densely-distributed infrared matrix. The infrared touch screen is provided with a circuit board outer frame in front of the display, and the circuit board is provided with infrared transmitting tubes and infrared receiving tubes on four sides of the screen, so that infrared matrixes which are crossed transversely and vertically are formed in a one-to-one correspondence mode. When a finger touches the screen, two infrared rays in the transverse direction and the vertical direction passing through the position can be blocked, so that the coordinate position of a touch point on the screen can be judged. Any non-transparent object can change the infrared ray on the touch point to realize the touch screen operation.
The laser radar is a radar system which emits laser beams to detect the position, speed and other characteristic quantities of a target, and the working principle of the radar system is that the detection laser beams are emitted to the target, then the received signals reflected from the target are compared with the emitted signals, and after proper processing, the information of the target, such as the distance, the direction and the like, can be obtained.
By adopting the technical scheme of the multi-screen interactive courseware based on the infrared technology and the laser radar, the perception of parameters such as the distance, the direction, the coordinate position and the like of an objective scene interactive target can be realized, and thus the touch response of single-contact interaction is completed. The overall interaction behavior is composed of a series of single-touch interactions, and the overall interaction behavior cannot be inferred from the interaction results of simple single-touch interactions. Therefore, the prior art cannot well solve the detection of the overall interactive behavior and the multi-screen interactive response based on the interactive behavior. Therefore, it is desirable to provide a system and a method for detecting and synchronizing interactions based on multi-screen interaction behaviors to solve the above technical problems.
Disclosure of Invention
In order to solve the problems, the invention provides a courseware system based on multi-screen interaction, which can realize real-time sensing and multi-screen response of the behavior of an overall interaction main body, so that a man-machine interaction mode is not only touch response of a single contact.
The technical scheme of the invention is as follows:
a courseware system based on multi-screen interaction comprises:
the first presentation unit: the system comprises a display module, a first display module and a second display module, wherein the display module is used for displaying visual output content and instant interaction response content for man-machine interaction of a first user;
a first interaction unit: the interaction data packet is used for collecting and generating a first user interaction, and the interaction response is fed back to the first display unit to indicate the occurrence of interaction;
the first calculation unit: the sequence of the interactive data packets output by the first interactive unit is received, and an interactive process and an overall interactive behavior result of an interactive object body are calculated and generated;
the first communication unit: the interactive response unit is used for receiving the track, the stay time, the speed transient data information and the whole interactive behavior result of the interactive body generated by the first computing unit and sending the interactive body to the first interactive response unit through message protocol encapsulation;
the first interactive response unit: the system comprises a first communication unit, a second communication unit, a first communication unit and a second communication unit, wherein the first communication unit is used for sending transient data and overall interactive behavior result data to the second communication unit;
a second presentation unit: and displaying the response event generated by the first interaction unit in the form of courseware files, animations, sound effects and video multimedia contents.
Preferably, the visual output content of the human-computer interaction comprises courseware files, animations, sound effects and videos; the instant interactive response content is instant effect feedback made by the first display unit receiving the interactive response, and the interactive response content comprises visual animation effect and audio resources.
Preferably, the interactive data packet at least includes an interactive touch serial number ID, a touch point coordinate P, a collected timestamp TS, and a touch TYPE. The interactive touch serial number is used for describing a multi-point touch serial number, the touch point coordinate describes the screen coordinate of a touch point of the current touch serial number relative to the first display unit, the acquired timestamp describes the time of generating the current touch serial number, and the touch type comprises the state types of entering, continuing and leaving of the current touch point.
Preferably, the overall interactive behavior result generated by the calculation comprises: according to the coordinate of the interactive touch point output by the sequence of the data packet and the preset interactive object body of the visual output in the given first display unit, performing area matching, and calculating whether the interactive touch points belong to the same interactive object body; calculating interactive track, dwell time and speed transient data information according to serial numbers ID, coordinates P of touch points, acquired timestamps TS and sequence data and changes of touch TYPE TYPE of the touch points belonging to the same interactive object; and covering the touch points in the interactive object body to the interactive object body, and judging that the interactive response of the prefabricated interactive object body is triggered.
Preferably, the first presentation unit includes a preset user-interactive object body, and the object body includes a geometric-shaped image or video resource or an overall image or video resource composed of a plurality of element objects for individual interaction.
Preferably, the first display unit and the second display unit both comprise a display screen device, a display screen accessory device and an audio playing device, the display screen device comprises a display screen made of materials such as L ED, L CD, CRT and IPS, a curtain or a white wall carried by a projection device, or a spliced screen made of a plurality of liquid crystal modules made of related materials, the display screen accessory device is a partition plate device which is additionally provided with a specific toughened glass material or a transparent PVC material on the physical surface of the display screen and is used for bearing the weight and the pressure load corresponding to man-machine interaction, and the audio playing device is a device capable of playing audio resources.
Preferably, the first interaction unit comprises an infrared frame, an infrared camera, radar touch control, capacitance touch control and resistance touch control.
Preferably, the condition that the same interactive object body of the first computing unit triggers the overall interactive response is met by:
when an object body for preset user interaction comprises an integral image or video resource formed by a plurality of element objects for independent interaction, touch interaction is required to be carried out on each geometric image or video resource in an interaction object body;
when an object subject for a predetermined user interaction comprises an image or video resource formed by a geometric shape, covering the geometric area in the object subject according to the current touch point, and if and only if the touch point covers the area ScoverOver most of the geometric total area S of the subject' S bodytotalWhen is, i.e. Scover≥ λ×StotalWherein, λ is a threshold value, and the value range is (0, 1); and generating a characteristic vector according to all touch points in the current interactive main body to perform characteristic matching with a preset user interactive geometric object, so as to judge that the whole interactive object body is triggered.
Preferably, the touch point coverage area ScoverThe calculation of (2) involves processing such as smooth filtering of touch points, touch point clustering, touch area reconstruction and area calculation; the touch point clustering adopts the outermost periphery based on the clustering center point cluster, namely the touch point at the maximum radius as the result of the current point cluster; connecting the result points of each point cluster in sequence can generate a geometric figure with communication closure; and calculating the geometric figure of the communication closure by area calculation.
More preferably, the first communication unit encapsulates the interaction transient data information and the whole interaction behavior result data through a messaging protocol, the messaging protocol format is implemented through XM L, JSON, and ProtocalBuffer, and data encryption is implemented to ensure safe communication.
The invention has the beneficial effects that: compared with the prior art, the method and the device can realize real-time sensing and multi-screen response of the behavior of the overall interaction subject, so that a man-machine interaction mode is not only simple touch control; the system provided by the invention has the advantages of low computational complexity, good cross-platform portability and suitability for popularization and promotion.
Drawings
Fig. 1 is an interaction diagram of each component unit in the embodiment of the present invention.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
As shown in fig. 1, a courseware system based on multi-screen interaction includes at least a first presentation unit, a first interaction unit, a first computing unit, a first communication unit, a first interaction response unit, and a second presentation unit.
The first display unit is responsible for displaying visual output content and instant interaction response content for human-computer interaction of the first user. An object subject needing user interaction is preset in the courseware content. The visual output content of the man-machine interaction comprises contents such as courseware files, animations, sound effects, videos and the like. The instant interactive response content, namely the instant effect feedback made by the interactive response is received by the first display unit, and the interactive response content comprises visual animation effect, audio resource and the like.
The first display unit comprises a display screen device, a display screen accessory device and an audio playing device, wherein the display screen device comprises a display screen formed by materials such as L ED, L CD, CRT and IPS, a curtain or a white wall carried by the projection device, or a spliced screen formed by a plurality of liquid crystal modules made of related materials, and the like, the display screen accessory device comprises a partition plate device which is additionally provided with a specific toughened glass material or a transparent PVC material on the physical surface of the display screen and is used for bearing the weight and the pressure load corresponding to human-computer interaction, and the audio playing device is a device capable of playing audio resources, such as a loudspeaker, a loudspeaker or a sound device.
The first interaction unit comprises equipment and a work driver which are enabled by existing touch interaction, such as an infrared frame, an infrared camera, radar touch, capacitance touch, resistance touch and the like.
The first interaction unit is responsible for collecting and generating a data packet of the first user interaction, and feeding back an interaction response to the first presentation unit to indicate the occurrence of the interaction. The data packet of the first user interaction at least comprises an interaction touch serial number ID, a touch point coordinate P, a collected timestamp TS, a touch TYPE TYPE and the like. The interactive touch serial number is used for describing a multi-point touch serial number, the touch point coordinate describes the screen coordinate of a touch point of the current touch serial number relative to the first display unit, the acquired timestamp describes the time of generating the current touch serial number, and the touch type includes the state types of entering, continuing and leaving of the current touch point. The generated interactive data packet sequence is as follows:
{ID1 P1 TS1 TYPE1}
{ID2 P2 TS2 TYPE2}
and the first calculation unit receives the interactive data packet sequence output by the first interaction unit, and calculates and generates an interaction process and an overall interaction behavior result of the interaction object. The man-machine interaction result generated by calculation comprises the following steps: firstly, performing area matching according to the coordinates of interactive touch points output by a data packet sequence and a preset interactive object body of visual output in a given first display unit, and calculating whether the interactive touch points belong to the same object body; secondly, calculating transient data information such as interactive track, dwell time, speed and the like according to serial numbers ID of touch points, coordinates P of the touch points, acquired time stamps TS and sequence data and changes of touch TYPE TYPE in the same interactive object; thirdly, the touch points in the interactive object body are covered on the object body, and the interactive response of the prefabricated interactive object body is judged to be triggered.
The first communication unit receives transient data information such as the track, the stay time, the speed and the like of the interactive body generated by the first calculation unit and the whole interactive behavior result, and the transient data information and the whole interactive behavior result are sent to the first interactive response unit through message protocol encapsulation.
The first interactive response unit monitors, receives and analyzes the transient data and the overall interactive behavior result data sent by the first communication unit, and respectively and correspondingly distributes and triggers response events according to the transient data and the result data.
The second display unit is characterized by having the same functions as the first display unit and comprises a display screen device, a display screen accessory device and an audio playing device, wherein the display screen device comprises a display screen formed by materials such as L ED, L CD, CRT, IPS and the like, a screen cloth or a white wall carried by a projection device, or a spliced screen formed by a plurality of liquid crystal modules made of related materials, and the display screen accessory device comprises a partition plate device additionally provided with a specific toughened glass material or a transparent PVC material on the physical surface of the display screen and used for bearing the weight and pressure load corresponding to human-computer interaction.
An object body of the preset user interaction related to the first presentation unit comprises an image or video resource formed by a geometric shape or an integral image or video resource formed by a plurality of element objects for independent interaction.
The first display unit comprises a display screen device, a display screen accessory device and an audio playing device, wherein the display screen device comprises a display screen formed by materials such as L ED, L CD, CRT and IPS, a curtain or a white wall carried by the projection device, or a spliced screen formed by a plurality of liquid crystal modules made of related materials, and the like, the display screen accessory device comprises a partition plate device which is additionally provided with a specific toughened glass material or a transparent PVC material on the physical surface of the display screen and is used for bearing the weight and the pressure load corresponding to human-computer interaction, and the audio playing device is a device capable of playing audio resources, such as a loudspeaker, a loudspeaker or a sound device.
The condition that the same object body of the first computing unit triggers the whole interactive response is met: the first method is that when an object body for preset user interaction comprises an overall image or video resource composed of a plurality of element objects for independent interaction, touch interaction is required to be performed on each geometric image or video resource in the object body. The second method, when an object subject for preset user interaction includes an image or video resource composed of a geometric shape, first, covering the geometric area in the object subject according to the current touch point, and if and only if the touch point covers the area ScoverBeyond the geometric total area of most subject bodies, this condition is specified as follows:
Scover≥ λ×Stotal
λ is a threshold value, and takes a range of (0, 1), for example, a typical value of 0.8;
secondly, generating a characteristic vector according to all touch points in the current interactive main body and performing characteristic matching on the geometric object interacted by the preset user, so that the triggering of the whole interactive object body is judged.
The calculation of the coverage area of the touch points involves the processes of smooth filtering of the touch points, clustering of the touch points, reconstruction of touch areas, area calculation and the like. The smooth filtering of the touch points can eliminate the jitter error in the interaction process; the clustering of the touch points adopts the outermost periphery based on the cluster center point cluster, namely the touch points at the maximum radius as the result of the current point cluster; connecting the result points of each point cluster in sequence can generate a geometric figure with communication closure; and calculating the geometric figure of the communication closure by area calculation.
The first communication unit packages the interactive transient data information and the whole interactive behavior result data through a message protocol, the format of the message protocol is realized through XM L, JSON, ProtocalBuffer and the like, and data encryption is implemented to ensure safe communication.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: those skilled in the art can still modify or easily conceive of changes or substitutions of some technical features described in the foregoing embodiments within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the present invention in its spirit and scope. Are intended to be covered by the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (6)

1. A courseware system based on multi-screen interaction is characterized by comprising:
the first presentation unit: the system comprises a display module, a first display module and a second display module, wherein the display module is used for displaying visual output content and instant interaction response content for man-machine interaction of a first user;
a first interaction unit: the interaction data packet is used for collecting and generating a first user interaction, and the interaction response is fed back to the first display unit to indicate the occurrence of interaction;
the first calculation unit: the sequence of the interactive data packets output by the first interactive unit is received, and an interactive process and an overall interactive behavior result of an interactive object body are calculated and generated;
the first communication unit: the interactive response unit is used for receiving the track, the stay time, the speed transient data information and the whole interactive behavior result of the interactive body generated by the first computing unit and sending the interactive body to the first interactive response unit through message protocol encapsulation;
the first interactive response unit: the system comprises a first communication unit, a second communication unit, a first communication unit and a second communication unit, wherein the first communication unit is used for sending transient data and overall interactive behavior result data to the second communication unit;
a second presentation unit: displaying the response event generated by the first interaction unit in the form of courseware files, animations, sound effects and video multimedia contents;
the interactive data packet at least comprises an interactive touch serial number ID, a touch point coordinate P, a collected timestamp TS and a touch TYPE TYPE; the interactive touch serial number is used for describing a multi-point touch serial number, the touch point coordinate describes the screen coordinate of a touch point of the current touch serial number relative to the first display unit, the acquired timestamp describes the time of generating the current touch serial number, and the touch type comprises the state types of entering, continuing and leaving of the current touch point;
the overall interactive behavior result generated by the calculation comprises: according to the coordinate of the interactive touch point output by the sequence of the data packet and the preset interactive object body of the visual output in the given first display unit, performing area matching, and calculating whether the interactive touch points belong to the same interactive object body; calculating interactive track, dwell time and speed transient data information according to serial numbers ID, coordinates P of touch points, acquired timestamps TS and sequence data and changes of touch TYPE TYPE of the touch points belonging to the same interactive object; covering a touch point in the interactive object body on the interactive object body, and judging that the interactive response of the prefabricated interactive object body is triggered;
the first presentation unit comprises an object body preset for user interaction, and the object body comprises an image or video resource formed by a geometric shape or an integral image or video resource formed by a plurality of element objects for independent interaction;
the condition that the same interactive object body of the first computing unit triggers the whole interactive response is met:
when an object body for preset user interaction comprises an integral image or video resource formed by a plurality of element objects for independent interaction, touch interaction needs to be carried out on each geometric image or video resource in the object body;
when an object subject for a predetermined user interaction comprises an image or video resource formed by a geometric shape, covering the geometric area in the object subject according to the current touch point, and if and only if the touch point covers the area ScoverOver most of the geometric total area S of the subject' S bodytotalWhen the current is over; and generating a characteristic vector according to all touch points in the current interactive main body to perform characteristic matching with a preset user interactive geometric object, so as to judge that the whole interactive object body is triggered.
2. A courseware system based on multi-screen interaction as claimed in claim 1, wherein the visual output content of the human-machine interaction includes courseware documents, animations, sound effects and videos; the instant interactive response content is instant effect feedback made by the first display unit receiving the interactive response, and the interactive response content comprises visual animation effect and audio resources.
3. A courseware system based on multi-screen interaction as claimed in claim 1, wherein the first and second presentation units each comprise a display screen device, a display screen accessory device and an audio playing device, the display screen device is a display screen made of L ED, L CD, CRT and IPS materials, a screen or a white wall carried by a projection device, or a spliced screen made of a plurality of liquid crystal modules made of related materials, the display screen accessory device is a partition device which is additionally provided with a specific tempered glass material or a transparent PVC material on a physical surface of the display screen and is used for bearing weight and pressure loads corresponding to human-computer interaction, and the audio playing device is a device capable of playing audio resources.
4. A courseware system based on multi-screen interaction according to claim 1, wherein the first interaction unit comprises an infrared frame, an infrared camera, radar touch, capacitive touch, and resistive touch.
5. A courseware system based on multi-screen interaction as claimed in claim 1, wherein the touch point coverage area ScoverThe calculation of (2) involves the smooth filtering of touch points, the clustering of touch points, the reconstruction of touch areas and the area calculation processing; the touch point clustering adopts the outermost periphery based on the clustering center point cluster, namely the touch point at the maximum radius as the result of the current point cluster; connecting the result points of each point cluster in sequence can generate a geometric figure with communication closure; and calculating the geometric figure of the communication closure by area calculation.
6. A multi-screen interaction-based courseware system as claimed in claim 1, wherein the first communication unit encapsulates interaction transient data information and overall interaction behavior result data through a messaging protocol, the messaging protocol format is implemented through XM L, JSON, ProtocalBuffer, and data encryption is implemented to ensure secure communication.
CN202010210582.1A 2020-03-24 2020-03-24 Courseware system based on multi-screen interaction Active CN111078117B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010210582.1A CN111078117B (en) 2020-03-24 2020-03-24 Courseware system based on multi-screen interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010210582.1A CN111078117B (en) 2020-03-24 2020-03-24 Courseware system based on multi-screen interaction

Publications (2)

Publication Number Publication Date
CN111078117A CN111078117A (en) 2020-04-28
CN111078117B true CN111078117B (en) 2020-07-21

Family

ID=70324637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010210582.1A Active CN111078117B (en) 2020-03-24 2020-03-24 Courseware system based on multi-screen interaction

Country Status (1)

Country Link
CN (1) CN111078117B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108563391A (en) * 2018-04-04 2018-09-21 广州视源电子科技股份有限公司 Wireless screen transmission touch return method and system
CN109101172A (en) * 2017-10-10 2018-12-28 北京仁光科技有限公司 Multi-screen ganged system and its interaction display methods
CN110489177A (en) * 2019-09-03 2019-11-22 惠州Tcl移动通信有限公司 Application control method, apparatus, storage medium and terminal device
CN110837326A (en) * 2019-10-24 2020-02-25 浙江大学 Three-dimensional target selection method based on object attribute progressive expression

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109101172A (en) * 2017-10-10 2018-12-28 北京仁光科技有限公司 Multi-screen ganged system and its interaction display methods
CN108563391A (en) * 2018-04-04 2018-09-21 广州视源电子科技股份有限公司 Wireless screen transmission touch return method and system
CN110489177A (en) * 2019-09-03 2019-11-22 惠州Tcl移动通信有限公司 Application control method, apparatus, storage medium and terminal device
CN110837326A (en) * 2019-10-24 2020-02-25 浙江大学 Three-dimensional target selection method based on object attribute progressive expression

Also Published As

Publication number Publication date
CN111078117A (en) 2020-04-28

Similar Documents

Publication Publication Date Title
CN104866225B (en) A kind of electronic equipment and its control method with touch display screen
US7411575B2 (en) Gesture recognition method and touch system incorporating the same
US20140237408A1 (en) Interpretation of pressure based gesture
US20120249461A1 (en) Dedicated user interface controller for feedback responses
CN108881624B (en) Message display method and terminal equipment
CN106471442A (en) The user interface control of wearable device
CN110489025B (en) Interface display method and terminal equipment
CN103080887A (en) Apparatus and method for proximity based input
EP2926227A1 (en) Content manipulation using swipe gesture recognition technology
CN110502293B (en) Screen capturing method and terminal equipment
WO2021179972A1 (en) Processing method and apparatus
US20120007826A1 (en) Touch-controlled electric apparatus and control method thereof
CN110007822B (en) Interface display method and terminal equipment
WO2020038089A1 (en) Terminal, method and device for performing operations on terminal, and storage medium
CN110046013A (en) A kind of interface display method and terminal device
CN105744054A (en) Mobile terminal control method and mobile terminal
CN107239222A (en) The control method and terminal device of a kind of touch-screen
CN110309003B (en) Information prompting method and mobile terminal
CN109634487B (en) Information display method, device and storage medium
CN105786354B (en) A kind of hiden application deployment method and device
CN111078117B (en) Courseware system based on multi-screen interaction
CN103164142B (en) The method of adjustment of a kind of screen touch point position of picture-in-picture interface and electronic equipment
CN109873980A (en) Video monitoring method, device and terminal device
CN110231910A (en) A kind of control method and terminal device
CN107402685A (en) Mobile terminal and its operating method and operation device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant