CN109857314B - Gesture control method and device for screen equipment - Google Patents

Gesture control method and device for screen equipment Download PDF

Info

Publication number
CN109857314B
CN109857314B CN201811645716.1A CN201811645716A CN109857314B CN 109857314 B CN109857314 B CN 109857314B CN 201811645716 A CN201811645716 A CN 201811645716A CN 109857314 B CN109857314 B CN 109857314B
Authority
CN
China
Prior art keywords
preset
gesture
information
behavior
preset gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811645716.1A
Other languages
Chinese (zh)
Other versions
CN109857314A (en
Inventor
陈果果
钟镭
陈轶博
宋愷晟
李璇
关岱松
张静雅
李思琪
刘星彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Shanghai Xiaodu Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Shanghai Xiaodu Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd, Shanghai Xiaodu Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201811645716.1A priority Critical patent/CN109857314B/en
Publication of CN109857314A publication Critical patent/CN109857314A/en
Application granted granted Critical
Publication of CN109857314B publication Critical patent/CN109857314B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a gesture control method and device for a screen device. One embodiment of the method comprises: acquiring indication information for associating a first preset gesture in a preset gesture group with a preset control behavior for controlling the screen-mounted device; the method and the device have the advantages that the first preset gesture is associated with the preset control behavior in response to the fact that positive feedback information of the user on the first inquiry information generated according to the indication information is detected, wherein the first inquiry information is used for inquiring whether the first preset gesture is associated with the preset control behavior or not, the implementation mode achieves self-definition of the association relation between the gesture and the control behavior of the screen-mounted device, the user can define the control behavior triggered by the gesture according to using habits and requirements, and control efficiency is improved.

Description

Gesture control method and device for screen equipment
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to the field of human-computer interaction, and particularly relates to a gesture control method and device for a screen device.
Background
The non-contact human-computer interaction is a human-computer interaction mode which is convenient and has strong control flexibility. In non-contact human-computer interaction, because the relative position between the user and the electronic equipment is limited to be small, the convenient control requirement of the user is met, and the method is applied to various fields such as intelligent life and intelligent office.
The man-machine interaction mode of the screen device comprises interaction through an additional wireless transmitting device (such as a remote controller) and voice interaction. Many keys (including virtual keys) exist in an interactive mode based on an additional device, the operation step length is long depending on the design of a screen equipment interface, attention needs to be transferred to a remote controller from screen equipment during operation, and the operation efficiency needs to be improved. The voice interaction can resolve the user's intention and directly provide the content that the user wishes to obtain. However, voice interaction is not applicable in some scenarios, such as a noisy ambient sound or a scenario with a loud multimedia sound played by a device with a screen.
Disclosure of Invention
The embodiment of the application provides a gesture control method and device for a screen device.
In a first aspect, an embodiment of the present application provides a gesture control method for a screen-equipped device, including: acquiring indication information for associating a first preset gesture in a preset gesture group with a preset control behavior for controlling the screen-mounted device; and in response to detecting positive feedback information of the user on first inquiry information generated according to the indication information, associating the first preset gesture with the preset control behavior, wherein the first inquiry information is used for inquiring whether to associate the first preset gesture with the preset control behavior.
In some embodiments, the preset gesture group includes at least two gestures with different hand moving directions, and the preset manipulation behavior includes a preset manipulation behavior with a preset directional characteristic; the above-mentioned instruction information that obtains to carry out the relevance with the predetermined action of controlling that has the screen equipment to the first predetermined gesture in the gesture group of predetermineeing includes: acquiring indication information associating a first preset gesture with a hand moving direction in a preset gesture group as a first preset direction with a preset control behavior with a second preset direction characteristic.
In some embodiments, the preset gesture group includes a preset gesture pair, and the hand moving directions of two gestures in the preset gesture pair are opposite; the method further comprises the following steps: and in response to detecting negative feedback information of the user on the first inquiry information generated according to the indication information, associating a second preset gesture with a second preset direction characteristic, wherein the second preset gesture is generated by the preset gesture, and the middle hand movement direction of the second preset gesture is opposite to the first preset movement direction of the first preset gesture.
In some embodiments, the preset gesture pairs include a hand left movement gesture and a hand right movement gesture, or the preset gesture pairs include a hand up movement preset gesture and a hand down movement preset gesture; and the preset control behavior with the preset direction characteristic comprises a preset control behavior for switching the presented content of the screen equipment according to the preset direction characteristic.
In some embodiments, the above method further comprises: generating second inquiry information in response to detecting negative feedback information of the user on the first inquiry information generated according to the indication information, wherein the second inquiry information is used for inquiring whether a third preset gesture in a preset gesture group is associated with a preset control behavior, and the first preset gesture is different from the third preset gesture; and in response to detecting positive feedback information of the user on the second inquiry information, associating the third preset gesture with the preset manipulation behavior.
In some embodiments, the above method further comprises: and presenting control scene information which is triggered by the first preset gesture to execute preset control behaviors on the screen equipment, so that a user sends feedback information of the first inquiry information based on the control scene information.
In a second aspect, an embodiment of the present application provides a gesture control apparatus for a screen-equipped device, including: the device comprises an acquisition unit, a display unit and a display unit, wherein the acquisition unit is configured to acquire indication information for associating a first preset gesture in a preset gesture group with a preset control behavior for controlling the screen-equipped device; the first association unit is configured to associate the first preset gesture with the preset manipulation behavior in response to detecting positive feedback information of the user on first inquiry information generated according to the indication information, wherein the first inquiry information is used for inquiring whether the first preset gesture is associated with the preset manipulation behavior.
In some embodiments, the preset gesture group includes at least two gestures with different hand moving directions, and the preset manipulation behavior includes a preset manipulation behavior with a preset directional characteristic; the above-mentioned acquisition unit is further configured to: acquiring indication information associating a first preset gesture with a hand moving direction in a preset gesture group as a first preset direction with a preset control behavior with a second preset direction characteristic.
In some embodiments, the preset gesture group includes a preset gesture pair, and the hand moving directions of two gestures in the preset gesture pair are opposite; the above-mentioned device still includes: the second association unit is configured to associate a second preset gesture, which is a preset gesture in which the middle hand movement direction is opposite to the first preset movement direction, with a preset manipulation behavior with a second preset direction characteristic in response to detecting negative feedback information of the user on the first inquiry information generated according to the indication information.
In some embodiments, the preset gesture pairs include a gesture of moving the hand left and a preset gesture of moving the hand right, or the preset gesture pairs include a preset gesture of moving the hand up and a preset gesture of moving the hand down; and the preset control behavior with the preset direction characteristic comprises a preset control behavior for switching the presented content of the screen equipment according to the preset direction characteristic.
In some embodiments, the above apparatus further comprises: the generating unit is configured to generate second inquiry information in response to detecting negative feedback information of the user on the first inquiry information generated according to the indication information, wherein the second inquiry information is used for inquiring whether a third preset gesture in a preset gesture group is associated with a preset control behavior, and the first preset gesture is different from the third preset gesture; a third associating unit configured to associate a third preset gesture with the preset manipulation behavior in response to detecting positive feedback information of the user to the second query information.
In some embodiments, the above apparatus further comprises: the presenting unit is configured to present control scene information which is triggered by the first preset gesture to execute a preset control behavior on the screen-mounted device, so that the user sends feedback information of the first inquiry information based on the control scene information.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; a display device; the storage device is used for storing one or more programs, and when the one or more programs are executed by one or more processors, the one or more processors implement the gesture control method of the screen-mounted device provided by the first aspect.
In a fourth aspect, the present application provides a computer readable medium, on which a computer program is stored, where the program, when executed by a processor, implements the gesture control method for a screen-equipped device provided in the first aspect.
According to the gesture control method and device for the screen-equipped device, the indication information associating the first preset gesture in the preset gesture group with the preset control behavior for controlling the screen-equipped device is acquired, the first preset gesture is associated with the preset control behavior in response to the fact that the positive feedback information of the user on the first inquiry information generated according to the indication information is detected, wherein the first inquiry information is used for inquiring whether the first preset gesture is associated with the preset control behavior, the self-definition of the association relation between the gesture and the control behavior for the screen-equipped device is achieved, the user can define the control behavior triggered by the gesture according to the use habit and the requirement, and the control efficiency is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram to which embodiments of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a gesture control method for a screen device according to the present application;
FIG. 3 is a flow diagram of another embodiment of a gesture control method of a screen device according to the present application;
FIG. 4 is a schematic diagram of an application scenario of the gesture control method of the screen-mounted device shown in FIG. 3;
FIG. 5 is a flow diagram of yet another embodiment of a gesture control method of a screen device according to the present application;
FIG. 6 is a schematic diagram of an application scenario of the gesture control method of the screen-enabled device shown in FIG. 5;
FIG. 7 is a schematic structural diagram of an embodiment of a gesture control apparatus of a screen device of the present application;
FIG. 8 is a block diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
The gesture control is a flexible and convenient non-contact man-machine interaction mode. In the gesture manipulation scene, a manipulation intention of the user may be recognized through a gesture operation of the user and then responded. In the existing gesture control method, a user initiates a gesture to control according to a predefined correspondence between the gesture and a control command. However, in some scenarios, due to individual differences, the predefined correspondence between the gesture and the control command may not be consistent with the usage habits and the control requirements of the user, so that a misoperation is easily caused during the control or a plurality of gesture adjustments are required to achieve the control intention.
Fig. 1 illustrates an exemplary system architecture to which the gesture control method of the screen device or the gesture control apparatus of the screen device of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include a screen device 110 and a server 120. The screen device 110 may interact with the server 120 over a network to receive or send messages, etc. The screened device 110 may be an electronic device having a display screen, such as a smart television, a smart display screen, a screened smart speaker, and so on. Various human-computer interaction applications, such as a browser application, a search application, a multimedia resource playing application, and the like, may be installed on the screen-equipped device.
User 130 may interact with server 120 using a screen device 110 to obtain services provided by server 120. User 130 may control screen device 110 to initiate a service request to server 120 in a variety of ways, such as a non-contact gesture interaction, a voice interaction, an auxiliary device (e.g., remote control) interaction, and so forth.
A human motion sensing device 111, such as an image acquisition device based on visible light or infrared light, a ranging device based on information such as laser light, sound wave, or the like, or a device for three-dimensional modeling, may be disposed on the screened device 110. The human motion sensing device 111 may collect human motion information and transmit the human motion information to a processor of the screen device 110 or a server 120 connected to the screen device 110 for processing.
The server 120 may be a server that provides a content server for the content displayed by the screen device 110, or may be a server that provides a functional service for the screen device 110. The server 120 may receive the request sent by the screen device 110, parse the request, generate response information according to a parsing result, and return the generated response information to the screen device 110. The screen device 110 may output the response information.
It should be noted that the gesture control method for the screen-equipped device provided in the embodiment of the present application may be executed by the screen-equipped device 110, and accordingly, the gesture control apparatus for the screen-equipped device may be disposed in the screen-equipped device 110. In these scenarios, the system architecture described above may not include server 120.
In some scenarios, the gesture control method for the screen-equipped device provided in the embodiment of the present application may be performed by the server 120 communicatively connected to the screen-equipped device 110, and accordingly, the gesture control apparatus for the screen-equipped device may be disposed in the server 120 connected to the screen-equipped device 110.
It should be understood that the number of screen devices, servers, and users in fig. 1 is merely illustrative. There may be any number of screened devices, servers, users, as desired for implementation.
With continued reference to FIG. 2, a flow diagram 200 of one embodiment of a gesture control method for a screen device in accordance with the present application is shown. The gesture control method of the screen equipment comprises the following steps:
step 201, acquiring indication information associating a first preset gesture in a preset gesture group with a preset control behavior for controlling the screen-equipped device.
In this embodiment, an executing body (for example, the screen device shown in fig. 1) of the gesture control method for the screen device may obtain indication information that associates a first preset gesture in a preset gesture group with a preset manipulation behavior for manipulating the screen device, where the indication information may be information sent by a user. The preset gesture group may include at least one preset gesture, where the preset gesture may be a gesture operation that is not in contact with the manipulated screen device, that is, a gesture for controlling the screen device in an air space. The preset control behavior may be a behavior of performing various types of controls such as browsing control, media playing control, and global control on the device with a screen.
In practice, a user may select one gesture from a preset gesture group, or may present one gesture from the preset gesture group on a display screen of the screen device, and the user may send a request for associating the preset gesture with a preset manipulation behavior of the screen device, where the preset gesture group may be obtained in advance. The executing body may use the gesture selected by the user as a first preset gesture, and generate, according to the request, indication information associating the corresponding first preset gesture with the preset manipulation behavior.
In one exemplary scenario, a screened device may provide a "setup" function. After entering the 'setup' menu, the user may select one preset gesture group to be customized or select a preset gesture group in which the user needs to be customized. The on-screen device may demonstrate an association between a user-selected preset gesture or group of preset gestures and a manipulation behavior in a default setting or a previous setting. Then, a preset gesture may be presented as a first preset gesture, for example, a gesture with two crossed arms, and the user is prompted to select one control behavior in a preset control behavior library to be associated with the first preset gesture, and select one control behavior, for example, a shutdown operation, and then, indication information associating the first preset gesture with the control behavior selected by the user may be generated, for example, indication information associating the gesture with the two crossed arms with the shutdown operation may be generated.
Step 202, in response to detecting positive feedback information of the user on the first inquiry information generated according to the indication information, associating the first preset gesture with a preset manipulation behavior.
The first query information is used for querying whether to associate the first preset gesture with the preset manipulation behavior.
In this embodiment, first query information for querying whether to associate the first preset gesture with the preset manipulation behavior may be generated according to the indication information and sent to the user. After acquiring the first inquiry information, the user may send out positive feedback information, i.e. confirm "yes". If positive feedback information sent by the user to the first inquiry information is detected, the first preset gesture and the preset manipulation behavior can be associated.
Here, associating the first preset gesture with the preset manipulation behavior means that the first preset gesture may trigger execution of the preset manipulation behavior. After the setting is completed, if it is detected that the user initiates a first preset gesture, an instruction for executing a preset manipulation behavior on the screen-equipped device may be generated.
The first query information may be presented in the form of a pop-up window containing the content of the query, e.g. the text "whether to associate the gesture with the XXX operation" is contained in the pop-up window. The execution body may also provide a user feedback interface, for example a "confirm" button and a "cancel" button may be provided in a pop-up window. The user selects to click the "confirm" button, and then it is determined that the user has sent positive feedback information for the first query. If the user selects to click the 'cancel' key, the user does not send the positive feedback information, and at this time, a gesture selection interface or a manipulation behavior selection interface can be returned for the user to reselect and set.
According to the gesture control method of the screen device, the indication information associating the first preset gesture in the preset gesture group with the preset control behavior for controlling the screen device is obtained, the first preset gesture is associated with the preset control behavior in response to the fact that the positive feedback information of the user on the first inquiry information generated according to the indication information is detected, wherein the first inquiry information is used for inquiring whether the first preset gesture is associated with the preset control behavior, the self-definition of the association relation between the gesture and the control behavior of the screen device is achieved, the user can define the control behavior triggered by the gesture according to the use habit and the requirement, and the control efficiency is improved.
In some optional implementation manners of the foregoing embodiment, before associating the first preset gesture with the preset manipulation behavior in response to detecting positive feedback information of the user on the first query information generated according to the indication information, the process 200 of the gesture control method of the screen-mounted device may further include: and presenting control scene information which is triggered by the first preset gesture to execute preset control behaviors on the screen equipment, so that a user sends feedback information of the first inquiry information based on the control scene information.
Here, the control scene information triggered by the first preset gesture to perform the preset control behavior on the screen-equipped device may include a dynamic diagram of the control scene, or description information of the control scene combined with the dynamic diagram. In the control scene, after the user initiates the first preset gesture, the screen device is triggered to execute the preset control behavior. The manipulation scene information may be pre-stored in the execution body. The execution main body can prestore the control scene information of each preset control action triggered by each preset gesture respectively, and identify the prestored different control scene information based on the name of the preset gesture corresponding to the control scene information and the name of the preset control action, so that the corresponding control scene information can be found according to the mark and presented to the user before the first preset gesture and the preset control action are associated, the user can more accurately judge whether the indication information meets the requirements and expectations of the user, and misoperation caused by association errors is avoided.
Referring to FIG. 3, shown is a flow diagram of another embodiment of a gesture control method for a screen device according to the present application. As shown in fig. 3, a process 300 of the gesture control method for a screen-equipped device of the present embodiment may include the following steps:
step 301, acquiring indication information associating a first preset gesture with a hand moving direction in a preset gesture group as a first preset direction with a preset control behavior with a second preset direction characteristic.
In this embodiment, the preset gesture group includes at least two gestures with different hand moving directions, that is, the moving directions of the preset gestures in the preset gesture group are different, for example, the preset gesture group includes a hand left-moving gesture, a hand right-moving gesture, a hand upward-moving gesture, and a hand downward-moving gesture. Optionally, the hand gestures of the preset gestures in the same preset gesture group are the same, that is, the preset gesture group includes at least two gestures with the same hand gesture and different hand motion directions.
The preset manipulation behavior comprises a preset manipulation behavior with a preset directional characteristic. Here, the directional characteristic of the manipulation behavior may represent a change direction of a manipulation object for which the manipulation behavior is directed, or a change direction of a presentation attribute of a manipulation object for which the operation behavior is directed. In the manipulation for the existing device, the manipulation behavior having the directional characteristic may include a browsing content switching operation, a playing content switching operation, a brightness adjustment operation, a window scaling operation, and the like. As an example, in a browsing operation, "browse previous page" may represent a manipulation intention to browse forward, and the corresponding manipulation direction feature is a forward manipulation direction feature; "browsing the next page" may represent a manipulation intention to browse backward, and the corresponding manipulation direction feature is a backward manipulation direction feature. The second preset direction may be a designated direction, and the control behavior having the second preset direction characteristic is a control behavior for switching the control object and adjusting the presentation attribute of the control object along or towards the designated direction.
In this embodiment, an executing body of the gesture control method for a screen-equipped device (for example, the screen-equipped device shown in fig. 1) may obtain indication information, where the indication information is used to indicate that a first preset gesture in a preset gesture group, in which a hand moving direction is a first preset direction, is associated with a preset manipulation behavior with a second preset direction characteristic. For example, the information may be information indicating that the first preset gesture of "extending the index finger to slide left" is associated with the backward browsing manipulation behavior, or information indicating that the first preset gesture of "extending the index finger to slide left" is associated with the forward browsing manipulation behavior. Another example may be information associating a first preset gesture of "finger-up-swipe" with a maneuver behavior to navigate down, or information associating a first preset gesture of "finger-down-swipe" with a maneuver behavior to navigate down.
In one exemplary scenario, a screened device may provide a "setup" function. After entering the "setup" menu, the user may select a first preset gesture in which a hand moving direction to be customized in the preset gesture group is a first preset direction and a preset direction, for example, a gesture of sliding left. The on-screen device may demonstrate a manipulation behavior associated with a first preset gesture in a default setting or a previous setting. Then, a dynamic diagram of the first preset gesture may be presented, and the user is prompted to select a manipulation behavior in a preset manipulation behavior library to be associated with the first preset gesture, and the user may select a manipulation behavior with a second preset directional characteristic, such as backward browsing. Thereafter, indication information associating the first preset gesture with a user-selected manipulation behavior with the second preset directional characteristic may be generated, for example, indication information associating a finger left-swipe gesture with a backward browsing may be generated.
Step 302, in response to detecting positive feedback information of the user on the first inquiry information generated according to the indication information, associating the first preset gesture with a preset manipulation behavior.
The first query information is used for querying whether to associate the first preset gesture with the preset manipulation behavior.
In this embodiment, first query information for querying whether to associate the first preset gesture with the hand movement direction being the first preset direction with the preset manipulation behavior with the second preset direction characteristic may be generated according to the instruction information acquired in step 201, and sent to the user. After acquiring the first inquiry information, the user may send out positive feedback information, i.e. confirm "yes". If positive feedback information sent by the user to the first query information is detected, the first preset gesture with the hand moving direction being the first preset direction may be associated with the preset manipulation behavior with the second preset direction characteristic.
The first query information may be presented in the form of a pop-up window containing the content of the query, and may further contain a user feedback interface. The user may send positive feedback information on the first query information to the execution main body through the feedback interface, or may send negative feedback information on the first query information to the execution main body through the feedback interface. The execution main body can acquire the feedback information of the user through the feedback interface.
The method comprises the steps of acquiring indication information associating a first preset gesture with a hand movement direction as a first preset direction with a preset control behavior with a second preset direction characteristic, inquiring whether a user associates the first preset gesture with the preset control behavior or not through first inquiry information, and associating the first preset gesture with the preset control behavior when the user feedback is confirmed, so that user-defined association of the gesture with the direction characteristic and the control behavior with the direction characteristic is achieved. Because different users may have individual differences in the use habits of the direction-related gestures and the control behaviors, the embodiment enables the users to customize the corresponding control relationship between the gesture direction and the corresponding control direction according to the use habits of the users on the direction, improves the flexibility of the gesture control mode, and can perform gesture operation according to the user-defined association relationship after determining the association relationship between the gesture and the control behaviors, thereby providing the control mode according with the use habits of the users, and improving the operation efficiency.
In some optional implementation manners of this embodiment, the preset gesture group may include a preset gesture pair, and hand movement directions of two preset gestures in the preset gesture pair are opposite. At this time, the process 300 of the gesture control method for the screen-equipped device may further include:
step 303, in response to detecting that the user negatively feeds back the first query information generated according to the indication information, associating a second preset gesture, in which the middle hand movement direction is opposite to the first preset movement direction, with a preset manipulation behavior having a second preset direction characteristic.
Specifically, if the user determines that the association between the first preset gesture and the preset manipulation behavior with the second preset direction characteristic is inconsistent with the desired association object after receiving the first query message, negative feedback information may be issued, for example, a "no" selection is made in a feedback page provided by the first query message. If the execution main body detects negative feedback information of the user to the first inquiry information, the other preset gesture in the gesture pair to which the first preset gesture belongs, namely a second preset gesture of the gesture pair to which the first preset gesture belongs, wherein the direction of hand movement of the second preset gesture is opposite to the first preset direction, can be associated with a preset manipulation behavior with a second preset direction characteristic.
In practice, if the user determines that the gesture of the hand moving in a certain direction is not associated with the preset manipulation behavior, it may be determined that the user's intention is to associate the gesture in the opposite direction with the preset manipulation behavior, and the gesture of the same pair of gestures moving in the opposite direction may be directly associated with the preset manipulation behavior without asking the user again whether to associate the gesture of the same pair of gestures moving in the opposite direction with the preset manipulation behavior. Therefore, the association efficiency of the gestures and the manipulation behaviors can be improved.
Optionally, the hand postures of the two gestures in the preset gesture pair are the same.
Further optionally, if the user determines that one of the preset gestures in the preset gesture pair is associated with the preset manipulation behavior with the second preset direction characteristic, the user may also associate the other preset gesture in the preset gesture pair with the preset manipulation behavior with the direction characteristic opposite to the second preset direction. That is, a pair of hand movements with opposite directions are associated with a pair of manipulations with opposite directional features. Like this, can realize that the hand is removed along equidirectional not the action of controlling that corresponds two not equidirectional characteristics respectively in same kind of hand gesture, promote the convenience that the gesture was controlled.
Optionally, the preset gesture pairs include a gesture of moving the hand leftward and a preset gesture of moving the hand rightward. Or the preset gesture pair comprises a preset gesture of upward movement of the hand and a preset gesture of downward movement of the hand. And the preset control behavior with the preset direction characteristic comprises a preset control behavior for switching the display content of the screen-equipped device according to the preset direction characteristic. The preset manipulation behavior for switching the presentation content of the screen-equipped device according to the preset direction characteristic may specifically be a manipulation behavior for switching a browsing page, presenting different parts of content in a loaded page, or switching a playing content according to a switching sequence indicated by the preset direction. Therefore, the control behaviors of a pair of switching presentation contents with different direction characteristics can be associated through the gesture pairs in the up-down direction or the gesture pairs in the left-right direction, the difference of individual operation habits can be better adapted through the association of the simple gesture direction and the control direction, and the control efficiency is improved.
In a further embodiment, similar to the process 200, before the step of associating the first preset gesture with the preset manipulation behavior in response to detecting the positive feedback information of the user on the first query information generated according to the indication information, the process 300 of the gesture control method for the screen-mounted device may further include: and presenting control scene information which is triggered by the first preset gesture to execute preset control behaviors on the screen equipment, so that a user sends feedback information of the first inquiry information based on the control scene information.
With continued reference to FIG. 4, a schematic diagram of an application scenario of the gesture control method of the screen-equipped device shown in FIG. 3 is shown. The preset gesture group comprises leftward flaring and rightward flaring, and the preset control action comprises leftward page sliding. As shown in fig. 4, after acquiring the indication information that the user associates the gesture 401 flaring left with the manipulation behavior 402 of page sliding left, a dynamic gesture manipulation diagram that controls the page sliding left presented by the screen device by the gesture flaring left can be presented on the screen device 400, and the first query information 403 can be presented on the screen device 400. The first query message is a text query message of "whether to confirm to associate the gesture with the operation", and includes a user feedback interface: and the option entries of "yes" and "no". If the user is detected to select the "Yes" option entry, a gesture to incite to the left may be associated with the page sliding to the left. Alternatively, if the user selects the option entry of "no," a gesture to pan to the right may be associated with a page swipe to the left.
As can be seen from fig. 4, the gesture control method of the embodiment can associate gesture operations with different hand movement directions with operation behaviors with different directional characteristics, and can prompt a user by presenting scene information of corresponding operation behaviors triggered by gestures, thereby more intuitively and efficiently helping the user to realize the association between the gestures and the operation behaviors.
With continued reference to FIG. 5, shown is a flow diagram of yet another embodiment of a gesture control method of a screen device according to the present application. As shown in fig. 5, a process 500 of the gesture control method for a screen-equipped device of the present embodiment includes the following steps:
step 501, acquiring indication information associating a first preset gesture in a preset gesture group with a preset manipulation behavior for manipulating a device with a screen.
Step 502, in response to detecting positive feedback information of the user on the first query information generated according to the indication information, associating the first preset gesture with a preset manipulation behavior.
The first query information is used for querying whether to associate the first preset gesture with the preset manipulation behavior.
The steps 501 and 502 correspond to the steps 201 and 202 of the foregoing embodiment, respectively, and specific implementation manners of the steps 501 and 502 may refer to the foregoing description of the steps 201 and 202, which is not described herein again.
In some embodiments, the step 501 and the step 502 may also be performed according to the manner of the step 301 and the step 302 in the foregoing embodiments, and the specific implementation manner may refer to the description of the step 301 and the step 302, which is not described herein again.
Step 503, in response to detecting negative feedback information of the user to the first query information generated according to the indication information, generating second query information.
The second query information is used for querying whether a third preset gesture in the preset gesture group is associated with the preset control behavior, and the first preset gesture is different from the third preset gesture.
If negative feedback information of the user on the first inquiry information generated according to the indication information is detected, it can be determined that the user does not wish to associate the first preset gesture with the preset manipulation behavior. At this time, one preset gesture may be reselected from the preset gesture group as a third preset gesture, and second query information inquiring whether to associate the third preset gesture with the preset manipulation behavior is generated.
And step 504, in response to detecting positive feedback information of the user on the second inquiry information, associating the third preset gesture with the preset manipulation behavior.
If the user sends positive feedback information to the second inquiry information, that is, the user confirms that the third preset gesture is associated with the preset control behavior, the third preset gesture can be associated with the preset control behavior, and the association relationship between the third preset gesture and the preset control behavior is stored, so that in the subsequent gesture-based control, when the user initiates the third preset gesture, the preset control behavior associated with the third preset gesture is executed.
According to the control method of the screen device, when the user determines that one gesture is not associated with the preset control behavior, the query information associating the other gesture with the preset control behavior can be rapidly provided, so that the gesture which the user expects to be associated with the preset control behavior can be rapidly searched, the situation that the gesture is selected again and associated with the control behavior is avoided, and the efficiency of operation of associating the gesture with the control behavior can be improved.
Optionally, when the preset gesture group only includes two preset gestures, the efficiency of the association operation can be further improved by generating second query information and detecting feedback information of the user to the second query information, and then associating the gesture and the preset operation behavior according to the expectation of the user according to the feedback information of the second query information.
Referring to fig. 6, a schematic diagram of an application scenario of the gesture control method shown in fig. 5 is shown. As shown in fig. 6, on the basis of the scenario shown in fig. 4, if the feedback information of the user to the first query information 403 is "no", a second query information 404 may be generated, the second query information being used to query whether to associate another gesture "flaring left" 401 in the preset gesture group to which "flaring right" 405 belongs with the preset manipulation behavior "sliding left" 402. And, at this time, the screen device 400 may also present a dynamic gesture manipulation diagram that controls the page presented by the screen device to slide to the left by a gesture to flash to the left. The second query 404 may also include a user feedback interface: and the option entries of "yes" and "no". If it is detected that the user selects the option entry of "YES" in the second query message, a gesture 405 flaring to the right may be associated with a page sliding to the left.
As can be seen from fig. 6, in the gesture control method of this embodiment, when the user determines that the indication information associated with the current first preset gesture and the preset manipulation behavior is not in accordance with the expectation of the user according to the first query information, a feedback information interface for querying and responding to the query information associated with another gesture and the preset manipulation behavior is quickly provided, so that the user can be helped to quickly complete establishment of the association relationship between the gesture operation and the manipulation behavior.
With further reference to fig. 7, as an implementation of the methods shown in the above-mentioned figures, the present application provides an embodiment of a gesture control apparatus for a screen-equipped device, where the apparatus embodiment corresponds to the method embodiments shown in fig. 2, fig. 3, and fig. 5, and the apparatus may be applied to various electronic devices.
As shown in fig. 7, the gesture control apparatus 700 of the screen device of the present embodiment includes an obtaining unit 701 and a first associating unit 702. The acquiring unit 701 is configured to acquire indication information associating a first preset gesture in a preset gesture group with a preset manipulation behavior for manipulating a device with a screen; the first associating unit 702 is configured to associate the first preset gesture with the preset manipulation behavior in response to detecting positive feedback information of the user on first query information generated according to the indication information, wherein the first query information is used for querying whether to associate the first preset gesture with the preset manipulation behavior.
In some embodiments, the preset gesture group may include at least two gestures with different hand movement directions, and the preset manipulation behavior may include a preset manipulation behavior with a preset directional characteristic; and the above-mentioned obtaining unit 701 may be further configured to: acquiring indication information associating a first preset gesture with a hand moving direction in a preset gesture group as a first preset direction with a preset control behavior with a second preset direction characteristic.
In some embodiments, the preset gesture group may include a preset gesture pair, where hand movement directions of two gestures in the preset gesture pair are opposite;
the apparatus 700 may further include: the second association unit is configured to associate a second preset gesture, which is a preset gesture in which the middle hand movement direction is opposite to the first preset movement direction, with a preset manipulation behavior with a second preset direction characteristic in response to detecting negative feedback information of the user on the first inquiry information generated according to the indication information.
In some embodiments, the preset gesture pair may include a gesture of moving the hand left and a preset gesture of moving the hand right, or the preset gesture pair may include a preset gesture of moving the hand up and a preset gesture of moving the hand down; and the preset control behavior with the preset direction characteristic comprises a preset control behavior for switching the presented content of the screen equipment according to the preset direction characteristic.
In some embodiments, the apparatus 700 may further include: the generating unit is configured to generate second inquiry information in response to detecting negative feedback information of the user on the first inquiry information generated according to the indication information, wherein the second inquiry information is used for inquiring whether a third preset gesture in a preset gesture group is associated with a preset control behavior, and the first preset gesture is different from the third preset gesture; a third associating unit configured to associate a third preset gesture with the preset manipulation behavior in response to detecting positive feedback information of the user to the second query information.
In some embodiments, the apparatus 700 may further include: the presenting unit is configured to present control scene information which is triggered by the first preset gesture to execute a preset control behavior on the screen-mounted device, so that the user sends feedback information of the first inquiry information based on the control scene information.
It should be understood that the units recited in the apparatus 700 correspond to the various steps in the methods described with reference to fig. 2, 3, 5. Thus, the operations and features described above for the method are equally applicable to the apparatus 700 and the units included therein, and are not described in detail here.
According to the gesture control apparatus 700 for a screen device in the embodiment of the application, by acquiring the indication information associating the first preset gesture in the preset gesture group with the preset manipulation behavior for manipulating the screen device, in response to detecting the positive feedback information of the user to the first query information generated according to the indication information, associating the first preset gesture with the preset manipulation behavior, wherein the first query information is used for querying whether to associate the first preset gesture with the preset manipulation behavior, so that the self-definition of the association relationship between the gesture and the manipulation behavior for the screen device is realized, the user can define the manipulation behavior triggered by the gesture according to the use habit and the requirement, and the manipulation efficiency is improved.
An embodiment of the present application further provides an electronic device, including: one or more processors; a display device; and the storage device is used for storing one or more programs, and when the one or more programs are executed by one or more processors, the one or more processors realize the gesture control method of the screen-equipped device of the embodiment.
Referring now to FIG. 8, shown is a block diagram of a computer system 800 suitable for use in implementing the electronic device of an embodiment of the present application. The electronic device shown in fig. 8 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 8, the computer system 800 includes a Central Processing Unit (CPU)801 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)802 or a program loaded from a storage section 808 into a Random Access Memory (RAM) 803. In the RAM 803, various programs and data necessary for the operation of the system 800 are also stored. The CPU 801, ROM 802, and RAM 803 are connected to each other via a bus 804. An input/output (I/O) interface 805 is also connected to bus 804.
The following components are connected to the I/O interface 805: an input portion 806 including a keyboard, a mouse, and the like; an output section 807 including a signal such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage portion 808 including a hard disk and the like; and a communication section 809 including a network interface card such as a LAN card, a modem, or the like. The communication section 809 performs communication processing via a network such as the internet. A drive 810 is also connected to the I/O interface 805 as necessary. A removable medium 811 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 810 as necessary, so that a computer program read out therefrom is mounted on the storage section 808 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 809 and/or installed from the removable medium 811. The computer program performs the above-described functions defined in the method of the present application when executed by the Central Processing Unit (CPU) 801. It should be noted that the computer readable medium of the present application can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit and a first association unit. The names of the units do not form a limitation on the units themselves under certain conditions, for example, the acquiring unit may be further described as a unit for acquiring indication information associating a first preset gesture in the preset gesture group with a preset manipulation behavior for manipulating the screen-equipped device.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: the method comprises the steps of obtaining indication information for associating a first preset gesture in a preset gesture group with a preset control behavior for controlling the equipment with a screen, responding to the fact that positive feedback information of a user on first inquiry information generated according to the indication information is detected, and associating the first preset gesture with the preset control behavior, wherein the first inquiry information is used for inquiring whether the first preset gesture is associated with the preset control behavior.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (12)

1. A gesture control method of a screen-equipped device includes:
acquiring indication information for associating a first preset gesture in a preset gesture group with a preset control behavior for controlling a device with a screen, wherein the preset gesture group comprises at least two gestures with the same hand gesture and different hand moving directions;
in response to detecting positive feedback information of a user on first inquiry information generated according to the indication information, associating the first preset gesture with the preset control behavior, wherein the first inquiry information is used for inquiring whether to associate the first preset gesture with the preset control behavior;
generating second inquiry information in response to detecting negative feedback information of the user on first inquiry information generated according to the indication information, wherein the second inquiry information is used for inquiring whether a third preset gesture in the preset gesture group is associated with the preset control behavior, and the first preset gesture is different from the third preset gesture; in response to detecting positive feedback information of the user to the second query information, associating the third preset gesture with the preset manipulation behavior.
2. The method of claim 1, wherein the preset steering behavior comprises a preset steering behavior having a preset directional characteristic;
the acquiring of the indication information associating the first preset gesture in the preset gesture group with the preset control behavior for controlling the screen-equipped device includes:
acquiring indication information associating a first preset gesture with a hand moving direction in a preset gesture group as a first preset direction with a preset control behavior with a second preset direction characteristic.
3. The method of claim 2, wherein the preset gesture group comprises preset gesture pairs, two gestures of the preset gesture pairs having opposite hand movement directions;
the method further comprises the following steps:
in response to detecting negative feedback information of the user on the first inquiry information generated according to the indication information, associating a second preset gesture, in which the middle hand movement direction of the preset gesture is opposite to the first preset movement direction, with the preset manipulation behavior with a second preset direction characteristic.
4. The method according to claim 3, wherein the preset gesture pairs comprise a hand left movement gesture and a hand right movement gesture, or the preset gesture pairs comprise a hand up movement preset gesture and a hand down movement preset gesture; and
the preset control behavior with the preset direction characteristic comprises a preset control behavior for switching the display content of the screen equipment according to the preset direction characteristic.
5. The method of any of claims 1-4, wherein the method further comprises:
and presenting control scene information which is triggered by the first preset gesture to execute the preset control behavior on the screen equipment, so that a user can send feedback information of the first inquiry information based on the control scene information.
6. A gesture control apparatus of a screen device, comprising:
the device comprises an acquisition unit, a display unit and a display unit, wherein the acquisition unit is configured to acquire indication information for associating a first preset gesture in a preset gesture group with a preset manipulation behavior for manipulating the on-screen device, and the preset gesture group comprises at least two gestures with the same hand gesture and different hand moving directions;
a first associating unit configured to associate the first preset gesture with the preset manipulation behavior in response to detecting positive feedback information of a user on first query information generated according to the indication information, wherein the first query information is used for querying whether to associate the first preset gesture with the preset manipulation behavior;
the generating unit is configured to generate second inquiry information in response to detecting negative feedback information of a user on first inquiry information generated according to the indication information, wherein the second inquiry information is used for inquiring whether a third preset gesture in the preset gesture group is associated with the preset control behavior, and the first preset gesture is different from the third preset gesture;
a third associating unit configured to associate the third preset gesture with the preset manipulation behavior in response to detecting positive feedback information of the user on the second query information.
7. The apparatus of claim 6, wherein the preset steering behavior comprises a preset steering behavior having a preset directional characteristic;
the acquisition unit is further configured to:
acquiring indication information associating a first preset gesture with a hand moving direction in a preset gesture group as a first preset direction with a preset control behavior with a second preset direction characteristic.
8. The device of claim 7, wherein the preset gesture group comprises a preset gesture pair, and hand movement directions of two gestures in the preset gesture pair are opposite;
the device further comprises:
the second association unit is configured to associate a second preset gesture, in which the middle-hand movement direction of the preset gesture is opposite to the first preset movement direction, with the preset manipulation behavior with a second preset direction characteristic in response to detecting negative feedback information of the user on the first inquiry information generated according to the indication information.
9. The apparatus of claim 8, wherein the preset gesture pair comprises a hand left movement gesture and a hand right movement gesture, or the preset gesture pair comprises a hand up movement preset gesture and a hand down movement preset gesture; and
the preset control behavior with the preset direction characteristic comprises a preset control behavior for switching the display content of the screen equipment according to the preset direction characteristic.
10. The apparatus of any of claims 6-9, wherein the apparatus further comprises:
the presenting unit is configured to present control scene information which is triggered by the first preset gesture to execute the preset control behavior on the screen-mounted device, so that a user can send feedback information of the first inquiry information based on the control scene information.
11. An electronic device, comprising:
one or more processors;
a display device;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
12. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-5.
CN201811645716.1A 2018-12-29 2018-12-29 Gesture control method and device for screen equipment Active CN109857314B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811645716.1A CN109857314B (en) 2018-12-29 2018-12-29 Gesture control method and device for screen equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811645716.1A CN109857314B (en) 2018-12-29 2018-12-29 Gesture control method and device for screen equipment

Publications (2)

Publication Number Publication Date
CN109857314A CN109857314A (en) 2019-06-07
CN109857314B true CN109857314B (en) 2021-06-29

Family

ID=66893439

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811645716.1A Active CN109857314B (en) 2018-12-29 2018-12-29 Gesture control method and device for screen equipment

Country Status (1)

Country Link
CN (1) CN109857314B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116560509A (en) * 2023-05-17 2023-08-08 山东格物智能科技有限公司 Man-machine interaction system and method based on visual core algorithm

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7603381B2 (en) * 2004-09-30 2009-10-13 Microsoft Corporation Contextual action publishing
DE102011075467A1 (en) * 2011-05-06 2012-11-08 Deckel Maho Pfronten Gmbh DEVICE FOR OPERATING AN AUTOMATED MACHINE FOR HANDLING, ASSEMBLING OR MACHINING WORKPIECES
CN103135746B (en) * 2011-11-25 2018-01-02 夏普株式会社 Non-contact control method, system and equipment based on static posture and dynamic posture
CN102710908A (en) * 2012-05-31 2012-10-03 无锡商业职业技术学院 Device for controlling television based on gesture
US9047508B2 (en) * 2012-11-07 2015-06-02 Xerox Corporation System and method for identifying and acting upon handwritten action items
US9884257B2 (en) * 2013-03-06 2018-02-06 Tencent Technology (Shenzhen) Company Limited Method for preventing misoperations of intelligent terminal, and intelligent terminal
JP5677623B1 (en) * 2013-08-21 2015-02-25 三菱電機株式会社 Program editing program
CN103778361A (en) * 2014-01-16 2014-05-07 宇龙计算机通信科技(深圳)有限公司 Electronic equipment and function setting method thereof
CN105468270A (en) * 2014-08-18 2016-04-06 腾讯科技(深圳)有限公司 Terminal application control method and device
CN105138263A (en) * 2015-08-17 2015-12-09 百度在线网络技术(北京)有限公司 Method and device for jumping to specific page in application
CN108271078A (en) * 2018-03-07 2018-07-10 康佳集团股份有限公司 Pass through voice awakening method, smart television and the storage medium of gesture identification

Also Published As

Publication number Publication date
CN109857314A (en) 2019-06-07

Similar Documents

Publication Publication Date Title
CN111752442B (en) Method, device, terminal and storage medium for displaying operation guide information
RU2581840C2 (en) Registration for system level search user interface
CN108132744B (en) Method and equipment for remotely controlling intelligent equipment
US20130234959A1 (en) System and method for linking and controlling terminals
CN109725724B (en) Gesture control method and device for screen equipment
CN104484193A (en) Method and device for rapidly starting application program
CN106201219A (en) The quick call method of function of application and system
US11706476B2 (en) User terminal apparatus, electronic apparatus, system, and control method thereof
CN111324252B (en) Display control method and device in live broadcast platform, storage medium and electronic equipment
CN111263175A (en) Interaction control method and device for live broadcast platform, storage medium and electronic equipment
US20230291955A1 (en) User terminal apparatus, electronic apparatus, system, and control method thereof
CN111221456A (en) Interactive panel display method, device, equipment and storage medium thereof
US20170185422A1 (en) Method and system for generating and controlling composite user interface control
CN111565320A (en) Barrage-based interaction method and device, storage medium and electronic equipment
CN104822078A (en) Shielding method and apparatus for video caption
WO2022017421A1 (en) Interaction method, display device, emission device, interaction system, and storage medium
CN111061452A (en) Voice control method and device of user interface
CN109857314B (en) Gesture control method and device for screen equipment
CN113918229A (en) Operation guiding method and device of application program, guiding file generation method and related device
WO2020221076A1 (en) Hosted application generation method and device
CN112866798A (en) Video generation method, device, equipment and storage medium
US10613622B2 (en) Method and device for controlling virtual reality helmets
CN113126863B (en) Object selection implementation method and device, storage medium and electronic equipment
CN111723343B (en) Interactive control method and device of electronic equipment and electronic equipment
KR20140142629A (en) Method and apparatus for processing key pad input received on touch screen of mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210514

Address after: 100085 Baidu Building, 10 Shangdi Tenth Street, Haidian District, Beijing

Applicant after: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

Applicant after: Shanghai Xiaodu Technology Co.,Ltd.

Address before: 100085 Baidu Building, 10 Shangdi Tenth Street, Haidian District, Beijing

Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant