CN112799574A - Display control method and display device - Google Patents

Display control method and display device Download PDF

Info

Publication number
CN112799574A
CN112799574A CN202110204855.6A CN202110204855A CN112799574A CN 112799574 A CN112799574 A CN 112799574A CN 202110204855 A CN202110204855 A CN 202110204855A CN 112799574 A CN112799574 A CN 112799574A
Authority
CN
China
Prior art keywords
interface
identity information
user
control method
display control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110204855.6A
Other languages
Chinese (zh)
Inventor
王明月
王慧娟
马珵
祝国帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202110204855.6A priority Critical patent/CN112799574A/en
Publication of CN112799574A publication Critical patent/CN112799574A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a display control method and a display device, relates to the technical field of display, and aims to improve the convenience of man-machine interaction. The display control method includes: periodically acquiring first user identity information; displaying a first interface under the condition that the first user identity information is matched with the first pre-configuration identity information; displaying a primary interface in response to a first air gesture of a user, the primary interface being different from the first interface; and responding to a second air-separating gesture of the user, and displaying a second interface, wherein the second interface is displayed after a control instruction corresponding to the second air-separating gesture is executed on the currently displayed interface.

Description

Display control method and display device
Technical Field
The present invention relates to the field of display technologies, and in particular, to a display control method and a display device.
Background
Display devices may be classified into commercial display devices and home display devices according to application scenarios. Commercial display devices are often located in conference rooms or other areas where there is a large amount of traffic, and the displayed content has a larger audience.
The existing commercial display device is widely applied to teaching or meetings, and multimedia file contents used for teaching or meetings are displayed to audiences through a display screen. Because the display content of the display screen needs to be controlled through the remote controller, the remote controller needs to be held by a speaker at all times in the speaking process, and the man-machine interaction mode is not convenient enough.
Disclosure of Invention
The invention provides a display control method and a display device, which are used for improving the convenience of man-machine interaction.
In order to achieve the above purpose, the embodiment of the invention adopts the following technical scheme:
in a first aspect, a display control method is provided, including: periodically acquiring first user identity information; displaying a first interface under the condition that the first user identity information is matched with first pre-configured identity information; in response to a first clear gesture by a user, displaying a primary interface, the primary interface being different from the first interface; and responding to a second air-separating gesture of the user, and displaying a second interface, wherein the second interface is displayed after a control instruction corresponding to the second air-separating gesture is executed on the currently displayed main interface.
In some embodiments, after the displaying the main interface, the display control method further includes: and periodically acquiring second user identity information, and continuously executing the second space gesture responding to the user and displaying a second interface under the condition that the second user identity information is matched with second pre-configured identity information.
In some embodiments, the period of acquiring the second user identity information is greater than the period of acquiring the first user identity information.
In some embodiments, in a case that the second interface is a first interface entering an application, after the displaying the second interface, the display control method further includes: in the case that one content page of a first multimedia file is included in the currently displayed interface, in response to a third blank gesture of the user, switching the first multimedia file to a second multimedia file or changing a play state of the first multimedia file, the play state including play and pause.
In some embodiments, after the displaying the second interface, the display control method further comprises: and responding to the first user operation, returning from the currently displayed interface to a third interface, wherein the third interface is the last interface before the second interface.
In some embodiments, the display control method further includes: adding at least one multimedia file in response to a second user operation, wherein the at least one multimedia file comprises the first multimedia file; and responding to a third user operation to confirm the playing of the first multimedia file.
In some embodiments, the display control method further includes: acquiring identity information to be added; under the condition that the identity information to be added accords with an input standard, adding the identity information to be added into an identity information database as preconfigured identity information; under the condition that the identity information to be added does not accord with the entry standard, returning to execute the step of acquiring the identity information to be added; wherein the first and second preconfigured identity information are stored in the identity information database.
In some embodiments, the preconfigured identity information is a user face image.
In a second aspect, there is provided a display device including: at least one processor and at least one memory; wherein the at least one memory has stored therein one or more computer programs, the one or more computer programs comprising instructions, which when executed by the at least one processor, cause the display apparatus to perform the display control method of any of the above embodiments; a display screen configured to display the first interface, the main interface, and the second interface.
In a third aspect, a computer-readable storage medium is provided, which stores computer program instructions that, when executed on a display apparatus, cause the display apparatus to execute the display control method according to any one of the above embodiments.
According to the display control method provided by the embodiment of the invention, the display interface content, the state of the display interface content and the like of the display device are controlled and adjusted through the air-separating gesture, so that the control operation of a user can be simplified, the dependence of the user on an entity control device is reduced, the human-computer interaction experience is improved, and the human-computer interaction convenience is high.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a block diagram of a display device according to some embodiments of the present disclosure;
FIG. 2 is a block diagram of a display control device according to some embodiments of the present disclosure;
FIG. 3 is a flow chart of a display control method according to some embodiments of the present disclosure;
FIG. 4 is a first interface structure diagram according to some embodiments of the present disclosure;
FIG. 5 is a diagram of a primary interface structure according to some embodiments of the present disclosure;
FIG. 6 is a diagram of a wake-up interface structure according to some embodiments of the present disclosure;
FIG. 7 is a second interface structure diagram according to some embodiments of the present disclosure;
FIG. 8 is a flow chart from a primary interface to a secondary interface according to some embodiments of the present disclosure;
FIG. 9 is a flow chart illustrating switching between the content and the playing status of multimedia files included in a display interface according to some embodiments of the disclosure;
FIG. 10 is a block diagram of multimedia file content not included in the second interface according to some embodiments of the present disclosure;
fig. 11 is a flow diagram of entry of preconfigured identity information, in accordance with some embodiments of the present disclosure;
FIG. 12 is a system setup interface structure diagram of identity information entry according to some embodiments of the present disclosure;
fig. 13 is a block diagram of an interface for obtaining identity information to be added in accordance with some embodiments of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless the context requires otherwise, throughout the description and the claims, the term "comprise" and its other forms, such as the third person's singular form "comprising" and the present participle form "comprising" are to be interpreted in an open, inclusive sense, i.e. as "including, but not limited to". In the description of the specification, the terms "one embodiment", "some embodiments", "example", "specific example" or "some examples" and the like are intended to indicate that a particular feature, structure, material, or characteristic associated with the embodiment or example is included in at least one embodiment or example of the present disclosure. The schematic representations of the above terms are not necessarily referring to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be included in any suitable manner in any one or more embodiments or examples.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present disclosure, "a plurality" means two or more unless otherwise specified.
In describing some embodiments, expressions of "coupled" and "connected," along with their derivatives, may be used. For example, the term "connected" may be used in describing some embodiments to indicate that two or more elements are in direct physical or electrical contact with each other. As another example, some embodiments may be described using the term "coupled" to indicate that two or more elements are in direct physical or electrical contact. However, the terms "coupled" or "communicatively coupled" may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The embodiments disclosed herein are not necessarily limited to the contents herein.
"at least one of A, B and C" has the same meaning as "A, B or at least one of C," each including the following combination of A, B and C: a alone, B alone, C alone, a and B in combination, a and C in combination, B and C in combination, and A, B and C in combination.
"A and/or B" includes the following three combinations: a alone, B alone, and a combination of A and B.
"plurality" means at least two.
The use of "adapted to" or "configured to" herein is meant to be an open and inclusive language that does not exclude devices adapted to or configured to perform additional tasks or steps.
Additionally, the use of "based on" means open and inclusive, as a process, step, calculation, or other action that is "based on" one or more stated conditions or values may in practice be based on additional conditions or values beyond those stated.
As used herein, the terms "about" or "approximately" and the like include the stated values as well as average values that are within an acceptable deviation range for the particular value, as determined by one of ordinary skill in the art in view of the measurement in question and the error associated with the measurement of the particular quantity (i.e., the limitations of the measurement system).
Referring to fig. 1, the display device 100 may include: a display screen 1 and a display control device 2. The display screen 1 is connected with the display control device 2, and the display control device 2 can control the display content of the display screen 1.
The Display screen 1 may be, for example, an OLED (Organic Light Emitting Diode) screen, a QLED (Quantum Dot Light Emitting Diode) screen, an LCD (Liquid Crystal Display) screen, a micro LED (including a miniLED or a micro LED) screen, and the like, without limitation. Illustratively, the size of the display screen 1 is not particularly limited, and may be a conventional size display screen 1, or a large size or oversized display screen 1. For example, the display screen 1 may have dimensions of 15 inches, 17 inches, 19 inches, 21.5 inches, 22.1 inches, 23 inches, 24 inches, 27 inches, 29 inches, and so forth.
In some embodiments of the present disclosure, referring to fig. 2, the display control apparatus 2 includes at least one (e.g., may be one) processor 21 and at least one (e.g., may be one) memory 22. One or more (e.g., multiple) computer programs are stored in the memory 22, and the multiple computer programs include instructions that, when executed by the processor 21, cause the display apparatus 100 to perform a corresponding display control method.
Illustratively, the processor 21 may be one or more general processing units (CPUs), Microprocessors (MCUs), Logic devices (logics), application-specific integrated circuits (ASICs), or integrated circuits for controlling the execution of programs according to some embodiments of the present disclosure; the CPU may be a single core processor (single CPU) or a multi-core processor (multi-CPU). A processor 21 herein may refer to one or more devices, circuits, or processing cores for processing data (e.g., computer program instructions, etc.).
Memory 22 may illustratively store computer programs and data, which may include high-speed random access memory, and may also include non-volatile memory, such as, but not limited to, a magnetic disk storage device, a flash memory device, etc., a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, a One Time Programmable (OTP) memory, an electrically erasable programmable read-only memory (EEPROM), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store program code in the form of instructions or data structures and that can be accessed by a computer. The memory 22 may be separate and connected to the processor 21 via a communication line. The memory 22 may also be integrated with the processor 21.
Based on the display device 100 described above, some embodiments of the present disclosure provide a display control method, and the execution subject of the display control method may be the display control device 2 described above, or may be a product including the display control device 2 described above, such as the display device 100.
As shown in fig. 3, the display control method may include the steps of:
s101, first user identity information is acquired periodically.
Illustratively, the first user identification information may be a user face image (i.e., a face image). The display device 100 may further include one or more image acquisition units, which may periodically acquire the user face image and transmit the acquired user face image to the display control device 2 (e.g., the processor 21 in the display control device 2) in real time. The acquisition period is the first period, and the length of the first period is not limited in this embodiment, for example, the length of the first period may be 3 seconds. For example, the image acquisition unit may be embodied as a camera, which may for example comprise an RGB (red green blue color mode) camera or a depth camera. The RGB camera can collect user face image data comprising RGB color information or collect user face image data comprising gray level images; also for example, a depth camera may capture user face image data including depth information. It should be noted that, in the embodiments of the present disclosure, the type of the adopted camera is not limited, as long as the function of acquiring the user face image of the user can be realized. In addition, in the embodiment of the present disclosure, an image capturing unit (e.g., a camera) may be built in the display device 100, for example, the image capturing unit and the display screen 1 may be assembled into an integral structure, and both are generally not detachable during the use of the display device. The image acquisition unit may also be independently disposed and connected to the display control device through a certain interface (for example, an interface such as a USB interface), which is not limited in this respect.
And S102, displaying a first interface.
And displaying a first interface under the condition that the first user identity information is matched with the first pre-configuration identity information.
Illustratively, in a first period, the image obtaining unit sends the collected user face image to the processor 21 in the display device 100, in the same period, the processor 21 matches the received user face image with the preconfigured identity information stored in the identity information database, if the received user face image is successfully matched with the preconfigured identity information stored in the identity information database, the successfully matched preconfigured identity information is the first preconfigured identity information, and in response to the successful matching between the received user face image and the first preconfigured identity information, the display screen 1 displays a first interface as shown in fig. 4; and if the received user face image fails to be matched with the pre-configured identity information stored in the identity information database, discarding the user face image acquired in the first period, re-acquiring the user face image in the next period, and repeating the steps. The preconfigured identity information may be a plurality of user face images pre-stored in the identity information database, and when matching, the processor 21 matches the received user face image with the pre-stored user face image in the identity information database one by one, and one of the pre-stored user face images, which has the highest similarity with the received user face image and is successfully matched, is used as the first preconfigured identity information. Specifically, when the matching similarity between the preconfigured identity information and the first user identity information is within the preset range, that is, the matching is considered to be successful, the display screen 1 displays a first interface as shown in fig. 4. For example, when the matching similarity between the preconfigured identity information and the first user identity information is higher than 95%, the preset range is 95% -100%, wherein the preset range can be designed as required, and is not limited.
Illustratively, referring to FIG. 4, the first interface may include displays of date, network status, and gesture prompt icons. The user may make the same gesture according to the gesture prompt icon, that is, the user may perform the gap control on the display content of the display screen 1 of the display device 100 through the gesture under the condition that the user has a certain distance from the display device 100.
And S103, displaying the main interface.
In a case where the display apparatus 100 displays the first interface, the display apparatus 100 may display a main interface in response to a first air gesture of the user, wherein the main interface is different from the first interface.
Specifically, the receiving of the first air-separating gesture of the user may be completed by one or more image obtaining units included in the display device, and the image obtaining unit used for obtaining the air-separating gesture of the user may be the same as or different from the image obtaining unit used for obtaining the face image of the user. The method for acquiring the space gesture of the user is similar to the method for acquiring the face image of the user, and is not described herein again.
Illustratively, the primary interface may include a plurality of icons, which may include: at least one of at least one (e.g., may be a plurality of) application icons and at least one (e.g., may be a plurality of) system settings icons; the home interface may also include date, network status, etc. Wherein, the icon may include at least one of text, image, etc. Corresponding applications can be accessed through the application icons, and a user can perform personalized setting on system parameters of the display device 100 through the system setting icons. For example, referring to fig. 6, the main interface includes application icons such as a file manager, a smart exhibition, a virtual exhibition hall, and the like, and also includes system setting icons such as gesture recognition, quick setting, a signal source, and the like. Here, the system setting icon through gesture recognition may set whether the gesture recognition function of the display apparatus 100 is turned on. The gesture recognition function of the display device 100 is turned off by default when leaving a factory, the user can turn on the gesture recognition function by himself, and after the gesture recognition function is turned on, the user can turn off the gesture recognition function by an air gesture.
For example, the blank gesture of the user refers to a gesture made by the user without contacting the display device to control a picture displayed by the display device, and may include: at least one of a hand contour of the user and a motion trajectory of the user's hand may be entered from the first interface into the main interface by at least one first clear gesture. For example, the primary interface may be entered from the first interface by two first clear gestures. Specifically, referring to fig. 4, a first space-exclusion gesture of pinching the thumb and the finger tip of the index finger, naturally straightening other fingers, and keeping the hand stationary is performed according to the gesture prompt icon on the first interface, and the first space-exclusion gesture enters the wake-up interface with the gesture recognition function of the display device 100 shown in fig. 5, and then a first space-exclusion gesture of opening the five fingers and moving the palm upward along the direction parallel to the plane of the display screen 1 is performed according to the gesture prompt icon on the wake-up interface shown in fig. 5, and the first space-exclusion gesture enters the main interface shown in fig. 6.
For example, in addition to the two spaced-apart gestures described above, a plurality of spaced-apart gestures with different at least one of hand contours and hand motion trajectories may be provided, for example: the five fingers are opened, and the palm moves downwards along the direction parallel to the plane of the display screen 1; the five fingers are opened, and the palm moves leftwards along the direction parallel to the plane of the display screen 1; the five fingers are opened, the palm moves rightwards along the direction parallel to the plane of the display screen 1, the five fingers are clenched into a fist, and the hand moves towards the direction close to the display screen 1 along the vertical line of the plane of the display screen 1, and the like.
It can be understood that different air-separating gestures correspond to different control commands issued by the processor 21, and the control effect of different control commands on the display device 100 is different. The person skilled in the art can set the corresponding relationship between different air-separating gestures and different control instructions by himself or herself as required, which is not limited by the present disclosure.
And S104, displaying a second interface.
And responding to a second air-separating gesture of the user, and displaying a second interface, wherein the second interface is displayed after a control instruction corresponding to the second air-separating gesture is executed on the currently displayed main interface.
For example, in the case that the display device 100 displays the main interface, the display device 100 may display a second interface in response to a second blank gesture of the user, where the second interface is displayed after executing a control instruction corresponding to at least one second blank gesture on a currently displayed interface (an initial main interface, or a main interface in which one icon is selected by the user; where the initial main interface is a first main interface displayed after entering the main interface).
For example, when entering the initial main interface from the first interface, one icon in the initial main interface is in a selected state, see fig. 6, and the icon is a quick setting icon in the system setting icons. And the user switches the icon in the selected state to the signal source icon in the system setting icon through a second space gesture, and at this time, the main interface of the icon in the selected state, which is the signal source icon shown in fig. 7, is the second interface. For another example, as shown in (a) of fig. 8, when the icon is quickly set in the selected state when entering the main interface from the first interface, the user sequentially enters the second interface UI2_ a as shown in (b) of fig. 8 through one second spaced gesture as shown in (a) of fig. 8, enters the second interface UI2_ b as shown in (c) of fig. 8 through one second spaced gesture as shown in (b) of fig. 8, enters the second interface UI2_ c as shown in (d) of fig. 8 through one second spaced gesture as shown in (c) of fig. 8, in the second interface UI2_ c, the icon in the selected state is switched to the HDMI2(8K) sub-icon under the signal source icon as shown in (d) of fig. 8, and at this time, the selected HDMI2(8K) sub-icon is confirmed through one second spaced gesture as shown in (d) of fig. 8, and the HDMI2 in the selected state is the HDMI 678K sub-icon, while the signal source setting of the display apparatus is switched to transmit a video signal with a resolution of 8K through the HDMI2 interface. It is understood that the hand contour of the user and the motion trajectory of the hand of the user of the plurality of second spaced-apart gestures may be the same or different.
Illustratively, after displaying the main interface, the display control method further includes: periodically acquiring second user identity information, and continuously executing a second air separation gesture responding to the user and displaying a second interface under the condition that the second user identity information is matched with second pre-configured identity information; wherein the second preconfigured identity information is the same or different than the first preconfigured identity information. For example, in a first sub-period of two consecutive periods, the second user identity information is the face of the first user, if the face image of the first user is successfully matched with the preconfigured identity information, the preconfigured identity information is the second preconfigured identity information of the first user, and in response to the successful matching, the display device executes a control instruction of the second air-separating gesture; and in a second sub-period of the two continuous periods, the second user identity information is the face image of the user B, the face image of the user B is successfully matched with the second pre-configured identity information of the user B, and the display device continues to execute the control instruction of the second air-separating gesture.
Next, the above process is specifically explained by taking the second preconfigured identity information as the same as the first preconfigured identity information, which are both user face images.
Illustratively, after the main interface is displayed, the image obtaining unit (i.e., the camera) periodically captures the user face image, and the capturing period of the camera is a second period, and the length of the second period is not limited, for example, the length of the second period may be 5 seconds. The matching process of the second user identity information and the second preconfigured identity information is consistent with the matching process of the first user identity information and the first preconfigured identity information, and is not described herein again. Under the condition that the second user identity information is successfully matched with the second pre-configured identity information, continuously executing a control instruction of a second air separating gesture of the user, and displaying a second interface; and under the condition that the second user identity information fails to be matched with the pre-configured identity information, abandoning the user face image acquired in the second period, re-acquiring the user face image in the next second period, and repeating the steps. And if the second user identity information and the preconfigured identity information are not successfully matched in a plurality of continuous second periods, stopping responding to the second air-separating gesture of the user. The number of the second cycles before the response is stopped is not limited, for example, if the second user identity information and the preconfigured identity information cannot be matched in six consecutive second cycles, the response to the second air-separating gesture of the user is stopped, and the second interface after the response is stopped maintains the control result interface corresponding to the last second air-separating gesture in the sixth second cycle in which the matching fails.
Illustratively, the display device 100 continuously acquires the second user identity information after stopping the response and matches the second user identity information, and resumes the response to the second blank gesture when the second matching is successful.
Illustratively, the period for acquiring the second subscriber identity information is longer than the period for acquiring the first subscriber identity information, that is, the length of the second period is longer than the first period. Specifically, the first user identity is acquired in the first period with a shorter length, and the user face image can be acquired and matched at a higher frequency, so that the time for the display device 100 to display the first interface is relatively shorter, and the response is more timely. After entering the first interface, the second user identity is acquired in the second period with the longer length, and the frequency of acquiring and matching the user face image is reduced, so that the energy consumption of the display device 100 can be reduced on the premise of responding to the user operation.
For example, after the user identity information is obtained and the user identity information is successfully matched with the preconfigured identity information, the display device 100 may respond to the user's air gesture, where the successfully matched identity information and the controlled air gesture are not necessarily from the same user. For example, the first user identity information successfully matched comes from a user A, the first space gesture responded by the display main interface comes from a user B, the second space gesture responded by the second interface displayed behind the main interface comes from a user C, and the second user identity information successfully matched and obtained periodically when the second interface displayed behind the main interface comes from a user D. When the user identity information is periodically acquired and matched, after the display device 100 acquires the user identity information which can be successfully matched with the preconfigured identity information, the display device 100 can make a corresponding response to the space gesture of the user, so that a plurality of users can participate in human-computer interaction at the same time, and the use experience of the user is improved.
After the second interface is displayed, returning to a third interface from the currently displayed interface in response to the first user operation is further included, wherein the third interface is the last interface before the second interface. Specifically, the user operation may be signal control or gesture control of a remote controller, and may also be other control manners, which is not limited to this. For example, the currently displayed interface is the second interface, the second interface is the first interface for entering the virtual exhibition hall application, the third interface is the main interface of which the selected icon is the virtual exhibition hall application icon, at this time, the user can exit the virtual exhibition hall application through a key (i.e., the first user operation) on the remote controller to return to the main interface, that is, return to the third interface from the currently displayed second interface.
Exemplarily, in a case that the second interface is a first interface for entering an application, after displaying the second interface, the display control method further includes: and in the case that one content page of the first multimedia file is contained in the currently displayed interface, responding to a third space gesture of the user, switching the first multimedia file into a second multimedia file or changing the playing state of the first multimedia file, wherein the playing state comprises playing and pausing. That is, the user may enter the application interface (i.e., the second interface) through a plurality of second spaced gestures in the main interface, and the application may be a multimedia (e.g., text, audio, video, picture, etc.) file viewing application or other types of applications. And under the condition that the application is a multimedia file viewing application and the viewed multimedia file exists in the interface, the user can switch the viewed multimedia file or change the viewing state of the multimedia file through at least one third spaced gesture.
As described above, different air-separating gestures correspond to different control commands issued by the processor 21, and different control commands have different control effects on the display device 100. For example, the at least one first spaced gesture is used to wake up the gesture recognition function of the display device 100 and cause the display device 100 to display the main interface; when the main interface is displayed, at least one second spaced gesture can switch the icon in the selected state and confirm and select the last icon switched to the selected state, when the last icon is the system setting icon or the sub-icon of the system setting icon, the system setting of the display device 100 can be correspondingly changed after confirmation and selection, and when the last icon is the application icon, the application interface can be accessed after confirmation and selection; in case the current application interface comprises one content page of the first multimedia file, the at least one third clear gesture is capable of switching the first multimedia file to the second multimedia file or changing the play status of the first multimedia file to play or pause. In the case of displaying the first interface, the main interface, and the display application interface, different air-separating gestures have different control effects on the display device 100, that is, the first air-separating gesture, the second air-separating gesture, and the third air-separating gesture have different control effects on the display device 100. Specifically, the hand contour of the user and the motion trajectory of the hand of the user, which are not corresponding to the different spaced gestures, are different, and the control effect that each spaced gesture can achieve on the display device 100 is also different.
For example, if the application entered by the user is a video player, the second interface is the first interface for the user to enter the video player. Referring to (a) in fig. 9, when there is a video 01 (i.e., a first multimedia file) that has been played historically, at this time, the user may switch the video 01 to a video 02 (i.e., a second multimedia file) as shown in (b) in fig. 9 through one third space gesture, and then, referring to (c) in fig. 9, the user may switch the playing state of the video 02 from pause to play through another third space gesture that is different from the other gesture.
Illustratively, in a case where one content page of the first multimedia file is not included in the currently displayed interface, the display apparatus 100 may add at least one multimedia file in response to the second user operation, wherein the at least one multimedia file includes the first multimedia file; subsequently, the display apparatus 100 may confirm to play the selected first multimedia file in response to a third user operation. Specifically, the user operation may be signal control or gesture control of a remote controller, and may also be other control manners, which is not limited to this. For example, the application entered by the user is a video player, referring to fig. 10, the second interface does not have videos played in history, and the user can selectively add video files in the file manager of the system itself to the video player through the remote controller (i.e., a second user operation), and then select one of the added video files through the remote controller and confirm the playing (i.e., a third user operation). After the video file is confirmed to be played, the user can switch the played video file or change the playing state of the played video file through a third space gesture.
The switching of the multimedia file and the file playing state is realized through the third space gesture, the display content of the display device 100 can be controlled without using an entity control device such as a remote controller, the control operation of a user can be simplified, the dependence of the user on the entity control device is reduced, and the human-computer interaction experience is improved.
In some embodiments of the present disclosure, the display control method may further include entering of preconfigured identity information. Specifically, referring to fig. 11, the following steps may be included:
s201, obtaining identity information to be added
For example, referring to fig. 12, a system setting interface for inputting identity information (for example, a face, that is, a human face) may be entered, and an identity information adding icon (shown as "+" in the figure) is selected and confirmed, that is, the interface for acquiring identity information to be added shown in fig. 13 may be entered to acquire identity information. Specifically, acquiring the identity information to be added may be performed by one or more image acquisition units included in the display apparatus 100.
S202, inputting identity information meeting the input standard
Under the condition that the identity information to be added accords with the input standard, adding the identity information to be added into an identity information database as preconfigured identity information; and returning to execute the step of acquiring the identity information to be added under the condition that the identity information to be added does not accord with the entry standard.
Illustratively, the processor 21 judges the acquired identity information to be added according to corresponding indexes (such as the definition of the face outline, the number of necessary feature points, the position of the face in the acquired face image and the like) in the entry standard, and adds the identity information to be added as preconfigured identity information to the identity information database when the identity information to be added meets the entry standard; and under the condition that the identity information to be added does not accord with the entry standard, returning to the step of acquiring the identity information to be added, acquiring the identity information to be added again and judging according to the entry standard until the identity information to be added accords with the entry standard and is finally and successfully added. The pre-configuration identity information comprises first pre-configuration identity information and second pre-configuration identity information, and the pre-configuration identity information is stored in an identity information database.
Illustratively, there is an upper limit on the number of pre-configured identity information that can be stored in the identity information database. For example, the upper limit value may be 15, the number of the pre-configured identity information existing in the identity information database needs to be determined before the identity information to be added is obtained each time, and if the number of the existing pre-configured identity information reaches the upper limit value, at least one (for example, one) of the existing pre-configured identity information needs to be deleted, and then new identity information needs to be added. Setting the upper limit value can ensure that the display device 100 only needs to match the acquired user face image with a limited number of pre-configured identity information, can improve the efficiency of matching and comparison, ensures that the display device 100 can respond to the user face image recognition in a short time, and improves the satisfaction degree of human-computer interaction experience. Meanwhile, the setting of the limited number of pre-configured identity information is equivalent to that the display device has the operation authority setting to a certain extent, so that the display device 100 can meet more various control requirements.
Illustratively, in a case where an execution subject of a display control method provided in some embodiments of the present disclosure is a product including the above-described display control apparatus 2, for example, the display apparatus 100, the display screen 1 of the display apparatus 100 is configured to display a first interface, a main interface, and a second interface.
Further embodiments of the present disclosure provide a computer-readable storage medium (e.g., a non-transitory computer-readable storage medium) having computer program instructions stored therein, which, when executed on a processor, cause a computer (e.g., a display device) to perform one or more steps of a display control method as described in any one of the above embodiments.
By way of example, such computer-readable storage media may include, but are not limited to: magnetic storage devices (e.g., hard Disk, floppy Disk, magnetic tape, etc.), optical disks (e.g., CD (Compact Disk), DVD (Digital Versatile Disk), etc.), smart cards, and flash Memory devices (e.g., EPROM (Erasable Programmable Read-Only Memory), card, stick, key drive, etc.). Various computer-readable storage media described in this disclosure can represent one or more devices and/or other machine-readable storage media for storing information. The term "machine-readable storage medium" can include, without being limited to, wireless channels and various other media capable of storing, including, and/or carrying instruction(s) and/or data.
Some embodiments of the present disclosure also provide a computer program product. The computer program product comprises computer program instructions which, when executed on a computer, cause the computer to perform one or more steps of the display control method as described in the above embodiments.
Some embodiments of the present disclosure also provide a computer program. When the computer program is executed on a computer, the computer program causes the computer to execute one or more steps of the display control method according to the above-described embodiment.
The advantages of the computer-readable storage medium, the computer program product, and the computer program are the same as those of the display control method according to some embodiments, and are not described herein again.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A display control method, comprising:
periodically acquiring first user identity information;
displaying a first interface under the condition that the first user identity information is matched with first pre-configured identity information;
in response to a first clear gesture by a user, displaying a primary interface, the primary interface being different from the first interface;
and responding to a second air-separating gesture of the user, and displaying a second interface, wherein the second interface is displayed after a control instruction corresponding to the second air-separating gesture is executed on the currently displayed main interface.
2. The display control method according to claim 1, wherein after the displaying the main interface, the display control method further comprises:
and periodically acquiring second user identity information, and continuously executing the second space gesture responding to the user and displaying a second interface under the condition that the second user identity information is matched with second pre-configured identity information.
3. The display control method according to claim 2,
the period for acquiring the second user identity information is greater than the period for acquiring the first user identity information.
4. The display control method according to claim 1, wherein in a case where the second interface is a first interface to enter an application, after the displaying the second interface, the display control method further comprises:
in the case that one content page of a first multimedia file is included in the currently displayed interface, in response to a third blank gesture of the user, switching the first multimedia file to a second multimedia file or changing a play state of the first multimedia file, the play state including play and pause.
5. The display control method according to claim 4, further comprising, after the displaying the second interface:
and responding to the first user operation, returning from the currently displayed interface to a third interface, wherein the third interface is the last interface before the second interface.
6. The display control method according to claim 4, characterized by further comprising:
adding at least one multimedia file in response to a second user operation, wherein the at least one multimedia file comprises the first multimedia file;
and responding to a third user operation to confirm the playing of the first multimedia file.
7. The display control method according to claim 1, further comprising:
acquiring identity information to be added;
under the condition that the identity information to be added accords with an input standard, adding the identity information to be added into an identity information database as preconfigured identity information;
under the condition that the identity information to be added does not accord with the entry standard, returning to execute the step of acquiring the identity information to be added;
wherein the first and second preconfigured identity information are stored in the identity information database.
8. The display control method according to claim 7, wherein the preconfigured identity information is a user face image.
9. A display device, comprising:
at least one processor and at least one memory; wherein the at least one memory has stored therein one or more computer programs, the one or more computer programs comprising instructions, which when executed by the at least one processor, cause the display apparatus to perform the display control method of any one of claims 1 to 8;
a display screen configured to display the first interface, the main interface, and the second interface.
10. A computer readable storage medium storing computer program instructions which, when run on a display device, cause the display device to perform the display control method of any one of claims 1 to 8.
CN202110204855.6A 2021-02-23 2021-02-23 Display control method and display device Pending CN112799574A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110204855.6A CN112799574A (en) 2021-02-23 2021-02-23 Display control method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110204855.6A CN112799574A (en) 2021-02-23 2021-02-23 Display control method and display device

Publications (1)

Publication Number Publication Date
CN112799574A true CN112799574A (en) 2021-05-14

Family

ID=75815584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110204855.6A Pending CN112799574A (en) 2021-02-23 2021-02-23 Display control method and display device

Country Status (1)

Country Link
CN (1) CN112799574A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117111727A (en) * 2023-02-22 2023-11-24 荣耀终端有限公司 Hand direction detection method, electronic device and readable medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109542219A (en) * 2018-10-22 2019-03-29 广东精标科技股份有限公司 A kind of gesture interaction system and method applied to smart classroom
CN109725727A (en) * 2018-12-29 2019-05-07 百度在线网络技术(北京)有限公司 There are the gestural control method and device of screen equipment
CN110045819A (en) * 2019-03-01 2019-07-23 华为技术有限公司 A kind of gesture processing method and equipment
CN110058777A (en) * 2019-03-13 2019-07-26 华为技术有限公司 The method and electronic equipment of shortcut function starting
CN110442294A (en) * 2019-07-10 2019-11-12 杭州鸿雁智能科技有限公司 Interface display method, device, system and the storage medium of operation panel
CN210721362U (en) * 2019-07-19 2020-06-09 广东墨痕教育科技有限公司 Non-contact control device of wisdom classroom all-in-one
CN111527468A (en) * 2019-11-18 2020-08-11 华为技术有限公司 Air-to-air interaction method, device and equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109542219A (en) * 2018-10-22 2019-03-29 广东精标科技股份有限公司 A kind of gesture interaction system and method applied to smart classroom
CN109725727A (en) * 2018-12-29 2019-05-07 百度在线网络技术(北京)有限公司 There are the gestural control method and device of screen equipment
CN110045819A (en) * 2019-03-01 2019-07-23 华为技术有限公司 A kind of gesture processing method and equipment
CN110058777A (en) * 2019-03-13 2019-07-26 华为技术有限公司 The method and electronic equipment of shortcut function starting
CN110442294A (en) * 2019-07-10 2019-11-12 杭州鸿雁智能科技有限公司 Interface display method, device, system and the storage medium of operation panel
CN210721362U (en) * 2019-07-19 2020-06-09 广东墨痕教育科技有限公司 Non-contact control device of wisdom classroom all-in-one
CN111527468A (en) * 2019-11-18 2020-08-11 华为技术有限公司 Air-to-air interaction method, device and equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117111727A (en) * 2023-02-22 2023-11-24 荣耀终端有限公司 Hand direction detection method, electronic device and readable medium

Similar Documents

Publication Publication Date Title
EP3298509B1 (en) Prioritized display of visual content in computer presentations
CN112104915B (en) Video data processing method and device and storage medium
JP7152528B2 (en) Methods, apparatus and electronics for tracking multiple facials with facial special effects
US11706485B2 (en) Display device and content recommendation method
US11330313B2 (en) Crowd rating media content based on micro-expressions of viewers
CN102662559B (en) Design method of virtual touch screen technology based on scene identification
CN112866734A (en) Control method for automatically displaying handwriting input function and display device
WO2003044648A2 (en) Method and apparatus for a gesture-based user interface
CN113660514B (en) Method and system for modifying user interface color in connection with video presentation
CN105204734A (en) Brightness control method and system
CN107450830B (en) Method and device for simultaneously supporting multi-point input in single-screen and multi-window mode
CN114327034A (en) Display device and screen recording interaction method
WO2020248697A1 (en) Display device and video communication data processing method
CN115776585A (en) Display device and content presentation method
CN112799574A (en) Display control method and display device
CN105302311A (en) Terminal coefficient control method and device based on fingerprint recognition and terminal
CN113051435B (en) Server and medium resource dotting method
US11889127B2 (en) Live video interaction method and apparatus, and computer device
CN112817557A (en) Volume adjusting method based on multi-person gesture recognition and display device
CN112835506A (en) Display device and control method thereof
WO2023169361A1 (en) Information recommendation method and apparatus and electronic device
CN112509152A (en) Car watching method, system, equipment and readable medium based on AR technology
CN113741769A (en) Control method and system based on image recognition, vehicle and storage medium
CN108055562A (en) Applied to the setting method of smart television, system and its intelligent TV set
CN111638845B (en) Animation element obtaining method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination