CN117999537A - Electronic display device, display control method and device thereof, and storage medium - Google Patents

Electronic display device, display control method and device thereof, and storage medium Download PDF

Info

Publication number
CN117999537A
CN117999537A CN202280004180.3A CN202280004180A CN117999537A CN 117999537 A CN117999537 A CN 117999537A CN 202280004180 A CN202280004180 A CN 202280004180A CN 117999537 A CN117999537 A CN 117999537A
Authority
CN
China
Prior art keywords
interface
display device
hand
electronic display
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280004180.3A
Other languages
Chinese (zh)
Inventor
张逸帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Publication of CN117999537A publication Critical patent/CN117999537A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

An electronic display device, a display control method, a display control device and a storage medium. The method comprises the following steps: generating a floating window interface according to a display interface in a current display state of the electronic display device in response to the detection of the instruction for starting the one-hand operation mode (S11); displaying a floating window interface at an upper layer of a display interface within a target display area on a screen of the electronic display device, wherein the target display area is determined based on a gesture of a user holding the electronic display device with one hand (S12); controlling a first target interface to respond to the first one-hand touch operation under the condition that the user is detected to perform the first one-hand touch operation in the floating window interface (S13); and controlling the second target interface to synchronously change according to the change of the first target interface, wherein the first target interface is one of a display interface and a floating window interface, and the second target interface is the other of the display interface and the floating window interface (S14). By the method, the user can conveniently perform one-hand touch operation.

Description

Electronic display device, display control method and device thereof, and storage medium Technical Field
The disclosure relates to the technical field of electronic display devices, and in particular relates to an electronic display device, a display control method and device thereof, and a storage medium.
Background
With the day-to-day variation of screen technology, large-screen electronic display devices and folding-screen electronic display devices are also becoming popular.
People prefer to purchase large screen electronic display devices for better viewing movies, browsing web pages, or viewing documents. However, the large-screen electronic display device has a problem in that the touch operation is inconvenient.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides an electronic display device, a display control method, a display control apparatus, and a storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided a display control method of an electronic display device, the method including:
Responding to a command for starting a single-hand operation mode, and generating a floating window interface according to a display interface in a current display state of the electronic display device, wherein the floating window interface is an interface with a reduced size of the display interface;
displaying the floating window interface on an upper layer of the display interface in a target display area on a screen of the electronic display device, wherein the target display area is determined based on a gesture of a user holding the electronic display device with one hand;
controlling a first target interface to respond to first one-hand touch operation under the condition that the user is detected to perform the first one-hand touch operation in the floating window interface, wherein the first one-hand touch operation is performed by holding the hand of the electronic display device by one hand of the user; and is combined with the other components of the water treatment device,
And controlling a second target interface to synchronously change according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface.
Optionally, the first target interface is configured with an interface of an application program, and the controlling the first target interface to respond to the first one-hand touch operation includes:
determining a target application program according to the first single-hand touch operation;
And controlling the first target interface to respond to the first one-hand touch operation by calling the interface of the target application program.
Optionally, the method further comprises:
And controlling the first target interface to respond to the second one-hand touch operation under the condition that the user is detected to perform the second one-hand touch operation in the display interface.
Optionally, if the first target interface is the display interface and the second target interface is the floating window interface, controlling, when detecting that the user performs the first one-hand touch operation in the floating window interface, the first target interface to respond to the first one-hand touch operation includes:
mapping the first single-hand touch operation to the display interface;
and controlling the display interface to respond to the mapping result of the first one-hand touch operation.
Optionally, displaying the floating window interface on an upper layer of the display interface in a target display area on a screen of the electronic display device includes:
acquiring touch data of a user on the electronic display device;
Determining a gesture of holding the electronic display device by a single hand of a user according to the touch data;
Determining an operation area of one-hand touch operation of a hand holding the electronic display device on a screen of the electronic display device according to a gesture of the user holding the electronic display device with one hand;
determining the target display area according to the operation area;
and displaying the floating window interface on the upper layer of the display interface in the target display area.
Optionally, the touch data includes a touch image, and determining, according to the touch data, a gesture of a user holding the electronic display device with one hand includes:
And inputting the touch image into a gesture recognition model after training, and obtaining a recognition result which is output by the gesture recognition model and is used for representing the gesture of a user holding the electronic display device by one hand.
Optionally, the electronic display device includes an electromagnetic wave energy absorption rate sensor, the touch data includes sensor data collected by the electromagnetic wave energy absorption rate sensor, and the determining, according to the touch data, a gesture of a user holding the electronic display device with one hand includes:
Determining a contact area between a user and the electronic display device according to the sensor data;
And determining the gesture of a user for holding the electronic display device according to the position distribution of the contact area on the electronic display device.
According to a second aspect of embodiments of the present disclosure, there is provided a display control apparatus of an electronic display device, the apparatus including:
the generating module is configured to respond to the detection of an instruction for starting a one-hand operation mode, and generate a floating window interface according to a display interface in a current display state of the electronic display device, wherein the floating window interface is an interface with the reduced size of the display interface;
a display module configured to display the floating window interface on top of the display interface within a target display area on a screen of the electronic display device, wherein the target display area is determined based on a gesture of a user holding the electronic display device with one hand;
The response module is configured to control a first target interface to respond to a first one-hand touch operation under the condition that the first one-hand touch operation of a user in the floating window interface is detected, wherein the first one-hand touch operation is performed by the hand of the user holding the electronic display device by one hand;
and the synchronization module is configured to control a second target interface to perform synchronous change according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface.
Optionally, the first target interface is configured with an interface of an application program, and the response module includes:
The first determining submodule is configured to determine a target application program according to the first one-hand touch operation;
And the calling module is configured to control the first target interface to respond to the first one-hand touch operation by calling the interface of the target application program.
Optionally, the apparatus further comprises:
and the control module is configured to control the first target interface to respond to the second one-hand touch operation under the condition that the second one-hand touch operation of the user in the display interface is detected.
Optionally, the response module is configured to:
If the first target interface is the display interface and the second target interface is the floating window interface, mapping the first one-hand touch operation to the display interface; and controlling the display interface to respond to the mapping result of the first one-hand touch operation.
Optionally, the display module includes:
An acquisition sub-module configured to acquire touch data of a user on the electronic display device;
A second determining sub-module configured to determine a gesture of a user holding the electronic display device with one hand according to the touch data;
A third determining submodule configured to determine an operation area in which a hand holding the electronic display device with a single hand of a user performs a single-hand touch operation on a screen of the electronic display device according to a gesture in which the user holds the electronic display device with the single hand;
a fourth determination sub-module configured to determine the target display area from the operation area;
and the display sub-module is configured to display the floating window interface on the upper layer of the display interface in the target display area.
Optionally, the touch data includes a touch image, and the second determining submodule is configured to:
And inputting the touch image into a gesture recognition model after training, and obtaining a recognition result which is output by the gesture recognition model and is used for representing the gesture of a user holding the electronic display device by one hand.
Optionally, the electronic display device includes an electromagnetic wave energy absorption rate sensor, the touch data includes sensor data collected by the electromagnetic wave energy absorption rate sensor, and the second determination submodule is configured to:
Determining a contact area between a user and the electronic display device according to the sensor data;
And determining the gesture of a user for holding the electronic display device according to the position distribution of the contact area on the electronic display device.
According to a third aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the display control method of an electronic display device provided in the first aspect of the present disclosure.
According to a third aspect of embodiments of the present disclosure, there is provided an electronic display device comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the display control method of the electronic display device provided in the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
And the electronic display equipment responds to the detection of the instruction for starting the single-hand operation mode, and generates a floating window interface according to the display interface in the current display state of the electronic display equipment, wherein the floating window interface is an interface with the reduced size of the display interface. And displaying a floating window interface on an upper layer of the display interface in a target display area on a screen of the electronic display device, wherein the target display area is determined based on a gesture of a user holding the electronic display device with one hand. And under the condition that the first one-hand touch operation is detected to be performed in the floating window interface by the user, controlling the first target interface to respond to the first one-hand touch operation, wherein the first one-hand touch operation is performed by the user holding the electronic display device by one hand. And controlling a second target interface to synchronously change according to the change of the first target interface, wherein the first target interface is one of a display interface and a floating window interface, and the second target interface is the other of the display interface and the floating window interface. By adopting the method, the suspended window interface is the interface of which the display interface is reduced in size, so that the single-hand touch operation on the display interface can be realized by the single-hand touch operation of a user in the suspended window interface. And because the size of the floating window interface is smaller than that of the display interface, compared with the display interface, the floating window interface is more convenient for a user to perform one-hand touch operation. The method solves the problem that the touch operation by one hand is inconvenient because the screen of the large-screen electronic display device is oversized in the related art, for example, the problem that the touch operation by one hand is inconvenient on the screen of the large-screen electronic display device.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a display control method of an electronic display device according to an exemplary embodiment.
Fig. 2 is a block diagram illustrating a display control apparatus of an electronic display device according to an exemplary embodiment.
Fig. 3 is a block diagram of an electronic display device, according to an example embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
It should be noted that, all actions for acquiring signals, information, instructions or data are performed under the condition of conforming to the corresponding data protection rule policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
Fig. 1 is a flowchart illustrating a display control method of an electronic display device according to an exemplary embodiment, and as shown in fig. 1, the display control method of the electronic display device may include the following steps.
In step S11, in response to detecting the instruction for starting the one-hand operation mode, generating a floating window interface according to the display interface in the current display state of the electronic display device, where the floating window interface is an interface with the reduced size of the display interface.
The one-hand operation mode may refer to a mode of touch operation using one hand, a small screen operation mode, a virtual operation interface mode, and the like. The floating window interface is an interface with the display interface reduced in size, and the floating window interface and the display interface have the same display content. It should be noted that, in the present disclosure, the floating window interface does not belong to the display content possessed by the display interface.
In some embodiments, the reduced size floating window interface may be obtained by obtaining a display interface and scaling down the length and width of the display interface. In other embodiments, the size of the floating window interface may be a default value, such as 3.5 inches, 4 inches, 4.5 inches, etc., and the size of the floating window interface is not particularly limited by the present disclosure.
And responding to the detection of the instruction for starting the single-hand operation mode, and generating a floating window interface according to the display interface in the current display state of the electronic display equipment. The instruction for starting the single-hand operation mode may be actions such as clicking a screen by a user according to a preset manner, sliding the user on the screen according to a preset track, clicking a preset button by the user, inputting preset voice by the user, and holding the electronic display device by the user using a preset gesture.
In step S12, the floating window interface is displayed on an upper layer of the display interface within a target display area on a screen of the electronic display device, wherein the target display area is determined based on a gesture in which a user holds the electronic display device with one hand.
It should be noted that, since the electronic display device may display the superposition result of the plurality of interfaces, in some embodiments, the floating window interface may be displayed at an upper layer of the display interface within the target display area on the screen of the electronic display device. And the transparency of the floating window interface can be freely set by the user. By adjusting the transparency of the floating window interface, the shielding degree of the floating window interface on the display interface can be reduced.
The target display area is determined based on a gesture of a user holding the electronic display device with one hand, and is an area other than a blind area in which the user performs a one-hand touch operation.
In some embodiments, the display interface in the current display state of the electronic display device may refer to a main screen interface of the electronic display device, and may also refer to a superposition result of multiple interfaces other than the floating window interface in the present disclosure.
In step S13, when it is detected that the user performs a first one-hand touch operation in the floating window interface, the first target interface is controlled to respond to the first one-hand touch operation, where the first one-hand touch operation is performed by a hand of the user holding the electronic display device with one hand.
For example, the first one-hand touch operation may be an operation performed by a user holding a hand of the electronic display device with one hand, and the specific operation may be an operation of clicking a function icon (such as a button control) of an application program in the floating window interface with a finger (or a finger wearing an auxiliary tool such as a touch finger glove). The first one-hand touch operation may be an operation in which the user zooms in and out the size of the floating window interface using a finger. The first one-hand touch operation may also be an operation (such as a page turning operation) in which the user slides the floating window interface with a finger. The first one-hand touch operation may also be an operation in which the user inputs characters in the floating window interface. The first one-hand touch operation may also be an operation performed by the user to draw in the floating window interface. The first one-hand touch operation may also be an operation in which the user plays video/audio in the floating window interface.
It should be noted that the first one-hand touch operation performed by the user in the floating window interface does not include an action of adjusting the display position of the floating window interface on the screen of the electronic display device by dragging the floating window interface.
In step S14, a second target interface is controlled to perform synchronous change according to the change of the first target interface, where the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface.
When the first target interface changes in response to the first one-hand touch operation, the second target interface is controlled to synchronously change according to the change of the first target interface. That is, in the present disclosure, the first target interface and the second target interface have a relationship that maps to each other. The display contents of the first target interface and the second target interface remain consistent, and the sizes of the display contents may not be consistent.
Illustratively, the first target interface is a display interface and the second target interface is a floating window interface. And when the display interface changes due to the response of the first one-hand touch operation, controlling the floating window interface to synchronously change. In this case, an interface of the application program is configured on the display interface, and the corresponding interface can be invoked according to the first one-hand touch operation. The background application program can be triggered to respond by calling the interface, so that the display interface changes correspondingly.
Further example, the first target interface is a floating window interface and the second target interface is a display interface. And when the floating window interface changes due to the response of the first one-hand touch operation, controlling the display interface to synchronously change. In this case, an interface of an application program is configured on the floating window interface, and a corresponding interface can be called according to the first one-hand touch operation. The background application program can be triggered to respond by calling the interface, so that the floating window interface changes correspondingly.
Optionally, the method further comprises:
And controlling the first target interface to respond to the second one-hand touch operation under the condition that the user is detected to perform the second one-hand touch operation in the display interface.
The second one-hand touch operation may be a hand other than a hand with which the user holds the electronic display device with one hand, for example, if the user holds the electronic display device with the right hand, the second one-hand touch operation refers to an operation performed with the left hand.
In some implementations, the second one-hand touch operation may be a hand of a user holding the electronic display device with a single hand.
For example, the second one-hand touch operation may be an operation in which the user (left hand) clicks a functional icon (such as a button control) of an application program within the display interface using a finger, a stylus, or a mouse. The second one-hand touch operation may be an operation in which the user zooms in and out of the display interface size using a finger. The second one-hand touch operation may also be an operation in which the user slides (e.g., turns pages) the display interface using a finger, a stylus, or a mouse. The second one-hand touch operation may be an operation in which the user inputs characters in the display interface. The second one-hand touch operation may be an operation of drawing by the user in the display interface. The second one-hand touch operation may also be an operation in which the user plays video/audio within the display interface.
In some embodiments, if the user performs a touch operation in the floating window interface, the floating window interface is determined to be a first target interface, and the display interface is determined to be a second target interface. Thus, the display interface is a mapping interface (projection interface) of the floating window interface.
In other embodiments, if the user performs a touch operation in the display interface, the display interface is determined to be a first target interface, and the floating window interface is determined to be a second target interface. Thus, the floating window interface is a mapping interface (projection interface) of the display interface.
In other embodiments, to more simply and quickly generate the floating window interface, the display interface may be defaulted to a first target interface and the floating window interface may be defaulted to a second target interface. Thus, the floating window interface is a mapping interface (projection interface) of the display interface, and has a function of displaying a projection result.
By adopting the method, the electronic display device responds to the detection of the instruction for starting the single-hand operation mode, and generates a floating window interface according to the display interface in the current display state of the electronic display device, wherein the floating window interface is an interface with the reduced size of the display interface. And displaying a floating window interface on an upper layer of the display interface in a target display area on a screen of the electronic display device, wherein the target display area is determined based on a gesture of a user holding the electronic display device with one hand. And under the condition that the first one-hand touch operation is detected to be performed in the floating window interface by the user, controlling the first target interface to respond to the first one-hand touch operation, wherein the first one-hand touch operation is performed by the user holding the electronic display device by one hand. And controlling a second target interface to synchronously change according to the change of the first target interface, wherein the first target interface is one of a display interface and a floating window interface, and the second target interface is the other of the display interface and the floating window interface. By adopting the method, the floating window interface is the interface of which the display interface is reduced in size, so that the single-hand touch operation on the display interface can be realized by the single-hand touch operation of a user in the floating window interface. And because the size of the floating window interface is smaller than that of the display interface, compared with the display interface, the floating window interface is more convenient for a user to perform one-hand touch operation. The method solves the problem that the touch operation by one hand is inconvenient because the screen of the large-screen electronic display device is oversized in the related art, for example, the problem that the touch operation by one hand is inconvenient on the screen of the large-screen electronic display device.
Moreover, as the user can perform touch operation in the floating window interface and also perform touch operation in the display interface, the method is convenient for the user to perform single-hand touch operation in the floating window interface and also perform touch operation in the display interface by using the other hand. For example, in a teaching scenario, a teacher uses the right hand to write text in a floating window interface, and the teacher uses the left hand to slide left in a display interface to move images in the display interface to the left, thereby facilitating the right hand to continuously write text and avoiding the written text overlapping each other.
Optionally, the first target interface is configured with an interface of an application program, and the controlling the first target interface to respond to the first one-hand touch operation includes:
determining a target application program according to the first single-hand touch operation; and controlling the first target interface to respond to the first one-hand touch operation by calling the interface of the target application program.
For example, assuming that the first one-hand touch operation is an operation of clicking an icon of the application program a in the floating window interface by a user, determining that the target application program is a program a for starting the application program a according to an operation position and an operation characteristic (such as clicking, sliding and the like) of the first one-hand touch operation, and controlling the first target interface to respond to the first one-hand touch operation by calling an interface for starting the program a of the application program a so as to start the application program a, by starting the application program a, the first target interface can be changed correspondingly, such as the first target interface is changed into a main interface displayed after entering the application program a.
Optionally, if the first target interface is the display interface and the second target interface is the floating window interface, controlling, when detecting that the user performs the first one-hand touch operation in the floating window interface, the first target interface to respond to the first one-hand touch operation includes:
Mapping the first single-hand touch operation to the display interface; and controlling the display interface to respond to the mapping result of the first one-hand touch operation.
For example, if the first target interface is a display interface and the second target interface is a floating window interface, the first one-hand touch operation is mapped to the display interface when it is detected that the user performs the first one-hand touch operation in the floating window interface. The display interface responds to the mapping result of the first one-hand touch operation. For example, assuming that the first one-hand touch operation is that the user clicks an icon of the application program B in the floating window interface, mapping the first one-hand touch operation to the display interface, where the mapping result is that the user clicks the icon of the application program B in the display interface, and the display interface responds to the operation that the user clicks the icon of the application program B in the display interface, so as to change.
Optionally, if the first target interface is a floating window interface and the second target interface is a display interface, mapping the second one-hand touch operation to the floating window interface when detecting that the user performs the second one-hand touch operation in the display interface. The floating window interface is responsive to a mapping result of the second one-hand touch operation.
In some embodiments, in the case where the electronic display device is a tablet computer, a touch-operable notebook, or a folding screen phone, one way to determine the target display area of the floating window interface on the screen of the electronic display device may be:
And acquiring touch data of a user on the electronic display device. And determining a gesture of holding the electronic display device by a single hand of the user according to the touch data. And determining an operable area where a hand holding the electronic display device by a single hand of a user performs touch operation by a single hand on a screen of the electronic display device according to a gesture of holding the electronic display device by the single hand of the user. And determining a target display area according to the operable area of the touch operation by the user with one hand. For example, an operable area in which a user performs a touch operation with one hand is used as a display area. For another example, a part of the operable area in which the user performs the touch operation with one hand is used as the display area.
The touch data may include a touch image, and according to the touch data, one embodiment of determining a gesture of a user holding the electronic display device with one hand may be:
And inputting the touch image into the gesture recognition model after training, and obtaining a recognition result which is output by the gesture recognition model and is used for representing the gesture of the user holding the electronic display device by one hand.
Assuming that an m×n capacitance lattice is distributed on a screen of the electronic display device, since a metal object and a human body close to the screen change the capacitance value of the corresponding position, when a user contacts the screen of the electronic display device, the screen corresponds to a touch image (capacitance image) of m×n×1.
Since the user may make contact with the screen (e.g., the screen edge position) of the electronic display device while the user holds the electronic display device, a touch image of m×n×1 may be obtained while the user holds the electronic display device. Inputting the m1 touch image into a trained gesture recognition model, and obtaining a recognition result which is output by the gesture recognition model and is used for representing the gesture of a user holding the electronic display device.
The training mode of the gesture recognition model may be that a touch image sample and a gesture label of the touch image sample are determined, and the gesture recognition model is trained according to the touch image sample and the gesture label of the touch image sample, so as to obtain a trained gesture recognition model.
In one example, a touch image of m×n×1 (assuming 16×40×1) is input into a neural network (e.g., resNet, resNeXt, mobileNet, shuffleNet, etc.), and x 1 eigenvalues are calculated. Illustrated at MobileNet: the convolution result of 2×5× 1 can be obtained by processing 16×40×1 touch images with MobileNet convolution layers (conv). Calculating the average value of 2×5× 1 by using the average pooling layer (avg pool) of MobileNet, to obtain the characteristic value of 1×1× 1 (i.e., x 1 characteristic values), where the characteristic value of 1×1× 1 may represent the touch image of 16×40×1. The eigenvalues of 1x 1 are input into the fully connected + softmax layer of MobileNet to obtain probability vectors. The probability vector characterizes gestures of a user holding the electronic display device, such as gestures of holding the electronic display device with a right hand, holding the electronic display device with a left hand, holding the electronic display device with a non-single hand, i.e. gestures of holding the electronic display device with a double hand, and the like.
In another example, a touch image of m×n×1 (assuming 16×40×1) is segmented to obtain a first segmented image and a second segmented image, where the first segmented image is an image including peripheral edge pixels of the touch image. If the number of pixels of the first divided image is (16+40) ×2×4= (112×4), the number of pixels of the second divided image is (16-2×4) = (40-2×4) = (8×32), wherein 16 represents the width of the touch image, 40 represents the length of the touch image, 2 represents the touch image has two long sides and two wide sides, 4 represents the dividing width of the touch image (i.e. the first side of the touch image is moved inwards by 4 pixels to divide into a first sub-image with the width of 4, the second side of the touch image is moved inwards by 4 pixels to divide into a second sub-image with the width of 4, the third side of the touch image is moved inwards by 4 pixels to divide into a third sub-image with the width of 4, the fourth side of the touch image is moved inwards by 4 pixels to divide into a fourth sub-image with the width of 4.
The first segmented image is input into a neural network (e.g., resNet, resNeXt, mobileNet, shuffleNet, etc.), and Y 1 eigenvalues are calculated. Illustrated at MobileNet: the first segmented image of 112 x 4 x 1 is processed using the MobileNet convolution layer (conv) to obtain a convolution result of 28 x 1 x y 1. Calculating the average value of 28×1×y 1 using the average pooling layer (avg pool) of MobileNet, a feature value of 1×1×y 1 (i.e., Y 1 feature values) may be obtained, and the feature value of 1×1×y 1 may represent the first segmented image of 112×4×1.
Then, the second segmented image is input to a neural network (for example, resNet, resNeXt, mobileNet, shuffleNet or the like), and Y 2 feature values are calculated. Illustrated at MobileNet: the convolution result of 2x 5 x y 2 is obtained by processing the second segmented image of 8 x 32 x 1 using the MobileNet convolution layer (conv). Calculating the average value of 2×5×y 2 using the average pooling layer (avg pool) of MobileNet, a feature value of 1×1×y 2 (i.e., Y 2 feature values) may be obtained, and the feature value of 1×1×y 2 may represent the second segmented image of 8×32×1.
Splicing (jointing) the characteristic value of 1 x Y 1 and the characteristic value of 1 x Y 2 to obtain a spliced vector of 1*1 x (Y 1+Y 2), and inputting the spliced vector of 1*1 x (Y 1+Y 2) into a fully-connected +softmax layer of MobileNet to obtain a probability vector. The probability vector characterizes gestures of a user holding the electronic display device, such as gestures of a right hand holding the electronic display device, a left hand holding the electronic display device, a two-hand holding the electronic display device, and the like.
In another example, a touch image of m×n×1 (assuming 16×40×1) is segmented to obtain a first segmented image, where the first segmented image is an image including peripheral edge pixels of the touch image. If the number of pixels of the first divided image is (16+40) ×2×4= (112×4), 16 represents the width of the touch image, 40 represents the length of the touch image, 2 represents the touch image has two long sides and two wide sides, and 4 represents the dividing width of the touch image (i.e. the first side of the touch image is moved inwards by 4 pixels to divide the first sub-image with the width of 4, the second side of the touch image is moved inwards by 4 pixels to divide the second sub-image with the width of 4, the third side of the touch image is moved inwards by 4 pixels to divide the third sub-image with the width of 4, the fourth side of the touch image is moved inwards by 4 pixels to divide the fourth sub-image with the width of 4.
The first segmented image is input into a neural network (e.g., resNet, resNeXt, mobileNet, shuffleNet, etc.), and Y 1 eigenvalues are calculated. Illustrated at MobileNet: the first segmented image of 112 x 4 x 1 is processed using the MobileNet convolution layer (conv) to obtain a convolution result of 28 x 1 x y 1. Calculating the average value of 28×1×y 1 using the average pooling layer (avg pool) of MobileNet, a feature value of 1×1×y 1 (i.e., Y 1 feature values) may be obtained, and the feature value of 1×1×y 1 may represent the first segmented image of 112×4×1.
And, inputting the touch image of m×n×1 (assuming that 16×40×1) into a neural network (for example ResNet, resNeXt, mobileNet, shuffleNet, etc.), and calculating to obtain Y 2 feature values. Illustrated at MobileNet: the convolution result of 2x 5 x y 2 can be obtained by processing 16 x 40 x 1 touch images using MobileNet convolution layers (conv). Calculating the average value of 2×5×y 2 by using the average pooling layer (avg pool) of MobileNet, to obtain the characteristic value of 1×1×y 2 (i.e. Y 2 characteristic values), where the characteristic value of 1×1×y 2 may represent 16×40×1 touch image.
Splicing (jointing) the characteristic value of 1 x Y 1 and the characteristic value of 1 x Y 2 to obtain a spliced vector of 1*1 x (Y 1+Y 2), and inputting the spliced vector of 1*1 x (Y 1+Y 2) into a fully-connected +softmax layer of MobileNet to obtain a probability vector. The probability vector characterizes gestures of a user holding the electronic display device, such as gestures of a right hand holding the electronic display device, a left hand holding the electronic display device, a two-hand holding the electronic display device, and the like.
In another example, a touch image of m×n×1 (assuming 18×40×1) is input into a target detection model (such as ResNet, resNeXt, mobileNet or ShuffleNet models), and x 2 eigenvalues are calculated. The location (x, y, w, h) of each Anchor Box (Anchor Box) and the corresponding class probability value for each Anchor Box are calculated using the first full connection+softmax layer. The categories include finger false touch, tiger's mouth false touch, left hand press, right hand press, etc. And calculating a total probability vector according to the position (x, y, w, h) of each anchor frame and the class probability value corresponding to each anchor frame by using the second full connection layer and softmax. The total probability vector characterizes gestures of a user holding the electronic display device, such as gestures of a right hand holding the electronic display device, a left hand holding the electronic display device, a two-hand holding the electronic display device, and the like.
The anchor frame refers to a priori frame with a plurality of different length-width ratios predefined by the algorithm by taking the anchor point as a center in the target detection algorithm.
In the case where the electronic display device is a large-screen electronic display device such as a television set capable of touch operation, an electronic blackboard for teaching, or the like, one embodiment of determining a display area of the floating window interface on a screen of the electronic display device may be:
The method comprises the steps of obtaining the position of a user relative to electronic display equipment, determining an operable area convenient for the user to operate based on the position of the user relative to the electronic display equipment, or determining a gesture of the user for holding the electronic display equipment by one hand so as to determine the operable area convenient for the user to operate, and determining the display area of the floating window interface according to the operable area. For example, assuming that the user is detected to be standing in the lower right corner of the electronic display device, the lower right corner of the screen of the electronic display device may be determined as an operable area for facilitating the user's operation.
Optionally, the electronic display device in the present disclosure may include an electromagnetic wave energy absorption rate sensor SAR (specific absorption rate) sensor, and the touch data may include sensor data acquired by the electromagnetic wave energy absorption rate sensor. One embodiment of determining a gesture of a user to hold the electronic display device based on the touch data may be:
A contact area of the user with the electronic display device is determined from the sensor data. According to the position distribution of the contact area on the electronic display device, the gesture of the user for holding the electronic display device is determined. For example, assuming that the plurality of touch areas are located at the right rear side and the right front side of the electronic display device, it may be determined that the gesture of the user holding the electronic display device is a left hand holding the electronic device.
The SAR sensor can determine the contact state between the user and the electronic display device (such as a mobile phone) by detecting the capacitance.
In some embodiments, the display control method of the electronic display device may further include:
And if the fact that the user performs the first one-hand touch operation in the floating window interface is not detected within the preset time period, controlling the floating window interface to enter a hidden state. For example, the floating window interface is hidden in the invisible area of the user at the edge of the screen, and the user drags the floating window interface hidden at the edge of the screen, so that the floating window interface is redisplayed in the visible area of the user. For example, the floating window interface is changed to a floating ball (floating point) with a smaller area, and the user clicks the floating ball (floating point) to change the floating ball (floating point) to the floating window interface again.
In some embodiments, the display control method of the electronic display device may further include:
If the gesture of the user for holding the electronic display device is recognized to be holding by both hands, the floating window interface can be closed or controlled to enter a hidden state.
If the gesture of the user for replacing the electronic display device is detected, the display position of the floating window interface is adjusted according to the replaced gesture.
In some embodiments, another embodiment of determining a display area of a floating window interface on a screen of an electronic display device: a fixed target display area is determined based on the user's selection/preset.
Fig. 2 is a block diagram illustrating a display control apparatus of an electronic display device according to an exemplary embodiment. Referring to fig. 2, the apparatus 200 includes:
A generating module 210, configured to generate a floating window interface according to a display interface in a current display state of the electronic display device in response to detecting an instruction for starting a one-hand operation mode, where the floating window interface is an interface with a reduced size of the display interface;
A display module 220 configured to display the floating window interface on top of the display interface within a target display area on a screen of the electronic display device, wherein the target display area is determined based on a gesture of a user holding the electronic display device with one hand;
a response module 230, configured to control a first target interface to respond to a first one-hand touch operation when detecting that a user performs the first one-hand touch operation in the floating window interface, where the first one-hand touch operation is performed by a hand of the user holding the electronic display device with one hand;
The synchronization module 240 is configured to control a second target interface to perform a synchronization change according to a change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface.
With this apparatus 200, the electronic display device generates a floating window interface according to the display interface in the current display state of the electronic display device in response to the detection of the instruction to start the one-hand operation mode, where the floating window interface is an interface in which the display interface is reduced in size. And displaying a floating window interface on an upper layer of the display interface in a target display area on a screen of the electronic display device, wherein the target display area is determined based on a gesture of a user holding the electronic display device with one hand. And under the condition that the first one-hand touch operation is detected to be performed in the floating window interface by the user, controlling the first target interface to respond to the first one-hand touch operation, wherein the first one-hand touch operation is performed by the user holding the electronic display device by one hand. And controlling a second target interface to synchronously change according to the change of the first target interface, wherein the first target interface is one of a display interface and a floating window interface, and the second target interface is the other of the display interface and the floating window interface. By adopting the method, the floating window interface is the interface of which the display interface is reduced in size, so that the single-hand touch operation on the display interface can be realized by the single-hand touch operation of a user in the floating window interface. And because the size of the floating window interface is smaller than that of the display interface, compared with the display interface, the floating window interface is more convenient for a user to perform one-hand touch operation. The method solves the problem that the touch operation by one hand is inconvenient because the screen of the large-screen electronic display device is oversized in the related art, for example, the problem that the touch operation by one hand is inconvenient on the screen of the large-screen electronic display device.
Optionally, the first target interface is configured with an interface of an application program, and the response module 230 includes:
The first determining submodule is configured to determine a target application program according to the first one-hand touch operation;
And the calling module is configured to control the first target interface to respond to the first one-hand touch operation by calling the interface of the target application program.
Optionally, the apparatus 200 further includes:
and the control module is configured to control the first target interface to respond to the second one-hand touch operation under the condition that the second one-hand touch operation of the user in the display interface is detected.
Optionally, the response module 230 is configured to:
If the first target interface is the display interface and the second target interface is the floating window interface, mapping the first one-hand touch operation to the display interface; and controlling the display interface to respond to the mapping result of the first one-hand touch operation.
Optionally, the display module 220 includes:
An acquisition sub-module configured to acquire touch data of a user on the electronic display device;
A second determining sub-module configured to determine a gesture of a user holding the electronic display device with one hand according to the touch data;
a third determining submodule configured to determine an operation area in which a hand holding the electronic display device with a single hand of a user performs a single-hand touch operation on a screen of the electronic display device according to a gesture in which the user holds the electronic display device with the single hand;
a fourth determination sub-module configured to determine the target display area from the operation area;
and the display sub-module is configured to display the floating window interface on the upper layer of the display interface in the target display area.
Optionally, the touch data includes a touch image, and the second determining submodule is configured to:
And inputting the touch image into a gesture recognition model after training, and obtaining a recognition result which is output by the gesture recognition model and is used for representing the gesture of a user holding the electronic display device by one hand.
Optionally, the electronic display device includes an electromagnetic wave energy absorption rate sensor, the touch data includes sensor data collected by the electromagnetic wave energy absorption rate sensor, and the second determination submodule is configured to:
Determining a contact area between a user and the electronic display device according to the sensor data;
And determining the gesture of a user for holding the electronic display device according to the position distribution of the contact area on the electronic display device.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
The present disclosure also provides a computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the display control method of an electronic display device provided by the present disclosure.
Fig. 3 is a block diagram of an electronic display device 800, according to an example embodiment. For example, electronic display device 800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 3, the electronic display device 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the electronic display device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the display control method of an electronic display device described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operation at the electronic display device 800. Examples of such data include instructions for any application or method operating on electronic display device 800, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 806 provides power to the various components of the electronic display device 800. The power supply components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic display device 800.
The multimedia component 808 includes a screen between the electronic display device 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. When the electronic display device 800 is in an operational mode, such as a photographing mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each of the front and rear cameras may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic display device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
Input/output interface 812 provides an interface between processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the electronic display device 800. For example, the sensor assembly 814 may detect an on/off state of the electronic display device 800, a relative positioning of the components, such as a display and keypad of the electronic display device 800, the sensor assembly 814 may also detect a change in position of the electronic display device 800 or a component of the electronic display device 800, the presence or absence of a user's contact with the electronic display device 800, an orientation or acceleration/deceleration of the electronic display device 800, and a change in temperature of the electronic display device 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the electronic display device 800 and other devices, either wired or wireless. The electronic display device 800 may access a wireless network based on a communication standard, such as WiFi,2G, or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic display device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the display control methods of the electronic display devices described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including instructions executable by processor 820 of electronic display device 800 to perform the display control method of the electronic display device described above. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned display control method of an electronic display device when being executed by the programmable apparatus.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

  1. A display control method of an electronic display device, the method comprising:
    Responding to a command for starting a single-hand operation mode, and generating a floating window interface according to a display interface in a current display state of the electronic display device, wherein the floating window interface is an interface with a reduced size of the display interface;
    displaying the floating window interface on an upper layer of the display interface in a target display area on a screen of the electronic display device, wherein the target display area is determined based on a gesture of a user holding the electronic display device with one hand;
    controlling a first target interface to respond to first one-hand touch operation under the condition that the user is detected to perform the first one-hand touch operation in the floating window interface, wherein the first one-hand touch operation is performed by holding the hand of the electronic display device by one hand of the user; and is combined with the other components of the water treatment device,
    And controlling a second target interface to synchronously change according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface.
  2. The method of claim 1, wherein the first target interface is configured with an interface of an application program, and wherein controlling the first target interface to respond to the first one-hand touch operation comprises:
    determining a target application program according to the first single-hand touch operation;
    And controlling the first target interface to respond to the first one-hand touch operation by calling the interface of the target application program.
  3. The method according to claim 1, wherein the method further comprises:
    And controlling the first target interface to respond to the second one-hand touch operation under the condition that the user is detected to perform the second one-hand touch operation in the display interface.
  4. The method of claim 1, wherein if the first target interface is the display interface and the second target interface is the floating window interface, the controlling the first target interface to respond to the first one-hand touch operation if it is detected that the user performs the first one-hand touch operation in the floating window interface comprises:
    mapping the first single-hand touch operation to the display interface;
    and controlling the display interface to respond to the mapping result of the first one-hand touch operation.
  5. The method of any of claims 1-4, wherein displaying the floating window interface at an upper layer of the display interface within a target display area on a screen of the electronic display device comprises:
    acquiring touch data of a user on the electronic display device;
    Determining a gesture of holding the electronic display device by a single hand of a user according to the touch data;
    Determining an operation area of one-hand touch operation of a hand holding the electronic display device on a screen of the electronic display device according to a gesture of the user holding the electronic display device with one hand;
    determining the target display area according to the operation area;
    and displaying the floating window interface on the upper layer of the display interface in the target display area.
  6. The method of claim 5, wherein the touch data comprises a touch image, and wherein the determining a gesture of a user holding the electronic display device with one hand based on the touch data comprises:
    And inputting the touch image into a gesture recognition model after training, and obtaining a recognition result which is output by the gesture recognition model and is used for representing the gesture of a user holding the electronic display device by one hand.
  7. The method of claim 5, wherein the electronic display device comprises an electromagnetic wave energy absorption rate sensor, wherein the touch data comprises sensor data collected by the electromagnetic wave energy absorption rate sensor, and wherein determining a gesture of a user holding the electronic display device with one hand based on the touch data comprises:
    Determining a contact area between a user and the electronic display device according to the sensor data;
    And determining the gesture of a user for holding the electronic display device according to the position distribution of the contact area on the electronic display device.
  8. A display control apparatus of an electronic display device, the apparatus comprising:
    the generating module is configured to respond to the detection of an instruction for starting a one-hand operation mode, and generate a floating window interface according to a display interface in a current display state of the electronic display device, wherein the floating window interface is an interface with the reduced size of the display interface;
    a display module configured to display the floating window interface on top of the display interface within a target display area on a screen of the electronic display device, wherein the target display area is determined based on a gesture of a user holding the electronic display device with one hand;
    The response module is configured to control a first target interface to respond to a first one-hand touch operation under the condition that the first one-hand touch operation of a user in the floating window interface is detected, wherein the first one-hand touch operation is performed by a hand holding the electronic display device by a single hand of the user;
    and the synchronization module is configured to control a second target interface to perform synchronous change according to the change of the first target interface, wherein the first target interface is one of the display interface and the floating window interface, and the second target interface is the other of the display interface and the floating window interface.
  9. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the steps of the method according to any one of claims 1-7.
  10. An electronic display device, comprising:
    a memory having a computer program stored thereon;
    a processor for executing the computer program in the memory to implement the steps of the method of any one of claims 1-7.
CN202280004180.3A 2022-06-20 2022-06-20 Electronic display device, display control method and device thereof, and storage medium Pending CN117999537A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/099937 WO2023245373A1 (en) 2022-06-20 2022-06-20 Electronic display device and display control method and apparatus therefor, and storage medium

Publications (1)

Publication Number Publication Date
CN117999537A true CN117999537A (en) 2024-05-07

Family

ID=89378946

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280004180.3A Pending CN117999537A (en) 2022-06-20 2022-06-20 Electronic display device, display control method and device thereof, and storage medium

Country Status (2)

Country Link
CN (1) CN117999537A (en)
WO (1) WO2023245373A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110037040A (en) * 2009-10-05 2011-04-13 삼성전자주식회사 Method for displaying screen thereof and a portable terminal
CN106569672A (en) * 2016-11-09 2017-04-19 宇龙计算机通信科技(深圳)有限公司 Application icon managing method and terminal equipment
CN108366301B (en) * 2018-04-24 2021-03-09 中国广播电视网络有限公司 Android-based video suspension playing method
CN109165076B (en) * 2018-10-17 2022-03-29 Oppo广东移动通信有限公司 Game application display method, device, terminal and storage medium
CN111124201A (en) * 2019-11-29 2020-05-08 华为技术有限公司 One-hand operation method and electronic equipment
CN113805745B (en) * 2021-08-12 2022-09-20 荣耀终端有限公司 Control method of suspension window and electronic equipment

Also Published As

Publication number Publication date
WO2023245373A1 (en) 2023-12-28

Similar Documents

Publication Publication Date Title
US10788978B2 (en) Method and apparatus for displaying interface and storage medium
US10007841B2 (en) Human face recognition method, apparatus and terminal
US20190235716A1 (en) Method and device for displaying interface
CN111031398A (en) Video control method and electronic equipment
US20210333948A1 (en) Method, device, and storage medium for controlling display of floating window
JP6093088B2 (en) Method and apparatus for coordinating web pages and electronic devices
CN107172347B (en) Photographing method and terminal
CN106980409B (en) Input control method and device
CN110941375B (en) Method, device and storage medium for locally amplifying image
CN107729880A (en) Method for detecting human face and device
CN113655929A (en) Interface display adaptation processing method and device and electronic equipment
CN111522498A (en) Touch response method and device and storage medium
KR20220093091A (en) Labeling method and apparatus, electronic device and storage medium
CN112333395A (en) Focusing control method and device and electronic equipment
CN111273979A (en) Information processing method, device and storage medium
US20230393649A1 (en) Method and device for inputting information
CN117999537A (en) Electronic display device, display control method and device thereof, and storage medium
CN115543064A (en) Interface display control method, interface display control device and storage medium
CN114245017A (en) Shooting method and device and electronic equipment
CN112165584A (en) Video recording method, video recording device, electronic equipment and readable storage medium
CN113518149B (en) Screen display method and device and storage medium
CN117762358A (en) Display control method, device and storage medium
CN114546203A (en) Display method, display device, electronic apparatus, and readable storage medium
CN114546576A (en) Display method, display device, electronic apparatus, and readable storage medium
CN117149317A (en) Method, device, equipment and storage medium for displaying interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination