WO2018103257A1 - Procédé de commande d'état d'écran et terminal utilisateur - Google Patents

Procédé de commande d'état d'écran et terminal utilisateur Download PDF

Info

Publication number
WO2018103257A1
WO2018103257A1 PCT/CN2017/082625 CN2017082625W WO2018103257A1 WO 2018103257 A1 WO2018103257 A1 WO 2018103257A1 CN 2017082625 W CN2017082625 W CN 2017082625W WO 2018103257 A1 WO2018103257 A1 WO 2018103257A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
state
touch points
touch
user terminal
Prior art date
Application number
PCT/CN2017/082625
Other languages
English (en)
Chinese (zh)
Inventor
兰娟
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2018103257A1 publication Critical patent/WO2018103257A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present application relates to the field of terminal technologies, and in particular, to a method for controlling a screen state and a user terminal.
  • Embodiments of the present invention provide a method for controlling a screen state and a user terminal, which determine the screen state of the two screens by the number of touch points in the two screens of the user terminal, thereby reducing the user's desire to light the screen or close the screen.
  • the operation further improves the operational efficiency of the user terminal.
  • an embodiment of the present invention provides a method for controlling a state of a screen, where the method is applied to a user terminal, including:
  • the screen states of the two screens are determined by the number of touch points in the two screens of the user terminal, thereby reducing the operation of the user to turn on the screen or turning off the screen, thereby improving the operation of the user terminal. effectiveness.
  • the user terminal can detect the first screen and the second screen at a time before determining a screen state of the at least one of the first screen and the second screen
  • the current screen state adjusts the screen state of the at least one screen according to the current screen state of the first screen and the second screen.
  • the current screen state is a screen state before the touch signal of the at least one screen is detected.
  • the method further includes:
  • the user terminal determines the screen of the first screen
  • the state is the off state and/or the screen state of the second screen is determined to be the lighting state.
  • the method further includes: when the number of the first touch points is equal to the number of the second touch points
  • the user terminal may further detect the occlusion area to determine the screen state of the at least one screen.
  • the user terminal detects the first occlusion area of the first screen and the second occlusion area of the second screen; if the first occlusion area is larger than the first occlusion area Determining the screen state, determining that the screen state of the first screen is the off state and/or determining that the screen state of the second screen is the lighting state; if the first occlusion region is smaller than the second occlusion region, determining that the screen state of the first screen is The lighting state and/or determining the screen state of the second screen is the off state.
  • the manner in which the user holds the user terminal with one hand and the situation in which the user holds the user terminal with both hands determines that the first occlusion area and the second occlusion area are different.
  • the first occlusion area when detecting that the user holds the user terminal with one hand, may be according to the contact position of each touch point in the first screen with the side of the user's palm and the user terminal. The first occlusion area is calculated; likewise, the second occlusion area in the case of holding the user terminal with one hand is calculated according to the calculation manner of the first occlusion area.
  • the first occlusion area may be determined by summing the occlusion areas of the two hands respectively, and the occlusion area of each hand may be It is determined with reference to the case of one-handed grip; likewise, the second occlusion area is calculated in the case where the user terminal is held by both hands in accordance with the calculation method of the first occlusion area.
  • the method further includes: for the case where the first occlusion region is equal to the second occlusion region, the user segment may further acquire the touch point
  • the duration of the duration on the screen is specifically the first duration of the first touch point on the first screen and the second duration of the second touch point on the second screen.
  • the first occlusion area is equal to the second occlusion area, obtain a first duration of the first touch point on the first screen and a second duration of the second touch point on the second screen; The duration of the duration is greater than the second duration, determining that the screen state of the first screen is the off state and/or determining that the screen state of the second screen is the lighting state; if the first duration is less than the second duration, determining the first screen The screen state is a lit state and/or the screen state of the second screen is determined to be a closed state.
  • the method further includes: when the number of the first touch points is equal to the number of the second touch points
  • the user terminal may further obtain the duration of the screen to determine the screen state of the at least one screen, specifically, the first duration of the first touch point on the first screen, and the second touch point on the second screen. Duration.
  • the first duration of the first touch point on the first screen and the second time of the second touch point on the second screen are acquired.
  • the duration is longer; if the first duration is greater than the second duration, determining that the screen state of the first screen is the off state and/or determining that the screen state of the second screen is the lighting state;
  • the screen state of the first screen is the lighting state and/or the screen state of the second screen is determined to be the off state.
  • the first duration may first stay according to each first touch point in the first screen to the current moment.
  • the average duration of the same touch position is calculated, and the average value of the total duration of all the first touch points is calculated, and the average value of the total duration is determined as the first duration; or the first duration may be first according to the first screen.
  • the determination method of the first duration is not limited, and the duration of each of the first touch points is the same as the duration of the first touch position, and the duration of the maximum value is determined as the first duration.
  • the second duration may be calculated in accordance with the determination of the first duration.
  • the user terminal determines the first screen, in a case where the number of the first touch points is less than the number of the second touch points.
  • the specific implementation manner of the screen state being the lighting state and/or determining that the screen state of the second screen is the off state is:
  • the screen state of the first screen is adjusted to Lighting state, and/or controlling the screen state of the second screen to remain off;
  • the screen state of the first screen is adjusted. To light up the state, and/or to adjust the screen state of the second screen to the off state;
  • the screen state of the first screen is controlled to remain. Lighting state, and/or controlling the screen state of the second screen to remain off;
  • the user terminal determines a screen of the first screen, where the number of the first touch points is greater than the number of the second touch points.
  • the screen state of controlling the first screen remains off. a state, and/or, adjusting a screen state of the second screen to a lighting state;
  • the screen state of the first screen is controlled to remain. Turning off the state, and/or controlling the screen state of the second screen to remain lit;
  • the screen state of the first screen is adjusted. To turn off the state, and/or to adjust the screen state of the second screen to a lit state;
  • the screen state of the first screen is Adjusted to the off state, and/or, the screen state controlling the second screen remains lit.
  • an embodiment of the present invention provides a user terminal, where the user terminal is configured with a first screen and a second screen, where the user terminal includes:
  • a detecting module configured to detect a touch signal to at least one of the first screen and the second screen
  • a quantity determining module configured to determine, according to the at least one touch signal, a number of first touch points on the first screen and a number of second touch points on the second screen;
  • a state determining module configured to determine that a screen state of the first screen is a lighting state and/or determine the second screen, if the number of the first touch points is less than the number of the second touch points The screen status is off.
  • a user terminal includes a processor and a touch screen in a structure, the touch screen is communicatively coupled to the processor, and the processor is configured to execute the user terminal provided by the first aspect of the present application.
  • a memory may be further included, where the memory is used to store application code that supports a user terminal to execute the above method, and the processor is configured to execute an application stored in the memory.
  • a computer storage medium where the program code is stored, and the program code is executed by the computing device, and the screen state control method provided by any one of the first aspect or the first aspect is executed.
  • the storage medium includes, but is not limited to, a flash memory, a hard disk drive (HDD), or a solid state drive (SSD).
  • a computer program product is provided, the computer product being executed by the computing device, performing a screen state control method provided by any one of the first aspect or the first aspect.
  • the names of the user terminal, the touch screen, the first screen, and the second screen are not limited to the embodiment of the present invention.
  • the devices may appear under other names. As long as the functions of the respective devices are similar to the present application, they are within the scope of the claims and their equivalents.
  • the user terminal is configured with a first screen and a second screen, and detects a touch signal for at least one of the first screen and the second screen; and determines, according to the at least one touch signal, the first screen.
  • the status and/or the screen state of the second screen is determined to be off.
  • the screen state of the two screens is determined by the number of touch points in the two screens of the user terminal, which reduces the operation of the user to light the screen or close the screen, thereby improving the operation efficiency of the user terminal.
  • FIG. 1A is a front elevational view of a possible user terminal according to an embodiment of the present invention.
  • FIG. 1B is a schematic rear view of a possible user terminal according to an embodiment of the present invention.
  • FIG. 2 is a schematic flowchart of a method for screen state control according to an embodiment of the present invention
  • FIG. 3 is a diagram showing an example of indication of a touch point in a single-handed grip according to an embodiment of the present invention
  • FIG. 4 is a diagram showing an example of indication of a touch point in a case of two-handed grip according to an embodiment of the present invention
  • FIG. 5A is a diagram showing an example of an occlusion area in a single-handed grip according to an embodiment of the present invention.
  • FIG. 5B is a diagram showing an example of an occlusion area in a single-handed grip according to an embodiment of the present invention.
  • 6A is a diagram showing an example of an occlusion area in a case of holding both hands according to an embodiment of the present invention
  • 6B is a diagram showing an example of an occlusion area in the case of two-handed grip according to an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of a user terminal according to an embodiment of the present invention.
  • FIG. 8 is a functional block diagram of another user terminal according to an embodiment of the present invention.
  • FIG. 1A and FIG. 1B a front view and a rear view of a possible user terminal are provided according to an embodiment of the present invention.
  • the A-side screen of the user terminal such as the first screen, may be displayed in the front view of the user terminal in FIG. 1A, and the B-side screen of the user terminal may be displayed in the back view of the user terminal in FIG. 1B, such as the second screen.
  • the user In the process of the user using the user terminal, if the user is using the first screen of the front side and the second screen is not used, that is, the current first screen is in the lighting state and the second screen is in the off state, the user The front side is rotated to the back side and after the end of the rotation, the user needs to light up the second screen on the back side and close the first screen on the front side to realize the use of the second screen without causing misoperation of the first screen. . It can be found that in this way, the user needs multiple operations to complete the switching use of the first screen and the second screen, which reduces the operation efficiency of the user.
  • the user terminal may determine the first on the first screen according to the at least one touch signal, when receiving the touch signal to the at least one screen of the first screen and the second screen.
  • the number of the touch points and the number of the second touch points on the second screen if the number of the first touch points is less than the number of the second touch points, determining that the screen state of the first screen is the lighted state and / or determine that the screen status of the second screen is off.
  • the user terminal may include, but is not limited to, an electronic device of a specific two screens such as a mobile phone, a tablet, a smart bracelet, and the like.
  • the user terminal is configured with a first screen and a second screen.
  • a screen state control method is provided according to an embodiment of the present invention. As shown in FIG. 2, the method includes steps 101 to 111. Please see the specifics below.
  • the user terminal detects at least one touch signal for the first screen and the second screen.
  • the touch signal may be generated by a finger or a hand of the user terminal performing a touch operation on the first screen or the second screen.
  • the user terminal determines the number of the first touch points on the first screen and the number of the second touch points on the second screen, respectively, according to the detected at least one touch signal.
  • the number of the first touch points or the number of the second touch points is an integer greater than or equal to zero, and since the user terminal detects at least one touch signal, the data of the first touch point and the second touch
  • the number of handles is zero when they are different.
  • FIG. 3 is a schematic diagram of an indication of a touch point in a single-handed grip according to an embodiment of the present invention.
  • the user holds the user terminal in the right hand, and the front side of the user terminal is the first screen, and the back side is the second.
  • the user terminal can detect the touch signal generated by the touch operation of the thumb on the first screen, and the first touch on the first screen.
  • the touch signal, the second touch point on the second screen is the position where the other finger performs the touch operation on the second screen.
  • the four fingers perform the touch operation on the second screen. , respectively, B1, B2, B3, B4, and further determine that the number of second touch points on the second screen is 4.
  • FIG. 4 is a schematic diagram of an indication of a touch point in a case of holding both hands according to an embodiment of the present invention.
  • the user holds the user terminal with both hands, and the front side of the user terminal is the first screen, and the back side is the second screen.
  • the user terminal is held by both hands, as shown in FIG. 4
  • the user terminal can detect The touch signal generated by the touch operation of the thumb on the first screen, the first touch point on the first screen is the positions A2 and A3 of the two touches respectively performing the touch operation on the first screen, and determining The number of the first touch points on the first screen is 2; and the touch signals generated by the touch operations of the other screens of the user on the second screen are detected, and the second touch points on the second screen are other The position where the finger performs the touch operation on the second screen, as can be seen from the figure, the four fingers of the left hand perform the touch operation on the second screen, and the three fingers in the right hand perform the touch operation on the second screen, respectively For B5, B6, B7, B8, B9, B10, B11, it is determined that the number of second touch points on the second screen is 7.
  • the user terminal compares the numbers of the two touch points. It can be understood that the number of the first touch points may be smaller than the second number. The number of touch points; or the number of first touch points may be equal to the number of second touch points; or the number of first touch points may be greater than the number of second touch points. For specific situations, refer to the specific description of steps 104 to 111 below.
  • the user terminal compares the number of the two touch points, if the number of the first touch points is smaller than the second touch The number of handles, the user terminal determines that the screen state of the first screen is a lighting state and/or determines that the screen state of the second screen is a closed state. If the number of the first touch points is less than the number of the second touch points, it indicates that the user terminal is about to use the first screen, and the screen state of the first screen may be determined in response to the user's need to use the first screen.
  • the user does not need to illuminate the first screen by other types of operations such as buttons, thereby reducing the lighting operation of the user; in addition, since the user is about to use the first screen, the user terminal can be turned off.
  • the second screen prevents the user from performing a misoperation on the second screen, and can also save the power consumption of the user terminal, and the user does not need to close the second screen by other types of operations such as a button, and also reduces the user's closing screen. Operation.
  • the user terminal may perform the screen state of at least one of the first screen and the second screen by comparing the number of the first touch points and the number of the second touch points. Adjusting; or controlling at least one of the first screen and the second screen to maintain the current screen state.
  • the current screen state herein refers to a screen state before the user terminal detects a touch signal for the at least one screen of the first screen and the second screen. Therefore, the user terminal performs an adjustment screen state or controls the hold screen state in relation to the current screen state. Next, let's take a closer look at this.
  • the user terminal determines that the screen state of the first screen is a lighting state and/or determines the first Before the screen state of the second screen is the off state, the current screen state of the first screen and the second screen may also be detected, and according to the number of the first touch points, the number of the second touch points, the number A current screen state of a screen and a current screen state of the second screen, determining that a screen state of the first screen is a lighting state and/or determining a screen state of the second screen is a closed state.
  • the number of the first touch points is less than the number
  • the number of two touch points adjusts the screen state of the first screen to a lighting state, and/or controls the screen state of the second screen to remain in a closed state.
  • at least one of adjusting the screen state of the first screen from the off state to the lighting state and controlling the screen state of the second screen to remain in the closed state may be implemented.
  • other operations of lighting the screen or turning off the screen thereby improving the operation efficiency of the user terminal.
  • the user terminal detects the first touch The number of points is smaller than the number of the second touch points, indicating that the user is about to use the first screen, so the user terminal adjusts the screen state of the first screen to the lighting state, and/or controls the second The screen status of the screen remains off.
  • the current screen state of the first screen is the off state and the current screen state of the second screen is the lighting state
  • the number of the first touch points is less than the The number of second touch points adjusts the screen state of the first screen to a lighting state, and/or adjusts the screen state of the second screen to a closed state.
  • at least one of adjusting the screen state of the first screen to the lighting state and adjusting the screen state of the second screen to the closed state may be implemented. Reducing the user's key operation, other operations of lighting the screen or closing the screen, thereby improving the operating efficiency of the user terminal.
  • the user terminal For example, if the user is using the second screen of the user terminal, that is, the current screen state of the first screen is the off state and the current screen state of the second screen is the lighting state, if the user terminal detects the first screen The number of touch points is smaller than the number of the second touch points, indicating that the user is about to use the first screen, so the user terminal adjusts the screen state of the first screen to the lighting state, and/or The screen state of the second screen is adjusted to the off state.
  • the current screen state of the first screen is a lighting state and the current screen state of the second screen is a closed state
  • the number of the first touch points is less than the The number of second touch points controls the screen state of the first screen to remain lit, and/or controls the screen state of the second screen to remain off.
  • the user terminal For example, if the user is using the first screen of the user terminal, that is, the current screen state of the first screen is the lighting state and the current screen state of the second screen is the closed state, if the user terminal detects the first screen The number of the touch points is smaller than the number of the second touch points, indicating that the user continues to use the first screen, so the user terminal does not adjust the screen states of the first screen and the second screen, thereby improving the user terminal.
  • the user terminal detects the first screen The number of the touch points is smaller than the number of the second touch points, indicating that the user continues to use the first screen, so the user terminal does not adjust the screen states of the first screen and the second screen, thereby improving the user terminal.
  • the current screen state of the first screen is a lighting state and the current screen state of the second screen is a lighting state
  • the number of the first touch points is less than Determining the number of second touch points, controlling the screen state of the first screen to remain lit, and/or adjusting the screen state of the second screen to a closed state; thus, the user needs to use the user terminal
  • the first screen it is possible to achieve at least one of controlling the screen state of the first screen to remain lit, and adjusting the screen state of the second screen to the off state to reduce
  • the operation of the user terminal is improved by operating the other buttons on the screen or turning off the screen.
  • the screen state of the two screens of the user terminal is in a light state
  • the user terminal detects that the number of the first touch points is less than the number of the second touch points, the user is about to need
  • the first screen is used, so the user terminal controls the screen state of the first screen to remain lit, and/or adjusts the screen state of the second screen to a closed state.
  • the user terminal after determining the number of the first touch points and the number of the second touch points, the user terminal, if the number of the first touch points is greater than the number of the second touch points, The user terminal determines that the screen state of the first screen is the off state and/or determines that the screen state of the second screen is the lit state. If the number of the first touch points is greater than the number of the second touch points, it indicates that the user terminal is about to use the second screen.
  • the screen state of the second screen is determined to be The lighting state, so that the user does not need to perform other types of operations such as pressing a button to light the second screen, thereby reducing the lighting operation of the user; in addition, since the user is about to use the second screen, the user terminal can be turned off.
  • a screen prevents the user from performing a misoperation on the first screen, and also saves power consumption of the user terminal, and the user does not need to close the first screen by other types of operations such as buttons, and also reduces the user's closing of the screen. operating.
  • the user terminal may perform the screen state of at least one of the first screen and the second screen by comparing the number of the first touch points and the number of the second touch points. Adjusting; or controlling at least one of the first screen and the second screen to maintain the current screen state.
  • the current screen state herein refers to a screen state before the user terminal detects a touch signal for the at least one screen of the first screen and the second screen. Therefore, the user terminal performs an adjustment screen state or controls the hold screen state in relation to the current screen state. Next, let's take a closer look at this.
  • the user terminal may also detect the first screen and the before determining that the screen state of the first screen is the off state and/or determining that the screen state of the second screen is the lighting state.
  • a current screen state of the second screen and determining the first screen according to the number of first touch points, the number of second touch points, the current screen state of the first screen, and the current screen state of the second screen.
  • the screen state is a closed state and/or the screen state of the second screen is determined to be a lighted state.
  • the number of the first touch points is greater than the first The number of two touch points controls the screen state of the first screen to remain off, and/or adjusts the screen state of the second screen to a lighting state.
  • the number of the first touch points is greater than the Controlling, by the number of second touch points, that the screen state of the first screen remains off, and/or controlling the screen state of the second screen to remain in a lighting state;
  • the current screen state of the first screen is a lighting state and the current screen state of the second screen is a closed state
  • the number of the first touch points is greater than the The number of second touch points adjusts the screen state of the first screen to a closed state, and/or adjusts the screen state of the second screen to light status.
  • the current screen state of the first screen is a lighting state and the current screen state of the second screen is a lighting state
  • the number of the first touch points is greater than
  • the number of second touch points is adjusted to adjust the screen state of the first screen to a closed state, and/or the screen state of the second screen is controlled to remain in a lighted state.
  • the user terminal After determining the number of the first touch points and the number of the second touch points, if the number of the first touch points is equal to the number of the second touch points, the user terminal detects the a first occlusion area of the first screen and a second occlusion area of the second screen.
  • the manner in which the user holds the user terminal with one hand and the situation in which the user holds the user terminal with both hands determines that the first occlusion area and the second occlusion area are different.
  • the first occlusion area may be in contact with the side of the user's palm and the user terminal according to the touch points of the first screen in the case that the user is manually held by the user. Position, the first occlusion area is calculated; likewise, the second occlusion area in the case of holding the user terminal with one hand is calculated according to the calculation manner of the first occlusion area.
  • the first occlusion area may be determined by summing the occlusion areas of the two hands respectively, and the occlusion area of each hand may be It is determined with reference to the case of one-handed grip; likewise, the second occlusion area is calculated in the case where the user terminal is held by both hands in accordance with the calculation method of the first occlusion area.
  • the embodiment of the present invention does not limit the method for detecting that the user holds the user terminal with one hand or holds the user terminal with both hands.
  • the user terminal does not need to detect the occlusion area of each screen.
  • the user terminal according to the position of each touch point, the contact position of the user's palm and the side of the user terminal And determining a first occlusion area of the user's palm in the first screen and a second occlusion area in the second screen, respectively.
  • FIG. 5A and FIG. 5B an exemplary diagram of an occlusion region in the case of one-handed grip is provided for an embodiment of the present invention.
  • 5A and FIG. 5B are the same holding state, wherein the touch point of the first screen on the front side of the user's finger and the user terminal is A4, and the touch point of the second screen on the back side of the user's finger and the user terminal is B12. It can be seen that the number of first touch points of the first screen is the same as the number of second touch points of the second screen.
  • the position of the touch boundary point is determined according to the touch point of the first screen on the front side of the user's finger and the front end of the user terminal, as shown by the location of T1 near the touch point position A4 in the figure, and then according to the user's palm.
  • the position of the front side of the front side of the user terminal is used to determine the front side boundary point.
  • the front side T1 of the user terminal is located, and the area enclosed by T1 in the figure can be determined as the first screen. An occlusion area.
  • the position of the touch boundary point is determined according to the touch point of the user's finger and the second screen on the back of the user terminal, as shown by the T2 position near the touch point position B12 in the figure, and then according to the user's palm.
  • User terminal The contact position of the side of the back side is used to determine the boundary point of the back side.
  • the position of the back side T2 of the user terminal is located, and the area enclosed by T2 in the figure can be determined as the second occlusion area in the second screen. .
  • FIG. 6A and FIG. 6B an exemplary diagram of an occlusion region in the case of two-handed grip is provided for an embodiment of the present invention.
  • 6A and FIG. 6B are the same holding state, wherein the touch points of the first screen on the front side of the user's finger and the user terminal are A5 and A6, and the touch points of the second screen on the back of the user's finger and the user terminal are B13. And B14, it can be seen that the number of first touch points of the first screen is the same as the number of second touch points of the second screen.
  • the first screen and the second screen may be equally divided into a left side and a right side, that is, the first screen is divided into a left side.
  • the first screen and the first screen on the right; the second screen is divided into a second screen on the left and a second screen on the right.
  • the position of the touch boundary point is determined according to the touch point of the user's finger and the first screen on the left side of the front side of the user terminal or the first screen on the right side, as shown in the first screen on the left side in the figure.
  • the location of the T11 near the touch point position A5, and then the front left side border point is determined according to the contact position of the user's palm and the side of the first screen on the left side of the front side of the user terminal, as shown in the figure, the left side of the user terminal is located at the left side T11.
  • the position enclosed by T11 in the figure can be determined as the first occlusion area on the left side of the first screen; in the same manner, the position of T12 near the touch point position A6 in the first screen on the right side is determined, Then, according to the contact position of the user's palm and the side of the first screen on the right side of the front side of the user terminal, the front right side boundary point is determined. As shown in the figure, the front side T12 of the user terminal is located, and the figure is surrounded by T12. The area can be determined as the second occlusion area on the right side of the first screen; finally, the first occlusion area on the left side and the first occlusion area on the right side are summed to obtain the first cover corresponding to the first screen. Area.
  • the position of the touch boundary point is determined according to the touch point of the user's finger and the second screen on the left side of the back of the user terminal or the second screen on the right side, as shown in the second screen on the left side in the figure.
  • the position of T21 near the touch point position B13, and then the front left side boundary point is determined according to the contact position of the user's palm and the side of the second screen on the left side of the front side of the user terminal, as shown in the figure.
  • the position enclosed by T21 in the figure can be determined as the second occlusion area on the left side of the second screen; in the same manner, the position of T22 near the touch point position B14 in the second screen on the right side is determined, Then, according to the contact position of the user's palm and the side of the second screen on the right side of the front side of the user terminal, the front right side boundary point is determined. As shown in the figure, the front side of the user terminal T22 is located, and the figure is surrounded by T22. The area can be determined as the second occlusion area on the right side of the second screen; finally, the second occlusion area on the left side and the second occlusion area on the right side are summed to obtain the second corresponding to the second screen. Block area.
  • the user terminal may install a pressure sensor, a distance sensor, a light sensor, etc. on a side of the user terminal to determine a side boundary in a contact position between the user's palm and the front side or the back side of the user terminal. Point to determine the occlusion area.
  • the manner of determining the occlusion area is not limited in the embodiment of the present invention.
  • the user terminal determines that the The screen state of the first screen is a closed state and/or the screen state of the second screen is determined to be a lighted state. If the first occlusion area is larger than the second occlusion area, it indicates that the user terminal is about to use the second screen. In this case, in response to the user needing to use the second screen, the screen state of the second screen is determined to be the lighting state.
  • the user does not need to perform other types of operations such as pressing a button to light the second screen, thereby reducing the lighting operation of the user; in addition, since the user is about to use the second screen,
  • the user terminal can close the first screen, so that the user can save the power consumption of the user terminal while performing the erroneous operation on the first screen, and the user does not need to close the first screen by using other types of operations such as a button. It also reduces the user's operation of turning off the screen.
  • the user terminal in determining the screen state of the first screen and/or the second screen, the user terminal needs to refer to a current screen state of the first screen and the second screen, where the current screen state refers to the user terminal detecting a screen state before a touch signal to the at least one screen of the first screen and the second screen.
  • the current screen state of the first screen is a closed state and the current screen state of the second screen is a closed state
  • the first occlusion region is greater than the second occlusion
  • the area controls the screen state of the first screen to remain off, and/or adjusts the screen state of the second screen to a lighting state.
  • the current screen state of the first screen is a closed state and the current screen state of the second screen is a lighted state, and if the first occlusion region is greater than the second Blocking the area, controlling the screen state of the first screen to remain off, and/or controlling the screen state of the second screen to remain lit;
  • the current screen state of the first screen is a lighting state and the current screen state of the second screen is a closed state
  • the first occlusion region is greater than the second
  • the occlusion area adjusts the screen state of the first screen to a closed state, and/or adjusts a screen state of the second screen to a lighting state.
  • the current screen state of the first screen is a lighting state and the current screen state of the second screen is a lighting state
  • the first occlusion region is greater than the first
  • the second occlusion area adjusts the screen state of the first screen to a closed state, and/or controls the screen state of the second screen to remain in a lighted state.
  • first occlusion area is smaller than the second occlusion area, determine that the screen state of the first screen is a lighting state and/or determine that a screen state of the second screen is a closed state.
  • the user terminal detects that the first occlusion area is larger than the second occlusion area, determining the first The screen state of the screen is a lit state and/or the screen state of the second screen is determined to be a closed state. If the first occlusion area is smaller than the second occlusion area, it indicates that the user terminal is about to use the first screen. In this case, in response to the user needing to use the first screen, the screen state of the first screen is determined to be lit.
  • the user terminal can close the second screen In order to prevent the user from performing a misoperation on the second screen, the power consumption of the user terminal can also be saved, and the user does not need to close the second screen by other types of operations such as a button, and the operation of turning off the screen of the user is also reduced.
  • the user terminal in determining the screen state of the first screen and/or the second screen, the user terminal needs to refer to a current screen state of the first screen and the second screen, where the current screen state refers to the user terminal detecting a screen state before a touch signal to the at least one screen of the first screen and the second screen.
  • the current screen state of the first screen is a closed state and the current screen state of the second screen is a closed state
  • the first occlusion region is smaller than the second occlusion region
  • the first occlusion region is smaller than the second occlusion
  • the area adjusts the screen state of the first screen to a lighting state, and/or adjusts the screen state of the second screen to a closed state.
  • the current screen state of the first screen is a lighting state and the current screen state of the second screen is a closed state
  • the first occlusion region is smaller than the second occlusion
  • the area controls the screen state of the first screen to remain lit, and/or controls the screen state of the second screen to remain off.
  • the first occlusion region controls the screen state of the first screen to remain lit, and/or adjusts the screen state of the second screen to a closed state.
  • first occlusion area is equal to the second occlusion area, acquiring a first duration of the first touch point on the first screen and the second touch point in the The second duration on the second screen.
  • the user terminal acquires the first touch a first duration of time on the first screen and a second duration of the second touch point on the second screen.
  • the first duration may be based on each duration of the first touch point in the first screen to the current touch time, and then calculate the total duration of all the first touch points.
  • the average value of the duration, the average value of the last total duration is determined as the first duration; or the first duration may first stay in the same touch according to each of the first touch points in the first screen to the current time.
  • the second duration may be calculated in a manner that determines the first duration.
  • first duration is greater than the second duration, determine that a screen state of the first screen is a closed state and/or determine that a screen state of the second screen is a lighting state.
  • the first duration may be greater than the second duration in 110, that is, if the first duration is greater than or equal to the For the second duration, it is determined that the screen state of the first screen is the off state and/or the screen state of the second screen is determined to be the lighting state. Alternatively, it may be merged into the case where the first duration is less than the second duration in 111, that is, if the first duration is less than or equal to the second duration, determining that the screen state of the first screen is a point The light state and/or the screen state of the second screen is determined to be a closed state. Alternatively, the user terminal may not adjust the screen states of the first screen and the second screen if the first duration and the second duration are the same.
  • the user terminal may simultaneously determine the first occlusion region and the second occlusion region in the case of detecting at least one touch signal for the first screen and the second screen.
  • the user terminal can simultaneously determine the first duration and the second duration while detecting at least one touch signal for the first screen and the second screen. This can ensure that the determined number of touch points and the determined occlusion area and the determined duration are the same time or the same duration range to improve the accuracy of adjusting the screen state.
  • the factor of the occlusion area may not be considered, and the number of the first touch points may be equal to the number of the second touch points. a first duration of the first touch point on the first screen and a second duration of the second touch point on the second screen, and further performing steps 110 and 111.
  • the user terminal is configured with a first screen and a second screen, and detects the first screen and the second screen.
  • a touch signal of at least one screen in the screen determining, according to the at least one touch signal, the number of first touch points on the first screen and the number of second touch points on the second screen; if the first touch If the number of points is less than the number of second touch points, it is determined that the screen state of the first screen is the lighting state and/or the screen state of the second screen is determined to be the closed state.
  • the screen state of the two screens is determined by the number of touch points in the two screens of the user terminal, which reduces the operation of the user to light the screen or close the screen, thereby improving the operation efficiency of the user terminal.
  • the user terminal is configured with a touch screen to present a first screen and a second screen.
  • Any touch screen can be configured with a capacitive touch panel or an infrared touch panel.
  • the touch screen of the user terminal may be two separate touch screens, one touch screen corresponding to one screen; or the touch screen is a touch screen that can be bent, such as a flexible screen, where two screens for display are present in the one flexible screen. .
  • the embodiment of the present invention does not limit the number of touch screens.
  • the touch screen can be used to receive a touch operation of a user.
  • the touch operation here refers to an operation in which a user's finger or hand directly touches the screen.
  • Applications supported by the user terminal may include, but are not limited to, social applications such as Facebook; image management applications such as photo albums; map applications such as Google Maps; browsers such as Safari, Google Chrome, and the like. These applications can have a common input and output device: a touch screen.
  • the touch screen is used to receive the touch operation of the user, and displays the output content of the application when the touch screen is lit.
  • FIG. 7 is a structural block diagram of an implementation of the user terminal 100.
  • the user terminal 100 can include a baseband chip 110, a memory 115, including one or more computer readable storage media, a radio frequency (RF) module 116, and a peripheral system 117. These components can communicate over one or more communication buses 114.
  • RF radio frequency
  • the peripheral system 117 is mainly used to implement an interaction function between the user terminal 110 and the user/external environment, and mainly includes input and output devices of the user terminal 100.
  • the peripheral system 117 can include a touch screen controller 118, a camera controller 119, an audio controller 120, and a sensor management module 121.
  • Each controller may be coupled to a respective peripheral device, such as a touch screen 123, a camera 124, an audio circuit 125, and a sensor 126.
  • the gesture sensor in sensor 126 can be used to receive gesture control operations input by the user.
  • the pressure sensor in the sensor 126 can be disposed under the touch screen 123 and can be used to collect the touch pressure applied to the touch screen 123 when the user inputs the touch operation through the touch screen 123. It should be noted that the peripheral system 117 may also include other I/O peripherals.
  • the baseband chip 110 can be integrated to include one or more processors 111, a clock module 112, and a power management module 113.
  • the clock module 112 integrated in the baseband chip 110 is primarily used to generate the clocks required for data transfer and timing control for the processor 111.
  • the power management module 113 integrated in the baseband chip 110 is mainly used to provide a stable, high-precision voltage for the processor 111, the radio frequency module 116, and the peripheral system.
  • a radio frequency (RF) module 116 is used to receive and transmit radio frequency signals, primarily integrating the receiver and transmitter of the user terminal 100.
  • a radio frequency (RF) module 116 communicates with the communication network and other communication devices via radio frequency signals.
  • the radio frequency (RF) module 116 may include, but is not limited to: an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chip, a SIM card, and Storage media, etc.
  • a radio frequency (RF) module 116 can be implemented on a separate chip.
  • Memory 115 is coupled to processor 111 for storing various software programs and/or sets of instructions.
  • memory 115 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
  • the memory 115 can store an operating system such as an embedded operating system such as ANDROID, IOS, WINDOWS, or LINUX.
  • the memory 115 can also store a network communication program that can be used to communicate with one or more additional devices, one or more user terminals, one or more user terminals.
  • the memory 115 can also store a user interface program, which can realistically display the content image of the application through a graphical operation interface, and receive user control operations on the application through input controls such as menus, dialog boxes, and keys. .
  • the memory 115 can also store one or more programs. As shown in FIG. 7, these programs may include: social applications such as Facebook; image management applications such as photo albums; map applications such as Google Maps; browsers such as Safari, Google Chrome, and the like.
  • the memory 15 is further configured to store application code for executing the solution of the present application, and is controlled by the touch screen controller 118, the touch screen 114, and the baseband chip 110. Alternatively, the execution may be controlled by other peripheral systems 117 to implement the map. 2 - The action of the user terminal provided by any of the embodiments shown in FIG. 6B.
  • the user terminal 100 is only an example provided by an embodiment of the present invention, and the user terminal 100 may have more or less components than those shown, may combine two or following components, or may have components Different configurations are implemented.
  • FIG. 8 is a functional block diagram of a user terminal according to an embodiment of the present invention.
  • the functional blocks of the user terminal can be implemented by hardware, software or a combination of hardware and software to implement the inventive solution.
  • the functional blocks depicted in Figure 8 can be combined or separated into several sub-blocks to implement the inventive arrangements. Accordingly, the above description of the invention may support any possible combination or separation or further definition of the functional modules described below.
  • the user terminal 200 may include a detection module 201, a quantity determination module 202, and a status determination module 203. among them:
  • the detecting module 201 is configured to detect a touch signal to at least one of the first screen and the second screen;
  • the quantity determining module 202 is configured to determine, according to the at least one touch signal, the number of first touch points on the first screen and the number of second touch points on the second screen;
  • the state determining module 203 is configured to determine that the screen state of the first screen is a lighting state and/or determine the second if the number of the first touch points is less than the number of the second touch points The screen status of the screen is off.
  • the detecting module 201 may be the touch screen controller 118 and/or the sensor management module 121 in FIG. 7, and may also be other input devices.
  • the quantity determining module 202 can be the processor 111 in FIG.
  • the state determining module may be implemented by the processor 111 and the touch screen controller 118 in FIG. 7, and may further include other modules of the baseband chip.
  • the detection module 201 the quantity determining module 202, and the state determining module 203 determine the screen states of the first screen and the second screen, please refer to the foregoing embodiment, and details are not described herein.
  • the detecting module 201 is further configured to detect a current screen state of the first screen and the second screen, where the current screen state is a touch that detects the at least one screen The state of the screen before the signal.
  • the state determining module 203 is further configured to: if the number of the first touch points is greater than the number of the second touch points, determine that the screen state of the first screen is Turning off the state and/or determining that the screen state of the second screen is a lit state.
  • the detecting module 201 is further configured to: if the number of the first touch points is equal to the number of the second touch points, detect the first occlusion area of the first screen And a second occlusion region in the second screen;
  • the state determining module 203 is further configured to: if the first occlusion region is greater than the second occlusion region, determine that a screen state of the first screen is a closed state and/or determine a screen state of the second screen Is lighting the state;
  • the status determining module 203 is further configured to: if the first occlusion area is smaller than the second occlusion area, determine that the screen state of the first screen is a lighting state and/or determine a screen of the second screen The status is off.
  • the first obtaining module 204 is further included:
  • the first obtaining module 204 is configured to acquire a first duration of the first touch point on the first screen and the second touch if the first occlusion area is equal to the second occlusion area a second duration of the handle on the second screen;
  • the state determining module 203 is further configured to: if the first duration is greater than the second duration, determine that a screen state of the first screen is a closed state and/or determine a screen state of the second screen Is lighting the state;
  • the state determining module 203 is further configured to: if the first duration is less than the second duration, determine that the screen state of the first screen is a lighting state and/or determine a screen of the second screen The status is off.
  • the second obtaining module 205 is further included:
  • the second obtaining module 205 is configured to acquire a first duration of the first touch point on the first screen, if the number of the first touch points is equal to the number of the second touch points And a second duration of the second touch point on the second screen;
  • the state determining module 203 is further configured to: if the first duration is greater than the second duration, determine that a screen state of the first screen is a closed state and/or determine a screen state of the second screen Is lighting the state;
  • the state determining module 203 is further configured to: if the first duration is less than the second duration, determine that the screen state of the first screen is a lighting state and/or determine a screen of the second screen The status is off.
  • the state determining module 203 determines that the screen state of the first screen is a lighting state if the number of the first touch points is less than the number of the second touch points. And/or determining that the screen state of the second screen is off state is specifically for:
  • the current screen state of the first screen is the off state and the current screen state of the second screen is the off state, and the number of the first touch points is less than the number of the second touch points, Adjusting a screen state of the first screen to a lighting state, and/or controlling a screen state of the second screen to remain a closed state;
  • the current screen state of the first screen is the off state and the current screen state of the second screen is the lighting state, and the number of the first touch points is less than the number of the second touch points, Adjusting a screen state of the first screen to a lighting state, and/or adjusting a screen state of the second screen to a closed state;
  • the current screen state of the first screen is the lighting state and the current screen state of the second screen is the off state, and the number of the first touch points is less than the number of the second touch points, Controlling that the screen state of the first screen remains lit, and/or controlling the screen state of the second screen to remain off;
  • the current screen state of the first screen is the lighting state and the current screen state of the second screen is the lighting state, and the number of the first touch points is less than the number of the second touch points, Then controlling the screen state of the first screen to remain lit, and/or adjusting the screen state of the second screen to a closed state.
  • the state determining module 203 determines that the screen state of the first screen is in a closed state and if the number of the first touch points is greater than the number of the second touch points. / or determining that the screen state of the second screen is a lighting state, specifically for:
  • the current screen state of the first screen is the off state and the current screen state of the second screen is the lighting state, and the number of the first touch points is greater than the number of the second touch points, Controlling that the screen state of the first screen remains off, and/or controlling the screen state of the second screen to remain in a lit state;
  • the current screen state of the first screen is the lighting state and the current screen state of the second screen is the off state, and the number of the first touch points is greater than the number of the second touch points, Adjusting a screen state of the first screen to a closed state, and/or adjusting a screen state of the second screen to a lighting state;
  • the screen state of the first screen is the lighting state and the current screen state of the second screen is the lighting state, and the number of the first touch points is greater than the number of the second touch points, Then, the screen state of the first screen is adjusted to a closed state, and/or the screen state of the second screen is controlled to remain a lighted state.
  • Also provided in the embodiment of the present invention is a computer storage medium for storing computer software instructions for the user terminal, which includes a program designed to execute the above aspects for the user terminal.
  • the user terminal is configured with a first screen and a second screen, and detects a touch signal for at least one of the first screen and the second screen; and determines, according to the at least one touch signal, the first screen.
  • the status and/or the screen state of the second screen is determined to be off.
  • the screen state of the two screens is determined by the number of touch points in the two screens of the user terminal, which reduces the operation of the user to light the screen or close the screen, thereby improving the operation efficiency of the user terminal.
  • embodiments of the invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or a combination of software and hardware. Moreover, the present invention can take the form of a computer program product embodied on one or more computer usable storage media including computer usable program code, including but not limited to disk storage and optical storage.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Les modes de réalisation de l'invention concernent un procédé de commande d'état d'écran ainsi qu'un terminal utilisateur, ledit procédé s'appliquant au terminal utilisateur configuré avec un premier écran et un second écran, et consistant à : détecter des signaux tactiles pour le premier écran et/ou le second écran ; déterminer le nombre de premiers points tactiles sur le premier écran et le nombre de seconds points tactiles sur le second écran en fonction d'au moins un signal tactile ; et si le nombre des premiers points tactiles est inférieur au nombre des seconds points tactiles, déterminer que l'état du premier écran est un état éclairé et/ou à déterminer que l'état du second écran est un état éteint. À l'aide de l'invention, les états de deux écrans d'un terminal utilisateur sont déterminés au moyen du nombre de points tactiles sur les deux écrans, ce qui permet de réduire le maniement par un utilisateur désireux d'éclairer ou d'éteindre l'écran et permet d'améliorer l'efficacité d'utilisation du terminal utilisateur.
PCT/CN2017/082625 2016-12-09 2017-04-28 Procédé de commande d'état d'écran et terminal utilisateur WO2018103257A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611131170 2016-12-09
CN201611131170.9 2016-12-09

Publications (1)

Publication Number Publication Date
WO2018103257A1 true WO2018103257A1 (fr) 2018-06-14

Family

ID=62490769

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/082625 WO2018103257A1 (fr) 2016-12-09 2017-04-28 Procédé de commande d'état d'écran et terminal utilisateur

Country Status (1)

Country Link
WO (1) WO2018103257A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115048018A (zh) * 2021-02-25 2022-09-13 博泰车联网科技(上海)股份有限公司 车载屏幕的状态控制方法、装置、车机和可读存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834463A (zh) * 2015-03-31 2015-08-12 努比亚技术有限公司 移动终端的握持识别方法及装置
CN104869230A (zh) * 2015-04-23 2015-08-26 努比亚技术有限公司 移动终端控制方法及装置
CN106168879A (zh) * 2016-06-30 2016-11-30 努比亚技术有限公司 一种双面屏交互的方法及终端
CN106406610A (zh) * 2016-09-13 2017-02-15 努比亚技术有限公司 防误触方法及***

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104834463A (zh) * 2015-03-31 2015-08-12 努比亚技术有限公司 移动终端的握持识别方法及装置
CN104869230A (zh) * 2015-04-23 2015-08-26 努比亚技术有限公司 移动终端控制方法及装置
CN106168879A (zh) * 2016-06-30 2016-11-30 努比亚技术有限公司 一种双面屏交互的方法及终端
CN106406610A (zh) * 2016-09-13 2017-02-15 努比亚技术有限公司 防误触方法及***

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115048018A (zh) * 2021-02-25 2022-09-13 博泰车联网科技(上海)股份有限公司 车载屏幕的状态控制方法、装置、车机和可读存储介质

Similar Documents

Publication Publication Date Title
AU2020201096B2 (en) Quick screen splitting method, apparatus, and electronic device, display UI, and storage medium
US9024877B2 (en) Method for automatically switching user interface of handheld terminal device, and handheld terminal device
WO2017101566A1 (fr) Procédé et dispositif de réveil d'écran
CN106055097B (zh) 一种亮屏控制方法、装置及电子设备
EP2905679B1 (fr) Dispositif électronique et procédé de contrôle dudit dispositif
US20230068100A1 (en) Widget processing method and related apparatus
EP3118726B1 (fr) Procédé de détection de rotation de l'élément de rotation et dispositif électronique l'appliquant
CN106020670B (zh) 一种屏幕点亮控制方法、装置及电子设备
WO2015176484A1 (fr) Procédé et dispositif de commande de saisie tactile
WO2017161803A1 (fr) Procédé et terminal de réglage de paramètres
US20130154947A1 (en) Determining a preferred screen orientation based on known hand positions
AU2015297122A1 (en) Electronic device operating in idle mode and method thereof
WO2017161826A1 (fr) Procédé de commande fonctionnelle et terminal
KR20150129423A (ko) 전자 장치 및 전자 장치의 제스처 인식 방법 및 전자 장치
US20150177972A1 (en) Unlocking method and electronic device
KR20170076359A (ko) 터치 이벤트 처리 방법 및 그 장치
EP2950188A1 (fr) Procédé et dispositif électronique pour commander un affichage
KR102360493B1 (ko) 센서를 갖는 전자 장치 및 그 운용 방법
CN106227375B (zh) 用于控制电子装置的显示器的方法及其电子装置
WO2017161824A1 (fr) Procédé et dispositif de commande d'un terminal
KR102536148B1 (ko) 전자 장치의 동작 방법 및 장치
US10528248B2 (en) Method for providing user interface and electronic device therefor
JP2019526112A (ja) タッチスクリーンのためのタッチ応答方法、装置および端末
US10599326B2 (en) Eye motion and touchscreen gestures
WO2018103257A1 (fr) Procédé de commande d'état d'écran et terminal utilisateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17878476

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17878476

Country of ref document: EP

Kind code of ref document: A1