WO2017101384A1 - 屏幕触控方法及装置 - Google Patents

屏幕触控方法及装置 Download PDF

Info

Publication number
WO2017101384A1
WO2017101384A1 PCT/CN2016/089010 CN2016089010W WO2017101384A1 WO 2017101384 A1 WO2017101384 A1 WO 2017101384A1 CN 2016089010 W CN2016089010 W CN 2016089010W WO 2017101384 A1 WO2017101384 A1 WO 2017101384A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
area
touch
display mode
touchable
Prior art date
Application number
PCT/CN2016/089010
Other languages
English (en)
French (fr)
Inventor
张帆
Original Assignee
乐视控股(北京)有限公司
乐视移动智能信息技术(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 乐视控股(北京)有限公司, 乐视移动智能信息技术(北京)有限公司 filed Critical 乐视控股(北京)有限公司
Publication of WO2017101384A1 publication Critical patent/WO2017101384A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present application relates to the field of electronic technologies, and in particular, to a screen touch method and apparatus.
  • the current smart phone can use the visual borderless technology on the screen to virtualize the original physical border around the screen, so that the user has a better visual effect on the display screen of the smart phone.
  • the existing physical border is virtualized into a border of the screen, and the screen border is not displayed to the user on the screen, and the user cannot perform touch operation on the screen.
  • the smartphone Since the screen border of the smartphone is getting smaller and smaller, and the width of the user's finger pressing on the edge of the screen is often about the width of the reserved screen border, the smartphone has some misoperations.
  • the embodiment of the present invention provides a screen touch method and device for solving the problem of erroneous operation on the screen due to the width of the user's finger pressing the edge of the screen larger than the width of the reserved screen edge.
  • an embodiment of the present application provides a screen touch method, including:
  • the touchable area is a target area selected from a display area of the screen; the target area is configured to respond to a user to the terminal Click operation
  • an embodiment of the present application provides a screen touch device, and a package include:
  • a first receiving module configured to receive a touch instruction of the touch screen
  • An acquiring module configured to acquire a current location of the touch instruction on the screen
  • a determining module configured to determine whether the location is within a touchable area of the screen; wherein the touchable area is a target area selected from a display area of the screen; the target area is used for Responding to a user's click operation on the terminal;
  • the execution module is configured to execute the touch instruction when the determining module determines that the location is within a touchable area of the screen.
  • an embodiment of the present application provides a screen touch device, including a memory, one or more processors, and one or more programs, wherein the one or more programs are processed by the one or more
  • the following operations are performed: receiving a touch command of the touch screen; acquiring a current position of the touch command in the screen; and determining whether the position is in a touchable area of the screen; wherein The touchable area is a target area selected from the display area of the screen; the target area is used to respond to a user's click operation on the terminal; if the determination result is yes, the touch instruction is executed.
  • embodiments of the present application provide a computer readable storage medium having stored thereon computer executable instructions that, in response to execution, cause a screen touch device to perform an operation,
  • the operation includes: receiving a touch command of the touch screen; acquiring a current position of the touch command in the screen; determining whether the position is within a touchable area of the screen; wherein the touchable
  • the control area is a target area selected from the display area of the screen; the target area is used to respond to a user's click operation on the terminal; if the determination result is yes, the touch instruction is executed.
  • the screen touch method and device of the embodiment of the present invention obtains the current position of the touch command on the screen by receiving a touch command of the touch screen, and determines whether the position is touchable in the screen.
  • the touchable area is a target area selected from the display area of the screen, and the target area is used to respond to an operation instruction of the user to the terminal, and if the determination result is yes, the touch is performed. instruction.
  • a target area is selected as a touchable area of the user in the display area of the screen.
  • FIG. 1 is a schematic flowchart of a screen touch method according to Embodiment 1 of the present application.
  • FIG. 2 is a schematic flowchart of a screen touch method according to Embodiment 2 of the present application.
  • FIG. 3 is a schematic flowchart of a screen touch method according to Embodiment 3 of the present application.
  • FIG. 4 is a schematic structural diagram of a screen touch device according to Embodiment 4 of the present application.
  • FIG. 5 is a schematic structural diagram of a screen touch device according to Embodiment 5 of the present application.
  • FIG. 6 is a schematic structural diagram of a screen touch device according to Embodiment 6 of the present application.
  • FIG. 7 is a schematic structural diagram of a computer program product for screen touch provided in Embodiment 7 of the present application.
  • FIG. 1 it is a schematic flowchart of a screen touch method according to Embodiment 1 of the present application.
  • the screen touch method includes:
  • Step 101 Receive a touch instruction of a touch screen.
  • the smart phone can display various applications to the user through the screen.
  • the user can send a touch command to the smart phone by clicking the display screen.
  • Step 102 Obtain a position where the touch instruction is currently located in the screen.
  • the smart phone Upon receiving the touch command, the smart phone based on the sensor can detect the current location of the touch point corresponding to the touch command in the screen.
  • Step 103 Determine whether the location is within a touchable area of the screen.
  • the smart phone After obtaining the position of the touch command touch point, the smart phone determines the position and determines whether the position is within the touchable area of the screen.
  • a touchable area is preferentially set for the smart phone, and the touchable area is a target area selected from the display area of the screen, and the user performs a click operation in the target area. That is to say, the target area is used to respond to the user's operation instructions to the terminal.
  • step 104 is performed; otherwise, step 105 is performed.
  • Step 104 Perform the touch instruction.
  • the smart phone When it is determined that the touch point of the touch command is located in the touchable area, the smart phone considers that the touch command is intentionally issued by the user, and is not a misoperation.
  • the touch command can be executed by the smartphone.
  • Step 105 Refusing to execute the touch instruction.
  • the smart phone When it is determined that the touch point of the touch command is no longer in the touchable area, the smart phone considers that the touch command is unintentionally issued by the user, which is a misoperation.
  • the smartphone can refuse to execute the touch command.
  • a target area is selected as a touchable area of the user in the display area of the screen, which is equivalent to dividing a touchable area for the smart phone on the basis of the display area, and the click operation occurs in the touchable area.
  • the smartphone will respond to the click operation.
  • the click operation occurs in the non-touchable area, the smartphone will not respond to the click operation.
  • the screen touch method obtains a touch command of the touch screen, acquires a current position of the touch command in the screen, and determines whether the position is in a touchable area of the screen, where
  • the touch area is a target area selected from the display area of the screen, and the target area is used to respond to the user's operation instruction to the terminal. If the determination result is yes, the touch instruction is executed.
  • a target area is selected as a touchable area of the user in the display area of the screen. When the touch point of the touch command is within the touchable area, the touch command is executed, otherwise the touch is refused.
  • the instruction reduces the probability of user misoperation.
  • the screen touch method further includes the following steps before receiving the touch command of the touch screen in the foregoing step 101:
  • Step 201 Receive a first adjustment instruction.
  • the first adjustment instruction carries the first area corresponding to the horizontal screen display mode.
  • the user can set the range of the touchable area of the smart phone according to his own needs.
  • the first adjustment instruction that the user adjusts the touchable area pre-stored by the smart phone through the display screen, where the first adjustment instruction carries the first value range of the first area corresponding to the horizontal screen display mode, and The second display value range of the second area corresponding to the vertical screen display mode.
  • the first area represents a touchable area corresponding to the horizontal screen display mode
  • the second area represents a touchable area corresponding to the vertical screen display mode. It should be noted here that the first area and the second area are only distinguished for the horizontal display mode and the vertical display mode, and the touchable areas are not sorted.
  • Step 202 Adjust the pre-stored touchable area in the horizontal display mode to the first area according to the first value range in the first adjustment instruction.
  • Step 203 Adjust the pre-stored touchable area in the vertical screen display mode to the second area according to the second value range in the first adjustment instruction.
  • the smart phone can obtain the first value range of the touchable area in the horizontal screen display mode and the second value range of the touchable area in the vertical screen display mode, and then according to the first A value range adjusts the touchable area in the pre-stored horizontal screen display mode to the first area, and adjusts the touchable area in the pre-stored vertical screen display mode to the second area according to the second value range.
  • the screen display area of the smartphone is 100 ⁇ 80, and the operation range of the touchable area can be 95 ⁇ 60.
  • the operable range of the length can be reduced during the adjustment of the touchable area, and when the smart phone is held vertically, the adjustment of the touchable area can be reduced.
  • the small width of the operable range so as to achieve the purpose of reducing misoperation.
  • Step 204 Determine a display mode in which the screen is currently located.
  • a gyroscope is disposed on the smart phone, and the gyroscope can determine whether the smart phone is in the horizontal display mode or the vertical display mode.
  • the horizontal screen display mode It is the display mode when the smartphone is horizontally held in the user's hand
  • the vertical screen display mode is the display mode when the smartphone is vertically held in the user's hand.
  • step 205 When it is determined that the screen is in the landscape mode, step 205 is performed; when it is determined that the screen is in the landscape mode, step 206 is performed.
  • Step 205 When the screen is in the horizontal screen display mode, set a first area corresponding to the horizontal screen display mode as the touchable area.
  • the smart phone can directly set the first area corresponding to the horizontal display mode adjusted according to the first adjustment instruction as the current touchable area.
  • Step 206 When the screen is in the vertical screen display mode, set a second area corresponding to the vertical screen display mode as the touchable area.
  • the smart phone can directly set the second area corresponding to the vertical screen display mode adjusted according to the first adjustment instruction as the current touchable area.
  • the screen touch method obtains a touch command of the touch screen, acquires a current position of the touch command in the screen, and determines whether the position is in a touchable area of the screen, where
  • the touch area is a target area selected from the display area of the screen, and the target area is used to respond to the user's operation instruction to the terminal. If the determination result is yes, the touch instruction is executed.
  • a target area is selected as a touchable area of the user in the display area of the screen.
  • the touch command is executed, otherwise the touch is refused.
  • the instruction reduces the probability of user misoperation. Further, in this embodiment, the user can adjust the range of the touchable area according to his own needs, so that the display of the screen is more flexible and the user experience is improved.
  • FIG. 3 it is a schematic flowchart of a screen touch method according to Embodiment 3 of the present application. After the touch command is executed in step 105 in the first embodiment, the screen touch method further includes the following steps:
  • Step 301 Receive a second adjustment instruction.
  • the second adjustment instruction carries the third extraction of the touchable area after the adjustment Range of values.
  • Step 302 Adjust the current touchable area according to the third value range.
  • the user can also adjust the touch area of the smartphone. Specifically, the user invokes the adjustment menu based on the display screen, and adjusts the touchable area in the adjustment menu, and accordingly, the smart phone receives the second adjustment instruction sent by the user based on the adjustment menu, wherein the second adjustment instruction carries the adjusted The third range of values corresponding to the touchable area. After receiving the second adjustment command, the smart phone adjusts the value range of the current touchable area to a third value range according to the third value range.
  • the screen touch method obtains a touch command of the touch screen, acquires a current position of the touch command in the screen, and determines whether the position is in a touchable area of the screen, where
  • the touch area is a target area selected from the display area of the screen, and the target area is used to respond to the user's operation instruction to the terminal. If the determination result is yes, the touch instruction is executed.
  • a target area is selected as a touchable area of the user in the display area of the screen.
  • the touch command is executed, otherwise the touch is refused.
  • the instruction reduces the probability of user misoperation. Further, in this embodiment, the user can adjust the range of the touchable area according to his own needs, so that the display of the screen is more flexible and the user experience is improved.
  • FIG. 4 it is a schematic structural diagram of a screen touch device according to Embodiment 4 of the present application.
  • the screen touch device includes: a first receiving module 11 , an obtaining module 12 , a determining module 13 , and an executing module 14 .
  • the first receiving module 11 is configured to receive a touch instruction of the touch screen.
  • the obtaining module 12 is configured to obtain a current position of the touch instruction in the screen.
  • the determining module 13 is configured to determine whether the location is within a touchable area of the screen.
  • the touchable area is a target area selected from a display area of the screen; the target area is used to respond to a user's click operation on the terminal.
  • the execution module 14 is configured to execute a touch instruction when the determining module determines that the location is within the touchable area of the screen.
  • the smart phone can display various applications to the user through the screen.
  • the user can send a touch to the first receiving module 11 by clicking the display screen. Control instructions.
  • the acquiring module 12 Upon receiving the touch command, the acquiring module 12 detects that the touch point corresponding to the touch command is currently located in the screen. After obtaining the position of the touch command touch point, the determining module 13 determines the position and determines whether the position is within the touchable area of the screen.
  • a touchable area is preferentially set for the smart phone, and the touchable area is a target area selected from the display area of the screen, and the user performs a click operation in the target area. That is to say, the target area is used to respond to the user's operation instructions to the terminal.
  • the smart phone When it is determined that the touch point of the touch command is located in the touchable area, the smart phone considers that the touch command is intentionally issued by the user, and is not a misoperation.
  • the execution module 14 can execute the touch command.
  • the smart phone When it is determined that the touch point of the touch command is no longer in the touchable area, the smart phone considers that the touch command is unintentionally issued by the user, which is a misoperation.
  • the execution module 14 can refuse to execute the touch command.
  • a target area is selected as a touchable area of the user in the display area of the screen, which is equivalent to dividing a touchable area for the smart phone on the basis of the display area, and the click operation occurs in the touchable area.
  • the smartphone will respond to the click operation.
  • the click operation occurs in the non-touchable area, the smartphone will not respond to the click operation machine.
  • the touch control device obtains the current position of the touch command on the screen by receiving the touch command of the touch screen, and determines whether the position is in the touchable area of the screen, wherein the touch is The area is a target area selected from the display area of the screen, and the target area is used to respond to the user's operation instruction to the terminal, and if the determination result is yes, the touch instruction is executed.
  • a target area is selected as a touchable area of the user in the display area of the screen. When the touch point of the touch command is within the touchable area, the touch command is executed, otherwise the touch is refused.
  • the instruction reduces the probability of user misoperation.
  • FIG. 5 it is a schematic structural diagram of a screen touch device according to Embodiment 5 of the present application.
  • the screen touch device includes the first receiving module 11, the obtaining module 12, the determining module 13, and the execution in the fourth embodiment.
  • the determination module 15 the setting module 16, the second receiving module 17, and the adjusting module 18 are further included.
  • the determining module 15 is configured to determine a display mode in which the screen is currently located before the first receiving module 11 receives the touch instruction of the touch screen.
  • the display mode includes a horizontal screen display mode and a vertical screen display mode
  • the setting module 16 is configured to set the first area corresponding to the horizontal screen display mode as the touchable area when the screen is in the horizontal screen display mode, and to correspond to the vertical screen display mode when the screen is in the vertical screen display mode
  • the second area is set as a touchable area.
  • the second receiving module 17 is configured to receive a first adjustment instruction, where the first adjustment instruction carries the first area of the first area corresponding to the horizontal screen display mode, before the determining module 15 determines the display mode in which the screen is currently located. a range of values, and a range of values of the second region corresponding to the vertical display mode;
  • the adjustment module 18 is configured to adjust the touchable area in the pre-stored horizontal display mode to the first area according to the first value range in the first adjustment instruction, and pre-store according to the second value range in the first adjustment instruction The touchable area in the portrait display mode is adjusted to the second area.
  • the second receiving module 17 is further configured to: after the execution module 14 executes the touch instruction, receive the second adjustment instruction; wherein the second adjustment instruction carries the third value range of the adjusted touchable area;
  • the adjustment module 18 is further configured to adjust the current touchable area according to the third value range.
  • the function modules of the screen touch device provided in this embodiment can be used to perform the flow of the screen touch method shown in FIG. 1 to FIG. 3, and the specific working principle is not described here. For details, refer to the description of the method embodiment.
  • the touch control device obtained in this embodiment obtains the current position of the touch command on the screen by receiving the touch command of the touch screen, and determines whether the position is in the touchable area of the screen, wherein The touch area is the item selected from the display area of the screen.
  • the target area is used to respond to the user's operation instruction to the terminal, and if the judgment result is yes, the touch instruction is executed.
  • a target area is selected as a touchable area of the user in the display area of the screen. When the touch point of the touch command is within the touchable area, the touch command is executed, otherwise the touch is refused.
  • the instruction reduces the probability of user misoperation.
  • the user can adjust the range of the touchable area according to his own needs, so that the display of the screen is more flexible and the user experience is improved.
  • FIG. 6 is a schematic structural diagram of a screen touch device according to Embodiment 6 of the present application.
  • the screen touch device of the embodiment of the present application includes a memory 61, one or more processors 62, and one or more programs 63.
  • the one or more programs 63 when executed by one or more processors 62, perform any of the above-described embodiments.
  • the touch control device of the embodiment of the present invention obtains the current position of the touch command in the screen by receiving a touch command of the touch screen, and determines whether the position is within the touchable area of the screen.
  • the touchable area is a target area selected from the display area of the screen, and the target area is configured to respond to an operation instruction of the user to the terminal, and if the determination result is yes, the touch instruction is executed.
  • a target area is selected as a touchable area of the user in the display area of the screen.
  • FIG. 7 is a schematic structural diagram of a computer program product for screen touch provided in Embodiment 7 of the present application.
  • the computer program product 71 for screen touch of the embodiment of the present application may include a signal bearing medium 72 .
  • Signal bearing medium 72 may include one or more instructions 73 that, when executed by, for example, a processor, may provide the functionality described above with respect to Figures 1-5.
  • the instruction 73 may include: one or more instructions for receiving a touch instruction of the touch screen; one or more instructions for acquiring a current position of the touch instruction in the screen; Whether the position is One or more instructions in the touchable area of the screen; wherein the touchable area is a target area selected from a display area of the screen; the target area is used to respond to a user-to-terminal Click operation; and one or more instructions for executing the touch command if the determination result is yes.
  • the screen touch device can perform one or more of the steps shown in FIG. 1 in response to instruction 73.
  • signal bearing medium 72 can include computer readable media 74 such as, but not limited to, a hard disk drive, a compact disk (CD), a digital versatile disk (DVD), a digital tape, a memory, and the like.
  • the signal bearing medium 72 can include a recordable medium 75 such as, but not limited to, a memory, a read/write (R/W) CD, an R/W DVD, and the like.
  • the signal bearing medium 72 can include a communication medium 76 such as, but not limited to, a digital and/or analog communication medium (eg, fiber optic cable, waveguide, wired communication link, wireless communication link, etc.).
  • computer program product 71 can be transmitted to one or more modules of a screen touch device via RF signal bearing medium 72, wherein signal bearing medium 72 is comprised of a wireless communication medium (eg, a wireless communication medium compliant with the IEEE 802.11 standard) Transfer.
  • a wireless communication medium eg, a wireless communication medium compliant with the IEEE 802.11 standard
  • the computer program product of the embodiment of the present invention obtains the current position of the touch command in the screen by receiving a touch command of the touch screen, and determines whether the position is within the touchable area of the screen.
  • the touchable area is a target area selected from the display area of the screen, and the target area is used to respond to an operation instruction of the user to the terminal. If the determination result is yes, the touch instruction is executed.
  • a target area is selected as a touchable area of the user in the display area of the screen.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种屏幕触控方法及装置,通过接收触摸屏幕的触控指令(101),获取所述触控指令在屏幕中当前所处的位置(102),判断所述位置是否在所述屏幕中的可触控区域内(103),其中,所述可触控区域为从所述屏幕的显示区域中选取的目标区域,所述目标区域用于响应用户对终端的操作指令,如果判断结果为是,执行所述触控指令(104)。该方法及装置中在屏幕的显示区域中选取一个目标区域作为用户的可触控区域,当触控指令的触控点处于该可触控区域范围内,执行触控指令,否则拒绝执行该触控指令,降低了用户误操作的概率。

Description

屏幕触控方法及装置
本专利申请要求申请日为2015年12月14日、申请号为2015109233618的中国专利申请的优先权,并将上述专利申请以引用的方式全文引入本文中。
技术领域
本申请涉及电子技术领域,尤其涉及一种屏幕触控方法及装置。
背景技术
目前的智能手机可以对屏幕采用视觉无边框技术,将屏幕四周原有的物理边框进行虚拟化,以使用户对智能手机的显示屏幕具有较好的视觉效果。现有将原来的物理边框虚拟化成屏幕的边框,该屏幕边框并不会在屏幕中显示给用户,并且用户不能对其进行触控操作。
由于智能手机的屏幕边框越来越小,而用户的手指按在屏幕边缘的宽度往往大约预留的屏幕边框的宽度,使得智能手机出现一些误操作。
发明内容
本申请实施例提供一种屏幕触控方法及装置,用于解决由于用户手指按在屏幕边缘的宽度大于预留屏幕边缘的宽度而导致对屏幕误操作的问题。
为了实现上述目的,本申请实施例提供了一种屏幕触控方法,包括:
接收触摸屏幕的触控指令;
获取所述触控指令在屏幕中当前所处的位置;
判断所述位置是否在所述屏幕中的可触控区域内;其中,所述可触控区域为从所述屏幕的显示区域中选取的目标区域;所述目标区域用于响应用户对终端的点击操作;
如果判断结果为是,执行所述触控指令。
为了实现上述目的,本申请实施例提供了一种屏幕触控装置,包 括:
第一接收模块,用于接收触摸屏幕的触控指令;
获取模块,用于获取所述触控指令在屏幕中当前所处的位置;
判断模块,用于判断所述位置是否在所述屏幕中的可触控区域内;其中,所述可触控区域为从所述屏幕的显示区域中选取的目标区域;所述目标区域用于响应用户对终端的点击操作;
执行模块,用于在所述判断模块判断出所述位置在所述屏幕中的可触控区域内时,执行所述触控指令。
另一方面,本申请实施例提供一种屏幕触控装置,包括存储器、一个或多个处理器以及一个或多个程序,其中,所述一个或多个程序在由所述一个或多个处理器执行时执行下述操作:接收触摸屏幕的触控指令;获取所述触控指令在屏幕中当前所处的位置;判断所述位置是否在所述屏幕中的可触控区域内;其中,所述可触控区域为从所述屏幕的显示区域中选取的目标区域;所述目标区域用于响应用户对终端的点击操作;如果判断结果为是,执行所述触控指令。
另一方面,本申请实施例提供一种计算机可读存储介质,所述计算机可读存储介质上存储有计算机可执行指令,所述计算机可执行指令响应于执行使得屏幕触控装置执行操作,所述操作包括:接收触摸屏幕的触控指令;获取所述触控指令在屏幕中当前所处的位置;判断所述位置是否在所述屏幕中的可触控区域内;其中,所述可触控区域为从所述屏幕的显示区域中选取的目标区域;所述目标区域用于响应用户对终端的点击操作;如果判断结果为是,执行所述触控指令。
本申请实施例的屏幕触控方法及装置,通过接收触摸屏幕的触控指令,获取所述触控指令在屏幕中当前所处的位置,判断所述位置是否在所述屏幕中的可触控区域内,其中,所述可触控区域为从所述屏幕的显示区域中选取的目标区域,所述目标区域用于响应用户对终端的操作指令,如果判断结果为是,执行所述触控指令。本申请实施例中在屏幕的显示区域中选取一个目标区域作为用户的可触控区域,当触控指令的触控点处于该可触控区域范围内,执行触控指令,否则拒 绝执行该触控指令,降低了用户误操作的概率。
附图说明
图1为本申请实施例一的屏幕触控方法的流程示意图;
图2为本申请实施例二的屏幕触控方法的流程示意图;
图3为本申请实施例三的屏幕触控方法的流程示意图;
图4为本申请实施例四的屏幕触控装置的结构示意图;
图5为本申请实施例五的屏幕触控装置的结构示意图;
图6为本申请实施例六提供的屏幕触控装置的结构示意图;
图7为本申请实施例七提供的用于屏幕触控的计算机程序产品的结构示意图。
具体实施方式
下面结合附图对本申请实施例提供的屏幕触控方法及装置进行详细描述。
实施例一
如图1所示,其为本申请实施例一的屏幕触控方法的流程示意图,该屏幕触控方法包括:
步骤101、接收触摸屏幕的触控指令。
本实施例中,智能手机可以通过屏幕向用户显示各种应用程序,当用户试图调用某个程序,或者对智能手机进行设置等操作时,用户可以通过点击显示屏向智能手机发送触控指令。
步骤102、获取所述触控指令在屏幕中当前所处的位置。
在接收到触控指令,智能手机基于传感器可以探测获取到该触控指令对应的触控点在屏幕中当前所处的位置。
步骤103、判断所述位置是否在所述屏幕中的可触控区域内。
在获取到触控指令触控点所处位置后,智能手机对该位置进行判断,判断该位置是否在屏幕的可触控区域内。本实施例中,优先为智能手机设置一个可触控区域,该可触控区域为从屏幕的显示区域中选取的目标区域,在目标区域中用户进行点击操作将会被执行。也就是说该目标区域用于响应用户对终端的操作指令。
如果判断出触控点所处的位置在屏幕的可触控区域内,则执行步骤104;否则,执行步骤105。
步骤104、执行所述触控指令。
当判断出触控指令的触控点所处位置在可触控区域内,智能手机则认为该触控指令是用户有意发出的,并不是误操作。智能手机就可以执行该触控指令。
步骤105、拒绝执行所述触控指令。
当判断出触控指令的触控点所处位置不再可触控区域内,智能手机则认为该触控指令是用户无意发出的,是一个误操作。智能手机就可以拒绝执行该触控指令。
本实施例中在屏幕的显示区域中选取一个目标区域作为用户的可触控区域,相当于在显示区域的基础上又为智能手机划分出一个可触控区域,当点击操作发生在可触控区域内,智能手机才会对点击操作进行响应,当点击操作发生在非可触控区域内,智能手机将不会对点击操作进行响应。
本实施例提供的屏幕触控方法,通过接收触摸屏幕的触控指令,获取触控指令在屏幕中当前所处的位置,判断所述位置是否在屏幕中的可触控区域内,其中,可触控区域为从屏幕的显示区域中选取的目标区域,目标区域用于响应用户对终端的操作指令,如果判断结果为是,执行触控指令。本实施例中在屏幕的显示区域中选取一个目标区域作为用户的可触控区域,当触控指令的触控点处于该可触控区域范围内,执行触控指令,否则拒绝执行该触控指令,降低了用户误操作的概率。
实施例二
如图2所示,其为本申请实施例二的屏幕触控方法的流程示意图,该屏幕触控方法在上述步骤101接收触摸屏幕的触控指令之前,还包括以下步骤:
步骤201、接收第一调整指令。
其中,所述第一调整指令中携带所述横屏显示模式对应的第一区 域的第一取值范围,以及所述竖屏显示模式对应的第二区域的取值范围。
一般智能手机的***中有默认可触控区域的范围设置,在用户使用时,用户可以根据自己的需求对智能手机的可触控区域进行范围设置。具体地,用户通过显示屏对智能手机预存的可触控区域进行调整的第一调整指令,其中,第一调整指令中携带有横屏显示模式对应的第一区域的第一取值范围,以及竖屏显示模式对应的第二区域的第二取值范围。
其中,第一区域表示横屏显示模式对应的可触控区域,第二区域表示竖屏显示模式对应的可触控区域。此处需要说明,第一区域和第二区域只是为了横屏显示模式和竖屏显示模式进行区分,并不是对可触控区域进行排序。
步骤202、根据所述第一调整指令中所述第一取值范围将预存的所述横屏显示模式下的所述可触控区域调整为所述第一区域。
步骤203、根据所述第一调整指令中所述第二取值范围将预存的所述竖屏显示模式下的所述可触控区域调整为所述第二区域。
在获取到第一调整指令后,智能手机可以获取到横屏显示模式下可触控区域的第一取值范围,以及竖屏显示模式下可触控区域的第二取值范围,然后根据第一取值范围将预存的横屏显示模式下的可触控区域调为第一区域,以及根据第二取值范围将预存的竖屏显示模式下的可触控区域调为第二区域。例如,在横握显示模式下,智能手机屏幕显示区域为100×80,可以将可触控区域的操作范围为95×60。
实际应用中,当智能手机横握时,在对可触控区域调整过程中可以多减小长度的可操作范围,而当智能手机竖握时,在对可触控区域调整过程中可以多减小宽度的可操作范围,这样达到减小误操作的目的。
步骤204、确定所述屏幕当前所处的显示模式。
本实施例中,智能手机上设置有陀螺仪,通过陀螺仪可以确定出智能手机处于横屏显示模式还是竖屏显示模式。其中,横屏显示模式 就是当智能手机在用户手中被横握时的显示模式,竖屏显示模式就是当智能手机在用户手中被竖握时的显示模式。
当确定出屏幕处于横屏显示模式时,执行步骤205;当确定出屏幕处于横屏显示模式时,执行步骤206。
步骤205、当所述屏幕处于所述横屏显示模式时,将与所述横屏显示模式对应的第一区域设置为所述可触控区域。
在确定出屏幕当前处于横屏显示模式时,智能手机就可以直接将根据第一调整指令调整后的横屏显示模式对应的第一区域设置为当前的可触控区域。
步骤206、当所述屏幕处于所述竖屏显示模式时,将与所述竖屏显示模式对应的第二区域设置为所述可触控区域。
在确定出屏幕当前处于竖屏显示模式时,智能手机就可以直接将根据第一调整指令调整后的竖屏显示模式对应的第二区域设置为当前的可触控区域。
本实施例提供的屏幕触控方法,通过接收触摸屏幕的触控指令,获取触控指令在屏幕中当前所处的位置,判断所述位置是否在屏幕中的可触控区域内,其中,可触控区域为从屏幕的显示区域中选取的目标区域,目标区域用于响应用户对终端的操作指令,如果判断结果为是,执行触控指令。本实施例中在屏幕的显示区域中选取一个目标区域作为用户的可触控区域,当触控指令的触控点处于该可触控区域范围内,执行触控指令,否则拒绝执行该触控指令,降低了用户误操作的概率。进一步地,本实施例中用户可以根据自身的需求对可触控区域的范围进行调整,使得屏幕的显示更加灵活,提高了用户体验。
实施例三
如图3所示,其为本申请实施例三的屏幕触控方法的流程示意图,该屏幕触控方法在上述实施例一中步骤105执行所述触控指令之后,还包括以下步骤:
步骤301、接收第二调整指令。
其中,所述第二调整指令中携带调整后所述可触控区域的第三取 值范围。
步骤302、按照所述第三取值范围对当前的所述可触控区域进行调整。
在智能手机的实际使用过程中,用户也可以对智能手机的可触控区域进行调整。具体地,用户基于显示屏调用调整菜单,在调整菜单里面对可触控区域进行调整,相应地智能手机接收用户基于调整菜单发送的第二调整指令,其中该第二调整指令中携带有调整后可触控区域对应的第三取值范围。在接收到第二调整指令后,智能手机就按照第三取值范围,将当前的可触控区域的取值范围调整为第三取值范围。
本实施例提供的屏幕触控方法,通过接收触摸屏幕的触控指令,获取触控指令在屏幕中当前所处的位置,判断所述位置是否在屏幕中的可触控区域内,其中,可触控区域为从屏幕的显示区域中选取的目标区域,目标区域用于响应用户对终端的操作指令,如果判断结果为是,执行触控指令。本实施例中在屏幕的显示区域中选取一个目标区域作为用户的可触控区域,当触控指令的触控点处于该可触控区域范围内,执行触控指令,否则拒绝执行该触控指令,降低了用户误操作的概率。进一步地,本实施例中用户可以根据自身的需求对可触控区域的范围进行调整,使得屏幕的显示更加灵活,提高了用户体验。
实施例四
如图4所示,其为本申请实施例四的屏幕触控装置的结构示意图,该屏幕触控装置包括:第一接收模块11、获取模块12、判断模块13和执行模块14。
其中,第一接收模块11,用于接收触摸屏幕的触控指令。
获取模块12,用于获取触控指令在屏幕中当前所处的位置。
判断模块13,用于判断所述位置是否在屏幕中的可触控区域内。
其中,可触控区域为从屏幕的显示区域中选取的目标区域;目标区域用于响应用户对终端的点击操作。
执行模块14,用于在判断模块判断出位置在屏幕中的可触控区域内时,执行触控指令。
本实施例中,智能手机可以通过屏幕向用户显示各种应用程序,当用户试图调用某个程序,或者对智能手机进行设置等操作时,用户可以通过点击显示屏向第一接收模块11发送触控指令。
在接收到触控指令,获取模块12探测获取到该触控指令对应的触控点在屏幕中当前所处的位置。在获取到触控指令触控点所处位置后,判断模块13对该位置进行判断,判断该位置是否在屏幕的可触控区域内。本实施例中,优先为智能手机设置一个可触控区域,该可触控区域为从屏幕的显示区域中选取的目标区域,在目标区域中用户进行点击操作将会被执行。也就是说该目标区域用于响应用户对终端的操作指令。
当判断出触控指令的触控点所处位置在可触控区域内,智能手机则认为该触控指令是用户有意发出的,并不是误操作。执行模块14就可以执行该触控指令。
而当判断出触控指令的触控点所处位置不再可触控区域内,智能手机则认为该触控指令是用户无意发出的,是一个误操作。执行模块14就可以拒绝执行该触控指令。
本实施例中在屏幕的显示区域中选取一个目标区域作为用户的可触控区域,相当于在显示区域的基础上又为智能手机划分出一个可触控区域,当点击操作发生在可触控区域内,智能手机才会对点击操作进行响应,当点击操作发生在非可触控区域内,智能手机将不会对点击操作机进行响应。
本实施例提供的屏幕触控装置,通过接收触摸屏幕的触控指令,获取触控指令在屏幕中当前所处的位置,判断位置是否在屏幕中的可触控区域内,其中,可触控区域为从屏幕的显示区域中选取的目标区域,目标区域用于响应用户对终端的操作指令,如果判断结果为是,执行触控指令。本实施例中在屏幕的显示区域中选取一个目标区域作为用户的可触控区域,当触控指令的触控点处于该可触控区域范围内,执行触控指令,否则拒绝执行该触控指令,降低了用户误操作的概率。
实施例五
如图5所示,其为本申请实施例五的屏幕触控装置的结构示意图,该屏幕触控装置除了包括上述实施例四中的第一接收模块11、获取模块12、判断模块13和执行模块14之外,还包括:确定模块15、设置模块16、第二接收模块17和调整模块18。
进一步地,确定模块15,用于在第一接收模块11接收触摸屏幕的触控指令之前,确定屏幕当前所处的显示模式。
其中,显示模式包括横屏显示模式和竖屏显示模式;
设置模块16,用于在屏幕处于横屏显示模式时,将与横屏显示模式对应的第一区域设置为可触控区域,以及在屏幕处于竖屏显示模式时,将与竖屏显示模式对应的第二区域设置为可触控区域。
进一步地,第二接收模块17,用于在确定模块15确定屏幕当前所处的显示模式之前,接收第一调整指令,第一调整指令中携带横屏显示模式对应的第一区域的第一取值范围,以及竖屏显示模式对应的第二区域的取值范围;
调整模块18,用于根据第一调整指令中第一取值范围将预存的横屏显示模式下的可触控区域调整为第一区域,以及根据第一调整指令中第二取值范围将预存的竖屏显示模式下的可触控区域调整为第二区域。
进一步地,第二接收模块17,还用于在执行模块14执行触控指令后,接收第二调整指令;其中,第二调整指令中携带调整后可触控区域的第三取值范围;
进一步地,调整模块18,还用于按照第三取值范围对当前的可触控区域进行调整。
本实施例提供的屏幕触控装置的各功能模块可用于执行图1~3中所示的屏幕触控方法的流程,其具体工作原理不再赘述,详见方法实施例的描述。
本实施例提供的屏幕触控装置,通过接收触摸屏幕的触控指令,获取触控指令在屏幕中当前所处的位置,判断所述位置是否在屏幕中的可触控区域内,其中,可触控区域为从屏幕的显示区域中选取的目 标区域,目标区域用于响应用户对终端的操作指令,如果判断结果为是,执行触控指令。本实施例中在屏幕的显示区域中选取一个目标区域作为用户的可触控区域,当触控指令的触控点处于该可触控区域范围内,执行触控指令,否则拒绝执行该触控指令,降低了用户误操作的概率。
进一步地,本实施例中用户可以根据自身的需求对可触控区域的范围进行调整,使得屏幕的显示更加灵活,提高了用户体验。
实施例六
图6为本申请实施例六提供的屏幕触控装置的结构示意图。如图6所示,本申请实施例的屏幕触控装置包括:存储器61、一个或多个处理器62以及一个或多个程序63。
其中,所述一个或多个程序63在由一个或多个处理器62执行时执行上述实施例中的任意一种方法。
本申请实施例的屏幕触控装置,通过接收触摸屏幕的触控指令,获取所述触控指令在屏幕中当前所处的位置,判断所述位置是否在所述屏幕中的可触控区域内,其中,所述可触控区域为从所述屏幕的显示区域中选取的目标区域,所述目标区域用于响应用户对终端的操作指令,如果判断结果为是,执行所述触控指令。本申请实施例中在屏幕的显示区域中选取一个目标区域作为用户的可触控区域,当触控指令的触控点处于该可触控区域范围内,执行触控指令,否则拒绝执行该触控指令,降低了用户误操作的概率。
实施例七
图7为本申请实施例七提供的用于屏幕触控的计算机程序产品的结构示意图。如图7所示,本申请实施例的用于屏幕触控的计算机程序产品71,可以包括信号承载介质72。信号承载介质72可以包括一个或更多个指令73,该指令73在由例如处理器执行时,处理器可以提供以上针对图1-5描述的功能。例如,指令73可以包括:用于接收触摸屏幕的触控指令的一个或多个指令;用于获取所述触控指令在屏幕中当前所处的位置的一个或多个指令;用于判断所述位置是否 在所述屏幕中的可触控区域内的一个或多个指令;其中,所述可触控区域为从所述屏幕的显示区域中选取的目标区域;所述目标区域用于响应用户对终端的点击操作;以及用于如果判断结果为是,执行所述触控指令的一个或多个指令。因此,例如,参照图4,屏幕触控装置可以响应于指令73来进行图1中所示的步骤中的一个或更多个。
在一些实现中,信号承载介质72可以包括计算机可读介质74,诸如但不限于硬盘驱动器、压缩盘(CD)、数字通用盘(DVD)、数字带、存储器等。在一些实现中,信号承载介质72可以包括可记录介质75,诸如但不限于存储器、读/写(R/W)CD、R/W DVD等。在一些实现中,信号承载介质72可以包括通信介质76,诸如但不限于数字和/或模拟通信介质(例如,光纤线缆、波导、有线通信链路、无线通信链路等)。因此,例如,计算机程序产品71可以通过RF信号承载介质72传送给屏幕触控装置的一个或多个模块,其中,信号承载介质72由无线通信介质(例如,符合IEEE 802.11标准的无线通信介质)传送。
本申请实施例的计算机程序产品,通过接收触摸屏幕的触控指令,获取所述触控指令在屏幕中当前所处的位置,判断所述位置是否在所述屏幕中的可触控区域内,其中,所述可触控区域为从所述屏幕的显示区域中选取的目标区域,所述目标区域用于响应用户对终端的操作指令,如果判断结果为是,执行所述触控指令。本申请实施例中在屏幕的显示区域中选取一个目标区域作为用户的可触控区域,当触控指令的触控点处于该可触控区域范围内,执行触控指令,否则拒绝执行该触控指令,降低了用户误操作的概率。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到各实施方式可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件。基于这样的理解,上述技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个 人计算机,服务器,或者网络设备等)执行各个实施例或者实施例的某些部分所述的方法。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (8)

  1. 一种屏幕触控方法,其特征在于,包括:
    接收触摸屏幕的触控指令;
    获取所述触控指令在屏幕中当前所处的位置;
    判断所述位置是否在所述屏幕中的可触控区域内;其中,所述可触控区域为从所述屏幕的显示区域中选取的目标区域;所述目标区域用于响应用户对终端的点击操作;
    如果判断结果为是,执行所述触控指令。
  2. 根据权利要求1所述的屏幕触控方法,其特征在于,所述接收触摸屏幕的触控指令之前,还包括:
    确定所述屏幕当前所处的显示模式;其中,所述显示模式包括横屏显示模式和竖屏显示模式;
    当所述屏幕处于所述横屏显示模式时,将与所述横屏显示模式对应的第一区域设置为所述可触控区域;
    当所述屏幕处于所述竖屏显示模式时,将与所述竖屏显示模式对应的第二区域设置为所述可触控区域。
  3. 根据权利要求2所述的屏幕触控方法,其特征在于,所述确定屏幕当前所处的显示模式之前,还包括:
    接收第一调整指令,所述第一调整指令中携带所述横屏显示模式对应的所述第一区域的第一取值范围,以及所述竖屏显示模式对应的所述第二区域的取值范围;
    根据所述第一调整指令中所述第一取值范围将预存的所述横屏显示模式下的所述可触控区域调整为所述第一区域;
    根据所述第一调整指令中所述第二取值范围将预存的所述竖屏显示模式下的所述可触控区域调整为所述第二区域。
  4. 根据权利要求1所述的屏幕触控方法,其特征在于,所述执行所述触控指令之后,还包括:
    接收第二调整指令;其中,所述第二调整指令中携带调整后所述可触控区域的第三取值范围;
    按照所述第三取值范围对当前的所述可触控区域进行调整。
  5. 一种屏幕触控装置,其特征在于,包括:
    第一接收模块,用于接收触摸屏幕的触控指令;
    获取模块,用于获取所述触控指令在屏幕中当前所处的位置;
    判断模块,用于判断所述位置是否在所述屏幕中的可触控区域内;其中,所述可触控区域为从所述屏幕的显示区域中选取的目标区域;所述目标区域用于响应用户对终端的点击操作;
    执行模块,用于在所述判断模块判断出所述位置在所述屏幕中的可触控区域内时,执行所述触控指令。
  6. 根据权利要求5所述的屏幕触控装置,其特征在于,还包括:
    确定模块,用于在所述第一接收模块接收触摸屏幕的触控指令之前,确定屏幕当前所处的显示模式;其中,所述显示模式包括横屏显示模式和竖屏显示模式;
    设置模块,用于在所述屏幕处于所述横屏显示模式时,将与所述横屏显示模式对应的第一区域设置为所述可触控区域,以及在所述屏幕处于所述竖屏显示模式时,将与所述竖屏显示模式对应的第二区域设置为所述可触控区域。
  7. 根据权利要求6所述的屏幕触控装置,其特征在于,还包括:
    第二接收模块,用于在所述确定模块确定屏幕当前所处的显示模式之前,接收第一调整指令,所述第一调整指令中携带所述横屏显示模式对应的所述第一区域的第一取值范围,以及所述竖屏显示模式对应的所述第二区域的取值范围;
    调整模块,用于根据所述第一调整指令中所述第一取值范围将预存的所述横屏显示模式下的所述可触控区域调整为所述第一区域,以及根据所述第一调整指令中所述第二取值范围将预存的所述竖屏显示模式下的所述可触控区域调整为所述第二区域。
  8. 根据权利要求3所述的屏幕触控装置,其特征在于,所述第二接收模块,还用于在所述执行模块执行所述触控指令后,接收第二调整指令;其中,所述第二调整指令中携带调整后所述可触控区域的第 三取值范围;
    所述调整模块,还用于按照所述第三取值范围对当前的所述可触控区域进行调整。
PCT/CN2016/089010 2015-12-14 2016-07-07 屏幕触控方法及装置 WO2017101384A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510923361.8 2015-12-14
CN201510923361.8A CN105892842A (zh) 2015-12-14 2015-12-14 屏幕触控方法及装置

Publications (1)

Publication Number Publication Date
WO2017101384A1 true WO2017101384A1 (zh) 2017-06-22

Family

ID=57002943

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/089010 WO2017101384A1 (zh) 2015-12-14 2016-07-07 屏幕触控方法及装置

Country Status (2)

Country Link
CN (1) CN105892842A (zh)
WO (1) WO2017101384A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106569707A (zh) * 2016-10-21 2017-04-19 深圳众思科技有限公司 基于触控屏的触控屏蔽方法和终端
CN106572245B (zh) * 2016-10-31 2020-03-27 努比亚技术有限公司 一种防误触装置及方法
CN107831966A (zh) * 2017-10-27 2018-03-23 北京珠穆朗玛移动通信有限公司 防误触的方法、移动终端及存储介质
CN110865734B (zh) * 2019-11-13 2022-10-25 北京字节跳动网络技术有限公司 目标对象显示方法、装置、电子设备和计算机可读介质
CN114741039B (zh) * 2020-12-24 2023-09-08 华为技术有限公司 设备控制方法和终端设备
CN113504840B (zh) * 2021-05-12 2023-11-07 深圳市爱协生科技股份有限公司 触控数据处理方法、装置以及触控屏

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339475A (zh) * 2008-09-08 2009-01-07 友达光电股份有限公司 触控屏幕装置的控制方法
CN101515220A (zh) * 2009-04-13 2009-08-26 青岛海信移动通信技术股份有限公司 对触摸屏幕中窗口移动、缩放和控制的方法及相关装置
CN101937311A (zh) * 2010-09-09 2011-01-05 宇龙计算机通信科技(深圳)有限公司 一种移动终端及图标控制方法
CN102929504A (zh) * 2012-11-12 2013-02-13 吴增国 触摸屏按键优化设置方法及***
WO2013167063A2 (zh) * 2012-12-14 2013-11-14 中兴通讯股份有限公司 一种触摸屏终端、控制装置以及触摸屏终端的工作方法
CN104932821A (zh) * 2015-06-02 2015-09-23 青岛海信移动通信技术股份有限公司 一种智能终端操作界面的显示方法及智能终端

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4198527B2 (ja) * 2003-05-26 2008-12-17 富士通コンポーネント株式会社 タッチパネル及び表示装置
CN105120160B (zh) * 2015-08-27 2019-05-03 努比亚技术有限公司 拍摄装置和拍摄方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339475A (zh) * 2008-09-08 2009-01-07 友达光电股份有限公司 触控屏幕装置的控制方法
CN101515220A (zh) * 2009-04-13 2009-08-26 青岛海信移动通信技术股份有限公司 对触摸屏幕中窗口移动、缩放和控制的方法及相关装置
CN101937311A (zh) * 2010-09-09 2011-01-05 宇龙计算机通信科技(深圳)有限公司 一种移动终端及图标控制方法
CN102929504A (zh) * 2012-11-12 2013-02-13 吴增国 触摸屏按键优化设置方法及***
WO2013167063A2 (zh) * 2012-12-14 2013-11-14 中兴通讯股份有限公司 一种触摸屏终端、控制装置以及触摸屏终端的工作方法
CN104932821A (zh) * 2015-06-02 2015-09-23 青岛海信移动通信技术股份有限公司 一种智能终端操作界面的显示方法及智能终端

Also Published As

Publication number Publication date
CN105892842A (zh) 2016-08-24

Similar Documents

Publication Publication Date Title
WO2017101384A1 (zh) 屏幕触控方法及装置
US20200166988A1 (en) Gesture actions for interface elements
US11301126B2 (en) Icon control method and terminal
EP2940555B1 (en) Automatic gaze calibration
US8976136B2 (en) Proximity-aware multi-touch tabletop
US8610678B2 (en) Information processing apparatus and method for moving a displayed object between multiple displays
US10191603B2 (en) Information processing device and information processing method
US9952661B2 (en) Method for providing screen magnification and electronic device thereof
US20130135228A1 (en) Device and method for displaying object in terminal
US10180783B2 (en) Information processing device, information processing method and program that controls movement of a displayed icon based on sensor information and user input
US10318135B2 (en) Method for adjusting window display position, and terminal
US20190056861A1 (en) Input techniques for virtual reality headset devices with front touch screens
US20130082916A1 (en) Methods, apparatuses, and computer program products for improving device behavior based on user interaction
EP2755126A2 (en) Method and device for displaying scrolling information in electronic device
EP2947556B1 (en) Method and apparatus for processing input using display
WO2018137399A1 (zh) 一种用于取消待执行操作的方法与装置
WO2017075999A1 (zh) 卸载应用程序的方法、装置及终端设备
WO2017084470A1 (zh) 移动终端、输入处理方法及用户设备、计算机存储介质
WO2013121807A1 (ja) 情報処理装置、情報処理方法およびコンピュータプログラム
US10019148B2 (en) Method and apparatus for controlling virtual screen
WO2015014135A1 (zh) 鼠标指针的控制方法、装置及终端设备
CN105468347B (zh) 暂停视频播放的***及方法
WO2015117526A1 (zh) 一种触控处理方法和装置
US10779148B2 (en) Data transmission method and first electronic device
EP2911115A2 (en) Electronic device and method for color extraction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16874446

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16874446

Country of ref document: EP

Kind code of ref document: A1