CN110308821B - Touch response method and electronic equipment - Google Patents

Touch response method and electronic equipment Download PDF

Info

Publication number
CN110308821B
CN110308821B CN201910583474.6A CN201910583474A CN110308821B CN 110308821 B CN110308821 B CN 110308821B CN 201910583474 A CN201910583474 A CN 201910583474A CN 110308821 B CN110308821 B CN 110308821B
Authority
CN
China
Prior art keywords
touch
touch operation
opening area
input information
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910583474.6A
Other languages
Chinese (zh)
Other versions
CN110308821A (en
Inventor
钱明贵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201910583474.6A priority Critical patent/CN110308821B/en
Publication of CN110308821A publication Critical patent/CN110308821A/en
Application granted granted Critical
Publication of CN110308821B publication Critical patent/CN110308821B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a touch response method, including: acquiring touch input information; determining touch operation represented by the touch input information based on the touch input information, wherein the touch operation comprises touch operation aiming at an opening area of the display screen; and responding to the touch operation aiming at the opening area of the display screen. The present disclosure also provides an electronic device.

Description

Touch response method and electronic equipment
Technical Field
The disclosure relates to a touch response method and an electronic device.
Background
With the rapid development of electronic devices, the display of the electronic devices tends to be full-screen. In order to take the front-end shooting function into consideration, a display screen with holes can be adopted, and a front-end camera is arranged in the hole area of the display screen.
Due to the existence of the opening area of the display screen, the content cannot be displayed in the opening area. The display pages of the application programs preinstalled on the electronic equipment are designed in the display output area, namely the display pages comprise display contents corresponding to the opening area. If the display content comprises the operable content capable of interacting, the user can not operate the operable content by operating the opening area. Even if the user knows that the open hole area has the interactive control, the existing electronic equipment cannot detect the touch of the user in the open hole area and cannot respond to the touch of the open hole area. In this case, the user needs are not matched, and the user experience is poor.
Disclosure of Invention
One aspect of the present disclosure provides a touch response method capable of responding to a touch operation of an open area of a display screen. The method comprises the following steps: acquiring touch input information; determining touch operation represented by the touch input information based on the touch input information, wherein the touch operation comprises touch operation aiming at an opening area of the display screen; and responding to touch operation aiming at the opening area of the display screen.
Optionally, the obtaining touch input information includes: the image information of the operation body is obtained through the image acquisition assembly, and the lens of the image acquisition assembly is at least arranged in the open hole of the display screen. Determining a touch operation for an aperture area of a display screen includes: and processing the image information into touch operation of an operation body aiming at the opening area of the display screen.
Optionally, the display screen includes a display panel and a touch layer located above the display panel, the display panel has an opening, the touch layer covers the display panel and an opening area, and the opening area corresponds to the opening.
Optionally, the display screen includes a display panel and a touch layer located above the display panel, where the touch layer corresponds to the display panel; the display panel is provided with an opening, the touch layer covers the display panel and does not cover the opening area, and the opening area corresponds to the opening. Obtaining touch input information includes: obtaining touch induction parameters through a touch layer; determining a touch operation for an aperture area of a display screen includes: and if the touch sensing parameters comprise first parameters sensed by the touch sensing unit surrounding the opening, processing the first parameters into touch operation of the operating body aiming at the opening area. The touch layer is provided with a touch sensing unit.
Optionally, the touch operation further includes a touch operation for an area outside the aperture area. The determining the touch operation outside the aperture area further includes: and if the touch sensing parameters comprise second parameters sensed by the touch sensing unit, processing the second parameters into touch operation of the operation body aiming at the outside of the opening area. The touch response method further includes: and responding to the touch operation outside the opening area. Wherein, the first parameter is a part of the second parameter.
Optionally, before determining the touch operation represented by the touch input information, the touch response method further includes: determining the number of touch operations and the operation positions of the touch operations according to the touch induction parameters; and determining a touch operation characterized by the touch input information comprises: and processing the touch input information into touch operation of the operation body aiming at the opening area of the display screen and touch operation of the operation body aiming at the opening area of the display screen according to the operation number and the operation positions.
Optionally, before determining the touch operation represented by the touch input information, the touch response method further includes: determining an operation position of touch operation according to the touch induction parameters; determining a touch operation characterized by touch input information includes: and processing the first parameter into touch operation of the operation body aiming at the opening area of the display screen according to the operation position.
Optionally, the responding to the touch operation on the opening area of the display screen includes: and responding to the touch operation based on the touch operation and the display content corresponding to the opening area in the display interface.
Another aspect of the disclosure provides an electronic device that includes a display screen and a processor. Wherein, the display screen has the trompil, and the treater is used for carrying out the following operation: acquiring touch input information; determining touch operation represented by the touch input information based on the touch input information, wherein the touch operation comprises touch operation aiming at an opening area corresponding to an opening of the display screen; and responding to touch operation aiming at the opening area of the display screen.
Optionally, the electronic device further includes an image capturing assembly, where the image capturing assembly includes a lens, and the lens is at least disposed in the opening of the display screen. The image acquisition assembly is used for acquiring image information of an operation body, and the processor is used for executing the following operations: and processing the image information into touch operation of an operation body aiming at the opening area of the display screen.
Optionally, the display screen includes a display panel and a touch layer located above the display panel. The display panel is provided with an opening, and the touch layer covers the display panel and the opening area.
Optionally, the display screen includes a display panel and a touch layer located above the display panel, where the touch layer corresponds to the display panel. The display panel is provided with an opening, the touch layer covers the display panel and does not cover the area of the opening, and the touch layer is provided with a touch sensing unit. The processor is configured to perform the following operations: obtaining touch sensing parameters through the touch layer; and if the touch sensing parameters comprise first parameters sensed by the touch sensing unit surrounding the opening, processing the first parameters into touch operation of the operating body aiming at the opening area.
Optionally, the touch operation further includes a touch operation for an area outside the aperture area. The processor is further configured to perform the following operations: if the touch sensing parameters comprise second parameters sensed by the touch sensing unit, processing the second parameters into touch operation of the operation body aiming at the outside of the opening area; and responding to touch operation outside the opening area. Wherein, the first parameter is a part of the second parameter.
Optionally, the processor is further configured to perform the following operations: before determining the touch operation represented by the touch input information, determining the operation number of the touch operation and the operation position of the touch operation according to the touch induction parameters; and determining a touch operation characterized by the touch input information comprises: and processing the touch input information into touch operation of the operation body aiming at the opening area of the display screen and touch operation of the operation body aiming at the opening area of the display screen according to the operation number and the operation positions.
Optionally, the processor is further configured to perform the following operations: and before determining the touch operation represented by the touch input information, determining the operation position of the touch operation according to the touch induction parameters. Determining a touch operation represented by the touch input information comprises: and processing the first parameter into touch operation of the operation body aiming at the opening area of the display screen according to the operation position.
Optionally, the processor is configured to respond to the touch operation based on the touch operation and display content corresponding to the opening area in the display interface.
Another aspect of the present disclosure provides an electronic device comprising a display screen, one or more processors; and a storage device for storing one or more programs, wherein when the one or more programs are executed by the one or more processors, the one or more processors are caused to execute the touch response method.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions that, when executed by a processor, cause the processor to perform the touch response method described above.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the touch response method as described above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 schematically illustrates an application scenario diagram of a touch response method and an electronic device according to an embodiment of the present disclosure;
fig. 2 schematically shows a flowchart of a touch response method according to a first exemplary embodiment of the disclosure;
fig. 3A schematically shows a block diagram of an electronic device according to a first exemplary embodiment of the present disclosure;
fig. 3B schematically shows a flowchart of a touch response method according to an exemplary embodiment ii of the present disclosure;
fig. 4A schematically shows a block diagram of an electronic device according to an exemplary embodiment of the present disclosure;
fig. 4B schematically shows a flowchart of a touch response method according to a third exemplary embodiment of the present disclosure;
fig. 4C schematically illustrates a flowchart of a touch response method according to an exemplary embodiment of the present disclosure;
FIG. 5A schematically illustrates a partial flow diagram of a touch response method according to an exemplary embodiment of the present disclosure;
FIG. 5B schematically shows a partial flow diagram of a touch response method according to an exemplary embodiment six of the present disclosure;
fig. 6 schematically shows a block diagram of an electronic device according to a third exemplary embodiment of the present disclosure; and
fig. 7 schematically shows a block diagram of an electronic device according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. The techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system.
The embodiment of the disclosure provides a touch response method and electronic equipment. The touch response method comprises the following steps: acquiring touch input information; determining touch operation represented by the touch input information based on the touch input information, wherein the touch operation comprises touch operation aiming at an opening area of the display screen; and responding to touch operation aiming at the opening area of the display screen.
According to the touch response method and the electronic equipment, the electronic equipment can respond to the touch operation of the opening area of the display screen while ensuring the front-end shooting function. Therefore, the response result of the touch operation is matched with the user requirement, and the user experience is improved.
Fig. 1 schematically shows an application scene diagram of a touch response method and an electronic device according to an embodiment of the disclosure. It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, an application scenario 100 of the present disclosure includes an electronic device 110 and an operator 120.
The electronic device 110 has a display 111, which may be an open-cell display, for example. For example, as shown in fig. 1, the display 111 has an opening 112. According to an embodiment of the present disclosure, the electronic device 110 includes, but is not limited to, a laptop portable computer, a tablet computer, a smart phone, or a smart wearable device, among others.
The electronic device 110 may have, for example, a processing function and a touch sensing function, so as to respond to a touch operation of the operating body 120 on the display screen 111 when the touch operation is detected. The detection of the touch operation may be detected and determined by, for example, acquiring image information on the display screen, detecting a change in capacitance by an Integrated Circuit Chip (IC Chip) Integrated in the touch layer, or detecting and determining by various sensors (e.g., pressure sensors).
However, the screen displayed on the display screen 111 may have, for example, operable content, and when the electronic device detects a touch operation of the operation body on the area where the operable content is located, the electronic device responds to the touch operation according to the operable content. For example, an object in the display screen may be caused to move by a touch operation in response to an operable content that controls movement of the object in the display screen.
The operation body 120 may be, for example, an operation body that enables the electronic device to sense, such as a finger of a user, a capacitance pen, or a touch screen pen, which is not limited in this disclosure.
It should be noted that the touch response method provided by the embodiment of the present disclosure may be generally executed by the electronic device 110. Accordingly, the electronic device provided by the embodiments of the present disclosure may generally be the electronic device 110.
It should be understood that the number and type of electronic devices, operators in fig. 1 are merely illustrative. There may be any number and type of electronic devices and operators, as desired for the implementation.
Fig. 2 schematically shows a flowchart of a touch response method according to a first exemplary embodiment of the disclosure.
According to an embodiment of the present disclosure, as shown in fig. 1, an electronic device 110 of an embodiment of the present disclosure may include a display 111 and a processor.
The display 111 has an opening 112, and the processor is used for responding to a touch operation. According to an embodiment of the present disclosure, in order to facilitate the processor to respond to the touch operation, the touch response method as shown in fig. 2 may be performed by the electronic device. The touch response method may include operations S210 to S230.
In operation S210, touch input information is obtained.
According to an embodiment of the present disclosure, the touch input information may include, for example, parameter values of various areas of the display screen 111 detected by the electronic device 110, where the parameter values may include, for example, capacitance values or pressure values. When the operating body 120 performs a touch operation on a certain area of the display screen 111, the parameters of the certain area detected by the electronic device may change. Therefore, the electronic device 110 can determine whether a touch operation exists according to the parameters of the respective areas.
According to an embodiment of the present disclosure, the touch input information may include, for example, capacitances of respective areas of the display screen 111 detected by the electronic device 110. When the operating body 120 performs a touch operation on a certain area of the display screen, the capacitance value of the certain area detected by the electronic device changes, for example, the capacitance value increases. Therefore, the electronic device 110 can determine whether the touch operation exists by obtaining the capacitance variation of each area.
In operation S220, a touch operation represented by the touch input information is determined based on the touch input information.
According to the embodiment of the disclosure, when the touch operation is determined to exist according to the parameter values detected in different areas of the display screen in the touch input information, the area of the display screen where the touch operation of the operation body is located can be determined according to the magnitude relation of the parameter values. And converting the parameters detected in the area of the display screen where the touch operation is positioned into the touch operation represented by the touch input information.
According to an embodiment of the present disclosure, the determined touch operation may include, for example, a touch operation on an open area of the display screen 111 and/or a touch operation outside the open area of the display screen. The opening area may be, for example, an area covered by the opening 112 of the display 111. Alternatively, the opening region may have a predetermined area (for example, 2 cm) that can cover the opening 112 with the center of the opening 112 as the center point, for example2) Of the arbitrary shape of the region. The method for determining the touch operation of the opening area is described in detail in fig. 3A or 3B, and the method for determining the touch operation outside the opening area is described in fig. 4B to 4C, which are not described in detail herein.
After the electronic device 110 determines the touch operation, the touch operation may be responded to. According to the embodiment of the disclosure, the electronic device can execute the operation corresponding to the user requirement by responding to the touch operation.
According to an embodiment of the present disclosure, in the case that the touch operation includes a touch operation with respect to an open area of the display screen 111, the responding to the touch operation includes an operation S230 of responding to the touch operation with respect to the open area of the display screen 111.
According to an embodiment of the present disclosure, in order to accurately determine a response to the touch operation of the opening region, the operation S230 may include responding to the touch operation based on the touch operation and display content corresponding to the opening region in the display interface. The screen displayed by the display interface has at least one operable content, and in a display state (for example, a landscape display state) of the current electronic device, one of the at least one operable content is located in an opening area of the display screen, so that the operable content cannot be displayed through the display interface. Operation S230 may include, for example: and when the touch operation is determined to comprise the touch operation aiming at the opening area of the display screen, responding to the touch operation according to the operable content of the opening area positioned on the display screen, so that the touch operation of the operable content of the opening area positioned on the display screen by the operation body is realized.
According to the embodiment of the disclosure, when the operable content located in the opening area of the display screen is the operation control for controlling the object in the picture displayed by the display interface to jump, the object in the picture displayed by the display interface can jump by responding to the touch operation aiming at the opening area of the display screen. According to the embodiment of the disclosure, when the electronic device changes the current state (for example, the horizontal screen display state is changed into the vertical screen display state), in the changed state, the operable content originally located in the opening area may be displayed by the display interface because the operable content is no longer located in the opening area.
In summary, the electronic device and the touch response method executed by the electronic device according to the embodiments of the present disclosure can respond to the touch operation of the opening area of the display screen. Therefore, the technical defect of poor user experience caused by incapability of operating the control under the condition that the display picture has the operable content corresponding to the opening area can be avoided. Therefore, the response result meets the user requirement, and the user experience is improved.
Fig. 3A schematically shows a structure diagram of an electronic device according to a first exemplary embodiment of the present disclosure, and fig. 3B schematically shows a flowchart of a touch response method according to a second exemplary embodiment of the present disclosure.
According to an embodiment of the present disclosure, as shown in the structural exploded view of the electronic device shown in fig. 3A, the electronic device of the embodiment of the present disclosure may further include an image capturing component 303 in addition to the display 111 and the processor. The display 111 may include a display panel 301 and a touch layer 302 disposed above the display panel 301, wherein the display panel 301 has the opening 112.
The image capturing assembly 303 includes a lens, and the lens is at least disposed in the opening 112 of the display screen. The image capturing component 303 can capture images within the field of view of the lens through the lens. For example, the image capturing component 303 is used to capture an object located on a side of the touch layer 302 away from the display panel 301 through a lens.
Therefore, when the operator 120 performs a touch operation on the display panel 111, the touch layer 302 is substantially touched on the side of the touch layer 302 away from the display panel 301. The image capturing component 303 may obtain image information of the operating body 120 through the lens and send the obtained image information to the processor. The processor may determine whether the acquired image information includes image information when the operator 120 performs a touch operation on the opening area of the display screen, for example, by performing recognition processing on the image.
When the processor determines that the obtained image information includes image information when the operator 120 performs a touch operation on the opening area of the display screen, it may be determined that the image information is touch input information. The touch operation represented by the touch input information can be determined through operation S220. In this case, as shown in fig. 3B, the touch input information may be obtained through operation S310. In operation S310, image information of the operation body is obtained by the image capturing component 303. Accordingly, the touch operation represented by the touch input information may be determined through operation S320. In operation S320, the image information is processed into a touch operation of the operator with respect to the opening area of the display screen.
According to an embodiment of the present disclosure, a method for identifying and processing image information by a processor may include, for example: acquiring image characteristics (for example, image contour characteristics, image texture information, and the like) of the image information, and then comparing the image characteristics with the image characteristics of a prestored operation body. And under the condition that the acquired image characteristics are matched with the image characteristics of the prestored operation body, determining that the acquired image comprises image information when the operation body 120 performs touch operation on the opening area of the display screen.
According to an embodiment of the present disclosure, a method for identifying and processing image information by a processor may include, for example: the image characteristics of the image are acquired, and the object included in the image and the distance between the object and the lens of the image acquisition component 303 are determined according to the image characteristics. When the operation body 120 is included in the image and the distance between the operation body 120 and the lens is smaller than the predetermined distance, it is determined that the acquired image information includes image information when the operation body 120 performs a touch operation on the opening area of the display screen.
According to an embodiment of the present disclosure, a method for identifying and processing image information by a processor may include, for example: the processor determines whether image information with large pixel contrast exists in the plurality of image information according to comparison of the plurality of image information shot by the image acquisition assembly within a preset time (for example, when the operation body touches the opening area, the shot image information is black or other dark colors, and when the operation body leaves the opening area, the shot image information is white or other light colors). If there is image information with large pixel contrast, it can be determined that the operation body has performed touch operation on the opening area, and thus it is determined that the acquired image information includes image information when the operation body 120 performs touch operation on the opening area of the display screen.
According to embodiments of the present disclosure, the image acquisition assembly may operate, for example, in response to control of a processor. For example, the processor may first determine whether the display content currently displayed on the display screen includes operable content corresponding to the position of the aperture area; and if the operable content corresponding to the position of the opening area is included, controlling the image acquisition assembly to be in an operating state so as to enable the image acquisition assembly to acquire the image information. And if the display content does not comprise operable content corresponding to the position of the opening area, controlling the image acquisition assembly to be in a non-working state. Alternatively, the image capture assembly may be in an operative or inoperative state, for example, in response to a user setting of the electronic device system.
According to the embodiment of the disclosure, the image information of the operation body obtained by the image acquisition component is used as the touch input information, and the image information is processed into the touch operation, so that the electronic device provided with the image acquisition component 303 in the opening can respond to the touch operation in the opening area, and thus, the response result can better meet the user requirement, and the user experience is improved. For example, when a user plays an electronic game through the electronic device 110, and empirically determines that the opening area has the operable content that is not displayed, the existing electronic device cannot respond to a touch operation of the user on the operable content of the opening area, which may result in poor user experience. By adopting the electronic equipment provided by the embodiment of the disclosure, the image information of the finger during the touch operation of the user is acquired through the lens. When it is determined that the touch operation includes a touch operation of the user for the open area of the display screen according to the image information, the response of the user to the touch operation of the operable content of the open area can be realized by processing the image information into the touch operation.
Fig. 4A schematically shows a structure diagram of an electronic device according to a second exemplary embodiment of the present disclosure, and fig. 4B schematically shows a flowchart of a touch response method according to a third exemplary embodiment of the present disclosure.
According to an embodiment of the present disclosure, as shown in fig. 4A, in order to sense the touch operation of the operating body 120. As shown in fig. 4A, the display screen 111 of the electronic device of the embodiment of the disclosure is different from the electronic device in fig. 3A in that the touch layer 402 is provided with a plurality of touch sensing units 404 for sensing touch operations of an operating body. According to an embodiment of the present disclosure, the display screen 111 may be, for example, a capacitive screen.
According to the embodiment of the disclosure, in order to enable the electronic device to have the front shooting function, as shown in fig. 4A, the electronic device further has an image capturing assembly 303, and at least part of a lens of the image capturing assembly 303 is disposed in the opening 112. Since the lens is disposed in the opening 112, the lens protrudes from the display panel 301 in a direction close to the touch layer 402. In order to reduce the weight of electronic devices, it is not possible to provide a touch layer at a position corresponding to an opening region. Therefore, the touch layer 402 of the embodiment of the disclosure can only cover the display panel 301, and cannot cover the opening area of the display panel 301.
In this case, the electronic device can detect a touch operation through the touch sensing unit 404. For example: the touch sensing units 404 can detect the capacitance of the touch layer 402 at the position thereof. Under normal conditions, the capacitance values detected by the touch sensing units are fixed values. When the operating body touches the touch layer by performing the touch operation, the capacitance value detected by the touch sensing unit corresponding to the region where the touch operation is located increases according to the area of the touch layer contacted by the operating body. Therefore, the sensing results of the touch sensing units can be sent to the processor, and the processor determines whether the touch operation of the operation body exists.
Accordingly, as shown in fig. 4B, the processor is configured to perform operation S410 to obtain touch sensing parameters through the touch layer. The touch sensing parameters obtained through the touch layer may be, for example, capacitance values detected by the plurality of touch sensing units. The larger the capacitance value sensed by the touch sensing unit is, the larger the area of the touch layer where the operating body contacts the touch sensing unit is. When the processor determines that a capacitance value larger than a fixed value exists in the capacitance values sensed by the plurality of touch sensing units, it can be determined that the operating body touches the touch layer at the position where the touch sensing unit detecting the capacitance value larger than the fixed value is located, and the capacitance value larger than the fixed value can be used as the touch input signal.
According to the embodiment of the disclosure, in order to locate the position of the touch operation, the setting position of the touch sensing unit, at which the capacitance value larger than a fixed value is detected, may also be determined. Therefore, the touch sensing parameters not only include the capacitance values, but also include the positions of the touch sensing units for detecting the capacitance values.
After the processor obtains the touch input signal through operation S410, operation S420 may be performed, and if the touch sensing parameters include a first parameter detected by the touch sensing unit surrounding the opening, the first parameter is processed into a touch operation of the operator with respect to the area of the opening. The touch signal obtained by detecting the touch operation of the operation body may be a complete sine wave signal, and a peak of the sine wave signal is a central position of the touch operation of the operation body. Accordingly, the operation S420 may include, for example: first, it is determined whether the capacitance value larger than the fixed value in the touch sensing parameters includes a capacitance value detected by the touch sensing unit located at the edge of the opening of the display panel 301. And if so, determining whether the capacitance value sensed by the touch sensing unit at the edge of the opening comprises the capacitance value at the peak position in the sine wave distribution. If not, determining the capacitance value distribution sensed by the touch sensing unit at the edge of the opening as a first parameter. In this case, it is explained that the center position of the operation body is located in the opening region, and the touch operation of the operation body is substantially the operation on the opening region, so the first parameter is processed as the touch operation of the operation body on the opening region.
In summary, the electronic device of the embodiment of the disclosure can detect the touch operation of the operation body on the opening region through the capacitance signal detected by the touch sensing unit located at the edge of the opening. Therefore, response to touch operation of the opening area can be facilitated, and user experience is improved.
Fig. 4C schematically shows a flowchart of a touch response method according to an exemplary embodiment of the present disclosure.
According to the embodiment of the disclosure, in order to improve the accuracy of the electronic device responding to the touch operation, the operation position of the touch operation can be determined before responding to the touch operation, so that the operable content corresponding to the operation position can be determined. Therefore, as shown in fig. 4C, the touch response method executed by the electronic device may further include an operation S440 of determining an operation position of the touch operation according to the touch sensing parameter. Accordingly, operation S420 may be implemented by operation S420'. In operation S420', the first parameter is processed into a touch operation of the operation body with respect to the opening area of the display screen according to the operation position.
Wherein, the operation S440 may be, for example: capacitance values greater than a fixed value in the touch sensing parameters are first determined. And then determining the capacitance value at the peak position of the sine wave distribution in the capacitance values larger than the fixed value. And then determining the position of the touch layer corresponding to the touch sensing unit which detects the capacitance value belonging to the sine wave distribution peak position. And finally, determining the position of the touch layer as the operation position of the touch operation.
According to the embodiment of the disclosure, under the condition that the capacitance value sensed by the touch sensing unit at the edge of the opening is the first parameter, the completion can be performed according to the first parameter, and the complete sine wave distribution of the capacitance value is obtained. And then the position in the opening region corresponding to the wave crest of the complete sine wave distribution obtained by the completion is taken as an operation position. At this time, operation S420' may include: and determining the operable content corresponding to the operation position in the opening region according to the operation position of the touch operation, and then processing the first parameter into the touch operation of the corresponding control in the opening region by the operation body according to the corresponding operable content.
According to the embodiment of the disclosure, it is considered that some complex operations may need a plurality of operation bodies to be performed, for example, a zoom-in operation requires two fingers of a user to perform a touch operation. At this time, the touch operation includes not only the touch operation for the opening area but also the touch operation for the outside of the opening area. At least two non-adjacent touch sensing units of the touch sensing units 404 disposed on the touch layer 402 detect a capacitance value larger than a fixed value. Therefore, determining the touch operation characterized by the touch input information in the touch response method executed by the electronic device may further include determining the touch operation directed to the outside of the aperture area. For this purpose, step 220 shown in fig. 2, for example, may further include determining a touch operation for the outside of the aperture area.
Fig. 5A schematically illustrates an example flow chart of a touch response method according to an example embodiment of the present disclosure.
As shown in fig. 5A, the determining a touch operation for the outside of the aperture area may include the following operation S550.
In operation S550, if the touch sensing parameter includes a second parameter sensed by the touch sensing unit, the second parameter is processed into a touch operation of the operation body for the outside of the opening area.
The second parameter is a complete sine wave signal when the touch signal obtained by detecting the touch operation of the operation body is a complete sine wave signal, and the first parameter is a partial signal of the sine wave signal, which does not include a peak portion. Thus, the first parameter is a partial parameter of the second parameter.
Accordingly, as shown in fig. 5A, in order to respond to the touch operation outside the opening area, so that the response result more meets the user requirement, the touch response method executed by the electronic device may further include an operation S560 of responding to the touch operation outside the opening area. According to an embodiment of the present disclosure, the operation S560 may include: and responding to the touch operation outside the aperture area based on the touch operation outside the aperture area and display content displayed in the display interface and corresponding to the position of the touch operation outside the aperture area.
In addition, according to the embodiment of the disclosure, in order to accurately determine the touch operation, and thus improve the response accuracy, before determining the touch operation represented by the touch input information, for example, before step 220 shown in fig. 2, the electronic device may further determine the operation number and the operation position of the touch operation.
Fig. 5B schematically shows a flowchart of a touch response method according to an exemplary embodiment six of the present disclosure.
As shown in fig. 5B, before the step 220 shown in fig. 2, the touch response method according to an embodiment of the present disclosure may further include the following operation S540.
In operation S540, the number of touch operations and the operation position of the touch operation are determined according to the touch sensing parameter. The operation S540 may include, for example: firstly, according to the capacitance values detected by the touch sensing units, the touch sensing units with the detected capacitance values larger than a fixed value are determined. Then, the number of sine wave distributions is determined according to the capacitance value larger than the fixed value. The determined number of touch operations is the number of formed sine wave distributions. And then determining a capacitance value corresponding to the peak position of the sine wave distribution, and determining the position of the touch layer corresponding to the touch sensing unit for detecting the capacitance value at the peak position. And finally, determining the determined position of the touch layer as the operation position of the touch operation. According to the embodiment of the disclosure, when touch operation is performed on an opening area, complete sine wave distribution needs to be obtained by supplementing according to the capacitance value detected by the touch sensing unit at the edge of the opening area, and the position of the opening area corresponding to the peak position is determined according to the supplemented sine wave distribution, so that the operation position of the touch operation on the opening area is obtained.
After the number of touch operations and the operation positions are determined, if the operation positions of the touch operations include the positions of the opening areas, the touch operations may be determined through operation S550' in fig. 5B. In operation S550', the touch input information is processed into a touch operation of the operation body with respect to the opening area of the display screen and a touch operation of the operation body with respect to the outside of the opening area of the display screen according to the operation number and the operation position. The operation S550' may include, for example: first, the operation type of the touch operation is determined according to the number of operations, for example, whether the operation is a two-finger operation or a three-finger operation may be determined. And determining the operable content corresponding to the touch operation according to the operation type and the operation position of the touch operation. And finally, converting the capacitance value detected by the touch sensing unit into touch operation on the operable content according to the operable content and the operation type. Wherein the operable content includes non-displayed content located outside the opening area and displayed content located outside the opening area.
Accordingly, the response to the touch operation is completed by performing operation S560'. In operation S560', a touch operation for the opening area of the display screen and a touch operation for the outside of the opening area of the display screen are responded.
According to the embodiment of the disclosure, when the touch operation of the user includes an operation of sliding from the opening region to the outside of the opening region, the touch input information may be obtained by the image acquisition assembly and the touch sensing unit in the touch layer. For example, when the touch operation of the user includes an operation of sliding from the opening region to the outside of the opening region, at a first time, the image capturing assembly may capture image information including the operation body, and at a second time consecutive to the first time, the touch sensing unit at the edge of the opening detects a complete sine wave distributed capacitance value. Therefore, the image information of the operation body acquired by the image acquisition assembly and the capacitance value of the complete sine wave distribution detected by the touch sensing unit at the edge of the opening are used as touch input information. According to the touch input information, the touch operation represented by the touch input information can be determined to be the operation of sliding from the opening area to the outside of the opening area.
Fig. 6 schematically shows a block diagram of an electronic device according to a third exemplary embodiment of the present disclosure.
According to the embodiment of the disclosure, the response of the electronic device to the touch operation of the opening area is simplified. As shown in fig. 6, the electronic device of the embodiment of the disclosure is different from the electronic device in fig. 4A in that the electronic device includes a touch layer 602, which may cover the display panel 301 and the opening area of the display panel, for example. And a touch sensing unit 604 is disposed in a region of the touch layer 602 corresponding to the opening region for sensing a touch operation on the opening region.
Fig. 7 schematically shows a block diagram of an electronic device according to an embodiment of the present disclosure.
As shown in fig. 7, electronic device 700 includes a processor 710, a computer-readable storage medium 720, and a display screen 730. The electronic device 700 may perform a touch response method according to an embodiment of the present disclosure.
In particular, processor 710 may comprise, for example, a general purpose microprocessor, an instruction set processor and/or associated chipset, and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 710 may also include on-board memory for caching purposes. Processor 710 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
Computer-readable storage medium 720, for example, may be a non-volatile computer-readable storage medium, specific examples including, but not limited to: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and so on.
The computer-readable storage medium 720 may include a computer program 721, which computer program 721 may include code/computer-executable instructions that, when executed by the processor 710, cause the processor 710 to perform a method according to an embodiment of the disclosure, or any variation thereof.
The computer program 721 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 721 may include one or more program modules, including 721A, modules 721B, … …, for example. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, so that the processor 710 may execute the method according to the embodiment of the present disclosure or any variation thereof when the program modules are executed by the processor 710.
Specifically, the display screen 730 is an open-cell display screen to maximize the display content. The display screen 730 is connected to the processor 710, so that the processor 710 performs a process matching the touch operation in response to the touch operation of the operator.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (7)

1. A touch response method is executed by an electronic device comprising a display screen, wherein the display screen comprises a display panel and a touch layer positioned above the display panel, and the touch layer corresponds to the display panel; the display panel is provided with an opening, the touch layer covers the display panel and does not cover an opening area, and the opening area corresponds to the opening; the method comprises the following steps:
acquiring touch input information;
determining a touch location of a touch input if the touch input information includes touch input information for a touch layer surrounding the aperture, the touch location including a location within the aperture;
determining a touch operation represented by the touch input information based on the touch position, wherein the touch operation comprises a touch operation aiming at the open pore area; and
responding to the touch operation aiming at the opening area,
wherein the obtaining touch input information comprises: obtaining touch sensing parameters through the touch layer; determining the touch operation for the opening area of the display screen comprises: and if the touch sensing parameters comprise first parameters sensed by the touch sensing units surrounding the opening, processing the first parameters into touch operation of an operation body for the opening area.
2. The method of claim 1, wherein the touch operations further comprise touch operations for outside the aperture area;
determining a touch operation for outside the aperture area comprises: if the touch sensing parameters comprise second parameters sensed by the touch sensing unit, processing the second parameters into touch operation of the operating body aiming at the outside of the opening area;
the method further comprises the following steps: responding to the touch operation aiming at the outside of the opening area,
wherein the first parameter is a partial parameter of the second parameter.
3. The method of claim 2, wherein the method further comprises, prior to determining the touch operation characterized by the touch input information:
determining the number of the touch operations according to the touch sensing parameters; and
determining a touch operation characterized by the touch input information comprises: and processing the touch input information into touch operation of an operation body aiming at the open pore area of the display screen and touch operation of the operation body aiming at the open pore area of the display screen according to the operation number and the touch position.
4. The method of claim 1, wherein:
determining a touch position of the touch operation according to the touch induction parameters;
determining a touch operation characterized by the touch input information comprises: and processing the first parameter into touch operation of an operation body aiming at the opening area of the display screen according to the touch position.
5. The method of claim 1, wherein responding to the touch operation to the aperture area of the display screen comprises: and responding to the touch operation based on the touch operation and the display content corresponding to the opening area in the display interface.
6. An electronic device, comprising:
the display screen comprises a display panel and a touch layer positioned above the display panel, and the touch layer corresponds to the display panel; the display panel is provided with an opening, the touch layer covers the display panel and does not cover an opening area, and the opening area corresponds to the opening;
a processor to perform the following operations:
acquiring touch input information;
determining a touch location of a touch input if the touch input information includes touch input information for a touch layer surrounding the aperture, the touch location including a location within the aperture;
determining a touch operation represented by the touch input information based on the touch position, wherein the touch operation comprises a touch operation aiming at the open pore area; and
responding to the touch operation aiming at the opening area,
wherein the obtaining touch input information comprises: obtaining touch sensing parameters through the touch layer; determining the touch operation for the opening area of the display screen comprises: and if the touch sensing parameters comprise first parameters sensed by the touch sensing units surrounding the opening, processing the first parameters into touch operation of an operation body for the opening area.
7. The electronic device of claim 6, further comprising:
the image acquisition assembly comprises a lens, the lens is at least arranged in the open hole of the display screen, and the image acquisition assembly is used for obtaining the image information of the operation body.
CN201910583474.6A 2019-06-28 2019-06-28 Touch response method and electronic equipment Active CN110308821B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910583474.6A CN110308821B (en) 2019-06-28 2019-06-28 Touch response method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910583474.6A CN110308821B (en) 2019-06-28 2019-06-28 Touch response method and electronic equipment

Publications (2)

Publication Number Publication Date
CN110308821A CN110308821A (en) 2019-10-08
CN110308821B true CN110308821B (en) 2021-05-18

Family

ID=68078120

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910583474.6A Active CN110308821B (en) 2019-06-28 2019-06-28 Touch response method and electronic equipment

Country Status (1)

Country Link
CN (1) CN110308821B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111078056B (en) * 2019-12-30 2021-07-16 联想(北京)有限公司 Electronic device and signal processing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107885389A (en) * 2017-11-29 2018-04-06 武汉天马微电子有限公司 A kind of display panel and display device
CN107977114A (en) * 2017-11-30 2018-05-01 武汉天马微电子有限公司 Display panel and its display device
CN109828690A (en) * 2019-01-31 2019-05-31 上海天马有机发光显示技术有限公司 Touch-control display panel and touch control display apparatus

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101397084B1 (en) * 2012-07-02 2014-05-20 엘지전자 주식회사 Mobile terminal
CN107765919A (en) * 2017-10-25 2018-03-06 武汉天马微电子有限公司 A kind of touch-control display panel and display device
CN108196714B (en) * 2018-01-02 2021-01-15 联想(北京)有限公司 Electronic equipment
CN208271151U (en) * 2018-05-14 2018-12-21 昆山国显光电有限公司 Touch panel, touch control display screen and touch control display device
CN108920071B (en) * 2018-06-21 2021-06-08 Oppo广东移动通信有限公司 Control method and device based on special-shaped display screen, storage medium and mobile terminal
CN109358765A (en) * 2018-08-20 2019-02-19 中兴通讯股份有限公司 Terminal, the method, apparatus and storage medium for executing operation at the terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107885389A (en) * 2017-11-29 2018-04-06 武汉天马微电子有限公司 A kind of display panel and display device
CN107977114A (en) * 2017-11-30 2018-05-01 武汉天马微电子有限公司 Display panel and its display device
CN109828690A (en) * 2019-01-31 2019-05-31 上海天马有机发光显示技术有限公司 Touch-control display panel and touch control display apparatus

Also Published As

Publication number Publication date
CN110308821A (en) 2019-10-08

Similar Documents

Publication Publication Date Title
US20210096651A1 (en) Vehicle systems and methods for interaction detection
CN110352446B (en) Method and apparatus for obtaining image and recording medium thereof
KR102230630B1 (en) Rapid gesture re-engagement
US9912853B2 (en) Switching between cameras of an electronic device
EP3923037A1 (en) Foreign object detection method, foreign object detection device, and electronic apparatus
US20170083741A1 (en) Method and device for generating instruction
KR102614046B1 (en) Method for obtaining bio data and an electronic device thereof
CN108958627B (en) Touch operation method and device, storage medium and electronic equipment
CN106981048B (en) Picture processing method and device
US9535604B2 (en) Display device, method for controlling display, and recording medium
CN108874284B (en) Gesture triggering method
CN106815809B (en) Picture processing method and device
US10969827B2 (en) Electronic device and method for controlling user interface therein
SE1450769A1 (en) Improved tracking of an object for controlling a non-touch user interface
WO2021102969A1 (en) Method and apparatus for triggering fingerprint detection, electronic device, and storage medium
CN103713849A (en) Method and device for image shooting and terminal device
CN110308821B (en) Touch response method and electronic equipment
EP2799970A1 (en) Touch screen panel display and touch key input system
CN106845190B (en) Display control system and method
US9898183B1 (en) Motions for object rendering and selection
CN110519433B (en) Camera application control method, device, equipment and storage medium
WO2017174114A1 (en) Improving readability of content displayed on a screen
US20200150812A1 (en) Information-processing device and information-processing program
WO2018161421A1 (en) Performance test method and performance test apparatus for touch display screen of terminal device
US20120013550A1 (en) Method for controlling the interactions of a user with a given zone of a touch screen panel

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant