CN116107449A - Control method and electronic equipment - Google Patents

Control method and electronic equipment Download PDF

Info

Publication number
CN116107449A
CN116107449A CN202211535009.3A CN202211535009A CN116107449A CN 116107449 A CN116107449 A CN 116107449A CN 202211535009 A CN202211535009 A CN 202211535009A CN 116107449 A CN116107449 A CN 116107449A
Authority
CN
China
Prior art keywords
ultrasonic
sensing area
touch sensing
electronic device
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211535009.3A
Other languages
Chinese (zh)
Inventor
张晗玉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202211535009.3A priority Critical patent/CN116107449A/en
Publication of CN116107449A publication Critical patent/CN116107449A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/043Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a control method and electronic equipment, wherein the method comprises the following steps: identifying a use scene of the electronic equipment; and under the condition that the use scene is matched with the target use scene, closing the touch control function of the touch screen, and starting the ultrasonic sensor to realize the touch control function.

Description

Control method and electronic equipment
Technical Field
The application belongs to the technical field of electronic equipment, and particularly relates to a control method and electronic equipment.
Background
Touch screens widely adopted in electronic devices at present are mostly capacitive touch screens, namely, the electronic devices adopt the touch function of the capacitive touch screen to respond to the touch operation of a user, however, the touch operation of the user responding to the touch function of the capacitive touch screen results in single input mode, and the user experience is poor.
Disclosure of Invention
The embodiment of the application aims to provide a control method and electronic equipment, which can solve the problems that the existing input mode is single and the user experience is poor.
In a first aspect, an embodiment of the present application provides a control method, where the method includes:
identifying a use scene of the electronic equipment;
and under the condition that the use scene is matched with the target use scene, closing the touch control function of the touch screen, and starting the ultrasonic sensor to realize the touch control function.
In a second aspect, an embodiment of the present application provides an electronic device, including:
the identification module is used for identifying a use scene of the electronic equipment;
and the control module is used for closing the touch control function of the touch screen and starting the ultrasonic sensor to realize the touch control function under the condition that the use scene is matched with the target use scene.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, the use scene of the electronic equipment can be identified, and under the condition that the use scene is matched with the target use scene, the touch function of the touch screen is closed, and the ultrasonic sensor is started to realize the touch function. Therefore, in the embodiment, the touch function is related to the use scene, so that the input mode is diversified; moreover, under the condition that the use scene is matched with the target use scene, the touch function of the touch screen is closed, the touch function is realized by opening the ultrasonic sensor, the problem that the touch purpose cannot be realized is avoided, and therefore the use experience of a user is improved.
Drawings
Fig. 1 is a flowchart of a control method of an electronic device according to some embodiments of the present application;
fig. 2 is a schematic diagram of a display interface of an electronic device according to some embodiments of the present application;
fig. 3 is a second schematic display interface of the electronic device according to some embodiments of the present application;
fig. 4 is a third schematic display interface of the electronic device according to some embodiments of the present application;
fig. 5 is a schematic diagram of a display interface of an electronic device according to some embodiments of the present application;
fig. 6 is a schematic diagram of a display interface of an electronic device according to some embodiments of the present application;
fig. 7 is a schematic diagram sixth of a display interface of an electronic device according to some embodiments of the present application;
fig. 8 is a schematic diagram seventh of a display interface of an electronic device according to some embodiments of the present application;
fig. 9 is a schematic diagram eight of a display interface of an electronic device according to some embodiments of the present application;
fig. 10 is a schematic diagram nine of a display interface of an electronic device according to some embodiments of the present application;
fig. 11 is a schematic view of a display interface of an electronic device according to some embodiments of the present application;
fig. 12 is a schematic diagram eleven of a display interface of an electronic device according to some embodiments of the present application;
FIG. 13 is a schematic structural view of an electronic device provided in some embodiments of the present application;
fig. 14 is a schematic structural diagram of an electronic device according to some embodiments of the present application;
fig. 15 is a schematic structural diagram of an electronic device according to some embodiments of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type and not limited to the number of objects, e.g., the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The execution subject of the control method of the electronic device provided by the embodiment of the application may be the electronic device provided by the embodiment of the application.
The control method of the electronic device provided by the embodiment of the application is described in detail below through specific embodiments and application scenarios thereof with reference to the accompanying drawings.
Before introducing each execution step of the control method of the electronic device in the embodiment of the present application, an application scenario of the control method of the electronic device in the embodiment of the present application is first described. For example, since the touch screen widely used in the current electronic devices is a capacitive touch screen, the touch function based on the capacitive touch screen responds to the touch operation of the user, and it can be understood that the existing touch mode is a capacitive touch mode. When the user has water in the finger, oil in the finger, or the user is in an underwater scene (such as diving, swimming, etc.), the touch control function of the capacitive touch screen can be disabled, so that the user cannot use the electronic device. Here, in the above special scenario, the electronic device may disable the touch function of the capacitive touch screen and turn on the ultrasonic sensor to implement the touch function, which may also be understood as that the user can be ensured to use the electronic device normally by responding to the touch operation of the user in the ultrasonic touch mode.
As shown in fig. 1, the control method provided in the embodiment of the application may be applied to an electronic device. As shown in fig. 1, the control method of the electronic device may include the following steps 1100 to 1200, which are described in detail below.
Step 1100, identify a usage scenario in which the electronic device is located.
In this embodiment, during the use of the electronic device, the electronic device may identify the use scenario where the electronic device is located in real time, and of course, in order to reduce the use power consumption of the electronic device, the electronic device may also periodically identify the use scenario where the electronic device is located.
In an alternative embodiment, the usage scenario where the electronic device is located may include a target usage scenario and other usage scenarios, where the target usage scenario may be a usage scenario where a user's finger is in water, the user's finger is in oil, or the user is in an underwater scenario. Wherein the other usage scenarios are usage scenarios other than the target usage scenario.
Specifically, a water-sensitive sensor can be arranged in the electronic equipment, whether liquid exists on the touch screen of the electronic equipment or not can be detected through the water-sensitive sensor, and if the liquid exists on the touch screen of the electronic equipment, the use scene of the electronic equipment is indicated to be a target use scene.
Step S1200, when the usage scenario matches with the target usage scenario, turning off the touch function of the touch screen, and turning on the ultrasonic sensor to implement the touch function.
In this embodiment, the touch screen of the electronic device may be a capacitive touch screen, where the electronic device may respond to a touch operation of a user based on a touch function of the capacitive touch screen, and may also implement the touch function through an ultrasonic sensor. Correspondingly, the touch mode of the electronic device can be a capacitive touch mode or an ultrasonic touch mode, and the touch characteristics of different touch modes are different, for example, the capacitive touch mode has the advantages of sensitivity and multi-point touch, and the ultrasonic touch mode has the advantage of water stain resistance.
It will be appreciated that ultrasonic waves are mechanical waves of extremely short wavelength, which need to propagate by means of a medium, cannot exist in vacuum, and have a greater propagation distance in water than in air. In the related art, an electronic device has implemented an ultrasonic fingerprint scheme, that is, fingerprint recognition based on ultrasonic waves to unlock the electronic device.
Specifically, if the user has water, the user has oil, or the user is in an underwater scene, the electronic device performs touch operation on the electronic device, and the touch function of the capacitive touch screen cannot respond to the touch operation of the user, at this time, the electronic device turns off the touch function of the touch screen, i.e., disables the capacitive touch mode, and turns on the ultrasonic sensor to realize the touch function, i.e., responds to the touch operation of the user through the ultrasonic touch mode, thereby avoiding incapability of realizing the touch purpose.
Specifically, in the case where the ultrasonic sensor is in the on state, the ultrasonic sensor can detect the touch operation of the user according to the set detection frequency.
The set detection frequency can be designed according to the requirement, for example, 50Hz, which corresponds to 20 times of detection in 1 second. For example, when the ultrasonic sensor is in the on state, the ultrasonic sensor scans at a detection frequency of 50Hz for a touch operation, and if the touch operation is detected, the ultrasonic sensor responds to the touch operation and continues to scan at the detection frequency of 50 Hz.
In the embodiment of the application, the use scene of the electronic equipment can be identified, and under the condition that the use scene is matched with the target use scene, the touch function of the touch screen is closed, and the ultrasonic sensor is started to realize the touch function. Therefore, in the embodiment, the touch function is related to the use scene, so that the input mode is diversified; moreover, under the condition that the use scene is matched with the target use scene, the touch function of the touch screen is closed, the touch function is realized by opening the ultrasonic sensor, the problem that the touch purpose cannot be realized is avoided, and therefore the use experience of a user is improved.
In an embodiment, in a case that the ultrasonic sensor is in an on state, the control method of the electronic device of the embodiment of the application may further include: highlighting the ultrasonic touch sensing area.
The ultrasonic touch sensing area is used for a user to implement touch operation.
In this embodiment, the ultrasonic touch sensing area can be highlighted when the ultrasonic sensor is in the on state, so that the user can perform the touch operation based on the ultrasonic touch sensing area. Here, the manner of highlighting the ultrasonic touch sensing area includes, but is not limited to, any one of the following: the icon of the ultrasonic touch sensing area is displayed in the ultrasonic touch sensing area, and the frame of the ultrasonic touch sensing area is displayed in the ultrasonic touch sensing area.
In this embodiment, according to the area of the ultrasonic touch sensing area, the ultrasonic touch sensing area may be divided into two types, one of which is smaller than the target value and the other of which is larger than the target value. Here, the user may select the ultrasonic touch sensing area to be highlighted from the two ultrasonic touch sensing areas according to his own needs.
Referring to fig. 2, in the case that the ultrasonic sensor is in the on state, an icon 201 of the ultrasonic touch sensing area, whose area is smaller than the target value, may be displayed in the ultrasonic touch sensing area.
Referring to fig. 3, in the case that the ultrasonic sensor is in the on state, an icon 301 of the ultrasonic touch sensing area, the area of which is larger than the target value, may be displayed in the ultrasonic touch sensing area.
According to the embodiment, the electronic device can highlight the ultrasonic touch sensing area, so that a user can conveniently implement touch operation based on the ultrasonic touch sensing area.
In one embodiment, in the case that the ultrasonic sensor is in the on state, the control method of the electronic device according to the embodiment of the application further includes the following steps 2100 to 2300:
in step 2100, input is received to an ultrasonic touch sensitive area.
Alternatively, the input to the ultrasonic touch sensing area may be a click input, which may be a single click input, a double click input, or any number of click inputs, or may be a long press input or a short press input.
Referring to fig. 2, in a case where the ultrasonic sensor is in an on state, a user may click on a center area of an icon 201 of the ultrasonic touch sensing area.
Referring to fig. 3, in a case where the ultrasonic sensor is in an on state, a user may click on a center area of an icon 301 of the ultrasonic touch sensing area.
In step 2200, a first operation is performed if the area of the ultrasonic touch sensing area is smaller than a target value.
In this embodiment, if the area of the ultrasonic touch sensing area is smaller than the target value, the electronic device may respond to the input to the ultrasonic touch sensing area, and further execute the first operation.
Referring to fig. 2, if the area of the icon 201 in the ultrasonic touch sensing area is smaller than the target value, if the electronic device is playing the target video, the electronic device may return to the previous interface in response to the user clicking the operation of the center area of the icon 201 in the ultrasonic touch sensing area.
In step 2300, a second operation is performed when the area of the ultrasonic touch sensing area is greater than the target value.
Wherein the first operation and the second operation are different.
In this embodiment, if the area of the ultrasonic touch sensing area is greater than the target value, the electronic device may respond to the input to the ultrasonic touch sensing area, and further execute the second operation.
Referring to fig. 3, if the area of the icon 301 of the ultrasonic touch sensing area is larger than the target value, if the electronic device is playing the target video, the electronic device may pause the playing of the target video in response to the user clicking the operation of the center area of the icon 301 of the ultrasonic touch sensing area.
According to the embodiment of the application, for the ultrasonic touch sensing areas with different areas, the electronic equipment can execute different operations.
In one embodiment, in the case that the ultrasonic sensor is in the on state, the operations performed by the electronic device are correspondingly different for different application scenarios.
In an application scenario in which a user manually sets a navigation mode of an ultrasonic touch sensing area, the control method of the electronic device according to the embodiment of the application may further include: and displaying a first interface indicated by the navigation control under the condition that the input of the user to the navigation control in the ultrasonic touch sensing area is detected through ultrasonic waves.
Referring to the icon 301 in the ultrasonic touch sensing area shown in fig. 5, since the area of the icon 301 in the ultrasonic touch sensing area is large, different navigation controls can be directly displayed on the icon 301 in the ultrasonic touch sensing area, thereby realizing different navigation modes. For example, a first navigation control 3011, a second navigation control 3012, and a third navigation control 3013 are displayed on the icon 301 of the ultrasonic touch sensitive area. The first navigation control 3011 may be a return control for returning to the previous interface. The second navigational control 3012 may be a home screen control for displaying a home screen interface. The third navigation control 3013 may be a background control for displaying a running interface of a background application.
Here, when the ultrasonic sensor is in the on state, the ultrasonic sensor scans the touch operation at a frequency of 50Hz, and if the click operation performed by the user on the first navigation control 3011 is scanned, the ultrasonic sensor returns to display the previous interface. If the click operation performed by the user on the second navigation control 3012 is scanned, a main screen interface can be displayed. If the click operation of the third navigation control 3013 performed by the user is scanned, an operation interface of the background application can be displayed.
Referring to the icon 201 in the ultrasonic touch sensing area shown in fig. 4, since the icon 201 in the ultrasonic touch sensing area is smaller, different navigation controls can be distinguished according to different touch times, so as to realize different navigation modes. For example, the icon 201 of the long-press ultrasonic touch sensing area may be set as a main screen control, and the icon 201 of the short-press ultrasonic touch sensing area may be set as a return control. The time of the long press or the short press is not particularly limited, and the time of the short press is usually set to be less than or equal to 600ms, and the time of the long press is set to be more than 600ms.
Here, when the ultrasonic sensor is in the on state, the ultrasonic sensor scans the touch operation at a frequency of 50Hz, and if the long press operation performed by the user on the icon 201 of the ultrasonic touch sensing area is scanned, the main screen interface is returned. If the short press operation of the user on the icon 201 of the ultrasonic touch sensing area is scanned, an operation interface of the background application can be displayed.
In an application scenario in which a user manually sets a starting manner of a first application, the control method of the electronic device according to the embodiment of the present application may further include: and under the condition that the input of a user to the azimuth control in the ultrasonic touch sensing area is detected through ultrasonic waves, determining a first application indicated by the azimuth control.
Referring to the icon 301 of the ultrasonic touch sensing area shown in fig. 5, four azimuth controls, that is, a first azimuth control 3021, a second azimuth control 3022, a third azimuth control 3023, and a fourth azimuth control 3024, may be displayed on the icon 301 of the ultrasonic touch sensing area. When the ultrasonic sensor is in an on state, the ultrasonic sensor scans touch operation at a frequency of 50Hz, and when a click operation on at least one of the four azimuth controls is detected, it can be determined that the at least one azimuth control indicates browser application A. After the browser A application is selected, the browser A application can be started through a preset scheme, which is equivalent to determining a control. Since the icon 301 of the ultrasonic touch sensing area shown in fig. 5 has a larger operable area, a preset scheme can be set as follows: clicking on the center area of the four-way control may launch browser application A.
Referring to the icon 201 of the ultrasonic touch sensing area shown in fig. 4, four azimuth controls, that is, a first azimuth control 2021, a second azimuth control 2022, a third azimuth control 2023, and a fourth azimuth control 2024, may be displayed on the icon 201 of the ultrasonic touch sensing area. When the ultrasonic sensor is in an on state, the ultrasonic sensor scans touch operation at a frequency of 50Hz, and when a click operation on at least one of the four azimuth controls is detected, it can be determined that the at least one azimuth control indicates browser application A. After selecting the browser application a, since the icon 201 of the ultrasonic touch sensing area shown in fig. 4 has a smaller operable area, a preset scheme can be set as follows: double clicking on the center region of icon 201 of the ultrasonic touch sensitive area may launch browser application a.
It can be appreciated that after setting the starting manner of the first application, the user may normally start the first application or exit the first application, but cannot operate the operation control in the first application. Referring to fig. 8 and 9, taking the first application as browser application a as an example, the operation controls within browser application a may include, but are not limited to: an input box 801 at the upper edge, a control bar 802 at the lower edge, and shortcut icons in other areas. Likewise, the user may operate the target control in browser application A by adding the determination control to the orientation control.
In an application scenario in which a user manually sets a sliding operation, the control method of the electronic device according to the embodiment of the present application may further include: under the condition that the input of a user to the ultrasonic touch sensing area is detected through ultrasonic waves, updating the first content of the running first application according to the direction corresponding to the input.
In this scenario, the ultrasonic touch sensing area supports a sliding operation, for example, the browser application may slide up and down, so as to update the display content of the browser application. For example, the operation in which the finger sliding region covers the entire region of the ultrasonic touch sensing region in the Y direction may be set as a sliding operation while limiting the sliding time to less than 1 second.
Referring to fig. 6, taking the icon 201 in the ultrasonic touch sensing area as an example, when the ultrasonic sensor is in the on state, the ultrasonic sensor scans the touch operation at a frequency of 50Hz, and if the user performs the sliding operation on the icon 201 in the ultrasonic touch sensing area, the electronic device can slide the browsing content of the current browsing interface of the browser application a upwards.
Referring to fig. 7, taking the icon 301 of the ultrasonic touch sensing area as an example, when the ultrasonic sensor is in the on state, the ultrasonic sensor scans the touch operation at a frequency of 50Hz, and if the downward sliding operation performed by the user on the icon 301 of the ultrasonic touch sensing area is scanned, the electronic device can slide the browsing content of the current browsing interface of the browser application a downward.
In an embodiment, the ultrasonic touch sensing area further supports input method operation, where the control method of the electronic device in the embodiment of the application further includes the following steps 3100 to 3400:
step 3100, displaying a virtual keyboard in a first area of the ultrasonic touch sensing area.
The first area may be an area above the ultrasonic touch sensing area.
Referring to fig. 10, a virtual keyboard 202 is displayed in an area above an icon 201 of an ultrasonic touch sensing area.
Referring to fig. 11, a virtual keyboard 302 is displayed in an area above an icon 301 of an ultrasonic touch sensing area.
Step 3200, determining a target identifier in the virtual keyboard indicated by the azimuth control when the input of the user to the azimuth control of the ultrasonic touch sensing area is detected through ultrasonic waves.
The target identifier may be any identifier in the virtual keyboard, for example, letters on the virtual keyboard, numbers on the virtual keyboard, an input mode identifier on the virtual keyboard, and the like. Wherein the input mode identifier is used to provide different input modes, such as handwriting, pinyin, strokes, etc.
Referring to fig. 10 and 11, in the case that the ultrasonic sensor is in the on state, the ultrasonic sensor scans the touch operation at a frequency of 50Hz, and if the touch operation of the position control by the user is scanned, it can be determined that the input mode identifier in the virtual keyboard is indicated by the position control.
In step 3300, when the target identifier is a target input mode identifier, a first handwriting area is displayed in the ultrasonic touch area.
Wherein the first handwriting area is used for a user to input text content.
For example, in the case where the target identifier is a handwriting identifier, referring to fig. 12, the first handwriting area 303 may be displayed on an icon of the ultrasonic touch-sensitive area.
It can be appreciated that the area of the first handwriting area is the same as the area of the ultrasonic touch sensing area, i.e. the area of the first handwriting area is the same as the area of the icon of the ultrasonic touch sensing area.
In step 3400, in the case that the user's input to the first handwriting area is detected by ultrasonic waves, the second content input by the input is acquired.
Referring to fig. 12, the ultrasonic sensor continues to scan the touch operation at a frequency of 50Hz, and if the user is scanned to input text content, such as "wang", in the first handwriting area 303, the electronic device may acquire the text content "wang".
Corresponding to the above embodiments, referring to fig. 13, the embodiment of the present application further provides an electronic device 1300, where the electronic device 1300 includes an identification module 1301 and a control module 1302.
The identifying module 1301 is configured to identify a usage scenario in which the electronic device is located;
the control module 1302 is configured to close a touch function of the touch screen and open the ultrasonic sensor to implement the touch function when the usage scenario matches with the target usage scenario.
In the embodiment of the application, the method and the device can identify the use scene of the electronic equipment, close the touch function of the touch screen and open the ultrasonic sensor to realize the touch function under the condition that the use scene is matched with the target use scene. Therefore, in the embodiment, the touch function is related to the use scene, so that the input mode is diversified; moreover, under the condition that the use scene is matched with the target use scene, the touch function of the touch screen is closed, the touch function is realized by opening the ultrasonic sensor, the problem that the touch purpose cannot be realized is avoided, and therefore the use experience of a user is improved.
In one embodiment, the electronic device 1300 further includes a first display module, a first determination module, and an update module (none of which are shown).
The first display module is used for displaying a first interface indicated by the navigation control under the condition that the input of the user to the navigation control in the ultrasonic touch sensing area is detected through ultrasonic waves; or alternatively, the process may be performed,
the first determining module is used for determining a first application indicated by the azimuth control under the condition that the input of the user to the azimuth control in the ultrasonic touch sensing area is detected through ultrasonic waves; or alternatively, the process may be performed,
and the updating module is used for updating the first content of the running first application according to the direction corresponding to the input under the condition that the input of the user to the ultrasonic touch sensing area is detected through ultrasonic waves.
In one embodiment, the electronic device further comprises a second display module (not shown in the figures).
The second display module is used for highlighting the ultrasonic touch sensing area
In one embodiment, the electronic device further includes a third display module, and a second determination module (none of which are shown).
The third display module is used for displaying the virtual keyboard in the first area of the ultrasonic touch sensing area;
and the second determining module is used for determining a target identifier in the virtual keyboard indicated by the azimuth control under the condition that the input of the user to the azimuth control of the ultrasonic touch sensing area is detected through ultrasonic waves.
In one embodiment, the electronic device further comprises a receiving module, a first executing module and a second executing module (not shown in the figure).
The receiving module is used for receiving input to the ultrasonic touch sensing area;
the first execution module is used for executing a first operation under the condition that the area of the ultrasonic touch sensing area is smaller than a target value;
the second execution module is used for executing a second operation under the condition that the area of the ultrasonic touch sensing area is larger than a target value;
wherein the first operation and the second operation are different.
The electronic device in the embodiment of the present application may be a terminal, or may be other devices other than a terminal. By way of example, the electronic device may be a mobile phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The electronic device in the embodiment of the application may be an electronic device having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The electronic device provided in this embodiment of the present application may implement each process implemented by the method embodiment of fig. 1, and in order to avoid repetition, details are not repeated here.
Optionally, as shown in fig. 14, the embodiment of the present application further provides an electronic device 1600, which includes a processor 1601 and a memory 1602, where the memory 1602 stores a program or an instruction that can be executed on the processor 1601, and the program or the instruction implements each step of the control method embodiment of the electronic device described above when executed by the processor 1601, and the steps achieve the same technical effects, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 15 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1900 includes, but is not limited to: radio frequency unit 1901, network module 1902, audio output unit 1903, input unit 1904, sensor 1905, display unit 1906, user input unit 1907, interface unit 1908, memory 1909, processor 1910, and the like.
Those skilled in the art will appreciate that the electronic device 1900 may further include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 1910 by a power management system for performing functions such as managing charge, discharge, and power consumption by the power management system. The electronic device structure shown in fig. 15 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown in the drawings, or may combine some components, or may be arranged in different components, which will not be described in detail herein.
The processor 1910 is configured to identify a usage scenario in which the electronic device is located; and under the condition that the use scene is matched with the target use scene, closing the touch control function of the touch screen, and starting the ultrasonic sensor to realize the touch control function.
According to the embodiment, the use scene of the electronic equipment can be identified, and under the condition that the use scene is matched with the target use scene, the touch function of the touch screen is closed, and the ultrasonic sensor is started to realize the touch function. Therefore, in the embodiment, the touch function is related to the use scene, so that the input mode is diversified; moreover, under the condition that the use scene is matched with the target use scene, the touch function of the touch screen is closed, the touch function is realized by opening the ultrasonic sensor, the problem that the touch purpose cannot be realized is avoided, and therefore the use experience of a user is improved.
In one embodiment, the processor 1910 is further configured to display a first interface indicated by a navigation control in an ultrasonic touch sensitive area if a user input to the navigation control is detected by ultrasonic waves; or under the condition that the input of a user to the azimuth control in the ultrasonic touch sensing area is detected through ultrasonic waves, determining a first application indicated by the azimuth control; or under the condition that the input of the user to the ultrasonic touch sensing area is detected through the ultrasonic wave, updating the first content of the running first application according to the direction corresponding to the input.
In one embodiment, the processor 1910 is further configured to highlight the ultrasonic touch sensitive area.
In one embodiment, the processor 1910 is further configured to display a virtual keyboard in a first area of the ultrasonic touch sensing area; and under the condition that the input of a user to the azimuth control of the ultrasonic touch sensing area is detected through ultrasonic waves, determining a target identifier in the virtual keyboard indicated by the azimuth control.
In one embodiment, the processor 1910 is further configured to receive input to an ultrasonic touch sensitive area; executing a first operation under the condition that the area of the ultrasonic touch sensing area is smaller than a target value; executing a second operation under the condition that the area of the ultrasonic touch sensing area is larger than a target value; wherein the first operation and the second operation are different.
It should be appreciated that in embodiments of the present application, the input unit 1904 may include a graphics processor (Graphics Processing Unit, GPU) 19041 and a microphone 19042, where the graphics processor 19041 processes image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 1906 may include a display panel 19061, and the display panel 19061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1907 includes at least one of a touch panel 19071 and other input devices 19072. Touch panel 19071, also referred to as a touch screen. Touch panel 19071 may include two parts, a touch detection device and a touch controller. Other input devices 19072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
The memory 1909 may be used to store software programs and various data. The memory 1909 may mainly include a first memory area storing programs or instructions and a second memory area storing data, wherein the first memory area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 1909 may include volatile memory or nonvolatile memory, or the memory 1909 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 1909 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
The processor 1910 may include one or more processing units; optionally, processor 1910 integrates an application processor that primarily handles operations related to the operating system, user interface, and applications, etc., and a modem processor that primarily handles wireless communication signals, such as a baseband processor. It is to be appreciated that the modem processor described above may not be integrated into the processor 1910.
The embodiment of the application further provides a readable storage medium, on which a program or an instruction is stored, where the program or the instruction realizes each process of the control method embodiment of the electronic device when executed by the processor, and the same technical effect can be achieved, so that repetition is avoided, and no redundant description is provided herein.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used for running a program or an instruction, so as to implement each process of the control method embodiment of the electronic device, and achieve the same technical effect, so that repetition is avoided, and no further description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the control method embodiments of the electronic device, and achieve the same technical effects, and are not described herein in detail for avoiding repetition.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or electronic device that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or electronic device. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or electronic device that comprises the element. Furthermore, it should be noted that the scope of the methods and electronic devices in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (10)

1. A control method, characterized in that the method comprises:
identifying a use scene of the electronic equipment;
and under the condition that the use scene is matched with the target use scene, closing the touch control function of the touch screen, and starting the ultrasonic sensor to realize the touch control function.
2. The method of claim 1, further comprising any one of:
under the condition that the input of a user to a navigation control in an ultrasonic touch sensing area is detected through ultrasonic waves, displaying a first interface indicated by the navigation control; or alternatively, the process may be performed,
under the condition that the input of a user to the azimuth control in the ultrasonic touch sensing area is detected through ultrasonic waves, determining a first application indicated by the azimuth control; or alternatively, the process may be performed,
under the condition that the input of a user to the ultrasonic touch sensing area is detected through ultrasonic waves, updating the first content of the running first application according to the direction corresponding to the input.
3. The method according to claim 1, wherein the method further comprises:
highlighting the ultrasonic touch sensing area.
4. The method according to claim 1, wherein the method further comprises:
displaying a virtual keyboard in a first area of the ultrasonic touch sensing area;
and under the condition that the input of a user to the azimuth control of the ultrasonic touch sensing area is detected through ultrasonic waves, determining a target identifier in the virtual keyboard indicated by the azimuth control.
5. The method according to claim 1, wherein the method further comprises:
receiving input to an ultrasonic touch sensing area;
executing a first operation under the condition that the area of the ultrasonic touch sensing area is smaller than a target value;
executing a second operation under the condition that the area of the ultrasonic touch sensing area is larger than a target value;
wherein the first operation and the second operation are different.
6. An electronic device, the electronic device comprising:
the identification module is used for identifying a use scene of the electronic equipment;
and the control module is used for closing the touch control function of the touch screen and starting the ultrasonic sensor to realize the touch control function under the condition that the use scene is matched with the target use scene.
7. The electronic device of claim 6, wherein the electronic device further comprises:
the first display module is used for displaying a first interface indicated by the navigation control under the condition that the input of the user to the navigation control in the ultrasonic touch sensing area is detected through ultrasonic waves; or alternatively, the process may be performed,
the first determining module is used for determining a first application indicated by the azimuth control under the condition that the input of the user to the azimuth control in the ultrasonic touch sensing area is detected through ultrasonic waves; or alternatively, the process may be performed,
and the updating module is used for updating the first content of the running first application according to the direction corresponding to the input under the condition that the input of the user to the ultrasonic touch sensing area is detected through ultrasonic waves.
8. The electronic device of claim 6, wherein the electronic device further comprises:
and the second display module is used for highlighting the ultrasonic touch sensing area.
9. The electronic device of claim 6, wherein the electronic device further comprises:
the third display module is used for displaying the virtual keyboard in the first area of the ultrasonic touch sensing area;
and the second determining module is used for determining a target identifier in the virtual keyboard indicated by the azimuth control under the condition that the input of the user to the azimuth control of the ultrasonic touch sensing area is detected through ultrasonic waves.
10. The electronic device of claim 6, wherein the electronic device further comprises:
the receiving module is used for receiving input to the ultrasonic touch sensing area;
the first execution module is used for executing a first operation under the condition that the area of the ultrasonic touch sensing area is smaller than a target value;
the second execution module is used for executing a second operation under the condition that the area of the ultrasonic touch sensing area is larger than a target value;
wherein the first operation and the second operation are different.
CN202211535009.3A 2022-11-30 2022-11-30 Control method and electronic equipment Pending CN116107449A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211535009.3A CN116107449A (en) 2022-11-30 2022-11-30 Control method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211535009.3A CN116107449A (en) 2022-11-30 2022-11-30 Control method and electronic equipment

Publications (1)

Publication Number Publication Date
CN116107449A true CN116107449A (en) 2023-05-12

Family

ID=86266512

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211535009.3A Pending CN116107449A (en) 2022-11-30 2022-11-30 Control method and electronic equipment

Country Status (1)

Country Link
CN (1) CN116107449A (en)

Similar Documents

Publication Publication Date Title
CN105824559B (en) False touch recognition and processing method and electronic equipment
US9678659B2 (en) Text entry for a touch screen
US20120290291A1 (en) Input processing for character matching and predicted word matching
US20140362119A1 (en) One-handed gestures for navigating ui using touch-screen hover events
EP2395419A2 (en) Mobile terminal and displaying method therefor
CN107506130B (en) Character deleting method and mobile terminal
CN112433693B (en) Split screen display method and device and electronic equipment
CN114779977A (en) Interface display method and device, electronic equipment and storage medium
CN113655929A (en) Interface display adaptation processing method and device and electronic equipment
US20150378443A1 (en) Input for portable computing device based on predicted input
CN112699363A (en) Graphic code identification method and device and electronic equipment
US8766937B2 (en) Method of facilitating input at an electronic device
CN114327726A (en) Display control method, display control device, electronic equipment and storage medium
WO2023093661A1 (en) Interface control method and apparatus, and electronic device and storage medium
CN116107531A (en) Interface display method and device
CN116107449A (en) Control method and electronic equipment
CN113253884A (en) Touch method, touch device and electronic equipment
CN111930296A (en) Electronic equipment control method and device and electronic equipment
CN111522488B (en) Interaction method for calling task panel by mobile phone terminal
CN116774872A (en) Control method, control device, electronic equipment and storage medium
CN115700446A (en) Application management method and device, electronic equipment and storage medium
CN115480664A (en) Touch response method and device, electronic equipment and storage medium
CN117170565A (en) Screen capturing method and device
CN113641275A (en) Interface control method and electronic equipment
CN116302229A (en) Icon display method and device and foldable electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination