CN110568989A - service processing method, service processing device, terminal and medium - Google Patents

service processing method, service processing device, terminal and medium Download PDF

Info

Publication number
CN110568989A
CN110568989A CN201910788301.8A CN201910788301A CN110568989A CN 110568989 A CN110568989 A CN 110568989A CN 201910788301 A CN201910788301 A CN 201910788301A CN 110568989 A CN110568989 A CN 110568989A
Authority
CN
China
Prior art keywords
area
gesture
fingerprint
screen
service
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910788301.8A
Other languages
Chinese (zh)
Inventor
洪帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Transsion Holdings Co Ltd
Original Assignee
Shenzhen Transsion Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Transsion Holdings Co Ltd filed Critical Shenzhen Transsion Holdings Co Ltd
Priority to CN201910788301.8A priority Critical patent/CN110568989A/en
Publication of CN110568989A publication Critical patent/CN110568989A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

the embodiment of the invention discloses a service processing method, a service processing device, a terminal and a medium, wherein the service processing method is applied to the terminal, the terminal comprises a first area and a second area, and the method comprises the following steps: acquiring a gesture starting point in the first area; acquiring a gesture end point in the second area; determining gesture information according to the gesture starting point and the gesture end point, and performing service processing according to the gesture information; the embodiment of the invention can better process the service, and improve the convenience and the processing efficiency of the service processing.

Description

service processing method, service processing device, terminal and medium
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to a service processing method, a service processing apparatus, a terminal, and a computer storage medium.
Background
With the development of scientific technology, terminals can provide more and more service processing functions for users, such as a screen brightness adjustment function, a screen display mode switching function, and the like. At present, if a user wants to perform a certain target service in a terminal, the user needs to manually find and start a corresponding target service processing function in the terminal, and then the target service is executed through the target service processing function; for example, when a user wants to adjust the screen brightness, the user needs to find the screen brightness adjusting function in the terminal and start the screen brightness adjusting function, and then perform brightness adjustment. Therefore, the existing business processing needs a user to manually inquire the corresponding business processing function, and the convenience is low; moreover, the flow of the service processing is complicated, which results in low processing efficiency.
Disclosure of Invention
in order to solve the foregoing technical problems, embodiments of the present invention provide a service processing method, an apparatus, a terminal, and a computer storage medium, which can better perform service processing, and improve convenience and processing efficiency of service processing.
In one aspect, an embodiment of the present invention provides a service processing method, where the service processing method is applied to a terminal, and the terminal includes a first area and a second area, and the method includes:
acquiring a gesture starting point in the first area;
Acquiring a gesture end point in the second area;
And determining gesture information according to the gesture starting point and the gesture end point, and performing service processing according to the gesture information.
optionally, a specific implementation manner of obtaining the gesture starting point in the first area may be: detecting a gesture trigger event in the first region; in response to a gesture triggering event detected in the first region, a gesture starting point is determined in the first region.
Optionally, the specific implementation of acquiring the gesture endpoint in the second area may be: detecting a sliding operation in the second area; and determining a gesture end point in the second area according to the sliding operation.
optionally, the first area is a screen area, the second area is a fingerprint area, and a bottom edge of the screen area is adjacent to a top edge of the fingerprint area; the gesture triggering event comprises: detecting an event that a sliding curve is detected in the screen area and the sliding curve intersects with the bottom side of the screen area; the gesture starting point is a point where the sliding curve intersects with the bottom edge of the screen area.
Optionally, the specific implementation of determining the gesture end point in the second area according to the sliding operation may be: detecting a target fingerprint in the fingerprint area according to the sliding operation, and recording the position point of the target fingerprint in the fingerprint area; and if the detected fingerprint area of the target fingerprint is gradually increased along a first preset direction from the position point, determining the position point as the gesture end point.
optionally, the first area is a fingerprint area, the second area is a screen area, and a bottom edge of the screen area is adjacent to a top edge of the fingerprint area; the gesture triggering event comprises: detecting a target fingerprint in the fingerprint area, wherein the detected fingerprint area of the target fingerprint is gradually reduced along a second preset direction; the gesture starting point is a point where the target fingerprint disappears in the fingerprint area.
Optionally, the specific implementation of determining the gesture end point in the second area according to the sliding operation may be: determining an intersection point of a scribing line formed by the sliding operation and the bottom edge of the screen area; and determining the intersection point of the curve and the bottom edge of the screen area as the gesture end point.
optionally, the gesture information includes at least one of: coordinate information and gesture order; wherein the coordinate information comprises at least one of the following coordinates: the starting point coordinates of the gesture starting point and the end point coordinates of the gesture end point; the gesture sequence includes: an order from the screen area to the fingerprint area, or an order from the fingerprint area to the screen area.
Optionally, the specific implementation of performing service processing according to the gesture information may be: determining target services to be processed according to the gesture sequence, wherein the target services comprise any one of the following items: mode adjustment service, screen brightness adjustment service, volume adjustment service, multimedia playing adjustment service; and determining an adjustment rule about the target service according to the coordinate information, and executing the target service according to the adjustment rule.
In another aspect, an embodiment of the present invention provides a service processing method, where the service processing method is applied to a terminal, and the terminal includes a first area and a second area, and the method includes:
Acquiring first fingerprint information in the first area;
Acquiring second fingerprint information in the second area;
And determining a target service to be processed according to the first fingerprint information and the second fingerprint information, and processing.
Optionally, the first area is a first fingerprint acquisition area, and the second area is a second fingerprint acquisition area; the fingerprint collection area includes: an under-screen fingerprint acquisition area, and/or a non-under-screen fingerprint acquisition area.
optionally, when the fingerprint acquisition area is a screen fingerprint acquisition area, the first fingerprint acquisition area is a first screen area of a screen of the terminal, and the second fingerprint acquisition area is a second screen area of the screen of the terminal.
Optionally, the target service to be processed includes any one of: mode adjustment service, screen brightness adjustment service, volume adjustment service, multimedia playing adjustment service.
in another aspect, an embodiment of the present invention provides a service processing method, where the service processing method is applied to a terminal, and the terminal includes a first area and a second area, and the method includes:
acquiring a first gesture track in the first area;
Acquiring a second gesture track in the second area;
And determining a target service to be processed according to the first gesture track and the second gesture track, and processing.
Optionally, the first area is a first screen area of the terminal; the second area is a second screen area of the terminal; the first screen area and the second screen area are any two area positions of the screen of the terminal.
Optionally, the target service to be processed includes any one of: mode adjustment service, screen brightness adjustment service, volume adjustment service, multimedia playing adjustment service.
In another aspect, an embodiment of the present invention provides a service processing apparatus, where the service processing apparatus operates in a terminal, and the terminal includes a first area and a second area, and the apparatus includes:
an acquisition unit configured to acquire first information in the first area and second information in the second area;
and the processing unit is used for determining the target service to be processed according to the first information and the second information and processing the target service.
Optionally, the obtaining unit specifically includes:
The contact acquisition module is used for acquiring a gesture starting point in the first area and acquiring a gesture end point in the second area;
the fingerprint acquisition module is used for acquiring first fingerprint information in the first area and acquiring second fingerprint information in the second area;
and the track acquisition module is used for acquiring a first gesture track in the first area and acquiring a second gesture track in the second area.
optionally, the first information includes: at least one of a gesture start point, first fingerprint information, and a first gesture trajectory;
the second information includes: at least one of a gesture endpoint, second fingerprint information, and a second gesture trajectory.
In another aspect, an embodiment of the present invention provides a terminal, including a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program, and the computer program includes program instructions, and the processor is configured to call the program instructions to execute the service processing method.
In still another aspect, an embodiment of the present invention provides a computer storage medium, where a computer program is stored, where the computer program includes program instructions, and the program instructions, when executed by a processor, cause the processor to execute the service processing method described above.
In the embodiment of the invention, a gesture starting point can be obtained in the first area, and a gesture ending point can be obtained in the second area; and determining gesture information according to the gesture starting point and the gesture end point, and performing service processing according to the gesture information. Therefore, in the process of the business processing, the user can input the gesture in the terminal, the terminal can automatically process the business according to the gesture information, the user does not need to search the corresponding business processing function, and the convenience and the processing efficiency of the business processing are improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1a is a schematic diagram of a simple gesture according to an embodiment of the present invention;
FIG. 1b is a schematic diagram illustrating a simple gesture detection process according to an embodiment of the present invention;
FIG. 2a is a schematic diagram of another simplified gesture provided in accordance with an embodiment of the present invention;
FIG. 2b is a schematic diagram illustrating another simple gesture detection process according to an embodiment of the present invention;
Fig. 3 is a schematic flowchart of a service processing method according to an embodiment of the present invention;
FIG. 4 is a schematic illustration of a slip curve provided by an embodiment of the present invention;
Fig. 5 is a schematic diagram of a service process provided in an embodiment of the present invention;
fig. 6 is a flowchart illustrating a service processing method according to another embodiment of the present invention;
FIG. 7 is a schematic diagram of a coordinate system provided by an embodiment of the present invention;
FIG. 8 is a schematic illustration of a central axis provided by an embodiment of the present invention;
Fig. 9 is a flowchart illustrating a service processing method according to another embodiment of the present invention;
fig. 10 is a flowchart illustrating a service processing method according to another embodiment of the present invention;
Fig. 11 is a schematic structural diagram of a service processing apparatus according to an embodiment of the present invention;
Fig. 12 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
the technical solutions in the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides a novel simple gesture, so that a terminal can automatically perform service processing on a user based on the simple gesture, and the convenience of the service processing is improved; the simple gesture is a gesture that a user finger slides from one area to another area of the terminal within a preset time period. For example, the simple gesture may be gesture 1 in which a user finger slides from a screen area to a fingerprint area within a preset time period, as shown in fig. 1 a; the specific detection process of the terminal for the gesture 1 can be collectively referred to as fig. 1 b. Specifically, the method comprises the following steps: if the terminal detects a sliding curve of a user finger in a screen area, the sliding curve is intersected with the bottom edge of the screen area of the terminal, and the intersection point is an intersection point 1 shown in figure 1 a; the terminal can acquire the coordinate of the intersection point 1 as X. And secondly, starting fingerprint detection by the terminal. And thirdly, if the user fingerprint is detected to start to appear at the intersection point 2 within a preset time period and the fingerprint area is gradually increased, acquiring the coordinate of the intersection point 2 as Z. And fourthly, according to the steps I-III, the gesture of the user can be determined to be the gesture 1, and the sequence is from the screen area to the fingerprint area.
For another example, the simple gesture may also be gesture 2 in which the finger of the user slides from the fingerprint area to the screen area within a preset time period, as shown in fig. 2 a; the specific detection process of the terminal for the gesture 2 can be collectively referred to as fig. 2 b. Specifically, the method comprises the following steps: firstly, if the terminal detects a user fingerprint in a fingerprint area, and the area of the fingerprint is gradually reduced until the fingerprint disappears at an intersection point 1; the terminal can acquire the coordinate of the intersection point 1 as z. And secondly, the terminal starts screen detection. If detecting that a drawn line is drawn into the screen curve from the bottom edge of the screen area in the preset time period, acquiring the coordinate of an intersection point 2 of the drawn line and the bottom edge of the screen area as x. And fourthly, according to the steps I-III, the gesture of the user can be determined to be the gesture 2, and the sequence is from the fingerprint area to the screen area.
Based on the above description of the simple gesture, an embodiment of the present invention provides a service processing method as shown in fig. 3. The service processing method may be performed by a terminal, and the terminal may include a first area and a second area, where the terminal may include but is not limited to: smart phones, tablets, laptops, etc. As shown in fig. 3, the service processing method may include the following steps S301 to S303:
S301, acquiring a gesture starting point in the first area.
in a specific implementation process, a gesture triggering event can be detected in a first area; if a gesture triggering event is detected, a gesture starting point may be determined in the first area in response to the gesture triggering event detected in the first area.
In one embodiment, the first area may be a screen area, and the second area may be a fingerprint area; and the bottom edge of the screen area is adjacent to the top edge of the fingerprint area; here, adjacent means that the bottom side of the screen area and the top side of the fingerprint area are parallel to each other or the bottom side of the screen area and the top side of the fingerprint area are the same side, as shown in fig. 1 a. Under this embodiment, the gesture triggering event may include: detecting an event that a sliding curve intersects with the bottom side of the screen area in the screen area; the gesture starting point is the point where the sliding curve intersects the bottom edge of the screen area, such as point 1 shown in FIG. 1 a.
It should be noted that, in the embodiment of the present invention, the shape of the sliding curve is not limited; that is, the shape of the sliding curve may be a curve line segment, a closed figure (e.g., circle, triangle, etc.), and so on. The method and the device take the consideration that a user may touch a terminal screen by mistake and form a sliding curve in a screen area, so as to trigger the terminal to perform subsequent service processing. Therefore, the terminal in the embodiment of the invention considers that the gesture trigger event is detected only when the sliding curve is detected in the screen area and the sliding curve is intersected with the bottom edge of the screen area; that is, if the terminal detects a sliding curve in the screen area, but the sliding curve does not intersect with the bottom side of the screen area, as shown in fig. 4; the terminal considers that no gesture triggering event is detected at this time. Therefore, the condition that the terminal is triggered by mistake to process the service due to the fact that the user touches the screen by mistake can be avoided, accuracy is improved, and processing resources of the terminal are saved.
in another embodiment, the first area may be a fingerprint area, and the second area may be a screen area; and the bottom edge of the screen area is adjacent to the top edge of the fingerprint area. Under this embodiment, the gesture triggering event may include: and detecting a target fingerprint in the fingerprint area, wherein the fingerprint area of the detected target fingerprint gradually decreases along a second preset direction. The second preset direction may be any direction, and the target fingerprint refers to a fingerprint which is currently detected by the terminal and is matched with the fingerprint database of the terminal. For example, the fingerprint database includes a fingerprint 1, a fingerprint 2, and a fingerprint 3, which are previously entered; if the fingerprint currently detected by the terminal is fingerprint 1, namely the currently detected fingerprint is matched with the fingerprint database, the terminal can be considered to detect the target fingerprint in the fingerprint area, and the target fingerprint is fingerprint 1; if the fingerprint currently detected by the terminal is fingerprint 4, that is, the currently detected fingerprint is not matched with the fingerprint database, the terminal may be considered to not detect the target fingerprint in the fingerprint area. Accordingly, the starting point of the gesture is the point where the target fingerprint disappears in the fingerprint area, such as point 1 shown in fig. 2 a.
it should be noted that, in consideration of the case where the user simply wants to input the target fingerprint in the fingerprint area to unlock the terminal. Therefore, the terminal in the embodiment of the present invention considers that the gesture trigger event is detected only when the target fingerprint is detected in the fingerprint area and the fingerprint area of the target fingerprint gradually decreases along the second preset direction; that is, if the terminal detects the target fingerprint in the fingerprint area, the fingerprint area of the target fingerprint is not gradually decreased along the second preset direction; the terminal considers that no gesture triggering event is detected at this time. Therefore, the situation that the terminal is triggered by mistake to process the service when the user unlocks the terminal through the target fingerprint can be avoided, the accuracy is improved, and the processing resource of the terminal is saved.
s302, acquiring a gesture end point in the second area.
in a specific implementation, a sliding operation may be detected in the second region; and then determining a gesture end point in the second area according to the sliding operation. When the first area is a screen area and the second area is a fingerprint area, the specific implementation of determining the gesture endpoint in the second area according to the sliding operation may be: detecting a target fingerprint in the fingerprint area according to the sliding operation, and recording the position points of the target fingerprint in the fingerprint area; if the fingerprint area of the detected target fingerprint gradually increases along the first preset direction from the position point, the position point is determined as the gesture end point, such as point 2 shown in fig. 1 a. The first preset direction may be any direction.
when the first area is a fingerprint area and the second area is a screen area, the specific implementation of determining the gesture endpoint in the second area according to the sliding operation may be: determining an intersection point of a scribing line formed by the sliding operation and the bottom edge of the screen area; the intersection of this scribe line with the bottom edge of the screen area is determined to be the gesture end point, point 2 as shown in FIG. 2 a.
and S303, determining gesture information according to the gesture starting point and the gesture end point, and performing service processing according to the gesture information.
After determining the gesture starting point and the gesture ending point, determining gesture information according to the gesture starting point and the gesture ending point; the gesture information includes at least one of: coordinate information and gesture order. Wherein the coordinate information comprises at least one of the following coordinates: starting point coordinates of a gesture starting point and end point coordinates of a gesture end point; the gesture sequence includes: the order from the screen area to the fingerprint area, or the order from the fingerprint area to the screen area.
After the gesture information is obtained, business processing can be performed according to the gesture information. In one embodiment, the business processes may be performed according to a gesture sequence in the gesture information. In one embodiment, the mode switching business process can be performed according to the gesture sequence in the gesture information. Specifically, the target mode corresponding to the gesture sequence may be determined according to a preset mapping relationship between the sequence and the mode, and the current mode is switched to the target mode; the current mode and the target mode herein may be any of the following: a normal display mode, an eye-protection mode, a one-handed mode, etc. For example, let the current mode be the normal display mode: if the gesture sequence is from the screen area to the fingerprint area, determining that the target mode is the eye protection mode according to the mapping relation, and switching from the normal display mode to the eye protection mode; for another example, if the gesture sequence is from the fingerprint region to the screen region, the target mode may be determined to be the one-handed mode according to the mapping relationship, and at this time, the normal display mode may be switched to the one-handed mode, as shown in fig. 5. It should be noted that, if the current mode of the terminal is the same as the target mode, the mode switching process may not be performed, or the terminal may switch to the normal display mode by default. In another embodiment, the service processing of interface switching may be performed according to the gesture sequence in the gesture information. For example, if the gesture sequence is from the screen area to the fingerprint area, switching from the current interface to the main interface of the terminal may be performed, and so on; for a specific implementation of the method, reference may be made to the above-mentioned specific implementation of the service processing related to mode switching, which is not described herein again.
In another embodiment, the business process may be performed according to coordinate information in the gesture information. In one embodiment, the business process of starting the target application may be executed according to the coordinates of the starting point of the gesture starting point. Specifically, according to the mapping relation between the coordinates and the applications, a target application corresponding to the starting point coordinates of the starting point of the gesture is determined, and the target application is started. For example, if the starting point coordinate of the starting point of the gesture is (0, 1), and the corresponding target application is an instant messaging application, the instant messaging application may be opened. In another embodiment, the service processing for starting the target application may also be executed according to the end point coordinate of the gesture end point, and the specific embodiment thereof may refer to the specific embodiment for starting the target application according to the start point coordinate, which is not described herein again. It should be noted that, in other embodiments, the service processing of mode switching or interface switching may also be performed according to the start point coordinate of the gesture start point, or the service processing of mode switching or interface switching may be performed according to the end point coordinate of the gesture end point, and for a specific implementation of this, reference may be made to the above specific implementation of performing the service processing according to the gesture sequence, which is not described herein again.
in still another embodiment, the service processing may be further performed according to the gesture information and the coordinate information in the gesture information. Specifically, the target service to be processed may be determined according to the gesture sequence in the gesture information; and then determining an adjustment rule about the target service according to the coordinate information in the gesture information, and executing the target service according to the adjustment rule. It should be noted that the above-mentioned specific embodiments of performing business processing according to gesture information are only examples, and are not exhaustive.
In the embodiment of the invention, a gesture starting point can be obtained in the first area, and a gesture ending point can be obtained in the second area; and determining gesture information according to the gesture starting point and the gesture end point, and performing service processing according to the gesture information. Therefore, in the process of the business processing, the user can input the gesture in the terminal, the terminal can automatically process the business according to the gesture information, the user does not need to search the corresponding business processing function, and the convenience and the processing efficiency of the business processing are improved.
Based on the above description, an embodiment of the present invention further provides a service processing method as shown in fig. 6, and the service processing method can implement the service processing function. The service processing method may be performed by a terminal, and the terminal may include a first area and a second area, where the terminal may include but is not limited to: smart phones, tablets, laptops, etc. As shown in fig. 6, the service processing method may include the following steps S601-S605:
S601, in response to a gesture trigger event detected in the first area, determining a gesture starting point in the first area.
S602, detecting a sliding operation in the second area, and determining a gesture end point in the second area according to the sliding operation.
S603, determining gesture information according to a gesture starting point and a gesture end point; the gesture information includes coordinate information and a gesture order.
in the specific implementation process, the terminal can preset a reference line and establish a coordinate system based on the reference line; the reference line may be a horizontal line where a bottom edge of the screen area (or a top edge of the fingerprint area) is located, or a horizontal line where a top edge of the screen area is located, or a horizontal line where a bottom edge of the fingerprint area is located, and the like, which is not limited in the embodiment of the present invention; the coordinate system established based on the reference lines can be seen in fig. 7. When the coordinate information is determined according to the gesture starting point and the gesture end point, the starting point coordinate of the gesture starting point can be determined according to the position of the gesture starting point in the coordinate system, and/or the end point coordinate of the gesture end point can be determined according to the position of the gesture end point in the coordinate system; adding the start point coordinates of the gesture start point and/or the end point coordinates of the gesture end point to the coordinate information.
When the gesture sequence is determined according to the gesture starting point and the gesture ending point, the gesture sequence may be determined according to a region where the gesture starting point is located and a region where the gesture ending point is located. Specifically, if the area where the gesture starting point is located is the screen area and the area where the gesture ending point is located is the fingerprint area, the gesture sequence is the sequence from the screen area to the fingerprint area; if the area where the gesture starting point is located is the fingerprint area and the area where the gesture ending point is located is the screen area, the gesture sequence is the sequence from the fingerprint area to the screen area.
and S604, determining the target service to be processed according to the gesture sequence.
in a specific implementation process, a target service corresponding to a gesture sequence can be determined according to a corresponding relation between the gesture sequence and the service; the target service herein includes any one of: mode adjustment service, screen brightness adjustment service, volume adjustment service, and multimedia playing adjustment service. The mode adjustment service refers to a service for switching from one mode to another mode.
And S605, determining an adjustment rule about the target service according to the coordinate information, and executing the target service according to the adjustment rule.
in a specific implementation process, an adjustment rule about the target service may be determined according to the start point coordinate of the start point of the gesture or the end point coordinate of the end point of the gesture, and then the target service is executed according to the adjustment rule. The embodiment of the present invention is described by taking an example of determining an adjustment rule related to a target service according to a start point coordinate of a start point of a gesture, and a specific implementation manner of determining an adjustment rule related to a target service according to an end point coordinate of an end point of a gesture may be referred to in the embodiment of the present invention, and details are not described here.
If the target service is a mode adjustment service, the mode adjustment service may be a service that is switched from a current mode to a single-handed mode, and the single-handed mode may include a left-handed mode or a right-handed mode; the rule for adjusting to the left-hand mode or the right-hand mode from the current mode may be determined according to the coordinates of the start point of the gesture start point. If the abscissa included by the coordinates of the starting point of the gesture is smaller than the abscissa of the central axis of the terminal, namely the gesture starting point is positioned on the left side of the central axis, determining that the adjustment rule is a rule for switching from the current mode to the left-hand mode; if the abscissa included in the coordinates of the starting point of the gesture is larger than the abscissa of the central axis, that is, the starting point of the gesture is located on the right side of the central axis, the adjustment rule is determined to be a rule for switching from the current mode to the right-hand mode. The central axis is a line that equally divides the terminal into left and right portions, as shown in fig. 8.
if the target service is a screen brightness adjusting service, determining a corresponding target brightness value according to the starting point coordinate of the starting point of the gesture, and determining an adjusting rule as a rule for adjusting the screen brightness of the terminal to the target brightness value; for example, the starting point coordinate of the starting point of the gesture is (3, 5), and the corresponding target brightness value is 80 candela per square meter (cd/m)2) The adjustment rule is to adjust the screen brightness of the terminal to 80cd/m2the rule of (2). If the target service is a volume adjustment service, a corresponding target volume value can be determined according to the starting point coordinates of the starting point of the gesture, and an adjustment rule is determined as a rule for adjusting the volume (such as media volume, ring volume, alarm volume, call volume, and the like) of the terminal to the target volume value.
if the target service is a multimedia playing adjustment service, the adjustment rule can be determined to be a rule for ending the currently played multimedia and playing the next multimedia according to the starting point coordinate of the starting point of the gesture. If the abscissa included by the start point coordinates of the gesture start point is smaller than the abscissa of the central axis of the terminal, that is, the gesture start point is located on the left side of the central axis of the terminal, the next multimedia is an adjacent multimedia in the multimedia list, which is located before the currently played multimedia; and if the abscissa included by the starting point coordinate of the gesture starting point is larger than the abscissa of the central axis of the terminal, namely the gesture starting point is positioned on the right side of the central axis, the next multimedia is an adjacent multimedia positioned behind the currently played multimedia in the multimedia list. For example, the multimedia list includes: … multimedia 4, multimedia 5, multimedia 6 …, wherein the currently played multimedia is multimedia 5; if the starting point coordinate of the starting point of the gesture is located on the left side of the central axis of the terminal, the next multimedia is multimedia 4; if the starting point coordinate of the starting point of the gesture is located on the right side of the central axis of the terminal, the next multimedia is multimedia 6.
In the embodiment of the invention, in response to a gesture triggering event detected in the first area, a gesture starting point is determined in the first area; detecting a sliding operation in the second area, and determining a gesture end point in the second area according to the sliding operation; and determining gesture information according to the gesture starting point and the gesture end point, and performing service processing according to the gesture information. Therefore, in the process of the business processing, the user can input the gesture in the terminal, the terminal can automatically process the business according to the gesture information, the user does not need to search the corresponding business processing function, and the convenience and the processing efficiency of the business processing are improved.
Based on the above description, an embodiment of the present invention further provides a service processing method as shown in fig. 9. The service processing method may be performed by a terminal, and the terminal may include a first area and a second area, where the terminal may include but is not limited to: smart phones, tablets, laptops, etc. As shown in fig. 9, the service processing method may include the following steps S901 to S903:
s901, acquiring first fingerprint information in a first area.
And S902, acquiring second fingerprint information in the second area.
In steps S901-S902, the first area may be a first fingerprint acquisition area, and the second area may be a second fingerprint acquisition area. Wherein, the fingerprint collection area can include: an under-screen fingerprint acquisition area and/or a non-under-screen fingerprint acquisition area; the under-screen fingerprint collection area is an area hidden below a terminal screen and used for collecting fingerprints, and the non-under-screen fingerprint collection area is an area where physical keys which are configured on the terminal and used for collecting fingerprints are located. And when the fingerprint collection area is a fingerprint collection area under the screen, the first fingerprint collection area can be a first screen area of the screen of the terminal, the second fingerprint collection area is a second screen area of the screen of the terminal, and the first screen area and the second screen area are two different and mutually independent screen areas in the screen of the terminal.
The terminal may acquire the first fingerprint information and the second fingerprint information in the first area and the second area, respectively, where the fingerprint information may include at least one of: fingerprint data, fingerprint acquisition time and other information. That is, the first fingerprint information may include at least one of: fingerprint data of the first fingerprint and fingerprint acquisition time of the first fingerprint; the second fingerprint information may include at least one of: fingerprint data of the second fingerprint and a fingerprint acquisition time of the second fingerprint. It should be noted that the first fingerprint and the second fingerprint may be fingerprints of the same finger or fingerprints of different fingers.
And S903, determining the target service to be processed according to the first fingerprint information and the second fingerprint information, and processing.
After the terminal acquires the first fingerprint information and the second fingerprint information, the target service to be processed can be determined according to the first fingerprint information and the second fingerprint information; the target service to be processed here may include any one of the following: mode adjustment service, screen brightness adjustment service, volume adjustment service, multimedia playing adjustment service. After determining the target service, the terminal may execute the target service.
The first fingerprint information comprises fingerprint data and fingerprint acquisition time of the first fingerprint, and the second fingerprint information comprises fingerprint data and fingerprint acquisition time of the second fingerprint. In one embodiment, the specific implementation manner of step S903 may be: firstly, fingerprint data of a first fingerprint and fingerprint data of a second fingerprint are adopted to form a target fingerprint data set, and then a target service corresponding to the target fingerprint data set is determined according to a mapping relation between a preset fingerprint data set and the service. In the process of executing the target service, determining a fingerprint acquisition sequence according to the fingerprint acquisition time of the first fingerprint and the fingerprint acquisition time of the second fingerprint; then, determining an adjustment rule of the target service according to the fingerprint acquisition sequence, and executing the target service according to the adjustment rule. For example, a target service determined from the target fingerprint dataset is a pattern adjustment service; if the fingerprint collection sequence is the first fingerprint → the second fingerprint, the adjustment rule of the mode adjustment service can be determined to be a rule for switching from the current mode to the right-hand mode; if the fingerprint collection order is second fingerprint → first fingerprint, then the adjustment rule of the mode adjustment traffic can be determined to be a rule for switching from the current mode to the left-handed mode. As another example, the target service determined according to the target fingerprint data set is a screen brightness adjustment service; if the fingerprint acquisition sequence is the first fingerprint → the second fingerprint, the adjustment rule can be determined as the rule for increasing the current screen brightness of the terminal; if the fingerprint collection order is the second fingerprint → the first fingerprint, it may be determined that the adjustment rule is a rule for turning down the current screen brightness of the terminal, and so on.
In another embodiment, the specific implementation manner of step S903 may be: firstly, determining a fingerprint acquisition sequence according to the fingerprint acquisition time of the first fingerprint and the fingerprint acquisition time of the second fingerprint, and then determining a target service corresponding to the fingerprint acquisition sequence according to a preset mapping relation between the acquisition sequence and the service. In the process of executing the target service, a target fingerprint data set can be determined according to the fingerprint data of the first fingerprint and the fingerprint data of the second fingerprint; then, an adjustment rule of the target service is determined according to the target fingerprint data set, and the target service is executed according to the adjustment rule.
In the embodiment of the present invention, the first fingerprint information may be acquired in the first area, and the second fingerprint information may be acquired in the second area; and determining the target service to be processed according to the first fingerprint information and the second fingerprint information, and processing. Therefore, in the process of the service processing, the user can input the fingerprint information in the terminal, the terminal can automatically process the service according to the fingerprint information, the user does not need to search the corresponding service processing function, and the convenience and the processing efficiency of the service processing are improved.
Based on the above description, an embodiment of the present invention further provides a service processing method as shown in fig. 10. The service processing method may be performed by a terminal, and the terminal may include a first area and a second area, where the terminal may include but is not limited to: smart phones, tablets, laptops, etc. As shown in fig. 10, the service processing method may include the following steps S1001 to S1003:
And S1001, acquiring a first gesture track in the first area.
And S1002, acquiring a second gesture track in the second area.
in steps S1001-S1002, the first region may be a first screen region of the terminal, and the second region may be a second screen region of the terminal. The first screen area and the second screen area are any two area positions of a screen of the terminal, namely the first screen area and the second screen area are two different and mutually independent screen areas in the screen of the terminal. The terminal may acquire a first gesture track and a second gesture track in the first area and the second area, respectively, where the gesture tracks may be a curve segment, a closed figure (e.g., a circle, a triangle, etc.), and so on.
and S1003, determining a target service to be processed according to the first gesture track and the second gesture track, and processing.
After the terminal acquires the first gesture track and the second gesture track, the target service to be processed can be determined according to the first gesture track and the second gesture track; the target service to be processed here may include any one of the following: mode adjustment service, screen brightness adjustment service, volume adjustment service, multimedia playing adjustment service. After determining the target service, the terminal may execute the target service.
in a specific implementation process, a target graph formed by a first gesture track and a second gesture track can be determined; and then determining the target service corresponding to the target graph according to the mapping relation between the preset graph and the service. In the process of executing the target service, determining a gesture sequence according to the first gesture track and the second gesture track; and determining an adjusting rule about the target business according to the gesture sequence, and executing the target business according to the adjusting rule.
in the embodiment of the invention, a first gesture track can be acquired in the first area, and a second gesture track can be acquired in the second area; and determining a target service to be processed according to the first gesture track and the second gesture track, and processing. Therefore, in the process of the business processing, the user can input the gesture track in the terminal, the terminal can automatically process the business according to the gesture track, the user does not need to search the corresponding business processing function, and the convenience and the processing efficiency of the business processing are improved.
based on the description of the foregoing method embodiment, in an embodiment, an embodiment of the present invention further provides a schematic structural diagram of a service processing apparatus as shown in fig. 11. As shown in fig. 11, a service processing apparatus in an embodiment of the present invention operates in a terminal, where the terminal includes a first area and a second area, and the service processing apparatus may include:
an obtaining unit 101, configured to obtain first information in the first area and obtain second information in the second area;
And the processing unit 102 is configured to determine a target service to be processed according to the first information and the second information, and perform processing.
In an embodiment, the obtaining unit specifically includes:
the contact acquisition module is used for acquiring a gesture starting point in the first area and acquiring a gesture end point in the second area;
the fingerprint acquisition module is used for acquiring first fingerprint information in the first area and acquiring second fingerprint information in the second area;
and the track acquisition module is used for acquiring a first gesture track in the first area and acquiring a second gesture track in the second area.
In another embodiment, the first information includes: at least one of a gesture start point, first fingerprint information, and a first gesture trajectory;
The second information includes: at least one of a gesture endpoint, second fingerprint information, and a second gesture trajectory.
in the embodiment of the invention, a gesture starting point can be obtained in the first area, and a gesture ending point can be obtained in the second area; and determining gesture information according to the gesture starting point and the gesture end point, and performing service processing according to the gesture information. Therefore, in the process of the business processing, the user can input the gesture in the terminal, the terminal can automatically process the business according to the gesture information, the user does not need to search the corresponding business processing function, and the convenience and the processing efficiency of the business processing are improved.
Fig. 12 is a schematic structural diagram of a terminal according to an embodiment of the present invention. The terminal in the present embodiment as shown in fig. 12 may include: one or more processors 201; one or more input devices 202, one or more output devices 203, and memory 204. The processor 201, the input device 202, the output device 203, and the memory 204 are connected by a bus 205. The memory 204 is used for storing a computer program comprising program instructions, and the processor 201 is configured to call the program instructions stored in the memory 204 to execute the service processing method.
in one embodiment, the processor 201 may be a Central Processing Unit (CPU), or other general-purpose processor, i.e., a microprocessor or any conventional processor. The memory 204 may include both read-only memory and random access memory and provides instructions and data to the processor 201. Therefore, the processor 201 and the memory 204 are not limited herein.
In the embodiments of the present invention, one or more program instructions stored in a computer storage medium are loaded and executed by the processor 201 to implement the corresponding steps of the methods in the corresponding embodiments described above; in the concrete implementation:
In one embodiment, at least one first program instruction in a computer storage medium may be loaded by processor 201 and perform the steps of:
Acquiring a gesture starting point in the first area;
Acquiring a gesture end point in the second area;
And determining gesture information according to the gesture starting point and the gesture end point, and performing service processing according to the gesture information.
In one embodiment, when the gesture start point is obtained in the first region, the at least one first program instruction may be loaded by the processor 201 and specifically configured to perform: detecting a gesture trigger event in the first region; in response to a gesture triggering event detected in the first region, a gesture starting point is determined in the first region.
In another embodiment, when the gesture endpoint is obtained in the second region, the at least one first program instruction may be loaded by the processor 201 and specifically configured to perform: detecting a sliding operation in the second area; determining a gesture endpoint in the second region according to the sliding operation
In yet another embodiment, the first area is a screen area, the second area is a fingerprint area, and the bottom edge of the screen area is adjacent to the top edge of the fingerprint area; the gesture triggering event comprises: detecting an event that a sliding curve is detected in the screen area and the sliding curve intersects with the bottom side of the screen area; the gesture starting point is a point where the sliding curve intersects with the bottom edge of the screen area.
in yet another embodiment, when the gesture end point is determined in the second region according to the sliding operation, the at least one first program instruction may be loaded by the processor 201 and specifically configured to perform: detecting a target fingerprint in the fingerprint area according to the sliding operation, and recording the position point of the target fingerprint in the fingerprint area; and if the detected fingerprint area of the target fingerprint is gradually increased along a first preset direction from the position point, determining the position point as the gesture end point.
in yet another embodiment, the first area is a fingerprint area, the second area is a screen area, and the bottom edge of the screen area is adjacent to the top edge of the fingerprint area; the gesture triggering event comprises: detecting a target fingerprint in the fingerprint area, wherein the detected fingerprint area of the target fingerprint is gradually reduced along a second preset direction; the gesture starting point is a point where the target fingerprint disappears in the fingerprint area.
In yet another embodiment, when the gesture end point is determined in the second region according to the sliding operation, the at least one first program instruction may be loaded by the processor 201 and specifically configured to perform: determining an intersection point of a scribing line formed by the sliding operation and the bottom edge of the screen area; and determining the intersection point of the curve and the bottom edge of the screen area as the gesture end point.
in yet another embodiment, the gesture information includes at least one of: coordinate information and gesture order; wherein the coordinate information comprises at least one of the following coordinates: the starting point coordinates of the gesture starting point and the end point coordinates of the gesture end point; the gesture sequence includes: an order from the screen area to the fingerprint area, or an order from the fingerprint area to the screen area.
In another embodiment, when performing a business process according to the gesture information, the at least one first program instruction may be loaded by the processor 201 and specifically configured to perform: determining target services to be processed according to the gesture sequence, wherein the target services comprise any one of the following items: mode adjustment service, screen brightness adjustment service, volume adjustment service, multimedia playing adjustment service; and determining an adjustment rule about the target service according to the coordinate information, and executing the target service according to the adjustment rule.
In yet another embodiment, at least one second program instruction in the computer storage medium may be loaded and executed by the processor 201 to perform the steps of:
Acquiring first fingerprint information in the first area;
Acquiring second fingerprint information in the second area;
And determining a target service to be processed according to the first fingerprint information and the second fingerprint information, and processing.
In one embodiment, the first area is a first fingerprint acquisition area and the second area is a second fingerprint acquisition area; the fingerprint collection area includes: an under-screen fingerprint acquisition area, and/or a non-under-screen fingerprint acquisition area.
In another embodiment, when the fingerprint collection area is a sub-screen fingerprint collection area, the first fingerprint collection area is a first screen area of a screen of the terminal, and the second fingerprint collection area is a second screen area of the screen of the terminal.
In another embodiment, the target service to be processed includes any one of the following: mode adjustment service, screen brightness adjustment service, volume adjustment service, multimedia playing adjustment service.
in yet another embodiment, at least one third program instruction in the computer storage medium may be loaded and executed by the processor 201 to perform the steps of:
Acquiring a first gesture track in the first area;
Acquiring a second gesture track in the second area;
And determining a target service to be processed according to the first gesture track and the second gesture track, and processing the target service.
In one embodiment, the first area is a first screen area of the terminal; the second area is a second screen area of the terminal; the first screen area and the second screen area are any two area positions of the screen of the terminal.
in another embodiment, the target service to be processed includes any one of the following: mode adjustment service, screen brightness adjustment service, volume adjustment service, multimedia playing adjustment service.
in the embodiment of the invention, a gesture starting point can be obtained in the first area, and a gesture ending point can be obtained in the second area; and determining gesture information according to the gesture starting point and the gesture end point, and performing service processing according to the gesture information. Therefore, in the process of the business processing, the user can input the gesture in the terminal, the terminal can automatically process the business according to the gesture information, the user does not need to search the corresponding business processing function, and the convenience and the processing efficiency of the business processing are improved.
It should be noted that, for the specific working process of the terminal and the unit described above, reference may be made to the relevant description in the foregoing embodiments, and details are not described here again.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
While the invention has been described with reference to a number of embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (21)

1. A service processing method is applied to a terminal, wherein the terminal comprises a first area and a second area, and the method comprises the following steps:
acquiring a gesture starting point in the first area;
Acquiring a gesture end point in the second area;
and determining gesture information according to the gesture starting point and the gesture end point, and performing service processing according to the gesture information.
2. The method of claim 1, wherein said obtaining a gesture start point in the first region comprises:
detecting a gesture trigger event in the first region;
in response to a gesture triggering event detected in the first region, a gesture starting point is determined in the first region.
3. The method of claim 2, wherein said obtaining a gesture endpoint in the second region comprises:
Detecting a sliding operation in the second area;
and determining a gesture end point in the second area according to the sliding operation.
4. The method of claim 3, wherein the first area is a screen area, the second area is a fingerprint area, and a bottom edge of the screen area is adjacent to a top edge of the fingerprint area;
the gesture triggering event comprises: detecting an event that a sliding curve is detected in the screen area and the sliding curve intersects with the bottom side of the screen area; the gesture starting point is a point where the sliding curve intersects with the bottom edge of the screen area.
5. The method of claim 4, wherein the determining a gesture endpoint in the second region according to the sliding operation comprises:
detecting a target fingerprint in the fingerprint area according to the sliding operation, and recording the position point of the target fingerprint in the fingerprint area;
And if the detected fingerprint area of the target fingerprint is gradually increased along a first preset direction from the position point, determining the position point as the gesture end point.
6. The method of claim 3, wherein the first area is a fingerprint area, the second area is a screen area, and a bottom edge of the screen area is adjacent to a top edge of the fingerprint area;
The gesture triggering event comprises: detecting a target fingerprint in the fingerprint area, wherein the detected fingerprint area of the target fingerprint is gradually reduced along a second preset direction; the gesture starting point is a point where the target fingerprint disappears in the fingerprint area.
7. the method of claim 6, wherein the determining a gesture endpoint in the second region according to the sliding operation comprises:
Determining an intersection point of a scribing line formed by the sliding operation and the bottom edge of the screen area;
And determining the intersection point of the drawn line and the bottom edge of the screen area as the gesture end point.
8. The method of any of claims 4-7, wherein the gesture information comprises at least one of: coordinate information and gesture order; wherein the content of the first and second substances,
The coordinate information includes at least one of the following coordinates: the starting point coordinates of the gesture starting point and the end point coordinates of the gesture end point;
the gesture sequence includes: an order from the screen area to the fingerprint area, or an order from the fingerprint area to the screen area.
9. The method of claim 8, wherein the conducting business processes according to the gesture information comprises:
Determining target services to be processed according to the gesture sequence, wherein the target services comprise any one of the following items: mode adjustment service, screen brightness adjustment service, volume adjustment service, multimedia playing adjustment service;
And determining an adjustment rule about the target service according to the coordinate information, and executing the target service according to the adjustment rule.
10. A service processing method is applied to a terminal, wherein the terminal comprises a first area and a second area, and the method comprises the following steps:
acquiring first fingerprint information in the first area;
Acquiring second fingerprint information in the second area;
and determining a target service to be processed according to the first fingerprint information and the second fingerprint information, and processing.
11. the method of claim 10, wherein the first area is a first fingerprint acquisition area and the second area is a second fingerprint acquisition area;
The fingerprint collection area includes: an under-screen fingerprint acquisition area, and/or a non-under-screen fingerprint acquisition area.
12. the method of claim 11, wherein when the fingerprint acquisition area is an off-screen fingerprint acquisition area, the first fingerprint acquisition area is a first screen area of a screen of the terminal, and the second fingerprint acquisition area is a second screen area of the screen of the terminal.
13. the method according to any of claims 10-12, wherein the target traffic to be processed comprises any of: mode adjustment service, screen brightness adjustment service, volume adjustment service, multimedia playing adjustment service.
14. A service processing method is applied to a terminal, wherein the terminal comprises a first area and a second area, and the method comprises the following steps:
Acquiring a first gesture track in the first area;
Acquiring a second gesture track in the second area;
And determining a target service to be processed according to the first gesture track and the second gesture track, and processing the target service.
15. the method of claim 14, wherein the first area is a first screen area of the terminal; the second area is a second screen area of the terminal;
the first screen area and the second screen area are any two area positions of the screen of the terminal.
16. The method according to any of claims 14-15, wherein the target traffic to be processed comprises any of: mode adjustment service, screen brightness adjustment service, volume adjustment service, multimedia playing adjustment service.
17. a service processing apparatus, wherein the service processing apparatus operates in a terminal, and the terminal includes a first area and a second area, and the apparatus comprises:
An acquisition unit configured to acquire first information in the first area and second information in the second area;
and the processing unit is used for determining the target service to be processed according to the first information and the second information and processing the target service.
18. the service processing apparatus of claim 17, wherein the obtaining unit specifically includes:
The contact acquisition module is used for acquiring a gesture starting point in the first area and acquiring a gesture end point in the second area;
The fingerprint acquisition module is used for acquiring first fingerprint information in the first area and acquiring second fingerprint information in the second area;
and the track acquisition module is used for acquiring a first gesture track in the first area and acquiring a second gesture track in the second area.
19. The traffic processing apparatus of claim 18,
the first information includes: at least one of a gesture start point, first fingerprint information, and a first gesture trajectory;
the second information includes: at least one of a gesture endpoint, second fingerprint information, and a second gesture trajectory.
20. A terminal, comprising a processor, an input device, an output device, and a memory, the processor, the input device, the output device, and the memory being interconnected, wherein the memory is configured to store a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1-16.
21. A computer-readable storage medium, characterized in that the computer storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to perform the method according to any of claims 1-16.
CN201910788301.8A 2019-08-23 2019-08-23 service processing method, service processing device, terminal and medium Pending CN110568989A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910788301.8A CN110568989A (en) 2019-08-23 2019-08-23 service processing method, service processing device, terminal and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910788301.8A CN110568989A (en) 2019-08-23 2019-08-23 service processing method, service processing device, terminal and medium

Publications (1)

Publication Number Publication Date
CN110568989A true CN110568989A (en) 2019-12-13

Family

ID=68776178

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910788301.8A Pending CN110568989A (en) 2019-08-23 2019-08-23 service processing method, service processing device, terminal and medium

Country Status (1)

Country Link
CN (1) CN110568989A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115082971A (en) * 2022-07-22 2022-09-20 深圳市必凡娱乐科技有限公司 Method for reading touch data information to realize image track tracking

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107493389A (en) * 2017-08-29 2017-12-19 深圳市金立通信设备有限公司 Singlehanded mode implementation method, terminal and computer-readable medium
CN107862196A (en) * 2017-11-29 2018-03-30 努比亚技术有限公司 Fingerprint verification method, mobile terminal and computer-readable recording medium
CN108984095A (en) * 2018-07-04 2018-12-11 Oppo广东移动通信有限公司 gesture interaction method, device, storage medium and electronic equipment
US20190029578A1 (en) * 2013-10-07 2019-01-31 Masimo Corporation Regional oximetry user interface

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190029578A1 (en) * 2013-10-07 2019-01-31 Masimo Corporation Regional oximetry user interface
CN107493389A (en) * 2017-08-29 2017-12-19 深圳市金立通信设备有限公司 Singlehanded mode implementation method, terminal and computer-readable medium
CN107862196A (en) * 2017-11-29 2018-03-30 努比亚技术有限公司 Fingerprint verification method, mobile terminal and computer-readable recording medium
CN108984095A (en) * 2018-07-04 2018-12-11 Oppo广东移动通信有限公司 gesture interaction method, device, storage medium and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115082971A (en) * 2022-07-22 2022-09-20 深圳市必凡娱乐科技有限公司 Method for reading touch data information to realize image track tracking
CN115082971B (en) * 2022-07-22 2022-11-08 深圳市必凡娱乐科技有限公司 Method for reading touch data information to realize image track tracking

Similar Documents

Publication Publication Date Title
US10649581B1 (en) Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback
CN105824559B (en) False touch recognition and processing method and electronic equipment
US8633909B2 (en) Information processing apparatus, input operation determination method, and input operation determination program
US9304656B2 (en) Systems and method for object selection on presence sensitive devices
US20120032891A1 (en) Device, Method, and Graphical User Interface with Enhanced Touch Targeting
US20120212438A1 (en) Methods and apparatuses for facilitating interaction with touch screen apparatuses
CN106415472B (en) Gesture control method and device, terminal equipment and storage medium
US10599903B2 (en) Information processing method and electronic device
WO2017166357A1 (en) Icon arrangement method, icon arrangement apparatus and terminal
US20150286283A1 (en) Method, system, mobile terminal, and storage medium for processing sliding event
KR20140078629A (en) User interface for editing a value in place
WO2014121626A1 (en) Displaying method, device and storage medium of mobile terminal shortcuts
US20150309690A1 (en) Method and system for searching information records
CN110806833A (en) Single-hand mode starting method, terminal and computer storage medium
WO2016173307A1 (en) Message copying method and device, and smart terminal
WO2016145827A1 (en) Terminal control method and device
CN108491152B (en) Touch screen terminal control method, terminal and medium based on virtual cursor
CN114327726A (en) Display control method, display control device, electronic equipment and storage medium
CN113268182A (en) Application icon management method and electronic equipment
CN106845190B (en) Display control system and method
CN110568989A (en) service processing method, service processing device, terminal and medium
US10514843B2 (en) Method for displaying virtual keypad overlapping an application and electronic device
WO2017045277A1 (en) Search method, device and apparatus, and non-volatile computer storage medium
CN108021313B (en) Picture browsing method and terminal
EP4095690A1 (en) Application program data processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination