CN107203319A - The system of interface operation control method and application this method - Google Patents

The system of interface operation control method and application this method Download PDF

Info

Publication number
CN107203319A
CN107203319A CN201610153114.9A CN201610153114A CN107203319A CN 107203319 A CN107203319 A CN 107203319A CN 201610153114 A CN201610153114 A CN 201610153114A CN 107203319 A CN107203319 A CN 107203319A
Authority
CN
China
Prior art keywords
user interface
main body
operating main
module
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610153114.9A
Other languages
Chinese (zh)
Inventor
张文信
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanning Fulian Fugui Precision Industrial Co Ltd
Original Assignee
Nanning Fugui Precision Industrial Co Ltd
Hon Hai Precision Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanning Fugui Precision Industrial Co Ltd, Hon Hai Precision Industry Co Ltd filed Critical Nanning Fugui Precision Industrial Co Ltd
Priority to CN201610153114.9A priority Critical patent/CN107203319A/en
Publication of CN107203319A publication Critical patent/CN107203319A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A kind of interface operation control system and its method, applied to the user interface of an electronic equipment, this method includes:Whether an operating main body is sensed close to user interface, and when the operating main body with the distance of user interface relative to a pre-determined distance is less than, exports an induced signal;Receive the induced signal of the induction module output, the feature to extract operating main body according to the induced signal;The feature of operating main body is extracted according to identification module, to run the data for the application model corresponding with this feature being pre-stored within a memory, to start the application model;Recognize that operating main body contacts the area of user interface, to differentiate that operating main body is implemented on the varying strength of user interface, and different instructions are performed to application model according to the difference of intensity.The operation and experience of the convenient, fast and extreme enrichment user of energy of the invention.

Description

The system of interface operation control method and application this method
Technical field
The present invention relates to a kind of interface operation control method and the system of application this method.
Background technology
Nowadays, many consumer electronics products all carry touch-screen, mobile phone, tablet personal computer etc..People can both watch video, picture by touch-screen, and browse webpage, play game, moreover it is possible to arbitrarily select and manipulate the object on screen.
Electronic product in correlation technique, generally sets touch sensing below main button, and the touch sensing can press the position of main button according to user, to control to show the left and right switching of the page or moving left and right for object on screen.However, when main button is not triggered, touch sensing can keep low energy consumption, non-inductive state.When main button is triggered by pressing, the state that touch sensing can receive input signal from non-inductive state entrance needs to expend for a long time.In addition, setting touch sensing not only to improve cost below main button, and design difficulty can be increased.
The content of the invention
In view of the foregoing, it is necessary to provide a kind of simple operation and the system of various interface operation control method and application this method.
A kind of interface operation control system, applied to the user interface of an electronic equipment, the system includes:
One induction module, for whether sensing an operating main body close to user interface, and when the operating main body with the distance of user interface relative to a pre-determined distance is less than, exports an induced signal;
One identification module, receives the induced signal of the induction module output, the feature to extract operating main body according to the induced signal;And
One bottom operating module, the feature for extracting operating main body according to identification module, to run the data for the application model corresponding with this feature being pre-stored within a memory, to start the application model;And
One operation module, to recognize that operating main body contacts the area of user interface, to differentiate that operating main body is implemented on the varying strength of user interface, and performs different instructions to application model according to the difference of intensity.
A kind of interface operation control method of interface operation control system as described above, wherein this method comprises the following steps:
A. whether an operating main body is sensed close to user interface, and when the operating main body is less than a predetermined angle relative to the distance of user interface, exports an induced signal;
B. the induced signal of the induction module output, the feature to extract operating main body according to the induced signal are received;
C. the data for the application model corresponding with this feature being pre-stored within a memory are run according to the feature of the operating main body of extraction, to start the application model;And
D. identification operating main body contacts the area of user interface, to differentiate that operating main body is implemented on the varying strength of user interface, and different instructions are performed to application model according to the difference of intensity.
Pass through described interface operation control system and its method, by induction module inductive operation main body whether close to user interface, and judge angle relative to user interface, an induced signal is exported when operating main body is less than a predetermined angle relative to the angle of user interface, the identification module extracts the feature of operating main body according to the induced signal, the bottom operating module starts the application model corresponding with this feature according to this feature, the operation module performs different instructions to application model according to the varying strength for being implemented on user interface.In this way, user by finger without being contacted with user interface, you can rapidly enter the operator scheme of system, and user can perform different operational orders by the contact area of finger on a user interface to application model.Compared to the time had not only been saved using touch sensing in the prior art but also cost is saved, greatly facilitate and abundant user operation and experience.
Brief description of the drawings
Fig. 1 is the running environment Organization Chart of the interface operation control system of a preferred embodiment of the present invention.
Fig. 2 is the schematic diagram of the target induction zone of the interface operation control system of a preferred embodiment of the present invention.
Fig. 3 is the block diagram of the operation module of interface operation control system of the present invention.
Fig. 4 is that user's finger exerts a force and inclined schematic diagram in the user interface of interface operation control system of the present invention from initial position to all directions.
Fig. 5 is the flow chart of interface operation control method of the present invention.
Main element symbol description
Electronic equipment 1
User interface 11
Memory 12
Processor 13
Interface operation control system 10
Induction module 101
Identification module 102
Bottom operating module 103
Operation module control module 104105
Image capturing unit 1011
Proximity sensor 1012
Target induction zone A
Low detection area B
High detection area C
Time detects module 1040
Area detects module 1041
Computing module 1042
Function control module 1043
Following embodiment will further illustrate the present invention with reference to above-mentioned accompanying drawing.
Embodiment
Below in conjunction with the accompanying drawings and better embodiment is described in further detail to the present invention:
As shown in figure 1, being the running environment Organization Chart of interface operation control system 10 of the present invention.The interface operation control system 10 is applied in electronic equipment 1.The electronic equipment 1 also includes user interface 11, memory 12, processor 13.It is understood that the electronic equipment 1 may be, but not limited to, mobile phone, tablet personal computer, personal digital assistant(Personal Digital Assistant, PDA)And the intelligent terminal such as electronic reader.
The interface operation control system 10 includes induction module 101, identification module 102, bottom operating module 103, operation module 104 and control module 105.Specific function performed by each module described above in the form of software program section to solidify in the operating system of electronic equipment 1, or be stored in the readable storage medium storing program for executing of electronic equipment 1 or other storage devices, in the memory 12 as described in being stored in, from the processor 13 control execution between modules and with.It should be noted that, each module of the above may be divided into multiple submodule again.In the present embodiment, the electronic equipment 1 is mobile phone, and the user interface 11 is mobile phone screen.The interface operation control system 10 is described in detail below.
The control module 105 is used to send an enabled instruction to induction module 101 when interface operation control system 10 starts.The induction module 101 can be combination any or a variety of in image sensor, infrared sensor and sonac, the induction module 101 is used for the enabled instruction that is sent according to control module 105, and whether one operating main body of sensing degree close to user interface 11 and apart from user interface 11 and exports an induction information.Specifically, image sensor, infrared sensor and sonac are respectively used to sense image, infrared waves or the ultrasonic wave of an operating main body, and then whether the operating main body is sensed close to user interface 11 and the degree apart from user interface 11.
Reference picture 2 in the lump, in the present embodiment, the induction module 101 include the front camera of image capturing unit 1011, such as mobile phone, and proximity sensor 1012.The operating main body is illustrated by taking the hand of user as an example, and certain operating main body can also be other positions of the body of user.The region of the accessible information of induction module 101 is target induction zone A, when the hand of user is close to user interface 11, and the image capturing unit 1011 is used for the operating main body shot in target induction zone A and exports the photographing information.Distance and output one range information of the detecting operating main body of proximity sensor 1012 relative to user interface 11.Foregoing induction information includes the photographing information and range information.
The target induction zone A is divided into low detection area B and high detection area C according to operating main body relative to the distance of user interface 11.The low detection area B is that operating main body enters target induction zone A but do not produce the region that the region of instruction, the i.e. hand of user are more than a pre-determined distance into the palmistry of target induction zone A but user for the distance of user interface 11 to user interface 11.The high detection area C is operating main body close to user interface 11 and the region of instruction can be produced to user interface 11, in this region, and operating main body is less than the pre-determined distance relative to the distance of user interface 11.
Control module 105 controls the filming frequency of image capturing unit 1011 according to above-mentioned induction information.When control module 105 judges that the hand of user is located at the low detection area B according to induction information, the control module 105 controls image capturing unit 1011 with a lower frequency shooting operation main body.
When control module 105 judges that operating main body is located at the high detection area C according to induction information, the control image capturing of control module 105 unit 1011 shoots the hand of user with a upper frequency, frequently to obtain the photographing information.
The control module 105 control identification module 102 receives the photographing information that the image capturing unit 1011 is exported, and extracts according to the photographing information operating characteristics of operating main body, such as gesture or face characteristic.In the present embodiment by taking gesture as an example, any one in thumb of the gesture including left hand, the right hand, left hand or the right hand, forefinger, middle finger, the third finger, little finger or its any combination.The corresponding different application model of the operating characteristics of operating main body, in the present embodiment, the corresponding different application model of different gestures of user, for example:Right hand thumb correspondence camera mode, left hand thumb correspondence telephony mode, combination correspondence mail mode, left hand middle finger correspondence music pattern of right hand forefinger and left index finger etc..The application model can be stored in advance in the memory 12.User can preset the gesture of any combination as needed to correspond to different application models, and the corresponding relation of those gestures and application model is stored in memory 12.The identification module 102 contrasts the gesture feature of extraction and the gesture feature in memory 12 of prestoring, to find the corresponding application model of corresponding gesture.
The control module 105 controls the application model that the bottom operating module 103 is found according to identification module 102, to read the data message of the application model, so as to start the application model, and the application model is presented in user interface 11.
As shown in figure 3, the operation module 104 includes time detecting module 1040, area detecting module 1041, computing module 1042 and function control module 1043.As shown in figure 4, being exerted a force to the right by initial position A in user interface 11 for user's finger and tilting, exert a force to the left and tilt, urged forward and tilted and exert a force backward and the change schematic diagram in Action Events with the contact area of user interface 11 such as tilt.The time detecting module 1040 total time lasting to detect Action Events, i.e., user's finger, which contacts user interface 11 and exerted a force towards moving direction and tilt to user's finger, leaves the lasting total time of user interface 11.Area detecting module 1041 contacts the area of user interface 11 to recognize user's finger in Action Events, the area include the initial area that identification user's finger contacts in user interface 11 and exert a force and tilt in user interface 11 over time after area.The computing module 1042 detects area and/or the area change that the time of the detecting of module 1040 and area detecting module 1041 are detected according to the time, to differentiate that finger is implemented on the intensity and incline direction of user interface 11.The intensity and incline direction that the function control module 1043 is differentiated according to computing module 1042 are come the command operatings different to application model execution.
It is the flow chart of interface operation control method preferred embodiment of the present invention shown in Fig. 5.
Whether step S601, the image capturing unit 1011 of the control induction module 101 of control module 105 comes across target induction zone A with lower frequency photographic subjects induction zone A with inductive operation main body;If so, then performing step S602;If no, re-executing this step.In the present embodiment, operating main body is the hand of user, naturally it is also possible to for other positions such as the faces of user.
Step S602, control module 105 judges whether operating main body is less than a pre-determined distance relative to the distance of user interface 11 according to the range information of induction module 101, to judge whether operating main body enters high detection area C, if less than the pre-determined distance, judging that operating main body enters high detection area C and performs step S603;Otherwise, then this step is re-executed.
Step S603, the image capturing unit 1011 of the control induction module 101 of control module 105 is shot with upper frequency, frequently to shoot the operating main body and export the photographing information.
Step S604, the control module 105 controls the identification module 102 to extract operating characteristics from the photographing information.The operating characteristics can be gesture, for example:Any one in left hand, the right hand, the thumb of left hand or the right hand, forefinger, middle finger, the third finger, little finger or its any combination.
Step S605, the identification module 102 contrasts the operating characteristics of extraction and the operating characteristics in memory 12 of prestoring, and judges current gesture feature with the presence or absence of in memory 12, and to should be in memory 12 a certain application model, if in the presence of execution step S607;Otherwise, return to step S601.In the present embodiment, the corresponding application model of different gestures of user can be stored in advance in the memory 12, for example:Right hand thumb correspondence camera mode, left hand thumb correspondence telephony mode, combination correspondence mail mode, left hand middle finger correspondence music pattern of right hand forefinger and left index finger etc., user can preset the gesture of any combination as needed to correspond to different application models, and the corresponding application model of those gestures is stored in memory 12.
Step S606, the bottom operating module 103 finds the corresponding application model of corresponding gesture according to identification module 102, to read the data message of the application model, so as to start the application model, and the application model is presented in user interface 11.
Step S607, the time detecting detecting user finger of module 1040 of the operation module 104 contacts the time of user interface 11.
Step S608, area detecting module 1041 contacts the area of user interface 11 to recognize user's finger, the area include the initial area that identification user's finger contacts in user interface 11 and exert a force and tilt in user interface 11 over time after area change.
Step S609, the computing module 1042 detects area and/or the area change that the time of the detecting of module 1040 and area detecting module 1041 are detected according to the time, to differentiate that finger is implemented on the operating characteristics of user interface 11, such as manipulation strength and operation direction.In the present embodiment, time for being detected is longer, area is bigger, then computing module 1042 differentiate finger be implemented on user interface 11 manipulation strength it is bigger;The operation direction that the bearing of trend of the area change detected then inputs for user to electronic equipment 1.Such as, when the finger of user is tilted to the left in user interface 11, area detecting module 1041 accordingly detects contact area and is moved to the left by initial position, and area detecting module 1041 judges that the operation that user inputs to electronic equipment 1 has direction value to the left.
Step S610, the operating characteristics that the function control module 1043 is differentiated according to computing module 1042, i.e. manipulation strength and operation direction, different operational orders are performed to application model, to realize difference in functionality.In the present embodiment, when the application model of user's current operation is picture mode, then when user's finger contacts user interface 11 and tilts to the left, the area detected and area change are transferred to computing module 1042 by area detecting module 1041, computing module 1042 accordingly exports with corresponding manipulation strength according to the size and area change of contact area and operates the first of direction to instruct to function control module 1043, the function control module 1043 is leafed through with slower speed according to the first instruction and comes current page prevpage, page two, page three ... are waited former photo, with the increase of contact area, illustrate user's finger impose on user interface 11 intensity it is bigger, then corresponding leafed through with fast speed comes more photos before current page to the function control module 1043, until finger unclamps and leaves user interface 11.When user's finger contacts user interface 11 and tilts to the right, then operation principle contacts user interface 11 with above-mentioned finger and operation principle inclined to the left is similar, unlike, computing module 1042 accordingly exports the second instruction with some strength value of information and direction value to function control module 1043 according to the size and area change of contact area, the function control module 1043 is leafed through with slower speed according to the second instruction and comes current page prevpage, page two, page three ... are waited later photo, with the increase of contact area, illustrate user's finger force in user interface 11 manipulation strength it is bigger, then corresponding leafed through with fast speed comes more photos after current page to the function control module 1043, until finger unclamps and leaves user interface 11.
When user's finger contacts user interface 11 to screen inclined upward, then operation principle is similar to above, unlike, computing module 1042 accordingly exports the 3rd instruction with some strength value of information and direction value to function control module 1043 according to the size and area change of contact area, the function control module 1043 amplifies the current photo for being presented in user interface 11 according to the 3rd instruction with less multiplication factor, with the increase of contact area, illustrate user's finger impose on user interface 11 intensity it is bigger, the function control module 1043 is then corresponding with the larger current photo for being presented in user interface 11 of multiplication factor amplification.When user's finger contacts user interface 11 to screen inclined downward, then operation principle is similar to above, unlike, computing module 1042 accordingly exports the 4th instruction with some strength value of information and direction value to function control module 1043 according to the size and area change of contact area, the function control module 1043 reduces the current photo for being presented in user interface 11 according to the 4th instruction with less multiplication factor, with the increase of contact area, illustrate user's finger impose on user interface 11 intensity it is bigger, the function control module 1043 is then corresponding to reduce the current photo for being presented in user interface 11 with larger multiplication factor.
Thus, the operation module 104 quickly searches any photo in the album mode of electronic equipment 1 so as to user by detecting operating main body in the Action Events such as inclination control the left and right of current page to switch to the left or to the right in user interface 11;The Action Events such as tilt up or down by detecting operating main body to zoom in or out photo or image.
Certainly, it is understood that operation module 104 applies also in web page browsing tilting by finger to control the left and right of current web page to switch, video or tilted by finger during music come the change for controlling volume and other there is the modules of similar functions.
To sum up, user by finger without being contacted with user interface, you can rapidly enters the operator scheme of system, and user can perform different operational orders by the inclination of finger on a user interface to application model.Compared to the time had not only been saved using touch sensing in the prior art but also cost is saved, greatly facilitate and abundant user operation and experience.
The above embodiments are merely illustrative of the technical solutions of the present invention and it is unrestricted, although the present invention is described in detail with reference to above preferred embodiment, it will be understood by those within the art that, technical scheme can be modified or equivalent substitution should not all depart from the spirit and scope of technical solution of the present invention.

Claims (10)

1. a kind of interface operation control system, the user interface applied to an electronic equipment, it is characterised in that the system includes:
One induction module, for whether sensing an operating main body close to user interface, and when the operating main body with the distance of user interface relative to a pre-determined distance is less than, exports an induced signal;
One identification module, receives the induced signal of the induction module output, the feature to extract operating main body according to the induced signal;And
One bottom operating module, the feature for extracting operating main body according to identification module, to run the data for the application model corresponding with this feature being pre-stored within a memory, to start the application model;And
One operation module, to recognize that operating main body contacts the area of user interface, to differentiate that operating main body is implemented on the varying strength of user interface, and performs different instructions to application model according to the difference of intensity.
2. interface operation control system as claimed in claim 1, it is characterized in that, the operation module is additionally operable to recognize the contact area that operating main body changes after exerting a force and tilt on a user interface, differentiate that operating main body is implemented on the incline direction of user interface according to the contact area of the change, and different instructions are performed to application model according to different incline directions.
3. interface operation control system as claimed in claim 2, it is characterized in that, the operation module is additionally operable to detect the duration that operating main body is contacted on a user interface, increases the intensity and incline direction according to duration extension, to perform different instructions to application model.
4. interface operation control system as claimed in claim 3, it is characterized in that, the operation module comes before current page or later more photos to leaf through, when user's finger contacts user interface and tilts to the left, the operation module comes more photos before current page to leaf through, and with the increase of finger contact area, the operation module correspondingly speeds up speed and come to leaf through more photos before current page;When user's finger contacts user interface and tilts to the right, the operation module comes more photos after current page to leaf through, and with the increase of finger contact area, the operation module correspondingly speeds up speed and the later more photos of current page is come to leaf through.
5. interface operation control system as claimed in claim 4, it is characterized in that, the operation module is to zoom in or out the current photo for being presented in user interface, when user's finger contacts user interface and during to screen inclined upward, the operation module is to amplify the current photo for being presented in user interface, and with the increase of finger contact area, the operation module correspondingly increases multiplication factor to amplify the current photo for being presented in user interface;When user's finger contacts user interface and during to screen inclined downward, the operation module is to reduce the current photo for being presented in user interface, and with the increase of finger contact area, the operation module correspondingly increases minification to reduce the current photo for being presented in user interface.
6. interface operation control system as claimed in claim 1, it is characterised in that whether the induction module detects operating main body by the image, infrared waves or ultrasonic wave of sensing operation main body close to user interface.
7. a kind of interface operation control method applied to interface operation control system described in claim 1, it is characterised in that this method comprises the following steps:
A. whether an operating main body is sensed close to user interface, and when the operating main body is less than a predetermined angle relative to the distance of user interface, exports an induced signal;
B. the induced signal of the induction module output, the feature to extract operating main body according to the induced signal are received;
C. the data for the application model corresponding with this feature being pre-stored within a memory are run according to the feature of the operating main body of extraction, to start the application model;And
D. identification operating main body contacts the area of user interface, to differentiate that operating main body is implemented on the varying strength of user interface, and different instructions are performed to application model according to the difference of intensity.
8. interface operation control method as claimed in claim 7, it is characterised in that step D also includes:The contact area that identification operating main body changes after exerting a force and tilt on a user interface, differentiates that operating main body is implemented on the incline direction of user interface according to the contact area of the change, and different instructions are performed to application model according to different incline directions.
9. interface operation control method as claimed in claim 8, it is characterised in that step D also includes:The duration that detecting operating main body is contacted on a user interface, increase the intensity and incline direction according to duration extension, to perform different instructions to application model.
10. interface operation control method as claimed in claim 9, it is characterised in that whether detect operating main body in step A by the image, infrared waves or ultrasonic wave of sensing operation main body close to user interface.
CN201610153114.9A 2016-03-17 2016-03-17 The system of interface operation control method and application this method Pending CN107203319A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610153114.9A CN107203319A (en) 2016-03-17 2016-03-17 The system of interface operation control method and application this method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610153114.9A CN107203319A (en) 2016-03-17 2016-03-17 The system of interface operation control method and application this method

Publications (1)

Publication Number Publication Date
CN107203319A true CN107203319A (en) 2017-09-26

Family

ID=59903964

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610153114.9A Pending CN107203319A (en) 2016-03-17 2016-03-17 The system of interface operation control method and application this method

Country Status (1)

Country Link
CN (1) CN107203319A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108762564A (en) * 2018-05-30 2018-11-06 维沃移动通信有限公司 A kind of method of controlling operation thereof and terminal device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102023772A (en) * 2010-11-26 2011-04-20 中兴通讯股份有限公司 Capacitive touch screen signal processing method and device
TW201411469A (en) * 2012-07-16 2014-03-16 Samsung Electronics Co Ltd Touch and gesture input-based control method and terminal therefor
CN103869947A (en) * 2012-12-14 2014-06-18 联想(北京)有限公司 Method for controlling electronic device and electronic device
TWI493426B (en) * 2008-04-27 2015-07-21 Htc Corp Electronic device and user interface display method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI493426B (en) * 2008-04-27 2015-07-21 Htc Corp Electronic device and user interface display method thereof
CN102023772A (en) * 2010-11-26 2011-04-20 中兴通讯股份有限公司 Capacitive touch screen signal processing method and device
TW201411469A (en) * 2012-07-16 2014-03-16 Samsung Electronics Co Ltd Touch and gesture input-based control method and terminal therefor
CN103869947A (en) * 2012-12-14 2014-06-18 联想(北京)有限公司 Method for controlling electronic device and electronic device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108762564A (en) * 2018-05-30 2018-11-06 维沃移动通信有限公司 A kind of method of controlling operation thereof and terminal device

Similar Documents

Publication Publication Date Title
CN105912163B (en) Physical button component, terminal, touch-control response method and device
RU2636104C1 (en) Method and device for implementing touch-sensitive button and identifying fingerprints and terminal device
EP2817704B1 (en) Apparatus and method for determining the position of a user input
US9696767B2 (en) Command recognition method including determining a hold gesture and electronic device using the method
CN109428969B (en) Edge touch method and device of double-screen terminal and computer readable storage medium
KR102165818B1 (en) Method, apparatus and recovering medium for controlling user interface using a input image
CN106951884A (en) Gather method, device and the electronic equipment of fingerprint
CN104781779A (en) Method and apparatus for creating motion effect for image
CN106815546A (en) fingerprint identification method and device
CN102906675A (en) Information input device, information input method and program
US8248366B2 (en) Image display device and operation method thereof
WO2017032006A1 (en) Method and apparatus for displaying information
TW201626204A (en) Touch control photography method and touch terminal thereof
CN103809895B (en) It is a kind of can dynamic generation button mobile terminal and method
EP3046317A1 (en) Method and apparatus for capturing images
TWI609314B (en) Interface operating control system method using the same
CN108021322A (en) Control the method and touch control terminal that display interface slides
CN107203319A (en) The system of interface operation control method and application this method
CN105278753B (en) Touch-control response method and device
US10114469B2 (en) Input method touch device using the input method, gesture detecting device, computer-readable recording medium, and computer program product
CN108021327A (en) Control the method and touch control terminal that display interface slides
CN105468194A (en) Touch response method and apparatus
CN107765962B (en) Display interface zooming control method and device and terminal
US20170322638A1 (en) Input device and input method
KR102289497B1 (en) Method, apparatus and recovering medium for controlling user interface using a input image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20180226

Address after: 530007 the Guangxi Zhuang Autonomous Region, China Hi tech Zone, the headquarters of the headquarters of the road No. 18, China ASEAN enterprise base, phase 5, No. three plant

Applicant after: NANNING FUGUI PRECISION INDUSTRIAL CO., LTD.

Address before: 530007 the Guangxi Zhuang Autonomous Region, China Hi tech Zone, the headquarters of the headquarters of the road No. 18, China ASEAN enterprise base, phase 5, No. three plant

Applicant before: NANNING FUGUI PRECISION INDUSTRIAL CO., LTD.

Applicant before: Hon Hai Precision Industry Co., Ltd.

WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20170926