CN105117020A - Edge interactive operation processing method and mobile terminal - Google Patents

Edge interactive operation processing method and mobile terminal Download PDF

Info

Publication number
CN105117020A
CN105117020A CN201510613259.8A CN201510613259A CN105117020A CN 105117020 A CN105117020 A CN 105117020A CN 201510613259 A CN201510613259 A CN 201510613259A CN 105117020 A CN105117020 A CN 105117020A
Authority
CN
China
Prior art keywords
user operation
gesture
mobile terminal
report point
fit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510613259.8A
Other languages
Chinese (zh)
Inventor
李成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201510613259.8A priority Critical patent/CN105117020A/en
Publication of CN105117020A publication Critical patent/CN105117020A/en
Pending legal-status Critical Current

Links

Landscapes

  • Telephone Function (AREA)

Abstract

The invention discloses an edge interactive operation processing method and a mobile terminal. The method includes the steps of monitoring whether user operation exists in a started induction area or not in real time, if yes, obtaining a position reporting parameter of the user operation, judging whether the obtained position reporting parameter of the user operation meets a preset FiT definition gesture or not, if yes, judging that the user operation is effective operation according to the obtained position reporting parameter of the user operation, and triggering the FiT function corresponding to the user operation. According to the technical scheme, FiT usability is improved, and electricity consumption is lowered.

Description

A kind of method and mobile terminal processing edge interactive operation
Technical field
The present invention relates to intelligent terminal technology, espespecially a kind of method and mobile terminal processing edge interactive operation.
Background technology
Rimless mobile terminal has become the manufacturing technology the most most advanced and sophisticated of mobile terminal, brings the interactive experience of strong vision shock and more horn of plenty to user.What edge mutual (FiT) operation based on Rimless mobile terminal was initiative sets abundant interactive mode and function at screen edge, and it is convenient greatly to bring to user.The availability that while user's edge is full of alternately and expects and thirst for, also edge is mutual and ease for use are had higher requirement.And along with the further expansion of technology, FiT technology is also progressively applied on some narrow limit mobile terminals, as myPrague mobile terminal.Although, the application of FiT technology brings great lifting on mobile terminals the mutual of user and experience, but all need in the application scenarios of current FiT technology could use when mobile terminal bright screen, this limits the use scenes of FiT to a certain extent and reduces the ease for use of FiT.Such as, the scheme " gripping and unlock " in FiT needs first to light screen by power key, and then function could unlock " to grip unblock " by edge, this kind of unlocking manner compares tradition unblock (as according to fingerprint input and numeral input unblock after lighting screen) some superiority, but this kind of unlocking manner still needs predicted point bright screen curtain, Consumer's Experience is bad, and ease for use is poor.
Wherein, the relation between sensor and processor as shown in Figure 1, comprises integrated sensor group (Sensors) and application processor (AP, ApplicationProcessor), and wherein, AP generally adopts the CPU of ARM chip.If need can use some function (such as FiT) under breath screen state, because sensor (Sensor) is directly affiliated on AP, this function can be successfully used in order to detect, the signal that AP needs the continuous treatment S ensor of the state that is held open (alwayson) to transmit, thus cause serious power problems.
Along with the development of mobile terminal technology and central processing unit (CPU) processing speed of mobile terminal more and more faster while, its power consumption also gets more and more, as GPS (GPS), light, compass, acceleration, distance perspective should wait sensor and bluetooth, WLAN (wireless local area network) (WLAN), infrared, the nextport hardware component NextPort such as wireless near field communication (NFC), contact panel (TP, Touchpanel) to be all very power consumptions.Each operation of sensor or nextport hardware component NextPort all will pass through CPU, and is held open the sensor for a long time or nextport hardware component NextPort very will take electricity.
Therefore, in order to power saving, mobile terminal built-in sensors and nextport hardware component NextPort all keep closed condition.But sensor and hardware keep closed condition that the application scenarios of the sensor or nextport hardware component NextPort will be caused limited, such as, can not navigate under breath screen state, under breath screen state, certain or some application can not be waken up by specific gesture rapid.
Summary of the invention
In order to solve the problems of the technologies described above, the invention provides a kind of method and the mobile terminal that process edge interactive operation, the ease for use of FiT can be improved and reduce power consumption.
In order to reach the object of the invention, the invention provides a kind of method processing edge interactive operation, comprising:
Whether the induction region that Real-Time Monitoring has been opened has user operation;
When monitoring the induction region opened and having user operation, obtain the report point parameter of user operation;
Judge whether the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set; When judging that the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set, the report point parameter according to the user operation obtained judges that this user operation is valid function;
Trigger the FiT function corresponding with this user operation.
Further, also comprise before the method: pre-set described induction region;
Open the contact panel TP of the induction region preset of preassigned sensor and described mobile terminal.
Further, the described definition gesture judging the whether satisfied FiT pre-set of the report point parameter of the user operation obtained, comprising:
According to the gesture of the report point parameter determination user operation of described acquisition;
Judge the gesture of described user operation determined whether in the definition gesture of the described FiT pre-set;
When the gesture of the described described user operation determined is in the definition gesture of the described FiT pre-set, then judge that the report point parameter of the user operation obtained meets the definition gesture of the described FiT pre-set;
When the gesture of the described user operation determined is not in the definition gesture of the described FiT pre-set, then judge that the report point parameter of the user operation obtained does not meet the definition gesture of the FiT pre-set.
Further, the report point parameter of the described user operation according to obtaining judges that this user operation is that valid function includes but not limited to:
The numerical value of the user operation of described preassigned sensor collection is more than or equal to corresponding predetermined threshold value; Or,
Described mobile terminal is in the operable state of this FiT function.
Further, described report point parameter comprises: report point quantity, and/or the numerical value of the user operation of preassigned sensor collection, and/or moving radius, and/or movement velocity, and/or direction of motion, and/or run duration.
Further, the definition gesture of described FiT includes but not limited to: boundary slip gesture, and edge slides gesture repeatedly, bilateral slip gesture, grip, and hold one and to shake hands gesture, edge is clicked and horizontal screen hands grasping.
Present invention also offers a kind of mobile terminal, when the mutual FiT function of unlatching breath platen edge, comprising: monitoring modular, acquisition module, judge module and processing module; Wherein,
Monitoring modular, whether the induction region opened for Real-Time Monitoring has user operation; When monitoring the induction region opened and having user operation, send to acquisition module and obtain notice;
Acquisition module, obtaining notice for receiving, obtaining the report point parameter of user operation;
Judge module, for judging whether the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set; When judging that the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set, the report point parameter according to the user operation obtained judges that this user operation is valid function, sends response notice to processing module;
Processing module, for receiving response notice, triggers the FiT function corresponding with this user operation.
Further, this mobile terminal also comprises control module, for opening the contact panel TP of the induction region preset of preassigned sensor and the described mobile terminal belonging to self.
Further, described judge module judges whether the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set, and comprising:
According to the gesture of the report point parameter determination user operation of described acquisition;
Judge the gesture of described user operation determined whether in the definition gesture of the described FiT pre-set;
When the gesture of the described described user operation determined is in the definition gesture of the described FiT pre-set, then judge that the report point parameter of the user operation obtained meets the definition gesture of the described FiT pre-set;
When the gesture of the described user operation determined is not in the definition gesture of the described FiT pre-set, then judge that the report point parameter of the user operation obtained does not meet the definition gesture of the described FiT pre-set.
Further, according to the report point parameter of the user operation obtained, described judge module judges that this user operation is that valid function includes but not limited to:
The numerical value of the user operation of described preassigned sensor collection is more than or equal to corresponding predetermined threshold value; Or,
Described mobile terminal is in the operable state of this FiT function.
Further, described report point parameter comprises: report point quantity, and/or the numerical value of the user operation of preassigned sensor collection, and/or moving radius, and/or movement velocity, and/or direction of motion, and/or run duration.
Further, the definition gesture of described FiT includes but not limited to: boundary slip gesture, and edge slides gesture repeatedly, bilateral slip gesture, grip, and hold one and to shake hands gesture, edge is clicked and horizontal screen hands grasping.
Technical solution of the present invention comprises: whether the induction region that Real-Time Monitoring has been opened has user operation; When monitoring the induction region opened and having user operation, obtain the report point parameter of user operation; Judge whether the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set; When judging that the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set, the report point parameter according to the user operation obtained judges that this user operation is valid function; Trigger the FiT function corresponding with this user operation.Technical solution of the present invention improves the ease for use of FiT and reduces power consumption.
Accompanying drawing explanation
Accompanying drawing described herein is used to provide a further understanding of the present invention, and form a application's part, schematic description and description of the present invention, for explaining the present invention, does not form inappropriate limitation of the present invention.In the accompanying drawings:
Fig. 1 is existing application processor and sensor connection diagram;
Fig. 2 is the hardware configuration signal of the mobile terminal realizing each embodiment of the present invention;
Fig. 3 is the schematic diagram of supporting to carry out between mobile terminal of the present invention the communication system communicated;
Fig. 4 is the process flow diagram of the method for process edge interactive operation of the present invention;
Fig. 5 is the structural representation of mobile terminal of the present invention;
Fig. 6 (a) is for splitting the schematic diagram of Touch Zone in the embodiment of the present invention;
Fig. 6 (b) is for splitting another schematic diagram of Touch Zone in the embodiment of the present invention;
Fig. 7 is application processor of the present invention and sensor connection diagram;
Fig. 8 is the process flow diagram of touch operation method first embodiment of mobile terminal of the present invention;
Fig. 9 is the process flow diagram of touch operation method second embodiment of mobile terminal of the present invention;
Figure 10 is C district of the present invention event handling system frame diagram;
Figure 11 is the process flow diagram of the click action recognition methods of mobile terminal of the present invention;
Figure 12 is C district judgement of slide method flow diagram in first embodiment of the invention;
Figure 13 is C district judgement of slide method flow diagram in second embodiment of the invention;
Figure 14 embodiment of the present invention two kinds of C districts move schematic diagram in contact.
Embodiment
Below in conjunction with drawings and Examples, technical scheme of the present invention is described in detail.
The mobile terminal realizing each embodiment of the present invention is described referring now to accompanying drawing.In follow-up description, use the suffix of such as " module ", " parts " or " unit " for representing element only in order to be conducive to explanation of the present invention, itself is specific meaning not.Therefore, " module " and " parts " can mixedly use.
Mobile terminal can be implemented in a variety of manners.Such as, the terminal described in the present invention can comprise the such as mobile terminal of mobile phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP (portable media player), guider etc. and the fixed terminal of such as digital TV, desk-top computer etc.Below, suppose that terminal is mobile terminal.But it will be appreciated by those skilled in the art that except the element except being used in particular for mobile object, structure according to the embodiment of the present invention also can be applied to the terminal of fixed type.
Fig. 2 is the hardware configuration schematic diagram of the mobile terminal realizing each embodiment of the present invention.
Mobile terminal 100 can comprise wireless communication unit 110, A/V (audio/video) input block 120, user input unit 130, sensing cell 140, output unit 150, storer 160, interface unit 170, controller 180 and power supply unit 190 etc.Fig. 2 shows the mobile terminal with various assembly, it should be understood that, does not require to implement all assemblies illustrated.Can alternatively implement more or less assembly.Will be discussed in more detail below the element of mobile terminal.
Wireless communication unit 110 generally includes one or more assembly, and it allows the wireless communication between mobile terminal 100 and wireless communication system or network.Such as, wireless communication unit can comprise at least one in broadcast reception module 111, mobile communication module 112, wireless Internet module 113, short range communication module 114 and positional information module 115.
Broadcast reception module 111 via broadcast channel from external broadcasting management server receiving broadcast signal and/or broadcast related information.Broadcast channel can comprise satellite channel and/or terrestrial channel.Broadcast management server can be generate and send the server of broadcast singal and/or broadcast related information or the broadcast singal generated before receiving and/or broadcast related information and send it to the server of terminal.Broadcast singal can comprise TV broadcast singal, radio signals, data broadcasting signal etc.And broadcast singal may further include the broadcast singal combined with TV or radio signals.Broadcast related information also can provide via mobile communications network, and in this case, broadcast related information can be received by mobile communication module 112.Broadcast singal can exist in a variety of manners, such as, it can exist with the form of the electronic service guidebooks (ESG) of the electronic program guides of DMB (DMB) (EPG), digital video broadcast-handheld (DVB-H) etc.Broadcast reception module 111 can by using the broadcast of various types of broadcast system Received signal strength.Especially, broadcast reception module 111 can by using such as multimedia broadcasting-ground (DMB-T), DMB-satellite (DMB-S), digital video broadcasting-hand-held (DVB-H), forward link media (MediaFLO ) Radio Data System, received terrestrial digital broadcasting integrated service (ISDB-T) etc. digit broadcasting system receive digital broadcasting.Broadcast reception module 111 can be constructed to be applicable to providing the various broadcast system of broadcast singal and above-mentioned digit broadcasting system.The broadcast singal received via broadcast reception module 111 and/or broadcast related information can be stored in storer 160 (or storage medium of other type).
Radio signal is sent at least one in base station (such as, access point, Node B etc.), exterior terminal and server and/or receives radio signals from it by mobile communication module 112.Various types of data that such radio signal can comprise voice call signal, video calling signal or send according to text and/or Multimedia Message and/or receive.
Wireless Internet module 113 supports the Wi-Fi (Wireless Internet Access) of mobile terminal.This module can be inner or be externally couple to terminal.Wi-Fi (Wireless Internet Access) technology involved by this module can comprise WLAN (WLAN) (Wi-Fi), Wibro (WiMAX), Wimax (worldwide interoperability for microwave access), HSDPA (high-speed downlink packet access) etc.
Short range communication module 114 is the modules for supporting junction service.Some examples of short-range communication technology comprise bluetooth tM, radio-frequency (RF) identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB), purple honeybee tMetc..
Positional information module 115 is the modules of positional information for checking or obtain mobile terminal.The typical case of positional information module is GPS (GPS).According to current technology, GPS module 115 calculates from the range information of three or more satellite and correct time information and for the Information application triangulation calculated, thus calculates three-dimensional current location information according to longitude, latitude and pin-point accuracy.Current, the method for calculating position and temporal information uses three satellites and by using the error of the position that goes out of an other satellite correction calculation and temporal information.In addition, GPS module 115 can carry out computing velocity information by Continuous plus current location information in real time.
A/V input block 120 is for audio reception or vision signal.A/V input block 120 can comprise camera 121 and microphone 1220, and the view data of camera 121 to the static images obtained by image capture apparatus in Video Capture pattern or image capture mode or video processes.Picture frame after process may be displayed on display unit 151.Picture frame after camera 121 processes can be stored in storer 160 (or other storage medium) or via wireless communication unit 110 and send, and can provide two or more cameras 1210 according to the structure of mobile terminal.Such acoustic processing can via microphones sound (voice data) in telephone calling model, logging mode, speech recognition mode etc. operational mode, and can be voice data by microphone 122.Audio frequency (voice) data after process can be converted to the formatted output that can be sent to mobile communication base station via mobile communication module 112 when telephone calling model.Microphone 122 can be implemented various types of noise and eliminate (or suppress) algorithm and receiving and sending to eliminate (or suppression) noise or interference that produce in the process of sound signal.
User input unit 130 can generate key input data to control the various operations of mobile terminal according to the order of user's input.User input unit 130 allows user to input various types of information, and keyboard, the young sheet of pot, touch pad (such as, detecting the touch-sensitive assembly of the change of the resistance, pressure, electric capacity etc. that cause owing to being touched), roller, rocking bar etc. can be comprised.Especially, when touch pad is superimposed upon on display unit 151 as a layer, touch-screen can be formed.
Sensing cell 140 detects the current state of mobile terminal 100, (such as, mobile terminal 100 open or close state), the position of mobile terminal 100, user for mobile terminal 100 contact (namely, touch input) presence or absence, the orientation of mobile terminal 100, the acceleration or deceleration of mobile terminal 100 move and direction etc., and generate order or the signal of the operation for controlling mobile terminal 100.Such as, when mobile terminal 100 is embodied as sliding-type mobile phone, sensing cell 140 can sense this sliding-type phone and open or close.In addition, whether whether sensing cell 140 can detect power supply unit 190 provides electric power or interface unit 170 to couple with external device (ED).Sensing cell 140 can comprise proximity transducer 1410 and will be described this in conjunction with touch-screen below.
Interface unit 170 is used as at least one external device (ED) and is connected the interface that can pass through with mobile terminal 100.Such as, external device (ED) can comprise wired or wireless head-band earphone port, external power source (or battery charger) port, wired or wireless FPDP, memory card port, for connecting the port, audio frequency I/O (I/O) port, video i/o port, ear port etc. of the device with identification module.Identification module can be that storage uses the various information of mobile terminal 100 for authentication of users and can comprise subscriber identification module (UIM), client identification module (SIM), Universal Subscriber identification module (USIM) etc.In addition, the device (hereinafter referred to " recognition device ") with identification module can take the form of smart card, and therefore, recognition device can be connected with mobile terminal 100 via port or other coupling arrangement.Interface unit 170 may be used for receive from external device (ED) input (such as, data message, electric power etc.) and the input received be transferred to the one or more element in mobile terminal 100 or may be used for transmitting data between mobile terminal and external device (ED).
In addition, when mobile terminal 100 is connected with external base, interface unit 170 can be used as to allow by it electric power to be provided to the path of mobile terminal 100 from base or can be used as the path that allows to be transferred to mobile terminal by it from the various command signals of base input.The various command signal inputted from base or electric power can be used as and identify whether mobile terminal is arranged on the signal base exactly.Output unit 150 is constructed to provide output signal (such as, sound signal, vision signal, alarm signal, vibration signal etc.) with vision, audio frequency and/or tactile manner.Output unit 150 can comprise display unit 151, dio Output Modules 152, alarm unit 153 etc.
Display unit 151 may be displayed on the information of process in mobile terminal 100.Such as, when mobile terminal 100 is in telephone calling model, display unit 151 can show with call or other communicate (such as, text messaging, multimedia file are downloaded etc.) be correlated with user interface (UI) or graphic user interface (GUI).When mobile terminal 100 is in video calling pattern or image capture mode, display unit 151 can the image of display capture and/or the image of reception, UI or GUI that video or image and correlation function are shown etc.
Meanwhile, when display unit 151 and touch pad as a layer superposed on one another to form touch-screen time, display unit 151 can be used as input media and output unit.Display unit 151 can comprise at least one in liquid crystal display (LCD), thin film transistor (TFT) LCD (TFT-LCD), Organic Light Emitting Diode (OLED) display, flexible display, three-dimensional (3D) display etc.Some in these displays can be constructed to transparence and watch from outside to allow user, and this can be called transparent display, and typical transparent display can be such as TOLED (transparent organic light emitting diode) display etc.According to the specific embodiment wanted, mobile terminal 100 can comprise two or more display units (or other display device), such as, mobile terminal can comprise outernal display unit (not shown) and inner display unit (not shown).Touch-screen can be used for detecting touch input pressure and touch input position and touch and inputs area.
When dio Output Modules 152 can be under the isotypes such as call signal receiving mode, call mode, logging mode, speech recognition mode, broadcast reception mode at mobile terminal, voice data convert audio signals that is that wireless communication unit 110 is received or that store in storer 160 and exporting as sound.And dio Output Modules 152 can provide the audio frequency relevant to the specific function that mobile terminal 100 performs to export (such as, call signal receives sound, message sink sound etc.).Dio Output Modules 152 can comprise loudspeaker, hummer etc.
Alarm unit 153 can provide and export that event informed to mobile terminal 100.Typical event can comprise calling reception, message sink, key signals input, touch input etc.Except audio or video exports, alarm unit 153 can provide in a different manner and export with the generation of notification event.Such as, alarm unit 153 can provide output with the form of vibration, when receive calling, message or some other enter communication (incomingcommunication) time, alarm unit 153 can provide sense of touch to export (that is, vibrating) to notify to user.By providing such sense of touch to export, even if when the mobile phone of user is in the pocket of user, user also can identify the generation of various event.Alarm unit 153 also can provide the output of the generation of notification event via display unit 151 or dio Output Modules 152.
Storer 160 software program that can store process and the control operation performed by controller 180 etc., or temporarily can store oneself through exporting the data (such as, telephone directory, message, still image, video etc.) that maybe will export.And, storer 160 can store about when touch be applied to touch-screen time the vibration of various modes that exports and the data of sound signal.
Storer 160 can comprise the storage medium of at least one type, described storage medium comprises flash memory, hard disk, multimedia card, card-type storer (such as, SD or DX storer etc.), random access storage device (RAM), static random-access memory (SRAM), ROM (read-only memory) (ROM), Electrically Erasable Read Only Memory (EEPROM), programmable read only memory (PROM), magnetic storage, disk, CD etc.And mobile terminal 100 can be connected the memory function of execute store 160 network storage device with by network cooperates.
Controller 180 controls the overall operation of mobile terminal usually.Such as, controller 180 performs the control relevant to voice call, data communication, video calling etc. and process.In addition, controller 180 can comprise the multi-media module 1810 for reproducing (or playback) multi-medium data, and multi-media module 1810 can be configured in controller 180, or can be configured to be separated with controller 180.Controller 180 can pattern recognition process, is identified as character or image so that input is drawn in the handwriting input performed on the touchscreen or picture.
Power supply unit 190 receives external power or internal power and provides each element of operation and the suitable electric power needed for assembly under the control of controller 180.
Various embodiment described herein can to use such as computer software, the computer-readable medium of hardware or its any combination implements.For hardware implementation, embodiment described herein can by using application-specific IC (ASIC), digital signal processor (DSP), digital signal processing device (DSPD), programmable logic device (PLD), field programmable gate array (FPGA), processor, controller, microcontroller, microprocessor, being designed at least one performed in the electronic unit of function described herein and implementing, in some cases, such embodiment can be implemented in controller 180.For implement software, the embodiment of such as process or function can be implemented with allowing the independent software module performing at least one function or operation.Software code can be implemented by the software application (or program) write with any suitable programming language, and software code can be stored in storer 160 and to be performed by controller 180.
So far, oneself is through the mobile terminal according to its functional description.Below, for the sake of brevity, by the slide type mobile terminal that describes in various types of mobile terminals of such as folded form, board-type, oscillating-type, slide type mobile terminal etc. exemplarily.Therefore, the present invention can be applied to the mobile terminal of any type, and is not limited to slide type mobile terminal.
Mobile terminal 100 as shown in Figure 2 can be constructed to utilize and send the such as wired and wireless communication system of data via frame or grouping and satellite-based communication system operates.
Describe wherein according to the communication system that mobile terminal of the present invention can operate referring now to Fig. 3.
Such communication system can use different air interfaces and/or Physical layer.Such as, the air interface used by communication system comprises such as frequency division multiple access (FDMA), time division multiple access (TDMA) (TDMA), CDMA (CDMA) and universal mobile telecommunications system (UMTS) (especially, Long Term Evolution (LTE)), global system for mobile communications (GSM) etc.As non-limiting example, description below relates to cdma communication system, but such instruction is equally applicable to the system of other type.
With reference to figure 3, cdma wireless communication system can comprise multiple mobile terminal 100, multiple base station (BS) 270, base station controller (BSC) 275 and mobile switching centre (MSC) 280.MSC280 is constructed to form interface with Public Switched Telephony Network (PSTN) 290.MSC280 is also constructed to form interface with the BSC275 that can be couple to base station 270 via back haul link.Back haul link can construct according to any one in some interfaces that oneself knows, described interface comprises such as E1/T1, ATM, IP, PPP, frame relay, HDSL, ADSL or xDSL.Will be appreciated that system as shown in Figure 3 can comprise multiple BSC2750.
Each BS270 can serve one or more subregion (or region), by multidirectional antenna or point to specific direction each subregion of antenna cover radially away from BS270.Or each subregion can by two or more antenna covers for diversity reception.Each BS270 can be constructed to support multiple parallel compensate, and each parallel compensate has specific frequency spectrum (such as, 1.25MHz, 5MHz etc.).
Subregion can be called as CDMA Channel with intersecting of parallel compensate.BS270 also can be called as base station transceiver subsystem (BTS) or other equivalent terms.Under these circumstances, term " base station " may be used for broadly representing single BSC275 and at least one BS270.Base station also can be called as " cellular station ".Or each subregion of particular B S270 can be called as multiple cellular station.
As shown in Figure 3, broadcast singal is sent to the mobile terminal 100 at operate within systems by broadcsting transmitter (BT) 295.Broadcast reception module 111 as shown in Figure 2 is arranged on mobile terminal 100 and sentences the broadcast singal receiving and sent by BT295.In figure 3, several GPS (GPS) satellite 300 is shown.Satellite 300 helps at least one in the multiple mobile terminal 100 in location.
In figure 3, depict multiple satellite 300, but understand, the satellite of any number can be utilized to obtain useful locating information.GPS module 115 as shown in Figure 2 is constructed to coordinate to obtain the locating information wanted with satellite 300 usually.Substitute GPS tracking technique or outside GPS tracking technique, can use can other technology of position of tracking mobile terminal.In addition, at least one gps satellite 300 optionally or extraly can process satellite dmb transmission.
As a typical operation of wireless communication system, BS270 receives the reverse link signal from various mobile terminal 100.Mobile terminal 100 participates in call usually, information receiving and transmitting communicates with other type.Each reverse link signal that certain base station 270 receives is processed by particular B S270.The data obtained are forwarded to relevant BSC275.BSC provides call Resourse Distribute and comprises the mobile management function of coordination of the soft switching process between BS270.The data received also are routed to MSC280 by BSC275, and it is provided for the extra route service forming interface with PSTN290.Similarly, PSTN290 and MSC280 forms interface, and MSC and BSC275 forms interface, and BSC275 correspondingly control BS270 so that forward link signals is sent to mobile terminal 100.
Based on above-mentioned mobile terminal hardware configuration and communication system, each embodiment of the inventive method is proposed.
Fig. 4 is the process flow diagram of the method for process edge interactive operation of the present invention, when mutual (FiT) function of unlatching breath platen edge, as shown in Figure 4, comprising:
Step 401: whether the induction region that Real-Time Monitoring has been opened has user operation.
Further, also comprise before the method: the contact panel (TP) opening the induction region preset of preassigned sensor and mobile terminal.Wherein, whether the induction region that Real-Time Monitoring has been opened has user operation, comprising: whether the contact panel (TP) of the induction region that Real-Time Monitoring presets has report point to produce.Wherein, preassigned sensor includes but not limited to: pressure transducer, and/or three-axis gyroscope, and/or light sensor, and/or infrared sensor, and/or acceleration transducer.
Wherein, the induction region preset of mobile terminal is special subregion 20, preferably as shown in Fig. 6 (a), is located at the edge of Touch Zone, and Touch Zone remaining area is then common subregion 10.Or as shown in Fig. 6 (b), common subregion 10 comprises A district and B district, wherein A district is operable area, and for detected touch point coordinate, B district is virtual key region, for detecting Menu key, Home key, return key etc., special subregion 20 is located at edge, Touch Zone and is positioned at both sides, A district.
Wherein, as shown in Figure 7, can by preassigned sensor and the TP of the induction region pre-set be integrated into micro-control unit (MCU, or adopt digital signal processor (DSP) to carry out, on the sensor that processes, opening the TP of the induction region preset of preassigned sensor and mobile terminal MicrocontrollerUnit).Wherein, preassigned sensor and the TP of the induction region pre-set are integrated in MCU, wherein, application processor (AP) comprises primary processor (CPU), even if time CPU is idle, also can control sensor by MCU also can image data, in system, its major function is to process the various information from various sensor, is just waken up in park mode by CPU if desired, reduces system power dissipation with this.And DSP is a kind of special chip of integrated dedicated computing ability, unique distinction is that it can process data immediately, and this instant ability makes the most applicable support of DSP can not put up with the application of any delay just, the voice that such as mobile terminal is conventional, audio frequency, image etc.Be held open image data by the MCU in Fig. 7, only have when function is really called and just AP is aroused from dormant state, in most of the cases, only need MCU to be held open, and AP is in dormant state, can the amount of saving energy greatly.It should be noted that, about how preassigned sensor and the TP of the induction region pre-set being integrated on MCU, belonging to conventional techniques means well-known to those skilled in the art, not repeating them here, be not used for limiting the present invention.
Further, also comprise before the method: the corresponding relation between the definition gesture of FiT and FiT function is set.
Step 402: when monitoring the induction region opened and having user operation, obtains the report point parameter of user operation.
That is this step, when the contact panel (TP) monitoring the induction region preset has report point to produce, obtains the report point parameter of the report point produced.
Wherein, report point parameter comprises: report point quantity, and/or the numerical value of the user operation of preassigned sensor collection, and/or moving radius, and/or movement velocity, and/or direction of motion, and/or run duration.
Wherein, user operation and touch control operation are as Figure 8-Figure 10: as shown in Figure 8, propose touch operation method first embodiment of mobile terminal of the present invention, said method comprising the steps of:
Step S10: the Touch Zone of mobile terminal is divided into common subregion and special subregion two subregions.
The Touch Zone of mobile terminal is used to the region receiving user's touch control operation, and this Touch Zone is divided into two subregions by the present invention, is respectively common subregion and special subregion.Wherein, common subregion and Touch Zone of the prior art, for normally receiving the touch control operation of user, and perform corresponding instruction; Special subregion is the subregion that the present invention defines especially, independent from common subregion, prevents maloperation.
When mobile terminal arranges subregion, can first arrange special subregion, determine position and the size (as width, length etc.) of special subregion, after setting special subregion, the region that Touch Zone is left is then common subregion.Wherein, special subregion is as shown in Fig. 6 (a) He Fig. 6 (b).
Step S11: virtual two input equipments, for each subregion distributes an input equipment.
In specific implementation, the touch-screen of mobile terminal drives during initialization registers two input equipments (input) by input_register_device () instruction, as input equipment 0 (input0) and input equipment 1 (input1).And by input_allocate_device () instruction for each subregion distributes an input equipment, input equipment 0 as corresponding in common subregion, the corresponding input equipment 1 of special subregion.
After registration these two input equipments good, upper strata is according to the name of the input equipment driving layer to report, and identifying active user touch area is common subregion or special subregion, different subregions, upper strata processing mode is different, and step S14 and step S16 will introduce in detail.
Upper strata of the present invention is often referred to framework (Framework) layer, application layer etc., in the system of mobile terminal, such as android, the custom-built systems such as IOS, generally include bottom (Physical layer, drive layer) and upper strata (ccf layer, application layer), the trend of signal stream is: contact panel receives the touch control operation of user, integrated circuit (IC) in contact panel changes physical depression signal into electric signal, by electrical signal transfer to driving layer, the position of layer to pressing is driven to resolve, obtain the concrete coordinate of location point, duration, pressure and other parameters, this parameter is uploaded to ccf layer, ccf layer realizes by corresponding interface to driving the communication of layer, ccf layer receives the input equipment (input) driving layer, resolve this input equipment, thus Response to selection or do not respond this input equipment, and upwards pass to which application concrete by effectively inputting, different application operatings is performed according to different events to meet application layer.
Step S12: receive touch control operation.Mobile terminal receives touch control operation by driving layer.
Step S13: judge whether touch control operation occurs in special subregion.
Touch control operation is generally the operations such as click, slip, each touch control operation is made up of one or more touch point, therefore mobile terminal by detecting the region that the touch point of touch control operation fall into, can judge that touch control operation occurs in common subregion or special subregion.In specific implementation, the driving layer of mobile terminal obtains the coordinate of the touch point of touch control operation, judges which subregion the coordinate of touch point fall into.When the coordinate of touch point falls into special subregion, then judge that touch control operation occurs in special subregion, enters step S14; When the coordinate of touch point does not fall into special subregion, but when falling into common subregion, then judge that touch control operation occurs in common subregion, then enter step S16.
Step S14: report touch control operation by the input equipment corresponding to special subregion.
Concrete, when the touch point of touch control operation falls into special subregion, the driving layer of mobile terminal then reports this touch point by the input equipment corresponding to special subregion.
After framework (Framework) layer receives reported event (reported event comprises input equipment and touch point parameters etc.), first according to the name of input equipment, which region identification is, identify it is at special subregion touch-control as driven layer (kernel) in previous step, layer is then driven to be reported to the input equipment of ccf layer to be input1, instead of report with input0, namely, ccf layer does not need to judge current touch point is at which subregion, the size and the position that judge subregion is not needed yet, these judgements operate in drive on layer and complete, and, drive layer except reporting specifically which input equipment, the parameters of this touch point also can be reported to ccf layer, such as compressing time, position coordinates, pressure size etc.
It should be noted that, ccf layer, after receiving reported event, turns multichannel mechanism by single channel, is reported to application layer.Be specially: first register a passage, this reported event is transmitted by this passage, this event is monitored by audiomonitor (listener), by this event by different passages, be passed to corresponding application module, produce different application operatings, wherein, application module comprises the conventional application such as shooting, contact person; Produce different application operatings, such as, under shooting application, user clicks at special subregion, then can produce focusing, and the different operatings such as camera parameter are adjusted in shooting.It should be noted that before reported event is delivered to audiomonitor, be single channel, after audiomonitor is monitored, what reported event was walked is hyperchannel, and hyperchannel exists simultaneously, its benefit is can be passed to different application modules simultaneously, and different application module produces different operation responses.
Alternatively, being implemented as of above-mentioned steps: the mode utilizing object-oriented, define classification and the implementation of common subregion and special subregion, after judgement is special subregion, by EventHub function, the touch point coordinate of different resolution is converted into the coordinate of LCD, definition single channel function (such as serverchannel and clientchannel etc.), the effect of this function is, after receiving reported event, this event is passed to task manager (TouchEventManager) by this passage, by the monitoring of audiomonitor, by this event by hyperchannel simultaneously or under the application module being passed to multiple response one by one, also one of them application module can only be passed to, application module is as camera, gallery etc., different application module produces corresponding operation.Certainly, the specific implementation of above-mentioned steps also can be the step realization of other modes, and the embodiment of the present invention does not limit this.
Step S15: ignore this touch control operation.
The present embodiment, to the special processing of touch control operation, is ignore this touch control operation.In specific implementation, after framework (Framework) layer of mobile terminal receives the touch point that the input equipment corresponding to special subregion reports, stop continuing to report this touch point, namely normal report flow is stopped, and shield this touch point, ignore this touch control operation, namely mobile terminal will not react this touch control operation, thus reaches the effect preventing maloperation.
Step S16: report touch control operation by the input equipment corresponding to common subregion.
Concrete, when the touch point of touch control operation falls into common subregion, the driving layer of mobile terminal then reports this touch point by the input equipment corresponding to common subregion.
After framework (Framework) layer receives reported event (reported event comprises input equipment and touch point parameters etc.), first according to the name of input equipment, which region identification is, identify it is at common subregion touch-control as driven layer (kernel) in previous step, layer is then driven to be reported to the input equipment of ccf layer to be input0, instead of report with input1, namely, ccf layer does not need to judge current touch point is at which subregion, the size and the position that judge subregion is not needed yet, these judgements operate in drive on layer and complete, and, drive layer except reporting specifically which input equipment, the parameters of this touch point also can be reported to ccf layer, such as compressing time, position coordinates, pressure size etc.
It should be noted that, ccf layer is after receiving reported event, no longer turn multichannel mechanism by single channel, but directly by multichannel mechanism, this event monitored by audiomonitor (listener), by this event by different passages, be passed to corresponding application module, produce different application operatings, wherein, application module comprises shooting, the conventional application such as contact person; Produce different application operatings, such as, under shooting application, user clicks at common subregion, then can produce focusing, and the different operatings such as camera parameter are adjusted in shooting.It should be noted that this hyperchannel and non-concurrent existence, but the unit interval only produces single passage.
Alternatively, being implemented as of above-mentioned steps: after judgement is common subregion, by EventHub function, the touch point coordinate of different resolution is converted into the coordinate of LCD, definition single channel function (such as serverchannel and clientchannel etc.), the effect of this function is, after receiving reported event, this event is passed to task manager (TouchEventManager) by this passage, by the monitoring of audiomonitor, this event is passed to one by one the application module of multiple response by hyperchannel under, also one of them application module can only be passed to, application module is as camera, gallery etc., different application module produces corresponding operation.Certainly, the specific implementation of above-mentioned steps also can be the step realization of other modes, and the embodiment of the present invention does not limit this.
Step S17: this touch control operation of normal process.
This touch control operation of described normal process, namely conventionally in normal flow touch control operation is processed.Such as, the ccf layer of mobile terminal continues to report this touch point according to normal flow, to perform corresponding operational order after receiving the touch point that the input equipment corresponding to common subregion reports.
Thus the present embodiment by easily being extracted separately as special subregion by the region of user misoperation Touch Zone, and ignores the touch control operation that this special subregion reports, and reaches the effect preventing maloperation.
As shown in Figure 9, propose touch operation method second embodiment of mobile terminal of the present invention, said method comprising the steps of:
Step S20: the Touch Zone of mobile terminal is divided into common subregion and special subregion two subregions.
Step S21: virtual two input equipments, for each subregion distributes an input equipment.
Step S22: receive touch control operation.
In the present embodiment, step S20-S22 is similar with the step S10-S12 in the first embodiment respectively, does not repeat them here.
Step S23: judge whether touch control operation occurs in special subregion.
Touch control operation is generally the operations such as click, slip, each touch control operation is made up of one or more touch point, therefore mobile terminal by detecting the region that the touch point of touch control operation fall into, can judge that touch control operation occurs in common subregion or special subregion.In specific implementation, the driving layer of mobile terminal obtains the coordinate of the touch point of touch control operation, judges which subregion the coordinate of touch point fall into.When the coordinate of touch point falls into special subregion, then judge that touch control operation occurs in special subregion, enters step S24; When the coordinate of touch point falls into common subregion, then judge that touch control operation occurs in common subregion, then enter step S26.
Step S24: report touch control operation by the input equipment corresponding to special subregion.
Concrete, when the touch point of touch control operation falls into special subregion, the driving layer of mobile terminal then reports this touch point by the input equipment corresponding to special subregion.
Step S25: generate special efficacy according to this touch control operation.
After framework (Framework) layer receives reported event (reported event comprises input equipment and touch point parameters etc.), first according to the name of input equipment, which region identification is, as above driving layer (kernel) to identify in general step is at special subregion touch-control, layer is then driven to be reported to the input equipment of ccf layer to be input1, instead of report with input0, namely, ccf layer does not need to judge current touch point is at which subregion, the size and the position that judge subregion is not needed yet, these judgements operate in drive on layer and complete, and, drive layer except reporting specifically which input equipment, the parameters of this touch point also can be reported to ccf layer, such as compressing time, position coordinates, pressure size etc.
The present embodiment, to the special processing of touch control operation, is generate special efficacy according to this touch control operation.In specific implementation, the ccf layer of mobile terminal stops continuing to report this touch point, namely stops normal report flow, and generate special efficacy in the position of this touch point or optional position after receiving the touch point that the input equipment corresponding to special subregion reports.Namely described special efficacy produces the special-effects such as color change, change in pattern, brightness change.Thus reach the effect preventing maloperation equally.
Step S26: report touch control operation by the input equipment corresponding to common subregion.
Concrete, when the touch point of touch control operation falls into common subregion, the driving layer of mobile terminal then reports this touch point by the input equipment corresponding to common subregion.
Step S27: this touch control operation of normal process.
After framework (Framework) layer receives reported event (reported event comprises input equipment and touch point parameters etc.), first according to the name of input equipment, which region identification is, identify it is at common subregion touch-control as driven layer (kernel) in previous step, layer is then driven to be reported to the input equipment of ccf layer to be input0, instead of report with input1, namely, ccf layer does not need to judge current touch point is at which subregion, the size and the position that judge subregion is not needed yet, these judgements operate in drive on layer and complete, and, drive layer except reporting specifically which input equipment, the parameters of this touch point also can be reported to ccf layer, such as compressing time, position coordinates, pressure size etc.
This touch control operation of described normal process, namely conventionally in normal flow touch control operation is processed.Such as, the ccf layer of mobile terminal continues to report this touch point according to normal flow, to perform corresponding operational order after receiving the touch point that the input equipment corresponding to common subregion reports.
Thus the present embodiment by not carrying out normal process to the touch control operation that special subregion reports, but generates special efficacy according to this touch control operation, achieves equally and prevents maloperation.
Ignore except the special processings such as process, special effect processing except adopting described in previous embodiment, special processing can also be carried out to the touch control operation that special subregion reports according to user-defined processing mode, as other operational order self-defined, comprise and light screen, extinguishing screen, adjustment volume, adjustment brightness etc.
In conjunction with see Figure 10, will be described further touch control operation flow process of the present invention in another way, for simplicity, Tu10Zhong, by common subregion referred to as A district, by special subregion referred to as C district, the report flow of touch-control event is as follows:
Drive layer by physical hardware as touch-screen receives touch-control event, and judge that touch control operation occurs in A district or C district, and by A district or C district device file node reported event.Native floor reads event from the device file in A district, C district, and processes the event in A district, C district, as coordinate calculates, is distinguished, finally distribute A district and C district event respectively by the event of device id to A, C district.Wherein A district event walks primary flow process, processes, that is, processed by multichannel mechanism by common mode to A district event; C district event then distributes from the C district designated lane being registered to Native floor in advance, inputted by Native port, system port exports the event ends with system service of C district to, monitor C district event by audiomonitor (listener), then report to each application by C district event reception external interface.
Step 403: judge whether the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set; When judging that the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set, the report point parameter according to the user operation obtained judges that this user operation is valid function.
Wherein, the definition gesture of FiT includes but not limited to: boundary slip gesture, and edge slides gesture repeatedly, bilateral slip gesture, grip, and hold one and to shake hands gesture, edge is clicked and horizontal screen hands grasping.
Wherein, judge whether the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set, and comprising:
According to the gesture of the report point parameter determination user operation obtained;
Judge the gesture of user operation determined whether in the definition gesture of the FiT pre-set;
When in definition gesture at the FiT pre-set of the gesture of the user operation determined, then judge that the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set;
When in definition gesture not at the FiT pre-set of the gesture of the user operation determined, then judge that the report point parameter of the user operation obtained does not meet the definition gesture of the FiT pre-set.
Wherein, Figure 11 is the process flow diagram of the click action recognition methods of mobile terminal of the present invention, as shown in figure 11, comprising:
S31, the Touch Zone of this mobile terminal is divided into common subregion and special subregion.
S32, when sensing touch control operation on this special subregion, obtain touch time and the displacement of this touch control operation.
Mobile terminal receives touch control operation by driving layer.Touch control operation is generally click, double-click, slide, the operation such as to choose, each touch control operation is made up of one or more touch point, therefore mobile terminal by detecting the region that the touch point of touch control operation fall into, can judge that touch control operation occurs in common subregion or special subregion.In specific implementation, the driving layer of mobile terminal obtains the coordinate of the touch point of touch control operation, judges which subregion the coordinate of touch point fall into.When the coordinate of touch point falls into special subregion, then think and sensed touch control operation on special subregion.When the touch point of touch control operation falls into special subregion, the driving layer of mobile terminal then reports this touch point by the input equipment corresponding to special subregion.
In this step, this touch control operation is a complete action, from receive user on special subregion pressing signal terminate to when receiving and lift signal.
The touch time T of this touch operation can according to the beginning touch point A (X of this touch control operation 0, Y 0) moment t1 and terminate touch point B (X 1, Y 1) moment t2 obtain.
The displacement D of this touch operation can according to the beginning touch point A (X of this touch control operation 0, Y 0) to terminating touch point B (X 1, Y 1) movement locus acquisition.
After framework (Framework) layer receives reported event (reported event comprises input equipment and touch point parameters etc.), first according to the name of input equipment, which region identification is, identify it is at special subregion touch-control as driven layer (kernel) in previous step, layer is then driven to be reported to the input equipment of ccf layer to be input1, instead of report with input0, namely, ccf layer does not need to judge current touch point is at which subregion, the size and the position that judge subregion is not needed yet, these judgements operate in drive on layer and complete, and, drive layer except reporting specifically which input equipment, the parameters of this touch point also can be reported to ccf layer, such as compressing time, position coordinates, pressure size etc.
S33, judge whether the displacement of this touch control operation is less than or equal to predeterminable range threshold values, if so, then perform step S34.
This predeterminable range threshold values can need to arrange, as being set to 2 millimeters according to user.In this step, if the displacement of this touch control operation is less than or equal to predeterminable range threshold values, then think that this touch control operation meets one of them condition of click action; If the displacement of this touch control operation is greater than predeterminable range threshold values, then think that the displacement of this touch control operation is excessive, this touch control operation can not be thought click action, this touch control operation may be mobile operation etc.
S34, judge whether the touch time of this touch control operation is less than or equal to Preset Time threshold values, if so, then perform step S35.
This Preset Time threshold values can need to arrange according to user.In this step, if the touch time T of this touch control operation is less than or equal to Preset Time threshold values, then think that this touch control operation meets second condition of click action; If the touch time T of this touch control operation is greater than Preset Time threshold values, then think the touch overlong time of this touch control operation, this touch control operation can not be thought to click, this touch control operation be likely long by or choose and wait other to operate.
S35, determine that this touch control operation is click action.
When the displacement of this touch control operation is less than or equal to predeterminable range threshold values and the touch time of this touch control operation is less than or equal to Preset Time threshold values, determines that this touch control operation is click action, can judge touch control operation accurately, avoid erroneous judgement.
Wherein, as shown in figure 12, the slip recognition methods of the virtual frame region in the present embodiment is applicable to two kinds of dividing mode in C district simultaneously, specifically comprises the following steps:
501, the touch-control event that sensing is concurrent with contact in virtual frame region.
502, the present co-ordinate position of the above-mentioned contact of real-time report, and judge whether the touch-control event concurrent with contact belongs to slip event according to the initial coordinate position of this contact and present co-ordinate position.
If 503 judge that touch-control event belongs to slip event, then judge the direction attribute of this slip event further.
As shown in figure 13, when adopting fixed form to divide C district, the slip recognition methods in C district comprises the following steps:
601, defining position and the width in C district, is be positioned at the whole tangible Region dividing of touch-screen the C district of both sides of the edge and be positioned at the A district of centre.
602, when touch-screen drives initialization, two virtual input devices (being defined as respectively: input0 and input1) are distributed by input_allocate_device (), and register this two input equipments by input_register_device (), wherein input0 corresponds to C district, input1 corresponds to A district.
After registration these two virtual input devices good, upper strata is by the name according to the virtual input device driving layer to report, and identifying active user touch area is C district or A district, different subregions, and upper strata processing mode is different, will introduce in step 604.
Upper strata of the present invention is often referred to framework (Framework) layer, application layer etc., in the system of mobile terminal, such as android, the custom-built systems such as IOS, generally include bottom (Physical layer, drive layer) and upper strata (ccf layer, application layer), the trend of signal stream is: Physical layer (contact panel) receives the touch control operation of user, physical depression changes electric signal TP into, TP is passed to driving layer, the position of layer to pressing is driven to resolve, obtain the concrete coordinate of location point, duration, pressure and other parameters, this parameter is uploaded to ccf layer, ccf layer realizes by corresponding interface to driving the communication of layer, ccf layer receives the input equipment (input) driving layer, resolve this input equipment, thus Response to selection or do not respond this input equipment, and upwards pass to which application concrete by effectively inputting, different application operatings is performed according to different events to meet application layer.
603, when C district sensing has the touch-control event concurrent with contact, reported the initial coordinate position (downX of this contact to upper strata by virtual input device input0, downY) and initially press time information (downTime), upper strata (system framework layer or application layer) records this information as follow-up judgement of slide foundation.As shown in figure 14, the center section of touch-screen is A district, and the narrow limit of the left and right sides is C district, and grey initial point represents the contact in C district.
Touch control operation is generally the operations such as click, slip, and each touch control operation is made up of one or more contact, and therefore mobile terminal by detecting the region that the contact of touch control operation fall into, can judge that touch control operation occurs in C district or A district.
After framework (Framework) layer receives reported event (reported event comprises input equipment and touch point parameters etc.), first according to the name of input equipment, which region identification is, identify it is at C district touch-control as driven floor (kernel), layer is then driven to be reported to the input equipment of ccf layer to be input0, instead of report with input1, namely, ccf layer does not need to judge current contact is at which subregion, the size and the position that judge subregion is not needed yet, these judgements operate in drive on layer and complete, and, drive layer except reporting specifically which input equipment, the parameters of this contact also can be reported to ccf layer, such as compressing time, position coordinates, pressure size etc.
604, in the moving process of above-mentioned contact, virtual input device input0 is according to the present co-ordinate position (currentX, currentY) of the cycle of presetting to this contact of real-time report, upper strata; Simultaneously, according to the initial coordinate position of contact and present co-ordinate position information, upper strata judges whether touch-control event is slip event, if then judge its direction attribute further, and carry out the special processing preset accordingly, special processing mode is as ignored, generating special efficacy etc.
For realizing judging comparatively accurately, the report cycle of virtual input device input0 can be set as short period value, such as 1/85 second.
In step 604, judge to be whether the concrete grammar of slip event with the concurrent touch-control event in contact be: the displacement judging contact current location and initial position; If this displacement exceedes predetermined threshold value, then judge that this touch-control event is as slip event, otherwise, judge that this touch-control event is not slip event.
The computing formula of the displacement of contact is:
Because C district is virtual frame region, what generally arrange is narrow, and the displacement of X-direction reduces, and negligible, thus the computing formula of the displacement of above-mentioned contact can be reduced to:
Displacement=| currentY – downY|.
Wherein, judge that this user operation is that valid function includes but not limited to according to the report point parameter of the user operation obtained:
The numerical value of the user operation of preassigned sensor collection is more than or equal to corresponding predetermined threshold value; Or,
Mobile terminal is in the operable state of this FiT function.
It should be noted that, the predetermined threshold value that different sensors is corresponding is different, and predetermined threshold value can set according to actual conditions.Wherein, operable state can judge by means of specific sensor, as mobile terminal be in pocket or bag time, be not operable state; If mobile terminal is just facing to being operable state during user.Illustrate, as preassigned sensor be pressure transducer time, force value is greater than default pressure threshold, is to prevent owing to slightly touching the false touch caused like this; When and for example preassigned sensor is light sensor, can judge that mobile terminal is not now in pocket false-touch prevention pattern; For another example, when breath screen state lower limb is taken pictures, now require that the built-in acceleration of mobile terminal is no more than predetermined acceleration threshold value, " grip and unlock " requires mobile terminal just facing to user, and deviation angle is no more than default deviation angle threshold value etc.
Step 404: trigger the FiT function corresponding with this user operation.
Wherein, FiT function includes but not limited to: boundary slip starts application, or boundary slip switch application, or edge sliding mobile terminal acceleration repeatedly, or bilateral slidable adjustment brightness, or the bilateral slidable adjustment volume of person, or grip unblock, or hold screenshotss/startup one-handed performance, or edge is clicked and is taken pictures, or collection picture of taking pictures is clicked at edge, or horizontal screen hands grasping starts camera, or other function.
Further, when judging that the report point of the user operation obtained does not meet the definition gesture of the FiT pre-set or judges that this user operation is not valid function, the method also comprises: do not respond user operation.
In the inventive method, the induction region opened by Real-Time Monitoring and judge whether user operation is valid function according to the user operation of the induction region opened, only having for just triggering the FiT function corresponding with this user operation during valid function, improve the ease for use of FiT and reducing power consumption.In addition, by judging user operation, avoid the generation of maloperation.
Illustrate, first, in prior art, utilizing FiT " to grip and unlock " function needs first to wake screen up with power key, is then being unlocked by " gripping password ".Adopt said method of the present invention, directly unlocked by " gripping password " under breath screen state can be realized, eliminate the operation of lighting screen.When detecting that mobile terminal edge C district (induction region pre-set) has report point to upload, judge whether to meet the condition of " grip and unlock ": 1) to grip the quantity that password records equal for report point quantity and typing; 2) position offset that password records is less than predetermined threshold value to report point coordinate and typing to grip; 3) the report point duration exceedes and the time threshold preset.If meet above-mentioned three focusing, judge that the operation of user meets " gripping unlocks " definition of gesture.
Further, light sensor can be utilized, the sensors such as gyroscope judge " grip and unlock " operation validity under breath screen state, thus shielding is user misoperation, wherein Rule of judgment is: 1) be blocked if range sensor detects now mobile terminal, then now mobile terminal is in pocket false-touch prevention pattern, shielding user unlocking operation now; 2) gyro sensor is in conjunction with front-facing camera, judge now whether mobile terminal is in operable position relative to the orientation residing for user, if be not in can operative orientation (such as, the mobile terminal back side towards user, the orientation such as mobile terminal face down) then shield now unlocking operation.
In addition, in prior art, the function of " horizontal screen also grips edge startup camera " can only use under bright screen state.When such problem is sometimes to need start camera fast or capture, need first to light screen, unlock and then start camera, step is too much.Adopt said method of the present invention directly by " horizontal screen also grips edge startup camera " functional realiey object under breath screen state, step and the flow process of operation can be greatly reduced.Comprise: pass through sensor fusion algorithm, when detecting that mobile terminal edge C district (the same description) has report point to upload, judge whether the condition meeting " horizontal screen also grips edge startup camera ": 1) judge now whether mobile terminal is in horizontal screen state by gyroscope parameters; 2) C district report point quantity is equal with predetermined number; 3) report point coordinate to be positioned at predeterminable area, i.e. coboundary 2 report point, lower limb 2 report point, and report point position isolated edge is no more than predetermined threshold value; 4) the report point duration exceedes and the time threshold preset.Then now judge the operation of user satisfied " horizontal screen also grips edge and starts camera " definition of gesture.
Further, by sensor fusion algorithm, pressure transducer can be utilized to shield, judge whether pressing dynamics reaches predetermined threshold value, if pressing dynamics is less than predetermined threshold value, then the camera shielded now starts.
Finally, user sometimes just remembers and has forgotten cleaning backstage after mobile terminal breath screen, if do not clear up backstage, mobile terminal can produce a large amount of buffer memory and increase power consumption; But in prior art, utilize " edge repeatedly slides and the clears up backstage " function of FiT technology or general cleaning consistency operation, could use when all needing again to light screen and then unblock, and adopt said method of the present invention, directly pass through " edge repeatedly slides and clears up backstage " functional realiey object under can realizing breath screen state, greatly reduce step and the flow process of operation.
Fig. 5 is the structural representation of mobile terminal of the present invention, when the mutual FiT function of unlatching breath platen edge, as Fig. 5, comprising: monitoring modular 50, acquisition module 51, judge module 52 and processing module 53.Wherein,
Monitoring modular 50, whether the induction region opened for Real-Time Monitoring has user operation; When monitoring the induction region opened and having user operation, send to acquisition module 51 and obtain notice.
Acquisition module 51, obtaining notice for receiving, obtaining the report point parameter of user operation.
Judge module 52, for judging whether the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set; When judging that the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set, the report point parameter according to the user operation obtained judges that this user operation is valid function, sends response notice to processing module 53.
Wherein, judge module 52 judges whether the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set, and comprising:
According to the gesture of the report point parameter determination user operation obtained;
Judge the gesture of user operation determined whether in the definition gesture of the FiT pre-set;
When in definition gesture at the FiT pre-set of the gesture of the user operation determined, then judge that the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set;
When in definition gesture not at the FiT pre-set of the gesture of the user operation determined, then judge that the report point parameter of the user operation obtained does not meet the definition gesture of the FiT pre-set.
Wherein, judge module 52 judges to judge that this user operation is that valid function includes but not limited to according to the report point parameter of the user operation obtained:
The numerical value of the user operation of preassigned sensor collection is more than or equal to corresponding predetermined threshold value; Or,
Mobile terminal is in the operable state of this FiT function.
Further, judge module 52, also for when judging that the report point of the user operation obtained does not meet the definition gesture of the FiT pre-set or judges that this user operation is not valid function, sending to processing module and not responding notice.
Processing module 53, for receiving response notice, triggers the FiT function corresponding with this user operation.
Further, processing module 53 does not also respond notice for receiving, and does not respond user operation.
Further, this mobile terminal also comprises control module 54, for opening the contact panel TP of the induction region preset of preassigned sensor and the mobile terminal belonging to self.Wherein, control module 54 can be micro-control unit (MCU, MicrocontrollerUnit) or adopt digital signal processor (DSP) to carry out the sensor processed.
Wherein, report point parameter comprises: report point quantity, and/or the numerical value of the user operation of preassigned sensor collection, and/or moving radius, and/or movement velocity, and/or direction of motion, and/or run duration.
Wherein, the definition gesture of FiT includes but not limited to: boundary slip gesture, and edge slides gesture repeatedly, bilateral slip gesture, grip, and hold one and to shake hands gesture, edge is clicked and horizontal screen hands grasping.
It should be noted that, in this article, term " comprises ", " comprising " or its any other variant are intended to contain comprising of nonexcludability, thus make to comprise the process of a series of key element, method, article or device and not only comprise those key elements, but also comprise other key elements clearly do not listed, or also comprise by the intrinsic key element of this process, method, article or device.When not more restrictions, the key element limited by statement " comprising ... ", and be not precluded within process, method, article or the device comprising this key element and also there is other identical element.
The invention described above embodiment sequence number, just to describing, does not represent the quality of embodiment.
Through the above description of the embodiments, those skilled in the art can be well understood to the mode that above-described embodiment method can add required general hardware platform by software and realize, hardware can certainly be passed through, but in a lot of situation, the former is better embodiment.Based on such understanding, technical scheme of the present invention can embody with the form of software product the part that prior art contributes in essence in other words, this computer software product is stored in a storage medium (as ROM/RAM, magnetic disc, CD), comprising some instructions in order to make a station terminal equipment (can be mobile phone, computing machine, server, air conditioner, or the network equipment etc.) perform method described in each embodiment of the present invention.
These are only the preferred embodiments of the present invention; not thereby the scope of the claims of the present invention is limited; every utilize instructions of the present invention and accompanying drawing content to do equivalent structure or equivalent flow process conversion; or be directly or indirectly used in other relevant technical fields, be all in like manner included in scope of patent protection of the present invention.

Claims (11)

1. process a method for edge interactive operation, when the mutual FiT function of unlatching breath platen edge, it is characterized in that, comprising:
Whether the induction region that Real-Time Monitoring has been opened has user operation;
When monitoring the induction region opened and having user operation, obtain the report point parameter of user operation;
Judge whether the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set; When judging that the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set, the report point parameter according to the user operation obtained judges that this user operation is valid function;
Trigger the FiT function corresponding with this user operation.
2. method according to claim 1, is characterized in that, also comprises before the method: pre-set described induction region;
Open the contact panel TP of the induction region preset of preassigned sensor and described mobile terminal.
3. method according to claim 1, is characterized in that, the described definition gesture judging the whether satisfied FiT pre-set of the report point parameter of the user operation obtained, comprising:
According to the gesture of the report point parameter determination user operation of described acquisition;
Judge the gesture of described user operation determined whether in the definition gesture of the described FiT pre-set;
When the gesture of the described described user operation determined is in the definition gesture of the described FiT pre-set, then judge that the report point parameter of the user operation obtained meets the definition gesture of the described FiT pre-set;
When the gesture of the described user operation determined is not in the definition gesture of the described FiT pre-set, then judge that the report point parameter of the user operation obtained does not meet the definition gesture of the FiT pre-set.
4. method according to claim 1, is characterized in that, the report point parameter of the described user operation according to obtaining judges that this user operation is that valid function includes but not limited to:
The numerical value of the user operation of described preassigned sensor collection is more than or equal to corresponding predetermined threshold value; Or,
Described mobile terminal is in the operable state of this FiT function.
5. the method according to any one of claim 1-4, is characterized in that, described report point parameter comprises: report point quantity, and/or the numerical value of the user operation of preassigned sensor collection, and/or moving radius, and/or movement velocity, and/or direction of motion, and/or run duration.
6. the method according to any one of claim 1-4, is characterized in that, the definition gesture of described FiT includes but not limited to: boundary slip gesture, edge slides gesture repeatedly, bilateral slip gesture, grip, hold one to shake hands gesture, edge is clicked and horizontal screen hands grasping.
7. a mobile terminal, when the mutual FiT function of unlatching breath platen edge, is characterized in that, comprising: monitoring modular, acquisition module, judge module and processing module; Wherein,
Monitoring modular, whether the induction region opened for Real-Time Monitoring has user operation; When monitoring the induction region opened and having user operation, send to acquisition module and obtain notice;
Acquisition module, obtaining notice for receiving, obtaining the report point parameter of user operation;
Judge module, for judging whether the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set; When judging that the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set, the report point parameter according to the user operation obtained judges that this user operation is valid function, sends response notice to processing module;
Processing module, for receiving response notice, triggers the FiT function corresponding with this user operation.
8. mobile terminal according to claim 7, is characterized in that, this mobile terminal also comprises control module, for opening the contact panel TP of the induction region preset of preassigned sensor and the described mobile terminal belonging to self.
9. mobile terminal according to claim 7, is characterized in that, described judge module judges whether the report point parameter of the user operation obtained meets the definition gesture of the FiT pre-set, and comprising:
According to the gesture of the report point parameter determination user operation of described acquisition;
Judge the gesture of described user operation determined whether in the definition gesture of the described FiT pre-set;
When the gesture of the described described user operation determined is in the definition gesture of the described FiT pre-set, then judge that the report point parameter of the user operation obtained meets the definition gesture of the described FiT pre-set;
When the gesture of the described user operation determined is not in the definition gesture of the described FiT pre-set, then judge that the report point parameter of the user operation obtained does not meet the definition gesture of the described FiT pre-set.
10. mobile terminal according to claim 7, is characterized in that, according to the report point parameter of the user operation obtained, described judge module judges that this user operation is that valid function includes but not limited to:
The numerical value of the user operation of described preassigned sensor collection is more than or equal to corresponding predetermined threshold value; Or,
Described mobile terminal is in the operable state of this FiT function.
11. mobile terminals according to any one of claim 7-10, it is characterized in that, described report point parameter comprises: report point quantity, and/or the numerical value of the user operation of preassigned sensor collection, and/or moving radius, and/or movement velocity, and/or direction of motion, and/or run duration.
CN201510613259.8A 2015-09-23 2015-09-23 Edge interactive operation processing method and mobile terminal Pending CN105117020A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510613259.8A CN105117020A (en) 2015-09-23 2015-09-23 Edge interactive operation processing method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510613259.8A CN105117020A (en) 2015-09-23 2015-09-23 Edge interactive operation processing method and mobile terminal

Publications (1)

Publication Number Publication Date
CN105117020A true CN105117020A (en) 2015-12-02

Family

ID=54665035

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510613259.8A Pending CN105117020A (en) 2015-09-23 2015-09-23 Edge interactive operation processing method and mobile terminal

Country Status (1)

Country Link
CN (1) CN105117020A (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105512525A (en) * 2015-12-01 2016-04-20 小米科技有限责任公司 Unlock method and device
CN105528154A (en) * 2015-12-09 2016-04-27 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN105681594A (en) * 2016-03-29 2016-06-15 努比亚技术有限公司 Edge interaction system and method for terminal
CN105892920A (en) * 2016-03-31 2016-08-24 努比亚技术有限公司 Display control method and device
CN106055153A (en) * 2016-05-30 2016-10-26 努比亚技术有限公司 Method for correcting edge interaction operation and mobile terminal
WO2016197713A1 (en) * 2016-01-08 2016-12-15 中兴通讯股份有限公司 Battery deformation detection method and device
CN106686235A (en) * 2016-12-28 2017-05-17 努比亚技术有限公司 Method and apparatus for preventing mistaken touches on terminal
WO2017148016A1 (en) * 2016-03-01 2017-09-08 北京佳拓思科技有限公司 Light-based unlocking apparatus
CN108334515A (en) * 2017-01-20 2018-07-27 阿里巴巴集团控股有限公司 The method, apparatus and system of stack address in file are collapsed in a kind of processing
CN108572727A (en) * 2017-03-13 2018-09-25 精工爱普生株式会社 Input unit, input control method and recording medium
CN109344586A (en) * 2018-10-24 2019-02-15 Oppo广东移动通信有限公司 Unlocking method, device, storage medium and mobile terminal
WO2019072254A1 (en) * 2017-10-13 2019-04-18 华为技术有限公司 Control method and terminal
WO2019119361A1 (en) * 2017-12-21 2019-06-27 深圳市为通博科技有限责任公司 Method for avoiding mistakenly touching edge, touch control device and storage medium
CN110192170A (en) * 2017-12-11 2019-08-30 深圳市汇顶科技股份有限公司 Touch controller, device, terminal and touch control method
CN112596847A (en) * 2016-01-08 2021-04-02 创新先进技术有限公司 Method and device for calling functions in application

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101482785A (en) * 2008-01-04 2009-07-15 苹果公司 Selective rejection of touch contacts in an edge region of a touch surface
CN102207818A (en) * 2010-02-19 2011-10-05 微软公司 Page manipulations using on and off-screen gestures
CN102692984A (en) * 2011-03-24 2012-09-26 国基电子(上海)有限公司 Electronic device
CN102999216A (en) * 2012-11-21 2013-03-27 汕头超声显示器(二厂)有限公司 Low-power consumption touch display module, touch electronic device with same and software system thereof
CN103257820A (en) * 2012-02-20 2013-08-21 联想(北京)有限公司 Control method and electronic device
CN103475919A (en) * 2013-09-04 2013-12-25 深圳Tcl新技术有限公司 Remote control method and system based on side face sliding touch and remote control
CN104156073A (en) * 2014-08-29 2014-11-19 深圳市中兴移动通信有限公司 Mobile terminal and operation method thereof
CN104615303A (en) * 2015-01-30 2015-05-13 深圳市中兴移动通信有限公司 Mistaken touch prevention method of mobile terminal and mobile terminal
CN104636039A (en) * 2015-01-30 2015-05-20 深圳市中兴移动通信有限公司 Application control method and device based on borderless terminal
CN104765543A (en) * 2015-04-17 2015-07-08 努比亚技术有限公司 Audio playing parameter adjustment method and device
CN104915137A (en) * 2015-03-31 2015-09-16 努比亚技术有限公司 Music playing method and device based on frame-free terminal

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101482785A (en) * 2008-01-04 2009-07-15 苹果公司 Selective rejection of touch contacts in an edge region of a touch surface
CN102207818A (en) * 2010-02-19 2011-10-05 微软公司 Page manipulations using on and off-screen gestures
CN102692984A (en) * 2011-03-24 2012-09-26 国基电子(上海)有限公司 Electronic device
CN103257820A (en) * 2012-02-20 2013-08-21 联想(北京)有限公司 Control method and electronic device
CN102999216A (en) * 2012-11-21 2013-03-27 汕头超声显示器(二厂)有限公司 Low-power consumption touch display module, touch electronic device with same and software system thereof
CN103475919A (en) * 2013-09-04 2013-12-25 深圳Tcl新技术有限公司 Remote control method and system based on side face sliding touch and remote control
CN104156073A (en) * 2014-08-29 2014-11-19 深圳市中兴移动通信有限公司 Mobile terminal and operation method thereof
CN104615303A (en) * 2015-01-30 2015-05-13 深圳市中兴移动通信有限公司 Mistaken touch prevention method of mobile terminal and mobile terminal
CN104636039A (en) * 2015-01-30 2015-05-20 深圳市中兴移动通信有限公司 Application control method and device based on borderless terminal
CN104915137A (en) * 2015-03-31 2015-09-16 努比亚技术有限公司 Music playing method and device based on frame-free terminal
CN104765543A (en) * 2015-04-17 2015-07-08 努比亚技术有限公司 Audio playing parameter adjustment method and device

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105512525A (en) * 2015-12-01 2016-04-20 小米科技有限责任公司 Unlock method and device
CN105528154A (en) * 2015-12-09 2016-04-27 广东欧珀移动通信有限公司 Control method, control device and electronic device
CN112596847A (en) * 2016-01-08 2021-04-02 创新先进技术有限公司 Method and device for calling functions in application
WO2016197713A1 (en) * 2016-01-08 2016-12-15 中兴通讯股份有限公司 Battery deformation detection method and device
WO2017148016A1 (en) * 2016-03-01 2017-09-08 北京佳拓思科技有限公司 Light-based unlocking apparatus
CN105681594A (en) * 2016-03-29 2016-06-15 努比亚技术有限公司 Edge interaction system and method for terminal
CN105681594B (en) * 2016-03-29 2019-03-01 努比亚技术有限公司 A kind of the edge interactive system and method for terminal
CN105892920A (en) * 2016-03-31 2016-08-24 努比亚技术有限公司 Display control method and device
CN106055153B (en) * 2016-05-30 2019-01-22 努比亚技术有限公司 A kind of method and mobile terminal for correcting edge interactive operation
CN106055153A (en) * 2016-05-30 2016-10-26 努比亚技术有限公司 Method for correcting edge interaction operation and mobile terminal
CN106686235A (en) * 2016-12-28 2017-05-17 努比亚技术有限公司 Method and apparatus for preventing mistaken touches on terminal
CN108334515A (en) * 2017-01-20 2018-07-27 阿里巴巴集团控股有限公司 The method, apparatus and system of stack address in file are collapsed in a kind of processing
CN108334515B (en) * 2017-01-20 2022-07-15 阿里巴巴集团控股有限公司 Method, device and system for processing stack address in crash file
CN108572727A (en) * 2017-03-13 2018-09-25 精工爱普生株式会社 Input unit, input control method and recording medium
WO2019072254A1 (en) * 2017-10-13 2019-04-18 华为技术有限公司 Control method and terminal
US11216116B2 (en) 2017-10-13 2022-01-04 Huawei Technologies Co., Ltd. Control method and terminal
CN110192170A (en) * 2017-12-11 2019-08-30 深圳市汇顶科技股份有限公司 Touch controller, device, terminal and touch control method
WO2019119361A1 (en) * 2017-12-21 2019-06-27 深圳市为通博科技有限责任公司 Method for avoiding mistakenly touching edge, touch control device and storage medium
US11093083B2 (en) 2017-12-21 2021-08-17 Shenzhen Weitongbo Technology Co., Ltd. Method for preventing an edge from being unintentionally touched, touch control apparatus and storage medium
CN109344586A (en) * 2018-10-24 2019-02-15 Oppo广东移动通信有限公司 Unlocking method, device, storage medium and mobile terminal

Similar Documents

Publication Publication Date Title
CN105117020A (en) Edge interactive operation processing method and mobile terminal
CN104850345A (en) Mobile terminal and fast task switching method therefor
CN104750417A (en) Frameless terminal application switching method and frameless terminal
CN104793867A (en) Mobile terminal and sliding recognition method for virtual frame area of mobile terminal
CN104834462A (en) Unlocking method for terminal equipment and terminal equipment
CN104796552A (en) Quick screen luminance adjustment method and quick screen luminance adjustment device
CN104935725A (en) Mobile terminal and method for realizing function adjustment by utilizing virtual frame areas of mobile terminal
EP3691234B1 (en) Photographing method and terminal
CN105511776A (en) Mobile terminal and control method
CN104834517A (en) Mobile terminal power-saving method and mobile terminal
CN105302596A (en) Mobile terminal application opening device and method
CN105487802A (en) Screen projection management method, device and system
CN104765543A (en) Audio playing parameter adjustment method and device
CN105468153A (en) Mobile terminal and control realizing method thereof
CN104731483A (en) Method and terminal for fast adjusting set parameters
CN104915058A (en) Method and device for awakening intelligent terminal
CN104766002A (en) Method and device for unlocking
CN104793885A (en) Mobile terminal and memory cleaning control method thereof
CN104834463A (en) Holding recognition method and device of mobile terminal
CN104731339A (en) Holding mode recognition method and device for mobile terminal
CN105094673A (en) Rapid starting method and apparatus for applications
CN104731504A (en) Application control method and device based on border-free terminal
CN106843724A (en) A kind of mobile terminal screen anti-error-touch device and method, mobile terminal
CN104731411A (en) Single-click action recognition method and device of mobile terminal
CN105117155B (en) Mobile terminal and its control method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20151202