WO2015020283A1 - 이동 단말기 및 그것의 제어방법 - Google Patents
이동 단말기 및 그것의 제어방법 Download PDFInfo
- Publication number
- WO2015020283A1 WO2015020283A1 PCT/KR2013/011885 KR2013011885W WO2015020283A1 WO 2015020283 A1 WO2015020283 A1 WO 2015020283A1 KR 2013011885 W KR2013011885 W KR 2013011885W WO 2015020283 A1 WO2015020283 A1 WO 2015020283A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display unit
- mobile terminal
- tap
- touch sensor
- touch
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
- G06F1/3231—Monitoring the presence, absence or movement of users
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
- G06F1/3287—Power saving characterised by the action undertaken by switching off individual functional units in the computer system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0412—Digitisers structurally integrated in a display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B1/00—Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
- H04B1/38—Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
- H04B1/40—Circuits
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1636—Sensing arrangement for detection of a tap gesture on the housing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/66—Substation equipment, e.g. for use by subscribers with means for preventing unauthorised or fraudulent calling
- H04M1/667—Preventing unauthorised calls from a telephone set
- H04M1/67—Preventing unauthorised calls from a telephone set by electronic means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72412—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/12—Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/50—Reducing energy consumption in communication networks in wire-line communication networks, e.g. low power modes or reduced link rate
Definitions
- the present invention relates to a mobile terminal and a control method thereof capable of controlling a function of the mobile terminal in response to an external force.
- Terminals may be divided into mobile / portable terminals and stationary terminals according to their mobility.
- the mobile terminal may be further classified into a handheld terminal and a vehicle mount terminal according to whether a user can directly carry it.
- the terminal is implemented in the form of a multimedia player having complex functions such as taking a picture or a video, playing a music or a video file, playing a game, and receiving a broadcast. .
- One object of the present invention relates to a mobile terminal capable of controlling the terminal only by tapping the terminal body or the terminal and its control method.
- a mobile terminal may include a display unit, a touch sensor sensing a tap that strikes the display unit, and a tap corresponding to a preset method on the display unit. And a control unit for controlling at least one of the functions executable in the terminal, wherein the touch sensor is configured to detect the tap using different methods in the activated and deactivated states of the display unit.
- the method may be related to an activation cycle of the touch sensor, and the touch sensor may be activated at different cycles according to whether the display unit is activated.
- the touch sensor may be activated periodically so as to correspond to a preset specific period while the display unit is in a deactivated state.
- the touch sensor when the display unit is in an activated state, the touch sensor is continuously activated.
- the power consumption of the touch sensor to detect the touch when the touch is detected by the touch sensor may vary depending on whether the display unit is activated.
- the proximity sensor may further include a proximity sensor configured to sense an object located within a reference distance from the deactivated display unit, wherein the touch sensor determines whether to activate an active state according to whether the object is sensed through the proximity sensor. It is characterized by.
- the touch sensor may be inactivated when the object is sensed through the proximity sensor, and periodically activated when the object is not sensed.
- the control unit may control the at least one function when a tap that continuously strikes the display unit is detected through the touch sensor while the display unit is inactivated.
- the continuously tapping tab may include a first tab and a second tap applied within a preset time after the first tap is applied.
- the control unit may be further configured to, when the second tap corresponds to a preset invalid condition, the at least one tap even if a tap corresponding to the preset method is detected for a preset time after the second tap is detected. It is characterized by limiting the control of the function.
- an initial screen displayed on the display unit when the display unit is switched from an inactive state to an active state is applied to the display unit.
- the information related to the information displayed at the position where the tab is applied among the information is displayed.
- the first screen information may correspond to a lock screen, and when the tab is applied to a first area of a display area of the display unit, time information is displayed, and a second different from the first area of the display area. When the tab is applied to the area, the home screen page is output.
- the control unit may control the at least one function in response to a tap applied to a preset specific area of the display unit when the display unit is deactivated.
- the touch sensor may be disposed to correspond to a display area of the display unit, and when the display unit is in an inactive state, at least one area of the touch sensor may be inactivated.
- the mobile terminal may control a function of the mobile terminal in response to a tap for tapping an object. Therefore, the user can use a user interface that can easily control the function of the mobile terminal even if the user does not operate the mobile terminal through a plurality of taps on the mobile terminal.
- the mobile terminal according to an embodiment of the present disclosure may control different functions or change different setting information according to the position where the tap is applied. Therefore, as the user applies the tap to various positions, the user can control various functions only by tapping the mobile terminal.
- the mobile terminal since the mobile terminal detects a tap using an acceleration sensor, the mobile terminal according to an embodiment of the present invention can sense not only a tap on the terminal body but also a tap applied to a point outside the main body. Therefore, in a situation where the terminal is far away from each other or when a user cannot apply a touch due to wearing gloves, the user can control various functions using the tap.
- the mobile terminal since the mobile terminal periodically activates the touch sensor while the display unit is inactivated, the mobile terminal can accurately detect a tap applied to the display unit using the touch sensor. Since the touch sensor is activated periodically, the efficiency of power usage can be increased.
- the acceleration sensor of the mobile terminal according to an embodiment of the present invention may detect a tap applied to the mobile terminal by being always on until the battery is discharged even when another sensing unit is inactivated.
- the mobile terminal according to an embodiment of the present disclosure may activate various sensors such as a touch sensor when the first tap is detected. Therefore, since the mobile terminal detects the second tap using various sensors together with the acceleration sensor, it is possible to prevent a malfunction and to minimize power consumption.
- FIG. 1 is a block diagram illustrating a mobile terminal according to one embodiment disclosed herein.
- FIGS. 2A and 2B are conceptual views of a communication system capable of operating a mobile terminal according to the present invention.
- 3A is a front perspective view of an example of a mobile terminal according to the present invention.
- FIG. 3B is a rear perspective view of the mobile terminal shown in FIG. 3A
- FIG. 4 is a flowchart illustrating a control method of a mobile terminal according to an embodiment of the present invention.
- FIG. 5 is a flowchart for describing in more detail a method of using an acceleration sensor in the control method shown in FIG. 4.
- FIG. 6 is a diagram for describing a method of sensing an tap by an acceleration sensor according to the control method of FIG. 5;
- 7A, 7B, 7C, 7D, and 7E are conceptual views illustrating the control method described with reference to FIG. 4.
- 8A, 8B, and 8C are conceptual views illustrating a method of executing different functions according to an object of a tab in a mobile terminal according to one embodiment of the present invention
- 9A, 9B, 9C, 10A, 10B, and 11 are conceptual views illustrating a method of executing different functions according to a target point of a tap in a mobile terminal according to an embodiment of the present invention.
- 12A and 12B are conceptual views illustrating a method of executing different functions according to a pann of taps in a mobile terminal according to an embodiment of the present invention.
- FIG. 13 14, 15A, 15B, 15C, and 15D are conceptual views illustrating a method of controlling a function according to a touch input applied after tapping in a mobile terminal according to an embodiment of the present invention.
- FIG. 16 is a conceptual view illustrating a method of controlling a mobile terminal in response to a tapping of the mobile terminal in a specific situation in the mobile terminal according to one embodiment of the present invention
- FIG. 17 is a conceptual view illustrating a method of connecting a plurality of mobile terminals sensing the same tap in a mobile terminal according to an embodiment of the present disclosure
- FIG. 18 is a conceptual view illustrating an operation example of deactivating a display unit in response to tapping in a mobile terminal according to one embodiment of the present invention
- FIG. 19 is a flowchart for describing a method of using a touch sensor in the control method illustrated in FIG. 4 in more detail.
- FIG. 20 is a diagram for describing a current consumption of a touch sensor in a mobile terminal according to one embodiment of the present invention.
- FIG. 21 illustrates a mode in which a display unit and a touch sensor operate in a mobile terminal according to an embodiment of the present disclosure.
- FIG. 22 is a flowchart illustrating a method of controlling a touch sensor using a proximity sensor in the method illustrated in FIG. 19.
- FIG. 23 is a flowchart illustrating a method for preventing a malfunction in a mobile terminal according to an embodiment of the present invention.
- FIG. 24 is a conceptual view illustrating an operation example in which a specific region of a touch sensor is deactivated in a state in which a display unit is deactivated in a mobile terminal according to an embodiment of the present disclosure.
- the mobile terminal described herein includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant, a portable multimedia player, a navigation, a slate PC , Tablet PC, ultrabook, and the like.
- a mobile phone a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant, a portable multimedia player, a navigation, a slate PC , Tablet PC, ultrabook, and the like.
- fixed terminals such as digital TVs, desktop computers, etc., except when applicable only to mobile terminals.
- FIG. 1 is a block diagram illustrating a mobile terminal 100 according to an exemplary embodiment disclosed herein.
- the mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, and an interface.
- the unit 170, the controller 180, and the power supply unit 190 may be included.
- the components shown in FIG. 1 are not essential, so that a mobile terminal having more or fewer components may be implemented.
- the wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and a network in which the mobile terminal 100 is located.
- the wireless communication unit 110 may include at least one of the broadcast receiving module 111, the mobile communication module 112, the wireless internet module 113, the short range communication module 114, and the location information module 115. Can be.
- the broadcast receiving module 111 receives a broadcast signal and / or broadcast related information from an external broadcast management server through a broadcast channel.
- the broadcast channel may include a satellite channel and a terrestrial channel.
- the broadcast management server may mean a server that generates and transmits a broadcast signal and / or broadcast related information or a server that receives a previously generated broadcast signal and / or broadcast related information and transmits the same to a terminal.
- the broadcast signal may include not only a TV broadcast signal, a radio broadcast signal, and a data broadcast signal, but also a broadcast signal having a data broadcast signal combined with a TV broadcast signal or a radio broadcast signal.
- the broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider.
- the broadcast related information may also be provided through a mobile communication network. In this case, it may be received by the mobile communication module 112.
- the broadcast related information may exist in various forms. For example, it may exist in the form of Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H).
- EPG Electronic Program Guide
- DMB Digital Multimedia Broadcasting
- ESG Electronic Service Guide
- DVB-H Digital Video Broadcast-Handheld
- the broadcast receiving module 111 may include, for example, Digital Multimedia Broadcasting-Terrestrial (DMB-T), Digital Multimedia Broadcasting-Satellite (DMB-S), Media Forward Link Only (MediaFLO), and Digital Video Broadcast (DVB-H).
- Digital broadcast signals can be received using digital broadcasting systems such as Handheld and Integrated Services Digital Broadcast-Terrestrial (ISDB-T).
- ISDB-T Handheld and Integrated Services Digital Broadcast-Terrestrial
- the broadcast receiving module 111 may be configured to be suitable for not only the above-described digital broadcasting system but also other broadcasting systems.
- the broadcast signal and / or broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.
- the mobile communication module 112 transmits and receives a wireless signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
- the wireless signal may include various types of data according to transmission and reception of a voice call signal, a video call call signal, or a text / multimedia message.
- the mobile communication module 112 is configured to implement a video call mode and a voice call mode.
- the video call mode refers to a state of making a call while viewing the other party's video
- the voice call mode refers to a state of making a call without viewing the other party's image.
- the mobile communication module 112 is configured to transmit and receive at least one of audio and video.
- the wireless internet module 113 refers to a module for wireless internet access and may be embedded or external to the mobile terminal 100.
- Wireless Internet technologies include Wireless LAN (WLAN), Wireless Fidelity (WiFi) Direct, Digital Living Network Alliance (DLNA), Wireless broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and more. This can be used.
- the short range communication module 114 refers to a module for short range communication.
- Short range communication technologies include Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and Near Field Communication (NFC). Can be.
- RFID Radio Frequency Identification
- IrDA Infrared Data Association
- UWB Ultra Wideband
- ZigBee ZigBee
- NFC Near Field Communication
- the location information module 115 is a module for obtaining a location of a mobile terminal, and a representative example thereof is a Global Position System (GPS) module or a Wireless Fidelity (WiFi) module.
- GPS Global Position System
- WiFi Wireless Fidelity
- the A / V input unit 120 is for inputting an audio signal or a video signal, and may include a camera 121 and a microphone 122.
- the camera 121 processes image frames such as still images or moving images obtained by the image sensor in a video call mode or a photographing mode.
- the processed image frame may be displayed on the display unit 151.
- the image frame processed by the camera 121 may be stored in the memory 160 or transmitted to an external device through the wireless communication unit 110.
- the position information of the user may be calculated from the image frame obtained by the camera 121.
- Two or more cameras 121 may be provided according to the use environment.
- the microphone 122 receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes the external sound signal into electrical voice data.
- the processed voice data may be converted into a form transmittable to the mobile communication base station through the mobile communication module 112 and output in the call mode.
- the microphone 122 may implement various noise removing algorithms for removing noise generated in the process of receiving an external sound signal.
- the user input unit 130 generates input data according to a control command for controlling the operation of the mobile terminal 100 applied from the user.
- the user input unit 130 may include a key pad, a dome switch, a touch pad (constant voltage / capacitance), a jog wheel, a jog switch, and the like.
- the sensing unit may be configured such as an open / closed state of the mobile terminal 100, a position of the mobile terminal 100, presence or absence of user contact, orientation of the mobile terminal, acceleration / deceleration of the mobile terminal, and the like.
- the current state is detected to generate a detection signal (or sensing signal) for controlling the operation of the mobile terminal 100.
- the sensing unit 140 may detect whether the slide phone is opened or closed when the mobile terminal 100 is in the form of a slide phone.
- the sensing unit 140 may detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled to an external device.
- the output unit 150 is used to generate an output related to visual, auditory or tactile senses, and may include a display unit 151, an audio output module 153, an alarm unit 154, and a haptic module 155. have.
- the display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal is in a call mode, the mobile terminal displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the mobile terminal 100 is in a video call mode or a photographing mode, the display unit 151 displays a photographed and / or received image, a UI, and a GUI.
- UI user interface
- GUI graphic user interface
- the display unit 151 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display (flexible). display, a 3D display, or an e-ink display.
- LCD liquid crystal display
- TFT LCD thin film transistor-liquid crystal display
- OLED organic light-emitting diode
- flexible display flexible display
- display a 3D display, or an e-ink display.
- Some of these displays can be configured to be transparent or light transmissive so that they can be seen from the outside. This may be referred to as a transparent display.
- a representative example of the transparent display is TOLED (Transparant OLED).
- the rear structure of the display unit 151 may also be configured as a light transmissive structure. With this structure, the user can see the object located behind the terminal body through the area occupied by the display unit 151 of the terminal body.
- two or more display units 151 may exist.
- a plurality of display units may be spaced apart or integrally disposed on one surface of the mobile terminal 100, or may be disposed on different surfaces, respectively.
- the display unit 151 may be configured as a stereoscopic display unit 152 for displaying a stereoscopic image.
- the stereoscopic image represents a three-dimensional stereoscopic image
- the three-dimensional stereoscopic image represents a gradual depth and reality in which an object is placed on a monitor or a screen. This is a video that lets you feel the same as space.
- 3D stereoscopic images are implemented using binocular disparity. Binocular parallax means the parallax made by the position of two eyes that are separated. When the two eyes see different two-dimensional images and the images are transferred to the brain through the retina and are fused, the depth and reality of the stereoscopic image can be felt. .
- the stereoscopic display unit 152 may be a three-dimensional display method such as stereoscopic (glasses), auto stereoscopic (glasses), projection (holographic). Stereoscopic methods commonly used in home television receivers include Wheatstone stereoscope.
- Examples of the auto stereoscopic method include a parallax barrier method, a lenticular method, an integrated imaging method, a switchable lens, and the like.
- Projection methods include reflective holographic methods and transmissive holographic methods.
- a 3D stereoscopic image is composed of a left image (left eye image) and a right image (right eye image).
- a top-down method in which the left and right images are arranged up and down in one frame according to the way in which the left and right images are merged into three-dimensional stereoscopic images.
- L-to-R (left-to-right, side by side) method to be arranged as a checker board method to arrange the pieces of the left and right images in the form of tiles, a column unit of the left and right images Or an interlaced method of alternately arranging rows, and a time sequential (frame by frame) method of alternately displaying left and right images by time.
- the 3D thumbnail image may generate a left image thumbnail and a right image thumbnail from the left image and the right image of the original image frame, respectively, and combine them to generate one 3D thumbnail image.
- a thumbnail refers to a reduced image or a reduced still image.
- the left image thumbnail and the right image thumbnail generated as described above are displayed with a left and right distance difference on the screen by a depth corresponding to the parallax of the left image and the right image, thereby representing a three-dimensional space.
- the left image and the right image necessary for implementing the 3D stereoscopic image may be displayed on the stereoscopic display 152 by a stereoscopic processor (not shown).
- the stereo processing unit receives a 3D image and extracts a left image and a right image therefrom, or receives a 2D image and converts the image into a left image and a right image.
- the display unit 151 and a sensor for detecting a touch operation form a mutual layer structure (hereinafter, referred to as a “touch screen”)
- the display unit 151 outputs the same.
- the touch sensor may have, for example, a form of a touch film, a touch sheet, a touch pad, or the like.
- the touch sensor may be configured to convert a change in pressure applied to a specific portion of the display unit 151 or capacitance generated in a specific portion of the display unit 151 into an electrical input signal.
- the touch sensor may be configured to detect not only the position and area where the touch object is touched on the touch sensor but also the pressure at the touch.
- the touch object is an object applying a touch to the touch sensor and may be, for example, a finger, a touch pen or a stylus pen, a pointer, or the like.
- the touch controller processes the signal (s) and then transmits the corresponding data to the controller 180. As a result, the controller 180 can know which area of the display unit 151 is touched.
- a proximity sensor 141 may be disposed in an inner region of a mobile terminal surrounded by the touch screen or near the touch screen.
- the proximity sensor 141 may be provided as an example of the sensing unit 140.
- the proximity sensor 141 refers to a sensor that detects the presence or absence of an object approaching a predetermined detection surface or an object present in the vicinity without using a mechanical contact by using an electromagnetic force or infrared rays.
- the proximity sensor 141 has a longer life and higher utilization than a contact sensor.
- Examples of the proximity sensor 141 include a transmission photoelectric sensor, a direct reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.
- the touch screen is capacitive, the touch screen is configured to detect the proximity of the pointer by a change in an electric field according to the proximity of a conductive object (hereinafter, referred to as a pointer).
- the touch screen may be classified as a proximity sensor.
- proximity touch an act of allowing the pointer to be recognized without being in contact with the touch screen so that the pointer is located on the touch screen
- contact touch an act of actually touching the pointer on the screen.
- the position at which the proximity touch is performed by the pointer on the touch screen means a position where the pointer is perpendicular to the touch screen when the pointer is in proximity proximity.
- the proximity sensor 141 detects a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state). Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.
- a proximity touch and a proximity touch pattern for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, and a proximity touch movement state.
- Information corresponding to the sensed proximity touch operation and proximity touch pattern may be output on the touch screen.
- a stereoscopic touch screen When the stereoscopic display unit 152 and the touch sensor form a mutual layer structure (hereinafter, referred to as a “stereoscopic touch screen”), or when the stereoscopic display unit 152 and the 3D sensor detecting a touch operation are combined with each other.
- the stereoscopic display unit 152 may also be used as a three-dimensional input device.
- the sensing unit 140 may include a proximity sensor 141, a stereoscopic touch sensing unit 142, an ultrasonic sensing unit 143, and a camera sensing unit 144.
- the proximity sensor 141 measures the distance between a sensing object (for example, a user's finger or a stylus pen) and a detection surface to which a touch is applied without mechanical contact by using an electromagnetic force or infrared rays.
- the terminal recognizes which part of the stereoscopic image is touched using this distance.
- the touch screen is capacitive, the proximity of the sensing object is detected by a change in electric field according to the proximity of the sensing object, and the touch screen is configured to recognize a three-dimensional touch using the proximity.
- the stereoscopic touch sensing unit 142 is configured to detect the strength or duration of the touch applied to the touch screen. For example, the three-dimensional touch sensing unit 142 detects a pressure to apply a touch, and if the pressure is strong, recognizes it as a touch on an object located farther from the touch screen toward the inside of the terminal.
- the ultrasonic sensing unit 143 uses ultrasonic waves to recognize position information of the sensing object.
- the ultrasonic sensing unit 143 may be formed of, for example, an optical sensor and a plurality of ultrasonic sensors.
- the optical sensor is configured to detect light
- the ultrasonic sensor is configured to detect ultrasonic waves. Because light is much faster than ultrasonic waves, the time that light reaches the optical sensor is much faster than the time that ultrasonic waves reach the ultrasonic sensor. Therefore, the position of the wave generation source can be calculated using the time difference from the time when the ultrasonic wave reaches the light as the reference signal.
- the camera sensing unit 144 includes at least one of a camera 121, a photo sensor, and a laser sensor.
- the camera 121 and the laser sensor are combined with each other to sense a touch of a sensing object with respect to a 3D stereoscopic image.
- 3D information may be obtained.
- a photo sensor may be stacked on the display element.
- the photo sensor is configured to scan the movement of the sensing object in proximity to the touch screen. More specifically, the photo sensor mounts a photo diode and a transistor (TR) in a row / column and scans contents mounted on the photo sensor by using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor calculates coordinates of the sensing object according to the amount of change of light, and thereby obtains the position information of the sensing object.
- TR transistor
- the acceleration sensor 145 may detect the movement of the terminal body.
- the acceleration sensor may detect the spatial movement of the main body based on the x-axis, y-axis, and z-axis.
- the acceleration sensor 145 may measure the moving speed, the angular velocity, and the like, as well as dynamic forces such as acceleration, vibration, and impact of the terminal body.
- the sound output module 153 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
- the sound output module 153 may also output a sound signal related to a function (for example, a call signal reception sound or a message reception sound) performed in the mobile terminal 100.
- the sound output module 153 may include a receiver, a speaker, a buzzer, and the like.
- the alarm unit 154 outputs a signal for notifying occurrence of an event of the mobile terminal 100.
- Examples of events generated in the mobile terminal 100 include call signal reception, message reception, key signal input, and touch input.
- the alarm unit 154 may output a signal for notifying occurrence of an event by using a form other than a video signal or an audio signal, for example, vibration.
- the video signal or the audio signal may be output through the display unit 151 or the sound output module 153, and thus the display unit 151 and the sound output module 153 may be classified as part of the alarm unit 154. .
- the haptic module 155 generates various tactile effects that a user can feel.
- a representative example of the tactile effect generated by the haptic module 155 may be vibration.
- the intensity and pattern of vibration generated by the haptic module 155 may be controlled by the user's selection or the setting of the controller. For example, the haptic module 155 may output different synthesized vibrations or sequentially output them.
- the haptic module 155 may be configured to provide a pin array that vertically moves with respect to the contact skin surface, a jetting force or suction force of air through the jetting or suction port, grazing to the skin surface, contact of the electrode, electrostatic force, and the like.
- Various tactile effects can be generated, such as effects by the endothermic and the reproduction of a sense of cold using the elements capable of endotherm or heat generation.
- the haptic module 155 may not only transmit the haptic effect through direct contact, but also may implement the user to feel the haptic effect through the muscle sense such as a finger or an arm. Two or more haptic modules 155 may be provided according to configuration aspects of the mobile terminal 100.
- the memory 160 may store a program for the operation of the controller 180 and may temporarily store input / output data (for example, a phone book, a message, a still image, a video, etc.).
- the memory 160 may store data relating to various patterns of vibration and sound output when a touch input on the touch screen is performed.
- the memory 160 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), magnetic memory, magnetic It may include a storage medium of at least one type of a disk and an optical disk.
- the mobile terminal 100 may be operated in connection with a web storage that performs a storage function of the memory 160 on the Internet.
- the interface unit 170 serves as a path with all external devices connected to the mobile terminal 100.
- the interface unit 170 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device.
- the audio input / output (I / O) port, video input / output (I / O) port, earphone port, and the like may be included in the interface unit 170.
- the identification module is a chip that stores a variety of information for authenticating the usage rights of the mobile terminal 100, a user identification module (UIM), subscriber identity module (SIM), universal user authentication And a universal subscriber identity module (USIM).
- a device equipped with an identification module (hereinafter referred to as an 'identification device') may be manufactured in the form of a smart card. Therefore, the identification device may be connected to the terminal 100 through the interface unit 170.
- the interface unit 170 serves as a passage through which power from the cradle is supplied to the mobile terminal 100, or inputted from the cradle by a user.
- Various command signals may be a passage through which the mobile terminal 100 is transmitted.
- Various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal 100 is correctly mounted on the cradle.
- the controller 180 typically controls the overall operation of the mobile terminal 100. For example, control and processing related to voice calls, data communications, video calls, and the like are performed.
- the controller 180 may include a multimedia module 181 for playing multimedia.
- the multimedia module 181 may be implemented in the controller 180 or may be implemented separately from the controller 180.
- controller 180 may perform a pattern recognition process for recognizing a writing input or a drawing input performed on the touch screen as text and an image, respectively.
- the controller 180 may execute a lock state for limiting input of a user's control command to applications.
- the controller 180 may control the lock screen displayed in the locked state based on the touch input sensed by the display unit 151 in the locked state.
- the power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.
- Various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using, for example, software, hardware or a combination thereof.
- the embodiments described herein include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), and field programmable gate arrays (FPGAs). It may be implemented using at least one of processors, controllers, micro-controllers, microprocessors, and electrical units for performing other functions. In some cases, the embodiments described herein may be implemented by the controller 180 itself.
- ASICs application specific integrated circuits
- DSPs digital signal processors
- DSPDs digital signal processing devices
- PLDs programmable logic devices
- FPGAs field programmable gate arrays
- embodiments such as the procedures and functions described herein may be implemented as separate software modules.
- Each of the software modules may perform one or more functions and operations described herein.
- the software code may be implemented as a software application written in a suitable programming language.
- the software code may be stored in the memory 160 and executed by the controller 180.
- FIGS. 2A and 2B are conceptual views of a communication system in which the mobile terminal 100 according to the present invention can operate.
- a communication system may use different air interfaces and / or physical layers.
- a radio interface that can be used by a communication system includes frequency division multiple access (FDMA), time division multiple access (TDMA), and code division multiple access (CDMA). ), Universal Mobile Telecommunications Systems (UMTS) (in particular, Long Term Evolution (LTE)), Global System for Mobile Communications (GSM), and the like.
- FDMA frequency division multiple access
- TDMA time division multiple access
- CDMA code division multiple access
- UMTS Universal Mobile Telecommunications Systems
- LTE Long Term Evolution
- GSM Global System for Mobile Communications
- CDMA Code Division Multiple Access
- the CDMA wireless communication system includes at least one terminal 100, at least one base station (BS) 270, and at least one base station controllers (BSCs) 275. ), And a Mobile Switching Center (MSC) 280.
- the MSC 280 is configured to connect with a Public Switched Telephone Network (PSTN) 290 and BSCs 275.
- PSTN Public Switched Telephone Network
- the BSCs 275 may be coupled to the BS 270 through a backhaul line.
- the backhaul line may be provided according to at least one of E1 / T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL.
- a plurality of BSCs 275 may be included in the system shown in FIG. 2A.
- Each of the plurality of BSs 270 may include at least one sector, and each sector may include an omnidirectional antenna or an antenna pointing in a specific radial direction from the BS 270.
- each sector may include two or more antennas of various types.
- Each BS 270 may be configured to support multiple frequency assignments, each of which may have a specific spectrum (eg, 1.25 MHz, 5 MHz, etc.).
- BS 270 may be referred to as Base Station Transceiver Subsystems (BTSs).
- BTSs Base Station Transceiver Subsystems
- one BSC 275 and at least one BS 270 may be collectively referred to as a “base station”.
- the base station may also indicate “cell site”.
- each of the plurality of sectors for a particular BS 270 may be called a plurality of cell sites.
- the broadcasting transmitter 291 transmits a broadcast signal to the terminals 100 operating in the system.
- the broadcast receiving module 111 illustrated in FIG. 1 is provided in the terminal 100 to receive a broadcast signal transmitted by the BT 295.
- FIG. 2A illustrates a satellite 300 of a Global Positioning System (GPS).
- GPS Global Positioning System
- the satellite 300 helps to locate the mobile terminal 100. Although two satellites are shown in FIG. 2A, useful location information may be obtained by two or less or more satellites.
- the location information module 115 shown in FIG. 1 cooperates with the satellite 300 shown in FIG. 2A to obtain desired location information.
- the location of the mobile terminal 100 may be tracked using all the technologies capable of tracking the location as well as the GPS tracking technology.
- at least one of the GPS satellites 300 may optionally or additionally be responsible for satellite DMB transmission.
- BS 270 receives a reverse link signal from mobile terminal 100.
- the mobile terminal 100 is connecting a call, transmitting or receiving a message, or performing another communication operation.
- Each of the reverse link signals received by a particular base station 270 is processed within by the particular base station 270.
- the data generated as a result of the processing is transmitted to the connected BSC 275.
- BSC 275 provides call resource allocation and mobility management functionality, including the organization of soft handoffs between base stations 270.
- the BSCs 275 transmit the received data to the MSC 280, and the MSC 280 provides an additional transmission service for the connection with the PSTN 290.
- the PSTN 290 is connected to the MSC 280
- the MSC 280 is connected to the BSCs 275
- the BSCs 275 are connected to the BS 100 so that the forward link signal is transmitted to the mobile terminal 100. 270 can be controlled.
- FIG. 2B illustrates a method of acquiring location information of a mobile terminal using a Wi-Fi location tracking system (WPS: WiFi Positioning System).
- WPS WiFi Positioning System
- the Wi-Fi Positioning System (WPS) 300 moves by using a WiFi module provided in the mobile terminal 100 and a wireless access point 320 that transmits or receives a wireless signal with the WiFi module.
- a technology for tracking the location of the terminal 100 it refers to a location positioning technology based on a wireless local area network (WLAN) using WiFi.
- WLAN wireless local area network
- the Wi-Fi location tracking system 300 includes a Wi-Fi location server 310, a mobile terminal 100, a wireless AP 330 connected to the mobile terminal 100, and a database 330 in which arbitrary wireless AP information is stored. It may include.
- the Wi-Fi location server 310 extracts information of the wireless AP 320 connected to the mobile terminal 100 based on the location information request message (or signal) of the mobile terminal 100. Information of the wireless AP 320 connected to the mobile terminal 100 is transmitted to the Wi-Fi positioning server 310 through the mobile terminal 100, or from the wireless AP 320 to the Wi-Fi positioning server 310 Can be sent.
- the extracted information of the wireless AP is MAC Address, SSID, RSSI, channel information, Privacy, Network Type, Signal Strength and Noise Strength It may be at least one of.
- the Wi-Fi positioning server 310 receives the information of the wireless AP 320 connected to the mobile terminal 100, and includes the information included in the pre-built database 330 and the received wireless AP 320. By comparing the information, the location information of the mobile terminal 100 is extracted (or analyzed).
- the wireless AP connected to the mobile terminal 100 is illustrated as the first, second and third wireless APs 320 as an example.
- the number of wireless APs connected to the mobile terminal 100 may vary depending on the wireless communication environment in which the mobile terminal 100 is located.
- the Wi-Fi location tracking system 300 may track the location of the mobile terminal 100 when the mobile terminal 100 is connected to at least one wireless AP.
- the database 330 may store various information of arbitrary wireless APs disposed at different locations.
- the information on any wireless AP stored in the database 300 includes MAC address, SSID, RSSI, channel information, Privacy, Network Type, latitude and longitude coordinates of the wireless AP, the name of the building where the wireless AP is located, the number of floors, and indoor detailed location information. (GPS coordinates available), the AP owner's address, phone number and the like.
- the Wi-Fi positioning server 310 is connected to the mobile terminal 100 in the database 330.
- the location information of the mobile terminal 100 may be extracted by searching for wireless AP information corresponding to the information of the wireless AP 320 connected to and extracting location information matched with the retrieved wireless AP information.
- the extracted location information of the mobile terminal 100 is transmitted to the mobile terminal 100 through the Wi-Fi positioning server 310, the mobile terminal 100 can obtain the location information.
- 3A is a front perspective view of an example of a mobile terminal 100 according to the present invention.
- the disclosed mobile terminal 100 has a terminal body in the form of a bar.
- the present invention is not limited thereto and may be applied to various structures such as a watch type, a clip type, a glasses type, or a folder type, a flip type, a slide type, a swing type, a swivel type, in which two or more bodies are relatively movable. have.
- the body includes a case (frame, housing, cover, etc.) that forms an exterior.
- the case may be divided into a front case 101 and a rear case 102.
- Various electronic components are built in the space formed between the front case 101 and the rear case 102.
- At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102, and the battery cover 103 covering the battery 191 may be detachably attached to the rear case 102. have.
- the cases may be formed by injecting synthetic resin, or may be formed of metal, for example, stainless steel (STS), aluminum (Al), titanium (Ti), or the like.
- STS stainless steel
- Al aluminum
- Ti titanium
- the display unit 151, the first sound output module 153a, the first camera 121a, the first operation unit 131, etc. are disposed on the front surface of the terminal body, and the microphone 122 and the interface unit 170 are disposed on the side surface. ),
- the second operation unit 132 may be provided.
- the display unit 151 is configured to display (output) information processed by the mobile terminal 100.
- the display unit 151 includes a liquid crystal display (LCD) for visually representing information, a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) ), At least one of a flexible display, a 3D display, and an e-ink display.
- LCD liquid crystal display
- TFT LCD thin film transistor-liquid crystal display
- OLED organic light-emitting diode
- the display unit 151 may include a touch sensing means to receive a control command by a touch method.
- the touch sensing means may be configured to detect the touch and input content corresponding to the touched position.
- the content input by the touch method may be letters or numbers or menu items that can be indicated or designated in various modes.
- the touch sensing means is formed to be translucent so that the visual information output from the display unit 151 can be seen, and may include a structure for increasing the visibility of the touch screen in a bright place. According to FIG. 3A, the display unit 151 occupies most of the front surface of the front case 101.
- the first sound output module 153a and the first camera 121a are disposed in an area adjacent to one end of both ends of the display unit 151, and the first operation unit 131 and the microphone 122 are located in an area adjacent to the other end. ) Is placed.
- the second manipulation unit 132 (see FIG. 3B), the interface unit 170, and the like may be disposed on the side of the terminal body.
- the first sound output module 153a may be implemented in the form of a receiver for transmitting a call sound to a user's ear or a loud speaker for outputting various alarm sounds or reproduction sounds of multimedia.
- Sound generated from the first sound output module 153a may be configured to be emitted along an assembly gap between the structures.
- the holes formed independently for the sound output may not be visible or hidden, so that the appearance of the mobile terminal 100 may be simplified.
- the present invention is not limited thereto, and a hole for emitting the sound may be formed in the window.
- the first camera 121a processes an image frame such as a still image or a moving image obtained by the image sensor in a video call mode or a photographing mode.
- the processed image frame may be displayed on the display unit 151.
- the user input unit 130 is operated to receive a command for controlling the operation of the mobile terminal 100 and may include first and second manipulation units 131 and 132.
- the first and second manipulation units 131 and 132 may be collectively referred to as manipulating portions, and may be employed in any manner as long as the user is tactile in a tactile manner such as touch, push, or scroll. Can be.
- the first operation unit 131 is illustrated as a touch key, but the present invention is not limited thereto.
- the first manipulation unit 131 may be a push key or a combination of a touch key and a push key.
- Content input by the first and / or second manipulation units 131 and 132 may be variously set.
- the first operation unit 131 receives a command such as a menu, a home key, a cancellation, a search, etc.
- the second operation unit 132 adjusts the volume of the sound output from the first sound output module 153a or A command such as switching to the touch recognition mode of the display unit 151 may be input.
- the microphone 122 is formed to receive a user's voice, other sounds, and the like.
- the microphone 122 may be provided at a plurality of locations and configured to receive stereo sound.
- the interface unit 170 serves as a path for allowing the mobile terminal 100 to exchange data with an external device.
- the interface unit 170 is a wired or wireless connection terminal for connecting to the earphone, a port for short-range communication (for example, an infrared port (IrDA Port), Bluetooth port (Bluetooth Port), wireless LAN port (Wireless) LAN Port) and the like, or at least one of power supply terminals for supplying power to the mobile terminal 100.
- the interface unit 170 may be implemented in the form of a socket for receiving an external card, such as a subscriber identification module (SIM) or a user identity module (UIM), a memory card for storing information.
- SIM subscriber identification module
- UIM user identity module
- FIG. 3B is a rear perspective view of the mobile terminal 100 shown in FIG. 3A.
- a second camera 121b may be additionally mounted on the rear of the terminal body, that is, the rear case 102.
- the second camera 121b has a photographing direction substantially opposite to that of the first camera 121a (see FIG. 3A), and may be a camera having different pixels from the first camera 121a.
- the first camera 121a has a low pixel so that the user's face is photographed and transmitted to the counterpart in case of a video call, and the second camera 121b photographs a general subject and does not transmit it immediately. In many cases, it is desirable to have a high pixel.
- the first and second cameras 121a and 121b may be installed in the terminal body in a rotatable or pop-up manner.
- the flash 123 and the mirror 124 are further disposed adjacent to the second camera 121b.
- the flash 123 shines light toward the subject when the subject is photographed by the second camera 121b.
- the mirror 124 allows the user to see his / her own face or the like when photographing (self-photographing) the user using the second camera 121b.
- the second sound output module 153b may be further disposed on the rear surface of the terminal body.
- the second sound output module 153b may implement a stereo function together with the first sound output module 153a (see FIG. 3A), and may be used to implement a speakerphone mode during a call.
- An antenna (not shown) for receiving a broadcast signal may be additionally disposed on the side of the terminal body.
- An antenna that forms part of the broadcast receiving module 111 (refer to FIG. 1) may be installed to be pulled out of the terminal body.
- the terminal body is provided with a power supply unit 190 (see FIG. 1) for supplying power to the mobile terminal 100.
- the power supply unit 190 may include a battery 191 embedded in the terminal body or detachably configured from the outside of the terminal body.
- the battery cover 103 is coupled to the rear case 102 so as to cover the battery 191 to limit the detachment of the battery 191, and to protect the battery 191 from external shocks and foreign matters. .
- the function of the mobile terminal can be controlled. That is, in the mobile terminal according to an embodiment of the present disclosure, a function or an application running in the mobile terminal may be controlled in response to the tap. In addition, in the mobile terminal according to an embodiment of the present invention, even if not currently being executed, it is possible to execute an executable or driveable function on the mobile terminal. Accordingly, the user may control at least one of functions executable on the mobile terminal by a simple gesture of tapping an object even without applying a touch input to the mobile terminal.
- FIG. 4 is a flowchart illustrating a control method of a mobile terminal according to an embodiment of the present invention.
- a step of detecting a tap for hitting an object is performed (S410).
- a tap or tap gesture may mean a gesture of tapping the main body 100 or the object of the mobile terminal. More specifically, the tap may be understood as an operation of lightly tapping the mobile terminal body 100 or an object with a tap object such as a finger, or an operation of lightly touching the tap object with the mobile terminal body 100 or an object.
- the tab object applying the tab may be the mobile terminal main body 100 or an object capable of applying an external force to the object.
- a finger a portion having a fingerprint
- a stylus pen a pen (pen), pointer, fist (finger joint), etc.
- the tap object is not necessarily limited to an object capable of applying a touch input to the mobile terminal according to the present invention, and may be of any kind as long as the tap object is an object capable of applying an external force to the mobile terminal main body 100 or the object.
- the object to which the tab is applied may include at least one of a terminal body and a point that leaves the main body. That is, the input area of the mobile terminal can extend outside the terminal body. Therefore, the point where the tap can be detected at the point outside the main body of the terminal becomes a virtual input area.
- the virtual input area may vary in width depending on the location, object, or strength of the tap on which the terminal is placed. For example, when the terminal is placed on a table or the like, when the user taps on the table, the movement of the terminal may occur and the tap may be detected. Therefore, if the intensity of the tapping is large, the virtual input area becomes larger. As another example, when the user holds the terminal body, the virtual input area may disappear.
- the controller 180 may recognize the single tap as a touch input. Can be. That is, in this case, the controller 180 does not control the function corresponding to the one tap, but the function according to the touch input corresponding to the one tap (for example, outputted at the point where the touch input is applied). The ability to select an icon).
- the sensing unit 140 or the sensing unit may generate a control signal for controlling one or more functions only when at least two (or more) taps are continuously applied within a time limit. Can be.
- knockknock the continuous sensing of at least two or more taps within a time limit.
- the second tap is detected within a time limit from when the first tap is detected. Therefore, hereinafter, “knockknock” is detected, which may mean that the object is knocked out a plurality of times.
- the "knockknock" is, after the first tap is detected by tapping the body or the point leaving the body by a first reference number of times or more than the first reference number of times, the point of leaving the body or the body within a time limit. A second tap that strikes the second reference number of times or more than the second reference number of times may be detected.
- the sensing unit 140 in response to the first tap being detected, prepares the terminal in a ready state (or an activated state) and controls the terminal accordingly when it is applied to the second tap. You can generate a signal. That is, the user may transfer information to the mobile terminal to control the mobile terminal using the tap by first applying the first tap.
- the number of the first reference number of times and the number of the second reference number of times may be the same or different.
- the first reference number may be three times
- the second reference number may be two times.
- both the first reference number of times and the second reference number of times may be two or more times.
- first and second taps of “knockknock” may be input in various patterns.
- an operation of tapping an object corresponds to a dot of a Morse code
- an operation of not releasing a contact for a predetermined time in contact with an object corresponds to a dash of a Morse code.
- 'knockknock', 'knockknockk', and 'knockknockk' are cases where two taps are applied, but may be tap gestures generated in different patterns.
- the sensing unit 145 may further be “knockknock” only when the first and second taps are sensed within the time limit, and the first and second taps are applied within the “predetermined area”. Can be determined to have been detected.
- knockknock may mean a plurality of tappings that are continuously detected within a predetermined area within a time limit.
- the time limit may be a very short time, for example, may be a time within 300ms to 2s.
- the predetermined area may mean the same area where the tap is applied or a narrow area that can be viewed as the same point.
- the sensing unit 140 may calculate a predetermined area from the point where the first tap is detected.
- the detector 140 may determine that "knockknock" is detected.
- the sensing unit 140 when “knockknock” with respect to the terminal body or an object located outside the terminal body is detected, the sensing unit 140 generates a control signal. The generated control signal is transmitted to the controller 180.
- a step of controlling at least one of functions executable on the terminal is performed (S420). That is, the controller 180 may control at least one of functions executable on the terminal in response to the control signal.
- the functions executable on the terminal may mean all kinds of functions capable of being executed or driven on the mobile terminal.
- one of the executable functions may be an application installed in the mobile terminal.
- 'an arbitrary function is executed' may mean 'an arbitrary application is executed or driven'.
- the function executable in the mobile terminal may be a function for receiving an event.
- the received event may be a message reception event, a call reception event, or the like.
- the event may be an event generated from an application installed in the mobile terminal.
- a function executable in the mobile terminal may be a function necessary for basic driving of the mobile terminal.
- a function required for basic driving may be a function of turning on / off the lighting provided in the display unit 151, and switching the mobile terminal from the unlocked state to the locked state or vice versa.
- There may be a function for switching, a function for setting a communication network, a function for changing setting information of a mobile terminal, and the like.
- the controller 180 may control at least one of the functions executable on the mobile terminal in response to the control signal.
- control signal may vary depending on the characteristics of "knockknock".
- the characteristic of the "knockknock” may be related to at least one of the number of taps applied, the position, the speed, the intensity, the pattern, and the area.
- the sensing unit 140 may generate the first control signal when the tap is applied "2" times and may generate the second control signal when the tap is applied "3" times.
- the controller 180 may control a function corresponding to the first and second control signals.
- the controller 180 may change setting information related to a function currently being driven or a function corresponding to screen information output on the display unit 151 among functions currently being driven, in response to a control signal.
- the controller 180 may output the guide information (or guide information) for the controllable setting information according to the position at which “knockknock” is applied on the display unit 151.
- the function controlled in response to the control signal generated by "knockknock” may vary depending on the state of the current mobile terminal, or may vary depending on the characteristics of "knockknock”.
- the controller 180 detects the state of the mobile terminal, that is, a function currently being driven by the mobile terminal, and a screen displayed on the current display unit 151. Types of information, applications, and applications corresponding to screen information currently being output on the display unit 151. Different control may be performed according to the on / off state of the lighting of the display unit 151, the lock / unlock state of the mobile terminal, and the like.
- the controller 180 executes a "voice recognition function" in a state in which the illumination of the display unit 151 is 'off' and the display unit 151 When the lighting is 'on', the control of the application related to the currently displayed screen information is performed, or if the currently displayed screen information is the lock screen, the lock state is released and the home screen page is displayed. It can be output on the display unit 151.
- a function executable in response to a tap on the main body or a point away from the main body (or an object on which the main body is placed) may include an application related to screen information output to the mobile terminal or changing a setting of a function currently running on the mobile terminal. It may be to change the setting of or to change the setting of the function corresponding to the screen information output to the mobile terminal.
- the sensing unit 140 is a tap applied position, the component (microphone, speaker, etc.) disposed at the tap applied position, the strength of the tab, the speed of the tab, the area of the tab
- Other control signals can be generated based on the pattern of the tap. That is, the controller 180 may control different functions according to the characteristics of "knockknock”. Alternatively, the control signal may include information on the characteristic of "knockknock", and the controller 180 may control different functions by using the information included in the control signal.
- FIG. 5 is a flowchart illustrating a method of using an acceleration sensor in more detail with reference to FIG. 4.
- FIG. 6 is a diagram illustrating a method of sensing a tap by an acceleration sensor according to the control method shown in FIG. 5. It is for the drawing.
- the acceleration sensor 145 may detect the movement of the main body based on at least one of the x-axis, the y-axis, and the z-axis.
- An acceleration signal corresponding to the movement of the main body may be generated.
- FIG. 6 an acceleration signal along the x-axis according to the movement of the main body is shown as an embodiment.
- an acceleration sensor 145 detects a first tap that exceeds a threshold reference (S510).
- the threshold reference is to prevent malfunction of the acceleration sensor 145 and is a criterion for determining whether a tap for generating a control signal is detected.
- the acceleration sensor 145 may determine whether the movement of the main body is generated by the first tap by comparing a difference value between the generated n th acceleration signal and the generated n-1 th acceleration signal with a threshold reference. If the difference between the acceleration signals is greater than the threshold reference, it can be seen that the first tap is applied.
- operation S520 it is determined whether the movement of the main body corresponding to the first tap disappears within the first reference time TI1. For example, when the terminal falls from the sky to the ground, movement beyond the threshold may be continuously detected. In this case, since the first tap is not regarded as being sensed, if the movement corresponding to the first tap does not disappear within the first reference time TI1, the process returns to the previous step.
- the step of limiting the operation of the movement of the main body during the time is set to a no-operation period (S530).
- a no-operation period For example, in the case of tapping and knocking an object, a movement beyond the threshold reference is sensed at the time of tapping, and after the tapping is left, the main body may move.
- the acceleration sensor 145 ignores the acceleration signal generated during the time set as the no-operation section.
- movement beyond the threshold may be continuously detected. If a motion beyond the threshold is continuously detected even after the time set as the no-operation interval has elapsed, the tap cannot be regarded as detected. Therefore, the movement of the terminal should not be detected for a predetermined time TI2 after the aftershock by the first tap disappears.
- the set range means a range in which the terminal can be seen as not moving.
- a step (S550) of determining whether or not to detect the second tap that exceeds the threshold criteria within the time limit may be regarded as being detected only when the time taken from the time point at which the first tap is detected to the time point at which the second tap is detected is within the time limit. If the second tap is not detected within the timeout period, the first return.
- the control signal is a peak of the first and second taps corresponding to the characteristic of "knockknock", a time from when the first tap is detected to the time when the second tap is detected, and the object to which the first and second taps are applied. It may vary depending on the point or may include information on the characteristics of "knockknock”.
- the controller 180 can control at least one of the functions that can be controlled on the terminal using the control signal.
- the acceleration sensor may be continuously activated while power is supplied to the controller. That is, even when a sleep mode for deactivating components other than components necessary for minimizing battery consumption is executed, the acceleration sensor 145 always detects movement of the terminal body unless the battery is discharged. And, as the "knockknock" is detected can generate a control signal.
- sensors except the acceleration sensor may be deactivated in a sleep mode.
- the remaining sensors may be activated and detect the second tap.
- the remaining sensors include a touch sensor, a microphone sensor, a proximity sensor, an RGB sensor, a pressure sensor, and the like, and the characteristics of "knockknock" (the intensity of the tap / point of destination / time interval between the first tap and the second tap / the object of the tap). Etc.).
- the touch sensor may be disposed on a main body to sense the second tap by using a touch generated on the main body.
- the touch sensor may calculate a position at which the second tap is applied, and may distinguish objects (eg, fingers, fingernails, palms, etc.) of the second tap by using an area where the second tap is touched. .
- the microphone sensor may detect the second tap by using sounds generated around the main body.
- the pattern of the first tab and the object (eg, finger, fingernail, palm, pen, etc.) of the second tap may be distinguished using the frequency characteristic of the received voice information. Can be.
- the proximity sensor may detect the second tap using the presence or absence of an object located around the main body.
- the controller 180 may reject the control signal generated by the acceleration sensor. This is because the mobile terminal 100 in the bag may malfunction due to shaking of the bag.
- the RGB sensor may sense a color of the object of the second tap and distinguish the type of the object by using the sensed color.
- the pressure sensor may sense the second tap by using the pressure applied to the main body, and calculate the strength of the pressure generated by the second tap.
- a piezo sensor (or an impact sensor) using the property of generating electricity on the surface of a crystal when pressure is applied in a specific direction may sense the second tap. If the acceleration sensor detects a movement corresponding to several hundred hz, the piezo sensor may detect a movement corresponding to several khz, thereby more accurately detecting the movement (or impact) of the terminal.
- the piezoelectric sensor may be used to identify the object and pattern of the above-described tap. Since the physical patterns generated according to the pattern of the object or the tap causing the impact on the object are different, it is possible to determine the pattern of the object and the tap of the tab using the physical pattern obtained experimentally.
- the experimentally obtained physical pattern may be created at the factory release stage and stored in the memory 160, and may be periodically updated or changed by a user.
- the sensors other than the acceleration sensor are deactivated in order to prevent battery drain.
- the controller 180 may perform control by "knockknock" only when the acceleration sensor and the other sensors except the acceleration sensor detect the second tap. Since various sensors are used, malfunction of the terminal can be prevented, and since the sensors other than the acceleration sensor are activated only for a limited time after the first tap is detected, the power can be efficiently used.
- a signal exceeding the threshold reference is generated only in one axis of the three axes of the acceleration sensor.
- a motion similar to “knockknock” may be detected.
- the acceleration sensor ignores the motion when the movement beyond the threshold reference is detected in the other axis while the movement is detected in the other axis. can be rejected.
- various methods for preventing malfunction due to “knockknock” may be applied to the terminal.
- the mobile terminal may be provided with a dedicated processor (micro controller unit, hereinafter referred to as 'MCU') for controlling the sensors.
- the MCU may serve as a hub of sensors, collect signals of the sensors, and determine whether "knockknock" has occurred. That is, the MCU may generate a control signal by synthesizing the signals of the sensors.
- the MCU which is not an application processor (hereinafter, referred to as an "AP") which is a main processor of the terminal, may collect signals of the sensors and generate a control signal.
- AP application processor
- the MCU may collect signals of the sensors and generate a control signal.
- the MCU may remain always on while the power is supplied.
- the MCU activates the AP using a control signal, which may cause a new effect of significantly reducing the current consumption.
- the MCU may activate other sensors to detect the second tap in response to the first tap sensed by the acceleration sensor. Since the sensors are controlled by the MCU and the MCU determines whether "knockknock" occurs using a plurality of sensors, malfunctions can be prevented in advance.
- the MCU may include an algorithm for identifying the characteristic of "knockknock”, and may determine the characteristic of "knockknock” by comprehensively using signals from the sensors.
- 7A, 7B, 7C, 7D, and 7E are conceptual views for explaining the control method described with reference to FIG. 4.
- a mobile terminal for controlling a function in response to "knockknock" is shown.
- the controller 180 can switch the display unit 151 to the activated state. That is, the controller 180 may turn on the lighting of the display unit 151 in response to detecting the knocking T. In this case, if the mobile terminal is in the locked state, a lock screen may be displayed on the display unit 151.
- the information output as the display unit 151 is activated may vary, and the controller 180 may output different information according to the position where the display unit taps.
- a tapping T may be displayed in an area 710 in which time information is displayed on a lock screen (the location of the area may vary depending on a mobile terminal).
- the controller 180 may turn on the lighting of the display unit 151 and output screen information 701 specialized for time information, as shown in FIG. 7A (b).
- the screen information may output various types of time information such as current time information and world time information.
- the mobile terminal may be locked while the screen information 401 is output. Therefore, in this case, the user may switch the locked state to the released state by using a touch on the display unit 151.
- the controller 180 may immediately switch the locked state to the unlocked state and output a home screen page.
- the first screen output may be not necessarily the home screen page.
- the first screen output on the display unit 151 may display screen information that was most recently output on the display unit 151 before the lock state is executed. have.
- the Home button here is a home screen page on the display unit 151 as the Home button is pressed (or selected), regardless of the function being executed in the mobile terminal.
- the controller 180 may output a home screen page on the display unit 151 when the home button is pressed or touched.
- the home screen page may not be output.
- such a home button may be implemented as a hardware key or a virtual key.
- the controller 180 is not limited to the embodiment in which the position where the home button is disposed is tapping (T), and the area in which a key (for example, a volume key, a power key, etc.) of another function is arranged is tapping (T). ), The function corresponding to the key of the function can be controlled.
- a control command is executed on the mobile terminal for a predetermined time. If not applied, as shown in (b) of FIG. 7B, the lighting of the display unit 151 may be turned off (in this case, the display unit 151 is turned off with time out). Can be). In this case, the controller 180 may output the screen information that was output before the lighting was turned off, as shown in FIG.
- the controller 180 if any function is activated, for example, the text input function is activated in the memo function application, by tapping (T) by
- the arbitrary function eg, a text input function
- the controller 180 may equally activated.
- the controller 180 may display the display unit 151 when a specific function is executed on the mobile terminal in a state where the lighting of the display unit 151 is turned off (this case may be referred to as a locked state).
- the specific function may be controlled in response to Knockknock T sensed when the lighting of 151 is turned off.
- different control may be performed according to the tapping T applied to different positions.
- these different locations may be locations that the user can grasp conventionally and conceptually, and through this, the controller 180 can provide a user experience (UX) more familiar to the user.
- UX user experience
- the controller 180 may output guide information (or guide information) for a function to be controlled as the second tap is applied.
- the guide information may be information for guiding a position to which the second tap is applied, and information about a function to be controlled according to the second tap.
- the guide information may be output through at least one of visual, auditory, and tactile.
- the controller 180 may control only the music playing function while continuously deactivating the lighting of the display unit 151. As such, when the specific function is executed while the display is inactivated, the controller 180 may control the specific function while maintaining the inactivated state of the display in response to the detected tap. Accordingly, power consumed to activate the display unit 151 may be reduced.
- FIG. 7C when a knocking T is applied while the display unit 151 is deactivated, FIG. As shown in FIG. 2, voice recognition may be activated. Accordingly, the controller 180 may activate, execute or perform a function related to the voice command in response to the voice command input from the user.
- the controller 180 may display in FIG. 7C (c). As shown, the lock state can be switched to the unlocked state, and the illumination of the display unit 151 can be turned on.
- a voice command eg, “open sesame”
- the controller 180 may display in FIG. 7C (c). As shown, the lock state can be switched to the unlocked state, and the illumination of the display unit 151 can be turned on.
- the controller 180 may output notification information indicating that the voice recognition function is activated by using at least one of visual, tactile, and auditory methods.
- the controller 180 may activate the display unit 151 only to output the notification information.
- the controller 180 may perform a function previously matched to the applied "knockknock” when the detected characteristic of the knockk T corresponds to a preset condition. For example, when “knockknock” having a first characteristic is detected, the controller 180 may perform a first function matched thereto, and when “knockknock” having a second characteristic different from the first characteristic is detected, The second function matched thereto may be performed. In addition, the performance of the first or second function may be performed only when the state of the mobile terminal satisfies a specific condition. For example, when the first function is set to be performed only when the tap having the first characteristic is detected in the locked state, the controller 180 may detect the “knockknock” having the first characteristic in the released state. 1 You may not perform the function.
- the tapping torque T having the first characteristic may be a tapping torque T for a predetermined area or more, and as illustrated in FIG. 7D, the tapping torque T of a predetermined area or more is displayed on the display unit 151.
- the controller 180 may perform a function matching the first characteristic.
- the function matched with the first characteristic may be a function of outputting context information by voice.
- the controller 180 may respond to the detection of a toktok (T) of the predetermined area or more, and includes status information (for example, event reception information, current time information, weather information, and status information of the mobile terminal (battery, communication status, Location, etc.).
- the controller 180 is a condition that the state of the mobile terminal is preset (for example, a condition that the lighting of the mobile terminal is off, or even if the touch (T) applied to the predetermined area or more), If the lock condition is not satisfied, the function matching the first characteristic may not be performed.
- the controller 180 detects a toktok T having a second characteristic different from the first characteristic (for example, sequentially in different areas).
- a toktok T having a second characteristic different from the first characteristic for example, sequentially in different areas.
- knockknock When applied to (knockknock), it is possible to perform a function matched to the second characteristic.
- the controller may receive a virtual keyboard or a visual keyboard. keyboard)) can be printed.
- control unit 180 may provide a user experience (UX) that is more familiar to a user.
- the user of the mobile terminal according to the present invention can control the mobile terminal simply by tapping the mobile terminal while the display unit is inactivated. That is, the mobile terminal according to the present invention can provide a user with a more intuitive and relatively simple user interface environment.
- the screen information output to the display unit 151 or An application corresponding to the screen information may be controlled.
- the controller 180 may change setting of an application corresponding to the screen information or change setting information on information output through the application.
- the controller 180 may perform different control according to the position where the "knockknock" is applied.
- FIG. 8A, 8B, and 8C are conceptual views illustrating a method of executing different functions according to a tap object in a mobile terminal according to an embodiment of the present invention.
- the tap object may be classified by the at least one sensor described above with reference to FIG. 6.
- the controller 180 can immediately perform a function previously matched to a corresponding attribute according to the type of the object.
- the controller 180 controls music as shown in (b) of FIG. 9A. And at least one of an image.
- the output music and images may be preset by the user or may be automatically selected by the controller 180.
- controller 180 may control different types of music and images output according to the intensity (or intensity) of the tok-talk (T). For example, when the intensity of the knock T is very strong, calm music may be output.
- the controller 180 executes and executes an application related to a social networking service (SNS) such as Facebook. Can be displayed on the display unit 151.
- SNS social networking service
- the executed application may be changed by the user's setting.
- the terminal body 100 or the display unit 151 when the terminal body 100 or the display unit 151 is knocked off by a tap object that cannot be touched, as illustrated in FIG. 8B, FIG. 8B.
- the locked state may be immediately released or the voice recognition function may be executed as shown in (c) of FIG. 8B.
- the tap object that is not touch-sensitive may be a hand of a user wearing gloves.
- FIG. 8C when the terminal body 100 or the display unit 151 is knocked off by a touch pen (or a stylus pen) as illustrated in FIG. 8C, FIG. As shown in (b) of FIG. 1, a memo function (or memo application) may be immediately activated.
- 9A, 9B, 9C, 10A, 10B, and 11 are conceptual views illustrating a method of executing different functions according to a target point of a tap in a mobile terminal according to one embodiment of the present invention.
- different functions may be controlled according to the target point of the tap (or a position where "knockknock” is applied).
- the job manager screen may be output to the display unit 151.
- the task manager screen may include information on at least one running application, information on an event which has occurred, and short icons.
- the task manager screen may be terminated in response to the tapping T being applied to the upper end of the main body 100 again.
- the display unit 151 may be automatically deactivated.
- the controller 180 is applied to the web browser when the tapping T is applied to the side surface of the main body 100 while outputting the screen according to the execution of the web browser. You can output the favorites screen for.
- the controller 180 executes a previously executed application.
- the screen may be output to the display unit 151. That is, the screen switching function may be executed by tapping T on the rear case 102 (refer to FIG. 1).
- the controller 180 terminates at least one of the plurality of running applications and displays another execution screen on the display unit 151. Can be output to In addition, whenever the tapping T is applied, the plurality of running applications may be sequentially terminated.
- the controller 180 may execute a function related to “voice”. have. In such a case, it is very likely that the current display unit 151 cannot be used. Thus, by executing a function related to "voice”, user convenience can be improved.
- the controller 180 can output current mobile terminal status information (eg, event reception information, current time information, weather information, mobile terminal status information (battery, communication status, location, etc.)) through voice. have. Further, after the situation information output is completed, the speech recognition function can be executed continuously.
- current mobile terminal status information eg, event reception information, current time information, weather information, mobile terminal status information (battery, communication status, location, etc.)
- the speech recognition function can be executed continuously.
- the target point of the tab may be formed at a point away from the main body rather than the main body.
- the tapping T is applied to an object on which the terminal body is placed, and the mobile terminal 100 may sense the tapping T applied to a point outside the main body.
- different functions may be executed depending on which position is applied to the tapping point T based on the main body. For example, when the application related to a photo or a book is executed and the image is output to the display unit 151, when the tapping T is applied to the right side of the main body, the next image is output instead of the current image, When toktok T is applied to the left side, the previous image may be output instead of the current image.
- the application related to the music when executed and the music is being played, when the tok T is applied to the right side of the main body, the next music is output instead of the music being played and the tok to the left side of the main body. If is applied, the previous music can be played instead of the music being played.
- different control may be performed according to the tapping T applied to different positions.
- these different locations may be locations that the user can grasp conventionally and conceptually, and through this, the controller 180 can provide a user experience (UX) more familiar to the user.
- UX user experience
- the controller 180 may output notification information. That is, in the mobile terminal according to the present invention, when the user does not know where the location of the terminal is located, notification information may be output when tapping T is applied to the surrounding object.
- the notification information may be output through at least one of a visual, tactile (eg, vibration) and an auditory manner.
- the controller 180 may output the notification information only when a tap on the peripheral object is detected within a predetermined distance or more based on the terminal body.
- 12A and 12B are conceptual views illustrating a method of executing different functions according to a pann of taps in a mobile terminal according to an embodiment of the present invention.
- the mobile terminal may distinguish the pattern of "knockknock” by using the sensor 140 and other sensors included in the sensor 140.
- the acceleration sensor 145 may generate the first control signal when "knockknock” is applied in the "knockknockknockknock” pattern, and may generate the second control signal when it is applied in the "knockknockknock” pattern.
- the controller 180 may control a function corresponding to the first and second control signals.
- “-” may refer to an operation of not releasing contact for a predetermined time in a state in which the tab is in contact with an object as shown by the Morse code. Alternatively, it may mean a state in which a tab is not input for a predetermined time between the tab and the tab.
- “knockknock” may be less than 1 second between the tab and the tab, and “knockknock” may be more than 1 second between the tab and the tab.
- FIG. 13 14, 15A, 15B, 15C, and 15D are conceptual views illustrating a method of controlling a function according to a touch input applied after tapping in a mobile terminal according to an embodiment of the present invention.
- the touch sensor may be deactivated in the sleep mode, but the acceleration sensor may detect “knockknock” in the activated state.
- the touch sensor may be activated and detect the second tap.
- the controller 180 may execute a function corresponding to the touch input detected by the touch sensor.
- the controller 180 can display a trajectory corresponding to a user's touch operation using the pen P or the like. That is, the controller 180 may execute an application related to the memo in response to the tapping T, and execute a memo function for displaying the touch trace of the user in response to the touch input.
- the memo function according to the knock tok (T) and the touch input may be executed while the display unit 151 is inactivated.
- the second tab may be transformed into a touch input that continuously moves from the point where the second tap is detected to an arbitrary point. This may be referred to as "knockknock and drag.”
- the user may apply "knockknock" consisting of the first and second taps to the touch screen with a finger.
- the second tab may continuously move from a point where the second tab is applied to an arbitrary point without falling off while being attached to the touch screen.
- the second tab may be released at any point.
- the controller 180 may execute an application previously matched with a symbol formed by the touch trace. For example, as illustrated in FIGS. 14A and 14B, when the symbol formed by tapping and dragging is “C”, the controller 180 may execute a calendar application previously matched with “C”. This allows the user to immediately execute a desired application by tapping the touch screen and drawing a specific symbol with a touch trace, without having to search for an icon of an application to be executed.
- controller 180 may execute different functions according to the direction of the drag input when the drag input continuously moves from the point where the second tap according to "knockknock" is detected to an arbitrary point.
- the controller 180 may apply “knockknock” to the display unit 151. And drag ", the music playback function can be controlled. For example, the controller 180 may adjust the sound (or volume) of the music to be played or convert the music to another music in response to "knock and drag.” In addition, the controller 180 may control the music playback function in various ways according to the position (drag direction by the second tab) of the point where the second tab is continuously moved and released.
- the controller 180 may directly convert the music to be played immediately after the music being played. And, as shown in Figure 14b, when the second tap is released by moving to the left direction from the applied point, the controller 180 can replay the music that was played before the currently played music is played.
- the controller 180 may increase the volume (or volume). Also, as shown in FIG. 15D, when the second tab is continuously released in the lower direction of the main body at the point where the second tab is applied, the controller 180 can lower the volume.
- the controller may vary the degree of control according to the moving distance of the drag input. For example, when “knockknock and drag” for adjusting the volume is sensed, the controller 180 can differently control the degree to which the volume is adjusted according to the moving distance of the drag input. As a more specific example, when the drag input moves upward by the distance '1', the controller 180 increases the volume of step '1' and detects the drag input having a distance '3' greater than the distance '1'. As a result, the volume of the third stage can be increased.
- the mobile terminal according to the present invention can provide a user interface (UX) more familiar to the user by providing a new interface called “knockknock” as well as “knockknock”.
- UX user interface
- FIG. 16 is a conceptual view illustrating a method of controlling a mobile terminal in response to a tapping of the mobile terminal in a specific situation in the mobile terminal according to one embodiment of the present invention.
- the controller 180 controls the camera in response to "knockknock". Activate and run the application associated with the camera. In this case, the controller 180 may output the camera function screen 801 even when the state of the mobile terminal is locked.
- FIG. 17 is a conceptual view illustrating a method of connecting a plurality of mobile terminals that sense the same tap in a mobile terminal according to an embodiment of the present invention.
- the first and second terminals 100a and 100b may be placed on the same object as a table.
- the first and second terminals 100a and 100b may detect the tapping T at a similar time. That is, since the distance from the position where the mobile terminal is placed to the target point of the tok-tock T is different, the timing for detecting the tok-tock T may not be the same.
- the mobile terminal when detecting the toktok (T) generated at the point away from the main body, the mobile terminal may activate the wireless communication unit, located in the vicinity, and may search for another mobile terminal that detects the same toktok (T). When another mobile terminal that detects the same tapping T is found, a channel for sharing data may be connected. Therefore, by applying a "knockknock" to tap an object on which a plurality of terminals are placed, it is possible to simply connect sync between devices.
- the controller 180 can output the name of the connected terminal or a list of transmittable contents to the display unit 151.
- the selected content may be transmitted to another connected terminal.
- the controller 180 may deactivate the display unit in response to tapping.
- a method of deactivating the display unit will be described in detail with reference to FIG. 18.
- FIG. 18 is a conceptual view illustrating an operation example of deactivating a display unit in response to tapping in a mobile terminal according to an embodiment of the present disclosure.
- screen information such as an execution screen, a standby screen, or a lock screen corresponding to one or more functions may be output on the display unit 151.
- the controller 180 may deactivate the display unit 151.
- the controller 180 may terminate at least one of the functions. For example, when “knockknock” is applied while audio is being output through the speaker, the speaker may be deactivated together with the display unit 151.
- a sleep mode in which the terminal waits with minimal power or a doze mode in which the touch sensor is periodically activated may be executed by "knockknock".
- the sleep mode is only applied when “knockknock” is applied to an empty space in which the object to be executed by the touch is not located in the entire area of the terminal. Can be executed. For example, when “knockknock” is applied to a region where an icon is not displayed while a home screen is displayed on the display unit 151, the sleep mode may be executed. In another example, when “knockknock” is applied to an area other than the display unit, the sleep mode may be executed.
- a tap that hits the display unit that is, "knockknock” may be sensed by using a touch sensor.
- the touch sensor may be activated (or on) even when the display unit is inactivated (or off) to detect a tap that strikes the display unit.
- the controller 180 does not always activate (or on) the touch sensor, but deactivates (or off) for a predetermined time,
- the state of the touch sensor may be controlled to be activated for a certain time. That is, the controller 180 may switch the activated state and the deactivated state of the touch sensor periodically so that the touch sensor is always activated (or turned on), thereby reducing power consumption.
- the terminal 100 may sense “knockknock”, which is a tap applied to the touch sensor using the touch sensor. That is, when the first tap is applied and the second tap is input to the predetermined area within the time limit, the controller 180 determines that "knockknock” is detected and controls at least one of functions executable on the terminal. can do.
- knockknock a tap applied to the touch sensor using the touch sensor. That is, when the first tap is applied and the second tap is input to the predetermined area within the time limit, the controller 180 determines that "knockknock” is detected and controls at least one of functions executable on the terminal. can do.
- FIG. 19 is a flowchart for describing in more detail a method of detecting a tap for hitting a terminal using a touch sensor among the control methods described with reference to FIG. 4, and FIG. 20 is a touch in a mobile terminal according to an embodiment of the present disclosure. It is a figure for demonstrating the current consumption of a sensor.
- the touch sensor may be periodically activated while the display unit is inactivated (S1910).
- the "display unit 151 deactivated" means that the illumination provided inside the illumination of the display unit 151 is off (off) state. That is, no information or graphic image is displayed on the display unit 151 when the display unit 151 is inactivated.
- the "display unit 151 is activated” refers to an on state of illumination provided inside the lighting unit, and an execution screen corresponding to one or more functions on the display unit 151, Screen information such as a standby screen or a lock screen may be output.
- the touch sensor may form a mutual layer structure with the display unit 151 and may be disposed to correspond to the display area of the display unit 151.
- the touch sensor may detect a tap on which the touch object taps a specific portion of the display unit 151, and detect not only the position and area where the touch object is tapped on the touch sensor, but also the pressure when the tap is applied.
- the touch object is an object applying a touch to the touch sensor and may be, for example, a finger, a touch pen or a stylus pen, a pointer, or the like.
- the touch sensor may be formed to detect the tap using different methods in the activated and deactivated state of the display unit 151.
- the different methods may be related to an activation period of the touch sensor. More specifically, the touch sensor may be activated at different periods depending on whether the display unit 151 is activated. That is, the touch sensor may have a tap applied to the touch sensor while having different activation cycles according to whether the display unit 151 is activated.
- the touch sensor when the display unit 151 is deactivated, the touch sensor may be activated at a predetermined specific period.
- the specific period may be a period corresponding to a time greater than zero.
- the touch sensor when the display unit 151 is activated, the touch sensor may be operated in an activated state at all times. That is, in this case, the activation period of the touch sensor may be zero or a period having a time very close to zero.
- whether the touch sensor is activated or not may be distinguished using power consumption of the touch sensor. For example, when the power consumption of the touch sensor is less than or equal to a preset reference value based on 0, the touch sensor corresponds to a deactivated state, and when the power consumption of the touch sensor is greater than the preset reference value based on 0. It can be said to be activated.
- the touch sensor when the display unit 151 is in an active state (hereinafter referred to as an active mode), the touch sensor continues to be activated and the display unit 151 is connected to the display unit 151. Wait for the tap to be applied.
- the touch sensor when the display unit 151 is in an inactive state (hereinafter referred to as a "dose mode"), the touch sensor may be activated every predetermined period.
- the shorter the specific period of activation of the touch sensor the faster the speed of detecting the tap that hits the display unit 151, but accordingly the power consumed by the touch sensor may increase.
- the longer the period in which the touch sensor is activated the smaller the power consumed by the touch sensor is, but the speed of detecting the tap hitting the display unit 151 may be slow.
- the specific period may be set to increase the efficiency of power consumption while detecting the tap hitting the display unit 151 so fast that the detection speed is not recognized by the user.
- the specific period may be set such that the touch sensor is deactivated and activated 20 times (1 Hz) per second.
- the touch sensor may also be activated, and in the activated state, the activation period T of the touch sensor may be zero or very close to zero.
- the period of the touch sensor may be several times shorter than a specific period in which the touch sensor is activated in the state in which the display unit 151 is inactivated. That is, the touch sensor may be activated at different periods depending on whether the display unit 151 is activated.
- the controller 180 may control at least one of functions executable in the terminal (S1920).
- the controller 180 may switch the doze mode to an active mode in which the display unit and the touch sensor are activated.
- the mobile terminal according to an embodiment of the present invention can detect “knockknock” by the touch sensor, it is possible to accurately detect the tap applied to the display unit 151.
- the touch sensor since the touch sensor is periodically activated in the mobile terminal according to an embodiment of the present disclosure, the efficiency of power use may be increased.
- FIG. 21 is a diagram illustrating a mode in which a display unit and a touch sensor operate in a mobile terminal according to an exemplary embodiment.
- an operation mode of a mobile terminal is an active mode (2010) or a sleep mode (2030) according to an operation state of the display unit 151 and the touch sensor. And a dose mode 2020.
- the active mode 2010 refers to a state in which both the display unit 151 and the touch sensor are activated. That is, the lighting of the display unit 151 is turned on, and the touch sensor is activated to receive a user input for an icon or a graphic object output on the display unit 151 and continuously consumes power. It may mean.
- the sleep mode 2030 means a state in which both the display unit 151 and the touch sensor are deactivated. Illumination of the display unit 151 may be turned off, and even if a touch is applied to the display unit 151, no function may be executed.
- the dose mode 2020 means a state in which the touch sensor is periodically activated while the display unit 151 is inactivated.
- the dose mode 2020 may be referred to as a state for receiving “knockknock” while the display unit 151 is deactivated.
- the touch sensor may sense a tap applied to the display unit 151 in different ways in the dose mode 2020 and the active mode 2010.
- settings related to the operation of the touch sensor may be set differently in the dose mode 2020 and the active mode 2010.
- a threshold set for recognizing a tap may be set differently. That is, the sensitivity of the touch sensor may be increased in the active mode 2010 than in the dose mode 2020. This is because the dose mode 2020 is a mode for detecting “knockknock” while reducing power consumption, and the active mode 2010 is a mode for accurately detecting a user's input.
- the controller 180 may selectively switch the active mode 2010 to the sleep mode 2030 or the dose mode 2020 according to a setting or a condition of the terminal. That is, the dose mode 2020 may be executed instead of the sleep mode 2030, or the sleep mode 2030 may be executed instead of the dose mode 2020. For example, when the touch sensor is set to recognize "knockknock”, the doze mode 2020 is executed, and when set to not recognize "knockknock", the sleep mode 2030 may be executed. This setting can be changed by the user.
- the main body may include a button for switching between the active mode 2010 and the sleep mode 2030, or the active mode 2010 and the dose mode 2020, such as a home button or a power button.
- the controller 180 can change the operation state of the mobile terminal.
- the controller 180 may execute the active mode 2010. On the contrary, if the user input is not input for a predetermined time in the active mode 2010, the controller 180 can execute the sleep mode 2030 or the dose mode 2020.
- the controller 180 may switch between the dose mode 2030 and the sleep mode 2020. With respect to the switching method will be described in detail with reference to the accompanying drawings below.
- FIG. 22 is a flowchart illustrating a method of controlling a touch sensor using a proximity sensor in the method illustrated in FIG. 19.
- a proximity sensor of a mobile terminal may be operated within a reference distance with respect to the display unit 151 or the periphery of the display unit 151 with the display unit 151 deactivated.
- the located object may be sensed (S2110).
- the proximity sensor is disposed near the display unit 151, and may sense an object approaching the display unit 151 or an object located within a reference distance from the display unit 151.
- the reference distance is a distance adjacent to the display unit 151 and may refer to a distance so close that the user cannot check the display unit 151 because of an object covering the display unit 151.
- the front of the main body on which the display unit 151 is disposed faces the table, or when the case protecting the main body surrounds the front of the display unit 151, or when the terminal
- the case may be a case in which the object is located within the reference distance from the display unit 151.
- an activation state of the touch sensor may be determined according to whether the object is sensed through the proximity sensor (S2120).
- the controller 180 may deactivate the touch sensor while the display unit 151 is inactivated. That is, referring to FIG. 21, the controller 180 may switch the dose mode to the sleep mode.
- the controller 180 may switch the sleep mode to the dose mode.
- the mobile terminal may change a setting related to the touch sensor according to whether an object located within a reference distance is sensed through the proximity sensor while the display unit 151 is inactivated.
- the touch sensor since the touch sensor is inactivated or periodically activated according to the state of the terminal, power consumption can be reduced.
- FIG. 23 is a flowchart illustrating a method for preventing a malfunction in a mobile terminal according to an embodiment of the present invention.
- the controller 180 may control at least one function.
- the tap that continuously taps the display unit 151 may include a first tab and a second tap applied within a preset time after the first tab is applied.
- a step of detecting a first tap may be performed (S2310).
- a step of determining whether the second tap corresponds to a preset condition may be performed (S2320).
- the controller 180 may provide a second tap corresponding to a preset condition only when a tap is applied to a predetermined region by at least two or more (or a plurality of) taps continuously applied within a reference time. It can be determined that it was detected.
- the reference time may be a very short time, for example, may be a time within 300ms to 2s.
- the predetermined area may mean a narrow area where the points to which the tap gestures are applied are the same or viewed as the same point.
- the present invention is not necessarily limited thereto, and even if the second tap corresponds to a preset condition, whether the second tap corresponds to the invalid condition may be determined.
- the controller 180 may detect a tap corresponding to the preset method for a preset time after the second tap is detected. Control may be restricted (S2340).
- control of the function is not performed.
- the controller 180 may prevent the malfunction by limiting the operation on the touch sensor for a limited time from the time when the second tap is released.
- the second tap is not a gesture of tapping the display unit 151, but a gesture that is not released for a predetermined time in contact with the display unit 151, a time limit after the second tap is released.
- the touch sensor can be deactivated while As a result, malfunctions in the mobile terminal can be prevented in advance.
- the controller 180 may not perform an operation on a tap input for a time limit after the reference number of times. That is, if another tap is continuously detected after the tap corresponding to the reference number of times, this corresponds to an invalid condition, in which case the function is not performed.
- 24 and 25 are conceptual views illustrating an operation example in which a specific region of a touch sensor is deactivated in a state in which a display unit is deactivated in a mobile terminal according to one embodiment of the present invention.
- the entire area of the display unit 151 may be divided into a specific region 151a for receiving a preset tab and a region 151b that do not receive the preset tab while the display unit 151 is deactivated. For example, when the user holds the terminal, an unintended touch may occur by a finger holding the main body.
- the controller 180 may control at least one function in response to a tap applied to a preset specific region of the display unit 151. That is, when the display unit 151 is in an inactive state, at least one area of the touch sensor may be inactive.
- some areas of the entire area of the display unit 151 may be wrapped by the case 2400 or the like.
- the touch sensor may distinguish between an area surrounded by the case 2400 and an area that is not, and activate at least one area of the touch sensor based on the divided area.
- the controller 180 may output screen information to the display unit 151 in response to the applied tap.
- the controller 180 may activate only an area not covered by the case 2400 among the entire areas of the display unit 151 and output screen information to the activated area.
- the tap may be sensed by at least one of an acceleration sensor and a touch sensor included in the sensing unit 140.
- An acceleration sensor is a sensor which can measure dynamic force, such as acceleration, a vibration, and an impact of a terminal main body here.
- the acceleration sensor may detect the movement (or vibration) of the terminal body generated by the tap gesture to detect whether a tap is applied to the object. Accordingly, the acceleration sensor may detect taps on the terminal body or tap an object located close to the terminal body to detect whether movement or vibration has occurred in the terminal body.
- the acceleration sensor may detect the tap not only when the tap is applied to the terminal body but also when it is applied to a point outside the main body.
- the mobile terminal in order to detect a tap through an acceleration sensor or a touch sensor, the mobile terminal may operate in a specific mode in which a minimum current or power is consumed even in a locked state.
- a specific mode may be referred to as a 'doze mode'.
- the touch corresponding to the tap applied to the terminal body is sensed through the touch sensor or the acceleration sensor is detected.
- the terminal body or an object around the terminal body may be detected to be tapped.
- a mode using only an acceleration sensor may be referred to as a first mode and a mode using a touch sensor as a second mode, and utilize both an acceleration sensor and a touch sensor (simultaneously or sequentially). Mode) may be referred to as a third mode or a hybrid mode.
- the mobile terminal may control the function of the mobile terminal in response to "knockknock". Therefore, a user interface capable of easily controlling the functions of the mobile terminal can be used.
- the mobile terminal according to an embodiment of the present invention may control different functions or change different setting information according to the characteristics of "knockknock”. Accordingly, the user can control various functions by variously applying "knockknock”.
- the above-described method may be implemented as code that can be read by a processor in a medium in which a program is recorded.
- processor-readable media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and may be implemented in the form of a carrier wave (for example, transmission over the Internet). Include.
- the above-described mobile terminal is not limited to the configuration and method of the embodiments described above, but the embodiments may be configured by selectively combining all or some of the embodiments so that various modifications can be made. have.
- Embodiments of the present invention can be applied to various industrial fields related to this by proposing a method for controlling the function of the mobile terminal in response to an external force in the mobile terminal.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Computer Security & Cryptography (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (14)
- 디스플레이부;상기 디스플레이부를 두드리는 탭을 감지하는 터치 센서; 및상기 디스플레이부 상에 기 설정된 방식에 대응되는 탭이 가해진 경우, 상기 단말기에서 실행 가능한 기능 중 적어도 하나를 제어하는 제어부를 포함하고,상기 터치 센서는, 상기 디스플레이부의 활성화 및 비활성화된 상태에서 서로 다른 방식을 이용하여 상기 탭을 감지하도록 형성되는 것을 특징으로 하는 이동 단말기.
- 제1 항에 있어서,상기 서로 다른 방식은 상기 터치 센서의 활성화 주기와 관련되고,상기 터치 센서는, 상기 디스플레이부의 활성화 여부에 따라 서로 다른 주기로 활성화되는 것을 특징으로 하는 이동 단말기.
- 제2 항에 있어서, 상기 터치 센서는,상기 디스플레이부가 비활성화된 상태에서 기 설정된 특정 주기에 대응되도록 주기적으로 활성화되는 것을 특징으로 하는 이동 단말기.
- 제3항에 있어서,상기 디스플레이부가 활성화 상태인 경우, 상기 터치 센서는 계속하여 활성화 상태인 것을 특징으로 하는 이동 단말기.
- 제2 항에 있어서,상기 터치 센서에 터치가 감지되었을 때 상기 터치를 감지하기 위하여 상기 터치 센서에서 소모되는 소모전력은, 상기 디스플레이부의 활성화 여부에 따라 달라지는 것을 특징으로 하는 것을 특징으로 이동 단말기.
- 제1 항에 있어서,상기 비활성화된 디스플레이부로부터 기준 거리 이내에 위치한 물체를 센싱하는 근접 센서를 더 포함하고,상기 터치 센서는, 상기 근접 센서를 통해 상기 물체가 센싱되는지에 따라 활성화 상태 활성화 여부가 결정되는 것을 특징으로 하는 이동 단말기.
- 제6 항에 있어서,상기 터치 센서는, 상기 근접센서를 통해 상기 물체가 센싱되는 경우, 비활성화되고,상기 물체가 센싱되지 않는 경우, 주기적으로 활성화되는 것을 특징으로 하는 이동 단말기.
- 제1 항에 있어서, 상기 제어부는,상기 디스플레이부가 비활성화된 상태에서, 상기 터치 센서를 통해 상기 디스플레이부를 연속적으로 두드리는 탭이 감지된 경우, 상기 적어도 하나의 기능을 제어하는 것을 특징으로 하는 이동 단말기.
- 제8 항에 있어서, 상기 연속적으로 두드리는 탭은,제1 탭 및 상기 제1 탭이 인가된 후 기 설정된 시간 이내에 인가되는 제2 탭을 포함하는 것을 특징으로 하는 이동 단말기.
- 제9 항에 있어서, 상기 제어부는,상기 제2 탭이 기 설정된 무효조건에 해당하는 경우, 상기 제2 탭이 감지된 후 기 설정된 시간 동안에는 상기 기 설정된 방식에 대응되는 탭이 감지되더라도, 상기 적어도 하나의 기능에 대한 제어를 제한하는 것을 특징으로 하는 이동 단말기.
- 제1 항에 있어서,상기 디스플레이부가 비활성화된 상태에서 상기 기 설정된 조건에 대응되는 탭이 가해진 경우, 상기 디스플레이부 상에는,상기 디스플레이부가 비활성화 상태에서 활성화 상태로 전환될 때 상기 디스플레이부 상에 표시되는 초기 화면정보 중 상기 탭이 가해진 위치에 표시되는 정보와 관련된 정보가 표시되는 것을 특징으로 하는 이동 단말기.
- 제11 항에 있어서,상기 초기 화면정보는 잠금 화면에 대응되고,상기 디스플레이부의 디스플레이 영역 중 제1 영역에 상기 탭이 가해지면, 시간 정보가 표시되고, 상기 디스플레이 영역 중 상기 제1 영역과 다른 제2 영역에 상기 탭이 가해지면 홈 스크린 페이지가 출력되는 것을 특징으로 하는 이동 단말기.
- 제1 항에 있어서, 상기 제어부는,상기 디스플레이부가 비활성화된 경우, 상기 디스플레이부 중 기 설정된 특정 영역에 대해 가해지는 탭에 대응하여, 상기 적어도 하나의 기능을 제어하는 것을 특징으로 하는 이동 단말기.
- 제1항에 있어서,상기 터치 센서는, 상기 디스플레이부의 디스플레이 영역에 대응되도록 배치되고,상기 디스플레이부가 비활성화 상태인 경우, 상기 터치 센서의 적어도 일 영역은 비활성화되는 것을 특징으로 하는 이동 단말기.
Priority Applications (12)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910418856.3A CN110262743B (zh) | 2013-08-06 | 2013-12-19 | 移动终端 |
CN201380036933.XA CN104781757B (zh) | 2013-08-06 | 2013-12-19 | 移动终端及其控制方法 |
US14/389,990 US9811196B2 (en) | 2013-08-06 | 2013-12-19 | Mobile terminal performing a different operation based on a type of a tap applied to a display and control method thereof |
EP19193791.1A EP3614251B1 (en) | 2013-08-06 | 2013-12-19 | Mobile terminal and control method thereof |
ES13886627T ES2770677T3 (es) | 2013-08-06 | 2013-12-19 | Terminal móvil |
EP13886627.2A EP2857932B1 (en) | 2013-08-06 | 2013-12-19 | Mobile terminal |
JP2016533001A JP2016531357A (ja) | 2013-08-06 | 2013-12-19 | 移動端末機およびその制御方法 |
US15/622,958 US9836162B2 (en) | 2013-08-06 | 2017-06-14 | Mobile terminal performing a different operation based on a type of a tap applied to a display and control method thereof |
US15/797,642 US9977536B2 (en) | 2013-08-06 | 2017-10-30 | Mobile terminal performing a different operation based on a type of a tap applied to a display and control method thereof |
US15/797,708 US10025426B2 (en) | 2013-08-06 | 2017-10-30 | Mobile terminal performing a different operation based on a type of a tap applied to a display and control method thereof |
US15/958,742 US10095353B2 (en) | 2013-08-06 | 2018-04-20 | Mobile terminal performing a different operation based on a type of a tap applied to a display and control method thereof |
US16/120,747 US10509511B2 (en) | 2013-08-06 | 2018-09-04 | Mobile terminal performing a different operation based on a type of a tap applied to a display and control method thereof |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2013-0093363 | 2013-08-06 | ||
KR1020130093363A KR101444091B1 (ko) | 2013-08-06 | 2013-08-06 | 이동 단말기 및 그것의 제어방법 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/389,990 A-371-Of-International US9811196B2 (en) | 2013-08-06 | 2013-12-19 | Mobile terminal performing a different operation based on a type of a tap applied to a display and control method thereof |
US15/622,958 Continuation US9836162B2 (en) | 2013-08-06 | 2017-06-14 | Mobile terminal performing a different operation based on a type of a tap applied to a display and control method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015020283A1 true WO2015020283A1 (ko) | 2015-02-12 |
Family
ID=51761029
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2013/011885 WO2015020283A1 (ko) | 2013-08-06 | 2013-12-19 | 이동 단말기 및 그것의 제어방법 |
Country Status (8)
Country | Link |
---|---|
US (6) | US9811196B2 (ko) |
EP (3) | EP3343330A1 (ko) |
JP (1) | JP2016531357A (ko) |
KR (1) | KR101444091B1 (ko) |
CN (2) | CN104781757B (ko) |
DE (1) | DE202013012697U1 (ko) |
ES (1) | ES2770677T3 (ko) |
WO (1) | WO2015020283A1 (ko) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017116893A (ja) * | 2015-12-26 | 2017-06-29 | 株式会社村田製作所 | 立体型画像表示装置 |
JP2018528537A (ja) * | 2015-08-20 | 2018-09-27 | 華為技術有限公司Huawei Technologies Co.,Ltd. | ダブルナックルタッチスクリーン制御のためのシステムおよび方法 |
Families Citing this family (45)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101444091B1 (ko) | 2013-08-06 | 2014-09-26 | 엘지전자 주식회사 | 이동 단말기 및 그것의 제어방법 |
KR20150098115A (ko) * | 2014-02-19 | 2015-08-27 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
TW201610780A (zh) * | 2014-06-03 | 2016-03-16 | 東部高科股份有限公司 | 智慧型裝置以及其控制方法 |
EP2996012B1 (en) | 2014-09-15 | 2019-04-10 | LG Electronics Inc. | Mobile terminal and controlling method thereof |
KR101537629B1 (ko) * | 2014-12-05 | 2015-07-17 | 엘지전자 주식회사 | 이동 단말기 및 그 제어 방법 |
KR102251542B1 (ko) * | 2014-11-25 | 2021-05-14 | 엘지전자 주식회사 | 이동 단말기 및 그것의 제어 방법 |
KR102290892B1 (ko) * | 2014-12-12 | 2021-08-19 | 엘지전자 주식회사 | 이동단말기 및 그것의 제어방법 |
WO2016106541A1 (zh) * | 2014-12-30 | 2016-07-07 | 深圳市柔宇科技有限公司 | 一种触控操作方法、触控操作组件及电子设备 |
KR20160120560A (ko) | 2015-04-08 | 2016-10-18 | 현대자동차주식회사 | 사용자 입력 인식장치 및 방법 |
KR20160122517A (ko) * | 2015-04-14 | 2016-10-24 | 엘지전자 주식회사 | 이동 단말기 |
EP3091422B1 (en) * | 2015-05-08 | 2020-06-24 | Nokia Technologies Oy | Method, apparatus and computer program product for entering operational states based on an input type |
CN105094640B (zh) * | 2015-07-08 | 2018-09-04 | 广东欧珀移动通信有限公司 | 一种终端操作方法及用户终端 |
CN105022484B (zh) * | 2015-07-08 | 2019-02-05 | Oppo广东移动通信有限公司 | 一种终端操作方法及用户终端 |
CN105204741B (zh) * | 2015-09-25 | 2019-03-29 | Oppo广东移动通信有限公司 | 移动终端的拍照方法及装置 |
CN105242870A (zh) * | 2015-10-30 | 2016-01-13 | 小米科技有限责任公司 | 具有触摸屏的终端的防误触方法及装置 |
US9977530B2 (en) * | 2015-12-11 | 2018-05-22 | Google Llc | Use of accelerometer input to change operating state of convertible computing device |
KR20170076500A (ko) * | 2015-12-24 | 2017-07-04 | 삼성전자주식회사 | 생체 신호에 근거하여 기능을 수행하기 위한 방법, 저장 매체 및 전자 장치 |
CN105681659A (zh) * | 2016-01-20 | 2016-06-15 | 广东欧珀移动通信有限公司 | 摄像头设定方法及摄像头设定装置 |
US11357936B2 (en) * | 2016-02-25 | 2022-06-14 | Altria Client Services Llc | Method and devices for controlling electronic vaping devices |
CN108780515B (zh) * | 2016-03-14 | 2021-10-26 | 三菱电机株式会社 | 标签、认证***、电子锁控制***、自动门控制***、电梯控制*** |
CN108710469B (zh) | 2016-04-28 | 2021-06-01 | Oppo广东移动通信有限公司 | 一种应用程序的启动方法及移动终端和介质产品 |
CN107436622B (zh) * | 2016-05-27 | 2022-10-25 | 富泰华工业(深圳)有限公司 | 一种具有夜灯功能的电子装置及夜灯控制方法 |
US20170357322A1 (en) * | 2016-06-08 | 2017-12-14 | Mplus Co., Ltd. | Terminal device on which piezo actuator module using piezo has been installed |
DK179594B1 (en) * | 2016-06-12 | 2019-02-25 | Apple Inc. | USER INTERFACE FOR MANAGING CONTROLLABLE EXTERNAL DEVICES |
CN107544295A (zh) * | 2016-06-29 | 2018-01-05 | 单正建 | 一种汽车设备的控制方法 |
US10620689B2 (en) * | 2016-10-21 | 2020-04-14 | Semiconductor Energy Laboratory Co., Ltd. | Display device, electronic device, and operation method thereof |
KR102667413B1 (ko) * | 2016-10-27 | 2024-05-21 | 삼성전자주식회사 | 음성 명령에 기초하여 애플리케이션을 실행하는 방법 및 장치 |
CN106897009B (zh) * | 2017-02-10 | 2020-01-10 | 北京小米移动软件有限公司 | 移动设备时间显示方法及装置 |
KR102389063B1 (ko) * | 2017-05-11 | 2022-04-22 | 삼성전자주식회사 | 햅틱 피드백을 제공하는 방법 및 이를 수행하는 전자 장치 |
US10365814B2 (en) * | 2017-05-16 | 2019-07-30 | Apple Inc. | Devices, methods, and graphical user interfaces for providing a home button replacement |
JP2019067214A (ja) * | 2017-10-02 | 2019-04-25 | ヤフー株式会社 | 判定プログラム、判定方法、端末装置、学習データ、及びモデル |
US10571994B1 (en) * | 2017-10-30 | 2020-02-25 | Snap Inc. | Systems and methods for reduced IMU power consumption in a wearable device |
US10194019B1 (en) * | 2017-12-01 | 2019-01-29 | Qualcomm Incorporated | Methods and systems for initiating a phone call from a wireless communication device |
KR102462096B1 (ko) * | 2017-12-13 | 2022-11-03 | 삼성디스플레이 주식회사 | 전자 장치 및 이의 구동 방법 |
CN110415388A (zh) * | 2018-04-27 | 2019-11-05 | 开利公司 | 敲击姿势进入控制*** |
AU2019267527A1 (en) | 2018-05-07 | 2020-11-19 | Apple Inc. | User interfaces for viewing live video feeds and recorded video |
KR20200071841A (ko) * | 2018-12-03 | 2020-06-22 | 현대자동차주식회사 | 차량용 음성명령어 처리 장치 및 그 방법 |
US10904029B2 (en) | 2019-05-31 | 2021-01-26 | Apple Inc. | User interfaces for managing controllable external devices |
US11363071B2 (en) | 2019-05-31 | 2022-06-14 | Apple Inc. | User interfaces for managing a local network |
KR20210101580A (ko) | 2020-02-10 | 2021-08-19 | 삼성전자주식회사 | 서로 다른 입력 동작을 구별하는 전자 장치 및 그 방법 |
US11429401B2 (en) * | 2020-03-04 | 2022-08-30 | Landis+Gyr Innovations, Inc. | Navigating a user interface of a utility meter with touch-based interactions |
CN111443821B (zh) * | 2020-03-09 | 2023-01-13 | 维沃移动通信有限公司 | 一种处理方法及装置 |
US11513667B2 (en) | 2020-05-11 | 2022-11-29 | Apple Inc. | User interface for audio message |
DE102021105256A1 (de) | 2021-03-04 | 2022-09-08 | Audi Aktiengesellschaft | Mobiles Kommunikationsgerät und Verfahren zum Betreiben eines mobilen Kommunikationsgeräts |
US20240094795A1 (en) * | 2022-09-16 | 2024-03-21 | Lenovo (United States) Inc. | Computing system power-on using circuit |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100023326A (ko) * | 2008-08-21 | 2010-03-04 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
KR20100052378A (ko) * | 2008-11-10 | 2010-05-19 | 삼성전자주식회사 | 휴대 단말기의 모션 입력 장치 및 그의 운용 방법 |
KR20110054415A (ko) * | 2009-11-17 | 2011-05-25 | 삼성전자주식회사 | 화면 표시 방법 및 장치 |
JP2012146156A (ja) * | 2011-01-13 | 2012-08-02 | Canon Inc | 情報処理装置、その制御方法及びプログラム並びに記録媒体 |
KR20130081673A (ko) * | 2012-01-09 | 2013-07-17 | 브로드콤 코포레이션 | 상호 정전용량성 터치 시스템에서의 빠른 터치 감지 |
Family Cites Families (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6057830A (en) * | 1997-01-17 | 2000-05-02 | Tritech Microelectronics International Ltd. | Touchpad mouse controller |
US20050024341A1 (en) | 2001-05-16 | 2005-02-03 | Synaptics, Inc. | Touch screen with user interface enhancement |
US7126588B2 (en) * | 2002-06-27 | 2006-10-24 | Intel Corporation | Multiple mode display apparatus |
US8373660B2 (en) | 2003-07-14 | 2013-02-12 | Matt Pallakoff | System and method for a portable multimedia client |
US8970501B2 (en) | 2007-01-03 | 2015-03-03 | Apple Inc. | Proximity and multi-touch sensor detection and demodulation |
US7872652B2 (en) | 2007-01-07 | 2011-01-18 | Apple Inc. | Application programming interfaces for synchronization |
KR20080073872A (ko) | 2007-02-07 | 2008-08-12 | 엘지전자 주식회사 | 터치 스크린을 구비한 이동통신 단말기 및 이를 이용한정보 입력 방법 |
US8600816B2 (en) * | 2007-09-19 | 2013-12-03 | T1visions, Inc. | Multimedia, multiuser system and associated methods |
US20090194341A1 (en) * | 2008-02-05 | 2009-08-06 | Nokia Corporation | Method and device for operating a resistive touch input component as a proximity sensor |
US20090239581A1 (en) | 2008-03-24 | 2009-09-24 | Shu Muk Lee | Accelerometer-controlled mobile handheld device |
US20100066677A1 (en) | 2008-09-16 | 2010-03-18 | Peter Garrett | Computer Peripheral Device Used for Communication and as a Pointing Device |
KR101572071B1 (ko) * | 2009-01-06 | 2015-11-26 | 삼성전자주식회사 | 휴대단말 표시부의 온/오프 제어 방법 및 장치 |
TWI419034B (zh) * | 2009-04-03 | 2013-12-11 | Novatek Microelectronics Corp | 用於一觸控面板中偵測觸碰事件的控制方法及其相關裝置 |
CH701440A2 (fr) * | 2009-07-03 | 2011-01-14 | Comme Le Temps Sa | Montre-bracelet à écran tactile et procédé d'affichage sur une montre à écran tactile. |
US8970507B2 (en) * | 2009-10-02 | 2015-03-03 | Blackberry Limited | Method of waking up and a portable electronic device configured to perform the same |
EP2315101B1 (en) * | 2009-10-02 | 2014-01-29 | BlackBerry Limited | A method of waking up and a portable electronic device configured to perform the same |
US8519974B2 (en) * | 2010-01-19 | 2013-08-27 | Sony Corporation | Touch sensing device, touch screen device comprising the touch sensing device, mobile device, method for sensing a touch and method for manufacturing a touch sensing device |
US8261213B2 (en) * | 2010-01-28 | 2012-09-04 | Microsoft Corporation | Brush, carbon-copy, and fill gestures |
US9405404B2 (en) * | 2010-03-26 | 2016-08-02 | Autodesk, Inc. | Multi-touch marking menus and directional chording gestures |
KR20110125358A (ko) | 2010-05-13 | 2011-11-21 | 삼성전자주식회사 | 휴대 단말기의 표시부 제어 방법 및 장치 |
KR101685145B1 (ko) * | 2010-06-09 | 2016-12-09 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
US8811948B2 (en) * | 2010-07-09 | 2014-08-19 | Microsoft Corporation | Above-lock camera access |
CN102576276B (zh) * | 2010-08-23 | 2017-05-10 | 谱瑞科技股份有限公司 | 电容扫描邻近侦测 |
US8311514B2 (en) | 2010-09-16 | 2012-11-13 | Microsoft Corporation | Prevention of accidental device activation |
US9262002B2 (en) * | 2010-11-03 | 2016-02-16 | Qualcomm Incorporated | Force sensing touch screen |
US8866735B2 (en) * | 2010-12-16 | 2014-10-21 | Motorla Mobility LLC | Method and apparatus for activating a function of an electronic device |
TW201232349A (en) * | 2011-01-21 | 2012-08-01 | Novatek Microelectronics Corp | Single finger gesture determination method, touch control chip, touch control system and computer system |
US8635560B2 (en) | 2011-01-21 | 2014-01-21 | Blackberry Limited | System and method for reducing power consumption in an electronic device having a touch-sensitive display |
US9250798B2 (en) * | 2011-01-24 | 2016-02-02 | Apple Inc. | Device, method, and graphical user interface with a dynamic gesture disambiguation threshold |
WO2013057048A1 (en) * | 2011-10-18 | 2013-04-25 | Slyde Watch Sa | A method and circuit for switching a wristwatch from a first power mode to a second power mode |
US20130100044A1 (en) * | 2011-10-24 | 2013-04-25 | Motorola Mobility, Inc. | Method for Detecting Wake Conditions of a Portable Electronic Device |
US9489061B2 (en) * | 2011-11-14 | 2016-11-08 | Logitech Europe S.A. | Method and system for power conservation in a multi-zone input device |
US9785217B2 (en) | 2012-09-28 | 2017-10-10 | Synaptics Incorporated | System and method for low power input object detection and interaction |
US20140168126A1 (en) * | 2012-12-14 | 2014-06-19 | Jeffrey N. Yu | Dynamic border control systems and methods |
CN103019796A (zh) | 2012-12-28 | 2013-04-03 | 深圳市汇顶科技股份有限公司 | 一种触摸终端的唤醒方法、***及触摸终端 |
KR102034584B1 (ko) * | 2013-06-20 | 2019-10-21 | 엘지전자 주식회사 | 포터블 디바이스 및 그 제어 방법 |
KR102160767B1 (ko) * | 2013-06-20 | 2020-09-29 | 삼성전자주식회사 | 제스처를 감지하여 기능을 제어하는 휴대 단말 및 방법 |
KR101444091B1 (ko) * | 2013-08-06 | 2014-09-26 | 엘지전자 주식회사 | 이동 단말기 및 그것의 제어방법 |
-
2013
- 2013-08-06 KR KR1020130093363A patent/KR101444091B1/ko active IP Right Grant
- 2013-12-19 WO PCT/KR2013/011885 patent/WO2015020283A1/ko active Application Filing
- 2013-12-19 CN CN201380036933.XA patent/CN104781757B/zh active Active
- 2013-12-19 CN CN201910418856.3A patent/CN110262743B/zh active Active
- 2013-12-19 EP EP18152464.6A patent/EP3343330A1/en not_active Ceased
- 2013-12-19 EP EP19193791.1A patent/EP3614251B1/en active Active
- 2013-12-19 JP JP2016533001A patent/JP2016531357A/ja active Pending
- 2013-12-19 US US14/389,990 patent/US9811196B2/en active Active
- 2013-12-19 EP EP13886627.2A patent/EP2857932B1/en active Active
- 2013-12-19 ES ES13886627T patent/ES2770677T3/es active Active
- 2013-12-19 DE DE202013012697.3U patent/DE202013012697U1/de not_active Expired - Lifetime
-
2017
- 2017-06-14 US US15/622,958 patent/US9836162B2/en active Active
- 2017-10-30 US US15/797,708 patent/US10025426B2/en active Active
- 2017-10-30 US US15/797,642 patent/US9977536B2/en active Active
-
2018
- 2018-04-20 US US15/958,742 patent/US10095353B2/en active Active
- 2018-09-04 US US16/120,747 patent/US10509511B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20100023326A (ko) * | 2008-08-21 | 2010-03-04 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
KR20100052378A (ko) * | 2008-11-10 | 2010-05-19 | 삼성전자주식회사 | 휴대 단말기의 모션 입력 장치 및 그의 운용 방법 |
KR20110054415A (ko) * | 2009-11-17 | 2011-05-25 | 삼성전자주식회사 | 화면 표시 방법 및 장치 |
JP2012146156A (ja) * | 2011-01-13 | 2012-08-02 | Canon Inc | 情報処理装置、その制御方法及びプログラム並びに記録媒体 |
KR20130081673A (ko) * | 2012-01-09 | 2013-07-17 | 브로드콤 코포레이션 | 상호 정전용량성 터치 시스템에서의 빠른 터치 감지 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2857932A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018528537A (ja) * | 2015-08-20 | 2018-09-27 | 華為技術有限公司Huawei Technologies Co.,Ltd. | ダブルナックルタッチスクリーン制御のためのシステムおよび方法 |
JP2017116893A (ja) * | 2015-12-26 | 2017-06-29 | 株式会社村田製作所 | 立体型画像表示装置 |
Also Published As
Publication number | Publication date |
---|---|
JP2016531357A (ja) | 2016-10-06 |
US9836162B2 (en) | 2017-12-05 |
EP2857932A4 (en) | 2016-08-31 |
US20170285851A1 (en) | 2017-10-05 |
US20180373388A1 (en) | 2018-12-27 |
EP2857932A1 (en) | 2015-04-08 |
EP3343330A1 (en) | 2018-07-04 |
US9977536B2 (en) | 2018-05-22 |
CN110262743B (zh) | 2022-03-18 |
US10509511B2 (en) | 2019-12-17 |
CN110262743A (zh) | 2019-09-20 |
US10095353B2 (en) | 2018-10-09 |
US20180239490A1 (en) | 2018-08-23 |
US20180067609A1 (en) | 2018-03-08 |
DE202013012697U1 (de) | 2018-08-07 |
US20180046317A1 (en) | 2018-02-15 |
CN104781757B (zh) | 2019-06-14 |
ES2770677T3 (es) | 2020-07-02 |
US10025426B2 (en) | 2018-07-17 |
EP3614251A1 (en) | 2020-02-26 |
US9811196B2 (en) | 2017-11-07 |
EP2857932B1 (en) | 2019-11-20 |
EP3614251B1 (en) | 2023-05-10 |
US20160259459A1 (en) | 2016-09-08 |
KR101444091B1 (ko) | 2014-09-26 |
CN104781757A (zh) | 2015-07-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015020283A1 (ko) | 이동 단말기 및 그것의 제어방법 | |
WO2015020284A1 (ko) | 이동 단말기 및 그것의 제어방법 | |
WO2018034402A1 (en) | Mobile terminal and method for controlling the same | |
WO2020171287A1 (ko) | 이동 단말기 및 이동 단말기를 구비한 전자장치 | |
WO2016182132A1 (ko) | 이동단말기 및 그 제어방법 | |
WO2015199270A1 (ko) | 이동 단말기 및 그 제어방법 | |
WO2015111778A1 (ko) | 안경형 단말기와 그 안경형 단말기의 제어방법 | |
WO2017082508A1 (ko) | 와치형 단말기 및 그 제어방법 | |
WO2017119531A1 (ko) | 이동 단말기 및 그 제어방법 | |
WO2017131319A1 (en) | Mobile terminal for one-hand operation mode of controlling paired device, notification and application | |
WO2017030223A1 (ko) | 카드유닛을 구비하는 이동 단말기 및 이의 제어방법 | |
WO2017104860A1 (ko) | 롤러블 이동 단말기 | |
WO2016032045A1 (ko) | 이동 단말기 및 이의 제어 방법 | |
WO2017057803A1 (ko) | 이동 단말기 및 그 제어 방법 | |
WO2017039094A1 (en) | Mobile terminal and method for controlling the same | |
WO2015072677A1 (en) | Mobile terminal and method of controlling the same | |
WO2017047854A1 (ko) | 이동 단말기 및 그 제어방법 | |
WO2017119529A1 (ko) | 이동 단말기 | |
WO2017034126A1 (ko) | 이동단말기 | |
WO2016010202A1 (en) | Mobile terminal and control method for the mobile terminal | |
WO2017052043A1 (en) | Mobile terminal and method for controlling the same | |
WO2017039051A1 (ko) | 와치 타입 이동단말기 및 그 제어방법 | |
WO2016039498A1 (ko) | 이동 단말기 및 그 제어 방법 | |
WO2017051959A1 (ko) | 단말 장치 및 단말 장치의 제어 방법 | |
WO2016114437A1 (ko) | 이동 단말기 및 그것의 제어방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 14389990 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013886627 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016533001 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13886627 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |