CN111418197A - Mobile terminal and control method thereof - Google Patents

Mobile terminal and control method thereof Download PDF

Info

Publication number
CN111418197A
CN111418197A CN201880076824.3A CN201880076824A CN111418197A CN 111418197 A CN111418197 A CN 111418197A CN 201880076824 A CN201880076824 A CN 201880076824A CN 111418197 A CN111418197 A CN 111418197A
Authority
CN
China
Prior art keywords
visual information
mobile terminal
state
displayed
user input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880076824.3A
Other languages
Chinese (zh)
Inventor
任素演
李珍姬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Publication of CN111418197A publication Critical patent/CN111418197A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/34Microprocessors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/36Memories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The present invention relates to a mobile terminal capable of detecting a motion of grasping the terminal, the mobile terminal including: a main body provided with a case forming an appearance; a memory storing a plurality of visual information; a touch screen disposed on the front surface of the main body and configured to display at least one of the plurality of visual information; a grip sensor disposed at a side surface of the body, attached to an inner surface of the housing, and detecting a user input applied to the side surface; and a control section that, in an editing mode in which the plurality of pieces of visual information are edited, executes an all-selection function of setting the displayed at least one piece of visual information to an editable selection state in accordance with a user input detected by the grip sensor, and that, even if the all-selection function is executed, does not set the remaining visual information other than the displayed at least one piece of visual information to a selection state.

Description

Mobile terminal and control method thereof
Technical Field
The present invention relates to a mobile terminal capable of detecting a motion of grasping the terminal.
Background
Terminals can be classified into mobile/portable terminals (mobile terminals) and stationary terminals (stationary terminals) according to whether they can move or not. Mobile terminals can be classified into portable terminals (hand-held terminals) and portable terminals (vehicle-mounted terminals) according to whether users can directly carry the mobile terminals or not.
The functions of mobile terminals are being increasingly diversified. For example, functions of data communication and voice communication, camera-based photo shooting and video shooting, voice recording, music file playing based on a speaker system, and outputting a picture or video on a display section. Some terminals are also added with electronic game functions or perform multimedia playing functions. In particular, recent mobile terminals may receive multicast signals that provide visual content such as broadcast or video or television programs.
As the functions of the terminal (terminal) are diversified, the terminal is gradually implemented in the form of a Multimedia device (Multimedia player) having integrated functions, such as functions of taking pictures or videos, playing music or video files, playing games, and receiving broadcasts.
In order to support and expand the functions of such a terminal, improvement of the structural part and/or software of the terminal may be considered.
Recently, terminals mounted with grip sensors that detect user inputs applied to the side surfaces of the terminal body are increasingly emerging for the purpose of simplification of the terminal appearance design and control of various functions. The grip sensor is a sensor that detects pressure applied when the user grips the terminal.
With the ability to implement user input in new ways based on grip sensors, there is an increasing demand in the market for new user experiences (user experience) that control terminals in new ways.
Disclosure of Invention
Problems to be solved by the invention
It is an object of the present invention to provide a variety of user interfaces that utilize user input applied through a grip sensor.
Means for solving the problems
The present invention is characterized by comprising: a main body provided with a case forming an appearance; a memory storing a plurality of visual information; a touch screen disposed on a front surface of the main body and displaying at least one of the plurality of visual information; a grip sensor disposed at a side of the body and attached to an inner surface of the housing, detecting a user input applied to the side; and a control unit configured to execute, when an edit mode for editing the plurality of pieces of visual information is being executed, an all-selection function for setting the displayed at least one piece of visual information to an editable selection state in accordance with a user input detected by the grip sensor, and the control unit does not set remaining visual information other than the displayed at least one piece of visual information to the selection state even if the all-selection function is executed.
In one embodiment, the control portion determines the number of displays of the visual information to be displayed on the touch panel based on a touch input applied in a preset manner, and in a state where the visual information corresponding to the determined number of displays is displayed on the touch panel, the control portion performs all the selection functions such that only the visual information corresponding to the determined number of displays among the visual information stored in the memory is set to the selection state based on the user input detected by the grip sensor.
In one embodiment, the control unit reduces the number of images to be displayed on the touch panel from among the visual information stored in the memory to a second number smaller than the first number in accordance with a touch input applied in a predetermined manner in a state where a first number of visual information corresponding to the visual information stored in the memory is displayed, and the control unit executes the all-select function so that the second number of visual information is set to the all-select state when a user input is applied through the grip sensor in a state where the second number of visual information corresponding to the visual information is displayed.
In one embodiment, the control unit executes a scroll function for scrolling the list information in a state where a part of items of the list information including a plurality of items are displayed on the touch panel, in accordance with a drag input applied to the touch panel, and selects an item of the list information displayed on the touch panel by the execution of the scroll function when a user input is detected by the grip sensor after the scroll function is executed.
In one embodiment, in a selected state in which the plurality of pieces of visual information are all selected, the control section executes an all cancel function so that the plurality of pieces of visual information are set to an unselected state, in accordance with a user input detected by the grip sensor.
In an embodiment, characterized in that the plurality of visual information are information associated with functions different from each other, respectively, and the control portion activates all of the functions different from each other or deactivates all of the functions different from each other in accordance with a user input detected by the grip sensor.
In one embodiment, the touch panel further includes an input window for displaying characters input by a user control command, and the control unit performs a full selection function according to the user input detected by the grip sensor, so that all the characters displayed on the input window are selected.
In one embodiment, the mobile terminal further includes: a proximity sensor detecting an object located at a periphery of the touch screen; and a gyro sensor that detects an inclination of the body, wherein when an object located around the touch panel is detected and a specific function is being executed in a state where the body is horizontal, the control unit stops execution of the specific function if a user input is detected by the grip sensor, and when an object located around the touch panel is detected and the mobile terminal is in a standby state in a state where the body is horizontal, the control unit executes a mute mode in which a warning sound is controlled not to be output in a sound manner.
In one embodiment, the mobile terminal further includes: a proximity sensor detecting an object located at a periphery of the touch screen; a gyro sensor detecting a tilt of the main body; and an illuminance sensor that detects illuminance around the touch panel, wherein the control unit stops execution of the specific function when the grip sensor detects a user input while the specific function is being executed in a state where an object located around the touch panel is detected, the main body is in a non-horizontal state, and the illuminance around the touch panel is equal to or less than a reference value, and executes a mute mode when the grip sensor detects a user input in a standby mode where the specific function is not being executed in a state where an object located around the touch panel is detected, the main body is in a non-horizontal state, and the illuminance around the touch panel is equal to or less than a reference value, the mute mode being a mode in which notification information is not output.
In one embodiment, the mobile terminal further includes an acceleration sensor that detects an acceleration of the body, the touch panel is a rectangle having a first side longer than a second side, the body assumes any one of a vertical posture and a horizontal posture, the vertical posture is a posture in which the first side is arranged in a direction parallel to a direction of gravity, the horizontal posture is a posture in which the first side is arranged in a direction perpendicular to the direction of gravity, and the control unit changes a display direction of visual information displayed on the touch panel when it is detected that the posture of the body is changed from the vertical posture to the horizontal posture and a user input is detected by the grip sensor in the horizontal posture.
In one embodiment, when the state sensor detects that the posture of the main body is changed from the landscape posture to the portrait posture again, the control unit returns the display mode of the visual information to the state before the change.
In one embodiment, the mobile terminal includes a fingerprint sensor that detects a fingerprint of a finger of a user, and the control unit executes a lock function so that the screen information displayed on the touch screen does not change if the user input is detected by the grip sensor after the fingerprint sensor detects the fingerprint.
In one embodiment, the control portion performs a multi-window function of dividing the touch screen into a plurality of regions and displaying visual information different from each other in each region when the body rotates in a state where the user input is detected by the grip sensor, and determines a display region of the first visual information according to a rotation direction of the body.
In one embodiment, the touch screen includes a first area and a second area, and the control portion displays the first visual information in the first area when the rotation direction of the main body is a first direction, and displays the first visual information in the second area when the rotation direction of the main body is a second direction.
In one embodiment, when the multi-window function is executed, the control unit displays, in one region of the touch panel, second visual information different from the first visual information, the second visual information being one of an icon of an application that is frequently used and an execution screen of an application that has been executed most recently among applications that are running in the background.
A control method of a mobile terminal provided with a grip sensor that detects a user input applied to a side of a main body, characterized by comprising: displaying a part of the visual information stored in the memory on the touch screen; a step of detecting a user input by the grip sensor while an editing mode for editing the plurality of visual information is being executed; and a step of executing, in accordance with the user input, a total selection function of setting the displayed at least one piece of visual information to an editable selection state, the total selection function not setting remaining visual information other than the displayed at least one piece of visual information to a selection state.
In one embodiment, the step of displaying the portion of the visual information on the touch screen further comprises: displaying the part of visual information and new visual information together according to a touch input in a preset mode aiming at the part of visual information; and setting the part of the visual information and the new visual information to all of the selected states when the grip sensor detects a user input in a state where the part of the visual information and the new visual information are displayed.
In one embodiment, the method further comprises: and a step of setting a selection state of the part of the visual information to a non-editable cancel state in accordance with a user input detected by the grip sensor in a state where the part of the visual information is displayed.
In one embodiment, a part of the plurality of items in the catalog information including the plurality of items is displayed on the touch screen, and the step of performing the entire selection function further includes: a step of executing a scroll function of scrolling the directory information according to a drag input applied to the touch screen in a state where the part of the items is displayed; and a step of selecting an item displayed on the touch panel by executing the scroll function in the catalog information when the user input is detected by the grip sensor after the scroll function is executed.
In one embodiment, in a state where first visual information is displayed on the touch screen, and in a state where a user input is detected by the grip sensor in a state where the first visual information is displayed, when the main body rotates, a step of performing a multi-window function of dividing the touch screen into a plurality of regions and displaying visual information different from each other in each region is performed, the first visual information being displayed in a specific region among the plurality of divided regions according to a rotation direction of the main body.
Effects of the invention
According to the mobile terminal of the present invention, in the course of executing the edit mode for editing the plurality of visual information displayed on the touch screen, the functions associated with the plurality of visual information are executed in response to the user input applied by the grip sensor, whereby the convenience of the user can be improved.
In addition, according to the mobile terminal of the present invention, it is possible to improve the convenience of the user by performing the motion control associated with the state of the terminal in response to the user input applied by the grip sensor.
Drawings
Referring to fig. 1a to 1c, fig. 1a is a block diagram for explaining a mobile terminal associated with the present invention, and fig. 1b and 1c are conceptual views of an example of the mobile terminal associated with the present invention as viewed from directions different from each other.
Fig. 2a to 2b are conceptual views for explaining a grip sensor implemented in a mobile terminal mounted to the present invention.
Fig. 2c is a conceptual diagram illustrating a grip sensor attached to a substrate.
Fig. 3a to 3c are conceptual views for explaining a grip sensor disposed on a side surface of a main body.
Fig. 4a and 4b are conceptual views showing a method of selecting a plurality of image information at once when running an album application.
Fig. 5a to 5c are conceptual views illustrating a method of controlling bibliographic information including a plurality of items by a grip input.
Fig. 6a to 6c are conceptual views showing an example of an operation of turning on/off a plurality of functions at once according to a user input detected by a grip sensor.
Fig. 7a and 7b are conceptual views for explaining a method of selecting the entire text information using the user input detected by the grip sensor.
Fig. 8a to 10b are conceptual views illustrating a method of executing a function associated with state information of a mobile terminal according to a user input detected by a grip sensor.
Fig. 11a and 11b are conceptual views illustrating a method of performing multitasking using a user input detected by a grip sensor.
Detailed Description
Hereinafter, embodiments disclosed in the present specification will be described in detail with reference to the accompanying drawings, where the same or similar structural elements are given the same reference numerals regardless of the figure number, and redundant description thereof will be omitted. The suffixes "module" and "portion" for structural elements used in the following description are given or mixed only in consideration of the writing of the specification, and do not have meanings or effects distinguished from each other by themselves. Moreover, in the description of the embodiments disclosed in the present specification, if it is determined that the detailed description of the related known technology would obscure the technical idea of the embodiments disclosed in the present specification, the detailed description thereof will be omitted. The drawings attached hereto are only for the purpose of facilitating understanding of the embodiments disclosed herein, and the technical idea disclosed herein is not limited to the drawings attached hereto, but rather, the present invention is intended to cover all modifications, equivalents, and alternatives included in the technical scope and spirit of the present invention.
The terms "first", "second", and the like, including ordinal numbers, may be used to describe various structural elements, but the structural elements are not limited by the terms. The terms are used only for the purpose of distinguishing one structural element from other structural elements.
If a structural element is referred to as being "connected" or "coupled" to another structural element, it may be directly connected or coupled to the other structural element, but it is also understood that other structural elements may be present therebetween. Conversely, if a structural element is referred to as being "directly connected" or "directly coupled" to another structural element, it is understood that no other structural element exists therebetween.
Unless the context clearly dictates otherwise, singular expressions shall include plural expressions.
In the present application, the terms "including" or "having" are used only for specifying the presence of the features, numerals, steps, actions, structural elements, components, or combinations thereof described in the specification, and are not intended to exclude the possibility of the presence or addition of one or more other features, numerals, steps, actions, structural elements, components, or combinations thereof.
The portable electronic devices described in this specification may include a mobile phone, a smart phone (smart phone), a notebook computer (laptop computer), a terminal for digital broadcasting, a PDA (personal digital assistants), a PMP (portable multimedia player), a navigation, a tablet PC (tablet PC), an ultrabook (ultrabook), a wearable device (e.g., a watch type terminal), a glasses type terminal (smart glass), a head mounted display), and the like.
However, those skilled in the art can easily understand that the structure of the embodiments described in the present specification can be applied to a stationary terminal such as a digital TV, a desktop computer, and a digital signage, in addition to a portable electronic device.
Referring to fig. 1a to 1c, fig. 1a is a block diagram for explaining a mobile terminal related to the present invention, and fig. 1b and 1c are conceptual views of an example of the mobile terminal related to the present invention as viewed from directions different from each other.
The mobile terminal 100 may include a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170, a control unit 180, a power supply unit 190, and the like. The constituent elements shown in fig. 1a are not essential to implementing the mobile terminal, so that the mobile terminal described in this specification may have more or less than the above-listed constituent elements.
More specifically, among the components, the wireless communication unit 110 may include one or more modules that enable wireless communication between the mobile terminal 100 and a wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or between the mobile terminal 100 and an external server. The wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.
Such a wireless communication part 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless network module 113, a short-range communication module 114, and a location information module 115.
The input part 120 may include a camera 121 or an image input part for inputting an image signal, a microphone 122 or an audio input part for inputting an audio signal, a user input part 123 (e.g., a touch key, a mechanical key, etc.) for receiving an information input from a user. The voice data or image data collected by the input unit 120 may be analyzed and processed as a control command of the user.
The detection unit 140 may include one or more sensors for detecting at least one of information within the mobile terminal, surrounding environment information of the mobile terminal, and user information. For example, the detection part 140 may include a proximity sensor 141(proximity sensor), an illuminance sensor 142(illumination sensor), a touch sensor (touch sensor), an acceleration sensor (acceleration sensor), a magnetic sensor (magnetic sensor), a gravity sensor (G-sensor), a gyro sensor (gyroscopic sensor), a motion sensor (motion sensor), an RGB sensor, infrared sensors (IR sensors), fingerprint recognition sensors (finger scan sensors), ultrasonic sensors (ultrasonic sensors), optical sensors (optical sensors, such as camera 121), microphone 122(microphone), battery fuel gauge (battery gauge), environmental sensors (e.g., barometer, hygrometer, thermometer, radiation detection sensors, heat sensors, gas sensors, etc.), chemical sensors (e.g., electronic nose, health sensor, biosensor, etc.). On the other hand, the mobile terminal disclosed in the present specification may combine and use information detected by at least two or more sensors among such sensors.
The output section 150 is used to generate an output related to a visual sense, an auditory sense, a tactile sense, or the like, and may include at least one of a display section 151, a sound output section 152, a haptic module 153, and a light output section 154. The display portion 151 may implement a touch screen by forming a layer structure or integrally with a touch sensor. Such a touch screen functions not only as a user input part 123 providing an input interface between the mobile terminal 100 and the user, but also can provide an output interface between the mobile terminal 100 and the user.
The interface unit 160 functions as a path between the mobile terminal 100 and various external devices connected to the mobile terminal 100. Such an interface part 160 may include at least one of a wired/wireless headset port (port), an external charger port (port), a wired/wireless data port (port), a memory card (memory card) port, a port (port) to which a device provided with an identification module is connected, an audio I/O (Input/Output) port (port), a video I/O (Input/Output) port (port), and an earphone port (port). The mobile terminal 100 is capable of performing appropriate control relating to an external device connected thereto in response to a situation in which the external device is connected to the interface section 160.
In addition, the memory 170 stores data supporting various functions of the mobile terminal 100. Memory 170 may store a plurality of applications (applications) driven by mobile terminal 100, data, instructions, and the like for the operation of mobile terminal 100. At least a portion of such applications may be downloaded from an external server via wireless communication. In addition, at least a portion of such applications may be already present in the mobile terminal 100 at the time of factory shipment to implement basic functions (e.g., a call receiving function, a call making function, a message receiving function, a message sending function) of the mobile terminal 100. On the other hand, the application program is stored in the memory 170, installed in the mobile terminal 100, and driven by the control unit 180 to execute the operation (or function) of the mobile terminal.
In general, the control unit 180 controls the overall operation of the mobile terminal 100 in addition to the operation related to the application. The control unit 180 processes signals, data, information, and the like inputted or outputted through the above-described components, or drives an application stored in the memory 170, thereby providing or processing appropriate information or functions to the user.
The control unit 180 may control at least some of the components shown in fig. 1a in order to drive the application stored in the memory 170. Further, the control section 180 may combine and operate at least two or more of the components included in the mobile terminal 100 to drive the application.
Under the control of the control unit 180, the power supply unit 190 receives an external power supply and an internal power supply and supplies power to each component included in the mobile terminal 100. Such a power supply section 190 includes a battery, which may be a built-in battery or a replaceable battery.
In order to implement the operation, control, or control method of the mobile terminal of the various embodiments described below, at least some of the respective constituent elements may operate in cooperation with each other. In addition, the operation, control or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application stored in the memory 170.
Before describing various embodiments implemented by the mobile terminal 100 described above, the above-listed components will be described in detail with reference to fig. 1 a.
First, the radio communication unit 110 will be described, and the broadcast receiving module 111 of the radio communication unit 110 receives a broadcast signal and/or broadcast-related information from an external broadcast management server via a broadcast channel. The broadcast channels may include satellite channels, terrestrial wave channels. In order to simultaneously receive at least two broadcast channels or switch broadcast channels, the mobile terminal 100 may be provided with more than two broadcast receiving modules.
The Mobile communication module 112 transmits and receives a signal to and from an external Mobile communication terminal according to a technical standard or a communication method for Mobile communication (for example, GSM (global system for Mobile communication), CDMA (Code Division multiple Access), CDMA2000(Code Division multiple Access 2000: Code Division multiple Access 2000), EV-DO (Enhanced Voice-Data Optimized or Enhanced Voice-Only Data), WCDMA (Wideband CDMA: Wideband CDMA), HSDPA (High Speed downlink Packet Access), HSUPA (High Speed Uplink Packet Access), L (TE L Term Evolution), L TE-a (L terminal-Advanced) and/or a wireless communication server on the network.
The wireless signal may include a voice call signal, a video call signal or data according to various modalities of text/multimedia messaging.
The wireless network module 113 refers to a module for connecting to a wireless internet, and may be built in or out of the mobile terminal 100. The wireless network module 113 is configured to transceive wireless signals over a communication network based on a wireless internet technology.
Examples of Wireless internet technologies include W L AN (Wireless L AN: Wireless Network), Wi-Fi (Wireless-Fidelity: Wireless Fidelity), Wi-Fi Direct (Wireless Fidelity Direct: Wireless Fidelity Direct), D L NA (Digital L providing Network Alliance: Digital living Network Alliance), WiBro (Wireless broadband: Wireless broadband), WiMAX (World Interoperability for Microwave Access: worldwide Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access: High Speed Downlink Packet Access), HSUPA (High Speed Uplink Packet Access: High Speed Uplink Packet Access), L TE (L terminal Evolution: long Term Evolution), L TE-a (L terminal-Advanced: Advanced long Term Evolution), and so on, and at least one of Wireless internet technologies is listed in the Wireless internet technologies, including Wireless Network 113.
The wireless network module 113 performing access to the wireless internet through the mobile communication network may also be understood as one of the mobile communication modules 112, from the viewpoint that wireless internet access based on WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, L TE, L TE-a, etc. is implemented through the mobile communication network.
The Short range communication module 114 is used for Short range communication (Short range communication), and can use Bluetooth (Bluetooth)TM)、RFID(Radi Frequency Identification: radio frequency identification), Infrared communication (Infrared Data Association; IrDA: infrared data association), UWB (Ultra wide: ultra wideband), ZigBee (ZigBee protocol), NFC (Near Field Communication: near field communication), Wi-Fi (Wireless-Fidelity: wireless high fidelity), Wi-Fi Direct (Wireless high fidelity Direct), Wireless USB (Wireless universal Serial Bus: wireless universal serial bus) technology to support near field communication. Such a short range communication module 114 may support Wireless communication between the mobile terminal 100 and a Wireless communication system, between the mobile terminal 100 and other mobile terminals 100, or between the mobile terminal 100 and a network in which other mobile terminals 100 (or external servers) are located, through a short range Wireless communication network (Wireless Area Networks). The short range Wireless communication network may be a short range Wireless Personal Area Networks (Wireless Personal Area Networks).
Here, the other mobile terminal 100 may be a wearable device (e.g., smart watch (smart watch), smart glasses (smart glass), HMD (head mounted display)) capable of exchanging data with (or being capable of cooperating with) the mobile terminal 100 of the present invention. The near field communication module 114 may detect (or identify) wearable devices capable of communicating with the mobile terminal 100 in the vicinity of the mobile terminal 100. Further, in the case where the detected wearable device is a device that has been verified to communicate with the mobile terminal 100 of the present invention, the control part 180 may transmit at least a part of the data processed at the mobile terminal 100 to the wearable device through the near field communication module 114. Accordingly, the user of the wearable device can use the data processed at the mobile terminal 100 through the wearable device. For example, according to this, when the mobile terminal 100 receives an incoming call, the user may make a phone call through the wearable device, or when the mobile terminal 100 receives a message, the received message may be confirmed through the wearable device.
The location information module 115 is a module for obtaining a location (or a current location) of the mobile terminal, and a typical example thereof is a GPS (Global Positioning System) module or a wifi (wireless fidelity) module. For example, when the mobile terminal uses a GPS module, the location of the mobile terminal may be obtained by using signals transmitted from GPS satellites. As another example, when the mobile terminal uses the Wi-Fi module, the location of the mobile terminal may be obtained based on information of a Wireless Access Point (AP) that transceives Wireless signals with the WiFi module. The location information module 115 may perform some functions in other modules of the wireless communication part 110 in order to replace or additionally obtain data regarding the location of the mobile terminal, as needed. The location information module 115 is a module used to obtain the location (or current location) of the mobile terminal, and is not limited to a module that directly calculates or obtains the location of the mobile terminal.
Next, the input part 120 is a unit for inputting image information (or signals), audio information (or signals), data, or information input by a user, and the mobile terminal 100 may be provided with one or more cameras 121 for the input of the image information. In the video call mode or the shooting mode, the camera 121 processes image frames of still pictures, video, or the like obtained by the image sensor. The processed image frames may be displayed on the display 151 or stored in the memory 170. On the other hand, the plurality of cameras 121 provided to the mobile terminal 100 may be arranged in a matrix structure, and a plurality of image information having various angles or focuses can be input to the mobile terminal 100 by forming the cameras 121 in the matrix structure. In addition, the plurality of cameras 121 may be configured in a stereoscopic structure to obtain left and right images for implementing a stereoscopic image.
The microphone 122 processes an external sound signal into electric voice data. The processed voice data may be used in various ways according to the function (or running application) in progress of the mobile terminal 100. On the other hand, various noise removal algorithms may be implemented at the microphone 122 to remove noise (noise) generated during the reception of an external sound signal.
The user input unit 123 is used to receive information from a user, and when information is input through the user input unit 123, the control unit 180 can control the operation of the mobile terminal 100 in accordance with the input information. Such a user input part 123 may include a mechanical (mechanical) input means (or mechanical keys, for example, keys located on the front, rear or side of the mobile terminal 100, dome switches (dome switches), jog wheels (jog dials), jog switches (jog switches), etc.) and a touch input means. For example, the touch input means may be implemented by a virtual key (virtual key), a soft key (soft key), or a visual key (visual key) displayed on a touch panel or may be implemented by a touch key (touch key) disposed in a portion other than the touch panel through software processing. On the other hand, the virtual keys or the visual keys may be displayed on the touch screen in various forms, for example, by graphics (graphic), text (text), icons (icon), video (video), or a combination thereof.
On the other hand, the detection unit 140 detects at least one of the in-mobile-terminal information, the surrounding environment information of the mobile terminal, and the user information, and generates a detection signal corresponding thereto. The control section 180 may control driving or operation of the mobile terminal 100 or execute data processing, functions or operation related to an application provided to the mobile terminal 100 based on such a detection signal. Next, representative sensors among the plurality of sensors that the detection unit 140 may include will be described in more detail.
First, the proximity sensor 141 is a sensor that detects an object approaching a predetermined detection surface or whether an object is present in the vicinity thereof, without mechanical contact, by using a force of an electromagnetic field, infrared rays, or the like. Such a proximity sensor 141 may be disposed in an internal area of the mobile terminal surrounded by the touch screen described hereinbefore or in the vicinity of the touch screen.
Examples of the proximity sensor 141 include a transmission type photosensor, a direct reflection type photosensor, a mirror reflection type photosensor, a high-frequency oscillation proximity sensor, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared proximity sensor, and the like. When the touch screen is capacitive, the proximity sensor 141 may detect the proximity of a conductive object using a change in an electric field according to the proximity of the object. In this case, the touch screen (or touch sensor) itself may be classified as a proximity sensor.
On the other hand, for convenience of explanation, a behavior in which an object is not in contact with the touch screen but is enabled to be recognized as being located on the touch screen by approaching the touch screen is referred to as "proximity touch (proximity touch)", and a behavior in which an object is actually in contact with the touch screen is referred to as "contact touch (contact touch)". The position where an object is proximity-touched on the touch screen refers to a position where the object corresponds in a vertical direction with respect to the touch screen when the object is proximity-touched. The proximity sensor 141 may detect a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, a proximity touch movement state, etc.). On the other hand, the control part 180 processes data (or information) corresponding to the proximity touch action and the proximity touch pattern detected by the proximity sensor 141 as described above, and can output visual information corresponding to the processed data on the touch screen. Then, the control part 180 may control the mobile terminal 100 such that different actions or data (or information) are processed according to whether the touch to the same portion on the touch screen is a proximity touch or a contact touch.
The touch sensor detects a touch (or a touch input) applied to the touch panel (or the display unit 151) by using at least one of a plurality of touch systems such as a resistive film system, a capacitance system, an infrared system, an ultrasonic system, and a magnetic field system.
For example, the touch sensor may be configured to convert a pressure applied to a specific portion of the touch panel or a change in capacitance or the like generated at the specific portion into an electrical input signal. The touch sensor may be configured to be capable of detecting a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like of a touch object that touches the touch panel, on the touch sensor. Here, the touch object is an object that touches the touch sensor, and may be, for example, a finger, a Stylus pen (Stylus pen), a pointer (pointer), or the like.
In this way, when a touch input is performed on the touch sensor, a signal corresponding to the touch input is transmitted to the touch controller. The touch controller processes the signal and then transmits corresponding data to the control unit 180. Thereby, the control part 180 knows which area of the display part 151 is touched, and the like. Here, the touch controller may be a component independent of the control unit 180, or may be the control unit 180 itself.
On the other hand, the control unit 180 may perform different controls or the same control depending on the type of the touch object touching the touch panel (or a touch key provided in a place other than the touch panel). Whether different controls are performed or the same controls are performed according to the type of the touch object may be determined according to the current operating state of the mobile terminal 100 or the running application program.
On the other hand, the touch sensor and the proximity sensor described above are provided in an independent or combined manner, and can detect various types of touches such as a short (or TAP) touch (short touch), a long touch (long touch), a multi touch (multi touch), a drag touch (drag touch), a flick touch (flick touch), a grab-in touch (ping-out touch), a release touch (ping-out touch), a swipe touch (sweep) touch, and a hovering touch (hovering) with respect to the touch screen.
The ultrasonic sensor can recognize position information of a detection object by using ultrasonic waves. On the other hand, the control section 180 can calculate the position of the fluctuation generation source from the information detected by the optical sensor and the plurality of ultrasonic sensors. The position of the fluctuation generation source can be calculated from the fact that the speed of light is much higher than the property of the ultrasonic wave, that is, the time of arrival of light at the optical sensor is much faster than the time of arrival of the ultrasonic wave at the ultrasonic sensor. More specifically, the position of the wave generation source can be calculated using the time difference from the arrival time of the ultrasonic wave using the light as a reference signal.
On the other hand, the camera 121 includes at least one of a camera sensor (e.g., a CCD, a CMOS, or the like), a photoelectric sensor (or an image sensor), and a laser sensor as a component of the input unit 120.
The camera 121 and the laser sensor may detect a touch of a detection object with respect to a three-dimensional stereoscopic image by being combined with each other. A photosensor configured to scan movement of a detection object approaching the touch panel may be stacked on the display element. In more detail, photodiodes (Photo diodes) and TRs (transistors) are disposed in rows and columns of a photosensor, and an object placed on the photosensor is scanned with an electric signal that varies according to the amount of light applied to the photodiodes. That is, the photoelectric sensor calculates the coordinates of the detection object from the amount of change in light, and thereby can obtain the position information of the detection object.
The display unit 151 displays (outputs) information processed in the mobile terminal 100. For example, the display unit 151 may display operation screen information of an application program driven by the mobile terminal 100, or ui (User interface) or gui (graphical User interface) information based on such operation screen information.
The display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.
The stereoscopic display unit may adopt three-dimensional display modes such as a stereoscopic mode (glasses mode), an autostereoscopic mode (glasses-free mode), and a projection mode (hologram mode).
The sound output portion 152 may output audio data received from the wireless communication portion 110 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, or the like, or output audio data stored in the memory 170. The audio output unit 152 also outputs an audio signal related to a function (e.g., call signal reception sound, message reception sound, etc.) executed in the mobile terminal 100. Such a sound output section 152 may include a receiver (receiver), a speaker (speaker), a buzzer (buzzer), and the like.
The haptic module (haptic module)153 generates various haptic effects that can be felt by the user. Vibration may be a typical example of the haptic effect generated by the haptic module 153. The intensity, pattern, and the like of the vibration generated in the haptic module 153 may be controlled by the selection of the user or the setting of the control part. For example, the haptic module 153 may synthesize vibrations different from each other and output or sequentially output.
The haptic module 153 may generate various haptic effects in addition to vibration, such as an array of pins vertically moving with respect to a contact skin surface, an injection force or a suction force of air passing through an injection port or a suction port, a touch to a skin surface, a contact of an electrode (electrode), an effect generated by a stimulus such as an electrostatic force, an effect of reproducing a cold and hot feeling using an element capable of absorbing or generating heat, and the like.
The haptic module 153 may not only deliver a haptic effect through direct contact, but also be implemented to feel the haptic effect through the sense of muscles of a user's finger or arm, etc. The haptic module 153 may be provided in more than two according to the configuration of the mobile terminal 100.
The light output part 154 outputs a signal for notifying the generation of an event using light of a light source of the mobile terminal 100. As examples of the event generated at the mobile terminal 100, there are message reception, call signal reception, missed call, alarm clock, schedule notification, mail reception, application-based information reception, and the like.
The signal output from the light output part 154 is implemented by the mobile terminal emitting light of a single color or multiple colors to the front or rear surface. The signal output may end due to the mobile terminal detecting that the user has acknowledged the event.
The interface unit 160 functions as a path between the mobile terminal 100 and all external devices connected to the mobile terminal 100. Interface unit 160 receives data from an external device, supplies power to each component in mobile terminal 100, and transmits data in mobile terminal 100 to an external device. For example, the interface part 160 may include a wired/wireless headset port (port), an external charger port (port), a wired/wireless data port (port), a memory card (memory) port (port), a port (port) to which a device provided with an identification module is connected, an audio I/O (Input/Output) port (port), a video I/O (Input/Output) port (port), an earphone port (port), and the like.
On the other hand, the identification module is a chip storing various information for verifying the usage right of the mobile terminal 100, and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. The device provided with the identification module (hereinafter referred to as 'identification device') may be manufactured in the form of a smart card (smart card). Therefore, the identification device can be connected to the terminal 100 via the interface section 160.
When the mobile terminal 100 is connected to an external cradle (cradle), the interface unit 160 may be a path for supplying power from the cradle to the mobile terminal 100, or a path for transmitting various command signals input by a user at the cradle to the mobile terminal 100. Various command signals input from the cradle or the power source may be used as a signal for recognizing whether the mobile terminal 100 is accurately mounted to the cradle.
The memory 170 may store a program for the operation of the control section 180, and may temporarily store input/output data (for example, a telephone directory, a message, a still image, a video, and the like). The memory 170 may store data related to various patterns of vibration and sound output in response to a touch input on the touch screen.
The memory 170 may include at least one type of storage medium among a flash memory type (flash memory type), a hard Disk type (hard Disk type), an SSD type (Solid State Disk type), an SDD type (Silicon Disk Drive type), a multimedia card type (multimedia card micro type), a card type memory (e.g., SD or XD memory, etc.), a Random Access Memory (RAM), an SRAM (static random access memory), a read-only memory (read-only memory), a ROM (ROM), an EEPROM (electrically erasable programmable read-only memory), a PROM (programmable read-only memory), a magnetic memory, a magnetic Disk, and an optical Disk. The mobile terminal 100 may also operate in association with a network storage (web storage) that performs the storage function of the memory 170 on the internet.
On the other hand, as described above, the control section 180 generally controls the operation related to the application and the overall operation of the mobile terminal 100. For example, when the state of the mobile terminal satisfies a set condition, the control part 180 may operate or release a locked state for restricting a user from inputting a control command with respect to an application.
The control unit 180 may perform control and processing related to voice call, data communication, video call, and the like, or may perform pattern recognition processing capable of recognizing handwriting input or drawing input performed on the touch panel as characters or images, respectively. The control unit 180 may control one or a combination of the components described above to implement various embodiments described below in the mobile terminal 100 of the present invention.
The power supply unit 190 receives an external power supply and an internal power supply under the control of the control unit 180, and supplies power necessary for the operation of each component. The power supply unit 190 includes a battery, which may be a rechargeable built-in battery and may be detachably coupled to the terminal body to facilitate charging and the like.
In addition, the power supply unit 190 may be provided with a connection port, which may be an example of the interface 160 electrically connected to an external charger that supplies power for charging the battery.
As another example, the power supply unit 190 may be configured to wirelessly charge the battery without using the connection port. In this case, the power supply unit 190 may receive power from an external wireless power transmission device using one or more of an Inductive Coupling (Inductive Coupling) method based on an electromagnetic induction phenomenon and a resonant Coupling (Magnetic Resonance Coupling) method based on an electromagnetic Resonance phenomenon.
On the other hand, in the various embodiments mentioned below, it can be implemented in a storage medium readable by a computer or a similar device using, for example, software, hardware, or a combination thereof.
Referring to fig. 1b and 1c, the disclosed mobile terminal 100 has a terminal body having a bar shape. However, the present invention is not limited to this, and various configurations such as a wristwatch type, a clip type, a glasses type, or a folding type, a flip type, a slide type, a swing type, and a rotation type in which two or more bodies are coupled to each other so as to be movable relative to each other may be employed. Although this relates to a particular type of mobile terminal, the description of a particular type of mobile terminal may be generally applicable to other types of mobile terminals.
Here, the terminal body may be understood as a concept that names the mobile terminal 100 as one ensemble.
The mobile terminal 100 includes a housing (e.g., a frame, a cover, etc.) forming an external appearance. As shown, the mobile terminal 100 may include a front case 101 and a rear case 102. Various electronic components are disposed in an internal space formed by the coupling of the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102.
A display unit 151 may be disposed on the front surface of the terminal body to output information. As shown in the drawing, the window 151a of the display 151 may be mounted to the front case 101 to form the front surface of the terminal body together with the front case 101.
In some cases, electronic components may be mounted on the rear case 102. The electronic components that can be mounted on the rear case 102 include a removable battery, an identification module, a memory card, and the like. In this case, a back cover 103 for covering the mounted electronic components is detachably joined to the rear case 102. Therefore, when the back cover 103 is detached from the rear case 102, the electronic components mounted to the rear case 102 are exposed to the outside.
As shown in the drawing, when the back cover 103 is combined with the rear case 102, a part of the side surface of the rear case 102 is exposed. In some cases, the rear case 102 may be completely covered with the rear cover 103 when the above-described coupling is performed. On the other hand, the back cover 103 may be provided with an opening for exposing the camera 121b or the audio output unit 152b to the outside.
Such cases 101, 102, 103 may be formed by injection molding of synthetic resin, or may be formed of metal such as stainless steel (STS), aluminum (Al), titanium (Ti), or the like.
Unlike the example in which an internal space accommodating various electronic components is formed by a plurality of housings, the mobile terminal 100 may also form the internal space by one housing. In this case, a single body of the mobile terminal 100 in which synthetic resin or metal extends from a side surface to a rear surface may be implemented.
On the other hand, the mobile terminal 100 may be provided with a waterproof portion (not shown) so that water does not flow into the inside of the terminal body. For example, the waterproof portion may include a waterproof member that is provided between the window 151a and the front case 101, between the front case 101 and the rear case 102, or between the rear case 102 and the back cover 103, and seals the inner space when they are combined.
The mobile terminal 100 may be provided with a display 151, first and second sound output parts 152a and 152b, a proximity sensor 141, an illuminance sensor 142, a light output part 154, first and second cameras 121a and 121b, first and second operation units 123a and 123b, a microphone 122, an interface part 160, and the like.
Next, as shown in fig. 1b and 1c, the mobile terminal 100 in which the display unit 151, the first sound output unit 152a, the proximity sensor 141, the illuminance sensor 142, the light output unit 154, the first camera 121a, and the first operation unit 123a are disposed on the front surface of the terminal body, the second operation unit 123b, the microphone 122, and the interface unit 160 are disposed on the side surface of the terminal body, and the second sound output unit 152b and the second camera 121b are disposed on the rear surface of the terminal body will be described as an example.
However, these components are not limited to this arrangement. These constituent elements may be removed or replaced, or may be disposed on other faces, as necessary. For example, the first operation unit 123a may not be provided on the front surface of the terminal body, and the second sound output portion 152b may be provided on a side surface of the terminal body instead of the rear surface of the terminal body.
The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, the display unit 151 may display operation screen information of an application program driven in the mobile terminal 100, and UI (User Interface) and GUI (graphical User Interface) information based on the operation screen information.
The display portion 151 may include at least one of a liquid crystal display (L CD), a thin film transistor-liquid crystal display (TFT L CD), an organic light-emitting diode (O L ED), a flexible display (flexible display), a three-dimensional display (3D display), and an electronic ink display (e-ink display).
The display unit 151 may have two or more display units according to the implementation form of the mobile terminal 100. In this case, the plurality of display portions may be disposed on one surface of the mobile terminal 100 at intervals or integrally, or may be disposed on different surfaces.
The display part 151 may include a touch sensor for detecting a touch to the display part 151 so that an input of a control instruction is received through a touch manner. Thus, when there is a touch to the display part 151, the touch sensor detects the touch, and the control part 180 may generate a control instruction corresponding to the touch accordingly. The contents input by the touch manner may be characters or numbers or instructions or specifiable menu items in various modes.
On the other hand, the touch sensor may be configured in a film form having a touch pattern, and disposed between the window 151a and a display (not shown) positioned on the rear surface of the window 151a, or disposed in a wire form directly patterned on the rear surface of the window 151 a. Alternatively, the touch sensor may be formed integrally with the display. For example, the touch sensor may be disposed on a substrate of the display or disposed inside the display.
In this way, the display unit 151 may form a touch screen together with the touch sensor, and in this case, the touch screen may function as the user input unit 123 (see fig. 1 a). The touch screen may replace at least a part of the functions of the first operation unit 123a according to circumstances.
The first sound output part 152a may be implemented by a receiver (receiver) that transmits a call sound to the ear of the user, and the second sound output part 152b may be implemented in the form of a loud speaker (loud speaker) that outputs various alarm sounds or multimedia broadcasting sounds.
A sound hole for emitting sound generated from the first sound output part 152a may be formed at the window 151a of the display part 151. However, the present invention is not limited thereto, and the sound may be emitted along an assembly gap between the structural bodies (for example, a gap between the window 151a and the front case 101). In this case, the hole independently formed for outputting the sound is not seen from the external appearance or the hole is hidden, so that the external appearance of the mobile terminal 100 can be simplified.
The light output section 154 is formed to be able to output light for notifying an event when the event occurs. As examples of the event, there are message reception, call signal reception, missed call, alarm clock, schedule notification, mail reception, information reception by application, and the like. When it is detected that the user has confirmed the event, the control part 180 may control the light output part 154 to end the output of the light.
The first camera 121a processes image frames of still images or video obtained by an image sensor in a photographing mode or a video call mode. The processed image frames may be displayed on the display 151 and may be stored in the memory 170.
The first operation unit 123a and the second operation unit 123b are examples of the user input unit 123, and may be collectively referred to as an operation unit (manipulation unit) that is operated to receive an instruction for controlling the operation of the mobile terminal 100. The first and second operation units 123a and 123b may be operated in any manner (tactual manner) that the user tactually operates, such as touching, pushing, scrolling, and the like. The first operation unit 123a and the second operation unit 123b may be operated without a tactile sensation of the user by proximity touch (proximity touch), hovering (hovering), or the like.
In the present drawing, the first operation unit 123a is illustrated as a touch key (touch key), but the present invention is not limited thereto. For example, the first operation unit 123a may be constituted by a key (mechanical key) or by touching a combination of a key and a key.
The contents input through the first and second operation units 123a and 123b may be set to various kinds. For example, the first operation unit 123a may receive instructions for a menu, a home key, a cancel, a search, and the like, and the second operation unit 123b may receive instructions for a size adjustment of the sound output from the first sound output unit 152a or the second sound output unit 152b, a switching to a touch recognition mode of the display unit 151, and the like.
On the other hand, as another example of the user input unit 123, a rear surface input unit (not shown) may be provided on the rear surface of the terminal body. Such a back input unit is operated to receive an instruction for controlling the operation of the mobile terminal 100, and the input content may be set in various ways. For example, the instruction may be received such as turning on/off, starting, ending, or scrolling of the power supply, or the instruction to adjust the size of the sound output from the first sound output unit 152a and the second sound output unit 152b, or to switch to the touch recognition mode of the display unit 151. The rear input section may be configured to be capable of receiving a touch input, a push input, or an input through a combination thereof.
The rear input part may be configured to overlap the display part 151 of the front in the thickness direction of the terminal body. For example, the rear surface input part may be disposed at an upper end portion of the rear surface of the terminal body so that a user can easily operate with the thumb while holding the terminal body with one hand. However, the present invention is not necessarily limited thereto, and the position of the rear surface input portion may be changed.
In this way, when the rear surface input unit is provided on the rear surface of the terminal body, a new form of user interface using the rear surface input unit can be realized. In addition, when the touch screen or the rear input section described above is not disposed on the front surface of the terminal body in place of at least a part of the functions of the first operation unit 123a provided on the front surface of the terminal body, the display section 151 may be formed as a larger screen.
On the other hand, in the mobile terminal 100, a fingerprint recognition sensor may be provided to recognize a user's fingerprint, and the control part 180 may use fingerprint information detected by the fingerprint recognition sensor as an authentication means. The fingerprint recognition sensor may be built in the display 151 or the user input 123.
The microphone 122 is configured to receive voice of the user, other sounds, and the like. The microphone 122 may be disposed at a plurality of locations and configured to receive stereo sound.
The interface part 160 becomes a path for enabling the mobile terminal 100 to be connected to AN external device, for example, the interface part 160 may be at least one of a connection terminal for connection with other devices (e.g., AN earphone, AN external speaker), a Port for short-range communication (e.g., AN infrared Port (IrDA Port), a Bluetooth Port (Bluetooth Port), a Wireless L AN Port (Wireless L ANPort), etc.), and a power supply terminal for supplying power to the mobile terminal 100.
A second camera 121b may be disposed on the rear surface of the terminal body. In this case, the photographing direction of the second camera 121b may be substantially opposite to the photographing direction of the first camera 121 a.
The second camera 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix (matrix). Such a camera may be referred to as an 'array camera'. When the second camera 121b is an array camera, images may be photographed in various ways using a plurality of lenses, and higher quality images may be obtained.
The flash 124 may be disposed adjacent to the second camera 121 b. When the subject is photographed by the camera 121b, the flash 124 illuminates the subject.
A second sound output unit 152b may be additionally provided to the terminal body. The second sound output portion 152b can implement a stereo function together with the first sound output portion 152a, and can also be used to implement a handsfree mode at the time of call communication.
At least one antenna for wireless communication may be provided at the terminal body. The antenna may be built in the terminal body or formed in the housing. For example, an antenna constituting a part of the broadcast receiving module 111 (refer to fig. 1a) may be configured to be able to be drawn out from the terminal body. Alternatively, the antenna may be formed in a film form and attached to the inner surface of the back cover 103, or may be configured such that a case including a conductive material functions as an antenna.
The terminal body is provided with a power supply unit 190 (see fig. 1a) for supplying power to the mobile terminal 100. The power supply unit 190 may include a battery 191, and the battery 191 may be built in the terminal body or may be configured to be detachable from the terminal body.
Battery 191 may be configured to receive power via a power cable connected to interface 160. The battery 191 may be charged wirelessly by a wireless charger. The wireless charging may be realized by an electromagnetic induction method or a resonance method (magnetic resonance method).
On the other hand, in the present drawing, it is illustrated that the detachment of the battery 191 is restricted by the back cover 103 being combined with the rear case 102 to cover the battery 191, and the battery 191 is protected from external impact and foreign matter. When the battery 191 is configured to be detachable from the terminal body, the rear cover 103 may be detachably coupled to the rear case 102.
Accessories for protecting the external appearance or assisting or extending the functions of the mobile terminal 100 may be additionally provided at the mobile terminal 100. As an example of such an accessory, a cover or a pocket for covering or accommodating at least one surface of the mobile terminal 100 may be cited. The cover or pouch may cooperate with the display unit 151 to expand the functions of the mobile terminal 100. As another example of the accessory, a stylus pen for assisting or extending touch input to the touch screen may be cited.
Recently, grip sensors are increasingly installed on mobile terminals so that a user can apply user input by gripping (grip) the terminal. Next, the structure and operation of the grip sensor according to the grip detection method will be described with reference to fig. 2a to 3 c.
Fig. 2a to 2b are conceptual views for explaining a grip sensor implemented in a mobile terminal mounted to the present invention.
The grip sensor according to an embodiment of the present invention may be disposed on a side surface portion of the terminal body. A plurality of areas a1, a2, A3, a4, a5, a6 may be respectively disposed at the side surface portion of the terminal body, and a plurality of grip sensors may be disposed at the plurality of areas a.
Each grip sensor detects a pressure applied to at least one of the plurality of regions, and the control section 180 executes a function corresponding to each region according to the pressure detected by each grip sensor. The areas detected by the plurality of grip sensors may be set to be different from each other, and the distances between the grip sensors may be different.
Fig. 2c is a conceptual diagram illustrating a grip sensor attached to a substrate.
The mobile terminal 100 of the present invention includes a grip sensor 10, and the grip sensor 10 is disposed at a region of a side surface of a front case 101 (refer to fig. 1a and 1b) constituting a main body, the front case forming an external appearance and an internal space. The front case 101 may have a deformed structure to enable external force to be better transmitted to the grip sensor 10.
The grip sensor 10 is attached to an inner surface of the front case 101, and a region of the front case 101 is pressed to be deformed by pressure applied to a side surface of the main body. When a region of the front case 101 is pressed, the grip sensor 10 is deformed, and thus the applied pressure is detected by a change in the resistance value of the deformation member.
The substrate 30 of fig. 2c may be the front case 101 of the electronic device of the present invention. The grip sensor 10 may be fixed on the base plate 30 by an adhesive member 20. The grip sensor 10 includes a base substrate 11, and a first deforming member 12 and a second deforming member 13 formed on both surfaces of the base substrate 11, respectively. In the case where a plurality of the first deforming member 12 and the second deforming member 13 are provided, they may be arranged spaced apart from each other on the base substrate 11.
If a pressure F is applied to the substrate 30 to which the grip sensor 10 is attached, the substrate 30 is deformed. If the substrate 30 is deformed in the direction in which the pressure force F is applied, the base substrate 11 is also bent in the same direction. The first deforming member 12 and the second deforming member 13 formed on both sides of the base substrate 11 are deformed in an opposite manner. The first deforming member 12 is disposed on the concave side of the deformed base substrate 11 to be contracted, and the second deforming member 13 is disposed on the convex side of the deformed base substrate 11 to be expanded. Therefore, the resistance value of the first deformation member 12 of the grip sensor 10 becomes smaller due to contraction, and the resistance value of the second deformation member 13 becomes larger due to expansion. By changing the output value according to the change in the resistance values of the first deforming member 12 and the second deforming member 13, the control portion 180 can acquire information on whether or not the pressure is applied, the degree of the applied pressure, and the direction in which the pressure is applied.
Fig. 3a to 3c are conceptual views for explaining a grip sensor disposed on a side surface of a main body.
Referring to fig. 3a, the grip sensor 300 is disposed at a side connecting the upper portion TA and the lower portion BA of the body. The grip sensor 300 detects a touch input and pressure generated by the grip of the user, and the control unit 180 forms a control command according to the touch and/or pressure detected by the grip sensor 300. Although not shown, the grip sensors 300 are formed at both side surfaces opposite to each other.
The grip sensor 300 extends in a longitudinal direction of the body, and one end of the grip sensor 300 is electrically connected to the main circuit substrate through the flexible circuit substrate 181.
The grip sensor 300 is comprised of a base 310 and a plurality of piezoelectric sensors 320. The base 310 may be a flexible circuit substrate extending in one direction. The plurality of piezoelectric sensors 320 are arranged along the direction. The flexible circuit substrate 181 includes an extension portion 181a extending from the base 310 and a connection pad portion 181b electrically connected to the main circuit substrate.
The plurality of piezoelectric sensors 320 are implemented as alternating Tx sections and Rx sections. If a high-frequency AC high voltage (e.g., about 700KHZ, about 250Ma) is applied to the piezoelectric sensor 320, the piezoelectric sensor 320 vibrates. In addition, if pressure is applied to the piezoelectric sensor 320, a proportional AC voltage is generated. The control part 180 may detect a touch input based on a change in a fine vibration pattern and detect a pressure according to the generation of an AC voltage.
When a finger touches the piezoelectric transducer, an ultrasonic wave pattern is output from the piezoelectric transducer of the Tx unit, and a change in the ultrasonic wave pattern is detected by the piezoelectric transducer of the Rx unit. When a change in the ultrasonic wave mode is detected, it is determined that a touch input is applied. A minute vibration occurs when the ultrasonic mode is output.
Referring to fig. 3c, a plurality of piezoelectric sensors 320 are arranged on the base 310 at a certain distance apart. An adhesive member 301 is formed on an upper portion of the plurality of piezoelectric sensors 320, and the adhesive member 301 is attached to an inner surface of the front case 101 of the electronic device 100. Accordingly, the grip sensor 300 can detect a touch input and pressure generated based on the grip of the user holding the front case 101.
In the above, the grip sensor for detecting the grip of the user in the mobile terminal of the present invention has been described. Next, various embodiments to which the grip sensor based on pressure detection and the grip sensor based on ultrasonic detection described above can be applied will be described.
Next, a method of executing a function associated with a plurality of visual information displayed on a touch screen by using a user input detected by a grip sensor will be described. Fig. 4a and 4b are conceptual views showing a method of selecting a plurality of image information at once while running the album application. Fig. 5a to 5c are conceptual views illustrating a method of controlling bibliographic information including a plurality of items by a grip input.
The control part 180 of the mobile terminal of the present invention may detect the user input through the grip sensor. The user input detected by the grip sensor may be generated by a user gripping the body. Such user input detected via the grip sensor may be referred to in various terms as grip input or grip instruction.
The control part 180 may form control commands different from each other according to at least one of a detection position, a detection area, a detection time, the number of times of detection, and a detection pressure of the user input detected by the grip sensor. For example, the control part 180 may generate control instructions different from each other according to the user input detected at the first position and the user input detected at the second position. As another example, the control part 180 may generate control instructions different from each other by distinguishing a user input detected during the first time from a user input detected during the second time.
In addition, the control part 180 may determine the posture of the hand of the user currently holding the terminal, based on at least one of the detection position and the detection area of the user input detected by the grip sensor. For example, the control section 180 may determine whether the user is the right hand or the left hand, based on at least one of the detection position and the detection area of the user input detected by the grip sensor.
The control part 180 may control an action of the mobile terminal associated with the visual information displayed on the touch screen according to the detection of the user input by the grip sensor. The visual information displayed on the touch screen may include all information that can be visually displayed, such as still images, icons, widgets (widgets), messages, mails, and the like.
The action of the mobile terminal may be an action of performing a function associated with visual information. Next, a method of executing a function associated with visual information displayed on a touch panel according to a user input detected by a grip sensor will be described. The functions associated with the visual information may include a full select function, a full cancel function, a screen capture (screencapture) function, and the like. A method for performing such a function will be described in detail with reference to the accompanying drawings.
First, when a plurality of images are displayed on the touch screen, the control part 180 may perform a function associated with the plurality of images according to a user input detected by the grip sensor.
For example, as shown in fig. 4a (a), if the album application is executed, the control part 180 may output a part of the image 410 among the images 410 and 420 stored in the memory 170 to the touch screen. The album application is an application program that executes a function of outputting the image stored in the memory 170 onto the touch screen.
The portion of the image 410 is an image contained in a particular folder of a plurality of folders accessible to the album application. The control section 180 may output only a predetermined number of images among a part of the images included in the specific folder. The number of outputs of the images may be altered by user input.
On the other hand, the control part 180 may execute an editing mode for editing the part of the image 410 according to a user control instruction. In the edit mode, editing functions of deleting information, folder moving, copying, rotating, making GIF, and the like can be performed. In the edit mode, the control unit 180 can set an image selected by a user control command to an editable selection state. In a case where the image is set in the selected state, the control part 180 may perform an editing function on the selected image if a user input for performing the editing function for editing the image is applied. For example, in a case where the first image is set in the selected state, the control part 180 may delete the first image if a user input for executing a delete function is applied.
When the editing function is executed, the control unit 180 may display a check box in an area adjacent to an area where each image is displayed, so as to indicate that a plurality of images can be edited.
On the other hand, the control unit 180 may selectively edit one image, or may select a plurality of images at once and execute an editing function for them at once. Therefore, a user of a conventional mobile terminal can only apply a plurality of user inputs to set a plurality of images to a selected state, respectively. Alternatively, the user of the mobile terminal selects all images included in a folder to which an image currently displayed on the touch screen belongs at once in a case where the entire selection function is performed. Therefore, the control part 180 cannot conveniently select a part of the image desired by the user.
In contrast, the control section 180 of the mobile terminal of the present invention may select a part of images at once according to the user input detected by the grip sensor.
For example, as shown in (b) of fig. 4a, the control part 180 may perform all selection functions of setting an image 410 displayed on the current touch screen 151 among the images 410, 420 included in a specific folder to a selection state according to a user input detected by the grip sensor. Also, the control section 180 may not set the image 420, which is not displayed on the touch screen although included in the specific folder, to the selected state. The all-select function may be understood as a function of setting only visual information displayed on the current touch screen to a selected state. When the all selection functions are executed, the control section 180 may display a selection mark in a check box displayed adjacent to the selected image 410.
Therefore, in the present invention, the user does not need to apply an input for directly selecting each image in order to edit all the currently displayed images, and user convenience can be improved. In addition, the present invention can set only information in a state that can be visually confirmed at present as a selected state, and not set information in a state that cannot be visually confirmed as a selected state, whereby a user can recognize information to be simultaneously selected in advance before simultaneously selecting information.
On the other hand, the control part 180 may change the output amount of the visual information currently displayed on the touch screen 151 according to the user input. For example, as shown in fig. 4b (a), in a state where a first number of images are displayed, the control section 180 may decrease the number of outputs to a second number smaller than the first number according to the application of the release-touch (ping-out) input. In this case, as shown in (b) of fig. 4b, the control part 180 may display the second number of visual information, and may make the display size of each image larger than that when the first number is displayed.
In addition, although not shown, in a state where the first number of images are displayed, the control part 180 may increase the number of outputs to a third number greater than the first number according to the application of the grab touch (ping-in) input. In this case, the control part 180 may display the third number of visual information, and may make the display size of each image smaller than that when the first number is displayed.
On the other hand, as shown in (b) and (c) of fig. 4b, in a state where the second number of images are displayed, if the user input is applied through the grip sensor, the control part 180 may perform all selection functions for setting the second number of images to the selected state. Therefore, the control part 180 can adjust the number of images selected at one time by adjusting the display number of images displayed on the touch screen.
On the other hand, although not illustrated, in a case where a plurality of images displayed on the touch screen 151 are set to a selected state, the control part 180 may perform an all cancel function of setting the plurality of images to a cancelled state according to a case where a user input is detected by the grip sensor. The cancel state is a state in which the editing function of the image is not applied.
Although not shown, the control unit 180 may perform a screen capture function when a user input is applied through a grip sensor in a state where visual information is displayed on the touch screen 151. The screen capture function is a function of capturing visual information currently displayed on the touch screen 151 in the form of a picture, GIF, or video. By this, the present invention can perform a screen capture function in a state in which a field of view for visual information to be captured is secured.
In the above, a method of executing all the selection functions in the album application based on the user input detected by the grip sensor has been described.
Next, a method of executing all the selection functions when directory information including a plurality of items is displayed on the touch panel will be described.
In a state where directory information including a plurality of items is displayed on the touch screen 151 and the edit mode is being executed, the control part 180 may set the plurality of items to all selected states if a grip input is detected.
The directory information may be a message directory, a mail directory, a memo directory, an SNS article directory, a to-be-processed item directory, or the like. For example, as shown in (a) of fig. 5a, a message directory including a plurality of messages 510 received from an external terminal may be displayed on the touch screen 151.
In a state where the message list 510 is displayed, the control unit 180 may execute an edit mode capable of editing the messages included in the message list 510 according to a user control instruction. In the edit mode, functions such as deletion, movement, and storage of messages can be executed.
As shown in fig. 5a (a), when the edit mode is executed, the control section 180 may display a check box for each message to indicate that the message included in the message list is in an editable state.
As shown in (b) of fig. 5a, in a state where the edit mode is operated, the control part 180 may set a plurality of messages 510 displayed on the touch screen 151 to a selection state in response to a user input applied through a grip sensor. Also, the control part 180 may not set a message, which is not displayed on the current touch screen 151, among the messages included in the message list to a selected state. Therefore, the user can intuitively and conveniently select the information that can be viewed.
On the other hand, the control part 180 may change the number of messages displayed on the touch screen according to a user control instruction. Therefore, the user can directly set the number of messages that can be selected according to the user input detected by the grip sensor.
As shown in (a) of fig. 5b, in a state where a first number of messages are displayed while the edit mode is in operation, the control part 180 may display a second number of messages greater than the first number on the touch screen 151 according to the applied grab (ping-in) touch input. The control part 180 may reduce a display size of each message in order to display the second number of messages on the touch screen 151.
In this case, as shown in (b) of fig. 5b, a plurality of messages 510 displayed before the grab (ping-in) touch input is applied may be displayed on the touch screen 151 together with a new message 520, which is not displayed before the grab (ping-in) touch input is applied among the messages included in the message list. In this state, as shown in (b) and (c) of fig. 5b, the control part 180 may set the plurality of messages 510 and the new message 520 to all selection states according to a user input applied through the grip sensor. Therefore, the invention can conveniently execute all selection functions not only for the picture information but also for the message information.
On the other hand, as shown in fig. 5c (a), the control unit 180 may execute a scroll function of scrolling (scroll) the message list in response to a flick touch input in the up-down direction applied to the message list. The scroll function is a function of causing at least a part of currently displayed information to disappear from the touch screen and outputting at least a part of currently undisplayed information on the touch screen.
As shown in (b) of fig. 5c, the control part 180 may display at least a portion 510a of the currently displayed message 510 and a new message 530 according to a flick touch input in the up-down direction applied to the touch screen 151 in the operation of the edit mode. In this state, if a user input is applied through the grip sensor, the control part 180 may set at least a part of the message 510a and the new message 530 displayed on the current touch screen to a selected state.
In the above, a method of executing all the selection functions in which a plurality of pieces of information are set to a selection state at once based on the user input detected by the grip sensor has been described.
Next, an example of an operation of turning on/off a plurality of functions at once in accordance with a user input detected by the grip sensor will be described. Fig. 6a to 6c are conceptual views showing an example of an action of performing a plurality of functions on/off at once according to a user input detected by a grip sensor.
The control part 180 may output a plurality of icons associated with functions different from each other on the touch screen 151. The function associated with the plurality of icons may be a function that operates in the background and may be a function associated with setting information of the mobile terminal. Such a function may be executed in the background without immediately outputting the execution screen when a touch input for an icon is applied. Also, it may be set that, when a specific condition is satisfied, an operation screen of the corresponding function is output or other functions are operated by the corresponding function. For example, the function associated with the plurality of icons may be a notification function, a wireless fidelity (WIFI) activation function, a Bluetooth (Bluetooth) activation function, a hot-spot (hot-spot) function, a flight mode function, and the like.
The control part 180 may activate (on) a function associated with an icon or deactivate a function associated with a notification icon according to a touch input for each of the plurality of icons.
On the other hand, the control part 180 may simultaneously activate or deactivate the functions associated with the plurality of icons according to the user input detected by the grip sensor.
For example, as shown in (a) and (b) of fig. 6a, in a case where a plurality of icons 610a, 610b, 610c, 610d respectively associated with functions set to output notification signals in time periods different from each other are displayed, and the functions set to output notification signals in time periods different from each other are in a deactivated state, the control section 180 may activate all of the functions to output notification signals in time periods different from each other according to a user input applied through the grip sensor.
In addition, although not illustrated, in a case where a plurality of icons 610a, 610b, 610c, 610d respectively associated with functions set to output notification signals in time periods different from each other are displayed and the functions set to output notification signals in time periods different from each other are in an activated state, the control part 180 may deactivate all the functions to output notification signals in time periods different from each other according to a user input applied through the grip sensor. Therefore, the user can control activation and deactivation of notification information without additionally applying a touch input at each notification time period.
In addition, in a state in which screen information for controlling a setting value of the mobile terminal is displayed, if a user input is applied through a grip sensor, the control part 180 may simultaneously activate or deactivate the setting value of the mobile terminal. For example, referring to fig. 6b (a) and (b), when a user input is applied by a grip sensor while screen information 620 related to the setting of the floating bar is displayed, control unit 180 may activate all functions (shortcut function, screen capture function, etc.) related to the floating bar. Here, the floating bar is a bar-shaped graphic object that is displayed in one area of the touch panel and includes a shortcut icon for a function set by a user.
In addition, in a state where icons indicating a plurality of setting functions related to NFC are displayed, when a user input is applied by a grip sensor, the control section 180 may activate or deactivate all of the plurality of setting functions related to NFC. For example, as shown in (a) and (b) of fig. 6c, in a state in which respective icons representing an NFC read mode function and an NFC communication function are displayed, if a user input is applied through a grip sensor, both the NFC read mode function and the NFC communication function may be activated.
In the above, a method of simultaneously activating or deactivating functions associated with settings of a mobile terminal using a user input detected by a grip sensor has been described.
Next, a method of selecting all text information by the user input detected by the grip sensor will be described. Fig. 7a and 7b are conceptual views for explaining a method of selecting the entire text information using the user input detected by the grip sensor.
The control part 180 may control the touch screen 151 to display text information on the touch screen 151. The text information may be memo information, text information, message information, etc.
In a state where text information is displayed on the touch panel 151, the control unit 180 may set the displayed text information to a selected state when a user input is detected by the grip sensor. In this case, unlike the above description, the mobile terminal may not be in the editing mode.
For example, as shown in fig. 7a (a), the control part 180 may run a memo application providing functions of inputting, editing, and storing memo information. In this case, a running screen of the memo application may be displayed on the touch screen 151. In the execution screen of the memo application, memo information input in advance, icons for editing the memo information may be included.
As shown in fig. 7a (b), when a user input is detected by the grip sensor in a state where the previously input memo information is displayed on the touch screen 151, the control part 180 may set the previously input memo information to a selected state. In this case, the control unit 180 may change the output mode of the memo information input in advance to indicate that the memo information input in advance is set in the selected state. The output form may include output color, output size, output contrast, and the like. For example, when the memo information input in advance is set to the selected state, the control unit 180 may change the output color of the memo information input in advance.
On the other hand, in a case where an input window for inputting text information is displayed on the touch screen 151, the control section 180 may set only the text information input to the input window to a selected state in response to a case where the user input is detected by the grip sensor. For example, as shown in fig. 7b (a), a message transmitted to and received from an external device and an input window 720 for inputting a message may be displayed on the touch screen 151. At this time, the text information may be a message that has been transmitted and received and information input to the input window.
As shown in fig. 7b (b), the control part 180 may detect text information input to the input window according to the user input detected by the grip sensor, and set only the detected text information to a selected state. In this case, the transmitted and received message may not be set to the selected state. Accordingly, the user can selectively select only the text information input to the input window.
In the above, a method of executing a function associated with visual information displayed on a touch panel according to a user input detected by a grip sensor has been described.
Next, a method of executing a function associated with state information of the mobile terminal according to a user input detected by the grip sensor will be described. Fig. 8a to 10b are conceptual views illustrating a method of executing a function associated with state information of a mobile terminal according to a user input detected by a grip sensor.
The mobile terminal of the present invention may further include a sensor for detecting status information associated with a status of the mobile terminal. The sensors for detecting the state information may include a proximity sensor 141 (refer to fig. 1b), a gyro sensor (gyro sensor), an illuminance sensor 142 (refer to fig. 1b), an acceleration sensor, a fingerprint sensor, and the like. The detailed description of the sensor is replaced with that of figure 1 a.
The control part 180 may generate the state information of the mobile terminal based on the detection information received from at least one of the plurality of sensors. For example, as shown in fig. 8a (a), the mobile terminal may be in a state in which the touch screen is directed downward. That is, the mobile terminal may be in a state where the touch screen is turned over so as to be abutted against the surface of the table. For convenience of explanation, the state in which the touch panel is facing downward will be referred to as an inverted state.
The control part 180 may determine the turning state based on the detection information received from the proximity sensor 141 and the gyro sensor. For example, the control part 180 may receive detection information indicating that an object approaching on the touch screen exists from the proximity sensor 141 and detection information indicating that the terminal body is in a horizontal state from the gyro sensor. The control unit 180 may determine that the mobile terminal is in the flipped state using the detection information detected by the proximity sensor 141 and the detection information detected by the gyro sensor.
In the flipped state, if the user input detected by the grip sensor is detected, the control part 180 may control different functions according to whether a specific function is executed in the mobile terminal.
First, the control part 180 may perform a specific function in a flipped state. In this case, the control part 180 may suspend the execution of the specific function in the middle of the execution of the specific function in the flipped state, according to the user input detected by the grip sensor. The specific function may be a call function, a music play function, etc.
For example, as shown in (a) of fig. 8a, the control part 180 may perform a music playing function in a flipped state. At this time, as shown in fig. 8a (b), the control part 180 may end the music playing function according to the user input detected by the grip sensor while the music playing function is being executed in the flipped state.
Although not shown, when the control unit 180 outputs notification information for notifying the reception of the call signal in the inverted state, the output of the notification information may be suspended in accordance with the user input detected by the grip sensor.
Although not shown, the control unit 180 may interrupt the reception of the call signal in accordance with the user input detected by the grip sensor when outputting notification information for notifying the reception of the call signal. That is, the rejection function may be performed.
In addition, as shown in fig. 8b (a), the control part 180 may place the mobile terminal in a standby mode in a flipped state. The standby mode may indicate a state in which a function set on the mobile terminal is not performed. In this case, as shown in (b) of fig. 8b, the control part 180 may execute a mute mode set not to output the notification information according to the user input detected by the grip sensor. The silent mode is a function set to not output notification information for notifying the occurrence of a specific event such as a call signal notification, a message reception notification, or the like at all, or to output the notification information only in a visual manner or a tactile manner. The mute mode is a mode in which the brightness of the touch panel is set to be minimum.
That is, the mobile terminal of the present invention can conveniently perform motion control of the mobile terminal associated with at least one of state information of the mobile terminal and a currently executed function in response to a user input applied through a grip sensor in a flipped state in which a field of view of a touch screen cannot be secured. Therefore, the user can perform the operation control suitable for the current situation without confirming the visual information displayed on the terminal.
As shown in fig. 9a (a), the control unit 180 may detect that the mobile terminal is in a packet state based on detection information received from at least one of the plurality of sensors. Specifically, the control unit 180 may receive detection information indicating that an object is present around the touch panel from the proximity sensor, detection information indicating that the main body is in a skewed state from the gyro sensor, and illuminance information indicating illuminance around the touch panel from the illuminance sensor. The control unit 180 may detect that the mobile terminal is in the inside of the bag by using the detection information detected by the proximity sensor, the detection information detected by the gyro sensor, and the illuminance information detected by the illuminance sensor. For example, the control unit 180 may determine that the mobile terminal is in a state of being in a bag when an object exists around the touch screen, the terminal body is currently in a skewed state, and the illuminance around the terminal is equal to or less than a reference value. For convenience of description, a state in which the mobile terminal is in a packet will be referred to as an intra-packet state.
In the in-pack state, if the user input is detected by the grip sensor, the control part 180 may perform different functions depending on whether a specific function is performed in the mobile terminal.
For example, in the bag-in-bag state, if a user input is detected by the grip sensor while a specific function is being executed, the control section 180 may suspend the execution of the specific function. For example, as shown in (a) and (b) of fig. 9a, in the in-bag state, if a user input is detected by the grip sensor while the music playing function is being executed, the control section 180 may suspend the execution of the music playing function.
As another example, as shown in (a) and (b) of fig. 9b, the control part 180 may perform a mute mode function if a user input is detected by a grip sensor on the way of performing the standby mode in the in-pack state. Therefore, even in a state where the visual field of the visual information displayed on the touch panel cannot be secured because the terminal is located in the bag, the user can easily control the operation of the terminal without taking out the terminal from the bag.
In addition, after the posture of the body is changed, when the user input is detected by the grip sensor within a preset time, the control part 180 may temporarily operate the landscape view mode.
Here, the landscape view mode indicates a display direction of visual information displayed on the touch screen. Next, the landscape view mode and the portrait view mode will be described in detail.
Referring to fig. 10a (a), the touch screen may be a rectangle having a first side larger than a second side. At this time, the first side may be referred to as a height h or a longitudinal direction, and the second side may be referred to as a width w or a lateral direction.
The main body may be in any one of a portrait posture in which a height h of the touch screen is parallel to the gravity direction a-a 'and a landscape posture in which the height h is perpendicular to the gravity direction a-a'.
The landscape view mode is a mode in which visual information displayed on the touch screen is displayed in the width w direction in a state in which the main body is in a landscape posture. The portrait view mode is a mode in which visual information displayed on the touch screen is displayed in the height h direction.
The control unit 180 may be set to display only in the portrait mode even if the posture of the main body is changed. In this case, as shown in fig. 10a (a) and 10 b), the control section 180 may display an image in the portrait view mode even if the body is changed from the portrait orientation to the landscape orientation. That is, the display direction of the image may not be changed.
At this time, as shown in fig. 10a (c), after the posture of the main body is changed, if the user input is detected by the grip sensor within a preset time, the control part 180 may temporarily operate the landscape view mode. In this case, when the posture of the main body is changed from the second posture to the first posture again, the control part 180 may suspend the landscape mode and re-execute the portrait mode.
In addition, when the user input is detected by the grip sensor within a preset time after the fingerprint information is input through the fingerprint sensor, the control part 180 may temporarily perform a lock mode function in which the terminal does not operate according to the touch input. For example, as shown in (a) and (b) of fig. 10b, in a state where card information is displayed on the touch screen 151 and fingerprint information is input through a fingerprint sensor, the control part 180 may execute a lock mode function according to a user input detected by a grip sensor. By this, the present invention can prevent other people from arbitrarily operating the mobile terminal when the mobile terminal is transferred to other people for card payment using the mobile terminal.
In the above, a method of executing a function associated with state information of a mobile terminal according to a user input detected by a grip sensor has been described. By this, the user can more conveniently control the mobile terminal even when the touch screen is not directly controlled or the visual field is not secured.
Next, a method of performing multitasking using a user input detected by a grip sensor will be described. Fig. 11a and 11b are conceptual views illustrating a method of performing multitasking using a user input detected by a grip sensor.
The control part 180 may perform a multitasking function when detecting the rotational movement of the body in a state where the user input is detected by the grip sensor. The multitasking function is a function of dividing a screen of the touch screen 151 into a plurality of regions and outputting an execution screen of an application different from each other in each of the divided regions. That is, not only the plurality of applications are simultaneously run, but also a function of simultaneously providing the running screens of the plurality of applications.
On the other hand, when the multitasking function is executed, the control part 180 may determine the arrangement region of the operation screen according to the rotation direction of the terminal.
For example, as shown in fig. 11a (a), the control unit 180 may execute a multitasking function when the main body is rotated in a first direction (counterclockwise direction) in a state where the user input is detected by the grip sensor while the execution screen 1110 of the first application is displayed on the touch screen 151.
As shown in fig. 11a (b), when the multitasking function is executed, the control unit 180 may divide the touch screen into two regions and display the operation screen 1110 of the first application in a first region (i.e., on the left side with respect to the front of the touch screen). Also, a running screen of a second application different from the first application may be displayed in the second area. The second application may be a most recently run application of the applications running in the background.
On the other hand, as shown in fig. 11a (b), in the case where there is no application running in the background, the control section 180 may display an icon 1120 of a frequently used application or an icon of an application running in the background in the second region. The frequently used application may be determined according to the number of times the application is run in the mobile terminal.
As shown in fig. 11a (c), if rotation of the main body in a second direction (clockwise direction) different from the first direction is detected while the multitasking function is being executed, the control unit 180 may re-divide the touch screen. The control unit 180 may display the execution screen 1110 of the first application in a third area located near the upper end of the main body, and output the icon 1120 of the frequently used application in a fourth area located near the lower end of the main body.
In addition, in fig. 11b (a) to (c), the arrangement regions of the operation screen at the time of the multitasking function are displayed in the case of rotating in the opposite direction to fig. 11a (a) to (c). In this case, unlike (a) to (c) of fig. 11a, when the main body is rotated in the second direction (clockwise direction), the execution screen 1110 of the first application may be displayed in the second area, and the icons 1120 of the frequently used applications may be displayed in the first area. In this state, when the main body is rotated in the first direction, the control unit 180 may display the operation screen 1110 of the first application in the third area and display the icons 1120 of the frequently used applications in the fourth area. That is, in the present invention, the arrangement position of the operation screen of the application when the multitask function is executed can be determined according to the rotation direction of the main body.
In the above, a method of performing a multitasking function using a user input detected by a grip sensor has been described. By this, the user can perform the multitasking function more conveniently, and can directly select the configuration region of the screen information of the currently displayed operation screen before performing the multitasking function.
The mobile terminal of the present invention is capable of improving user convenience by executing a function associated with a plurality of visual information in response to a user input being applied through a grip sensor while an edit mode for editing the plurality of visual information displayed on a touch panel is being executed.
In addition, the mobile terminal of the present invention can improve user convenience by performing operation control associated with the state of the terminal in response to the user input being applied through the grip sensor.
The present invention described above can be realized as computer-readable codes in a medium in which a program is recorded. The computer-readable medium includes all kinds of recording devices that store data readable by a computer system. Examples of the computer-readable medium include a Hard Disk Drive (HDD), a Solid State Disk (SSD), a Silicon Disk Drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a flexible Disk, and an optical data storage device, and may be realized in the form of a carrier wave (e.g., transmission via the internet). The computer may include the control unit 180 of the terminal. The above detailed description is therefore not to be taken in a limiting sense, but rather is to be construed in an illustrative sense. The scope of the invention should be determined by reasonable interpretation of the appended claims and all changes which come within the equivalent scope of the invention should be construed as falling within the scope of the invention.

Claims (20)

1. A mobile terminal, comprising:
a main body provided with a case forming an appearance;
a memory storing a plurality of visual information;
a touch screen disposed on a front surface of the main body and displaying at least one of the plurality of visual information;
a grip sensor disposed at a side of the body and attached to an inner surface of the housing, detecting a user input applied to the side; and
a control unit configured to execute all selection functions for setting at least one piece of displayed visual information to an editable selection state in accordance with a user input detected by the grip sensor while an editing mode for editing the plurality of pieces of visual information is being executed,
the control unit does not set the remaining visual information other than the displayed at least one visual information to a selected state even if the all selection functions are executed.
2. The mobile terminal of claim 1,
the control section determines the number of displays of visual information to be displayed on the touch screen based on a touch input applied in a preset manner,
in a state where the visual information corresponding to the determined number of displays is displayed on the touch panel, the control section executes all of the selection functions so that only the visual information corresponding to the determined number of displays among the visual information stored in the memory is set to a selected state in accordance with the user input detected by the grip sensor.
3. The mobile terminal of claim 2,
the control unit reduces the number of images to be displayed on the touch panel among the visual information stored in the memory to a second number smaller than the first number in accordance with a touch input applied in a predetermined manner in a state where a first number of visual information corresponding to the visual information stored in the memory is displayed,
when a user input is applied via the grip sensor while the second number of pieces of visual information are displayed, the control unit executes all selection functions so that the second number of pieces of visual information are set to all selection states.
4. The mobile terminal of claim 1,
the control unit executes a scroll function for scrolling the list information in accordance with a drag input applied to the touch panel in a state where a part of the plurality of items in the list information including the plurality of items is displayed on the touch panel, and selects an item displayed on the touch panel due to execution of the scroll function in the list information when a user input is detected by the grip sensor after the scroll function is executed.
5. The mobile terminal of claim 1,
in a selected state in which the plurality of visual information are all selected, the control section executes an all cancel function so that the plurality of visual information are set to an unselected state, in accordance with a user input detected by the grip sensor.
6. The mobile terminal of claim 1,
the plurality of visual information are information associated with functions different from each other,
the control portion activates all of the functions different from each other or deactivates all of the functions different from each other according to the user input detected by the grip sensor.
7. The mobile terminal of claim 1,
the touch panel further includes an input window for displaying characters input by a user control instruction, and the control unit executes a full selection function according to the user input detected by the grip sensor so that all the characters displayed on the input window are selected.
8. The mobile terminal of claim 1,
the mobile terminal further includes:
a proximity sensor detecting an object located at a periphery of the touch screen; and
a gyro sensor detecting a tilt of the main body,
the control part stops execution of a specific function if a user input is detected by the grip sensor while an object located in a periphery of the touch screen is detected and the specific function is being executed in a state where the main body is horizontal,
the control unit executes a mute mode in which an alert sound is not output in an audible manner when an object located in the periphery of the touch panel is detected and the mobile terminal is in a standby state in a state in which the main body is horizontal.
9. The mobile terminal of claim 1,
the mobile terminal further includes:
a proximity sensor detecting an object located at a periphery of the touch screen;
a gyro sensor detecting a tilt of the main body; and
an illuminance sensor that detects peripheral illuminance of the touch screen,
the control unit stops execution of a specific function when a user input is detected by the grip sensor while the specific function is being executed in a state where an object located in the periphery of the touch panel is detected, the main body is in a non-horizontal state, and the illuminance around the touch panel is equal to or lower than a reference value,
when a user input is detected by the grip sensor in a standby mode in which an object located in the periphery of the touch panel is detected, the main body is in a non-horizontal state, the illuminance around the touch panel is equal to or less than a reference value, and a specific function is not executed, the control unit executes a mute mode in which the control is performed so that notification information is not output.
10. The mobile terminal of claim 1,
the mobile terminal further includes an acceleration sensor detecting acceleration of the main body,
the touch screen is rectangular with a first side longer than a second side,
the body assumes any one of a longitudinal posture in which the first side is arranged in a direction parallel to a direction of gravitational force and a lateral posture in which the first side is arranged in a direction perpendicular to the direction of gravitational force,
when it is detected that the posture of the main body is changed from the portrait posture to the landscape posture and the grip sensor detects a user input in the landscape posture, the control unit changes the display direction of the visual information displayed on the touch panel.
11. The mobile terminal of claim 10,
when the state sensor detects that the posture of the main body is changed from the horizontal posture to the vertical posture again, the control unit returns the display mode of the visual information to the state before the change.
12. The mobile terminal of claim 1,
the mobile terminal includes a fingerprint sensor that detects a fingerprint of a user's finger,
when the user input is detected by the grip sensor after the fingerprint sensor detects the fingerprint, the control unit performs a lock function so that the screen information displayed on the touch panel is not changed.
13. The mobile terminal of claim 1,
the control part performs a multi-window function of dividing the touch screen into a plurality of regions and displaying visual information different from each other in each region when the body rotates in a state where first visual information is displayed on the touch screen and a user input is detected by the grip sensor, and determines a display region of the first visual information according to a rotation direction of the body.
14. The mobile terminal of claim 13,
the touch screen includes a first area and a second area,
the control part displays the first visual information in a first area when a rotation direction of the body is a first direction,
the control unit displays the first visual information in a second area when the rotation direction of the main body is a second direction.
15. The mobile terminal of claim 13,
the control part displays second visual information different from the first visual information in an area of the touch screen if the multi-window function is executed,
the second visual information is any one of an icon of an application that is frequently used and a running screen of an application that has run most recently among applications that are running in the background.
16. A control method of a mobile terminal provided with a grip sensor that detects a user input applied to a side of a main body, characterized by comprising:
displaying a part of the visual information stored in the memory on the touch screen;
a step of detecting a user input by the grip sensor while an editing mode for editing the plurality of visual information is being executed;
performing a step of setting the displayed at least one visual information to an editable selection state in accordance with the user input,
the all-select function does not set the remaining visual information other than the displayed at least one visual information to the select state.
17. The control method of a mobile terminal according to claim 16,
the step of displaying the portion of the visual information on the touch screen further comprises:
displaying the part of visual information and new visual information together according to a touch input in a preset mode aiming at the part of visual information; and
and setting the partial visual information and the new visual information to all selected states when the grip sensor detects a user input in a state where the partial visual information and the new visual information are displayed.
18. The control method of a mobile terminal according to claim 16, further comprising:
and a step of setting a selection state of the part of the visual information to a non-editable cancel state in accordance with a user input detected by the grip sensor in a state where the part of the visual information is displayed.
19. The control method of a mobile terminal according to claim 16,
displaying, on the touch screen, a part of items in the plurality of items in catalog information including a plurality of items,
the step of performing the overall selection function further comprises:
a step of executing a scroll function of scrolling the directory information according to a drag input applied to the touch screen in a state where the part of the items is displayed; and
and a step of selecting an item displayed on the touch panel by executing the scroll function in the catalog information when the user input is detected by the grip sensor after the scroll function is executed.
20. The control method of a mobile terminal according to claim 16,
first visual information is displayed on the touch screen,
a step of executing a multi-window function of dividing the touch screen into a plurality of regions and displaying visual information different from each other in each region when the body is rotated in a state where the user input is detected by the grip sensor in a state where the first visual information is displayed,
the first visual information is displayed in a specific region among the divided regions according to a rotation direction of the body.
CN201880076824.3A 2018-02-13 2018-02-13 Mobile terminal and control method thereof Pending CN111418197A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2018/001897 WO2019160171A1 (en) 2018-02-13 2018-02-13 Mobile terminal and control method therefor

Publications (1)

Publication Number Publication Date
CN111418197A true CN111418197A (en) 2020-07-14

Family

ID=67618673

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880076824.3A Pending CN111418197A (en) 2018-02-13 2018-02-13 Mobile terminal and control method thereof

Country Status (5)

Country Link
US (1) US11102343B2 (en)
EP (1) EP3713202A4 (en)
KR (1) KR102434869B1 (en)
CN (1) CN111418197A (en)
WO (1) WO2019160171A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022070081A (en) * 2020-10-26 2022-05-12 レノボ・シンガポール・プライベート・リミテッド Information processing device and control method
US20220342517A1 (en) * 2021-04-21 2022-10-27 Sap Se Selecting all items or displayed items

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105511680A (en) * 2015-12-28 2016-04-20 努比亚技术有限公司 Terminal screen touch control device and method
CN106502536A (en) * 2016-10-31 2017-03-15 珠海市魅族科技有限公司 A kind of page info choosing method and device
JP2017062712A (en) * 2015-09-25 2017-03-30 シャープ株式会社 Electronic apparatus
CN107153546A (en) * 2017-05-12 2017-09-12 厦门美图移动科技有限公司 A kind of video broadcasting method and mobile device
CN107273012A (en) * 2017-06-29 2017-10-20 努比亚技术有限公司 One kind grips object processing method, equipment and computer-readable recording medium
CN107479810A (en) * 2016-06-07 2017-12-15 北京三星通信技术研究有限公司 Operating method and terminal device based on secondary display area

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004177993A (en) 2002-11-22 2004-06-24 Panasonic Mobile Communications Co Ltd Mobile terminal with pressure sensor, and program executable by mobile terminal with pressure sensor
WO2010036050A2 (en) 2008-09-26 2010-04-01 Lg Electronics Inc. Mobile terminal and control method thereof
KR101975946B1 (en) * 2012-05-31 2019-05-09 엘지전자 주식회사 Terminal having user interface using grip and touch and control method thereof
US20140282253A1 (en) * 2013-03-14 2014-09-18 Tencent Technology (Shenzhen) Company Limited Method and apparatus for batch selection of multiple images
WO2014204022A1 (en) * 2013-06-17 2014-12-24 Lg Electronics Inc. Mobile terminal
KR102139110B1 (en) * 2013-06-20 2020-07-30 삼성전자주식회사 Electronic device and method for controlling using grip sensing in the electronic device
JP6454482B2 (en) * 2014-05-28 2019-01-16 京セラ株式会社 Mobile device
JP6050282B2 (en) * 2014-06-09 2016-12-21 富士フイルム株式会社 Electronics
KR101631966B1 (en) * 2014-06-19 2016-06-20 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP6361579B2 (en) * 2015-05-22 2018-07-25 京セラドキュメントソリューションズ株式会社 Display device and image forming apparatus
CN107870686A (en) * 2016-09-27 2018-04-03 深圳富泰宏精密工业有限公司 Electronic installation and its control method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017062712A (en) * 2015-09-25 2017-03-30 シャープ株式会社 Electronic apparatus
CN105511680A (en) * 2015-12-28 2016-04-20 努比亚技术有限公司 Terminal screen touch control device and method
CN107479810A (en) * 2016-06-07 2017-12-15 北京三星通信技术研究有限公司 Operating method and terminal device based on secondary display area
CN106502536A (en) * 2016-10-31 2017-03-15 珠海市魅族科技有限公司 A kind of page info choosing method and device
CN107153546A (en) * 2017-05-12 2017-09-12 厦门美图移动科技有限公司 A kind of video broadcasting method and mobile device
CN107273012A (en) * 2017-06-29 2017-10-20 努比亚技术有限公司 One kind grips object processing method, equipment and computer-readable recording medium

Also Published As

Publication number Publication date
WO2019160171A1 (en) 2019-08-22
EP3713202A1 (en) 2020-09-23
US20200389550A1 (en) 2020-12-10
KR20200110298A (en) 2020-09-23
US11102343B2 (en) 2021-08-24
EP3713202A4 (en) 2021-06-23
KR102434869B1 (en) 2022-08-22

Similar Documents

Publication Publication Date Title
US9846507B2 (en) Mobile terminal and control method thereof
US10666780B2 (en) Mobile terminal and control method therefor
CN106850938B (en) Mobile terminal and control method thereof
CN106067834B (en) Wearable device and control method thereof
EP3128725B1 (en) Mobile terminal and controlling method thereof
CN105404412B (en) Portable terminal and control method thereof
EP3130993B1 (en) Mobile terminal and method for controlling the same
EP3244288B1 (en) Mobile terminal and method for controlling the same
KR102228856B1 (en) Mobile terminal and method for controlling the same
KR102225945B1 (en) Mobile terminal and method for controlling the same
US9774360B2 (en) Mobile terminal and method for controlling the same
KR102208115B1 (en) Mobile terminal and method for controlling the same
KR20180106056A (en) Mobile terminal and method for controlling the same
KR20160099399A (en) Watch type terminal
KR102223728B1 (en) Mobile terminal and method for controlling the same
KR101642808B1 (en) Mobile terminal and method for controlling the same
KR20170021159A (en) Mobile terminal and method for controlling the same
KR20170079549A (en) Mobile terminal and method for controlling the same
KR20180067855A (en) Mobile terminal and method for controlling the same
KR20160057225A (en) Mobile terminal and method for controlling the same
JP2016062590A (en) Mobile terminal and method for controlling the same
KR20210055094A (en) Mobile terminal
KR102434869B1 (en) Mobile terminal and its control method
KR102135374B1 (en) Mobile terminal and method of controlling the same
KR20160134334A (en) Mobile terminal and method of controlling the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200714