KR20130060862A - Mobile terminal and control method thereof - Google Patents

Mobile terminal and control method thereof Download PDF

Info

Publication number
KR20130060862A
KR20130060862A KR1020110127150A KR20110127150A KR20130060862A KR 20130060862 A KR20130060862 A KR 20130060862A KR 1020110127150 A KR1020110127150 A KR 1020110127150A KR 20110127150 A KR20110127150 A KR 20110127150A KR 20130060862 A KR20130060862 A KR 20130060862A
Authority
KR
South Korea
Prior art keywords
keys
group
mobile terminal
keys included
groups
Prior art date
Application number
KR1020110127150A
Other languages
Korean (ko)
Inventor
이종훈
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020110127150A priority Critical patent/KR20130060862A/en
Publication of KR20130060862A publication Critical patent/KR20130060862A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/36Memories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

PURPOSE: A mobile terminal and a control method thereof are provided to conveniently input keys corresponding to a Korean group, an English group, and a number group without manipulation by grouping all keys included in a virtual keypad and displaying the keys on a screen as a three-dimensional depth value. CONSTITUTION: A display unit(251) displays a virtual keypad including a plurality of keys. A memory stores attribute information in each key. A control unit groups the keys into a plurality of groups based on the attribute information and controls three-dimensional depth values of the keys to allow the keys to have different three-dimensional depth values. [Reference numerals] (AA) Drag

Description

[0001] MOBILE TERMINAL AND CONTROL METHOD THEREOF [0002]

The present invention relates to a mobile terminal, and more particularly, to a mobile terminal capable of displaying a virtual keypad on a screen and a control method thereof.

Terminal is movable And may be divided into a mobile / portable terminal and a stationary terminal depending on whether the mobile terminal is a mobile terminal or a mobile terminal. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.

As the functions are diversified, for example, the terminal is implemented in the form of a multimedia player having complex functions such as taking a picture or a video, playing a music or a video file, playing a game or receiving a broadcast. have. Further, in order to support and enhance the function of the terminal, it may be considered to improve the structural and software parts of the terminal.

Thanks to this improvement, the terminal can display a virtual keypad on the screen. However, since the size of the terminal screen is limited, there is a problem that all the keys included in the virtual keypad cannot be displayed on one screen. In addition, there is a problem that the chat window or the browser screen displayed on the screen is covered by the virtual keypad.

An object of the present invention is to provide a mobile terminal and a control method thereof capable of displaying all keys included in a virtual keypad on a single screen.

According to an embodiment of the present invention, a mobile terminal is configured to allow a touch input and displays a virtual keypad including a plurality of keys, and displays attribute information of each of the plurality of keys. The plurality of keys are grouped into a plurality of groups based on a memory to be stored and the attribute information, and the keys included in one group and the keys included in another group are different from each other in three-dimensional depth. It includes a control unit for controlling the three-dimensional depth value of the plurality of keys to have a value.

The control unit may generate priority information based on attribute information of the plurality of keys included in each of the plurality of groups and determine priority information of the plurality of groups and determine the priority. Based on the information, the 3D depth value of the plurality of keys included in each of the plurality of groups may be controlled.

In an embodiment, when a touch input for at least one of the plurality of groups is detected, the controller may control a 3D depth value of a plurality of keys included in the touched group based on the touch input. .

In example embodiments, when the touch input to the display unit is detected, the control unit may switch the three-dimensional depth values of the keys included in the one group and the keys included in the other group to be switched with each other. 3D depth value can be controlled.

The control unit may display the plurality of keys on the display unit such that the keys included in the one group and the keys included in the other group overlap each other.

In an embodiment, the controller may control the colors of the plurality of keys such that the keys included in the one group and the keys included in the other group are distinguished from each other.

The control unit may detect at least one word starting with a letter corresponding to the selected key in the memory when the selection of any one of the plurality of keys is detected, and the detected at least one The next sequence number of a character corresponding to the selected key may be detected in a word of, and the 3D depth value of the key corresponding to the detected next sequence number may be controlled.

In an embodiment, when a touch input is detected on any one of the plurality of keys, the controller may control at least one of a color and a shape of the touched key to distinguish the touched key from other keys.

The present invention relates to a control method of a mobile terminal which is formed to enable touch input and includes a display unit for displaying a virtual keypad. In a method of controlling a mobile terminal according to an embodiment of the present disclosure, extracting attribute information of each of a plurality of keys included in the virtual keypad, and based on the attribute information, the plurality of keys are divided into a plurality of groups. Grouping, and controlling three-dimensional depth values of the plurality of keys such that the keys included in one group and the keys included in another group have different three-dimensional depth values. .

The controlling of the three-dimensional depth value of the plurality of keys may include determining priority of the plurality of groups based on attribute information of the plurality of keys included in each of the plurality of groups. Generating priority information, and controlling three-dimensional depth values of the plurality of keys included in each of the plurality of groups based on the priority information.

According to the present invention, all the keys included in the virtual keypad are grouped into a plurality of groups, and the keys included in one group and the keys included in another group are displayed together on one screen with different three-dimensional depth values. The user can more conveniently enter keys corresponding to the Hangul group, the English group, and the numeric group without any other manipulation.

In addition, according to the present invention, all the keys included in the virtual keypad is displayed on one screen, so that the user can more conveniently check the double consonants, double vowels, etc. that were not displayed on the existing virtual keypad or displayed together with other keys. You can enter

In addition, according to the present invention, all the keys included in the virtual keypad are displayed on the screen with a three-dimensional depth value, so that the user can fully enjoy the chat window or the browser screen obscured by the virtual keypad without any complicated manipulation. . As a result, user convenience may be improved.

1 is a block diagram illustrating a mobile terminal according to the present invention.
2A and 2B are perspective views showing the appearance of a mobile terminal related to the present invention.
3 is a flowchart illustrating a method of controlling a mobile terminal according to an exemplary embodiment of the present invention.
4A, 4B, 5A, 5B, 6A, 6B, 7A, 7B, 8A, and 8B are conceptual views illustrating an operation example of the mobile terminal according to FIG. 3.

DETAILED DESCRIPTION Hereinafter, exemplary embodiments of the present invention will be described with reference to the accompanying drawings so that those skilled in the art may easily implement the technical idea of the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In the drawings, parts irrelevant to the description are omitted in order to clearly describe the present invention, and like reference numerals designate like parts throughout the specification.

The mobile terminal described herein may include a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a PDA (Personal Digital Assistants), a PMP (Portable Multimedia Player), a navigator, have. However, it should be understood that the configuration according to the embodiments described herein may be applied to a fixed terminal such as a digital TV, a desktop computer, and the like, unless it is applicable only to a mobile terminal. It will be easy to see.

1 is a block diagram illustrating a mobile terminal 100 according to the present invention.

1, a mobile terminal 100 includes a wireless communication unit 110, an A / V input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, A controller 160, an interface unit 170, a controller 180, and a power supply unit 190. The components shown in Fig. 1 are not essential, so that a mobile terminal having more or fewer components can be implemented.

Hereinafter, the components 110 to 190 of the mobile terminal 100 will be described in order.

The wireless communication unit 110 may include one or more modules for enabling wireless communication between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and the network in which the mobile terminal 100 is located. For example, the wireless communication unit 110 may include a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short range communication unit 114, and a location information module 115.

The broadcast receiving module 111 receives broadcast signals and broadcast related information from an external broadcast management server through a broadcast channel. Here, the broadcast-related information means information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information can also be provided through a mobile communication network. In this case, the broadcast-related information may be received by the mobile communication module 112. The broadcast signal and the broadcast related information received through the broadcast receiving module 111 may be stored in the memory 160.

The mobile communication module 112 transmits and receives radio signals to at least one of a base station, an external terminal, and a server on a mobile communication network. Such wireless signals may include various types of data depending on a voice call signal, a video call signal, a text message, or a multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be embedded in the mobile terminal 100 or externally. Wireless Internet technologies may include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.

The short-range communication unit 114 is a module for short-range communication. As a short range communication technology, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and the like can be used.

The position information module 115 is a module for acquiring the position of the mobile terminal 100, and a representative example thereof is a Global Position System (GPS) module.

1, an A / V (Audio / Video) input unit 120 is for inputting an audio signal and a video signal, and may include a camera 121, a microphone 122, and the like. The camera 121 processes image frames such as still images and moving images obtained by the image sensor in the video communication mode or the photographing mode. The image frame processed by the camera 121 can be displayed on the display unit 151. [ The image frame may be stored in the memory 160 or transmitted to the outside through the wireless communication unit 110. [ More than two cameras 121 may be provided depending on the use environment.

The microphone 122 processes the sound signal input from the outside in the communication mode, the recording mode, the voice selection mode, and the like as electrical voice data. The voice data processed by the microphone 122 in the communication mode can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication module 112 and output. The microphone 122 may be implemented with various noise reduction algorithms for eliminating noise generated when an external sound signal is input.

The user input unit 130 generates input data for controlling the operation of the mobile terminal 100 by a user. The user input unit 130 may include a key pad, a dome switch, a touch pad (static pressure and static electricity), a jog wheel, a jog switch, and the like.

The sensing unit 140 senses the current state of the mobile terminal 100 such as presence or absence of a user, the open / close state of the mobile terminal 100, position, orientation, acceleration, deceleration, And generates a sensing signal. For example, when the mobile terminal 100 is in the form of a slide phone, the sensing unit 140 may detect whether the slide phone is opened or closed. The sensing unit 140 may sense whether the power supply unit 190 is powered on, whether the interface unit 170 is connected to an external device, and the like.

The sensing unit 140 may include a proximity sensor 141. The sensing unit 140 may include a touch sensor (not shown) for sensing a touch operation with respect to the display unit 151.

The touch sensor may have the form of a touch film, a touch sheet, a touch pad, or the like. The touch sensor may be configured to convert a pressure applied to a specific portion of the display portion 151 or a change in capacitance generated at a specific portion of the display portion 151 into an electrical input signal. The touch sensor may be configured to detect a touch pressure as well as a touched position and area.

When the touch sensor and the display unit 151 have a mutual layer structure, the display unit 151 can be used as an input device in addition to the output device. The display unit 151 may be referred to as a 'touch screen'.

If there is a touch input via the touch screen, signals corresponding thereto are sent to a touch controller (not shown). The touch controller processes the signals transmitted from the touch sensor, and then transmits data corresponding to the processed signals to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched.

In the case where the touch screen is electrostatic, it can be configured to detect the proximity of the sensing object by a change of the electric field along the proximity of the sensing target. Such a touch screen may be classified as proximity sensor 141. [

The proximity sensor 141 refers to a sensor that detects the presence or absence of an object to be sensed without mechanical contact using an electromagnetic force or infrared rays. The proximity sensor 141 has a longer life than the contact type sensor and its utilization is also high. Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor.

Hereinafter, for convenience of explanation, a proximity action is referred to as " proximity touch " while an object to be sensed does not touch the touch screen, and an action of touching the sensing object on the touch screen is called & touch ".

The proximity sensor 141 detects the presence or absence of a proximity touch and a proximity touch pattern (for example, a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, The information corresponding to the presence / absence of proximity touch and the proximity touch pattern can be output to the touch screen.

The output unit 150 generates an output related to visual, auditory, tactile, and the like. The output unit 150 may include a display unit 151, an audio output module 153, an alarm unit 154, and a haptic module 155.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, when the mobile terminal 100 operates in the call mode, the display unit 151 displays a UI (User Interface) or a GUI (Graphic User Interface) related to the call. When the mobile terminal 100 operates in the video communication mode or the photographing mode, the display unit 151 displays the photographed image, the received image, the UI, the GUI, and the like.

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT-LCD), an organic light emitting diode (OLED), a flexible display display, a 3D display, and an e-ink display.

At least one display (or display element) included in the display unit 151 may be configured to be transparent or light transmissive so that the display can see the outside through it. This can be referred to as a transparent display. A typical example of such a transparent display is TOLED (Transparent OLED). The rear structure of the display unit 151 may also be of a light transmission type. With this structure, the user can see an object located behind the terminal body through the area occupied by the display unit 151 in the terminal body.

There may be two or more display units 151 according to the embodiment of the mobile terminal 100. [ For example, in the mobile terminal 100, a plurality of display portions may be spaced apart from each other or positioned integrally with one another, and may be located on different surfaces.

The sound output module 153 may output audio data received from the wireless communication unit 110 or stored in the memory 160 in a call signal reception mode, a call mode or a recording mode, a voice selection mode, a broadcast reception mode, The sound output module 153 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The sound output module 153 may include a receiver, a speaker, a buzzer, and the like.

The alarm unit 154 outputs a signal for notifying the occurrence of an event of the mobile terminal 100. Examples of events generated in the mobile terminal 100 include call signal reception, message reception, key signal input, touch input, and the like. The alarm unit 154 may output a signal for informing occurrence of an event in a form other than the video signal or the audio signal, for example, vibration. The display unit 151 and the audio output module 153 may be classified as a part of the alarm unit 154 because the video signal or the audio signal can be output through the display unit 151 or the audio output module 153. [

The haptic module 155 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 155 is vibration. The intensity, pattern, and the like of the vibration generated by the tactile module 155 are controllable. For example, different vibrations may be synthesized and output or sequentially output.

In addition to the vibration, the haptic module 155 may be configured to perform various functions such as a pin arrangement vertically moving with respect to the contact skin surface, a spraying force or a suction force of the air through the injection port or the suction port, a touch on the skin surface, contact with an electrode, Various effects can be generated such as the effect of heat generation and the effect of reproducing the cold sensation using the heat absorbing or heatable element.

The haptic module 155 can be configured to not only transmit the tactile effect through the direct contact but also to allow the user to feel the tactile effect through the muscular sense of the finger or arm. At least two haptic modules 155 may be provided according to the configuration of the mobile terminal 100.

The memory 160 may store a program for the operation of the controller 180 and temporarily store input and output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 160 may store data related to vibrations and sounds of various patterns that are output upon touch input on the touch screen.

The memory 160 may be a flash memory, a hard disk, a multimedia card micro type, a card type memory (e.g., SD or XD memory), a random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM) And may include at least one storage medium. The mobile terminal 100 may operate in association with a web storage that performs storage functions of the memory 160 on the Internet.

The interface unit 170 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 170 receives data from an external device, receives power, transfers the power to each component inside the mobile terminal 100, or transmits data inside the mobile terminal 100 to an external device. For example, the interface unit 170 may include a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, a port for connecting a device having an identification module, Input / output ports, video I / O ports, earphone ports, and the like.

The identification module is a chip for storing various information for authenticating the usage right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM), a general user authentication module Universal Subscriber Identity Module (USIM)). A device equipped with an identification module (hereinafter referred to as "identification device") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the terminal 100 through the port.

The interface unit 170 may be a path through which the power from the cradle is supplied to the mobile terminal 100 when the mobile terminal 100 is connected to an external cradle or various command signals input from the cradle by the user, (Not shown). The various command signals or power supplied from the cradle may also act as a signal for recognizing that the mobile terminal 100 is correctly mounted in the cradle.

The controller 180 controls the overall operation of the mobile terminal 100. For example, control and processing related to voice communication, data communication, video communication, and the like. The control unit 180 may include a multimedia module 181 for multimedia playback. The multimedia module 181 may be implemented in the control unit 180 or may be implemented separately from the control unit 180. [ The control unit 180 may perform pattern selection processing for selecting handwriting input and drawing input on the touch screen as characters and images, respectively.

The power supply unit 190 receives an external power source and an internal power source under the control of the controller 180 to supply power for operation of each component.

The various embodiments described herein may be implemented in a recording medium readable by a computer or similar device using software, hardware, or a combination thereof.

According to a hardware implementation, the embodiments described herein may be applied to various types of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays Microprocessors, microprocessors, and other electronic units for performing other functions, as will be appreciated by those skilled in the art. In some cases, the embodiments described herein may be implemented by the controller 180 itself.

According to the software implementation, embodiments such as the procedures and functions described herein may be implemented as separate software modules. Each of the software modules may perform one or more of the functions and operations described herein. Software code may be implemented in a software application written in a suitable programming language. Such software code may be stored in the memory 160 and executed by the controller 180.

Hereinafter, a method of processing user input to the mobile terminal 100 will be described.

The user input unit 130 is operated to receive a command for controlling the operation of the mobile terminal 100, and may include a plurality of operation units. The operating units may also be referred to as manipulating portions and may be employed in any manner as long as they are operated in a tactile manner by the user's tactile sense.

Various kinds of time information can be displayed on the display unit 151. [ Such visual information can be displayed in the form of letters, numbers, symbols, graphics, icons, and the like, and can be formed as a three-dimensional stereoscopic image. At least one of a character, a number, a symbol, a graphic, and an icon may be displayed in a predetermined arrangement for inputting time information, thereby being implemented as a keypad. Such a keypad may be referred to as a so-called " soft key ".

The display unit 151 may operate as an entire area or may be divided into a plurality of areas and operated. In the latter case, the plurality of areas can be configured to operate in association with one another. For example, an output window and an input window may be displayed on the upper and lower portions of the display unit 151, respectively. The output window and the input window are areas allocated for outputting or inputting information, respectively. In the input window, a soft key with a number for inputting a telephone number may be output. When the soft key is touched, the number corresponding to the touched soft key is displayed in the output window. When the operating unit is operated, a call connection to the telephone number displayed in the output window may be attempted, or the text displayed in the output window may be entered into the application.

The display unit 151 or the touch pad may be configured to detect touch scrolling. The user can move the cursor or the pointer positioned on the displayed object, for example, the icon, on the display unit 151 by scrolling the display unit 151 or the touch pad. Further, when the finger is moved on the display unit 151 or the touch pad, the path along which the finger moves may be visually displayed on the display unit 151. [ This will be useful for editing the image displayed on the display unit 151. [

One function of the mobile terminal 100 may be executed in response to a case where the display unit 151 and the touch pad are touched together within a predetermined time range. In the case of being touched together, there may be a case where the user clamps the main body of the mobile terminal 100 using the thumb and index finger. At this time, one function of the mobile terminal 100 to be executed may be activation or deactivation for the display unit 151 or the touch pad, for example.

2A and 2B are perspective views showing the appearance of the mobile terminal 100 related to the present invention.

FIG. 2A shows a front side and one side of the mobile terminal 100, and FIG. 2B shows a rear side and the other side of the mobile terminal 100. FIG.

Referring to FIG. 2A, the mobile terminal 100 includes a bar-shaped terminal body. However, the mobile terminal 100 is not limited thereto, and may be realized in various forms such as a slide type, a folder type, a swing type, and a swivel type in which two or more bodies are movably coupled to each other.

The terminal body includes a case (a casing, a housing, a cover, and the like) that forms an appearance. In the embodiment, the case may be divided into a front case 101 and a rear case 102. A variety of electronic components are embedded in the space formed between the front case 101 and the rear case 102. At least one intermediate case may be additionally disposed between the front case 101 and the rear case 102. [

The cases may be formed by injection molding of a synthetic resin or may be formed to have a metal material such as stainless steel (STS), titanium (Ti) or the like.

The display unit 151, the sound output unit 152, the camera 121, the user input unit 130 (see FIG. 1), the microphone 122, the interface 170, can do.

The display unit 151 occupies a main portion of the front case 101. [ The audio output unit 152 and the camera 121 are located in a region adjacent to one end of the display unit 151 and the first user input unit 131 and the microphone 122 are located in a region adjacent to the other end. The second user input unit 132 and the interface 170 may be located on the sides of the front case 101 and the rear case 102. [

The user input unit 130 is operated to receive a command for controlling the operation of the mobile terminal 100. [ The user input unit 130 may include first and second user input units 131 and 132.

The first and second user inputs 131 and 132 may receive various commands. For example, the first user input unit 131 may receive a command such as start, end, scroll, and the like. The second user input unit 132 may receive a command such as adjusting the volume of the sound output from the sound output unit 152, switching to the touch selection mode of the display unit 151, and the like.

Referring to FIG. 2B, a rear camera 121 'may be additionally mounted on the rear surface of the terminal body, that is, the rear case 102. The rear camera 121 'has a photographing direction opposite to that of the front camera 121 (see FIG. 2A), and may be configured to have pixels different from those of the front camera 121.

For example, the front camera 121 may be configured to have a low pixel, and the rear camera 121 'may be configured to have a high pixel. Accordingly, when the front camera 121 is used during a video call, the size of the transmission data may be reduced when the user's face is photographed and transmitted to the counterpart in real time. On the other hand, the rear camera 121 'can be used for the purpose of storing a high-quality image.

Meanwhile, the cameras 121 and 121 'may be installed in the terminal body so as to be rotated or popped up.

The flash 123 and the mirror 124 may be further positioned adjacent to the rear camera 121 '. The flash 123 emits light toward the subject when the user photographs the subject with the rear camera 121 '. The mirror 124 illuminates the user's face when the user photographs himself (self-photographing) using the rear camera 121 '.

A rear sound output unit 152 'may be additionally disposed on the rear surface of the terminal body. The rear sound output unit 152 ′ may perform a stereo function together with the front sound output unit 152 (see FIG. 2A), and may perform a speakerphone function during a call.

In addition to the antenna for communication, an antenna 116 for receiving broadcast signals may be additionally provided on the side surface of the terminal body. The antenna 116 constituting a part of the broadcast receiving module 111 (see FIG. 1) can be installed to be able to be drawn out from the terminal body.

A power supply unit 190 for supplying power to the mobile terminal 100 is mounted on the terminal body. The power supply unit 190 may be built in the main body of the terminal or may be detachable from the outside of the main body of the terminal.

The rear case 102 may further include a touch pad 135 for sensing a touch. The touch pad 135 may be of a light transmission type like the display unit 151 (see FIG. 2A). In addition, a rear display unit for outputting visual information may be additionally mounted on the touch pad 135. In this case, information output from both the front display unit 151 and the rear display unit may be controlled by the touch pad 135.

The touch pad 135 operates in correlation with the display unit 151. [ The touch pad 135 may be located parallel to the rear of the display unit 151. The touch pad 135 may have a size equal to or smaller than that of the display unit 151.

The mobile terminal 100 can display a three-dimensional image (3-dimensional stereoscopic image) that enables depth perception to stereovision beyond the level of the two-dimensional image display. It is evolving. The 3D image is an image in which the depth and reality in which the object is positioned on the display unit 151 can be felt in the same manner as the real space. As a result, the user can enjoy a more realistic user interface or content through the 3D image.

Meanwhile, the mobile terminal 100 may display a virtual keypad on the screen. However, since the size of the display unit 151 of the mobile terminal 100 is limited, there is a problem that all the keys included in the virtual keypad are not displayed on the display unit 151. In addition, there is a problem that the chat window or the browser screen displayed on the display unit 151 is covered by the virtual keypad.

Accordingly, hereinafter, the mobile terminal 100 capable of displaying all the keys included in the virtual keypad on the display unit 151 in a three-dimensional form will be described with reference to the accompanying drawings.

In the present specification, the term "three-dimensional depth value" refers to an index indicating a distance difference between objects included in a three-dimensional image. Specifically, when the user looks at the object through the display unit 151, when the object looks in a two-dimensional form, the three-dimensional depth value of the object is "0". However, when the object is seen in a three-dimensional form, for example, when the object is seen to protrude out of the mobile terminal 100 with respect to the display unit 151, the three-dimensional depth value of the object becomes negative. If the object appears to be inside the mobile terminal 100, the three-dimensional depth value of the object becomes positive.

That is, when the user views the objects included in the 3D image protruding to the outside of the mobile terminal 100 with respect to the display unit 151, the absolute value of the 3D depth value represented by the negative number of the object in the near distance is shown. The value is large.

Hereinafter, for convenience of description, when the user views objects included in the 3D image protruding to the outside of the mobile terminal 100 with respect to the display unit 151, the more visible the object, the more 'three-dimensional depth'. The larger the value is, the farther the object is visible, the smaller the 3-dimensional depth value. This is obvious to those of ordinary skill in the art.

3 is a flowchart illustrating a control method of a mobile terminal 100 (refer to FIG. 1) according to an exemplary embodiment.

The mobile terminal 100 may include a display unit 151 (see FIG. 1), a memory 160 (see FIG. 1), and a controller 180 (see FIG. 1).

Referring to FIG. 3, first, an operation S110 of extracting attribute information of each of a plurality of keys included in a virtual keypad is performed.

The display unit 151 may be configured to allow touch input and display a virtual keypad including a plurality of keys. The virtual keypad may include a standard array qwerty keypad in which Q, W, E, R, T, and Y are located at the top left, a 3X4 keypad in which at least one letter corresponds to each of the 12 keys. . Hereinafter, for convenience of description, a QWERTY-type virtual keypad will be described as an example.

The memory 160 may store attribute information of each of the plurality of keys. Here, the attribute information refers to data provided for the content. The attribute information may include at least one of the location and content of the content, information about the author, a right condition, a use condition, and a use history.

Specifically, the attribute information of each of the plurality of keys may include property information of each of the plurality of keys, and may include frequency of use information, recent usage history information, and the like. The property information of each of the plurality of keys may include, for example, whether the character corresponding to the key corresponds to Korean, English, or numerals, and if it corresponds to Korean, which corresponds to a terminal sound, a double consonant, a short vowel, and a double vowel. If the information corresponding to the uppercase and lowercase letters, etc. may be included.

Next, based on the attribute information, step S120 is performed in which a plurality of keys are grouped into a plurality of groups.

In detail, the controller 180 may group the plurality of keys into a plurality of groups based on the property of each of the plurality of keys. For example, the controller 180 may group the plurality of keys into a Korean group, an English group, and a number group according to which of each of the plurality of keys corresponds to Korean, English, and numbers. In addition, the controller 180 may group the plurality of keys corresponding to the Korean alphabet into a terminal sound group, a double consonant group, a short vowel group, and a double vowel group.

Meanwhile, the controller 180 generates priority information such that priority of each of the plurality of keys is determined based on usage frequency information of each of the plurality of keys, and based on the priority information, the controller 180 generates a plurality of keys, respectively. You can also group into groups.

Thereafter, step S130 is performed in which the three-dimensional depth values of the plurality of keys are controlled such that the keys included in one group and the keys included in the other group have different three-dimensional depth values.

Specifically, the controller 180 generates priority information based on attribute information of a plurality of keys included in each of the plurality of groups, and determines priority information of the plurality of groups, and based on the priority information. The 3D depth value of the plurality of keys included in each of the plurality of groups may be controlled.

For example, the controller 180 may determine three-dimensional depth values of keys included in each of the Korean group, the English group, and the numeric group so that the keys included in the Korean group, the English group, and the numeric group have different three-dimensional depth values. Can be controlled. In this case, the controller 180 may determine the priority of the groups in the order of the Hangul group, the English group, and the numeric group. Accordingly, the 3D depth value of the keys included in the Hangul group is the largest and included in the numeric group. The 3D depth value of the keys included in each of the groups may be controlled such that the 3D depth value of the received keys is the smallest.

In addition, the controller 180 is included in each of the terminal group, the double consonant group, the single vowel group and the double vowel group so that the keys included in the terminal group, the double consonant group, the short vowel group and the double vowel group have different three-dimensional depth values. It is possible to control the three-dimensional depth value of the key.

As described above, according to the present invention, all the keys included in the virtual keypad are grouped into a plurality of groups, and the keys included in one group and the keys included in another group have different three-dimensional depth values. By being displayed together on the screen, the user can more conveniently input keys corresponding to the Hangul group, the English group, and the numeric group without any other manipulation.

In addition, according to the present invention, all the keys included in the virtual keypad are displayed on the screen with a three-dimensional depth value, so that the user can fully enjoy the chat window or the browser screen obscured by the virtual keypad without any complicated manipulation. . As a result, user convenience may be improved.

4A and 4B are conceptual views illustrating an operation example of the mobile terminal 200 according to FIG. 3. The mobile terminal 200 may include a display unit 251, a memory 160 (see FIG. 1), and a controller 180 (see FIG. 1).

The display unit 251 may display a qty-type virtual keypad. In this case, as illustrated in FIG. 4A, when a touch input (for example, a drag input from right to left) to the virtual keypad is detected, the controller 180 stores attribute information of each of the plurality of keys in the memory 160. Extracted from and based on the attribute information, a plurality of keys can be grouped into a plurality of groups. For example, the controller 180 may group the plurality of keys into a Korean group and an English group.

Subsequently, as illustrated in FIG. 4B, the controller 180 may control the 3D depth value of each of the plurality of groups based on the attribute information of the plurality of keys included in each of the plurality of groups. For example, the controller 180 may control the 3D depth values of the keys included in each of the groups such that the 3D depth values of the keys included in the Korean group and the English group are different from each other. As shown, the keys included in the Hangul group may be displayed to protrude out of the mobile terminal 200 based on the display unit 251 compared to the keys included in the English group. In this case, the keys included in the Korean group and the keys included in the English group may overlap each other and be displayed on the display unit 251.

Accordingly, the keys included in the English group which are not displayed on the virtual keypad or displayed together with the keys included in the Korean group on the virtual keypad are different from the keys included in the Korean group and have a 3D depth value. ) May be displayed. Although not shown, the keys included in the numeric group may also be displayed on the display unit 251 by different three-dimensional depth values from the keys included in the Korean group and the English group, respectively.

5A and 5B are conceptual views illustrating an operation example of the mobile terminal 200 according to FIG. 3. The mobile terminal 200 may include a display unit 251, a memory 160 (see FIG. 1), and a controller 180 (see FIG. 1).

The display unit 251 may display a qty-type virtual keypad. As shown in FIG. 5A, due to space problems, in the general QWERTY-type virtual keypad, five double consonants (ㄲ, ㄸ, ㅃ, ㅆ, ㅉ) and two of the double vowels (표시, ㅖ) are displayed. Or the corresponding phonetic sounds (a, c, k, k, k) and other vowels (k, k). At this time, the user touches and touches the other vowels (2, ㅔ) corresponding to two of the two vowels (a, c, ㅂ, ㅅ, ㅈ) and double vowels respectively corresponding to the five consonants to input them. There was an inconvenience to do.

However, in this case, when a touch input (for example, a drag input from left to right) to the virtual keypad is detected, as shown in FIG. 5B, the controller 180 stores attribute information of each of the plurality of keys in the memory 160. ), And based on the attribute information, the three-dimensional depth value of each of the plurality of keys can be controlled.

That is, the controller 180 may display the five consonants that are not displayed on the virtual keypad or displayed together with other keys on the virtual keypad by changing the different consonants and three-dimensional depth values. In addition, the controller 180 may display two characters of the double vowels that are not displayed on the virtual keypad or displayed together with other keys on the virtual keypad by different vowels and three-dimensional depth values. As shown, two of the five consonants and two of the double vowels may be displayed to protrude to the outside of the mobile terminal 200 based on the display unit 251 as compared to other consonants and other vowels. In addition, two of the five consonants and two vowels may be displayed differently from other consonants and different vowels.

On the other hand, although not shown, the control unit 180 differs from other consonants and other vowels in three-dimensional depth values such as overlap and 'vowels' such as 'ㄳ', which are not displayed on a common qwerty-type virtual keypad. Can be displayed on the virtual keypad.

6A and 6B are conceptual views illustrating an operation example of the mobile terminal 200 according to FIG. 3. The mobile terminal 200 may include a display unit 251, a memory 160 (see FIG. 1), and a controller 180 (see FIG. 1).

The display unit 251 may display a virtual keypad including keys included in the Korean group and keys included in the English group having different three-dimensional depth values. In this case, when a touch input to the virtual keypad is detected, the controller 180 may control three-dimensional depth values of keys included in the touched group based on the degree of the touch applied to the virtual keypad.

In detail, the controller 180 may detect a degree to which a touch is applied when detecting a touch input for any one of a plurality of groups. The controller 180 can determine the degree to which the touch is applied by detecting at least one of a change in the touch operation and a duration from the start of the touch input to the release of the touch input.

Here, the extent to which the touch is applied may be the time the touch was held, the number of times it was touched, and the length of the drag, and may be a distance value between the first and second touch points upon detection of the multi-touch input.

For example, as shown in FIG. 6A, when the user drags at least one of the keys included in the Hangul group in a first direction (left to right), the 3D depth value of the keys included in the Hangul group is determined. Control commands for increasing can be detected.

Although the drawing illustrates a case in which a drag gesture is detected as a touch input to the virtual keypad, the type of touch input to the virtual keypad is not limited thereto. For example, even if a single tap gesture, a flick gesture, a pinch-in gesture, or a pinch-out gesture is detected, the group is included in the group. A control command to increase the three dimensional depth value of the keys can be sensed.

Accordingly, based on the control command, as shown in FIG. 6B, the keys included in the Hangul group may be displayed to protrude out of the mobile terminal 200 based on the display unit 251. Although not shown, when the user drags at least one of the keys included in the Hangul group longer in the first direction, the keys included in the Hangul group are external to the mobile terminal 200 based on the display unit 251. It can be displayed to protrude more.

Although not shown, when the user drags at least one of the keys included in the Hangul group in the second direction (right to left direction), a control command for reducing the 3D depth value of the keys included in the Hangul group is provided. It may be detected.

Meanwhile, the controller 180 may increase or decrease the 3D depth value for all the keys included in any one group, but increase the 3D depth value for at least one of the keys included in any one group. It can also be reduced or reduced.

As described above, according to the present invention, by changing the three-dimensional depth value of at least one of the plurality of keys based on the touch input, the user can adjust the stereoscopic feeling of the keys displayed on the screen to match the viewing angle and viewing environment. have.

7A and 7B are conceptual views illustrating an operation example of the mobile terminal 200 according to FIG. 3. The mobile terminal 200 may include a display unit 251, a memory 160 (see FIG. 1), and a controller 180 (see FIG. 1).

The display unit 251 may display a virtual keypad including keys included in the Korean group and keys included in the English group having different three-dimensional depth values. In this case, when a touch input to the virtual keypad is detected, the controller 180 may switch 3D depth values of the keys included in one group and the 3D depth values of the keys included in the other group to be switched with each other. You can also control the dimension depth value.

For example, as shown in FIG. 7A, when a user touches a key corresponding to 'Korean / English conversion' among the keys included in the Korean group (hereinafter, referred to as 'Korean / English key'), the Korean group A control command for switching the three-dimensional depth value of the keys included in the key and the keys included in the English group may be sensed. That is, a control command for reducing the 3D depth value of the keys included in the Korean group and increasing the 3D depth value of the keys included in the English group may be sensed.

Accordingly, based on the control command, as shown in FIG. 7B, the keys included in the English group are outside of the mobile terminal 200 based on the display unit 251 as compared with the keys included in the Korean group. It may be displayed to protrude further.

Meanwhile, the control command for switching the 3D depth value of the keys included in one group and the keys included in another group is not limited to the case where the above-described touch input (touch input for 'Korean / English key') is present. Do not. Specifically, although not shown, when the user scrolls the virtual keypad or when the user selects at least one of the keys included in one group and then drags toward at least one of the keys included in the other group In addition, a control command for switching the 3D depth value of the keys included in one group and the keys included in another group may be sensed.

8A and 8B are conceptual views illustrating an operation example of the mobile terminal 200 according to FIG. 3. The mobile terminal 200 may include a display unit 251, a memory 160 (see FIG. 1), and a controller 180 (see FIG. 1).

As shown in FIG. 8A, when a selection for any one of the plurality of keys (eg, 'ㅇ') is detected, the controller 180 determines that the selected key (eg, 'ㅇ') has different keys. At least one of a color and a shape of the selected key (eg, 'o') may be controlled so as to be distinguished from.

In detail, the controller 180 may instantly change at least one of the color, brightness, and saturation of the selected key. In addition, although not shown, the controller 180 may instantaneously give the selected key a sparkling effect, or may temporarily increase or decrease the size of the selected key.

Meanwhile, the controller 180 may detect at least one word (eg, 'a') starting with a letter corresponding to the selected key (eg, 'o') in the memory 160.

Next, the controller 180 may detect a next sequence character of a character corresponding to the selected key (for example, 'o') from the detected at least one word. Thereafter, as illustrated in FIG. 8B, the controller 180 may control the 3D depth value of the key (eg, 'ㅏ') corresponding to the detected next sequence letter.

Specifically, a key (eg, 'ㅏ') corresponding to the detected next sequence letter may have a three-dimensional depth value different from other keys included in the virtual keypad. The controller 180 displays a key corresponding to the next sequence character (eg, '다음') so that the user can more easily recognize a key corresponding to the next sequence character (eg, 'ㅏ'). The display unit 251 may protrude to the outside of the mobile terminal 200 based on the unit 251.

Although not shown, the controller 180 may output a notification signal if a word starting with a letter corresponding to the selected key is not detected in the memory 160. Here, the notification signal may include at least one of a voice signal output from the sound output module 153 (see FIG. 1) and a vibration signal generated from the haptic module 155 (see FIG. 1).

Further, according to the embodiment disclosed herein, the above-described method can be implemented as a code that can be read by a processor on a medium on which the program is recorded. Examples of the medium that can be read by the processor include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, etc., and may be implemented in the form of a carrier wave (e.g., transmission over the Internet) .

The mobile terminal described above can be applied to not only the configuration and method of the embodiments described above but also all or some of the embodiments may be selectively combined so that various modifications may be made to the embodiments It is possible.

Claims (10)

A display unit configured to enable a touch input and to display a virtual keypad including a plurality of keys;
A memory for storing attribute information of each of the plurality of keys; And
Based on the attribute information, the plurality of keys are grouped into a plurality of groups, and the keys included in one group of the plurality of groups and the keys included in another group have different three-dimensional depth values. A mobile terminal including a control unit for controlling a three-dimensional depth value of a plurality of keys.
The method of claim 1,
The control unit,
Based on the attribute information of the plurality of keys included in each of the plurality of groups, priority information is generated to determine the priority among the plurality of groups, and based on the priority information, the plurality of groups And a 3D depth value of the plurality of keys included in each of the plurality of keys.
3. The method of claim 2,
The control unit,
And when a touch input for at least one of the plurality of groups is detected, controlling a 3D depth value of a plurality of keys included in the touched group based on the touch input.
3. The method of claim 2,
The control unit,
When a touch input to the display unit is detected, the 3D depth value of the plurality of keys is controlled so that the 3D depth values of the keys included in the one group and the keys included in the other group are switched to each other. Mobile terminal.
The method of claim 1,
The control unit,
And displaying the plurality of keys on the display unit such that the keys included in the one group and the keys included in the other group overlap each other.
The method of claim 1,
The control unit,
And controlling the color of the plurality of keys such that the keys included in the one group and the keys included in the other group are distinguished from each other.
The method of claim 1,
The control unit,
Detecting a selection of any one of the plurality of keys, detecting at least one word starting with a letter corresponding to the selected key in the memory,
Detecting a next sequence character of a character corresponding to the selected key in the detected at least one word,
And a 3D depth value of a key corresponding to the detected next sequence number character.
The method of claim 1,
The control unit,
And when a touch input is detected on any one of the plurality of keys, controlling at least one of a color and a shape of the touched key to distinguish the touched key from other keys.
In the mobile terminal is formed to enable a touch input, including a display unit for displaying a virtual keypad:
Extracting attribute information of each of the plurality of keys included in the virtual keypad;
Grouping the plurality of keys into a plurality of groups based on the attribute information; And
Controlling a three-dimensional depth value of the plurality of keys such that keys included in one group and keys included in another group have different three-dimensional depth values from among the plurality of groups. .
The method of claim 9,
Controlling the three-dimensional depth value of the plurality of keys,
Generating priority information based on attribute information of the plurality of keys included in each of the plurality of groups, such that priority is determined among the plurality of groups; And
And controlling a three-dimensional depth value of the plurality of keys included in each of the plurality of groups, based on the priority information.
KR1020110127150A 2011-11-30 2011-11-30 Mobile terminal and control method thereof KR20130060862A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020110127150A KR20130060862A (en) 2011-11-30 2011-11-30 Mobile terminal and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020110127150A KR20130060862A (en) 2011-11-30 2011-11-30 Mobile terminal and control method thereof

Publications (1)

Publication Number Publication Date
KR20130060862A true KR20130060862A (en) 2013-06-10

Family

ID=48859215

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020110127150A KR20130060862A (en) 2011-11-30 2011-11-30 Mobile terminal and control method thereof

Country Status (1)

Country Link
KR (1) KR20130060862A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015147549A1 (en) * 2014-03-25 2015-10-01 박인기 Device and method for inputting chinese characters, and chinese character search method using same

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015147549A1 (en) * 2014-03-25 2015-10-01 박인기 Device and method for inputting chinese characters, and chinese character search method using same
KR20150111329A (en) * 2014-03-25 2015-10-05 박인기 Device and method for inputting chinese characters, and method for searching the chinese characters

Similar Documents

Publication Publication Date Title
KR101995486B1 (en) Mobile terminal and control method thereof
KR101772979B1 (en) Mobile terminal and control method thereof
KR101886753B1 (en) Mobile terminal and control method thereof
KR101303160B1 (en) Mobile terminal and method for providing user interface thereof
KR101899972B1 (en) Mobile terminal and control method thereof
KR101952177B1 (en) Mobile terminal and control method thereof
KR20140051719A (en) Mobile terminal and control method thereof
KR20140001711A (en) Mobile terminal and method for recognizing voice thereof
KR20130068552A (en) Mobile terminal and control method thereof
KR20150127842A (en) Mobile terminal and control method thereof
KR101818114B1 (en) Mobile terminal and method for providing user interface thereof
KR101977086B1 (en) Mobile terminal and control method thereof
KR20110029681A (en) Mobile terminal
KR20130059123A (en) Mobile terminal and control method thereof
KR20140061133A (en) Mobile terminal and control method thereof
KR101861699B1 (en) Mobile terminal and control method thereof
KR101300260B1 (en) Mobile terminal and control method thereof
KR20130132134A (en) Mobile terminal and control method thereof
KR20130091181A (en) Mobile terminal and control method thereof
KR20130060862A (en) Mobile terminal and control method thereof
KR101847917B1 (en) Mobile terminal and control method thereof
KR20130143372A (en) Mobile terminal
KR20130091182A (en) Mobile terminal and control method thereof
KR101917687B1 (en) Mobile terminal and control method thereof
KR101260771B1 (en) Mobile terminal and control method thereof

Legal Events

Date Code Title Description
WITN Withdrawal due to no request for examination