KR20170021616A - Mobile terminal and method for controlling the same - Google Patents

Mobile terminal and method for controlling the same Download PDF

Info

Publication number
KR20170021616A
KR20170021616A KR1020150116222A KR20150116222A KR20170021616A KR 20170021616 A KR20170021616 A KR 20170021616A KR 1020150116222 A KR1020150116222 A KR 1020150116222A KR 20150116222 A KR20150116222 A KR 20150116222A KR 20170021616 A KR20170021616 A KR 20170021616A
Authority
KR
South Korea
Prior art keywords
images
image
thumbnail
thumbnail image
mobile terminal
Prior art date
Application number
KR1020150116222A
Other languages
Korean (ko)
Inventor
김수진
윤성혜
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150116222A priority Critical patent/KR20170021616A/en
Publication of KR20170021616A publication Critical patent/KR20170021616A/en

Links

Images

Classifications

    • H04M1/72522
    • G06F17/30244
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to a mobile terminal capable of searching and displaying a plurality of stored images, and a control method of the mobile terminal. The mobile terminal includes a memory for storing a plurality of images, a display unit for displaying image information related to the plurality of images, And displaying thumbnail images corresponding to the plurality of images on the display unit. When the preset touch gesture input is applied, the plurality of images are simultaneously grouped according to a preset grouping criterion to generate at least one group And a controller for simultaneously releasing the grouping states of the simultaneously grouped images when another touch gesture input corresponding to the preset touch gesture input is applied, wherein the predetermined grouping criterion includes: A composition in which the plurality of images are photographed, And a position at which the image is photographed.

Description

[0001] MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME [0002]

The present invention relates to a mobile terminal capable of searching and displaying a plurality of stored images and a control method of the mobile terminal.

A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.

The functions of mobile terminals are diversified. For example, there are data and voice communication, photographing and video shooting through a camera, voice recording, music file playback through a speaker system, and outputting an image or video on a display unit. Some terminals are equipped with an electronic game play function or a multimedia player function. In particular, modern mobile terminals can receive multicast signals that provide visual content such as broadcast and video or television programs.

Such a terminal has various functions according to the development of technology. For example, it is implemented in the form of a multimedia device having a combination of functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game, and broadcasting. Further, in order to support and enhance the functionality of such a terminal, it may be considered to improve the structural and / or software portion of the terminal.

On the other hand, such a terminal can have a larger capacity memory due to the development of the present technology. As a result, current mobile terminals have been able to store hundreds to thousands of pictures at a minimum. Accordingly, a method for enabling a user to manage a large number of the stored photographs more easily and simply is under active research.

The present invention has been made to solve the above-mentioned problems and other problems, and it is an object of the present invention to provide a mobile terminal that enables a user to manage images more easily and conveniently by grouping images based on a predetermined simple user's touch gesture, And to provide a control method.

Another object of the present invention is to provide a mobile terminal and a control method of the mobile terminal which allow a user to confirm the number of images included in a specific group and images included in the specific group without changing the current browsing level The purpose of that is to do.

According to an aspect of the present invention, there is provided a mobile terminal including a memory for storing a plurality of images, a display unit for displaying image information related to the plurality of images, And displaying thumbnail images corresponding to the plurality of images on the display unit. When the preset touch gesture input is applied, the plurality of images are simultaneously grouped according to a preset grouping criterion to generate at least one group And a control unit for simultaneously releasing the grouped states of the images when another touch gesture input corresponding to the preset touch gesture input is applied, wherein the predetermined grouping criterion includes: And a position in which the plurality of images are photographed At least one of them.

In one embodiment, the controller determines a representative image from at least one image included in the generated group according to a predetermined reference, and outputs a first thumbnail image corresponding to the representative image to the generated angle And graphic objects related to the number of images included in the group are displayed in one area of the first thumbnail image corresponding to the representative image.

In one embodiment, the control unit determines the representative image based on at least one of the captured time, the saturation, the brightness, and the sharpness of images included in the group.

In one embodiment, when any one of the first thumbnail images is selected, the controller selects, from any one of the plurality of images corresponding to the selected first thumbnail image, a thumbnail corresponding to the plurality of images A second thumbnail image having a higher resolution and a larger size than other first thumbnail images corresponding to the images or the representative image is generated and displayed.

In one embodiment, the control unit displays at least one thumbnail image corresponding to the group of images corresponding to the selected first thumbnail image based on the touch input of the user for the second thumbnail image .

In one embodiment, the controller displays a graphic object in the form of a scroll bar in a part of a display area where the second thumbnail image is displayed, and displays a drag input length of the user detected through the graphic object in the form of a scroll bar And displays a second thumbnail image corresponding to one of the groups of images corresponding to the selected first thumbnail image.

In one embodiment, the control unit displays a plurality of graphic objects corresponding to the number of images of the group corresponding to the selected first thumbnail image in a part of the display area where the second thumbnail image is displayed, The second thumbnail image corresponding to one of the groups of images corresponding to the selected first thumbnail image corresponding to any one of the graphic objects of the user is displayed.

In one embodiment, the control unit displays thumbnail images corresponding to the images of the group corresponding to the selected first thumbnail image, based on the touch input of the user for the second thumbnail image, At least one graphic object corresponding to at least one function preset is displayed in each of the areas in which the thumbnail images corresponding to the respective images are displayed.

In one embodiment, when the corresponding touch gesture input is applied while the grouping state is released, the control unit may transmit information related to at least one of the plurality of images stored in the mobile terminal, And displaying the thumbnail image around the thumbnail image corresponding to at least one of the plurality of images.

In one embodiment, the predetermined touch gesture input is a pinch-in gesture input, and the other touch gesture input corresponding to the preset touch gesture input is a pinch-out gesture input. .

According to an aspect of the present invention, there is provided a method of controlling a mobile terminal, the method comprising: displaying thumbnail images corresponding to a plurality of previously stored images; Grouping the plurality of images simultaneously according to a predetermined grouping criterion when a gesture input is applied; displaying a thumbnail image corresponding to a plurality of groups generated according to the grouping; And simultaneously releasing the grouping state of the simultaneously grouped images when another touch gesture input corresponding to the input is applied, wherein the predetermined grouping criterion is a grouping state in which the plurality of image photographed times, And a position at which the plurality of images are captured, The.

Effects of the mobile terminal and the control method according to the present invention will be described as follows.

According to at least one of the embodiments of the present invention, the present invention allows a plurality of images to be simultaneously grouped and grouped based on a simple touch gesture of the user, so that a user can easily and quickly manage a plurality of images .

According to at least one of the embodiments of the present invention, the present invention has an advantage in that a user can find an image to be searched for more quickly and easily by grouping the images according to the time at which the images were taken or the image composition of the images .

According to at least one of the embodiments of the present invention, the present invention displays information about images included in a group in a graphic object displaying grouped images, and displays images included in the group when a specific group is selected By displaying the image as a thumbnail image, there is an advantage that the user can check images included in each group without changing the current browsing level.

1A is a block diagram illustrating a mobile terminal according to the present invention.
1B and 1C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
2 is a flowchart illustrating an operation of grouping images according to a touch gesture input in a mobile terminal according to an exemplary embodiment of the present invention.
FIG. 3 is a flowchart illustrating an operation of displaying the grouped images in more detail in the operation processes of FIG. 2. Referring to FIG.
FIG. 4 is a flowchart illustrating an operation of releasing a grouping state of grouped images in a mobile terminal according to an exemplary embodiment of the present invention.
5 is a flowchart illustrating an operation of displaying information related to a selected group when a specific group of images is selected in a mobile terminal according to an exemplary embodiment of the present invention.
6 is a diagram illustrating an example in which a grouping of a plurality of images is simultaneously performed according to a touch input of a user in a mobile terminal according to an embodiment of the present invention.
7 to 9 are illustrations showing examples in which images included in a selected group are displayed when a specific group is selected in the mobile terminal according to the embodiment of the present invention.
10 is an exemplary view illustrating an example in which information related to each image is displayed together with a touch input of a user in a mobile terminal according to an embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like or similar elements are denoted by the same or similar reference numerals, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device (e.g., a smart watch, a glass glass, a head mounted display (HMD), etc.) .

However, it will be readily apparent to those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, .

1A is a block diagram for explaining a mobile terminal 100 related to the present invention.

The mobile terminal 100 includes a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a control unit 180, ), And the like. The components shown in FIG. 1 are not essential for implementing the mobile terminal 100, so that the mobile terminal 100 described herein can have more or fewer components than the components listed above have.

The wireless communication unit 110 may be connected between the mobile terminal 100 and the wireless communication system or between the mobile terminal 100 and another mobile terminal or between the mobile terminal 100 and an external server, And may include one or more modules that enable wireless communication. In addition, the wireless communication unit 110 may include one or more modules for connecting the mobile terminal 100 to one or more networks.

The wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short distance communication module 114, and a location information module 115 .

The input unit 120 includes a camera 121 or an image input unit for inputting a video signal, a microphone 122 for inputting an audio signal, an audio input unit, a user input unit 123 for receiving information from a user A touch key, a mechanical key, and the like). The voice data or image data collected by the input unit 120 may be analyzed and processed by a user's control command.

The sensing unit 140 may include at least one sensor for sensing at least one of information in the mobile terminal 100, surrounding environment information surrounding the mobile terminal 100, and user information. For example, the sensing unit 140 may include a proximity sensor 141, an illumination sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, A G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor, a finger scan sensor, an ultrasonic sensor, A microphone 226, a battery gauge, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, A thermal sensor, a gas sensor, etc.), a chemical sensor (e.g., an electronic nose, a healthcare sensor, a biometric sensor, etc.). Meanwhile, the mobile terminal disclosed in the present specification can combine and utilize information sensed by at least two of the sensors.

The output unit 150 includes at least one of a display unit 151, an acoustic output unit 152, a haptic tip module 153, and a light output unit 154 to generate an output related to visual, auditory, can do. The display unit 151 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. The touch screen may function as a user input unit 123 that provides an input interface between the mobile terminal 100 and a user and may provide an output interface between the mobile terminal 100 and a user.

The interface unit 160 serves as a path to various types of external devices connected to the mobile terminal 100. The interface unit 160 is connected to a device having a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, And may include at least one of a port, an audio I / O port, a video I / O port, and an earphone port. In the mobile terminal 100, corresponding to the connection of the external device to the interface unit 160, it is possible to perform appropriate control related to the connected external device.

In addition, the memory 170 stores data supporting various functions of the mobile terminal 100. The memory 170 may store a plurality of application programs or applications running on the mobile terminal 100, data for operation of the mobile terminal 100, and commands. At least some of these applications may be downloaded from an external server via wireless communication. Also, at least a part of these application programs may exist on the mobile terminal 100 from the time of shipment for the basic functions (e.g., telephone call receiving function, message receiving function, and calling function) of the mobile terminal 100. The application program may be stored in the memory 170 and installed on the mobile terminal 100 and may be operated by the control unit 180 to perform the operation (or function) of the mobile terminal 100.

In addition to the operations related to the application program, the control unit 180 typically controls the overall operation of the mobile terminal 100. The control unit 180 may process or process signals, data, information, and the like input or output through the above-mentioned components, or may drive an application program stored in the memory 170 to provide or process appropriate information or functions to the user.

In addition, the controller 180 may control at least some of the components illustrated in FIG. 1A in order to drive an application program stored in the memory 170. FIG. In addition, the controller 180 may operate at least two of the components included in the mobile terminal 100 in combination with each other for driving the application program.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power to the components included in the mobile terminal 100. The power supply unit 190 includes a battery, which may be an internal battery or a replaceable battery.

At least some of the above components may operate in cooperation with each other to implement a method of operation, control, or control of the mobile terminal 100 according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal 100 may be implemented on the mobile terminal by driving at least one application program stored in the memory 170.

Hereinafter, the components listed above will be described in more detail with reference to FIG. 1 before explaining various embodiments implemented through the mobile terminal 100 as described above.

First, referring to the wireless communication unit 110, the broadcast receiving module 111 of the wireless communication unit 110 receives broadcast signals and / or broadcast-related information from an external broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. Two or more broadcast receiving modules may be provided to the mobile terminal 100 for simultaneous broadcast reception or broadcast channel switching for at least two broadcast channels.

The mobile communication module 112 may be a mobile communication module or a mobile communication module such as a mobile communication module or a mobile communication module that uses technology standards or a communication method (e.g., Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution And an external terminal, or a server on a mobile communication network established according to a long term evolution (AR), a long term evolution (AR), or the like.

The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.

The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the mobile terminal 100. The wireless Internet module 113 is configured to transmit and receive a wireless signal in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.

The wireless Internet module 113 for performing a wireless Internet connection through the mobile communication network can be used for wireless Internet access by WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE or LTE- May be understood as a kind of the mobile communication module 112.

The short-range communication module 114 is for short-range communication, and includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology. The short-range communication module 114 may communicate with the mobile terminal 100 and the wireless communication system through the wireless area networks, between the mobile terminal 100 and another mobile terminal, And may support wireless communication between the network where the mobile terminal (or the external server) is located. The short-range wireless communication network may be a short-range wireless personal area network.

Here, another mobile terminal may be a wearable device (e.g., a smart watch, smart smart, etc.) capable of exchanging data with the mobile terminal 100 according to the present invention glass, HMD (head mounted display)). The short range communication module 114 may detect (or recognize) a wearable device capable of communicating with the mobile terminal 100 around the mobile terminal 100. [ If the detected wearable device is a device authenticated to communicate with the mobile terminal 100 according to the present invention, the control unit 180 may transmit at least a part of the data processed by the mobile terminal 100 to the short- 114 to the wearable device. Therefore, the user of the wearable device can use the data processed by the mobile terminal 100 through the wearable device. For example, according to this, when a telephone is received in the mobile terminal 100, the user performs a telephone conversation via the wearable device, or when a message is received in the mobile terminal 100, It is possible to check the message.

The position information module 115 is a module for obtaining the position (or current position) of the mobile terminal 100, and representative examples thereof include a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, when the mobile terminal 100 utilizes the GPS module, it can acquire the position of the mobile terminal 100 using a signal transmitted from the GPS satellite. As another example, when the mobile terminal 100 utilizes the Wi-Fi module, the location of the mobile terminal 100 is determined based on the information of the wireless AP (wireless access point) transmitting or receiving the wireless signal with the Wi- Can be obtained. Optionally, the location information module 115 may replace or, in addition, perform at least one of the other modules of the wireless communication unit 110 to obtain data regarding the location of the mobile terminal 100. The position information module 115 is a module used for obtaining the position (or the current position) of the mobile terminal 100, and is not limited to a module for directly calculating or acquiring the position of the mobile terminal 100. [

Next, the input unit 120 is for inputting image information (or signal), audio information (or signal), data, or information input from a user. For inputting image information, Or a plurality of cameras 121 may be provided. The camera 121 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame may be displayed on the display unit 151 or stored in the memory 170. [ A plurality of cameras 121 provided in the mobile terminal 100 may be arranged to have a matrix structure and various angles or foci may be provided to the mobile terminal 100 through the camera 121 having the matrix structure A plurality of pieces of image information can be input. In addition, the plurality of cameras 121 may be arranged in a stereo structure so as to acquire a left image and a right image for realizing a stereoscopic image.

The microphone 122 processes the external acoustic signal into electrical voice data. The processed voice data can be utilized variously according to a function (or a running application program) being executed in the mobile terminal 100. Meanwhile, the microphone 122 may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.

The user input unit 123 is for receiving information from a user and when the information is inputted through the user input unit 123, the control unit 180 can control the operation of the mobile terminal 100 to correspond to the input information . The user input unit 123 may include a mechanical input unit or a mechanical key such as a button located on the rear or side of the mobile terminal 100, a dome switch, a jog wheel, Jog switches, etc.) and touch-type input means. For example, the touch-type input means may comprise a virtual key, a soft key or a visual key displayed on the touch screen through software processing, And a touch key disposed on the touch panel. Meanwhile, the virtual key or the visual key can be displayed on a touch screen having various forms, for example, a graphic, a text, an icon, a video, As shown in FIG.

Meanwhile, the sensing unit 140 senses at least one of information in the mobile terminal 100, surrounding environment information surrounding the mobile terminal 100, and user information, and generates a corresponding sensing signal. The control unit 180 may control the driving or operation of the mobile terminal 100 or may perform data processing, function or operation related to the application program installed in the mobile terminal 100 based on the sensing signal. Representative sensors among various sensors that may be included in the sensing unit 140 will be described in more detail.

First, the proximity sensor 141 refers to a sensor that detects the presence of an object approaching a predetermined detection surface, or the presence of an object in the vicinity of the detection surface, without mechanical contact by using electromagnetic force or infrared rays. The proximity sensor 141 may be disposed in the inner region of the mobile terminal 100 wrapped by the touch screen as described above or in proximity to the touch screen.

Examples of the proximity sensor 141 include a transmission type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high frequency oscillation type proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, and an infrared proximity sensor. In the case where the touch screen is electrostatic, the proximity sensor 141 can be configured to detect the proximity of the object with a change of the electric field along the proximity of the object having conductivity. In this case, the touch screen (or touch sensor) itself may be classified as a proximity sensor.

On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The proximity sensor 141 can detect a proximity touch and a proximity touch pattern (e.g., a proximity touch distance, a proximity touch direction, a proximity touch speed, a proximity touch time, a proximity touch position, have. Meanwhile, the control unit 180 processes data (or information) corresponding to the proximity touch operation and the proximity touch pattern sensed through the proximity sensor 141 as described above, and further provides visual information corresponding to the processed data It can be output on the touch screen. Furthermore, the control unit 180 can control the mobile terminal 100 such that different operations or data (or information) are processed according to whether the touch to the same point on the touch screen is a proximity touch or a touch touch .

The touch sensor senses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, do.

For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated at a specific portion into an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.

Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the control unit 180 can know which area of the display unit 151 is touched or the like. Here, the touch controller may be a separate component from the control unit 180, and may be the control unit 180 itself.

On the other hand, the control unit 180 may perform different controls or perform the same control according to the type of the touch object touching the touch screen (or a touch key provided on the touch screen). Whether to perform different controls or to perform the same control according to the type of the touch object may be determined according to the current state of the mobile terminal 100 or an application program being executed.

On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.

The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the controller 180 can calculate the position of the wave generating source through the information sensed by the optical sensor and the plurality of ultrasonic sensors. The position of the wave source can be calculated using the fact that the light is much faster than the ultrasonic wave, that is, the time when the light reaches the optical sensor is much faster than the time the ultrasonic wave reaches the ultrasonic sensor. More specifically, the position of the wave generating source can be calculated using the time difference with the time when the ultrasonic wave reaches the reference signal.

The camera 121 includes at least one of a camera sensor (for example, a CCD, a CMOS, etc.), a photo sensor (or an image sensor), and a laser sensor.

The camera 121 and the laser sensor may be combined with each other to sense a touch of the sensing object with respect to the three-dimensional stereoscopic image. The photosensor may be laminated to the display element, which is configured to scan the movement of the object proximate the touch screen. More specifically, the photosensor mounts photo diodes and TRs (Transistors) in a row / column and scans the contents loaded on the photosensor using an electrical signal that varies according to the amount of light applied to the photo diode. That is, the photo sensor performs coordinate calculation of the object to be sensed according to the amount of change of light, and position information of the object to be sensed can be obtained through the calculation.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, the display unit 151 may display execution screen information of an application program driven by the mobile terminal 100 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information .

Also, the display unit 151 may be configured as a stereoscopic display unit for displaying a stereoscopic image.

In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.

The sound output unit 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, The sound output unit 152 also outputs sound signals related to functions (e.g., call signal reception sound, message reception sound, etc.) performed in the mobile terminal 100. [ The audio output unit 152 may include a receiver, a speaker, a buzzer, and the like.

The haptic module 153 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 153 may be vibration. The intensity and pattern of the vibration generated in the haptic module 153 can be controlled by the user's selection or the setting of the control unit. For example, the haptic module 153 may synthesize and output different vibrations or sequentially output the vibrations.

In addition to vibration, the haptic module 153 includes a pin arrangement vertically moving with respect to the contacted skin surface, a spraying force or suction force of the air through the injection port or the suction port, a touch on the skin surface, contact of an electrode, And various effects of tactile effect such as effect of reproducing the cold sensation using the heat absorbing or heatable element can be generated.

The haptic module 153 can transmit the tactile effect through the direct contact, and the tactile effect can be felt by the user through the muscles of the finger or arm. The haptic module 153 may include two or more haptic modules 153 according to the configuration of the mobile terminal 100.

The light output unit 154 outputs a signal for notifying the occurrence of an event using the light of the light source of the mobile terminal 100. Examples of events that occur in the mobile terminal 100 may include message reception, call signal reception, missed call, alarm, schedule notification, email reception, information reception through an application, and the like.

The signal output by the optical output unit 154 is implemented as the mobile terminal 100 emits light of a single color or a plurality of colors to the front or rear surface. The signal output may be terminated when the mobile terminal 100 detects an event confirmation of the user.

The interface unit 160 serves as a path for communication with all external devices connected to the mobile terminal 100. The interface unit 160 receives data from an external device or supplies power to each component in the mobile terminal 100 or transmits data in the mobile terminal 100 to an external device. For example, a port for connecting a device equipped with a wired / wireless headset port, an external charger port, a wired / wireless data port, a memory card port, an audio I / O port, a video I / O port, an earphone port, and the like may be included in the interface unit 160.

The identification module is a chip for storing various information for authenticating the use right of the mobile terminal 100 and includes a user identification module (UIM), a subscriber identity module (SIM) A universal subscriber identity module (USIM), and the like. Devices with identification modules (hereinafter referred to as "identification devices") can be manufactured in a smart card format. Accordingly, the identification device can be connected to the mobile terminal 100 through the interface unit 160. [

When the mobile terminal 100 is connected to an external cradle, the interface unit 160 may be a path through which power from the cradle is supplied to the mobile terminal 100, May be a channel through which various command signals transmitted to the mobile terminal 100 may be transmitted. The various command signals or the power source input from the cradle may be operated as a signal for recognizing that the mobile terminal 100 is correctly mounted on the cradle.

The memory 170 may store a program for the operation of the controller 180 and temporarily store input / output data (e.g., a phone book, a message, a still image, a moving picture, etc.). The memory 170 may store data on vibration and sound of various patterns outputted when a touch is input on the touch screen.

The memory 170 may be a flash memory type, a hard disk type, a solid state disk type, an SDD type (Silicon Disk Drive type), a multimedia card micro type ), Card type memory (e.g., SD or XD memory), random access memory (RAM), static random access memory (SRAM), read-only memory (ROM), electrically erasable programmable read memory, a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and / or an optical disk. The mobile terminal 100 may operate in association with a web storage that performs the storage function of the memory 170 on the Internet.

Meanwhile, as described above, the control unit 180 controls the operations related to the application program and the general operation of the mobile terminal 100. [ For example, when the state of the mobile terminal 100 satisfies a set condition, the controller 180 may execute or release a lock state for restricting input of a user's control command to applications.

In addition, the control unit 180 performs control and processing related to voice communication, data communication, video call, or the like, or performs pattern recognition processing to recognize handwriting input or drawing input performed on the touch screen as characters and images, respectively . Further, the controller 180 may control any one or a plurality of the above-described components in order to implement various embodiments described below on the mobile terminal 100 according to the present invention.

The power supply unit 190 receives external power and internal power under the control of the controller 180 and supplies power necessary for operation of the respective components. The power supply unit 190 includes a battery, the battery may be an internal battery configured to be chargeable, and may be detachably coupled to the terminal body for charging or the like.

In addition, the power supply unit 190 may include a connection port, and the connection port may be configured as an example of the interface 160 in which an external charger for supplying power for charging the battery is electrically connected.

As another example, the power supply unit 190 may be configured to charge the battery in a wireless manner without using the connection port. In this case, the power supply unit 190 may use at least one of an inductive coupling method based on a magnetic induction phenomenon from an external wireless power transmission apparatus and a magnetic resonance coupling method based on an electromagnetic resonance phenomenon Power can be delivered.

In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.

Hereinafter, a structure of a mobile terminal 100 or a terminal in which the above-described components are disposed will be described with reference to FIGS. 1B and 1C, according to an embodiment of the present invention shown in FIG. 1A.

Referring to FIGS. 1B and 1C, the disclosed mobile terminal 100 includes a bar-shaped terminal body. However, the present invention is not limited thereto and can be applied to various structures such as a folder type, a flip type, a slide type, a swing type, and a swivel type in which a watch type, a clip type, a glass type or two or more bodies are relatively movably coupled . The following discussion will be related to a particular type of mobile terminal, but a description of a particular type of mobile terminal is generally applicable to other types of mobile terminals.

Here, the terminal body can be understood as a concept of referring to the mobile terminal 100 as at least one aggregate.

The mobile terminal 100 includes a case (for example, a frame, a housing, a cover, and the like) that forms an appearance. As shown, the mobile terminal 100 may include a front case 101 and a rear case 102. Various electronic components are disposed in the inner space formed by the combination of the front case 101 and the rear case 102. At least one middle case may be additionally disposed between the front case 101 and the rear case 102.

A display unit 151 is disposed on a front surface of the terminal body to output information. The window 151a of the display unit 151 may be mounted on the front case 101 to form a front surface of the terminal body together with the front case 101. [

In some cases, electronic components may also be mounted on the rear case 102. Electronic parts that can be mounted on the rear case 102 include detachable batteries, an identification module, a memory card, and the like. In this case, a rear cover 103 for covering the mounted electronic components can be detachably coupled to the rear case 102. Therefore, when the rear cover 103 is separated from the rear case 102, the electronic parts mounted on the rear case 102 are exposed to the outside.

As shown, when the rear cover 103 is coupled to the rear case 102, a side portion of the rear case 102 can be exposed. In some cases, the rear case 102 may be completely covered by the rear cover 103 during the engagement. Meanwhile, the rear cover 103 may be provided with an opening for exposing the camera 121b and the sound output unit 152b to the outside.

These cases 101, 102, and 103 may be formed by injection molding of synthetic resin or may be formed of metal such as stainless steel (STS), aluminum (Al), titanium (Ti), or the like.

The mobile terminal 100 may be configured such that one case provides the internal space, unlike the above example in which a plurality of cases provide an internal space for accommodating various electronic components. In this case, a unibody mobile terminal 100 in which synthetic resin or metal is connected from the side to the rear side can be realized.

Meanwhile, the mobile terminal 100 may include a waterproof unit (not shown) for preventing water from penetrating into the terminal body. For example, the waterproof portion is provided between the window 151a and the front case 101, between the front case 101 and the rear case 102, or between the rear case 102 and the rear cover 103, And a waterproof member for sealing the inside space of the oven.

The mobile terminal 100 is provided with a display unit 151, first and second sound output units 152a and 152b, a proximity sensor 141, an illuminance sensor 142, a light output unit 154, Cameras 121a and 121b, first and second operation units 123a and 123b, a microphone 122, an interface unit 160, and the like.

1B and 1C, a display unit 151, a first sound output unit 152a, a proximity sensor 141, an illuminance sensor 142, an optical output unit (not shown) A second operation unit 123b, a microphone 122 and an interface unit 160 are disposed on a side surface of the terminal body, And a mobile terminal 100 having a second sound output unit 152b and a second camera 121b disposed on a rear surface thereof.

However, these configurations are not limited to this arrangement. These configurations may be excluded or replaced as needed, or placed on different planes. For example, the first operation unit 123a may not be provided on the front surface of the terminal body, and the second sound output unit 152b may be provided on the side surface of the terminal body rather than the rear surface of the terminal body.

The display unit 151 displays (outputs) information processed by the mobile terminal 100. For example, the display unit 151 may display execution screen information of an application program driven by the mobile terminal 100 or UI (User Interface) and GUI (Graphic User Interface) information according to the execution screen information .

The display unit 151 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display display, a 3D display, and an e-ink display.

In addition, the display unit 151 may exist in two or more depending on the embodiment of the mobile terminal 100. In this case, the mobile terminal 100 may be provided with a plurality of display portions spaced apart from each other or disposed integrally with one another, or may be disposed on different surfaces, respectively.

The display unit 151 may include a touch sensor that senses a touch with respect to the display unit 151 so that a control command can be received by a touch method. When a touch is made to the display unit 151, the touch sensor senses the touch, and the control unit 180 generates a control command corresponding to the touch based on the touch. The content input by the touch method may be a letter or a number, an instruction in various modes, a menu item which can be designated, and the like.

The touch sensor may be a film having a touch pattern and disposed between the window 151a and a display (not shown) on the rear surface of the window 151a, or may be a metal wire . Alternatively, the touch sensor may be formed integrally with the display. For example, the touch sensor may be disposed on a substrate of the display or inside the display.

In this way, the display unit 151 can form a touch screen together with the touch sensor. In this case, the touch screen can function as a user input unit 123 (see FIG. 1A). In some cases, the touch screen may replace at least some functions of the first operation unit 123a.

The first sound output unit 152a may be implemented as a receiver for transmitting a call sound to a user's ear and the second sound output unit 152b may be implemented as a loud speaker for outputting various alarm sounds or multimedia playback sounds. ). ≪ / RTI >

The window 151a of the display unit 151 may be provided with an acoustic hole for emitting the sound generated from the first acoustic output unit 152a. However, the present invention is not limited to this, and the sound may be configured to be emitted along an assembly gap (for example, a gap between the window 151a and the front case 101) between the structures. In this case, the appearance of the mobile terminal 100 can be made more simple because the hole formed independently for the apparent acoustic output is hidden or hidden.

The optical output unit 154 is configured to output light for notifying the occurrence of an event. Examples of the event include a message reception, a call signal reception, a missed call, an alarm, a schedule notification, an email reception, and reception of information through an application. The control unit 180 may control the light output unit 154 to terminate the light output when the event confirmation of the user is detected.

The first camera 121a processes an image frame of a still image or a moving image obtained by the image sensor in the photographing mode or the video communication mode. The processed image frame can be displayed on the display unit 151 and can be stored in the memory 170. [

The first and second operation units 123a and 123b may be collectively referred to as a manipulating portion as an example of a user input unit 123 operated to receive a command for controlling the operation of the mobile terminal 100 have. The first and second operation units 123a and 123b can be employed in any manner as long as the user is in a tactile manner such as touch, push, scroll, or the like. In addition, the first and second operation units 123a and 123b may be employed in a manner that the user operates the apparatus without touching the user through a proximity touch, a hovering touch, or the like.

In this figure, the first operation unit 123a is a touch key, but the present invention is not limited thereto. For example, the first operation unit 123a may be a mechanical key or a combination of a touch key and a touch key.

The contents input by the first and second operation units 123a and 123b can be variously set. For example, the first operation unit 123a receives a command such as a menu, a home key, a cancellation, a search, and the like, and the second operation unit 123b receives a command from the first or second sound output unit 152a or 152b The size of the sound, and the change of the display unit 151 to the touch recognition mode.

On the other hand, a rear input unit (not shown) may be provided on the rear surface of the terminal body as another example of the user input unit 123. The rear input unit is operated to receive a command for controlling the operation of the mobile terminal 100, and input contents may be variously set. For example, commands such as power on / off, start, end, scrolling, and the like, the size adjustment of the sound output from the first and second sound output units 152a and 152b, And the like can be inputted. The rear input unit may be implemented as a touch input, a push input, or a combination thereof.

The rear input unit may be disposed so as to overlap with the front display unit 151 in the thickness direction of the terminal body. For example, the rear input unit may be disposed at the rear upper end of the terminal body such that when the user holds the terminal body with one hand, the rear input unit can be easily operated using the index finger. However, the present invention is not limited thereto, and the position of the rear input unit may be changed.

When a rear input unit is provided on the rear surface of the terminal body, a new type of user interface using the rear input unit can be realized. When the first operation unit 123a is not disposed on the front surface of the terminal body in place of at least a part of the functions of the first operation unit 123a provided on the front surface of the terminal body, The display unit 151 may be configured as a larger screen.

Meanwhile, the mobile terminal 100 may be provided with a fingerprint recognition sensor for recognizing the fingerprint of the user, and the controller 180 may use the fingerprint information sensed through the fingerprint recognition sensor as authentication means. The fingerprint recognition sensor may be embedded in the display unit 151 or the user input unit 123.

The microphone 122 is configured to receive the user's voice, other sounds, and the like. The microphone 122 may be provided at a plurality of locations to receive stereophonic sound.

The interface unit 160 is a path through which the mobile terminal 100 can be connected to an external device. For example, the interface unit 160 may include a connection terminal for connection with another device (for example, an earphone or an external speaker), a port for short-range communication (for example, an infrared port (IrDA Port), a Bluetooth port A wireless LAN port, or the like), or a power supply terminal for supplying power to the mobile terminal 100. The interface unit 160 may be implemented as a socket for receiving an external card such as a SIM (Subscriber Identification Module) or a UIM (User Identity Module) or a memory card for storing information.

And a second camera 121b may be disposed on a rear surface of the terminal body. In this case, the second camera 121b has a photographing direction substantially opposite to that of the first camera 121a.

The second camera 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may be arranged in a matrix form. Such a camera can be named an 'array camera'. When the second camera 121b is configured as an array camera, images can be taken in various ways using a plurality of lenses, and a better quality image can be obtained.

The flash 124 may be disposed adjacent to the second camera 121b. The flash 124 shines light toward the subject when the subject is photographed by the second camera 121b.

And a second sound output unit 152b may be additionally disposed in the terminal body. The second sound output unit 152b may implement a stereo function together with the first sound output unit 152a and may be used for implementing a speakerphone mode in a call.

The terminal body may be provided with at least one antenna for wireless communication. The antenna may be embedded in the terminal body or formed in the case. For example, an antenna constituting a part of the broadcast receiving module 111 (see FIG. 1A) may be configured to be able to be drawn out from the terminal body. Alternatively, the antenna may be formed in a film type and attached to the inner surface of the rear cover 103, or a case including a conductive material may be configured to function as an antenna.

The terminal body is provided with a power supply unit 190 (see FIG. 1A) for supplying power to the mobile terminal 100. The power supply unit 190 may include a battery 191 built in the terminal body or detachable from the outside of the terminal body.

The battery 191 may be configured to receive power through a power cable connected to the interface unit 160. In addition, the battery 191 may be configured to be wirelessly chargeable through a wireless charger. The wireless charging may be implemented by a magnetic induction method or a resonance method (magnetic resonance method).

The rear cover 103 is configured to be coupled to the rear case 102 so as to cover the battery 191 to restrict the release of the battery 191 and to protect the battery 191 from external impact and foreign matter . When the battery 191 is detachably attached to the terminal body, the rear cover 103 may be detachably coupled to the rear case 102.

The mobile terminal 100 may be provided with an accessory that protects the appearance or supports or expands the function of the mobile terminal 100. [ One example of such an accessory is a cover or pouch that covers or accommodates at least one side of the mobile terminal 100. [ The cover or pouch may be configured to interlock with the display unit 151 to expand the function of the mobile terminal 100. Another example of an accessory is a touch pen for supplementing or extending a touch input to the touch screen.

Hereinafter, embodiments related to a control method that can be implemented in the mobile terminal 100 configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.

2 illustrates an operation of grouping images according to a touch gesture input in a mobile terminal according to an exemplary embodiment of the present invention.

2, the controller 180 of the mobile terminal 100 according to an exemplary embodiment of the present invention displays information related to the images currently stored in the mobile terminal 100 on the display unit 151 (S200).

In managing and displaying these images, the control unit 180 generally uses a software application such as an image browser to allow the user to search for and manage images. Wherein the image browser may display information related to images currently stored in the form of a grid, table, or list arranged in rows and columns, wherein the information associated with the stored images is a thumbnail image generated from each image, image.

Here, the 'thumbnail image' may be an image formed of one pixel extracted per n pixels of the original image. Accordingly, as the n is larger, an image having a lower resolution and a smaller thumbnail image and a smaller n (minimum = 1) and higher resolution (original image if n = 1) can be generated from the original image.

Meanwhile, the user can select a specific thumbnail image and identify a specific image corresponding to the thumbnail image. Also, when the user selects a thumbnail image corresponding to a specific group, it is possible to identify images included in the group.

If the information related to the currently stored images is displayed on the display unit 151 through the image browser or the like, the controller 180 may detect whether there is a predetermined touch gesture input (S202). Here, the predetermined touch gesture input may be various. For example, a plurality of touch inputs forming a predetermined pattern to be applied to the display unit 151 or the rear surface of the mobile terminal 100, or a drag input applied along a preset length or locus. Or a touch gesture input to which at least two touch inputs applied on the display unit 151 are close to each other (Pinch in) or distant from each other (Pinch out). Alternatively, the predetermined touch gesture input may be a plurality of touch inputs forming a pattern of different pressure or hold time applied on the display unit 151. [

If the predetermined touch gesture input is applied in step S202, the controller 180 may analyze the images displayed on the display unit 151 according to a predetermined reference (S204) . The images may be simultaneously grouped according to the predetermined criteria according to the analysis result of step S204 (S206). Here, the predetermined reference may be the time at which each image was photographed or the position information recorded at the time each image was photographed.

For example, the control unit 180 may perform grouping in the same group in the case of photographs taken consecutively. In other words, when each photograph, that is, the time at which the images are photographed is compared with each other, if there are other pictures taken before a predetermined time elapses from the time at which the specific picture was photographed, Can be formed into the same group. For each of the other pictures, if there are other pictures taken within a predetermined time from the time the pictures were taken, all of the pictures may be formed into one group. In this case, the photographs taken at a time adjacent to each other may be formed into one group. Here, the predetermined time may be set by a user.

Alternatively, the control unit 180 may perform grouping into the same group in the case of photographs taken at an adjacent position. That is, as a result of comparing the position information of each of the photographs, that is, the mobile terminal 100 recorded at the time when the images are photographed, the controller 180 may form photographs photographed at positions adjacent to each other as a group. Here, the user can set a criterion that can be determined as the 'adjacent position' (for example, distance), and the photographs in which the photographed positions are located within predetermined distances from each other can be formed into one group.

Meanwhile, the control unit 180 may form not only the time or position at which the photographs were taken, but also photographs having the same or similar photographing schemes as one group. For example, the control unit 180 may analyze the similarity of each image to determine whether each image has the same or similar image composition.

For example, the control unit 180 can distinguish the background and the object based on the distance information (for example, the focal distance) included in the image. Then, it is possible to determine whether the photographic composition is the same or similar based on the background or the subject. That is, the control unit 180 can determine that the photographs have the same or similar photographing schemes when the number of subjects is the same, the shapes of the subjects are similar, or the backgrounds are the same or similar to each other. Here, the similarity between the shape and the background of the subject may be determined according to whether the number of pixels having the same color value is equal to or greater than a predetermined number as a result of comparing pixels of a photograph constituting a subject or a background.

Meanwhile, if the grouping of the images is performed simultaneously in steps S204 and S206, a plurality of groups can be simultaneously generated. In this case, the control unit 180 may display the generated groups according to a preset method (S208). For example, the control unit 180 may display the groups using images included in each group. That is, the control unit 180 determines one of the images included in each group as a representative image determined according to a predetermined reference, and determines that the representative image is a representative image of a specific group around the determined representative image or at least a part of the representative image And may display a graphic object for display. In this case, the graphical object may indicate that the graphic object is in a grouped state, but may display information related to images of the group including the representative image.

FIG. 3 illustrates an operation of displaying the grouped images in step S208 of FIG. 2 in greater detail.

Referring to FIG. 3, the controller 180 of the mobile terminal 100 according to the embodiment of the present invention can determine a representative image of each group from the images included in each group (S300). For example, the control unit 180 may determine the representative image based on the time or position at which the images stored in each group are photographed, or the saturation, brightness, or sharpness of each image.

For example, the control unit 180 may determine the representative image as the first photograph or the photograph that is photographed most later than the photographs included in the group. Alternatively, the control unit 180 may determine a photograph taken at a specific position among the photographs included in the group as a representative image. Alternatively, the image having the highest saturation, brightness, and sharpness may be determined as the representative image by comparing the saturation, brightness, and sharpness of the images included in each group.

 Or the control unit 180 may determine the representative image based on different criteria according to the grouped criteria of the current images. For example, when the reference group in which the current images are grouped is 'object', the controller 180 compares the saturation, brightness, or sharpness of the objects included in each of the images with each other to obtain a subject having the highest saturation, brightness, or sharpness It is possible to determine the included image as the representative image. Or the image having the highest saturation, brightness or sharpness of 'background' by comparing the saturation, brightness or sharpness of the 'background' included in each of the images when the grouped reference is 'background' You can also decide by image.

If the representative image is determined in step S300, the controller 180 may generate a thumbnail image of the representative image in step S302. Here, the controller 180 extracts one pixel for each representative image pixel N (N is a positive integer larger than 0) determined in step S300, and combines the extracted pixels according to the extracted order and position, A thumbnail image corresponding to the thumbnail image may be generated. The thumbnail image generated in step S302 may be displayed on the display unit 151 as a thumbnail image corresponding to the group including the representative image.

Meanwhile, the controller 180 of the mobile terminal 100 according to the exemplary embodiment of the present invention displays the predetermined graphic object so that the thumbnail image generated in step S302 is separated from the thumbnail image corresponding to the ungrouped images (S304). Here, the predetermined graphic object may be information related to images of a group including the representative image. For example, the control unit 180 may display a plurality of graphic objects corresponding to the number of images included in the group, at least in part or in the vicinity of the thumbnail image generated in step S302. Alternatively, the controller 180 may display a number corresponding to the number of images included in the group at least in part or in the vicinity of the thumbnail image generated in step S302, and indicate that the thumbnail image is a thumbnail image corresponding to a specific group You may. Hereinafter, an example in which grouping is performed according to an embodiment of the present invention, and an example of a thumbnail image corresponding to a generated group will be described in detail with reference to FIG.

It should be noted that the grouped images can be simultaneously ungrouped based on the preset touch gesture as well as the plurality of images can be simultaneously grouped according to the preset touch gesture as described above. FIG. 4 illustrates an operation of releasing a grouping state of grouped images in a mobile terminal according to an exemplary embodiment of the present invention.

Referring to FIG. 4, in a state where the grouped images are displayed in step S208 of FIG. 2, the controller 180 may detect whether or not another predetermined touch gesture is input (S400). Where the other touch gesture input may be a touch gesture input, such as a touch gesture input, which was sensed to group the plurality of images simultaneously in FIG.

For example, if the touch gesture input sensed for simultaneously grouping a plurality of images in step S202 of FIG. 2 is a plurality of tabs of a predetermined pattern, the controller 180 may display a plurality of If the tab is re-applied, the current images may be ungrouped (S402). In this case, the images may be restored to the state before grouping in step S206 of FIG. 2, and thumbnail images corresponding to the images grouped according to the grouping may be displayed on the display unit 151. FIG.

Or the touch gesture input for ungrouping may be another touch gesture input corresponding to the sensed touch gesture for grouping the images. That is, for example, if the touch gesture input sensed for grouping a plurality of images simultaneously in step S202 of FIG. 2 is a pinch-in gesture, the touch gesture input sensed in step S400 of FIG. It may be a pinch-out gesture, which is another touch gesture corresponding to the gesture. A more intuitive user interface for grouping or ungrouping images may be formed when using different touch gestures corresponding to each other when grouping and ungrouping images.

Meanwhile, according to the above description, the mobile terminal 100 according to the embodiment of the present invention has mentioned that a user can search and manage images using a software application such as an image browser. The image browser may display the stored images based on a folder hierarchy in a tree format. That is, the controller 180 can display information related to the images stored in the specific folder on the display unit. If the user selects a subfolder included in the specific folder, only the images included in the selected subfolder And can be displayed in one area on the display unit 151. [

 Here, the state indicating only the images included in the subfolder may mean a lower browsing level. That is, the browsing level may correspond to the level of each folder in the folder hierarchy of the tree format, and accordingly, the state in which the images included in the lower-level folder are browsed may include a case where images are browsed at a lower browsing level .

Meanwhile, when the images are grouped by the user as described above, the group may correspond to a subfolder formed in a folder in which current images are stored. That is, the image browser may generate a group corresponding to the grouped images when the images are grouped, and may set the grouped images to correspond to the generated group. And displaying the images corresponding to the selected group, i.e., the grouped images, when the group is selected, similar to displaying images included in the lower-level folder. That is, when a thumbnail image corresponding to a specific group of images (e.g., thumbnail images) currently displayed on the display unit 151 is selected from the user, the image browser displays only the images included in the group on the display unit 151 ) On the display area. I.e., change the browsing level to a lower level corresponding to the group, i. E., The subfolder.

In the meantime, when a specific group is selected, the present invention allows the user to confirm images included in the specific group even when the browsing level is not changed. That is, even if a specific group is selected by the user, various information related to the images included in the specific group may be displayed on the display unit without changing the browsing level, Can be displayed on the display unit 151. Accordingly, each time a user selects a group, information (e.g., a thumbnail image) of images included in the group is displayed without changing the browsing level so that the user can manage the grouped images more quickly and easily .

FIG. 5 shows an operation flow of the mobile terminal 100 according to the embodiment of the present invention.

5, when the grouped images are displayed in step S208 of FIG. 2, the controller 180 of the mobile terminal 100 according to an exemplary embodiment of the present invention displays the grouped images (S500). For example, when a touch input of a user is detected in an area on the display unit 151 in which a thumbnail image corresponding to one of the currently generated groups is displayed, the group corresponding to the thumbnail image is displayed As shown in FIG. In this case, the control unit 180 may display a graphic object in the vicinity of the thumbnail image or different thumbnail images other than the thumbnail image corresponding to the selected group so that the currently selected group is distinguished from the other groups (S502). For example, the control unit 180 may display the other thumbnail images in a blurred manner or darkly display a certain brightness or less.

Meanwhile, in this state, the control unit 180 may display information related to the currently selected group on the display unit 151 according to the user's selection. For example, the control unit 180 may display a thumbnail image corresponding to at least one of the images belonging to the currently selected group on the display unit 151 based on the touch input of the user (S504).

For example, when a specific group is selected in step S504, the controller 180 may display on the display unit 151 information related to the first image in the predetermined order among the images belonging to the group. In this case, the control unit 180 may display a thumbnail image generated from the image as information related to the first image of the corresponding group.

Meanwhile, the thumbnail image generated from the first image may have a resolution different from that of other thumbnail images currently displayed on the display unit 151 and a different size. That is, when thumbnail images currently displayed on the display unit 151 are thumbnail images (first thumbnail images) generated from representative images and non-grouped images of each group, the thumbnail images generated from the first image The thumbnail image (second thumbnail image) may have a different resolution and a different size from the first thumbnail image.

For example, the second thumbnail image may be generated with a thumbnail image having a larger resolution and / or a larger size than the first thumbnail image.

That is, as described above, when the 'thumbnail image' is an image formed of one pixel extracted per n pixels of the original image, the control unit 180 determines that the thumbnail image is further extracted from the original image A plurality of pixels may be extracted to generate a second thumbnail image. Accordingly, if the first thumbnail image is composed of four pixels each in the horizontal and vertical directions, that is, 16 pixels (4X4 = 16 pixels) extracted from the pixels constituting the original image, , The second thumbnail image includes two pixels each in the horizontal and vertical directions of the pixels constituting the original image, that is, four pixels (one pixel extracted every (2X2 = 4) pixels The second thumbnail image may have a resolution twice as large as the first thumbnail image and a size four times as large as the first thumbnail image.

Meanwhile, in step S504, the controller 180 may display information on images belonging to the selected group on the display unit 151 according to a user's touch input. In this case, the controller 180 may generate thumbnail images corresponding to the images belonging to the selected group. The thumbnail images can be displayed on the display unit 151.

If at least one of the images belonging to the selected group is displayed on the display unit 151 according to the touch input of the user, An operation related to one thumbnail image may be performed (S506). For example, when a thumbnail image corresponding to one of the images belonging to a specific group is displayed at a higher resolution and a larger size, the control unit 180 displays the thumbnail image corresponding to another image included in the same group A thumbnail image corresponding to the images can be displayed on the display unit 151. [ An example of displaying information related to a plurality of images included in a selected group using a thumbnail image displayed in a higher resolution and a larger size corresponding to any one of the images will be described in detail with reference to FIG. .

Alternatively, when thumbnail images of images belonging to a currently selected specific group are displayed on the display unit 151, thumbnail images of higher resolution and larger size corresponding to the images may be displayed on the display unit 151 . In this case, a part of each thumbnail image or a periphery of each of the thumbnail images may display a graphic object related to a predetermined function (e.g., sharing or deleting function) related to the corresponding image. Such a graphic object may be displayed in the form of an icon or the like. The display unit 151 displays a plurality of thumbnail images corresponding to the plurality of images included in the selected group and displays a plurality of thumbnail images corresponding to at least one predetermined function An example of displaying graphic objects will be described with reference to FIG.

If there is a user's selection of the graphic object (for example, touch input), the controller 180 performs a function corresponding to the selected graphic object with respect to the original image corresponding to the thumbnail image in which the selected graphic object is displayed Of course it is possible. That is, for example, when the user selects the 'trash can' icon displayed on the specific thumbnail image, the control unit 180 moves the image corresponding to the specific thumbnail image to the 'trash folder' folder in which data to be deleted or deleted is temporarily stored .

In the above description, the mobile terminal 100 according to the exemplary embodiment of the present invention conducts grouping of a plurality of images at the same time based on preset conditions and displays images grouped into a plurality of groups, And displaying information related to images included in each of the plurality of groups without changing the level, with reference to the flowchart.

In the following description, an example of displaying images grouped into the plurality of groups according to the operation procedure described above and an example of displaying information related to images included in each of the plurality of groups will be described in more detail with reference to exemplary diagrams Let's look at it.

6 illustrates an example in which a plurality of images are simultaneously grouped according to a user's touch input in a mobile terminal according to an exemplary embodiment of the present invention.

First, referring to the first drawing of FIG. 6, the first drawing of FIG. 6 shows an example in which a plurality of previously stored images 600 are displayed in the mobile terminal 100 according to the embodiment of the present invention. For example, the control unit 180 may display, on the display unit 151, thumbnail images respectively corresponding to pre-stored images using a program or an application such as an image browser as described above.

In this state, the control unit 180 may simultaneously perform grouping on the plurality of images based on a predetermined grouping criterion according to a predetermined touch gesture input of the user applied on the display unit 151. [ For example, when the preset touch gesture input is a pinch-in gesture, the control unit 180 determines whether the touch gesture input is detected at different points on the display unit 151, as shown in the second drawing of FIG. If at least two touch inputs 610 are dragged in the direction of approaching each other, it can be detected that the preset touch gesture input is applied.

2, the controller 180 may group the images corresponding to the thumbnail images currently displayed on the display unit 151 according to a predetermined grouping criterion, as described in S204 and S206 of FIG. 2 have. That is, the control unit 180 generates images of the same group within a predetermined time according to the time at which the images are captured, or images of the images taken in a similar composition according to the composition of each image . Alternatively, if there is location information associated with the images, the controller 180 may generate the same group of images having the same location information.

On the other hand, the controller 180 can determine the representative image according to the predetermined criteria for each group generated, and display the thumbnail image corresponding to the generated representative image. In this case, the thumbnail image corresponding to the representative image of the group may be displayed separately from the thumbnail image corresponding to the non-grouped images to indicate that the thumbnail image corresponds to a specific group. For example, the thumbnail images corresponding to the representative images corresponding to the respective groups are displayed on a part or the periphery of an area of the display unit 151 on which the representative image is displayed, can do. The third diagram in Fig. 6 shows this example.

Referring to FIG. 6, there is shown an example in which three groups are generated from a plurality of images 600 before being grouped. In this case, as shown in the third diagram of Fig. 6, thumbnail images 630, 640, 650 (corresponding to groups 1, 2, 3) corresponding to the representative image determined from each group Can be displayed on the display unit. And the number of images 632, 642, and 652 included in each group may be displayed in a portion of each of the thumbnail images 630, 640, and 650 to indicate that each thumbnail image corresponds to the group. Accordingly, the thumbnail images 630, 640 and 650 corresponding to the respective groups and the thumbnail images 660 corresponding to the non-grouped images can be displayed separately.

Meanwhile, in the third drawing of FIG. 6, a numeric graphic object is displayed to indicate that each thumbnail image corresponds to the group, but it is needless to say that other types of graphic objects may be displayed . For example, the graphic object may display a plurality of graphic objects according to the number of images included in the group. That is, when there are four images included in the first group corresponding to the first thumbnail image 630, the controller 180 displays four predetermined graphic objects 670 in a part of the first thumbnail image 630 To display the number of images included in the first group, and to indicate that the first thumbnail image 630 corresponds to the first group.

On the other hand, the control unit 180 may display a thumbnail image corresponding to a specific group in the form of a scroll bar. In this case, as shown in the fourth drawing in Fig. 6, such a scroll bar type graphic object is displayed on a part of a thumbnail image (second thumbnail image) corresponding to a specific group (for example, a second group) The second thumbnail image 630 and the thumbnail image 660 corresponding to the non-grouped image may be distinguished from each other.

Meanwhile, the control unit 180 may check the images included in each group without changing the current browsing level. Figs. 7 to 8 show examples of such cases.

Referring to FIG. 7, FIG. 7 is a view illustrating an example of displaying images included in a selected group when a specific group is selected in the mobile terminal according to an exemplary embodiment of the present invention.

First, the first diagram of FIG. 7 shows an example in which thumbnail images corresponding to groups generated according to an embodiment of the present invention are displayed. 7, when the user selects one of the thumbnail images (the first thumbnail image 630), the control unit 180 selects the group corresponding to the selected first thumbnail image On the display unit 151, information related to the images of the user. The second diagram of FIG. 7 shows this example.

Referring to the second drawing of FIG. 7, the second diagram of FIG. 7 shows the information related to the first image in the predetermined order among the images included in the first group corresponding to the first thumbnail image 630 Is displayed. As shown in the second diagram of FIG. 7, the information related to the first image included in the selected group may be a thumbnail image 700 generated from the first image. And the thumbnail image 700 generated from the first image may be a thumbnail image generated with a higher resolution and a larger size than the thumbnail image 630 corresponding to the first group. 7, if the first one of the images belonging to the first group is the representative image, the control unit 180 determines whether the first image among the images belonging to the first group is a representative image, Resolution and thumbnail images of different sizes can be displayed on the display unit 151. [

On the other hand, the control unit 180 stores information related to other images belonging to the first group, in accordance with the user's selection or automatically, on a part of the thumbnail image 700 corresponding to the first image among the images belonging to the first group The graphical object 710 may be further displayed. That is, the control unit 180 detects a touch input corresponding to a predetermined time in an area on the display unit 151 on which the thumbnail image 700 corresponding to the first image is displayed, or a thumbnail image corresponding to the first image The display unit 700 may further display the graphic object 710 automatically. For example, the graphical object 710 may have the form of a scroll bar, as shown in the second drawing of FIG.

In this case, the control unit 180 may display information of other images included in the first group, that is, a thumbnail image, based on a user's input to the graphic object 710. That is, as shown in the third drawing of FIG. 7, the touch input of the user is detected in the area where the graphic object 710 is displayed, and as shown in the fourth drawing of FIG. 7, The control unit 180 may display a thumbnail image corresponding to a specific image corresponding to the length of the drag input on the display unit 151. In this case,

For example, the specific image may be any one of images included in the first group corresponding to the length of the drag input according to a predetermined order. That is, if the first group includes four images, if the length to which the drag input is applied is 3/4 (712) of the entire scroll bar, as shown in the third drawing of FIG. 7, May display on the display unit 151 a thumbnail image 750 generated from the third image in the predetermined order among the images included in the first group. 7 shows an example in which a thumbnail image 750 corresponding to another image included in the first group is displayed according to the drag input 720 of the user.

Meanwhile, FIG. 7 illustrates an example in which a graphic object in the form of a scroll bar is displayed. However, graphic objects corresponding to the number of images included in the group may be displayed at any time. In this case, it is needless to say that the controller 180 may display a thumbnail image generated from an image corresponding to one of the graphic objects selected by the user on the display unit 151. [

Meanwhile, according to the above description, the mobile terminal 100 according to the embodiment of the present invention may display not only any one of the images of the group selected by the user, but also all of the images of the selected group on the display unit 151 . FIG. 8 is an exemplary view illustrating an example in which images included in a selected group are displayed in the mobile terminal according to the embodiment of the present invention.

Referring to FIG. 8, the first drawing of FIG. 8 shows an example in which thumbnail images corresponding to the groups generated according to the embodiment of the present invention are displayed. In this case, as shown in the second drawing of FIG. 8, when the user selects (touch input 802) any one thumbnail image (first thumbnail image 630), the controller 180 May display on the display unit 151 information related to the group of images corresponding to the selected first thumbnail image. The information related to the images of the group corresponding to the first thumbnail image may be thumbnail images 800 corresponding to the images included in the first group. The second diagram of FIG. 8 shows this example.

In this case, the controller 180 may display the thumbnail images 800 corresponding to each of the images included in the first group separately from the thumbnails corresponding to other images or other groups. That is, the controller 180 displays the thumbnails corresponding to the other images or the other groups in a darker or dimmer manner, or displays the thumbnails in a color having a predetermined brightness or lower, so that the thumbnails 800 are more clearly distinguished You may.

8, when the thumbnail images 800 corresponding to the images included in the first group are displayed, the controller 180 displays the thumbnail images 800 corresponding to the images included in the first group, Lt; / RTI > associated with the respective functions. In other words, as shown in the third drawing of FIG. 8, the controller 180 receives another touch input 804, and the other touch input 804 inputs a touch input When a drag input 810 applied to move away from the display unit 802 is applied on the display unit 151, it is input to display graphical objects related to the functions corresponding to the images included in the first group As shown in FIG.

In this case, the control unit 180 may display the thumbnail images 800 corresponding to each of the images included in the first group with higher resolution and larger size images. That is, when the touch input 804 and the drag input 810 are applied, the control unit 180 displays thumbnails of a higher resolution and a larger size as shown in the third and fourth figures of FIG. 8, Each of the images included in the first group can be displayed. And display graphic objects related to the preset functions in the areas on the display unit 151 on which the respective thumbnail images are displayed. For example, if the preset functions are a 'sharing' function for uploading a specific image to a specific shared site or a web page, or a 'delete' function or a 'trash can' function for deleting the corresponding image, The graphic objects 822 and 824 corresponding to the sharing function and the trash can function are displayed in a part of the area where the thumbnail images corresponding to the images included in the first group are displayed, can do.

If any one of the graphic objects 822 and 824 displayed on the thumbnail images is selected, the controller 180 determines whether the selected graphic object is a thumbnail image corresponding to the selected graphic object Function can be performed.

8, if the thumbnail images 800 corresponding to each of the images included in the specific group are displayed, the control unit 180 displays the thumbnail images 800 corresponding to any one of the images A thumbnail image of a higher resolution and a larger size corresponding to the thumbnail image may be displayed or an original image corresponding to one of the thumbnail images may be displayed on the display unit 151. [ Fig. 9 is an illustration showing this example.

9, the control unit 180 of the mobile terminal 100 according to the embodiment of the present invention includes a plurality of groups corresponding to the groups generated according to the embodiment of the present invention, The user can receive a selection of any one of the groups in a state in which the thumbnail images are displayed. In this case, the control unit 180 may display thumbnail images 800 corresponding to each of the images included in the group selected by the user, i.e., the first group, on the display unit 151. [

In this case, as shown in the second diagram of FIG. 9, the control unit 180 can sense a user's input to an area on the display unit 151 in which one of the thumbnail images 820 is displayed. In this case, the controller 180 may display any one of the thumbnail images 820 in a different manner.

That is, as shown in the third diagram of FIG. 9, the control unit 180 may generate a thumbnail image 900 of a higher resolution and a larger size based on the user's selection of the one thumbnail image 820 From the image corresponding to any one of the thumbnail images 820. The generated thumbnail image 900 of a higher resolution and a larger size can be displayed on the display unit 151. In this case, other thumbnail images among the thumbnail images 800 corresponding to the respective images included in the first group may not be displayed as shown in the third drawing of FIG. Alternatively, the control unit 180 may display thumbnail images of other thumbnail images 800 corresponding to each of the images included in the first group, with thumbnails of lower resolution and smaller size (not shown) ). In this case, when there is a user's selection of one of the other thumbnail images, the controller 180 displays a thumbnail image of a higher resolution and a larger size corresponding to another selected thumbnail image on the display unit 151 Of course.

On the other hand, in a state in which a thumbnail image 900 of a higher resolution and a larger size is displayed, the control unit 180 may display an image corresponding to the thumbnail image 900 on the display unit 151 have. 9, an original image 950 corresponding to the thumbnail image 900 may be displayed on the display unit 151. In this case, as shown in FIG.

On the other hand, when one of the thumbnail images 820 is selected as shown in the second diagram of FIG. 9, the controller 180 determines that the selected thumbnail image 820 is included in the selected thumbnail image 820 The corresponding original image 950 may be immediately displayed on the display unit 151. [

Meanwhile, the control unit 180 of the mobile terminal 100 according to the embodiment of the present invention can simultaneously group a plurality of images as described above when a preset touch input gesture, for example, a pinch-in gesture is detected. In this state, when another touch input gesture corresponding to the pinch-in gesture, that is, a pinch-out gesture, is detected, the controller 180 can cancel the grouping of the current plurality of images as described above. In this case, a plurality of images grouped into the respective groups may be displayed in the form of a thumbnail image on the display unit as respective images.

Meanwhile, if a pinch-out gesture is input without grouping, the controller 180 may perform various operations related to the plurality of images currently stored. FIG. 10 shows an example of the functions performed in the mobile terminal 100 according to the embodiment of the present invention.

For example, as shown in the first drawing of FIG. 10, the control unit 180 displays thumbnail images 1000 for each of a plurality of images in which grouping is not performed on the display unit 151 . In this state, when the pinch-out gesture 1010 is input, as shown in the first drawing of FIG. 10, the controller 180 determines that for each of the plurality of images, the higher the thumbnail images 1000 And thumbnail images having a larger size can be generated and displayed. The second diagram of FIG. 10 shows an example in which thumbnail images 1020 of higher resolution and larger size are displayed for each of a plurality of images.

On the other hand, the process of the first drawing and the second drawing of Fig. 10 can be repeated. That is, the control unit 180 can display the plurality of images with a higher resolution and a larger size thumbnail image each time a user's pinch-out input is applied. However, if the plurality of images are displayed with thumbnail images corresponding to a predetermined maximum resolution and a maximum size, the controller 180 may further provide other information related to each of the plurality of images at present.

For example, an example shown in the second diagram of FIG. 10 may assume a state in which the plurality of images are displayed with thumbnail images corresponding to a predetermined maximum resolution and a maximum size. In this state, when the pinch-out gesture 1030 is applied from the user once again as shown in the third diagram of FIG. 10, the control unit 180 performs another function related to each of the currently stored plurality of images Can be detected.

For example, another function associated with each of the plurality of images may be to display other information associated with each of the plurality of images. In this case, the control unit 180 may analyze each of the plurality of images and display information related to each of the plurality of images, which is currently stored in the mobile terminal 100, on the periphery of each of the plurality of images .

For example, the controller 180 may analyze the position of the mobile terminal 100 storing the image and search for information corresponding to the analyzed position among the information stored in the mobile terminal 100. The retrieved information can be displayed on the periphery of the image. Here, the information stored in the mobile terminal 100 may be an image, a message transmitted or received via SMS or MMS, or a message exchanged through the SNS. Or may be pre-stored location information for a specific location.

For example, as shown in the fourth diagram of FIG. 10, when there is an image of 'Jeju Island' where the image 1022 is photographed, the control unit 180 determines that the image 1022 is photographed at the location 'Jeju Island' From the mobile terminal 100. For example, In this case, an image 1060 in which a user takes an airplane ticket to the Jeju Island, a message 1062 that the user has transmitted or received to reserve a hotel in Jeju Island, and the previously stored location information 1064 And can be retrieved from the mobile terminal 100. The controller 180 may then display the airplane ticket image 1060, the message 1062, and the location information 1064 on the periphery of the image 1022.

Of course, when the information related to the specific image 1022 is displayed, the control unit 180 may display the information 1072 related to the other image 1070. FIG. And for each of the plurality of images, such related information may be displayed on the display unit 151. [

In the above description, the related information is retrieved based on the position where the image is captured. However, it is needless to say that any other criterion may be set. For example, when the image 1022 includes a text including a name of 'Jeju Island', the controller 180 may recognize the name 'Jeju Island' according to the result of text recognition on the image 1022 Of course it is. In this case, even if the information of the position where the image 1022 is stored is not recorded, the controller 180 can search the information 1060, 1062, and 1064 shown in the fourth drawing of FIG. 10 to be.

In the above description, the information related to each of the plurality of images is simultaneously searched and displayed according to the pinch-out gesture input 1030. However, according to the selection of the user, only information related to one specific image is searched Of course.

For example, when the user's pinch-out gesture 1030 is sensed, the controller 180 determines whether the touch input for the pinch-out gesture 1030 is a specific image corresponding to a point on the display unit 151, (1022) is selected. In this case, it is needless to say that the control unit 180 may search only the information related to the selected specific image 1022 and display the information on the display unit 151.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a control unit 180 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

Claims (11)

A memory for storing a plurality of images;
A display unit for displaying image information related to the plurality of images; And
Displaying the thumbnail images corresponding to the plurality of images on the display unit, grouping the plurality of images according to a preset grouping criterion to generate at least one group when a preset touch gesture input is applied, And a controller for simultaneously releasing the grouped states of the images when another touch gesture input corresponding to the preset touch gesture input is applied,
The predetermined grouping criterion may include:
Wherein at least one of the plurality of images is photographed, a composition in which the plurality of images are photographed, and a position in which the plurality of images are photographed.
The apparatus of claim 1,
Determining a representative image according to a predetermined reference from at least one images included in the generated group, displaying a first thumbnail image corresponding to the representative image so as to correspond to each of the generated groups,
In one region of the first thumbnail image corresponding to the representative image,
And graphical objects related to the number of images included in the group are displayed.
3. The apparatus of claim 2,
Wherein the determination unit determines the representative image based on at least one of the captured time, the saturation, the brightness, and the sharpness of the images included in the group.
3. The apparatus of claim 2,
A thumbnail image corresponding to the plurality of images or another thumbnail image corresponding to the representative image is selected from any one of the plurality of images corresponding to the selected first thumbnail image when any one of the first thumbnail images is selected, And generates and displays a second thumbnail image of a higher resolution and a larger size than the first thumbnail image.
5. The apparatus of claim 4,
And displays at least one thumbnail image corresponding to the images of the group corresponding to the selected first thumbnail image based on the touch input of the user for the second thumbnail image.
6. The apparatus of claim 5,
A graphic object in the form of a scroll bar is displayed in a part of a display area where the second thumbnail image is displayed and a graphic object corresponding to the selected first thumbnail image is displayed in accordance with a drag input length of the user, And displays a second thumbnail image corresponding to any one of the images of the group to be displayed.
6. The apparatus of claim 5,
A plurality of graphic objects corresponding to the number of images of the group corresponding to the selected first thumbnail image are displayed in a part of a display area where the second thumbnail image is displayed, And displays a second thumbnail image corresponding to one of the groups of images corresponding to the selected first thumbnail image corresponding to one of the thumbnail images.
6. The apparatus of claim 5,
Displaying thumbnail images corresponding to each of the images of the group corresponding to the selected first thumbnail image based on the touch input of the user for the second thumbnail image,
Wherein at least one graphical object corresponding to at least one preset function is displayed in each of regions where thumbnail images corresponding to each of the groups of images are displayed.
The apparatus of claim 1,
Wherein when the corresponding touch gesture input is applied while the grouping state is released, information associated with at least one of the plurality of images stored in the mobile terminal is displayed on a thumbnail corresponding to at least one of the plurality of images, And displays the image on the periphery of the image.
The method according to claim 1,
Wherein the predetermined touch gesture input includes:
A pinch-in gesture input,
Another touch gesture input, corresponding to the preset touch gesture input,
And a pinch-out gesture input.
Displaying thumbnail images respectively corresponding to a plurality of previously stored images;
Grouping the plurality of images simultaneously according to a predetermined grouping criterion when a preset touch gesture input is applied;
Displaying a thumbnail image corresponding to a plurality of groups generated according to the grouping; And
And simultaneously releasing the grouping state of the grouped images if another touch gesture input corresponding to the preset touch gesture input is applied,
The predetermined grouping criterion may include:
Wherein at least one of the plurality of images is photographed, a composition of the plurality of images is photographed, and a position at which the plurality of images are photographed.
KR1020150116222A 2015-08-18 2015-08-18 Mobile terminal and method for controlling the same KR20170021616A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150116222A KR20170021616A (en) 2015-08-18 2015-08-18 Mobile terminal and method for controlling the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150116222A KR20170021616A (en) 2015-08-18 2015-08-18 Mobile terminal and method for controlling the same

Publications (1)

Publication Number Publication Date
KR20170021616A true KR20170021616A (en) 2017-02-28

Family

ID=58543303

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150116222A KR20170021616A (en) 2015-08-18 2015-08-18 Mobile terminal and method for controlling the same

Country Status (1)

Country Link
KR (1) KR20170021616A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10789473B2 (en) 2017-09-22 2020-09-29 Samsung Electronics Co., Ltd. Method and device for providing augmented reality service
US11765108B2 (en) 2021-06-03 2023-09-19 LINE Plus Corporation Method, computer device, and non-transitory computer readable recording medium to display grouped image message

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10789473B2 (en) 2017-09-22 2020-09-29 Samsung Electronics Co., Ltd. Method and device for providing augmented reality service
US11765108B2 (en) 2021-06-03 2023-09-19 LINE Plus Corporation Method, computer device, and non-transitory computer readable recording medium to display grouped image message

Similar Documents

Publication Publication Date Title
KR101832966B1 (en) Mobile terminal and method of controlling the same
KR20160029536A (en) Mobile terminal and control method for the mobile terminal
KR20170006559A (en) Mobile terminal and method for controlling the same
KR20160018001A (en) Mobile terminal and method for controlling the same
KR20160131720A (en) Mobile terminal and method for controlling the same
KR20160039453A (en) Mobile terminal and control method for the mobile terminal
KR20170014356A (en) Mobile terminal and method of controlling the same
KR20170042162A (en) Mobile terminal and controlling metohd thereof
KR20150145400A (en) Mobile terminal and method for controlling the same
KR20180023197A (en) Terminal and method for controlling the same
KR20160009976A (en) Mobile terminal and method for controlling the same
KR20170016752A (en) Mobile terminal and method for controlling the same
KR20170001329A (en) Mobile terminal and method for controlling the same
KR20170115863A (en) Mobile terminal and method for controlling the same
KR20160097924A (en) Mobile terminal and method for controlling the same
KR20170021514A (en) Display apparatus and controlling method thereof
KR20170021616A (en) Mobile terminal and method for controlling the same
KR101676475B1 (en) Mobile terminal and method for controlling the same
KR20170039994A (en) Mobile terminal and method for controlling the same
KR20170024437A (en) Mobile terminal and folder image sliding method thereof
KR20170056833A (en) Mobile terminal
KR20160072638A (en) Mobile terminal and method for controlling the same
KR20160009973A (en) Mobile terminal and method for controlling the same
KR20150134664A (en) Mobile terminal and method for controlling the same
KR20150093519A (en) Mobile terminal and method for controlling the same