KR101684272B1 - Terminal device and controlling method thereof - Google Patents

Terminal device and controlling method thereof Download PDF

Info

Publication number
KR101684272B1
KR101684272B1 KR1020150109117A KR20150109117A KR101684272B1 KR 101684272 B1 KR101684272 B1 KR 101684272B1 KR 1020150109117 A KR1020150109117 A KR 1020150109117A KR 20150109117 A KR20150109117 A KR 20150109117A KR 101684272 B1 KR101684272 B1 KR 101684272B1
Authority
KR
South Korea
Prior art keywords
user
main screen
skin
skin diagnosis
information
Prior art date
Application number
KR1020150109117A
Other languages
Korean (ko)
Inventor
윤정한
김성민
신현진
조세나
명고운
Original Assignee
주식회사 엘지유플러스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 엘지유플러스 filed Critical 주식회사 엘지유플러스
Priority to KR1020150109117A priority Critical patent/KR101684272B1/en
Application granted granted Critical
Publication of KR101684272B1 publication Critical patent/KR101684272B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Primary Health Care (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

In the present invention, disclosed are a terminal device capable of making information to be supplied to a user different according to a time zone or whether to log in, and a controlling method thereof. To this end, the controlling method of the terminal device includes the steps of: turning on a display unit if the approach of a user is sensed when the display unit is off; and outputting a main screen through the display unit. At this time, the configuration of the main screen is automatically controlled according to a current time or whether to log in.

Description

TECHNICAL FIELD [0001] The present invention relates to a terminal device and a control method thereof,

BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to a terminal apparatus and a control method thereof,

With increasing interest in cosmetics, public interest in skin care is also increasing. Each individual is trying to maintain optimal skin condition, either by receiving medical treatment or using cosmetics to prevent skin aging or skin troubles.

However, even if you have good cosmetics, if you do not use it properly, the beauty effect will decrease. Recently, various cosmetic methods have been introduced through various media. This is an example of a program that guides a person through a cosmetic procedure.

For this purpose, it is necessary to provide the user with information such as a skin care method or makeup method. However, since there are differences in skin management method or make-up method depending on the time zone or the user, it is more useful to provide information necessary for the user depending on the situation, rather than providing the same information for every situation .

SUMMARY OF THE INVENTION The present invention has been made to solve the above problems, and it is an object of the present invention to provide a terminal device and a control method therefor which improve user convenience.

More specifically, the present invention provides a terminal apparatus and a control method thereof, which can provide useful information to a user in a time zone differently from a time-based screen configuration.

It is another object of the present invention to provide a terminal device capable of outputting skin condition information of a user after a user logs in and guiding the user to make a perfect skin care, and a control method thereof.

It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, unless further departing from the spirit and scope of the invention as defined by the appended claims. It will be possible.

According to another aspect of the present invention, there is provided a method of controlling a terminal device comprising: turning on a display unit when a proximity of a user is detected in a state that the display unit is off; And outputting a main screen through the display unit. At this time, the configuration of the main screen can be automatically adjusted according to the current time or whether the user is logged in.

According to an aspect of the present invention, there is provided a terminal apparatus including: a display unit; And a controller for turning on the display unit and controlling the main screen to be output through the display unit when the proximity of the user is detected in a state in which the display unit is off. At this time, the configuration of the main screen can be automatically adjusted according to the current time or whether the user is logged in.

The technical solutions obtained by the present invention are not limited to the above-mentioned solutions, and other solutions not mentioned are clearly described to those skilled in the art from the following description. It can be understood.

SUMMARY OF THE INVENTION The present invention has been made in order to solve the above problems, and it is an object of the present invention to provide a terminal device and a control method thereof for improving user convenience.

Specifically, the present invention provides a terminal apparatus and a control method thereof, which can provide useful information to a user for each time zone, by changing the screen configuration for each time period.

In addition, the present invention provides a terminal device capable of outputting skin condition information of a user after a user logs in and guiding the user to make a perfect skin care, and a control method thereof.

The effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description will be.

1 and 2 are views showing a terminal device.
3 is an exploded perspective view of a terminal device according to an embodiment of the present invention.
4 is a diagram showing an example in which the first light source unit and the second light source unit are disposed.
5 is a block diagram of a terminal device based on electronic components that can be inserted into a terminal device.
6 and 7 are diagrams showing an example in which user log-on proceeds.
8 is a diagram illustrating an example in which user information is input through a setting screen.
9 is a diagram showing an example in which a photograph for skin diagnosis is taken.
10 is a diagram showing an example in which a skin diagnosis result is output.
11 is a diagram showing another example in which a skin diagnosis result is output.
12 is a flowchart for explaining the operation of the terminal device according to the present invention.
13 is a diagram showing an example in which an idle screen is output.
14 to 17 are diagrams showing an example in which a main screen is outputted.

Hereinafter, a terminal device related to the present invention will be described in detail with reference to the drawings.

The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

The embodiments to be described later can be applied to various electronic apparatuses. That is, the makeup recommendation method according to the present invention can be applied not only to mobile terminals that are carried by a user such as mobile phones, smart phones, digital cameras, PDAs, laptops, MP3s, The present invention can be applied to a fixed type terminal which is normally used in a fixed position. For convenience of explanation, in the following embodiments, an electronic device to which the cosmetic recommendation method according to the present invention can be applied is exemplified by a terminal device capable of storing cosmetics.

1 and 2 are views showing a terminal device according to the present invention. Referring to the example shown in FIG. 1, the mirror 120 may be exposed on the front surface of the terminal device. At this time, at least a part of the mirror 120 can be utilized as the display unit 125 that outputs information. Accordingly, the user can check various information outputted through the display unit 125 while viewing his or her own image on the mirror 120.

The terminal device may include a light source unit 140. The light source 140 may be disposed to illuminate the user's face when the user is positioned in front of the terminal device. As the user's face brightens, the effect of the user's skin tone and makeup state on the mirror can be more apparent. 1 and 2, the light source unit 140 is located at the left and right edges of the terminal device. However, the position of the light source unit 140 is not limited to the illustrated example.

On the rear surface of the mirror 120, a case 110 on which an electronic component can be mounted may be located. As will be described later, the case 110 may include a front case, a rear case, and a main body. In the case 110, electronic parts such as a communication unit, a camera, a memory, and a control unit can be mounted. The camera 130 mounted on the case 110 may perform a function of photographing the outside of the display unit. To this end, it is preferable that the back surface portion of the mirror 120 (i.e., the surface abutting the case 110) is in a state capable of transmitting light.

A part of the terminal device may be equipped with a cosmetic refrigerator for storing cosmetics. A mirror may be attached to the door 150 of the cosmetic refrigerator, but this is not necessarily so.

Hereinafter, for convenience of explanation, the portion corresponding to the cosmetic refrigerator of the terminal device will be referred to as a refrigerator area 20, and the remaining area excluding the area occupied by the cosmetic refrigerator will be referred to as a terminal device area 10. [

3 is an exploded perspective view of a terminal device according to an embodiment of the present invention. Referring to FIG. 3, a mirror 310 may be disposed on a front surface of a terminal device. Reflecting portions 312 for reflecting light generated in the light source portion may be coupled to both ends of the mirror 310. [ The reflector 312 reflects a part of light generated in the light source. By reflecting a part of the light generated in the light source part, glare due to the generation of light can be reduced.

The mirror 310 likes the transmittance and the reflectance so that one side can operate as a reflective surface which is invisible to the other side and the other side can operate as a transmissive surface through which the other side is transmissive. For example, when the transmissivity of the mirror 310 is adjusted to 70% and the reflectance is adjusted to 30%, when the mirror 310 is viewed from either side, the opposite side is not seen. However, when the mirror 310 is viewed from the other side, The opposite side can be seen.

At this time, the reflecting surface of the mirror 310 is preferably exposed to the outside of the terminal device. This is for the user to be able to illuminate his / her face reflected on the terminal device.

It is preferable that the transmission surface of the mirror 310 is disposed so as to face the inside of the terminal apparatus. In this case, even if a camera is installed in the terminal device, a user facing the mirror 310 can be photographed through the camera.

The mirror 310 may be manufactured by optical deposition using a non-metallic material such as SIO2 or CIO2. When the touch screen panel is attached to the mirror 310 made of a non-metallic material, a part of the mirror 310 may function as a display unit for outputting information and receiving a touch input.

The mirror 310 may be partitioned into a first mirror part 310-1 forming a terminal device area 10 and a second mirror part 310-2 forming a refrigerator area 20. [

A part of the back surface of the first mirror part 310-1 can engage with the front case 320. An opening 324 for coupling with the touch screen panel 350 may be formed in a part of the front case 320.

When the touch screen panel 350 is coupled to the opening 324 of the front case 320, the touch screen 350 and the mirror 310 directly come into contact with each other through the opening 324. Accordingly, a part of the entire area of the first mirror part 310-1 which directly contacts the touch screen panel 350 can function as the display part 125 that outputs information or receives the touch input.

A groove 322 for inserting the light source unit 330 may be formed at one end of the front case 320. When the front case 322 is coupled with the mirror 310, the reflection portion 312 may be positioned on the groove 322. Accordingly, a part of the light generated through the light source unit 330 placed in the groove 322 will be reflected by the reflection unit 312.

A door 340 for opening and closing the refrigerating chamber 364 may be coupled to the rear surface of the second mirror part 310-2. At this time, a heat insulating material (for example, styrofoam) 345 may be inserted between the second mirror part 310-2 and the door 340 to keep the refrigerating chamber 364 at a constant temperature. A groove 347 for inserting the light source unit 330 may be formed at either end of the heat insulating material 345 or the door 340. [ When the mirror 310 is attached on the heat insulating material 345 and the door 340, the reflecting portion 312 may be positioned on the groove 347. Accordingly, a part of the light generated through the light source part 330, which is seated in the groove 347, will be reflected by the reflection part 312. In FIG. 3, a groove 347 for inserting the light source unit 330 is illustrated at one end of the heat insulating material 345.

The light source unit 330 may include a light emitting unit capable of emitting light, such as an LED (Light Emitting Unit), an incandescent lamp, a fluorescent lamp, or the like. The lighting of the light source unit and the lighting of the light source unit may be controlled by the control unit or may be controlled by manual input (for example, operation of a switch for turning on / off the light source unit). The user will be able to make makeup more easily by using the light from the light source.

The light source unit 330 may include a first light source unit emitting a first color and a second light source unit emitting a second color. The first light source unit and the second light source unit are turned on during the makeup of the user, and the second light source unit is turned on after the makeup of the user is finished. The first light source unit and the second light source unit emit light of different colors of daylight, white, warm white, .

The light emitting units constituting the first light source unit and the light emitting units constituting the second light source unit may be alternately arranged or the arrays constituting the first light source unit and the arrays constituting the second light source unit may be arranged side by side .

For example, FIG. 4 shows an example in which the first light source unit and the second light source unit are disposed. As in the example shown in Fig. 4 (a), the light emitting units constituting the first light source unit and the light emitting units constituting the second light source unit may be arranged alternately one by one, As in the example, the arrays constituting the first light source section and the arrays constituting the second light source section may be arranged side by side.

The first light source unit and the second light source unit may be disposed in a manner different from the illustrated example. For example, the light emitting units constituting the first light source unit and the light emitting units constituting the second light source unit may be alternately arranged at a ratio of N: M (where N and M are different natural numbers), and the first light source unit and the second light source unit may be arranged alternately Or may be disposed at another position. For example, the first light source unit may be disposed in the front case, and the second light source unit may be disposed in the door.

The front case 320 and the door 340 may be coupled to the body 360. The body 360 determines the overall shape of the terminal device and functions to mount electronic components. Specifically, the body 360 may be provided with a storage box 362 for storing various electronic components for operating a terminal device, and a refrigerating chamber 364 for storing cosmetics. The storage box 362 may be protected by the front case 320 and the refrigerator compartment 364 may be protected by the door 340. At this time, the door 340 can be hinged to be rotatable in association with the body 360.

Electronic parts such as a touch screen, a communication unit, a memory, a control unit, and a camera can be mounted on the storage box 362. At this time, a part of the front case corresponding to the position where the camera is to be disposed may be perforated (326) so that the camera can shoot the outside of the mirror 310.

The refrigerating chamber 364 may be formed to protrude toward the back of the body 360. The thermal management unit 380 can be coupled to the protruding portion of the refrigerating chamber 364. The heat management unit 380 may supply cold air to the refrigerating chamber 364 and discharge the generated heat. For example, the thermal management unit 380 may include a thermoelectric element. When the electric current is applied, the thermoelectric element forms a low-temperature part having a low temperature at one side and the other part is an object constituting a high-temperature part at a high temperature, and cold air can be supplied to the refrigerating chamber 364 through the low temperature part of the thermoelectric element.

In order to dissipate the heat of the high temperature portion of the thermoelectric element, the heat management portion 380 may include a heat sink. For this purpose, it is preferable that the heat sink is made of a material that can easily transfer heat to receive the heat generated in the high temperature portion by conduction. That is, the heat sink may be made of a metal material or the like.

5 is a block diagram of a terminal device based on electronic components that can be inserted into a terminal device. 5, a terminal device according to the present invention includes a communication unit 510, a display unit 520, a camera 530, a light source unit 540, a memory 550, a sensing unit 560, and a control unit 570 .

The communication unit 510 plays a role of receiving data from outside or transmitting data to the outside. Through the communication unit 510, the terminal device can communicate with an external server or an external terminal.

(E.g., Bluetooth, Zigbee or NFC (Near Field Communication)), a mobile communication technology (e.g., LTE (Long Term Evolution), HSDPA (High Speed Downlink Packet Access) (Wireless Local Area Network), Wi-Fi (Wireless Fidelity), etc.) may be applied, or wired communication technology may be applied.

The display unit 520 plays a role of outputting information under the control of the controller 570. At this time, the display unit 520 may be formed on at least a part of the mirror. In this case, the mirror 120 plays a role of illuminating the user by using reflection of incident light, and at the same time, plays a role of outputting information through at least a part of the area (i.e., the area where the display 520 is combined) You can do it.

The display unit 520 functions as an output device for outputting information and can function as an input device for receiving a touch input.

The camera 530 can take a picture of the front surface of the mirror 120. For example, the camera 530 may photograph a user looking at the mirror 120. The front side of the mirror 120 (i.e., the side facing the user) reflects the light, and the back side (i.e., the case 12 and the case 12) Face) can have a translucent form that allows light to pass through. As the reflective surface reflecting the light is exposed to the outside, the user will be able to illuminate his / her appearance in the mirror. The camera 530 can photograph the outside of the terminal device as the transmission surface through which the light passes is directed toward the camera 530. [

The image data collected via the camera 530 may be analyzed and processed with user control commands. In addition, the controller 570 may analyze the video signal input to the camera 530 to determine the face type, the lip type, the eye shape, the skin tone type, the skin type, etc. of the user. At this time, in order to reduce power consumption by the camera, the control unit may keep the camera in an inactive state, and may control the camera to be activated only when the user is close to the terminal device.

The light source unit 540 can emit light. The light source unit 540 may include a light emitting unit such as an LED (Light Emmiting Unit), an incandescent lamp, a fluorescent lamp, or the like. The lighting of the light source unit 540 may be controlled by the control unit 570 or may be controlled by a manual input (for example, operation of a switch for turning on / off the light source unit 540). For example, the control unit maintains the light source unit in the off state, and when the user is close to the terminal device, when the user intends to shoot the user through the camera, or when it is determined that the user intends to start makeup, .

The memory 550 stores data supporting various functions of the terminal. The memory 550 may store a plurality of application programs, for example, an application for skin analysis, and various data driven by the mobile terminal. In addition, the memory 550 may store moving images for guiding makeup. At this time, moving pictures can be classified and stored according to a predetermined criterion.

The sensing unit 560 may sense an object approaching the terminal device. When it is detected that the user is close to the terminal device, the controller 570 activates the camera 530, analyzes the image input through the camera 530, and determines whether a nearby user is in a state of using the terminal device .

The control unit 570 controls the overall operation of the terminal device. The control unit 570 analyzes the face region of the user through the image input through the camera 530, extracts moving images suitable for the analysis result, and outputs the moving image list including the extracted moving images. The control unit 570 processes signals, data, information, and the like input or output through the above-described components, or drives an application program stored in the memory 550 to provide or process appropriate information or functions to the user .

The operation of the terminal according to the present invention will be described in detail with reference to the above description.

In order to use the service provided by the terminal device according to the present invention, the user can first try to log on (or log in) through the registered account. Specifically, when account information such as an ID and a password for using a service provided by the terminal device is input, the control unit can attempt to log on the user based on the inputted account information. The control unit may transmit the user ID and the password to the skin management server and receive the user login result from the skin management server.

6 and 7 are diagrams showing an example in which user log-on proceeds.

When the terminal apparatus is started, the control unit can first control the plurality of guide messages to sequentially output the guidance messages guiding the service provided by the terminal apparatus, as in the example shown in FIG.

6A to 6C illustrate that a message indicating that services such as skin diagnosis, skin condition management, skin care method, and customized cosmetic recommendation can be provided are sequentially output.

When a predetermined time has elapsed since a touch input in which a pointer touching the display unit is dragged in a predetermined direction is received or a predetermined guidance message is output in a state in which any one of the plurality of guidance messages is output, It is possible to control so that the guidance message of the order number is outputted.

For example, as in the example shown in FIG. 6 (a), in a state in which a first guidance message indicating that a skin diagnosis service can be provided is being output, a pointer that touches the display unit is dragged in a predetermined direction When a predetermined time has elapsed since the touch input is received or the first guidance message is output, the control unit stops outputting the first guidance message to inform that the skin diagnosis service can be provided, As shown in the illustrated example, it is possible to control the second guidance message to be outputted to guide that the skin condition management service can be provided.

When all the guidance messages are output, the output of the guidance message is stopped, and a user input for starting the service request (for example, a touch input for touching the start button shown in Fig. 6) is received, A log-on screen can be output as in the example shown in FIG.

When the account information such as the ID and the password is inputted, the control unit can request the user to log on to the skin diagnosis server based on the inputted account information.

If the user is in the non-member state, the user can click the member registration button 710 to proceed with the member registration procedure. The general description of the membership procedure is already known, so a detailed description thereof will be omitted.

When the user logs on to the terminal device for the first time, the control unit can control to output a setting screen for receiving user information.

For example, FIG. 8 shows an example in which user information is input through a setting screen. First, as in the example shown in FIG. 8A, the control unit may output a screen for requesting a user's profile image. The face registration request screen may include a photographing button 810 for photographing a user through a camera and a gallery button 820 for selecting any one of photographs already photographed.

When a photograph is photographed through the photographing button 810 or a photographed photograph is selected through the gallery button 820, the control unit displays the photographed photograph or the selected photograph as shown in the example of FIG. 8 (b) You can register as your profile image.

If the profile image associated with the user account is already registered, the profile image registration procedure shown in Figs. 8A and 8B may be omitted.

Thereafter, the control unit may control to output a selection screen for acquiring skin information of the user.

In Figs. 8 (c) to 8 (e), screens for selecting a skin type of a user, a screen for selecting a skin tone, and a screen for selecting a skin anomaly are illustrated. Based on the user input for each screen, the skin information of the user can be obtained.

When the acquisition of the user information is completed, the control unit can provide skin diagnosis information to the logged-in user.

However, the log-on process is not a necessary procedure for receiving a service provided by the terminal device. For example, the control unit may provide a skin diagnosis service to a user who is not logged on through a guest account.

When the skin diagnosis service is activated by user input, the control unit can diagnose the skin condition of the user based on the user's face photographed through the camera. Specifically, when the user's face is recognized through the camera, the control unit can perform skin diagnosis on skin troubles, pores, dirt, wrinkles, and skin tones based on the recognized face.

For example, FIG. 9 is a view showing an example in which a photograph for skin diagnosis is taken.

When the skin diagnosis service is activated, the control unit can output a preview image input through the camera through the display unit, and control the guide line 910 to guide the user's face position on the preview image. In FIG. 9A, a guideline 910 is shown as a dotted line.

It is recognized that the face of the user is located within the guide line 910, and when the predetermined time has elapsed without the user's face being out of the guide line, the control unit can take a picture. For example, the control unit may take a picture after three seconds have elapsed since the face of the user in the guide line 910 was recognized. At this time, if the user's face goes out of the guide line 910 while counting 3 seconds, the control unit may stop counting and wait until the user's face re-enters the guide line 910.

When the photograph is photographed, the control unit can diagnose the user's skin condition based on the photographed photograph.

During skin analysis, the control unit can control to output a message 920 indicating that skin diagnosis is performed, as in the example shown in FIG. 9B. If the stop button 930 for stopping the skin diagnosis is touched, the control unit can stop the skin analysis.

When the skin diagnosis is completed, the control unit may quantify the results of at least one diagnosis item and provide the skin diagnosis result to the user.

For example, FIG. 10 is a diagram showing an example in which a skin diagnosis result is output.

When the skin analysis is completed, the control unit can control the evaluation value for each analysis item to be output as in the example shown in Fig. In FIG. 10, evaluation values for five analysis items such as the pores 1001, the troubles 1003, the dirt 1005, the skin tone 1007, and the wrinkles 1009 are illustrated as being output. The control unit may calculate an overall score based on the average value of skin diagnosis results for a plurality of items.

The control unit can output weather information 1010, time information 1020, and the like. Although not shown, the weather information may include UV index information and precipitation information along with weather and temperature information. In addition, the control unit may also output the information of the logged-in user (which is illustrated in FIG. 10 as a picture of the user being output) 1030.

The control unit can control so that the figure objects 1041, 1043, 1045, 1047, and 1049 representative of each analysis item are additionally output. Each figure object may have the name of the analysis item mapped to the figure object.

At this time, based on the evaluation value of each analysis item, the control unit can determine the size of the graphic object representing the analysis item and the color of the graphic object.

11 is a diagram showing another example in which a skin diagnosis result is output.

After each skin diagnosis item displays the reference figure 1110 located at the vertex, the controller can reflect the distance from the center of the reference figure to each vertex as the evaluation score for each skin diagnosis item. For example, in FIG. 11, the regular pentagon 1110 is displayed as a reference figure, and the distance from the center of the regular pentagon to each vertex is reflected as an evaluation score for each skin diagnosis item.

When the skin diagnosis is performed while the user is logged on, the control unit can store the user's skin diagnosis result in association with the user account information. Accordingly, the user can inquire at any time the skin diagnosis result associated with his or her account accumulated for a predetermined period of time.

Alternatively, if a skin diagnosis is made while the user is not logged on, the control unit may output the result of the skin diagnosis of the user and immediately discard it. Accordingly, the user will not be able to inquire about the result of the skin diagnosis proceeded without being logged on at a later time.

The operation of the terminal device according to the present invention will be described in detail based on the above description.

12 is a flowchart for explaining the operation of the terminal device according to the present invention.

First, it is assumed that the display unit of the terminal device is initially in an Off state. Here, the Off state means a state in which no power is supplied to the display unit, and no graphic object is output to the display unit. However, even if the display unit is off, the touch sensor having a mutual layer structure with the display unit may maintain a state of waking up every predetermined time. Accordingly, even if the display unit is off, the display unit can be in a state capable of receiving the touch input.

If a user accessing the terminal device is detected (S1210), the control unit switches the display unit in the off state to the on state (S1220), and controls the display unit to output the standby screen in operation S1230. At this time, the control unit may sense the user's approach through the image input through the camera, or may sense the user's approach through the sensing signal of the sensing unit.

13 is a diagram showing an example in which an idle screen is output. When the user's access is detected, the control unit may control the display unit to output an idle screen as in the example shown in FIG. Through the idle screen, time information, weather information, communication status information, and the like can be output.

In the example shown in FIG. 13, time information 1310 including date and current time, weather information 1320 including weather and current temperature, and communication state information 1330 including communication intensity are output Lt; / RTI >

The control unit can receive the weather information from the weather information providing server and output it on the standby screen. If the communication state is poor and weather information can not be received from the weather information providing server, a message indicating that the communication state is poor and weather information can not be received may be output instead of the weather information.

When a user input for touching the screen is received in the state that the standby screen is being output (S1240), the control unit can stop the output of the standby screen and control the main screen to be output. At this time, the control unit may output information to be outputted through the main screen based on factors such as whether the user is logged on and the time.

14 to 17 are diagrams showing an example in which a main screen is outputted.

First, FIG. 14 is a diagram showing a configuration of a main screen of morning time before log-on.

If the user does not log on to the terminal device and the current time corresponds to the morning time zone (S1250), the control unit displays the detailed date information and the first date and time information, as in the example shown in FIG. 14 (a) Type content is output (S1260). Here, the first type of content may be content selected in association with today's weather, such as skin care method information according to weather or makeup method information.

In the example shown in FIG. 14 (a), when it is assumed that the display unit is divided into a first area on the left side and a second area on the right side with a virtual line as a boundary, in the first area, And the skin management method information including the weather information such as the precipitation probability, the humidity and the skin index, and the message guiding the skin management method according to the weather is outputted .

Control buttons 1410 to 1440 for controlling the terminal device and a start button 1450 for starting skin diagnosis may be output on the main screen.

For example, in (a) of FIG. 14, a control button is outputted through the first area and a start button is outputted through the second area. The control buttons output through the first area are illustrated as including a menu button 1410, a beauty content button 1420, a self camera button 1430, and a lighting button 1440.

The menu button 1410 is for outputting a list of applications installed in the terminal device. When the menu button is touched, the control unit can control to output a list of applications installed in the terminal device.

The beauty content button 1420 is for outputting a list of contents related to the skin care method or cosmetic method and the like. When the beauty content button is touched, the control unit can output a content list including a web page, a document, a voice file, or a moving picture related to a skin care method, a makeup method, or the like.

The self-camera button 1430 is for activating the camera. When the self-camera button is touched, the control unit can control the camera to be activated.

The illumination button 1440 is for controlling on-off control and brightness control of the light source unit. When the illumination contour is touched, the control unit may output sub buttons such as a button for controlling on / off of the light source unit and a control bar for controlling the brightness of the light source unit.

For example, in FIG. 14B, as the illumination button 1440 is touched, a button 1442 for turning on and off the first light source unit, a button 1444 for turning on and off the second light source unit, The sub-buttons including the control bar 1446 for outputting are displayed.

When the start button 1450 for skin diagnosis is touched, the control unit can activate the camera. When a user's face is detected through a camera under a predetermined condition, the control unit can perform skin diagnosis based on the detected face. When the skin diagnosis is completed, the controller may control the skin diagnosis result as described above with reference to FIGS. 9 and 11 to be outputted.

If the user does not log on to the terminal device and the current time corresponds to the afternoon time zone (S1250), the control unit may control to output the second type of content instead of the first type content (S1270).

For example, FIG. 15 is a diagram showing the configuration of the main screen of the afternoon time zone before logon.

In the afternoon, since the frequency of going out of the user is higher than morning, the skin management method information or makeup method information according to the weather is likely to be information that is no longer useful to the user. Accordingly, the control unit can output the second type of content instead of the first type content in the afternoon.

For example, in FIG. 15, a skin management moving image having a high number of views is output through the second region. By outputting content that is popular (i.e., with a high number of views) to multiple users instead of content associated with the weather, it can attract users and further increase user interest.

14 and 15 illustrate that the first type content is output in the morning time zone and the second type content is output in the afternoon time zone. However, the time zone in which each content is output may be set differently from the illustrated example .

In a state in which the main screen is being output, the user can proceed to log on to use the service provided by the terminal device. For example, when the login button shown in FIGS. 14 and 15 is touched, the control unit may output a login page for user logon. When the account information is inputted through the login page, the control unit can perform log-on based on the inputted account information (S1280).

When the user log-on is completed, the control unit may control the skin diagnosis result of the user to be output instead of the first or second type content.

For example, FIG. 16 is a diagram showing an example in which a skin diagnosis result of a user is output.

If it is determined that the predetermined period has not elapsed since the most recent skin diagnosis was performed (S1290), the control unit may control to output the latest skin diagnosis result of the user as in the example shown in Fig. 16 (S1292). In FIG. 16, the diagnostic results of the evaluation items such as trouble, pores, dullness, wrinkles, and skin tone, the time elapsed since the last skin diagnosis was performed, .

If the predetermined period has elapsed since the user performed the most recent skin diagnosis (S1290), the controller may control the cumulative result of accumulating the user's skin diagnosis result to be outputted (S1294).

For example, FIG. 17 is a diagram showing an example in which a cumulative skin diagnosis result of a user is output.

If a predetermined period has elapsed since the user performed the most recent skin diagnosis, the controller may control the cumulative result of accumulating the skin diagnosis result of the user to be output as in the example shown in FIG. In FIG. 17, a graph is shown in which a skin diagnosis result from February 10 to February 28 is accumulated.

When an arbitrary item on the graph is touched, the control unit can control the skin diagnosis result corresponding to the touched item to be output. For example, when an item corresponding to Feb. 28 is touched in the graph shown in FIG. 17, the control unit can output the skin diagnosis result of the selected date.

It is possible to induce the skin diagnosis to be performed periodically by changing the information output through the display unit according to the elapsed time since the user performed the skin diagnosis last time.

If the user does not have a record of performing the skin diagnosis, or if the user has performed the skin diagnosis for a long period of time (i.e., the second predetermined period), the control unit , It is possible to control the first type content or the second type content to be output according to the current time.

When the user logs on to the terminal device, the control unit can control the log-on state to be maintained even if the display unit is off. If the proximity of the user is detected and the user recognized through the camera is determined to be the same person as the user of the logged-on account, the control unit continuously maintains the log-on state, and in the example shown in Figs. 16 and 17 Likewise, the skin diagnosis result information can be controlled to be output.

If it is determined that the recognized user is different from the user of the logged-on account, the control unit logs out the logged-on account, and then, as in the example shown in FIGS. 14 and 15, 2 type content can be output.

According to an embodiment of the present invention, the above-described methods (operation flow charts) can be implemented by a program such as a computer program or an application, or can be implemented as a code readable by a processor on a medium on which the program is recorded . Examples of the medium that can be read by the processor include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, etc., and may be implemented in the form of a carrier wave (e.g., transmission over the Internet) .

The terminal device 10 described above can be applied to a configuration and a method of the embodiments described above in a limited manner, but the embodiments can be modified such that all or some of the embodiments are selectively combined .

510:
520:
530: Camera
540: Light source
550: memory
560:
570:

Claims (16)

Turning on the display unit when the proximity of the user is sensed while the display unit is off; And
And outputting a main screen through the display unit,
The configuration of the main screen is automatically adjusted according to the current time or whether the user is logged in,
Wherein the main screen includes information of the user's skin diagnosis result after the user logs in.
The method according to claim 1,
If the current time is the first time zone, the first type content is included in the main screen.
3. The method of claim 2,
Wherein the first type content includes at least one of skin management method information related to weather or makeup method information related to weather.
3. The method according to claim 1 or 2,
And the second type content is included in the main screen when the current time is the second time zone.
5. The method of claim 4,
Wherein the second type content includes a popular moving image selected by the server.
delete The method according to claim 1,
Wherein the main screen includes the latest skin diagnosis result information if the predetermined time has not elapsed since the user performed the last skin diagnosis.
The method according to claim 1,
Wherein the main screen includes information on skin diagnosis cumulative results for the latest N times, if a predetermined time has elapsed since the last skin diagnosis was performed.
A display unit; And
A control unit for turning on the display unit and controlling the main screen to be output through the display unit when the proximity of the user is detected in a state in which the display unit is off;
, ≪ / RTI &
The configuration of the main screen is automatically adjusted according to the current time or whether the user is logged in,
Wherein the main screen includes information of the skin diagnosis result of the user after the user logs in.
10. The method of claim 9,
Wherein if the current time is a first time zone, the first type content is included in the main screen.
11. The method of claim 10,
Wherein the first type content includes at least one of skin management method information related to weather or makeup method information related to weather.
11. The method according to claim 9 or 10,
And the second type content is included in the main screen when the current time is the second time zone.
13. The method of claim 12,
Wherein the second type content includes a popular moving image selected by the server.
delete 10. The method of claim 9,
Wherein the main screen includes the latest skin diagnosis result information if the predetermined time has not elapsed since the user performed the last skin diagnosis.
10. The method of claim 9,
Wherein the main screen includes the skin diagnosis cumulative result information for the latest N times, if a predetermined time has elapsed since the user performed the last skin diagnosis.
KR1020150109117A 2015-07-31 2015-07-31 Terminal device and controlling method thereof KR101684272B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150109117A KR101684272B1 (en) 2015-07-31 2015-07-31 Terminal device and controlling method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150109117A KR101684272B1 (en) 2015-07-31 2015-07-31 Terminal device and controlling method thereof

Publications (1)

Publication Number Publication Date
KR101684272B1 true KR101684272B1 (en) 2016-12-08

Family

ID=57576650

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150109117A KR101684272B1 (en) 2015-07-31 2015-07-31 Terminal device and controlling method thereof

Country Status (1)

Country Link
KR (1) KR101684272B1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050049327A (en) * 2003-11-21 2005-05-25 삼성전자주식회사 Method for displaying information in mobile phone
KR20080044655A (en) * 2006-11-17 2008-05-21 엘지전자 주식회사 Method of display in mobile terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20050049327A (en) * 2003-11-21 2005-05-25 삼성전자주식회사 Method for displaying information in mobile phone
KR20080044655A (en) * 2006-11-17 2008-05-21 엘지전자 주식회사 Method of display in mobile terminal

Similar Documents

Publication Publication Date Title
KR101668348B1 (en) Method for analyzing skin surface and apparatus therefor
US10939076B2 (en) Streaming and storing video for audio/video recording and communication devices
US11873952B2 (en) Smart-home device light rings with tapered sections for uniform output
US11819108B2 (en) Smart mirror system and methods of use thereof
KR101661588B1 (en) Method for analyzing skin surface and apparatus therefor
CN208985034U (en) Electronic equipment with sensor and display equipment
JP2010004118A (en) Digital photograph frame, information processing system, control method, program, and information storage medium
KR100695053B1 (en) Mirror apparatus having display function
US20140375828A1 (en) Apparatus, systems, and methods for capturing and displaying an image
CN102577367A (en) Time shifted video communications
KR20180080140A (en) Personalized skin diagnosis and skincare
KR101715063B1 (en) Dressing table
JP2006330011A (en) Information presenting device
CN111109959A (en) Intelligent cosmetic mirror, control method thereof, controller and storage medium
CN208013970U (en) A kind of living creature characteristic recognition system
KR101753633B1 (en) Terminal device and method for comparing skin dignosis results using thereof
KR101701210B1 (en) Method for outputting an skin analysing result, apparatus and application therefor
KR101684272B1 (en) Terminal device and controlling method thereof
KR101648049B1 (en) Dressing table and controlling method thereof
JP2018072617A (en) Display control device, display system, display control method and computer program
EP3641319A1 (en) Displaying content on a display unit

Legal Events

Date Code Title Description
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20191115

Year of fee payment: 6