KR101684272B1 - Terminal device and controlling method thereof - Google Patents
Terminal device and controlling method thereof Download PDFInfo
- Publication number
- KR101684272B1 KR101684272B1 KR1020150109117A KR20150109117A KR101684272B1 KR 101684272 B1 KR101684272 B1 KR 101684272B1 KR 1020150109117 A KR1020150109117 A KR 1020150109117A KR 20150109117 A KR20150109117 A KR 20150109117A KR 101684272 B1 KR101684272 B1 KR 101684272B1
- Authority
- KR
- South Korea
- Prior art keywords
- user
- main screen
- skin
- skin diagnosis
- information
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 238000003745 diagnosis Methods 0.000 claims description 63
- 238000007726 management method Methods 0.000 claims description 15
- 230000001186 cumulative effect Effects 0.000 claims description 5
- 238000013459 approach Methods 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 19
- 239000002537 cosmetic Substances 0.000 description 15
- 238000004891 communication Methods 0.000 description 12
- 238000004458 analytical method Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 8
- 238000011156 evaluation Methods 0.000 description 6
- 238000003491 array Methods 0.000 description 4
- 230000003796 beauty Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 239000011810 insulating material Substances 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 239000007769 metal material Substances 0.000 description 3
- 239000011148 porous material Substances 0.000 description 3
- 230000037303 wrinkles Effects 0.000 description 3
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000001556 precipitation Methods 0.000 description 2
- 230000036555 skin type Effects 0.000 description 2
- 229920006328 Styrofoam Polymers 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000008021 deposition Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000004313 glare Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 230000009759 skin aging Effects 0.000 description 1
- 239000008261 styrofoam Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
- 230000002618 waking effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Primary Health Care (AREA)
- Marketing (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Health & Medical Sciences (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
Description
BACKGROUND OF THE INVENTION 1. Field of the Invention [0002] The present invention relates to a terminal apparatus and a control method thereof,
With increasing interest in cosmetics, public interest in skin care is also increasing. Each individual is trying to maintain optimal skin condition, either by receiving medical treatment or using cosmetics to prevent skin aging or skin troubles.
However, even if you have good cosmetics, if you do not use it properly, the beauty effect will decrease. Recently, various cosmetic methods have been introduced through various media. This is an example of a program that guides a person through a cosmetic procedure.
For this purpose, it is necessary to provide the user with information such as a skin care method or makeup method. However, since there are differences in skin management method or make-up method depending on the time zone or the user, it is more useful to provide information necessary for the user depending on the situation, rather than providing the same information for every situation .
SUMMARY OF THE INVENTION The present invention has been made to solve the above problems, and it is an object of the present invention to provide a terminal device and a control method therefor which improve user convenience.
More specifically, the present invention provides a terminal apparatus and a control method thereof, which can provide useful information to a user in a time zone differently from a time-based screen configuration.
It is another object of the present invention to provide a terminal device capable of outputting skin condition information of a user after a user logs in and guiding the user to make a perfect skin care, and a control method thereof.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, unless further departing from the spirit and scope of the invention as defined by the appended claims. It will be possible.
According to another aspect of the present invention, there is provided a method of controlling a terminal device comprising: turning on a display unit when a proximity of a user is detected in a state that the display unit is off; And outputting a main screen through the display unit. At this time, the configuration of the main screen can be automatically adjusted according to the current time or whether the user is logged in.
According to an aspect of the present invention, there is provided a terminal apparatus including: a display unit; And a controller for turning on the display unit and controlling the main screen to be output through the display unit when the proximity of the user is detected in a state in which the display unit is off. At this time, the configuration of the main screen can be automatically adjusted according to the current time or whether the user is logged in.
The technical solutions obtained by the present invention are not limited to the above-mentioned solutions, and other solutions not mentioned are clearly described to those skilled in the art from the following description. It can be understood.
SUMMARY OF THE INVENTION The present invention has been made in order to solve the above problems, and it is an object of the present invention to provide a terminal device and a control method thereof for improving user convenience.
Specifically, the present invention provides a terminal apparatus and a control method thereof, which can provide useful information to a user for each time zone, by changing the screen configuration for each time period.
In addition, the present invention provides a terminal device capable of outputting skin condition information of a user after a user logs in and guiding the user to make a perfect skin care, and a control method thereof.
The effects obtained by the present invention are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description will be.
1 and 2 are views showing a terminal device.
3 is an exploded perspective view of a terminal device according to an embodiment of the present invention.
4 is a diagram showing an example in which the first light source unit and the second light source unit are disposed.
5 is a block diagram of a terminal device based on electronic components that can be inserted into a terminal device.
6 and 7 are diagrams showing an example in which user log-on proceeds.
8 is a diagram illustrating an example in which user information is input through a setting screen.
9 is a diagram showing an example in which a photograph for skin diagnosis is taken.
10 is a diagram showing an example in which a skin diagnosis result is output.
11 is a diagram showing another example in which a skin diagnosis result is output.
12 is a flowchart for explaining the operation of the terminal device according to the present invention.
13 is a diagram showing an example in which an idle screen is output.
14 to 17 are diagrams showing an example in which a main screen is outputted.
Hereinafter, a terminal device related to the present invention will be described in detail with reference to the drawings.
The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.
The embodiments to be described later can be applied to various electronic apparatuses. That is, the makeup recommendation method according to the present invention can be applied not only to mobile terminals that are carried by a user such as mobile phones, smart phones, digital cameras, PDAs, laptops, MP3s, The present invention can be applied to a fixed type terminal which is normally used in a fixed position. For convenience of explanation, in the following embodiments, an electronic device to which the cosmetic recommendation method according to the present invention can be applied is exemplified by a terminal device capable of storing cosmetics.
1 and 2 are views showing a terminal device according to the present invention. Referring to the example shown in FIG. 1, the
The terminal device may include a
On the rear surface of the
A part of the terminal device may be equipped with a cosmetic refrigerator for storing cosmetics. A mirror may be attached to the
Hereinafter, for convenience of explanation, the portion corresponding to the cosmetic refrigerator of the terminal device will be referred to as a
3 is an exploded perspective view of a terminal device according to an embodiment of the present invention. Referring to FIG. 3, a
The
At this time, the reflecting surface of the
It is preferable that the transmission surface of the
The
The
A part of the back surface of the first mirror part 310-1 can engage with the
When the
A
A
The
The
The light emitting units constituting the first light source unit and the light emitting units constituting the second light source unit may be alternately arranged or the arrays constituting the first light source unit and the arrays constituting the second light source unit may be arranged side by side .
For example, FIG. 4 shows an example in which the first light source unit and the second light source unit are disposed. As in the example shown in Fig. 4 (a), the light emitting units constituting the first light source unit and the light emitting units constituting the second light source unit may be arranged alternately one by one, As in the example, the arrays constituting the first light source section and the arrays constituting the second light source section may be arranged side by side.
The first light source unit and the second light source unit may be disposed in a manner different from the illustrated example. For example, the light emitting units constituting the first light source unit and the light emitting units constituting the second light source unit may be alternately arranged at a ratio of N: M (where N and M are different natural numbers), and the first light source unit and the second light source unit may be arranged alternately Or may be disposed at another position. For example, the first light source unit may be disposed in the front case, and the second light source unit may be disposed in the door.
The
Electronic parts such as a touch screen, a communication unit, a memory, a control unit, and a camera can be mounted on the
The refrigerating
In order to dissipate the heat of the high temperature portion of the thermoelectric element, the
5 is a block diagram of a terminal device based on electronic components that can be inserted into a terminal device. 5, a terminal device according to the present invention includes a
The
(E.g., Bluetooth, Zigbee or NFC (Near Field Communication)), a mobile communication technology (e.g., LTE (Long Term Evolution), HSDPA (High Speed Downlink Packet Access) (Wireless Local Area Network), Wi-Fi (Wireless Fidelity), etc.) may be applied, or wired communication technology may be applied.
The
The
The
The image data collected via the
The
The
The
The
The operation of the terminal according to the present invention will be described in detail with reference to the above description.
In order to use the service provided by the terminal device according to the present invention, the user can first try to log on (or log in) through the registered account. Specifically, when account information such as an ID and a password for using a service provided by the terminal device is input, the control unit can attempt to log on the user based on the inputted account information. The control unit may transmit the user ID and the password to the skin management server and receive the user login result from the skin management server.
6 and 7 are diagrams showing an example in which user log-on proceeds.
When the terminal apparatus is started, the control unit can first control the plurality of guide messages to sequentially output the guidance messages guiding the service provided by the terminal apparatus, as in the example shown in FIG.
6A to 6C illustrate that a message indicating that services such as skin diagnosis, skin condition management, skin care method, and customized cosmetic recommendation can be provided are sequentially output.
When a predetermined time has elapsed since a touch input in which a pointer touching the display unit is dragged in a predetermined direction is received or a predetermined guidance message is output in a state in which any one of the plurality of guidance messages is output, It is possible to control so that the guidance message of the order number is outputted.
For example, as in the example shown in FIG. 6 (a), in a state in which a first guidance message indicating that a skin diagnosis service can be provided is being output, a pointer that touches the display unit is dragged in a predetermined direction When a predetermined time has elapsed since the touch input is received or the first guidance message is output, the control unit stops outputting the first guidance message to inform that the skin diagnosis service can be provided, As shown in the illustrated example, it is possible to control the second guidance message to be outputted to guide that the skin condition management service can be provided.
When all the guidance messages are output, the output of the guidance message is stopped, and a user input for starting the service request (for example, a touch input for touching the start button shown in Fig. 6) is received, A log-on screen can be output as in the example shown in FIG.
When the account information such as the ID and the password is inputted, the control unit can request the user to log on to the skin diagnosis server based on the inputted account information.
If the user is in the non-member state, the user can click the
When the user logs on to the terminal device for the first time, the control unit can control to output a setting screen for receiving user information.
For example, FIG. 8 shows an example in which user information is input through a setting screen. First, as in the example shown in FIG. 8A, the control unit may output a screen for requesting a user's profile image. The face registration request screen may include a photographing
When a photograph is photographed through the photographing
If the profile image associated with the user account is already registered, the profile image registration procedure shown in Figs. 8A and 8B may be omitted.
Thereafter, the control unit may control to output a selection screen for acquiring skin information of the user.
In Figs. 8 (c) to 8 (e), screens for selecting a skin type of a user, a screen for selecting a skin tone, and a screen for selecting a skin anomaly are illustrated. Based on the user input for each screen, the skin information of the user can be obtained.
When the acquisition of the user information is completed, the control unit can provide skin diagnosis information to the logged-in user.
However, the log-on process is not a necessary procedure for receiving a service provided by the terminal device. For example, the control unit may provide a skin diagnosis service to a user who is not logged on through a guest account.
When the skin diagnosis service is activated by user input, the control unit can diagnose the skin condition of the user based on the user's face photographed through the camera. Specifically, when the user's face is recognized through the camera, the control unit can perform skin diagnosis on skin troubles, pores, dirt, wrinkles, and skin tones based on the recognized face.
For example, FIG. 9 is a view showing an example in which a photograph for skin diagnosis is taken.
When the skin diagnosis service is activated, the control unit can output a preview image input through the camera through the display unit, and control the
It is recognized that the face of the user is located within the
When the photograph is photographed, the control unit can diagnose the user's skin condition based on the photographed photograph.
During skin analysis, the control unit can control to output a
When the skin diagnosis is completed, the control unit may quantify the results of at least one diagnosis item and provide the skin diagnosis result to the user.
For example, FIG. 10 is a diagram showing an example in which a skin diagnosis result is output.
When the skin analysis is completed, the control unit can control the evaluation value for each analysis item to be output as in the example shown in Fig. In FIG. 10, evaluation values for five analysis items such as the
The control unit can
The control unit can control so that the figure objects 1041, 1043, 1045, 1047, and 1049 representative of each analysis item are additionally output. Each figure object may have the name of the analysis item mapped to the figure object.
At this time, based on the evaluation value of each analysis item, the control unit can determine the size of the graphic object representing the analysis item and the color of the graphic object.
11 is a diagram showing another example in which a skin diagnosis result is output.
After each skin diagnosis item displays the reference figure 1110 located at the vertex, the controller can reflect the distance from the center of the reference figure to each vertex as the evaluation score for each skin diagnosis item. For example, in FIG. 11, the
When the skin diagnosis is performed while the user is logged on, the control unit can store the user's skin diagnosis result in association with the user account information. Accordingly, the user can inquire at any time the skin diagnosis result associated with his or her account accumulated for a predetermined period of time.
Alternatively, if a skin diagnosis is made while the user is not logged on, the control unit may output the result of the skin diagnosis of the user and immediately discard it. Accordingly, the user will not be able to inquire about the result of the skin diagnosis proceeded without being logged on at a later time.
The operation of the terminal device according to the present invention will be described in detail based on the above description.
12 is a flowchart for explaining the operation of the terminal device according to the present invention.
First, it is assumed that the display unit of the terminal device is initially in an Off state. Here, the Off state means a state in which no power is supplied to the display unit, and no graphic object is output to the display unit. However, even if the display unit is off, the touch sensor having a mutual layer structure with the display unit may maintain a state of waking up every predetermined time. Accordingly, even if the display unit is off, the display unit can be in a state capable of receiving the touch input.
If a user accessing the terminal device is detected (S1210), the control unit switches the display unit in the off state to the on state (S1220), and controls the display unit to output the standby screen in operation S1230. At this time, the control unit may sense the user's approach through the image input through the camera, or may sense the user's approach through the sensing signal of the sensing unit.
13 is a diagram showing an example in which an idle screen is output. When the user's access is detected, the control unit may control the display unit to output an idle screen as in the example shown in FIG. Through the idle screen, time information, weather information, communication status information, and the like can be output.
In the example shown in FIG. 13,
The control unit can receive the weather information from the weather information providing server and output it on the standby screen. If the communication state is poor and weather information can not be received from the weather information providing server, a message indicating that the communication state is poor and weather information can not be received may be output instead of the weather information.
When a user input for touching the screen is received in the state that the standby screen is being output (S1240), the control unit can stop the output of the standby screen and control the main screen to be output. At this time, the control unit may output information to be outputted through the main screen based on factors such as whether the user is logged on and the time.
14 to 17 are diagrams showing an example in which a main screen is outputted.
First, FIG. 14 is a diagram showing a configuration of a main screen of morning time before log-on.
If the user does not log on to the terminal device and the current time corresponds to the morning time zone (S1250), the control unit displays the detailed date information and the first date and time information, as in the example shown in FIG. 14 (a) Type content is output (S1260). Here, the first type of content may be content selected in association with today's weather, such as skin care method information according to weather or makeup method information.
In the example shown in FIG. 14 (a), when it is assumed that the display unit is divided into a first area on the left side and a second area on the right side with a virtual line as a boundary, in the first area, And the skin management method information including the weather information such as the precipitation probability, the humidity and the skin index, and the message guiding the skin management method according to the weather is outputted .
For example, in (a) of FIG. 14, a control button is outputted through the first area and a start button is outputted through the second area. The control buttons output through the first area are illustrated as including a
The
The
The self-
The
For example, in FIG. 14B, as the
When the
If the user does not log on to the terminal device and the current time corresponds to the afternoon time zone (S1250), the control unit may control to output the second type of content instead of the first type content (S1270).
For example, FIG. 15 is a diagram showing the configuration of the main screen of the afternoon time zone before logon.
In the afternoon, since the frequency of going out of the user is higher than morning, the skin management method information or makeup method information according to the weather is likely to be information that is no longer useful to the user. Accordingly, the control unit can output the second type of content instead of the first type content in the afternoon.
For example, in FIG. 15, a skin management moving image having a high number of views is output through the second region. By outputting content that is popular (i.e., with a high number of views) to multiple users instead of content associated with the weather, it can attract users and further increase user interest.
14 and 15 illustrate that the first type content is output in the morning time zone and the second type content is output in the afternoon time zone. However, the time zone in which each content is output may be set differently from the illustrated example .
In a state in which the main screen is being output, the user can proceed to log on to use the service provided by the terminal device. For example, when the login button shown in FIGS. 14 and 15 is touched, the control unit may output a login page for user logon. When the account information is inputted through the login page, the control unit can perform log-on based on the inputted account information (S1280).
When the user log-on is completed, the control unit may control the skin diagnosis result of the user to be output instead of the first or second type content.
For example, FIG. 16 is a diagram showing an example in which a skin diagnosis result of a user is output.
If it is determined that the predetermined period has not elapsed since the most recent skin diagnosis was performed (S1290), the control unit may control to output the latest skin diagnosis result of the user as in the example shown in Fig. 16 (S1292). In FIG. 16, the diagnostic results of the evaluation items such as trouble, pores, dullness, wrinkles, and skin tone, the time elapsed since the last skin diagnosis was performed, .
If the predetermined period has elapsed since the user performed the most recent skin diagnosis (S1290), the controller may control the cumulative result of accumulating the user's skin diagnosis result to be outputted (S1294).
For example, FIG. 17 is a diagram showing an example in which a cumulative skin diagnosis result of a user is output.
If a predetermined period has elapsed since the user performed the most recent skin diagnosis, the controller may control the cumulative result of accumulating the skin diagnosis result of the user to be output as in the example shown in FIG. In FIG. 17, a graph is shown in which a skin diagnosis result from February 10 to February 28 is accumulated.
When an arbitrary item on the graph is touched, the control unit can control the skin diagnosis result corresponding to the touched item to be output. For example, when an item corresponding to Feb. 28 is touched in the graph shown in FIG. 17, the control unit can output the skin diagnosis result of the selected date.
It is possible to induce the skin diagnosis to be performed periodically by changing the information output through the display unit according to the elapsed time since the user performed the skin diagnosis last time.
If the user does not have a record of performing the skin diagnosis, or if the user has performed the skin diagnosis for a long period of time (i.e., the second predetermined period), the control unit , It is possible to control the first type content or the second type content to be output according to the current time.
When the user logs on to the terminal device, the control unit can control the log-on state to be maintained even if the display unit is off. If the proximity of the user is detected and the user recognized through the camera is determined to be the same person as the user of the logged-on account, the control unit continuously maintains the log-on state, and in the example shown in Figs. 16 and 17 Likewise, the skin diagnosis result information can be controlled to be output.
If it is determined that the recognized user is different from the user of the logged-on account, the control unit logs out the logged-on account, and then, as in the example shown in FIGS. 14 and 15, 2 type content can be output.
According to an embodiment of the present invention, the above-described methods (operation flow charts) can be implemented by a program such as a computer program or an application, or can be implemented as a code readable by a processor on a medium on which the program is recorded . Examples of the medium that can be read by the processor include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, etc., and may be implemented in the form of a carrier wave (e.g., transmission over the Internet) .
The
510:
520:
530: Camera
540: Light source
550: memory
560:
570:
Claims (16)
And outputting a main screen through the display unit,
The configuration of the main screen is automatically adjusted according to the current time or whether the user is logged in,
Wherein the main screen includes information of the user's skin diagnosis result after the user logs in.
If the current time is the first time zone, the first type content is included in the main screen.
Wherein the first type content includes at least one of skin management method information related to weather or makeup method information related to weather.
And the second type content is included in the main screen when the current time is the second time zone.
Wherein the second type content includes a popular moving image selected by the server.
Wherein the main screen includes the latest skin diagnosis result information if the predetermined time has not elapsed since the user performed the last skin diagnosis.
Wherein the main screen includes information on skin diagnosis cumulative results for the latest N times, if a predetermined time has elapsed since the last skin diagnosis was performed.
A control unit for turning on the display unit and controlling the main screen to be output through the display unit when the proximity of the user is detected in a state in which the display unit is off;
, ≪ / RTI &
The configuration of the main screen is automatically adjusted according to the current time or whether the user is logged in,
Wherein the main screen includes information of the skin diagnosis result of the user after the user logs in.
Wherein if the current time is a first time zone, the first type content is included in the main screen.
Wherein the first type content includes at least one of skin management method information related to weather or makeup method information related to weather.
And the second type content is included in the main screen when the current time is the second time zone.
Wherein the second type content includes a popular moving image selected by the server.
Wherein the main screen includes the latest skin diagnosis result information if the predetermined time has not elapsed since the user performed the last skin diagnosis.
Wherein the main screen includes the skin diagnosis cumulative result information for the latest N times, if a predetermined time has elapsed since the user performed the last skin diagnosis.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150109117A KR101684272B1 (en) | 2015-07-31 | 2015-07-31 | Terminal device and controlling method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150109117A KR101684272B1 (en) | 2015-07-31 | 2015-07-31 | Terminal device and controlling method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
KR101684272B1 true KR101684272B1 (en) | 2016-12-08 |
Family
ID=57576650
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150109117A KR101684272B1 (en) | 2015-07-31 | 2015-07-31 | Terminal device and controlling method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101684272B1 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20050049327A (en) * | 2003-11-21 | 2005-05-25 | 삼성전자주식회사 | Method for displaying information in mobile phone |
KR20080044655A (en) * | 2006-11-17 | 2008-05-21 | 엘지전자 주식회사 | Method of display in mobile terminal |
-
2015
- 2015-07-31 KR KR1020150109117A patent/KR101684272B1/en active IP Right Grant
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20050049327A (en) * | 2003-11-21 | 2005-05-25 | 삼성전자주식회사 | Method for displaying information in mobile phone |
KR20080044655A (en) * | 2006-11-17 | 2008-05-21 | 엘지전자 주식회사 | Method of display in mobile terminal |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101668348B1 (en) | Method for analyzing skin surface and apparatus therefor | |
US10939076B2 (en) | Streaming and storing video for audio/video recording and communication devices | |
US11873952B2 (en) | Smart-home device light rings with tapered sections for uniform output | |
US11819108B2 (en) | Smart mirror system and methods of use thereof | |
KR101661588B1 (en) | Method for analyzing skin surface and apparatus therefor | |
CN208985034U (en) | Electronic equipment with sensor and display equipment | |
JP2010004118A (en) | Digital photograph frame, information processing system, control method, program, and information storage medium | |
KR100695053B1 (en) | Mirror apparatus having display function | |
US20140375828A1 (en) | Apparatus, systems, and methods for capturing and displaying an image | |
CN102577367A (en) | Time shifted video communications | |
KR20180080140A (en) | Personalized skin diagnosis and skincare | |
KR101715063B1 (en) | Dressing table | |
JP2006330011A (en) | Information presenting device | |
CN111109959A (en) | Intelligent cosmetic mirror, control method thereof, controller and storage medium | |
CN208013970U (en) | A kind of living creature characteristic recognition system | |
KR101753633B1 (en) | Terminal device and method for comparing skin dignosis results using thereof | |
KR101701210B1 (en) | Method for outputting an skin analysing result, apparatus and application therefor | |
KR101684272B1 (en) | Terminal device and controlling method thereof | |
KR101648049B1 (en) | Dressing table and controlling method thereof | |
JP2018072617A (en) | Display control device, display system, display control method and computer program | |
EP3641319A1 (en) | Displaying content on a display unit |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GRNT | Written decision to grant | ||
FPAY | Annual fee payment |
Payment date: 20191115 Year of fee payment: 6 |