US20120223956A1 - Information processing apparatus, information processing method, and computer-readable storage medium - Google Patents

Information processing apparatus, information processing method, and computer-readable storage medium Download PDF

Info

Publication number
US20120223956A1
US20120223956A1 US13/400,980 US201213400980A US2012223956A1 US 20120223956 A1 US20120223956 A1 US 20120223956A1 US 201213400980 A US201213400980 A US 201213400980A US 2012223956 A1 US2012223956 A1 US 2012223956A1
Authority
US
United States
Prior art keywords
makeup
image
facial
face
data representing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/400,980
Inventor
Mari Saito
Tatsuki Kashitani
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KASHITANI, TATSUKI, SAITO, MARI
Publication of US20120223956A1 publication Critical patent/US20120223956A1/en
Priority to US14/980,630 priority Critical patent/US10945514B2/en
Priority to US17/189,915 priority patent/US20210177124A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D44/005Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D2044/007Devices for determining the condition of hair or skin or for selecting the appropriate cosmetic or hair treatment

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, an information processing system, and a program.
  • an apparatus for generating output image data comprising a receiving unit configured to receive image data representing an input image, the input image containing at least one facial image.
  • the apparatus further comprises a recognition unit configured to recognize the facial image in the image data, and recognize facial features of the facial image.
  • the apparatus further comprises a makeup image generation unit configured to generate data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup.
  • the apparatus also comprises a display generation unit configured to generate output image data representing the makeup image superimposed on the facial image.
  • a method for generating output image data comprises receiving image data representing an input image, the input image containing at least one facial image.
  • the method further comprises recognizing the facial image in the image data, and recognizing facial features of the facial image.
  • the method further comprises generating data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup.
  • the method also comprises generating output image data representing the makeup image superimposed on the facial image.
  • a tangibly-embodied non-transitory computer-readable storage medium storing instructions which, when executed by a processor, cause a computer to perform a method for generating output image data.
  • the method comprises receiving image data representing an input image, the input image containing at least one facial image.
  • the method further comprises recognizing the facial image in the image data, and recognizing facial features of the facial image.
  • the method further comprises generating data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup.
  • the method also comprises generating output image data representing the makeup image superimposed on the facial image.
  • an apparatus for generating output image data comprises receiving means for receiving image data representing an input image, the input image containing at least one facial image.
  • the apparatus further comprises recognition means for recognizing the facial image in the image data, and recognizing facial features of the facial image.
  • the apparatus further comprises makeup image generation means for generating data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup.
  • the apparatus also comprises display generation means for generating output image data representing the makeup image superimposed on the facial image
  • an information processing apparatus for improving a makeup support scheme.
  • FIG. 1 is a diagram showing an overview of a makeup support apparatus according to a first embodiment
  • FIG. 2 is a block diagram showing an example of a configuration of the makeup support apparatus according to the first embodiment
  • FIG. 3 is a flowchart showing a makeup support process performed by the makeup support apparatus according to the first embodiment
  • FIG. 4 is a flowchart illustrating a simulation presenting process shown in FIG. 3 ;
  • FIG. 5 is a diagram illustrating an example of display control of the makeup support apparatus according to the first embodiment
  • FIG. 6 is a diagram showing a positional relationship between the makeup support apparatus according to the first embodiment and a user
  • FIG. 7 is a diagram showing an example of makeup scheme information
  • FIG. 8 is a diagram illustrating an example of display control of the makeup support apparatus according to the first embodiment
  • FIG. 9 is a block diagram showing an example of a configuration of a makeup support apparatus in variant 1 ;
  • FIG. 10 is a diagram showing a display screen of a makeup support apparatus in variant 2 ;
  • FIG. 11 is a diagram showing an overview of a makeup support system according to a second embodiment
  • FIG. 12 is a diagram showing an example of a configuration of the makeup support apparatus according to the second embodiment.
  • FIG. 13 is a diagram showing an overview of a makeup support system according to a third embodiment
  • FIG. 14 is a diagram showing an example of a configuration of the makeup support apparatus according to the third embodiment.
  • FIG. 15 is a diagram showing an overview of a makeup support system according to a fourth embodiment.
  • FIG. 16 is a diagram showing an example of a configuration of the makeup support apparatus according to the fourth embodiment.
  • AR augmented reality
  • the information presented to the user may be visualized using various forms of virtual objects such as text, icons or animation.
  • a primary application field of the AR technology is the support of user activities in the real world.
  • the AR technology is applied to a makeup support scheme. This can improve a makeup scheme of a user.
  • the makeup support scheme using the AR technology is applicable to a makeup support system for simulating a state after makeup completion and presenting a simulation result to a user. Further, in the system, the makeup support scheme using the AR technology displays a procedure during makeup as well as after makeup completion in consideration of the difficulty a general user has in applying actual makeup according to the simulation, thereby further improving the makeup support scheme.
  • the makeup support apparatus is a tablet-type terminal including a touch panel display.
  • a display unit 18 not only has a display function, but also a function of a manipulation unit 19 for receiving a manipulation input from a user.
  • a face image of the user is captured by a camera 10 provided in the makeup support apparatus 1 .
  • An AR image in which a virtual makeup image is superimposed on the face image is displayed on the display unit 18 .
  • the face image of the user is an image in a real space
  • the makeup image is a virtual object superimposed on a video of the real space.
  • the series of control processes carried out by the makeup support apparatus 1 described in the present specification may be realized using hardware, software, or a combination of hardware and software. Instructions for performing the series of control processes may be stored in advance on a tangibly embodied non-transitory computer-readable storage medium, such as a hard disk drive, provided inside or outside the respective apparatus. During execution, the instructions may be written into RAM (Random Access Memory) and executed by a processor such as a CPU (Central Processing Unit).
  • RAM Random Access Memory
  • CPU Central Processing Unit
  • the makeup support apparatus 1 includes a camera 10 (i.e., a receiving unit), an image recognition unit 11 , a user face information analysis unit 12 , a recommended makeup pattern judgment unit 13 , a makeup scheme DB 14 , a makeup image generation unit 15 , a display control unit 17 (i.e., a display generation unit), a display unit 18 and a manipulation unit 19 .
  • the image recognition unit 11 includes a face and part recognition unit 110 and an SLAM processing unit 111 .
  • unit may be a software module, a hardware module, or a combination of a software module and a hardware module.
  • Such hardware and software modules may be embodied in discrete circuitry, an integrated circuit, or as instructions executed by a processor.
  • the camera 10 is an example of an imaging unit for acquiring an image (video) by imaging a real space.
  • the camera 10 captures a face image of the user.
  • the user faces the makeup support apparatus 1 when applying makeup using the makeup support apparatus 1 .
  • the camera 10 provided in the makeup support apparatus 1 captures a face of the user.
  • the camera 10 outputs the captured image to the image recognition unit 11 .
  • an image (video) 181 shown to the left in FIG. 6 is output to the image recognition unit 11 .
  • the image recognition unit 11 performs an image recognition process on the captured image acquired from the camera 10 . Specifically, first, a face in the captured image is recognized by the face and part recognition unit 110 of the image recognition unit 11 . Once the face is recognized, parts of the face are recognized. The face and part recognition unit 110 may recognize the face and the face parts using a known image recognition scheme such as pattern recognition. The face and part recognition unit 110 outputs recognition results (the face parts and face part position information) to the user face information analysis unit 12 . Further, the face and part recognition unit 110 may judge to have recognized the face in the captured image even after recognizing the presence of a portion of the face, such as the shoulder, neck, head, hair, or the like of the user in the captured image.
  • SLAM processing unit 111 of the image recognition unit 11 According to a principle of SLAM (Simultaneous Localization and Mapping) technology disclosed in, for example, Andrew J. Davison's “Real-Time Simultaneous Localization and Mapping with a Single Camera,” (Proceedings of the 9th IEEE International Conference on Computer Vision Volume 2, 2003, pp. 1403-1410).
  • SLAM Simultaneous Localization and Mapping
  • the SLAM processing unit 111 can follow the motion of the face by tracking a position in a next image to which an initially or subsequently recognized face part, such as the eyes or mouth, has moved in order to recognize positions of the face parts.
  • the user face information analysis unit 12 analyzes face information based on the recognition result (face part information) output from the face and part recognition unit 110 . Specifically, information on texture such as skin quality, skin texture and hair texture or information on a shape such as contour, layout of parts and hairdo is analyzed from the face part position and the face part image contained in the face part information. The user face information analysis unit 12 outputs the analyzed face information to the recommended makeup pattern judgment unit 13 .
  • a makeup pattern suitable for a general trend as well as a small region- or age-based community to which a user belongs may be recommended.
  • a region, an age or the like to which the user belongs is input as user information by the user.
  • a sex may be input and a makeup pattern suitable for a man may be recommended.
  • FIG. 3 is a diagram showing an example of the makeup scheme information.
  • the makeup scheme information contains data of pattern ID: an ID of a makeup pattern, context ID: an ID of a makeup objective or environment such as usual, party or outdoor, process: number of a procedure, Item ID: an ID of cosmetics (products), tool ID: an ID of a tool such as a puff or a brush, action ID: IDs of operations such as paint, slap, or push for the tool, part ID: an ID of part of a face such as eyes or eyebrows, pressure: pressure on a tool, direction: a tool operation direction, motion: movement of the tool, distance: a deviation from a reference face part, and length: a length (distance) at which cosmetics are applied, as shown in FIG. 3 .
  • the context ID is information used when a user performs mode setting to set a makeup objective or environment in advance.
  • the recommended makeup pattern judgment unit 13 judges a recommended makeup pattern by referencing a set mode, in addition to the face information.
  • the judgment of the recommended makeup pattern may be a judgment to determine a recommendation order. In this case, for example, three upper makeup patterns are presented to the user as recommended makeup patterns, and if the user does not select any one from the makeup patterns, three subsequent makeup patterns are also presented as the recommended makeup pattern.
  • Makeup scheme information shown in FIG. 3 is information indicating a makeup method (operation), but may be used to generate a makeup completion image.
  • the recommended makeup pattern judgment unit 13 outputs the makeup scheme information for the recommended makeup pattern to the image generation unit 15 .
  • the recommended makeup pattern judgment unit 13 may output an ID of the recommended makeup pattern and the image generation unit 15 may acquire the makeup scheme information for the recommended makeup pattern from the makeup scheme DB based on the ID.
  • the image generation unit 15 draws a makeup image that is virtual information superimposed on the captured image based on the makeup scheme information for the recommended makeup pattern, and deforms the makeup image according to the position and posture of the face parts contained in the recognition result (three-dimensional position and posture) of the SLAM processing unit 111 .
  • the image generation unit 15 may deform the makeup image using the recognition result (face part information) output from the face and part recognition unit 110 .
  • the makeup image generated by the image generation unit 15 will be described in detail in “(1-4) Example of AR Image.”
  • the image generation unit 15 outputs the generated makeup image to the display control unit 17 .
  • the display control unit 17 generates output image data and displays an AR image in which a virtual makeup image is superimposed on an image obtained by imaging the real space on the display unit 18 . Accordingly, the user can view the operation of makeup applied to the face image or the face after applying makeup, which is displayed on the display unit 18 .
  • an image (captured image) obtained by imaging the face of the user using the camera 10 is input in step S 62 .
  • the captured image is then displayed on the display unit 18 in step S 64 .
  • mode setting may be performed by the user in step S 65 .
  • information of a makeup objective or environment such as usual, party, outdoor, or date is input (step S 66 ), so that a makeup pattern having a style suitable for a mode desired by the user is recommended.
  • step S 65 determines whether mode setting is performed by the user (step S 65 : No).
  • the face parts are recognized from the captured image by the image recognition unit 11 in step S 67 .
  • the user face information analysis unit 12 analyzes face information from the face parts recognized by the image recognition unit 11 in step S 68 .
  • a recommended makeup pattern is judged by the recommended makeup pattern judgment unit 13 , superimposed on the face of the user and then presented.
  • the recommended makeup pattern judgment unit 13 judges a makeup pattern suitable for the user from the makeup patterns stored in the makeup scheme DB 14 , as the recommended makeup pattern, in consideration of the face information output from the user face information analysis unit 12 or a set mode when mode setting is performed by the user.
  • the recommended makeup pattern is displayed on the display unit 18 to be presented to the user.
  • the recommended makeup pattern presented on the display unit 18 may be an image obtained by applying makeup to a face of a model created in advance, like sample images 182 a to 182 c shown in FIG. 6 .
  • step S 72 If a plurality of recommended makeup patterns are presented in step S 72 (step S 72 : Yes), any of the recommended makeup patterns is selected by the user in step S 74 .
  • the makeup support apparatus 1 performs a process of displaying an AR image in which a virtual makeup image created from the recommended makeup pattern is superimposed on an image obtained by imaging a face in the real space on the display unit 18 in step S 76 (see the AR image 183 shown in FIG. 6 ).
  • This simulation presenting process (AR image display process) will be described using a flowchart of FIG. 5 later.
  • step S 80 a makeup action image for the determined makeup pattern is presented in step S 80 .
  • the makeup action image will be described later with reference to FIG. 8 .
  • step S 70 if there is no favorite makeup pattern among the presented recommended makeup patterns (step S 78 : No), subsequent recommended makeup patterns are presented (step S 70 ). For example, if a recommendation list has been generated by the recommended makeup pattern judgment unit 13 , the recommended makeup pattern is presented in order from upper makeup patterns in the list. Steps S 70 to S 78 are iteratively performed until the favorite makeup pattern is selected. Alternatively, the process is performed again from mode setting in step S 66 .
  • a process of recognizing a face part position from the captured image in the face and part recognition unit 110 is performed in step S 84 . If the face part position is recognized (step S 86 : Yes), an attribute of a face feature point is updated in step S 88 .
  • the face feature point refers to a landmark on an object (in the present embodiment, the face and the face part) that is a recognition target in an image tracked for three-dimensional recognition in the SLAM process.
  • step S 90 a process of recognizing a three-dimensional position and posture of a face part that is a recognition target in the captured image is performed by the SLAM processing unit 111 .
  • a virtual makeup image is generated by the makeup image generation unit 15 in step S 94 .
  • An AR image in which a makeup image that is virtual information is superimposed on the face of the user that is a captured image of a real space is then displayed in step S 96 .
  • the AR image according to the present embodiment is an image in which a virtual makeup image is superimposed on a captured image obtained by imaging a real space.
  • the superimposed makeup image may be a makeup completion image generated based on the makeup scheme information for the makeup pattern.
  • a virtual makeup completion image 185 is superimposed on the face image 181 of the user, as in the AR image 183 in FIG. 6 . Accordingly, the user can confirm a completion state when applying makeup using the recommended makeup pattern, in advance.
  • the user can also confirm the AR image from several angles.
  • a description will be given with reference to FIGS. 6 and 7 .
  • the face is directed to the side as shown in FIG. 7 . Accordingly, the position and the posture of the face of the user in the image captured by the camera 10 are changed.
  • the simulation presenting process shown in FIG. 4 is iteratively performed, if the position and the posture of the face of the user in the image are changed, the change can be tracked in real time, such that a position or a shape in which the makeup image that is superimposed virtual information can be changed according to a change of the user in the image.
  • a virtual makeup completion image is displayed with a changed shape by tracking the position of the recognized face part, as shown in the AR image 184 of FIG. 6 . Accordingly, the user can recognize the AR image from several angles in real time.
  • the superimposed virtual makeup image may be a makeup action image generated based on the makeup scheme information of the makeup pattern.
  • the superimposed and displayed makeup action image is displayed in order of the number of the process, but an example of a time at which a makeup action image for a next procedure is displayed will be described in a variant to be described later.
  • a superimposed position of the makeup action image is changed or transformed according to the change of the position and the posture of the face part of the user in the image (tracking display), similar to the makeup completion image described using FIG. 6 .
  • a makeup action image 187 a in which a hand holding an eyebrow pencil moves along a makeup area indicated by a dotted line is changed into a makeup action image 187 b according to the change of the position and posture of the face part of the user in the image. That is, the makeup action image tracks the face part. Accordingly, the user can confirm the makeup action image as a makeup model from several angles in real time and can apply makeup more accurately.
  • the virtual makeup image may be two-dimensional information or three-dimensional information.
  • the makeup progress degree judgment unit 16 compares a recognition result output from the image recognition unit 11 with a previously generated makeup completion image to judge a progress degree of real makeup being applied by the user.
  • the makeup image generation unit 15 modifies the superimposed and displayed makeup action image according to the progress degree output from the makeup progress degree judgment unit 16 .
  • the modification may also be based on the steps taken toward completion of the makeup procedure.
  • the makeup image generation unit 15 may modify the superimposed and displayed makeup action image to depict a next step in the makeup procedure to be performed, following a determination that a particular step has been performed. A detailed description of other configurations will be omitted since they are the same as those described above.
  • portions of the makeup action image can be removed or made transparent, where the removed or transparent portions may correspond to sections of the face where makeup has been applied according to the makeup procedure. Accordingly, the user can visually confirm to what extent the makeup currently being applied is approaching the completion state. Further, a time at which a makeup action image for a next procedure is displayed may be determined according to the progress degree of the makeup being applied by the user. Accordingly, if the makeup according to the shown procedure is completed, the makeup action image for the next procedure is automatically displayed.
  • a manipulation of the user may be used as trigger. For example, if the next procedure is instructed to be displayed, the makeup action image for the next procedure is forcibly displayed. Further, the procedure may be skipped by the manipulation of the user. Further, a makeup action image for a face part approached by the hand of the user may be displayed. If changing the procedure according to a makeup procedure is not desirable, an alert indicating that fact may be displayed.
  • an image 191 on which the makeup completion image is superimposed may be displayed, in addition to the AR image 190 on which the makeup action image is superimposed. Accordingly, the user can apply makeup while confirming a makeup completion state.
  • makeup support system according to a second embodiment of the present disclosure will be described with reference to FIGS. 11 and 12 .
  • makeup service of a desired makeup artist can be received.
  • FIG. 11 is a diagram showing an overview of the makeup support system according to the present embodiment.
  • the makeup support system according to the present embodiment includes a makeup support apparatus 1 , a server 30 , and a makeup scheme acquisition apparatus 4 which are connected via a network 6 .
  • the makeup scheme acquisition apparatus 4 includes various information acquisition units such as a camera 40 , a motion sensor 41 and a pressure sensor 42 .
  • a makeup scheme of a makeup artist is acquired by such a makeup scheme acquisition apparatus 4 and stored in a makeup scheme DB 302 of the server 30 .
  • various sensors are attached to the arm or hand of the makeup artist to acquire information such as pressure when a makeup tool contacts cosmetics, pressure when makeup is applied to the face, and a locus. Further, a makeup procedure is imaged by the camera 40 .
  • various sensors may be attached to cosmetics, makeup tools, or a mannequin to acquire makeup scheme information of makeup artists. Further, IDs of used cosmetics, or information indicating that a plurality of cosmetics are used together, if any, are input.
  • the server 30 includes a scheme analysis unit 301 and the makeup scheme DB 302 .
  • the scheme analysis unit 301 analyzes a makeup scheme from the various information acquired by the makeup scheme acquisition apparatus 4 and digitizes the makeup scheme with reference to a generalized face image.
  • the makeup scheme DB 302 stores the makeup scheme information analyzed by the scheme analysis unit 301 .
  • the makeup support apparatus 1 includes a camera 10 , an image recognition unit 11 , a user face information analysis unit 12 , a recommended makeup pattern judgment unit 13 , a makeup image generation unit 15 , a display control unit 17 , a display unit 18 , a manipulation unit 19 , and a communication unit 20 .
  • the image recognition unit 11 includes a face and part recognition unit 110 and an SLAM processing unit 111 .
  • the communication unit 20 establishes a communication connection with the server 30 .
  • the communication in the communication unit 20 may be wired or wireless communication, but usually exchanges information with the server 30 using wireless communication such as a wireless LAN or Bluetooth (registered trademark).
  • This enables communication for receiving makeup service of a makeup artist desired by the user.
  • the communication unit 20 receives the makeup scheme information from the server 30 .
  • a description of other configurations will be omitted since they are the same as those of the makeup support apparatus 1 according to the first embodiment.
  • the makeup scheme information acquired from each makeup artist is stored in the makeup scheme DB 302 of the server 30 .
  • a list of selectable makeup artists is displayed on the display unit 18 , as shown in FIG. 11 , and makeup scheme information of the artist selected by the user is acquired from the server 30 . Accordingly, a recommended makeup pattern judgment unit of the makeup support apparatus 1 can judge a makeup pattern suitable for the user from makeup patterns based on the makeup scheme information of the makeup artist desired by the user.
  • the makeup scheme acquisition apparatus 4 shown in FIG. 11 is not necessarily an indispensable component.
  • makeup scheme information can be exchanged among users.
  • FIG. 13 is a diagram showing an overview of a makeup support system according to the present embodiment.
  • the makeup support system according to the present embodiment includes makeup support apparatuses 1 A to 1 D for respective users and a server 31 , which are connected via a network 6 .
  • the makeup support apparatus 1 includes a camera 10 , an image recognition unit 11 , a user face information analysis unit 12 , a recommended makeup pattern judgment unit 13 , a makeup image generation unit 15 , a display control unit 17 , a display unit 18 , a manipulation unit 19 and a communication unit 20 .
  • the image recognition unit 11 includes a face and part recognition unit 110 and an SLAM processing unit 111 .
  • the camera 10 acquires an image of makeup done by a user and outputs the image to the communication unit 20 .
  • the communication unit 20 establishes a communication connection with the server 30 and transmits the captured image output from the camera 10 to the server 30 . Further, the communication unit 20 receives makeup scheme information of other users from the makeup scheme DB 312 of the server 30 . This makes it possible to exchange the makeup scheme information with the other users.
  • the recommended makeup pattern judgment unit 13 judges a makeup pattern suitable for the user from the makeup patterns based on the makeup scheme information of the other users acquired by the communication unit 20 . A description of other configurations will be omitted since they are the same as those of the makeup support apparatus 1 according to the first embodiment.
  • the server 31 includes a scheme analysis unit 311 and a makeup scheme DB 312 .
  • the scheme analysis unit 311 analyzes a makeup scheme from the captured image obtained by imaging a makeup action of the user transmitted from each makeup support apparatus 1 , and digitizes the makeup scheme with reference to a generalized face image.
  • the scheme analysis unit 311 for example, compares a professional's makeup action image or a reference makeup action image with the makeup action image of the user to calculate a difference therebetween, and digitizes the makeup scheme.
  • the makeup scheme DB 312 stores the makeup scheme information analyzed by the scheme analysis unit 311 .
  • the makeup scheme information acquired from each user is stored in the makeup scheme DB 312 of the server 31 .
  • the makeup support apparatus 1 A a list of other selectable users is displayed on the display unit 18 , as shown in FIG. 13 , and the makeup scheme information of the other user selected by the user is acquired from the server 31 .
  • the recommended makeup pattern judgment unit of the makeup support apparatus 1 can judge a makeup pattern suitable for the user from the makeup patterns based on the makeup scheme information of the other user, such as a friend of the user.
  • the makeup support system can be further utilized, such as friendly competition in the makeup scheme with friends, imitating makeup schemes of other users, or being famous among nonprofessional amateurs.
  • a makeup support system according to a fourth embodiment of the present disclosure will be described with reference to FIGS. 15 and 16 .
  • information of appropriate cosmetics and cosmetics sale service can be provided.
  • FIG. 15 is a diagram showing an overview of the makeup support system according to the present embodiment.
  • the makeup support system according to the present embodiment includes a makeup support apparatus 1 , a server 32 and a sale management apparatus 5 for each cosmetics shop, which are connected via a network 6 .
  • the makeup support apparatus 1 includes a camera 10 , an image recognition unit 11 , a user face information analysis unit 12 , a recommended makeup pattern judgment unit 13 , a makeup image generation unit 15 , a display control unit 17 , a display unit 18 , a manipulation unit 19 , a communication unit 20 , and a cosmetics information providing unit 21 .
  • the image recognition unit 11 includes a face and part recognition unit 110 and an SLAM processing unit 111 .
  • the communication unit 20 establishes a communication connection with the server 32 and exchanges information with the server 32 . Specifically, the communication unit 20 performs reception of makeup scheme information, transmission of a makeup ID, and reception of cosmetics information from/to the server 32 .
  • the recommended makeup pattern judgment unit 13 judges a recommended makeup pattern based on the makeup scheme information received by the communication unit 20 . Further, the recommended makeup pattern judgment unit 13 outputs a cosmetics ID (see FIG. 7 ) of a makeup pattern judged to be a recommended makeup pattern to the server 32 via the communication unit 20 .
  • the cosmetics information providing unit 21 provides the user with the cosmetics information transmitted from the server 32 according to the cosmetics ID output from the makeup recommended pattern judgment unit 13 .
  • maker, brand name, product name, price, product explanation and the like are displayed as cosmetics information on the display unit 18 , as shown in FIG. 15 .
  • information of appropriate cosmetics to be used when the makeup based on the recommended makeup pattern is applied by the user is provided.
  • a buy button 192 is displayed together with the cosmetics information on the display unit 18 , such that the user can easily perform a procedure of purchasing appropriate cosmetics.
  • the server 32 includes a makeup scheme DB 322 and a cosmetics information DB 323 .
  • the makeup scheme DB 322 stores the makeup scheme information, similar to the makeup scheme DB in each embodiment described above.
  • the makeup information DB 323 stores cosmetics information corresponding to each cosmetics ID contained in the makeup scheme information stored in the makeup scheme DB 322 .
  • the sale management apparatus 5 performs product sale management according to a cosmetics purchase request from the makeup support apparatus 1 .
  • the sale management apparatus 5 may be owned by each cosmetics shop. Alternatively, a normal net sale system (online shopping) may be used.
  • the makeup support apparatus 1 sells cosmetics necessary for the user to realize the recommended makeup pattern to thereby perform makeup support.
  • a makeup support apparatus 1 judges a recommended makeup pattern in consideration of effects of makeup on skin.
  • information having an influence on the skin such as age, sex, race, and life pattern, may be input by the user.
  • a makeup support system according to a sixth embodiment of the present disclosure will be described.
  • a user's favorite face image such as a face photograph of an entertainer is input by the user, and a makeup pattern causing the face of the user to be as close as possible to the user's favorite face is judged to be a recommended makeup pattern from an analysis result for the face image and the face information of the user.
  • the makeup support apparatus 1 performs simulation of a state after makeup completion using the AR technology, thereby improving the makeup support scheme. Further, a procedure during makeup application is displayed using the AR technology, thereby further improving the makeup support scheme.
  • a process with a heavy load among the processes performed in the makeup support apparatus 1 may be performed by the server connected via the network, or performed in a distributive manner via remote devices or servers in, for example, a cloud computing configuration.
  • the captured image captured by the camera 10 may be transmitted from the makeup support apparatus 1 to the server, and user face information of the captured image may be analyzed by the server to judge a recommended makeup pattern.
  • the server transmits the makeup scheme information of the makeup pattern judged to be the recommended makeup pattern from the captured image transmitted from the makeup support apparatus 1 , to the makeup support apparatus 1 .
  • the server may generate the makeup image from the recommended makeup pattern and transmit the makeup image to the makeup support apparatus 1 .
  • the makeup scheme DB 14 for storing the makeup scheme information may be disposed in the server.
  • the process with a heavy load is performed by the server as described above, power consumption of the makeup support apparatus 1 can be reduced and hardware resources necessary for the makeup support apparatus 1 can be reduced. Further, if the makeup scheme DB 14 for storing the makeup schemes is disposed in the server, storage capacity of the makeup support apparatus 1 can be reduced and the same makeup scheme information can be easily used or managed between makeup support apparatuses.
  • the present technology can adopt the following configurations.
  • An information processing apparatus comprising:
  • an imaging unit for capturing an image
  • an image recognition unit for sequentially acquiring images from the imaging unit and recognizing parts of a face in the acquired image
  • a display control unit for displaying a virtual makeup image tracking the face in the image based on a position of the face parts contained in a recognition result output from the image recognition unit, the virtual makeup image being superimposed on the image.
  • an analysis unit for analyzing face information based on the recognition result output from the image recognition unit
  • a recommended makeup pattern judgment unit for judging a recommended makeup pattern according to an analysis result output from the analysis unit
  • the makeup pattern includes the makeup scheme information
  • the image generation unit generates the virtual makeup image based on makeup scheme information of a makeup pattern judged to be a recommended makeup pattern by the recommended makeup pattern judgment unit.
  • a makeup progress degree judgment unit for judging a makeup progress degree by comparing the image captured by the imaging unit with a previously generated makeup completion image
  • An information processing system comprising:
  • an information processing apparatus including
  • an imaging unit for capturing an image
  • a display control unit for displaying a virtual makeup image tracking the face in the image based on a position of the face parts contained in a recognition result output from the image recognition unit, the virtual makeup image being superimposed on the image;
  • an analysis unit for analyzing face information based on the recognition result output from the image recognition unit
  • a recommended makeup pattern judgment unit for judging a recommended makeup pattern according to an analysis result output from the analysis unit
  • a program for causing a computer to function as an information processing apparatus comprising:
  • an imaging unit for capturing an image
  • an image recognition unit for sequentially acquiring images from the imaging unit and recognizing parts of a face in the acquired image
  • a display control unit for displaying a virtual makeup image tracking the face in the image based on a position of the face parts contained in a recognition result output from the image recognition unit, the virtual makeup image being superimposed on the image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Processing Or Creating Images (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

A method is provided for generating output image data. The method comprises receiving image data representing an input image, the input image containing at least one facial image. The method further comprises recognizing the facial image in the image data, and recognizing facial features of the facial image. The method further comprises generating data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup. The method also comprises generating output image data representing the makeup image superimposed on the facial image.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present disclosure contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2011-044274 filed in the Japan Patent Office on Mar. 1, 2011, the entire content of which is hereby incorporated by reference.
  • BACKGROUND Description of the Related Art
  • The present disclosure relates to an information processing apparatus, an information processing method, an information processing system, and a program.
  • As schemes for supporting makeup, various makeup simulations for simulating a state after makeup completion have been proposed.
  • For example, a method of displaying a simulation image obtained by changing a state of roughness of skin such as wrinkles or pores in a skin image of a user has been proposed in Japanese Patent Laid-open Publication No. 2006-133856. Accordingly, a noticeable degree of improvement for the wrinkles or the pores can be presented by using specific cosmetics. Further, an auxiliary apparatus for visualizing an impression of makeup in various lighting states in consideration of different impressions of makeup under indoor artificial light and outdoor natural light has been proposed in Japanese Patent Laid-open Publication No. 2001-186923. Accordingly, the user can select the best makeup for any event.
  • Further, a makeup simulation apparatus for generating a state in which desired cosmetics have been applied by acquiring a two-dimensional face image of a subject, deforming a standard application shape, and synthesizing the deformed standard application shape with the face image of the subject has been proposed in Japanese Patent Laid-open Publication No. 2009-53981.
  • TECHNICAL PROBLEM
  • However, since a synthesized image generated by the above-described makeup simulation is a still image from a front face, it is difficult to confirm makeup simulation from several angles in real time. Further, although a state after makeup completion can be confirmed, it is difficult to enter a simulated state by a user actually doing makeup. Accordingly, it is necessary to further improve a makeup support scheme such as makeup simulation.
  • Therefore, it is desirable to provide a novel and improved information processing apparatus, information processing method, and computer-readable storage medium capable of improving a makeup support scheme.
  • SUMMARY
  • Accordingly, there is provided an apparatus for generating output image data. The apparatus comprises a receiving unit configured to receive image data representing an input image, the input image containing at least one facial image. The apparatus further comprises a recognition unit configured to recognize the facial image in the image data, and recognize facial features of the facial image. The apparatus further comprises a makeup image generation unit configured to generate data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup. The apparatus also comprises a display generation unit configured to generate output image data representing the makeup image superimposed on the facial image.
  • In another aspect, there is provided a method for generating output image data. The method comprises receiving image data representing an input image, the input image containing at least one facial image. The method further comprises recognizing the facial image in the image data, and recognizing facial features of the facial image. The method further comprises generating data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup. The method also comprises generating output image data representing the makeup image superimposed on the facial image.
  • In another aspect, there is provided a tangibly-embodied non-transitory computer-readable storage medium storing instructions which, when executed by a processor, cause a computer to perform a method for generating output image data. The method comprises receiving image data representing an input image, the input image containing at least one facial image. The method further comprises recognizing the facial image in the image data, and recognizing facial features of the facial image. The method further comprises generating data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup. The method also comprises generating output image data representing the makeup image superimposed on the facial image.
  • In yet another aspect, there is provided an apparatus for generating output image data. The apparatus comprises receiving means for receiving image data representing an input image, the input image containing at least one facial image. The apparatus further comprises recognition means for recognizing the facial image in the image data, and recognizing facial features of the facial image. The apparatus further comprises makeup image generation means for generating data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup. The apparatus also comprises display generation means for generating output image data representing the makeup image superimposed on the facial image
  • According to the embodiments described above, there are provided an information processing apparatus, information processing method, and computer-readable storage medium, for improving a makeup support scheme.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram showing an overview of a makeup support apparatus according to a first embodiment;
  • FIG. 2 is a block diagram showing an example of a configuration of the makeup support apparatus according to the first embodiment;
  • FIG. 3 is a flowchart showing a makeup support process performed by the makeup support apparatus according to the first embodiment;
  • FIG. 4 is a flowchart illustrating a simulation presenting process shown in FIG. 3;
  • FIG. 5 is a diagram illustrating an example of display control of the makeup support apparatus according to the first embodiment;
  • FIG. 6 is a diagram showing a positional relationship between the makeup support apparatus according to the first embodiment and a user;
  • FIG. 7 is a diagram showing an example of makeup scheme information;
  • FIG. 8 is a diagram illustrating an example of display control of the makeup support apparatus according to the first embodiment;
  • FIG. 9 is a block diagram showing an example of a configuration of a makeup support apparatus in variant 1;
  • FIG. 10 is a diagram showing a display screen of a makeup support apparatus in variant 2;
  • FIG. 11 is a diagram showing an overview of a makeup support system according to a second embodiment;
  • FIG. 12 is a diagram showing an example of a configuration of the makeup support apparatus according to the second embodiment;
  • FIG. 13 is a diagram showing an overview of a makeup support system according to a third embodiment;
  • FIG. 14 is a diagram showing an example of a configuration of the makeup support apparatus according to the third embodiment;
  • FIG. 15 is a diagram showing an overview of a makeup support system according to a fourth embodiment; and
  • FIG. 16 is a diagram showing an example of a configuration of the makeup support apparatus according to the fourth embodiment.
  • DETAILED DESCRIPTION OF THE EMBODIMENT(S)
  • In the following, embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
  • It is to be noted that the description is set forth below in accordance with the following order.
  • 1. First Embodiment
  • (1-1) Overview of Makeup Support Apparatus
  • (1-2) Example of Configuration of Makeup Support Apparatus
  • (1-3) Example of Flow of Process
  • (1-4) Example of AR Image
  • (1-5) Variant
  • 2. Second Embodiment 3. Third Embodiment 4. Fourth Embodiment 5. Fifth Embodiment 6. Sixth Embodiment 7. Summary
  • In recent years, technology called augmented reality (AR) for superimposing additional information onto the real world and presenting the information to a user has been attracting attention. In the AR technology, the information presented to the user may be visualized using various forms of virtual objects such as text, icons or animation. A primary application field of the AR technology is the support of user activities in the real world. In the following, the AR technology is applied to a makeup support scheme. This can improve a makeup scheme of a user.
  • The makeup support scheme using the AR technology is applicable to a makeup support system for simulating a state after makeup completion and presenting a simulation result to a user. Further, in the system, the makeup support scheme using the AR technology displays a procedure during makeup as well as after makeup completion in consideration of the difficulty a general user has in applying actual makeup according to the simulation, thereby further improving the makeup support scheme.
  • A system for improving a makeup scheme using a makeup support apparatus that is an example of an information processing apparatus will be described in <1. First Embodiment> to <6. Sixth Embodiment>.
  • 1. First Embodiment (1-1) Overview of Makeup Support Apparatus
  • First, an overview of a makeup support apparatus 1 according to a first embodiment will be described with reference to FIG. 1. As shown in FIG. 1, the makeup support apparatus according to the present embodiment is a tablet-type terminal including a touch panel display. Accordingly, a display unit 18 not only has a display function, but also a function of a manipulation unit 19 for receiving a manipulation input from a user. In an example shown in FIG. 1, a face image of the user is captured by a camera 10 provided in the makeup support apparatus 1. An AR image in which a virtual makeup image is superimposed on the face image is displayed on the display unit 18. Here, the face image of the user is an image in a real space, and the makeup image is a virtual object superimposed on a video of the real space.
  • The series of control processes carried out by the makeup support apparatus 1 described in the present specification may be realized using hardware, software, or a combination of hardware and software. Instructions for performing the series of control processes may be stored in advance on a tangibly embodied non-transitory computer-readable storage medium, such as a hard disk drive, provided inside or outside the respective apparatus. During execution, the instructions may be written into RAM (Random Access Memory) and executed by a processor such as a CPU (Central Processing Unit).
  • The virtual makeup image includes, for example, a makeup completion image indicating a makeup completion state (i.e., a completed makeup operation performed according to a makeup procedure) or a makeup action image used to indicate a makeup procedure or method during makeup application. The user can confirm a state after makeup completion in advance by viewing the AR image in which a makeup completion image is superimposed on the face image of the user. In particular, the makeup action image is displayed to be superimposed on a real face part of the user so that the makeup procedure or method during makeup application is statically or dynamically displayed. As the makeup procedure is dynamically displayed to be superimposed on the face part of the user, the user can recognize a motion of his or her hand when applying makeup. Accordingly, the user can easily apply makeup by merely moving their hand as indicated by the makeup action image, while viewing an AR image in which the makeup action image is superimposed on the face image. Accordingly, the user makeup scheme is more improved compared to makeup advice using language. Here, the image captured by the camera 10 is displayed to be reversed to left and right on the display unit 18. Accordingly, the user views his or her face as shown on a makeup mirror. The AR image displayed on the display unit 18 will be described in detail with reference to FIGS. 6 to 8 in “(1-4) Example of AR Image,” which will be described later.
  • (1-2) Example of Configuration of Makeup Support Apparatus
  • Next, an example of a configuration of the makeup support apparatus 1 according to the present embodiment will be described with reference to FIG. 2. As shown in FIG. 2, the makeup support apparatus 1 includes a camera 10 (i.e., a receiving unit), an image recognition unit 11, a user face information analysis unit 12, a recommended makeup pattern judgment unit 13, a makeup scheme DB 14, a makeup image generation unit 15, a display control unit 17 (i.e., a display generation unit), a display unit 18 and a manipulation unit 19. The image recognition unit 11 includes a face and part recognition unit 110 and an SLAM processing unit 111. As used herein the term “unit” may be a software module, a hardware module, or a combination of a software module and a hardware module. Such hardware and software modules may be embodied in discrete circuitry, an integrated circuit, or as instructions executed by a processor.
  • The camera 10 is an example of an imaging unit for acquiring an image (video) by imaging a real space. The camera 10 captures a face image of the user. The user faces the makeup support apparatus 1 when applying makeup using the makeup support apparatus 1. Accordingly, the camera 10 provided in the makeup support apparatus 1 captures a face of the user. The camera 10 outputs the captured image to the image recognition unit 11. For example, as an example of the image obtained by imaging the face of the user, an image (video) 181 shown to the left in FIG. 6 is output to the image recognition unit 11.
  • The image recognition unit 11 performs an image recognition process on the captured image acquired from the camera 10. Specifically, first, a face in the captured image is recognized by the face and part recognition unit 110 of the image recognition unit 11. Once the face is recognized, parts of the face are recognized. The face and part recognition unit 110 may recognize the face and the face parts using a known image recognition scheme such as pattern recognition. The face and part recognition unit 110 outputs recognition results (the face parts and face part position information) to the user face information analysis unit 12. Further, the face and part recognition unit 110 may judge to have recognized the face in the captured image even after recognizing the presence of a portion of the face, such as the shoulder, neck, head, hair, or the like of the user in the captured image.
  • Further, three-dimensional positions and postures of the face and the face parts or a three-dimensional position and posture of the camera 10 are recognized by the SLAM processing unit 111 of the image recognition unit 11 according to a principle of SLAM (Simultaneous Localization and Mapping) technology disclosed in, for example, Andrew J. Davison's “Real-Time Simultaneous Localization and Mapping with a Single Camera,” (Proceedings of the 9th IEEE International Conference on Computer Vision Volume 2, 2003, pp. 1403-1410). Accordingly, even when the face parts such as eyes or mouth are hidden by the hand or a direction of the face is changed while the user is applying makeup, the SLAM processing unit 111 can follow the motion of the face by tracking a position in a next image to which an initially or subsequently recognized face part, such as the eyes or mouth, has moved in order to recognize positions of the face parts.
  • The user face information analysis unit 12 analyzes face information based on the recognition result (face part information) output from the face and part recognition unit 110. Specifically, information on texture such as skin quality, skin texture and hair texture or information on a shape such as contour, layout of parts and hairdo is analyzed from the face part position and the face part image contained in the face part information. The user face information analysis unit 12 outputs the analyzed face information to the recommended makeup pattern judgment unit 13.
  • Makeup scheme information for each pattern is stored in the makeup scheme DB (database) 14. The recommended makeup pattern judgment unit 13 judges a makeup pattern to be recommended for the face information acquired from the user face information analysis unit 12 based on the makeup scheme information stored in the makeup scheme DB 14.
  • As the recommended makeup pattern, a makeup pattern suitable for a general trend as well as a small region- or age-based community to which a user belongs may be recommended. A region, an age or the like to which the user belongs is input as user information by the user. Alternatively, a sex may be input and a makeup pattern suitable for a man may be recommended.
  • FIG. 3 is a diagram showing an example of the makeup scheme information. The makeup scheme information contains data of pattern ID: an ID of a makeup pattern, context ID: an ID of a makeup objective or environment such as usual, party or outdoor, process: number of a procedure, Item ID: an ID of cosmetics (products), tool ID: an ID of a tool such as a puff or a brush, action ID: IDs of operations such as paint, slap, or push for the tool, part ID: an ID of part of a face such as eyes or eyebrows, pressure: pressure on a tool, direction: a tool operation direction, motion: movement of the tool, distance: a deviation from a reference face part, and length: a length (distance) at which cosmetics are applied, as shown in FIG. 3.
  • The context ID is information used when a user performs mode setting to set a makeup objective or environment in advance. The recommended makeup pattern judgment unit 13 judges a recommended makeup pattern by referencing a set mode, in addition to the face information. Alternatively, the judgment of the recommended makeup pattern may be a judgment to determine a recommendation order. In this case, for example, three upper makeup patterns are presented to the user as recommended makeup patterns, and if the user does not select any one from the makeup patterns, three subsequent makeup patterns are also presented as the recommended makeup pattern.
  • Makeup scheme information shown in FIG. 3 is information indicating a makeup method (operation), but may be used to generate a makeup completion image. The recommended makeup pattern judgment unit 13 outputs the makeup scheme information for the recommended makeup pattern to the image generation unit 15. Alternatively, the recommended makeup pattern judgment unit 13 may output an ID of the recommended makeup pattern and the image generation unit 15 may acquire the makeup scheme information for the recommended makeup pattern from the makeup scheme DB based on the ID.
  • The image generation unit 15 draws a makeup image that is virtual information superimposed on the captured image based on the makeup scheme information for the recommended makeup pattern, and deforms the makeup image according to the position and posture of the face parts contained in the recognition result (three-dimensional position and posture) of the SLAM processing unit 111. Alternatively, the image generation unit 15 may deform the makeup image using the recognition result (face part information) output from the face and part recognition unit 110. The makeup image generated by the image generation unit 15 will be described in detail in “(1-4) Example of AR Image.” The image generation unit 15 outputs the generated makeup image to the display control unit 17.
  • The display control unit 17 generates output image data and displays an AR image in which a virtual makeup image is superimposed on an image obtained by imaging the real space on the display unit 18. Accordingly, the user can view the operation of makeup applied to the face image or the face after applying makeup, which is displayed on the display unit 18.
  • (1-3) Example of Flow of Process
  • Next, a flow of a process in the makeup support apparatus 1 according to the first embodiment will be described using flowcharts of FIGS. 4 and 5.
  • First, an image (captured image) obtained by imaging the face of the user using the camera 10 is input in step S62. The captured image is then displayed on the display unit 18 in step S64. Then, mode setting may be performed by the user in step S65. Through mode setting, information of a makeup objective or environment such as usual, party, outdoor, or date is input (step S66), so that a makeup pattern having a style suitable for a mode desired by the user is recommended.
  • On the other hand, if mode setting is not performed by the user (step S65: No), the face parts are recognized from the captured image by the image recognition unit 11 in step S67. Subsequently, the user face information analysis unit 12 analyzes face information from the face parts recognized by the image recognition unit 11 in step S68.
  • Next, in step S70, a recommended makeup pattern is judged by the recommended makeup pattern judgment unit 13, superimposed on the face of the user and then presented. The recommended makeup pattern judgment unit 13 judges a makeup pattern suitable for the user from the makeup patterns stored in the makeup scheme DB 14, as the recommended makeup pattern, in consideration of the face information output from the user face information analysis unit 12 or a set mode when mode setting is performed by the user. The recommended makeup pattern is displayed on the display unit 18 to be presented to the user. Here, the recommended makeup pattern presented on the display unit 18 may be an image obtained by applying makeup to a face of a model created in advance, like sample images 182 a to 182 c shown in FIG. 6.
  • If a plurality of recommended makeup patterns are presented in step S72 (step S72: Yes), any of the recommended makeup patterns is selected by the user in step S74.
  • Next, the makeup support apparatus 1 performs a process of displaying an AR image in which a virtual makeup image created from the recommended makeup pattern is superimposed on an image obtained by imaging a face in the real space on the display unit 18 in step S76 (see the AR image 183 shown in FIG. 6). This simulation presenting process (AR image display process) will be described using a flowchart of FIG. 5 later.
  • Next, if the recommended makeup pattern presented in the simulation is determined by the user in step S78, a makeup action image for the determined makeup pattern is presented in step S80. The makeup action image will be described later with reference to FIG. 8.
  • On the other hand, if there is no favorite makeup pattern among the presented recommended makeup patterns (step S78: No), subsequent recommended makeup patterns are presented (step S70). For example, if a recommendation list has been generated by the recommended makeup pattern judgment unit 13, the recommended makeup pattern is presented in order from upper makeup patterns in the list. Steps S70 to S78 are iteratively performed until the favorite makeup pattern is selected. Alternatively, the process is performed again from mode setting in step S66.
  • The process in the makeup support apparatus 1 according to the present embodiment has been described above. Next, the process of presenting the simulation shown in the above-described step S76 will be described with reference to FIG. 5.
  • First, as shown in FIG. 5, a process of recognizing a face part position from the captured image in the face and part recognition unit 110 is performed in step S84. If the face part position is recognized (step S86: Yes), an attribute of a face feature point is updated in step S88. The face feature point refers to a landmark on an object (in the present embodiment, the face and the face part) that is a recognition target in an image tracked for three-dimensional recognition in the SLAM process.
  • In step S90, a process of recognizing a three-dimensional position and posture of a face part that is a recognition target in the captured image is performed by the SLAM processing unit 111.
  • If the position of the face part in the captured image can be recognized through each recognition process (step S92/Yes), a virtual makeup image is generated by the makeup image generation unit 15 in step S94. An AR image in which a makeup image that is virtual information is superimposed on the face of the user that is a captured image of a real space is then displayed in step S96.
  • The simulation presenting process in the makeup support apparatus 1 according to the present embodiment has been described above. Next, an example of the AR image displayed on the display unit 18 of the makeup support apparatus 1 according to the present embodiment will be described with reference to the accompanying drawings.
  • (1-4) Example of AR Image
  • The AR image according to the present embodiment is an image in which a virtual makeup image is superimposed on a captured image obtained by imaging a real space. The superimposed makeup image may be a makeup completion image generated based on the makeup scheme information for the makeup pattern. For example, a virtual makeup completion image 185 is superimposed on the face image 181 of the user, as in the AR image 183 in FIG. 6. Accordingly, the user can confirm a completion state when applying makeup using the recommended makeup pattern, in advance.
  • The user can also confirm the AR image from several angles. Hereinafter, a description will be given with reference to FIGS. 6 and 7.
  • If the user desires to confirm a makeup state of the face seen from the side after confirming a makeup state from a front face in the AR image 183 in FIG. 6, for example, the face is directed to the side as shown in FIG. 7. Accordingly, the position and the posture of the face of the user in the image captured by the camera 10 are changed. Here, since the simulation presenting process shown in FIG. 4 is iteratively performed, if the position and the posture of the face of the user in the image are changed, the change can be tracked in real time, such that a position or a shape in which the makeup image that is superimposed virtual information can be changed according to a change of the user in the image. Accordingly, a virtual makeup completion image is displayed with a changed shape by tracking the position of the recognized face part, as shown in the AR image 184 of FIG. 6. Accordingly, the user can recognize the AR image from several angles in real time.
  • Further, the superimposed virtual makeup image may be a makeup action image generated based on the makeup scheme information of the makeup pattern. For example, in the case of a makeup pattern having a pattern ID “P-00001” in the makeup scheme information shown in FIG. 3, a makeup action image indicating a procedure of process 001; apply cosmetics I-201 to tool T-221 under pressure of 20, and then process 002; apply cosmetics I-201 to face part P-002 (e.g., cheek) by length (length) 21 along a straight line (Motion=2) without deviation (distance=0) under pressure of 12 is superimposed and displayed. The superimposed and displayed makeup action image is displayed in order of the number of the process, but an example of a time at which a makeup action image for a next procedure is displayed will be described in a variant to be described later.
  • A superimposed position of the makeup action image is changed or transformed according to the change of the position and the posture of the face part of the user in the image (tracking display), similar to the makeup completion image described using FIG. 6. Accordingly, for example, as shown in FIG. 8, a makeup action image 187 a in which a hand holding an eyebrow pencil moves along a makeup area indicated by a dotted line is changed into a makeup action image 187 b according to the change of the position and posture of the face part of the user in the image. That is, the makeup action image tracks the face part. Accordingly, the user can confirm the makeup action image as a makeup model from several angles in real time and can apply makeup more accurately.
  • The example of the AR image has been described above. Further, the virtual makeup image may be two-dimensional information or three-dimensional information.
  • (1-5) Variant
  • Next, a variant of the makeup support apparatus 1 according to the first embodiment described above will be described with reference to FIG. 9. FIG. 9 is a block diagram showing a configuration of a makeup support apparatus according to the present variant. The makeup support apparatus according to the present variant includes a camera 10, an image recognition unit 11, a user face information analysis unit 12, a recommended makeup pattern judgment unit 13, a makeup scheme DB 14, a makeup image generation unit 15, a makeup progress degree judgment unit 16, a display control unit 17, a display unit 18, and a manipulation unit 19, as shown in FIG. 9. The image recognition unit 11 includes a face and part recognition unit 110 and an SLAM processing unit 111.
  • The makeup progress degree judgment unit 16 compares a recognition result output from the image recognition unit 11 with a previously generated makeup completion image to judge a progress degree of real makeup being applied by the user. The makeup image generation unit 15 modifies the superimposed and displayed makeup action image according to the progress degree output from the makeup progress degree judgment unit 16. The modification may also be based on the steps taken toward completion of the makeup procedure. For example, the makeup image generation unit 15 may modify the superimposed and displayed makeup action image to depict a next step in the makeup procedure to be performed, following a determination that a particular step has been performed. A detailed description of other configurations will be omitted since they are the same as those described above.
  • According to an example configuration, as the makeup being applied by the user approaches a completion state, portions of the makeup action image can be removed or made transparent, where the removed or transparent portions may correspond to sections of the face where makeup has been applied according to the makeup procedure. Accordingly, the user can visually confirm to what extent the makeup currently being applied is approaching the completion state. Further, a time at which a makeup action image for a next procedure is displayed may be determined according to the progress degree of the makeup being applied by the user. Accordingly, if the makeup according to the shown procedure is completed, the makeup action image for the next procedure is automatically displayed.
  • For the time at which the makeup action image for the next procedure is displayed as described above, a manipulation of the user may be used as trigger. For example, if the next procedure is instructed to be displayed, the makeup action image for the next procedure is forcibly displayed. Further, the procedure may be skipped by the manipulation of the user. Further, a makeup action image for a face part approached by the hand of the user may be displayed. If changing the procedure according to a makeup procedure is not desirable, an alert indicating that fact may be displayed.
  • Further, as shown in FIG. 10, an image 191 on which the makeup completion image is superimposed may be displayed, in addition to the AR image 190 on which the makeup action image is superimposed. Accordingly, the user can apply makeup while confirming a makeup completion state.
  • 2. Second Embodiment
  • Next, a makeup support system according to a second embodiment of the present disclosure will be described with reference to FIGS. 11 and 12. According to the present embodiment, makeup service of a desired makeup artist can be received.
  • FIG. 11 is a diagram showing an overview of the makeup support system according to the present embodiment. As shown in FIG. 11, the makeup support system according to the present embodiment includes a makeup support apparatus 1, a server 30, and a makeup scheme acquisition apparatus 4 which are connected via a network 6.
  • The makeup scheme acquisition apparatus 4 includes various information acquisition units such as a camera 40, a motion sensor 41 and a pressure sensor 42. A makeup scheme of a makeup artist is acquired by such a makeup scheme acquisition apparatus 4 and stored in a makeup scheme DB 302 of the server 30. Specifically, various sensors are attached to the arm or hand of the makeup artist to acquire information such as pressure when a makeup tool contacts cosmetics, pressure when makeup is applied to the face, and a locus. Further, a makeup procedure is imaged by the camera 40. Alternatively, various sensors may be attached to cosmetics, makeup tools, or a mannequin to acquire makeup scheme information of makeup artists. Further, IDs of used cosmetics, or information indicating that a plurality of cosmetics are used together, if any, are input.
  • The server 30 includes a scheme analysis unit 301 and the makeup scheme DB 302. The scheme analysis unit 301 analyzes a makeup scheme from the various information acquired by the makeup scheme acquisition apparatus 4 and digitizes the makeup scheme with reference to a generalized face image. The makeup scheme DB 302 stores the makeup scheme information analyzed by the scheme analysis unit 301.
  • Next, a configuration of a makeup support apparatus 1 according to the present embodiment is shown in FIG. 12. As shown in FIG. 12, the makeup support apparatus 1 according to the present embodiment includes a camera 10, an image recognition unit 11, a user face information analysis unit 12, a recommended makeup pattern judgment unit 13, a makeup image generation unit 15, a display control unit 17, a display unit 18, a manipulation unit 19, and a communication unit 20. The image recognition unit 11 includes a face and part recognition unit 110 and an SLAM processing unit 111.
  • The communication unit 20 establishes a communication connection with the server 30. The communication in the communication unit 20 may be wired or wireless communication, but usually exchanges information with the server 30 using wireless communication such as a wireless LAN or Bluetooth (registered trademark). This enables communication for receiving makeup service of a makeup artist desired by the user. Specifically, the communication unit 20 receives the makeup scheme information from the server 30. A description of other configurations will be omitted since they are the same as those of the makeup support apparatus 1 according to the first embodiment.
  • As described above, the makeup scheme information acquired from each makeup artist is stored in the makeup scheme DB 302 of the server 30. On the other hand, in the makeup support apparatus 1, a list of selectable makeup artists is displayed on the display unit 18, as shown in FIG. 11, and makeup scheme information of the artist selected by the user is acquired from the server 30. Accordingly, a recommended makeup pattern judgment unit of the makeup support apparatus 1 can judge a makeup pattern suitable for the user from makeup patterns based on the makeup scheme information of the makeup artist desired by the user.
  • Further, in the present embodiment, since at least the makeup scheme information acquired from the makeup artist may be stored in the makeup scheme DB, the makeup scheme acquisition apparatus 4 shown in FIG. 11 is not necessarily an indispensable component.
  • 3. Third Embodiment
  • Next, a makeup support system according to a third embodiment of the present disclosure will be described with reference to FIGS. 13 and 14. According to the present embodiment, makeup scheme information can be exchanged among users.
  • FIG. 13 is a diagram showing an overview of a makeup support system according to the present embodiment. As shown in FIG. 13, the makeup support system according to the present embodiment includes makeup support apparatuses 1A to 1D for respective users and a server 31, which are connected via a network 6.
  • A configuration of the makeup support apparatus 1 according to the present embodiment is shown in FIG. 14. As shown in FIG. 14, the makeup support apparatus 1 according to the present embodiment includes a camera 10, an image recognition unit 11, a user face information analysis unit 12, a recommended makeup pattern judgment unit 13, a makeup image generation unit 15, a display control unit 17, a display unit 18, a manipulation unit 19 and a communication unit 20. The image recognition unit 11 includes a face and part recognition unit 110 and an SLAM processing unit 111.
  • The camera 10 acquires an image of makeup done by a user and outputs the image to the communication unit 20. The communication unit 20 establishes a communication connection with the server 30 and transmits the captured image output from the camera 10 to the server 30. Further, the communication unit 20 receives makeup scheme information of other users from the makeup scheme DB 312 of the server 30. This makes it possible to exchange the makeup scheme information with the other users. The recommended makeup pattern judgment unit 13 judges a makeup pattern suitable for the user from the makeup patterns based on the makeup scheme information of the other users acquired by the communication unit 20. A description of other configurations will be omitted since they are the same as those of the makeup support apparatus 1 according to the first embodiment.
  • The server 31 includes a scheme analysis unit 311 and a makeup scheme DB 312. The scheme analysis unit 311 analyzes a makeup scheme from the captured image obtained by imaging a makeup action of the user transmitted from each makeup support apparatus 1, and digitizes the makeup scheme with reference to a generalized face image. The scheme analysis unit 311, for example, compares a professional's makeup action image or a reference makeup action image with the makeup action image of the user to calculate a difference therebetween, and digitizes the makeup scheme. The makeup scheme DB 312 stores the makeup scheme information analyzed by the scheme analysis unit 311.
  • Alternatively, the makeup support apparatus 1 may include various sensors such as motion sensors and pressure sensors, and transmit information acquired by the various sensors when the user applies makeup to the server 31. In this case, the scheme analysis unit 311 analyzes a makeup scheme from various information transmitted from the makeup support apparatus 1 and digitizes the makeup scheme with reference to a generalized face image.
  • As described above, the makeup scheme information acquired from each user is stored in the makeup scheme DB 312 of the server 31. Meanwhile, in the makeup support apparatus 1A, a list of other selectable users is displayed on the display unit 18, as shown in FIG. 13, and the makeup scheme information of the other user selected by the user is acquired from the server 31. Accordingly, the recommended makeup pattern judgment unit of the makeup support apparatus 1 can judge a makeup pattern suitable for the user from the makeup patterns based on the makeup scheme information of the other user, such as a friend of the user. Accordingly, the makeup support system can be further utilized, such as friendly competition in the makeup scheme with friends, imitating makeup schemes of other users, or being famous among nonprofessional amateurs.
  • 4. Fourth Embodiment
  • Next, a makeup support system according to a fourth embodiment of the present disclosure will be described with reference to FIGS. 15 and 16. According to the present embodiment, information of appropriate cosmetics and cosmetics sale service can be provided.
  • FIG. 15 is a diagram showing an overview of the makeup support system according to the present embodiment. As shown in FIG. 15, the makeup support system according to the present embodiment includes a makeup support apparatus 1, a server 32 and a sale management apparatus 5 for each cosmetics shop, which are connected via a network 6.
  • A configuration of the makeup support apparatus 1 according to the present embodiment is shown in FIG. 16. As shown in FIG. 16, the makeup support apparatus 1 according to the present embodiment includes a camera 10, an image recognition unit 11, a user face information analysis unit 12, a recommended makeup pattern judgment unit 13, a makeup image generation unit 15, a display control unit 17, a display unit 18, a manipulation unit 19, a communication unit 20, and a cosmetics information providing unit 21. The image recognition unit 11 includes a face and part recognition unit 110 and an SLAM processing unit 111.
  • The communication unit 20 establishes a communication connection with the server 32 and exchanges information with the server 32. Specifically, the communication unit 20 performs reception of makeup scheme information, transmission of a makeup ID, and reception of cosmetics information from/to the server 32.
  • The recommended makeup pattern judgment unit 13 judges a recommended makeup pattern based on the makeup scheme information received by the communication unit 20. Further, the recommended makeup pattern judgment unit 13 outputs a cosmetics ID (see FIG. 7) of a makeup pattern judged to be a recommended makeup pattern to the server 32 via the communication unit 20.
  • The cosmetics information providing unit 21 provides the user with the cosmetics information transmitted from the server 32 according to the cosmetics ID output from the makeup recommended pattern judgment unit 13. For example, maker, brand name, product name, price, product explanation and the like are displayed as cosmetics information on the display unit 18, as shown in FIG. 15. Accordingly, information of appropriate cosmetics to be used when the makeup based on the recommended makeup pattern is applied by the user is provided. Further, as shown in FIG. 15, a buy button 192 is displayed together with the cosmetics information on the display unit 18, such that the user can easily perform a procedure of purchasing appropriate cosmetics.
  • A description of other configurations of the makeup support apparatus 1 according to the present embodiment will be omitted since they are the same as those of the makeup support apparatus 1 according to the first embodiment.
  • The server 32 includes a makeup scheme DB 322 and a cosmetics information DB 323. The makeup scheme DB 322 stores the makeup scheme information, similar to the makeup scheme DB in each embodiment described above. The makeup information DB 323 stores cosmetics information corresponding to each cosmetics ID contained in the makeup scheme information stored in the makeup scheme DB 322.
  • The sale management apparatus 5 performs product sale management according to a cosmetics purchase request from the makeup support apparatus 1. The sale management apparatus 5 may be owned by each cosmetics shop. Alternatively, a normal net sale system (online shopping) may be used.
  • As described above, the makeup support apparatus 1 according to the present embodiment sells cosmetics necessary for the user to realize the recommended makeup pattern to thereby perform makeup support.
  • 5. Fifth Embodiment
  • Next, a makeup support system according to a fifth embodiment of the present disclosure will be described. A makeup support apparatus 1 according to the present embodiment judges a recommended makeup pattern in consideration of effects of makeup on skin.
  • Specifically, skin quality, skin texture and the like that are ideal, for example, after five years or ten years are set (a predicted value of a future face image), and skin quality, skin texture and the like (a predicted value of a current face image), for example, after five years or ten years predicted from current face information are calculated to judge a makeup pattern that hides the differences therebetween as the recommended makeup pattern. In addition to current face information, information having an influence on the skin, such as age, sex, race, and life pattern, may be input by the user.
  • Accordingly, from a long-term view, a makeup pattern considering temporal changes of the skin can be recommended
  • 6. Sixth Embodiment
  • Next, a makeup support system according to a sixth embodiment of the present disclosure will be described. In a makeup support apparatus 1 according to the present embodiment, a user's favorite face image such as a face photograph of an entertainer is input by the user, and a makeup pattern causing the face of the user to be as close as possible to the user's favorite face is judged to be a recommended makeup pattern from an analysis result for the face image and the face information of the user.
  • Accordingly, a makeup pattern close to the user's favorite face can be recommended.
  • 7. Summary
  • As described above, the makeup support apparatus 1 according to the embodiment of the present disclosure performs simulation of a state after makeup completion using the AR technology, thereby improving the makeup support scheme. Further, a procedure during makeup application is displayed using the AR technology, thereby further improving the makeup support scheme.
  • In the respective embodiments described above, all the processes are performed in the makeup support apparatus 1, but the present disclosure is not limited to such examples. For example, a process with a heavy load among the processes performed in the makeup support apparatus 1 may be performed by the server connected via the network, or performed in a distributive manner via remote devices or servers in, for example, a cloud computing configuration. For example, the captured image captured by the camera 10 may be transmitted from the makeup support apparatus 1 to the server, and user face information of the captured image may be analyzed by the server to judge a recommended makeup pattern. In this case, the server transmits the makeup scheme information of the makeup pattern judged to be the recommended makeup pattern from the captured image transmitted from the makeup support apparatus 1, to the makeup support apparatus 1. Further, the server may generate the makeup image from the recommended makeup pattern and transmit the makeup image to the makeup support apparatus 1. Further, the makeup scheme DB 14 for storing the makeup scheme information may be disposed in the server.
  • As the process with a heavy load is performed by the server as described above, power consumption of the makeup support apparatus 1 can be reduced and hardware resources necessary for the makeup support apparatus 1 can be reduced. Further, if the makeup scheme DB 14 for storing the makeup schemes is disposed in the server, storage capacity of the makeup support apparatus 1 can be reduced and the same makeup scheme information can be easily used or managed between makeup support apparatuses.
  • The embodiments of the present disclosure have been described in detail with reference to the accompanying drawings, but the present disclosure is not limited to such examples. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof. Also, any reference in the claims to articles, such as “a” or “an,” is to be construed as meaning “one or more.”
  • For example, the present technology can adopt the following configurations.
  • (1) An information processing apparatus comprising:
  • an imaging unit for capturing an image;
  • an image recognition unit for sequentially acquiring images from the imaging unit and recognizing parts of a face in the acquired image; and
  • a display control unit for displaying a virtual makeup image tracking the face in the image based on a position of the face parts contained in a recognition result output from the image recognition unit, the virtual makeup image being superimposed on the image.
  • (2) The information processing apparatus according to the (1), wherein the display control unit displays a makeup completion image as the virtual makeup image.
    (3) The information processing apparatus according to the (1), wherein the display control unit displays a makeup action image as the virtual makeup image.
    (4) The information processing apparatus according to any one of the (1) to (3), further comprising an image generation unit for generating the virtual makeup image based on makeup scheme information.
    (5) The information processing apparatus according to the (4), further comprising:
  • an analysis unit for analyzing face information based on the recognition result output from the image recognition unit; and
  • a recommended makeup pattern judgment unit for judging a recommended makeup pattern according to an analysis result output from the analysis unit,
  • wherein the makeup pattern includes the makeup scheme information, and
  • wherein the image generation unit generates the virtual makeup image based on makeup scheme information of a makeup pattern judged to be a recommended makeup pattern by the recommended makeup pattern judgment unit.
  • (6) The information processing apparatus according to the (4) or (5), further comprising:
  • a makeup progress degree judgment unit for judging a makeup progress degree by comparing the image captured by the imaging unit with a previously generated makeup completion image,
  • wherein the display control unit sequentially displays makeup action images to be superimposed on the image, the makeup action images being changed according to a judgment result from the makeup progress degree judgment unit.
    (7) The information processing apparatus according to the (6), wherein order of the makeup action images sequentially displayed by the display control unit is changed according to a manipulation of the user.
    (8) An information processing method comprising:
  • capturing an image;
  • sequentially acquiring images through the capturing step and recognizing parts of a face in the acquired image; and
  • displaying a virtual makeup image tracking the face in the image based on a position of the face parts contained in a recognition result output in the recognition step, the virtual makeup image being superimposed on the image.
  • (9) An information processing system comprising:
  • an information processing apparatus including
  • an imaging unit for capturing an image,
  • an image recognition unit for sequentially acquiring images from the imaging unit and recognizing parts of a face in the acquired image, and
  • a display control unit for displaying a virtual makeup image tracking the face in the image based on a position of the face parts contained in a recognition result output from the image recognition unit, the virtual makeup image being superimposed on the image; and
  • a server including
  • an analysis unit for analyzing face information based on the recognition result output from the image recognition unit, and
  • a recommended makeup pattern judgment unit for judging a recommended makeup pattern according to an analysis result output from the analysis unit,
  • wherein the information processing apparatus generates the virtual makeup image based on makeup scheme information of the recommended makeup pattern acquired from the server.
    (10) A program for causing a computer to function as an information processing apparatus comprising:
  • an imaging unit for capturing an image;
  • an image recognition unit for sequentially acquiring images from the imaging unit and recognizing parts of a face in the acquired image; and
  • a display control unit for displaying a virtual makeup image tracking the face in the image based on a position of the face parts contained in a recognition result output from the image recognition unit, the virtual makeup image being superimposed on the image.

Claims (18)

1. An apparatus comprising:
a receiving unit configured to receive image data representing an input image, the input image containing at least one facial image;
a recognition unit configured to recognize the facial image in the image data, and recognize facial features of the facial image;
a makeup image generation unit configured to generate data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup; and
a display generation unit configured to generate output image data representing the makeup image superimposed on the facial image.
2. The apparatus of claim 1, wherein the makeup image depicts a first step in a makeup procedure to be performed.
3. The apparatus of claim 1, wherein the makeup image depicts a final result of a makeup procedure, the makeup image being modified, after applications of makeup to a facial feature, by removing or making transparent a portion of the makeup image corresponding to the applied makeup on the facial feature.
4. The apparatus of claim 1, comprising a progress determining unit configured to:
compare the recognized facial image with a makeup completion image depicting a completed makeup operation performed according to a makeup procedure, and
determine a progress degree toward completion of the makeup procedure.
5. The apparatus of claim 4, wherein the makeup image generation unit is configured to modify the makeup image based on the determined progress degree.
6. The apparatus of claim 4, wherein the display generation unit is configured to generate the output image data representing the makeup completion image superimposed on the facial image.
7. The apparatus of claim 5, wherein the modified makeup image depicts a second step in the makeup procedure to be performed.
8. The apparatus of claim 1, comprising an analyzing unit configured to analyze information corresponding to the recognized facial features.
9. The apparatus of claim 8, wherein the information corresponding to the recognized facial features includes at least one of texture information of the facial features, or shape information of the facial features.
10. The apparatus of claim 1, comprising a recommendation unit configured to select a makeup pattern for use in generating the makeup image.
11. The apparatus of claim 10, wherein the makeup pattern is based on the recognized facial features.
12. The apparatus of claim 10, wherein the recommendation unit is configured to select the makeup pattern based on a mode indicating a style preference, the makeup pattern conforming to the style preference.
13. The apparatus of claim 10, wherein the recommendation unit is configured to select a plurality of recommended makeup patterns.
14. The apparatus of claim 1, wherein:
the recognition unit is configured to iteratively perform recognition processing on the image data to detect a change in position of the facial features; and
the display generation unit is configured to generate the output image data representing the makeup image transformed according to the position of the facial features.
15. The apparatus of claim 1, comprising a display unit configured to display the output image data.
16. A method comprising:
receiving image data representing an input image, the input image containing at least one facial image;
recognizing the facial image in the image data, and recognizing facial features of the facial image;
generating data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup; and
generating output image data representing the makeup image superimposed on the facial image.
17. A tangibly embodied non-transitory computer-readable storage medium storing instructions which, when executed by a processor, cause a computer to perform a method, comprising:
receiving image data representing an input image, the input image containing at least one facial image;
recognizing the facial image in the image data, and recognizing facial features of the facial image;
generating data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup; and
generating output image data representing the makeup image superimposed on the facial image.
18. An apparatus comprising:
receiving means for receiving image data representing an input image, the input image containing at least one facial image;
recognition means for recognizing the facial image in the image data, and recognizing facial features of the facial image;
makeup image generation means for generating data representing a makeup image based on the recognized facial features, the makeup image providing information assisting in the application of makeup; and
display generation means for generating output image data representing the makeup image superimposed on the facial image.
US13/400,980 2011-03-01 2012-02-21 Information processing apparatus, information processing method, and computer-readable storage medium Abandoned US20120223956A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/980,630 US10945514B2 (en) 2011-03-01 2015-12-28 Information processing apparatus, information processing method, and computer-readable storage medium
US17/189,915 US20210177124A1 (en) 2011-03-01 2021-03-02 Information processing apparatus, information processing method, and computer-readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-044274 2011-03-01
JP2011044274A JP2012181688A (en) 2011-03-01 2011-03-01 Information processing device, information processing method, information processing system, and program

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/980,630 Continuation US10945514B2 (en) 2011-03-01 2015-12-28 Information processing apparatus, information processing method, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
US20120223956A1 true US20120223956A1 (en) 2012-09-06

Family

ID=46753026

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/400,980 Abandoned US20120223956A1 (en) 2011-03-01 2012-02-21 Information processing apparatus, information processing method, and computer-readable storage medium
US14/980,630 Active US10945514B2 (en) 2011-03-01 2015-12-28 Information processing apparatus, information processing method, and computer-readable storage medium
US17/189,915 Abandoned US20210177124A1 (en) 2011-03-01 2021-03-02 Information processing apparatus, information processing method, and computer-readable storage medium

Family Applications After (2)

Application Number Title Priority Date Filing Date
US14/980,630 Active US10945514B2 (en) 2011-03-01 2015-12-28 Information processing apparatus, information processing method, and computer-readable storage medium
US17/189,915 Abandoned US20210177124A1 (en) 2011-03-01 2021-03-02 Information processing apparatus, information processing method, and computer-readable storage medium

Country Status (2)

Country Link
US (3) US20120223956A1 (en)
JP (1) JP2012181688A (en)

Cited By (80)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130216295A1 (en) * 2012-02-20 2013-08-22 Charlene Hsueh-Ling Wong Eyes make-up application machine
US20130286036A1 (en) * 2012-04-26 2013-10-31 Myongji University Industry and Academia Corporation Foundation Apparatus and method for producing makeup avatar
KR20130121003A (en) * 2012-04-26 2013-11-05 한국전자통신연구원 Method and device for producing dressed avatar
US20140016823A1 (en) * 2012-07-12 2014-01-16 Cywee Group Limited Method of virtual makeup achieved by facial tracking
CN103870821A (en) * 2014-04-10 2014-06-18 上海影火智能科技有限公司 Virtual make-up trial method and system
CN103885461A (en) * 2012-12-21 2014-06-25 宗经投资股份有限公司 Movement method for makeup tool of automatic makeup machine
US20140210814A1 (en) * 2013-01-25 2014-07-31 Electronics & Telecommunications Research Institute Apparatus and method for virtual makeup
US20140314315A1 (en) * 2013-03-25 2014-10-23 Brightex Bio-Photonics Llc Systems and Methods for Recommending Cosmetic Products for Users with Mobile Devices
CN104205168A (en) * 2013-02-01 2014-12-10 松下电器产业株式会社 Makeup application assistance device, makeup application assistance method, and makeup application assistance program
CN104333566A (en) * 2013-12-23 2015-02-04 乐视网信息技术(北京)股份有限公司 Information acquiring and recommending method, device and system
US20150038225A1 (en) * 2012-03-13 2015-02-05 Neowiz Bless Studio Corporation Online game providing method for providing character makeup and system therefor
EP2820970A4 (en) * 2013-02-01 2015-06-10 Panasonic Ip Man Co Ltd Makeup application assistance device, makeup application assistance method, and makeup application assistance program
CN104797165A (en) * 2013-08-30 2015-07-22 松下知识产权经营株式会社 Makeup assistance device, makeup assistance method, and makeup assistance program
US20150234942A1 (en) * 2014-02-14 2015-08-20 Possibility Place, Llc Method of making a mask with customized facial features
US20150262403A1 (en) * 2014-03-13 2015-09-17 Panasonic Intellectual Property Management Co., Ltd. Makeup support apparatus and method for supporting makeup
CN105101836A (en) * 2013-02-28 2015-11-25 松下知识产权经营株式会社 Makeup assistance device, makeup assistance method, and makeup assistance program
EP2821959A4 (en) * 2013-02-01 2015-12-02 Panasonic Ip Man Co Ltd Makeup application assistance device, makeup application assistance method, and makeup application assistance program
CN105188467A (en) * 2013-03-22 2015-12-23 松下知识产权经营株式会社 Makeup support device, makeup support method, and makeup support program
US20160015152A1 (en) * 2013-03-22 2016-01-21 Panasonic Intellectual Property Management Co., Ltd. Makeup support device, makeup support method, and makeup support program
US20160042557A1 (en) * 2014-08-08 2016-02-11 Asustek Computer Inc. Method of applying virtual makeup, virtual makeup electronic system, and electronic device having virtual makeup electronic system
US20160110587A1 (en) * 2014-10-21 2016-04-21 Samsung Electronics Co., Ltd. Method and apparatus for facial recognition
US20160125229A1 (en) * 2014-11-03 2016-05-05 Anastasia Soare Facial structural shaping
US9412034B1 (en) * 2015-01-29 2016-08-09 Qualcomm Incorporated Occlusion handling for computer vision
US20160328632A1 (en) * 2015-05-05 2016-11-10 Myongsu Choe Makeup supporting methods for creating and applying a makeup guide content to makeup user's face on a real-time basis
US9504925B2 (en) 2014-02-14 2016-11-29 Right Foot Llc Doll or action figure with facial features customized to a particular individual
US20160357578A1 (en) * 2015-06-03 2016-12-08 Samsung Electronics Co., Ltd. Method and device for providing makeup mirror
US9576183B2 (en) 2012-11-02 2017-02-21 Qualcomm Incorporated Fast initialization for monocular visual SLAM
TWI573093B (en) * 2016-06-14 2017-03-01 Asustek Comp Inc Method of establishing virtual makeup data, electronic device having method of establishing virtual makeup data and non-transitory computer readable storage medium thereof
CN107003827A (en) * 2014-09-26 2017-08-01 三星电子株式会社 The method for displaying image and equipment performed by the equipment including changeable mirror
CN107111861A (en) * 2015-01-29 2017-08-29 松下知识产权经营株式会社 Image processing apparatus, stylus and image processing method
US20170256084A1 (en) * 2014-09-30 2017-09-07 Tcms Transparent Beauty, Llc Precise application of cosmetic looks from over a network environment
CN107153806A (en) * 2016-03-03 2017-09-12 炬芯(珠海)科技有限公司 A kind of method for detecting human face and device
US20180077347A1 (en) * 2015-03-26 2018-03-15 Panasonic Intellectual Property Management Co., Ltd. Image synthesis device and image synthesis method
CN107924577A (en) * 2015-10-26 2018-04-17 松下知识产权经营株式会社 Position generating means of making up and makeup position generation method
US20180151086A1 (en) * 2016-11-25 2018-05-31 Naomi Belhassen Semi-permanent makeup system and method
US9986812B2 (en) 2013-02-01 2018-06-05 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance system, and makeup application assistance method
WO2018104356A1 (en) * 2016-12-06 2018-06-14 Koninklijke Philips N.V. Displaying a guidance indicator to a user
CN108292423A (en) * 2015-12-25 2018-07-17 松下知识产权经营株式会社 Local dressing producing device, local dressing utilize program using device, local dressing production method, local dressing using method, local dressing production process and local dressing
TWI630579B (en) * 2015-12-27 2018-07-21 華碩電腦股份有限公司 Electronic apparatus, computer readable recording medium storing program and facial image displaying method
US20180239954A1 (en) * 2015-09-08 2018-08-23 Nec Corporation Face recognition system, face recognition method, display control apparatus, display control method, and display control program
CN108600614A (en) * 2018-04-02 2018-09-28 珠海格力电器股份有限公司 Image processing method and device
EP3396585A1 (en) * 2017-04-27 2018-10-31 Cal-Comp Big Data, Inc. Lip gloss guide device and method thereof
CN108771316A (en) * 2018-05-30 2018-11-09 杭州任你说智能科技有限公司 A kind of artificial intelligence makeup mirror system
CN108932654A (en) * 2018-06-12 2018-12-04 苏州诚满信息技术有限公司 A kind of virtually examination adornment guidance method and device
US10162997B2 (en) 2015-12-27 2018-12-25 Asustek Computer Inc. Electronic device, computer readable storage medium and face image display method
CN109090808A (en) * 2018-08-08 2018-12-28 颜沿(上海)智能科技有限公司 A kind of intelligently examination adornment dressing glass and method
CN109246345A (en) * 2018-10-23 2019-01-18 Oppo广东移动通信有限公司 U.S. pupil image pickup method, device, storage medium and mobile terminal
US20190035126A1 (en) * 2017-07-25 2019-01-31 Cal-Comp Big Data, Inc. Body information analysis apparatus capable of indicating blush-areas
CN109508587A (en) * 2017-09-15 2019-03-22 丽宝大数据股份有限公司 Biological information analytical equipment and its bottom adornment analysis method
US10324739B2 (en) * 2016-03-03 2019-06-18 Perfect Corp. Systems and methods for simulated application of cosmetic effects
EP3522117A1 (en) * 2018-02-02 2019-08-07 Perfect Corp. Systems and methods for virtual application of cosmetic effects to photo albums and product promotion
CN110135930A (en) * 2018-02-02 2019-08-16 英属开曼群岛商玩美股份有限公司 Virtual application dressing effect and the method, system and storage media for promoting product
US20190297271A1 (en) * 2016-06-10 2019-09-26 Panasonic Intellectual Property Management Co., Ltd. Virtual makeup device, and virtual makeup method
EP3530142A4 (en) * 2016-10-24 2019-10-30 Panasonic Intellectual Property Management Co., Ltd. Image processing device, image processing method, and image processing program
CN110575001A (en) * 2018-06-11 2019-12-17 卡西欧计算机株式会社 display control device, display control method, and medium storing display control program
US20200089935A1 (en) * 2017-07-25 2020-03-19 Cal-Comp Big Data, Inc. Body information analysis apparatus capable of indicating shading-areas
US10740985B2 (en) * 2017-08-08 2020-08-11 Reald Spark, Llc Adjusting a digital representation of a head region
US10750160B2 (en) 2016-01-05 2020-08-18 Reald Spark, Llc Gaze correction of multi-view images
US10762665B2 (en) 2018-05-23 2020-09-01 Perfect Corp. Systems and methods for performing virtual application of makeup effects based on a source image
EP3708029A1 (en) * 2019-03-13 2020-09-16 Cal-Comp Big Data, Inc. Virtual make-up system and virtual make-up coloring method
US10891478B2 (en) 2015-03-20 2021-01-12 Skolkovo Institute Of Science And Technology Method for correction of the eyes image using machine learning and method for machine learning
US10912372B2 (en) 2017-05-16 2021-02-09 Anastasia Beverly Hills, Llc Facial stencils
US11017575B2 (en) 2018-02-26 2021-05-25 Reald Spark, Llc Method and system for generating data to provide an animated visual representation
US11017567B2 (en) * 2017-03-22 2021-05-25 Snow Corporation Dynamic content providing method and system for face recognition camera
CN113168896A (en) * 2019-01-04 2021-07-23 宝洁公司 Method and system for guiding a user to use an applicator
US11093749B2 (en) * 2018-12-20 2021-08-17 L'oreal Analysis and feedback system for personal care routines
JP2021530031A (en) * 2018-07-27 2021-11-04 北京微播視界科技有限公司Beijing Microlive Vision Technology Co., Ltd Face-based special effects generation methods, devices and electronics
CN113850096A (en) * 2018-04-24 2021-12-28 株式会社Lg生活健康 Mobile terminal
US20220007816A1 (en) * 2020-07-07 2022-01-13 Perfect Mobile Corp. System and method for navigating user interfaces using a hybrid touchless control mechanism
CN113993417A (en) * 2019-05-06 2022-01-28 凯尔Os公司 Intelligent mirror subsystem and using method thereof
US20220101566A1 (en) * 2020-09-28 2022-03-31 Snap Inc. Providing augmented reality-based makeup in a messaging system
US11321764B2 (en) * 2016-11-11 2022-05-03 Sony Corporation Information processing apparatus and information processing method
USD963681S1 (en) * 2019-09-05 2022-09-13 Hoffmann-La Roche Inc. Portion of a display screen with a graphical user interface
USD965004S1 (en) * 2021-01-11 2022-09-27 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US11462028B2 (en) * 2013-12-17 2022-10-04 Sony Corporation Information processing device and information processing method to generate a virtual object image based on change in state of object in real space
US20230101374A1 (en) * 2021-09-30 2023-03-30 L'oreal Augmented reality cosmetic design filters
EP4131144A4 (en) * 2020-04-13 2023-10-11 Beijing Bytedance Network Technology Co., Ltd. Image processing method and apparatus, electronic device, and computer readable storage medium
US11837019B1 (en) * 2023-09-26 2023-12-05 Dauntless Labs, Llc Evaluating face recognition algorithms in view of image classification features affected by smart makeup
US11861255B1 (en) 2017-06-16 2024-01-02 Apple Inc. Wearable device for facilitating enhanced interaction
US12062078B2 (en) 2020-09-28 2024-08-13 Snap Inc. Selecting color values for augmented reality-based makeup

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8933994B2 (en) 2013-03-15 2015-01-13 Skin Republic, Inc. Systems and methods for specifying and formulating customized topical agents
JP5664755B1 (en) * 2013-12-20 2015-02-04 フリュー株式会社 Photo sticker creation apparatus and method, and program
JP6086112B2 (en) * 2014-12-08 2017-03-01 株式会社メイクソフトウェア Image processing apparatus, image processing method, and computer program
JP6275086B2 (en) * 2015-07-25 2018-02-07 株式会社オプティム Server, data providing method, and server program
JP6132248B2 (en) * 2016-01-22 2017-05-24 パナソニックIpマネジメント株式会社 Makeup support device
JP6132249B2 (en) * 2016-01-25 2017-05-24 パナソニックIpマネジメント株式会社 Makeup support device, makeup support method, and makeup support program
JP6296305B2 (en) * 2016-01-25 2018-03-20 パナソニックIpマネジメント株式会社 Makeup support device, makeup support method, and makeup support program
JP6078897B2 (en) * 2016-01-26 2017-02-15 パナソニックIpマネジメント株式会社 Makeup support device and makeup support method
JP6078896B2 (en) * 2016-01-26 2017-02-15 パナソニックIpマネジメント株式会社 Makeup support device and makeup support method
JP6650998B2 (en) * 2016-03-04 2020-02-19 株式会社オプティム Mirror, image display method and program
JP6670677B2 (en) * 2016-05-20 2020-03-25 日本電信電話株式会社 Technical support apparatus, method, program and system
TWI585711B (en) * 2016-05-24 2017-06-01 泰金寶電通股份有限公司 Method for obtaining care information, method for sharing care information, and electronic apparatus therefor
JP6314322B2 (en) * 2016-06-20 2018-04-25 株式会社メイクソフトウェア Image processing apparatus, image processing method, and computer program
KR102178566B1 (en) * 2016-06-30 2020-11-13 주식회사 엘지생활건강 Electronic mirror apparatus and method for controlling the same
US10607372B2 (en) * 2016-07-08 2020-03-31 Optim Corporation Cosmetic information providing system, cosmetic information providing apparatus, cosmetic information providing method, and program
CN109310196B (en) * 2016-07-14 2021-08-10 松下知识产权经营株式会社 Makeup assisting device and makeup assisting method
WO2018029963A1 (en) * 2016-08-08 2018-02-15 パナソニックIpマネジメント株式会社 Make-up assistance apparatus and make-up assistance method
CN106774879B (en) * 2016-12-12 2019-09-03 快创科技(大连)有限公司 A kind of plastic operation experiencing system based on AR virtual reality technology
US10135822B2 (en) 2017-03-21 2018-11-20 YouaretheID, LLC Biometric authentication of individuals utilizing characteristics of bone and blood vessel structures
US11374929B2 (en) * 2017-03-21 2022-06-28 Global E-Dentity, Inc. Biometric authentication for an augmented reality or a virtual reality device
JP2017201550A (en) * 2017-06-23 2017-11-09 株式会社メイクソフトウェア Image processing apparatus, image processing method, and computer program
TW201931179A (en) 2017-07-13 2019-08-01 美商美國資生堂公司 Systems and methods for virtual facial makeup removal and simulation, fast facial detection and landmark tracking, reduction in input video lag and shaking, and a method for recommending makeup
EP3735684A2 (en) 2018-01-06 2020-11-11 CareOS Smart mirror system and methods of use thereof
JP6601747B2 (en) * 2018-02-06 2019-11-06 パナソニックIpマネジメント株式会社 Makeup support system and makeup support method
US10431010B2 (en) 2018-02-09 2019-10-01 Perfect Corp. Systems and methods for virtual application of cosmetic effects to a remote user
CN110136272B (en) * 2018-02-09 2023-05-30 英属开曼群岛商玩美股份有限公司 System and method for virtually applying makeup effects to remote users
US10395436B1 (en) 2018-03-13 2019-08-27 Perfect Corp. Systems and methods for virtual application of makeup effects with adjustable orientation view
JP6583754B2 (en) * 2018-03-19 2019-10-02 株式会社Novera Information processing device, mirror device, program
CN108829233B (en) * 2018-04-26 2021-06-15 深圳市同维通信技术有限公司 Interaction method and device
JP2021518785A (en) * 2018-04-27 2021-08-05 ザ プロクター アンド ギャンブル カンパニーThe Procter & Gamble Company Methods and systems for improving user compliance of surface-applied products
CN108920490A (en) * 2018-05-14 2018-11-30 京东方科技集团股份有限公司 Assist implementation method, device, electronic equipment and the storage medium of makeup
US11042726B2 (en) * 2018-11-05 2021-06-22 Panasonic Intellectual Property Management Co., Ltd. Skin analyzer, skin analysis method, and non-transitory computer-readable recording medium
US11253045B2 (en) 2019-07-18 2022-02-22 Perfect Mobile Corp. Systems and methods for recommendation of makeup effects based on makeup trends and facial analysis
CN112287744A (en) * 2019-07-18 2021-01-29 玩美移动股份有限公司 Method and system for implementing makeup effect suggestion and storage medium
WO2021102291A1 (en) * 2019-11-21 2021-05-27 General Vibration Corporation Systems and methods for producing a pure vibration force from a synchronized dual array of eccentric rotating masses
CN111091610B (en) * 2019-11-22 2023-04-11 北京市商汤科技开发有限公司 Image processing method and device, electronic equipment and storage medium
US11403788B2 (en) 2019-11-22 2022-08-02 Beijing Sensetime Technology Development Co., Ltd. Image processing method and apparatus, electronic device, and storage medium
JP6929979B2 (en) 2020-01-31 2021-09-01 ユニ・チャーム株式会社 Display control device, display control method and display control program
KR102391026B1 (en) * 2020-02-20 2022-04-26 주식회사 엘지생활건강 Mobile terminal and Automatic cosmetic recognition system
US11587358B2 (en) 2020-03-26 2023-02-21 Panasonic Avionics Corporation Managing content on in-flight entertainment platforms
KR102455058B1 (en) * 2021-02-08 2022-10-17 주식회사 엘지생활건강 Mobile terminal and Automatic cosmetic recognition system
CN116888634A (en) * 2021-03-11 2023-10-13 株式会社资生堂 Information processing device, information processing method, information processing program, information processing system, and cosmetic method
WO2024034537A1 (en) * 2022-08-10 2024-02-15 株式会社 資生堂 Information processing device, information processing method, and program

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040110113A1 (en) * 2002-12-10 2004-06-10 Alice Huang Tool and method of making a tool for use in applying a cosmetic
US20040257439A1 (en) * 2003-06-17 2004-12-23 Moritex Corporation Skin observing apparatus
US20050175234A1 (en) * 2002-09-03 2005-08-11 Shizuo Sakamoto Head-mounted object image combining method, makeup image combining method, headmounted object image combining device, makeup image composition device, and program
US20060132506A1 (en) * 1997-03-06 2006-06-22 Ryuichi Utsugi Method of modifying facial images, makeup simulation method, makeup method, makeup support apparatus and foundation transfer film
US20060147119A1 (en) * 2003-06-30 2006-07-06 Shiseido Co., Ltd. Eye form classifying method, form classification map, and eye cosmetic treatment method
US20070019882A1 (en) * 2004-01-30 2007-01-25 Shoji Tanaka Makeup simulation program, makeup simulation device, and makeup simulation method
US20070050639A1 (en) * 2005-08-23 2007-03-01 Konica Minolta Holdings, Inc. Authentication apparatus and authentication method
US20080136895A1 (en) * 2006-12-12 2008-06-12 General Instrument Corporation Mute Function for Video Applications
US20080192980A1 (en) * 2007-02-14 2008-08-14 Samsung Electronics Co., Ltd. Liveness detection method and apparatus of video image
US20090309826A1 (en) * 2008-06-17 2009-12-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Systems and devices
US20100215599A1 (en) * 2009-02-23 2010-08-26 L'oreal Method of making up with light-sensitive makeup by applying a base layer and a kit for implementing such a method
US20110211047A1 (en) * 2009-03-27 2011-09-01 Rajeshwar Chhibber Methods and Systems for Imaging and Modeling Skin Using Polarized Lighting

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5520203A (en) * 1993-04-26 1996-05-28 Segerstrom; E. Jane Cosmetics and their selection, application, and distribution
EP0638261B1 (en) * 1993-08-03 1996-01-03 Parfums Christian Dior Method for determining the colour of a make-up restoring sensitively the colour of the skin of a person and device for its application
FR2799022B1 (en) 1999-09-29 2002-02-01 Oreal MAKEUP ASSISTANCE DEVICE AND ASSEMBLY CONSISTING OF SUCH A DEVICE AND A DEVICE FOR DELIVERING A PRODUCT HAVING A PREDETERMINED BRDF, SELECTED BY THE MAKEUP ASSISTANCE DEVICE
JP2002056260A (en) * 2000-08-10 2002-02-20 Itsuo Kagami System and method for reserving beauty parlor
US7079158B2 (en) * 2000-08-31 2006-07-18 Beautyriot.Com, Inc. Virtual makeover system and method
US20020196333A1 (en) * 2001-06-21 2002-12-26 Gorischek Ignaz M. Mirror and image display system
JP2003153739A (en) * 2001-09-05 2003-05-27 Fuji Photo Film Co Ltd Makeup mirror device, and makeup method
US20030065578A1 (en) * 2001-10-01 2003-04-03 Jerome Peyrelevade Methods and systems involving simulated application of beauty products
TWI227444B (en) * 2003-12-19 2005-02-01 Inst Information Industry Simulation method for make-up trial and the device thereof
JPWO2005109246A1 (en) * 2004-05-12 2008-03-21 株式会社味香り戦略研究所 Sensory database
JP4385925B2 (en) 2004-11-02 2009-12-16 花王株式会社 Image forming method
FR2881858A1 (en) * 2005-02-04 2006-08-11 Oreal Interactive system for recommending cosmetic to person, transmits information related to desired appearance characteristic determined by data management unit and cosmetic for obtaining desired appearance characteristic, to person
US20060281053A1 (en) * 2005-06-13 2006-12-14 Medcalf Rochelle R Mannequin head sensor
KR101363691B1 (en) * 2006-01-17 2014-02-14 가부시키가이샤 시세이도 Makeup simulation system, makeup simulation device, makeup simulation method, and makeup simulation program
WO2008102440A1 (en) * 2007-02-21 2008-08-28 Tadashi Goino Makeup face image creating device and method
US7629757B2 (en) * 2007-03-19 2009-12-08 The Electric Lipstick Company, Llc Powered cosmetic dispenser
JP2009064423A (en) * 2007-08-10 2009-03-26 Shiseido Co Ltd Makeup simulation system, makeup simulation device, makeup simulation method, and makeup simulation program
JP2009053981A (en) 2007-08-28 2009-03-12 Kao Corp Makeup simulation device
US8562352B2 (en) * 2008-07-09 2013-10-22 Andrea B. Fairweather Systems, methods and apparatus involving Fairweather faces cosmetics brushes and face charts
JP5302793B2 (en) * 2009-06-24 2013-10-02 ソニーモバイルコミュニケーションズ株式会社 Cosmetic support device, cosmetic support method, cosmetic support program, and portable terminal device
KR101604846B1 (en) * 2009-11-27 2016-03-21 엘지전자 주식회사 Mobile terminal and operation control method thereof
TWI426450B (en) * 2010-10-27 2014-02-11 Hon Hai Prec Ind Co Ltd Electronic cosmetic case

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060132506A1 (en) * 1997-03-06 2006-06-22 Ryuichi Utsugi Method of modifying facial images, makeup simulation method, makeup method, makeup support apparatus and foundation transfer film
US20050175234A1 (en) * 2002-09-03 2005-08-11 Shizuo Sakamoto Head-mounted object image combining method, makeup image combining method, headmounted object image combining device, makeup image composition device, and program
US20040110113A1 (en) * 2002-12-10 2004-06-10 Alice Huang Tool and method of making a tool for use in applying a cosmetic
US20040257439A1 (en) * 2003-06-17 2004-12-23 Moritex Corporation Skin observing apparatus
US20060147119A1 (en) * 2003-06-30 2006-07-06 Shiseido Co., Ltd. Eye form classifying method, form classification map, and eye cosmetic treatment method
US20070019882A1 (en) * 2004-01-30 2007-01-25 Shoji Tanaka Makeup simulation program, makeup simulation device, and makeup simulation method
US20070050639A1 (en) * 2005-08-23 2007-03-01 Konica Minolta Holdings, Inc. Authentication apparatus and authentication method
US20080136895A1 (en) * 2006-12-12 2008-06-12 General Instrument Corporation Mute Function for Video Applications
US20080192980A1 (en) * 2007-02-14 2008-08-14 Samsung Electronics Co., Ltd. Liveness detection method and apparatus of video image
US20090309826A1 (en) * 2008-06-17 2009-12-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Systems and devices
US20100215599A1 (en) * 2009-02-23 2010-08-26 L'oreal Method of making up with light-sensitive makeup by applying a base layer and a kit for implementing such a method
US20110211047A1 (en) * 2009-03-27 2011-09-01 Rajeshwar Chhibber Methods and Systems for Imaging and Modeling Skin Using Polarized Lighting

Cited By (157)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130216295A1 (en) * 2012-02-20 2013-08-22 Charlene Hsueh-Ling Wong Eyes make-up application machine
US8899242B2 (en) * 2012-02-20 2014-12-02 Zong Jing Investment, Inc. Eyes make-up application machine
US20150038225A1 (en) * 2012-03-13 2015-02-05 Neowiz Bless Studio Corporation Online game providing method for providing character makeup and system therefor
US20130286036A1 (en) * 2012-04-26 2013-10-31 Myongji University Industry and Academia Corporation Foundation Apparatus and method for producing makeup avatar
KR20130121003A (en) * 2012-04-26 2013-11-05 한국전자통신연구원 Method and device for producing dressed avatar
KR102024903B1 (en) 2012-04-26 2019-09-25 한국전자통신연구원 Method and device for producing dressed avatar
US9378574B2 (en) * 2012-04-26 2016-06-28 Electronics And Telecommunications Research Institute Apparatus and method for producing makeup avatar
US20140016823A1 (en) * 2012-07-12 2014-01-16 Cywee Group Limited Method of virtual makeup achieved by facial tracking
US9224248B2 (en) * 2012-07-12 2015-12-29 Ulsee Inc. Method of virtual makeup achieved by facial tracking
US9576183B2 (en) 2012-11-02 2017-02-21 Qualcomm Incorporated Fast initialization for monocular visual SLAM
CN103885461A (en) * 2012-12-21 2014-06-25 宗经投资股份有限公司 Movement method for makeup tool of automatic makeup machine
CN103970525A (en) * 2013-01-25 2014-08-06 韩国电子通信研究院 Apparatus And Method For Virtual Makeup
US20140210814A1 (en) * 2013-01-25 2014-07-31 Electronics & Telecommunications Research Institute Apparatus and method for virtual makeup
US20160157587A1 (en) * 2013-02-01 2016-06-09 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance method, and makeup application assistance program
US20160143423A1 (en) * 2013-02-01 2016-05-26 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance method, and makeup application assistance program
US10242589B2 (en) * 2013-02-01 2019-03-26 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance method, and makeup application assistance program
EP2820970A4 (en) * 2013-02-01 2015-06-10 Panasonic Ip Man Co Ltd Makeup application assistance device, makeup application assistance method, and makeup application assistance program
EP2821966A4 (en) * 2013-02-01 2015-10-14 Panasonic Ip Man Co Ltd Makeup application assistance device, makeup application assistance method, and makeup application assistance program
US9615647B2 (en) 2013-02-01 2017-04-11 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance method, and makeup application assistance program
US10249211B2 (en) * 2013-02-01 2019-04-02 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance method, and makeup application assistance program
EP2821959A4 (en) * 2013-02-01 2015-12-02 Panasonic Ip Man Co Ltd Makeup application assistance device, makeup application assistance method, and makeup application assistance program
US9681736B2 (en) * 2013-02-01 2017-06-20 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance method, and makeup application assistance program
US9986812B2 (en) 2013-02-01 2018-06-05 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance system, and makeup application assistance method
US10251463B2 (en) 2013-02-01 2019-04-09 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance method, and makeup application assistance program
CN104205168A (en) * 2013-02-01 2014-12-10 松下电器产业株式会社 Makeup application assistance device, makeup application assistance method, and makeup application assistance program
US9812030B2 (en) 2013-02-01 2017-11-07 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance method, and makeup application assistance program
US10028569B2 (en) 2013-02-01 2018-07-24 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance system, and makeup application assistance method
US10299568B2 (en) * 2013-02-01 2019-05-28 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance method, and makeup application assistance program
US10292481B2 (en) * 2013-02-01 2019-05-21 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance method, and makeup application assistance program
US10264870B2 (en) 2013-02-01 2019-04-23 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance system, and makeup application assistance method
US20160148533A1 (en) * 2013-02-01 2016-05-26 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance method, and makeup application assistance program
US20160148532A1 (en) * 2013-02-01 2016-05-26 Panasonic Intellectual Property Management Co., Ltd. Makeup application assistance device, makeup application assistance method, and makeup application assistance program
US20160000209A1 (en) * 2013-02-28 2016-01-07 Panasonic Intellectual Property Management Co., Ltd. Makeup assistance device, makeup assistance method, and makeup assistance program
CN105101836A (en) * 2013-02-28 2015-11-25 松下知识产权经营株式会社 Makeup assistance device, makeup assistance method, and makeup assistance program
US10660425B2 (en) * 2013-02-28 2020-05-26 Panasonic Intellectual Property Management Co., Ltd. Makeup assistance device, makeup assistance method, and makeup assistance program
CN109875229A (en) * 2013-03-22 2019-06-14 松下知识产权经营株式会社 Makeup auxiliary device, skin are close to thin slice, cosmetic auxiliary method and recording medium
US10342316B2 (en) 2013-03-22 2019-07-09 Panasonic Intellectual Property Management Co., Ltd. Makeup support device, makeup support method, and makeup support program
US20160015152A1 (en) * 2013-03-22 2016-01-21 Panasonic Intellectual Property Management Co., Ltd. Makeup support device, makeup support method, and makeup support program
CN105188467A (en) * 2013-03-22 2015-12-23 松下知识产权经营株式会社 Makeup support device, makeup support method, and makeup support program
US10413042B2 (en) 2013-03-22 2019-09-17 Panasonic Intellectual Property Management Co., Ltd. Makeup support device, makeup support method, and makeup support program
US10010155B2 (en) * 2013-03-22 2018-07-03 Panasonic Intellectual Property Management Co., Ltd. Makeup support device, makeup support method, and makeup support program
US20140314315A1 (en) * 2013-03-25 2014-10-23 Brightex Bio-Photonics Llc Systems and Methods for Recommending Cosmetic Products for Users with Mobile Devices
US9542595B2 (en) * 2013-03-25 2017-01-10 Brightex Bio-Photonics Llc Systems and methods for recommending cosmetic products for users with mobile devices
CN104797165A (en) * 2013-08-30 2015-07-22 松下知识产权经营株式会社 Makeup assistance device, makeup assistance method, and makeup assistance program
EP3039991A4 (en) * 2013-08-30 2016-09-14 Panasonic Ip Man Co Ltd Makeup assistance device, makeup assistance method, and makeup assistance program
US11462028B2 (en) * 2013-12-17 2022-10-04 Sony Corporation Information processing device and information processing method to generate a virtual object image based on change in state of object in real space
CN104333566A (en) * 2013-12-23 2015-02-04 乐视网信息技术(北京)股份有限公司 Information acquiring and recommending method, device and system
US9504925B2 (en) 2014-02-14 2016-11-29 Right Foot Llc Doll or action figure with facial features customized to a particular individual
US20150234942A1 (en) * 2014-02-14 2015-08-20 Possibility Place, Llc Method of making a mask with customized facial features
WO2015123117A3 (en) * 2014-02-14 2015-11-19 Possibility Place, Llc Method of making a mask with customized facial features
US9563975B2 (en) * 2014-03-13 2017-02-07 Panasonic Intellectual Property Management Co., Ltd. Makeup support apparatus and method for supporting makeup
US20150262403A1 (en) * 2014-03-13 2015-09-17 Panasonic Intellectual Property Management Co., Ltd. Makeup support apparatus and method for supporting makeup
CN103870821A (en) * 2014-04-10 2014-06-18 上海影火智能科技有限公司 Virtual make-up trial method and system
US20160042557A1 (en) * 2014-08-08 2016-02-11 Asustek Computer Inc. Method of applying virtual makeup, virtual makeup electronic system, and electronic device having virtual makeup electronic system
CN107003827A (en) * 2014-09-26 2017-08-01 三星电子株式会社 The method for displaying image and equipment performed by the equipment including changeable mirror
EP3201834A4 (en) * 2014-09-30 2018-08-08 TCMS Transparent Beauty LLC Precise application of cosmetic looks from over a network environment
US10553006B2 (en) 2014-09-30 2020-02-04 Tcms Transparent Beauty, Llc Precise application of cosmetic looks from over a network environment
US20170256084A1 (en) * 2014-09-30 2017-09-07 Tcms Transparent Beauty, Llc Precise application of cosmetic looks from over a network environment
US20200167983A1 (en) * 2014-09-30 2020-05-28 Tcms Transparent Beauty, Llc Precise application of cosmetic looks from over a network environment
KR20160046560A (en) * 2014-10-21 2016-04-29 삼성전자주식회사 Method and apparatus for face recognition
KR102214918B1 (en) * 2014-10-21 2021-02-10 삼성전자주식회사 Method and apparatus for face recognition
US10248845B2 (en) * 2014-10-21 2019-04-02 Samsung Electronics Co., Ltd. Method and apparatus for facial recognition
US20160110587A1 (en) * 2014-10-21 2016-04-21 Samsung Electronics Co., Ltd. Method and apparatus for facial recognition
US9760762B2 (en) * 2014-11-03 2017-09-12 Anastasia Soare Facial structural shaping
US20160125229A1 (en) * 2014-11-03 2016-05-05 Anastasia Soare Facial structural shaping
US20160125227A1 (en) * 2014-11-03 2016-05-05 Anastasia Soare Facial structural shaping
CN107111861A (en) * 2015-01-29 2017-08-29 松下知识产权经营株式会社 Image processing apparatus, stylus and image processing method
US9412034B1 (en) * 2015-01-29 2016-08-09 Qualcomm Incorporated Occlusion handling for computer vision
US10891478B2 (en) 2015-03-20 2021-01-12 Skolkovo Institute Of Science And Technology Method for correction of the eyes image using machine learning and method for machine learning
US11908241B2 (en) 2015-03-20 2024-02-20 Skolkovo Institute Of Science And Technology Method for correction of the eyes image using machine learning and method for machine learning
US10623633B2 (en) * 2015-03-26 2020-04-14 Panasonic Intellectual Property Management Co., Ltd. Image synthesis device and image synthesis method
US20180077347A1 (en) * 2015-03-26 2018-03-15 Panasonic Intellectual Property Management Co., Ltd. Image synthesis device and image synthesis method
US10083345B2 (en) * 2015-05-05 2018-09-25 Myongsu Choe Makeup supporting methods for creating and applying a makeup guide content to makeup user's face on a real-time basis
US20160328632A1 (en) * 2015-05-05 2016-11-10 Myongsu Choe Makeup supporting methods for creating and applying a makeup guide content to makeup user's face on a real-time basis
US20160357578A1 (en) * 2015-06-03 2016-12-08 Samsung Electronics Co., Ltd. Method and device for providing makeup mirror
US10671837B2 (en) * 2015-09-08 2020-06-02 Nec Corporation Face recognition system, face recognition method, display control apparatus, display control method, and display control program
US10885312B2 (en) 2015-09-08 2021-01-05 Nec Corporation Face recognition system, face recognition method, display control apparatus, display control method, and display control program
US10970524B2 (en) 2015-09-08 2021-04-06 Nec Corporation Face recognition system, face recognition method, display control apparatus, display control method, and display control program
US10885311B2 (en) 2015-09-08 2021-01-05 Nec Corporation Face recognition system, face recognition method, display control apparatus, display control method, and display control program
US20180239954A1 (en) * 2015-09-08 2018-08-23 Nec Corporation Face recognition system, face recognition method, display control apparatus, display control method, and display control program
US11842566B2 (en) 2015-09-08 2023-12-12 Nec Corporation Face recognition system, face recognition method, display control apparatus, display control method, and display control program
CN107924577A (en) * 2015-10-26 2018-04-17 松下知识产权经营株式会社 Position generating means of making up and makeup position generation method
US10874196B2 (en) * 2015-10-26 2020-12-29 Panasonic Intellectual Property Management Co., Ltd. Makeup part generating apparatus and makeup part generating method
EP3396619A4 (en) * 2015-12-25 2019-05-08 Panasonic Intellectual Property Management Co., Ltd. Makeup part creation device, makeup part usage device, makeup part creation method, makeup part usage method, makeup part creation program, and makeup part usage program
US20180268572A1 (en) * 2015-12-25 2018-09-20 Panasonic Intellectual Property Management Co., Ltd. Makeup part generating apparatus, makeup part utilizing apparatus, makeup part generating method, makeup part utilizing method, non-transitory computer-readable recording medium storing makeup part generating program, and non-transitory computer-readable recording medium storing makeup part utilizing program
US10783672B2 (en) * 2015-12-25 2020-09-22 Panasonic Intellectual Property Management Co., Ltd. Makeup part generating apparatus, makeup part utilizing apparatus, makeup part generating method, makeup part utilizing method, non-transitory computer-readable recording medium storing makeup part generating program, and non-transitory computer-readable recording medium storing makeup part utilizing program
CN108292423A (en) * 2015-12-25 2018-07-17 松下知识产权经营株式会社 Local dressing producing device, local dressing utilize program using device, local dressing production method, local dressing using method, local dressing production process and local dressing
US10162997B2 (en) 2015-12-27 2018-12-25 Asustek Computer Inc. Electronic device, computer readable storage medium and face image display method
TWI630579B (en) * 2015-12-27 2018-07-21 華碩電腦股份有限公司 Electronic apparatus, computer readable recording medium storing program and facial image displaying method
US11317081B2 (en) 2016-01-05 2022-04-26 Reald Spark, Llc Gaze correction of multi-view images
US10750160B2 (en) 2016-01-05 2020-08-18 Reald Spark, Llc Gaze correction of multi-view images
US11854243B2 (en) 2016-01-05 2023-12-26 Reald Spark, Llc Gaze correction of multi-view images
US10324739B2 (en) * 2016-03-03 2019-06-18 Perfect Corp. Systems and methods for simulated application of cosmetic effects
CN107153806A (en) * 2016-03-03 2017-09-12 炬芯(珠海)科技有限公司 A kind of method for detecting human face and device
US10666853B2 (en) * 2016-06-10 2020-05-26 Panasonic Intellectual Property Management Co., Ltd. Virtual makeup device, and virtual makeup method
US20190297271A1 (en) * 2016-06-10 2019-09-26 Panasonic Intellectual Property Management Co., Ltd. Virtual makeup device, and virtual makeup method
TWI573093B (en) * 2016-06-14 2017-03-01 Asustek Comp Inc Method of establishing virtual makeup data, electronic device having method of establishing virtual makeup data and non-transitory computer readable storage medium thereof
US10360710B2 (en) * 2016-06-14 2019-07-23 Asustek Computer Inc. Method of establishing virtual makeup data and electronic device using the same
EP3530142A4 (en) * 2016-10-24 2019-10-30 Panasonic Intellectual Property Management Co., Ltd. Image processing device, image processing method, and image processing program
US10789748B2 (en) 2016-10-24 2020-09-29 Panasonic Intellectual Property Management Co., Ltd. Image processing device, image processing method, and non-transitory computer-readable recording medium storing image processing program
US11321764B2 (en) * 2016-11-11 2022-05-03 Sony Corporation Information processing apparatus and information processing method
US20180151086A1 (en) * 2016-11-25 2018-05-31 Naomi Belhassen Semi-permanent makeup system and method
US10354546B2 (en) * 2016-11-25 2019-07-16 Naomi Belhassen Semi-permanent makeup system and method
RU2750596C2 (en) * 2016-12-06 2021-06-29 Конинклейке Филипс Н.В. Displaying guide pointer to the user
US11116303B2 (en) * 2016-12-06 2021-09-14 Koninklijke Philips N.V. Displaying a guidance indicator to a user
CN110050251A (en) * 2016-12-06 2019-07-23 皇家飞利浦有限公司 Guidance indicator is shown to user
WO2018104356A1 (en) * 2016-12-06 2018-06-14 Koninklijke Philips N.V. Displaying a guidance indicator to a user
US11017567B2 (en) * 2017-03-22 2021-05-25 Snow Corporation Dynamic content providing method and system for face recognition camera
EP3396585A1 (en) * 2017-04-27 2018-10-31 Cal-Comp Big Data, Inc. Lip gloss guide device and method thereof
US10783802B2 (en) 2017-04-27 2020-09-22 Cal-Comp Big Data, Inc. Lip gloss guide device and method thereof
US10912372B2 (en) 2017-05-16 2021-02-09 Anastasia Beverly Hills, Llc Facial stencils
US11861255B1 (en) 2017-06-16 2024-01-02 Apple Inc. Wearable device for facilitating enhanced interaction
US20200089935A1 (en) * 2017-07-25 2020-03-19 Cal-Comp Big Data, Inc. Body information analysis apparatus capable of indicating shading-areas
US20190035126A1 (en) * 2017-07-25 2019-01-31 Cal-Comp Big Data, Inc. Body information analysis apparatus capable of indicating blush-areas
CN109299636A (en) * 2017-07-25 2019-02-01 丽宝大数据股份有限公司 The biological information analytical equipment in signable blush region
US11232647B2 (en) 2017-08-08 2022-01-25 Reald Spark, Llc Adjusting a digital representation of a head region
US10740985B2 (en) * 2017-08-08 2020-08-11 Reald Spark, Llc Adjusting a digital representation of a head region
US11836880B2 (en) 2017-08-08 2023-12-05 Reald Spark, Llc Adjusting a digital representation of a head region
CN109508587A (en) * 2017-09-15 2019-03-22 丽宝大数据股份有限公司 Biological information analytical equipment and its bottom adornment analysis method
EP3522117A1 (en) * 2018-02-02 2019-08-07 Perfect Corp. Systems and methods for virtual application of cosmetic effects to photo albums and product promotion
US10607264B2 (en) 2018-02-02 2020-03-31 Perfect Corp. Systems and methods for virtual application of cosmetic effects to photo albums and product promotion
CN110135930A (en) * 2018-02-02 2019-08-16 英属开曼群岛商玩美股份有限公司 Virtual application dressing effect and the method, system and storage media for promoting product
US11657557B2 (en) 2018-02-26 2023-05-23 Reald Spark, Llc Method and system for generating data to provide an animated visual representation
US11017575B2 (en) 2018-02-26 2021-05-25 Reald Spark, Llc Method and system for generating data to provide an animated visual representation
CN108600614A (en) * 2018-04-02 2018-09-28 珠海格力电器股份有限公司 Image processing method and device
CN113850096A (en) * 2018-04-24 2021-12-28 株式会社Lg生活健康 Mobile terminal
US10762665B2 (en) 2018-05-23 2020-09-01 Perfect Corp. Systems and methods for performing virtual application of makeup effects based on a source image
CN108771316A (en) * 2018-05-30 2018-11-09 杭州任你说智能科技有限公司 A kind of artificial intelligence makeup mirror system
US11029830B2 (en) * 2018-06-11 2021-06-08 Casio Computer Co., Ltd. Display control apparatus, display controlling method and display control program for providing guidance using a generated image
CN110575001A (en) * 2018-06-11 2019-12-17 卡西欧计算机株式会社 display control device, display control method, and medium storing display control program
CN108932654A (en) * 2018-06-12 2018-12-04 苏州诚满信息技术有限公司 A kind of virtually examination adornment guidance method and device
JP7286684B2 (en) 2018-07-27 2023-06-05 北京微播視界科技有限公司 Face-based special effects generation method, apparatus and electronics
JP2021530031A (en) * 2018-07-27 2021-11-04 北京微播視界科技有限公司Beijing Microlive Vision Technology Co., Ltd Face-based special effects generation methods, devices and electronics
US11354825B2 (en) 2018-07-27 2022-06-07 Beijing Microlive Vision Technology Co., Ltd Method, apparatus for generating special effect based on face, and electronic device
CN109090808A (en) * 2018-08-08 2018-12-28 颜沿(上海)智能科技有限公司 A kind of intelligently examination adornment dressing glass and method
CN109246345A (en) * 2018-10-23 2019-01-18 Oppo广东移动通信有限公司 U.S. pupil image pickup method, device, storage medium and mobile terminal
US11093749B2 (en) * 2018-12-20 2021-08-17 L'oreal Analysis and feedback system for personal care routines
US11756298B2 (en) * 2018-12-20 2023-09-12 L'oreal Analysis and feedback system for personal care routines
US20210374417A1 (en) * 2018-12-20 2021-12-02 L'oreal Analysis and feedback system for personal care routines
CN113168896A (en) * 2019-01-04 2021-07-23 宝洁公司 Method and system for guiding a user to use an applicator
EP3708029A1 (en) * 2019-03-13 2020-09-16 Cal-Comp Big Data, Inc. Virtual make-up system and virtual make-up coloring method
CN113993417A (en) * 2019-05-06 2022-01-28 凯尔Os公司 Intelligent mirror subsystem and using method thereof
USD963681S1 (en) * 2019-09-05 2022-09-13 Hoffmann-La Roche Inc. Portion of a display screen with a graphical user interface
USD984469S1 (en) * 2019-09-05 2023-04-25 Hoffmann-La Roche Inc. Portion of a display screen with a graphical user interface
EP4131144A4 (en) * 2020-04-13 2023-10-11 Beijing Bytedance Network Technology Co., Ltd. Image processing method and apparatus, electronic device, and computer readable storage medium
US11908237B2 (en) 2020-04-13 2024-02-20 Beijing Bytedance Network Technology Co., Ltd. Image processing method and apparatus, electronic device, and computer-readable storage medium
US20220007816A1 (en) * 2020-07-07 2022-01-13 Perfect Mobile Corp. System and method for navigating user interfaces using a hybrid touchless control mechanism
US11690435B2 (en) * 2020-07-07 2023-07-04 Perfect Mobile Corp. System and method for navigating user interfaces using a hybrid touchless control mechanism
US20220101566A1 (en) * 2020-09-28 2022-03-31 Snap Inc. Providing augmented reality-based makeup in a messaging system
US12062078B2 (en) 2020-09-28 2024-08-13 Snap Inc. Selecting color values for augmented reality-based makeup
US11798202B2 (en) * 2020-09-28 2023-10-24 Snap Inc. Providing augmented reality-based makeup in a messaging system
USD1012114S1 (en) 2021-01-11 2024-01-23 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD965004S1 (en) * 2021-01-11 2022-09-27 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD1017624S1 (en) 2021-01-11 2024-03-12 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD989800S1 (en) 2021-01-11 2023-06-20 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US20230101374A1 (en) * 2021-09-30 2023-03-30 L'oreal Augmented reality cosmetic design filters
US11837019B1 (en) * 2023-09-26 2023-12-05 Dauntless Labs, Llc Evaluating face recognition algorithms in view of image classification features affected by smart makeup

Also Published As

Publication number Publication date
JP2012181688A (en) 2012-09-20
US10945514B2 (en) 2021-03-16
US20210177124A1 (en) 2021-06-17
US20160128450A1 (en) 2016-05-12

Similar Documents

Publication Publication Date Title
US20210177124A1 (en) Information processing apparatus, information processing method, and computer-readable storage medium
US11868515B2 (en) Generating textured polygon strip hair from strand-based hair for a virtual character
KR100722229B1 (en) Apparatus and method for immediately creating and controlling virtual reality interaction human model for user centric interface
CN110363867B (en) Virtual decorating system, method, device and medium
KR101911133B1 (en) Avatar construction using depth camera
US9646340B2 (en) Avatar-based virtual dressing room
JP2021534516A (en) Virtual fitting system and method for eyeglasses
KR20180100476A (en) Virtual reality-based apparatus and method to generate a three dimensional(3d) human face model using image and depth data
Rahman et al. Augmented rendering of makeup features in a smart interactive mirror system for decision support in cosmetic products selection
JP2004094917A (en) Virtual makeup device and method therefor
CN102201099A (en) Motion-based interactive shopping environment
JP2007213623A (en) Virtual makeup device and method therefor
Yang et al. A virtual try-on system in augmented reality using RGB-D cameras for footwear personalization
Vitali et al. Acquisition of customer’s tailor measurements for 3D clothing design using virtual reality devices
EP2880637A2 (en) Avatar-based virtual dressing room
Kim Dance motion capture and composition using multiple RGB and depth sensors
CN116523579A (en) Display equipment, virtual fitting system and method
Treepong et al. Makeup creativity enhancement with an augmented reality face makeup system
KR20220026186A (en) A Mixed Reality Telepresence System for Dissimilar Spaces Using Full-Body Avatar
CN116452745A (en) Hand modeling, hand model processing method, device and medium
KR101719927B1 (en) Real-time make up mirror simulation apparatus using leap motion
Raj et al. Augmented reality and deep learning based system for assisting assembly process
JP6710095B2 (en) Technical support device, method, program and system
Makled et al. Investigating user embodiment of inverse-kinematic avatars in smartphone augmented reality
CN108629824B (en) Image generation method and device, electronic equipment and computer readable medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAITO, MARI;KASHITANI, TATSUKI;REEL/FRAME:027736/0223

Effective date: 20120216

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE