WO2023194466A1 - Method for recommending cosmetic products using a knn algorithm - Google Patents

Method for recommending cosmetic products using a knn algorithm Download PDF

Info

Publication number
WO2023194466A1
WO2023194466A1 PCT/EP2023/058988 EP2023058988W WO2023194466A1 WO 2023194466 A1 WO2023194466 A1 WO 2023194466A1 EP 2023058988 W EP2023058988 W EP 2023058988W WO 2023194466 A1 WO2023194466 A1 WO 2023194466A1
Authority
WO
WIPO (PCT)
Prior art keywords
product
cosmetic product
source
source image
characterizing parameters
Prior art date
Application number
PCT/EP2023/058988
Other languages
French (fr)
Inventor
Sidarth SINGLA
Ruowei JIANG
Matthieu PERROT
Sileye BA
Robin KIPS
Original Assignee
L'oreal
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by L'oreal filed Critical L'oreal
Publication of WO2023194466A1 publication Critical patent/WO2023194466A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D44/005Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24147Distances to closest patterns, e.g. nearest neighbour classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Definitions

  • the present invention relates to a method for recommending a cosmetic product, which method comprises implementing an encoding artificial neural network.
  • the choice of beauty products often depends on the personal characteristics of a wearer. These characteristics may include lifestyle, colour preferences, body chemistry, style preferences and/or physical attributes of the wearer. Thus, one shade of lipstick that works well on a young woman with blond hair and fair skin may not work well on a more mature woman with dark hair and dark skin. Beauty professionals, whether they be in beauty establishments, retail establishments or other places, are therefore often asked for assistance in selecting beauty products. Of course, the ability to obtain useful advice depends both on the level of personal attention available and on the level of competence of the beauty professional who is giving the advice. For example, if a beauty counter in a retail establishment does not have sufficient staff to manage the customer traffic or if the staff are poorly trained, the quality of the advice may suffer as a result.
  • This task consists in extracting a make-up style from a reference portrait image and in synthesizing it on the target image of a different person.
  • the make-up attributes are extracted with the aid of a neural network and rendered on the source image with the aid of a generative model, such as a GAN or VAE.
  • the faces into face groups based on the skin tone value of each of the faces, the face groups each comprising faces of a similar skin tone which are grouped from the input image and grouped from the source image based on an average skin tone value difference less than or equal to a grouping threshold of the faces with a similar skin tone;
  • the colour characteristics comprising the skin tones of the respective faces in the face group pair as part of the colour characteristics applied to modify the input image.
  • WO2014168678A1 discloses a system comprising:
  • a user information module wherein the user information module captures an image of the user’s skin
  • a graphical user interface wherein the graphical user interface allows the selection of a skin-related application from a plurality of skin-related applications
  • processor coupled to the user information module, to the graphical user interface and to the database of skin images, wherein the processor is configured to:
  • an output display wherein the output display displays the simulated image of the user’s skin.
  • US2021209427A1 discloses an apparatus comprising:
  • a processing circuit for accepting at least one image representative of the skin of the face of a user
  • a communication circuit for transmitting the accepted image to machine learning models and for receiving a regimen recommendation from the machine learning models
  • a user interface circuit for displaying the regimen recommendation to the user
  • the user interface circuit is further configured to display images of human faces to the user
  • the processing circuit being further configured to accept an input of the user which classifies the characteristics of the skin of the face based on images which are supplied to it via the user interface circuit;
  • the communication interface circuit being further configured to transmit the input of the user to machine learning models as training data.
  • the problem posed by the invention is:
  • the product recommendations may thus be used to assist makeup consumers in making an aesthetic choice, or even potentially be used to obtain a product having this colour.
  • One subject of the invention is a method for extracting information about a source cosmetic product from a source image and recommending a target cosmetic product close to the source cosmetic product from a database.
  • the method implements a convolutional neural network trained to retrieve the characteristics of the source image and offering two approaches for using the k-nearest neighbours method, and to find the target product in the database of reference products that is closest to the source cosmetic product.
  • a method for recommending a cosmetic product comprising: obtaining a source image of a source cosmetic product applied to a first person, implementing an encoding artificial neural network configured so as to determine characterizing parameters of the source cosmetic product from the source image,
  • the recommendation comprising the implementation of a k-nearest neighbours algorithm (k-NN).
  • the method may be used to recommend cosmetic products of various types.
  • the method may be used for make-up products or haircare products.
  • the make-up products may be blusher, foundation, mascara, lipstick, lip gloss, nail varnish or eyeliner.
  • the haircare product may be a dye.
  • the cosmetic product is a lipstick.
  • the invention has the advantage of implementing the nearest neighbours algorithm which obtains reliable and precise results.
  • the method thus implements an artificial neural network trained to automatically extract parameters of the source cosmetic product from the source image. These parameters make it possible to search the database for a target cosmetic product defined by close parameters.
  • the artificial neural network that is used has the advantage of needing few computing resources to be executed, in comparison in particular with conventional methods using inverse graphics networks in which a gradient descent is performed.
  • the proposed method may therefore be implemented by devices with low computational capabilities, in particular by portable devices, for example smartphones, tablets, for example.
  • said at least one source image is obtained using a photography device.
  • the recommended target product may be displayed on a screen.
  • the cosmetic product is a make-up product for the lips.
  • the parameters of the vector may then be an opacity, a colour, a reflection intensity, an amount and a texture of the make-up product for the lips.
  • a cosmetic product recommendation engine comprising a k-nearest neighbours algorithm (k-NN), the recommendation engine receiving, as input, the vector of characterizing parameters that was determined using the encoding artificial neural network and generating, as output, a recommendation of a target cosmetic product from the database.
  • k-NN k-nearest neighbours algorithm
  • a computer program product comprising instructions that, when the program is executed by a computer, prompt said computer to:
  • a cosmetic product recommendation engine comprising a k-nearest neighbours algorithm (k-NN)
  • the recommendation engine receiving, as input, the vector of characterizing parameters that was determined using the encoding artificial neural network and generating, as output, a recommendation of a target cosmetic product from the database.
  • a system comprising: - a memory which stores the computer program product defined above, the source image and the database,
  • a processing unit configured so as to implement the computer program product
  • a photography device configured so as to acquire the source image
  • a screen configured so as to display the target cosmetic product.
  • the device according to the invention has one or more of the following features, taken alone or in combination:
  • Said at least one source image is obtained using a photography device, and the recommendation of the cosmetic product is displayed on a screen.
  • the cosmetic product is a make-up product for the lips, and the parameters of the vector are an opacity, a colour, a reflection intensity, an amount and a texture of the make-up product for the lips.
  • the source cosmetic product is entered into the database.
  • the k-nearest neighbours algorithm (k-NN) is implemented after extracting values using a machine learning model.
  • the characterizing parameters of the source cosmetic product are stored in the database.
  • the implementation of the encoding artificial neural network comprises detecting shimmering on the source image; classifying the source image depending on the detected shimmering; differentiating the algorithm for recommending the target cosmetic product depending on the classification of the source image.
  • the source image is preprocessed to crop the body area and the characterizing parameters of the source cosmetic product are shown in an E-type vector (product R, product G, product B, gloss, gloss detail, moisture, intensity), so as to supply the input of the recommendation algorithm.
  • E-type vector product R, product G, product B, gloss, gloss detail, moisture, intensity
  • Shimmering is detected on the source image and the characterizing parameters of the source cosmetic product are shown in an E-type vector (product R, product G, product B, gloss, gloss detail, moisture, intensity, shimmering R, shimmering G, shimmering B), so as to supply the input of the recommendation algorithm.
  • FIG. 1 illustrates one mode of implementation of a method for recommending a cosmetic product.
  • the rendering generation method comprises obtaining a source image .
  • the source image is an image of a real cosmetic product PC applied to a person.
  • the cosmetic product is a make-up product for the face or a haircare product
  • the source image may be a portrait of a person to whom the make-up product or the haircare product is applied.
  • the cosmetic product is a lipstick.
  • the method may comprise defining an area of interest of the source image , the area of interest corresponding to the area of the source image in which the cosmetic product PC is located, in this case the mouth of the person to whom the lipstick is applied.
  • the method according to the invention comprises implementing an encoding artificial neural network.
  • This encoding artificial neural network is trained to extract characterizing parameters of a cosmetic product from an image received as input of this encoding artificial neural network.
  • the characterizing parameters make it possible to define the characteristics of the cosmetic product present in the image received as input.
  • the cosmetic product is a make-up product for the lips, in particular a lipstick or a lip gloss
  • the characterizing parameters may be chosen from a gloss, a gloss detail, a moisture, an intensity, a shimmering R, a shimmering G, a shimmering B of the make-up product for the lips.
  • the encoding artificial neural network E is for example a convolutional neural network.
  • the encoding artificial neural network is implemented by taking the source image or the defined area of interest of the source image as input.
  • the encoding artificial neural network E then makes it possible to determine characterizing parameters associated with the cosmetic product PC shown in the source image or in the area of interest of the source image. These characterizing parameters are shown in a vector .
  • the rendering generation method then comprises implementing a product recommendation engine comprising the implementation of a k -nearest neighbours algorithm ( k -NN).
  • the recommendation engine is configured to receive, as input, the vector of the characterizing parameters of the source cosmetic product.
  • the recommendation engine is also configured to generate, as output, a recommendation of a cosmetic product based on the characterizing parameters provided as input.
  • the cosmetic product generated by the recommendation then has the characteristics defined by the characterizing parameters provided as input of the recommendation engine (KNN).
  • the recommendation engine is implemented by receiving, as input, at least said vector or of characterizing parameters that was determined by the encoding artificial neural network.
  • the recommendation engine then makes it possible to generate at least one recommendation of a cosmetic product.
  • Such a recommendation method may be used by a user to choose a cosmetic product virtually, from an image of a person to whom is applied the cosmetic product that the user wants to try on virtually.
  • the recommendation method may also be used to generate, in a simple manner, vectors of characterizing parameters of a cosmetic product from a source image . These vectors of characterizing parameters may then be used as input of a recommendation engine when a user wishes to try on the cosmetic product virtually.
  • the source image ( ) is classified depending on the shimmering detected on the source image.
  • the shimmering may for example be detected with the aid of an artificial intelligence algorithm.
  • an encoding artificial neural network E1 makes it possible to determine characterizing parameters associated with the cosmetic product. These characterizing parameters are shown in a vector , namely ⁇ colour (R, G, B), gloss, gloss detail, moisture, intensity>.
  • the vector is then processed by a supervised learning algorithm implementing the k-nearest neighbours KNN method which generates a recommendation of one or more cosmetic products (brand a, brand b, brand c) wherein the recommendation engine compares the characterizing parameters of the source image to those of the reference products of a database of products with no shimmering using a KNN algorithm.
  • the database of products with no shimmering comprises N products with no shimmering, each product ranging from 1 to N being defined by a brand with which a colour, a gloss, a gloss detail, a moisture and an intensity are associated.
  • an encoding artificial neural network E2 makes it possible to determine characterizing parameters associated with the cosmetic product. These characterizing parameters are shown in a vector , namely ⁇ shimmering (R, G, B), colour (R, G, B), gloss, gloss detail, moisture, intensity>.
  • the vector is then processed by a supervised learning algorithm implementing the k-nearest neighbours KNN method which generates a recommendation of one or more cosmetic products (brand i, brand j, brand k) wherein the recommendation engine compares the characterizing parameters of the source image to those of the reference products of a database of products with shimmering using a KNN algorithm.
  • the database of products with shimmering comprises M products with shimmering, each product ranging from 1 to M being defined by a brand with which a shimmering, a colour, a gloss, a gloss detail, a moisture and an intensity are associated.
  • the generated cosmetic product then has the characteristics defined by the characterizing parameters provided as input of the recommendation artificial neural network.
  • the invention is not limited to the embodiment of the example shown.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a method for recommending a cosmetic product, comprising: obtaining a source image (Xref) of a source cosmetic product (PC) applied to a body area of a person, implementing an encoding artificial neural network (E) configured so as to determine characterizing parameters of the source cosmetic product (PC) from the source image (Xref), identifying, amongst reference cosmetic products of a database, a target cosmetic product having the characterizing parameters closest to those of the source cosmetic product (PC); and recommending the target cosmetic product.

Description

Method for recommending cosmetic products using a kNN algorithm
The present invention relates to a method for recommending a cosmetic product, which method comprises implementing an encoding artificial neural network.
Prior art
The choice of beauty products often depends on the personal characteristics of a wearer. These characteristics may include lifestyle, colour preferences, body chemistry, style preferences and/or physical attributes of the wearer. Thus, one shade of lipstick that works well on a young woman with blond hair and fair skin may not work well on a more mature woman with dark hair and dark skin. Beauty professionals, whether they be in beauty establishments, retail establishments or other places, are therefore often asked for assistance in selecting beauty products. Of course, the ability to obtain useful advice depends both on the level of personal attention available and on the level of competence of the beauty professional who is giving the advice. For example, if a beauty counter in a retail establishment does not have sufficient staff to manage the customer traffic or if the staff are poorly trained, the quality of the advice may suffer as a result.
Nowadays, virtual try-on technologies are widespread on retail platforms online and on social media. In particular, for makeup, consumers have the opportunity to try on cosmetic products virtually in augmented reality before buying. Although it is easy to create virtual makeup for entertainment purposes, the parametrization of a rendering engine for synthesizing realistic images of a given cosmetic product remains a tedious task and requires in-depth knowledge in computer graphics. Furthermore, consumers are often invited to choose from a selection of predefined shades of makeup but are not able to try on makeup from inspiring reference images on social networks.
In recent years, the computer vision sector has tried to provide a solution to this problem thanks to the progress made in the task of transferring make-up styles. This task consists in extracting a make-up style from a reference portrait image and in synthesizing it on the target image of a different person. The make-up attributes are extracted with the aid of a neural network and rendered on the source image with the aid of a generative model, such as a GAN or VAE.
Document US10936853B1 discloses a method implemented using a computer device, the method comprising:
receiving an input image to be modified based on the colour characteristics of a source image;
detecting faces in the input image and in the source image;
determining a skin tone value reflective of a skin tone of each of the faces;
grouping the faces into face groups based on the skin tone value of each of the faces, the face groups each comprising faces of a similar skin tone which are grouped from the input image and grouped from the source image based on an average skin tone value difference less than or equal to a grouping threshold of the faces with a similar skin tone;
matching a face group pair as input image face group paired with a source image face group; and
generating a modified image from the input image based on the colour characteristics of the source image, the colour characteristics comprising the skin tones of the respective faces in the face group pair as part of the colour characteristics applied to modify the input image.
WO2014168678A1 discloses a system comprising:
a user information module, wherein the user information module captures an image of the user’s skin;
a graphical user interface, wherein the graphical user interface allows the selection of a skin-related application from a plurality of skin-related applications;
a database of skin images;
a processor coupled to the user information module, to the graphical user interface and to the database of skin images, wherein the processor is configured to:
determine data about the image of the user’s skin from the image of the user’s skin;
identify one or more sets of skin image data in the database of skin images which correspond to the data about the image of the user’s skin based on one or more parameters specified by the skin-related application; and
apply at least one image processing filter, which corresponds to the identified sets of skin image data from the database of skin images, to the image of the user’s skin in order to generate a simulated image of the user’s skin; and
an output display, wherein the output display displays the simulated image of the user’s skin.
US2021209427A1 discloses an apparatus comprising:
a processing circuit for accepting at least one image representative of the skin of the face of a user;
a communication circuit for transmitting the accepted image to machine learning models and for receiving a regimen recommendation from the machine learning models;
a user interface circuit for displaying the regimen recommendation to the user,
wherein the user interface circuit is further configured to display images of human faces to the user;
the processing circuit being further configured to accept an input of the user which classifies the characteristics of the skin of the face based on images which are supplied to it via the user interface circuit; and
the communication interface circuit being further configured to transmit the input of the user to machine learning models as training data.
These documents do not disclose any recommendation of cosmetic products.
Disclosure of the invention
There is a need to provide product recommendations to makeup consumers based on a reference portrait image, found for example on social networks, directly, that is to say without consulting, by appointment, a person qualified in the art.
The problem posed by the invention is:
Extracting the values of the characteristics of a source cosmetic product (source), in particular of a lipstick, from an image containing the product applied to a body surface, in particular the lips.
Recommending a cosmetic product from a database of cosmetic products, in particular a lipstick, which has predicted characteristic values based on the source lipstick.
The product recommendations may thus be used to assist makeup consumers in making an aesthetic choice, or even potentially be used to obtain a product having this colour.
One subject of the invention is a method for extracting information about a source cosmetic product from a source image and recommending a target cosmetic product close to the source cosmetic product from a database. The method implements a convolutional neural network trained to retrieve the characteristics of the source image and offering two approaches for using the k-nearest neighbours method, and to find the target product in the database of reference products that is closest to the source cosmetic product.
According to one aspect, what is proposed is a method for recommending a cosmetic product, comprising:
obtaining a source image of a source cosmetic product applied to a first person,
implementing an encoding artificial neural network configured so as to determine characterizing parameters of the source cosmetic product from the source image,
identifying, amongst reference cosmetic products of a database, a target cosmetic product having the characterizing parameters closest to those of the source cosmetic product; and
recommending the target cosmetic product,
the recommendation comprising the implementation of a k-nearest neighbours algorithm (k-NN).
The method may be used to recommend cosmetic products of various types. For example, the method may be used for make-up products or haircare products. In particular, the make-up products may be blusher, foundation, mascara, lipstick, lip gloss, nail varnish or eyeliner. The haircare product may be a dye. In particular, the cosmetic product is a lipstick.
The invention has the advantage of implementing the nearest neighbours algorithm which obtains reliable and precise results.
The method thus implements an artificial neural network trained to automatically extract parameters of the source cosmetic product from the source image. These parameters make it possible to search the database for a target cosmetic product defined by close parameters.
The artificial neural network that is used has the advantage of needing few computing resources to be executed, in comparison in particular with conventional methods using inverse graphics networks in which a gradient descent is performed.
The proposed method may therefore be implemented by devices with low computational capabilities, in particular by portable devices, for example smartphones, tablets, for example.
Advantageously, said at least one source image is obtained using a photography device. Moreover, the recommended target product may be displayed on a screen.
In one advantageous mode of implementation, the cosmetic product is a make-up product for the lips. The parameters of the vector may then be an opacity, a colour, a reflection intensity, an amount and a texture of the make-up product for the lips.
According to another aspect, what is proposed is a method for training an encoding artificial neural network as described above, comprising:
extracting, from a database, predefined characterizing parameters associated with reference cosmetic products,
implementing an encoding artificial neural network, so as to determine the characterizing parameters of a source cosmetic product, taking a source image as input,
implementing a cosmetic product recommendation engine comprising a k-nearest neighbours algorithm (k-NN), the recommendation engine receiving, as input, the vector of characterizing parameters that was determined using the encoding artificial neural network and generating, as output, a recommendation of a target cosmetic product from the database.
According to another aspect, what is proposed is a computer program product comprising instructions that, when the program is executed by a computer, prompt said computer to:
- extract, from a database, predefined characterizing parameters associated with reference cosmetic products,
- implementing an encoding artificial neural network, so as to determine the characterizing parameters of a source cosmetic product, taking a source image as input,
- implementing a cosmetic product recommendation engine comprising a k-nearest neighbours algorithm (k-NN), the recommendation engine receiving, as input, the vector of characterizing parameters that was determined using the encoding artificial neural network and generating, as output, a recommendation of a target cosmetic product from the database.
According to another aspect, what is proposed is a system comprising:
- a memory which stores the computer program product defined above, the source image and the database,
- a processing unit configured so as to implement the computer program product,
- a photography device configured so as to acquire the source image,
- a screen configured so as to display the target cosmetic product.
Preferred embodiments
Preferably, the device according to the invention has one or more of the following features, taken alone or in combination:
Said at least one source image is obtained using a photography device, and the recommendation of the cosmetic product is displayed on a screen.
The cosmetic product is a make-up product for the lips, and the parameters of the vector are an opacity, a colour, a reflection intensity, an amount and a texture of the make-up product for the lips.
The source cosmetic product is entered into the database.
The k-nearest neighbours algorithm (k-NN) is implemented after extracting values using a machine learning model.
The characterizing parameters of the source cosmetic product are stored in the database.
The implementation of the encoding artificial neural network comprises detecting shimmering on the source image; classifying the source image depending on the detected shimmering; differentiating the algorithm for recommending the target cosmetic product depending on the classification of the source image.
No shimmering is detected on the source image, the source image is preprocessed to crop the body area and the characterizing parameters of the source cosmetic product are shown in an E-type vector (product R, product G, product B, gloss, gloss detail, moisture, intensity), so as to supply the input of the recommendation algorithm.
Shimmering is detected on the source image and the characterizing parameters of the source cosmetic product are shown in an E-type vector (product R, product G, product B, gloss, gloss detail, moisture, intensity, shimmering R, shimmering G, shimmering B), so as to supply the input of the recommendation algorithm.
The invention may be better understood from reading the following detailed description of a nonlimiting exemplary implementation thereof, and from studying the appended schematic and partial drawing, in which:
Brief description of the drawings
schematically illustrates one embodiment and mode of implementation of the invention.
Detailed description

illustrates one mode of implementation of a method for recommending a cosmetic product.
The rendering generation method comprises obtaining a source image . The source image is an image of a real cosmetic product PC applied to a person. For example, when the cosmetic product is a make-up product for the face or a haircare product, the source image may be a portrait of a person to whom the make-up product or the haircare product is applied. In this case, the cosmetic product is a lipstick.
The method may comprise defining an area of interest of the source image , the area of interest corresponding to the area of the source image in which the cosmetic product PC is located, in this case the mouth of the person to whom the lipstick is applied.
In particular, the method according to the invention comprises implementing an encoding artificial neural network. This encoding artificial neural network is trained to extract characterizing parameters of a cosmetic product from an image received as input of this encoding artificial neural network.
The characterizing parameters make it possible to define the characteristics of the cosmetic product present in the image received as input. For example, when the cosmetic product is a make-up product for the lips, in particular a lipstick or a lip gloss, the characterizing parameters may be chosen from a gloss, a gloss detail, a moisture, an intensity, a shimmering R, a shimmering G, a shimmering B of the make-up product for the lips.
The encoding artificial neural network E is for example a convolutional neural network.
More particularly, in the method according to the invention, the encoding artificial neural network is implemented by taking the source image or the defined area of interest of the source image as input. The encoding artificial neural network E then makes it possible to determine characterizing parameters associated with the cosmetic product PC shown in the source image or in the area of interest of the source image. These characterizing parameters are shown in a vector .
The rendering generation method then comprises implementing a product recommendation engine comprising the implementation of a k-nearest neighbours algorithm (k-NN).
The recommendation engine is configured to receive, as input, the vector of the characterizing parameters of the source cosmetic product. The recommendation engine is also configured to generate, as output, a recommendation of a cosmetic product based on the characterizing parameters provided as input.
The cosmetic product generated by the recommendation then has the characteristics defined by the characterizing parameters provided as input of the recommendation engine (KNN).
In particular, in the recommendation method, the recommendation engine is implemented by receiving, as input, at least said vector or of characterizing parameters that was determined by the encoding artificial neural network. The recommendation engine then makes it possible to generate at least one recommendation of a cosmetic product.
Such a recommendation method may be used by a user to choose a cosmetic product virtually, from an image of a person to whom is applied the cosmetic product that the user wants to try on virtually.
The recommendation method may also be used to generate, in a simple manner, vectors of characterizing parameters of a cosmetic product from a source image . These vectors of characterizing parameters may then be used as input of a recommendation engine when a user wishes to try on the cosmetic product virtually.
According to the invention, the source image ( ) is classified depending on the shimmering detected on the source image. The shimmering may for example be detected with the aid of an artificial intelligence algorithm.
If no shimmering is detected, an encoding artificial neural network E1 makes it possible to determine characterizing parameters associated with the cosmetic product. These characterizing parameters are shown in a vector , namely <colour (R, G, B), gloss, gloss detail, moisture, intensity>.
The vector is then processed by a supervised learning algorithm implementing the k-nearest neighbours KNN method which generates a recommendation of one or more cosmetic products (brand a, brand b, brand c) wherein the recommendation engine compares the characterizing parameters of the source image to those of the reference products of a database of products with no shimmering using a KNN algorithm.
The implementation of the KNN recommendation engine makes it possible to obtain a product recommendation based on the characterizing parameters shown in the vector
In the example shown, the database of products with no shimmering comprises N products with no shimmering, each product ranging from 1 to N being defined by a brand with which a colour, a gloss, a gloss detail, a moisture and an intensity are associated.
If shimmering is detected, an encoding artificial neural network E2 makes it possible to determine characterizing parameters associated with the cosmetic product. These characterizing parameters are shown in a vector , namely <shimmering (R, G, B), colour (R, G, B), gloss, gloss detail, moisture, intensity>.
The vector is then processed by a supervised learning algorithm implementing the k-nearest neighbours KNN method which generates a recommendation of one or more cosmetic products (brand i, brand j, brand k) wherein the recommendation engine compares the characterizing parameters of the source image to those of the reference products of a database of products with shimmering using a KNN algorithm.
The implementation of the KNN recommendation engine makes it possible to obtain a product recommendation based on the characterizing parameters shown in the vector
In the example shown, the database of products with shimmering comprises M products with shimmering, each product ranging from 1 to M being defined by a brand with which a shimmering, a colour, a gloss, a gloss detail, a moisture and an intensity are associated.
The generated cosmetic product then has the characteristics defined by the characterizing parameters provided as input of the recommendation artificial neural network.
The invention is not limited to the embodiment of the example shown.

Claims (12)

  1. Method for recommending a cosmetic product, comprising:
    obtaining a source image (Xref) of a source cosmetic product (PC) applied to a body area of a person,
    implementing an encoding artificial neural network configured so as to determine characterizing parameters of the source cosmetic product (PC) from the source image (Xref),
    identifying, amongst reference cosmetic products of a database, a target cosmetic product having the characterizing parameters closest to those of the source cosmetic product (PC); and
    recommending the target cosmetic product,
    the recommendation comprising the implementation of a k-nearest neighbours algorithm (k-NN).
  2. Method according to Claim 1, wherein said at least one source image is obtained using a photography device, and wherein the recommendation of the cosmetic product is displayed on a screen.
  3. Method according to either one of the preceding claims, wherein the cosmetic product (PC) is a make-up product for the lips, and wherein the parameters of the vector are an opacity, a colour, a reflection intensity, an amount and a texture of the make-up product for the lips.
  4. Method according to any one of the preceding claims, wherein the source cosmetic product (PC) is entered into the database.
  5. Method according to any one of the preceding claims, wherein the k-nearest neighbours algorithm (k-NN) is implemented after extracting values using a machine learning model.
  6. Method according to any one of the preceding claims, characterized in that the characterizing parameters of the source cosmetic product (PC) are stored in the database.
  7. Method according to any one of the preceding claims, wherein the implementation of the encoding artificial neural network (E1, E2) comprises detecting shimmering on the source image (Xref); classifying the source image (Xref) depending on the detected shimmering; differentiating the algorithm for recommending the target cosmetic product depending on the classification of the source image (Xref).
  8. Method according to the preceding claim, wherein no shimmering is detected on the source image, the source image is preprocessed to crop the body area and the characterizing parameters of the source cosmetic product are shown in an E-type vector (product R, product G, product B, gloss, gloss detail, moisture, intensity), so as to supply the input of the recommendation algorithm.
  9. Method according to Claim 8, wherein shimmering is detected on the source image and the characterizing parameters of the source cosmetic product are shown in an E-type vector (product R, product G, product B, gloss, gloss detail, moisture, intensity, shimmering R, shimmering G, shimmering B), so as to supply the input of the recommendation algorithm.
  10. Method for training an encoding artificial neural network of one of the preceding claims, comprising:
    extracting, from a database, predefined characterizing parameters associated with reference cosmetic products,
    implementing an encoding artificial neural network (E1, E2), so as to determine the characterizing parameters of a source cosmetic product (PC), taking a source image as input,
    implementing a cosmetic product recommendation engine comprising a k-nearest neighbours algorithm (k-NN), the recommendation engine receiving, as input, the vector (E1(Xref), E2(Xref)) of characterizing parameters that was determined using the encoding artificial neural network (E1, E2) and generating, as output, a recommendation of a target cosmetic product from the database.
  11. Computer program product comprising instructions that, when the program is executed by a computer, prompt said computer to:
    extract, from a database, predefined characterizing parameters associated with reference cosmetic products,
    implement an encoding artificial neural network (E1, E2), so as to determine the characterizing parameters of a source cosmetic product (PC), taking a source image as input,
    implement a cosmetic product recommendation engine comprising a k-nearest neighbours algorithm (k-NN), the recommendation engine receiving, as input, the vector (E1(Xref), E2(Xref)) of characterizing parameters that was determined using the encoding artificial neural network (E1, E2) and generating, as output, a recommendation of a target cosmetic product from the database.
  12. System comprising:
    a memory which stores the computer program product according to the preceding claim, the source image (Xref) and the database,
    a processing unit configured so as to implement the computer program product, a photography device configured so as to acquire the source image,
    a screen configured so as to display the target cosmetic product.
PCT/EP2023/058988 2022-04-07 2023-04-05 Method for recommending cosmetic products using a knn algorithm WO2023194466A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FRFR2203201 2022-04-07
FR2203201A FR3134469A1 (en) 2022-04-07 2022-04-07 Cosmetic product recommendation method using a kNN algorithm

Publications (1)

Publication Number Publication Date
WO2023194466A1 true WO2023194466A1 (en) 2023-10-12

Family

ID=82196774

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/058988 WO2023194466A1 (en) 2022-04-07 2023-04-05 Method for recommending cosmetic products using a knn algorithm

Country Status (2)

Country Link
FR (1) FR3134469A1 (en)
WO (1) WO2023194466A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014168678A1 (en) 2013-04-09 2014-10-16 Elc Management Llc Skin diagnostic and image processing systems, apparatus and articles
US20190272659A1 (en) * 2014-02-23 2019-09-05 Northeastern University System for Beauty, Cosmetic, and Fashion Analysis
US10936853B1 (en) 2019-10-04 2021-03-02 Adobe Inc. Skin tone assisted digital image color matching
US20210209427A1 (en) 2018-01-05 2021-07-08 L'oreal Machine-implemented facial health and beauty assistant

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014168678A1 (en) 2013-04-09 2014-10-16 Elc Management Llc Skin diagnostic and image processing systems, apparatus and articles
US20190272659A1 (en) * 2014-02-23 2019-09-05 Northeastern University System for Beauty, Cosmetic, and Fashion Analysis
US20210209427A1 (en) 2018-01-05 2021-07-08 L'oreal Machine-implemented facial health and beauty assistant
US10936853B1 (en) 2019-10-04 2021-03-02 Adobe Inc. Skin tone assisted digital image color matching

Also Published As

Publication number Publication date
FR3134469A1 (en) 2023-10-13

Similar Documents

Publication Publication Date Title
US10799010B2 (en) Makeup application assist device and makeup application assist method
KR101140533B1 (en) Method and system for recommending a product based upon skin color estimated from an image
JP4435809B2 (en) Virtual makeup apparatus and method
CN109840825A (en) The recommender system of physical features based on user
US11978242B2 (en) Systems and methods for improved facial attribute classification and use thereof
CN108694736B (en) Image processing method, image processing device, server and computer storage medium
US11010894B1 (en) Deriving a skin profile from an image
KR102253750B1 (en) Makeup recommendation and selling cosmetics platform service using deep learning
JP2002224049A (en) Portable terminal unit, advice system, skin diagnosing evaluating method, skin diagnosing evaluating program, makeup advice providing method and makeup advice providing program
WO2018029963A1 (en) Make-up assistance apparatus and make-up assistance method
JP2020526756A (en) A system for managing hair condition information and how to optimize beauty consultation
KR20210017287A (en) System and method for recommending color of cosmetic product by sharing information with influencer
WO2023194466A1 (en) Method for recommending cosmetic products using a knn algorithm
KR102425873B1 (en) Personal color diagnostic method and system based on machine learning and augmented reality
KR102465453B1 (en) A virtual makeup composition A.I. processing apparatus and a method using it
KR20220022433A (en) System for buying service of cosmetic object and applying selective makeup effect
JPWO2022002961A5 (en)
Yan et al. Exploring the Facial Color Representative Regions Using the Humanae Images
Lee et al. A cognitive knowledge-based system for hair and makeup recommendation based on facial features classification
KR20200107486A (en) Virtual makeup composition processing apparatus
Tzou et al. How AI and AR can help beauty industry
US20240135424A1 (en) Information processing apparatus, information processing method, and program
KR102532561B1 (en) Method for providing consulting data for personal style
US20240065420A1 (en) Interreality cosmetics for a smart cosmetic device system
Kamble Foundation Makeup Shade Recommendation using Computer Vision Based on Skin Tone Recognition

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23717507

Country of ref document: EP

Kind code of ref document: A1