WO2022114614A1 - Recommandation de lentille de contact personnalisée et système d'essayage virtuel - Google Patents

Recommandation de lentille de contact personnalisée et système d'essayage virtuel Download PDF

Info

Publication number
WO2022114614A1
WO2022114614A1 PCT/KR2021/016366 KR2021016366W WO2022114614A1 WO 2022114614 A1 WO2022114614 A1 WO 2022114614A1 KR 2021016366 W KR2021016366 W KR 2021016366W WO 2022114614 A1 WO2022114614 A1 WO 2022114614A1
Authority
WO
WIPO (PCT)
Prior art keywords
contact lens
fitting
user
biometric information
information
Prior art date
Application number
PCT/KR2021/016366
Other languages
English (en)
Korean (ko)
Inventor
김재윤
Original Assignee
(주)인터비젼
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)인터비젼 filed Critical (주)인터비젼
Publication of WO2022114614A1 publication Critical patent/WO2022114614A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris

Definitions

  • the present invention relates to a contact lens customized recommendation and virtual fitting system, and more particularly, a contact lens that can recommend and fit a customized contact lens suitable for a user in a short time through a user's biometric information and an artificial neural network-based contact lens recommendation model It relates to a lens fit recommendation and a virtual fitting system.
  • contact lenses are used to obtain good eyesight by attaching the lenses directly to the front of the eyeball. It is widely used.
  • Republic of Korea Patent Publication No. 10-1520778 discloses a contact lens and facial image taken directly without wearing the contact lens, which is synthesized through contact lens and eye shape recognition, so that the user can know the appearance of the contact lens after wearing the contact lens.
  • Lens virtual fitting methods, apparatus, and computer programs for executing the methods are known.
  • Patent Document 1 KR10-1520778 B1
  • Patent Document 2 KR10-2019-0122955 A
  • Patent Document 3 KR10-2020-0075541 A
  • a contact lens fitting recommendation and virtual fitting system when an image or photo of a contact lens virtual fitting user is input, the user's biometric information for contact lens fitting recommendation is recognized through face recognition, hair recognition and eye recognition, and a biometric information discriminator 210 to extract; a fitting information recommender 310 for customarily recommending a contact lens suitable for a user through a plurality of calculations in which weights are applied to the biometric information; and an image synthesizer 510 that displays a contact lens recommended by the fitting information recommender 310 so that a user can select it, and displays a virtual fitting image by synthesizing the contact lens with the user's image or photo; include
  • the fitting information recommender 310 is trained to output a recommended contact lens suitable for the biometric information with the biometric information recognized and extracted by the biometric information discriminator 210 of the present invention, or the biometric information discrimination a learner 410 for learning the fitting information recommender 310 using the biometric information recognized and extracted by the device 210 and the contact lens information selected by the user in the image synthesizer 510; .
  • biometric information of the present invention includes any one or more of facial skin tone, hair color, and eye color.
  • the biometric information discriminator 210 of the present invention generates facial feature data through facial recognition, and a face recognition unit 212 for measuring facial skin tone by classifying the face and hair, and the facial feature data It further includes a hair recognition unit 214 for generating hair outline information and measuring a hair color with reference, and a pupil recognition unit 216 for measuring a pupil color with reference to the facial feature data.
  • the learner 410 of the present invention learns by matching the facial skin tone and the hair color as a pair, and the contact lens model data 40 and color tag stored in the contact lens DB 320 based on the learning data Match the information and save it.
  • the present invention is a recommended model generated by learning the eye color according to the facial skin tone and hair color through the contact lens fitting image.
  • the biometric information discriminator 210 recognizes the facial skin tone in the user's facial region 10, and recognizes biometric information such as hair color in the hair region 20, so that the fitting information recommender 310 recognizes the user's biometric information
  • the contact lens model data 40 suitable for the information it is possible to recommend and fit a customized contact lens suitable for the user in a short time.
  • the user by continuously learning the recommendation model of the learner 410 using biometric information such as the user's facial skin tone and hair color and the contact lens selection information recommended through the image synthesizer 510, the user is increasingly satisfied with the It has the effect of recommending contact lenses that can enhance it.
  • FIG. 1 is a block diagram showing the configuration of a contact lens fitting recommendation and virtual fitting system according to an embodiment of the present invention.
  • FIG. 2 is a view for explaining the biometric information discriminator 210 according to an embodiment of the present invention.
  • FIG 3 is a view for explaining the matching of the contact lens DB 320 through the biometric information recognition and learner 410 of the biometric information discriminator 210 according to an embodiment of the present invention.
  • Figure 4 is a view for explaining a user's contact lens virtual fitting according to an embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating a learning method of the learner 410 according to an embodiment of the present invention.
  • FIG. 6 is a flowchart illustrating a method for virtual fitting a user's contact lenses according to an embodiment of the present invention.
  • FIG. 1 is a block diagram illustrating a contact lens fitting recommendation and virtual fitting system according to an embodiment of the present invention.
  • the contact lens fit recommendation and virtual fitting system (hereinafter referred to as 'fitting system 100') of the present invention includes a camera unit 110, an input unit 120, a storage unit 130, a display unit 140, and a control unit 150.
  • the fitting system 100 refers to a portable electronic device such as a smartphone, tablet, or notebook computer or a computer equipped with a camera.
  • the camera unit 110 is for capturing an image for virtual fitting of a contact lens based on the user's face.
  • the image captured by the camera unit 110 is output through the display unit 140 .
  • the camera unit 110 includes an image sensor, and the image sensor serves to receive light reflected from a subject and convert it into an electrical signal.
  • the image sensor may be implemented based on a Charged Coupled Device (CCD), a Complementary Metal-Oxide Semiconductor (CMOS), or the like.
  • CCD Charged Coupled Device
  • CMOS Complementary Metal-Oxide Semiconductor
  • the camera unit 110 may further include an analog-to-digital converter, convert the electrical signal output from the image sensor into a digital sequence, and output it to the display unit 140 through the control unit 150 . have.
  • the input unit 120 receives a user's key manipulation for controlling various functions and operations of the fitting system, generates an input signal, and transmits the generated input signal to the control unit 150 .
  • the input unit 120 may include a touch screen, a touch pad, a keypad, a keyboard, a mouse, and a track ball.
  • the input unit 120 may include at least one of a power key, a character key, a number key, and a direction key for power on/off.
  • the function of the input unit 120 may be performed on the display unit 140 . In this case, when all input functions are performed only by the display unit 140 , the input unit 120 may be omitted.
  • the display unit 140 may receive data for screen display from the control unit 150 and display the received data on the screen.
  • the display unit 140 may visually provide a menu, data, function setting information, and other various information of the fitting system to the user.
  • the display unit 140 may visually provide the user with information for manipulating and using the application through a separate application for driving the fitting system.
  • the display unit 140 may be formed of any one of a touch screen, a liquid crystal display (LCD), an organic light emitting diode (OLED), and an active matrix organic light emitting diode (AMOLED). can Meanwhile, when the display unit 140 is formed of a touch screen, some or all of the functions of the input unit 120 described above may be performed instead.
  • LCD liquid crystal display
  • OLED organic light emitting diode
  • AMOLED active matrix organic light emitting diode
  • the storage unit 130 serves to store various data required for the operation of the fitting system, and various data generated according to the operation of an application or program.
  • the storage unit 130 may be a NAND flash memory, an SD card and a MicroSD card, an HDD, an SDD, or the like.
  • the storage unit 130 may store an operating system (OS) for booting and operation of the fitting system or an application for driving the fitting system.
  • OS operating system
  • Various data stored in the storage unit 130 may be modified, deleted, or added according to a user's manipulation.
  • the controller 150 controls the overall operation of the fitting system and the signal transmission/reception between the camera unit 110 , the input unit 120 , the display unit 140 and the storage unit 130 , and performs a data processing function for processing data. can do.
  • the control unit 150 may be a central processing unit (CPU), an application processor (AP), a graphic processing unit (GPU), or the like.
  • the control unit 150 includes a determining unit 200 , an artificial neural network 300 , a learning unit 400 , and an image generating unit 500 .
  • the determination unit 200 , the artificial neural network 300 , the learning unit 400 , and the image generation unit 500 may be implemented as one configuration of the control unit 150 in the form of hardware or driven by the control unit 150 in the form of software. can
  • the determining unit 200 serves to recognize biometric information from input photo information or photo information captured by the camera unit 110 .
  • the determination unit 200 may determine information such as facial skin tone, hair color, and eye color by analyzing photo information.
  • the determination unit 200 includes a biometric information determiner 210 (refer to FIG. 2 ).
  • the artificial neural network 300 serves to recommend a contact lens suitable for a user based on biometric information determined from the user's photo information taken by the camera unit 110 .
  • the artificial neural network 300 includes a fitting information recommender 310 (see FIG. 4 ) for recommending contact lens fitting information to a user and a contact lens DB 320 (see FIG. 4 ).
  • the learning unit 400 serves to learn a contact lens that is matched according to the biometric information through the biometric information determined by the determining unit 200.
  • the learner (410, see FIG. 4)
  • the image generator 500 serves to virtual fit the contact lens recommended by the artificial neural network 300 to the user's photo, and includes an image synthesizer 510 (refer to FIG. 4 ).
  • the fitting system 100 may include a storage medium insertion unit for inserting an external storage medium such as a memory card to store data, a connection terminal for exchanging data with an external digital device or computer, and a charging terminal for charging. have. Also, the fitting system 100 may further include an audio processing unit that inputs or outputs an audio signal through a microphone and a speaker or reproduces a digital sound source.
  • FIG. 2 is a view for explaining the biometric information discriminator 210 according to an embodiment of the present invention
  • FIG. 3 is a biometric information recognition and learner 410 of the biometric information discriminator 210 according to an embodiment of the present invention. It is a view for explaining the matching of the contact lens DB (320) through.
  • the biometric information discriminator 210 may be configured in the form of hardware in the discriminator 200 of the control unit 150 or may be driven in the discriminator 200 in the form of software.
  • the biometric information discriminator 210 is for discriminating biometric information from input photo information or from images or photo information captured by the camera unit 110 (refer to FIG. 1 ).
  • the biometric information discriminator 210 includes a face recognition unit 212 , a hair recognition unit 214 , and a pupil recognition unit 216 .
  • the face recognition unit 212 recognizes the facial region 10 from the photo information as shown in FIG. 3 , the hair recognition unit 214 recognizes the hair region 20 , and the pupil recognition unit 216 recognizes the pupil region 30 . ) is recognized.
  • the face recognition unit 212 serves to recognize the boundary of the face serving as the face region 10 in the photo information and recognize the tone of the face skin.
  • the face recognition unit 212 extracts feature points of the face from the photo. For example, facial feature data consisting of points and lines is extracted for the boundary between hair and face and the positions of eyes, nose, mouth, and ears. Based on these characteristic data, the skin tone in the facial region is measured. Skin tone is measured with color codes such as RGB, HEX, CMYK, and HSL, but the average value of the skin tone values measured over the entire facial area is used.
  • the hair recognition unit 214 serves to recognize the boundary of the hair, which is the hair region 20 located outside the boundary of the face recognized by the facial recognition unit 212, and recognize the color of the hair.
  • the hair recognition unit 214 extracts the boundary of the hair located at the outside again based on the outside of the boundary of the face, and measures the color of the hair in the hair region. Hair color is measured with color codes such as RGB, HEX, CMYK, and HSL, but the average value is used for the color values measured over the entire hair.
  • the pupil recognition unit 216 recognizes the pupil region 30 based on the position of the eye recognized by the face recognition unit 212 , and in the pupil region 30 , in the black region, the white region, and the black region of the pupil Measure the eye color. Even in the black region, the color of the iris region, not the pupil region, is measured. The color of the pupil is also measured with color codes such as RGB, HEX, CMYK, and HSL, but the average value is used for the color values measured over the entire pupil area.
  • the face recognition unit 212, the hair recognition unit 214, and the eye recognition unit 216 are linked to a separate visual engine and included in the photo information for the face region 10, hair region 20, and pupil region ( 30), and can recognize facial skin tone, hair color, and eye color.
  • FIG 3 is a view for explaining the matching of the contact lens DB 320 through the biometric information recognition and learner 410 of the biometric information discriminator 210 according to an embodiment of the present invention.
  • the learner 410 learns for virtual fitting recommendation based on the color information measured by the biometric information discriminator 210 and matches with the contact lens DB 320 to utilize the learning result for fitting. it is to do
  • the biometric information discriminator 210 recognizes the facial region 10, the hair region 20, and the pupil region 30 in each photo input as a plurality as shown in FIG. 3, and recognizes a color in each region.
  • color recognition is an RGB code
  • the facial skin tone (FT, Face Tone) is R(236), G(219), B(207)
  • the hair color (HC, Hair Color) is R(64), G(33), and B(37)
  • eye color (EC) may be measured as R(68), G(42), and B(51).
  • the facial skin tone (FT) is R(245), G(207), B(181)
  • the hair color (HC) is R(182), G(151), B(119)
  • the pupil color (EC) may be measured as R(107), G(106), and B(105).
  • facial skin tone (FT) is R(255), G(199), B(194)
  • hair color (HC) is R(32), G(13), B(19).
  • the pupil color EC may be measured as R(85), G(43), and B(44).
  • a plurality of photos becomes a population, and FT, HC, and EC become parameters. That is, learning is made with the variables extracted from the population, and a contact lens recommendation model is generated.
  • the contact lens recommendation model generated by learning recommends a virtual fitting contact lens suitable for the user in the future.
  • the contact lens recommendation model learned by the learner 410 is used in the fitting information recommender 310 (refer to FIG. 4 ) to be described later.
  • the learner 410 also proceeds to learn the data of the contact lens DB 320 that matches the middle pupil color (EC). For example, if the RGB color of the learned EC is purple, the contact lens model data 40 stored in the contact lens DB 320 matches that with a purple tag.
  • EC middle pupil color
  • contact lens model data 40 for a plurality of virtual fittings is stored in the contact lens DB 320 , and a plurality of color codes are designated as tags in each contact lens model data 40 .
  • the learner 410 also learns the data of the contact lens DB 320 that matches the pupil color EC learned based on these tags. Therefore, when the user virtual fitting the contact lens in the future, a contact lens suitable for the user may be recommended according to the contact recommendation model.
  • FIG. 1 is a block diagram showing the configuration of a contact lens fitting recommendation and virtual fitting system according to an embodiment of the present invention
  • FIG. 2 is a view for explaining a biometric information discriminator 210 according to an embodiment of the present invention
  • FIG. 4 is a diagram for explaining virtual fitting of a contact lens of a user according to an embodiment of the present invention.
  • the artificial neural network 300 includes a fitting information recommender 310 .
  • the fitting information recommender 310 is an independent artificial neural network and may consist of a plurality of layers.
  • Each of the plurality of layers includes a plurality of operations to which weights are applied.
  • the plurality of layers may include a convolution layer, a deconvolution layer, a pooling layer, a fully-connected layer, and the like.
  • the operation may include a convolution operation, a deconvolution operation, a maximum pooling operation, a minimum pooling operation, a soft-max operation, and the like.
  • Each of these operations includes a weight.
  • a convolution operation, a pooling operation, etc. use a filter, and such a filter consists of a matrix, and the value of each element of the matrix may be a weight.
  • the biometric information determiner 210 determines the facial region 10 from the user's face image or photo. ), the hair region 20 and the pupil region 30 are recognized, respectively.
  • the display unit 140 may display a guideline so as to photograph the user's face while aligning it to a specific position.
  • the facial skin tone FT is measured through the face recognition unit 212
  • the hair color HC is measured through the hair recognition unit 214 .
  • the measured facial skin tone (FT) and hair color (HC) information is transmitted to the fitting information recommender 310 of the artificial neural network 300 .
  • the fitting information recommender 310 outputs recommended contact lens fitting information using facial skin tone (FT) and hair color (HC) as variables.
  • the recommended contact lens fitting information outputs contact lens DB 320 information that is matched according to a facial skin tone (FT) and a hair color (HC). That is, a plurality of recommended contact lens model data 40 matched from the contact lens DB 320 is output.
  • the fitting information recommender 310 outputs optimal recommended contact lens model data 40 by performing a plurality of calculations in which weights are applied to the measured facial skin tone (FT) and hair color (HC).
  • the weight may be different depending on the color and area of the face area 10 and the hair area 20, for example, as high and low.
  • the image synthesizer 510 outputs a plurality of recommended contact lens model data 40 on the screen of the display unit 140 that is currently photographing the user's face, and according to the user's selection, the user's pupil region 30
  • the contact lens model data 40 is overlapped and displayed.
  • the contact lens model data 40 may be displayed to be arranged in a line on the lower side of the display unit 140 .
  • the contact lens model data 40 selected by the user trains the learner 410 together with the user's facial skin tone (FT) and hair color (HC). Therefore, when a plurality of users use the fitting system 100 of the present invention, the contact recommendation model is continuously learned and developed. Thus, it is possible to improve and increase user satisfaction with contact lens recommendations.
  • FT facial skin tone
  • HC hair color
  • FIG. 5 is a flowchart illustrating a learning method of the learner 410 according to an embodiment of the present invention.
  • Step S11 is a step of collecting a contact lens fitting image as shown in FIG. 3, and refining and pre-processing it. It collects a plurality of images of the upper body wearing contact lenses, cuts the image size at the same ratio, and unifies the contrast or saturation of the image so that it can be recognized well.
  • Step S12 is a step of determining and extracting biometric information through the determining unit 200 (see FIG. 1 ).
  • the biometric information determines and extracts information such as facial skin tone, hair color, and eye color.
  • Step S13 is a step of learning the biological information determined in step S12 through the artificial neural network 300 and the learning unit 400 . Specifically, it learns the eye color (contact lens color) worn along with facial skin tone and hair color.
  • Step S14 is a step of matching the pupil color from the data learned in step S13 to the contact lens model data (40, see FIG. 4) stored in the contact lens DB (320, see FIG. 4).
  • the learning unit 400 performs learning together with such matching information.
  • FIG. 6 is a flowchart illustrating a method for virtual fitting of a user's contact lens according to an embodiment of the present invention.
  • Step S21 is a step of photographing the user's upper body. That is, it is a step in which the user takes an image or photo before wearing the virtual contact lens.
  • Step S22 is a step of determining and extracting the user's biometric information from the image or photo taken in step S21.
  • the facial region (10, see Fig. 4), the hair region (20, see Fig. 4), and the pupil region (30, see Fig. 4) are extracted from the user's image or photo, respectively, as shown in Fig. 4, and the facial region ( In 10), the facial skin tone (FT) is measured, and in the hair region 20, the hair color (HC) is measured.
  • the pupil region 30 only the position of the pupil is extracted.
  • Step S23 is a step of inputting facial skin tone (FT) and hair color (HC) information measured in step S22 into the artificial neural network, and outputting the recommended contact lens image to the display unit 140 (refer to FIG. 1 ).
  • FT facial skin tone
  • HC hair color
  • Step S23 output recommended contact lens color information according to facial skin tone (FT) and hair color (HC) information, and a plurality of recommended contact lens model data matching this color in the contact lens DB 320 (refer to FIG. 4 ) (40, see FIG. 4 ) is output on the display unit 140 .
  • Step S24 is a step in which the user selects any one of the plurality of recommended contact lens model data 40 output on the display unit 140 .
  • the user selects any one of the contact lens model data 40 that he or she likes on the display unit 140 .
  • Step S25 is a step of synthesizing the contact lens model data 40 selected in step S24 on the pupil region 30 from the user's image or photo and displaying the synthesized data on the display unit 140 .
  • the contact lens model data 40 is synthesized in the image or photo in real time and displayed in the user's pupil region 30 . do. Through this, the user can easily select a contact lens that is suitable for him in a short time, similar to the one he actually wears, even without directly wearing the contact lens.
  • Step S26 is a step in which the learning unit 400 learns the user's facial skin tone (FT) and hair color (HC) together with the contact lens model data 40 selected in step S25.
  • the data selected by the user in step S25 is accumulated and learned in the learning unit 400 together with facial skin tone (FT) and hair color (HC) information. Therefore, the learning data is accumulated, and it is possible to gradually increase the satisfaction of the contact lens recommended to the user by learning.
  • fitting information recommendation device 320 contact lens DB

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Development Economics (AREA)
  • Evolutionary Computation (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Artificial Intelligence (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Ophthalmology & Optometry (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Databases & Information Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

La présente invention concerne une recommandation de lentille de contact personnalisée et un système d'essayage virtuel pouvant recommander et essayer une lentille de contact personnalisée appropriée pour un utilisateur en un court laps de temps à travers les informations biométriques de l'utilisateur et un modèle de recommandation de lentille de contact. En tant que modèle de recommandation créé par l'apprentissage de la couleur des yeux en fonction de la tonalité de la peau du visage et de la couleur des cheveux par l'intermédiaire d'une image d'essayage de lentille de contact, lorsqu'une vidéo ou une image du visage de l'utilisateur est entrée dans une unité de commande par l'intermédiaire d'une unité de caméra, un identifiant d'informations biométriques reconnaît la tonalité de peau du visage de la zone de visage de l'utilisateur et reconnaît des informations biométriques telles que la couleur des cheveux dans une zone de cheveux, et un dispositif de recommandation d'informations d'essayage recommande des données de modèle de lentille de contact appropriées pour les informations biométriques de l'utilisateur, de telle sorte que la présente invention a pour effet de recommander et d'essayer une lentille de contact personnalisée appropriée pour l'utilisateur en un court laps de temps.
PCT/KR2021/016366 2020-11-30 2021-11-10 Recommandation de lentille de contact personnalisée et système d'essayage virtuel WO2022114614A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2020-0164700 2020-11-30
KR1020200164700A KR20220075984A (ko) 2020-11-30 2020-11-30 콘택트렌즈 맞춤 추천 및 가상 피팅 시스템

Publications (1)

Publication Number Publication Date
WO2022114614A1 true WO2022114614A1 (fr) 2022-06-02

Family

ID=81756091

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/016366 WO2022114614A1 (fr) 2020-11-30 2021-11-10 Recommandation de lentille de contact personnalisée et système d'essayage virtuel

Country Status (2)

Country Link
KR (1) KR20220075984A (fr)
WO (1) WO2022114614A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003057038A1 (fr) * 2002-01-04 2003-07-17 Vision Optic Co., Ltd. Systeme et procede de selection des lunettes et des lentilles de contact
JP2004513389A (ja) * 2000-10-30 2004-04-30 ノバルティス アクチエンゲゼルシャフト 特注美容コンタクトレンズを注文する方法及びシステム
KR20040097200A (ko) * 2002-03-26 2004-11-17 김소운 3차원 안경 시뮬레이션 시스템 및 방법
KR101520778B1 (ko) * 2014-11-28 2015-05-18 (주) 뷰엠테크놀로지 콘택트렌즈 가상 피팅 방법, 장치 및 이 방법을 실행시키는 컴퓨터 프로그램
KR101563200B1 (ko) * 2015-03-09 2015-10-26 김상무 컬러렌즈 맞춤 간편화 시스템 및 그 방법

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102091643B1 (ko) 2018-04-23 2020-03-20 (주)이스트소프트 인공신경망을 이용한 안경 착용 영상을 생성하기 위한 장치, 이를 위한 방법 및 이 방법을 수행하는 프로그램이 기록된 컴퓨터 판독 가능한 기록매체
KR102231239B1 (ko) 2018-12-18 2021-03-22 김재윤 안경 착용 시뮬레이션 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004513389A (ja) * 2000-10-30 2004-04-30 ノバルティス アクチエンゲゼルシャフト 特注美容コンタクトレンズを注文する方法及びシステム
WO2003057038A1 (fr) * 2002-01-04 2003-07-17 Vision Optic Co., Ltd. Systeme et procede de selection des lunettes et des lentilles de contact
KR20040097200A (ko) * 2002-03-26 2004-11-17 김소운 3차원 안경 시뮬레이션 시스템 및 방법
KR101520778B1 (ko) * 2014-11-28 2015-05-18 (주) 뷰엠테크놀로지 콘택트렌즈 가상 피팅 방법, 장치 및 이 방법을 실행시키는 컴퓨터 프로그램
KR101563200B1 (ko) * 2015-03-09 2015-10-26 김상무 컬러렌즈 맞춤 간편화 시스템 및 그 방법

Also Published As

Publication number Publication date
KR20220075984A (ko) 2022-06-08

Similar Documents

Publication Publication Date Title
CN109793498B (zh) 一种皮肤检测方法及电子设备
WO2017188589A2 (fr) Système de caméra mobile hyperspectrale et surveillance de la peau humaine à l'aide d'un système de caméra mobile hyperspectrale
CN108076290B (zh) 一种图像处理方法及移动终端
WO2016085085A1 (fr) Dispositif et procédé de pose virtuelle de lentille de contact, et programme informatique pour exécuter un procédé de pose virtuelle de lentille de contact
CN105068646B (zh) 终端的控制方法和***
CN111866483B (zh) 颜色还原方法及装置、计算机可读介质和电子设备
WO2019125029A1 (fr) Dispositif électronique permettant d'afficher un objet dans le cadre de la réalité augmentée et son procédé de fonctionnement
WO2020130281A1 (fr) Dispositif électronique et procédé de fourniture d'un avatar sur la base de l'état émotionnel d'un utilisateur
CN113297843B (zh) 指代消解的方法、装置及电子设备
EP3691521A1 (fr) Dispositif électronique et procédé de fourniture d'un indice de stress correspondant à l'activité d'un utilisateur
CN109512384A (zh) 头皮检测设备
WO2014088125A1 (fr) Dispositif de photographie d'images et procédé associé
CN107231507A (zh) 摄像装置和摄像方法
WO2019045385A1 (fr) Procédé d'alignement d'images et dispositif associé
JP2022551139A (ja) 生体顔検出方法、生体顔検出装置、電子機器、及びコンピュータプログラム
WO2019039698A1 (fr) Procédé de traitement d'image basé sur la lumière externe et dispositif électronique prenant en charge celui-ci
WO2021101073A1 (fr) Appareil d'acquisition de données biométriques et procédé associé
CN112434546A (zh) 人脸活体检测方法及装置、设备、存储介质
CN105095917A (zh) 图像处理方法、装置及终端
CN104751156A (zh) 一种可调光的手指静脉图像采集***
CN112115748B (zh) 证件图像识别方法、装置、终端及存储介质
WO2020141754A1 (fr) Procédé de recommandation de produit à porter sur le visage, et appareil associé
WO2022114614A1 (fr) Recommandation de lentille de contact personnalisée et système d'essayage virtuel
CN107861244A (zh) 具热显像功能的穿戴式装置
WO2020230908A1 (fr) Application de diagnostic de strabisme et appareil de diagnostic de strabisme comportant cette dernière

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21898440

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21898440

Country of ref document: EP

Kind code of ref document: A1