CN113496140A - Iris positioning method and cosmetic pupil virtual try-on method and device - Google Patents

Iris positioning method and cosmetic pupil virtual try-on method and device Download PDF

Info

Publication number
CN113496140A
CN113496140A CN202010191636.4A CN202010191636A CN113496140A CN 113496140 A CN113496140 A CN 113496140A CN 202010191636 A CN202010191636 A CN 202010191636A CN 113496140 A CN113496140 A CN 113496140A
Authority
CN
China
Prior art keywords
iris
edge
image
eyelid
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010191636.4A
Other languages
Chinese (zh)
Inventor
杜峰
齐坤鹏
杨超
刘享军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Wodong Tianjun Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Wodong Tianjun Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Wodong Tianjun Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN202010191636.4A priority Critical patent/CN113496140A/en
Publication of CN113496140A publication Critical patent/CN113496140A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an iris positioning method and a beautiful pupil virtual try-on method and device based on the iris positioning method, and relates to the technical field of computers. One specific embodiment of the cosmetic pupil virtual fitting method comprises the following steps: an edge detection step, namely performing edge detection on the eye image and reserving edge amplitude information of each pixel point; an eyelid region removing step of setting pixels of the eyelid region to a predetermined value to remove edge amplitude information between the eyelid and the iris; a circle detection step, namely, taking each pixel point in the eye region as a circle center, making circles within the eye image range with different radiuses, accumulating the edge amplitudes on the circumference of each circle, and averaging to obtain the average edge amplitude of the circle, and taking the circle with the maximum average edge amplitude as an iris; and mixing the beautiful pupil mapping material with the iris image to realize virtual try-on of beautiful pupil. The embodiment improves the accuracy of the iris position and improves the cosmetic pupil virtual fitting effect while reducing the calculation amount.

Description

Iris positioning method and cosmetic pupil virtual try-on method and device
Technical Field
The invention relates to the technical field of computers, in particular to an iris positioning method, a beautiful pupil virtual try-on method based on the iris positioning, and a corresponding application system and a corresponding device.
Background
With the development of internet technology in recent years, consumers can purchase more and more kinds of goods through the internet, and there are many wearing goods, such as mydriatic products, which consumers want to be able to obtain a trial effect before purchasing.
In the prior art, a real-time cosmetic pupil virtual try-on method is known, which includes the steps of firstly, obtaining eye key points through a human face key point detection algorithm to obtain eyeball key points and eye circumference key points; then, constructing eye-drop grids, and mixing the pupil beautifying material with the iris images in the eye-drop grids; and reconstructing a periocular grid and covering the periocular grid on the previously mixed image, removing the mixed cosmetic pupil image of the eyelid area, and only keeping the mixed cosmetic pupil image of the eyeball area inside the eyelid. The method for virtually trying on the beautiful pupil in the prior art mainly focuses on a method for mixing beautiful pupil materials and an original human face picture, and has the problems that the iris position accuracy is not high, the mixed image has a cutting sense, so that the beautiful pupil wearing effect is reduced, and the calculation amount is large, and the efficiency is low.
One of the key points of the face key point detection algorithm used in the prior art is the iris detection algorithm. The current iris detection algorithm comprises a deep learning-based method, a Daugman iris detection algorithm and a round detection method based on Hough transformation. However, the existing iris detection algorithm has certain limitations in the aspects of accuracy, robustness, operation time and the like.
Disclosure of Invention
In view of this, embodiments of the present invention provide an iris positioning method, an iris positioning method-based cosmetic pupil virtual try-on method, and a corresponding application system and apparatus thereof, according to the iris positioning method of the present invention, the iris position can be accurately found, thereby better achieving cosmetic pupil virtual try-on, helping a user to perform a purchase decision by online previewing a try-on effect when purchasing a cosmetic pupil on the internet, and improving user experience.
To achieve the above object, according to an aspect of an embodiment of the present invention, there is provided an iris positioning method for positioning an iris of an eye image including an eyeball region and an eyelid region, the iris positioning method including:
an edge detection step, which is to perform edge detection on the eye image and retain the edge amplitude information of each pixel point in the eye image;
an eyelid region removing step of setting pixels of the eyelid region to a predetermined value to remove edge amplitude information between an eyelid and an iris; and
and a circumference detection step, namely, taking each pixel point in the eye region as a circle center, making circles within the eye image range with different radiuses, accumulating the edge amplitudes on the circumference of each circle and averaging to obtain an average edge amplitude of each circle, and taking the circle with the largest average edge amplitude as an iris.
Further, according to another aspect of the present invention, there is provided a cosmetic pupil virtual fitting method, including:
a face image acquisition step of acquiring a face image including an eye image;
detecting key points of the human face to obtain coordinates of key pixel points describing the eyes;
extracting the eye region, namely extracting the eye region based on the coordinates of the key pixel points and establishing eye region coordinates;
a coordinate conversion step of converting the screen coordinates of the eye region into the eye region coordinates;
an iris repositioning step of repositioning the iris of the eye region in the eye region coordinates by using the iris positioning method according to one aspect of the embodiment of the invention to determine an iris image;
a beautiful pupil drawing step, wherein the pixel point coordinates of the beautiful pupil material mapping correspond to the pixel point coordinates of the iris image under the eye region coordinates, so that the beautiful pupil material mapping is fused with the iris image; and
and an image output step of converting the fused image back to the screen coordinates and outputting the same onto the screen.
One embodiment of the above invention has the following advantages or benefits: because the edge detection information is used as the input data of the circle detection, the circle with the largest average pixel value, namely the average edge amplitude value, is searched through the circle detection to be used as the iris, the technical problems of large iris positioning deviation and large iris detection calculated amount are solved, and the technical effect of obtaining the accurate iris area with small calculated amount is achieved.
In addition, the method realizes the drawing of the beautiful pupil through one-time mixing, does not need complex processing, and can realize the shadow gradual change effect of the edge of the beautiful pupil and the adjustment of the strength of the beautiful pupil.
Further effects of the above-mentioned non-conventional alternatives will be described below in connection with the embodiments.
Drawings
The drawings are included to provide a better understanding of the invention and are not to be construed as unduly limiting the invention. Wherein:
fig. 1 is a schematic diagram of a main flow of a cosmetic pupil virtual fitting method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a main flow of an iris localization method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram showing eye keypoints;
FIG. 4 is a schematic diagram illustrating coordinate transformation;
FIG. 5 is a diagram showing before and after bilateral filtering, where FIG. 5A shows an image of an eye before bilateral filtering and FIG. 5B shows an image of an eye after bilateral filtering;
fig. 6 is a diagram showing an edge detection result;
FIG. 7 is a schematic diagram illustrating the effect of eyelid area removal;
fig. 8 is a schematic diagram showing a procedure of circle detection, in which fig. 8A shows a case of a circle centered at a point a, fig. 8B shows a case centered at a point B, and fig. 8C shows a case centered at a point C;
fig. 9 is a diagram illustrating an iris localization effect, in which fig. 9A illustrates a case where an iris is located in the middle of an eye region, and fig. 9B illustrates a case where an iris is located at the edge of an eye region;
fig. 10 is a diagram illustrating an aesthetic pupil virtual fitting effect of the aesthetic pupil virtual fitting method according to the embodiment of the present invention;
FIG. 11 is an exemplary system architecture diagram in which embodiments of the present invention may be employed;
fig. 12 is a schematic structural diagram of a computer system suitable for implementing a terminal device or a server according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention are described below with reference to the accompanying drawings, in which various details of embodiments of the invention are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a schematic diagram of a main flow of a cosmetic pupil virtual fitting method according to an embodiment of the present invention. The beauty pupil drawing algorithm is different from the existing algorithm in that the beauty pupil drawing can be realized through one-time mixing, an eyelid area does not need to be removed through the secondary covering of eye surrounding grids, and the gradual shade change of the beauty pupil edge and the adjustment of the beauty pupil intensity can be realized at the same time. The following describes the respective steps of the cosmetic pupil virtual fitting method according to the present embodiment.
As shown in fig. 1, the main flow of the cosmetic pupil virtual fitting method according to the present embodiment includes the following main steps: the method comprises the steps of image acquisition, human face key point detection, eye region extraction, coordinate conversion, iris relocation, pupil material image mixing and image output.
< S1: image acquisition step >
Namely, the face data is collected through the camera.
< S2: face Key Point detection step >
After the face data is acquired, eye key point coordinates are acquired through a known face key point detection algorithm. As shown in fig. 3, taking one eye as an example, the key point coordinates describing the eye can be obtained by the face key point detection algorithm as follows: a left eye corner key point 1, a right eye corner key point 2, an upper eyelid edge point 3, a lower eyelid edge point 4, and an iris center point 5. The cosmetic pupil virtual fitting method according to the present embodiment is based on data that are the coordinates of these five eye key points.
< S3: eye region extraction step >
After the above five key points are detected, as shown in fig. 3, a square is formed by taking the point 5 as the center and the distance from the point 1 to the point 2 as the side length, and a square coordinate system of the eye region is established. Wherein the square side length is parallel to the line connecting point 1 and point 2, and the square side length is specified to be 1. That is, as shown in fig. 4A, the square coordinate system has point a as the origin, and defines the coordinates of point B as (0,1) and the coordinates of point C as (1,0), and the coordinates of point D as (1, 1). This square area serves as a drawing area where the calculations in the following steps are all performed.
< S4: coordinate transformation step >
In order to align the coordinates of the beautiful pupil material map when drawing the beautiful pupil, the coordinates of the key points (original image coordinates) in the screen coordinate system need to be converted into the above-mentioned square coordinate system of the eye region.
As shown in fig. 4A and 4B, assuming that the xOy coordinate system is a screen coordinate system, the x ' O ' y ' coordinate system is a coordinate system after coordinate conversion into a square coordinate system. The specific conversion process is described below.
Let the coordinate transformation matrix be:
Figure BDA0002416122360000051
the coordinates of a point before transformation are (x, y), the coordinates of the point after transformation are (x ', y'), then
(x y1)W=(x’ y’1),
The coordinate transformation matrix W (six unknowns) is required, the coordinates of three points before and after transformation are required to be transformed, three points of a point A, a point B and a point C are taken for calculation, and the coordinates before transformation are respectively recorded as (x)A,yA),(xB,yB),(xC,yC) And the transformed coordinates are respectively marked as (x)A’,yA’),(xB’,yB’),(xC’,yC'). The coordinates of the points 1, 2 and 5 before transformation are obtained in advance through a face key point detection algorithm and are marked as (x1, y1), (x2, y2), (x5 and y5), and then
Figure BDA0002416122360000061
Figure BDA0002416122360000062
Figure BDA0002416122360000063
The coordinate axis after transformation is parallel to two sides of the square, and the unit length is the side length of the square, so that (x)A’,yA’)=(0,0),(xB’,yB’)=(0,1),(xC’,yC') (1,0), then
Figure BDA0002416122360000064
From this, a coordinate transformation matrix W is obtained, and coordinates after all points are transformed are obtained.
< S5: iris repositioning step >
The virtual try-in of the beautiful pupil is that the beautiful pupil sheet and the iris area of the human eye are drawn in a mixed mode, the iris area is approximately circular, and the iris area can correspond to the coordinate of the beautiful pupil material mapping through the center coordinate and radius information of the iris. In practice, the iris center point returned by the face key point detection algorithm has a certain deviation from the center of the iris, and the deviation can cause the trial wearing position of the cosmetic pupil to be inaccurate, so that the iris needs to be repositioned. By the iris positioning method, the center and the radius of the iris can be accurately positioned. The core idea of the iris positioning algorithm of the embodiment is to find the circle with the maximum probability in the image, take the iris image as input, remove noise and interference of an eyelid area by using bilateral filtering and eyelid edge key point fitting for the accuracy and robustness of calculation, take edge information as input data of circle detection, and finally obtain an accurate iris area.
The iris positioning method according to the present embodiment will be described in detail below.
As shown in fig. 2, the iris positioning step specifically includes the following steps: bilateral filtering denoising, edge detection, eyelid area removal and iris circumference detection. The respective steps will be described in detail below.
[ S501: bilateral filtering denoising step
The eye picture has interference information such as eyelashes and highlight points. The present embodiment first removes the interference information by using bilateral filtering. The bilateral filtering method is a nonlinear filtering method, is a compromise treatment combining the spatial proximity and the pixel value similarity of an image, considers the spatial domain information and the gray level similarity at the same time, has the characteristics of simplicity, non-iteration and locality, and can reduce noise and smooth while keeping edges. In digital image processing, a place where the change in the gray level is severe is defined as an edge. Bilateral filtering represents the intensity of a current pixel using a weighted average of its neighborhood pixel values, where any open interval, for example, with a point as a center point, can be referred to as the neighborhood of the point. The weight used in the bilateral filtering considers the Euclidean distance between pixels, namely the Gaussian weight of a spatial domain, and considers the difference between the current pixel value and the neighborhood pixel value, namely the Gaussian weight of a value domain.
For example, assume that in an image, I (I, j) is the pixel value at pixel point (I, j) before bilateral filtering, IBF(i, j) is the pixel value at the pixel point (i, j) after bilateral filtering, then
Figure BDA0002416122360000071
Wherein S represents a neighborhood of the pixel point (i, j), and the weight w (i, j, k, l) of the neighborhood point (k, l) is:
Figure BDA0002416122360000072
as shown in fig. 5, fig. 5A shows the image of the eye before bilateral filtering, and fig. 5B shows the image of the eye after bilateral filtering, it can be seen that the image becomes smoother after denoising by bilateral filtering, while preserving edge information.
[ S502: edge detection procedure
In the present embodiment, edge detection is performed using Sobel operator (Sobel operator). The sobel operator is a discrete difference operator used to calculate the approximate value of the gray scale of the image brightness function. Using this operator at any point in the image will produce the corresponding gray scale vector or its normal vector. In this embodiment, the horizontal and vertical gradients of the image are calculated by using the sobel operator to obtain the gradient amplitudes. In an image, the gradient at the edge is large, and the gradient at the non-edge portion is small, so the gradient magnitude can be used to characterize the image edge.
The Sobel operator containing two basic operators in transverse and longitudinal directions is used for convolution with the image respectively to obtain components of transverse and longitudinal edge detection, namely
Figure BDA0002416122360000081
By means of edge detection, key information such as iris edges and the like can be reserved, irrelevant information can be removed, and robustness of the algorithm to brightness change can be improved. In the present invention, only amplitude information for edge detection is needed, i.e.
Figure BDA0002416122360000082
An edge distribution graph is obtained, and the pixel value of each point is the corresponding edge amplitude information G.
Fig. 6 shows the result of edge detection, and as shown in the figure, after edge detection, the gray value at the edge of the image is larger, and the gray value at the non-edge area is smaller.
[ S503: eyelid region removal procedure
When the iris is positioned, because the eyelid usually shields the iris, a large amount of irrelevant information in the eyelid area needs to be removed, and the accuracy of iris positioning can be greatly improved by removing the irrelevant information. To remove the eyelid area, the eyelid edge needs to be determined first, and then the pixels of the eyelid edge and the eyelid area need to be set to 0, or other predetermined values that can achieve the effect of removing the eyelid area. In practice, the eyelid margin may be fitted by 4 segments of parabolas, for example, in fig. 4, by fitting parabolas through points 1 and 3, points 2 and 3, points 1 and 4, points 2 and 4, respectively, the upper eyelid margin may be:
Figure BDA0002416122360000083
the lower eyelid margin is:
Figure BDA0002416122360000084
setting the pixel value of the eyelid area to 0 according to the fitted eyelid edge, a edge distribution map with the eyelid area removed can be obtained, as shown in fig. 7.
[ S504: iris circumference detection procedure
After preserving the edge magnitude information and removing the eyelid area, the circumference of the iris can be detected and the iris located. Two parameters of the center and the radius need to be detected for detecting the circumference of the iris, so the process of detecting the circumference of the iris is actually the process of determining the center and the radius of the circumference of the iris. Specifically, for example, a circle may be determined by taking a point in the eye region as a center and taking a value as a radius, and then, for this circle, the edge amplitudes obtained by edge detection on the circumference thereof are accumulated and averaged as the probability value of the circle. As described above in step S502, since the gradient at the edge is large, the circle is traversed sequentially with each point in the eye region as the center and different values as the radius, and the circle with the maximum probability value is regarded as the circumference of the iris, so that the position of the iris is determined.
However, traversing each point in the eye area, i.e., the square area, as a center of the circle is very computationally intensive and unnecessary. In the actual operation, in order to reduce the traversal range and reduce the calculation amount, a detection strategy is set by starting from an iris center point obtained by detecting the key points of the human face. Although the central point of the iris returned by the key point detection is inaccurate, the central point always falls in the iris area, the traversal range can be properly reduced by using the limitation condition, and the circle center is traversed in a small square area with the side length of 1/4 at the center of the square area. During specific implementation, the process can be realized through parallel operation of a GPU (Graphics Processing Unit), so that the operation overhead is greatly reduced, and the algorithm speed is improved.
In addition, the traversal range of the radius can be reduced appropriately. For example, the iris radius is typically 0.2-0.3 of the eye width, so the radius traversal range can be defined as [0.2, 0.3], with a step size of 0.01.
As described above, each circle center and radius may determine a circle, for each circle, the edge amplitudes on the circumference of the circle are accumulated and averaged, the obtained average edge amplitude is the probability value of the circle, and the circle with the largest probability value is the iris circumference. More specifically, as shown in fig. 8, a circle is drawn with a point Q as a center and r as a radius, the sum of the edge amplitudes of the pixels on the circumference is calculated, and then the average value is obtained. Since the edge amplitude of white is greater than that of black, it is apparent that the average edge amplitude of the circle Q is greater, and therefore the probability that the circle Q is an iris edge is greater than the probabilities of the circles P and R and also greater than the probability that circles of other radii are irises. Where cumulative summing is the sum of the pixel values, i.e., edge amplitudes, for a point on the circumference and averaging is the number of points on the circumference divided by the sum.
As shown in fig. 8, in the circle corresponding to the point P, Q, R, the probability value that the circle having the point Q as the radius r is located on the iris circumference is the largest, and thus the iris circumference is considered, and the position of the iris is determined anew and more accurately.
As shown in fig. 9, point M is the pupil center returned by the face key point detection algorithm, point N is the center of the iris obtained by the algorithm, and the area enclosed by the circle in the figure is the iris detected by the algorithm of the present invention. A significant improvement in accuracy can be seen.
< S6: image blending step of pupil-beautifying material >
The image mixing of the beautiful pupil material, namely beautiful pupil drawing, is to fuse the beautiful pupil material chartlet with the iris image, the position (namely the circle center and the radius of the circle) of the iris can be obtained through the steps, and the coordinates of the beautiful pupil material chartlet can be corresponding to the coordinates of the iris image, so that the integration of the beautiful pupil material and the iris image is realized. The specific method is as follows.
If the mixing weight of a certain point is alpha, the color of the beautiful pupil map at the point is E, the color of the iris image is F, and the mixed color is CαComprises the following steps: cα=αE+(1-α)F。
The determination process of the blending weight α is explained below. The alpha channel (i.e., the transparent channel, which records a special coating of image transparency information, representing transparency and translucency of the picture) of the cosmetic pupil map is 0 at the transparent position, and the remaining area is not 0, so that the value of the alpha channel can be directly used as the blending weight. But because ofSince the eyelid region blocks the iris, it is desirable that the image after the eyelid region is blended is still the original image. During the iris repositioning, the eyelid edge has been determined, so that the eyelid area weight can be set to 0, or other predetermined value that can eliminate the occlusion of the eyelid area, and the blended image remains the original image. Further, in order to make the blended image more natural, the eyelid edge is appropriately weighted down for the eyeball region, thereby achieving the effect of gradual shading. Let the width of the shadow be wsThe distance from a point in the eyeball region to the eyelid margin is dlThen the weight coefficient is dl/wsI.e. the closer to the eyelid edge, the smaller the weight, set the d in the eyeball areal>0, eyelid area dl<0, then the resulting mixing weight is:
Figure BDA0002416122360000111
wherein imRepresents the mixed concentration and has the value range of [0, 1%],imThe larger the size, the more pronounced the cosmetic pupil effect.
< S7: image output step >
Rendering the mixed image on a screen can realize the virtual fitting effect of the beautiful pupil, as shown in fig. 10.
Having described the cosmetic pupil virtual fitting method and the iris positioning method according to the embodiments of the present invention in detail, an exemplary system architecture to which the cosmetic pupil virtual fitting method of the embodiments of the present invention can be applied will be described below.
Figure 11 shows an exemplary system architecture 1100 to which the cosmetic pupil virtual fitting method of an embodiment of the present invention may be applied.
As shown in fig. 11, the system architecture 1100 may include terminal devices 1101, 1102, 1103, a network 1104, and a server 1105. The network 1104 is a medium to provide communication links between the terminal devices 1101, 1102, 1103 and the server 1105. Network 1104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
A user may use terminal devices 1101, 1102, 1103 to interact with a server 1105 over a network 1104 to receive or send messages or the like. Various messaging client applications, such as shopping applications, web browser applications, search applications, instant messaging tools, mailbox clients, social platform software, etc. (examples only) may be installed on the terminal devices 1101, 1102, 1103.
The terminal devices 1101, 1102, 1103 may be various electronic devices having a display screen and a camera and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 1105 may be a server that provides various services, such as a backend management server (for example only) that provides support for shopping-like websites browsed by users using the terminal devices 1101, 1102, 1103. The backend management server may analyze and otherwise process data such as the received product information query request, and feed back a processing result (for example, target push information and product information (only an example)) to the terminal device.
It should be noted that the cosmetic pupil virtual fitting method provided in the embodiment of the present invention is generally executed by the server 1105, and accordingly, the cosmetic pupil virtual fitting processing apparatus is generally provided in the server 1105.
It should be understood that the number of terminal devices, networks, and servers in fig. 11 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to FIG. 12, shown is a block diagram of a computer system 1200 suitable for use with a terminal device implementing an embodiment of the present invention. The terminal device shown in fig. 2 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 12, the computer system 200 includes a Central Processing Unit (CPU)1201, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)1202 or a program loaded from a storage section 1208 into a Random Access Memory (RAM) 1203. In the RAM 1203, various programs and data necessary for the operation of the system 1200 are also stored. The CPU 1201, ROM 1202, and RAM 1203 are connected to each other by a bus 1204. An input/output (I/O) interface 1205 is also connected to bus 1204.
The following components are connected to the I/O interface 1205: an input section 1206 including a keyboard, a mouse, and the like; an output portion 1207 including a display device such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 1208 including a hard disk and the like; and a communication section 1209 including a network interface card such as a LAN card, a modem, or the like. The communication section 1209 performs communication processing via a network such as the internet. A driver 1210 is also connected to the I/O interface 1205 as needed. A removable medium 1211, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like, is mounted on the drive 1210 as necessary, so that a computer program read out therefrom is mounted into the storage section 1208 as necessary.
In particular, according to the embodiments of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 1209, and/or installed from the removable medium 1211. The computer program performs the above-described functions defined in the system of the present invention when executed by the Central Processing Unit (CPU) 1201.
It should be noted that the computer readable medium shown in the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present invention may be implemented by software or hardware. The described modules may also be provided in a processor, which may be described as: a processor includes a sending module, an obtaining module, a determining module, and a first processing module. Wherein the names of the modules do not in some cases constitute a limitation of the module itself.
As another aspect, the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be separate and not incorporated into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to comprise:
s1: an image acquisition step, wherein face data is acquired through a camera;
s2: a human face key point detection step, namely obtaining eye key point coordinates by a known human face key point detection algorithm after human face data are obtained;
s3: an eye region extraction step, namely establishing a square coordinate system of the eye by using the coordinates of the eye key points, and extracting the eye region;
s4: a coordinate conversion step, converting the key point coordinates (original image coordinates) in the screen coordinate system to the square coordinate system of the eye region so as to align with the coordinates of the pupil beautifying material mapping when the pupil is drawn;
s5: iris repositioning step, namely accurately positioning the center and radius of the iris through iris repositioning to obtain an accurate iris area so as to ensure that the fitting position of the beautiful pupil is accurate in subsequent processing;
s6: and an image mixing step of the beautiful pupil material, wherein the coordinates of the pasted picture of the beautiful pupil material correspond to the coordinates of the iris image, so that the beautiful pupil material is fused with the iris image.
S7: and an image output step, wherein the mixed image is rendered on a screen, and finally, the effect of beautifying the pupil by virtual try-on is realized.
In the iris repositioning step, the following steps are specifically executed:
s501: bilateral filtering processing is carried out on the eye image, interference information is removed, a pixel value of each pixel point in the image after bilateral filtering is obtained, and edge information of an iris area in the eye image is reserved
S502: carrying out edge detection by using a Sobel operator to obtain edge amplitude information of each pixel point in the eye region;
s503: parabolic fitting is carried out on the eyelid edge, the pixel value of the eyelid area is set to be 0 according to the eyelid edge, and an edge distribution diagram of the eyelid area is obtained;
s504: and iris circumference detection, namely determining the center and radius of the iris.
According to the technical scheme of the embodiment of the invention, the following effects are obtained.
According to the iris positioning method, the iris image is used as input, noise and interference of the eyelid area are removed by using bilateral filtering and eyelid edge key point fitting for the accuracy and robustness of calculation, the edge information is used as input data of circle detection, the circle with the maximum probability in the image is found, the accurate iris area is finally obtained, the position deviation of the iris is greatly reduced, and the effect of beautifying pupil virtual try-on is further improved.
In addition, according to the iris positioning method, accurate positioning of the iris can be realized through fast and efficient post-processing without depending on extra fine labeling data.
According to the virtual fitting method for the cosmetic pupil, based on the iris positioning method, the cosmetic pupil mixed in the eyelid area is simply removed under the condition that the calculated amount is greatly reduced, only the cosmetic pupil of the iris area in the eye is reserved, the cosmetic pupil drawing can be realized through one-time mixing, the eyelid area does not need to be removed through the secondary covering of the periocular grid, the shadow gradual change of the cosmetic pupil edge and the adjustment of the cosmetic pupil intensity can be simultaneously realized, and the cutting feeling is reduced.
The above-described embodiments should not be construed as limiting the scope of the invention. Those skilled in the art will appreciate that various modifications, combinations, sub-combinations, and substitutions can occur, depending on design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (19)

1. An iris positioning method for positioning an iris of an eye image, wherein the eye image includes an eyeball region and an eyelid region, comprising:
edge detection, namely performing edge detection on the eye image and reserving edge amplitude information of each pixel point in the eye image;
removing an eyelid area, setting pixels of the eyelid area as a preset value, and removing edge amplitude information between an eyelid and an iris; and
and a circumference detection step, namely, taking each pixel point in the eye region as a circle center, making circles within the eye image range with different radiuses, accumulating the edge amplitudes on the circumference of each circle and averaging to obtain an average edge amplitude of each circle, and taking the circle with the largest average edge amplitude as an iris.
2. An iris localization method according to claim 1,
further comprising a noise removing step of removing interference information in the eye image before the edge detecting step.
3. An iris localization method according to claim 2,
in the noise removing step, the interference information is removed through bilateral filtering processing.
4. An iris localization method according to claim 1 or 2,
in the edge detection step, edge detection is performed on the eye image by sobel edge detection.
5. An iris localization method according to claim 1 or 2,
in the eyelid region removing step, eyelid edge key pixel point information is obtained through face key point detection, the eyelid edge is fitted by using the eyelid edge key pixel point information to obtain a fitted upper eyelid edge and a fitted lower eyelid edge, and pixels of the eyelid region are set to be the preset value according to the upper eyelid edge and the lower eyelid edge.
6. An iris localization method according to claim 1 or 2,
and in the center of the eyeball area, performing the circumference detection by taking each pixel point in a square of 1/4 with the side length being the distance between the left-eye corner pixel point and the right-eye corner pixel point as the center of a circle.
7. An iris localization method according to claim 6,
and obtaining the information of the left canthus pixel points and the information of the right canthus pixel points through face key point detection.
8. An iris localization method according to claim 1 or 2,
and at each circle center point, performing the circumference detection by taking a value in an interval of 0.2-0.3 times of the distance between the left eye corner pixel point and the right eye corner pixel point as a radius.
9. An iris localization method according to claim 8,
the search step for the radius is 0.01 times the distance.
10. An iris localization method according to claim 8,
and obtaining the information of the left canthus pixel points and the information of the right canthus pixel points through face key point detection.
11. A cosmetic pupil virtual fitting method is characterized by comprising the following steps:
a face image acquisition step of acquiring a face image including an eye image;
detecting key points of the human face to obtain coordinates of key pixel points describing the eyes;
extracting the eye region, namely extracting the eye region based on the coordinates of the key pixel points and establishing eye region coordinates;
a coordinate conversion step of converting the screen coordinates of the eye region into the eye region coordinates;
an iris repositioning step of repositioning the iris in the eye region coordinates by using the iris positioning method according to any one of claims 1 to 10 to determine an iris image;
a beautiful pupil drawing step, wherein the pixel point coordinates of the beautiful pupil material mapping correspond to the pixel point coordinates of the iris image under the eye region coordinates, so that the beautiful pupil material mapping is fused with the iris image; and
and an image output step of converting the fused image back to the screen coordinates and outputting the same onto the screen.
12. The cosmetic pupil virtual fitting method according to claim 11, characterized in that
And in the cosmetic pupil drawing step, calculating the mixed color of each pixel point by using the mixed weight.
13. The cosmetic pupil virtual fitting method according to claim 12,
using the value of the transparent channel of the cosmetic pupil mapping material as a blending weight.
14. The cosmetic pupil virtual fitting method according to claim 13,
setting the blending weight for the eyelid area to 0.
15. The cosmetic pupil virtual fitting method according to claim 13,
setting a mixing weight at the eyelid edge to be lower than a mixing weight of the eyeball area.
16. The cosmetic pupil virtual fitting method according to claim 14 or 15,
in the eyeball area and in an area outside the eyelid area, taking the ratio of the distance from each point to the eyelid edge and the shadow width of the gradual shadow effect at the eyelid edge expected to be realized as the weight coefficient of each point, and taking the product of the mixed concentration of each point, the value of the transparent channel of the aesthetic pupil mapping material and the weight coefficient of each point as the mixed weight of each point, wherein the value range of the mixed concentration is [0,1 ].
17. An apparatus for performing cosmetic pupil virtual fitting processing, comprising:
a face image acquisition module that acquires a face image including an eye image;
the face key point detection module is used for obtaining coordinates of key pixel points describing the eyes;
the eye region extraction module extracts an eye region based on the coordinates of the key pixel points and establishes eye region coordinates;
the coordinate conversion module is used for converting the screen coordinates of the eye region into the coordinates of the eye region;
an iris repositioning module for repositioning the iris in the eye region coordinates by using the iris repositioning method according to any one of claims 1 to 10 to determine an iris image;
the beauty pupil drawing module corresponds pixel point coordinates of a beauty pupil material mapping to pixel point coordinates of the iris image under the eye region coordinates so as to fuse the beauty pupil material mapping and the iris image; and
and the image output module converts the fused image back to the screen coordinates and outputs the fused image to the screen.
18. An electronic device with a cosmetic pupil virtual try-on function, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-9.
19. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-9.
CN202010191636.4A 2020-03-18 2020-03-18 Iris positioning method and cosmetic pupil virtual try-on method and device Pending CN113496140A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010191636.4A CN113496140A (en) 2020-03-18 2020-03-18 Iris positioning method and cosmetic pupil virtual try-on method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010191636.4A CN113496140A (en) 2020-03-18 2020-03-18 Iris positioning method and cosmetic pupil virtual try-on method and device

Publications (1)

Publication Number Publication Date
CN113496140A true CN113496140A (en) 2021-10-12

Family

ID=77993363

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010191636.4A Pending CN113496140A (en) 2020-03-18 2020-03-18 Iris positioning method and cosmetic pupil virtual try-on method and device

Country Status (1)

Country Link
CN (1) CN113496140A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116777940A (en) * 2023-08-18 2023-09-19 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104091155A (en) * 2014-07-04 2014-10-08 武汉工程大学 Rapid iris positioning method with illumination robustness
CN107871322A (en) * 2016-09-27 2018-04-03 北京眼神科技有限公司 Iris segmentation method and apparatus
CN108288248A (en) * 2018-01-02 2018-07-17 腾讯数码(天津)有限公司 A kind of eyes image fusion method and its equipment, storage medium, terminal
CN109325421A (en) * 2018-08-28 2019-02-12 武汉真元生物数据有限公司 A kind of eyelashes minimizing technology and system based on edge detection
CN109785259A (en) * 2019-01-09 2019-05-21 成都品果科技有限公司 A kind of real-time U.S. pupil method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104091155A (en) * 2014-07-04 2014-10-08 武汉工程大学 Rapid iris positioning method with illumination robustness
CN107871322A (en) * 2016-09-27 2018-04-03 北京眼神科技有限公司 Iris segmentation method and apparatus
CN108288248A (en) * 2018-01-02 2018-07-17 腾讯数码(天津)有限公司 A kind of eyes image fusion method and its equipment, storage medium, terminal
CN109325421A (en) * 2018-08-28 2019-02-12 武汉真元生物数据有限公司 A kind of eyelashes minimizing technology and system based on edge detection
CN109785259A (en) * 2019-01-09 2019-05-21 成都品果科技有限公司 A kind of real-time U.S. pupil method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐一番: "基于Sobel定位和曲面识别的虹膜算法研究", 《中国优秀硕士学位论文全文数据库(信息科技辑)》, no. 1, 15 December 2013 (2013-12-15), pages 13 - 16 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116777940A (en) * 2023-08-18 2023-09-19 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium
CN116777940B (en) * 2023-08-18 2023-11-21 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN110766777A (en) Virtual image generation method and device, electronic equipment and storage medium
CN106682632B (en) Method and device for processing face image
CN108830780B (en) Image processing method and device, electronic device and storage medium
CN108694719B (en) Image output method and device
CN109087377B (en) Method and apparatus for handling image
CN109472264B (en) Method and apparatus for generating an object detection model
CN107248169B (en) Image positioning method and device
US20220215618A1 (en) Image processing method and apparatus, computer storage medium, and electronic device
CN113327278A (en) Three-dimensional face reconstruction method, device, equipment and storage medium
CN107977946A (en) Method and apparatus for handling image
CN111861867B (en) Image background blurring method and device
CN112330527A (en) Image processing method, image processing apparatus, electronic device, and medium
US20230125255A1 (en) Image-based lighting effect processing method and apparatus, and device, and storage medium
CN109377552B (en) Image occlusion calculating method, device, calculating equipment and storage medium
CN109165571B (en) Method and apparatus for inserting image
CN111179276A (en) Image processing method and device
CN114792355A (en) Virtual image generation method and device, electronic equipment and storage medium
JP2023541351A (en) Character erasure model training method and device, translation display method and device, electronic device, storage medium, and computer program
WO2024125328A1 (en) Live-streaming image frame processing method and apparatus, and device, readable storage medium and product
CN113112422B (en) Image processing method, device, electronic equipment and computer readable medium
CN114581979A (en) Image processing method and device
CN113496140A (en) Iris positioning method and cosmetic pupil virtual try-on method and device
CN114049290A (en) Image processing method, device, equipment and storage medium
CN113378790A (en) Viewpoint positioning method, apparatus, electronic device and computer-readable storage medium
CN109523564B (en) Method and apparatus for processing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination