CN114005151B - Face angle sample collection and labeling method - Google Patents

Face angle sample collection and labeling method Download PDF

Info

Publication number
CN114005151B
CN114005151B CN202010736655.0A CN202010736655A CN114005151B CN 114005151 B CN114005151 B CN 114005151B CN 202010736655 A CN202010736655 A CN 202010736655A CN 114005151 B CN114005151 B CN 114005151B
Authority
CN
China
Prior art keywords
angle
point
face
eareye
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010736655.0A
Other languages
Chinese (zh)
Other versions
CN114005151A (en
Inventor
田凤彬
于晓静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ingenic Semiconductor Co Ltd
Original Assignee
Beijing Ingenic Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ingenic Semiconductor Co Ltd filed Critical Beijing Ingenic Semiconductor Co Ltd
Priority to CN202010736655.0A priority Critical patent/CN114005151B/en
Publication of CN114005151A publication Critical patent/CN114005151A/en
Application granted granted Critical
Publication of CN114005151B publication Critical patent/CN114005151B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a method for collecting and labeling a face angle sample, which comprises the following steps: s1, collecting a face sample; s2, generating face sample marking points: the seven points are marked manually, and the seven points are points 1 at which the left earhole is positioned; the right ear hole position is set to point 2; the center position of the bottom of the chin is set as a point 3; the left eye corner position of the human eye is set as point 4; the center position of the two-eye connecting line segment is set as a point 5; the right eye corner position is set as point 6; and the lower lip lower corner is set as point 7; s3, calculating the angle of the face angle sample through the marking points, and generating angle marking information: determining a calculation parameter; obtaining two real lengths of the three-dimensional human face of the person by traversing and calculating the distance between the corners of eyes of the picture of the face of the person to obtain the maximum distance and the distance between the connecting line center of the corners of eyes and the chin to obtain the maximum distance; and calculating angles through a projection triangle sine theorem to obtain three angle values of a face. The method adopts an algorithm mode for marking, so that the error problem of manual marking angles is reduced.

Description

Face angle sample collection and labeling method
Technical Field
The invention relates to the technical field of face recognition, in particular to a face angle sample acquisition and labeling method.
Background
The technology of neural networks in the field of artificial intelligence is rapidly developed in the current society. In the face recognition field, the prior art is to carry out sample marking angles through manual observation angles. However, the artificial annotation angle in the prior art brings great errors, so that the artificial factors are strong and the errors are great.
In addition, the manual labeling of the prior art also includes the following commonly used technical terms:
1. face angle: refers to the angles formed by the human face in three directions in space.
2. Marking points: refers to the coordinate information of the point and other attributes of the point, such as whether the point is present, the presence of a flag of 1, the absence of a flag of 0, the point being the left eye corner or the bottom of the chin.
3. Volunteers: voluntary to attend the gathering activity, and those willing to contribute to the project.
Disclosure of Invention
In order to solve the above problems, an object of the present invention is to: the method adopts an algorithm mode to label, so that the error problem of manual labeling angles is reduced.
Specifically, the invention provides a method for collecting and labeling a face angle sample, which comprises the following steps:
S1, collecting a face sample;
s2, generating a face sample marking point: the seven points are marked manually, the seven points refer to the positions of the left earholes and are set as points 1; the right ear hole position, set to point 2; the center position of the bottom of the chin is set as a point 3; the left eye corner position of the human eye is set as point 4; the center position of the two-eye connecting line segment is set as a point 5; the right eye corner position, set as point 6; and the lower lip lower corner, set as point 7;
Calculating the maximum distance between the eyes and corners (namely the distance between the point 4 and the point 6) and the distance between the center of the connecting line segment of the maximum eyes and the chin by traversing
S3, calculating the angle of the face angle sample through the marking points, and generating angle marking information: determining a calculation parameter; the distance between the two eyes and the corners of the whole picture of the same human face, namely the distance between the point 4 and the point 6 is calculated through traversal, so that the maximum distance is obtained, the distance between the connecting line center of the two eyes and the chin is traversed, namely the distance between the point 5 and the point 3 is obtained, so that two real distances of the three-dimensional human face are obtained, the two distances are the two maximum distances of various poses of the human face in an image plane, and the two real lengths of the three-dimensional human face are defined; and calculating angles through a projection triangle sine theorem to obtain three angle values of a face.
The step S1 further includes:
S1.1, a person to be collected does not wear glasses, the distance between a camera and the person to be collected is 3 meters, the camera and the nose of the face of the person to be collected are positioned on the same horizontal line, the camera rotates from left to right around a brain bag as a center, and is turned upwards from bottom to top, and from left to bottom to right, and from bottom to top, and when looking up, the camera rotates left and right, and when looking up, the camera rotates to one side to turn upwards from bottom, and each person collects n pieces and 1000 persons;
s1.2, after the collected person does not wear glasses to collect, wearing one of a near-view mirror, a sunglasses, a hat, a scarf and a mask at random, collecting according to the collecting mode of S1.1, and randomly illuminating the background picture.
The step S2 further includes:
if the eye is a side face, the occluded earhole is written into a data value of the eye which can be seen, and the occluded ear position is marked at the same time, if the eye angle of one eye is occluded, the marked data is the data value of the eye which can be seen, and the occluded eye position is marked at the same time; if the device is provided with a sunglasses or long hair is left, the device is marked by adopting an estimation mode.
The determining the calculated parameter in step S3 further includes:
A) The horizontal direction of the connecting line between the two eye angles on the plane is kept unchanged, the connecting line rotates along the direction of the vertical plane, the length is also unchanged, only the vertical direction on the plane generates the change of the length, the change is the calculation of the included angle of the direction, the direction is named as yaw, and the line segment between the set point 4 and the set point 6 is disteye; when two eyes are laterally arranged on one side, only one eye can see the change of the distance from the eyes to the earholes on the corresponding side is embodied in the change of the distance in the yaw direction, and the values of the other two directions are unchanged; setting the distance from the point 6 to the point 2 or the distance from the point 4 to the point 1 as eareye, and specifically dividing into eareye, eareye and 5241;
B) The line segment from the center of two eyes to the chin generates length change in the horizontal direction on the plane, and the included angle of the direction is calculated by utilizing the property, wherein the direction is named pitch, and the line segment from the point 5 to the point 3 is eyechin; eyechin has a maximum length, and the angle is 0;
C) The longer the line segment from the lower corner of the lower lip to the center of the bottom of the chin, the larger the upward viewing angle, the smaller the line segment, and the larger the overlooking angle, the line segment is lipchin, and the direction of the pitch can be determined by utilizing the change of lipchin value;
d) Calculating the included angle of the horizontal direction is directly obtained by using the included angle between the connecting line of two eyes and the horizontal direction, and the direction is named roll.
The method for implementing the step S3 further includes:
S3.1, placing pictures acquired by each acquired person into a folder, acquiring 1000 acquired faces of the acquired person, and generating 1000 folders;
S3.2, the longest disteye in each folder is set as max_ disteye, the longest eareye value is max_ eareye, the longest eyechin is max_ eyechin, and the lipchin value corresponding to the picture with the distance max_ eyechin is lipchin _0;
S3.3, calculating the minimum disteye of non-zero values under each folder to be min_ disteye, wherein the corresponding eareye value is eareye _60, calculating the angles of the two cases,
A=arcos(min_disteye/max_disteye),
B=arcsin(eareye_60/max_eareye),
Correcting the angle, wherein the correction angle value is correction= (B-A)/(90-B);
S3.4, calculating the space angle of each face.
The step S3.4 further comprises:
The sequence of the face angle direction representation is from-90 to 90; calculating the values of disteye, eareye, eyechin and lipchin of each graph in each folder to be dist_ disteye, dist_ eareye, dist_ eyechin and dist_ lipchin, wherein the maximum values of dist_3962 and dist_ lipchin are the maximum values of max_ disteye, max_ eareye, max_ eyechin and lipchin _0 of the folders respectively; different people have different maximum values.
In the step S3, calculating angles through the projection triangle sine theorem, and obtaining three angle values of a face further comprises:
Calculation of the yaw direction:
if eareye62> eareye, the yaw direction is positive, otherwise negative; the angle value is: if dist _ disteye >0, angle _ law= arccos (dist _ disteye/max _ disteye), otherwise,
Angle_law=arcsin (dist_ eareye/max_ eareye) -correct × (arcsin (dist_ eareye/max_ eareye) -B); adding the positive and negative values in the yaw direction to angle_yaw;
Calculation of the pitch direction:
if dist_ lipchin < lipchin _0, the angle is negative, otherwise it is positive; angle_pitch= arccos (dist_ eyechin/max_ eyechin); adding the positive and negative values of the pitch direction to the angle_pitch;
And (3) calculating the roll direction:
Directly calculating by using coordinate points of the marking point 4 and the marking point 6, and setting the coordinate points as (x 1, y 1), (x 2, y 2); the angle is
angel_roll=arctan((y2-y1)/(x2-x1))
This value has a direction, i.e. positive and negative;
the face angle is: angle_law, angle_pitch, angle_roll.
And (3) calculating the roll direction: if there is an occlusion of an eye, the coordinates of the point of reference 5 and the corner of the other, non-occluded eye are used for calculation.
Thus, the present application has the advantages that: the method is simple and effective, is easy to operate, can effectively reduce the face angle marking error, and eliminates the influence of human factors.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate and together with the description serve to explain the application.
FIG. 1 is a schematic flow chart of the method of the present invention.
Fig. 2 (a) is a schematic diagram of face-head-up directions of angles in step S1 of the method of the present invention.
Fig. 2 (b) is a schematic diagram of the face looking up at each angular direction in step S1 of the method of the present invention.
Fig. 2 (c) is a schematic diagram of the face in each angular direction in a top view in step S1 of the method of the present invention.
Fig. 3 is a schematic diagram of seven-point labeling of face sample labeling in the method of the present invention.
Fig. 4 is a schematic diagram of face direction naming in the method of the present invention.
Fig. 5 is a specific flow chart of the method for implementing step S3 in the method of the present invention.
Detailed Description
In order that the technical content and advantages of the present invention may be more clearly understood, a further detailed description of the present invention will now be made with reference to the accompanying drawings.
As shown in fig. 1, the invention relates to a method for collecting and labeling a face angle sample, which comprises the following steps:
S1, collecting a face sample;
s2, generating a face sample marking point: the seven points are marked manually, the seven points refer to the positions of the left earholes and are set as points 1; the right ear hole position, set to point 2; the center position of the bottom of the chin is set as a point 3; the left eye corner position of the human eye is set as point 4; the center position of the two-eye connecting line segment is set as a point 5; the right eye corner position, set as point 6; and the lower lip lower corner, set as point 7;
S3, calculating the angle of the face angle sample through the marking points, and generating angle marking information: determining a calculation parameter; the distance between the two eyes and the corners of the whole picture of the same human face, namely the distance between the point 4 and the point 6 is calculated through traversal, so that the maximum distance is obtained, the distance between the connecting line center of the two eyes and the chin is traversed, namely the distance between the point 5 and the point 3 is obtained, so that two real distances of the three-dimensional human face are obtained, the two distances are the two maximum distances of various poses of the human face in an image plane, and the two real lengths of the three-dimensional human face are defined; and calculating angles through a projection triangle sine theorem to obtain three angle values of a face.
The technical scheme of the invention can be further described as follows:
S1, collecting a face sample.
S1.1, the distance between the camera and the person is 3 meters, the camera and the nose of the face of the person are on the same horizontal line, the waist of the person is straight, the neck is straight, the person rotates from left to right around the brain bag as the center, the person leans up from bottom to top, and the person faces are rotated in various postures (for example, rotate from left to right, rotate from right to bottom to left, rotate left to right when looking up, rotate left to right when looking down, lean up from bottom to side, and the like), and the specific angle direction of the person face is shown in the schematic diagrams of fig. 2 (a) - (c). The collected people come from volunteers, each person collects n pieces, and 1000 persons are collected;
S1.2, after the volunteers collect without wearing glasses, the volunteers wear one of a near-view mirror, a sunglasses, a hat, a scarf and a mask at random to collect. Random illumination degree, random background picture.
S2, generating a face sample marking point. The seven-point label adopts manual label. Seven points refer to two earhole positions and the center of the bottom of a chin, the left eye corner of a human eye, the middle point of a connecting line section of the two eyes, the right eye corner and the lower corner of the lower lip, and the marked positions are shown in figure 3.1 is the left earhole position, 2 is the right earhole position, 3 is the center of the bottom of the chin, 4 is the left eye corner position, 5 is the center of the two eyes, 6 is the right eye corner position, and 7 is the lower corner of the lower lip. If the eyes are side faces, the occluded earholes are written with the data values of the eyes which can be seen, and the positions of the occluded ears are marked, and if the corners of one eye are occluded, the data are marked with the data values of the eyes which can be seen, and the positions of the occluded eyes are marked. If the device is provided with a sunglasses or long hair is left, the device is marked by adopting an estimation mode.
The principles of the present invention may be described as follows:
1) Determining a calculation parameter;
A) The horizontal direction of the connecting line between the two eye angles on the plane is kept unchanged, the connecting line rotates along the vertical plane direction, the length is also unchanged, and only the vertical direction on the plane can generate the length change. This change is the calculation of the angle of the direction. This direction is designated as yaw as shown in fig. 4. The line segment between the set point 4 and the set point 6 is disteye.
When both eyes are located laterally to one side, only one eye can see, the change in the distance of the eyes to the corresponding side earhole is manifested as a change in the yaw direction, and the values of the other two directions are unchanged. The distance from the point 6 to the point 2 or the point 4 to the point 1 is eareye, and is specifically eareye, eareye and 41.
B) The line segment from the center of the eyes to the chin will change in length in the horizontal direction on the plane, and the included angle in this direction is calculated by using this property. This direction is designated pitch, as shown in FIG. 4. Let the line segment from 5 to 3 be: eyechin. eyechin has a maximum length and an angle of 0.
C) A line segment from the lower corner of the lower lip to the center of the lower chin. The longer the line segment, the larger the look-up angle, the smaller the line segment, and the larger the look-down angle, which is caused by the nearly spherical shape of the human face. Let the line segment be lipchin. Using this value, the direction of pitch can be determined.
D) Calculating the included angle of the horizontal direction directly uses the included angle between the connecting line of the two eyes and the horizontal direction. This direction is designated roll, as shown in fig. 4.
2) The true length needs to be known. The distance between the corners of eyes of all pictures of the same human face (namely, the distance between the point 4 and the point 6) is calculated through traversal, so that the maximum distance is obtained, the distance between the connecting line center of the corners of eyes and the chin (namely, the distance between the point 5 and the point 3) is traversed, so that the two real distances of the three-dimensional human face are obtained, the two distances are the two maximum distances of various gestures of the human face in an image plane, and the two maximum distances are the two real lengths of the three-dimensional human face of the human; 3) The angle is calculated by projecting the triangular sine theorem. Three angle values are obtained.
As shown in fig. 5, a specific method for implementing step S3 in the present invention may be described as follows:
S3.1, placing the pictures collected by each volunteer into a folder, collecting faces of 1000 volunteers, and generating 1000 folders.
S3.2, the longest two-eye angle distance max_ disteye in each folder is max_ eareye, the longest eareye value is max_ eyechin from the longest eyebrow to the chin, and the lipchin value corresponding to the max_ eyechin picture is lipchin _0.
S3.3, calculating the minimum two-eye distance min_ disteye of non-zero value under each folder, wherein the corresponding eareye value is eareye _60, calculating the angles of the two cases,
A=arcos(min_disteye/max_disteye),
B=arcsin (eareye _60/max_ eareye). Correcting the angle, wherein the correction angle value is correction= (B-A)/(90-B)
S3.4, calculating the space angle of each face.
The sequence of the face angle direction representation is from-90 to 90 as shown in fig. 2 (a) -2 (c). The disteye, eareye, eyechin, lipchin values for each of the graphs in each folder are calculated as dist_ disteye, dist_ eareye, dist_ eyechin, dist_ lipchin. The maximum values are the maximum values of the folder, max_ disteye, max_ eareye, max_ eyechin (and lipchin _0), respectively. The maximum value varies from person to person.
The calculation in the yaw direction. If eareye62> eareye41, the yaw direction is positive, otherwise negative. The angle value is: if it is
Dist_ disteye >0, angle_yw= arccos (dist_ disteye/max_ disteye), otherwise,
Angle_law=arcsin (dist_ eareye/max_ eareye) -correct × (arcsin (dist_ eareye/max_ eareye) -B). Adding the positive and negative values in the yaw direction to angle_yaw;
calculation of the pitch direction. If dist_ lipchin < lipchin _0, the angle is negative, otherwise positive. The angle is calculated and the angle is calculated,
Angle_pitch= arccos (dist_ eyechin/max_ eyechin). Adding the positive and negative values of the pitch direction to the angle_pitch;
And calculating the roll direction. Calculation is directly performed using the coordinate points of the mark point 4 and the mark point 6 (calculation is performed using the mark point 5 and the other angular position of the eye if there is an occlusion of the eye). Setting coordinate points as (x 1, y 1), (x 2, y 2); the angle is
angel_roll=arctan((y2-y1)/(x2-x1))
This value has a direction, i.e. a positive and negative value.
Then, a face angle is: angle_law, angle_pitch, angle_roll.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, and various modifications and variations can be made to the embodiments of the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (5)

1. A method for collecting and labeling a face angle sample, the method comprising the steps of:
S1, collecting a face sample;
S2, generating a face sample marking point: the seven points are marked manually, the seven points refer to the positions of the left earholes and are set as points 1; the right ear hole position, set to point 2; the center position of the bottom of the chin is set as a point 3; the left eye corner position of the human eye is set as point 4; the center position of the two-eye connecting line segment is set as a point 5; the right eye corner position, set as point 6; and the lower lip lower corner, set as point 7;
S3, calculating the angle of the face angle sample through the marking points, and generating angle marking information:
Determining a calculation parameter; the distance between the two eyes and the corners of the whole picture of the same human face, namely the distance between the point 4 and the point 6 is calculated through traversal, so that the maximum distance is obtained, the distance between the connecting line center of the two eyes and the chin is traversed, namely the distance between the point 5 and the point 3 is obtained, so that two real distances of the three-dimensional human face are obtained, the two distances are the two maximum distances of various poses of the human face in an image plane, and the two real lengths of the three-dimensional human face are defined; calculating angles through a projection triangle sine theorem to obtain three angle values of a face;
The determining the calculated parameter in step S3 further includes:
A) The horizontal direction of the connecting line between the two eye angles on the plane is kept unchanged, the connecting line rotates along the direction of the vertical plane, the length is also unchanged, only the vertical direction on the plane generates the change of the length, the change is the calculation of the included angle of the direction, the direction is named as the yaw, and the line segment between the set point 4 and the set point 6 is disteye; when two eyes are laterally arranged on one side, only one eye can see the change of the distance from the eyes to the earholes on the corresponding side is embodied in the change of the distance in the yaw direction, and the values of the other two directions are unchanged; setting the distance from the point 6 to the point 2 or the distance from the point 4 to the point 1 as eareye, and specifically dividing into eareye, eareye and 5241;
B) The line segment from the center of two eyes to the chin generates length change in the horizontal direction on the plane, and the included angle of the direction is calculated by utilizing the property, wherein the direction is named pitch, and the line segment from the point 5 to the point 3 is eyechin; eyechin has a maximum length, and the angle is 0;
c) The longer the line segment from the lower corner of the lower lip to the center of the bottom of the chin, the larger the upward viewing angle, the smaller the line segment, the larger the overlooking angle, the line segment is lipchin, and the direction of the pitch can be determined by utilizing the change of lipchin value;
D) Calculating an included angle in the horizontal direction, wherein the included angle between the connecting line of two eyes and the horizontal direction is directly obtained, and the direction is named roll;
the method for implementing the step S3 further includes:
S3.1, placing pictures acquired by each acquired person into a folder, acquiring 1000 acquired faces of the acquired person, and generating 1000 folders;
S3.2, the longest disteye in each folder is set as max_ disteye, the longest eareye value is max_ eareye, the longest eyechin is max_ eyechin, and the lipchin value corresponding to the picture with the distance max_ eyechin is lipchin _0;
S3.3, calculating the minimum disteye of non-zero values under each folder to be min_ disteye, wherein the corresponding eareye value is eareye _60, calculating the angles of the two cases,
A=arcos(min_disteye/max_disteye),
B= arcsin (eareye_60/max_eareye),
Correcting the angle, wherein the correction angle value is correction= (B-A)/(90-B);
S3.4, calculating the space angle of each face;
The step S3.4 further comprises:
The sequence of the face angle direction representation is from-90 to 90; calculating the values of disteye, eareye, eyechin and lipchin of each graph in each folder to be dist_ disteye, dist_ eareye, dist_ eyechin and dist_ lipchin, wherein the maximum values of dist_3962 and dist_ lipchin are the maximum values of max_ disteye, max_ eareye, max_ eyechin and lipchin _0 of the folders respectively; the maximum value varies from person to person.
2. The method for collecting and labeling a face angle sample according to claim 1, wherein the step S1 further comprises:
S1.1, a person to be collected does not wear glasses, the distance between a camera and the person to be collected is 3 meters, the camera and the nose of the face of the person to be collected are positioned on the same horizontal line, the camera rotates from left to right around a brain bag as a center, and is turned upwards from bottom to top, and from left to bottom to right, and from bottom to top, and when looking up, the camera rotates left to right, and when looking up, the camera rotates to one side to turn upwards from bottom, and each person collects n pieces and 1000 persons;
S1.2, after being collected by a person without wearing glasses, one of a near-view mirror, a sunglasses, a hat, a scarf and a mask is worn at random, and then the person is collected according to the collection mode of S1.1, the illumination degree is random, and the background picture is random.
3. The method for collecting and labeling a face angle sample according to claim 1, wherein the step S2 further comprises:
If the eye is a side face, the occluded earhole is written into a data value of the eye which can be seen, and the occluded ear position is marked at the same time, if the eye angle of one eye is occluded, the marked data is the data value of the eye which can be seen, and the occluded eye position is marked at the same time; if the device is provided with a sunglasses or long hair is left, the device is marked by adopting an estimation mode.
4. The method for collecting and labeling human face angle samples according to claim 1, wherein in the step S3, the calculating angles by projecting triangular sine theorem, obtaining three angle values of a human face further comprises:
Calculation of the yaw direction:
If eareye62> eareye, the yaw direction is positive, otherwise negative; the angle value is: if dist _ disteye >0, angle _ law= arccos (dist _ disteye/max _ disteye),
Otherwise the first set of parameters is selected,
Angle_law=arcsin (dist_ eareye/max_ eareye) -correct × (arcsin (dist_ eareye/max_ eareye) -B); adding the positive and negative values in the yaw direction to angle_yaw;
Calculation of the pitch direction:
if dist_ lipchin < lipchin _0, the angle is negative, otherwise it is positive; angle_pitch= arccos (dist_ eyechin/max_ eyechin); adding the positive and negative values of the pitch direction to the angle_pitch;
And (3) calculating the roll direction:
directly calculating by using coordinate points of the marking point 4 and the marking point 6, and setting the coordinate points as (x 1, y 1), (x 2, y 2); the angle is
angel_roll=arctan((y2-y1)/(x2-x1))
This value has a direction, i.e. positive and negative;
the face angle is: angle_law, angle_pitch, angle_roll.
5. The method for collecting and labeling a face angle sample according to claim 4, wherein the calculating the roll direction further comprises: if there is an occlusion of an eye, the coordinates of the point of reference 5 and the corner of the other, non-occluded eye are used for calculation.
CN202010736655.0A 2020-07-28 2020-07-28 Face angle sample collection and labeling method Active CN114005151B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010736655.0A CN114005151B (en) 2020-07-28 2020-07-28 Face angle sample collection and labeling method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010736655.0A CN114005151B (en) 2020-07-28 2020-07-28 Face angle sample collection and labeling method

Publications (2)

Publication Number Publication Date
CN114005151A CN114005151A (en) 2022-02-01
CN114005151B true CN114005151B (en) 2024-05-03

Family

ID=79920479

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010736655.0A Active CN114005151B (en) 2020-07-28 2020-07-28 Face angle sample collection and labeling method

Country Status (1)

Country Link
CN (1) CN114005151B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009176005A (en) * 2008-01-24 2009-08-06 Toyota Motor Corp Characteristic point detection method for face image and its device
CN104951767A (en) * 2015-06-23 2015-09-30 安阳师范学院 Three-dimensional face recognition technology based on correlation degree
WO2019232866A1 (en) * 2018-06-08 2019-12-12 平安科技(深圳)有限公司 Human eye model training method, human eye recognition method, apparatus, device and medium
CN111259739A (en) * 2020-01-09 2020-06-09 浙江工业大学 Human face pose estimation method based on 3D human face key points and geometric projection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009176005A (en) * 2008-01-24 2009-08-06 Toyota Motor Corp Characteristic point detection method for face image and its device
CN104951767A (en) * 2015-06-23 2015-09-30 安阳师范学院 Three-dimensional face recognition technology based on correlation degree
WO2019232866A1 (en) * 2018-06-08 2019-12-12 平安科技(深圳)有限公司 Human eye model training method, human eye recognition method, apparatus, device and medium
CN111259739A (en) * 2020-01-09 2020-06-09 浙江工业大学 Human face pose estimation method based on 3D human face key points and geometric projection

Also Published As

Publication number Publication date
CN114005151A (en) 2022-02-01

Similar Documents

Publication Publication Date Title
CN106168853B (en) A kind of free space wear-type gaze tracking system
WO2020125499A1 (en) Operation prompting method and glasses
CN110363116B (en) Irregular human face correction method, system and medium based on GLD-GAN
CN104766059B (en) Quick accurate human-eye positioning method and the gaze estimation method based on human eye positioning
Nishino et al. Corneal imaging system: Environment from eyes
CN1312555C (en) Mixed reality exhibiting method and apparatus
CN104615978B (en) Direction of visual lines tracking and device
US20030169907A1 (en) Facial image processing system
CN106909875A (en) Face shape of face sorting technique and system
CN108369653A (en) Use the eyes gesture recognition of eye feature
CN104978548A (en) Visual line estimation method and visual line estimation device based on three-dimensional active shape model
US20160127657A1 (en) Imaging system
WO2019062056A1 (en) Smart projection method and system, and smart terminal
CN110473221A (en) A kind of target object automatic scanning system and method
US11707191B2 (en) Calibration and image procession methods and systems for obtaining accurate pupillary distance measurements
CN108717704A (en) Method for tracking target, computer installation based on fish eye images and computer readable storage medium
CN102905136B (en) A kind of video coding-decoding method, system
CN106915303A (en) Automobile A-column blind area perspective method based on depth data and fish eye images
CN106937059A (en) Image synthesis method and system based on Kinect
Tian et al. Absolute head pose estimation from overhead wide-angle cameras
Kumar et al. A novel approach to video-based pupil tracking
CN114005151B (en) Face angle sample collection and labeling method
CN109255327A (en) Acquisition methods, face&#39;s plastic operation evaluation method and the device of face characteristic information
CN109145865A (en) Face standard level calculating method and device
WO2021248564A1 (en) Panoramic big data application monitoring and control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant