CN107316023A - A kind of face identification system for being used to share equipment - Google Patents

A kind of face identification system for being used to share equipment Download PDF

Info

Publication number
CN107316023A
CN107316023A CN201710501944.0A CN201710501944A CN107316023A CN 107316023 A CN107316023 A CN 107316023A CN 201710501944 A CN201710501944 A CN 201710501944A CN 107316023 A CN107316023 A CN 107316023A
Authority
CN
China
Prior art keywords
mrow
msub
user
sample
mfrac
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710501944.0A
Other languages
Chinese (zh)
Inventor
昝立民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201710501944.0A priority Critical patent/CN107316023A/en
Publication of CN107316023A publication Critical patent/CN107316023A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/14Payment architectures specially adapted for billing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/0042Coin-freed apparatus for hiring articles; Coin-freed facilities or services for hiring of objects
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F17/00Coin-freed apparatus for hiring articles; Coin-freed facilities or services
    • G07F17/10Coin-freed apparatus for hiring articles; Coin-freed facilities or services for means for safe-keeping of property, left temporarily, e.g. by fastening the property

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Accounting & Taxation (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Finance (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Development Economics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of face identification system for being used to share equipment, belong to area of pattern recognition.The system obtains user images, constitutes positive and negative sample training user identity detection module, user identity is identified based on user identity detection module, determines that the related of the user preserved in vending system is extracted after user identity to be set and perform.Mode of the invention based on user biological feature recognition carries out authenticating user identification, when shared equipment is done shopping, it directly can on a sharing device be done shopping without using mobile phone terminal and pay expense, accurate authentication can be carried out, cost is reduced, and can further react the usage experience of user.

Description

A kind of face identification system for being used to share equipment
【Technical field】
The present invention relates to area of pattern recognition, and in particular to a kind of face identification system for being used to share equipment.
【Background technology】
In recent years, with the fast development of computer industry, computer technology has goed deep into the life of people, has begun to Gradually the living environment with us combines, and occurs in that the concept of shared equipment.So-called shared equipment, is exactly profit With technologies such as computer, communication, sensor, household electrical appliances, the various shared equipment in family are all connected to together, by a terminal It is controlled, so as to give people to provide an extremely easily living environment.
In shared device systems, in order to which preferably user oriented provides service, the generally usage behavior to user It is collected, analyzes there is provided personalized service, to strengthen Consumer's Experience.Therefore, how to recognize that user becomes particularly important. In the prior art, authenticating user identification can be carried out by way of user carries RFID card piece or other electronic installations, still If the hardware unit can equally carry out authentication, and need to carry with by other people acquisitions of non-user Corresponding hardware unit, reduces user experience, while improving the cost of vending system.In the prior art, it can also pass through The system of pattern-recognition, for example, detect user biological feature, carries out user identity identification, and such as user's face recognition, fingerprint are known Not, but system above needs user to be fixed, or perform identifying system required by authenticating step, such as user Front is needed to stand on before face authentification device, fingerprint recognition needs user that finger is positioned on harvester, these systems The operation of user is limited to a certain extent, reduces Consumer's Experience.
【The content of the invention】
In order to solve existing shared equipment identities authentication question, the present invention proposes a kind of for sharing equipment Face identification system.
Goods equipment body, cargo scanner, identification authentication system, gateway, goods cabinet and Charging Detail Record unit;
The cargo scanner and Charging Detail Record unit are all connected with a control unit;Goods is placed on the goods cabinet, described Cargo scanner scans goods, obtains goods information and transmits to described control unit;User is filled by the authentication Put and the shared equipment is opened after confirmation identity, user takes goods and closed after the cabinet door of shared equipment, the cargo scanner Goods is scanned, goods information is obtained and transmits to described control unit;The Charging Detail Record unit is carried out by described control unit Charging, and deducted fees from user's registration account.
It is preferred that the identification authentication system include:Cloud server, image acquisition units, detection unit and result output Unit, the cloud server includes database, and for preserving user's training sample, described image collecting unit is arranged on described On shared device end, detection unit and the result output unit is set in a gateway, and spy is provided with the detection unit Grader is levied, the feature classifiers are trained based on the training sample, the detection unit gathers single to described image The user images of member collection are identified, and obtain identity authentication result, and the result output unit is by authenticating user identification result Gateway is transferred to, gateway extracts the use habit of the shared device end of the user, sends to each shared device end, described Shared device end is performed automatically to be met the operation of user's use habit or provides respective selection so that user is selected.
Preferably, the feature classifiers are trained based on improved Boost systems, feature point in the detection unit The training process of class device comprises the following steps:
A1. user's training sample is useful in the whole body images that the user that gathers in advance continuously moves, training sample What family was present includes N number of, i.e., N number of positive sample, and user is non-existent including L, i.e., L negative sample;
A2. the feature vector, X of training sample, i.e. X=(f are obtained1(x),f2(x),...fk(x))T, f (x) expression image samples Eigen;Sample label is expressed as y, and y=1 represents positive sample label, and y=0 represents negative sample label, and X is the posteriority of positive sample Probability can be expressed as
Wherein function δ (z) is defined as
So as to set up sorter model
Convolution (1) and formula (2), posterior probability can be expressed as
P (y=1 | X)=δ (Hk(X)) (4)
For the grader H of feature vector, Xk(X) it is represented by
Wherein,hk(fk(x) Weak Classifier) is represented, by K weak typing Device can constitute strong classifier Hk(X)。
Positive negative sample is respectively put into two set:Positive sample set { X1j, j=0 ... N-1 } and negative sample set {X0j, j=N ... N+L-1 }, constantly Weak Classifier is selected using multigroup sample from positive and negative sample set, and then structure Produce discrimination highest assembled classifier, it is known that the posterior probability of single sample is expressed as
Pij=δ (Hk(Xij)) (6)
Wherein, the numbering of i value representative sample set, i=1 represents positive sample set, and i=0 represents negative sample set, j It is sample number, setting grader hk(fk(xij)) in conditional probability be Gaussian Profile, i.e. conditional probability is
p(fk(xij) | y=1)~N (μ11)
p(fk(xij) | y=0)~N (μ00) (7)
Wherein, μ1100Incremental update can be carried out
μ00Renewal it is identical with above formula;Can be in the hope of P by formula (7) and formula (8)ij, so, sample set i posteriority Probability can be expressed as
A3. handmarking's human face region is carried out to first frame positive sample image, then obtained using Kalman filter per frame Human face region in image, and handmarking's amendment is carried out every 10 two field pictures, missed with the accumulation for reducing Kalman filter Difference;
A4. the human face region exported for Kalman filter, extracts the human face region feature, generates characteristic vector, Characteristic vector set is formed, and weight is set, the posterior probability of this feature vector set is improved:
Wherein, wj0It is weighting function, monotone decreasing is expressed asWherein, | d (X1j)-d (X10) | represent sample x1jThe Euclidean distance in region is manually demarcated to first frame, c is a constant;
A5. grader is selected after the posterior probability for determining sample set, the system representation of selection is
Wherein,It is the strong classifier for including k-1 Weak Classifier;L is the log-likelihood function of set, fixed Justice is
L=∑si(yilogPi+(1-yi)log(1-Pi)) (11)
Preferably, the user images of the extraction are characterized as Haar-Like features.
Preferably, the user images that the detection unit is gathered to described image collecting unit are identified specifically, obtaining User images are taken, the feature classifiers generated in characteristic vector, input detection unit are identified;
The identification process also includes, the face that the characteristic vector for being identified as user is exported with the Kalman filter The characteristic vector in region carries out arest neighbors matching, and the arest neighbors matching is based on Euclidean distance, distance value is more than into predetermined threshold Characteristic vector be added in the characteristic vector set of Kalman filter output.
The purpose so done is, is identified as the characteristic vector of user by the certification of feature classifiers, that is, is recognized It is set to user, then is matched with the characteristic vector of human face region, it is assumed that the Euclidean distance between two samples is less than pre- If threshold value, then the two samples just belong to same class, i.e., belong to same category with the sample in characteristic vector set;And if Euclidean distance between two samples is more than predetermined threshold value, then it can be assumed that currently there is new sample, can be added Enter into characteristic vector set, feature classifiers are trained again.
The beneficial effect realized of the present invention is:The system of authenticating user identification in the vending system that the present invention is used, no User is needed additionally to carry hardware device, and user need to be only appeared in vending system, it is not necessary to specific identification step is performed, Identification can be carried out to user, it is user-friendly, improve the usage experience of user.Also, use Kalman filtering Output result weight is increased to the selection of Weak Classifier, improve the accuracy of classification..
【Brief description of the drawings】
Accompanying drawing described herein be for providing a further understanding of the present invention, constituting the part of the application, but Inappropriate limitation of the present invention is not constituted, in the accompanying drawings:
Fig. 1 is the overall construction drawing that this hair shares equipment face identification system.
Fig. 2 is present system flow chart.
【Embodiment】
Embodiments of the invention are described below in detail, the example of the embodiment is shown in the drawings, wherein from beginning to end Same or similar label represents same or similar element or the element with same or like function.Below with reference to attached The embodiment of figure description is exemplary, is only used for explaining the present invention, and is not considered as limiting the invention.On the contrary, this All changes in the range of spirit and intension that the embodiment of invention includes falling into attached claims, modification and equivalent Thing.
In the description of the invention, it is to be understood that term " first ", " second " etc. be only used for describe purpose, without It is understood that to indicate or imply relative importance.In the description of the invention, it is necessary to which explanation, is provided unless otherwise clear and definite And restriction, term " connected ", " connection " should be interpreted broadly, for example, it may be fixedly connected or be detachably connected, Or be integrally connected;Can be mechanical connection or electrical connection;Can be joined directly together, intermediary can also be passed through It is indirectly connected to.For the ordinary skill in the art, the tool of above-mentioned term in the present invention can be understood with concrete condition Body implication.In addition, in the description of the invention, unless otherwise indicated, " multiple " are meant that two or more.
Any process described otherwise above or System describe are construed as in flow chart or herein, represent to include Module, fragment or the portion of the code of one or more executable instructions for the step of realizing specific logical function or process Point, and the scope of the preferred embodiment of the present invention includes other realization, wherein can not be by shown or discussion suitable Sequence, including according to involved function by it is basic simultaneously in the way of or in the opposite order, carry out perform function, this should be of the invention Embodiment person of ordinary skill in the field understood.
The system applied of the present invention, share equipment body, cargo scanner, identification authentication system, gateway, goods cabinet and Charging Detail Record unit;
The cargo scanner and Charging Detail Record unit are all connected with a control unit;Goods is placed on the goods cabinet, described Cargo scanner scans goods, obtains goods information and transmits to described control unit;User is filled by the authentication Put and the shared equipment is opened after confirmation identity, user takes goods and closed after the cabinet door of shared equipment, the cargo scanner Goods is scanned, goods information is obtained and transmits to described control unit;The Charging Detail Record unit is carried out by described control unit Charging, and deducted fees from user's registration account.
In the embodiment that the present invention is provided, identification authentication system is arranged on shared equipment body, authentication is utilized The cabinet door and Charging Detail Record unit of the shared equipment of device control, wherein, the identification authentication system in the application is face identification device, profit With the face image of collection consumer, customer identification is judged, first, consumer's enrollment status information is shared equipment backstage and protected Consumer identification information is deposited, when consumer is using shared equipment, the identification authentication system collection being arranged in shared equipment disappears The person's of expense facial image information, and matched with the consumer identification information that shared equipment backstage is stored, determine after user identity, The shared equipment cabinet door of unblock, consumer takes goods and closed after the cabinet door of shared equipment, shares the cargo scanner scanning of equipment Goods, obtains goods information and transmits to described control unit;The Charging Detail Record unit carries out charging by described control unit, And deducted fees from user's registration account.
Wherein, user's registration account can be by setting the binding payer such as bank card or Alipay wechat accordingly Formula, is paid after consumer is using shared equipment.
Referring to accompanying drawing 1, it illustrates the identification authentication system, including:Cloud server, image acquisition units, detection Unit and result output unit, the cloud server include database, for preserving user's training sample, described image collection Unit is arranged on the shared device end, and detection unit and the result output unit is set in a gateway, the detection Feature classifiers are provided with unit, the feature classifiers are trained based on the training sample, the detection unit pair The user images of described image collecting unit collection are identified, and obtain identity authentication result, the result output unit will be used Family identity authentication result is transferred to gateway, and gateway extracts the use habit of the shared device end of the user, sends to each common Enjoy device end, the shared device end perform automatically meet the operation of user's use habit or provide respective selection for Family is selected.
Referring to accompanying drawing 2, the system flow chart of the present invention is shown.Wherein, the feature classifiers are based on improved Boost System is trained, and the training process of feature classifiers comprises the following steps in the detection unit:
A1. user's training sample is useful in the whole body images that the user that gathers in advance continuously moves, training sample What family was present includes N number of, i.e., N number of positive sample, and user is non-existent including L, i.e., L negative sample;
A2. the feature vector, X of training sample, i.e. X=(f are obtained1(x),f2(x),...fk(x))T, f (x) expression image samples Eigen;Sample label is expressed as y, and y=1 represents positive sample label, and y=0 represents negative sample label, and X is the posteriority of positive sample Probability can be expressed as
Wherein function δ (z) is defined as
So as to set up sorter model
Convolution (1) and formula (2), posterior probability can be expressed as
P (y=1 | X)=δ (Hk(X)) (4)
For the grader H of feature vector, Xk(X) it is represented by
Wherein,hk(fk(x) Weak Classifier) is represented, by K weak typing Device can constitute strong classifier Hk(X)。
Positive negative sample is respectively put into two set:Positive sample set { X1j, j=0 ... N-1 } and negative sample set {X0j, j=N ... N+L-1 }, constantly Weak Classifier is selected using multigroup sample from positive and negative sample set, and then structure Produce discrimination highest assembled classifier, it is known that the posterior probability of single sample is expressed as
Pij=δ (Hk(Xij)) (6)
Wherein, the numbering of i value representative sample set, i=1 represents positive sample set, and i=0 represents negative sample set, j It is sample number, setting grader hk(fk(xij)) in conditional probability be Gaussian Profile, i.e. conditional probability is
p(fk(xij) | y=1)~N (μ11)
p(fk(xij) | y=0)~N (μ00) (7)
Wherein, μ1100Incremental update can be carried out
μ00Renewal it is identical with above formula;Can be in the hope of P by formula (7) and formula (8)ij, so, sample set i posteriority Probability can be expressed as
A3. handmarking's human face region is carried out to first frame positive sample image, then obtained using Kalman filter per frame Human face region in image, and handmarking's amendment is carried out every 10 two field pictures, missed with the accumulation for reducing Kalman filter Difference;
A4. the human face region exported for Kalman filter, extracts the human face region feature, generates characteristic vector, Characteristic vector set is formed, and weight is set, the posterior probability of this feature vector set is improved:
Wherein, wj0It is weighting function, monotone decreasing is expressed asWherein, | d (X1j)-d (X10) | represent sample x1jThe Euclidean distance in region is manually demarcated to first frame, c is a constant;
A5. grader is selected after the posterior probability for determining sample set, the system representation of selection is
Wherein,It is the strong classifier for including k-1 Weak Classifier;L is the log-likelihood function of set, fixed Justice is
L=∑si(yilogPi+(1-yi)log(1-Pi)) (11)
Preferably, the user images of the extraction are characterized as Haar-Li ke features.
Preferably, the user images that the detection unit is gathered to described image collecting unit are identified specifically, obtaining User images are taken, the feature classifiers generated in characteristic vector, input detection unit are identified;
The identification process also includes, the face that the characteristic vector for being identified as user is exported with the Kalman filter The characteristic vector in region carries out arest neighbors matching, and the arest neighbors matching is based on Euclidean distance, distance value is more than into predetermined threshold Characteristic vector be added in the characteristic vector set of Kalman filter output.
The purpose so done is, is identified as the characteristic vector of user by the certification of feature classifiers, that is, is recognized It is set to user, then is matched with the characteristic vector of human face region, it is assumed that the Euclidean distance between two samples is less than pre- If threshold value, then the two samples just belong to same class, i.e., belong to same category with the sample in characteristic vector set;And if Euclidean distance between two samples is more than predetermined threshold value, then it can be assumed that currently there is new sample, can be added Enter into characteristic vector set, feature classifiers are trained again.
Described above is only the better embodiment of the present invention, therefore all constructions according to described in present patent application scope, The equivalent change or modification that feature and principle are done, is included in the range of present patent application.

Claims (6)

1. a kind of face identification system for being used to share equipment, it is characterised in that the system includes:Shared equipment body, goods Scanner, identification authentication system, gateway, goods cabinet and Charging Detail Record unit;
The cargo scanner and Charging Detail Record unit are all connected with a control unit;Goods is placed on the goods cabinet, the goods Scanner scans goods, obtains goods information and transmits to described control unit;User is true by the identification authentication system Recognize after identity and open the shared equipment, user takes goods and closed after the cabinet door of shared equipment, the cargo scanner scanning Goods, obtains goods information and transmits to described control unit;The Charging Detail Record unit carries out charging by described control unit, And deducted fees from user's registration account.
2. according to the system in claim 1, it is characterised in that:The identification authentication system includes:Cloud server, image are adopted Collect unit, detection unit and result output unit, the cloud server includes database, for preserving user's training sample, Described image collecting unit is arranged on the shared device end, and detection unit and the result output unit is arranged on gateway In, feature classifiers are provided with the detection unit, the feature classifiers are trained based on the training sample, described The user images that detection unit is gathered to described image collecting unit are identified, and obtain identity authentication result, and the result is defeated Go out unit and authenticating user identification result be transferred to gateway, gateway extracts the use habit of the shared device end of the user, Send to each shared device end, the shared device end is performed automatically to be met the operation of user's use habit or provide corresponding Option is selected for user.
3. according to the system in claim 2, it is characterised in that:The feature classifiers are carried out based on improved Boost systems The training process of feature classifiers comprises the following steps in training, the detection unit:
A1. user's training sample is to have user to deposit in the whole body images that the user that gathers in advance continuously moves, training sample Include N number of, i.e., N number of positive sample, the non-existent L that includes of user is individual, i.e., L negative sample;
A2. the feature vector, X of training sample, i.e. X=(f are obtained1(x),f2(x),...fk(x))T, f (x) expression image pattern spies Levy;Sample label is expressed as y, and y=1 represents positive sample label, and y=0 represents negative sample label, and X is the posterior probability of positive sample It can be expressed as
<mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>=</mo> <mn>1</mn> <mo>|</mo> <mi>X</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>X</mi> <mo>|</mo> <mi>y</mi> <mo>=</mo> <mn>1</mn> <mo>)</mo> </mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>Y</mi> <mo>=</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>&amp;Sigma;</mi> <mrow> <mi>y</mi> <mo>&amp;Element;</mo> <mo>{</mo> <mn>0</mn> <mo>,</mo> <mn>1</mn> <mo>}</mo> </mrow> </msub> <mi>P</mi> <mrow> <mo>(</mo> <mi>X</mi> <mo>|</mo> <mi>y</mi> <mo>)</mo> </mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>=</mo> <mi>&amp;delta;</mi> <mrow> <mo>(</mo> <mi>l</mi> <mi>n</mi> <mo>(</mo> <mfrac> <mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>X</mi> <mo>|</mo> <mi>y</mi> <mo>=</mo> <mn>1</mn> <mo>)</mo> </mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>=</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> <mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>X</mi> <mo>|</mo> <mi>y</mi> <mo>=</mo> <mn>0</mn> <mo>)</mo> </mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>=</mo> <mn>0</mn> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>)</mo> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
Wherein function δ (z) is defined as
<mrow> <mi>&amp;delta;</mi> <mrow> <mo>(</mo> <mi>z</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mn>1</mn> <mo>+</mo> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mi>z</mi> </mrow> </msup> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
So as to set up sorter model
<mrow> <msub> <mi>H</mi> <mi>k</mi> </msub> <mrow> <mo>(</mo> <mi>X</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>l</mi> <mi>n</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>X</mi> <mo>|</mo> <mi>y</mi> <mo>=</mo> <mn>1</mn> <mo>)</mo> </mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>=</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> <mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>X</mi> <mo>|</mo> <mi>y</mi> <mo>=</mo> <mn>0</mn> <mo>)</mo> </mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>=</mo> <mn>0</mn> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
Convolution (1) and formula (2), posterior probability can be expressed as
P (y=1 | X)=δ (Hk(X)) (4)
For the grader H of feature vector, Xk(X) it is represented by
<mrow> <msub> <mi>H</mi> <mi>k</mi> </msub> <mrow> <mo>(</mo> <mi>X</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>l</mi> <mi>n</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>X</mi> <mo>|</mo> <mi>y</mi> <mo>=</mo> <mn>1</mn> <mo>)</mo> </mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>=</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> <mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>X</mi> <mo>|</mo> <mi>y</mi> <mo>=</mo> <mn>0</mn> <mo>)</mo> </mrow> <mi>P</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>=</mo> <mn>0</mn> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>k</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>K</mi> </munderover> <msub> <mi>h</mi> <mi>k</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>f</mi> <mi>k</mi> </msub> <mo>(</mo> <mi>x</mi> <mo>)</mo> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
Wherein,hk(fk(x) Weak Classifier) is represented, can be with by K Weak Classifier Constitute strong classifier Hk(X);
Positive negative sample is respectively put into two set:Positive sample set { X1j, j=0 ... N-1 } and negative sample set { X0j, j= N ... N+L-1 }, constantly Weak Classifier is selected using multigroup sample from positive and negative sample set, and then construct identification Rate highest assembled classifier, it is known that the posterior probability of single sample is expressed as
Pij=δ (Hk(Xij)) (6)
Wherein, the numbering of i value representative sample set, i=1 represents positive sample set, and i=0 represents negative sample set, and j is sample This numbering, setting grader hk(fk(xij)) in conditional probability be Gaussian Profile, i.e. conditional probability is
p(fk(xij) | y=1)~N (μ11)
p(fk(xij) | y=0)~N (μ00) (7)
Wherein, μ1100Incremental update can be carried out
<mrow> <msub> <mi>&amp;mu;</mi> <mn>1</mn> </msub> <mo>&amp;LeftArrow;</mo> <msub> <mi>&amp;eta;&amp;mu;</mi> <mn>1</mn> </msub> <mo>+</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&amp;eta;</mi> <mo>)</mo> </mrow> <mfrac> <mn>1</mn> <mi>N</mi> </mfrac> <munder> <mo>&amp;Sigma;</mo> <mrow> <mi>j</mi> <mo>|</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>=</mo> <mn>1</mn> </mrow> </munder> <msub> <mi>f</mi> <mi>k</mi> </msub> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mo>)</mo> </mrow> </mrow>
<mrow> <msub> <mi>&amp;sigma;</mi> <mn>1</mn> </msub> <mo>&amp;LeftArrow;</mo> <msub> <mi>&amp;eta;&amp;sigma;</mi> <mn>1</mn> </msub> <mo>+</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <mi>&amp;eta;</mi> <mo>)</mo> </mrow> <msqrt> <mrow> <mfrac> <mn>1</mn> <mi>N</mi> </mfrac> <munder> <mo>&amp;Sigma;</mo> <mrow> <mi>j</mi> <mo>|</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>=</mo> <mn>1</mn> </mrow> </munder> <msup> <mrow> <mo>(</mo> <msub> <mi>f</mi> <mi>k</mi> </msub> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mo>)</mo> <mo>-</mo> <msub> <mi>&amp;mu;</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> </mrow>
μ00Renewal it is identical with above formula;Can be in the hope of P by formula (7) and formula (8)ij, so, sample set i posterior probability It can be expressed as
<mrow> <msub> <mi>P</mi> <mi>i</mi> </msub> <mo>=</mo> <mn>1</mn> <mo>-</mo> <munder> <mi>&amp;Pi;</mi> <mi>j</mi> </munder> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <msub> <mi>P</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>9</mn> <mo>)</mo> </mrow> </mrow>
A3. handmarking's human face region is carried out to first frame positive sample image, then obtained using Kalman filter per two field picture In human face region, and every 10 two field pictures carry out handmarking's amendment, to reduce the accumulated error of Kalman filter;
A4. the human face region exported for Kalman filter, extracts the human face region feature, generates characteristic vector, is formed Characteristic vector set, and weight is set, improve the posterior probability of this feature vector set:
<mrow> <msub> <mi>P</mi> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> </msub> <mo>=</mo> <mi>P</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>=</mo> <mn>1</mn> <mo>|</mo> <msup> <mi>X</mi> <mo>+</mo> </msup> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>j</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <msub> <mi>w</mi> <mrow> <mi>j</mi> <mn>0</mn> </mrow> </msub> <msub> <mi>P</mi> <mi>i</mi> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>12</mn> <mo>)</mo> </mrow> </mrow>
Wherein, wj0It is weighting function, monotone decreasing is expressed asWherein, | d (X1j)-d(X10) | table This x of sample1jThe Euclidean distance in region is manually demarcated to first frame, c is a constant;
A5. grader is selected after the posterior probability for determining sample set, the system representation of selection is
<mrow> <msub> <mi>h</mi> <mi>k</mi> </msub> <mo>=</mo> <munder> <mi>argmax</mi> <mrow> <mi>h</mi> <mo>&amp;Element;</mo> <mrow> <mo>{</mo> <mrow> <msub> <mi>h</mi> <mn>1</mn> </msub> <mn>...</mn> <msub> <mi>h</mi> <mi>M</mi> </msub> </mrow> <mo>}</mo> </mrow> </mrow> </munder> <mi>l</mi> <mrow> <mo>(</mo> <msub> <mi>H</mi> <mrow> <mi>k</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mo>+</mo> <mi>h</mi> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>10</mn> <mo>)</mo> </mrow> </mrow>
Wherein,It is the strong classifier for including k-1 Weak Classifier;L is the log-likelihood function of set, is defined as
L=∑si(yilog Pi+(1-yi)log(1-Pi)) (11)。
4. according to the system in claim 2, it is characterised in that:The user images of the extraction are characterized as Haar-Like features.
5. the system in Claims 2 or 3, it is characterised in that:The detection unit is gathered to described image collecting unit User images be identified specifically, obtain user images, generate the feature classifiers in characteristic vector, input detection unit It is identified;
The identification process also includes, the human face region that the characteristic vector for being identified as user is exported with the Kalman filter Characteristic vector carry out arest neighbors matching, arest neighbors matching is based on Euclidean distance, distance value is more than to the spy of predetermined threshold Vector is levied to be added in the characteristic vector set of the Kalman filter output.
6. according to the system in claim 2, it is characterised in that:The use that the detection unit is gathered to described image collecting unit Family face is identified.
CN201710501944.0A 2017-06-27 2017-06-27 A kind of face identification system for being used to share equipment Pending CN107316023A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710501944.0A CN107316023A (en) 2017-06-27 2017-06-27 A kind of face identification system for being used to share equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710501944.0A CN107316023A (en) 2017-06-27 2017-06-27 A kind of face identification system for being used to share equipment

Publications (1)

Publication Number Publication Date
CN107316023A true CN107316023A (en) 2017-11-03

Family

ID=60180370

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710501944.0A Pending CN107316023A (en) 2017-06-27 2017-06-27 A kind of face identification system for being used to share equipment

Country Status (1)

Country Link
CN (1) CN107316023A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109147199A (en) * 2018-07-09 2019-01-04 保山市质量技术监督综合检测中心 A kind of device management method and system based on fingerprint recognition
CN109711473A (en) * 2018-12-29 2019-05-03 北京沃东天骏信息技术有限公司 Item identification method, equipment and system
CN111080917A (en) * 2019-12-04 2020-04-28 万翼科技有限公司 Self-help article borrowing and returning method and related equipment
CN111209567A (en) * 2019-12-30 2020-05-29 北京邮电大学 Method and device for judging perceptibility of improving robustness of detection model

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609678A (en) * 2011-01-24 2012-07-25 台湾色彩与影像科技股份有限公司 Intelligent self-service system for face recognition
CN103544775A (en) * 2013-10-25 2014-01-29 上海煦荣信息技术有限公司 Vending machines, vending system and vending method for same
CN203773629U (en) * 2013-11-06 2014-08-13 上海煦荣信息技术有限公司 Intelligent self-service vending system
CN105427459A (en) * 2014-09-17 2016-03-23 黄浩庭 Self-help vending method and vending machine
CN205722151U (en) * 2016-04-06 2016-11-23 上海英内物联网科技股份有限公司 A kind of self-selecting type Intelligent vending machine with face identification functions
CN106204918A (en) * 2016-03-08 2016-12-07 青岛海尔特种电冰柜有限公司 Intelligence selling cabinet and intelligence selling system
CN106469369A (en) * 2016-11-03 2017-03-01 林杰 A kind of automatic selling method based on Internet of Things and free supermarket
CN206097289U (en) * 2016-04-26 2017-04-12 上海英内物联网科技股份有限公司 Intelligent vending machine who has face identification function based on internet of things

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609678A (en) * 2011-01-24 2012-07-25 台湾色彩与影像科技股份有限公司 Intelligent self-service system for face recognition
CN103544775A (en) * 2013-10-25 2014-01-29 上海煦荣信息技术有限公司 Vending machines, vending system and vending method for same
CN203773629U (en) * 2013-11-06 2014-08-13 上海煦荣信息技术有限公司 Intelligent self-service vending system
CN105427459A (en) * 2014-09-17 2016-03-23 黄浩庭 Self-help vending method and vending machine
CN106204918A (en) * 2016-03-08 2016-12-07 青岛海尔特种电冰柜有限公司 Intelligence selling cabinet and intelligence selling system
CN205722151U (en) * 2016-04-06 2016-11-23 上海英内物联网科技股份有限公司 A kind of self-selecting type Intelligent vending machine with face identification functions
CN206097289U (en) * 2016-04-26 2017-04-12 上海英内物联网科技股份有限公司 Intelligent vending machine who has face identification function based on internet of things
CN106469369A (en) * 2016-11-03 2017-03-01 林杰 A kind of automatic selling method based on Internet of Things and free supermarket

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
孟繁杰 等: "基于球形摄像机模型的全景三维跟踪算法的研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109147199A (en) * 2018-07-09 2019-01-04 保山市质量技术监督综合检测中心 A kind of device management method and system based on fingerprint recognition
CN109711473A (en) * 2018-12-29 2019-05-03 北京沃东天骏信息技术有限公司 Item identification method, equipment and system
CN111080917A (en) * 2019-12-04 2020-04-28 万翼科技有限公司 Self-help article borrowing and returning method and related equipment
CN111080917B (en) * 2019-12-04 2021-12-17 万翼科技有限公司 Self-help article borrowing and returning method and related equipment
CN111209567A (en) * 2019-12-30 2020-05-29 北京邮电大学 Method and device for judging perceptibility of improving robustness of detection model
CN111209567B (en) * 2019-12-30 2022-05-03 北京邮电大学 Method and device for judging perceptibility of improving robustness of detection model

Similar Documents

Publication Publication Date Title
Ferrari et al. On the personalization of classification models for human activity recognition
US8320643B2 (en) Face authentication device
CN107122641B (en) Intelligent equipment owner identification method and intelligent equipment owner identification device based on use habit
CN107316023A (en) A kind of face identification system for being used to share equipment
CN106096576B (en) A kind of intelligent Service method of robot
Yang et al. Sieving regression forest votes for facial feature detection in the wild
Qin et al. A fuzzy authentication system based on neural network learning and extreme value statistics
CN105184304B (en) Image identification device and characteristic quantity data register method to image identification device
CN107220582A (en) Recognize the driver of vehicle
CN108875491A (en) Data-updating method, authenticating device and the system and non-volatile memory medium of face unlock certification
US9798942B2 (en) Biometric recognition method with speed and security feature suitable for POS/ATM applications
CN107644218B (en) The working method that crowded region behavior analyzes and determines is realized based on image collecting function
CN106303599A (en) A kind of information processing method, system and server
WO2017192719A1 (en) User specific classifiers for biometric liveness detection
CN102968645A (en) Method for improving face recognition accuracy rate and adaptability through updating of images
Kaur et al. Fusion in multimodal biometric system: A review
Sastry et al. A team of continuous-action learning automata for noise-tolerant learning of half-spaces
CN111831989A (en) Identity authentication method and device based on multi-mode PPG signal
Fu et al. A feature-based on-line detector to remove adversarial-backdoors by iterative demarcation
Yang et al. Retraining and dynamic privilege for implicit authentication systems
Takahashi et al. A review of off-line mode dataset shifts
CN107330398A (en) The method and device of iris recognition in a kind of shared selling apparatus
CN109308782B (en) Behavior detection method and system for target object
Fernandez Biometric sample quality and its application to multimodal authentication systems
CN114971628A (en) Digital RMB biological payment method, system, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20171103

RJ01 Rejection of invention patent application after publication