EP3432790A1 - System and methods for authenticating vital sign measurements for biometrics detection using photoplethysmography via remote sensors - Google Patents

System and methods for authenticating vital sign measurements for biometrics detection using photoplethysmography via remote sensors

Info

Publication number
EP3432790A1
EP3432790A1 EP17769567.3A EP17769567A EP3432790A1 EP 3432790 A1 EP3432790 A1 EP 3432790A1 EP 17769567 A EP17769567 A EP 17769567A EP 3432790 A1 EP3432790 A1 EP 3432790A1
Authority
EP
European Patent Office
Prior art keywords
heart rate
rate value
user
fhrv
shrv
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP17769567.3A
Other languages
German (de)
French (fr)
Other versions
EP3432790A4 (en
Inventor
Aviram SIBONI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Multisense Bv
Original Assignee
Multisense Bv
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Multisense Bv filed Critical Multisense Bv
Publication of EP3432790A1 publication Critical patent/EP3432790A1/en
Publication of EP3432790A4 publication Critical patent/EP3432790A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7278Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02405Determining heart rate variability
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1176Recognition of faces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • H04W12/065Continuous authentication
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/15Biometric patterns based on physiological signals, e.g. heartbeat, blood flow
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the present invention relates generally to the field of object class detection and authentication. More specifically, the present invention relates to a biometrics recognition and authentication system and methods based on remote photo-plethysmography measuring and analysis of skin color variations for observing human vital signs using non-invasive, remote, passive sensors of a mobile device, i.e. camera, including, but not limited, average heart rate, heart rate variation and respiratory rate.
  • a biometrics recognition and authentication system and methods based on remote photo-plethysmography measuring and analysis of skin color variations for observing human vital signs using non-invasive, remote, passive sensors of a mobile device, i.e. camera, including, but not limited, average heart rate, heart rate variation and respiratory rate.
  • Biometrics detection is a rapidly growing research area.
  • Biometrics detection has several applications in many areas such as intelligent human-computer interfaces, crowd surveillance, security authentication, video coding, video conferencing, and content-based image retrieval.
  • the human face is a dynamic object and has a great variability in its appearance, which makes biometrics detection become a difficult problem in computer vision and many other applications.
  • some efficient methods have been developed. Traditionally, methods that focus on facial landmarks (such as eyes, nose, etc.), that detect face-like colors in circular regions, or that use standard feature templates, were used to detect faces. However, these attempts don't improve the accuracy of biometrics detection fundamentally.
  • Biometrics detection is still a challenge because of several difficulties, such as variable face, orientations, different face sizes, partial occlusions of faces in an image, and changeable lighting conditions.
  • various systems and methods for indicating skin color characteristics exist in the art.
  • Robust biometrics detection using the hausdorff, by Jesorsky, O., Kirchberg, K.J., Frischholz, R.W. discloses a shape comparison approach to achieve fast, accurate biometrics detection that is robust to changes in illumination and background.
  • the proposed method is edge-based and works on grayscale still images.
  • the Hausdorff distance is used as a similarity measure between a general face model and possible instances of the object within the image.
  • Feature-based biometrics detection against skin-color like backgrounds with varying illumination by Hu, W.C., Yang, C.Y., Huang, D.Y., Huang, C.H., discloses a three-stage scheme for real-time reliable biometrics detection is presented.
  • the proposed three-stage scheme is a feature-based method that is mainly based on skin color and facial features. Skin regions are obtained using a YCbCr skin-color model in the first stage. In the second stage, a face template measure is used to obtain face candidates and then a suitable face box is used to effectively remove non-face regions from the face candidates. Finally, facial features are measured to detect faces from face candidates in the third stage.
  • Biometrics detection technology based on skin color segmentation and template matching by Chen, A. P., Pan, L., Tong, Y.B., Ning, N., discloses a methodology for biometrics detection from video recordings of the human face and demonstrates an implementation using a digital camera with ambient daylight providing illumination by obtaining the accurate binary image.
  • Photoplethysmography imaging as a new research field is used to extract physiological information in recent years.
  • volume of the blood in the blood vessels is constantly changing during the cardiac circulation system and human face is a part of a living body, there is a constant blood flow in the subject's face region.
  • Photoplethysmography corresponds to the variations in reflected light due to cardiovascular blood volume pulse.
  • HR heart rate
  • Non-contact, automated cardiac pulse measurements using video imaging and blind source separation by Ming-Zher Poh, Daniel J. McDuff, and Rosalind W. Picard, discloses color video recordings of the human face and is based on automatic face tracking along with blind source separation of the color channels into independent components.
  • the cardiac pulse rate extracted from videos recorded by a basic webcam is compared to an FDA-approved finger blood volume pulse (BVP) sensor and achieved high accuracy and correlation even in the presence of movement artifacts.
  • BVP finger blood volume pulse
  • the pulsatile signal extracted from face as a unique feature may be used to improve the robustness of non-contact biometrics detection by comparing and correlating it to the pulsatile signal extracted from additional regions of the user's body (such as a fingertip) using a real-time simultaneous PPGi technology via remote sensors of the same mobile device.
  • One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all Markush groups used in the appended claims.
  • Fig. 1 presents a high level data flow diagram of the method disclosed by the present invention
  • FIG. 2 presents, an example of a generalized presentation of the present invention first heart rate value (FHRV) and said second heart rate value (SHRV) comparison; and
  • FIG. 3 presents an embodiment of the system disclosed by the present invention.
  • the technology described herein relates to generating, storing and using semantic networks and databases to correlate physiological and psychological states of users and/or groups of users using their voice intonation data analysis.
  • the term "user” used interchangeably in the present invention refers hereinafter to any party that employs a biometrics recognition and authentication system based on remote photo- plethysmography measuring and analysis of skin color variations for observing human vital signs using non-invasive, remote, passive sensors of a mobile device, i.e. camera, including, but not limited, average heart rate, heart rate variation and respiratory rate.
  • mobile device interchangeably refers, but not limited to such as a mobile phone, laptop, tablet, wearable computing device, cellular communicating device, digital camera (still and/or video), PDA, computer server, video camera, television, electronic visual dictionary, communication device, personal computer, and etc.
  • the present invention means and methods are performed in a standalone electronic device comprising at least one screen. Additionally or alternatively, at least a portion of such as processing, memory accessible, databases, includes a cloud-based platform, and/or web-based platform.
  • the software components and/or image databases provided are stored in a local memory module and/or stored in a remote server.
  • PPG photoplethysmography
  • PPGi photoplethysmography imaging
  • the implemented authenticating vital sign measurements for biometrics detection of a user using photoplethysmography (PPG) and photoplethysmography imaging (PPGi) analyses via remote sensors of a mobile device method can be executed using a computerized process according to the example method 100 illustrated in FIG. 1.
  • the method 100 commences by simultaneously activating at least one front camera sensor, at least one back camera sensor of the mobile device at step 102. Video of a first body region of the user via at least one first front camera sensor and a second body region of said user via said at least one back camera sensor obtained at steps 104 and 106, respectively.
  • Extraction of a first pulsatile signal of the user using photoplethysmography imaging (PPGi) analysis of the first body region of said user in said video is carried out at step 108.
  • extraction of a second pulsatile signal corresponding to the second body region of the user (step 110).
  • a first heart rate value (FHRV) based on photoplethysmography imaging analysis of video of the first body area is calculated at step 112.
  • the similar procedure concerning the second body area in order to obtain a second heart rate value (SHRV) is carried out at step 114.
  • the purpose of the present procedure is to obtain a comparative output value (COV) between the first heart rate value (FHRV) and the second heart rate value (SHRV). Vitality is authenticated if the first heart rate value (FHRV) and second heart rate value (SHRV) coincide. Otherwise, the procedure is repeated.
  • COV comparative output value
  • a generalized presentation of the present invention first heart rate value (FHRV) and said second heart rate value (SHRV) comparison according to the example system 200 illustrated in FIG. 2.
  • the system if configured to present to the user a comparison output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV).
  • the comparison output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV) is used to detect and authenticate the body regions of the user and/or used to detect and authenticate the identity of the user.
  • FIG. 3 graphically illustrates, according to another preferred embodiment of the present invention, an example of computerized system for implementing the invention 300.
  • the systems and methods described herein can be implemented in software or hardware or any combination thereof.
  • the systems and methods described herein can be implemented using one or more computing devices which may or may not be physically or logically separate from each other. Additionally, various aspects of the methods described herein may be combined or merged into other functions.
  • the illustrated system elements could be combined into a single hardware device or separated into multiple hardware devices. If multiple hardware devices are used, the hardware devices could be physically located proximate to or remotely from each other.
  • the methods can be implemented in a computer program product accessible from a computer-usable or computer-readable storage medium that provides program code for use by or in connection with a computer or any instruction execution system.
  • a computer-usable or computer-readable storage medium can be any apparatus that can contain or store the program for use by or in connection with the computer or instruction execution system, apparatus, or device.
  • a data processing system suitable for storing and/or executing the corresponding program code can include at least one processor coupled directly or indirectly to computerized data storage devices such as memory elements.
  • Input/output (I/O) devices can be coupled to the system.
  • Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks.
  • the features can be implemented on a computer with a display device, such as an LCD (liquid crystal display), virtual display, or another type of monitor for displaying information to the user, and a keyboard and an input device, such as a mouse or trackball by which the user can provide input to the computer.
  • a display device such as an LCD (liquid crystal display), virtual display, or another type of monitor for displaying information to the user
  • a keyboard and an input device such as a mouse or trackball by which the user can provide input to the computer.
  • a computer program can be a set of instructions that can be used, directly or indirectly, in a computer.
  • the systems and methods described herein can be implemented using programming languages such as FlashTM, JAVATM, C++, C, C#, Visual BasicTM, JavaScriptTM, PHP, XML, HTML, etc., or a combination of programming languages, including compiled or interpreted languages, and can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • the software can include, but is not limited to, firmware, resident software, microcode, etc. Protocols such as SOAP/HTTP may be used in implementing interfaces between programming modules.
  • the components and functionality described herein may be implemented on any desktop operating system executing in a virtualized or non-virtualized environment, using any programming language suitable for software development, including, but not limited to, different versions of Microsoft WindowsTM, AppleTM MacTM, iOSTM, AndroidTM, UnixTM/X-WindowsTM, LinuxTM, etc.
  • the system could be implemented using a web application framework, such as Ruby on Rails.
  • the processing system can be in communication with a computerized data storage system.
  • the data storage system can include a non-relational or relational data store, such as a MySQLTM or other relational database. Other physical and logical database types could be used.
  • the data store may be a database server, such as Microsoft SQL ServerTM, OracleTM, IBM DB2TM, SQLITETM, or any other database software, relational or otherwise.
  • the data store may store the information identifying syntactical tags and any information required to operate on syntactical tags.
  • the processing system may use object- oriented programming and may store data in objects.
  • the processing system may use an object-relational mapper (ORM) to store the data objects in a relational database.
  • ORM object-relational mapper
  • an RDBMS can be used.
  • tables in the RDBMS can include columns that represent coordinates.
  • data representing user events, virtual elements, etc. can be stored in tables in the RDBMS.
  • the tables can have pre-defined relationships between them.
  • the tables can also have adjuncts associated with the coordinates.
  • Suitable processors for the execution of a program of instructions include, but are not limited to, general and special purpose microprocessors, and the sole processor or one of multiple processors or cores, of any kind of computer.
  • a processor may receive and store instructions and data from a computerized data storage device such as a read-only memory, a random access memory, both, or any combination of the data storage devices described herein.
  • a processor may include any processing circuitry or control circuitry operative to control the operations and performance of an electronic device.
  • the processor may also include, or be operatively coupled to communicate with, one or more data storage devices for storing data.
  • data storage devices can include, as non-limiting examples, magnetic disks (including internal hard disks and removable disks), magneto- optical disks, optical disks, read-only memory, random access memory, and/or flash storage.
  • Storage devices suitable for tangibly embodying computer program instructions and data can also include all forms of non-volatile memory, including, for example, semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • the processor and the memory can be supplemented by, or incorporated in, ASICs (application- specific integrated circuits).
  • the systems, modules, and methods described herein can be implemented using any combination of software or hardware elements.
  • the systems, modules, and methods described herein can be implemented using one or more virtual machines operating alone or in combination with each other. Any applicable virtualization solution can be used for encapsulating a physical computing machine platform into a virtual machine that is executed under the control of virtualization software running on a hardware computing platform or host.
  • the virtual machine can have both virtual system hardware and guest operating system software.
  • the systems and methods described herein can be implemented in a computer system that includes a back-end component, such as a data server, or that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination of them.
  • the components of the system can be connected by any form or medium of digital data communication such as a communication network. Examples of communication networks include, e.g., a LAN, a WAN, and the computers and networks that form the Internet.
  • One or more embodiments of the invention may be practiced with other computer system configurations, including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, etc.
  • the invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a network.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Signal Processing (AREA)
  • Veterinary Medicine (AREA)
  • Computer Security & Cryptography (AREA)
  • Cardiology (AREA)
  • Physiology (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Psychiatry (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A method of authenticating vital sign measurements for biometrics detection via a mobile device comprises the steps of: (a) obtaining video of a first body region and a second body region of said user via said at least one front camera sensor and at least one back camera sensor, respectively; (b) extracting pulsatile signals of said user using photoplethysmography imaging (PPGi) analysis of said video obtained from said first and second body regions; and (c) calculating a first heart rate value (FHRV) and a second heart rate value (SHRV). The videos are obtained simultaneously and a comparative output value between first heart rate value (FHRV) and a second heart rate value (SHRV) is calculated.

Description

SYSTEM AND METHODS FOR AUTHENTICATING VITAL SIGN
MEASUREMENTS FOR BIOMETRICS DETECTION USING PHOTOPLETHYSMOGRAPHY VIA REMOTE SENSORS
FIELD OF THE INVENTION
The present invention relates generally to the field of object class detection and authentication. More specifically, the present invention relates to a biometrics recognition and authentication system and methods based on remote photo-plethysmography measuring and analysis of skin color variations for observing human vital signs using non-invasive, remote, passive sensors of a mobile device, i.e. camera, including, but not limited, average heart rate, heart rate variation and respiratory rate.
BACKGROUND OF THE INVENTION
[2] The following description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[3] In the last decades, biometrics detection is a rapidly growing research area. Biometrics detection has several applications in many areas such as intelligent human-computer interfaces, crowd surveillance, security authentication, video coding, video conferencing, and content-based image retrieval. Among others, the human face is a dynamic object and has a great variability in its appearance, which makes biometrics detection become a difficult problem in computer vision and many other applications. In order to improve the accuracy of biometrics detection, some efficient methods have been developed. Traditionally, methods that focus on facial landmarks (such as eyes, nose, etc.), that detect face-like colors in circular regions, or that use standard feature templates, were used to detect faces. However, these attempts don't improve the accuracy of biometrics detection fundamentally. Biometrics detection is still a challenge because of several difficulties, such as variable face, orientations, different face sizes, partial occlusions of faces in an image, and changeable lighting conditions. With the purpose of locating and extracting the face region from the backgrounds, various systems and methods for indicating skin color characteristics exist in the art. [4] Robust biometrics detection using the hausdorff, by Jesorsky, O., Kirchberg, K.J., Frischholz, R.W., discloses a shape comparison approach to achieve fast, accurate biometrics detection that is robust to changes in illumination and background. The proposed method is edge-based and works on grayscale still images. The Hausdorff distance is used as a similarity measure between a general face model and possible instances of the object within the image.
[5] Robust real-time biometrics detection, by Viola, P., Jones, M.J., discloses an image representation called the "Integral Image" which allows the features used by our detector to be computed very quickly; a classifier which is built using the AdaBoost learning algorithm (Freund and Schapire, 1995) to select a small number of critical visual features from a very large set of potential features; and a method for combining classifiers in a "cascade" which allows background regions of the image to be quickly discarded while spending more computation on promising face-like regions.
[6] Feature-based biometrics detection against skin-color like backgrounds with varying illumination, by Hu, W.C., Yang, C.Y., Huang, D.Y., Huang, C.H., discloses a three-stage scheme for real-time reliable biometrics detection is presented. The proposed three-stage scheme is a feature-based method that is mainly based on skin color and facial features. Skin regions are obtained using a YCbCr skin-color model in the first stage. In the second stage, a face template measure is used to obtain face candidates and then a suitable face box is used to effectively remove non-face regions from the face candidates. Finally, facial features are measured to detect faces from face candidates in the third stage.
[7] Biometrics detection technology based on skin color segmentation and template matching, by Chen, A. P., Pan, L., Tong, Y.B., Ning, N., discloses a methodology for biometrics detection from video recordings of the human face and demonstrates an implementation using a digital camera with ambient daylight providing illumination by obtaining the accurate binary image.
[1] Photoplethysmography imaging (PPGi) as a new research field is used to extract physiological information in recent years. As a fact that volume of the blood in the blood vessels is constantly changing during the cardiac circulation system and human face is a part of a living body, there is a constant blood flow in the subject's face region. Photoplethysmography (PPG) corresponds to the variations in reflected light due to cardiovascular blood volume pulse. Nowadays, it has been shown that heart rate (HR) could be measured from human face with a simple consumer level digital camera under ambient light.
[2] Non-contact, automated cardiac pulse measurements using video imaging and blind source separation, by Ming-Zher Poh, Daniel J. McDuff, and Rosalind W. Picard, discloses color video recordings of the human face and is based on automatic face tracking along with blind source separation of the color channels into independent components. Using Bland- Altman and correlation analysis, the cardiac pulse rate extracted from videos recorded by a basic webcam is compared to an FDA-approved finger blood volume pulse (BVP) sensor and achieved high accuracy and correlation even in the presence of movement artifacts.
[3] As presented, the noncontact methods to estimate HR can obtain very high accuracy.
However, none of the current technologies and prior art, taken alone or in combination, does not address nor provide a solution for a authenticating vital sign measurements for biometrics detection and identity authentication using PPGi technology via remote sensors of a mobile device. The pulsatile signal extracted from face as a unique feature may be used to improve the robustness of non-contact biometrics detection by comparing and correlating it to the pulsatile signal extracted from additional regions of the user's body (such as a fingertip) using a real-time simultaneous PPGi technology via remote sensors of the same mobile device.
[4] Therefore, there is a long felt and unmet need for a system and method that overcomes the problems associated with the prior art.
[5] As used in the description herein and throughout the claims that follow, the meaning of "a," "an," and "the" includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of "in" includes "in" and "on" unless the context clearly dictates otherwise.
[6] All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. "such as") provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention. [7] Groupings of alternative elements or embodiments of the invention disclosed herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the specification is herein deemed to contain the group as modified thus fulfilling the written description of all Markush groups used in the appended claims.
SUMMARY OF THE INVENTION
[8] It is thus an object of the present invention to provide a method, using a computer processing system, for authenticating vital sign measurements for biometrics detection of a user using photoplethysmography (PPG) and photoplethysmography imaging (PPGi) analyses via remote sensors of a mobile device, the method comprises the steps of: simultaneously activating at least one first front camera sensor, at least one second back camera sensor of said mobile device; obtaining video recording of a face of said user via said at least one first front camera sensor; extracting a first pulsatile signal of said user using photoplethysmography imaging (PPGi) analysis of a face of said user in said video recording; generating a first heart rate value (FHRV) comprising heart rate measurements over time; extracting a second pulsatile signal of said user using photoplethysmography (PPG) analysis of a finger of said user via said least one second back camera sensor; generating a second heart rate value (SHRV) comprising heart rate measurements over time; comparing said first heart rate value (FHRV) and said second heart rate value (SHRV); and generating a comparison output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV).
[9] It is another object of the present invention to provide a system for authenticating vital sign measurements for biometrics detection of a user using photoplethysmography (PPG) and photoplethysmography imaging (PPGi) analyses via remote sensors of a mobile device, embodied in one or more non-transitory computer-readable media, said mobile device, comprising: at least one processor; at least one display; at least one first front camera sensor; at least one second back camera sensor; and at least one data storage device storing a plurality of instructions and data wherein, upon execution of said instructions by the at least one processor, said instructions cause: simultaneously activating at least one first front camera sensor, at least one second back camera sensor of said mobile device; obtaining video recording of a face of said user via said at least one first front camera sensor; extracting a first pulsatile signal of said user using photoplethysmography imaging (PPGi) analysis of a face of said user in said video recording; generating a first heart rate value (FHRV) comprising heart rate measurements over time; extracting a second pulsatile signal of said user using photoplethysmography (PPG) analysis of a finger of said user via said least one second back camera sensor; generating a second heart rate value (SHRV) comprising heart rate measurements over time; comparing said first heart rate value (FHRV) and said second heart rate value (SHRV); and generating a comparison output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV).
It is another object of the present invention to provide a non-transitory computer-readable medium storing software comprising instructions executable by one or more computers which, upon such execution, cause the one or more computers to perform operations comprising: simultaneously activating at least one first front camera sensor, at least one second back camera sensor of said mobile device; obtaining video recording of a face of said user via said at least one first front camera sensor; extracting a first pulsatile signal of said user using photoplethysmography imaging (PPGi) analysis of a face of said user in said video recording; generating a first heart rate value (FHRV) comprising heart rate measurements over time; extracting a second pulsatile signal of said user using photoplethysmography (PPG) analysis of a finger of said user via said least one second back camera sensor; generating a second heart rate value (SHRV) comprising heart rate measurements over time; comparing said first heart rate value (FHRV) and said second heart rate value (SHRV); and generating a comparison output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV).
BRIEF DESCRIPTION OF THE PREFERRED EMBODIMENTS
The novel features believed to be characteristics of the invention are set forth in the appended claims. The invention itself, however, as well as the preferred mode of use, further objects and advantages thereof, will best be understood by reference to the following detailed description of illustrative embodiment when read in conjunction with the accompanying drawings, wherein: [12] Fig. 1 presents a high level data flow diagram of the method disclosed by the present invention;
[13] Fig. 2 presents, an example of a generalized presentation of the present invention first heart rate value (FHRV) and said second heart rate value (SHRV) comparison; and
[14] Fig. 3 presents an embodiment of the system disclosed by the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[15] In the following detailed description of the preferred embodiments, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the invention may be practiced. It is understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention. The present invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the present invention is not unnecessarily obscured.
[16] Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[17] While the technology will be described in conjunction with various embodiment(s), it will be understood that they are not intended to limit the present technology to these embodiments. On the contrary, the present technology is intended to cover alternatives, modifications and equivalents, which may be included within the spirit and scope of the various embodiments as defined by the appended claims.
[18] Furthermore, in the following description of embodiments, numerous specific details are set forth in order to provide a thorough understanding of the present technology. However, the present technology may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the present embodiments.
[19] Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present description of embodiments, discussions utilizing terms such as "obtaining", "calculating", "processing", "performing," "extracting," "configuring" or the like, refer to the actions and processes of a computer system, or similar electronic computing device. The computer system or similar electronic computing device manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission, or display devices, including integrated circuits down to and including chip level firmware, assembler, and hardware based micro code.
[20] As will be explained in further detail below, the technology described herein relates to generating, storing and using semantic networks and databases to correlate physiological and psychological states of users and/or groups of users using their voice intonation data analysis.
[21] While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and the above detailed description. It should be understood, however, that it is not intended to limit the invention to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
[22] The term "user" used interchangeably in the present invention, refers hereinafter to any party that employs a biometrics recognition and authentication system based on remote photo- plethysmography measuring and analysis of skin color variations for observing human vital signs using non-invasive, remote, passive sensors of a mobile device, i.e. camera, including, but not limited, average heart rate, heart rate variation and respiratory rate.
[23] The term "mobile device" interchangeably refers, but not limited to such as a mobile phone, laptop, tablet, wearable computing device, cellular communicating device, digital camera (still and/or video), PDA, computer server, video camera, television, electronic visual dictionary, communication device, personal computer, and etc. The present invention means and methods are performed in a standalone electronic device comprising at least one screen. Additionally or alternatively, at least a portion of such as processing, memory accessible, databases, includes a cloud-based platform, and/or web-based platform. In some embodiments, the software components and/or image databases provided, are stored in a local memory module and/or stored in a remote server.
[24] The term "photoplethysmography (PPG) analysis" used interchangeably in the present invention, refers hereinafter to the optical detection of blood volume changes in the microvascular bed of the tissue. The sensor system consists of a light source and a detector software to monitor changes in the light intensity via reflection from or transmission through the tissue. The changes in light intensity are associated with small variations in blood perfusion of the tissue and provide information on the cardiovascular system, in particular, the pulse rate.
[25] The term "photoplethysmography imaging (PPGi) analysis" used interchangeably in the present invention, refers hereinafter to a contactless measurement for the functional registration of blood perfusion in the upper skin layers of facial regions. For PPGi a mobile phone camera is used that enables detecting the skin perfusion of an analyzed face.
[26] As a non-limiting example, the implemented authenticating vital sign measurements for biometrics detection of a user using photoplethysmography (PPG) and photoplethysmography imaging (PPGi) analyses via remote sensors of a mobile device method can be executed using a computerized process according to the example method 100 illustrated in FIG. 1. As illustrated in FIG. 1, the method 100 commences by simultaneously activating at least one front camera sensor, at least one back camera sensor of the mobile device at step 102. Video of a first body region of the user via at least one first front camera sensor and a second body region of said user via said at least one back camera sensor obtained at steps 104 and 106, respectively. Extraction of a first pulsatile signal of the user using photoplethysmography imaging (PPGi) analysis of the first body region of said user in said video is carried out at step 108. Similarly, extraction of a second pulsatile signal corresponding to the second body region of the user (step 110).; A first heart rate value (FHRV) based on photoplethysmography imaging analysis of video of the first body area is calculated at step 112. The similar procedure concerning the second body area in order to obtain a second heart rate value (SHRV) is carried out at step 114. The purpose of the present procedure is to obtain a comparative output value (COV) between the first heart rate value (FHRV) and the second heart rate value (SHRV). Vitality is authenticated if the first heart rate value (FHRV) and second heart rate value (SHRV) coincide. Otherwise, the procedure is repeated.
[27] As a non-limiting example, a generalized presentation of the present invention first heart rate value (FHRV) and said second heart rate value (SHRV) comparison according to the example system 200 illustrated in FIG. 2. As illustrated in FIG. 2, the system if configured to present to the user a comparison output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV). The comparison output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV) is used to detect and authenticate the body regions of the user and/or used to detect and authenticate the identity of the user.
[28] Reference is made now to FIG. 3 which graphically illustrates, according to another preferred embodiment of the present invention, an example of computerized system for implementing the invention 300. The systems and methods described herein can be implemented in software or hardware or any combination thereof. The systems and methods described herein can be implemented using one or more computing devices which may or may not be physically or logically separate from each other. Additionally, various aspects of the methods described herein may be combined or merged into other functions.
[29] In some embodiments, the illustrated system elements could be combined into a single hardware device or separated into multiple hardware devices. If multiple hardware devices are used, the hardware devices could be physically located proximate to or remotely from each other.
[30] The methods can be implemented in a computer program product accessible from a computer-usable or computer-readable storage medium that provides program code for use by or in connection with a computer or any instruction execution system. A computer-usable or computer-readable storage medium can be any apparatus that can contain or store the program for use by or in connection with the computer or instruction execution system, apparatus, or device.
[31] A data processing system suitable for storing and/or executing the corresponding program code can include at least one processor coupled directly or indirectly to computerized data storage devices such as memory elements. Input/output (I/O) devices (including but not limited to keyboards, displays, pointing devices, etc.) can be coupled to the system. Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. To provide for interaction with a user, the features can be implemented on a computer with a display device, such as an LCD (liquid crystal display), virtual display, or another type of monitor for displaying information to the user, and a keyboard and an input device, such as a mouse or trackball by which the user can provide input to the computer.
[32] A computer program can be a set of instructions that can be used, directly or indirectly, in a computer. The systems and methods described herein can be implemented using programming languages such as Flash™, JAVA™, C++, C, C#, Visual Basic™, JavaScript™, PHP, XML, HTML, etc., or a combination of programming languages, including compiled or interpreted languages, and can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. The software can include, but is not limited to, firmware, resident software, microcode, etc. Protocols such as SOAP/HTTP may be used in implementing interfaces between programming modules. The components and functionality described herein may be implemented on any desktop operating system executing in a virtualized or non-virtualized environment, using any programming language suitable for software development, including, but not limited to, different versions of Microsoft Windows™, Apple™ Mac™, iOS™, Android™, Unix™/X-Windows™, Linux™, etc. The system could be implemented using a web application framework, such as Ruby on Rails.
[33] The processing system can be in communication with a computerized data storage system.
The data storage system can include a non-relational or relational data store, such as a MySQL™ or other relational database. Other physical and logical database types could be used. The data store may be a database server, such as Microsoft SQL Server™, Oracle™, IBM DB2™, SQLITE™, or any other database software, relational or otherwise. The data store may store the information identifying syntactical tags and any information required to operate on syntactical tags. In some embodiments, the processing system may use object- oriented programming and may store data in objects. In these embodiments, the processing system may use an object-relational mapper (ORM) to store the data objects in a relational database. The systems and methods described herein can be implemented using any number of physical data models. In one example embodiment, an RDBMS can be used. In those embodiments, tables in the RDBMS can include columns that represent coordinates. In the case of environment tracking systems, data representing user events, virtual elements, etc. can be stored in tables in the RDBMS. The tables can have pre-defined relationships between them. The tables can also have adjuncts associated with the coordinates.
[34] Suitable processors for the execution of a program of instructions include, but are not limited to, general and special purpose microprocessors, and the sole processor or one of multiple processors or cores, of any kind of computer. A processor may receive and store instructions and data from a computerized data storage device such as a read-only memory, a random access memory, both, or any combination of the data storage devices described herein. A processor may include any processing circuitry or control circuitry operative to control the operations and performance of an electronic device.
[35] The processor may also include, or be operatively coupled to communicate with, one or more data storage devices for storing data. Such data storage devices can include, as non-limiting examples, magnetic disks (including internal hard disks and removable disks), magneto- optical disks, optical disks, read-only memory, random access memory, and/or flash storage. Storage devices suitable for tangibly embodying computer program instructions and data can also include all forms of non-volatile memory, including, for example, semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, ASICs (application- specific integrated circuits).
[36] The systems, modules, and methods described herein can be implemented using any combination of software or hardware elements. The systems, modules, and methods described herein can be implemented using one or more virtual machines operating alone or in combination with each other. Any applicable virtualization solution can be used for encapsulating a physical computing machine platform into a virtual machine that is executed under the control of virtualization software running on a hardware computing platform or host. The virtual machine can have both virtual system hardware and guest operating system software.
[37] The systems and methods described herein can be implemented in a computer system that includes a back-end component, such as a data server, or that includes a middleware component, such as an application server or an Internet server, or that includes a front-end component, such as a client computer having a graphical user interface or an Internet browser, or any combination of them. The components of the system can be connected by any form or medium of digital data communication such as a communication network. Examples of communication networks include, e.g., a LAN, a WAN, and the computers and networks that form the Internet.
[38] One or more embodiments of the invention may be practiced with other computer system configurations, including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, etc. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a network.

Claims

[39] CLAIMS
1. A method of authenticating vital sign measurements for biometrics detection via a mobile device having at least one front camera sensor and at least one back sensor, the method comprises the steps of:
a. obtaining video of a first body region and a second body region of said user via said at least one front camera sensor and at least one back camera sensor, respectively;
b. extracting pulsatile signals of said user using photoplethysmography imaging (PPGi) analysis of said video obtained from said first and second body regions;
c. calculating a first heart rate value (FHRV) and a second heart rate value (SHRV);
wherein said videos are obtained simultaneously and a comparative output value between first heart rate value (FHRV) and a second heart rate value (SHRV) is calculated.
2. The method of claim 1, wherein said pulsatile signal is selected from the group consisting of average heart rate, heart rate variation and respiratory rate and any combinations thereof.
3. The method of claim 1, wherein said first body region is a face.
4. The method of claim 1, wherein said second body region is a finger.
5. The method of claim 1, wherein the color values for the pulsatile signal analysis are obtained by using the actual color values from the adaptive body regions in real-time.
6. The method of claim 1, wherein the color values for the pulsatile signal analysis are obtained by using the actual color values from the body capillary vessels in real-time.
7. The method of claim 1, wherein inter-beat-interval changes in the heart rate are estimated.
8. The method of claim 1, wherein a single or multiple adaptive region of facial regions is analyzed for controlling the signal acquisition step so that the PPGi-obtained metadata is preserved wherein the facial expressions and body muscle movements are considered.
9. The method of claim 1, wherein said method further comprising a step of presenting to the user a comparative output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV).
10. The method of claim 1, wherein said comparative output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV) is used to detect and authenticate the body regions of the user.
11. The method of claim 1, wherein said comparative output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV) is used to detect and authenticate the identity of the user.
12. A system for authenticating vital sign measurements for biometrics detection of a user using photoplethysmography (PPG) and photoplethysmography imaging (PPGi) analyses via remote sensors of a mobile device, embodied in one or more non-transitory computer-readable media, said mobile device, comprising:
a. at least one processor;
b. at least one display;
c. at least one first front camera sensor;
d. at least one second back camera sensor; and
e. at least one data storage device storing a plurality of instructions and data wherein, upon execution of said instructions by the at least one processor, said instructions cause:
i. obtaining video of a first body region and a second body region of said user via said at least one front camera sensor and at least one back camera sensor, respectively;
ii. extracting pulsatile signals of said user using photoplethysmography imaging (PPGi) analysis of said video obtained from said first and second body regions;
iii. calculating a first heart rate value (FHRV) ) and a second heart rate value (SHRV);
wherein said video recording are obtained simultaneously and a comparative output value between first heart rate value (FHRV) and a second heart rate value (SHRV) is calculated.
13. The system of claim 12, wherein said pulsatile signal is selected from the group consisting of average heart rate, heart rate variation and respiratory rate and any combinations thereof.
14. The system of claim 12, wherein said first body region is a face.
15. The system of claim 12, wherein said second body region is a finger.
16. The system of claim 12, wherein the color values for the pulsatile signal analysis are obtained by using the actual color values from the adaptive body regions in real-time.
17. The system of claim 12, wherein the color values for the pulsatile signal analysis are obtained by using the actual color values from the body capillary vessels in real-time.
18. The system of claim 12, wherein inter-beat- interval changes in the heart rate are estimated.
19. The system of claim 12, wherein a single or multiple adaptive region of facial regions is analyzed for controlling the signal acquisition step so that the PPGi-obtained metadata is preserved wherein the facial expressions and body muscle movements are considered.
20. The system of claim 12, wherein said instructions further cause to present to the user a comparative output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV).
21. The system of claim 12, wherein said comparative output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV) is used to detect and authenticate the body regions of the user.
22. The system of claim 12, wherein said comparative output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV) is used to detect and authenticate the identity of the user.
23. A non-transitory computer-readable medium storing software comprising instructions executable by one or more computers which, upon such execution, cause the one or more computers to perform operations comprising:
a. obtaining video of a first body region and a second body region of said user via said at least one front camera sensor and at least one back camera sensor, respectively;
b. extracting pulsatile signals of said user using photoplethysmography imaging (PPGi) analysis of said first body region of said user in said video obtained from said first and second body regions;
c. calculating a first heart rate value (FHRV) and a second heart rate value (SHRV);
wherein said video recording are obtained simultaneously and a comparative output value between first heart rate value (FHRV) and a second heart rate value (SHRV) is calculated.
24. The non-transitory computer-readable medium of claim 23, wherein said pulsatile signal is selected from the group consisting of average heart rate, heart rate variation and respiratory rate and any combinations thereof.
25. The non-transitory computer-readable medium of claim 23, wherein said first body region is a face.
26. The non-transitory computer-readable medium of claim 23, wherein said second body region is a finger.
27. The non-transitory computer-readable medium of claim 23, wherein the color values for the pulsatile signal analysis are obtained by using the actual color values from the adaptive body regions in real-time.
28. The non-transitory computer-readable medium of claim 23, wherein the color values for the pulsatile signal analysis are obtained by using the actual color values from the body capillary vessels in real-time.
29. The non-transitory computer-readable medium of claim 23, wherein inter-beat-interval changes in the heart rate are estimated.
30. The non-transitory computer-readable medium of claim 23, wherein a single or multiple adaptive region of facial regions is analyzed for controlling the signal acquisition step so that the PPGi-obtained metadata is preserved wherein the facial expressions and body muscle movements are considered.
31. The non-transitory computer-readable medium of claim 21, wherein said instructions further cause to present to the user a comparative output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV).
32. The non-transitory computer-readable medium of claim 21, wherein said comparative output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV) is used to detect and authenticate the body regions of the user.
33. The non-transitory computer-readable medium of claim 21, wherein said comparative output value (COV) between said first heart rate value (FHRV) and said second heart rate value (SHRV) is used to detect and authenticate the identity of the user.
EP17769567.3A 2016-03-22 2017-03-22 System and methods for authenticating vital sign measurements for biometrics detection using photoplethysmography via remote sensors Withdrawn EP3432790A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662311414P 2016-03-22 2016-03-22
PCT/IL2017/050361 WO2017163248A1 (en) 2016-03-22 2017-03-22 System and methods for authenticating vital sign measurements for biometrics detection using photoplethysmography via remote sensors

Publications (2)

Publication Number Publication Date
EP3432790A1 true EP3432790A1 (en) 2019-01-30
EP3432790A4 EP3432790A4 (en) 2019-10-02

Family

ID=59900017

Family Applications (1)

Application Number Title Priority Date Filing Date
EP17769567.3A Withdrawn EP3432790A4 (en) 2016-03-22 2017-03-22 System and methods for authenticating vital sign measurements for biometrics detection using photoplethysmography via remote sensors

Country Status (2)

Country Link
EP (1) EP3432790A4 (en)
WO (1) WO2017163248A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4033972A4 (en) * 2019-12-02 2024-01-10 Binah.Ai Ltd System and method for physiological measurements from optical data
JP2023545426A (en) * 2020-10-09 2023-10-30 ビナー.エーアイ リミテッド System and method for blood alcohol determination by optical data

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140276104A1 (en) * 2013-03-14 2014-09-18 Nongjian Tao System and method for non-contact monitoring of physiological parameters
US10856747B2 (en) * 2014-01-07 2020-12-08 Samsung Electronics Co., Ltd. Method and system for measuring heart rate in electronic device using photoplethysmography
US20150302158A1 (en) * 2014-04-21 2015-10-22 Microsoft Corporation Video-based pulse measurement
US10028668B2 (en) * 2014-05-06 2018-07-24 Alivecor, Inc. Blood pressure monitor
US9465930B2 (en) * 2014-08-29 2016-10-11 Dropbox, Inc. Fingerprint gestures

Also Published As

Publication number Publication date
WO2017163248A1 (en) 2017-09-28
EP3432790A4 (en) 2019-10-02

Similar Documents

Publication Publication Date Title
Salama AbdELminaam et al. A deep facial recognition system using computational intelligent algorithms
Wang et al. Isolated sign language recognition with grassmann covariance matrices
Kapuscinski et al. Recognition of hand gestures observed by depth cameras
Jian et al. Facial-feature detection and localization based on a hierarchical scheme
US20160343135A1 (en) Determining a pulse signal from a video sequence
Xue et al. Robust visual tracking via multi-scale spatio-temporal context learning
Shyam et al. A taxonomy of 2D and 3D face recognition methods
Shehu et al. Remote eye gaze tracking research: a comparative evaluation on past and recent progress
Wang et al. Accurate face alignment and adaptive patch selection for heart rate estimation from videos under realistic scenarios
Sun et al. Contrast-phys+: Unsupervised and weakly-supervised video-based remote physiological measurement via spatiotemporal contrast
Wu et al. Anti-jamming heart rate estimation using a spatial–temporal fusion network
Singh et al. Detection of stress, anxiety and depression (SAD) in video surveillance using ResNet-101
EP3432790A1 (en) System and methods for authenticating vital sign measurements for biometrics detection using photoplethysmography via remote sensors
Ghazali et al. Novel automatic eye detection and tracking algorithm
Yang et al. Motion-tolerant heart rate estimation from face videos using derivative filter
Zhang et al. Hierarchical facial landmark localization via cascaded random binary patterns
Huang et al. Accurate and efficient pulse measurement from facial videos on smartphones
Jan Deep learning based facial expression recognition and its applications
Liu et al. Heart rate estimation by leveraging static and dynamic region weights
Liu Face detection and recognition on mobile devices
Florea et al. Recognition of the gaze direction: Anchoring with the eyebrows
Velusamy et al. Improved feature representation for robust facial action unit detection
Shyam et al. Automatic face recognition in digital world
Anwar Real time facial expression recognition and eye gaze estimation system
Gaur et al. Comparative studies for the human facial expressions recognition techniques

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20181022

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20190830

RIC1 Information provided on ipc code assigned before grant

Ipc: G06F 21/32 20130101ALI20190826BHEP

Ipc: A61B 5/024 20060101AFI20190826BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20200603