GB2595930A - Individualised computer-implemented security method and system - Google Patents

Individualised computer-implemented security method and system Download PDF

Info

Publication number
GB2595930A
GB2595930A GB2008960.3A GB202008960A GB2595930A GB 2595930 A GB2595930 A GB 2595930A GB 202008960 A GB202008960 A GB 202008960A GB 2595930 A GB2595930 A GB 2595930A
Authority
GB
United Kingdom
Prior art keywords
user
mental state
request
typicality
functionality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2008960.3A
Other versions
GB2595930B (en
GB202008960D0 (en
Inventor
Smith-Creasey Max
Ghassemian Mona
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
British Telecommunications PLC
Original Assignee
British Telecommunications PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by British Telecommunications PLC filed Critical British Telecommunications PLC
Priority to GB2008960.3A priority Critical patent/GB2595930B/en
Publication of GB202008960D0 publication Critical patent/GB202008960D0/en
Publication of GB2595930A publication Critical patent/GB2595930A/en
Application granted granted Critical
Publication of GB2595930B publication Critical patent/GB2595930B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • H04W12/67Risk-dependent, e.g. selecting a security level depending on risk profiles

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The present disclosure relates to a computer-implemented security method comprising, in response to a request for functionality made by a user through a user interface: obtaining authentication of the user’s identity based on one or more authentication credentials obtained from the user in response to the request; obtaining an inference of the user’s state of mind based on one or more mental state biometrics obtained from the user in response to the request; obtaining an individualised mental state typicality score indicating a degree of typicality of that inference for the user based on a mental state profile established for the user prior to the request being made; obtaining a policy decision in dependence on the inference, the individualised mental state typicality score and an access policy for the requested functionality; and initiating provision or denial of the requested functionality in accordance with the policy decision.

Description

INDIVIDUALISED COMPUTER-IMPLEMENTED SECURITY METHOD AND
SYSTEM
Field
The present disclosure relates to security methods which authenticate users based on both their identity and their state of mind.
More specifically, aspects relate to computer-implemented security methods, data processing systems configured to perform such methods, computer programs comprising instructions which, when the program is executed by a computer, cause the computer to carry out such methods, computer-readable data carriers having such programs stored thereon and data carrier signals carrying such computer programs.
Background
In the interests of efficiency, and to reduce the risk of corruption, many security checks traditionally performed by a human guard are now automated. Such security checks may for example be to permit access to restricted areas such as homes, airport departure lounges, train station platforms, ticketed events, hospital wards and secure workplaces. They may alternatively be to permit provision of other functionality such as accessing personal data and/or cash at an automatic teller machine (ATM).
However, while automatic security systems have increased efficiency of security checks, they lack the emotional intelligence and intuition which human security guards can provide. A user's identity can be automatically authenticated by checking one or more credentials provided by the user such as a passport, personal identification number (PIN) or biometric credential e.g. fingerprint, but identity may not be the only factor relevant to a suitable security policy. It may for example be desirable to deny requests for functionality made by authorised users at times when they are engaging in dishonest behaviour, acting under coercion or impaired e.g. due to inebriation, fatigue or a health problem.
Some attempts have already been made to mitigate the types of problem described above by making use of user biometrics. For example, US 2017/0223017 Al describes a process in which a user attempting a large mobile transaction triggers a prompt to the user to provide biometric data to assess whether they could be under coercion. For example, if an image of the user's face shows them frowning and a pulse measurement indicates an elevated heart rate then a coercion risk could be identified. In that case the transaction is denied, or additional security checks are required for the transaction to be processed. Such an approach may however unduly delay the authentication process or inconvenience the user in some circumstances, since the biometrics provided may not necessarily indicate coercion.
What is needed is an improved automatic authentication system and method.
Summary
According to a first aspect, there is provided a computer-implemented security method, the method comprising, in response to a request for functionality made by a user through a user interface: obtaining authentication of the user's identity based on one or more authentication credentials obtained from the user in response to the request; obtaining an inference of the user's state of mind based on one or more mental state biometrics obtained from the user in response to the request; obtaining an individualised mental state typicality score indicating a degree of typicality of that inference for the user based on a mental state profile established for the user prior to the request being made; obtaining a policy decision in dependence on the inference, the individualised mental state typicality score and an access policy for the requested functionality; and initiating provision or denial of the requested functionality in accordance with the policy decision.
The method of the first aspect can further comprise determining the individualised mental state typicality score in dependence on contextual data indicating a context in which the request for functionality was made.
The mental state profile can comprise data indicating typicality of the inference for the user in that context, the individualised mental state typicality score being determined further in dependence thereon.
The method of the first aspect can further comprise, in response to the request, obtaining the contextual data from one or more environmental sensors in the vicinity of the user interface.
The method of the first aspect can further comprise obtaining an environmental mental state typicality score indicating a degree of typicality of the inference for the context based on an environmental context profile established from a plurality of individuals prior to the request being made; wherein the policy decision is obtained further in dependence thereon.
According to a second aspect, there is provided a computer-implemented security method, the method comprising, in response to a request for functionality made by a user through a user interface: obtaining authentication of the user's identity based on one or more authentication credentials obtained from the user in response to the request; obtaining an inference of the user's state of mind based on one or more mental state biometrics obtained from the user in response to the request; obtaining contextual data indicating a context in which the request for functionality was made from one or more environmental sensors in the vicinity of the user interface; obtaining an environmental mental state typicality score indicating a degree of typicality of the inference for the context based on an environmental context profile established from a plurality of individuals prior to the request being made; obtaining a policy decision in dependence on the inference, the environmental mental state typicality score and an access policy for the requested functionality; and initiating provision or denial of the requested functionality in accordance with the policy decision.
The method of the second aspect can further comprise obtaining an individualised mental state typicality score indicating a degree of typicality of the inference for the user based on a mental state profile established for the user prior to the request being made; wherein the policy decision is obtained further in dependence thereon.
The method of either the first or second aspect can further comprise obtaining the policy decision by combining the individualised mental state typicality score with the environmental mental state typicality score by means of sum score fusion.
The method of either the first or second aspect can further comprise, prior to the request for functionality being made by the user, causing the mental state profile for the user to be stored remotely from a device comprising the user interface.
The method of either the first or second aspect can further comprise, prior to the request for functionality being made by the user, establishing the mental state profile for the user using an anomaly detection algorithm facilitated by a neural network.
The method of either the first or second aspect can further comprise determining the environmental mental state typicality score locally on a device comprising the one or more environmental sensors.
The one or more authentication credentials of either the first or second aspect can comprise at least one of the one or more mental state biometrics.
The method of either the first or second aspect can further comprise making the inference using a support vector machine classifier or an artificial neural network classifier.
According to a third aspect, there is provided a data processing system configured to perform the method of either of the first or second aspects.
According to a fourth aspect, there is provided a computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of either of the first or second aspects.
According to a fifth aspect, there is provided a computer-readable data carrier having stored thereon the computer program of the fourth aspect.
According to a sixth aspect, there is provided a data carrier signal carrying the computer program of the fourth aspect.
Brief description of the figures
Aspects of the present disclosure will now be described by way of example with reference to the accompanying figures. In the figures: Figure 1 illustrates an example system 100 in which the proposed methods can be employed; Figure 2 is a flowchart illustrating an example computer-implemented security method; and Figure 3 schematically illustrates an example data processing system capable of performing such a method.
Detailed description of the figures
The following description is presented to enable any person skilled in the art to make and use the system and is provided in the context of a particular application.
Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art.
Where existing authentication systems take into account the mental state of the individual being authenticated, they do so with reference to generic data, without considering what is typical for that individual and/or for the context in which authentication is being requested. This can lead to authentication being inappropriately delayed and/or additional security checks being unnecessarily implemented, in turn leading to delays and/or wastage of resources such as electrical power, memory and bandwidth. For example, if the individual's skin conductance or electrodermal activity is measured by a galvanic skin resistance (GSR) sensor to indicate that they are perspiring, they may be inferred to be stressed and/or frightened, and authentication for actions such as entry to restricted locations may be accordingly denied to them. This may be an inappropriate policy decision however if that individual is prone to unusually high levels of perspiration and/or if it is an unusually hot day and/or if the individual is requesting access to their workplace following a routine lunchtime run.
The methods and systems proposed herein employ an individualised mental state typicality score and/or an environmental mental state typicality score to assess whether a biometric-based inference of the individual's mental state is normal for that individual and/or the context in which they are requesting authentication. (Mental state can refer here to emotional state/mood and/or any impairments and/or psychological disturbances the user may be experiencing e.g. as a result of fatigue, inebriation, a medical condition, medication or recreational drug use).
An authentication policy is then applied accordingly so that security is maintained while reducing occurrences of authentication being delayed and/or made more resource-intensive than necessary by the implementation of additional security checks in situations where a user's mental state biometrics correspond to a mental state which is usual (or even the "default" state) for that individual and/or a mental state which could be expected of any user in the environmental context the user is in and/or a mental state which is normal for that individual given the context. In this way, an automated security system with improved intuition-like features, such as emotional and/or situational/contextual awareness, is provided.
Figure 1 illustrates an example system 100 in which the proposed methods can be employed.
A user 110 requests functionality through a user interface 120. In this example, the functionality is access to a restricted area through a security door 130 and the user interface 120 is a proximity reader to which the user 110 presents an identity card 140. The reader 120 obtains an authentication credential from the identity card 140 by near field communication (NFC).
Authentication of the user 110's identity is then obtained. This could for example be solely on the basis of the authentication credential stored on the identity card 140 or may require one or more further authentication credentials to be collected. For example, the user 110 could be required to enter a passcode into a user interface such as a touchscreen 150 and/or one or more biometric credentials could be taken from them, e.g. a camera 160 could obtain one or more images of the user 110's face for a facial recognition process.
Processing to compare whatever authentication credential(s) is/are provided to stored values associated with the user 110's identity could be conducted in part or solely on a local computing device, or could involve communication with a remote computing device such as a cloud server 170 which could host such stored values, for example in a user profile database, and/or perform some or all of the authentication processing. Communication with any remote computing device could be via one or more wired and/or wireless connections.
One or more mental state biometrics are collected from the user 110 in order to obtain an inference of the user's state of mind. For example, the camera 160 could be used to capture images of the user 110's face for facial emotion detection (optionally images used for facial recognition as described above could be used for this purpose too). Alternatively or additionally, one or more other available biometric sensors could be used to collect mental state biometrics, such as may be incorporated for example in a smart watch 180 worn by the user 110. These could for example include a GSR sensor and/or a photoplethysmography (PPG) heart rate monitor which could report readings, for example via a cellular connection to the cloud server 170 or via a short-range communication protocol such as Bluetoothw to a local computing device. This could be automatic or triggered by an input from the user 110 to a user input device such as the touchscreen 150 or a button on the smartwatch 180. (The user 110 could optionally be prompted to provide such an input via a user output device such as the touchscreen 150 or a vibration notification device of the smartwatch 180.) As with the authentication processing, processing to obtain the inference of the user 110's state of mind could be conducted in part or solely on a local computing device, or could involve communication with a remote computing device such as the cloud server 170 which could perform some or all of the processing.
As an example, the inference could be obtained based on one or more facial images and one or more heart rate measurements both taken over an interval of a few seconds. Analysis of the facial images could result in a finding of a 70% likelihood of the user 110 being stressed. Analysis of the heart rate measurements could result in a finding of a 90% likelihood of the user 110 being stressed. These two likelihoods can be fused to determine an 80% likelihood of the user 110 being stressed. (Fusing of inferences based on multiple biometrics can optionally involve weighting, e.g. according to the reliability of each biometric in judging mental states.) Further processing can be conducted to obtain an individualised mental state typicality score indicating a degree of typicality of the inference for the user 110 based on a mental state profile established for the user in advance, which can for example be stored on the cloud server 170.
For example, the inference could be an 80% likelihood of the user being stressed as described above. The user 110's mental state profile could indicate that they are stressed 75% of the time. The individualised mental state typicality score could then be determined as 100%-175%-80%I=95%. This high score indicates that the mental state inference is very usual for the user 110.
Contextual data indicating a context in which the request for functionality was made can also be collected from one or more environmental sensors in the vicinity of the user interface, such as a thermometer 190 which measures the ambient air temperature and/or the camera 160 which could capture images of the environment (optionally incorporated as background in the facial images described above) so that an image recognition process could e.g. recognise that the area is crowded.
Contextual data can alternatively or additionally be obtained in other ways, for example date and time from an internal clock of any computing device involved in the processing and/or indications of expected requests for functionality comprised in records of an organisation controlling the functionality access, such as staff rotas.
Further processing can be conducted to obtain an environmental mental state typicality score indicating a degree of typicality of the inference for the context based on an environmental context profile established from a plurality of individuals in advance, which can for example be stored on the cloud server 170.
For example, one or more images of the vicinity taken by the camera 160 over the same interval during which the mental state biometrics are collected could be determined to show a crowd 195. The environmental context profile could associate the presence of a crowd with an 80% likelihood of stress. One or more temperature measurements taken by the thermometer 190 over that same interval could indicate a temperature which the environmental context profile associates with a 70% likelihood of stress. These two likelihoods can be fused to arrive at an environmental mental state typicality score of 75%. (Fusing of likelihoods based on multiple contextual factors can optionally involve weighting, e.g. according to the reliability of each factor in judging mental states.) A policy decision is then obtained in dependence on the inference, an access policy for the requested functionality and (where available) the individualised mental state typicality score and/or environmental mental state typicality score. The requested functionality can then be provided or denied in accordance with the policy decision. In this case, the touchscreen 150 could inform the user 110 that access has been denied, and optionally advise them of possible steps they can take to attempt to have the policy decision overruled, if the policy decision is to deny the requested functionality. Alternatively, the door 130 could be unlocked if the policy decision is to provide the requested functionality.
For example, the individualised mental state typicality score of 95% and environmental mental state typicality score of 75% determined above could be fused to produce an overall typicality score of 85%. The policy could permit access despite an inference that the user is stressed provided the overall typicality score is greater than a predetermined threshold value of 70%. In this case, the user 110 is therefore granted access.
Figure 2 is a flowchart illustrating an example computer-implemented security method 200.
The method 200 is initiated at step 210, where an indication of a request for functionality having been made by a user through a user interface is obtained. The request for functionality can for example be a request for access to a restricted location (e.g. through a security door, gate or barrier to a home, workplace, ticketed or members-only area in a transport hub or leisure venue), a request for access to restricted data (e.g. confidential business information accessible through a workstation or personal account data accessible through an ATM or ticket machine) or a request for provision of an item or service (e.g. cash at an ATM, a ticket at a ticket machine, a product at a vending machine or an automated carwash at a service station). The device performing the method 200 could comprise the user interface or could communicate with a device comprising the user interface through one or more wired or wireless connections.
At step 220, authentication of the user's identity is obtained based on one or more authentication credentials obtained from the user. This can for example be by comparing the authentication credential(s) to pre-stored authentication credentials, for example comprised in an authorised credentials list or user profile database. Such pre-stored credentials can be stored locally on a device performing the authentication, or on a remote device communicably coupled to it via one or more wired or wireless connections. The device performing the method 200 could perform the authentication itself or receive an indication that the user has been authenticated from a separate device via one or more wired or wireless connections.
Step 220 can comprise step 222, wherein the one or more authentication credentials are obtained from the user. The authentication credential(s) could for example comprise one or more knowledge factors, i.e. something the user knows (e.g. a partial or full password/passcode, personal identification number (PIN) or challenge response such as a security pattern or security question answer); ownership factors, i.e. something the user has (e.g. ID card, security token, implanted device, personal device with built-in hardware or software token); or inherence factors, i.e. something the user is or does (e.g. fingerprint, retinal pattern, DNA sequence, signature, face, voice or other biometric identifier). The credential(s) could be actively provided by the user, optionally in response to a prompt from a user interface, or could be passively collected from the user. The device performing the method 200 could comprise a user interface configured to obtain the authentication credential(s) from the user or could communicate with a device comprising such a user interface through one or more wired or wireless connections. Such a device could be the same device comprising the user interface through which the functionality request is made, or another. The user interface through which the authentication credential(s) are obtained could be the same user interface through with the functionality request is made, or another. If the user is prompted to provide the authentication credential(s) then this could be by means of one of the aforementioned user interfaces, or another.
In Figure 2, step 222 (obtain authentication credential(s)) is shown following step 210 (obtain indication of functionality request). Alternatively, step 210 could follow step 222, for example as may be appropriate if the user can select from multiple available functionalities. In some implementations, steps 210 and 222 could be merged, with active or passive provision of the authentication credential(s) being considered to imply the functionality request.
At step 230, an inference of the user's state of mind is obtained based on one or more mental state biometrics obtained from the user in response to the request. This can for example be done using a support vector machine classifier or an artificial neural network classifier. The device performing the method 200 could make the inference itself or receive an indication of the inference from a separate device via one or more wired or wireless connections.
Step 230 can comprise step 232, wherein the one or more mental state biometrics are obtained from the user. The mental state biometric(s) could for example comprise one or more facial images (for expression detection), video clips (for gesture detection), GSR skin conductance measurements, electrocardiography (ECG), blood volume pulse (BVP) or PPG heart rate measurements, vocal recordings (for tone/pitch detection) or pressure/speed/rhythm measurements relating to interaction with a user input device such as a keyboard, keypad, touchscreen or stylus. The mental state biometric(s) could be actively provided by the user, optionally in response to a prompt from a user interface, or could be passively collected from the user. The device performing the method 200 could comprise a user interface configured to obtain the mental state biometric(s) from the user or could communicate with a device comprising such a user interface through one or more wired or wireless connections. Such a device could be the same device comprising any of the aforementioned user interfaces, or another. The user interface through which the mental state biometric(s) are obtained could be any of the aforementioned user interfaces, or another. If the user is prompted to provide the mental state biometric(s) then this could be by means of one of the aforementioned user interfaces, or another.
In Figure 2, step 232 is shown following step 222. Alternatively, step 222 could follow step 232. In some implementations, steps 222 and 232 could be merged, with the same data serving as both as authentication credential and a mental state biometric.
At step 240, one or both of an individualised mental state typicality score and an environmental mental state typicality score are obtained. The device performing the method 200 could determine one or both of these scores or could be communicably coupled to a device which does so via a wired or wireless connection.
An individualised mental state typicality score indicates a degree of typicality of the inference obtained at step 230 for the user based on a mental state profile established for the user prior to the functionality request being made, as shown at step 202.
The user's mental state profile could for example comprise a list of moods and/or other mental states the user has been determined to experience, each associated with a corresponding prevalence of that mental state for the user in question, generically and/or in one or more particular contexts. For example a generic prevalence score for a particular mental state in a user mental state profile could correspond to a proportion of time the user has been determined to experience the particular mental state for. Similarly, a contextual prevalence score for a particular mental state in a user mental state profile could correspond to a proportion of one or more periods over which a particular contextual factor was present the user has been determined to experience the particular mental state for.
The user's mental state profile could be established during an initial registration period and optionally updated on an ad hoc, periodic or continuous basis and/or in response to one or more event triggers.
Establishing and/or updating the user's mental state profile could require the user's active participation, for example to self-report their mental state according to a particular schedule (e.g. periodically, such as hourly or daily) or in response to one or more event triggers (e.g. whenever they request certain functionality, such as the functionality the method 200 is directed at securing access to, through a user interface). Alternatively or additionally, the mental state profile could be established and/or updated via active or passive collection of mental state biometrics from the user, for example from wearable devices and/or surveillance systems. Such collected mental state biometrics can for example be input to an anomaly detection algorithm facilitated by an artificial neural network trained on mental state biometrics collected from a population of individuals, so that discrepancies between responses of the user's mental state biometrics to certain stimuli and those of the population are identified. Prevalence scores for the user's mental state profile can then be derived by modifying average prevalence scores for the population according to the output of the anomaly detection algorithm.
Data used to establish and/or update the mental state profile could be obtained from one or more user interfaces comprised in one or more devices with which the user interacts during the registration and/or update period, optionally including one or more of the aforementioned user interfaces.
Once the user's mental state profile has been established at step 202, it can be stored at step 204. It could for example be stored remotely from the device comprising the user interface through which the request for functionality is made, e.g. on a cloud server. In this way, the mental state profile could be accessible for use in determining whether to grant or deny functionality requests made through any device communicably coupled to such a remote storage device via one or more wired or wireless connections. Alternatively, storing the user mental state profile locally allows it to be kept more securely, for example if it is considered to comprise sensitive personal data.
An environmental mental state typicality score indicates a degree of typicality of the inference obtained at step 230 for the context in which the functionality request is made, based on an environmental context profile established from a plurality of individuals prior to the functionality request being made, as shown at step 206.
The environmental context profile could for example comprise a list of environmental context factors, for example ambient noise, light, temperature, pressure, humidity, precipitation or wind speed levels and/or local activity levels (e.g. pedestrian and/or vehicular traffic volume) and/or one or more combinations of such factors, each associated with a corresponding prevalence score for each of one or more mental states. The prevalence scores in the environmental context profile could correspond to the proportion of a population determined to experience a particular mental state when exposed to a particular environmental context factor or combination of environmental context factors.
The environmental context profile could be established during an initial calibration period and optionally updated on an ad hoc, periodic or continuous basis, and/or in response to one or more event triggers, such as the process 200 being run. Establishing and/or updating the environmental context profile could be achieved using statistical modelling and/or machine learning techniques.
Once the environmental context profile has been established at step 206, it can be stored at step 208. It could for example be stored remotely from the device comprising the user interface through which the request for functionality is made, e.g. on a cloud server. In this way, the environmental context profile could be accessible for use in determining whether to grant or deny functionality requests made through any device communicably coupled to such a remote storage device via one or more wired or wireless connections.
Determining the environmental mental state typicality score comprises obtaining contextual data indicating a context in which the request for functionality was made from one or more environmental sensors in the vicinity of the user interface through which the functionality request was made, as shown at step 242. Such environmental sensors could for example comprise one or more sensors for determining natural conditions (e.g. light sensors, thermometers, barometers, humidity gauges, precipitation detectors or wind speed gauges) and/or one or more sensors for determining local activity such as pedestrian and/or vehicular traffic (e.g. cameras, microphones, vibration sensors, pressure pads or proximity sensors). All environmental sensors contributing to the contextual data could be collocated in a single device, or they could be distributed between a plurality of devices in the vicinity of the device comprising the user interface which receives the functionality request, said plurality of devices each being communicably coupled to the device which determines the environmental mental state typicality score via one or more wired or wireless connections.
The environmental mental state typicality score can be determined locally on a device comprising the one or more environmental sensors in order to reduce latency in said determination.
The individualised mental state typicality score could be determined in dependence on contextual data indicating a context in which the request for functionality was made. Such contextual data could comprise locally sensed environmental context data such as that described above in relation to the environmental state typicality score. Alternatively or additionally, the contextual data could comprise data obtained from other sources, such as time of day, day of the week and date (obtained from the clock/calendar of any device involved in the determination), whether the user is scheduled to be working that day or has an imminent appointment (obtained from an electronic calendar maintained by the user, their employer or another entity such as an authority responsible for securing access to a restricted area the user is attempting to enter) live traffic data (obtained from a remote traffic data monitoring system), and local activity schedules such as public transport, public event and waste collection timetables (obtained from remote databases). The contextual data could be used to adjust the individualised mental state typicality score in a generic way, e.g. with threshold adjustments to take into account context according to rules which are generic for all users (e.g. allow for increased stress on working days), or could be used to implement individualised adjustments if the user's mental state profile comprises data indicating typicality of the inference for the user in different contexts (e.g. allow for increased stress on a day when a dental appointment is scheduled).
Moving on to step 250, a policy decision is obtained in dependence on the inference obtained at step 230, the typicality score(s) obtained at step 240 and an access policy for the requested functionality. The access policy could be stored locally to the device making the policy decision to reduce the time taken to make that decision, or on a remote device to which it is communicably coupled via one or more wired or wireless connections to ensure the most up to date policy is always applied, enabling increased security.
If both an individualised mental state typicality score and an environmental mental state typicality score are available, then obtaining the policy decision can comprise combining them, e.g. by means of sum score fusion, at step 252.
Finally, at step 260, provision or denial of the requested functionality is initiated in accordance with the policy decision.
Figure 3 schematically illustrates an example data processing system 300 capable of performing the method 200. It comprises a processor 310 operably coupled to both a memory 320 and an interface 330.
The memory 320 can optionally comprise instructions which, when the program is executed by the processor 310, cause the data processing system 300 to carry out the method 200. Alternatively or additionally, the interface 330 can optionally comprise one or both of a physical interface 331 configured to receive a data carrier having such instructions stored thereon and a receiver 332 configured to receive a data carrier signal carrying such instructions.
The interface 330 can optionally comprise a user interface 333 configured to receive the request for functionality. Alternatively or additionally, the interface 330 can optionally comprise a receiver 332 configured to receive an indication of the request for functionality.
The interface 330 can optionally comprise a user interface 334 configured to receive one or more of the one or more authentication credentials. Alternatively or additionally, the interface 330 can optionally comprise a receiver 332 configured to receive one or more of the one or more authentication credentials and/or the authentication of the user's identity.
The interface 330 can optionally comprise one or more biometric devices 335 configured to directly measure one or more of the one or more mental state biometrics. Alternatively or additionally, the interface 330 can optionally comprise a receiver 332 configured to receive one or more of the one or more mental state biometrics and/or the inference of the user's state of mind.
The interface 330 can optionally comprise one or more environmental sensors 336 configured to directly measure contextual data indicating a context in which the request for functionality was made. Alternatively or additionally, the interface 330 can optionally comprise a receiver 332 configured to receive contextual data and/or a mental state profile for the user and/or an environmental context profile and/or an individualised mental state typicality score and/or an environmental mental state typicality score.
The interface 330 can optionally comprise a receiver 332 configured to receive the policy decision and/or the access policy.
The interface 330 can optionally comprise a user interface 337 configured to provide or deny the requested functionality and/or communicate the policy decision to the user. Alternatively or additionally, the interface 330 can optionally comprise a transmitter 338 configured to communicate the policy decision to a separate computing device to cause that separate computing device to provide or deny the requested functionality.
The receiver 332, when present, can comprise one or more wireless receiver modules and/or one or more wired receiver modules. Similarly, the transmitter 338, when present, can comprise one or more wireless transmitter modules and/or one or more wired transmitter modules.
Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. It is intended that the specification and examples be considered as exemplary only.
In addition, where this application has listed the steps of a method or procedure in a specific order, it could be possible, or even expedient in certain circumstances, to change the order in which some steps are performed, and it is intended that the particular steps of the method or procedure claims set forth herein not be construed as being order-specific unless such order specificity is expressly stated in the claim. That is, the operations/steps may be performed in any order, unless otherwise specified, and embodiments may include additional or fewer operations/steps than those disclosed herein. It is further contemplated that executing or performing a particular operation/step before, contemporaneously with, or after another operation is in accordance with the described embodiments.
The methods described herein may be encoded as executable instructions embodied in a computer readable medium, including, without limitation, non-transitory computer-readable storage, a storage device, and/or a memory device.
Such instructions, when executed by a processor (or one or more computers, processors, and/or other devices) cause the processor (the one or more computers, processors, and/or other devices) to perform at least a portion of the methods described herein. A non-transitory computer-readable storage medium includes, but is not limited to, volatile memory, non-volatile memory, magnetic and optical storage devices such as disk drives, magnetic tape, compact discs (CDs), digital versatile discs (DVDs), or other media that are capable of storing code and/or data.
Where a processor is referred to herein, this is to be understood to refer to a single processor or multiple processors operably connected to one another. Similarly, where a memory is referred to herein, this is to be understood to refer to a single memory or multiple memories operably connected to one another.
The methods and processes can also be partially or fully embodied in hardware modules or apparatuses or firmware, so that when the hardware modules or apparatuses are activated, they perform the associated methods and processes. The methods and processes can be embodied using a combination of code, data, and hardware modules or apparatuses.
Examples of processing systems, environments, and/or configurations that may be suitable for use with the embodiments described herein include, but are not limited to, embedded computer devices, personal computers, server computers (specific or cloud (virtual) servers), hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, mobile telephones, network personal computers (PCs), minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. Hardware modules or apparatuses described in this disclosure include, but are not limited to, application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), dedicated or shared processors, and/or other hardware modules or apparatuses.
User devices can include, without limitation, static user devices such as PCs and mobile user devices such as smartphones, tablets, laptops and smartwatches. 10 Receivers and transmitters as described herein may be standalone or may be comprised in transceivers. A communication link as described herein comprises at least one transmitter capable of transmitting data to at least one receiver over one or more wired or wireless communication channels. Wired communication channels can be arranged for electrical or optical transmission. Such a communication link can optionally further comprise one or more relaying transceivers.
User input devices can include, without limitation, microphones, buttons, keypads, touchscreens, touchpads, trackballs, joysticks, mice, gesture control devices and brain control (e.g. electroencephalography, EEG) devices. User output devices can include, without limitation, speakers, buzzers, display screens, projectors, indicator lights, haptic feedback devices and refreshable braille displays. User interface devices can comprise one or more user input devices, one or more user output devices, or both.

Claims (15)

  1. Claims 1. A computer-implemented security method, the method comprising, in response to a request for functionality made by a user through a user interface: obtaining authentication of the user's identity based on one or more authentication credentials obtained from the user in response to the request; obtaining an inference of the user's state of mind based on one or more mental state biometrics obtained from the user in response to the request; obtaining an individualised mental state typicality score indicating a degree of typicality of that inference for the user based on a mental state profile established for the user prior to the request being made; obtaining a policy decision in dependence on the inference, the individualised mental state typicality score and an access policy for the requested functionality; and initiating provision or denial of the requested functionality in accordance with the policy decision.
  2. 2. The method of claim 1, further comprising determining the individualised mental state typicality score in dependence on contextual data indicating a context in which the request for functionality was made.
  3. 3. The method of claim 2, wherein the mental state profile comprises data indicating typicality of the inference for the user in that context, the individualised mental state typicality score being determined further in dependence thereon.
  4. 4. The method of either of claims 2 or 3, further comprising, in response to the request, obtaining the contextual data from one or more environmental sensors in the vicinity of the user interface.
  5. 5. The method of claim 4, further comprising obtaining an environmental mental state typicality score indicating a degree of typicality of the inference for the context based on an environmental context profile established from a plurality of individuals prior to the request being made; wherein the policy decision is obtained further in dependence thereon.
  6. 6. The method of claim 5, further comprising obtaining the policy decision by combining the individualised mental state typicality score with the environmental mental state typicality score by means of sum score fusion.
  7. 7. The method of either of claims 5 or 6, further comprising determining the environmental mental state typicality score locally on a device comprising the one or more environmental sensors.
  8. 8. The method of any of claims 1 to 7, further comprising, prior to the request for functionality being made by the user, causing the mental state profile for the user to be stored remotely from a device comprising the user interface.
  9. 9. The method of any preceding claim, wherein the one or more authentication credentials comprise at least one of the one or more mental state biometrics.
  10. 10. The method of any preceding claim, further comprising, prior to the request for functionality being made by the user, establishing the mental state profile for the user using an anomaly detection algorithm facilitated by a neural network.
  11. 11. The method of any preceding claim, further comprising making the inference using a support vector machine classifier or an artificial neural network classifier.
  12. 12. A data processing system configured to perform the method of any preceding claim.
  13. 13. A computer program comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of any of claims 1 to 11.
  14. 14. A computer-readable data carrier haying stored thereon the computer program of claim 13.
  15. 15. A data carrier signal carrying the computer program of claim 13.
GB2008960.3A 2020-06-12 2020-06-12 Individualised computer-implemented security method and system Active GB2595930B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2008960.3A GB2595930B (en) 2020-06-12 2020-06-12 Individualised computer-implemented security method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2008960.3A GB2595930B (en) 2020-06-12 2020-06-12 Individualised computer-implemented security method and system

Publications (3)

Publication Number Publication Date
GB202008960D0 GB202008960D0 (en) 2020-07-29
GB2595930A true GB2595930A (en) 2021-12-15
GB2595930B GB2595930B (en) 2022-11-16

Family

ID=71835712

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2008960.3A Active GB2595930B (en) 2020-06-12 2020-06-12 Individualised computer-implemented security method and system

Country Status (1)

Country Link
GB (1) GB2595930B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120016827A1 (en) * 2010-07-19 2012-01-19 Lockheed Martin Corporation Biometrics with mental/ physical state determination methods and systems
US20170223017A1 (en) 2016-02-03 2017-08-03 Mastercard International Incorporated Interpreting user expression based on captured biometric data and providing services based thereon
WO2019048855A2 (en) * 2017-09-05 2019-03-14 B-Secur Limited Wearable authentication device
US10482698B2 (en) * 2015-05-01 2019-11-19 Assa Abloy Ab Invisible indication of duress via wearable
US20200167783A1 (en) * 2018-11-26 2020-05-28 Capital One Services, Llc Systems for detecting biometric response to attempts at coercion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120016827A1 (en) * 2010-07-19 2012-01-19 Lockheed Martin Corporation Biometrics with mental/ physical state determination methods and systems
US10482698B2 (en) * 2015-05-01 2019-11-19 Assa Abloy Ab Invisible indication of duress via wearable
US20170223017A1 (en) 2016-02-03 2017-08-03 Mastercard International Incorporated Interpreting user expression based on captured biometric data and providing services based thereon
WO2019048855A2 (en) * 2017-09-05 2019-03-14 B-Secur Limited Wearable authentication device
US20200167783A1 (en) * 2018-11-26 2020-05-28 Capital One Services, Llc Systems for detecting biometric response to attempts at coercion

Also Published As

Publication number Publication date
GB2595930B (en) 2022-11-16
GB202008960D0 (en) 2020-07-29

Similar Documents

Publication Publication Date Title
US10565894B1 (en) Systems and methods for personalized digital goal setting and intervention
US9300925B1 (en) Managing multi-user access to controlled locations in a facility
US10740845B2 (en) System for mobile device enabled biometric monitoring
US20150242605A1 (en) Continuous authentication with a mobile device
US9147060B2 (en) System and method for real world biometric analytics through the use of a multimodal biometric analytic wallet
US20190272725A1 (en) Pharmacovigilance systems and methods
US20190108841A1 (en) Virtual health assistant for promotion of well-being and independent living
JP7487872B2 (en) Medical system and method for implementing same
US20120072121A1 (en) Systems and methods for quality control of computer-based tests
US10037419B2 (en) System, method, and apparatus for personal identification
US10216914B2 (en) System, method, and apparatus for personal identification
Zhu et al. Blinkey: A two-factor user authentication method for virtual reality devices
WO2015143085A1 (en) Techniques for wellness monitoring and emergency alert messaging
WO2016082783A1 (en) Identity authentication method and system
EP3651038A1 (en) Brain activity-based authentication
WO2019070763A1 (en) Caregiver mediated machine learning training system
US20220414194A1 (en) Method and system for user authentication
Vhaduri et al. Summary: Multi-modal biometric-based implicit authentication of wearable device users
Yfantidou et al. Beyond accuracy: a critical review of fairness in machine learning for mobile and wearable computing
CA3154229A1 (en) System and method for monitoring system compliance with measures to improve system health
JP2019525337A (en) System and method for optimizing user experience based on patient status, user role, current workflow and display proximity
GB2595930A (en) Individualised computer-implemented security method and system
GB2595931A (en) Contextualised computer-implemented security method and system
CA3199173A1 (en) Passive alarm state adjustment
US20230108872A1 (en) Method, data processing system and computer program for securing functionality of a user device connected to a local network