CN107122641A - Smart machine owner recognition methods and owner's identifying device based on use habit - Google Patents

Smart machine owner recognition methods and owner's identifying device based on use habit Download PDF

Info

Publication number
CN107122641A
CN107122641A CN201710276970.8A CN201710276970A CN107122641A CN 107122641 A CN107122641 A CN 107122641A CN 201710276970 A CN201710276970 A CN 201710276970A CN 107122641 A CN107122641 A CN 107122641A
Authority
CN
China
Prior art keywords
owner
smart machine
identification
identification model
acceleration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710276970.8A
Other languages
Chinese (zh)
Other versions
CN107122641B (en
Inventor
陈焰
朱添田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Yidun Information Technology Co., Ltd.
Original Assignee
Hangzhou Anshi Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Anshi Information Technology Co Ltd filed Critical Hangzhou Anshi Information Technology Co Ltd
Priority to CN201710276970.8A priority Critical patent/CN107122641B/en
Publication of CN107122641A publication Critical patent/CN107122641A/en
Application granted granted Critical
Publication of CN107122641B publication Critical patent/CN107122641B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/316User authentication by observing the pattern of computer usage, e.g. typical user behaviour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Business, Economics & Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Accounting & Taxation (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Finance (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Software Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a kind of smart machine owner recognition methods based on use habit and owner's identifying device, following steps are carried out during identification:Gather the kinematic parameter of smart machine within the sampling time according to default sample frequency under predetermined trigger condition, the kinematic parameter includes acceleration and the direction of motion;Segment processing is carried out to the kinematic parameter sequentially in time, corresponding characteristic vector is extracted in each segmentation obtained for segment processing respectively;It is the probability that owner operates that the characteristic vector of all segmentations, which is inputted, and obtains the smart machine to the good identification model identification of training in advance, if the probability is more than default threshold value, recognition result operates for owner;Otherwise recognition result is operated in person to be non-.Instant invention overcomes such as privacy leakage present in availability risk control system, the limitation of data sharing and specific training, owner's recognition methods accuracy of identification is high, and speed is fast, can realize Real time identification.

Description

Smart machine owner recognition methods and owner's identifying device based on use habit
Technical field
The present invention relates to smart machine security technology area, and in particular to a kind of smart machine owner based on use habit Recognition methods and owner's identifying device.
Background technology
Because smart machine (portable mobile communication apparatus such as including smart mobile phone, and tablet personal computer etc.) is more and more faster Development, the means of payment for having become main flow instantly of mobile payment.Mobile payment manufacturer, such as Alipay, wechat payment etc., Generally may require that user bank card binding to locally applied, for completing quick payment.When transaction, businessman can be with Allow user closely pay using POS system or directly allow user to be completed using the payment relevant interface on smart mobile phone Pay.In the behind of these convenient operations, a series of risk is equally introduced:If the hand-set from stolen of user, then attack Person just has an opportunity to be paid using the mobile phone of victim.Traditional checking way generally require user set up it is one's own with Card, such as password.But this way only can recognize that whether the operator of the mobile phone knows corresponding authority, and lack basic Mode go identification user in itself, i.e., he/her whether be the smart mobile phone owner.In view of these problems, risk control plan Slightly it is integrated into mobile-phone payment, for example, recognition of face, position identification etc..These features can effectively describe intelligent hand Machine owner so that attacker is difficult to bypass so as to improve the security of mobile payment.
Although risk control strategy is used in Mobile solution payment, these verification modes have some following deficiency, And cause these methods to be unable to provide the user lasting protection:
(1) existing risk control strategy more or less needs user to provide some privacy informations, calls some privacies Related authority, the extraction of these information can all constitute potential threaten to smart phone user;
(2) service provider of each Mobile solution be owned by can not between a set of data collection scheme of oneself, different application Shared data so that user profile, which is repeated several times, to be uploaded and become redundancy;
(3) training and authentication mechanism in risk control strategy does specific action sometimes for user, such as will be by face It is placed in the range of taking photograph of intelligent mobile phone, or smart mobile phone is placed on to fixed position of body etc..
The content of the invention
In view of the shortcomings of the prior art, the invention provides a kind of smart machine owner's recognition methods based on use habit And owner's identifying device, it can effectively solve the problem that the privacy leakage problem in existing recognition methods.
A kind of smart machine owner's recognition methods based on use habit, comprises the following steps:
S1, gathers the motion of smart machine under predetermined trigger condition according to default sample frequency within the sampling time Parameter, the kinematic parameter includes acceleration and the direction of motion;
S2, carries out segment processing, each segmentation obtained for segment processing to the kinematic parameter sequentially in time Corresponding characteristic vector is extracted respectively;
S3, the characteristic vector of all segmentations is inputted to the good identification model of training in advance and recognizes that obtaining the smart machine is The probability of owner's operation, if the probability is more than default threshold value, recognition result operates for owner;Otherwise recognition result is It is non-that I operates.
For the limitation of prior art, the invention provides a kind of smart machine owner identification side based on use habit Method, smart machine owner user real-time identification method is applied to most of smart machine.Set by owner using intelligence Kinematic parameter when standby extracts its feature, creates the behavioural characteristic model (i.e. identification model) of the owner of each smart machine, so The data collected when afterwards by follow-up someone using the mobile phone are authenticated with existing model, so as to play owner's certification work( Can, and then risk control effectively can be carried out to smart machine, and do not need any extra authority data relevant with privacy.
It is of the invention directly to gather intelligence respectively using the acceleration transducer and gyro sensor that are set in smart machine The acceleration and the direction of motion of equipment.
Trigger condition predetermined the step S1 is that smart machine is transferred to active state by resting state.Usual smart machine When not in use, in a dormant state, now the screen of smart machine is dimmed or smart machine is without operation.Work as appearance in the present invention Any one situation then thinks that smart machine is transferred to active state by resting state below:
(1) screen of smart machine is lighted;
(2) smart machine has new applying to be opened on foreground.
It should be noted that the time in sampling time and sample frequency are set according to practical application request, preferably, The sampling time is 3s in the present invention, and sample frequency is 50Hz/s, i.e., per second to gather 50 times.
Make obtained arbitrary neighborhood when carrying out segment processing to the kinematic parameter sequentially in time in the step S2 Two segmentation between have lap.
, to that there should be the corresponding sampling time, it is segmented sequentially in time in collection kinematic parameter every time.As excellent It is segmented when carrying out segment processing to the kinematic parameter sequentially in time in choosing, the step S2 using sliding window, And window shifts are less than window size during segmentation., it is necessary to sequentially in time will be corresponding in the sampling time when implementing The kinematic parameter that (i.e. sampling time) does not collect in the same time is ranked up sequentially in time, then recycles sliding window to enter Row segmentation.During segmentation, the window shifts that sliding window is moved every time are less than the size of window.Further preferably, the sliding window The window size of mouth is 0.2s, and window shifts are 0.1s.
Preferably, in the step S2 for each segmentation extract respectively following characteristic parameter formed corresponding feature to Amount:Acceleration and the direction of motion X, Y and Z average respectively in three directions, standard deviation, average, the degree of bias, kurtosis, Minimum value and maximum, zero-crossing rate and RMS amplitude, and acceleration and square root and the direction of motion and square root.
In the present invention, the identification model is trained by the following method to be obtained:
The characteristic vector and owner that obtain non-owner's operation smart machine operate the characteristic vector of the smart machine, and shape Into training set, the ratio of the bar number of the characteristic vector of owner and the bar number of the characteristic vector of non-owner is 1:N, wherein n are 3~6;
It is trained using SVMs using training set and obtains identification model.
The training set of the present invention includes 2000~5000 characteristic vectors, wherein the corresponding characteristic vector for belonging to owner is led to Cross and acquired using gathering kinematic parameter initial stage;The corresponding characteristic vector of non-owner from third party by (mostly providing sea Data interconnection net company is measured, has the moving parameter information of a large number of users) obtain, when obtaining by the way of stratified sampling.
To improve accuracy of identification, smart machine owner recognition methods of the invention also includes updating identification model, until more The accuracy of identification of identification model after new is met:
αnew> A ∩ Var (α) < V (meet condition α simultaneouslynew> A and Var (α) < V)
When stop, wherein, A is accuracy of identification threshold value, and V is the variance threshold values of accuracy of identification, before Var (α) is this renewal The variance of the accuracy of identification for the identification model that obtained all identification models and this renewal are obtained;
Proceeded as follows when updating identification model:
A kinematic parameter is often gathered, the characteristic vector for just extracting a portion kinematic parameter is added to the training Collection obtains new training set, the characteristic vector formation test set that another part kinematic parameter is extracted;
New identification model is obtained using new training set training, and new identification model and old identification mould are tested using test set The accuracy of identification of type;And judge whether the accuracy of identification of new identification model and old identification model meets following condition:
λαnew+(1-λ)αold> αold- β,
If meeting, old identification model is substituted to complete to update with new identification model, is not otherwise replaced, wherein, αnewWith αoldThe accuracy of identification of respectively new identification model and old identification model, λ is the weight of new identification model, and value is that 0~1, β is Correction factor.
In the present invention, renewal identification model is only identified model in first time training and just proceeded by.Updating identification During model, for training set, training set existing characteristic vector in training set when maintaining training identification model for the first time On the basis of gradually increase new characteristic vector;For test set, re-form and (do not protected every time with renewal identification model operation Stay the last characteristic vector updated when identification model is operated in test set).
Without specified otherwise, old identification model is interpreted as the identification model before this renewal operation in the present invention, New identification model is interpreted as the identification model retrieved in this renewal operating process.
In the present invention, λ, β, A and V value are adjusted according to concrete application demand.Preferably, λ=0.8, β=0.05, A =0.8, V=0.1.
It should be noted that in the smart machine recognition methods of the present invention, during the use for being defined as owner using known When kinematic parameter trains identification model, further, according to performance indications of the accuracy of identification of the identification model as identification model, Identification model is updated using accuracy of identification in preliminary use, to improve accuracy of identification.
Preferably, the kinematic parameter also includes acceleration of gravity, the step S1 is also using described gravity Acceleration is screened effective to retain to the kinematic parameter collected, and is entered for the acceleration and the direction of motion after screening Row step S2~S3.
The kinematic parameter collected is screened to retain effective add using described acceleration of gravity in the present invention When speed and the direction of motion, if acceleration of gravity meets following condition, then it is assumed that corresponding acceleration and the direction of motion are invalid simultaneously Delete, otherwise it is assumed that corresponding acceleration and the direction of motion effectively and retain:
{Xmin< Xgr(m) < Xmax}∪{Ymin< Ygr(m) < Ymax}∪{Zmin< | Zgr(m) | < Zmax,
Wherein, Xgr(m)、YgrAnd Z (m)gr(m) it is respectively the acceleration of gravity that collects for the m times in X-direction, Y-direction and Z Component on direction;XminAnd XmaxMinimum threshold and max-thresholds respectively in X-direction;YminAnd YmaxRespectively in Y-direction Minimum threshold and max-thresholds;ZminAnd ZmaxMinimum threshold and max-thresholds respectively in Z-direction;M value is 1~M, M is the total degree that kinematic parameter is gathered within the sampling time.
X in the present inventionmin=-1.5, Xmax=1.5;Ymin=-1.5, Ymax=1.5;Zmin=9, Zmax=10.Need Bright, acceleration of gravity max-thresholds in all directions and minimum threshold can be adjusted according to actual use situation.
It is not hand holding state that user is possible to when using smart machine, but is placed on the plane, and is now collected Kinematic parameter is invalid, and data screening is carried out by acceleration of gravity, can effectively delete a large amount of invalid datas, contribute to Improve the accuracy of owner's identification.
It should be noted that X, Y, the definition of Z-direction represent the acceleration that mobile phone is moved left and right for X in the present invention, Y is represented The movable acceleration of mobile phone, Z represents the acceleration of mobile phone vertical direction.
Preferably, the kinematic parameter also includes acceleration of gravity, the step S1 also includes according in the sampling time Acceleration of gravity and acceleration between Euclidean distance judge the relative status being presently in of smart machine;Accordingly, institute Identification model is stated including geo-stationary identification model and relative motion identification model, according to relative status by institute in the step S3 The characteristic vector for having segmentation inputs to corresponding identification model and carries out owner's identification.
The method of relative status of smart machine is determined according to the relation of acceleration of gravity in kinematic parameter and acceleration such as Under:
The average value of acceleration of gravity and acceleration in the kinematic parameter within the sampling time is calculated respectively, and is calculated respectively Euclidean distance between two average values, if more than default distance threshold, then it is assumed that smart machine is in relative motion shape State, otherwise it is assumed that smart machine is in relative static conditions.
Train geo-stationary identification model and train the method phase of relative motion identification model and above-mentioned training identification model Together, except that:
The characteristic vector included in the training set used during training geo-stationary identification model is relative static conditions;
The characteristic vector included in the training set used during training relative motion identification model is relative motion state.
Present invention also offers a kind of smart machine owner's identifying device based on use habit, including:
Data acquisition module, for being gathered under predetermined trigger condition according to default sample frequency within the sampling time The kinematic parameter of smart machine, the kinematic parameter includes acceleration and the direction of motion;
Data processing module, for carrying out segment processing to the kinematic parameter sequentially in time, for segment processing Corresponding characteristic vector is extracted in obtained each segmentation respectively;
Identification module, is somebody's turn to do for the characteristic vector of all segmentations to be inputted to the good identification model identification of training in advance Smart machine is the probability that owner operates, if the probability is more than default threshold value, recognition result operates for owner;Otherwise Recognition result is operated in person to be non-.
Compared with prior art, smart machine owner recognition methods and identifying device of the invention based on use habit, Owner's identification of smart machine can be effectively realized, and effectively prevent the risk of leakage privacy of user simultaneously.
Brief description of the drawings
Fig. 1 is the flow chart of smart machine owner's recognition methods based on use habit of the present embodiment;
Fig. 2A, Fig. 2 B and Fig. 2 C are respectively the acceleration and the difference of acceleration of gravity of smart machine under relative motion state (D-value) the fluctuation schematic diagram on tri- directions of X, Y and Z;
Fig. 3 A, Fig. 3 B and Fig. 3 C are respectively the acceleration and the difference of acceleration of gravity of smart machine under relative static conditions (D-value) the fluctuation schematic diagram on tri- directions of X, Y and Z.
Embodiment
Below in conjunction with the drawings and specific embodiments, the present invention will be described in detail.
As shown in figure 1, smart machine owner's recognition methods based on use habit of the present embodiment, comprises the following steps:
S1, gathers the motion of smart machine under predetermined trigger condition according to default sample frequency within the sampling time Parameter, kinematic parameter includes acceleration and the direction of motion;
When not in use, in a dormant state, now the screen of smart machine is dimmed or smart machine for usual smart machine Opened without new application program.Predetermined trigger condition is that smart machine is transferred to active state by resting state in the present embodiment, And then think that smart machine is transferred to active state by resting state when any one following situation occurs in smart machine:
(1) screen of smart machine is lighted;
(2) smart machine has new applying to be opened on foreground.
Sample frequency when gathering kinematic parameter is 50HZ, i.e., per second to gather 50 times, every time 3 sensors x, y, z of collection Value on axle, totally 9 are worth.The corresponding sampling time was gathered every time for 3 seconds.
S2, carries out segment processing to kinematic parameter sequentially in time, and each segmentation obtained for segment processing is distinguished Extract corresponding characteristic vector;
The arbitrary neighborhood for making to obtain when carrying out segment processing to the kinematic parameter sequentially in time in the present embodiment There is lap between two segmentations.
The Duplication of lap between 30%~80%, can generally be adjusted according to practical application request.Adopting every time Collect kinematic parameter to that there should be the corresponding sampling time, be segmented sequentially in time.It is used as a kind of implementation, step S2 In sequentially in time to the kinematic parameter carry out segment processing when utilize slide window implementation, and segmentation when window shifts Less than window size., it is necessary to sequentially in time will be corresponding not in the same time (when sampling in the sampling time when implementing Between) kinematic parameter that collects is ranked up sequentially in time, then recycles sliding window to be split.It is sliding during segmentation The window shifts that dynamic window is moved every time are less than the size of window.
In the present embodiment, during segmentation, for 0.2s, (i.e. each segmentation includes 10 groups of motion ginsengs to the window size of sliding window Number, one group of kinematic parameter is interpreted as the motion ginseng obtained by 1 second 50 times collections in the result combination once gathered, the present embodiment Every collection in 0.02 second once number, i.e., 0.2 second be exactly 10 times), the displacement of window is 0.1s, i.e., between two neighboring segmentation Data coincidence factor is 50%.
If it should be noted that kinematic parameter includes acceleration and the direction of motion, the one group of motion once collected Parameter should be the combination of collection acceleration and the direction of motion;Similarly, if kinematic parameter includes acceleration, the direction of motion and again Power acceleration, the then one group of kinematic parameter once collected should be collection acceleration, the direction of motion and acceleration of gravity Combination.
Characteristic vector is the character representation of user's operation, represents user operation habits.For each point in the present embodiment Section extracts following characteristic parameter and forms corresponding characteristic vector respectively:Acceleration and the direction of motion are in X, Y and Z respectively three sides Upward average, standard deviation, average, the degree of bias, kurtosis, minimum value and maximum, zero-crossing rate and RMS amplitude, Yi Jijia Speed and square root and the direction of motion and square root.Illustrated by taking the data feature values of acceleration in the X direction as an example:
1. average:
2. standard deviation:
3. average:
4. the degree of bias:
5. kurtosis:
6. minimum value:L=min { x (k), k=1 ..., K }
7. maximum:H=max { x (k), k=1 ..., K }
8. zero-crossing rate:
9. RMS amplitude:
10. and square root:
Wherein, acceleration is in tri- directions of X, Y, Z in the kinematic parameter that x (k), y (k) and z (k) collect for kth time On component, sgn [x (k)] is the jump function of acceleration in the kinematic parameter that kth time is collected, and sgn [x (k+1)] is the The jump function of acceleration in the kinematic parameter collected for k+1 times, K is total group of number of the kinematic parameter that the segmentation includes.K Size depend on K=10 in sampling time and default sample frequency, and window size, the present embodiment.
S3, the characteristic vector of all segmentations is inputted to the good identification model of training in advance and recognizes that obtaining the smart machine is The probability of owner's operation, if the probability is more than default threshold value, recognition result operates for owner;Otherwise recognition result is It is non-that I operates.
Identification model is trained obtain by the following method in the present embodiment:
The characteristic vector and owner that obtain non-owner's operation smart machine operate the characteristic vector of the smart machine, and shape Into training set, the ratio of the bar number of the characteristic vector of owner and the bar number of the characteristic vector of non-owner is 1:N, wherein n are 3~6 (n=5 in the present embodiment);
It is trained using SVMs using training set and obtains identification model.
RFB (radial direction base and function) parameter is set during training:
i:(span is 1~90000 to penalty coefficient C, and preferably 100) value is
ii:Data are mapped to the distribution after new feature space, and (span is 0~0.1 to γ, and preferably 0.01) value is
As a kind of preferred embodiment, also include updating identification model after training is identified model, until updating The accuracy of identification of identification model afterwards is met:
αnew> A ∩ Var (α) < V
When stop, wherein, A is accuracy of identification threshold value, and V is the variance threshold values of accuracy of identification, before Var (α) is this renewal The variance of the accuracy of identification for the identification model that obtained all identification models and this renewal are obtained;
Proceeded as follows when updating identification model:
A kinematic parameter is often gathered, the characteristic vector for just extracting a portion kinematic parameter is added to the training Collection obtains new training set, the characteristic vector formation test set that another part kinematic parameter is extracted;
New identification model is obtained using new training set training, and new identification model and old identification mould are tested using test set The accuracy of identification of type;And judge whether the accuracy of identification of new identification model and old identification model meets following condition:
λαnew+(1-λ)αold> αold- β,
If meeting, old identification model is substituted to complete to update with new identification model, is not otherwise replaced, wherein, αnewWith αoldThe accuracy of identification of respectively new identification model and old identification model, λ is the weight of new identification model, and value is that 0~1, β is Correction factor.
In the present embodiment, 4000 characteristic vectors are preferably comprised in training set when first trains, identification are being updated afterwards During model, the characteristic vector newly obtained can be added to every time in training set, i.e., training set is constantly expanding.
In the present embodiment, λ value is 0~1, and preferably 0.8, β value is 0~0.1, preferably 0.05, A value For 0.7~1, preferably 0.8, V value is 0~1, preferably 0.1.
The grader that new characteristic vector is obtained with training is judged that output result represents correspondence between 0 to 1 Characteristic vector be more nearly owner or other people.Whether it is owner's that what actual identification model identification was obtained is actually Probability p, then according to following formula:
Wherein, γ is the decision threshold between 0 to 1.Finally the characteristic vector to all inputs is counted, and is judged whether It is that owner is using smart machine.
The training method of the present embodiment carries out classification learning using SVMs (SVM), and using semi-supervised online Practise:
It is trained firstly the need of the kinematic parameter to owner, it is assumed that obtained when a user is using smart machine N group characteristic vectors, have p user, then will have n*p group characteristic vectors altogether and carry out composing training collection, by owner user The training set of my kinematic parameter composition is labeled as 1, the kinematic parameter group of other people (i.e. non-owners) in addition to owner Into data set be labeled as 0, when p is very big, my data volume not perfectly flat will weigh with other people data volume.
Further, solved the above problems in the present embodiment using stratified sampling, with PCA to 56 spies Levy and analyze, find the 10th feature and square root average root sum square of of acceleration transducer Influences of the acceleration (ARSSA) to result is the most obvious, as topmost feature, by the feature of every other people to Amount is ranked up according to ARSSA size, and it is divided into five identical intervals from small to large, is then chosen from each interval Same amount of data.In order to ensure time continuity, continuous 99 samples after each sample elected (characteristic vector) Example can all be added into training set.
Smart machine is possible to not be hand holding state when in use, but is placed on the plane, the motion at this moment collected Parameter is invalid data, and to avoid influence of the invalid data to accuracy of identification, the present embodiment also further utilizes acceleration of gravity The validity of kinematic parameter to collecting every time judges, and deletes invalid data and (be judged as invalid motion ginseng Number).It is specific as follows:
Kinematic parameter also includes acceleration of gravity, and step S1 is also using described acceleration of gravity to the fortune that collects Dynamic parameter is screened effective to retain, and carries out step S2~S3 for the acceleration and the direction of motion after screening.Utilize When described acceleration of gravity is screened to the kinematic parameter collected to retain effective acceleration and the direction of motion, if weight Power acceleration meets following condition, then it is assumed that corresponding acceleration and the direction of motion are invalid and delete, otherwise it is assumed that it is corresponding plus Speed and the direction of motion effectively and retain:
{Xmin< Xgr(m) < Xmax}∪{Ymin< Ygr(m) < Ymax}∪{Zmin< | Zgr(m) | < Zmax,
Wherein, Xgr(m)、YgrAnd Z (m)gr(m) it is respectively the acceleration of gravity that collects for the m times in X-direction, Y-direction and Z Component on direction;XminAnd XmaxMinimum threshold and max-thresholds respectively in X-direction;YminAnd YmaxRespectively in Y-direction Minimum threshold and max-thresholds;ZminAnd ZmaxMinimum threshold and max-thresholds respectively in Z-direction;M value is 1~M, M is the total degree that kinematic parameter is gathered within the sampling time.
In the present embodiment, Xmin=-1.5, Xmax=1.5, Ymin=-1.5, Ymax=1.5, Zmin=9, Zmax=10, M= 150。
When user is using smart machine, the change of state itself can also have a huge impact to judgement.In the present embodiment Motion state is divided into relative static conditions and relative motion state.Compared with relative static conditions, under relative motion state, Larger fluctuation occurs in the acceleration of smart machine and the difference (D-value) of acceleration of gravity.Wherein, Fig. 2A, Fig. 2 B and Fig. 2 C be respectively under relative motion state the acceleration of smart machine and the difference (D-value) of acceleration of gravity in X, Y and Z tri- Fluctuation schematic diagram on individual direction.Fig. 3 A, Fig. 3 B and Fig. 3 C be respectively under relative static conditions the acceleration of smart machine with again Fluctuation schematic diagram of the difference (D-value) of power acceleration on tri- directions of X, Y and Z.
Consider to fluctuate situation under different relative status, a kind of preferred reality is provided in accuracy of identification, the present embodiment to improve Existing scheme:
Kinematic parameter also includes acceleration of gravity, and step S1 also includes according to the acceleration of gravity in the sampling time and acceleration Euclidean distance between degree judges the relative status being presently in of smart machine;Accordingly, the identification model includes relative It is according to relative status that the characteristic vector of all segmentations is defeated in static identification model and relative motion identification model, the step S3 Enter to corresponding identification model and carry out owner's identification.Intelligence is determined according to the relation of acceleration of gravity in kinematic parameter and acceleration The method of the relative status of equipment is as follows:
The average value of acceleration of gravity and acceleration in the kinematic parameter within the sampling time is calculated respectively, and is calculated respectively Euclidean distance between two average values, if more than default distance threshold, then it is assumed that smart machine is in relative motion shape State, otherwise it is assumed that smart machine is in relative static conditions.
Relation using acceleration of gravity and acceleration in the kinematic parameter collected every time is judged to gather motion ginseng Relative status during number residing for the smart machine, are divided into relative motion state for different states and relative static conditions two are big Class, is respectively adopted corresponding grader (i.e. identification model) for each class and is identified.
When there is new kinematic parameter to need identification, it is necessary first to first determine the smart machine when gathering the kinematic parameter Relative status (relative motion state or relative static conditions), then in the corresponding grader of input value recognize.
The characteristic vector included in the training set used during training geo-stationary identification model is relative static conditions;
The characteristic vector included in the training set used during training relative motion identification model is relative motion state.
The present invention smart machine owner recognition methods be based on identifying system complete, the identifying system including smart machine with And the server communicated to connect with intelligent set.
This method can also use ONLINE RECOGNITION for identified off-line pattern:
(a) under identified off-line pattern:
User downloads the good identification model of training in advance by network connection from server end first, is used in smart machine During, kinematic parameter, progress data processing (including segment processing and extraction characteristic vector) are locally gathered in smart machine, And final identification;
(b) under ONLINE RECOGNITION pattern:
Server obtain smart machine kinematic parameter, carry out data processing (including segment processing and extract feature to Amount), and final identification.
Conveniently, as a kind of implementation, under ONLINE RECOGNITION pattern, server obtains the motion ginseng of smart machine It can locally be collected by smart machine during number and server is sent to after kinematic parameter.
Smart machine owner's recognition methods of the present embodiment will be illustrated using smart mobile phone as optimal implementation below.Often One smart mobile phone has mark of the unique device number as mobile phone.
Before owner's identification is carried out:
Smart mobile phone needs the corresponding application program of installation and deployment, and being then deployed in applying for mobile phone terminal using this is making a reservation for Trigger condition under by acceleration transducer, the numerical value that gyro sensor is collected with gravity sensor as kinematic parameter, The remote server (i.e. server) that is connected with the smart mobile phone is uploaded to together with the device number of the smart mobile phone.
Server end is obtained after the kinematic parameter, is pre-processed as follows:
Obtained kinematic parameter is screened using acceleration of gravity, then the parameter after screening is segmented, and Extract the characteristic vector of each segmentation;
The corresponding relative status of each kinematic parameter are determined according to the relation between acceleration of gravity and acceleration.
Server trains identification model after pretreatment terminates according to the corresponding characteristic vector of kinematic parameter after screening, this The identification model at place includes geo-stationary identification model and relative motion identification model, wherein:
The training method of geo-stationary identification model is as follows:
By the characteristic vector under relative static conditions, the characteristic vector formation together with the non-owner obtained from third party is trained Collection, model is identified to train, and updates obtained identification model.
Characteristic vector in training set is changed to phase by the method for relative motion identification model with reference to geo-stationary identification model It is corresponding to motion state.
After renewal terminates, you can to carry out owner's identification, during using identified off-line:First download and updated from server Identification model afterwards, the kinematic parameter collected is locally pre-processed in smart machine to determine to gather during the kinematic parameter The relative status of smart machine and the characteristic vector of the kinematic parameter, then input characteristic vector according to the relative status of determination Into corresponding identification model.
During ONLINE RECOGNITION, the kinematic parameter collected is directly sent to server end, pre-processed by server end And identification operation, no longer elaborate herein.
Owner's recognition methods of the present embodiment can directly with the mobile payment association installed on smart machine, not Same mobile payment supplier is connected to the transaction request transmitted from mobile phone terminal, it is thus only necessary to which whether inquired about from server end is owner I is operating the smart mobile phone so as to reach the purpose of safeguard protection.The verification method can be by each mobile payment supplier Originally respective authentication mode is united, in addition, the data of checking only need to upload once, rather than uploads to each respectively The server of Audited Suppliers.
Owner's recognition methods of the present embodiment is mainly used to solve in already present smart mobile phone risk-control mechanism not Foot.From user using, come the grader specified, and then being used on smart machine behavioural habits as smart mobile phone user Judge during mobile phone the user whether be certification owner.Need only to collection and the insensitive motion-sensing of privacy of user The data of device are transformed into tripartite's service, without extra support.Therefore sensor-based smart mobile phone machine Primary user's real-time identification method is actually that the safeguard protection to the Mobile solution of any required user authentication is improved.For example, Alice and Bob are good friends, and her mobile phone has been stayed in Bob familys by Alice, if the Facebook in Alice mobile phones can be with Automated log on, then Bob can check Alice agreement of the Facebook privacy informations without Alice.Then, if Sensor-based smart mobile phone owner user real-time identification method is employed in Alice mobile phone, then Bob is used as one The operation behavior of individual non-owner user will be examined out, thus after carrying out protection operation, such as mail notification or Person is redirected to blank page.
In practical application, to improve security, when identification is not owner's operation, mail reminder etc. can be passed through Mode notifies owner.
Technical scheme and beneficial effect are described in detail above-described embodiment, Ying Li Solution is to the foregoing is only presently most preferred embodiment of the invention, is not intended to limit the invention, all principle models in the present invention Interior done any modification, supplement and equivalent substitution etc. are enclosed, be should be included in the scope of the protection.

Claims (10)

1. a kind of smart machine owner's recognition methods based on use habit, it is characterised in that comprise the following steps:
S1, gathers the motion ginseng of smart machine under predetermined trigger condition according to default sample frequency within the sampling time Number, the kinematic parameter includes acceleration and the direction of motion;
S2, carries out segment processing to the kinematic parameter sequentially in time, and each segmentation obtained for segment processing is distinguished Extract corresponding characteristic vector;
S3, the characteristic vector of all segmentations is inputted to the good identification model identification of training in advance and obtains the smart machine for owner The probability of operation, if the probability is more than default threshold value, recognition result operates for owner;Otherwise recognition result is non- People operates.
2. smart machine owner's recognition methods as claimed in claim 1 based on use habit, it is characterised in that the step Trigger condition predetermined S1 is that smart machine is transferred to active state by resting state.
3. smart machine owner's recognition methods as claimed in claim 1 based on use habit, it is characterised in that the identification Model is trained obtain by the following method:
The characteristic vector and owner for obtaining non-owner's operation smart machine operate the characteristic vector of the smart machine, and form instruction Practice and collect, the ratio of the bar number of the characteristic vector of owner and the bar number of the characteristic vector of non-owner is 1:N, wherein n are 3~6;
It is trained using SVMs using training set and obtains identification model.
4. smart machine owner's recognition methods as claimed in claim 3 based on use habit, it is characterised in that also including more New identification model, until the accuracy of identification of the identification model after updating is met:
αnew> A ∩ Var (α) < V
When stop, wherein, A is accuracy of identification threshold value, and V is the variance threshold values of accuracy of identification, and Var (α) is obtains before this renewal All identification models and the obtained variance of the accuracy of identification of identification model of this renewal;
Proceeded as follows when updating identification model:
A kinematic parameter is often gathered, the characteristic vector for just extracting a portion kinematic parameter is obtained added to the training set To new training set, the characteristic vector formation test set that another part kinematic parameter is extracted;
New identification model is obtained using new training set training, and tests new identification model and old identification model using test set Accuracy of identification;And judge whether the accuracy of identification of new identification model and old identification model meets following condition:
λαnew+(1-λ)αold> αold- β,
If meeting, old identification model is substituted to complete to update with new identification model, is not otherwise replaced, wherein, αnewAnd αoldPoint Not Wei new identification model and old identification model accuracy of identification, λ is the weight of new identification model, value be 0~1, β be correction because Son.
5. smart machine owner's recognition methods based on use habit as described in any one in Claims 1 to 4, its feature It is, the kinematic parameter also includes acceleration of gravity, and the step S1 is also using described acceleration of gravity to collection To kinematic parameter screened effective to retain, and for the acceleration and the direction of motion after screening carry out step S2~ S3。
6. smart machine owner's recognition methods as claimed in claim 5 based on use habit, it is characterised in that described in Acceleration of gravity when being screened to the kinematic parameter collected to retain effective acceleration and the direction of motion, if gravity adds Speed meets following condition, then it is assumed that corresponding acceleration and the direction of motion are invalid and delete, otherwise it is assumed that corresponding acceleration With the direction of motion effectively and retain:
{Xmin< Xgr(m) < Xmax}∪{Ymin< Ygr(m) < Ymax}∪{Zmin< | Zgr(m) | < Zmax,
Wherein, Xgr(m)、YgrAnd Z (m)gr(m) it is respectively the acceleration of gravity that collects for the m times in X-direction, Y-direction and Z-direction On component;XminAnd XmaxMinimum threshold and max-thresholds respectively in X-direction;YminAnd YmaxRespectively in Y-direction most Small threshold value and max-thresholds;ZminAnd ZmaxMinimum threshold and max-thresholds respectively in Z-direction;M value is 1~M, and M is The total degree of kinematic parameter is gathered within the sampling time.
7. smart machine owner's recognition methods based on use habit as described in any one in Claims 1 to 4, its feature It is, the kinematic parameter also includes acceleration of gravity, the step S1 also includes according to the acceleration of gravity in the sampling time Euclidean distance between acceleration judges the relative status being presently in of smart machine;Accordingly, the identification model bag Include the feature of all segmentations in geo-stationary identification model and relative motion identification model, the step S3 according to relative status Vector input to corresponding identification model carries out owner's identification.
8. smart machine owner's recognition methods as claimed in claim 7 based on use habit, it is characterised in that according to motion The relation of acceleration of gravity and acceleration determines that the method for the relative status of smart machine is as follows in parameter:
Calculate the average value of acceleration of gravity and acceleration in the kinematic parameter within the sampling time respectively, and calculate respectively this two Euclidean distance between individual average value, if more than default distance threshold, then it is assumed that smart machine is in relative motion state, no Then, it is believed that smart machine is in relative static conditions.
9. smart machine owner's recognition methods as claimed in claim 8 based on use habit, it is characterised in that training is relative The characteristic vector included in the training set used during static identification model is relative static conditions;
The characteristic vector included in the training set used during training relative motion identification model is relative motion state.
10. a kind of smart machine owner's identifying device based on use habit, it is characterised in that including:
Data acquisition module, for gathering intelligence within the sampling time according to default sample frequency under predetermined trigger condition The kinematic parameter of equipment, the kinematic parameter includes acceleration and the direction of motion;
Data processing module, for carrying out segment processing to the kinematic parameter sequentially in time, is obtained for segment processing Each segmentation extract corresponding characteristic vector respectively;
Identification module, the intelligence is obtained for the characteristic vector of all segmentations to be inputted to the good identification model identification of training in advance Equipment is the probability that owner operates, if the probability is more than default threshold value, recognition result operates for owner;Otherwise recognize As a result operated in person to be non-.
CN201710276970.8A 2017-04-25 2017-04-25 Intelligent equipment owner identification method and intelligent equipment owner identification device based on use habit Active CN107122641B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710276970.8A CN107122641B (en) 2017-04-25 2017-04-25 Intelligent equipment owner identification method and intelligent equipment owner identification device based on use habit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710276970.8A CN107122641B (en) 2017-04-25 2017-04-25 Intelligent equipment owner identification method and intelligent equipment owner identification device based on use habit

Publications (2)

Publication Number Publication Date
CN107122641A true CN107122641A (en) 2017-09-01
CN107122641B CN107122641B (en) 2020-06-16

Family

ID=59725823

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710276970.8A Active CN107122641B (en) 2017-04-25 2017-04-25 Intelligent equipment owner identification method and intelligent equipment owner identification device based on use habit

Country Status (1)

Country Link
CN (1) CN107122641B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109214447A (en) * 2018-08-27 2019-01-15 郑州云海信息技术有限公司 Model training method and device, disk life-span prediction method and device
CN109977639A (en) * 2018-10-26 2019-07-05 招商银行股份有限公司 Identity identifying method, device and computer readable storage medium
CN110223672A (en) * 2019-05-16 2019-09-10 九牧厨卫股份有限公司 A kind of multilingual audio recognition method of off-line type
CN110795722A (en) * 2019-10-25 2020-02-14 支付宝(杭州)信息技术有限公司 Deployment method and device of security authentication model and electronic equipment
CN110990819A (en) * 2019-12-25 2020-04-10 浙江每日互动网络科技股份有限公司 Method and server for acquiring gait feature data of terminal user based on mobile terminal data
CN111061376A (en) * 2019-12-25 2020-04-24 浙江每日互动网络科技股份有限公司 Method and server for identifying terminal user change machine based on mobile terminal data
CN111062353A (en) * 2019-12-25 2020-04-24 浙江每日互动网络科技股份有限公司 Method and server for acquiring gait feature data of terminal user based on mobile terminal data
CN111062352A (en) * 2019-12-25 2020-04-24 浙江每日互动网络科技股份有限公司 Method and server for recognizing gait of terminal user based on mobile terminal data
CN111126294A (en) * 2019-12-25 2020-05-08 浙江每日互动网络科技股份有限公司 Method and server for recognizing gait of terminal user based on mobile terminal data
CN111142688A (en) * 2019-12-25 2020-05-12 浙江每日互动网络科技股份有限公司 Method and server for identifying terminal user change machine based on mobile terminal data
CN111404941A (en) * 2020-03-17 2020-07-10 广东九联科技股份有限公司 Network security protection method and network security protection device
CN111490995A (en) * 2020-06-12 2020-08-04 支付宝(杭州)信息技术有限公司 Model training method and device for protecting privacy, data processing method and server
CN111626769A (en) * 2020-04-30 2020-09-04 北京芯盾时代科技有限公司 Man-machine recognition method and device and storage medium
CN112784224A (en) * 2019-11-08 2021-05-11 中国电信股份有限公司 Terminal safety protection method, device and system
CN112989980A (en) * 2021-03-05 2021-06-18 华南理工大学 Target detection system and method based on web cloud platform

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101365193A (en) * 2007-08-09 2009-02-11 财团法人Seoul大学校产学协力财团 System and method for customer authentication execution based on customer behavior mode
CN103077356A (en) * 2013-01-11 2013-05-01 中国地质大学(武汉) Protecting and tracking method for primary information of mobile terminal based on user behavior pattern
CN103530546A (en) * 2013-10-25 2014-01-22 东北大学 Identity authentication method based on mouse behaviors of user
CN103533546A (en) * 2013-10-29 2014-01-22 无锡赛思汇智科技有限公司 Implicit user verification and privacy protection method based on multi-dimensional behavior characteristics
CN103530543A (en) * 2013-10-30 2014-01-22 无锡赛思汇智科技有限公司 Behavior characteristic based user recognition method and system
CN103699822A (en) * 2013-12-31 2014-04-02 同济大学 Application system and detection method for users' abnormal behaviors in e-commerce based on mouse behaviors
CN103927471A (en) * 2014-04-18 2014-07-16 电子科技大学 Authentication method and device
CN104268481A (en) * 2014-10-10 2015-01-07 中国联合网络通信集团有限公司 Method and device for realizing early warning of smart phone
CN105389486A (en) * 2015-11-05 2016-03-09 同济大学 Authentication method based on mouse behavior
US20160080936A1 (en) * 2014-09-16 2016-03-17 Samsung Electronics Co., Ltd. Systems and methods for device based authentication
CN105528613A (en) * 2015-11-30 2016-04-27 南京邮电大学 Behavior identification method based on GPS speed and acceleration data of smart phone

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101365193A (en) * 2007-08-09 2009-02-11 财团法人Seoul大学校产学协力财团 System and method for customer authentication execution based on customer behavior mode
CN103077356A (en) * 2013-01-11 2013-05-01 中国地质大学(武汉) Protecting and tracking method for primary information of mobile terminal based on user behavior pattern
CN103530546A (en) * 2013-10-25 2014-01-22 东北大学 Identity authentication method based on mouse behaviors of user
CN103533546A (en) * 2013-10-29 2014-01-22 无锡赛思汇智科技有限公司 Implicit user verification and privacy protection method based on multi-dimensional behavior characteristics
CN103530543A (en) * 2013-10-30 2014-01-22 无锡赛思汇智科技有限公司 Behavior characteristic based user recognition method and system
CN103699822A (en) * 2013-12-31 2014-04-02 同济大学 Application system and detection method for users' abnormal behaviors in e-commerce based on mouse behaviors
CN103927471A (en) * 2014-04-18 2014-07-16 电子科技大学 Authentication method and device
US20160080936A1 (en) * 2014-09-16 2016-03-17 Samsung Electronics Co., Ltd. Systems and methods for device based authentication
CN104268481A (en) * 2014-10-10 2015-01-07 中国联合网络通信集团有限公司 Method and device for realizing early warning of smart phone
CN105389486A (en) * 2015-11-05 2016-03-09 同济大学 Authentication method based on mouse behavior
CN105528613A (en) * 2015-11-30 2016-04-27 南京邮电大学 Behavior identification method based on GPS speed and acceleration data of smart phone

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109214447A (en) * 2018-08-27 2019-01-15 郑州云海信息技术有限公司 Model training method and device, disk life-span prediction method and device
CN109214447B (en) * 2018-08-27 2021-10-29 郑州云海信息技术有限公司 Disk life prediction method and device
CN109977639A (en) * 2018-10-26 2019-07-05 招商银行股份有限公司 Identity identifying method, device and computer readable storage medium
CN110223672B (en) * 2019-05-16 2021-04-23 九牧厨卫股份有限公司 Offline multi-language voice recognition method
CN110223672A (en) * 2019-05-16 2019-09-10 九牧厨卫股份有限公司 A kind of multilingual audio recognition method of off-line type
CN110795722A (en) * 2019-10-25 2020-02-14 支付宝(杭州)信息技术有限公司 Deployment method and device of security authentication model and electronic equipment
CN112784224B (en) * 2019-11-08 2024-01-30 中国电信股份有限公司 Terminal safety protection method, device and system
CN112784224A (en) * 2019-11-08 2021-05-11 中国电信股份有限公司 Terminal safety protection method, device and system
CN111061376A (en) * 2019-12-25 2020-04-24 浙江每日互动网络科技股份有限公司 Method and server for identifying terminal user change machine based on mobile terminal data
CN111142688A (en) * 2019-12-25 2020-05-12 浙江每日互动网络科技股份有限公司 Method and server for identifying terminal user change machine based on mobile terminal data
CN111126294A (en) * 2019-12-25 2020-05-08 浙江每日互动网络科技股份有限公司 Method and server for recognizing gait of terminal user based on mobile terminal data
CN111062352A (en) * 2019-12-25 2020-04-24 浙江每日互动网络科技股份有限公司 Method and server for recognizing gait of terminal user based on mobile terminal data
CN111062353A (en) * 2019-12-25 2020-04-24 浙江每日互动网络科技股份有限公司 Method and server for acquiring gait feature data of terminal user based on mobile terminal data
CN110990819B (en) * 2019-12-25 2023-04-21 每日互动股份有限公司 Method and server for acquiring gait feature data of terminal user based on mobile terminal data
CN111062353B (en) * 2019-12-25 2023-04-28 每日互动股份有限公司 Method and server for acquiring gait feature data of terminal user based on mobile terminal data
CN110990819A (en) * 2019-12-25 2020-04-10 浙江每日互动网络科技股份有限公司 Method and server for acquiring gait feature data of terminal user based on mobile terminal data
CN111404941A (en) * 2020-03-17 2020-07-10 广东九联科技股份有限公司 Network security protection method and network security protection device
CN111626769A (en) * 2020-04-30 2020-09-04 北京芯盾时代科技有限公司 Man-machine recognition method and device and storage medium
CN111626769B (en) * 2020-04-30 2021-04-06 北京芯盾时代科技有限公司 Man-machine recognition method and device and storage medium
CN111490995A (en) * 2020-06-12 2020-08-04 支付宝(杭州)信息技术有限公司 Model training method and device for protecting privacy, data processing method and server
CN112989980A (en) * 2021-03-05 2021-06-18 华南理工大学 Target detection system and method based on web cloud platform

Also Published As

Publication number Publication date
CN107122641B (en) 2020-06-16

Similar Documents

Publication Publication Date Title
CN107122641A (en) Smart machine owner recognition methods and owner's identifying device based on use habit
CN103530540B (en) User identity attribute detection method based on man-machine interaction behavior characteristics
CN107016346A (en) gait identification method and system
CN109685647A (en) The training method of credit fraud detection method and its model, device and server
CN105975959A (en) Face characteristic extraction modeling method based on neural network, face identification method, face characteristic extraction modeling device and face identification device
CN110458687A (en) The automatic measures and procedures for the examination and approval of decision, device and computer readable storage medium
CN107392110A (en) Beautifying faces system based on internet
CN107222865A (en) The communication swindle real-time detection method and system recognized based on suspicious actions
CN107481019A (en) Order fraud recognition methods, system, storage medium and electronic equipment
CN109697416A (en) A kind of video data handling procedure and relevant apparatus
CN109949286A (en) Method and apparatus for output information
CN106022030A (en) Identity authentication system and method based on user habit behavior features
CN106600423A (en) Machine learning-based car insurance data processing method and device and car insurance fraud identification method and device
CN105830081A (en) Methods and systems of generating application-specific models for the targeted protection of vital applications
CN106096662A (en) Human motion state identification based on acceleration transducer
CN105931116A (en) Automated credit scoring system and method based on depth learning mechanism
CN107368971A (en) The methods of marking and device of a kind of personal credit
CN107194216A (en) A kind of mobile identity identifying method and system of the custom that swiped based on user
CN102567788A (en) Real-time identification system and real-time identification method for fraudulent practice in communication services
CN101827002A (en) Concept drift detection method of data flow classification
CN106845403A (en) A kind of method that its identity speciality is determined by user behavior track
CN109978870A (en) Method and apparatus for output information
CN106228133A (en) User authentication method and device
CN107682344A (en) A kind of ID collection of illustrative plates method for building up based on DPI data interconnection net identifications
CN108415653A (en) Screen locking method and device for terminal device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200211

Address after: Room 431, Building 7, No. 5, No. 3 Road, Genshan Branch, Jianggan District, Hangzhou City, Zhejiang Province, 310004

Applicant after: Hangzhou Yidun Information Technology Co., Ltd.

Address before: 311231, room 3, building 2, building 28, 301 poly Road, Hangzhou, Zhejiang, Binjiang District, -271

Applicant before: Hangzhou Anshi Information Technology Co. Ltd.

GR01 Patent grant
GR01 Patent grant