CN113570452B - Method, system, storage medium and terminal for solving fraud detection by quantum hidden Markov model - Google Patents

Method, system, storage medium and terminal for solving fraud detection by quantum hidden Markov model Download PDF

Info

Publication number
CN113570452B
CN113570452B CN202110960351.7A CN202110960351A CN113570452B CN 113570452 B CN113570452 B CN 113570452B CN 202110960351 A CN202110960351 A CN 202110960351A CN 113570452 B CN113570452 B CN 113570452B
Authority
CN
China
Prior art keywords
quantum
hidden markov
markov model
hidden
fraud detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110960351.7A
Other languages
Chinese (zh)
Other versions
CN113570452A (en
Inventor
李晓瑜
李志明
蒋欣睿
朱钦圣
吴昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Yuanjiang Technology Co ltd
Original Assignee
Sichuan Yuanjiang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Yuanjiang Technology Co ltd filed Critical Sichuan Yuanjiang Technology Co ltd
Priority to CN202110960351.7A priority Critical patent/CN113570452B/en
Publication of CN113570452A publication Critical patent/CN113570452A/en
Application granted granted Critical
Publication of CN113570452B publication Critical patent/CN113570452B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/03Credit; Loans; Processing thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/17Function evaluation by approximation methods, e.g. inter- or extrapolation, smoothing, least mean square method
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Business, Economics & Management (AREA)
  • Algebra (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Development Economics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Operations Research (AREA)
  • Computing Systems (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Technology Law (AREA)
  • General Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a method, a system, a storage medium and a terminal for solving fraud detection by using a quantum hidden Markov model, wherein the method comprises the following steps: establishing a corresponding quantum hidden Markov model according to the transaction time sequence data type and the fraud detection type, wherein the possibility of the output of the quantum hidden Markov model is related to the transaction of the previous transaction time sequence; the transaction time sequence data type is used as an observation state of a quantum hidden Markov model, and the fraud detection type is used as a hidden state of the quantum hidden Markov model; and outputting all the possibilities output by the quantum hidden Markov model to a random forest classifier as an additional characteristic of fraud detection. The invention adds the hidden Markov model of the quantum version into the fraud detection system, which not only can verify the credibility of the combination of the quantum algorithm and the classical algorithm, but also can achieve the effect similar to the traditional hidden Markov model under the condition of using less parameters.

Description

Method, system, storage medium and terminal for solving fraud detection by quantum hidden Markov model
Technical Field
The invention relates to the field of quantum computing, in particular to a method, a system, a storage medium and a terminal for solving fraud detection by using a quantum hidden Markov model.
Background
Credit card fraud detection suffers from concept drift (i.e., the model created to face the fraud problem changes over time), skewed class distribution (i.e., in real life, fraud is only an example, and occupies a small portion of the user's numerous transactions; but causes significant loss to the user), and data characteristics are too numerous. However, no matter how the model changes, any transaction information contains time and transaction amount, which reflects the transaction habit of the cardholder to a certain extent. That is, if a person purchases small amount of products frequently in the past and suddenly purchases other goods with a very large amount of money on a certain day, the person can be doubted whether the person faces the situation of credit card embezzlement. Therefore, it is very necessary to consider detailed sequence information of past transactions. Therefore, in order to find the "habit" of each cardholder, a Hidden Markov Model (HMM) may be used to find such hidden habits. Hidden Markov Models (HMM) are statistical models that are used to describe a Markov process with Hidden unknown parameters; the difficulty is to determine the implicit parameters of the process from the observable parameters; these parameters are then used for further analysis. YvanLucas et al used such a method.
With the rise of online shopping, transactions of various amounts occur many times each day. This has also led to the fact that the computing power of today's classic computers is not distracting in the face of such huge amounts of transactions and the need to process these transactions in a short time. With the development of society, quantum mechanics gradually enters the visual field of people. Thus turning attention to quantum computers with computing power well beyond that of classical computers. Various quantum algorithms are faster than conventional general purpose computers in dealing with problems.
The state superposition principle of quantum mechanics can enable the state of a quantum information unit, namely a quantum bit, to be in a superposition state with multiple possibilities at one time, so that the quantum information processing efficiency is much faster than that of classical information processing, and the quantum information processing method has more potential. Quantum coherence in quantum computing enables quantum states to interfere with each other. Thus if one qubit is operated on, the other bits of the string can be affected.
The concept of quantum computers arose based on these two most essential features, but at the earliest due to early hardware limitations, attention was focused on quantum algorithms that could run on quantum computers. Since the quantum prime factorization algorithm proposed by peter-xiuer 1994, the quantum algorithm has proved to be far lower in operating efficiency and time complexity than the conventional algorithm, and the quantum computer becomes more impatient after the threat to the RSA encryption algorithm which is now commonly used in banks and networks. Many classical algorithms have also been developed with corresponding quantum versions and have been shown to produce superior performance. At present, the quantum version algorithm is rarely used for solving the practical problem, and almost theoretically proves that the operation efficiency and the time complexity of the quantum version algorithm are superior to those of the classical algorithm.
Therefore, the hidden Markov model of the quantum version can be considered to be added into the fraud detection system to verify the credibility of the combination of the quantum algorithm and the classical algorithm and prepare for replacing the random forest with a more efficient quantum algorithm.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a method, a system, a storage medium and a terminal for solving fraud detection by using a quantum hidden Markov model.
The purpose of the invention is realized by the following technical scheme:
in a first aspect of the invention, there is provided a method for solving fraud detection using quantum hidden markov models, comprising the steps of:
establishing a corresponding quantum hidden Markov model according to the transaction time sequence data type and the fraud detection type, wherein the possibility of the output of the quantum hidden Markov model is related to the transaction of the previous transaction time sequence; the transaction time sequence data type is used as an observation state of a quantum hidden Markov model, and the fraud detection type is used as a hidden state of the quantum hidden Markov model;
and outputting all the possibilities output by the quantum hidden Markov model to a random forest classifier as an additional characteristic of fraud detection.
Further, the likelihood of the quantum hidden markov model output comprises:
deriving a quantum hidden Markov model from a traditional hidden Markov model to obtain:
Figure BDA0003222055170000021
wherein t represents time t, y t Represents the observed state at time t, p t-1 Representing the hidden state at time t-1, p (y) tt-1 ) Represents the conditional probability of observing the state at time t when the state at time t-1 is known, tr () represents the trace-finding,
Figure BDA0003222055170000022
a Karus operator corresponding to an observation state at the time of t-1;
sequence y of transaction sequences 1 ,y 2 ,y 3 ,…,y T Constructing a likelihood function
Figure BDA0003222055170000023
Then, carrying out gradient descent maximization of the value of the likelihood function on all possible derivatives of the Kraus operator by using the likelihood function, thereby obtaining a matrix solution of the Karus operator; where y is 1 ,y 2 ,y 3 ,…,y T Is the value of the observed value and is,
Figure BDA0003222055170000024
is the y i Each observation value of the 2 x 2 Kraus operation matrix corresponds to one operation matrix, and the observation values are the same initially; and->
Figure BDA0003222055170000025
ρ 0 Representing an initial density matrix; converting the solution of the derived quantum hidden Markov model into a constrained optimization problem:
Figure BDA0003222055170000026
m represents the mth Karus operator;
reconstructing a new matrix kappa, stacking all 2 x 2 Kraus operator matrixes into a 2S x 2 matrix kappa according to the output number S in columns, wherein the newly constructed matrix kappa needs to meet the condition
Figure BDA0003222055170000031
The constrained optimization problem is written as:
Figure BDA0003222055170000032
since κ is on the Stiefel manifold, its constrained optimization problem can be transformed into the unconstrained problem as follows, which can be solved by gradient descent algorithm:
Figure BDA0003222055170000033
wherein G represents the gradient of the likelihood function for k, U = [ G | κ ], τ represents a real number in the interval [0,1], and V = [ κ | -G ];
the probability p (y) of the output sequence of the quantum hidden Markov model is obtained by reversely deducing the Kraus operator through kappa and solving through the Kraus operator tt-1 )。
Further, the method further comprises the steps of:
the DA is used for judging whether the generated model is good or bad:
Figure BDA0003222055170000034
in the formula (I), the compound is shown in the specification,
Figure BDA0003222055170000036
is the length of the sequence, s is the number of hidden states, Y is the observation data, D is the quantum hidden Markov model, and f is a parameter from (- ∞, 1)]To (-1,1)]The non-linear function of (c):
Figure BDA0003222055170000035
when DA =1, the model is shown to perfectly predict the sequence; and drawing a graph of DA change according to the increase of the iteration number of the gradient descent algorithm under the condition of each operation, so as to modify the iteration number and other parameters in the gradient descent algorithm, and ensure that the final DA value can be converged to a value close to 1.
Further, the quantum hidden markov models include eight, wherein:
the transaction time sequence data types comprise two types of money and time difference of two transactions, and each type is respectively used as an observation state of four quantum hidden Markov models;
the hidden states comprise all real cardholder data sets, fraudulent cardholder data sets, all real terminal data sets and fraudulent terminal data sets which are respectively used as hidden states of two quantum hidden Markov models, wherein the observation states of the two quantum hidden Markov models are different.
In a second aspect of the invention, there is provided a system for solving fraud detection using quantum hidden markov models, comprising:
the system comprises a plurality of quantum hidden Markov models, a plurality of counter units and a plurality of data processing units, wherein the quantum hidden Markov models are established according to transaction time sequence data types and fraud detection types, and the possibility of the output of the quantum hidden Markov models is associated with the transaction of the previous transaction time sequence; the transaction time sequence data type is used as an observation state of a quantum hidden Markov model, and the fraud detection type is used as a hidden state of the quantum hidden Markov model;
and the random forest classifier is used for receiving the possibility of all quantum hidden Markov model outputs as an additional characteristic of fraud detection.
Further, the likelihood of the quantum hidden markov model output comprises:
deriving a quantum hidden Markov model from a traditional hidden Markov model to obtain:
Figure BDA0003222055170000041
wherein t represents time t, y t Represents the observed state at time t, p t-1 Representing the hidden state at time t-1, p (y) tt-1 ) Represents the conditional probability of observing the state at time t when the state at time t-1 is known, tr () represents the trace-finding,
Figure BDA0003222055170000042
a Karus operator corresponding to an observation state at the time of t-1;
will trade the time sequence y 1 ,y 2 ,y 3 ,…,y T Constructing a likelihood function
Figure BDA0003222055170000043
Then, carrying out gradient descent maximization of the value of the likelihood function on all possible derivatives of the Kraus operator by using the likelihood function, thereby obtaining a matrix solution of the Karus operator; where y is 1 ,y 2 ,y 3 ,…,y T Is the value of the observed value and is,
Figure BDA0003222055170000044
is the th y i Each observation value of the 2 x 2 Kraus operation matrix corresponds to one operation matrix, and the observation values are the same initially; and->
Figure BDA0003222055170000045
ρ 0 Representing an initial density matrix; converting the solution of the derived quantum hidden Markov model into a constrained optimization problem:
Figure BDA0003222055170000046
m represents the mth Karus operator;
reconstructing a new matrix kappa, stacking all 2 x 2 Kraus operator matrixes into a 2S x 2 matrix kappa according to the output number S in columns, wherein the newly constructed matrix kappa needs to meet the condition
Figure BDA0003222055170000047
The constrained optimization problem is written as:
Figure BDA0003222055170000048
since κ is on the Stiefel manifold, its constrained optimization problem can be transformed into the unconstrained problem as follows, which can be solved by gradient descent algorithm:
Figure BDA0003222055170000051
wherein G represents the gradient of the likelihood function for k, U = [ G | k ], τ represents a real number in the interval [0,1], and V = [ k | -G ];
the probability p (y) of the quantum hidden Markov model output sequence is solved by the Kraus operator which is reversely deduced by the kappa tt-1 )。
Further, the system further comprises:
and the DA determining module is used for determining whether the generative model is good or bad by utilizing DA:
Figure BDA0003222055170000052
in the formula (I), the compound is shown in the specification,
Figure BDA0003222055170000054
is the length of the sequence, s is the number of hidden states, Y is the observation data, D is the quantum hidden Markov model, and f is a parameter from (- ∞, 1)]To (-1,1)]The non-linear function of (c):
Figure BDA0003222055170000053
when DA =1, the model is shown to perfectly predict the sequence; and drawing a graph in which the DA changes according to the increase of the iteration times of the gradient descent algorithm under the condition of each operation, so as to modify the iteration times and other parameters in the gradient descent algorithm, and ensure that the final DA value can be converged to a value close to 1.
Further, the quantum hidden markov models include eight, wherein:
the transaction time sequence data types comprise two types of money and time difference of two transactions, and each type is respectively used as an observation state of four quantum hidden Markov models;
the hidden states comprise all real cardholder data sets, fraudulent cardholder data sets, all real terminal data sets and fraudulent terminal data sets which are respectively used as hidden states of two quantum hidden Markov models, wherein the observation states of the two quantum hidden Markov models are different.
In a second aspect of the invention, there is provided a storage medium having stored thereon computer instructions which, when executed, perform the steps of the method for quantum hidden markov model solution fraud detection.
In a third aspect of the present invention, a terminal is provided, which includes a memory and a processor, the memory having stored thereon computer instructions executable on the processor, the processor executing the steps of the method for quantum hidden markov model solution for fraud detection.
The beneficial effects of the invention are:
(1) In an exemplary embodiment of the present invention, compared to the prior art that employs hidden markov model + random forest to detect fraud (i.e. the computing power of the current classical computer is not good at dealing with such huge amount of transactions and the need to process the transactions in a short time), in the present exemplary embodiment, a quantum version of hidden markov model is added to the fraud detection system, which not only can verify the credibility of the combination of quantum algorithm and classical algorithm and prepare for replacing the random forest with a more efficient quantum algorithm, but also can achieve the effect similar to the conventional hidden markov model in the case of using fewer parameters (i.e. two probability matrices, namely AC in the hidden markov model, are merged into a Karus operator, thereby reducing the use of parameters).
(2) In yet another exemplary embodiment of the present invention, a specific manner of implementation of the present application in applying quantum hidden Markov models to fraud models is disclosed.
(3) In still another exemplary embodiment of the present invention, the input observation data is discretized by partitioning the data according to the number of the set hidden states by the maximum value of the input data each time. This enables a reduction in usage parameters.
(4) In yet another exemplary embodiment of the present invention, the DA is utilized to determine whether the generative model is good or bad: the DA is plotted as a function of the increase in the number of iterations, by running each time, thereby modifying the number of iterations and other parameters so that the final DA value converges to as high a value as possible.
(5) In yet another exemplary embodiment of the present invention, the quantum hidden markov models include eight, wherein: the types of the transaction time sequence data comprise money and time difference of two transactions, and each type is respectively trained in four data sets as an observation state: all real cardholder datasets, fraudulent cardholder datasets, all real terminal datasets, fraudulent terminal datasets. Therefore, the habit of the user can be comprehensively summarized, and the behavior mode is closer to the behavior mode than the current behavior, rather than simply drawing a conclusion in one aspect.
The system, the storage medium and the terminal of the present invention have the same advantages.
Drawings
FIG. 1 is a flow chart of a method provided by an exemplary embodiment of the present invention;
FIG. 2 is a schematic diagram of a conventional hidden Markov model;
FIG. 3 is a schematic diagram of a full quantum circuit implementing a hidden Markov HMM;
FIG. 4 is a schematic diagram of a simplified quantum hidden Markov HMM implementation;
FIG. 5 is a schematic diagram of a quantum hidden Markov format;
fig. 6 is a block diagram of a system provided in an exemplary embodiment of the invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it is to be understood that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that directions or positional relationships indicated by "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", and the like are directions or positional relationships described based on the drawings, and are only for convenience of description and simplification of description, and do not indicate or imply that the device or element referred to must have a specific orientation, be configured and operated in a specific orientation, and thus, should not be construed as limiting the present invention.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected" and "connected" are to be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if," as used herein, may be interpreted as "at \8230; \8230when" or "when 8230; \823030when" or "in response to a determination," depending on the context. Furthermore, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In addition, the technical features involved in the different embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
In the prior art, the credit card fraud detection has the problems of concept drift (namely, a model established in the face of the fraud problem changes along with the change of time), skewed class distribution (namely, in real life, the fraud is only an example, and only occupies a small part in a plurality of transactions of a user, but causes great loss to the user), and too many data characteristics. However, no matter how the model changes, any transaction information contains time and transaction amount, which reflects the transaction habit of the cardholder to a certain extent. That is, if a person purchases small amount of products frequently in the past and suddenly purchases other goods with a very large amount of money on a certain day, the person can be doubted whether the person faces the situation of credit card embezzlement. Therefore, it is very necessary to consider detailed sequence information of past transactions. Therefore, in order to find the "habit" of each cardholder, a Hidden Markov Model (HMM) may be used to find such hidden habits. With the rise of online shopping, transactions of various amounts occur many times each day. This has also led to the fact that the computing power of today's classic computers is not distracting in the face of such huge amounts of transactions and the need to process these transactions in a short time. With the development of society, quantum mechanics gradually enters the visual field of people. Thus turning attention to quantum computers with computing power well beyond that of classical computers. Various quantum algorithms are faster than conventional general purpose computers in dealing with problems. However, in the prior art, the algorithm of the quantum version is rarely used for solving the practical problem, and the operation efficiency and the time complexity of the algorithm are almost theoretically proved to be superior to those of the classical algorithm.
Thus, the following exemplary embodiments contemplate the use of a quantum version of hidden Markov models to be added to a fraud detection system to verify the trustworthiness of the quantum algorithm in combination with the classical algorithm and to provide for the subsequent exchange of the random forest for a more efficient quantum algorithm.
Referring to fig. 1, fig. 1 illustrates a method for providing quantum hidden markov model solution fraud detection in an exemplary embodiment of the invention, comprising the steps of:
establishing a corresponding quantum hidden Markov model according to the transaction time sequence data type and the fraud detection type, wherein the possibility of the output of the quantum hidden Markov model is related to the transaction of the previous transaction time sequence; the transaction time sequence data type is used as an observation state of a quantum hidden Markov model, and the fraud detection type is used as a hidden state of the quantum hidden Markov model;
and outputting all the possibilities output by the quantum hidden Markov model to a random forest classifier as an additional characteristic of fraud detection.
Specifically, compared to the prior art that the hidden markov model + random forest is used for fraud detection (i.e. the computing power of the current classical computer is not sufficient in the face of such huge amount of transactions and the need to process the transactions in a short time), in the exemplary embodiment, the quantum version of the hidden markov model is added to the fraud detection system, so that not only can the credibility of the combination of the quantum algorithm and the classical algorithm be verified and preparation for later replacing the random forest with a more efficient quantum algorithm be made, but also the similar effect as that of the conventional hidden markov model can be achieved under the condition of using fewer parameters (i.e. the two probability matrices of AC in the hidden markov model are integrated into the Karus operator, thereby reducing the use of parameters).
The content of the random forest as fraud detection belongs to the conventional technical means in the field, and therefore, the description thereof is omitted here.
Preferably, in an exemplary embodiment, the likelihood of the quantum hidden markov model output comprises:
deriving a quantum hidden Markov model from the traditional hidden Markov model to obtain:
Figure BDA0003222055170000081
wherein t represents time t, y t Represents the observed state at time t, p t-1 Representing the hidden state at time t-1, p (y) tt-1 ) When the state representing time t-1 is known, the conditional probability of observing the state at time t, tr () represents the trace-finding,
Figure BDA0003222055170000082
a Karus operator corresponding to the observation state at the time of t-1;
the following will describe the process of deriving quantum hidden markov models by using conventional hidden markov models:
FIG. 2 is a diagram of a conventional hidden Markov model, where Y is an observed value at a certain time, i.e., the time interval or amount in S1, and X is a hidden state corresponding to the time, and there is a probability relationship between them, called an emission matrix C, because a hidden state is represented externally as the observed state with a certain probability; there is also a probability relationship between two neighboring hidden states called transition matrix a, because this is a random process and is now a hidden state, and other states are possible at the next moment.
Table 1 gives a comparative illustration of the representation of classical probability and quantum simulation.
Figure BDA0003222055170000091
TABLE 1 comparison of electrostatic and quantum representations
In FIG. 3 (full-quantum-circuit implementation hidden Markov HMM), U is constructed using a transition matrix A and an emission matrix C 1 And U 2 (unitary matrix, satisfying conditions)
Figure BDA0003222055170000092
)。U 1 Evolvement>
Figure BDA0003222055170000093
To perform a markov transition. And U 2 Update>
Figure BDA0003222055170000094
To include the probability of measuring each observable output. In operation, measure->
Figure BDA0003222055170000095
It changes->
Figure BDA0003222055170000096
To give an updated conditional state p t
Because it is always prepared in the same state
Figure BDA0003222055170000097
So that it follows the projection operator P y Federated distribution behind applications
Figure BDA0003222055170000098
U on 2 Can be more succinctly written to be only on->
Figure BDA0003222055170000099
An up-acting Karus operator, so that only particles need to be concerned>
Figure BDA00032220551700000910
How to evolve, it is necessary to set a Karus operator for each output y, so as to obtain a group of Kams operators { K } y And }, and +>
Figure BDA00032220551700000911
Simplifying the circuit results in the circuit in fig. 4 (simplified quantum hidden markov HMM implementation).
Wherein U is 1 Still a quantum implementation of the transfer matrix,
Figure BDA00032220551700000912
is a quantum implementation of bayesian updating after observation. This scheme of modeling a classical HMM can be written as:
Figure BDA00032220551700000913
similarly, U may be substituted 1 It is also simplified into the Karus operator, resulting in fig. 5 (quantum hidden markov format), and the above formula can also be simplified as:
Figure BDA0003222055170000101
in order to find the known state p t-1 Under the condition of (1), the probability of transmitting output y is obtained by simply tracing the formula:
Figure BDA0003222055170000102
to solve the above Karus operator (K) y ) A likelihood function estimation method is proposed to solve: according to a transaction timing sequence y 1 ,y 2 ,y 3 ,…,y T Constructing a likelihood function
Figure BDA0003222055170000103
(the likelihood function is a function of the parameters of the statistical modelThe manner of construction can be seen in Adhikary, S.S., srinivasan, S.,&boots, B. (2019). Leading qualitative models using a systematic pulse generator on the feedback command; or Srinivasan, S., gordon, G.,&boots, b. (2017). Learning high yield sum markov models.) and then using a likelihood function to perform gradient descent maximization of the value of the likelihood function on all possible Kraus operator derivatives, thereby obtaining a matrix solution of the Karus operator; where y is 1 ,y 2 ,y 3 ,…,y T Is an observed value, is combined with>
Figure BDA0003222055170000104
Is the th y i 2 x 2 Kraus operation matrix (note that, in case of a combination of>
Figure BDA0003222055170000105
Is the y i 2 each n ×2 n The Kraus operation matrix of (1), where n is not the sequence length, is only expressed as the power of 2, and generally takes 1, so that 2 × 2 is written), each observed value corresponds to one operation matrix, and the observed values are the same initially; and->
Figure BDA0003222055170000106
ρ 0 Representing an initial density matrix; such a problem can be translated into a constrained optimization problem:
Figure BDA0003222055170000107
m represents the mth Karus operator;
to solve such a problem, a new matrix k is reconstructed, and all 2 × 2 Kraus operator matrices are stacked in columns according to the output number S to form a 2s × 2 matrix k (10 × 2 matrix k if S = 5), which proves that the newly constructed matrix k needs to satisfy the condition
Figure BDA0003222055170000108
Said is provided withThe optimization problem of the constraint is written as:
Figure BDA0003222055170000109
since κ is on the Stiefel manifold, its constrained optimization problem can be transformed into the unconstrained problem as follows, which can be solved by gradient descent algorithm:
Figure BDA00032220551700001010
wherein G represents the gradient of the likelihood function for k, U = [ G | k ], τ represents a real number in the interval [0,1], and V = [ k | -G ];
the probability p (y) of the output sequence of the quantum hidden Markov model is obtained by reversely deducing the Kraus operator through kappa and solving through the Kraus operator tt-1 )。
Preferably, in an exemplary embodiment, the input observation data is discretized by partitioning the data according to the number of the set hidden states by the maximum value of the input data each time.
Wherein, the partition is only used for reducing the number of observation, and is only one of the methods according to the maximum value, and can also be according to the range, the frequency and the like of the data; the hidden state is unknown and in this question can be seen as a series of consumption habits (elaborated on in the following exemplary embodiments).
For a partition: for example, if the data is in the range of 1 to 100, then 100 Karus operators are required, thus losing the meaning of reducing the parameters. Therefore, it is necessary to partition it, which is now an equal division, and to put the value of each interval to a value, so that several partitions have only several outputs. The partition is to process the given data, partition first, then reassign the data according to the interval, and finally bring into the quantum hidden Markov model for calculation. Thus, the input data is discretized, and the aim of reducing parameters can be fulfilled.
More preferably, in an exemplary embodiment, the method further comprises the steps of:
using DA to judge whether the generated model is good or bad:
Figure BDA0003222055170000111
in the formula (I), the compound is shown in the specification,
Figure BDA0003222055170000112
is the length of the sequence, s is the number of hidden states, Y is the observation data, D is the quantum hidden Markov model, and f is a parameter from (- ∞, 1)]To (-1,1)]The non-linear function of (c):
Figure BDA0003222055170000113
when DA =1, the model is shown to perfectly predict the sequence; in each run, the DA is plotted as a function of the increase in the number of iterations of the gradient descent algorithm, thereby modifying the number of iterations and other parameters (e.g., step size) in the gradient descent algorithm so that the final DA value can converge to a value close to 1.
More preferably, in an exemplary embodiment, the quantum hidden markov models include eight, wherein:
the transaction time sequence data types comprise two types of money and time difference of two transactions, and each type is respectively used as an observation state of four quantum hidden Markov models;
the hidden states comprise all real card holder data sets, fraudulent card holder data sets, all real terminal data sets and fraudulent terminal data sets which are respectively used as hidden states of two quantum hidden Markov models, wherein the observation states of the two quantum hidden Markov models are different.
Wherein the observation state is Y in the content, the hidden state is X in the content, and the eight quantum hidden markov models are respectively: the amount of money in all the real cardholder data sets is used as an observation value training model 1, the amount of money in all the fraudulent cardholder data sets is used as an observation value training model 2, the amount of money in all the real terminal data sets is used as an observation value training model 3, the amount of money in the fraudulent terminal data sets is used as an observation value training model 4, the time difference in all the real cardholder data sets is used as an observation value training model 5, the time difference in all the fraudulent cardholder data sets is used as an observation value training model 6, the time difference in all the real terminal data sets is used as an observation value training model 7, and the time difference in all the fraudulent terminal data sets is used as an observation value training model 8.
Specifically, the model is trained on these datasets, with the amount of money in all the real cardholder datasets as the observed value to train model 1, and so on. The beneficial aspect is that the user habits can be more comprehensively summarized, so that the behavior mode can be closer to the current behavior, and the conclusion can not be made in one aspect.
Having the same inventive concept as the above-described exemplary embodiment, yet another exemplary embodiment of the present invention provides a system for solving fraud detection by quantum hidden markov model, as shown in fig. 6, comprising:
the system comprises a plurality of quantum hidden Markov models, a plurality of counter units and a plurality of data processing units, wherein the quantum hidden Markov models are established according to transaction time sequence data types and fraud detection types, and the possibility of the output of the quantum hidden Markov models is associated with the transaction of the previous transaction time sequence; the transaction time sequence data type is used as an observation state of a quantum hidden Markov model, and the fraud detection type is used as a hidden state of the quantum hidden Markov model;
and the random forest classifier is used for receiving all the output possibilities of the quantum hidden Markov model as additional characteristics of fraud detection.
Correspondingly, in an exemplary embodiment, the likelihood of the quantum hidden markov model output includes:
deriving a quantum hidden Markov model from the traditional hidden Markov model to obtain:
Figure BDA0003222055170000121
wherein t represents time t, y t Represents the observed state at time t, p t-1 Representing the hidden state at time t-1, p (y) tt-1 ) Represents the conditional probability of observing the state at time t when the state at time t-1 is known, tr () represents the trace-finding,
Figure BDA0003222055170000122
a Karus operator corresponding to an observation state at the time of t-1;
sequence y of transaction sequences 1 ,y 2 ,y 3 ,…,y T Constructing a likelihood function
Figure BDA0003222055170000123
Then, carrying out gradient descent maximization of the value of the likelihood function on all possible derivatives of the Kraus operator by using the likelihood function, thereby obtaining a matrix solution of the Karus operator; where y is 1 ,y 2 ,y 3 ,…,y T Is the value of the observed value and is,
Figure BDA0003222055170000131
is the y i Each observation value of the 2 x 2 Kraus operation matrix corresponds to one operation matrix, and the observation values are the same initially; and->
Figure BDA0003222055170000132
ρ 0 Representing an initial density matrix; converting the solution of the derived quantum hidden Markov model into a constrained optimization problem:
Figure BDA0003222055170000133
m represents the mth Karus operator;
reconstructing a new matrix k, and combining all the matrices2 x 2 Kraus operator matrix is piled up into a 2S x 2 matrix kappa according to the output number S in columns, and the newly built matrix kappa needs to meet the condition
Figure BDA0003222055170000134
The constrained optimization problem is written as:
Figure BDA0003222055170000135
since κ is in the Stiefel manifold, the constrained optimization problem can be transformed into the unconstrained problem as follows, which can be solved by gradient descent algorithm:
Figure BDA0003222055170000136
wherein G represents the gradient of the likelihood function for k, U = [ G | k ], τ represents a real number in the interval [0,1], and V = [ k | -G ];
the probability p (y) of the quantum hidden Markov model output sequence is solved by the Kraus operator which is reversely deduced by the kappa tt-1 )。
Correspondingly, in an exemplary embodiment, the system further comprises:
and the DA determining module is used for determining whether the generative model is good or bad by utilizing DA:
Figure BDA0003222055170000137
in the formula (I), the compound is shown in the specification,
Figure BDA0003222055170000138
is the length of the sequence, s is the number of hidden states, Y is the observation data, D is the quantum hidden Markov model, and f is a parameter from (- ∞, 1)]To (-1, 1)]The non-linear function of (c):
Figure BDA0003222055170000139
/>
when DA =1, the model is said to perfectly predict the sequence; and drawing a graph in which the DA changes according to the increase of the iteration times of the gradient descent algorithm under the condition of each operation, so as to modify the iteration times and other parameters in the gradient descent algorithm, and ensure that the final DA value can be converged to a value close to 1.
Correspondingly, in an exemplary embodiment, the input observation data is discretized by partitioning the data according to the number of the set hidden states by the maximum value of the input data each time.
Wherein, the partition is only used for reducing the number of observation, and is only one of the methods according to the maximum value, and can also be according to the range, the frequency and the like of the data; the hidden state is not known and can be seen as a series of consumption habits in this problem.
Correspondingly, in an exemplary embodiment, the quantum hidden markov models include eight, wherein:
the transaction time sequence data types comprise two types of money and time difference of two transactions, and each type is respectively used as an observation state of four quantum hidden Markov models;
the hidden states comprise all real cardholder data sets, fraudulent cardholder data sets, all real terminal data sets and fraudulent terminal data sets which are respectively used as hidden states of two quantum hidden Markov models, wherein the observation states of the two quantum hidden Markov models are different.
Having the same inventive concept as the above exemplary embodiments, an exemplary embodiment of the present invention provides a storage medium having stored thereon computer instructions which, when executed, perform the steps of the method for quantum hidden markov model solution fraud detection.
Having the same inventive concept as the above-described exemplary embodiments, an exemplary embodiment of the present invention provides a terminal, including a memory and a processor, where the memory has stored thereon computer instructions executable on the processor, and the processor executes the computer instructions to perform the steps of the method for quantum hidden markov model solution fraud detection.
Based on such understanding, the technical solution of the present embodiment or parts of the technical solution may be essentially implemented in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It is to be understood that the above-described embodiments are illustrative only and not restrictive of the broad invention, and that various other modifications and changes in light thereof will be suggested to persons skilled in the art based upon the above teachings. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications of the invention may be made without departing from the scope of the invention.

Claims (6)

1. A quantum hidden Markov model fraud detection method is characterized by: the method comprises the following steps:
establishing a corresponding quantum hidden Markov model according to the transaction time sequence data type and the fraud detection type, wherein the possibility of the output of the quantum hidden Markov model is related to the transaction of the previous transaction time sequence; the transaction time sequence data type is used as an observation state of a quantum hidden Markov model, and the fraud detection type is used as a hidden state of the quantum hidden Markov model;
outputting the possibility of all quantum hidden Markov model outputs to a random forest classifier as an additional feature of fraud detection;
the quantum hidden markov models include eight, wherein: the transaction time sequence data types comprise two types of money and time difference of two transactions, and each type is respectively used as an observation state of four quantum hidden Markov models; the hidden states comprise all real cardholder data sets, fraudulent cardholder data sets, all real terminal data sets and fraudulent terminal data sets which are respectively used as hidden states of two quantum hidden Markov models, wherein the observation states of the two quantum hidden Markov models are different;
the possibility of the quantum hidden Markov model output comprises:
deriving a quantum hidden Markov model from the traditional hidden Markov model to obtain:
Figure FDA0004067997440000011
wherein t represents time t, y t Represents the observed state at time t, p t-1 Representing the hidden state at time t-1, p (y) tt-1 ) Represents the conditional probability of the observed state at time t when the state at time t-1 is known, tr () represents the trace-finding,
Figure FDA0004067997440000012
a Karus operator corresponding to the observation state at the time of t-1;
will trade the time sequence y 1 ,y 2 ,y 3 ,…,y T Constructing a likelihood function
Figure FDA0004067997440000013
Then, the likelihood function is used for carrying out gradient descent maximization on the values of the likelihood function on all possible Kraus operator derivative solving values, and therefore a matrix solution of the Karus operator is obtained; where y is 1 ,y 2 ,y 3 ,…,y T Is an observed value, is greater than or equal to>
Figure FDA0004067997440000014
Is the y i Each observation value of the 2 x 2 Kraus operation matrix corresponds to one operation matrix, and the observation values are the same initially; and->
Figure FDA0004067997440000015
ρ 0 Representing an initial density matrix; converting the solution of the derived quantum hidden Markov model into a constrained optimization problem:
Figure FDA0004067997440000016
m represents the mth Karus operator;
reconstructing a new matrix kappa, stacking all 2 x 2 Kraus operator matrixes into a 2S x 2 matrix kappa according to the output number S in columns, wherein the newly constructed matrix kappa needs to meet the condition
Figure FDA0004067997440000017
The constrained optimization problem is written as:
Figure FDA0004067997440000021
since κ is on the Stiefel manifold, its constrained optimization problem can be transformed into the unconstrained problem as follows, which can be solved by gradient descent algorithm:
Figure FDA0004067997440000022
/>
wherein G represents the gradient of the likelihood function for k, U = [ G | k ], τ represents a real number in the interval [0,1], and V = [ k | -G ];
the probability p (y) of the output sequence of the quantum hidden Markov model is obtained by reversely deducing the Kraus operator through kappa and solving through the Kraus operator tt-1 )。
2. The method of quantum hidden markov model solution for fraud detection according to claim 1, wherein: the method further comprises the steps of:
the DA is used for judging whether the generated model is good or bad:
Figure FDA0004067997440000023
where l is the length of the sequence, s is the number of hidden states, Y is the observation data, D is the quantum hidden Markov model, f (.) is a nonlinear function with parameters from (- ∞,1] to (-1, 1 ]:
Figure FDA0004067997440000024
when DA =1, the model is shown to perfectly predict the sequence; and drawing a graph in which the DA changes according to the increase of the iteration times of the gradient descent algorithm under the condition of each operation, so as to modify the iteration times and other parameters in the gradient descent algorithm, and ensure that the final DA value can be converged to a value close to 1.
3. A system for solving fraud detection by using quantum hidden Markov model is characterized in that: the method comprises the following steps:
a plurality of quantum hidden Markov models, wherein the quantum hidden Markov models are established according to the transaction time sequence data types and the fraud detection types, and the probability of the output of the quantum hidden Markov models is related to the transaction of the previous transaction time sequence; the transaction time sequence data type is used as an observation state of a quantum hidden Markov model, and the fraud detection type is used as a hidden state of the quantum hidden Markov model;
the random forest classifier is used for receiving the possibility of all quantum hidden Markov model outputs as an additional characteristic of fraud detection;
the quantum hidden markov models include eight, wherein: the transaction time sequence data types comprise two types of money and time difference of two transactions, and each type is respectively used as an observation state of four quantum hidden Markov models; the hidden states comprise all real card holder data sets, fraudulent card holder data sets, all real terminal data sets and fraudulent terminal data sets which are respectively used as hidden states of two quantum hidden Markov models, wherein the observation states of the two quantum hidden Markov models are different;
the probability of the quantum hidden Markov model output comprises:
deriving a quantum hidden Markov model from the traditional hidden Markov model to obtain:
Figure FDA0004067997440000031
wherein t represents time t, y t Indicating the observed state, p t-1 Representing the hidden state at time t-1, p (y) tt-1 ) When the state at time t-1 is known, the conditional probability of the observed state at time t, tr () represents the trace-finding, K yt-1 A Karus operator corresponding to the observation state at the time of t-1;
sequence y of transaction sequences 1 ,y 2 ,y 3 ,…,y T Constructing a likelihood function
Figure FDA0004067997440000032
Then, carrying out gradient descent maximization of the value of the likelihood function on all possible derivatives of the Kraus operator by using the likelihood function, thereby obtaining a matrix solution of the Karus operator; where y is 1 ,y 2 ,y 3 ,…,y T Is an observed value, K yi Is the y i Each observation value of the 2 x 2 Kraus operation matrix corresponds to one operation matrix, and the observation values are the same initially; and->
Figure FDA0004067997440000033
ρ 0 Representing an initial density matrix; converting the solution of the derived quantum hidden Markov model into a constrained optimization problem:
Figure FDA0004067997440000034
m represents the mth Karus operator;
reconstructing a new matrix kappa, stacking all 2 x 2 Kraus operator matrixes into a 2S x 2 matrix kappa according to the output number S in columns, wherein the newly constructed matrix kappa needs to meet the condition
Figure FDA0004067997440000035
The constrained optimization problem is written as:
Figure FDA0004067997440000036
since κ is on the Stiefel manifold, its constrained optimization problem can be transformed into the unconstrained problem as follows, which can be solved by gradient descent algorithm:
Figure FDA0004067997440000041
wherein G represents the gradient of the likelihood function for k, U = [ G | k ], τ represents a real number in the interval [0,1], and V = [ k | -G ];
the probability p (y) of the quantum hidden Markov model output sequence is solved by the Kraus operator which is reversely deduced by the kappa tt-1 )。
4. The system for quantum hidden markov model solution fraud detection according to claim 3, wherein: the system further comprises:
and the DA determining module is used for determining whether the generative model is good or bad by utilizing DA:
Figure FDA0004067997440000042
where l is the length of the sequence, s is the number of hidden states, Y is the observed data, D is a quantum hidden Markov model, f (.) is a nonlinear function with parameters from (- ∞,1] to (-1, 1 ]:
Figure FDA0004067997440000043
when DA =1, the model is shown to perfectly predict the sequence; and drawing a graph in which the DA changes according to the increase of the iteration times of the gradient descent algorithm under the condition of each operation, so as to modify the iteration times and other parameters in the gradient descent algorithm, and ensure that the final DA value can be converged to a value close to 1.
5. A storage medium having stored thereon computer instructions, characterized in that: the computer instructions when executed perform the steps of the method for quantum hidden markov model solution fraud detection of claim 1 or 2.
6. A terminal comprising a memory and a processor, the memory having stored thereon computer instructions executable on the processor, wherein the processor, when executing the computer instructions, performs the steps of the method for quantum hidden markov model solution fraud detection according to claim 1 or 2.
CN202110960351.7A 2021-08-20 2021-08-20 Method, system, storage medium and terminal for solving fraud detection by quantum hidden Markov model Active CN113570452B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110960351.7A CN113570452B (en) 2021-08-20 2021-08-20 Method, system, storage medium and terminal for solving fraud detection by quantum hidden Markov model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110960351.7A CN113570452B (en) 2021-08-20 2021-08-20 Method, system, storage medium and terminal for solving fraud detection by quantum hidden Markov model

Publications (2)

Publication Number Publication Date
CN113570452A CN113570452A (en) 2021-10-29
CN113570452B true CN113570452B (en) 2023-04-07

Family

ID=78172254

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110960351.7A Active CN113570452B (en) 2021-08-20 2021-08-20 Method, system, storage medium and terminal for solving fraud detection by quantum hidden Markov model

Country Status (1)

Country Link
CN (1) CN113570452B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117808006B (en) * 2024-01-03 2024-07-12 北京工商大学 Fraud recognition method and system based on emotion and behavior combined classification

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11288585B2 (en) * 2015-12-30 2022-03-29 Google Llc Quantum statistic machine
CN107990820B (en) * 2017-11-28 2019-09-24 四川元匠科技有限公司 A kind of plate thickness information detecting method based on impulse eddy current
US20220058175A1 (en) * 2018-09-13 2022-02-24 Nec Corporation Data analysis apparatus, data analysys method, and program
US11694103B2 (en) * 2018-09-19 2023-07-04 Microsoft Technology Licensing, Llc Quantum-walk-based algorithm for classical optimization problems

Also Published As

Publication number Publication date
CN113570452A (en) 2021-10-29

Similar Documents

Publication Publication Date Title
CN112529168B (en) GCN-based attribute multilayer network representation learning method
Rizwan et al. Bitcoin price prediction using deep learning algorithm
JP7476977B2 (en) Prediction method, prediction device, and program
CN111309788A (en) Community structure discovery method and system for bank customer transaction network
CN113570452B (en) Method, system, storage medium and terminal for solving fraud detection by quantum hidden Markov model
CN111178902B (en) Network payment fraud detection method based on automatic feature engineering
Zhang et al. Consumer credit risk assessment: A review from the state-of-the-art classification algorithms, data traits, and learning methods
Zarnaz et al. Credit card approval prediction by non-negative tensor factorization
Breskuvienė et al. Categorical feature encoding techniques for improved classifier performance when dealing with imbalanced data of fraudulent transactions
Zhang et al. Multimodel integrated enterprise credit evaluation method based on attention mechanism
Potluru et al. Synthetic data applications in finance
Zhu et al. Loan default prediction based on convolutional neural network and LightGBM
Ganji et al. Lagrangian constrained community detection
CN117273157A (en) Quantum core method, classification method, data coding method, related system and device
Li et al. An alternating nonmonotone projected Barzilai–Borwein algorithm of nonnegative factorization of big matrices
CN113570158A (en) Stock prediction method, system, storage medium and terminal of quantum hidden Markov
Islam et al. Discovering probabilistically weighted sequential patterns in uncertain databases
Karimnejad Esfahani et al. An RBF approach for oil futures pricing under the jump-diffusion model
CN113919862A (en) Marketing arbitrage black product identification method based on dynamic attention-drawing network
Bhuyan et al. The Forecasting About Bitcoin and Other Digital Currency Markets: The Effects of Data Mining and Other Emerging Technologies
CN114757723B (en) Data analysis model construction system and method for resource element trading platform
Becker Decomposition methods for large scale stochastic and robust optimization problems
Chaudhry et al. Artificial Intelligence with Streamlining Payments and Lending for a Simpler Financial Ecosystem
Ching et al. Modeling default data via an interactive hidden Markov model
Fitzpatrick et al. Lightcone Hamiltonian for Ising Field Theory I: T< T_c

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant