EP3655893A1 - Systeme d'apprentissage machine pour diverses applications informatiques - Google Patents
Systeme d'apprentissage machine pour diverses applications informatiquesInfo
- Publication number
- EP3655893A1 EP3655893A1 EP18755710.3A EP18755710A EP3655893A1 EP 3655893 A1 EP3655893 A1 EP 3655893A1 EP 18755710 A EP18755710 A EP 18755710A EP 3655893 A1 EP3655893 A1 EP 3655893A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- hardware
- transactions
- software arrangement
- neural network
- authentication
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/047—Probabilistic or stochastic networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/38—Payment protocols; Details thereof
- G06Q20/40—Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
- G06Q20/401—Transaction verification
- G06Q20/4016—Transaction verification involving fraud or risk level assessment in transaction processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/044—Recurrent networks, e.g. Hopfield networks
- G06N3/0442—Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/06—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
- G06N3/063—Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/0985—Hyperparameter optimisation; Meta-learning; Learning-to-learn
Definitions
- the invention relates to the field of fraud detection systems during authentication, particularly during authentication, an operation or a transaction.
- these systems generally use neural networks whose statistical learning is based on decision tree forests (random forests) that analyze a sampling of non-sequential data.
- the object of the present invention is therefore to provide a system for detecting fraud during identification, to overcome at least some of the disadvantages of the prior art, by providing a machine learning system for various computer applications allowing a text search for the detection of defects or anomalies in an authentication, operation or transaction performed by the application, comprising:
- the neural network driving the treatment model is advantageously:
- LSTM long-term and short-term memory type recurrent neural network
- the recurrent neural network of the LSTM type comprises at least two recurrent layers and a Logistic Regression Classifier positioned above the last layer. recurring taking into account the time elapsed between two authentications, operations or transactions.
- the hardware and software arrangement for validating authentication, operation or transaction is parameterized with a Jaccard index matrix so that the degree of similarity between the output data of a first neural network is measured of the LSTM type and those from a hardware and software arrangement of a second neural network for statistical learning of the type of decision tree and to validate the results of one of the two neural networks.
- the hardware and software arrangement forming a recurrent neural network resulting in an LSTM-type model uses a GPU.
- the hardware and software arrangement forming a pretreatment system comprises:
- At least one first database containing at least one set of sequential schematics of raw data relating to said computer application
- a hardware and software arrangement forming at least a second database containing at least one set of external data; a hardware and software arrangement for enriching the raw data with external data;
- the pretreatment system uses a multi-threaded mode.
- FIG. 1 is a schematic representation of a recurrent neural network unrolled in time by creating a copy of the model for each time step.
- Figure 2 shows averaged averaged recall curves in the test set (the figure shows LSTM results on Long sequences).
- the horizontal dashed lines indicate the average AUPCR for each curve (the figure shows the LSTM results on Long sequences).
- FIG. 4 shows a pairwise comparison of the true positive sets of two models measured with the Jaccard index and encoded in color in a density map
- FIG. 5 shows the drive architecture of an LSTM model.
- Figure 6 shows a meta-classifier that combines the LSTM model and the random forest model.
- Figure 7 shows a fraud detection framework according to the invention.
- the authentications, transactions or fraudulent transactions can be understood as anomalies in consumer buying behavior or as a set of outliers in the class of genuine authentications, transactions or transactions which themselves form a class opposing fraudulent transactions.
- frauds mingle very well with genuine authentications, transactions or transactions, for two reasons.
- the actual buying actions of millions of consumers naturally cover a broad spectrum of variability.
- fraudsters apply a variety of insurable, yet rational, strategies for performing fraudulent acts that span multiple consumer accounts over different time periods - but in the end, these acts will similarly appear only as authentications, transactions, or individual transactions. in a dataset.
- identical purchasing actions may reflect either completely legitimate behavior in the context of certain consumers, or obvious anomalies in the context of other consumers.
- the first method is a well-established practice in the field of credit card fraud detection and is based on manual peculiarity engineering.
- the second method we focus on recovering the sequential structure of a user's authentication, transaction, or transaction history by modeling the transition dynamics between authentications, transactions, or transactions by means of a recurrent neural network.
- a long-term and short-term memory network is a special variant of a recurrent neural network (RNN).
- RNN recurrent neural network
- Recurrent neural networks were developed in the 1980s [Williams and Hinton, 1986, Werbos, 1988, Elman, 1990] for time series modeling.
- the structure of an RNN is similar to that of a standard multilayer perception, with the difference that it allows connections among hidden units associated with discrete time steps.
- the time steps index the individual elements in an input sequence.
- the model can retain information about past entries, which allows it to discover temporal correlations between events that are possibly far apart from one another in the input sequence. This is a crucial property for the appropriate learning of time series in which the occurrence of an event is likely to depend on the presence of several other events even more distant in time.
- a generic neural network with an input x f and a state s f for a time step t, is represented by equation 1.
- the initial state s 0 is the zero vector and a is a certain nonlinear element activation function - tanh in this case.
- a cost ⁇ measures network performance on a given task and is typically composed of costs at all time steps
- Such a composite cost will be applicable, for example, to text marking tasks, for which a tag is assigned to each word entered. In this case, only the label of the last authentication, operation or transaction in a sequence is predicted.
- the model parameters ⁇ are learned by minimizing the cost f with an optimization method based on a gradient.
- One approach that can be used to calculate the required gradients is backpropagation over time (BPTT).
- BPTT works by deploying a recurrent network over time to represent it as a deep multilayer network with as many hidden layers as there are time steps (see Figure 1).
- the well-known backpropagation algorithm [Williams and Hinton, 1986] is applied to the deployed network.
- the parameter ⁇ affects the error through not only the last state, but also all the previous states. Similarly, the error depends on W across all states s. This dependence becomes problematic when calculating the gradient of W. l. t. ⁇ .
- the jacobian matrix ÎÎSfc contains all the component interactions between the Sk state and the st state. We can understand it as a means for returning the error of the state t to the state k. It occurs as the product of all paired interactions between consecutive states
- a means for extracting information from an authentication, operation or transaction sequence consists in aggregating the values of certain variables along the sequence. To assemble these aggregations of peculiarities, one follows the procedure that has recently been proposed by [Brusen et al., 2016]. This simple but powerful procedure can be considered as constituting the state of the art engineering technique in the detection of credit card fraud. They add new features to each authentication, operation or transaction based on certain predefined rules. The value of a new feature is calculated with an aggregation function applied to a subset of previous transactions. The goal is to create a record of the activities from the history of authentications, operations or transactions of a cardholder, which quantifies the degree to which the authentication, operation or transaction in progress complies with the previous ones.
- ( tl t) ieN is the sequence of authentications, operations or transactions, temporally ordered, of a given card holder, where t indexes the authentications, transactions or individual transactions in its sequence.
- the value of a particular variable is indicated in an authentication, operation or transaction by
- t is the quantity used in an authentication, operation or transaction x t .
- a subset of authentications, transactions or transactions from the past is selected up to a maximum time horizon t h and according to certain nominal variables A and B:
- the set S k contains all the authentications, operations or transactions of t h hours preceding x k , where the nominal variables A and B have taken the same values as for x k .
- the pair (sums *, counts /) corresponds to a single constraint given by A, B and t h .
- these pairs are calculated for all combinations of country, merchant class, and card entry variables. , inside a time horizon of 24 hours. Finally, all these pairs are added to the authentication, operation or transaction particularity vector x k .
- the real interesting phenomenon is the genuine purchasing behavior of cardholders or, similarly, the malicious behavior of fraudsters. It is assumed that this object, which is roughly called behavior, is controlled by certain latent but coherent qualities. With its state variables, the LSTM is in principle able to identify these qualities from the sequence of observations.
- sequence data set On the basis of a set of authentication data, operations or transactions labeled credit card, recorded between March and May 2015, we created data sets as follows: all the authentications, operations or Transactions of an identified cardholder are grouped and the authentications, transactions or transactions of each cardholder are sorted according to time. As a result, there is obtained a temporally ordered sequence of authentications, operations or transactions for each cardholder. In the rest of this work, this sequence is called a cardholder's account, and the complete set of all accounts is called the sequence data set.
- the sequence data set is further divided into two mutually exclusive sets: one set of sequence data contains only the Authentications, Operations or Ecommerce Transactions (ECOM), and the other set contains only the Authentications, Operations or Transactions. made in sales outlets (F2F).
- a typical characteristic of fraud detection problems is the strong imbalance between the minority class (fraudulent transactions) and the majority class (authentic transactions). The overall fraction of fraudulent authentication, transactions or transactions is usually about 0.5% or less. In the F2F dataset, frauds occur with an order of magnitude lower frequency than the ECOM dataset, further exacerbating the problem of detection.
- Literature studies [Bhattacharyya et al., 201 1] and previous experiments have shown that some form of under-sampling of the majority class on the training set improves learning.
- a downsampling strategy can not be applied to a set of sequence data. Therefore, sub-sampling is used at the account level.
- an account is considered to be compromised if it contains at least authentication, transaction or fraudulent transaction, and is considered to be genuine if it contains only genuine transactions.
- Deferred Ground Reality The present test period begins more than a week after the training period. The reason for this decision is twofold: in a production system, authentication labels, transactions, or transactions are only available after human investigators have verified the transactions. As a result, the availability of a specific ground reality is always delayed by about a week. The second reason is that the classification is typically more accurate on recent authentications, transactions or transactions that closely follow the training period. But this accuracy and likely to be an overly optimistic evaluation of the performance of the classifier in a production system, since in practice we still do not get access to the real labels.
- the first set of features contains all the raw features after the specific variables of a trade have been removed. Since frauds do not usually appear in isolation but rather as elements of complete fraud sequences that may span several hours or days, the identity of the cardholder from the set of features has been removed. Otherwise, a classifier could simply remember the identities of cardholders with compromised accounts and make decisions only in this much smaller set of transactions. However, in practice, one would rather know if there is an authentication, operation or fraudulent transaction and then make the account compromised.
- the second set of features contains all the features of the BASE set plus the delta-time feature as described in section 3.2.
- This third set of peculiarities contains all the peculiarities of the TDELTA set plus 14 aggregated peculiarities like described above.
- the authentications, transactions, or transactions of the preceding 24 hours were aggregated in terms of the quantity and number of authentications, transactions, or transactions based on all combinations of the term-mcc, term-country, and card-entry-mode dummy variables. . See Table 2 for an overview of the features.
- Table 2 List of features in these datasets.
- Marked features ( * ) are composite features composed of several lower-level features.
- Nominal variables in the case of the random forest, the nominal variables can be used just as they are. We have only established a correspondence between each value and an integer. In the case of neural networks, we wanted to avoid having vectors of a single particularity encoded by token (one-hot encoding) to very high dimension. Therefore, a label encoding mechanism which is very popular in the field of natural language processing and neural networks has been employed, Collobert et al. [201 1], Socher et al. [2013], Tang et al. [2014], which is applicable to arbitrariness of dummy variables other than words [Guo and Berkhahn, 2016].
- the peculiarity values and their corresponding vectors are stored inside a dictionary. To encode a particular value of the nominal variable, we look at the value of the particularity in the dictionary and retrieve its vector.
- the vectors in integration are part of the parameters of the model and can be adjusted jointly during the estimation of the parameters.
- Time function we consider the function of time as a composition of several nominal variables. For each temporal resolution of the time function, ie the year, the month, the day the day, the hour, the minute and the second, we define a nominal variable in the same way as that described above.
- the long and short term memory network has two recurrent layers and a logistic regression classifier stacked above the last layer.
- the logistic regression classifier can be driven in conjunction with the LSTM state transition model via error backpropagation.
- An abandonment [Srivastava et al., 2014] is applied to the LSTM nodes to regularize the parameters and the whole model is trained by minimizing the cross entropy between the predicted class distribution and the true class distribution with the ADAM algorithm. This implementation is based on the Keras Deep Learning Library.
- Grid search both the random forest (RF) and the LSTM must be parameterized with hyper-parameters.
- the space of possible hyper-parameter configurations was searched for in terms of a coarse grid overlapped by a subset of all hyper-parameters (see Table 3). The configuration was then selected with AU CP / 3 ⁇ 4, maximum value 2 on the validation set.
- Table 3 Hyper-parameters taken into consideration during the grid search
- AUCPR a precision-return curve (PR) and in particular the area under this curve was used to quantify the accuracy of detection.
- PR precision-return curve
- Each point on the PR curve corresponds to the accuracy of the classifier at a specific recall level.
- the entire curve gives a complete picture of the accuracy of a classifier and its robustness even in unbalanced settings.
- the integral above this curve yields a single-valued summary of performance, and is called AUCPR.
- AUCPR@0.2 From the point of view of trade, low booster and high accuracy are preferable to high booster and low accuracy. A typical choice is therefore to measure the accuracy on the first K elements in the list of hierarchical results. This precision at K corresponds to an isolated point on the PR curve and is likely to vary because of the different ones chosen for K. In order to reflect the commercial interests and to avoid a problem of variability, it is suggested to use the integral on the calculated PR curve up to a certain recall level (0.2 in the present experiments). The maximum value for AUCPR@0.2 is 0.2. Jaccard's index: to explore the qualitative differences between the two present approaches, the Jaccard index was used to measure the degree to which two classifiers are similar in terms of the frauds they detect. With two sets of results (true positives) A
- the Jaccard index is defined by u v
- the decision threshold is set to st and corresponds to a reminder of 0.2.
- Savings are another metric measure that is often used in the field of fraud detection credit card. They measure the monetary benefit of a certain algorithm over a trivial acceptor / rejector and are based on a predefined cost matrix.
- the individual inputs are composed of a processing cost C p , a reimputation C C b and a cost dependent on the transaction g (').
- g represents the loss of money due to fraud occurring while the investigation process is in progress. It is defined by:
- Fj is the set of authentication, operations or fraudulent transactions that occur until T hours after authentication, operation or transaction x, ⁇ .
- a model was qualified for each set combination of features, data set and sequence length, and its classification performance was tested on the test set held. In the case of random forests, the length of the input sequence has no influence on the model since only the last authentication, operation or transaction of the input sequence is used. Qualified models were evaluated on each of the 24 test days individually, and their average performance is reported against the metric values defined above.
- Table 5 and Table 6 show a summary of the results for face-to-face and ecommerce data sets.
- a first observation is that the global detection accuracy is much higher on the ECOM than on the F2F, which can be explained by the higher proportion of frauds in the ECOM.
- longer input sequences seem to have no effect on the accuracy of detection, neither for F2F nor for ECOM.
- Table 5 Average AUC on all test days. Sequence lengths (SHORT, LONG) and sets of features (BASE, TDELTA, AGG)
- Tables 5 and 6 report the average statistics on all test days.
- the AUCPRs of the RF and LSTM are plotted for the individual test days, it can be seen in Figure 3 that the predictions of the two classifiers show strong variations according to the days.
- the curves are correlated, we can deduce that some days the detection problem is more difficult than other days.
- both classifiers have their minimum wrt value of the AUPCR in the time periods 9/05 - 10/05 and 25/05 - 26/05.
- Model regularization when dealing with a temporal process for which one aims at predicting certain properties of future events, no collection of historical data points can truly satisfy the requirements requested from a set representative validation. The accuracy of a prediction the next day just after the end of the training set is better than for the more distant days in the future, suggesting a time dependence of the conditional distribution. When we choose the days just after the learning period as the validation set, the results with this set will suggest a small regularization of the model. But this choice has the opposite effect on performance for the more distant days in the future. An exact and very reliable model of today's data will probably be bad in a few days, while a less reliable model of the day will still be valid in a few days.
- the system can use only the neural network. recurrent long-term and short-term memory type (LSTM), or the neural network for statistical learning of the type of decision tree, or a combination of both (see Figure 6).
- LSTM long-term and short-term memory type
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Mathematical Physics (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Computational Linguistics (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- Computer Security & Cryptography (AREA)
- Business, Economics & Management (AREA)
- Accounting & Taxation (AREA)
- Computer Hardware Design (AREA)
- Neurology (AREA)
- Probability & Statistics with Applications (AREA)
- Finance (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)
- Collating Specific Patterns (AREA)
- Testing And Monitoring For Control Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Machine Translation (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| FR1756823A FR3069357B1 (fr) | 2017-07-18 | 2017-07-18 | Systeme d'apprentissage machine pour diverses applications informatiques |
| PCT/EP2018/069176 WO2019016106A1 (fr) | 2017-07-18 | 2018-07-13 | Systeme d'apprentissage machine pour diverses applications informatiques |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| EP3655893A1 true EP3655893A1 (fr) | 2020-05-27 |
Family
ID=60182698
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP18755710.3A Ceased EP3655893A1 (fr) | 2017-07-18 | 2018-07-13 | Systeme d'apprentissage machine pour diverses applications informatiques |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US11763137B2 (fr) |
| EP (1) | EP3655893A1 (fr) |
| CN (1) | CN110998608B (fr) |
| FR (1) | FR3069357B1 (fr) |
| WO (1) | WO2019016106A1 (fr) |
Families Citing this family (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11829866B1 (en) * | 2017-12-27 | 2023-11-28 | Intuit Inc. | System and method for hierarchical deep semi-supervised embeddings for dynamic targeted anomaly detection |
| CN108985920A (zh) | 2018-06-22 | 2018-12-11 | 阿里巴巴集团控股有限公司 | 套现识别方法和装置 |
| US12039458B2 (en) * | 2019-01-10 | 2024-07-16 | Visa International Service Association | System, method, and computer program product for incorporating knowledge from more complex models in simpler models |
| US20200387818A1 (en) * | 2019-06-07 | 2020-12-10 | Aspen Technology, Inc. | Asset Optimization Using Integrated Modeling, Optimization, and Artificial Intelligence |
| CN110362494B (zh) * | 2019-07-18 | 2021-06-15 | 腾讯科技(深圳)有限公司 | 微服务状态信息展示的方法、模型训练方法以及相关装置 |
| US11899765B2 (en) | 2019-12-23 | 2024-02-13 | Dts Inc. | Dual-factor identification system and method with adaptive enrollment |
| CN111123894B (zh) * | 2019-12-30 | 2021-09-07 | 杭州电子科技大学 | 一种基于lstm和mlp结合的化工过程故障诊断方法 |
| FR3109232B1 (fr) * | 2020-04-10 | 2024-08-16 | Advestis | Procede de prediction interpretable par apprentissage fonctionnant sous ressources memoires limitees |
| JP6926279B1 (ja) * | 2020-05-29 | 2021-08-25 | 楽天グループ株式会社 | 学習装置、認識装置、学習方法、認識方法、プログラム、及び再帰型ニューラルネットワーク |
| US11336507B2 (en) * | 2020-09-30 | 2022-05-17 | Cisco Technology, Inc. | Anomaly detection and filtering based on system logs |
| US20220188837A1 (en) * | 2020-12-10 | 2022-06-16 | Jpmorgan Chase Bank, N.A. | Systems and methods for multi-agent based fraud detection |
| CN112598118B (zh) * | 2021-03-03 | 2021-06-25 | 成都晓多科技有限公司 | 有监督学习的标注异常处理方法、装置、存储介质及设备 |
| CN113569993A (zh) * | 2021-08-27 | 2021-10-29 | 浙江工业大学 | 一种聚合反应过程质量预测模型构建方法 |
| US20240303650A1 (en) * | 2023-03-06 | 2024-09-12 | Mastercard International Incorporated | Systems and methods for multi-stage residual modeling approach for analysis and assessment |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11205103B2 (en) * | 2016-12-09 | 2021-12-21 | The Research Foundation for the State University | Semisupervised autoencoder for sentiment analysis |
| CN106600283A (zh) * | 2016-12-16 | 2017-04-26 | 携程旅游信息技术(上海)有限公司 | 识别姓名国籍的方法、系统及判断交易风险的方法、系统 |
| US10762423B2 (en) * | 2017-06-27 | 2020-09-01 | Asapp, Inc. | Using a neural network to optimize processing of user requests |
-
2017
- 2017-07-18 FR FR1756823A patent/FR3069357B1/fr active Active
-
2018
- 2018-07-13 EP EP18755710.3A patent/EP3655893A1/fr not_active Ceased
- 2018-07-13 WO PCT/EP2018/069176 patent/WO2019016106A1/fr not_active Ceased
- 2018-07-13 CN CN201880053753.5A patent/CN110998608B/zh active Active
- 2018-07-13 US US16/632,267 patent/US11763137B2/en active Active
Also Published As
| Publication number | Publication date |
|---|---|
| FR3069357B1 (fr) | 2023-12-29 |
| US11763137B2 (en) | 2023-09-19 |
| WO2019016106A1 (fr) | 2019-01-24 |
| CN110998608B (zh) | 2024-02-20 |
| US20200257964A1 (en) | 2020-08-13 |
| CN110998608A (zh) | 2020-04-10 |
| FR3069357A1 (fr) | 2019-01-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3655893A1 (fr) | Systeme d'apprentissage machine pour diverses applications informatiques | |
| CN108960833B (zh) | 一种基于异构金融特征的异常交易识别方法,设备及存储介质 | |
| CN113516511B (zh) | 一种金融产品购买预测方法、装置及电子设备 | |
| CN110084609B (zh) | 一种基于表征学习的交易欺诈行为深度检测方法 | |
| WO2010076260A1 (fr) | Procede et systeme pour classifier des donnees issues de base de donnees | |
| RU2723448C1 (ru) | Способ расчета кредитного рейтинга клиента | |
| Jonnalagadda et al. | Credit card fraud detection using Random Forest Algorithm | |
| Tang et al. | Stock movement prediction: A multi‐input LSTM approach | |
| Mizher et al. | Deep CNN approach for unbalanced credit card fraud detection data | |
| Jose et al. | Detection of credit card fraud using resampling and boosting technique | |
| FR3048840A1 (fr) | ||
| Hanae et al. | Analysis of Banking Fraud Detection Methods through Machine Learning Strategies in the Era of Digital Transactions | |
| Shi et al. | An attention-based balanced variational autoencoder method for credit card fraud detection | |
| WO2021110763A1 (fr) | Méthode mise en œuvre par ordinateur pour l'allocation d'une pièce comptable à un couple de comptes débiteur/créditeur et l'écriture comptable | |
| Aziz et al. | Fraudulent transactions detection in credit card by using data mining methods: A review | |
| CN116821759A (zh) | 类别标签的识别预测方法、装置和处理器及电子设备 | |
| CN117078369A (zh) | 金融类用户用卡升级推荐方法、装置、电子设备及存储介质 | |
| WO2023170303A1 (fr) | Methode pour la detection d'anomalie utilisant un modele global-local | |
| CN114757788A (zh) | 用户交易行为识别方法及装置 | |
| Mathew | An Ensemble Machine Learning Model for Classification of Credit Card Fradulent Transactions | |
| Frery | Ensemble Learning for Extremely Imbalced Data Flows | |
| Visalakshi et al. | Detecting credit card frauds using different machine learning algorithms | |
| Peng et al. | Credit scoring model in imbalanced data based on cnn-atcn | |
| Essien | A Synergistic Approach for Enhancing Credit Card Fraud Detection Using Random Forest and Naïve Bayes Models | |
| Kang | Fraud detection in mobile money transactions using machine learning |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20200213 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| AX | Request for extension of the european patent |
Extension state: BA ME |
|
| DAV | Request for validation of the european patent (deleted) | ||
| DAX | Request for extension of the european patent (deleted) | ||
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
| 17Q | First examination report despatched |
Effective date: 20211008 |
|
| P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230527 |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
| 18R | Application refused |
Effective date: 20240906 |