Abstract
<title>Abstract</title> Fraud in financial services—especially account opening fraud—poses major operational and reputational risks. Static rules struggle to adapt to evolving tactics, missing novel patterns and generating excessive false positives. Machine learning promises adaptive detection, but deployment faces severe class imbalance: in the NeurIPS 2022 BAF Base benchmark used here, fraud prevalence is 1.10%. Standard metrics (accuracy, f1_weighted) can look strong while doing little for the minority class. We compare logistic regression, SVM (RBF), Random Forest, LightGBM, and a GRU model on N=1,000,000 accounts under a unified preprocessing pipeline. All models are trained to minimize their loss function, while configurations are selected on a stratified development set using validation 1_weighted. For the four classical models, class weighting in the loss (class_weight in {None, 'balanced'}) is treated as a hyperparameter and tuned. Similarly, the GRU is trained with a fixed class-weighted cross-entropy loss that up-weights fraud cases. This ensures that both model families leverage weighted training objectives, while their final hyperparameters are consistently selected by the f1_weighted metric. Despite similar AUCs and aligned feature importance across families, the classical models converge to high-precision, low-recall solutions (1-6% fraud recall), whereas the GRU recovers 78% recall at 5% precision (AUC = 0.8800). Under extreme imbalance, objective choice and operating point matter at least as much as architecture.
Affiliated Institutions
Related Publications
AP-Loss for Accurate One-Stage Object Detection
One-stage object detectors are trained by optimizing classification-loss and localization-loss simultaneously, with the former suffering much from extreme foreground-background ...
Survey on deep learning with class imbalance
Abstract The purpose of this study is to examine existing deep learning techniques for addressing class imbalanced data. Effective classification with imbalanced data is an impo...
Focal Loss for Dense Object Detection
The highest accuracy object detectors to date are based on a two-stage approach popularized by R-CNN, where a classifier is applied to a sparse set of candidate object locations...
Focal Loss for Dense Object Detection
The highest accuracy object detectors to date are based on a two-stage approach popularized by R-CNN, where a classifier is applied to a sparse set of candidate object locations...
Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor
Naomi Klein: book is downright scary.Ethan Zuckerman, MIT: Should be required reading.Dorothy Roberts, author of Killing the Black Body: A must-read.Astra Taylor, author of Pe...
Publication Info
- Year
- 2025
- Type
- article
- Citations
- 0
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.21203/rs.3.rs-8303897/v1