AI & ML interests

A Financial AI Repository

Recent Activity

EghbalĀ  updated a Space 1 day ago
FinText/README
EghbalĀ  published a model 19 days ago
FinText/Chronos_Tiny_2002_US
EghbalĀ  published a model 21 days ago
FinText/TimesFM_20M_2023_Augmented
View all activity

FinText Logo

Time Series Foundation Models for Finance

SSRN arXiv ResearchGate Website - FinText.ai GitHub - FinText.ai

šŸ†• GitHub Model Loading Support (NEW)

All models can now be loaded directly from GitHub. The repository includes utilities and setup instructions. šŸ”— https://github.com/DeepIntoStreams/TSFM_Finance

šŸš€ TSFMs Release

We are pleased to introduce FinText-TSFM, a comprehensive suite of time series foundation models (TSFMs) with 613 models pre-trained for quantitative finance. This release accompanies the paper : Re(Visiting) Time Series Foundation Models in Finance by Eghbal Rahimikia, Hao Ni, and Weiguan Wang (2025).

šŸ’” Key Highlights

  • Finance-Native Pre-training:
    Models are pre-trained from scratch on large-scale financial time series datasets — including daily excess returns across 89 markets and over 2 billion observations — to ensure full temporal and domain alignment.

  • Bias-Free Design:
    Pre-training strictly follows a chronological expanding-window setup, avoiding any look-ahead bias or information leakage.
    Each variation includes 23 separately pre-trained models, corresponding to each year from 2000 to 2023, with data starting in 1990.

  • Model Families:
    This release includes variants of Chronos and TimesFM architectures adapted for financial time series:

    • Chronos-Tiny (8M) / Mini (20M) / Small (46M)
    • TimesFM-8M / 20M
  • Model Collections:

    • U.S.: Covers U.S. market-wide excess returns from 2000 to 2023, with one pre-trained model per year.
    • Global: Covers excess returns across 94 global markets from 2000 to 2023, with one pre-trained model for each year.
    • Augmented: Extends the global data with augmented factors from 2000 to 2023, with one pre-trained model for each year.
    • The remaining 253 pre-trained models are available for download via the FinText.ai Portal. These include models pre-trained with varying hyperparameter configurations for extended experimentation and performance comparison.
  • Performance Insights:
    Our findings show that off-the-shelf TSFMs underperform in zero-shot forecasting, while finance-pretrained models achieve large gains in both predictive accuracy and portfolio performance.

  • Evaluation Scope:
    Models are benchmarked across U.S. and seven international markets, using rolling windows of 5, 21, 252, and 512 days, with over 18 million out-of-sample forecasts spanning 22 years (2001–2023) of daily excess returns, evaluated at both the statistical and economic performance levels.

🧠 Technical Overview

  • Architecture: Transformer-based TSFMs (Chronos & TimesFM)
  • Compute: 50,000 GPU hours on NVIDIA GH200 Grace Hopper clusters

šŸ“š Citation

Please cite the accompanying paper if you use these models:

Re(Visiting) Time Series Foundation Models in Finance.
Rahimikia, Eghbal; Ni, Hao; Wang, Weiguan.
SSRN: [https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5770562)

šŸ”‹ Acknowledgments

This project was made possible through computational and institutional support from:

  • UK Research and Innovation (UKRI)
  • Isambard-AI National AI Research Resource (AIRR)
  • Alliance Manchester Business School (AMBS), University of Manchester
  • N8 Centre of Excellence in Computationally Intensive Research (N8 CIR)
  • The University of Manchester (Research IT & Computational Shared Facility)
  • University College London (UCL)
  • The Alan Turing Institute
  • Shanghai University

Developed by:

University of Manchester Logo UCL Logo

Alliance Manchester Business School, University of Manchester
Department of Mathematics, University College London (UCL)

Powered by:

BriCS Logo N8 Bede Logo

Isambard-AI, Bristol Centre for Supercomputing (BriCS)
The Bede Supercomputer

datasets 0

None public yet