Kronos: A Foundation Model for the Language of Financial Markets

Papers with Code Trending Papers

Summary

Kronos is a new foundation model for financial K-line data that uses a specialized tokenizer and autoregressive pre-training to outperform existing models in forecasting and synthetic data generation.

The success of large-scale pre-training paradigm, exemplified by Large Language Models (LLMs), has inspired the development of Time Series Foundation Models (TSFMs). However, their application to financial candlestick (K-line) data remains limited, often underperforming non-pre-trained architectures. Moreover, existing TSFMs often overlook crucial downstream tasks such as volatility prediction and synthetic data generation. To address these limitations, we propose Kronos, a unified, scalable pre-training framework tailored to financial K-line modeling. Kronos introduces a specialized tokenizer that discretizes continuous market information into token sequences, preserving both price dynamics and trade activity patterns. We pre-train Kronos using an autoregressive objective on a massive, multi-market corpus of over 12 billion K-line records from 45 global exchanges, enabling it to learn nuanced temporal and cross-asset representations. Kronos excels in a zero-shot setting across a diverse set of financial tasks. On benchmark datasets, Kronos boosts price series forecasting RankIC by 93% over the leading TSFM and 87% over the best non-pre-trained baseline. It also achieves a 9% lower MAE in volatility forecasting and a 22% improvement in generative fidelity for synthetic K-line sequences. These results establish Kronos as a robust, versatile foundation model for end-to-end financial time series analysis. Our pre-trained model is publicly available at https://github.com/shiyu-coder/Kronos.
Original Article Export to Word Export to PDF
View Cached Full Text

Cached at: 05/08/26, 08:33 AM

Paper page - Kronos: A Foundation Model for the Language of Financial Markets

Source: https://huggingface.co/papers/2508.02739

Abstract

Kronos, a specialized pre-training framework for financial K-line data, outperforms existing models in forecasting and synthetic data generation through a unique tokenizer and autoregressive pre-training on a large dataset.

The success of large-scale pre-training paradigm, exemplified by Large Language Models (LLMs), has inspired the development of Time Series Foundation Models (TSFMs). However, their application tofinancial candlestick(K-line) data remains limited, often underperforming non-pre-trained architectures. Moreover, existing TSFMs often overlook crucial downstream tasks such asvolatility predictionandsynthetic data generation. To address these limitations, we propose Kronos, a unified, scalable pre-training framework tailored to financial K-line modeling. Kronos introduces a specialized tokenizer that discretizes continuous market information intotoken sequences, preserving bothprice dynamicsandtrade activity patterns. We pre-train Kronos using anautoregressive objectiveon a massive, multi-market corpus of over 12 billion K-line records from 45 global exchanges, enabling it to learn nuanced temporal and cross-asset representations. Kronos excels in azero-shot settingacross a diverse set of financial tasks. On benchmark datasets, Kronos boostsprice series forecastingRankICby 93% over the leading TSFM and 87% over the best non-pre-trained baseline. It also achieves a 9% lowerMAEin volatility forecasting and a 22% improvement ingenerative fidelityfor synthetic K-line sequences. These results establish Kronos as a robust, versatile foundation model for end-to-end financial time series analysis. Our pre-trained model is publicly available at https://github.com/shiyu-coder/Kronos.

View arXiv pageView PDFGitHub23.5kautoAdd to collection

Get this paper in your agent:

hf papers read 2508\.02739

Don’t have the latest CLI?curl \-LsSf https://hf\.co/cli/install\.sh \| bash

Models citing this paper33

#### NeoQuasar/Kronos-base Time Series Forecasting• UpdatedSep 9, 2025 • 831k • 147 #### NeoQuasar/Kronos-Tokenizer-base Time Series Forecasting• UpdatedSep 9, 2025 • 2.6M • 51 #### NeoQuasar/Kronos-mini Time Series Forecasting• UpdatedSep 9, 2025 • 691k • 19 #### NeoQuasar/Kronos-small Time Series Forecasting• UpdatedSep 9, 2025 • 1.14M • 18 Browse 33 models citing this paper## Datasets citing this paper0

No dataset linking this paper

Cite arxiv.org/abs/2508.02739 in a dataset README.md to link it from this page.

Spaces citing this paper42

Collections including this paper16

Browse 16 collections that include this paper

Similar Articles

A decoder-only foundation model for time-series forecasting

Papers with Code Trending

This article presents a research paper on Time-Series Foundation Model (TimeFM), a decoder-only model that achieves near-optimal zero-shot performance across diverse time-series datasets by adapting large language model techniques.

How finance teams use Codex

OpenAI Blog

OpenAI highlights how finance teams can leverage Codex to automate tasks like generating monthly business review narratives and cleaning up financial models, integrating with tools like Excel and Slack to improve efficiency.