Tag
NanoTDB is a small embedded append-only time series database for resource-constrained hosts, written in Go, with no runtime dependencies. It uses a WAL and partitioned data files, supports line protocol ingestion, and provides efficient time-range queries.
SurF is a generative model for multivariate irregularly sampled time series using the Time Rescaling Theorem to transform event sequences into i.i.d. exponential noise, achieving state-of-the-art results across multiple real-world benchmarks.
Dywave is a dynamic tokenization framework for IoT sensing signals that uses wavelet-based hierarchical decomposition to align tokens with semantic events, achieving up to 12% higher accuracy and 75% reduction in input token length on five real-world datasets.
TabPFN-3 is a new foundation model for tabular data, pretrained on synthetic data, that scales to 1M training rows while reducing training and inference time, achieving state-of-the-art performance on tabular prediction, time series, and relational data.
This paper proposes a semantic feature segmentation framework for predictive maintenance that decomposes monitoring signals into canonical and residual components to improve interpretability while maintaining predictive performance.
Datadog releases Toto 2.0, a family of open-weights time series foundation models from 4M to 2.5B parameters, achieving state-of-the-art results on three benchmarks. The models demonstrate scaling laws for time series, improving predictably with parameter count.
FutureSim replays chronological world events to benchmark AI agents' long-term predictive abilities, finding that even the best agent achieves only 25% accuracy.
This paper introduces HEPA, a self-supervised architecture for predicting rare critical events in time series using a Joint-Embedding Predictive Architecture (JEPA) pretraining strategy. It demonstrates superior performance across multiple domains with significantly fewer labeled data and tuned parameters compared to leading models.
Kronos is the world's first open-source foundational large model for financial markets, trained from scratch on 12 billion real candlestick data points, supporting price prediction and volatility forecasting, far outperforming general models, and completely free and open-source.
This paper introduces C2L-Net, a data-driven model for efficient and accurate state-of-charge estimation of lithium-ion batteries using short historical windows.
This paper proposes Temporal-Decay Shapley (TDS), a data valuation framework for time-series data that incorporates temporal decay and multi-scale fusion to address the time-varying nature of sample values, outperforming traditional methods in noise detection and data selection.
The paper introduces TTCD, a novel framework for temporal causal discovery from non-stationary time series data using transformer-based feature learning and reconstruction-guided signal distillation.
The article introduces SAGE, a multi-agent LLM framework for time-series anomaly detection that uses specialized analyzers to improve interpretability and reliability. It demonstrates superior performance over baselines on three benchmarks and enhances diagnostic reporting through structured evidence consolidation.
This paper introduces MOSAIC, a method for module discovery in scientific time series that combines causal representation learning with sparse additive identifiable causal learning. It aims to recover interpretable latent variables and their associated observations without post-hoc alignment, validated on domains like molecular dynamics and climate data.
This paper proposes Online Localized Conformal Prediction (OLCP) to address covariate heterogeneity in online learning and time-series settings. It introduces OLCP-Hedge for bandwidth selection and demonstrates valid long-run coverage with narrower prediction sets compared to existing baselines.
Google open-sourced TimesFM 2.5, a 200 M-parameter, 16 k-context zero-shot time-series forecasting base model that works straight out of the box on historical data.
This paper introduces JuRe (Just Repair), a minimal denoising network for time series anomaly detection that matches or exceeds complex neural baselines on the TSB-AD and UCR benchmarks, demonstrating that a proper manifold-projection training objective is more important than architectural complexity.
Academic study compares SARIMAX and Poisson regression for forecasting sparse, bursty vulnerability-sighting time-series, finding count-based models more stable.
This article presents a research paper on Time-Series Foundation Model (TimeFM), a decoder-only model that achieves near-optimal zero-shot performance across diverse time-series datasets by adapting large language model techniques.