time-series

Tag

Cards List
#time-series

NanoTDB – Golang Append-Only Time Series DB

Hacker News Top · 6h ago Cached

NanoTDB is a small embedded append-only time series database for resource-constrained hosts, written in Go, with no runtime dependencies. It uses a WAL and partitioned data files, supports line protocol ingestion, and provides efficient time-range queries.

0 favorites 0 likes
#time-series

SurF: A Generative Model for Multivariate Irregular Time Series Forecasting

arXiv cs.LG · 13h ago Cached

SurF is a generative model for multivariate irregularly sampled time series using the Time Rescaling Theorem to transform event sequences into i.i.d. exponential noise, achieving state-of-the-art results across multiple real-world benchmarks.

0 favorites 0 likes
#time-series

Dywave: Event-Aligned Dynamic Tokenization for Heterogeneous IoT Sensing Signal

arXiv cs.LG · 13h ago Cached

Dywave is a dynamic tokenization framework for IoT sensing signals that uses wavelet-based hierarchical decomposition to align tokens with semantic events, achieving up to 12% higher accuracy and 75% reduction in input token length on five real-world datasets.

0 favorites 0 likes
#time-series

TabPFN-3: Technical Report

arXiv cs.LG · 13h ago Cached

TabPFN-3 is a new foundation model for tabular data, pretrained on synthetic data, that scales to 1M training rows while reducing training and inference time, achieving state-of-the-art performance on tabular prediction, time series, and relational data.

0 favorites 0 likes
#time-series

Semantic Feature Segmentation for Interpretable Predictive Maintenance in Complex Systems

arXiv cs.AI · 13h ago Cached

This paper proposes a semantic feature segmentation framework for predictive maintenance that decomposes monitoring signals into canonical and residual components to improve interpretability while maintaining predictive performance.

0 favorites 0 likes
#time-series

@ClementDelangue: Are scaling laws finally working for time series foundation models? Today, @datadoghq is releasing Toto 2.0 weights in …

X AI KOLs Following · 22h ago Cached

Datadog releases Toto 2.0, a family of open-weights time series foundation models from 4M to 2.5B parameters, achieving state-of-the-art results on three benchmarks. The models demonstrate scaling laws for time series, improving predictably with parameter count.

0 favorites 0 likes
#time-series

FutureSim: Replaying World Events to Evaluate Adaptive Agents

Hugging Face Daily Papers · yesterday Cached

FutureSim replays chronological world events to benchmark AI agents' long-term predictive abilities, finding that even the best agent achieves only 25% accuracy.

0 favorites 0 likes
#time-series

HEPA: A Self-Supervised Horizon-Conditioned Event Predictive Architecture for Time Series

arXiv cs.LG · 2d ago Cached

This paper introduces HEPA, a self-supervised architecture for predicting rare critical events in time series using a Joint-Embedding Predictive Architecture (JEPA) pretraining strategy. It demonstrates superior performance across multiple domains with significantly fewer labeled data and tuned parameters compared to leading models.

0 favorites 0 likes
#time-series

@Huanusa: This is absolutely mind-blowing! Someone actually built an AI that can directly read candlestick trading, and its performance is through the roof! It's called Kronos — the world's first open-source foundational large model designed specifically for financial markets! Trained from scratch on 12 billion real candlestick data points from 45 exchanges, not a repurposed general AI. It can: price prediction + volatility prediction and more.

X AI KOLs Timeline · 2d ago Cached

Kronos is the world's first open-source foundational large model for financial markets, trained from scratch on 12 billion real candlestick data points, supporting price prediction and volatility forecasting, far outperforming general models, and completely free and open-source.

0 favorites 0 likes
#time-series

C2L-Net: A Data-Driven Model for State-of-Charge Estimation of Lithium-Ion Batteries During Discharge

arXiv cs.AI · 3d ago Cached

This paper introduces C2L-Net, a data-driven model for efficient and accurate state-of-charge estimation of lithium-ion batteries using short historical windows.

0 favorites 0 likes
#time-series

Temporal-Decay Shapley: A Time-Aware Data Valuation Framework for Time-Series Data

arXiv cs.LG · 3d ago Cached

This paper proposes Temporal-Decay Shapley (TDS), a data valuation framework for time-series data that incorporates temporal decay and multi-scale fusion to address the time-varying nature of sample values, outperforming traditional methods in noise detection and data selection.

0 favorites 0 likes
#time-series

TTCD:Transformer Integrated Temporal Causal Discovery from Non-Stationary Time Series Data

arXiv cs.LG · 3d ago Cached

The paper introduces TTCD, a novel framework for temporal causal discovery from non-stationary time series data using transformer-based feature learning and reconstruction-guided signal distillation.

0 favorites 0 likes
#time-series

Detecting Time Series Anomalies Like an Expert: A Multi-Agent LLM Framework with Specialized Analyzers

arXiv cs.AI · 2026-05-08 Cached

The article introduces SAGE, a multi-agent LLM framework for time-series anomaly detection that uses specialized analyzers to improve interpretability and reliability. It demonstrates superior performance over baselines on three benchmarks and enhances diagnostic reporting through structured evidence consolidation.

0 favorites 0 likes
#time-series

MOSAIC: Module Discovery via Sparse Additive Identifiable Causal Learning for Scientific Time Series

arXiv cs.LG · 2026-05-08 Cached

This paper introduces MOSAIC, a method for module discovery in scientific time series that combines causal representation learning with sparse additive identifiable causal learning. It aims to recover interpretable latent variables and their associated observations without post-hoc alignment, validated on domains like molecular dynamics and climate data.

0 favorites 0 likes
#time-series

Online Localized Conformal Prediction

arXiv cs.LG · 2026-05-08 Cached

This paper proposes Online Localized Conformal Prediction (OLCP) to address covariate heterogeneity in online learning and time-series settings. It introduces OLCP-Hedge for bandwidth selection and demonstrates valid long-run coverage with narrower prediction sets compared to existing baselines.

0 favorites 0 likes
#time-series

@oragnes: Google quietly open-sourced the time-series forecasting base model TimesFM 2.5—params down to 200 M, context up to 16 k. Feed it raw history and get instant zero-shot forecasts; perfect for crypto predictions, fam 😂

X AI KOLs Timeline · 2026-04-20 Cached

Google open-sourced TimesFM 2.5, a 200 M-parameter, 16 k-context zero-shot time-series forecasting base model that works straight out of the box on historical data.

0 favorites 0 likes
#time-series

Back to Repair: A Minimal Denoising Network\ for Time Series Anomaly Detection

Hugging Face Daily Papers · 2026-04-19 Cached

This paper introduces JuRe (Just Repair), a minimal denoising network for time series anomaly detection that matches or exceeds complex neural baselines on the TSB-AD and UCR benchmarks, demonstrating that a proper manifold-projection training objective is more important than architectural complexity.

0 favorites 0 likes
#time-series

Modeling Sparse and Bursty Vulnerability Sightings: Forecasting Under Data Constraints

Hugging Face Daily Papers · 2026-04-17 Cached

Academic study compares SARIMAX and Poisson regression for forecasting sparse, bursty vulnerability-sighting time-series, finding count-based models more stable.

0 favorites 0 likes
#time-series

A decoder-only foundation model for time-series forecasting

Papers with Code Trending · 2023-10-14 Cached

This article presents a research paper on Time-Series Foundation Model (TimeFM), a decoder-only model that achieves near-optimal zero-shot performance across diverse time-series datasets by adapting large language model techniques.

0 favorites 0 likes
← Back to home

Submit Feedback