Squeezing Time: How Dynamic Tokenization Could Reshape Time‑Series Foundation Models
Opening — Why this matters now Foundation models have escaped the confines of language and images. Time‑series data — from electricity demand to financial markets — is the next frontier. And yet the architectures that dominate AI today were never designed for thousands of sequential measurements. Transformers, for instance, scale poorly with long sequences. Feed them enough historical context and they become computationally expensive — almost theatrically so. ...