Tiny-Time-Mixers R2 (TTM): More Accurate Predictions with Exogenous Feature Mixing - Complete Tutorial
IBM’s open-source foundation model for time-series just got even better!
Of all foundation time-series models, IBM’s TTM has a unique feature:
It is the first non-Transformer-based foundation TS model.
Tiny-Time-Mixers (TTM)[1] is a breakthrough in time-series forecasting, delivering high-accuracy predictions with zero-shot capability or minimal fine-tuning.
We previously explored its mechanics in depth here. Since then, the model has received major upgrades—including the R2.1 variant, which now handles daily and weekly seasonalities.
TTM is a lightweight, open-source model that continues to evolve. Its latest version (the R2 series) secured 1st place on the Gift-Eval Benchmark for point forecasting.
This article covers how Tiny-Time-Mixers works and how to leverage its advanced features, such as feature-mixing and exogenous infusion—focusing on the latest R2 version.
Let’s get started!
✅ Find the Tiny-Time-Mixers notebook in the AI Projects folder (Project 16)
🎁 Subscribe to AI Horizon Forecast by next week to receive a -20% discount. You'll gain access to all my AI Capstone Projects, including this interesting project on using Tiny-Time-Mixers!
Enter Tiny-Time-Mixers
The first key innovation of TTM is its lightweight, pretrained design, eliminating the need for costly self-attention. Moreover, TTM offers: