Products/AI/ML - Foundation Model/timesfm

timesfm

A pretrained time-series foundation model developed by Google Research for time-series forecasting.

AI/ML - Foundation ModelMountain View, United StatesCorporate (owned by Alphabet)thousands peopleFounded 1998200M parameters (down from 500M in v2.0)Supports up to 16k context lengthSupports continuous quantile forecast up to 1k horizon via optional 30M quantile headRemoves frequency indicator requirementHas new forecasting flagsSupports torch and JAX/Flax backendsAvailable on Hugging FaceIntegrated with BigQuery as official Google product

Our Take

{"problem_it_solves": "Provides a pretrained model for time-series forecasting tasks, eliminating the need to train models from scratch for each forecasting use case.", "target_customer": "Data scientists, ML engineers, and developers working on time-series forecasting tasks.", "use_cases": ["Time-series forecasting", "Point forecasting", "Quantile forecasting", "Anomaly detection in time series", "Demand forecasting", "Financial forecasting"], "pricing_details": "Open source under Apache-2.0 license", "differentiator": "First pretrained time-series foundation model from Google Research, tested at ICML 2024, with support for up to 16k context length and built-in quantile forecasting head.", "traction": {"notable_metrics": "15.8k GitHub stars, 1.4k forks, 109 watchers, cited in paper at ICML 2024", "press_mentions": ["Paper: A decoder-only foundation model for time-series forecasting, ICML 2024", "TimesFM Hugging Face Collection", "Google Research blog post"]}}

Key Facts

Category
AI/ML - Foundation Model
Location
Mountain View, United States
Founded
1998
Team Size
thousands people
Stage
Corporate (owned by Alphabet)
Pricing
Free
Discovered via
github-trending

Links

Want products like this in your inbox every morning?

Five products. Every morning. Written by someone who actually cares whether they're good or not. Free forever, unsubscribe whenever.