TIME-MOE: Billion-Scale Time Collection Basis Mannequin with Combination-of-Consultants | by Nikos Kafritsas | Oct, 2024

And open-source as nicely!

A top-level view of Time-MOE (Picture Supply)

The Combination-of-Consultants (MOE) structure has surged in reputation with the rise of enormous language fashions (LLMs).

As time-series fashions undertake cutting-edge methods, Combination-of-Consultants has naturally discovered its place within the time-series basis house.

This text discusses Time-MOE, a time-series basis mannequin that makes use of MOE to enhance forecasting accuracy whereas lowering computational prices. Key contributions embody:

  1. Time-300B Dataset: The biggest open time-series dataset, with 300 billion time factors throughout 9 domains, and a scalable data-cleaning pipeline.
  2. Scaling Legal guidelines for Time Collection: Insights into how scaling legal guidelines have an effect on giant time-series fashions.
  3. Time-MOE structure: A household of open-source time-series fashions leveraging MOE to reinforce efficiency.

Let’s get began

Discover the hands-on challenge for Time-MOE within the AI Tasks folder, together with different cool tasks!

Time-MOE is a 2.4B parameter open-source time-series basis mannequin utilizing Combination-of-Consultants (MOE) for zero-shot forecasting