Startup DreamersStartup Dreamers
  • Home
  • Startup
  • Money & Finance
  • Starting a Business
    • Branding
    • Business Ideas
    • Business Models
    • Business Plans
    • Fundraising
  • Growing a Business
  • More
    • Innovation
    • Leadership
Trending

Lightning Bolt 515 Miles Long From Texas To Missouri Breaks Record

August 26, 2025

This $10,000 Mistake Could Derail Your Business Before It Starts — Here’s How to Avoid It

August 26, 2025

How AI’s Defining Your Brand Story — and How to Take Control

August 26, 2025
Facebook Twitter Instagram
  • Newsletter
  • Submit Articles
  • Privacy
  • Advertise
  • Contact
Facebook Twitter Instagram
Startup DreamersStartup Dreamers
  • Home
  • Startup
  • Money & Finance
  • Starting a Business
    • Branding
    • Business Ideas
    • Business Models
    • Business Plans
    • Fundraising
  • Growing a Business
  • More
    • Innovation
    • Leadership
Subscribe for Alerts
Startup DreamersStartup Dreamers
Home » IBM Improves Generative AI Forecasting Using Time, Not Just Attention
Innovation

IBM Improves Generative AI Forecasting Using Time, Not Just Attention

adminBy adminAugust 29, 20240 ViewsNo Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email

According to IBM, attention is not all you need when forecasting certain outcomes with generative AI. You also need time. Earlier this year, IBM made its open-source TinyTimeMixer (TTM) model available on Hugging Face under an Apache License. Based on IBM’s Granite foundation model, TTM is a lightweight pre-trained time series foundation model (TSFM) for time-series forecasting based on a patch-mixer architecture for learning context and correlations across time and multiple variables.

Unlike language and vision-based foundation models, such as ChatGPT and Llama, where each word or token contains semantic meaning, TSFMs use values associated with local temporal patches – a contiguous set of points in time to learn the temporal patterns. Additionally, while the language/vision foundation models best derive associations when trained on a single context, such as a given language or topic where grammatical structures or vernacular is common to the dataset, with TSFMs, further contexts and associations can be derived by looking at long historical time windows and correlations with other multi-variate time series data. This data can vary by industry, time resolution, sampling rates, numerical scales or other characteristics typically associated with time series data.

One common factor between the different types of models is the need for massive amounts of data to train the models properly. Language and vision foundation models have essentially the entirety of the internet at their disposal. TSFMs, however, require very specific time-stamped data that is typically not publicly available. Some estimates go as high as 95% of this type of data is still proprietary and not publicly available. Fortunately, researchers from Monash University and the University of Sydney have compiled the Monash Time Series Forecasting Repository, which provides sufficient data across multiple domains and time units to train TSFMs properly.

The ability of TSFMs to handle the multi-variate nature of time series data is essential for taking into account the context of what the data represents during the training window (e.g., when analyzing stock prices, was there an earnings call or a critical announcement where there was an inflection point in the data, etc.). To take full advantage of this, as opposed to using a transformer architecture like language models, IBM created a new architecture called Time Series Mixer or TS Mixer. According to IBM, upon implementing the TS Mixer architecture, model size was reduced by a factor of 10 compared to models using transformer architecture while maintaining similar accuracy levels.

Since its release in April 2024, TTM has had over one million downloads from Hugging Face, which begs the question: What time series applications are developers using IBMs Granite TTM for? According to IBM, TTM is being used for a variety of value-added, multi-variable use cases. One use is forecasting flash storage device performance across over 350 key performance indicators. Another use case is to provide directional forecasts for stock movements using both the temporal patterns and the impact of other variables. It has been used in providing a 28-day sales forecast (demonstrated against the M5 retail data set) for inventory and revenue planning with the added capability of factoring in the effects of things like sale events and other variables that affect retail sales for added accuracy. TTM is also used for forecasting based optimization (model predictive control), such as building temperature control or complex manufacturing process modeling.

As we continue to see, there is no one-size-fits-all AI solution. As new AI technologies and models are introduced, selecting the best solution for the application is important. Transformer-based large language models clearly provide world-changing results when predicting outcomes based on language and vision. However, in the case of forecasting time series-based outcomes, IBM has developed a new tool to put in our collective toolboxes. Its Granite TTM is not the only TSFM available, but hopefully, given the innovations that IBM has introduced and its open-sourced availability, it will be the one that helps drive TSFMs to the same scale in terms of development and utility as its language-based counterparts.

Read the full article here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Articles

Lightning Bolt 515 Miles Long From Texas To Missouri Breaks Record

Innovation August 26, 2025

How Indigenous Wisdom Can Guide Our AI Future

Innovation August 25, 2025

Why ‘Zombie Squirrels’ Have Been Appearing In The U.S. And Canada

Innovation August 24, 2025

Today’s Wordle #1526 Hints And Answer For Saturday, August 23rd

Innovation August 23, 2025

Today’s NYT Mini Crossword Clues And Answers For Friday, August 22nd

Innovation August 22, 2025

When First-Year College Students Struggle With Separation Anxiety

Innovation August 21, 2025
Add A Comment

Leave A Reply Cancel Reply

Editors Picks

Lightning Bolt 515 Miles Long From Texas To Missouri Breaks Record

August 26, 2025

This $10,000 Mistake Could Derail Your Business Before It Starts — Here’s How to Avoid It

August 26, 2025

How AI’s Defining Your Brand Story — and How to Take Control

August 26, 2025

Stop Losing Customers — 5 Friction Fixes That Boost Conversions

August 26, 2025

How Indigenous Wisdom Can Guide Our AI Future

August 25, 2025

Latest Posts

People Really Only Care About These 3 Things at Work — Do You Offer Them?

August 25, 2025

How a Software Engineer’s Business Impacts Education

August 25, 2025

The Global Car Reckoning Is Here. Far Too Many Auto Companies Don’t Have a Plan

August 25, 2025

Why ‘Zombie Squirrels’ Have Been Appearing In The U.S. And Canada

August 24, 2025

Co-founders of Stakt on Starting a Side Hustle Earning $10M in 2025

August 24, 2025
Advertisement
Demo

Startup Dreamers is your one-stop website for the latest news and updates about how to start a business, follow us now to get the news that matters to you.

Facebook Twitter Instagram Pinterest YouTube
Sections
  • Growing a Business
  • Innovation
  • Leadership
  • Money & Finance
  • Starting a Business
Trending Topics
  • Branding
  • Business Ideas
  • Business Models
  • Business Plans
  • Fundraising

Subscribe to Updates

Get the latest business and startup news and updates directly to your inbox.

© 2025 Startup Dreamers. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.

GET $5000 NO CREDIT