Startup DreamersStartup Dreamers
  • Home
  • Startup
  • Money & Finance
  • Starting a Business
    • Branding
    • Business Ideas
    • Business Models
    • Business Plans
    • Fundraising
  • Growing a Business
  • More
    • Innovation
    • Leadership
Trending

This Home Robot Clears Tables and Loads the Dishwasher All by Itself

November 26, 2025

Does Creativity Still Stand Out In The AI Era? Study Seeks Answers

November 26, 2025

‘Odd Lots’ Cohost Joe Weisenthal Has Predictions About How the AI Bubble Will Burst

November 25, 2025
Facebook Twitter Instagram
  • Newsletter
  • Submit Articles
  • Privacy
  • Advertise
  • Contact
Facebook Twitter Instagram
Startup DreamersStartup Dreamers
  • Home
  • Startup
  • Money & Finance
  • Starting a Business
    • Branding
    • Business Ideas
    • Business Models
    • Business Plans
    • Fundraising
  • Growing a Business
  • More
    • Innovation
    • Leadership
Subscribe for Alerts
Startup DreamersStartup Dreamers
Home » IBM Improves Generative AI Forecasting Using Time, Not Just Attention
Innovation

IBM Improves Generative AI Forecasting Using Time, Not Just Attention

adminBy adminAugust 29, 20240 ViewsNo Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email

According to IBM, attention is not all you need when forecasting certain outcomes with generative AI. You also need time. Earlier this year, IBM made its open-source TinyTimeMixer (TTM) model available on Hugging Face under an Apache License. Based on IBM’s Granite foundation model, TTM is a lightweight pre-trained time series foundation model (TSFM) for time-series forecasting based on a patch-mixer architecture for learning context and correlations across time and multiple variables.

Unlike language and vision-based foundation models, such as ChatGPT and Llama, where each word or token contains semantic meaning, TSFMs use values associated with local temporal patches – a contiguous set of points in time to learn the temporal patterns. Additionally, while the language/vision foundation models best derive associations when trained on a single context, such as a given language or topic where grammatical structures or vernacular is common to the dataset, with TSFMs, further contexts and associations can be derived by looking at long historical time windows and correlations with other multi-variate time series data. This data can vary by industry, time resolution, sampling rates, numerical scales or other characteristics typically associated with time series data.

One common factor between the different types of models is the need for massive amounts of data to train the models properly. Language and vision foundation models have essentially the entirety of the internet at their disposal. TSFMs, however, require very specific time-stamped data that is typically not publicly available. Some estimates go as high as 95% of this type of data is still proprietary and not publicly available. Fortunately, researchers from Monash University and the University of Sydney have compiled the Monash Time Series Forecasting Repository, which provides sufficient data across multiple domains and time units to train TSFMs properly.

The ability of TSFMs to handle the multi-variate nature of time series data is essential for taking into account the context of what the data represents during the training window (e.g., when analyzing stock prices, was there an earnings call or a critical announcement where there was an inflection point in the data, etc.). To take full advantage of this, as opposed to using a transformer architecture like language models, IBM created a new architecture called Time Series Mixer or TS Mixer. According to IBM, upon implementing the TS Mixer architecture, model size was reduced by a factor of 10 compared to models using transformer architecture while maintaining similar accuracy levels.

Since its release in April 2024, TTM has had over one million downloads from Hugging Face, which begs the question: What time series applications are developers using IBMs Granite TTM for? According to IBM, TTM is being used for a variety of value-added, multi-variable use cases. One use is forecasting flash storage device performance across over 350 key performance indicators. Another use case is to provide directional forecasts for stock movements using both the temporal patterns and the impact of other variables. It has been used in providing a 28-day sales forecast (demonstrated against the M5 retail data set) for inventory and revenue planning with the added capability of factoring in the effects of things like sale events and other variables that affect retail sales for added accuracy. TTM is also used for forecasting based optimization (model predictive control), such as building temperature control or complex manufacturing process modeling.

As we continue to see, there is no one-size-fits-all AI solution. As new AI technologies and models are introduced, selecting the best solution for the application is important. Transformer-based large language models clearly provide world-changing results when predicting outcomes based on language and vision. However, in the case of forecasting time series-based outcomes, IBM has developed a new tool to put in our collective toolboxes. Its Granite TTM is not the only TSFM available, but hopefully, given the innovations that IBM has introduced and its open-sourced availability, it will be the one that helps drive TSFMs to the same scale in terms of development and utility as its language-based counterparts.

Read the full article here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Articles

Does Creativity Still Stand Out In The AI Era? Study Seeks Answers

Innovation November 26, 2025

NYT ‘Pips’ Hints, Answers, And Walkthrough, Tuesday November 25

Innovation November 25, 2025

Google’s Black Friday Special Offers For Pixel 10 Pro Customers

Innovation November 24, 2025

Today’s Wordle #1618 Hints And Answer For Sunday, November 23

Innovation November 23, 2025

NYT ‘Pips’ Hints, Answers, And Walkthrough, Saturday November 22

Innovation November 22, 2025

600 LED Drones Bring David Hockney Paintings To Life In The Night Sky

Innovation November 21, 2025
Add A Comment

Leave A Reply Cancel Reply

Editors Picks

This Home Robot Clears Tables and Loads the Dishwasher All by Itself

November 26, 2025

Does Creativity Still Stand Out In The AI Era? Study Seeks Answers

November 26, 2025

‘Odd Lots’ Cohost Joe Weisenthal Has Predictions About How the AI Bubble Will Burst

November 25, 2025

NYT ‘Pips’ Hints, Answers, And Walkthrough, Tuesday November 25

November 25, 2025

A $100 Million AI Super PAC Targeted New York Democrat Alex Bores. He Thinks It Backfired

November 24, 2025

Latest Posts

Today’s Wordle #1618 Hints And Answer For Sunday, November 23

November 23, 2025

Inside a Wild Bitcoin Heist: Five-Star Hotels, Cash-Stuffed Envelopes, and Vanishing Funds

November 22, 2025

NYT ‘Pips’ Hints, Answers, And Walkthrough, Saturday November 22

November 22, 2025

Inside the Multimillion-Dollar Plan to Make Mobile Voting Happen

November 21, 2025

600 LED Drones Bring David Hockney Paintings To Life In The Night Sky

November 21, 2025
Advertisement
Demo

Startup Dreamers is your one-stop website for the latest news and updates about how to start a business, follow us now to get the news that matters to you.

Facebook Twitter Instagram Pinterest YouTube
Sections
  • Growing a Business
  • Innovation
  • Leadership
  • Money & Finance
  • Starting a Business
Trending Topics
  • Branding
  • Business Ideas
  • Business Models
  • Business Plans
  • Fundraising

Subscribe to Updates

Get the latest business and startup news and updates directly to your inbox.

© 2025 Startup Dreamers. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.

GET $5000 NO CREDIT