Startup DreamersStartup Dreamers
  • Home
  • Startup
  • Money & Finance
  • Starting a Business
    • Branding
    • Business Ideas
    • Business Models
    • Business Plans
    • Fundraising
  • Growing a Business
  • More
    • Innovation
    • Leadership
Trending

Google Is Not Ruling Out Ads in Gemini

March 18, 2026

Nvidia Will Spend $26 Billion to Build Open-Weight AI Models, Filings Show

March 17, 2026

When AI Companies Go to War, Safety Gets Left Behind

March 16, 2026
Facebook Twitter Instagram
  • Newsletter
  • Submit Articles
  • Privacy
  • Advertise
  • Contact
Facebook Twitter Instagram
Startup DreamersStartup Dreamers
  • Home
  • Startup
  • Money & Finance
  • Starting a Business
    • Branding
    • Business Ideas
    • Business Models
    • Business Plans
    • Fundraising
  • Growing a Business
  • More
    • Innovation
    • Leadership
Subscribe for Alerts
Startup DreamersStartup Dreamers
Home » IBM Improves Generative AI Forecasting Using Time, Not Just Attention
Innovation

IBM Improves Generative AI Forecasting Using Time, Not Just Attention

adminBy adminAugust 29, 20245 ViewsNo Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email

According to IBM, attention is not all you need when forecasting certain outcomes with generative AI. You also need time. Earlier this year, IBM made its open-source TinyTimeMixer (TTM) model available on Hugging Face under an Apache License. Based on IBM’s Granite foundation model, TTM is a lightweight pre-trained time series foundation model (TSFM) for time-series forecasting based on a patch-mixer architecture for learning context and correlations across time and multiple variables.

Unlike language and vision-based foundation models, such as ChatGPT and Llama, where each word or token contains semantic meaning, TSFMs use values associated with local temporal patches – a contiguous set of points in time to learn the temporal patterns. Additionally, while the language/vision foundation models best derive associations when trained on a single context, such as a given language or topic where grammatical structures or vernacular is common to the dataset, with TSFMs, further contexts and associations can be derived by looking at long historical time windows and correlations with other multi-variate time series data. This data can vary by industry, time resolution, sampling rates, numerical scales or other characteristics typically associated with time series data.

One common factor between the different types of models is the need for massive amounts of data to train the models properly. Language and vision foundation models have essentially the entirety of the internet at their disposal. TSFMs, however, require very specific time-stamped data that is typically not publicly available. Some estimates go as high as 95% of this type of data is still proprietary and not publicly available. Fortunately, researchers from Monash University and the University of Sydney have compiled the Monash Time Series Forecasting Repository, which provides sufficient data across multiple domains and time units to train TSFMs properly.

The ability of TSFMs to handle the multi-variate nature of time series data is essential for taking into account the context of what the data represents during the training window (e.g., when analyzing stock prices, was there an earnings call or a critical announcement where there was an inflection point in the data, etc.). To take full advantage of this, as opposed to using a transformer architecture like language models, IBM created a new architecture called Time Series Mixer or TS Mixer. According to IBM, upon implementing the TS Mixer architecture, model size was reduced by a factor of 10 compared to models using transformer architecture while maintaining similar accuracy levels.

Since its release in April 2024, TTM has had over one million downloads from Hugging Face, which begs the question: What time series applications are developers using IBMs Granite TTM for? According to IBM, TTM is being used for a variety of value-added, multi-variable use cases. One use is forecasting flash storage device performance across over 350 key performance indicators. Another use case is to provide directional forecasts for stock movements using both the temporal patterns and the impact of other variables. It has been used in providing a 28-day sales forecast (demonstrated against the M5 retail data set) for inventory and revenue planning with the added capability of factoring in the effects of things like sale events and other variables that affect retail sales for added accuracy. TTM is also used for forecasting based optimization (model predictive control), such as building temperature control or complex manufacturing process modeling.

As we continue to see, there is no one-size-fits-all AI solution. As new AI technologies and models are introduced, selecting the best solution for the application is important. Transformer-based large language models clearly provide world-changing results when predicting outcomes based on language and vision. However, in the case of forecasting time series-based outcomes, IBM has developed a new tool to put in our collective toolboxes. Its Granite TTM is not the only TSFM available, but hopefully, given the innovations that IBM has introduced and its open-sourced availability, it will be the one that helps drive TSFMs to the same scale in terms of development and utility as its language-based counterparts.

Read the full article here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Articles

The Dilemma Of Profits V.S. Guardrails

Innovation March 1, 2026

As Davos & India Celebrated AI, Paris Sounded The Alarm On AI Safety

Innovation February 28, 2026

Backyard Baseball Is Getting A New Game And I’m Ready For It In July

Innovation February 27, 2026

Solving The Data Bottleneck For Physical AI

Innovation February 26, 2026

Today’s Wordle #1686 Hints And Answer For Friday, January 30

Innovation January 30, 2026

Today’s Wordle #1685 Hints And Answer For Thursday, January 29

Innovation January 29, 2026
Add A Comment

Leave A Reply Cancel Reply

Editors Picks

Google Is Not Ruling Out Ads in Gemini

March 18, 2026

Nvidia Will Spend $26 Billion to Build Open-Weight AI Models, Filings Show

March 17, 2026

When AI Companies Go to War, Safety Gets Left Behind

March 16, 2026

A Former Top Trump Official Is Going After Prediction Markets

March 14, 2026

Apple Blocks US Users From Downloading ByteDance’s Chinese Apps

March 13, 2026

Latest Posts

The Data Centers Have Arrived at the Edge of the Arctic Circle

March 11, 2026

ByteDance’s AI Ambitions Are Being Hampered by Compute Restraints and Copyright Concerns

March 10, 2026

OpenAI Had Banned Military Use. The Pentagon Tested Its Models Through Microsoft Anyway

March 9, 2026

What AI Models for War Actually Look Like

March 8, 2026

Wall Street Has AI Psychosis

March 7, 2026
Advertisement
Demo

Startup Dreamers is your one-stop website for the latest news and updates about how to start a business, follow us now to get the news that matters to you.

Facebook Twitter Instagram Pinterest YouTube
Sections
  • Growing a Business
  • Innovation
  • Leadership
  • Money & Finance
  • Starting a Business
Trending Topics
  • Branding
  • Business Ideas
  • Business Models
  • Business Plans
  • Fundraising

Subscribe to Updates

Get the latest business and startup news and updates directly to your inbox.

© 2026 Startup Dreamers. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.

GET $5000 NO CREDIT