Startup DreamersStartup Dreamers
  • Home
  • Startup
  • Money & Finance
  • Starting a Business
    • Branding
    • Business Ideas
    • Business Models
    • Business Plans
    • Fundraising
  • Growing a Business
  • More
    • Innovation
    • Leadership
Trending

Jeffrey Epstein Advised an Elon Musk Associate on Taking Tesla Private

February 14, 2026

AI Industry Rivals Are Teaming Up on a Startup Accelerator

February 13, 2026

‘Uncanny Valley’: Tech Elites in the Epstein Files, Musk’s Mega Merger, and a Crypto Scam Compound

February 11, 2026
Facebook Twitter Instagram
  • Newsletter
  • Submit Articles
  • Privacy
  • Advertise
  • Contact
Facebook Twitter Instagram
Startup DreamersStartup Dreamers
  • Home
  • Startup
  • Money & Finance
  • Starting a Business
    • Branding
    • Business Ideas
    • Business Models
    • Business Plans
    • Fundraising
  • Growing a Business
  • More
    • Innovation
    • Leadership
Subscribe for Alerts
Startup DreamersStartup Dreamers
Home » Sparse Models, The Math, And A New Theory For Ground-Breaking AI
Innovation

Sparse Models, The Math, And A New Theory For Ground-Breaking AI

adminBy adminAugust 18, 20230 ViewsNo Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email

Video: This intriguing theory from a master of conceptual science might end up being crucial to new AI advances.

Get ready for a lot of math…!

We have sort of an intuitive understanding of a big need in artificial intelligence and machine learning, which has to do with making sure that systems converge well, and that data is oriented the right way. Also, that we understand what these tools are doing, that we can look under the hood.

A lot of us have already heard of the term “curse of dimensionality,” but Tomaso Armando Poggio invokes this frightening trope with a good bit of mathematics attached… (Poggio is the Eugene McDermott professor in the Department of Brain and Cognitive Sciences, a researcher at the McGovern Institute for Brain Research, and a member of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL)

In talking about the contributions of Alessandro Volta in 1800 and his development of the first battery, BB makes the analogy to current technology and the frontier that we’re facing now.

We need, he says, a theory of machine learning to provide, in his words, “deep explainability” and for other kinds of fundamental advancement.

“That’s the root of a lot of the problems,” Poggio says. “(A lack of) explainability: not knowing exactly the properties and limitations of those systems … and we need a theory because we need better systems.”

He also suggests we can find principles that human intelligence has in common with large language models, and use those for deeper exploration.

(Watch Poggio’s description of a process where someone can use a “powerful estimator” and parametric analysis to approximate an unknown function, and then, in principle, find the relevant parameters by optimizing the fit between different components, and how this process relates to thinking, in a broader way, about the use of an implicit function from input/output data.)

Later, in assessing an image of which the parameters number no less than ten to the power of 1000, Poggio compares that number to the number of protons in the entire universe: 10 to the power of eighty.

“This (dimensional volume) is a real curse,” he says.

In describing the curse of dimensionality as it affects new systems, Poggio talks about the example of working with a “well-known and classical function,” and also describes the nature of a compositional function that would help with these sorts of problems.

Breaking down binary trees into collections of variables, he talks about dimensionality and the principle of sparse connectivity, again, with a detailed description that you’ll want to listen to, maybe more than once.

“(This approach) will avoid the curse of dimensionality when approximation is done by a deep network with the same compositional structure, that same sparse connectivity at different layers. … the question was, then, are compositionally sparse functions very rare, something that happens, perhaps, with images? … this would explain why convolutional networks are good, and dense networks are bad.”

Not to add more technicality, but the following statement by Poggio seems to sum up this part of his theory:

“It turns out (that) every practical function, every function that is Turing computable, in non-polynomial, (or) non-exponential time, is compositionally sparse, and can be approximated without curse of dimensionality, by a deep network with the appropriate sparse connectivity at each layer.”

Watch this sentence closely.

Generally, using the example of a convolutional network, Poggio talks about how sparsity could help us to uncover key improvements in AI/ML systems. He explains what he calls a “conjecture” on sparse models this way:

“This may be what transformers can do, for at least a subset of functions:

to find that sparse composition at each level of the hierarchy. And this is done by self-attention, which selects a small number, a sparse number of tokens, at each layer in the network.”

This is, to put it mildly, very interesting for engineers who are trying to break through the current limitations of what we can do with AI and ML. A lot of it, to be sure, has to do with black box models, and dimensionality, and fitting.

Take a look and see what you think of this approach. Poggio concludes with a summary:

“I think we need a theory-first approach to AI. This will provide true explainability, will allow us to improve on the systems … which we don’t understand why they work, which is kind of very ironic. And perhaps beyond that, to really discover principles of intelligence that apply also to our brain(s). … any testing conjecture to be explored (involves the idea that) what that (model) may be doing is really: to find at least for a subset of interesting function(s), the sparse variables that are needed at each layer in a network.”

Read the full article here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Articles

Today’s Wordle #1686 Hints And Answer For Friday, January 30

Innovation January 30, 2026

Today’s Wordle #1685 Hints And Answer For Thursday, January 29

Innovation January 29, 2026

Today’s Wordle #1684 Hints And Answer For Wednesday, January 28

Innovation January 28, 2026

U.S. Revamps Wildfire Response Into Modern Central Organization

Innovation January 27, 2026

Studies Are Increasingly Finding High Blood Sugar May Be Associated With Dementia

Innovation January 26, 2026

Google’s Last Minute Offer For Pixel Customers

Innovation January 25, 2026
Add A Comment

Leave A Reply Cancel Reply

Editors Picks

Jeffrey Epstein Advised an Elon Musk Associate on Taking Tesla Private

February 14, 2026

AI Industry Rivals Are Teaming Up on a Startup Accelerator

February 13, 2026

‘Uncanny Valley’: Tech Elites in the Epstein Files, Musk’s Mega Merger, and a Crypto Scam Compound

February 11, 2026

How iPhones Made a Surprising Comeback in China

February 10, 2026

Loyalty Is Dead in Silicon Valley

February 9, 2026

Latest Posts

The Tech Elites in the Epstein Files

February 6, 2026

Elon Musk Is Rolling xAI Into SpaceX—Creating the World’s Most Valuable Private Company

February 5, 2026

TikTok Data Center Outage Triggers Trust Crisis for New US Owners

February 3, 2026

No Phone, No Social Safety Net: Welcome to the ‘Offline Club’

February 2, 2026

Moltbot Is Taking Over Silicon Valley

February 1, 2026
Advertisement
Demo

Startup Dreamers is your one-stop website for the latest news and updates about how to start a business, follow us now to get the news that matters to you.

Facebook Twitter Instagram Pinterest YouTube
Sections
  • Growing a Business
  • Innovation
  • Leadership
  • Money & Finance
  • Starting a Business
Trending Topics
  • Branding
  • Business Ideas
  • Business Models
  • Business Plans
  • Fundraising

Subscribe to Updates

Get the latest business and startup news and updates directly to your inbox.

© 2026 Startup Dreamers. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.

GET $5000 NO CREDIT