Startup DreamersStartup Dreamers
  • Home
  • Startup
  • Money & Finance
  • Starting a Business
    • Branding
    • Business Ideas
    • Business Models
    • Business Plans
    • Fundraising
  • Growing a Business
  • More
    • Innovation
    • Leadership
Trending

Today’s NYT Mini Crossword Clues And Answers For Friday, July 11th

July 11, 2025

Why Your Finance Team Needs an AI Strategy, Now

July 11, 2025

3 Bold Moves Every Entrepreneur Should Make This Year

July 11, 2025
Facebook Twitter Instagram
  • Newsletter
  • Submit Articles
  • Privacy
  • Advertise
  • Contact
Facebook Twitter Instagram
Startup DreamersStartup Dreamers
  • Home
  • Startup
  • Money & Finance
  • Starting a Business
    • Branding
    • Business Ideas
    • Business Models
    • Business Plans
    • Fundraising
  • Growing a Business
  • More
    • Innovation
    • Leadership
Subscribe for Alerts
Startup DreamersStartup Dreamers
Home » Getting Serious About Handling Sparse Models! Re-Examining Dense Tensors In The AI Age
Innovation

Getting Serious About Handling Sparse Models! Re-Examining Dense Tensors In The AI Age

adminBy adminSeptember 12, 20230 ViewsNo Comments5 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email

What can you do about data sparsity? What do you do when you have a matrix with a bunch of zeros in it, and you can’t get a good look at a complex system because so many of the nodes are empty?

This can be a major issue. If you’re wondering “what the pros would do,” you just might be interested in the idea of solutions for sparse data problems. If you read the journals, you’re probably seeing the term pop up in tech media. But what does it mean? Digging through your lexicon of terms, looking at classifier charts, ruminating on the Boltzmann machine – all of that can be helpful, but also, hearing from people in the field is a powerful way to connect with what’s on the front burner right now.

For some thoughts on this, Saman Amarasinghe goes all of the way back to the 1950s to talk about FORTRAN and its build.

“One thing interesting about Fortran was it had only one data structure; the data structure was tensors,” he says.

However, as he moves on, Amarasinghe suggests that we have to expand our view of new AI systems to deal with sparsity as a fundamental issue, and that dense tensors present a challenge.

“The world is not dense,” he said, underlining phenomena like replication and symmetry that can help us to conceive of data models differently.

Sparsity, by the way, refers to situations where there isn’t enough data, or where too many of the data points have a zero or null value.

The latter is often referred to as ‘controlled sparsity’.

Amarasinghe suggests that we can deal with these kinds of sparsity with new approaches that expand on what dense tensors have done for the past half-century.

Dense tensors, he notes, are flexible, but they waste memory.

The solution? Compressing data sets, and using metadata to point to the empty values.

“The problem is: I have all of these zeros I’m storing,” Amarasinghe says. “So what I want to do is: instead of doing that, how do I compress this thing, (and I) don’t store the zeros. But now how do you figure out what that value is? Because we don’t know. … we need to keep additional data called metadata. And the metadata will say, for that value, what the row and column number is: this is called the coordinate format.”

Amarasinghe shows a series of projections of the type of code that you’ll need to create a multi-tensor result

“This is hard,” he concludes, while also providing some caveats that may be handy in tomorrow’s engineering world.

Ignoring sparsity, he contends, is throwing away performance. Amarasinghe explains how the efficiencies work:

“At some point, you get better and better performance,” he says. “And if there’s a lot of sparsity, (you get) a huge amount of performance. Why? Because if you multiply by zero, you don’t have to do anything. You don’t even have to fetch items. So normally, you keep zeros multiplied (and) if you add, you just have to frame the data and copy it – you don’t (unintelligible) operation. So because of these two, I can do a lot less operation, a lot less data fetches, and you get good performance.”

Going deeper into ideas like vector multiplication, Amarasinghe illustrates more of the work the engineers have to do to deal with data sparsity at a fundamental level.

“If you look at where the data is, large amounts of data, things like sparse neural networks (are) right at the cusp of getting performance using matrix matrix multiply,” he says. “But there are many different other domains, we have sparsities in much large numbers. So you can get a huge amount of performance in here.”

New approaches might help us to figure out how to handle data irregularities in systems)

Amarasinghe also presents a slide with some relative sparsity in the following categories:

  • Internet and social media
  • Circuit simulation
  • Computing chemistry
  • Fluid dynamics
  • Statistics

This part of the presentation speaks to the idea of analyzing different kinds of data systems differently. It is also instructive of trends in AI right now: you can find papers on data sparsity problems in statistics, for example, all over the Internet. We also see that sparse data bias is viewed as a major problem for systems.

To address this, Amarasinghe suggests engineers can build a sparse tensor compiler to optimize the problem. Watch carefully the part of the talk where he goes into the use of lossless compression – some of the visuals may help.

“What we have done is (we’ve) made it possible for programmers to write the code as they’re working on dense data,” he says. “But actually, in fact, the data is compressed. So we are going to operate on compressed data, and get great performance.”

Read the full article here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Articles

Today’s NYT Mini Crossword Clues And Answers For Friday, July 11th

Innovation July 11, 2025

Taylor Vs. Serrano 3 Will Set A World Record—Here’s How To Watch

Innovation July 10, 2025

UiPath CTO Details ‘Office Layout’ For Agents, Robots And Humans

Innovation July 9, 2025

How Baidu’s ERNIE 4.5 Is Catalyzing China’s AI Transformation

Innovation July 8, 2025

I Want AI In My Business In The Best Way

Innovation July 7, 2025

Today’s ‘Wordle’ #1478 Hints, Clues And Answer For Sunday, July 6th

Innovation July 6, 2025
Add A Comment

Leave A Reply Cancel Reply

Editors Picks

Today’s NYT Mini Crossword Clues And Answers For Friday, July 11th

July 11, 2025

Why Your Finance Team Needs an AI Strategy, Now

July 11, 2025

3 Bold Moves Every Entrepreneur Should Make This Year

July 11, 2025

Teen’s Side Hustle Became a Multi-Hundred-Million-Dollar Business

July 11, 2025

The Teens Are Taking Waymos Now

July 11, 2025

Latest Posts

13 Jobs Without College or AI: Salaries Can Start at $70k+

July 10, 2025

How to Deal With Slow-Paying Customers the Right Way

July 10, 2025

Gen Z Founder Launches Physical CD Music Service

July 10, 2025

Trump’s Defiance of TikTok Ban Prompted Immunity Promises to 10 Tech Companies

July 10, 2025

UiPath CTO Details ‘Office Layout’ For Agents, Robots And Humans

July 9, 2025
Advertisement
Demo

Startup Dreamers is your one-stop website for the latest news and updates about how to start a business, follow us now to get the news that matters to you.

Facebook Twitter Instagram Pinterest YouTube
Sections
  • Growing a Business
  • Innovation
  • Leadership
  • Money & Finance
  • Starting a Business
Trending Topics
  • Branding
  • Business Ideas
  • Business Models
  • Business Plans
  • Fundraising

Subscribe to Updates

Get the latest business and startup news and updates directly to your inbox.

© 2025 Startup Dreamers. All Rights Reserved.
  • Privacy Policy
  • Terms of use
  • Press Release
  • Advertise
  • Contact

Type above and press Enter to search. Press Esc to cancel.

GET $5000 NO CREDIT