Michael Cafarella
Video: applying AI to markets can teach us a few things
One of the eventual applications of new AI/ML intelligence models has to be interfacing with markets and the general study of economics.
When you watch Michael Cafarella talk about this type of analysis, you sort of get a picture of what it’s going to be like trying to fine-tune economic systems with new kinds of data, and new sophisticated neural networks.
Starting out with some basics in macroeconomics and price index activity, Cafarella introduces what’s called the ‘hedonic’ price index. It’s basically an index that counts the quality of products into its formulas.
“If you’re paying twice as much for the product, but it’s twice as good, well, you’re not really paying twice as much, because you’re getting twice as much in return,” Cafarella explains, referring to these metrics as “time series” calculations, and asking: if we see a price in a market, do we know if the change is due to things like inflation or quality adjustment?
We have to answer these quality questions, he contends.
“We want to … have something that reflects accurate quality adjustments, something that reflects how much people prefer one change versus another,” Cafarella says, estimating that the average half-life of a barcoded item is one year. That means most products you see on the shelf will be replaced by something else a year from now.
He identifies more criteria for better economic analysis – frequency, for one, and more granular product categorization.
Those sound like things that AI was meant to do!
He points out that government offices have been assessing quality for some goods for a long time, but not for others. For example, there may be more rudimentary or non-AI analysis of microprocessors, especially given the massive chip shortage for which the current administration authorized $52 billion in domestic production support.
But what about all of the other items that we buy?
Cafarella uses the example of a piece of clothing that may now be cheaper on the market, but might be made with inferior materials
With that in mind, Cafarella talks more about managing aggregated price adjustments, citing a project done with several partners where researchers took a lot of data from checkout scanners, including what he referred to as “textual product descriptions.” That’s where the quality data comes in.
(image caption: Get the real data!)
Noting that this “well-behaved” data set was useful in refining the business intelligence at hand, Cafarella describes a process of naïve comparison. It’s not predictions, exactly – but by understanding how people value goods, and making nuanced comparisons to products available at different times, what you come up with is an adjusted price index that’s a little smarter.
In Cafarella’s example, we see that while prices have gone up, quality has gone up, too.
One way to think about that is that organics in the food industry have blossomed and matured, so that if people could buy some of the organic items available today five years ago, they would’ve paid a hefty premium.
Anyway, this talk is interesting and yet another real-world application of these new technologies and data structures that will show us more about our own lives, and in this case, our markets.
“This (type of analysis) is really a meaningful difference in our understanding of how even a very boring kind of product category like food still reflects actual ability to deliver changes that consumers want, and really changes our understanding of the economy,” Cafarella says. “It’s just one taste, I think, of how you can use these computational data and machine learning methods to really improve our understanding of the economy overall.”
Read the full article here