Wednesday, July 10, 2024

Calling AI a Bubble

Ron Miller (via Hacker News):

[Rodney Brooks] knows what he’s talking about, and he thinks maybe it’s time to put the brakes on the screaming hype that is generative AI. Brooks thinks it’s impressive technology, but maybe not quite as capable as many are suggesting. “I’m not saying LLMs are not important, but we have to be careful [with] how we evaluate them,” he told TechCrunch.

He says the trouble with generative AI is that, while it’s perfectly capable of performing a certain set of tasks, it can’t do everything a human can, and humans tend to overestimate its capabilities. “When a human sees an AI system perform a task, they immediately generalize it to things that are similar and make an estimate of the competence of the AI system; not just the performance on that, but the competence around that,” Brooks said. “And they’re usually very over-optimistic, and that’s because they use a model of a person’s performance on a task.”

He added that the problem is that generative AI is not human or even human-like, and it’s flawed to try and assign human capabilities to it. He says people see it as so capable they even want to use it for applications that don’t make sense.

M.G. Siegler:

Seemingly every investor I talk to these days is struggling with the same basic thing: they believe AI is going to be one of the most transformative technologies of the past several decades – and perhaps ever – but they have almost no idea how to invest in the space. And yet they are investing in the space. At a pace that puts the crypto boom to shame. Because, well, that’s the job.

Katie Balevic (via Hacker News):

Tech companies are spending big on the AI craze, but it will be a while before they have much — if anything — to show for it.

As companies prepare to spend over $1 trillion on artificial intelligence, a Goldman Sachs report examined the big question at hand: “Will this large spend ever pay off?”

That sizable investment will go toward the data centers needed to run AI, the power grid, and AI chips. But shortages of those AI ingredients could lead to disappointing returns for companies.

The report is here.

Edward Zitron:

The report covers AI’s productivity benefits (which Goldman remarks are likely limited), AI’s returns (which are likely to be significantly more limited than anticipated), and AI’s power demands (which are likely so significant that utility companies will have to spend nearly 40% more in the next three years to keep up with the demand from hyperscalers like Google and Microsoft).

[…]

The report includes an interview with economist Daron Acemoglu of MIT (page 4), an Institute Professor who published a paper back in May called “The Simple Macroeconomics of AI” that argued that “the upside to US productivity and, consequently, GDP growth from generative AI will likely prove much more limited than many forecasters expect.” A month has only made Acemoglu more pessimistic, declaring that “truly transformative changes won’t happen quickly and few – if any – will likely occur within the next 10 years,” and that generative AI’s ability to affect global productivity is low because “many of the tasks that humans currently perform…are multi-faceted and require real-world interaction, which AI won’t be able to materially improve anytime soon.”

Dare Obasanjo:

This is a great article from Sequoia which argues the tech industry needs $600B in AI revenue to justify the money spent on GPUs and data centers.

OpenAI is the biggest AI pure play and is at $3.4B ARR. This feels like a bubble unless products worth buying show up.

There is no doubt that there will be a lot of money made from AI. The question is whether it will be enough to support a $3T valuation for Nvidia?

Hemant Mohapatra (Thread Reader, via Hacker News):

So now that Nvidia has far outstripped the market cap of AMD and Intel, I thought this would be a fun story to tell. I spent 6+yrs @ AMD engg in mid to late 2000s helping design the CPU/APU/GPUs that we see today. Back then it was unimaginable for AMD to beat Intel in market-cap (we did in 2020!) and for Nvidia to beat both! In fact, AMD almost bought Nvidia but Jensen wasn’t ready to sell unless he replace Hector Ruiz of AMD as the CEO of the joint company. The world would have looked very different had that happened. Here’s the inside scoop of how & why AMD saw the GPU oppty, lost it, and then won it back in the backdrop of Nvidia’s far more insane trajectory, & lessons I still carry from those heady days[…]

Update (2024-07-15): See also: Hacker News.

7 Comments RSS · Twitter · Mastodon

Old Unix Geek

Yes, AMD bought ATI instead, and there was some work to integrate the ATI compute and the CPU compute, IIRC.

I'm far from convinced by the current generative LLM fad. But that doesn't mean there won't be cool stuff that has massive economic impact.

Eg: UK company Materials Nexus recently used a materials science AI to create permanent magnets without rare earths in only 3 months. Similar work on boosting solar panels / batteries / high temperature superconductors might prove essential for reducing carbon emissions.

There are some things where generative "ai" is rather usefull. But it's always as a nice to have feature, not some big OMG solution.

Mayyyyybe I'd say that code producing organisations will have to subscribe to generative "ai" to stay competititve. But it won't make them any money, it'll be more like you have to pay for a bunch of staging servers etc to be in the game.

The trillions in revenue won't materialise, and it sickens me to see this play out again so soon after Uber for X, crypto "currencies", NFT's, the metaverse and Web3 scambubbles have popped.

The Industrial Revolution transitioned people from one sector to another.
AI is besieging, and may conquer those sectors people had to transfer to in the aftermath of the Industrial Revolution.
What will be left? I am not sure. UBI? Empty lives with no purpose other than spending money on leisure?
Desperate people selling themselves online to the ravenous fat cats and psychopaths ?
I don’t like this trend.

As an example, could Artificial Intelligence (AI) be trained to automatically count-score the number of nodes of mind maps available as PDF or pictures (JPG, PNG, etc)? Is there any application or company capable of doing that using AI or other approach? Thanks!

"Empty lives with no purpose"

I'll never understand that argument. If only your work provides your life purpose, don't you already lead an empty life devoid of purpose?

Having just come back from a four week trip with my family I can only say that life without work can be full of purpose and joy. I wish I was rich enough to retire.

On the other hand it seems that mind set is what's keeping me from becoming rich enough.

Old Unix Geek

On the other hand LLMs can sometimes be funny.

Leave a Comment