This has mirrored what I've seen in my company. People in the data science/ML part of the company are super excited about AI and are always giving presentations on it and evangelizing it. Most engineers in other areas, though, are generally underwhelmed every time they try using it. It's being heavily pushed by AI "experts" and senior leaders, but the enthusiasm on the ground is lacking as results rarely live up to the extremely rosy promises that the "experts" keep making. Meanwhile, everyone can read the news about layoffs attributed to AI and can see that hiring (especially of junior engineers) has slowed to a trickle. You can only fool people for so long.
taurath 8 minutes ago [-]
For mine it’s worse because we have new leadership who believes in it to a far larger extent than it can deliver. Now a massive amount of our workforce is building up proofs of concepts and spitting out tons of effectively useless output to look good because of how strongly they’ve signaled it’s good for careers here to fully embrace it. It’s a massive mess and there’s nobody to clean it up, and the voices advocating for rigor or good engineering practices are being sidelined.
It’s full out mania. As someone raised in and who escaped a cult, I am having to use every tool in my very large toolbox to stay sane while I wait for this to pass and die down or make my move towards a place that still cares whether their product works.
Wissenschafter 3 minutes ago [-]
"while I wait for this to pass and die down"
Lmao
baxtr 4 minutes ago [-]
What about karpathy though?
grebc 16 minutes ago [-]
Underwhelmed is the absolute correct word to use here.
Absolutely everyone raves about this but other than a few basic computer related tasks I’ve not seen compelling use cases that justify the billions being lit on fire trying to pursue it.
My cynical take is the crypto bro’s needed something to do with their useless GPU’s after the crash and found the perfect answer in LLM’s.
Wissenschafter 5 minutes ago [-]
When did Hacker news start becoming a luddite, bad takes everywhere I look, feels like everyone is '50 year old burnt out guy' that has no idea what is going on vibe?
I just got back from a SAIRS conference at UCLA and talked directly with some of the presenters and engineers at Google.
You won't be 'underwhelmed' long.
pesus 19 seconds ago [-]
Blindly dismissing everyone not impressed by the AI hype only serves to further delegitimize the AI hype.
belval 15 minutes ago [-]
This is poor reporting, almost needs a checklist:
[X] Tweets and instagram comments presented as "what society is thinking"
[X] Ties Luigi Mangione and the California warehouse fire to Gen Z discontent (about AI?).
[X] Statistics being used to support the title with little to no regards to continuity: "those respondents who said that AI makes them “nervous” grew from 50% to 52% during the same period" => percentage was 52% in 2023, 50% in 2024 and 52% in 2025, seems mostly flat to me, with the real jump being in 2022-2023 with 39%.
JumpCrisscross 7 minutes ago [-]
They cite a report and a Gallup poll. That’s not just tweets.
simonw 25 minutes ago [-]
I was talking recently to someone who teaches AI-adjacent courses at a US university (not in a computer science department) and they said that enrollment in their class is lower than expected, which they think is likely due to the severity of the AI backlash among students on campus.
mmargenot 16 minutes ago [-]
AI applications that would help normal people in a significant way are pretty lacking, so I'm not surprised. So much conversation about AI products is cycles of "this tech will change everything" without material backup outside of coding agents.
mitthrowaway2 18 minutes ago [-]
What kind of AI-adjacent?
If it's fundamentals of ML, I'm surprised to hear that.
If it's "how to use ChatGPT for creative writing" then I'm not surprised. Why would someone take a class from a teacher who has had only just as much experience with these tools as their students have?
19skitsch 30 seconds ago [-]
Agree… OP said “not CS” so doesn’t seem surprising. If we’re going by anecdotes, AI classes in the CS dept have risen in popularity in the past few years.
semiinfinitely 23 minutes ago [-]
> they think
is key
CobrastanJorji 7 minutes ago [-]
I think people are really underestimating how poorly today's tweens think of AI. "That looks like chatgpt" is an insult. Kids avoid things because they heard somewhere that AI might have been involved and have a sense that means it is bad or immoral or illegal or cheating in some nebulous way, and it's reinforced by their teachers telling them that using AI for homework is cheating.
I think this next generation is going to come up fundamentally believing that AI is generally a bad thing, and it's going to surprise older people.
MrScruff 6 minutes ago [-]
I think it's not that difficult to see why a technology that will likely trigger widespread unemployment during a cost of living crisis, an arms race with China, along with all the alignment concerns, might not be hugely popular with the public.
Maybe I'd be a bit more optimistic if someone could explain a realistic economic scenario for how we're going to transition into our utopian abundant future without a depression or a revolution.
JumpCrisscross 4 minutes ago [-]
> a realistic economic scenario for how we're going to transition into our utopian abundant future
One aspect almost certainly has to be data centers being run as utilities.
nacozarina 19 minutes ago [-]
a person can have full faith in the potential value of ai science and simultaneously have zero faith in the current crop of business stewards of that science.
no one is questioning the underlying model mathematics, they are questioning deceptive & reckless stewards.
joaohaas 16 minutes ago [-]
I think most people oustide the area do not care and do not know about who's on top, and the negative perception is much more related to how the tech will enable users to misuse it (replacing phone lines/support, AI art, things losing quality, etc) than about the companies themselves.
JumpCrisscross 6 minutes ago [-]
The lack of federal permitting standards for AI data centers is really going to bite the industry in the ass. We also probably need something akin to the WARN Act for AI-related layoffs.
SunshineTheCat 14 minutes ago [-]
Giant leaps in innovation almost always have a reaction like this.
It's new, people fear it. Sometimes justified, usually not.
People greatly feared the car because of the number of horse-related jobs it would displace.
President Benjamin Harrison and First Lady Caroline Harrison feared electricity so much they refused to operate light switches to avoid being shocked. They had staff turn lights on/off for them.
Looking back at these we might laugh.
We're largely in the same boat now.
It's possible AI will destroy us all, but judging from history, the irrational reactions to something new isn't exactly unprecedented.
marginalia_nu 9 minutes ago [-]
Many innovations are also on the refuse pile of history. Indoor gas lighting[1] is one. People were quite justifiably skeptical of electricity, when its relatively short-lived predecessor frequently killed people in explosions, carbon monoxide poisoning, etc.
They've been saying that since the Boomers were kids, look where that led us
rootusrootus 10 minutes ago [-]
I'm biased, but I think Gen X turned out okay ;-).
2 minutes ago [-]
Yokohiii 4 minutes ago [-]
My only surprise is that the AI "elite" is surprised.
hcmgr 21 minutes ago [-]
The tone deafness of the tech community is so unbearable. Either too on the spectrum, too ambitious (the world is fine cause I’m getting mine), or too isolated from non-tech people, to realise most people despise what they’re creating.
There’s also a lack of willingness to ‘bring along’ the public. It’s just “make the god thing; ask for permission later”.
gcheong 31 minutes ago [-]
"Make something people want" seems so quaint now.
MaysonL 25 minutes ago [-]
“Make people want something, and sell it to them”.
cyanydeez 23 minutes ago [-]
Ignore all environmental, political and social problems, and invest everything in a purely antisocial technology.
Paraphrasing the classic, it's not AI that people are unhappy with, it's their life around AI. The world generally appears to have become a harsher and more dangerous place - even though it hasn't. But people and especially tabloid press like finding scapegoats and participating in mass hysteria. The anti-AI hysteria is going to go away soon while AI isn't. It's just another tool, like cars or factories. Granted, it brings some danger, but at the same time it brings overwhelmingly more good.
therobots927 22 minutes ago [-]
What the tech elite fail to understand is that we are at historic levels of wealth and income inequality. Access to healthcare is determined by one’s employment which makes what I’m about to explain a matter of life and death.
It doesn’t matter if you think it’s all going to work out and AI will bring an unprecedented era of abundance. That is not the current state.
Now what do you think happens when we dramatically expand productivity with AI? Well, we’re already seeing unprecedented layoffs in tech. And it’s easy to draw the conclusion that unless something structural changes all of the productivity gains from AI will go to investors not workers. Leaving said workers without access to healthcare or housing.
And of course let’s not forget that the tech elite in question supported Trump in the last election - someone who has done everything in his power to reduce healthcare access among the low income / unemployed population. This isn’t fucking rocket science guys.
Rendered at 22:02:51 GMT+0000 (Coordinated Universal Time) with Vercel.
It’s full out mania. As someone raised in and who escaped a cult, I am having to use every tool in my very large toolbox to stay sane while I wait for this to pass and die down or make my move towards a place that still cares whether their product works.
Lmao
Absolutely everyone raves about this but other than a few basic computer related tasks I’ve not seen compelling use cases that justify the billions being lit on fire trying to pursue it.
My cynical take is the crypto bro’s needed something to do with their useless GPU’s after the crash and found the perfect answer in LLM’s.
I just got back from a SAIRS conference at UCLA and talked directly with some of the presenters and engineers at Google.
You won't be 'underwhelmed' long.
[X] Tweets and instagram comments presented as "what society is thinking"
[X] Ties Luigi Mangione and the California warehouse fire to Gen Z discontent (about AI?).
[X] Statistics being used to support the title with little to no regards to continuity: "those respondents who said that AI makes them “nervous” grew from 50% to 52% during the same period" => percentage was 52% in 2023, 50% in 2024 and 52% in 2025, seems mostly flat to me, with the real jump being in 2022-2023 with 39%.
If it's fundamentals of ML, I'm surprised to hear that.
If it's "how to use ChatGPT for creative writing" then I'm not surprised. Why would someone take a class from a teacher who has had only just as much experience with these tools as their students have?
is key
I think this next generation is going to come up fundamentally believing that AI is generally a bad thing, and it's going to surprise older people.
Maybe I'd be a bit more optimistic if someone could explain a realistic economic scenario for how we're going to transition into our utopian abundant future without a depression or a revolution.
One aspect almost certainly has to be data centers being run as utilities.
no one is questioning the underlying model mathematics, they are questioning deceptive & reckless stewards.
It's new, people fear it. Sometimes justified, usually not.
People greatly feared the car because of the number of horse-related jobs it would displace.
President Benjamin Harrison and First Lady Caroline Harrison feared electricity so much they refused to operate light switches to avoid being shocked. They had staff turn lights on/off for them.
Looking back at these we might laugh.
We're largely in the same boat now.
It's possible AI will destroy us all, but judging from history, the irrational reactions to something new isn't exactly unprecedented.
[1] https://en.wikipedia.org/wiki/Gas_lighting
The kids are alright.
There’s also a lack of willingness to ‘bring along’ the public. It’s just “make the god thing; ask for permission later”.
It doesn’t matter if you think it’s all going to work out and AI will bring an unprecedented era of abundance. That is not the current state.
The current state is: Nearly all productivity growth since 1980 has gone to shareholders, not workers: https://www.epi.org/productivity-pay-gap/
Now what do you think happens when we dramatically expand productivity with AI? Well, we’re already seeing unprecedented layoffs in tech. And it’s easy to draw the conclusion that unless something structural changes all of the productivity gains from AI will go to investors not workers. Leaving said workers without access to healthcare or housing.
And of course let’s not forget that the tech elite in question supported Trump in the last election - someone who has done everything in his power to reduce healthcare access among the low income / unemployed population. This isn’t fucking rocket science guys.