NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Good code will still win (greptile.com)
seamossfet 38 minutes ago [-]
I find most developers fall into one of two camps:

1. You treat your code as a means to an end to make a product for a user.

2. You treat the code itself as your craft, with the product being a vector for your craft.

The people who typically have the most negative things to say about AI fall into camp #2 where AI is automating a large part of what they considered their art while enabling people in group #1 to iterate on their product faster.

Personally, I fall into the first camp.

No one has ever made a purchasing decision based on how good your code is.

The general public does not care about anything other than the capabilities and limitations of your product. Sure, if you vibe code a massive bug into your product then that'll manifest as an outcome that impacts the user negatively.

With that said, I do have respect for people in the latter camp. But they're generally best fit for projects where that level of craftsmanship is actually useful (think: mission critical software, libraries us other devs depend on, etc).

I just feel like it's hard to talk about this stuff if we're not clear on which types of projects we're talking about.

iamcalledrob 13 minutes ago [-]
Sloppy technical design ends up manifesting in bugs, experiential jank, and instability.

There are some types of software (e.g. websites especially), where a bit of jank and is generally acceptable. Sessions are relatively short, and your users can reload the webpage if things stop working. The technical rigor of these codebases tends to be poor, but it's generally fine.

Then there's software which is very sensitive to issues (e.g. a multi-player game server, a driver, or anything that's highly concurrent). The technical rigor here needs to be very high, because a single mistake can be devastating. This type of software attracts people who want to take pride in their code, because the quality really does matter.

I think these people are feeling threatened by LLMs. Not so much because an LLM is going to outperform them, but because an LLM will (currently) make poor technical design decisions that will eventually add up to the ruin of high-rigor software.

clan 22 minutes ago [-]
I respect your opinion and especially your honesty.

And at the same time I hope that you will some day be forced to maintain a project written by someone else with that mindset. Cruel, yes. But unfortunately schadenfreude is a real thing - I must be honest too.

I have gotten to old for ship now, ask questions later projects.

jstanley 19 minutes ago [-]
I'm in camp 1 too. I've maintained projects developed with that mindset. It's fine! Your job is to make the thing work, not take on its quality as part of your personal identity.

If it's harder to work with, it's harder to work with, it's not the end of the world. At least it exists, which it probably wouldn't have if developed with "camp 2" tendencies.

I think camp 2 would rather see one beautiful thing than ten useful things.

7 minutes ago [-]
ambicapter 5 minutes ago [-]
> At least it exists, which it probably wouldn't have if developed with "camp 2" tendencies.

Ah yes, if you aren't shitting code out the door as fast as possible, you're probably not shipping anything at all.

ambicapter 6 minutes ago [-]
This is like when people decided that everyone was either "introvert" or "extrovert" and then everyone started making decisions about how to live their life based on this extremely reductive dichotomy.

There are products that are made better when the code itself is better. I would argue that the vast majority of products are expected to be reliable, so it would make sense that reliable code makes for better product. That's not being a code craftsman, it's being a good product designer and depending on your industry, sometimes even being a good businessman. Or, again, depending on your industry, not being callous about destroying people's lives in the various ways that bad code can.

renewiltord 4 minutes ago [-]
I’m an introvert. I make sure that all my “welcome to the company” presentations are in green. I am also an extrovert in that I add more green than required.
bloppe 13 minutes ago [-]
I mostly agree with this. Part of the confusion with the discourse around AI is the fact that "software engineering" can refer to tons of different things. A Next.js app is pretty different from a Kubernetes operator, which is pretty different from a compiler, etc.

I've worked on a project that went over the complexity cliff before LLM coding even existed. It can get pretty hairy when you already have well-established customers with long-term use-cases that absolutely cannot be broken, but their use-cases are supported by a Gordian Knot of tech debt that practically cannot be improved without breaking something. It's not about a single bug that an LLM (or human) might introduce. It's about a complete breakdown in velocity and/or reliability, but the product is very mature and still makes money; so abandoning it and starting over is not considered realistic. Eager uptake of tech debt helped fuel the product's rise to popularity, but ultimately turned it into a dead end. It's a tough balancing act. I think a lot of LLM-generated platforms will fall eventually into this trap, but it will take many years.

roland35 31 minutes ago [-]
That's true, but I think there is a gray area in between. As things scale up in one way or another, having high quality is important for both #1 and #2. Its hard to extend software that was designed poorly.

The question where experience comes in is when quality is and isnt worth the time. I can create all sorts of cool software I couldn't before because now I can quickly pump out "good enough" android apps or react front ends! (Not trying to denigrate front end devs, it's just a skill I dont have)

logicchains 4 minutes ago [-]
It's perfectly possible to write very clean code with AI, it just takes a lot more time and prompting.
Swizec 27 minutes ago [-]
> The people who typically have the most negative things to say about AI fall into camp #2 where AI is automating a large part of what they considered their art while enabling people in group #1 to iterate on their product faster.

I am in both camps. Always have been.

Code janitors about to be in high demand. We’ve always been pretty popular with leadership and it’s gonna get even more important.

Treat code design and architecture as the thing that lets your slop canons (90% of engineers even pre-ai) move fast without breaking things

My output is org velocity.

acedTrex 20 minutes ago [-]
> My output is org velocity.

Amen, slow and steady and the feature fly wheel just keeps getting faster.

seamossfet 16 minutes ago [-]
>slop cannons

I am stealing that phrase haha

packetlost 29 minutes ago [-]
I agree on the software dev camps.

> The general public does not care about anything other than the capabilities and limitations of your product.

It's absolutely asinine to say the general public doesn't care about the quality and experience of using software. People care enough that Microsoft's Windows director sent out a very tail-between-legs apology letter due to the backlash.

It's as it always has been, balancing quality and features is... well, a balance and matters.

seamossfet 26 minutes ago [-]
The public doesn't care about the code itself, they absolutely care about the quality and experience of using the software.

But you can have an extremely well designed product that functions flawlessly from the perspective of the user, but under the hood it's all spaghetti code.

My point was that consuming software as a user of the product can be quite different from the experience of writing that software.

Facebook is a great example of this, there's some gnarly old spaghetti code under the hood just from the years of legacy code but those are largely invisible to the user and their experience of the product.

I'd just be careful to separate code elegance from product experience, since they are different. Related? Yeah, sure. But they're not the same thing.

blackbear_ 19 minutes ago [-]
There are other players in the game: the business and the market.

Good code makes it easier for the business to move fast and stay ahead of the competition while reducing expenses for doing so.

packetlost 20 minutes ago [-]
That's fair!

> Facebook is a great example of this, there's some gnarly old spaghetti code under the hood just from the years of legacy code but those are largely invisible to the user and their experience of the product.

I'm sure that's the case in basically everything, it sorta doesn't matter (until it does) if it's cordoned off into a corner that doesn't change and nominally works from the outside perspective.

But those cases are usually isolated, if they aren't it usually quickly becomes noticeable to the user in one way or another, and I think that's where these new tools give the illusion of faster velocity.

If it's truly all spaghetti underneath, the ability to make changes nosedives.

slopinthebag 7 minutes ago [-]
Facebook.com is a monstrosity though, and their mobile apps as well are slow and often broken. And the younger generations are using other networks, Facebook is in trouble.
slopinthebag 6 minutes ago [-]
This is just cope to avoid feeling any shame for shipping slop to users.
ModernMech 28 minutes ago [-]
> You treat your code as a means to an end to make a product for a user.

It isn’t that though, the “end” here is making money not building products for users. Typically people who are making products for users cares about the craft.

If the means-to-end people could type words into a box and get money out the other side, they would prefer to deal with that than products or users.

Thats why ai slop is so prevalent — the people putting it out there don’t care about the quality of their output or how it’s used by people, as long as it juices their favorite metrics - views, likes, subscribes, ad revenue whatever. Products and users are not in scope.

seamossfet 19 minutes ago [-]
Yeah, I'm not trying to defend slop.

I don't think all means-to-end people are just in it for money, I'll use the anecdote of myself. My team is working on a CAD for drug discovery and the goal isn't to just siphon money from people, the goal is legitimately to improve computational modeling of drug interactions with targets.

With that in mind, I care about the quality of the code insofar as it lets me achieve that goal. If I vibe coded a bunch of incoherent garbage into the platform, it would help me ship faster but it would undermine my goal of building this tool since it wouldn't produce reliable or useful models.

I do think there's a huge problem with a subset of means-to-end people just cranking out slop, but it's not fair to categorize everyone in that camp this way ya'know?

Animats 35 minutes ago [-]
Meanwhile, the complexity of the average piece of software is drastically increasing. ... The stats suggest that devs are shipping more code with coding agents. The consequences may already be visible: analysis of vendor status pages [3] shows outages have steadily increased since 2022, suggesting software is becoming more brittle.

We've already seen a large-scale AWS outage because of this. It could get much worse. In a few years, we could have major infrastructure outages that the AI can't fix, and no human left understands the code.

AI coders, as currently implemented, don't have a design-level representation of what they're doing other than the prompt history and the code itself. That inherently leads to complexity growth. This isn't fundamental to AI. It's just a property of the way AI-driven coding is done now.

Is anybody working on useful design representations as intermediate forms used in AI-driven coding projects?

"The mending apparatus is itself in need of mending" - "The Machine Stops", by E.M. Forster, 1909.

reese_john 50 minutes ago [-]

  Why build each new airplane with the care and precision of a Rolls-Royce? In the early 1970s, Kelly Johnson and I [Ben Rich] had dinner in Los Angeles with the great Soviet aerodynamicist Alexander Tupolev, designer of their backfire Bear bomber. 'You Americans build airplanes like a Rolex watch,' he told us. 'Knock it off the night table and it stops ticking. We build airplanes like a cheap alarm clock. But knock it off the table and still it wakes you up.'...The Soviets, he explained, built brute-force machines that could withstand awful weather and primitive landing fields. Everything was ruthlessly sacrificed to cut costs, including pilot safety.
  We don't need to be ruthless to save costs, but why build the luxury model when the Chevy would do just as well? Build it right the first time, but don't build it to last forever. - Ben Rich in Skunk Works
xnx 2 hours ago [-]
If "good code" == "useful code", then yes.

People forget that good engineering isn't "the strongest bridge", but the cheapest bridge that just barely won't fail under conditions.

gizmo686 2 minutes ago [-]
Engineers don't build the cheapest bridge that just barely won't fail. They build the cheapest bridge that satisfies thousands of pages of regulatory requirements maintained and enforced by dozens of different government entities. Those regulations range from safety, to aesthetic, to environmental, to economic, to arcane.

Left to their own devices, engineers would build the cheapest bridge they could sell that hopefully won't collapse. And no care for the impact on any stakeholder other than the one paying them.

siriusastrebe 1 hours ago [-]
What would happen if we made bridges to last as long as possible, to withstand natural disasters and require minimal maintenance?

What if we built things that are meant to last? Would the world be better for it?

GarnetFloride 1 hours ago [-]
Look up Roman concrete. There are 2000 year old bridges and aqueducts still in use.

We only recently figured out how to reproduce Roman concrete.

We’d have more but a lot were blown up during WWII.

bombela 35 minutes ago [-]
There is nothing special about roman concrete compared to moderns concrete. Modern concrete is much better

The difference is that they didn't have rebar. And so they built gravity stable structures. Heavy and costly as fuck.

A modern steel and concrete structure is much lighter and much cheaper to produce.

It does mean a nodern structure doesn't last as long but also the roman stuff we see is what survived the test of time, not what crumbled.

darkwater 22 minutes ago [-]
We have modern architecture crumbling already less than 100 years after it has been built. I know engineering is about tradeoffs but we should also acknowledge that, as a society, we are so much used to put direct economic cost as the main and sometimes only metric.
recursive 1 hours ago [-]
Devil's advocate here. Maybe we'd all forget how to build bridges in the next thousand years, after bridging all the bridg-able spans.
DeathArrow 1 hours ago [-]
What if instead of one bridge we build three, so more people can cross the river?
siriusastrebe 1 hours ago [-]
And if your one bridge survived as long as, or longer than three bridges?
pixl97 34 minutes ago [-]
Then you still have traffic issues and no one is happy.
fiedzia 1 hours ago [-]
> What if we built things that are meant to last? Would the world be better for it?

You'd have a better bridge, at the expense of other things, like hospitals or roads. If people choose good-enough bridges, that shows there is something else they value more.

siriusastrebe 1 hours ago [-]
Once the good-enough bridge deteriorates and we have to spend more money maintaining or replacing it

Don't we end up just spending the same? Just now we're left with a crappy bridge.

cm11 28 minutes ago [-]
Certainly, "enough" is doing a lot of work and things get blurry, but I think "good enough" is meant to capture some of that. Over building is also a problem. It isn't strictly true that building longer lived things is cheaper over time either, it obviously depends on the specific things getting compared. And if you go 100 years rather than 25 years, you'll have fewer chances to adjust and optimize for changes to the context, new technology, changing goals, or more efficient (including cost saving) methods.

Obviously, there's a way to do both poorly too. We can make expensive things that don't last. I think a large chunk of gripes about things that don't last are really about (1) not getting the upside of the tradeoff, cheaper (in both senses) more flexible solutions, and (2) just getting bad quality period.

pixl97 33 minutes ago [-]
Depends how much the infrastructure and needs around it changes.
nisegami 51 minutes ago [-]
But we also got roads and hospitals.
blast 1 hours ago [-]
> the cheapest bridge that just barely won't fail

That can't be right? What about safety factors

StevenWaterman 1 hours ago [-]
Safety factors exist because without them, bridges fall down
pklausler 50 minutes ago [-]
The free market ensures that bridges stay up, because the bridge-makers don't want to get sued by people who have died in bridge collapses.
quentindanjou 37 minutes ago [-]
This only works when the barrier of entry to sue is low enough to be done and when the law is applied impartially without corruption with sanctions meaningful enough , potentially company-ending, to discourage them.

At the moment you remove one of these factors, free market becomes dangerous for the people living in it.

irishcoffee 57 minutes ago [-]
That isn't how safety factors work... The person you're responding to is correct. I encourage you to look it up!
pagecalm 33 minutes ago [-]
Agreed on the economics side. Clean code saves you time and money whether a human or AI wrote it. That part doesn't change.

But I don't think the models are going to get there on their own. AI will generate a working mess all day long if you let it. The pressure to write good code has to come from the developer actually reviewing what comes out and pushing back. The incentive is there but it only matters if someone acts on it.

simianwords 43 minutes ago [-]
People are not emotionally ready to accept that certain layers of abstraction don’t need as much care and effort if they can be automated.

We are at the point where a single class can be dirty but the API of the classes should be clean. There’s no point reviewing the internals of a class anymore. I’m more or less sure that they would work as intended.

Next step is that of a micro service itself. The api of that micro service should be clean but internals may be however. We are 10% here.

dcchambers 39 minutes ago [-]
Does performance not matter?

What if your AI uses an O(n) algorithm in a function when an O(log n) implementation exists? The output would still be "correct"

NitpickLawyer 12 minutes ago [-]
> Does performance not matter?

Performance can be a direct target in a feedback loop and optimised away. That's the easy part. Taking an idea and poof-ing a working implementation is the hard part.

simianwords 38 minutes ago [-]
In most cases no. Bottleneck is usual IO.
ezekg 22 minutes ago [-]
The background pattern really makes it hard to read, just fyi. I'd make the content have a white bg if you absolutely must use the pattern.
socalgal2 26 minutes ago [-]
When has this ever been true

Did the best processor win? no x86 is trash

Did the best computer language win? no (not that you can can pick a best)

The same is true pretty much everywhere else outside computers, with rare exception.

personality1 20 minutes ago [-]
I wish I could write beautiful good code, every part of me wants it, but I'm forced to deliver as fast as I can.
muskstinks 35 minutes ago [-]
... for now.

And just to be clear: AI continues to progress. There are already rumors about the next Anthropic model coming out and we are now in the phase of the biggest centralized reinforcement loop ever existed: everyone using ai for writing and giving it feedback.

We are, thanks to LLMs, able now to codify humans and while its not clear how fast this is, i do not believe anymore that my skills are unique.

A small hobby application costed me 11 dollars on the weekend and took me 3h to 'build' while i would have probably needed 2-3 days for it.

And we are still limited by resources and normal human progress. Like claude team is still exerpimental. Things like gastown or orchestrator architecture/structure is not that estabslihed and consumed quite a lot of tokens.

We have not even had time yet to build optimzed models. Claude code still understand A LOT of languages (human languages and programming languages)

Do not think anyone really cares about code quality. I do but i'm a software engineere. Everyone around me doesn't. Business doesn't. Even fellow co-workers don't or don't understand good code.

Even stupid things like the GTA 5 Online (or was it RDR2?) startup code wasn't found for ages (there was some algo complexity in loading some config file which took ages until someone non rockstar found it and rockstar fixed it).

We also have plenty of code were it doesn't matter as long as it works. Offline apps, scripts, research scripts etc.

vb-8448 2 hours ago [-]
Good code wasn't winning even before the ai slop era!

The pattern was always: ship fast, fix/document later, but when "later" comes "don't touch what is working".

To date nothing changed yet, I bet it won't change even in the future.

deathanatos 8 minutes ago [-]
& I have thus far made a large portion of my living off of fixing bad code "later".

… but lately, the rate at which some dev with an LLM can just churn out new bad code has just shot through the roof. I can still be struggling to pick apart the last piece of slop, trying to figure out "okay, if someone with a brain had written this, what would the inputs & outputs be?" and "what is it that production actually needs and relies on, and what causes problems, and how can we get the code from point A to point B without more outages"; but in the meantime, someone has spit out 8 more modules of the same "quality".

So sure, the basic tenants haven't changed, but these days I feel like I'm drowning in outages & bugs.

briantakita 2 hours ago [-]
I was told by an exec...once a company or technology implements something and gets mindshare, the community (including companies) moves on.

Competition is essentially dead for that segment given there is always outward growth.

With that being said, AI enables smaller players to implement their visions with enough completeness to be viable. And with a hands off approach to code, the underlying technology mindshare does not matter as much.

esafak 1 hours ago [-]
If that were true first movers would always win. Hotmail came before Gmail. Yahoo came before Google. Myspace came before Facebook. Et cetera. Of course it is best to avoid competition by creating a new (sub)category but category kings can change.
mettamage 56 minutes ago [-]
I disagree, Electron showed the world that good code can be magnetic

... I'll see myself out

rbbydotdev 33 minutes ago [-]
The wrinkle here is what exactly “win” means
ahussain 22 minutes ago [-]
My prediction is that we'll start to see a whole new layer of abstraction to help us write high quality code with LLMs - meaning new programming languages, new toolchains, stricter typechecking, in-built feedback loops etc.

The slop we're seeing today comes primarily from the fact that LLMs are writing code with tools meant for human users.

sublinear 22 minutes ago [-]
> economic forces will drive AI models toward generating good, simpler, code because it will be cheaper overall

Economic forces are completely irrelevant to the code quality of AI.

> I believe that economic incentives will start to take effect and AI models will be forced to generate good code to stay competitive amongst software developers and companies

Wherever AI succeeds, it will be because a dev is spending time on a process that requires a lot of babysitting. That time is about the same as writing it by hand. Language models reduce the need to manually type something because that's what they are designed to do, but it doesn't mean faster or better code.

AI is rubber duck that can talk back. It's also a natural language search tool. It's training wheels for devs to learn how to plan better and write half-decent code. What we have is an accessibility tool being sold as anything and everything else because investors completely misunderstand how software development works and are still in denial about it.

Code quality starts and ends with business needs being met, not technical capability. There is no way to provide that to AI as "context" or automate it away. AI is the wrong tool when those needs can be met by ideas already familiar to an experienced developer. They can write that stuff in their sleep (or while sitting in the meetings) and quickly move on.

yshamrei 1 hours ago [-]
good code do not earn money =)
RcouF1uZ4gsC 1 hours ago [-]
The existence and ubiquity of bash scripts make me doubt this.
seniorThrowaway 2 hours ago [-]
this submission is basically an ad
aplomb1026 1 hours ago [-]
[dead]
throwaway613746 2 hours ago [-]
[dead]
sloptile 2 hours ago [-]
[flagged]
dang 2 hours ago [-]
Please don't cross into personal attack, and especially please don't harass newcomers.

https://news.ycombinator.com/newsguidelines.html

sloptile 59 minutes ago [-]
[dead]
7e 2 hours ago [-]
None of this is true. Models will soon scale to several million tokens of context. That, combined with the combined experience of millions of feedback cycles, will make software a solved problem for machines, even as humans remain dumb. Yes, even complex software. Complex software is actually better because it is, generally, faster with more features. It’s smarter. Like a jet fighter, the more complex it is, the more capable it is.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 18:39:18 GMT+0000 (Coordinated Universal Time) with Vercel.