AWS CEO says replacing junior devs with AI is 'one of the dumbest ideas'
Posted by birdculture 16 hours ago
Comments
Comment by alexgotoi 14 hours ago
What AI does is remove a bunch of the humiliating, boring parts of being junior: hunting for the right API by cargo-culting Stack Overflow, grinding through boilerplate, getting stuck for hours on a missing import. If a half-decent model can collapse that search space for them, you get to spend more of their ramp time on “here’s how our system actually fits together” instead of “here’s how for-loops work in our house style”.
If you take that setup and then decide “cool, now we don’t need juniors at all”, you’re basically saying you want a company with no memory and no farm system – just an ever-shrinking ring of seniors arguing about strategy while no one actually grows into them.
Always love to include a good AI x work thread in my https://hackernewsai.com/ newsletter.
Comment by rsanek 13 hours ago
Comment by socketcluster 10 hours ago
My experience as a team lead working with a lot of juniors is that they are terrified of losing face and tend to talk a big game. As a team lead, I try to use language which expresses any doubts or knowledge gaps I have so that others in my team feel comfortable doing it as well. But a key aspect is that you have to really know your stuff in certain areas because you need to inspire others to mirror you... They won't try to mirror you if they don't respect you, based on your technical ability.
You need to demonstrate deep knowledge in some areas and need to demonstrate excellent reasoning abilities before you can safely ask dumb questions IMO. I try to find the specific strengths and weaknesses of my team members. I give constructive criticism for weaknesses but always try to identify and acknowledge each person's unique superpower; what makes them really stand out within the team. If people feel secure in their 'superpower', then they can be vulnerable in other areas and still feel confident. It's important to correctly identify the 'superpower' though because you don't want a Junior honing a skill that they don't naturally possess or you don't want them to be calling shots when they should be asking for help.
Comment by tharkun__ 6 hours ago
My experience as a team lead working with a lot of juniors is that they are terrified of losing face
So much this! Both from my experience as Junior very many years ago and also with Juniors (and not so Juniors) today. tend to talk a big game
Very big game. Claude does too. The kind of BS it spews in very confident language is amazing. As a team lead, I try to use language which expresses any doubts or knowledge gaps I have so that others in my team feel comfortable doing it as well
Agree. I also often literally say "Dumb idea: X" to try and suss out areas that may have been left by the wayside and under-explored or where assumptions have been made without verifying them. It's amazing how often even "Seniors"+ will spew assumptions as fact without verification. It's very annoying actually. superpower
How do you actually do this tho? I would love to do this but it seems hard to find an actual "superpower". Like where does "super" power start vs. "yeah they're better at this than others but definitely not as good as me or "person X that definitely does have that superpower". Like when can you start encouraging so to speak,Comment by andrewmutz 13 hours ago
Comment by boston_clone 13 hours ago
But forced RTO and only 10 days off per year is enough to keep me away ;)
Comment by lvspiff 12 hours ago
Even as a lead I ask the dumb question when no one else does just because when i can see the look in people faces or realize no one is chiming in the dumb question is needed to ensure everyone drives the point home. I've never been met with any sort of looking down upon nor do i discourage any of my staff - quite the opposite - I champion them for being willing to speak up.
Comment by tpoacher 10 hours ago
Some questions really are dumb and bring no value to the table.
The key is knowing which is which, and that is the part that comes with experience.
Comment by noisy_boy 8 hours ago
They do tell you that the person asking them either isn't getting it, which is valuable information, or that they are trying to ask questions for the sake of it, which is also valuable information.
Comment by darth_avocado 10 hours ago
But yes on a personal level, being senior enough in my career, I’d rather be thought of as less skilled by asking questions before the s hits the fan, than execute and mismanage a project that I didn’t ask enough questions on. The latter has more consequences tbh.
Comment by reactordev 13 hours ago
Comment by JohnFen 10 hours ago
Absolutely. If the company I work for happens to be one that's so crappy that I'd get fired for questioning things, it's better to find that out as soon as possible. That's not a company that's worth my time and attention.
Comment by Xelbair 13 hours ago
Comment by reactordev 12 hours ago
Comment by baxtr 12 hours ago
Whenever I don’t understand something I say something like: "Uh, I’m probably the only one here, but I don’t get it…"
Comment by Rodeoclash 12 hours ago
Comment by darth_avocado 10 hours ago
Comment by tracker1 10 hours ago
PFM - Pure Fucking Magic
I've only once ever had anyone actually ask what it means... essentially it's used as an abstraction for a complex process that would be excessive to explain in context.
I asked, after the meeting.
Comment by tracker1 10 hours ago
Asking stupid questions almost goes hands in glove with "it's easier to ask forgiveness than permission." A lot of times, you're better off just doing something. Asking a simple question or making something happen saves a lot of grief more often than not. Understanding is important.
Comment by rukuu001 12 hours ago
Comment by shakna 6 hours ago
I wouldn't say discouraging it will be the norm across most places in Australia.
Comment by awesome_dude 10 hours ago
Graduate, Junior, Senior, Team Lead, - my title hasn't mattered to the response
Comment by vkou 10 hours ago
(I believe you when you say that most of yours are like this.)
Comment by kennyloginz 4 hours ago
Comment by globalnode 10 hours ago
Comment by awesome_dude 10 hours ago
Comment by xeromal 12 hours ago
Comment by tpoacher 10 hours ago
The seniors were very understanding, and more importantly it raised important questions about backups, dev vs prod pipelines, etc.
But you can bet my cousin was super embarrassed by it, and saving face was a big part of it.
Comment by ben_w 9 hours ago
I've worked with people from Korea who took me 100% seriously when I said the architecture was too complex and hard to work with slowing down velocity, and although they did end up keeping it there was no outward indication of lost face and they did put in the effort to justify their thinking.
I've also worked with some British, Greek, and Russian people who were completely unwilling to listen to any feedback from coworkers, only order them around.
Even within a person: I know a self-identified communist, who is very open minded about anything except politics.
Comment by Spooky23 11 hours ago
The place I work at is in the middle of a new CEO’s process of breaking the company. The company can’t go out of business, but we’ll set stuff on fire for another 12-18 months.
Comment by simsla 12 hours ago
A product manager can definitely say things that would make me lose a bit of respect for a fellow senior engineer.
I can also see how juniors have more leeway to weigh in on things they absolutely don't understand. Crazy ideas and constructive criticism is welcome from all corners, but at some level I also start expecting some more basic competence.
Comment by lanstin 12 hours ago
Comment by mv4 10 hours ago
Meta? Ask questions anytime.
Amazon? Not so much.
Comment by doctaj 7 hours ago
Comment by aut0m8d 7 hours ago
Comment by marcus_holmes 7 hours ago
Comment by sharkweek 5 hours ago
It was anybody’s guess if they really didn’t understand the topic or if they were reading the room, but it was always appreciated.
Comment by awesome_dude 10 hours ago
"Why the f*ck are you asking, you should know this"
or
"Why the f*ck can you not look that up"
edit: There's an entire chapter of the internet with "LMGTFY" responses because people ask questions.
or
"Isn't it f*cking obvious"
or
"Do I have to f*cking spell it out for you"
There's a strong chance that I am autistic, which means, yes, I need people to be (more) explicit.
AI has done a hell of a good job making it easier for me to search for subtexts that I typically miss. And I receive less of the negative feedback when I have something to ask that does help.
Comment by versteegen 5 hours ago
> "Why the f*ck are you asking, you should know this"
Because you mentioned NZ: my father, a toolmaker, said there was a huge difference between Europe and NZ. In Germany/Netherlands, he'd be working under a more senior toolmaker. When he took a job in NZ and asked the boss something, as would have been the proper thing to do in Europe, he got a response just like that: because he was the expert, and his NZ boss was just a manager.
Comment by jamblewamble 9 hours ago
Comment by awesome_dude 9 hours ago
edit: well, except when you search the documentation and get (literally) 70+ results because you don't know the exact phrasing used in the self hosted wiki...
Or, when it's a question that is domain specific (meaning that the SME is supposed to know it, which you only know if you are... an SME...)
etc
Comment by mmh0000 5 hours ago
: “hey bob, I looked here and here and here and didn’t find the correct information. Can you show me where to look or tell me the answer so I can document it”
Because most people don’t bother doing the tiniest amount of their own research before asking dumb questions it becomes a huge headache to answer the same thing a million times.
However, if you can show that you did put in the effort to look up the answer first people will be much more willing to help.
Comment by awesome_dude 5 hours ago
Can you show why you assumed that what you are asking for wasn't provided?
Can you also explain why your response is to make rather harsh judgements rather than work out what was going on in the first place?
Comment by watwut 39 minutes ago
More of, ask do the quick google search or check the doc before asking that question. If the quick search or look into the doc does not contain the answer, ask.
Comment by apercu 10 hours ago
Comment by kelipso 10 hours ago
Comment by lanstin 12 hours ago
Comment by giancarlostoro 13 hours ago
I strongly disagree, a Senior who cannot ask a "dumb question" is a useless developer to me. Ask away, we want to figure things out, we're not mind readers. Every good senior developer I know asks questions and admits when they don't know things. This is humility and in my eyes is a strong market for a good Senior Developer. The entire point of our job is to ask questions (to ourselves typically the most) and figure out the answers (the code).
Juniors could or should ask more questions, and by the time you're a Senior, you're asking key questions no matter how dumb they sound.
Comment by cloudfudge 12 hours ago
Comment by johnfn 11 hours ago
This seems almost entirely wrong to me? That anyone, at any level of seniority, can ask "dumb questions" and give signal about "nonsense abstractions" seems a property of any healthy organization. That only juniors can do this doesn't just seem wrong, it seems backwards. I would expect seniors to have the clearest idea on whether abstractions make sense, not juniors.
Comment by tyre 14 hours ago
Junior devs do that naturally (if you have the culture) because they don’t know anything already. It’s great
Comment by 47928485 14 hours ago
Tell me you've never worked at FAANG without telling me you've never worked at FAANG...
Comment by lkjdsklf 11 hours ago
What isn’t viewed positively is when you refuse to accept a decision after it’s been made and your concerns have been heard. People get pissed if you keep relitigating the same points over and over again
Comment by kyralis 2 hours ago
Your job is to make sure that the decision makers, when they're not you, have the information needed to make competent decisions. You should keep arguing when (a) there is credible reason to believe that important information has not been heard or understood or (b) when new information has come to light that you credibly believe might change the decision. In the absence of those two, your should accept that you have done your job and should let your managers to theirs, even if you disagree with them. Bring it back up when (a) or (b) changes, and not until.
Comment by jauer 9 hours ago
I think this comes down to how you go about asking. You have to take the time to understand what is and how it's seen by others by being curious, reading docs, etc instead of rolling in making assertions disguised as questions to assert authority like so many are wont to do.
I suppose it's possible that I'm the designated court jester and that's why I can get away with questioning, but I don't think that's the case :)
Comment by iyihz 8 hours ago
Comment by ebiester 13 hours ago
Comment by kyralis 2 hours ago
I've given talks on work/life balance -- and I stand by those talks enough to argue with directors and above when needed, though it rarely is -- and an important part of that talk is about how much better it can look when you can intelligently describe the limits of your knowledge, skills, and estimation.
If you get penalized for that, you're just in a shit role with a shit manager. Don't project that on the rest of us.
Comment by tyre 10 hours ago
I did work at Stripe, which in places did this pretty well. It still felt like a huge company (this was back in 2022) that had lost part of that spirit depending on org leadership. I had to pull that out of engineers who had been scared out of that level of vulnerability. But building that trust is part of leadership and great people tend to want to question and improve things.
Comment by MrChoke 5 hours ago
Comment by iwontberude 13 hours ago
Comment by wkat4242 13 hours ago
Some of the biggest accidents have happened directly due to this. Like Tenerife where the flight engineer had been listening to the radio and raised doubts about the runway being free but was ignored by the overconfident captain.
Comment by watwut 11 minutes ago
It takes months of dysfunction until the customer says "I do not want to work with you anymore" or until the "overtime and over budget" thing suddenly becomes too large and problems show up in numbers. Or until key team suddenly completely decomposes. Every single time I have seen that, multiple people tried to communicate about the issue and were shot down.
It is not like management was always "wholistically" right and everyone down there just dont see big picture or have bad arguments - they usually just do not know what is going on on lower levels. Failure to actually listen, whether because it feels bad or because it would take time is quite common.
Comment by lanstin 6 hours ago
Comment by mikrl 13 hours ago
Is this a bad thing though? If some technical decision has downside risk, I’d reasonably expect:
- the affected stakeholder to bring it up
- the decision maker to assuage the stakeholder’s concern (happy path) or triage and escalate
Comment by iwontberude 13 hours ago
Comment by fn-mote 11 hours ago
As important as I think questioning is, there’s another side of it where people push their own agenda with questions on topics that were decided by other/more senior people hashing it out. At some point this does need to be dealt with. All I see is the yapping questions wasting meeting time, though.
Comment by AdrianB1 13 hours ago
Comment by ebiester 13 hours ago
One is a "one junior per team" model. I endorse this for exactly the reasons you speak.
Another, as I recently saw, was a 70/30 model of juniors to seniors. You make your seniors as task delegators and put all implementation on the junior developers. This puts an "up or out" pressure and gives very little mentorship opportunities. if 70% of your engineers are under 4 years of experience, it can be a rough go.
Comment by jorvi 13 hours ago
You have 1 veteran doctor overseeing 4 learning doctors. For example operating rooms do this, where they will have 4 operating rooms with 4 less experienced anesthesist and then 1 very experienced anesthesist who will rotate between the 4 and is on call for when shit hits the fan.
Honestly I think everyone here is missing the forest for the trees. Juniors their main purpose isn't to "ask questions", it's to turn into capable seniors.
That's also why the whole "slash our junior headcount by 3/4th" we are seeing across the industry is going to massively, massively backfire. AI / LLMs are going to hit a wall (well, they already hit it a while ago), suddenly every is scrambling for seniors but there are none because no one wanted to bear the 'burden' of training juniors to be seniors. You thought dev salaries are insane now? Wait until 4-5 years from now.
Comment by marcosdumay 12 hours ago
A hospital model may be a good idea. One where you have a senior programmer and many junior ones doing most tasks isn't. IMO, something closer to a real hospital team, where you have experts of different disciplines, and maybe a couple of juniors composing the team has much higher chances of success.
Comment by jorvi 12 hours ago
That is not how hospitals work. The surgery departement won't have a crack team of different disciplines to teach budding surgeons everything. They'll only have veteran surgeons to guide less-experienced surgeons.
What you will have is interdepartmental cooperation / hand-off, and you'll have multi-discipline roles like surgical oncologist.
In the same way, you won't have devops seniors training front-end juniors.
Comment by marcosdumay 12 hours ago
Comment by lanstin 5 hours ago
Comment by tetha 9 hours ago
And practically, you can have one or two journeymen per master. However, these 2-3 people can in turn support 3-4 more juniors to supply useful work.
This also establishes a fairly natural growth of a person within the company. First you do standard things as told. Then you start doing projects that mostly follow a standard that has worked in the past. And then you start standardizing projects.
Comment by AdrianB1 13 hours ago
Comment by never_inline 3 hours ago
Juniors are usually given either grunt or low priority work while seniors get more "important" work.
OTOH, it takes a lot to get your questions on RIGHT EARS when you're a junior, so wouldn't agree with your characterization at all.
Comment by int_19h 1 hour ago
Comment by HeavyStorm 12 hours ago
Really, juniors are only important because they ask "dumb" questions that can help remove useless abstractions? That your take?
Comment by sailfast 13 hours ago
It’s damned near impossible to figure out where to spend your time wisely to correct an assumption a human made vs. an AI on a blended pull request. All of the learning that happens during PR review is at risk in this way and I’m not sure where we will get it back yet. (Outside of an AI telling you - which, to be fair, there are some good review bots out there)
Comment by startupsfail 13 hours ago
The result is interesting. First, juniors are miserable. What used to be a good experience coding and debugging, in a state of flow is now anxiously waiting if an AI could do it or not.
And senior devs are also miserable, getting apprentices used to be fun and profit, working with someone young is uplifting, and now it is gone.
The code quality is going down, Zen cycle interrupted, with the RL cost functions now at the top.
The only ones who are happy are hapless PhDs ;$
Comment by PunchyHamster 7 hours ago
Juniors are just... necessary in the balance, have to little of them and the mid and senior devs will get more and more expensive, so you hire a bunch of juniors, they get trained on job, and it balances it out.
Hell, if company does it right they might underpay junior-turned-senior for decade before they notice and look how industry pay looks like!
Comment by TheGRS 9 hours ago
I still think the central issue is the economy. There are more seniors available to fill roles, so filling out junior roles is less desirable. And perhaps "replacing juniors with AI" is just the industry's way of clumsily saving face.
Comment by marcosdumay 12 hours ago
Or in other words, the people that get the most value from AI are junior devs, since they still don't know very well plenty of popular environments. It's also useful for seniors that are starting something in a new environment, but those only get 1 or 2 novel contexts at a time, while everything is new for a junior.
Or, in again another set of words, AI enable juniors to add more value more quickly. That makes them more valuable, not less.
Comment by Waterluvian 11 hours ago
Comment by epgui 10 hours ago
Comment by frtime2025 13 hours ago
I also don’t believe juniors, kids, seniors, staff, principals, distinguished/fellow should be replaced by AI. I think they WILL be, but they shouldn’t be. AI at Gemini 3 Flash / Claude Opus 4.5 level is capable with help and review of doing a lot of what a lot of devs do currently. It can’t do everything and will fail, but if the business doesn’t care, they’ll cut jobs.
Don’t waste time trying to argue against AI to attempt to save your job. Just learn AI and do your job until you’re no longer needed to even manage it. Or, if you don’t want to, do something else.
Comment by marcosdumay 12 hours ago
That's not how things work in normal times.
But normal times require minimally capable managers, a somewhat competitive economy, and some meritocracy in hiring. I can believe that's how things will work this time, but it's still a stupid way to do it.
Comment by dejj 13 hours ago
> cargo-culting Stack Overflow
What do you mean by this? I understand “cargo-culting” as building false idols, e.g. wooden headphones and runways to attract airplanes that never come.
Comment by kjellsbells 13 hours ago
example: you have a Windows problem. You search and read that "sfc /scannow" seems a popular answer to Windows problems. You run it without ever understanding what sfc does, whether the tool is relevant to your problem, etc. You are cargo culting a solution.
Comment by PaulStatezny 12 hours ago
Comment by kunley 13 hours ago
http://www.catb.org/jargon/html/C/cargo-cult-programming.htm...
Comment by protocolture 3 hours ago
This is the only role of executives, sales people, account managers. They usually do it with complete and utter confidence too. Vibe-questioning and vibe-instructing other people without a care in the world.
Comment by citizenpaul 9 hours ago
AI is now just the scapegoat for an economy wide problem. Execs found "one neat trick", piling junior work on seniors until they quit. While not hiring replacements in order to goose short term profits. Now every company is in the same position where hiring a senior really means hiring 5 seniors to replace the one that had 5 jobs layered on over a few years. This is of course impossible for any mortal to jump into. Now they also dont even have juniors to train up to senior levels.
Comment by flyinglizard 11 hours ago
Comment by never_inline 3 hours ago
They're also good at putting company code into ChatGPT.
/snark
Comment by throwaway894345 3 hours ago
Comment by jppope 13 hours ago
I was never formally trained so I just keep asking "why" until someone proves it all the way. Sales itself is also a lot about asking questions that won't come up to find the heart of the thing people actually want... which is just another side of the coin.
Comment by aerhardt 13 hours ago
Comment by lingrush4 12 hours ago
We hire juniors so that we can offload easy but time-consuming work on them while we focus on more important or more difficult problems. We also expect that juniors will eventually gain the skills to solve the more difficult problems as a result of the experience they gain performing the easy tasks.
If we stop hiring juniors now, then we won't have any good senior engineers in 5-10 years.
Comment by SecretDreams 4 hours ago
Comment by hiddencost 14 hours ago
Comment by iwontberude 13 hours ago
Comment by deepGem 13 hours ago
Now that their minds are free from routine and boilerplate work, they will start asking more 'whys' which will be very good for the organization overall.
Take any product - nearly 50% of the features are unused and it's a genuine engineering waste to maintain those features. A junior dev spending 3 months on the code base with Claude code will figure out these hidden unwanted features, cull them or ask them to be culled.
It'll take a while to navigate the hierarchy but they'll figure it out. The old guard will have no option but to move up or move out.
Comment by throwway120385 13 hours ago
Comment by deepGem 9 hours ago
Comment by alwa 13 hours ago
At some level, aren’t you describing the age-old process of maturing from junior to mid level to senior in most lines of work, and in most organizations? Isn’t that what advancing in responsibility boils down to: developing subtlety and wisdom and political credibility and organizational context? Learning where the rakes are?
I wish 3 months, or even 3 years, were long enough to fully understand the whys and wherefores and politics of the organizations I cross paths with, and the jungle of systems and code supporting all the kinds of work that happen inside…
Comment by simonw 16 hours ago
> The juniors working this way compress their ramp dramatically. Tasks that used to take days take hours. Not because the AI does the work, but because the AI collapses the search space. Instead of spending three hours figuring out which API to use, they spend twenty minutes evaluating options the AI surfaced. The time freed this way isn’t invested in another unprofitable feature, though, it’s invested in learning. [...]
> If you’re an engineering manager thinking about hiring: The junior bet has gotten better. Not because juniors have changed, but because the genie, used well, accelerates learning.
Comment by beAbU 15 hours ago
I would argue a machine that short circuits the process of getting stuck in obtuse documentation is actually harmful long term...
Comment by chaos_emergent 15 hours ago
I would argue a machine that short-circuits the process of getting stuck in obtuse books is actually harmful long term...
Comment by sfpotter 14 hours ago
Comment by calepayson 14 hours ago
Conversations like this are always well intentioned and friction truly is super useful to learning. But the ‘…’ in these conversations seems to always be implicating that we should inject friction.
There’s no need. I have peers who aren’t interested in learning at all. Adding friction to their process doesn’t force them to learn. Meanwhile adding friction to the process of my buddies who are avidly researching just sucks.
If your junior isn’t learning it likely has more to do with them just not being interested (which, hey, I get it) than some flaw in your process.
Start asking prospective hires what their favorite books are. It’s the easiest way to find folks who care.
Comment by weakfish 14 hours ago
Comment by sfpotter 14 hours ago
Please read it as: "who knows what you'll find if you take a stop by the library and just browse!"
Comment by alwa 12 hours ago
It’s not as if today’s juniors won’t have their own hairy situations to struggle through, and I bet those struggles will be where they learn too. The problem space will present struggles enough: where’s the virtue in imposing them artificially?
Comment by bee_rider 14 hours ago
Comment by sfpotter 14 hours ago
Comment by bee_rider 10 hours ago
Comment by usefulcat 14 hours ago
Comment by GeoAtreides 14 hours ago
AI, on the other hand...
Comment by deepsquirrelnet 6 hours ago
Eventually we will have to somehow convince AI of new and better ways of doing things. It’ll be propaganda campaigns waged by humans to convince God to deploy new instructions to her children.
Comment by 627467 5 hours ago
And this outcome will be obvious very quickly for most observers won't it? So, the magic will occur by pushing AI beyond another limit or just have people go back to specialize on what eventually will becoming boring and procedural until AI catches up
Comment by mplewis 12 hours ago
Comment by ori_b 14 hours ago
Comment by thinkingemote 14 hours ago
Comment by Aurornis 14 hours ago
The arguments were similar, too: What will you do if Google goes down? What if Google gives the wrong answer? What if you become dependent on Google? Yet I'm willing to bet that everyone reading this uses search engines as a tool to find what they need quickly on a daily basis.
Comment by CharlieDigital 14 hours ago
Microsoft docs are a really good example of this where just looking through the ToC on the left usually exposes me to some capability or feature of the tooling that 1) I was not previously aware of and 2) I was not explicitly searching for.
The point is that the path to a singular answer can often include discovery of unrelated insight along the way. When you only get the answer to what you are asking, you lose that process of organic discovery of the broader surface area of the tooling or platform you are operating in.
I would liken AI search/summaries to visiting only the well-known, touristy spots. Sure, you can get shuttled to that restaurant or that spot that everyone visits and posts on socials, but in traveling that way, you will miss all of the other amazing food, shops, and sights along the way that you might encounter by walking instead. Reading the docs is more like exploring the random nooks and crannies and finding experiences you weren't expecting and ultimately knowing more about the place you visited than if you had only visited the major tourist destinations.
As a senior-dev, I have generally a good idea of what to ask for because I have built many systems and learned many things along the way. A junior dev? They may not know what to ask for and therefore, may never discover those "detours" that would yield additional insights to tuck into the manifolds of their brains for future reference. For the junior dev, it's like the only trip they will experience is one where they just go to the well known tourist traps instead of exploring and discovering.
Comment by raw_anon_1111 5 hours ago
Comment by Cpoll 13 hours ago
Of course no-one's stopping a junior from doing it the old way, but no-one's teaching them they can, either.
Comment by rafaelmn 14 hours ago
So if AI gets you iterating faster and testing your assumptions/hypothesis I would say that's a net win. If you're just begging it to solve the problem for you with different wording - then yeah you are reducing yourself to a shitty LLM proxy.
Comment by tencentshill 14 hours ago
Comment by macintux 14 hours ago
Maybe. The naturally curious will also typically be slower to arrive at a solution due to their curiosity and interest in making certain they have all the facts.
If everyone else is racing ahead, will the slowpokes be rewarded for their comprehension or punished for their poor metrics?
Comment by rgoulter 8 hours ago
It's always possible to go slower (with diminishing benefits).
Or I think putting it in terms of benefits and risks/costs: I think it's fair to have "fast with shallow understanding" and "slower but deeper understanding" as different ends of some continuum.
I think what's preferable somewhat depends on context & attitude of "what's the cost of making a mistake?". If making a mistake is expensive, surely it's better to take an approach which has more comprehensive understanding. If mistakes are cheap, surely faster iteration time is better.
The impact of LLM tools? LLM tools increase the impact of both cases. It's quicker to build a comprehensive understanding by making use of LLM tools, similar to how stuff like autocompletion or high-level programming languages can speed up development.
Comment by marcosdumay 12 hours ago
Yes. And now you can ask the AI where the docs are.
The struggling is not the goal. And rest assured there are plenty of other things to struggle with.
Comment by PaulKeeble 14 hours ago
Comment by fireflash38 11 hours ago
There's a lot of good documentation where you learn more about the context of how or why something is done a certain way.
Comment by supersour 14 hours ago
Comment by amrocha 13 hours ago
Comment by throwaway613745 14 hours ago
Comment by pizza234 15 hours ago
Comment by Terr_ 14 hours ago
Also the difference between using it to find information versus delegating executive-function.
I'm afraid there will be a portion of workers who crutch heavily on "Now what do I do next, Robot Soulmate?"
Comment by Ifkaluva 14 hours ago
Any task has “core difficulty” and “incidental difficulty”. Struggling with docs is incidental difficulty, it’s a tax on energy and focus.
Your argument is an argument against the use of Google or StackOverflow.
Comment by skydhash 14 hours ago
Complaining about docs is like complaining about why research article is not written like elementary school textbooks.
Comment by tikhonj 14 hours ago
Comment by amrocha 13 hours ago
And yet I wouldn’t trust a single word coming out of the mouth of someone who couldn’t understand Hegel so they read an AI summary instead.
There is value in struggling through difficult things.
Comment by jimbokun 14 hours ago
If you can just get to the answer immediately, what’s the value of the struggle?
Research isn’t time coding. So it’s not making the developer less familiar with the code base she’s responsible for. Which is the usual worry with AI.
Comment by schainks 14 hours ago
If you read great books all the time, you will find yourself more skilled at identifying good versus bad writing.
Comment by bigstrat2003 13 hours ago
Comment by jaapbadlands 14 hours ago
Comment by bigstrat2003 13 hours ago
Comment by twosdai 8 hours ago
Its no different now, just the level of effort required to get the code copy is lower.
Whenever I use AI I sit and read and understand every line before pushing. Its not hard. I learn more.
Comment by tayo42 14 hours ago
Comment by seanmcdirmid 14 hours ago
1995: struggling with docs and learning how and where to find the answers part of the learning process
2005: struggling with stackoverflow and learning how to find answers to questions that others have asked before quickly is part of the learning process
2015: using search to find answers is part of the learning process
2025: using AI to get answers is part of the learning process
...
Comment by lifeformed 13 hours ago
Comment by seanmcdirmid 12 hours ago
Comment by tester756 12 hours ago
XML oriented programming and other stuff was "invented" back then
Comment by wizzwizz4 13 hours ago
To the extent that learning to punch your own punch cards was useful, it was because you needed to understand the kinds of failures that would occur if the punch cards weren't punched properly. However, this was never really a big part of programming, and often it was off-loaded to people other than the programmers.
In 1995, most of the struggling with the docs was because the docs were of poor quality. Some people did publish decent documentation, either in books or digitally. The Microsoft KB articles were helpfully available on CD-ROM, for those without an internet connection, and were quite easy to reference.
Stack Overflow did not exist in 2005, and it was very much born from an environment in which search engines were in use. You could swap your 2005 and 2015 entries, and it would be more accurate.
No comment on your 2025 entry.
Comment by seanmcdirmid 13 hours ago
I thought all computer scientists heard about Dijkstra making this claim at one time in their careers. I guess I was wrong? Here is the context:
> A famous computer scientist, Edsger Dijkstra, did complain about interactive terminals, essentially favoring the disciplined approach required by punch cards and batch processing.
> While many programmers embraced the interactivity and immediate feedback of terminals, Dijkstra argued that the "trial and error" approach fostered by interactive systems led to sloppy thinking and poor program design. He believed that the batch processing environment, which necessitated careful, error-free coding before submission, instilled the discipline necessary for writing robust, well-thought-out code.
> "On the Cruelty of Really Teaching Computing Science" (EWD 1036) (1988 lecture/essay)
Seriously, the laments I hear now have been the same in my entire career as a computer scientist. Let's just look toward to 2035 where someone on HN will complain some old way of doing things is better than the new way because its harder and wearing hair shirts is good for building character.
Comment by linksnapzz 14 hours ago
Now get back to work.
Comment by lokar 15 hours ago
But for many Jr engineers it’s the hard part. They are not (yet) expected to be responsible for the larger issues.
Comment by bdangubic 15 hours ago
Comment by dclowd9901 15 hours ago
But these are the things people learn through experience and exposure, and I still think AI can help by at least condensing the numerous books out there around technology leadership into some useful summaries.
Comment by lokar 10 hours ago
Doing backend and large distributed systems it (seems to me), much deeper. Types of consistency and their tradeoffs in practice, details like implementing and correctly using lamport clocks, good API design, endless details about reworking, on and on.
And then for both, a learned sense of what approaches to system organization will work in the long run (how to avoid needing to stage a re-write every 5 years).
Comment by ehnto 5 hours ago
Comment by phist_mcgee 5 hours ago
Gatekeeping?
Why couldn't a backend team have all tasks be junior compatible, if uncoupled from deadlines and time constraints?
Comment by ekkeke 15 hours ago
There is such a thing as software engineering skill and it is not domain knowledge, nor knowledge of a specific codebase. It is good taste, an abstract ability to create/identify good solutions to a difficult problem.
Comment by ehnto 5 hours ago
In a long term enterprise the point is building up a long term skillset into the community. Bolstering your teams hive mind on a smaller scale also.
But work has evolved and the economy has become increasingly hostile to long term building, making it difficult to get buy in for efforts that don't immediately get work done or make money.
Comment by lokar 10 hours ago
Comment by bdangubic 12 hours ago
Comment by ekkeke 10 hours ago
Comment by bdangubic 7 hours ago
how does one junior acquire engineering skills except through experience as you said?
Comment by Aperocky 14 hours ago
Good luck maintaining that.
Comment by bdangubic 12 hours ago
Comment by jason_oster 10 hours ago
Beware, your ego may steer you astray.
Comment by bdangubic 7 hours ago
Comment by butwhyth0oo 15 hours ago
Comment by atomicnumber3 12 hours ago
But dyou know what's really great at taking a bunch of tokens and then giving me a bunch of probabilistically adjacent tokens? Yeah exactly! So often even if the AI is giving me something totally bonkers semantically, just knowing all those tokens are adjacent enough gives me a big leg up in knowing how to phrase my next question, and of course sometimes the AI is also accidentally semantically correct too.
Comment by never_inline 2 hours ago
Comment by ChuckMcM 14 hours ago
How are we defining "learning" here? The example I like to use is that a student who "learns" what a square root is, can calculate the square root of a number on a simple 4 function calculator (x, ÷, +, -) if iteratively. Whereas the student who "learns" that the √ key gives them the square root, is "stuck" when presented with a 4 function calculator. So did they 'learn' faster when the "genie" surfaced a key that gave them the answer? Or did they just become more dependent on the "genie" to do the work required of them?
Comment by pests 11 hours ago
I graduated HS in mid 2000s and didn't start using a calculator for math classes until basically a junior in college. I would do every calculation by hand, on paper. I benefited from a great math teacher early on that taught me how to properly lay out my calculations and solutions on paper. I've had tests I've turned in where I spent more paper on a single question than others did on the entire test.
It really helped my understanding of numbers and how they interacted, and helped teachers/professors narrow down on my misunderstandings.
Comment by jacquesm 8 hours ago
This aspect is entirely missing when you use an oracle.
Comment by sailfast 13 hours ago
It’s diamond age and a half - you just need to continue to be curious and perhaps slow your shipping speed sometimes to make sure you budget time for learning as well.
Comment by simonw 13 hours ago
Comment by almosthere 15 hours ago
Comment by golly_ned 8 hours ago
This really isn't the case from what I've seen. It's that they use Cursor or other code generation tools integrated into their development environment to generate code, and if it's functional and looks from a fuzzy distance like 'good' code (in the 'code in the small' sense), they send an oversized PR, and it's up to the reviewer to actually do the thinking.
Comment by rustystump 8 hours ago
The crux is always that you dont know what u dont know. AI doesnt fix this.
Comment by sharemywin 14 hours ago
Comment by xp84 10 hours ago
Ehh... 'used well' is doing some very heavy lifting there. And the incentive structure at 90% of companies does not optimize for 'using it well.'
The incentive is to ship quickly, meaning aim the AI-gun at the codebase for a few hours and grind out a "technically working" solution, with zero large-scale architecture thought and zero built-up knowledge of how the different parts of the application are intended to work together (because there was no "intention"). There will be tests, but they may not be sensible and may be very brittle.
Anyway, deploying a bunch of fresh grads armed not with good mentorship but with the ability to generate thousands of LOC a day is a recipe for accelerating the collapse I usually see in startup codebases about 6-8 years old. This is the point where the list of exceptions to every supposed pattern is longer than the list of things that follow the patterns, and where each bug, when properly pursued, leads to a long chain of past bad decisions, each of which would take days of effort to properly unwind (and that unwinding will also have a branching effect on other things). Also, coincidentally, this is the point where an AI agent is the most useless, because they really don't expect all the bizarre quirks in the codebase.
Am I saying AI is useless? No, it's great for prototyping and getting to PMF, and great in the hands of someone who can read its output with a critical eye, but I wouldn't combine it with inexperienced users who haven't had the opportunity to learn from all the many mistakes I've made over the years.
Comment by zahlman 9 hours ago
Comment by SkyPuncher 14 hours ago
I hate to be so negative, but one of the biggest problems junior engineers face is that they don't know how to make sense of or prioritize the gluttony of new-to-them information to make decisions. It's not helpful to have an AI reduce the search space because they still can't narrow down the last step effectively (or possibly independently).
There are junior engineers who seem to inherently have this skill. They might still be poor in finding all necessary information, but when they do, they can make the final, critical decision. Now, with AI, they've largely eliminated the search problem so they can focus more on the decision making.
The problem is it's extremely hard to identify who is what type. It's also something that senior level devs have generally figured out.
Comment by GeoAtreides 14 hours ago
This is "the kids will use the AI to learn and understand" level of cope
no, the kids will copy and paste the solution then go back to their preferred dopamine dispenser
Comment by CuriouslyC 14 hours ago
There might be value in learning from failure, but my guess is that there's more value in learning from success, and if the LLM doesn't need me to succeed my time is better spent pushing into territory where it fails so I can add real value.
Comment by jplusequalt 13 hours ago
I would argue you're learning less than you might believe. Similarly to how people don't learn math by watching others solve problems, you're not going to learn to become a better engineer/problem solver by reading the output of ChatGPT.
Comment by CuriouslyC 10 hours ago
Regarding leveling up as an engineer, at this point in my career it's called management.
Comment by skydhash 14 hours ago
This is an example of a book on Common Lisp
https://gigamonkeys.com/book/practical-a-simple-database
What you usually do is follow the book instructions and get some result, then go to do some exploration on your own. There’s no walk in the dark trying to figure your own path.
Once you learn what works, and what does not, then you’ll have a solid foundation to tackle more complex subject. That’s the benefit of having a good book and/or a good teacher to guide you to the path of mastering. Using a slot machine is more tortuous than that.
Comment by CuriouslyC 14 hours ago
Also, for a lot of things, that is how people learn because there aren't good textbooks available.
Comment by skydhash 14 hours ago
I was helping a few people on getting started with an Android Development bootcamp and just being able to run the default example and get their bearing around the IDE was interesting to them. And I remember when I was first learning python. Just doing basic variable declaration and arithmetic was interesting. Same with learning C and being able to write tic-tac-toe.
I think a lot of harm is being done by making beginner have expectations that would befit people that have years of experience. Like you can learn docker in 2 months to someone that doesn't even know Linux exists or have never encountered the word POSIX.
Please do read the following article: https://www.norvig.com/21-days.html
Comment by switchbak 12 hours ago
Just as some might pull the answers from the back of the textbook, the interesting ones are the kids who want to find out why certain solutions are the way they are.
Then again I could be wrong, I try hard to stay away from the shithose that is the modern social media tech landscape (TikTok, Insta, and friends) so I'm probably WAY out of touch (and I prefer it that way).
Comment by simonw 13 hours ago
Comment by ares623 13 hours ago
Comment by lanfeust6 16 hours ago
Comment by alpha_squared 15 hours ago
Comment by sublinear 15 hours ago
Comment by gosub100 14 hours ago
Comment by simonw 13 hours ago
Comment by nothrabannosir 2 hours ago
Comment by simonw 2 hours ago
That's why "copy page" buttons are increasingly showing on manual pages eg. https://platform.claude.com/docs/en/get-started
Comment by gnerd00 15 hours ago
Since desktop computers became popular, there have been thousands of small to mid-size companies that could benefit from software systems.. A thousand thousand "consultants" marched off to their nearest accountant, retailer, small manufacturer or attorney office, to show off the new desktop software and claim ability to make new, custom solutions.
We know now, this did not work out for a lot of small to mid-size business and/or consultants. Few could build a custom database application that is "good enough" .. not for lack of trying.. but pace of platforms, competitive features, stupid attention getting features.. all of that, outpaced small consultants .. the result is giant consolidation of basic Office software, not thousands of small systems custom built for small companies.
What now, in 2025? "junior" devs do what? design and build? no. Cookie-cutter procedures at AWS lock-in services far, far outpace small and interesting designs of software.. Automation of AWS actions is going to be very much in demand.. is that a "junior dev" ? or what?
This is a niche insight and not claiming to be the whole story.. but.. ps- insert your own story with "phones" instead of desktop software for another angle
Comment by mlloyd 15 hours ago
Lotus Notes is an example of that custom software niche that took off and spawned a successful consulting ecosystem around it too.
Comment by dylan604 15 hours ago
TIL Notes is still a thing. I had thought it was dead and gone some time ago.
Comment by nateglims 15 hours ago
Comment by gnerd00 15 hours ago
I did not write "all software" or "enterprise software" but you are surprised I said that... hmmm
Comment by imiric 13 hours ago
"AI" tools are most useful in the hands of experienced developers, not juniors. It's seniors who have the knowledge and capability to review the generated output, and decide whether the code will cause more issues when it's merged, or if it's usable if they tweak and adapt it in certain ways.
A junior developer has no such skills. Their only approach will be to run the code, test whether it fulfills the requirements, and, if they're thorough, try to understand and test it to the best of their abilities. Chances are that because they're pressured to deliver as quickly as possible to impress their colleagues and managers, they'll just accept whatever working solution the tool produces the first time.
This makes "AI" in the hands of junior developers risky and counterproductive. Companies that allow this type of development will quickly grind to a halt under the weight of technical debt, and a minefield of bugs they won't know how to maneuver around.
The unfortunate reality is that with "AI" there is no pathway for junior developers to become senior. Most people will gravitate towards using these tools as a crutch for quickly generating software, and not as a learning tool to improve their own skills. This should concern everyone vested in the future of this industry.
Comment by versteegen 3 hours ago
This is also a supremely bad take... well, really it's mainly the way you worded it that's bad. Juniors have skills, natural aptitudes, as much intelligence on average as other programmers, and often even some experience but what they lack is work history. They sure as hell are capable of understanding code rather than just running it. Yes, of course experience is immensely useful, most especially at understanding how to achieve a maintainable and reliable codebase in the longterm, which is obviously of special importance, but long experience is not a hard requirement. You can reason about trade offs, learn from advice, learn quickly, etc.
Comment by imiric 1 hour ago
Comment by amrocha 13 hours ago
Comment by snarf21 12 hours ago
Comment by Yodel0914 12 hours ago
Comment by thinkingtoilet 11 hours ago
Comment by amrocha 33 minutes ago
Comment by ivape 15 hours ago
Because that makes the most business sense.
Comment by irishcoffee 14 hours ago
Comment by bgwalter 15 hours ago
https://substack.com/@kentbeck
What software projects is he actively working on?
Comment by helsinkiandrew 15 hours ago
Comment by dlisboa 15 hours ago
Comment by umanwizard 15 hours ago
Comment by bgwalter 15 hours ago
Comment by repler 15 hours ago
In many cases he helped build the bandwagons you're implying he simply jumped onto.
Comment by exasperaited 15 hours ago
The fact that I cannot tell if you mean this satirically or not (though I want to believe you do!) is alarming to me.
Comment by psunavy03 15 hours ago
Comment by bigfishrunning 11 hours ago
Comment by imiric 14 hours ago
Comment by psunavy03 13 hours ago
Of course he can be wrong; he's human. That wasn't my point.
Comment by bgwalter 12 hours ago
Comment by switchbak 12 hours ago
The thrust of the issue is that: when used suitably, AI tools can increase the rate of learning such that it changes the economics of investments in juniour developers - in a good way, to the contrary of how these tools have been discussed in the mainstream. That is an interesting take, and worthy of discussion.
Your appeal to authority here is out of place here and clearly uninformed, thus the downvotes.
Comment by bgwalter 12 hours ago
What I did not know and what the Wikipedia page revealed is that he worked for a YCombinator company. Thus the downvotes.
Comment by orliesaurus 15 hours ago
Comment by orliesaurus 15 hours ago
Comment by bluGill 15 hours ago
Comment by JKCalhoun 14 hours ago
Just an example. I've been in so many code bases over the years… I had a newer engineer come aboard who, when he saw some code I recently wrote with labels (!) he kind of blanched. He thought "goto == BAD". (We're talking "C" here.)
But this was code that dealt with Apple's CoreFoundation. More or less every call to CF can fail (which means returning NULL in the CF API). And (relevant) passing NULL to a CF call, like when trying to append a CF object to a CF array, was a hard crash. CF does no param checking. (Why, that would slow it down—you, dear reader, are to do all the sanity checking.)
So you might have code similar to:
CFDictionary dict = NULL;
dict = CFCreateDictionary();
if (!dict)
goto bail;
You would likely continue to create arrays, etc—insert them into your dictionary, maybe return the dictionary at the end. And again, you checked for NULL at every call to CF, goto bail if needed.Down past 'bail' you could CFRelease() all the non-null instances that you do not return. This was how we collected our own garbage. :-)
In any event, goto labels made the code cleaner: your NULL-checking if-statements did not have to nest crazy deep.
The new engineer admitted surprise that there might be a place for labels. (Or, you know, CF could have been more NULL-tolerant and simply exited gracefully.)
Comment by al_borland 8 hours ago
A very basic example were the interns who constantly tried to use Google Docs for everything, their personal accounts no less. I had to stop them and point them back to MS Office at least a dozen times.
In other situations, people will try and use free tools that don’t scale well, because that’s what they used in college or as a hobby. It can take a lot of work to point them to the enterprise solution that is already approved and integrated with everything. A basic example of this would be someone using Ansible from their laptop when we have Ansible Automation Platform, which is better optimized for running jobs around the globe and automatically logs to Splunk to create an audit trail.
Comment by salawat 15 hours ago
Comment by SketchySeaBeast 14 hours ago
Comment by positr0n 13 hours ago
Comment by orliesaurus 13 hours ago
Comment by agoodusername63 12 hours ago
Comment by codegeek 15 hours ago
Sorry, what does that mean exactly ? Are you claiming that a junior dev knows how to ask the right prompts better than a Senior dev ?
Comment by __s 15 hours ago
Overall I don't quite agree. Personally this applies to me, I've been using vim for the last decade so any AI tooling that wants me to run some electron app is a non starter. But many of my senior peers coming from VS Code have no such barriers
Comment by citrin_ru 14 hours ago
Comment by YetAnotherNick 14 hours ago
Comment by sailfast 13 hours ago
You won’t need Vim except to review changes and tweak some things if you feel like it.
Comment by francisofascii 13 hours ago
Comment by perfmode 14 hours ago
Comment by bongodongobob 14 hours ago
Comment by whazor 15 hours ago
For medium or small companies, these guardrails or documentation can be missing. In that case you need experienced people to help out.
Comment by WestCoader 15 hours ago
Comment by Mountain_Skies 15 hours ago
Comment by raincole 15 hours ago
Comment by butwhyth0oo 15 hours ago
Comment by debo_ 15 hours ago
Comment by red-iron-pine 15 hours ago
"bespoke, hand generated content straight to your best readers"
Comment by DiscourseFan 15 hours ago
Comment by JKCalhoun 14 hours ago
Me too. Fire your senior devs. (Ha ha, not ha ha.)
Comment by Ancalagon 14 hours ago
Cannot wait for the 'Oh dear god everything is on fire, where is the senior dev?' return pay packages.
Comment by ch2026 15 hours ago
Comment by 9rx 10 hours ago
While anyone is free to define words as they so please, most people consider those with the most experience to be seniors. I am pretty sure that has been the message around this all along: Do not cut the seniors. The label you choose isn't significant. Whether you want to call them juniors or seniors, it has always been considered to make no sense to cut those with the most experience.
Comment by dragonwriter 10 hours ago
Comment by 9rx 10 hours ago
Comment by dragonwriter 10 hours ago
Comment by 9rx 8 hours ago
Which is nothing new. It has always been understood that it is valuable to have experienced people on board. The "cut the juniors" talk has never been about letting those who offer value go. Trying to frame it as being about those who offer experiential value — just not in the places you've arbitrary chosen — is absurd.
Comment by zahlman 8 hours ago
Aside from the absurdity of this claim, consider how many years of experience a "senior" is typically expected to have, and then consider how long even ChatGPT has been available to the public, never mind SOTA coding agents.
Comment by lvl155 14 hours ago
I think LLM is a reflection of human intelligence. If we humans become dumber as a result of LLM, LLMs will also become dumber. I’d like to think in some dystopian world, LLM’s trained from pre 2023 data will be sought after.
Comment by thunky 7 hours ago
Ironic because the junior has much more to lose. The 50+ can probably coast across the finish line.
Comment by yieldcrv 15 hours ago
I would have agreed with you 100% one year ago. Basically senior engineers are too complacent to look at AI tools as well as ego driven about it, all while corporate policy disincentivizes them from using anything at all, with maybe a forced Co-Pilot subscription. While junior engineers will take a risk that the corporate monitoring of cloud AI tools isn't that robust.
But now, although many of those organizations are still the same - with more contrived Co-Pilot subscriptions - I think senior engineers are skirting corporate policy too and become more familiar with tools.
I'm also currently in an organization that is a total free for all with as many AI coding and usage tools as necessary to deliver faster. So I could be out of touch already.
Perhaps more complacent firms are the same as they were a year ago.
Comment by kakacik 15 hours ago
Coding in any sufficiently large organization is never the main part of senior's time spend, unless its some code sweatshop. Juniors can do little to no of all that remaining glue that makes projects go from a quick brainstorming meeting to live well functioning and supported product.
So as for worth - companies can, in non-idedal fashion obviously, work without juniors. I can't imagine them working without seniors, unless its some sort of quick churn of CRUDs or eshops from some templates.
Also there is this little topic that resonates recently across various research - knowledge gained fast via llms is a shallow one, doesn't last that long and doesn't go deeper. One example out of many - any time I had to do some more sophisticated regex-based processing I dived deep into specs, implementation etc. and few times pushed it to the limits (or realized task is beyond what regex can do), instead of just given the result, copypasted it and moved along since some basic test succeeded. Spread this approach across many other complex topics. That's also a view on long term future of companies.
I get what you say and I agree partially but its a double edged sword.
Comment by neilv 10 hours ago
Would that experience be from cheating on their homework? Are you sure that's the skill you want to prioritize?
> “Number two, they're usually the least expensive because they're right out of college, and they generally make less. So if you're thinking about cost optimization, they're not the only people you would want to optimize around.”
Hahaha. Sounds like a threat. Additional context for this is that Amazon has a history of stack ranking and per-manager culling quotas, and not as much a reputation for caring about employees like Google did.
> “Three, at some point, that whole thing explodes on itself. If you have no talent pipeline that you're building and no junior people that you're mentoring and bringing up through the company, we often find that that's where we get some of the best ideas.”
I thought the tech industry had given up on training and investing in juniors for long-term, since (the thinking goes) most of them will job-hop in 18 months, no matter how well you nurture. Instead, most companies are hiring for the near-term productivity they can get, very transactionally.
Does AWS have good long-term retention of software engineers?
Comment by rossdavidh 10 hours ago
Comment by neilv 10 hours ago
Comment by simgt 21 minutes ago
Comment by byzantinegene 5 hours ago
Comment by pnathan 15 hours ago
But I don't learn. That's not what I'm trying to do- I'm trying to fix the bug. Hmm.
I'm pretty sure AI is going to lead us to a deskilling crash.
Food for thought.
Comment by omnimus 14 hours ago
Comment by thunky 6 hours ago
AI is an excellent teacher for someone that wants to learn.
Comment by zahlman 8 hours ago
Nothing is preventing you from studying how the bugfix works once it's in place.
Nor is there any reason this use of AI should cause you to lose skills you already have.
Comment by Karliss 14 minutes ago
Comment by golly_ned 8 hours ago
It's like reading the solution to a math proof instead of proving it yourself. Or writing a summary of a book compared to reading one. The effort towards seeing the design space and choosing a particular solution doesn't exist; you only see the result, not the other ways it could've been. You don't get a feedback loop to learn from either, since that'll be AI generated too.
It's true there's nothing stopping someone from going back and trying to solve it themselves to get the same kind of learning, but learning the bugfix (or whatever change) by studying it once in place just isn't the same.
And things don't work like that in practice any more than things like "we'll add tests later" end up being followed through with with any regularity. If you fix a bug, the next thing for you to do is to fix another bug, or build another feature, write another doc, etc., not dwell on work that was already 'done'.
Comment by hyperadvanced 6 hours ago
I’m not a super heavy AI user but I’ve vibe coded a few things for the frontend with it. It has helped me understand how you lay out react apps a little better and how the legos that React gives you work. Probably far less than if I had done it from scratch and read a book but sometimes a working prototype is so much more valuable to a product initiative than learning a programming language is that you would be absolutely burning time and value to not vibe code the prototype
Comment by rudnevr 7 hours ago
Comment by deepspace 14 hours ago
That's my thought too. It's going to be a triple whammy
1. Most developers (Junior and Senior) will be drawn in by the temptation of "let the AI do the work", leading to less experience in the workforce in the long term.
2. Students will be tempted to use AI to do their homework, resulting in new grads who don't know anything. I have observed this happen first hand.
3. AI-generated (slop) code will start to pollute Github and other sources used for future LLM training, resulting in a quality collapse.
I'm hoping that we can avoid the collapse somehow, but I don't see a way to stop it.
Comment by JeremyNT 15 hours ago
The thing with juniors is: those who are interested in how stuff works now have tools to help them learn in ways we never did.
And then it's the same as before: some hires will care and improve, others won't. I'm sure that many juniors will be happy to just churn out slop, but the stars will be motivated on their own to build deeper understanding.
Comment by BeFlatXIII 13 hours ago
Comment by PaulStatezny 12 hours ago
The neural connections (or lack of them) have longer term comprehension-building implications.
Comment by pnathan 11 hours ago
Comment by pphysch 14 hours ago
It should probably be supplemented with some good old RTFM, but it does get us somewhat beyond the "blind leading the blind" StackOverflow paradigm of most software engineering.
Comment by frostiness 16 hours ago
Comment by mjr00 15 hours ago
Even more recently we had this with radiologists, a profession that was supposed to be crushed by deep learning and neural networks. A quick Google search says an average radiologist in the US currently makes between $340,000 to $500,000 per year.
This might be the professional/career version of "buy when there's blood in the streets."
Comment by avgDev 15 hours ago
I had a job lined up before graduating. Now make high salary for the area, work remotely 98% of the time and have flexible schedule. I'm so glad I didn't listen to that guy.
Comment by dylan604 15 hours ago
Comment by realityfactchex 12 hours ago
Comment by filoleg 13 hours ago
I was still in high school in 2010, and was told the same thing about outsourcing to India/SEA/etc. making a CS degree/career (in the US) a terrible choice. It wasn't just random people saying this either, I was reading about it in the news, online, had some family acquaintances with alleged former software dev career, etc. I didn't listen, and I am glad I didn't.
As I was graduating from college, and deep learning was becoming a new hot thing, I heard the same thing about radiologists, and how they are all getting automated away in the next 5 years. I had no plans to go to med school, and I didn't know anyone at the time who went through it, so I didn't know much about the topic. On the surface, it seemed like a legitimate take, and I just stored it in my head as "sounds about right."
Cue to now, I know more than a few people who went through med school, and am in general more attuned to the market. Turns out, all of that was just another genpop hype, those news articles about "omg radiologists are all getting replaced by computers" stopped from showing up on any of my news feeds, and not a single radiology-specialized med school graduate I know had any issues with getting a job (that paid significantly more than an entry level position at a FAANG).
I have zero idea what point I was trying to make with this comment, but your examples mirror my personal experience with the topic really well.
Comment by codegeek 15 hours ago
So I think that a lot of juniors WILL get replaced by AI not because they are junior necessarily but because a lot of them won't be able to add great value compared to a default AI and companies care about getting the best value from their workers. A junior who understands this and does more than the bare minimum will stand out while the rest will get replaced.
Comment by mjr00 13 hours ago
Again this is what people said about outsourced developers. 2008 logic was, why would anyone hire a junior for $50k/year when you could hire a senior with 20 years experience for $10k/year from India?
Reality: for 5+ years you could change careers by taking a 3-6 month JavaScript bootcamp and coming out the other end with a $150k job lined up. That's just how in demand software development was.
Comment by hrimfaxi 15 hours ago
At the end of the day, radiologists are still doctors.
Comment by symlinkk 13 hours ago
Comment by sublinear 15 hours ago
You can either bet on the new unproven thing claiming to change things overnight, or just do the existing thing that's working right now. Even if the new thing succeeds, an overnight success is even more unrealistic. The insight you gain in the meantime is valuable for you to take advantage of what that change brings. You win either way.
Comment by bluGill 15 hours ago
There can sometimes be too much competition, but often there is only the illusion of too much if you don't look at quality. You can find a lot of cheap engineers in India, but if you want a good quality product you will have to pay a lot more.
Comment by ravenstine 15 hours ago
Hell, I should probably be studying how to be a carpenter given the level at which companies are pushing vibe coding on their engineers.
Comment by bonzini 12 hours ago
Comment by simonw 15 hours ago
That's such a terrible trend.
Reminds me of my peers back in ~2001 who opted not to take a computer science degree even though they loved programming because they thought all the software engineering jobs would be outsourced to countries like India and there wouldn't be any career opportunities for them. A very expensive mistake!
Comment by roncesvalles 15 hours ago
Comment by DiscourseFan 15 hours ago
Comment by omnimus 14 hours ago
Comment by platevoltage 1 hour ago
Comment by andrewl-hn 10 hours ago
Historically, these candidates have been the hiring sweet spot: less risky than brand new engineers, still small enough experience to efficiently mold them into your bespoke tools and processes and turn them into long-term employees, and still very cheap.
Comment by Nextgrid 15 hours ago
It's similar to all those people who were hyping up blockchain/crypto/NFTs/web3 as the future, and now that it all came to pass they adapted to the next grift (currently it's AI). He is now toning down his messaging in preparation of a cooldown of the AI hype to appear rational and relevant to whatever comes next.
Comment by seg_lol 15 hours ago
Comment by mattgreenrocks 14 hours ago
Pointing out that it wasn’t always that will make you seem “negative.”
Comment by seg_lol 14 hours ago
Comment by ay 9 hours ago
https://www.business-standard.com/amp/world-news/amazon-euro...
Comment by fullshark 15 hours ago
Comment by burningChrome 15 hours ago
Considering the talk around junior devs lately on HN, there's way too many of them, it would indeed be amusing.
Comment by raincole 15 hours ago
To what?
Comment by kachapopopow 3 hours ago
Comment by ok123456 15 hours ago
Comment by butwhyth0oo 15 hours ago
Comment by epolanski 15 hours ago
I fear that unless you heavily invest in them and follow them, they might be condemned to have decades of junior experience.
Comment by tayo42 14 hours ago
You can describe pre-ai developers and like that too. It's probably my biggest complaint about some of my Co workers
Comment by epolanski 11 hours ago
Comment by PartiallyTyped 15 hours ago
In my view there's two parts to learning, creation and taste, and both need to be balanced to make progress. Creation is, in essence, the process of forming pathways that enable you to do things, developing taste is the process of pruning and refining pathways to doing things better.
You can't become a chef without cooking, and you can't become a great one without cultivating a taste (pun intended) for what works and what it means for something to be good.
From interactions with our interns and new-grads, they lack the taste, and rely too much on the AI for generation. The consequence is that when you have conversations with them, they straggle to understand the concepts and tools they are using because they lack the familiarity that comes with creation, and they lack the skills to refine the produced code into something good.
Comment by bespokedevelopr 12 hours ago
I switched over to consulting/contracting so I don’t have the visibility like they do, but my work is heavily dependent on llms. However I don’t see it wiping out the industry but rather making people more efficient.
They have much more robust tooling though around their llms and internal products that have automated much of their workflows which is I believe where the concern is coming from. They can see first hand how much of their job has turned into reviewing outputs and feeding outputs into other tools. A shift in skills but not fully automated solution yet.
It’s hard to gauge where things are going and where we’ll be in 5 years. If we only get incremental improvements there’s still huge gains to be made in building out tooling ecosystems to make this all better.
What does that look like for new college grads though? How much of this is really computer science if you are only an llm consumer?
Comment by jakub_g 10 hours ago
It's not really the work that LLMs currently do. I mean sure, maybe if you plug an LLM to read all emails and slacks and zoom transcripts of the entire company, it could do it at some point in the future. But would it have the same amount of influence compared to an industry & company veteran who has the company specific knowledge and experience that is nowhere written down?
Comment by deadbabe 10 hours ago
Comment by user3939382 11 hours ago
Comment by benjismith 11 hours ago
The advent of agentic coding is probably punch #2 in the one-two punch against juniors, but it's an extension of a pattern that's been unfolding for probably 5+ years now.
Comment by siliconc0w 11 hours ago
"If your seniors are resisting AI and saying it doesn't work, replace them with AI-native engineers!"
"AI will replace all junior software developers"
"AI will be a tool to help junior software developers"
Eventually we will get to:
"AI requires and will likely to continue to require pretty heavy hand holding and is not a substitute for building and maintaining independent subject matter expertise"
Comment by alecco 15 hours ago
"Amazon announces $35 billion investment in India by 2030 to advance AI innovation, create jobs" https://www.aboutamazon.com/news/company-news/amazon-35-bill... (Dec 9 2025)
Comment by la64710 15 hours ago
Comment by reop2whiskey 13 hours ago
Comment by xp84 10 hours ago
Yes, killing your talent pipeline is a horrible idea. But that's Future CEO's problem. When we need new seniors to backfill natural attrition, we can poach them from competitors.
And juniors don't make that much less money, either. Sure, there are people who do light frontend work on Wordpress sites and stuff, who make a lot less. But at my place of work, when we had junior SWEs (we either developed them into seniors in the past 3 years or let them attrition), they were making about ¾ of what seniors make. So, you can pay 4 juniors or you can pay 2-3 seniors. Arguably 1 senior using AI will be a lot more sustainable than 4 juniors burning tokens all day trying to get Cursor to do things they don't really even understand and can't evaluate effectively.
Anyway I completely agree that all of this, especially eliminating the bottom 2 steps of the career ladder for engineers, is horrible for our entire industry. But our incentive structure will richly reward companies for doing this. Stock price go up. Let Future CEO worry about it.
Comment by stillworks 1 hour ago
Just yesterday had a coding interview (not any FAANG) and the interviewer wanted a screen share and also checked my IDE settings to make sure "AI" was turned off.
Not that I intended to or even intend to use LLM based tooling for interviews.
Although having said that, if I intended to, interviewers won't find out. Interviews should always be done in person. (That took a different tangent... sorry)
Comment by kunley 12 hours ago
Comment by mrcsharp 12 hours ago
Comment by cowsandmilk 16 hours ago
Comment by stargrazer 7 hours ago
Comment by harshaw 12 hours ago
Comment by aposm 15 hours ago
Comment by twostorytower 16 hours ago
Comment by jsheard 16 hours ago
Comment by steveBK123 15 hours ago
Comment by azemetre 16 hours ago
Comment by riskable 15 hours ago
If you want to complain about tech companies ruining the environment, look towards policies that force people to come into the office. Pointless commutes are far, far worse for the environment than all data centers combined.
Complaining about the environmental impact of AI is like plastic manufacturers putting recycling labels on plastic that is inherently not recycleable and making it seem like plastic pollution is every day people's fault for not recycling enough.
AI's impact on the environment is so tiny it's comparable to a rounding error when held up against the output of say, global shipping or air travel.
Why don't people get this upset at airport expansions? They're vastly worse.
Comment by cakealert 1 hour ago
It helps when you put yourself in the shoes of people like that and ask yourself, if I find out tomorrow that the evidence that AI is actually good for the environment is stronger, will I believe it? Will it even matter for my opposition to AI? The answer is no.
Comment by azemetre 12 hours ago
I'm glad people are grabbing the reigns of power back from some of the most evil people on the planet.
Comment by jnd-cz 15 hours ago
Comment by jraph 15 hours ago
We do too, don't worry.
Comment by ghc 16 hours ago
Comment by marcosdumay 16 hours ago
Comment by phyzix5761 16 hours ago
Comment by tastyfreeze 16 hours ago
Comment by simonw 16 hours ago
I'm a big fan of the "staff engineer" track as a way to avoid this problem. Your 10-15 year engineers who don't vibe with management should be able to continue earning managerial salaries and having the biggest impact possible.
I'm also a fan of leadership without management. Those experienced engineers should absolutely be taking on leadership responsibilities - helping guide the organization, helping coach others, helping build better processes. But they shouldn't be stuck in management tasks like running 1-1s and looking after direct reports and spending a month every year on the annual review process.
Comment by ryandrake 15 hours ago
Comment by RevEng 15 hours ago
Comment by tastyfreeze 15 hours ago
Comment by grogenaut 15 hours ago
Comment by otikik 16 hours ago
Comment by nextworddev 15 hours ago
Comment by DiscourseFan 15 hours ago
Comment by philipwhiuk 16 hours ago
Comment by rippeltippel 13 hours ago
Such as (cough...) Amazon?
Comment by turtletontine 12 hours ago
Comment by reassess_blind 2 hours ago
Comment by geodel 15 hours ago
Comment by klipklop 14 hours ago
Comment by stockresearcher 14 hours ago
Comment by klipklop 12 hours ago
We can also assume once these coding models get good enough they will not be shared with the general public or competitors.
Comment by israrkhan 14 hours ago
2. Junior engineer's heavy reliance on AI tools is a problem in itself. AI tools learn from existing code that is written by senior engineers. Too much use of AI by junior engineers will result in deterioration of engineering skills. It will eventually result in AI learning from AI generated code. This is true for most other content as well, as more and more content on internet is AI generated.
Comment by welliebobs 9 hours ago
It's my prediction that we're going to see more specialised skill sets become more commonplace. We'll have developers who can effectively use AI to bootstrap PoC's, developers who use AI in well established code bases to increase velocity (think asking Cursor to implement another set of REST endpoints for a new type), and developers who might choose to exclude AI from their workflows.
Eventually (I hope, at least) it'll be expected that it's another tool that developers can use in their day to day and less of the Omnissiah that has come to replace us as developers.
Comment by jaredcwhite 14 hours ago
The only relevant point here is keeping a talent pipeline going, because well duh. That it even needs to be said like it's some sort of clever revelation is just another indication of the level of stupid our industry is grappling with.
The. Bubble. Cannot. Burst. Soon. Enough!!
Comment by jr-throw 15 hours ago
I realized that they are shockingly bad at most basic things. Still their PR:s look really good (on the surface). I assume they use AI to write most of the code.
What they do excel in is a) cultural fit for the company and b) providing long-term context to the AIs for what needs to be done. They are essentially human filters between product/customers and the AI. They QA the AI models' output (to some extent).
Comment by KnuthIsGod 8 hours ago
Folks in Hyderabad can run LLMs too and data centre and infrastructure costs are lower in India.
Comment by focusgroup0 16 hours ago
Comment by ThrowawayTestr 16 hours ago
Comment by toast0 15 hours ago
Comment by platevoltage 13 hours ago
Comment by nxor 16 hours ago
Comment by HeavyStorm 12 hours ago
You should replace devs vertically, not horizontally, otherwise, who'll be you senior dev tomorrow?
Jokes aside, AI has the potential to reduce workforce across the board, but companies should strive to retain all levels staffed with humans. Also, an LLM can't fully replace even a junior, not yet at least.
Comment by twelvechess 16 hours ago
Comment by RevEng 15 hours ago
Comment by rented_mule 15 hours ago
The "over" deserves a lot of emphasis. To this day, I save my code at least once per line that I type because of the daily (sometimes hourly) full machine crashes I experienced in the 80s and 90s.
Comment by daedrdev 15 hours ago
Comment by XenophileJKO 15 hours ago
Comment by RussianCow 15 hours ago
Comment by XenophileJKO 13 hours ago
I have always cared a lot about quality and craftsmanship. Now when I am working and notice something wrong, I just fix it. I can code it entirely with AI in the time it would've take me to put it on an eternal backlog somewhere.
Comment by joshribakoff 15 hours ago
Comment by otabdeveloper4 13 hours ago
Comment by nevir 13 hours ago
Pair them with a senior so they can learn engineering best practices:
And now you've also just given your senior engineers some extra experience/insights into how to more effectively leverage AI.
It accelerates the org to have juniors (really: a good mix of all experience levels)
Comment by goosejuice 7 hours ago
Why? That seems unlikely to me. That's like saying juniors are likely the most comfortable with jj, zed, or vscode.
Comment by par 14 hours ago
Comment by itissid 14 hours ago
Now claude had access to this[2] link and it got the daya in the research prompt using web-searcher. But that's not the point. Any Junior worth their salt — distributed systems 101 — would know _what_ was obvious, failure to pay attention to the _right_ thing. While there are ideas on prompt optimization out there [3][4], the issue is how many tokens can it burn to think about these things and come up with optimal prompt and corrections to it is a very hard problem to solve.
[1] https://github.com/humanlayer/humanlayer/blob/main/.claude/c... [2] https://litestream.io/guides/vfs/#when-to-use-the-vfs [3] https://docs.boundaryml.com/guide/baml-advanced/prompt-optim... [4]https://github.com/gepa-ai/gepa
Comment by NewJazz 14 hours ago
Comment by shevy-java 11 hours ago
Comment by danans 14 hours ago
It's like expecting someone to know how to use source control (which at some point wasn't table stakes like it is today).
Comment by gehsty 13 hours ago
But more seriously are there CEOs out there who think they can replace the people starting off in their industry with AI? Who do the think will be the senior devs in 5-10yrs?
Comment by jacquesm 8 hours ago
Comment by testing22321 8 hours ago
Comment by jacquesm 8 hours ago
Comment by zkmon 14 hours ago
Comment by Jerry2 14 hours ago
Comment by fire2dev 14 hours ago
I kind of agree with this point from the perspective of civilisation.
Comment by exabrial 10 hours ago
Comment by gaptoothclan 13 hours ago
Comment by zkmon 14 hours ago
Comment by enigma101 4 hours ago
Comment by echelon_musk 16 hours ago
https://news.ycombinator.com/item?id=44972151
Does this story add anything new?
Comment by geodel 15 hours ago
Comment by add2 8 hours ago
Comment by rudnevr 7 hours ago
Comment by fastball 9 hours ago
Comment by gchokov 12 hours ago
Comment by wolfi1 13 hours ago
Comment by tasqyn 14 hours ago
Comment by rvz 15 hours ago
We do not need to hire anymore outside senior developers who need to be trained on the codebase with AI, given that the junior developers catch up so quickly they already replaced the need to hire a senior developer.
Therefore replacing them with AI agents was quite premature if not completely silly. In fact it makes more sense to hire far less senior developers and to instead turn juniors directly into senior developers to save lots of money and time to onboard.
Problem solved.
Comment by platevoltage 13 hours ago
Comment by 0xdeadbeefbabe 16 hours ago
Comment by nextworddev 15 hours ago
Comment by dwa3592 15 hours ago
Comment by testemailfordg2 8 hours ago
Comment by gorbachev 13 hours ago
Comment by aurizon 10 hours ago
Comment by d--b 11 hours ago
When I started working, I think I was fairly competent technically, and usually the people I hired were also pretty good straight out of uni.
Comment by AtNightWeCode 12 hours ago
Comment by mberning 14 hours ago
Comment by johnwheeler 15 hours ago
Comment by oulipo2 15 hours ago
Comment by MSKJ 11 hours ago
Comment by butterisgood 11 hours ago
Comment by Madmallard 6 hours ago
Does he not understand the people making millions or billions off AI literally do not care?
They fully are committed to seeing if they can do away with having to employ people all together.
They want techno-feudalism.
Sam Altman and the ilk are so anti-humanity seeming in interviews it's really disgusting that we allow them to be in a position of power at all.
Comment by Nextgrid 16 hours ago
I do agree with him about AI being a boon to juniors and pragmatic usage of AI is an improvement in productivity, but that's not news, it's been obvious since the very beginnings of LLMs.
Comment by grogenaut 15 hours ago
Comment by Nextgrid 15 hours ago
(also I’m just talking out of my ass on a tech forum under a pseudonym instead of going to well-publicized interviews)
Comment by oytis 13 hours ago
Comment by tonyhart7 15 hours ago
Comment by ubicomp 15 hours ago
Comment by panny 8 hours ago
At some point, admit you have a problem maybe? Maybe that will only happen after you spent 20 years staying on top of the latest tools and tech only to be told you're out of style because your hair started to grey.
Comment by mocha_nate 8 hours ago
Comment by coolThingsFirst 10 hours ago
Comment by mxkopy 14 hours ago
Comment by dogemaster2032 15 hours ago
Comment by elmean 16 hours ago
Comment by msdrigg 16 hours ago
Comment by jjmarr 15 hours ago
Week 2: 32 interns
Week 3: 16 interns
Week 4: 8 interns
Week 5: 4 interns
Week 6: 2 interns
Week 7: 1 intern
Week 8: 0.5 interns
Is it possible to make it to the end of the summer without getting sliced in half?
Comment by linuxhansl 15 hours ago
Comment by glitchc 15 hours ago
Comment by elmean 12 hours ago
Comment by nerdsniper 16 hours ago
Comment by elmean 12 hours ago
Comment by happymellon 16 hours ago
Comment by smurda 16 hours ago
Comment by Nextgrid 16 hours ago
I see none of that happening - software quality is actually in freefall (but AI is not to blame here, this began even before the LLM era), delivery doesn't seem to be any faster (not a surprise - writing code has basically never been the bottleneck and the push to shove AI everywhere probably slows down delivery across the board) nor cheaper (all the money spent on misguided AI initiatives actually costs more).
It is a super easy bet to take with money - software development is still a big industry and if you legitimately believe AI will do 90% of a senior engineer you can start a consultancy, undercut everyone else and pocket the difference. I haven’t heard of any long-term success stories with this approach so far.
Comment by joshribakoff 15 hours ago
Comment by JackSlateur 16 hours ago
I'm yet to see that production-grade code written by these production-grade models;