• ch4s3
  • ·
  • 3 minutes ago
  • ·
  • [ - ]
> junior developer employment drops by about 9-10% within six quarters, while senior employment barely budges. Big tech hired 50% fewer fresh graduates over the past three years.

This study showing 9-10% drop is odd[1] and I'm not sure about their identification critria.

> We identify GenAI adoption by detecting job postings that explicitly seek workers to implement or integrate GenAI technologies into firm workflows.

Based on that MIT study it seems like 90+% of these projects fail. So we could easily be seeing an effect where firms posting these GenAI roles are burning money on the projects in a way that displaces investment in headcount.

The point about "BigTech" hiring 50% fewer grads is almost orthogonal. All of these companies are shifting hiring towards things where new grads are unlikely to add value, building data centers and frontier work.

Moreover the TCJA of 2017 caused software developers to not count for R&D tax write offs (I'm oversimplifying). This surely has more of an effect than whatever "GenAI integrator roles" postings correlates to.

[1] https://download.ssrn.com/2025/11/6/5425555.pdf

My experience hasn't been LLMs automate coding, just speeds it up. It's like I know what I want the solution to be and I'll describe it to the LLM, usually for specific code blocks at a time, and then build it up block-by-block. When I read hacker news people are talking like it's doing much more than that. It doesn't feel like an automation tool to me at all. It just helps me do what I was gonna do anyways, but without having to look up library function calls and language specific syntax
  • jvans
  • ·
  • 4 minutes ago
  • ·
  • [ - ]
i notice a huge difference between working on large systems with lots of microservices and building small apps or tools for myself. The large system work is what you describe, but small apps or tools I resonate with the automate coding crowd.

I've built a few things end to end where I can verify the tool or app does what I want and I haven't seen a single line of the code the LLM wrote. It was a creepy feeling the first time it happened but it's not a workflow I can really use in a lot of my day to day work.

> My experience hasn't been LLMs automate coding, just speeds it up.

This is how basically everyone I know actually uses LLMs.

The whole story about vibecoding and LLMs replacing engineers has become a huge distraction from the really useful discussions to be had. It’s almost impossible to discuss LLMs on HN because everyone is busy attacking the vibecoding strawman all the time.

Half strawman -- a mudman, perhaps. Because we're seeing proper experts with credentials jump on the 'shit, AI can do all of this for me' realization blog post train.
So another strawman?
It’s a better Google for me. Instead of searching AWS or StackOverflow it hallucinates a good enough output that I can refactor into an output.
The bottom up and top down don’t seem to match.

Where is all the new and improved software output we’d expect to see?

  • Eong
  • ·
  • 10 minutes ago
  • ·
  • [ - ]
Love the article, I had a struggle with my new identity and thus had to write https://edtw.in/high-agency-engineering/ for myself, but also came to the realisation that the industry is shifting too especially for junior engineers.

Curious about how the Specialist vs Generalist theme plays out, who is going to feel it more *first* when AI gets better over time?

The author has a bizarre idea of what a computer science degree is about. Why would it teach cloud computing or dev ops? The idea is you learn those on your own.
If that's "the idea", then clearly we need a more holistic, useful degree to replace CS as "the" software degree.
Despite what completely uninformed people may think, the field "computer science" is not about software development. It's a branch of mathematics. If you want an education in software development, those are offered by trade schools.
What I want is for universities to offer a degree in Software Engineering. That's a different field from Computer Science.

You say that belongs in a trade school? I might agree, if you think trade schools and not universities should teach electrical engineering, mechanical engineering, and chemical engineering.

But if chemical engineering belongs at a university, so does software engineering.

Plenty of schools offer software engineering degrees alongside computer science, including mine ~20 years ago.

The bigger problem when I was there was undergrads (me very much included) not understanding the difference at all when signing up.

  • mxkopy
  • ·
  • 42 minutes ago
  • ·
  • [ - ]
Last I checked ASU does, and I’m certain many other universities do too.
The degree is (should be) about CS fundamentals and not today's hotness. Maybe a "trades" diploma in CS could teach today's hotness.
  • wrs
  • ·
  • 1 hour ago
  • ·
  • [ - ]
Cloud computing is not some new fundamental area of computer science. It’s just virtual CPUs with networks and storage. My CS degree from 1987 is still working just fine in the cloud, because we learned about CPUs, virtualization, networks, and storage. They’re all a lot bigger and faster, with different APIs, but so what?

Devops isn’t even a thing, it’s just a philosophy for doing ops. Ops is mostly state management, observability, and designing resilient systems, and we learned about those too in 1987. Admittedly there has been a lot of progress in distributed systems theory since then, but a CS degree is still where you’ll find it.

School is typically the only time in your life that you’ll have the luxury of focusing on learning the fundamentals full time. After that, it’s a lot slower and has to be fit into the gaps.

There has to be a balance of practical skills and theory in a useful degree, and most CS curricula are built that way. It should not be all about random hot tech because that always changes. You can easily learn tech from tutorials, because the tech is simple compared to theory. Theory is also important to be able to judge the merits of different technology and software designs.
Why is this necessarily true?
A CS degree is there to teach you concepts and fundamentals that are the foundation of everything computing related. It doesn't generally chase after the latest fads.
  • tibbar
  • ·
  • 21 minutes ago
  • ·
  • [ - ]
Sure, but we need to update our definitions of concepts/fundamentals. A lot of this stuff has its own established theory and has been a core primitive for software engineering for many years.

For example, the primitives of cloud computing are largely explained by papers published by Amazon, Google, and others in the early '00s (DynamoDB, Bigtable, etc.). If you want to explore massively parallel computation or container orchestration, etc, it would be natural to do that using a public cloud, although of course many of the platform-specific details are incidentals.

Part of the story here is that the scale of computing has expanded enormously. The DB class I took in grad school was missing lots of interesting puzzle pieces around replication, consistency, storage formats, etc. There was a heavy focus on relational algebra and normalization forms, which is just... far from a complete treatment of the necessary topics.

We need to extend our curricula beyond the theory that is require to execute binaries on individual desktops.

I have been telling people that, titles aside, senior developers were the people not afraid to write original code. I don’t see LLMs changing this. I only envision people wishing LLMs would change this.
I disagree.

1) Senior developers are more likely to know how to approach a variety of tasks, including complex ones, in ways that work, and are more likely to (maybe almost subconsciously) stick to these proven design patterns rather than reinvent the wheel in some novel way. Even if the task itself is somewhat novel, they will break it down in familar ways into familar subtasks/patterns. For sure if a task does require some thinking outside the box, or a novel approach, then the senior developer might have better intuition on what to consider.

The major caveat to this is that I'm an old school developer, who started professionally in the early 80's, a time when you basically had to invent everything from scratch, so certainly there is no mental block to having to do so, and I'm aware there is at least a generation of developers that grew up with stack overflow and have much more of a mindset of building stuff using cut an paste, and less having to sit down and write much complex/novel code themselves.

2) I think the real distinction of senior vs junior programmers, that will carry over into the AI era, is that senior developers have had enough experience, at increasing levels of complexity, that they know how to architect and work on large complex projects where a more junior developer will flounder. In the AI coding world, at least for time being, until something closer to AGI is achieved (could be 10-20 years away), you still need to be able to plan and architect the project if you want to achieve a result where the outcome isn't just some random "I let the AI choose everything" experiment.

I completely agree with your second point. For your first point my experience tells me the people least afraid to write original code are the people least oppositional to reinventing wheels.

The distinguishing behavior is not about the quantity of effort involved but the total cost after consideration of dependency management, maintenance time, and execution time. The people that reinvent wheels do so because they want to learn and they also want to do less work on the same effort in the future.

> in the early 80's, a time when you basically had to invent everything from scratch, so certainly there is no mental block to having to do so, and I'm aware there is at least a generation of developers that grew up with stack overflow and have much more of a mindset of building stuff using cut an paste, and less having to sit down and write much complex/novel code themselves.

I think this is really underappreciated and was big in driving how a lot of people felt about LLMs. I found it even more notable on a site named Hacker News. There is an older generation for whom computing was new. 80s through 90s probably being the prime of that era (for people still in the industry). There constantly was a new platform, language, technology, concept to learn. And nobody knew any best practices, nobody knew how anything "should work". Nobody knew what anything was capable of. It was all trying things, figuring them out. It was way more trailblazing / exploring new territory. The birth of the internet being one of the last examples of this from that era.

The past 10-15 years of software development have been the opposite. Just about everything was evolutionary, rarely revolutionary. Optimizing things for scale, improving libraries, or porting successful ideas from one domain to another. A lot of shifting around deck chairs on things that were fundamentally the same. Just about every new "advance" in front-end technology was this. Something hailed as ground breaking really took little exploration, mostly solution space optimization. There was almost always a clear path. Someone always had an answer on stack overflow - you never were "on your own". A generation+ grew up in that environment and it felt normal to them.

LLMs came about and completely broke that. And people who remembered when tech was new and had potential and nobody knew how to use it loved that. Here is a new alien technology and I get to figure out what makes it tick, how it works how to use it. And on the flip side people who were used to there being a happy path, or a manual to tell you when you were doing it wrong got really frustrated as their being no direction - feeling perpetually lost and it not working the way they wanted.

I found it especially ironic being on hacker news how few people seemed to have a hacker mindset when it came to LLMs. So much was, "I tried something it didn't work so I gave up". Or "I just kept telling it to work and it didn't so I gave up". Explore, pretend you're in a sci-fi movie. Does it work better on Wednesdays? Does it work better if you stand on your head? Does it work differently if you speak pig-latin? Think side-ways. What behavior can you find about it that makes you go "hmm, that's interesting...".

Now I think there has been a shift very recently of people getting more comfortable with the tech - but still was surprised of how little of a hacker mindset I saw on hacker news when it came to LLMs.

LLMs have reset the playing field from well manicured lawn, to an unexplored wilderness. Figure out the new territory.

  • Terr_
  • ·
  • 22 minutes ago
  • ·
  • [ - ]
To me, the "hacker" distinction is not about novelty, but understanding.

Bashing kludgy things together until they work was always part of the job, but that wasn't the motivational payoff. Even if the result was crappy, knowing why it was crappy and how it could've been better was key.

LLMs promise an unremitting drudgery of "mess around until it works" with problems that that are either too random to characterize, which you have no time to dig into, or where there's no consistent opportunity to apply what you've learned later.

  • CSSer
  • ·
  • 2 hours ago
  • ·
  • [ - ]
I almost think what a lot of people are coming to grips is with is how much code is unoriginal. The ones who've adjusted the fastest were humble to begin with. I don't want to claim the title, but I can certainly claim the imposter syndrome! If anything, LLMs validated something I always suspected. The amount of truly unique, relevant to success, code in a given project is often very small. More often than not, it's not grouped together either. Most of the time it's tailored to a given functionality. For example, a perfectly accurate Haversine distance is slower than an optimized one with tradeoffs. LLMs have not yet become adept at housing the ability to identify the need for those tradeoffs in context well or consistently, so you end up with generic code that works but not great. Can the LLM adjust if you explicitly instruct it to? Sure, sometimes! Sometimes it catches it in a thought loop too. Other times you have to roll up your sleeves and do the work like you said, which often still involves traditional research or thinking.
On the junior developer question:

A humble way for devs to look at this, is that in the new LLM era we are all juniors now.

A new entrant with a good attitude, curiosity and interest in learning the traditional "meta" of coding (version control, specs, testing etc) and a cutting-edge, first-rate grasp of using LLMs to assist their craft (as recommended in the article) will likely be more useful in a couple of years than a "senior" dragging their heels or dismissing LLMs as hype.

We aren't in coding Kansas anymore, junior and senior will not be so easily mapped to legacy development roles.

  • ·
  • 3 hours ago
  • ·
  • [ - ]
The points mentioned in the article, regarding the things to focus on, is spot on.
> Junior developers: Make yourself AI-proficient and versatile. Demonstrate that one junior plus AI can match a small team’s output. Use AI coding agents (Cursor/Antigravity/Claude Code/Gemini CLI) to build bigger features, but understand and explain every line if not most. Focus on skills AI can’t easily replace: communication, problem decomposition, domain knowledge. Look at adjacent roles (QA, DevRel, data analytics) as entry points. Build a portfolio, especially projects integrating AI APIs. Consider apprenticeships, internships, contracting, or open source. Don’t be “just another new grad who needs training”; be an immediately useful engineer who learns quickly.

If I were starting out today, this is basically the only advice I would listen to. There will indeed be a vacuum in the next few years because of the drastic drop in junior hiring today.

  • gassi
  • ·
  • 47 minutes ago
  • ·
  • [ - ]
> Addy Osmani is a Software Engineer at Google working on Google Cloud and Gemini

Ah, there it is.

[dead]
The outlook on CS credentials is wrong. You'll never be worse off than someone without those credentials, all other things equal. Buried in this text is some assumption that the relatively studious people who get degrees are going to fall behind the non-degreed, because the ones who didn't go to school will out-study them. What is really going to happen generally is that the non-degreed will continue to not study, and they will lean on AI to avoid studying even the few things that they might have otherwise needed to study to squeak by in industry.