Was having a discussion the other day with someone, and we came to the same conclusion. You used to be able to make yourself useful by doing the easy / annoying tasks that had to be done, but more senior people didn't want to waste time dealing with. In exchange you got on-the-job experience, until you were able to handle more complex tasks and grow your skill set. AI means that those 'easy' tasks can be automated away, so there's less immediate value in hiring a new grad.
I feel the effects of this are going to take a while to be felt (5 years?); mid-level -> senior-level transitions will leave a hole behind that can't be filled internally. It's almost like the aftermath of a war killing off 18-30 year olds leaving a demographic hole, or the effect of covid on education for certain age ranges.
In the past, a junior would write bad code and you'd work with them to make it better. Now I just assume they're taking my feedback and feeding it right back to the LLM. Ends up taking more of my time than if I'd done it myself. The whole mentorship thing breaks down when you're basically collaborating with a model through a proxy.
I think highly motivated juniors who actually want to learn are still valuable. But it's hard to get past "why bother mentoring when I could just use AI directly?"
I don't have answers here. Just thinking maybe we're not seeing the end of software engineering for those of us already in it—but the door might be closing for anyone trying to come up behind us.
This is especially annoying when you get back a response in a PR "Yes, you're right. I have pushed the fixes you suggested."
Part of the challenge (and I don't have an answer either) is there are some juniors who use AI to assist... and some who use it to delegate all of their work to.
It is especially frustrating that the second group doesn't become much more than a proxy for an LLM.
New juniors can progress in software engineering - but they have to take the road of disciplined use of AI and make sure that they're learning the material rather than delegating all their work to it... and that delegating work is very tempting... especially if that's what they did in college.
Hiring well is hard, specially if compensation isn't competitive enough to attract talented individuals who have a choice. It's also hard to change institutional hiring practices. People don't get fired by buying IBM, and they also don't get fired if they follow the same hiring practices in place in 2016.
> What are all those rounds for if we're getting engineers who aren't as valued for the team's needs at the end of the pipeline?
Software development is a multidiscinary field. It involves multiple non-overlapping skill sets, bot hard skills and soft skills. Also, you need multiple people vetting a candidate to eliminate corruption and help weed out candidates who outright clash with company culture. You need to understand that hiring someone is a disruptive activity, that impacts not only what skill sets are available in your organization but also how the current team dynamics. If you read around, you'll stumble upon stories of people who switch roles in reaction to new arrivals. It's important to get this sort of stuff right.
Well I'm still waiting. Your second paragraph seems to contradict the first. Which perfectly encapsulates the issue with hiring. Too afraid to try new things, so instead add beuracracy to leases accountability.
I think you haven't spend much time thinking about the issue. Changing hiring practices does not mean they are improve. It only means they changed. You are still faced with the task of hiring adequate talent, but if you change processes them now you don't have baselines and past experiences to guide you. You keep those baselines if you keep your hiring practices then you stick with something that is proven to work albeit with debatable optimality, and mitigate risks because your experience with the process helps you be aware of some red flags. The worst case scenario is that you repeat old errors, but those will be systematic errors which are downplayed by the fact that your whole organization is proof that your hiring practices are effective.
No, but I'd like to at least see conversation on how to improve the process. We aren't even at that point. We're just barely past acknowledging that it's even an issue.
>but if you change processes them now you don't have baselines and past experiences to guide you.
I argue we're already at this point. The reason we got past the above point of "acknowledging problem" (a decade too late, arguably) is that the baselines are failing to new technology, which is increasing false positives.
You have a point, but why does tech pick this point to finally decide not to "move fast and break things"? Not when it comes to law and ethics, but for aquiring new talent (which meanwhile is already disrupting heir teams with this AI slop?)
>those will be systematic errors which are downplayed by the fact that your whole organization is proof that your hiring practices are effective.
okay, so back to step zero then. Do we have a hiring problem? The thesis of this article says yes.
"it worked before" seems to be the antipattern the tech industry tried to fight back against for decades.
If you're still asking trvia, yes. Maybe it's time to shift from the old filter and update the process?
If you can see in the job that a 30 minute PR is the problem, then maybe replace that 3rd leetcode round with 30 minutes of pair programming. Hard to chatGPT in real time without sounding suspicion.
But is the false negative for a nervous pair programmer worse than a false positive for a leetcode question? Ideally a good interviewer would be able to separate the anxiety from the actual thinking and see that this person can actually think, but that's another undervalued skill among industry.
Given how much these orgs pay, both direct to head hunters and indirect in interview time, might as well probationally hire the whoever passes the initial sniff test.
That also lets you evaluate longer term habits like punctuality, irritability, and overall not-being-a-jerkness.
Onboarding. Even with good employees, it can take a few months to get the flow of the organization, understanding the code base, and understanding the domain. Maybe a bit of technology shift too. Firing a person who doesn't appear to be preforming in the first week or two or three would be churning through that too fast.
Provisional hiring with "maybe we'll hire you after you move here and work for us for a month" is a non-starter for many candidates.
At my current job and the job previous it took two or three weeks to get things fully set up. Be it equipment, provisioning permissions, accounts, training (the retail company I worked at from '10 to '14 - they sent every new hire out to a retail store to learn about how the store runs (to get a better idea of how to build things for them and support their processes).
... and not every company pays Big Tech compensation. Sometimes it's "this is the only person who didn't say «I've got an offer with someone else that pays 50% more»". Sometimes a warm body that you can delegate QA testing and pager duty to (rather than software development tasks) is still a warm body.
"Bad" is vague, subjective moralist judgement. It's also easily manipulated and distorted to justify firing competent people who did no wrong.
> It’s pretty obvious when someone starts actually working if they’re going to a net positive. On the order of weeks, not months.
I feel your opinion is rather simplistic and ungrounded. Only the most egregious cases are rendered apparent in a few weeks worth of work. In software engineering positions, you don't have the chance to let your talents shine through in the span of a few weeks. The cases where incompetence is rendered obvious in the span of a few weeks actually spells gross failures in the whole hiring process, which failed to verify that the candidate failed to even meet the hiring bar.
> (...) might as well probationally hire the whoever passes the initial sniff test.
This is a colossal mistake, and one which disrupts a company's operations and the candidates' lives. Moreover, it has a chilling effect on the whole workforce because no one wants to work for a company ran by sociopaths that toy with people's lives and livelihood as if it was nothing.
If you have that kind of office politics going on, that's the issue to be solved.
>toy with people's lives and livelihood as if it was nothing.
If the employee lies about their skills, it is on them.
You’re asking these rhetorical questions as if we haven’t had centuries of precedent here, both bad and good. How does the AMA balance between neurosurgeons and optometrists? Bar associations between corporate litigators and family estate lawyers? Professional engineering associations between civil engineers and chemical engineers?
One takes the FE exam ( https://ncees.org/exams/fe-exam/ ). You will note at the bottom of the page "FE Chemical" and "FE Civil" which are two different exams.
Then you have an apprenticeship for four years as an Engineer in Training (EIT).
Following, that, you take the PE exam. https://ncees.org/exams/pe-exam/ You will note that the PE exams are even more specialized to the field.
Depending on the state you are licensed in (states tend to have reciprocal licensing - but not necessarily and not necessarily for all fields). For example, if you were licensed in Washington, you would need to pass another exam specific to California to work for a California firm.
Furthermore, there is the continuing education requirements (that are different for each state). https://www.pdhengineer.com/pe-continuing-education-requirem...
You have to take 30 hours of certified study in your field across every two years. This isn't a lot, but people tend to fuss about "why do CS people keep being expected to learn on our own?" ... Well, if we were Professional Engineers it wouldn't just be an expectation - it would be a requirement to maintain the license. You will again note the domain of the professional development is different - so civil and mechanical engineers aren't necessarily taking the same types of classes.
These requirements are set by the state licensure and part of legislative processes.
But are you suggesting we have separate licenses for every different type of developer? We have new types coming up every few years.
The whole idea of guilds for developers is just stupid and impractical. It could never work on any long term or large scale basis.
And yet welcome to leetcode grind.
If you need to fizzbuzz me, fine. But why am I still making word search solver project in my free time as if I'm applying for a college internship?
And I’m being an accelerationist hoping the whole thing collapses under its own ridiculousness.
Recruitment is broken even more than before chatgpt.
This is not limited to junior devs. I had the displeasure of working with a guy who was hired as a senior dev who heavily delegated any work they did. He failed to even do the faintest review of what the coding agent and of course did zero testing. At one time these stunts resulted in a major incident where one of these glorious PRs pushed code that completely inverted a key business rule and resulted in paying customers being denied access to a paid product.
Sometimes people are slackers with little to no ownership or pride in their craftsmanship, and just stumbled upon a career path they are not very good at. They start at juniors but they can idle long enough to waddle their way to senior positions. This is not a LLM problem, or caused by it.
And then in the next PR, you have to request the exact same changes
Hmmm. Is there any way to distinguish between these two categories? Because I agree, if someone is delegating all their work to an LLM or similar tool, cut out the middleman. Same as if someone just copy/pasted from Stackoverflow 5 years ago.
I think it is also important to think about incentives. What incentive does the newer developer have to understand the LLM output? There's the long term incentive, but is there a short term one?
Unfortunately, the use of LLMs has brought about a lot of mistrust in the workplace. Earlier you’d simply assume that a junior making mistakes is simply part of being a junior and can be coached; whereas nowadays said junior may not be willing to take your advice as they see it as sermonizing when an “easy” process to get “acceptable” results exists.
I saw a situation like this many years ago. The newly hired midlevel engineer thought he was smarter than the supervisor. Kept on arguing about code style, system design etc. He was fired after 6 months.
But I was friendly with him, so we kept in touch. He ended up working at MSFT for 3 times the salary.
> Earlier you’d simply assume that a junior making mistakes is simply part of being a junior and can be coached; whereas nowadays said junior may not be willing to take your advice
Hot take: This reads like an old person looking down upon young people. Can you explain why it isn't? Else, this reads like: "When I was young, we worked hard and listened to our elders. These days, young people ignore our advice." Every time I see inter-generational commentary like this (which is inevitably from personal experience), I am immediately suspicious. I can assure you that when I was young, I did not listen to older people's advice and I tried to do everything my own way. Why would this be any different in the current generation? In my experience, it isn't.On a positive note: I can remember mentoring some young people and watching them comb through blogs to learn about programming. I am so old that my shelf is/was full of O'Reilly books. By the time I was mentoring them, few people under 25 were reading O'Reilly books. It opened my eyes that how people changes more than what people learn. Example: Someone is trying to learning about access control modifiers for classes/methods in a programming language. Old days: Get the O'Reilly book for that programming language. Lookup access modifiers in the index. 10 year ago: Google for a blog with an intro to the programming language. There will be a tip about what access modifiers can do. Today: Ask ChatGPT. In my (somewhat contrived) example, the how is changing, but not the what.
The answer to this (throughout the ages) should be the same: read the authoritative source of information. The official API docs, the official language specification, the man page, the textbook, the published paper, and so on.
Maybe I am showing my age, but one of the more frustrating parts of being a senior mentoring a junior is when they come with a question or problem, and when I ask: “what does the official documentation say?” I get a blank stare. We have moved from consulting the primary source of information to using secondary sources (like O’Reilly, blogs and tutorials), now to tertiary sources like LLMs.
I think this is undoubtedly true from my observations. Recently, I got together over drinks with a group of young devs (most around half my age) from another country I was visiting.
One of the things I said, very casually, was, "Hey, don't sleep on good programming books. O'Reilly. Wiley. Addison-Wesley. MIT Press. No Starch Press. Stuff like that."
Well, you should've seen the looks on their faces. It was obvious that advice went over very poorly. "Ha, read books? That's hard. We'd rather just watch a YouTube video about how to make a JS dropdown menu."
So yeah, I get that "showing my age" remark. Used to be the discipline in this industry is that you shouldn't ask a question of a senior before you'd read the documentation. If you had read the documentation, man pages, googled, etc., and still couldn't come up with an answer, then you could legitimately ask for a senior mentor's time. Otherwise, the answer from the greybeards would have been "Get out of my face, kid. Go RTFM."
That system that used to exist is totally broken now. When reading and understanding technical documentation is viewed as "old school", then you know we have a big problem.
If I have a problem with a USB datastream, the last place I'm going to look is the official USB spec. I'll be buried for weeks. The information may be there, but it will take me so long to find it that it might as well not.
The first place to look is a high quality source that has digested the official spec and regurgitated it into something more comprehensible.
[shudder] the amount of life that I've wasted discussing the meaning of some random phrase in IEC-62304 is time I will never get back!
Hot take: This reads like a person who was difficult to work with.
Senior people have responsibility, therefore in a business situation they have authority. Junior people who think they know it all don't like this. If there's a disagreement between a senior person and a junior person about something, they should, of course, listen to each other respectfully. If that's not happening, then one of them is not being a good employee. But if they are, then the supervisor makes the final call.
The tangent to that is it is also changing with the how much one internalizes about the problem domain and is able to apply that knowledge later. Hard fought knowledge from the old days is something that shapes how I design systems today.
However, the tendency of people who reach for ChatGPT today to solve a problem results in them making the same mistakes again the next time since the information is so easy to access. It also results in things that are larger are more difficult... the "how do you architect this larger system" is something you learn by building the smaller systems and learning about them so that their advantages and disadvantages and how and such becomes an inherent part of how you conceive of the system as a whole. ... Being able to have ChatGPT do it means people often don't think about the larger problem or how it fits together.
I believe that is harder for a junior who is using ChatGPT to advance to being a mid level or senior developer than it is for a junior from the old days because of the lack of retention of the knowledge of the problems and solutions.
I have a friend who is a dev, a very senior one at that, who spins up 4 Claudes at once and does the whole enterprises work. Hes a "Senior AI Director" with nobody beneath him, not a single direct report, and NO knowledge of AI or ML, to my chagrin.
So now I'm whining too...
Once you’re a senior you can exercise judgement on when/how to use LLMs.
When you’re a junior you haven’t developed that judgement yet. That judgement comes from consulting documentation, actually writing code by hand, seeing how you can write a small program just fine, but noticing that some things need to change when the code gets a lot bigger.
A junior without judgement isn’t very valuable unless he/she is working hard to develop that judgement. Passing assignments through to the LLM does not build judgement, so it’s not a winning strategy.
I don't mind if AI is used as a tool, but the output needs to be vetted.
For software, I can imagine a process where junior developers create a PR and then run through it with another engineer side by side. The short-term incentive would be that they can do it, else they'd get exposed.
You might be specifically talking about people who copy/paste without understanding, but I think it's still OK-ish to do that, since you can't make an entire [whatever you're coding up] by copy/pasting snippets from SO like you're cutting words out of a magazine for a ransom note. There's still thought involved, so it's more like training wheels that you eventually outgrow as you get more understanding.
It at least forces you to tinker with whatever you copied over.
But I suppose my question is rhetorical. We're laying off hundreds of thousands of engineers and maming existing ones do the work of 3-4 engineers. Not much time to help the juniors.
That is at least for the people who don't understand what they're doing, the LLM tends to come out with something I can at least turn into something useful.
It might be reversed though for people who know what they're doing. IF they know what they're doing they might theoretically be able to put together some stackoverflow results that make sense, and build something up from that better than what gets generated from LLM (I am not asserting this would happen, and thinking it might be the case)
However I don't know as I've never known anyone who knew what they were doing who also just copy/pasted some stackoverflow or delegated to LLM significantly.
Yes, it should be obvious. At least at the current state of LLMs.
> There's the long term incentive, but is there a short term one?
The short term incentive is keeping their job.
I've learnt that saying this exact phrase does wonders when it comes to advancing your career. I used to argue against stupid ideas but not only did I achieve nothing, but I was also labelled uncooperative and technically incompetent. Then I became a "yes-man" and all problems went away.
I have seen responses to PRs that appear to be a copy and paste of my feedback into it and a copy and paste of the response and fixes into the PR.
It may be the that the developer is incorporating the mannerisms of Claude into their own speech... that would be something to delve into (that was intentional). However, more often than not in today's world of software development such responses are more likely to indicate a copy and paste of LLM generated content.
This is nothing new. People rarely have independent thoughts, usually they just parrot whatever they've been told to parrot. LLMs created common world-wide standard on this parroting, which makes the phenomenon more evident, but it doesn't change the fact that it existed before LLMs.
Have you ever had a conversation with an intelligent person and thought "wow that's refreshing"? Yeah. There's a reason why it feels so good.
In a previous role I was a principal IC trying to mentor someone who had somehow been promoted up to senior but was still regularly turning in code for review that I wouldn't have expected from an intern— it was an exhausting, mind-numbing process trying to develop some sense of engineering taste in this person, and all of this was before LLMs. This person was definitely not just there for the money; they really looked up to the top-level engineers at our org and aspired to be be there, but everything just came across as extremely shallow, like engineering cosplay: every design review or bit of feedback was soundbites from a how-to-code TED talk or something. Lots of regurgitated phrases about writing code to be "maintainable" or "elegant" but no in-the-bones feeling about what any of that actually meant.
Anyway, I think a person like this is probably maximally susceptible to the fawning ego-strokes that an AI companion delivers alongside its suggestions; I think I ultimately fear that combination more than I fear a straight up mercenary for whom it's a clear transaction of money -> code.
Very odd. It was like he only had ever worked on school projects assigned to him, and had no actual interest in exploring the problems we were working on.
But it can be tricky to evaluate this in the kind of structured, disciplined way that big-company HR departments like to see, where all interviewees get a consistent set of questions and are "scored" on their responses according to a fixed rubric.
This guy claimed to want to get promoted to Senior, but didn't do anything Senior-shaped. If you're going to own a component of a system, I should be able to ask you intelligent questions about how you might evolve it, and you should be able to tell me why someone cares about it.
Just pick the two you like the most.
I suppose it depends on the team and industry. This would be unheard of behavior for games, for example. Why you taking a pay cut and likely working more hours to just say "I don't know, whatever works?". You'd ideally be working towards some sort of goal. Management, domain knowledge, just begin able to solve hard problems.
Welp, to each their own I suppose.
>Strong desire to work on a specific piece of the code (or to not work on one) might even in some cases be a red flag.
I understand an engineer should compromise. But if you want to specialize in high performance computing and you're pigeonholed into 6 months of front end web, I can understand the frustration. They need to consider their career too. It's too easy for the manager to ignore you of you don't stand up for yourself. Some even count on it and plan around the turnover.
Of course, if they want nothing other than kernel programming as a junior and you simply need some easy but important work done for a month, it can be unreasonable. There needs to be a balance as a team.
It's worth considering how aggressively open the door has been for the last decade. Each new generation of engineers increasingly disappointed me with how much more motivated they were by a big pay check than they were for anything remotely related to engineering. There's nothing wrong with choosing a career for money, but there's also nothing wrong about missing a time when most people chose it because they were interested in it.
However I have noticed a shift: while half the juniors I work with are just churning out AI slop, the other half are really interested in the craft of software engineering and understanding computer science better.
We'll need new senior engineers in a few years, and I suspect they will come from a smaller pool of truly engaged juniors today.
There are still junior engineers out there who have experiments on their githubs, who build weird little things because they can. Those people were the best engineers anyway. The last decade of "money falls from the sky and anyone can learn to code" brought in a bunch of people who were interested in it for the money, and those people were hard to work with anyway. I'd lump the sidehustle "ship 30 projects in 30 days" crowd in here too. I think AI will effectively eliminate junior engineers in the second camp, but absolutely will not those in the first camp. It will certainly make it harder for those junior engineers at the margins between those two extremes.
There's nothing more discouraging than trying to guide a junior engineer who is just typing what you say into cursor. Like clearly you don't want to absorb this, and I can also type stuff into an AI, so why are you here?
The best engineers I've worked with build things because they are truly interested in them, not because they're trying to get rich. This is true of literally all creative pursuits.
Last re-engineering project was mostly done when they fired me as the probational period was almost over, and seems they did not want me further - too expensive? - and anyone can finish it right? Well...
So i am finishing it for them, one more month, without a contract, for my own sake. Maybe they pay, maybe they don't - this is reality. But I want to see this thing working live.. i have been through maybe 20-30 projects/products of such size and bigger, and only 3-4 had flown. The rest did not - and never for technical reasons.
Then/now i'll be back to the job-search. Ah. Long lists of crypto-or-adtech-or-ai-dreams, mostly..
Mentoring, juniors? i have not seen anything even faintly smelling of that, for decade..
If software were "just" a job without any of the gratifying aspects, I wouldn't do nearly as good a job.
But it's hard to know if a candidate is one of those when hiring, which also means that if you are one of those juniors it is hard for you to prove it to a prospective employer.
seems like something a work policy can fix quickly. If not something filtered in the interview pipeline. I wouldn't just let juniors go around and try to copy-pasting non-compilable Stackoverflow code, why would I do it here?
I keep hearing this and find it utterly perplexing.
As a junior, desperate to prove that I could hang in this world, I'd comb over my PRs obsessively. I viewed each one as a showcase of my abilities. If a senior had ever pointed at a line of code and asked "what does this do?" If I'd ever answered "I don't know," I would've been mortified.
I don't want to shake my fist at a cloud, but I have to ask genuinely (not rhetorically): do these kids not have any shame at all? Are they not the slightest bit embarrassed to check in a pile of slop? I just want to understand.
I'm approaching 30 years of professional work and still feel this way. I've found some people are like this, and others aren't. Those who aren't tend to not progress as far.
> embarrassed to check in a pile of slop
Part of being a true junior, especially nowadays, is not being able to recognize the differences between a pile of slop from useful and elegant code.I feel that's the bare minimum a junior should be asking. the "this is useful" or "this is slop" will come with experience, but you need to at least be able to explain what's going on.
the transition to mid and senior goes when you can start to quantify other aspects of the code. Like performance, how widespread a change affects the codebase at large, the input/outputs expected, and the overall correctness based on the language. Balancing those parameters and using it to accurately estimate a project scope is when you're really thinking like a senior.
early on when I was doing iOS development I learned that "m34" was the magic trick to make flipping a view around have a nice perspective effect, and I didn't know what "m34" actually meant but I definitely knew what the effect of the line of code that mutated it was...
Googling on it now seems like a common experience for early iOS developers :)
https://stackoverflow.com/questions/14261180/need-better-and...
https://stackoverflow.com/questions/3881446/meaning-of-m34-o...
Everybody else through my 21-year career has almost universally either been helpful or neutral (mostly just busy). If you think code reviews are just for bikeshedding about style minutia, then you're really missing out. I personally have found it extremely rewarding to invest in junior SWEs and see them progress in their careers.
It is not.
AI provides a bar. You need to be at least better than AI at coding to become a professional. It'll take genuine interest in the technology to surpass AI and clear that bar. The next generation of software professionals will be smaller, but unencumbered by incompetents. Their smaller number will be compensated by AI that can take care of the mundane tasks, and with any luck it's capabilities will only increase.
Surely I'm not the only one who's had colleagues with 10+years experience who can't manage to check out a new branch in git? We've been hiring people we shouldn't have hired.
It's clear why people do it (more pay) but it sets up bad incentives for the companies. Why would a company invest money in growing the technical skill set of an employee, just to have them leave as soon as they can get a better offer?
> culture of job-hopping
When using this phrase in this context, is your sentiment positive or negative? In my experience, each time I have a job offer for more money, I go and talk to my current line manager. I explain the new job offer, and ask if they would like to counteroffer. 100% (<-- imagine 48 point bold font!) of the time, my line manager has been simultaneously emotionally hurt ("oh, he's disloyal for leaving") and unsupportive of matching compensation. In almost all cases, an external recruiter found me online, reached out, and had a great new opportunity that paid well. Who am I to look away? I'm nothing special as a technologist, but please don't fault me for accepting great opportunities with higher pay. > Why would a company invest money in growing the technical skill set of an employee
What exactly is meant by "invest" here? In my career, my employers haven't done shit for me about training. Yet, 100% of them expect me to be up-to-date all the time on whatever technology they fancy this week. Is tech training really a thing in 2025 with so many great online resources? In my career, I am 100% self-trained, usually through blogs, technical papers, mailing lists, and discussions with peers.At Taos, there was a monthly training session / tech talk on some subject.
At Network Appliance ('98-'09), there was a moderate push to go to trainings and they paid for the devs on the team I was on to go to the perl conference (when it was just down the road one year everyone - even the tech writers - went).
At a retail company that I worked at ('10-'14), they'd occasionally bring in trainers on some thing that... about half a dozen of the more senior developers (who would then be able to spread the knowledge out ... part of that was a formal "do a presentation on the material from the past two weeks for the rest of your team.")
However, as time went on and as juniors would leave sooner the appetite for a company to spend money on training sessions has dissipated. It could be "Here is $1000 training budget if you ask your manager" becoming $500 now. It could be that there aren't any more conferences that the company is willing to spend $20k to send a team to.
If half of the junior devs are going to jump to the next tier of company and the other half aren't going to become much better... why do that training opportunity at all?
Training absolutely used to be a thing that was much more common... but so too were tenures of half a decade or longer.
This is why I never do internal job transfers. The total comp doesn't change. If I do an external job change, I will get a pay rise. I say it to my peers in private: "Loyalty is for suckers; you get paid less."
It's no surprise the market adapts to the new terms and conditions. But companies simply don't care enough to focus on retention.
I'm pretty sure it just comes down to bean-counting: "we have a new fulltime permanent asset for $100k" vs "we have a new fulltime permanent asset for $120k" is effectively the same thing, and there's a clear "spend money, acquire person" transaction going on. Meanwhile, "we spent $20k on an asset we already have" is.. a hard sell. What are you buying with that $20k exactly? 20% more hours? 20% more output? No? Then why are we spending the money?
It's certainly possible to dance around it talking about reducing risk ("there's a risk this person leaves, which will cause...") but it's bogged down in hypotheticals and kinda a hard sell. Sometimes I wonder if it wouldn't be easier to just fire staff for a week then re-hire them at a new salary.
You keep a good thing going, you buy oil for the machinery, you keep your part of the bargain and do the maintenance. You pay the correct price for the stuff you are lucky enough to have been getting on the cheap.
I like the directness of the question: "Why should I pay more when it won't burn down right this instand if I don't?" This is a question asked all over, and it is dangerous, keeping anything going requires maintenance and knowledge in how to maintain it. That goes for cars and it goes for people.
This is not business, it is miserly behaviour, it is being cheap.
The miser will find himself in a harsh, transactional, brutal world. Because that is the only way for people to protect themselves against him.
This incentive is entirely backwards. It should be "what are we losing with not spending that 20k?". You lose out on someone used to the company workflow, you waste any training you invested in them, you create a hole that strains your other 3-4 100k engineers, and you add a time strain to your managers to spend time interviewing a new member.
if you really believe you can buy all that back for 120k as if you ran short on milkk, you're missing the forest for the tree.
>Sometimes I wonder if it wouldn't be easier to just fire staff for a week then re-hire them at a new salary.
if society conditions a workforce to understand the issue, sure. But psychologically. you'd create an even lower morale workplace. Even for a week, people don't want to be dropped like a hot potato, even if you pick it up later as it cools. People want some form of stability, especially in an assumed full time role.
Employers get straight up lazy, by having soft negotiating employees to ignore. This laziness will bite them.
I guess that's how we got here to begin with. We take a workforce and treat is as expendable instead of as a proper team.
I suppose it will vary per industry but I can't imagine an other kind of engineering being comfortable just letting go of people mid-project because "we can afford to lose them".
I've started viewing developers that have never maintained an existing piece of software for over 3 years with skepticism. Obviously, with allowances for people who have very good reasons to be in that situation (just entered the market, bad luck with employers, etc).
There's a subculture of adulation for developers that "get things done fast" which, more often than not, has meant that they wrote stuff that wasn't well thought out, threw it over the wall, and moved on to their next gig. They always had a knack of moving on before management could connect the dots that all the operational problems were related to the person who originally wrote it and not the very-competent people fixing the thing. Your average manager doesn't seem to have the capability to really understand tech debt and how it impacts ability to deliver over time; and in many cases they'll talk about the "rock star" developer that got away with a glimmer in their eye.
Saw a post of someone on Hacker News the other day talking about how they were creating things faster than n-person teams, and then letting the "normies" (their words not mine) maintain it while moving on to the next thing. Thats exactly the kind of person I'd like to weed out.
You're falling for the exact same fallacy experienced by failed salesmen. "Why would I bother investing time in this customer when they're just going to take my offer to another dealership for a better deal?"
Answer: you offer a good deal and work with people honestly, because if you don't, you'll never get a customer.
What you say only works if everyone is doing it. But if you're spending resources on juniors and raises, you can easily be outcompeted and outpoached by companies using that saved money to poach your best employees.
give a big enough raise and they won't want to be poached. You won't retain everyone, but your goal probably isn't to compete with Google to begin with. So why worry of the scenario of boosting a good junior from 100k to 150k but losing them to a 250k job?
In some ways you will also need to read the room. I don't like the mentality of "I won't hire this person, they are only here for money", but to some extent you need to gauge how much of them is mission-focused and how much would leave the minute they get a 10k counter-offer. adjust your investments accordingly and focus on making something that makes money off that.
Some genius MBA determined that people feel more rewarded by recognition and autonomy than pay, which is actually true. But it means that all the recognition and autonomy in the world won't make you stay if you can make 50% more somewhere else.
Th power structure that makes up a typical owners-vs-employees company demands that every employee be replacable. Denying raises & paying the cost of churn are vital to maintaining this rule. Ignoring this rule often results in e.g. one longer-tenured engineer becoming irreplacable enough to be able to act insubordinately with impunity.
A bit bleak but that's capitalism for you. Unionization, working at a smaller companies, or at employee-owned cooperatives are all alternatives to this dynamic.
But as someone who originally wanted to be a specialist (or at the very leastT-shaped), I see a lot more problem in fostering specialists than generalists under this model. Sometimes you do just need that one guru who breathes C++ to come in and dig deep into your stack. Not always, but the value is irreplaceable.
But that means there's no need for entry-level glassblowers, and everyone in the field with any significant experience is super old. The pipeline has been dead for a while now.
Tech companies are betting that in 5 years, AI should be good enough to replace mid-levels.
Rinse and repeat with seniors 5 years after that.
Hard to say if that bet will pay off, or what the endgame would be; just the CEO commanding an company of AIs?
Plenty of skilled work requires a master’s or PhD. CS, for those who want a safe, secure job, looks like it’s going that way.
Not disagreeing that this is happening in the industry but it still feels like a missed opportunity to not hire juniors. Not only do you have the upcoming skill gap as you mention, but someone needs to instruct AI to do these menial/easy tasks. Perhaps it's only my opinion but I think it would be prudent to instead see this as just having junior engineers who can get more menial tasks done, instead of expecting to add it to the senior dev workflow at zero cost to output.
That said, you hit on something I've been feeling, the thing these models are best at by far is stuff that wasn't worth doing before.
I've also written a lot of python 2 in my career, and writing python 3 still isn't quite native-level for me - and the AI tools let me make up for my lack of knowledge of modern Python.
Basically this type of maintenance work for any sufficiently complex codebase. (Over 20k LOC)
When I was an QA intern / Software Dev Intern. I did all of that junk.
I mean, they just want to write code without testing it? Or fixing the bugs that come out of it?
Everything turned out fine. Turns out you don't really need to be able to perform long division by hand. Sure, you should still understand the algorithm at some level, esp. if you work in STEM, but otherwise, not so much.
There were losses. I recall my AP physics professors was one of the old school types (retired from industry to teach). He could find the answer to essentially any problem to about 1-2 digits of precision in his head nearly instantly. Sometimes he'd have to reach for his slide rule for harder things or to get a few more digits. Ain't no one that can do that now (for reasonable values of "no one"). And, it is a loss, in that he could catch errors nearly instantly. Good skill to have. A better skill is to be able to set up a problem for finite element analysis, write kernels for operations, find an analytic solution using Mathematica (we don't need to do integrals by hand anymore for the mot part), unleash R to validate your statistics, and so on. The latter are more valuable than the former, and so we willingly pay the cost. Our ability to crank out integrals isn't what it was, but our ability to crank out better jet engines, efficient cars, computer vision models has exploded. Worth the trade off.
Recently watched an Alan Guth interview, and he made a throwaway comment, paraphrased: "I proved X in this book, well, Mathematica proved...". The point being that the proof was multiple pages per step, and while he could keep track of all the sub/superscripts and perform the Einstein sums on all the tensors correctly, why??? I'd rather he use his brain to think up new solutions to problems, not manipulate GR equations by hand.
I'm ignoring AGI/singularity type events, just opining about the current tooling.
Yah, the transition will be bumpy. But we will learn the skills we need for the new tools, and the old skills just won't matter as much. When they do, yah, it'll be a bit more painful, but so what, we gained so much efficiency we can afford the losses.
So, there are two parts to this:
The first is that a lot of those tasks are non-trivial for someone who isn't a digital native (and occasionally trivial for people who are). That is to say that I often found myself doing tasks that my bosses couldn't do in a reasonable time span; they were tasks which they had ALWAYS delegated, which is another way of saying that they were tasks in which proficiency was not necessary at their level.
This leads into the second part, which is that performing these tasks did not help me advance in relevant experience at all. They were not related to higher-level duties, nor did they endear me to the people who could have introduced me to such duties. My seniors had no interest in our growth as workers; anyone who wanted to see that growth had to take it into their own hands, at which point "junior-level" jobs are only worth the paycheck.
I don't know if it's a senior problem generally, or something specific to this cohort of Boomer/Gen-X seniors. Gun-to-my-head, I would wager the latter. They give enough examples in other arenas of public life to lend credence to the notion that that they simply don't care what happens to their juniors, or to their companies after they leave, particularly if there is added hassle in caring. This is an accusation often lobbed at my own generation, to which I say, it's one of the few things our forebears actually did teach us.
Yet again, AI is just a cover for mismanagement.
We had code school grads asking for $110-$130. Meanwhile, I can hire an actual senior engineer for $200 and he/she will be easily 4x as productive and useful, while also not taking a ton of mentorship time.
Since even that $110 costs $140, it's tough to understand how companies aren't taking a bath on $700/day.
Bear in mind these types can explain things like why word-alignment matters and train themselves into being net productive within a few weeks.
you can't have rent at 3.5k a month and not expect 6 figures when requiring in-office work. old wisdom of "30% of salary goes to rent" suggest that that kind of housing should only be rented if you're making 140k. Anyone complaining about junior costs in these areas needs to join in bringing housing prices down.
Its even more frustrating knowing those people went through a overly long gauntlet and prevailed over hundreds of other qualified would-be engineers. Its so weird just seeing an entire pipeline built around minimizing this situation utterly fail.
Who knows if we'll even need senior devs in 5 years. We'll see what happens. I think the role of software development will change so much those years of technical experience as a senior won't be so relevant but that's just my 5 cents.
While the work seems to take similar amounts of time, I spend drastically less time fixing bugs, bugs that take me days or God forbid weeks, solved in minutes usually, sometimes maybe an hour if its obscure enough. You just have to feed the model enough context, full stack trace, every time.
Man, I wish this was true. I've given the same feedback on a colleague's clearly LLM-generated PRs. Initially I put effort into explaining why I was flagging the issues, now I just tag them with a sadface and my colleague replies "oh, cursor forgot." Clearly he isn't reading the PRs before they make it to me; so long as it's past lint and our test suite he just sends the PR.
I'd worry less if the LLMs weren't prone to modifying the preconditions of the test whenever they fail such that the tests get neutered, rather than correctly resolving the logic issues.
The bad product managers have become 10x worse because they just generate AI garbage to spray at the engineering team. We are now writing AI review process for our user stories to counter the AI generation of the product team. I'd much rather spend my time building things than having AI wars between teams.
Which stands to reason you'll need less of them. I'm really hoping this somehow leads to an explosion of new companies being built and hiring workers , otherwise - not good for us.
Depends on how much demand there would be for somewhat-cheaper software. Human hours taken could well remain the same.
Also depends on whether this approach leads to a whole lot of badly-fucked projects that companies can’t do without and have to hire human teams to fix…
I've found Opus 4.5 as a big upgrade compared to any of the other models. Big step up and the minor issues that were annoying and I needed to watch out for with Sonnet and GPT5.1.
It's to the point where I'm on the side of, if the models are offline or I run out of tokens for the 5 hour window or the week (with what I'm paying now), there's kind of no use of doing work. I can use other models to do planning or some review, but then wait until I'm back with Opus 4.5 to do the code.
It still absolutely requires review from me and planning before writing the code, and this is why there can be some slop that goes by, but it's the same as if you have a junior and they put in weak PRs. Difference is much quicker planning which the models help with, better implementation with basic conventions compared to juniors, and much easier to tell a model to make changes compared to a human.
I guess it depends on the project type, in some cases like you're saying way faster. I definitely recognize I've shaved weeks off a project, and I get really nuanced and Claude just updates and adjusts.
which means either devs will take over architectural roles (which already exist and are filled) or architects will take over dev roles. same goes for testing/QA - these are already positions within the industry in addition to being hats that we sometimes put on out of necessity or personal interest.
This is mostly a good thing provided you have a clear separation between solution exploration and actually shipping software - as the extra work put into productionizing a solution may not be obvious or familiar to someone who can use AI to identify a bugfix candidate, but might not know how we go about doing pre-release verification.
Going to throw out another anecdote here. At a company that a number of my friends work for (a fortune 50), they are currently making record profits that they loudly brag about during employee townhalls. They also are in the process of gutting multiple departments as fast as possible with little regard for the long term consequences. This is not the only company that I know of acting in this way (acting like they're about to go bankrupt when in fact they are seeing record profits).
To me the societal risk is that an entire generation of employees becomes extremely jaded and unmotivated, and fairly so. We used to work under the assumption that if our company is successful, then the employees would be successful. Record profits == raises for all, bonuses for all. And while we know that that connection was never that strong, it was strong enough to let us at least pretend that it was a law of universe.
That fundamental social contract is now at its breaking point for so many workers. Who can really blame people for putting in minimal effort when they have so much evidence that it will not be rewarded?
That's all over now; the growth spurt of a young software industry has given way to maturity. We'll be navigating an employment environment much like what the norm is in other technical professions with tougher standards and fiercer competition for good jobs.
dismissing technical talent as "warm bodies" is exactly how the old guard of IBM/AT&T/Oracle fell to the new scrappy talent. I'm sure history will repeat itself again in due time.
> We'll be navigating an employment environment much like what the norm is in other technical professions with tougher standards and fiercer competition for good jobs.
if every other sector except healthcare wasn't experiencing the same thing, you may have a point. This clearly isn't a problem limited to tech, though.
Today, a CEO can turn in a few quarters of really solid earnings growth, they can earn enough to retire to a life a private jets. Back when CxO pay was lower, the only way to make that kind of bank was to claw your way into the top job and stay there for a decade or more.
The current situation strongly incentivizes short-term thinking.
With today's very high, option-heavy compensation a CEO making long-term investments in the company rather than cutting staff and doing stock buybacks is taking money out of his own pocket.
It's a perverse incentive.
Lip Bu Tan, for instance, has performance targets on a five year timeline, which are all negated if the stock falls below a certain threshhold in 3 years. [1]
Or, ever controversial Elon Musk, certainly has an (also egregious) $1 Trillion dollar pay package, but it has some pretty extreme goals over 10 years, such as shipping 1 million Optimus robots [2].
All in all, we can debate about the Goodharting of these metrics (as Musk is keen to do), but I feel boards of these public companies are trying to make more long-term plans, or at least moving away from tying goals to pure quarterly metrics. Perhaps we can argue about the execution of them.
Note: I own neither of these stocks and my only vested interest is buying the S&P.
[1] https://www.cnbc.com/2025/03/14/new-intel-ceo-lip-bu-tan-to-... [2] https://www.bbc.com/news/articles/cwyk6kvyxvzo
But we only care about short term metrics now, so no one cares. They don't even care to develop the tools to understand it. It might as well not exist. Blame the young people and move on.
At this point in the tech industry, it'd be easier to name companies not doing this. Maybe Apple? I think they got aroudn it by not renewing contractors. But I might have missed something.
>To me the societal risk is that an entire generation of employees becomes extremely jaded and unmotivated, and fairly so.
I sure am jaded. But more motivated than now in my goals. They used to be to be this knowledgeable IC who can dig deep into a domain, but it's definitely been shifting to being able to sustain myself off my talents. I'll grab short term contracts and let my own products be the steady income.
(yeah, a lot easier said than done. But I have time to prepare for that).
>Who can really blame people for putting in minimal effort when they have so much evidence that it will not be rewarded?
Worse than that. Why put in effort when your reward for providing all that value is still getting the axe?
My industry is finally starting to see real moves at unionizing, but I hope tech as a whole is starting to wake up to this fact?
If you want to avoid getting laid off, make sure the product of your work is more valuable than your salary.
You can't outwork corporate greed, unless you're working for peanuts in a 3rd world country. Then you're truly irreplacable (and still broke).
This entire discussion sounds crazy to me. If you want socialism, vote for socialism. If you want raw unfiltered capitalism, vote for the billionaire. You can't vote for the billionaire and expect safety nets. That's madness.
You are not wrong, but the contract is/was metaphorical. For a long time people were able to make a living for themselves by studying hard (usually STEM) and end up with a career which payed off. That was the invisible "contract". Hell I went to university for things which seem like academic navel gazing, but I still got a good tech job on the other side. That's not the reality for a lot of graduates nowdays who take more practical degrees at masters and phd levels.
Again even if the literal statement is clearly false, it is the sentiment which matters, and this sentiment does not just apply to graduates. I think many just feel like working hard does not work anymore, especially in the face of housing, cost of living, job competition and social media flaunting the wealth of others.
I get the idea from my younger siblings, "Why try if you are already a looser."
Recessions like the GFC, the Dot Bomb, the early 90s, the Asian Financial Crisis, the early 80s, Stagflation, and others show otherwise.
The extended bull run that SWEs had from the early 2010s to 2022 was an outlier, and the whiplash being felt today is comparable to what law and finance grads faced in the 2010s, accounting majors in the 2000s, and Aerospace/MechE majors in the 1990s.
“If you can convince the lowest white man he's better than the best colored man, he won't notice you're picking his pocket. Hell, give him somebody to look down on, and he'll empty his pockets for you.”
It doesn’t set the legal standard that profits must be maximized which is impossible.
Socialism has a specific meaning, it's not just a label we get to put on behaviors that we - or rather, specifically you in this case - don't like.
Or more to the point, productivity has consistently outpaced pay for most of the US workforce since the mid-1970s. That's ~50 years that companies have been ripping you off. It's only now you notice, because rent/mortgage/school/medical have finally become so much larger than pay.
Well now you get to live through the Great Depression and study it up close.
The alternate way of looking at it is that the 50s to mid 70s era saw a period of unprecedented prosperity and now we are just seeing a reversion to the mean.
http://web.archive.org/web/20200428221848/https://www.nytime...
A social contract is an implicit agreement that everyone more or less accepts without anything being necessarily legally binding.
For example, the courtesy of two weeks notice in the US is a social contract: there’s nothing legally requiring it, but there are _social_ consequences (ie: your reference might be less positive) if you don’t follow it.
Everything that’s kind of in an employee’s favor is not socialism. You don’t have to like the idea of “work hard, help the company do well, get rewarded,” but that isn’t socialism. It’s just a thing you don’t like.
The top 10% of income earners in the US account for 50% of consumer spending. LMK if you think that's part of the contract. https://www.marketplace.org/story/2025/02/24/higher-income-a...
The second part of your sentence is not necessarily true. It might be true in some or even many cases, but it's certainly not something you can just assert & move on, as if it's a physical law.
I also think though that individual experiences of this kind are more about specific companies maturing than a widespread culture shift. A lot of people on these forums worked in tech companies that are relatively young and have changed a lot over the past two decades.
well we can trace that back to the 1920's, for one example.
>Do you people have some kind of contract with Tesla that I don't know about?
Are you aware of what a "social contract" is? There's nothing wrong with seeking to fill in gaps of knowledge.
>This entire discussion sounds crazy to me. If you want socialism, vote for socialism.
I'd be down for it, but this is almost orthogonal to the main point of the discussion. Social contracts exist in all forms of governing. Even rampant capitism has the bare bones social contract of "don't make your customers TOO angry so you can maximize extraction".
When I was starting, you were checked for potential as a trainee. In my case, options trading. They checked over that you could do some mental arithmetic, and that you had a superficial idea of what trading was about. Along with a degree from a fancy university, that was all that was needed. I didn't know much about coding, and I didn't know much about stochastic differential equations.
A couple of weeks ago, a young guy contacted me about his interview with an options trading firm. This guy had spent half a year learning every stat/prob trick question ever. All those game theory questions about monks with stickers on their foreheads, all the questions about which card do you need to turn over, the lot. The guy could code, and had learned a bunch of ML to go with it. He prepared for their trading game with some really great questions to me about bet sizing.
I was convinced he was simply overly nervous about his prospects, because I'd never met someone so well prepared.
Didn't get the job.
Now I can assure you, he could have done the job. But apparently, firms want to hire people who are nearly fully developed on their own dime.
When they get their analyst class, I guess there is going to be nobody who can't write async python. Everyone will know how to train an ML on a massive dataset, everyone will already know how to cut latency in the system.
All things that I managed to learn while being paid.
You gotta ask yourself whether we really want a society where people have to already know the job before they get their first job. Where everyone is like a doctor: already decided at age 16 that this was the path they wanted to follow, choosing classes towards that goal, and sticking with it until well into adulthood. And they have to essentially pay to get this job, because it comes at at cost of exploring other things (as well as actual money to live).
If you attend a well-known college that bigco's hire from frequently, there's a lot of knowledge floating around about interview prep, hiring schedules, which companies pay the best, etc. Clubs host "interview prep workshops" where they'd teach the subject matter of interviews, host events(hackathons, case competitions, etc.) to help you bolster your resume for applying to these bigco's. So just by attending a better/fancier school, you'd have pretty decent odds of eventually getting a job at one of these prestigious places.
If you were to attend a less prestigious school, regardless of your aptitude or capability, the information asymmetry is so bad that you'll never learn of the prerequisites for even being considered for some of these roles. Not many upperclassmen will have interned at fancy employers, so they won't be there to help you drill dynamic programming/black-scholes/lbo models, and won't tell you that you need to have your applications prepped by a certain date, and won't tell you that you should be working on side projects/clubs, etc.
I suppose that the apprenticeship model biases towards people that already have connections, so perhaps inequality was already bad, whereas now we just have an information asymmetry that's more easily solvable.
With the way higher-ed works in the US, and the way certain schools opportunity hoard to an insane degree, that is effectively already the case for whole industries and has been so for decades at this point. It's practically an open secret that getting into some schools is the golden ticket rather than the grades you earn while there. Many top schools are just networking and finishing schools for whole "elite" industries.
Built most of the software of a company where I worked for 7y from humble beginnings to >80 people. Still gotta line up for a 4h on-site assessment! Built tons of free time projects, some more complex than anything one would usually build on the job. Still gotta have live coding interviews and no one can be arsed to even check my publicly available repos...
That'd be fine... meanwhile, the new loop we come into:
- okay, so what does your company need and do
Company: "that's under NDA/trade secrets, we can't tell you"
- okay. we can't see what you want to you'll have to train them
Company: "we don't want to train people, they just need to hit the ground running"
- okay. we'll just let colleges train the fundamentals and have others figure it out
Company: "no one's training anyone anymore. Where did the juniors go?"
Even doctors have apprenticeship programs. An industry where no one wants to train the next generation is a doomed one. If the US doesn't do it, some other country will gladly take it up.
A smaller size company, perhaps in a lower COL city, might have a more "human" side to them, simply because they can't afford all the nonsense.
You don't need a fancy school to get into a top firm anymore. You have to master the hell out of the interview.
I'm sure that's true in some areas, but our last hire I was shocked at the ridiculous lengths the applications would go to to avoid putting in even a minimum effort to apply for the job. Like the Van Halen brown M&M test, we put a line in the middle of the job advert saying "If you've read this, put your favorite color in at the top of your job application message. We had low double digits % of people who would do that.
Honestly, on our next hiring round, I think I'm going to make people fill out a google form to apply, and have any of our job posts say "Apply at <URL>" and completely ignoring any apps we get through Indeed or the like. We had a team of 3 people reviewing applications for an hour or two a day for a month and most of the responses were just human slop.
We're stuck in a stalemate where the sheer volume of applications for employers to handle and applicants to send makes them take shortcuts, leaving both sides wonder why people aren't trying.
If somebody has to send in 300-500 applications (which is not unheard of) and answer the same questions till they go blind, it's not surprising that certain things are missing or people don't care. Applicants don't have any reason to believe their info isn't thrown in the trash by an LLM as soon as it is sent.
Lazy people will always be a problem but until there is transparency or trust developed I doubt we will see meaningful change.
That's leading to an escalation where because applicants believe their apps are just getting fed to the LLMs, employers have to use an LLM. ;-/
Let's not blame the people with no power in this situation.
TBH I can't blame them. you're applying to hundreds of applications repetitively with qualifications that barely matter because you're encouraged to apply anyway. You can only spend so many hours reading HR-drivel (that at this point may or may not be ai-generated) before you focus on just finding "job title, salary , location), and then slamming apply. It's just not worth editing my resume to add some weird qualifier if I don't even think I'm going to get a reply. It's another hoop.
It's the complete inverse of hosting Van Helen at your show. It'd be more like trying to make a cashier recite their company motto. They are not that dedicated to any one role. They can't afford to be.
---
I don't know if it's feasible for your situation, but smaller teams tend to have candidates email their resume. It can still be LLM'd, but I will tend to pay more attention if I feel like I have a direct communication channel. Not yet another greenhouse application form. It leaves room to be more free form with my pitch as well.
ChatGPT was pretty useless when it first released. It was neat that you could talk to it but I don't think it actually became a tool you could depend on (and even then, in a very limited way) until sometime in 2024.
Basically:
- the junior hiring slowdown started in 2022.
- but LLM's have only really been useful in a work context starting around 2024.
As for this point:
> According to very recent research from Stanford’s Digital Economy Lab, published in August of this year, companies that adopt AI at higher rates are hiring juniors 13% less
The same point stands. The junior hiring slowdown existed before the AI spend.
But yeah, it's bad in general. seniors are struggling too. This was cooking for even longer, but more mess got added to the stack.
The AI wave didn't start yet. Will hit in 26/27
Let's say you hire your great new engineer. Ok, great! Now their value is going to escalate RAPIDLY over the next 2-3 years. And by rapidly, it could be 50-100%. Because someone else will pay that to NOT train a person fresh out of college!
What company hands out raises aggressively enough to stay ahead of that truth? None of them, maybe a MANGA or some other thing. But most don't.
So, managers figure out fresh out of college == training employees for other people, so why bother? The company may not even break even!
That is the REAL catch 22. Not AI. It is how the value of people changes early in their career.
If salaries reflected productivity, you'd probably start out at near minimum wage and rapidly get raises of 100% every half year.
On top of that, if the junior is successful he'll probably leave soon after he's up-and-running b/c the culture encourages changing jobs every 1-2 years. So then you need to lock people down with vesting stock or something..
It seems not easy at all. Even if you give aggressive raises, at the next interview they can fake/inflate their experience and jump in to a higher salary bracket
Hiring and training junior developers seems incredibly difficult and like a total waste of energy. The only time I've seen it work is when you get a timid autistic-savant-type who is too intimidated with interviewing and changing jobs. These people end up pumping out tons of code for small salaries and stay of for years and years. This is hitting the jackpot for a company
In the current economic situation you can offer a junior 2x may be even 3x less and still get candidates to choose from.
Also there juniors who are ready to compensate for lack of experience by working longer hours (though that's not something you would learn during hiring).
> The first few months at your first job you're probably a net loss in productivity.
It's true for a senior too, each company is different and it takes time to learn company's specific stuff.
I don't think the kinds of people who see a 50% raise and complain that it's not 100% are the kinds of candidates you want to hire anyway. I'd like to see more of that before deciding we tried nothing and ran out of ideas.
I didn't leave my first job because I was non-autistic. I left because I was paid 50k and the next job literally tripled my total comp. Oh, and because I was laid off. but tbf I was already out the door mentally around that time after 2 years of nothing but chastising and looking at the next opportunity.
I would have (outside of said chastising) gladly stayed if I got boosted to 75k. I was still living within my means on 50k.
>Hiring and training junior developers seems incredibly difficult and like a total waste of energy
If that's the attitude at large, we're all falling into a tragedy of the commons.
Sadly this is not as common as it should be - but I've also mentored folks at FAANGs who got promoted after 1y at the new-hire level because they were so clearly excelling. The first promotion is usually not very hard to attain if you're in the top quartile.
No biggie, just be the best in the interview stage and continue to be the best for years after that. It's that simple.
If colleges stayed up to date, and teach valuable skills, the jump wouldn't be so steep!
If industry doesn't want to pay for training, they better pay bootcamps to overhaul themselves and teach what they actually need. I don't think universities will bend much more since they have their own bubble on their hands.
The boom-bust recession cycle is roughly every 10 years. You can't say that AI is impacting hiring when your data just looks like the typical 10 year cycle. Your data needs to go back further.
That being said, what's more likely going on:
1: There are always periods where it's hard for recent college grads to get jobs. I graduated into one. Ignoring AI, how different is it now from 10, 20, and 30 years ago?
2: There are a lot of recent college grads who, to be quite frank, don't work out and end up leaving the field. (Many comments in this thread point out how many junior developers just shouldn't be hired.) Perhaps we're just seeing many companies realize it's easier to be stricter about who they hire?
Ignoring AI, there is simply more competition and less human interface in the process to begin with. 10 years ago, you'd throw maybe dozens of apps and study interview trivia (this was right before the "leetcdoe era" so not even that). 20 years ago you'd probably just wander around a career fair and stumble into your career. 30 years ago you were as close to shaking your managers' hand for a job as you'd ever be in the modern tech industry.
10 years ago, a reference from nearly anyone in the pipeline to the hiring manager guaranteed at least a look see at you. Now it's a 50/50 at best. "who you know" may not be enough anymore.And now career fairs are 90% advertising firms instead of actual talent aquisition.
>Perhaps we're just seeing many companies realize it's easier to be stricter about who they hire?
if you look at the hiring numbers, you see that hiring globally is in fact not slowing down. That's a bit of a tangent, but that may give a clue to the whole situation here.
Today you may not even get a human to see your resume after 100 job apps. It's not just brutal but a solitary experince. No feedback to improve upon, no advice to take.
Default "people have value because human attention solves problems", has become default "existing org structure has value because existing revenue streams are stable."
The idea of a company used to contain an implied optimism. "If we get capable people together, we can accomplish great things!" Now that optimism has been offloaded to the individual, to prove their worth before they can take part.
Amidst this influx of applicants, junior and intermediate staff began getting Senior titles to justify pay raises. Soon those exact same people were moving from job to job as a "Senior", but without the relevant criteria that would've qualified for that title a decade before. You can still see people get promotions without having accomplished anything, much less learned anything, but they did keep the lights on. Today there's a sea of "Senior" engineers that can basically write code (and not especially well), but lack all the other "non-coding" skills that Seniors should have.
Even if you hired 100K new Juniors tomorrow, there's nobody to train them, because most of the people working today are practically Juniors themselves. Each "generation" is getting worse than the one before, because they're learning less from the generation before, and not being required to improve. There's still good engineers around, but finding them is like playing Where's Waldo? - and you have to know what Waldo looks like, which you won't if you're not experienced!
The fix isn't going to be learning to network ("relational intelligence") and mentoring more. The fix is for us to stop letting the industry devolve. Treat it like the real engineering professions, with real school requirements, real qualifications, real apprenticeships, real achievements (and titles that aren't meaningless). Otherwise it'll continue to get worse.
Sadly not in "our" hands. At best, some director/product owner brings it up. Executives have a nice chuckle, and they continue to outsource to anywhere else. This US industry barely wants to hire Americans to begin with at this point.
We're gonna have to divorce from big tech and push more businesses that reflect our desires if we want true change. Or collectively bargain while we have the chance. I don't know what is more likely in this community.
I saw the title inflation happen in real time. When the boot camp floodgates opened, that was the beginning of the end of my faith in this field. I saw people with three months of create-react-app tutorials churning out garbage, while I was called upon to put out fires and fix things when they broke. I "did devops", and rapidly became shadow developer IT, helping incapable programmers fix bugs in codebases I wasn't even familiar with, better than they could. And I am truly not that great of a programmer! I just know how to read, reason, and use grep a lot. These aren't superpowers, but finding someone who can even reason through how to debug something is impossible these days.
I would love some sort of licensure or guild or standards, but I have no idea how we even begin to change that. Part of the problem is that companies don't want to change. It's cheaper to pay a few people nothing than it is to pay a lot of people a lot, and that shows no sign of changing. Maybe more planes have to fall out of the sky, I don't know. Maybe Windows has to become so buggy and unusable that multiple hospitals shut down for months on end. We don't just need a reckoning, we need a reckoning where we all wind up better on the other side.
I am squinting at the horizon, but still, all I see is darkness.
1. The industry cannot define the terms junior or senior.
2. Most seniors today are the prior generation’s juniors with almost no increase of capabilities, just more years on a resume.
The article asks about what happens when today’s seniors retire in the future. I would argue we are at that critical juncture now.
I highly doubt throwing even a 3YOE "senior" of 2012 at a modern junior interview would turn out as well as you'd expect. the standards have gotten sky high. That doesn't mean they can't do the job, it means the industry created more hoops to jump through.
I agree to an extent with title inflation (and where the hell is the mid level?), but I don't think peple are confusing "juniors" here. It's new grads to at best 2 years of experience. not much controversy there. I also don' think the idea that the 2014 graduating CS class is smarter than the 2024 class would pass the sniff test.
I am an older gen-z and launching my career has felt nigh on impossible. At my first job, the allergy toward mentorship this article mentions was incredibly palpable. None of my several managers had management experience, and one of them openly told me they didn't want to be managing me. The one annual review I got was from someone who worked alongside me for a week.
Follow that experience up with a layoff and a literally futile job search, and its hard to be optimistic about building much of a career.
It is insane how much screwed over we are. I am about to turn 30 soon with 5 YoE, PhD in ML which supposedly is the cutting edge stuff. Yet I have no prospects to even buy a tiny flat and start “normal life”. AI eats its own tail, I have no idea what I should do and what to learn to have any sensible prospects in life.
And although it hasn't discouraged me, I have to admit that I've been burned by juniors when caught in the middle between them and senior leadership on output expectations or strategy because frankly it's much more challenging to mentor how to navigate company politics than it is to mentor professional coding acumen. I want to be humble here. I don't think that's the junior's fault.
It feels like these problems go a lot deeper than AI. Most shops want software teams that are either silently embedded black boxes that you insert rough instructions into and get working software as output or an outsourced team. We've all experienced this. It seems silly to deny that it's directly related to why it's so hard to mentor or hire juniors.
I think you succeeded overall at your goal! Thanks for replying. You encouraged me to go back and read your article more closely.
Well that explains why AI excacerbates this. It's all they ever wished for and they don't need to make do with that facsimile of "human interaction" anymore. It's not perfect but that's a sacrifice they are willing to make.
Or you know, they just really want to be as cheap as possible in production (hence, outsourcing).
>It seems silly to deny that it's directly related to why it's so hard to mentor or hire juniors.
I'll give a slight BOTD here after my disdain above and admit tha a small team probably isn't the best enviroment to train a junior. Not unless you either
a) truly believe that the skillet you need isn't out there, and you are willing to train it yourself to alleviate your workload, or
b) you are thinking long term efficiency and are willing to lose early productivity to power the future prosperity. Which, to be frank, is not how modern businesses operate.
And yes. Any teacher in any field (but especially education) will tell you that the star players make their day, week, and year. But the worst cases make you question your career. Our natural negativity bias makes the latter stick out more. Those in industry won't get star players as they are either filtered out by these stupid hoops or gobbled up for 100k above your budget by the big players. It's rough.
I have a friend of a friend in his mid 20s who finished a masters degree in data science focused on AI. There isnt a job for him and I think hes given up.
In Letters to a Young Poet Rilke responded to a young aspiring poet who asked how a person knows whether the artistic path is truly their calling:
> “There is only one thing you should do. Go into yourself. Find out the reason that commands you to write; see whether it has spread its roots into the very depths of your heart; confess to yourself whether you would have to die if you were forbidden to write. This most of all: ask yourself in the most silent hour of your night: must I write? Dig into yourself for a deep answer. And if this answer rings out in assent, if you meet this solemn question with a strong, simple "I must," then build your life in accordance with this necessity; your whole life, even into its humblest and most indifferent hour, must become a sign and witness to this impulse.”
How do I respond to this friend of a friend? Is data science or coding in general the path for you only if you would rather die than stop merging pull requests into main every day even when nobody is paying you?
Is coding the new poetry?
What do I tell this guy?
I don't know what the disposition of your friend is, but I don't think many of us are ready to die cold on the streets scaping towards our goal. Survive first and then figure out how to climb from there. Don't see setbacks as a sign of weakness, but a part of life.
And no, coding is not the new poetry. I wish people would stop spamming this website with doomer nonsense like this.
The other place you will meet struggling artists is sports. Train several times a week, neglect your social life, your studies, just learn how to chase after a ball.
Only people who are crazy driven will actually do this. The ones who don't make it, they try to climb up from lower league clubs. They go on and on, carving out a career.
But most kids do not have a burning passion for anything. They are curious, they're smart, they want to explore the world. But they haven't found a calling. If they try to go through the eye of the needle, they find it's quite hard, because those paths are taken by guys with a mental lock on a certain career.
What to tell the guy? He's picked the subject that is the most useful for learning about the world. Go around and look at things. There's so much that a person who can code and can deal with statistics can apply himself do.
Can you give a few examples please?
There has been a cultural shift too. I don't know when it got started, but at least employees in the tech companies started to get more and more obsessed with promotions. The so-called career development is nothing but a coded phrase for getting promoted. Managers use promotion as a tool to retain talent and to expand their territories. Companies adopted to this culture too. As a result, people development increasingly became a lip service.
Has anyone ever seen a manager mentoring ICs? I haven't. This is a senior/staff/principal responsibility.
lots of "seniors" via title inflation dont have fundamentals anyways - hence a lot of broken software in the wild & also perverse incentives like Resume driven development. A.I is built on badly written open source code.
because once you have the fundamentals, built a few things - you would've battle scars which makes someone a senior
not the 'senior' we see in big corps or places cosplaying where promos are based on playing politics.
Interesting observation. I have personally tried to avoid getting into people manager positions (as I believed I'd be Peter Principled) but always took it as my duty to share knowledge and mentor the curious and the hungry (and even the ones that are not so). It's actually a very rewarding feeling when I hear good things about people who learned with me.
This seems like a deeply flawed take on the dual track IC-management ladder. Senior ICs don't keep plugging away by themselves because they're not managers, they just don't get people-management tasks. I think the leadership & mentorship they provide is harder than for me (a manager) because they don't have the hammer of a "manager" job title, and need to earn all their credibility. I have not had a senior IC and above in more than 10 years that didn't have a significant amount of junior & int development explicitly defined in their role, and the easiest way to get promoted is with leverage. Try and be 20% better than your peers with your contribution (hard). Make 10 people 3% better (much easier)
This is because "management" includes a bunch of BS that few engineers want to actually deal with. Performance discussions, 1:1s, being hauled into mandatory upper-level meetings, not actually building things anymore, etc. If it was simply pairing with juniors from time to time to hack on things and show them cool stuff, it would be wonderful.
Its a double edged sword too. I see it in my biz -- its easier to spend 40 hours training a model how to do things the way we like rather than hire someone junior and spend a month+ on onboarding. We are noticing hitting a wall to a certain point with clients still wanting to talk to a real person, but I can see that changing in the next ~5 years. Zero idea what happens to those junior folks that used to get trained (me being one that sat through a 3mo onboarding program!).
I don't know. if we simply defer talks to LLM's, then companies will take out the middlemen. which means less clients. We'll have our own little filter bubbble of tech where everyone is talking to their black box to try and push out their ideas instead of within the industry.
Not exactly an industry I want to be in. But I don't think it'll get to that point.
There is a fair bit of anecdotal evidence that junior hiring--at least in the software space--is fairly difficult currently. Via internships at good schools etc. may be better but I have to believe that off the street from bootcamps and the like is pretty tough.
https://metr.org/blog/2025-03-19-measuring-ai-ability-to-com...
You're totally right. 10 minutes on /r/cscareerquestions (without even sorting by `top`, though it's more brutal if you do) is enough to confirm it.
I normally wouldn't cite Reddit as a source, but this same subreddit was overflowing with posts on fending off recruiters and negotiating already-sky-high comp packages just two years ago. Seeing how quickly the tables turned is sobering.
Single-Payer health care would help our industry immensely if it came to pass.
Imagine having no fear any more.
It actually might help.
This is the model used in Eastern Europe and India - the vast majority of new grads are hired by mass recruiters like EPAM, WITCH, Deloitte, and Accenture at low base salaries but also the expectation that they self train and learn how to become productive SWEs, or they just stagnate at the low rungs. Japan, Korea, and China use a similar model as well.
But honestly, even FTE isn't much of a headache if I can hire a junior SWE for $60k-80k, invest in training them, and then bumping salaries to market rate after they have matured. This is what a number of traditional F500s like Danaher [0], AbbVie [1], and Capital One [2] do via Leadership and Trainee Development Programs, and honestly, it's much easier to make a case to hire someone if they have a couple of years of real world work experience.
[0] - https://jobsblog.danaher.com/blog/leadership-development-pro...
[1] - https://www.abbvie.com/join-us/student-programs.html
[2] - https://www.capitalonecareers.com/get-ahead-with-early-caree...
> The most common answer from students when asked what they needed was a mentor who had just been in their shoes a few years ago, a surprising and heartening answer.
Mentoring is difficult; especially in today's world, where we are taught to despise older folks, and encouraged to treat everyone that we work with, as competitors.
For myself, I'm happily retired from the Rodent Rally, and find that LLMs have been a huge help, when learning new stuff.
This kid would not accept seniority, would constantly and publicly try to divert from the stack we worked with, he would not take any input on his work without actively fighting the process and will crowd the conversation at team meetings with never-ending Reddit-tier takes that contributed to nothing other than fill his ego.
In the end I managed to convince my boss to get him out, and he now works in Cyber, which will probably cause even more damage in the long run, but at least I can now say "not my problem".
You should have stopped to think about why such a person was hired in the first place, while there are an endless supply of very talented, hard working, and honest young people who would never be given a chance at all.
But if I guess right, hiring is not seen as the responsibility of your company. And that's the core of the problem.
The hiring process is probably barely better than random, and, probably even closer to random for a junior hire.
Junior hires mostly don't know anything. So, you're pretty much hiring on "seems smart, curious, and enthusiastic" and praying a lot that you can train them. You're simply going to get misses.
This is one of the advantages that you get running "cooperative engineering" programs. You get to vet juniors before they get welded into your pipelines.
What world is this? This not match my experiences at all. Is this a common sentiment among your peers?
The people who will give you credit where it's due and lift you in my experience are more rare than not, and almost always an older member, which perhaps is because they don't feel the need to prove themselves as much anymore.
Despising older folks has been a thing a long time, made famous by Zuck starting out. Now that he's older, I wonder if he still feels the same way...
and before that is was hippies with "Don't trust anyone over 30" which became deeply ingrained in at least American culture.
The difference, this time, is the CEO is now a younger person, when they used to always be someone in at least their forties (more often fifties or sixties).
9 times out of 10 it goes the other way around. Most young people have only had very negative interactions with their seniors, which has been wholly on the part of the senior. The current young generation is very respectful towards older people.
This has not been my experience.
I worked for a company that prized seniority, and I regularly dealt with folks older than me, more experienced than me, more capable than me, and willing to help me out. I worked there for almost 27 years, and it was awesome.
In my experience, I'm usually written off as an "OK Boomer," before I've even had a chance to open my mouth to prove it (or not).
My fave, is when we have a really promising text-only relationship, then, the minute they see me, it goes south.
This is kind of like saying “Get your flight hours in on Microsoft Flight Simulator and then Delta Airlines will hire you.”
All unpaid and in your spare time between your two minimum wage jobs, of course.
It's sad that the great opportunities have become more out of reach. But that seems almost expected for a job that's relatively easy, safe, and pays well.
I would much rather have that junior take some hacks at building some features with AI along with my guidance than context switching over to AI just to walk it through doing a task which means having to explain the business and our business rules over and over again.
To me cutting out a junior developer adds more time for senior developers than making their work lighter.
That opportunity is now lost. In a few years we will lack senior engineers because right now we lack junior engineers.
All is not lost however. Some companies are hiring junior engineers and giving them AI, and telling them to learn how to use AI to do their job. These will be our seniors of the future.
But my bigger concern is that every year the AI models become more capable, so as the "lost ladder" moves up, the AI models will keep filling in the gaps, until they can do the work of a Senior supervised by a Staff, then the work of a Staff supervised by a Principal, and so on.
The good news is that this is a good antidote to the other problem in our industry -- a lot of people got into software engineering for the money in the last few decades, not for the joy of programming. These are the folks that will be replaced first, leaving only those who truly love solving the hardest problems.
If I were to graduate today, I'd be royally screwed.
But looking back on my 30 years of working (including in high school), every job I've ever had I got through personal referrals or recruiter reach-outs. I've gotten to interviews before but never actually taken a job without a personal connection.
Will say what's gotten me hired are my projects eg. robotics or getting published online for hardware stuff, I work in the web-cloud space primarily though, hardware would be cool but hard to make that jump
I feel that too. I am a self-taught dev. Got a degree, but not in CS. I don't know if I could get hired today.
Not sure how to fix it; feels like the entire industry is eating the seed corn.
They do not. Mentoring is rewarding work, but it is work.
I also find it objectionable that if you're simply not interested in mentoring, you're a jerk. Some people just aren't good at it, some people are genuinely swamped with existing responsibilities, and some people might just want to focus on their goals... and that's fine. There is no but.
Some folks <gasp> just don't like other people that much, and prefer working alone. Also fine, and kudos for being self-aware enough to not inflict yourself on people who probably wouldn't enjoy your oversight either. This should be celebrated as a communications success.
All of which brings me to the truth: if a company wants to mentor junior developers - and there are many, many excellent reasons to develop talent long-term - then they should make sure that they have suitably experienced people who have opted-in to mentorship, and make sure that their success metrics and remuneration reflect the fact that a significant portion of their time is acknowledged to be dedicated to mentorship. Otherwise, you're describing a recipe for legitimate resentment.
Likewise, if you're a junior developer desperate for mentorship... I understand that your instinct is to take any offer that will have you. But if you're able to have an honest conversation with the recruiter about what kind of mentorship culture exists in a company, you might be saving everyone a lot of pain and frustration.
We have an intern that is finishing a four year computer science degree that has no clue what git is, never used a log and all he presents is AI garbage.
I find it profoundly depressing to try and teach someone who has no interest in the craft.
80% of the candidate I interview pass (leetcode style coding interview, as mandated by the company). This is actually annoying because I'll probably have to raise the bar and start rejecting very good candidates.
I'm sorry but to me this part reads like a humorous phrase that's popular in some circles in my region which goes:
"Maybe <list of negative things, usually correct characterizations of the speaker>, but at least <something even worse>"
The companies I worked for used automated coding quizzes like Codility to weed out the worst applicants, but I suspect you're already doing that.
How is them knowing when binary search is useful relevant to what they'll be doing at work should they get hired?
Because of our work is changing, faster than ever, not day to day but over time. You need a foundation to handle that change. My 2X years experience showed me that the people who has strong foundation handle the transition well. If I'm going to hire and invest and mentor, I want that person to be successful.
Because it goes directly to their understanding rather than whatever rote memorization they’ve done. Anything that involves rote memorization can be done, better, by LLMs. What’s in short supply are people with good critical thinking skills and the ability to deal effectively with new problems.
They don't have to hire in any given country.
Given the current state of affairs in the US, I'd be moving the balance elsewhere too.
The US has rules to play by, the corporations are playing by them. Recently some of the rules seem to involve specific donations but it is the current set of rules.
In this case, the corporations have global reach, so they may decide that other countries people can be more productive per $. Whether importing or offshoring. Are they correct? That's up for debate.
If you are a US citizen, then this is the result of the policies of your country.
Go be upset at the US system, not a random outsider who is suffering the effects of it all as much as anyone else. :-)
I started in tech in the late 70s. I can say this break happened during the Reagan Years with a bit of help from the Nixon Years.
And then we have others claiming that AI is already having such a significant impact on hiring that the effects are clearly visible in the statistics.
AI companies could never make any money (statement about the future, and about AI companies, and finances). And AI could be having a visible effect on hiring today (statement about now, and about non-AI companies, and about employment).
They don't have to both be true, but they do not inherently contradict each other.
Wages for your typical engineer stopped going up 5+ years ago. The joke of senior FAANG engineers making $400k has been a meme for over 5 years. Yet, inflation has done over 20% in 5 years? Look at new offers for people joining the majority of positions available at public tech companies. You're not seeing $500k offers regularly. Maybe at Jane Street or Anthropic or some other companies that are barely hiring - all of which barely employ anyone compared to FAANG. You're mostly seeing the same $350-400k/yr meme.
The reason we're not employing new grads is the same reason as the standards getting much more aggressive. Oversupply and senior talent has always been valued more.
Not true for Western Europe. Getting more than 60k euros yearly as a software engineer was hard in 2019, it's now basically impossible to get less than that.
It is also something which is likely to be quite harmful, since it selects for people who are great at networking over people who have good technical skills. Obviously interpersonal communication is important, but how well a 20 year old in University performs at it should not doom or make their career.
And even people with bad social skills deserve to exist and should be allowed into their chosen career. Being someone who does good work and is respectful, but not overly social, should be good enough.
It's the bloated junior salaries that have killed their market. I never like hiring juniors, I never like working with juniors, and I'd rather pay the extra 20-30% and get someone more experienced. I'm sorry, but if you don't get into FANG, you should basically be working for nothing until you have some experience. It's cruel, it's not fair, but it's just not worth it for the employer. Especially in today's world where there is no company loyalty.
All this BS about AI taking away the stuff that juniors did, in my field, software development, that was never the case. I never worked in a place where the juniors had different work than the seniors. We all did the same things, except the juniors sucked at it, and required handholding, and it would have been faster and better if they weren't there.
The real trick is finding companies that do very simple work, simple enough that juniors can thrive on day one. It won't be the best experience, but it is experience, and the rest is what you make of it.
They forgot to add in "Aging billionaires spend a trillion dollars on longevity research" which results in "110 year old Senior engineers still working"
Because those senior people will NOT be around forever. And they have killed their talent development and knowledge transfer pipelines.
Either direction you take it, this feels like a lose-lose situation for everyone.
People don't think in terms of shared commons and that if all companies are doing the same thing then there won't be much of a "senior" market left to hire.
You say you can just hire from outside the company - but what do you do when there is no one left to hire because the talent pool is completely drained?
Abandoning the junior employee will slowly drain that talent pool until there are no seniors available to hire, the "just hire from outside the company" plan doesn't work any more.
The economics of providing every new grad a $150k TC offer just doesn't work in a world with the dual pressures of AI and async induced offshoring.
Heck, once you factor in YoE, salaries and TCs outside the new grad range have largely risen because having experienced developers really does matter and provides positive business outcomes.
State and local governments needs to play the same white collar subsidy game that the rest of the world is playing in order to help fix the economics of junior hiring for white collar roles. This is why Hollywood shifted to the UK, VFX shifted to Vancouver, Pharma shifted to Switzerland, and Software to India.
It was always a weird US thing driven by huge companies and VCs. In other western, developed countries ~$50k equivalent would be normal. Even adjusting for other provided social benefits, there's still a long way down...
The majority of tech jobs are consolidated in the 3 primary tech hubs - the Bay, Seattle, and NYC.
A $110k new grad position in the Bay would end up becoming around a $130k-$150k TC offer, which lands at the median [0] for entry level SWE roles in the US.
Basically, median TC would need to shift to the 25th percentile as it exists in the US today [0], or shift to the level that they are at the 75th percentile in Canada [1] and the United Kingdom [2], both have which has taken advantage of the differential to a certain extent as well as offering subsidizes to attract FDI from American tech companies.
When an American entry level SWE salary 25th percentile ends up being the equivalent of the 75th percentile of both Canadian and British entry level SWE salaries, something is very wrong given that both countries have similar CoL to the US.
But sadly, in your specific case, based on your resume I think it would be difficult for someone like me to justify hiring you without references or a personal connection (which a lot of people are leveraging, which truly sucks for most new grads). My two cents to you is you may need to consider relocating to a tech hub, even if you are taking a cut compared to where you live or commuting to one even if you have to take a hellish multi-hour commute to the office 2-3 days a week.
[0] - https://www.levels.fyi/t/software-engineer/levels/entry-leve...
[1] - https://www.levels.fyi/t/software-engineer/levels/entry-leve...
[2] - https://www.levels.fyi/t/software-engineer/levels/entry-leve...
Building a GCC ends up costing around $60k-$100k per head in operating costs without subsidizes, and deploying vibe coding tools to fully replace an entire dev team end up in a similar price range (but conversely they could arguably enhance productivity for new grads and hires eg. Glean Search).
I have been unable to get a tech job for months so I’ve looked into retraining in a new field and every single one has some up front large cost via either paying for schooling or situations like mechanics needing to bring their own tools.
The standard US company has completely shed all training costs and put the expectations on laborers to train themselves. And you’re shit out of luck if their requirements change during your training as so many college graduates who picked comp sci are currently learning
If that were to actually happen, we'd wind up excluding many of our greatest technical performers while drowning in a sea of would-be middle managers. People skills matter, but so do many other strengths that don't always overlap with being naturally good at navigating interpersonal dynamics.
But some of the best "people" people that I've seen in my career have been the most technical, also. They were really good at being able to communicate the value of their solution, the problems it solves, and risks and rewards. They could get buy-in from stakeholders and other teams. They could listen empathetically when faced with issues and blockers. And they did so with authenticity and genuine care because they were passionate about software engineering.
I believe those are skills that can be learned and practiced and that you don't have to be necessarily "social" to grow in that area.
The continued reliance on say, COBOL, and the complete lack of those developers comes to mind.
Even before LLMs, there were periods recently where multiple companies had "senior only" hiring policies. That just inflated what "senior" was until it was basically 5 years of experience.
This time seems a bit different, however. There are both supply and demand side problems. The supply of students it tainted with AI "learning" now. Colleges haven't realized that they absolutely have to effectively crack down on AI, or the signal of their degrees will wither to nothing. The demand side is also low, of course, since the candidates aren't good, and AI seems to be a good substitute for a newly graduated hire, especially if that hire is just going to use the AI badly.
So the irony here is that LLMs are actually going to be decent at COBOL by default. And other uncommon/esoteric codebases. For example I vibe-ported some Apple ii assembly to modern C/SDL and... it works. It's stuff that I just wouldn't even attempt at manual development speed. It may be actually an easier path than training someone to do things, as long as you have a large enough test suite or detailed enough requirements.
apologise for inflicting this era on them and teach them to be entrepreneurial, teach them how to build, teach them rust on the backend, teach them postgres, teach them about assets maintaining value while money loses its
tell them to never under any circumstances take on a mortgage, especially not the 50 year variety. tell them to stay at home for as long as possible and save as much as possible and put it into assets: gold, silver, bitcoin, monero
they must escape the permanent underclass, nothing else matters
That is some hard stereotyping being generalised on a platform with worldwide reach. You may wish to rethink what led you to that statement.
Although, with that statement and others from you recently I'm guessing you have "lost your fucks and don't have any more to give" IE burnt out on it all.
Good luck, I hope you get to a place where you can not rely rely on shortcuts like stereotypes so much and have more energy to give to yourself and your life.
Despite everything, I like it that humanity exists. I want humanity to continue to exist. I reject any notion or attitude that would, taken to its logical conclusion, result in the extinction of humanity. And, even more so, that would result in the extinction of my family and lineage. For your sake, I hope that this is just edgy horseshit that you will soon grow out of.
The world is fundamentally different than it was 50 years ago and the same boomer platitudes no longer make sense. We are going to suffer a global economic collapse in the near future (conveniently when the generations to blame are retired or dead) and it's going to reshape our world and what labor looks like.
I just hope that my generation will be kinder to future generations than the last.
Firstly, we've been here before, specifically in 2008. This was the real impact of the GFC. The junior hiring pipeline got decimated in many industries and never returned. This has created problems for an entire generation (ie the millenials) who went to college and accumulated massive amounts of debt for careers that never eventuated. Many of those careers existed before 2008.
The long-term consequences of this are still playing out. It's delaying life milestones like finding a partner, buying a house, having a family and generally just having security of any kind.
Secondly, there is a whole host of other industries this has affected that the author couldn't pointed to. The most obvious is the entertainment industry.
You may have asked "why do we need to wait 3 years between seasons of 8 episodes now when we used to put out 22 episodes a year?" It's a good question and the answer is this exact same kind of cost-cutting. Writers rooms got smaller and typically now the entire season is written and then it's produced when the writers are no longer there with the exception of the showrunner, who is the head writer.
So writers are rarely on set now. This was the training ground for future showrunners. Also, writers were employed for 9 months or more for the 22 episode run and now they're employed for maybe 3 months so need multiple jobs a year. Getting jobs in this industry is hard and time-consuming and the timing just may not work out.
Plus the real cost of streaming is how it destroyed residuals because Netflix (etc) are paying far fewer residuals (because they're showing their own origianl content) and those residuals sustained workers in the entertainment industry so they could have long-term careers and that experience wouldn't be lost. The LA entertainmen tindustry is in a dire state for these reasons and also because a lot of it is being offshored to further reduce costs.
Bear in mind that the old system produced cultural touchstones and absolute cash cows eg Seinfeld, Friends, ER.
Circling back, the entire goal of AI Is to displace workers and cut costs. That's it. It's no more compolicated than that. And yes, junior workers and less-skilled workers will suffer first and the most. But those junior engineers would otherwise be future senior engineers.
What I would like for people to understand that all of this is about short-term decisions to cut costs. It's no more complicated than that.
For example, the death of optical media has had a massive impact on the entertainment industry, particularly movies. Matt Damon has spoken about this, on Hot Wings of all places [1].
Streaming began as a alternate path for monetizing old content other than cable TV syndication. And it was excelelnt for this in the early years. At that time it was bonus income.
But streaming also ushered in a golden age for watching serialized content so it's a mixed bag.
Loss of writers is just one factor. Filming fewer episodes, moving production out of the US and loss os residuals all contribute to killing this ecosystem.
Instead of only funding universities, provide lower risk curves for hiring juniors where the jobs are.
The big issue is the game theory of first mover disadvantage at play.
Whoever trains the junior loses all the investment when the junior jumps ship. This creates a natural situation of every company holding until the ‘foolish ones’ (in their eyes) waste resources on training.
Second mover advantage is real. This is what the government can fix.
There is an unbounded amount of opportunity available for those who want to grab hold of it.
If you want to rely on school and get the approval of the corporate machine, you are subject to the whims of their circumstance.
Or, you can go home, put in the work, learn the tech, become the expert, and punch your own ticket. The information is freely available. Your time is your own.
Put. In. The. Work.
new grads will be fed to the meat grinder with no regards, its a closed shop unless you know someone
We're not hiring a lot of rotary phone makers these days.
Who is hiring their own shoe-smith? It's been 30-ish years since my carpenter father last had work boots resoled.
It's almost as if... technology and economy evolve over time.
For all the arguments software people make about freedom to use their property as they see fit, they ignore non-programmers use of personal technology property is coupled to the opinions of programmers. Programmers ignore how they are middlemen of a sort they often deride as taking away the programmer's freedom! A very hypocritical group, them programmers.
What's so high tech about configuration of machines with lexical constructs as was the norm 60+ years ago? Seems a bit old fashioned.
Programmers are biology and biology has a tendency to be nostalgic, clingy, and self selecting. Which is all programmers are engaged in when they complain others won't need their skills.
Furthermore, this is why the humanities matter: because human relationships matter.
where do you network? what do you network with these other humans on?
I do think I could get a job from my network because I’ve worked in the industry for years and done good work; I’m a little skeptical of advice to network to junior/new grads. I at least ignore those LinkedIn requests
Unfortunately, if you network to get a job, you're already months behind.
As I talk to college kids, I try to get them to find opportunities to network while they're in school, before they're desperate to get that first internship or job. They want to come at their search from a place of confidence, not anxiety.
There are so many meetups at universities (at least at the one near me) that they can mingle with the working world, and they stand out because they're there when it's mostly professionals.
Student or not, networking works best in-person when possible (conferences, meetups, professional events) where you get to know people and get truly curious about them. But after that, it involves following up and keeping the relationships warm, showing that you are interested in people professionally and can possibly help them with their problems, and that's no trivial investment.
If you do that enough, then you will build trust and rapport to create some opportunities, but it's admittedly a long game. It also has to be genuine or else people end up feeling used.
I think that there is a blocker that a lot of people have against networking in general because it feels gross and insincere. We've all seen people do it poorly, and so we avoid it, but it can be really fulfilling if done well.
I have had so many people reach out to me out of the blue when they're looking for job, after literally leaving me on read in LinkedIn DMs. And giving them the benefit of the doubt, I meet with them and try to help them out, and then I never hear from them again after they find a job. It doesn't feel great, which is why I always suggest being intentional about nurturing your close professional relationships. It doesn't have to be anything grand; just being kind and courteous goes a long way.
This is terrible advice. Apply, cold call, create projects, job fairs, get co-op opportunties and ambush are better ways. Hackathons, github projects or small businesses can help. 9/10 CEOs will ignore your cold outreach but some won't.
Getting too busy making friends at the Greek houses will land you a marketing role if you are lucky. People need to associate you wish your craft. If they know you as a social guy you will get social roles. Any developer too social is suspect for many and ends up at best a pm.
When I was coming up people went into hardware/certifications to bridge the gap but moving from hardware to software was a gap too big for many as they became typecast.
New grads (myself included, back then), tend to discount Tier 2, because in their head the hiring process is looking for the single applicant with the best technical skills. When in reality, it's a lot more of a "who can we get quickly, who won't have a negative impact on team output or morale". Parents, Parent's friends, friends, and friend's parents all can fall into Tier 2, and absolutely should asked about whether their workplaces are hiring, and if so, if they could provide a recommendation.
Tier 3 is mostly useful for finding out about positions that don't necessarily get publicized, but depending on mutual connection to the shared acquaintance, might be willing to offer a recommendation.
With regards to where to network, that comes down to engaging with social gatherings that bring together a spread of people that aren't exclusively your direct peers. That's the stumbling block a lot of new grads find themselves in, which is that all their social time is spent with other new grads (or worse still, nobody at all). Clubs, parties thrown by friends' parents, university alumni events, hell, join the Oddfellows (YMMV, some lodges stopped recruiting after Vietnam). Conferences, whether technical or not. Hell, a step I recommend for everyone is going to bars and talking to strangers. Not highest density networking opportunity (except some gay bars in SF), but it's a pretty good environment to practice casual communication with people you have approximately nothing in common with, with very low stakes.
- go to events/conventions/join clubs related to programming (need to be located near a large city for this)
- talk to other students/self-learners and wait for them to get to the next step
I’ve been unemployed a long time and have been thinking of improving at networking. These are what I came up with.
However, it takes time.
If you need a job right now, it won't happen via ordinary networking, by which I mean networking with people whose job isn't recruitment.
If you think of networking as a pleasant way to keep some interesting ideas flowing and making some friends, circulation will get you things that you never even thought of.
(The best professional recruiters actually stir the pot for years and years before getting a return. Constantly keeping up with what various people are doing, just in case the time is right for someone to move on.)
I'm actually a bit surprised, because as a young guy I didn't do any networking beyond connecting with colleagues, which certainly helped. But I'm finding lots of young guys will reach out to me for advice. It's a good habit, but one I suspect more than half the population doesn't practice.
If you're a senior, maintain relations with last year's graduating class (and with your placement services people).
If you get an internship, keep in touch with people there.
If you are a new grad: go to alumni events. Go to alumni events! GO TO ALUMNI EVENTS.
If you are still in school: talk to your alumni and career office; they will be able to connect you better.
If you are in High School: consider a university with a co-op program.
The value of fact-to-face connection should not be underestimated.
Again: this may be uncomfortable for some people, but it is the way of the world.
The article is self-serving in identifying the solutions ("do things related to the service we offer, and if that doesn't work, buy our service to help you do them better"), but it is a subject worth talking about, so I will offer my refutation of their analysis and solution.
The first point I'd like to make is that while the hiring market is shrinking, I believe it was long overdue and that the root cause is not "LLMs are takin' our jerbs", but rather the fact that for probably the better part of two decades, the software development field has been plagued by especially unproductive workers. There are a great deal of college graduates who entered the field because they were promised it was the easiest path to a highly lucrative career, who never once wrote a line of code outside of their coursework, who then entered a workforce that values credentialism over merit, who then dragged their teams down by knowing virtually nothing about programming. Productive software engineers are typically compensated within a range of at most a few hundred thousand dollars, but productive software engineers generally create millions in value for their companies, leading to a lot of excess income, some of which can be wasted on inefficient hiring practices without being felt. This was bound for a correction eventually, and LLMs just happened to be the excuse needed for layoffs and reduced hiring of unproductive employees[1].
Therefore, I believe the premise that you need to focus entirely on doing things an LLM can't -- networking with humans -- is deeply faulty. This implies that it is no longer possible to compete with LLMs on engineering merit, and I could not possibly disagree more. Rather than following their path forward, which emphasises only networking, my actual suggestion to prospective junior engineers is: build things. Gain experience on your own. Make a portfolio that will wow someone. Programming is a field that doesn't require apprenticeship. There is not a single other discipline that has as much learning material available as software development, and you can learn by doing, seeing the pain points that crop up in your own code and then finding solutions for them.
Yes, this entails programming as a hobby, doing countless hours of unpaid programming for neither school nor job. If you can't do that much, you will never develop the skills to be a genuinely good programmer -- that applied just as much before this supposed crisis, because the kind of junior engineer who never codes on their own time was not being given the mentorship to turn into a good engineer, but rather was given the guidance to turn them into a gear that was minimally useful and only capable of following rote instructions, often poorly. It is true that the path of the career-only programmer who goes through life without spending their own time doing coding is being closed off. But it was never sustainable anyways. If you don't love programming for its own sake, this field is not likely to reward you going forward. University courses do not teach nearly effectively enough to make even a hireable junior engineer, so you must take your education into your own hands.
[1] Of course, layoff processes are often handled just as incompetently as hiring processes, leading to some productive engineers getting in the crossfire of decisions that should mostly hurt unproductive engineers. I'm sympathetic to people who have struggled with this, but I do believe productive engineers still have a huge edge over unproductive engineers and are highly likely to find success despite the flaws in human resource management.
I have been seeing an uptick of articles on HN where someone identifies a problem, then amps it up a bit more and then tells you that they are the right ones to solve it for a fee.
These things should not be taken seriously and upvoted.
It's just an app, not a service, that my husband and I built (and quit our jobs for) that has a generous free trial. (Technically, right now it's completely free because it's in early access, so if you never upgrade, you could use it for free forever.)
The CTA at the end was just in an effort to talk to more people (for free) and see how we can help and make our software better. I come from the DevOps world, and they always say you have to first know how to do something really well manually before you can automate it, and that's what we're trying to do by talking to people (for free).
Thanks for giving it some thought and for your perspectives, they really help.
So, from a individual's perspective, figuring out how to meet people who will help you sidestep the "unwashed masses" pile of applications is probably the next most important thing after technical competence (and yeah, ranking above technical excellence).
That's exactly what the portfolio is for. Having an actual body of work people can look at and within a couple of minutes of looking think "wow, this person will definitely be able to contribute something valuable to our project" will immediately set you apart from every applicant who has vague, unreliable credentials that are only extremely loosely correlated with competence, like university trivia. You do need to get as far as a human looking at your portfolio, which isn't a guarantee on any given application, but once you get that far your odds will skyrocket next to University Graduate #130128154 who may have happened to get human eyes on their application but has nothing else to set them apart.
Good luck with causation/correlation vs the rise of LLM.
Not to mention I'm the only white person on my team other than the owner/operator. They already brought in bots of sorts from overseas. The constant drive to cheaper labor and gutting of the American middle class has been vast compared to the suffering the industry will have under junior developers using AI. It's definitely made my job easier. And I really don't care. No one cared about me. I have relatively low pay, no health insurance, and no 401K. When the last person left, management replied to his goodbye email saying he'd be replaced in a week. And then they proceeded to try to hire someone in Mexico City. Maintain the same time zone, but pay 3rd world wages and likely to have coercive control over them through desperation. Never found anyone.
I have no love for this industry or any of the "woes" it'll have with AI. Overall it's going to lead to lower wages and less jobs. For those out there producing "AI slop", I support you. It's hardly what they deserve, but they've earned it.
The general population is being rapidly sacked as a 'necessary' expense of criminal elites.
No one should be happy about this.
we’re so cooked
It wasn't too long ago that it was common to read threads on HN and other tech fora about universities graduating software engineers seriously lacking coding skills. This was evidenced by often-torturous interview processes that would herd dozens to hundreds of applicants through filters to, among other things, rank them based on their ability to, well, understand and write software.
This process is inefficient, slow and expensive. Companies would much rather be able to trust that a CS degree carries with it a level of competence commensurate with what the degree implies. Sadly, they cannot, still, today, they cannot.
And so, the root cause of the issue isn't AI or LLM's, it's universities churning people through programs and granting degrees that often times mean very little other than "spent at least four years pretending to learn something".
If you are thinking that certain CS-degree-granting universities could be classified as scams, you might be right.
And so, anyone with half a braincell, will, today, look at the availability of LLM tools for coding as a way to stop (or reduce) the insanity and be able to get on with business without having to deal with as much of the nonsense.
Nobody here makes a product or offers a service (hardware, software, anything) for the love of the art. We make things to solve problems for people and services. That's why you exists. Not to look after a social contract (as a comment suggested). Sorry, that's nonsense. The company making spark plugs makes spark plugs, they are not on this planet to support some imaginary public good. Solving the problem is how they contribute.
And, in order to solve problems, you need people who are capable of deploying the skills necessary to do so. If universities are graduating people who can barely make a contribution to the mission at hand, companies are going to always look for ways to mitigate that blocking element. Today, LLM's are starting to provide that solution.
So it isn't about greed or some other nonsense idealistic view of the universe. If I can't hire capable people, I will gladly give senior engineers more tools to support the work they have to do.
As is often the case, the solution to so many problems today --including this one-- is found in education. Our universities need to be setup to succeed or fail based on the quality of the education they deliver. This has almost never been the case. Which means you have large scale farming operations granting degrees that can easily be dwarfed by an LLM.
And don't think that this is only a problem a the entry level. I recently worked with a CTO who, to someone with experience, was so utterly unqualified for the job it was just astounding that he had been give the position in the first place. It was clearly a case of him not knowing just how much he didn't know. It didn't take much to make the case for replacing him with a qualified individual or risk damage to the company's products and reputation going forward.
A knowledgeable entry-level professional who also has solid AI-as-a-tool skills is invaluable. Note that first they have to come out of university with real skills. They cannot acquire those after the fact. Not any more.
NOTE: To the inevitable naive socialist/communist-leaning folks in our mix. Love your enthusiasm and innocence, but, no, companies do not exist to make a profit. Try starting one for once in your naive life with that specific mission as your guiding principle and see how far you'll get.
Companies succeed by solving problems for people and other companies. Their clients and customers exchange currency for the value they deliver. The amount they are willing to pay is proportionate to the value of the problem being solved as perceived by the customer --and only the customer.
Company management has to charge more than the mere raw cost of the product or service for a massive range of reasons that I cannot possibly list here. A simple case might be having to spend millions of dollars and devote years (=cost) to creating such solutions. And, responsible companies, will charge enough to be able to support ongoing work, R&D, operations, etc. and have enough funds on hand to survive the inevitable market downturns. Without this, they would have to let half the employees go every M.N years just because of natural business cycles.
So, yeah, before you go off talking about businesses like you've never started or ran a non-trivial anything (believe me, it is blatantly obvious when reading your comments), you might want to make an attempt to understand that your stupid Marxists professors or sources had absolutely no clue, were talking out of their asses, never started or ran a business, and everything they pounded into your brains fails the most basic tests with objective, on-the-ground, skin-in-the-game reality.
I went to a very bottom-tier school with a piss-poor reputation (but no debt!).
That didn’t stop me from getting employed, because employees were looking for workers when I started my employment journey.
OK, you're one of those people. Good grief, get a grip.