Ask HN: Would you hire a "vibe coder"?
1. For the team leaders / engineering managers out there, would you hire someone that was a self-proclaimed "vibe coder"?

2. What, if any, amount of education in programming-related topics (cs, history, languages, architecture, etc) should be required before entering the workforce?

2b. Do you think a potential rise in "vibe coding" is going to make it more difficult for less experienced workers to actually learn on the job?

Absolutely not. Vibe code is barely sufficient for an unskilled would-be founder to throw out a trash proof of concept in hopes of attracting VC money. The results are not production-quality for an ongoing business, the code is a sloppy average for code hoovered indiscriminately from places ike StackOverflow, reddit, and even Quora. The tools have no concept of performance, security, consistency, maintainability, or fitness-for-purpose.

And before you say, "OK but they will improve, right?", let me give you an analogy.

We can build ladders to climb to the moon, we just need better ladder technology. Today, we can build very tall self-supporting ladders. We cannot yet build one tall enough to reach the moon, but by next year, or 2028 at the latest, we'll have ladders to the moon, or failing that, ladder technology that, in principle, will be sufficient to reach the moon.

Now replace "ladders to the moon" with "LLMs to AGI".

https://cendyne.dev/posts/2025-03-19-vibe-coding-vs-reality....

From my limited experience with AI coding I wouldn't hire a "bive coder". I'd be less concerned about specific language experience though as long as the team already had a strong foundation in the language they were using and the new hire had experience with similar languages and domains. I have a strong background in Java, decent background in Go, and virtually no background in Python. I've been using AI to help me write Python and I feel my prior programming experience has allowed me to have a pretty good understanding of what the generated code does and where it's incorrect even with out being an expert in Python. If I were hiring and we made use of AI tools during development I'd have no concerns about hiring a DotNet developer for a Java position if there fundamentals were solid. I wouldn't want to hire a Java developer for imbedded systems work in C since even if the languages have syntactic similarities the domains are incredibly different.
No, I would not hire a vibe coder. I think vibe coding will make cowboy coding look rigorous and make debugging next to impossible.

It is already the case where I interview less experienced people who ultimately have no ability to code. Vibe coding will make that problem worse. It will not make it harder for someone who actually can code get a job. IMHO

Relatedly, there is a YC company right now that states on is web site that it will ONLY hire "vibe" coders.

And you're expected to work 16 hour days. And weekends. And for about half the pay of a regular coder.

I don't know how they wrote that job listing with a straight face.

I see worse and worse software being produced in the future. Vibe coders won't understand half the code an LLM spits out.
I would, if they're a great dev when they aren't "vibe coding". That they called themselves that would certainly make me more cautious about them, though.
1. I would see that a red flag.

2. Any standard education in CS or SWE, AI hasn't changed anything about it IMHO. Only people hyping ChatGPT believe that the world has changed.

2b. It doesn't change anything since "vibe coding" (aka "code vomit") has no value to me. I want to work with juniors eager to learn, and vibe coders are not part of that group.

I already had bad experiences with "vibe coders" who can't explain or debug their code. They are like "pigeon CEOs" and only add problems without bringing solutions.

[dead]
[dead]
  • ·
  • 1 month ago
  • ·
  • [ - ]
I would, under my understanding of what vibe coding is, provided that the prompt and vibe check processes aren’t overly limited.

My understanding of vibe coding is that you prompt, the LLM does something, and if it passes the vibe check you go with it, and if not, you do something to revert it, and afterward you try another prompt. Reverting after it doesn’t pass a vibe check could be actual reverting or a forward fix.

If you read stuff and do research and spend time thinking when prompting or doing a vibe check, I think this is a valuable activity.

Define “vibe check” in a way that communicates some kind of rigor, understanding of the code, a repeatable process, a reliable path to good software. Does it just mean “looks ok to me right now?”
A vibe is intuition, so it can't be too much, else it isn't vibe coding. However, the intuition that comes after taking some time is different than after a quick glance. For instance if you read a pull request and then immediately decide after reading it at a measured space that would be different than after skimming it. So yes, Looks OK to me right now, or LGTM, as in a pull request, is what I have in mind.
Intuition comes from deep understanding, a set of learned heuristics. That implies experience that lets you skim and trust your quick judgment. And it informs the probability you assign to the correctness of your judgment.

As I understand it vibe coding and “vibe check” refer merely to “a looks good to me” feeling with no deep understanding, not knowing what you don’t know, which ignores security holes and edge cases.

If my kids “vibe code” a fort with cardboard boxes and blankets they have a different intuition about stability, security, longevity, and the purpose of a fort than I do.

I guess a full time vibe coder would develop an intuition based on experiences with vibe coding and launching stuff and getting feedback. This process would increase their level of understanding and intuition.
We will all form an intuition about "vibe coding" before too long, probably not a positive intuition. I'd suggest people "launching stuff" in hopes of feedback would label their work as "vibe coding" to warn prospective users, but for now at least that will be obvious, like a child's cardboard fort obviously can't pass for a real fort.

Someone could "vibe bake" cakes by trial and error with the help of AI, then try to sell them and get feedback. That might work, or they might poison someone. Alternately they could learn how to bake and study baking to develop actual skills and understand what they're doing.