I'm deeply skeptical of CEOs being "built different" like some people are arguing here. If Elon can be CEO of three companies and the founder of a couple more while also finding time to tweet 50+ times a day, have a failed and embarrassing stint in trying to optimize the federal government, and get K-holed at parties then the demands of the job can't be that rigorous.

If anything, I would argue that the strategic decisions actually can be automated/performed via broader consensus. With that handled, all that's left is the cartel that CEOs have invented to justify their exhorbant pay packages.

> With that handled, all that's left is the cartel that CEOs have invented to justify their exhorbant pay packages.

CEO compensation is determined by board committees mostly made up of other CxOs. They write letters to each other's shareholders about how valuable CEOs are to build up the edifice.

I wish my compensation were determined by fellow engineers who "truly know my worth". I'd pay it forward if I were on a committee determining colleague's pay packets.

  • pan69
  • ·
  • 19 hours ago
  • ·
  • [ - ]
Works kinda like "industry awards" where a bunch of companies give awards to each other.
Well, be the change you want to see. CEOs know that part of the justification of their existence relies on solidarity between each other - and that they can collectively bargain with institutions to remind of that justification.
  • duxup
  • ·
  • 17 hours ago
  • ·
  • [ - ]
I suspect Elon's value proposition now isn't really anything he does, it's the fear for stockholders that their companies will be evaluated like any other company and it will tank the stock if Elon isn't around.

Sort of a stock suicide pact ...

I'd advise you take a look at some of Musk's companies: - Tesla is the top seller of EVs in the US, beating century-old companies. - SpaceX has left public institutions like NASA and ESA in the dust despite their vastly bigger budgets - Although it joined late, xAI is now firmly in the top 4 of AI companies worldwide (OpenAI, Anthropic, Google, xAI)

What's the common element between these successes?

> What's the common element between these successes?

Financial engineering.

> xAI is now firmly in the top 4 of AI companies worldwide

Lolol literally no one thinks this

That Elon Musk let the competent people do their job and didn't meddle too much? Did you know that Cybertruck, Starship and Twitter are the projects where Musk has let his competence "shine" the most?
It's been long-since established that the only reason early Musk companies survived was employees learning how to manage up and manipulate Musk into doing the right things early on...
[dead]
> If Elon can be CEO of three companies and the founder of a couple more while also finding time to tweet 50+ times a day, have a failed and embarrassing stint in trying to optimize the federal government, and get K-holed at parties then the demands of the job can't be that rigorous.

Except, Elon has been largely successful at being the "CEO" of these companies because he attracts talent to them. So...either:

(1) Businesses still will require human talent and if so, I don't see how an AI bot will replace a CEO who is necessary to attract human talent.

OR

(2) Businesses don't require any human talent beneath the CEO and thus a CEO is still necessary, or at least some type of "orchestrator" to direct the AI bots beneath them.

From contact with former SpaceX and Tesla engineers, these companies attract talent in spite of the CEO.
Keep in mind, SpaceX and Tesla are both in industries where people don’t actually have that many other options. If you really want to see what talent Elon is able to attract, you need to look at his other companies like X.
My experience would lead me to disagree, Elon companies tend to be highly competitive to apply for, and working for one is still quite prestigious among people hiring. If young engineer support means anything their merch is still quite popular among college engineering students
I think it just means he’s not the real CEO in any of them and has to have someone below actually running things
That someone actually running things is usually called COO, or Chief Operating Officer. That person operates the company.
>That person operates the company.

s/operates/operates on/

So they are a surgeon? Wouldn't be surprised at the damage they cause, conidering the business results of so many companies.

What are you saying? Seems like you created your own strawman with that sed phrase.
I mean, you're not wrong. The COO, Gwynne Shotwell, at SpaceX, is known to handle a lot of the day to day stuff, and I feel that further reinforces the point. If she can handle all of that in her role as COO then what's the point of a CEO?
A CEO will set the grand vision, long term goals, and direction for the company - typically approved by the board (which the CEO has to convince). The COO literally operates the company, while the CEO will nudge them in certain directions to accomplish broader goals.
> A CEO will set the grand vision, long term goals, and direction for the company

I thought determining vision/goals/direction was the responsibility of the board. The Chief Executive Officer is supposed to execute the board's wishes.

  • Alupis
  • ·
  • 54 minutes ago
  • ·
  • [ - ]
A board is often a consulting group. They're there to see the 30,000ft view of what's going on, areas that need focus, and suggest grand strategy/guidance. A board is usually comprised of executives from other companies. The CEO is usually the one selling the board on their vision and execution. The board acts like guard rails for that vision.
Hot take: Musk is a great CEO. He's a horrible person, but I feel it's undeniable that his weight behind a project greatly increases the chance of interesting and profitable things happening (despite the over-optimistic claims and missed deadlines). I think he achieves this in large part _because_ he is an asshole, tweeting all the time to drum up publicity, being notorious for doing K, being too optimistic about what can be achieved, etc. I think somebody can be a good CEO without being such a jerk, it's just that Musk doesn't take the good-person strategy. And the bad-person strategy works well for him.

A CEO's job is (roughly) to maximize a company's valuation. It is not to run the company themselves, not to be nice, not to improve the world. I'm not claiming this is what _should_ be, just how it _is_. By this metric, I think Musk has done really well in his role.

Edit: Tangentially related -- at the end of the musical "Hadestown", the cast raise their glasses to the audience and toast "to the world we dream about, and the one we live in today." I think about that a lot. It's so beautiful, helps enforce some realism on me, and makes me think about what I want to change with my life.

> being too optimistic about what can be achieved

It's called "lying to customers and investors".

> And the bad-person strategy works well for him.

Worked. Tesla is not doing that well recently. Others are a bit better.

The first part works because otherwise reusable rockets wouldn't have been invented (or maybe they'd have been invented 20 years later). It's the same as Steve Jobs, the Android guys were still making prototypes with keyboards until they saw the all screen interface of the iPhone. Sometimes it requires a single individual pushing their will through an organization to get things done, and sometimes that requires lying.
> The first part works because otherwise reusable rockets wouldn't have been invented…

Maybe, maybe not. We often see technology reach a threshold that allows for sudden progress, like Newton and Leibniz both coming up with calculus at around the same time (https://en.wikipedia.org/wiki/Leibniz%E2%80%93Newton_calculu...), or Darwin rushing to publish On The Origin of Species because someone else had figured out the same thing (https://en.wikipedia.org/wiki/Alfred_Russel_Wallace).

SpaceX benefited immensely from massive improvements in computing power, sensors, etc.

It did, and it needed a direction to be pushed towards. Are you familiar with the great man theory of history [0]? It is no different here (well, historians these days use a blend of great man theory and historical materialism as you're stating in your example, as no one theory explains the majority of historical changes).

[0] https://en.wikipedia.org/wiki/Great_man_theory

How does one distinguish between a great man and a lucky one?
Historians don't care about such a distinction, of course. Was Genghis Khan lucky when he conquered half the world?
I mean, read his Wiki entry, and you’ll discover he got a really nice coat as a wedding present, which he regifted to a powerful patron.

You can decide if that’s a touch of luck. I’m sure he had a few near misses in combat with an element of luck, too.

That's exactly what I mean by my point, it's not all luck, of course it was his skill in uniting the tribes which none had done before.
It is not all luck, correct.

Some of it is luck. Often quite a bit.

> The first part works because otherwise reusable rockets wouldn't have been invented (or maybe they'd have been invented 20 years later).

I do not want to take credit away from SpaceX in what they achieved. It sure is complex. But it's also possible to give someone excess credit by denying others what is due. I don't know which part of 'reusable rockets' you are talking about, whether it's the reusable engines and hardware or if it's the VTOL technology. But none of that was 'invented' by SpaceX. NASA had been doing that for decades before that, but never had enough funding to get it all together. Talking about reusable hardware and engines, the Space Shuttle Orbiter is an obvious example - the manned upper stage of a rocket that entered orbit and was reused multiple times for decades. SpaceX doesn't yet have an upper stage that has done that. The only starship among the 9 to even survive the reentry never entered orbit in the first place. Now comes the 'reusable engine'. Do you need a better example than the RS-25/SSME of the same orbiter? Now let's talk about VTOL rockets. Wasn't Apollo LMs able to land and takeoff vertically in the 1960s itself? NASA also had a 'Delta Clipper' experiment in the 1990s that did more or less the same thing as SpaceX grasshopper and Starship SN15 - 'propulsive hops', multiple times. Another innovation at SpaceX is the full-flow stage combustion cycle used in the Raptor engine. To date, it is the only FF-SCC engine to have operated in space. But both NASA and USSR had tested these things on the ground. Similarly, Starship's silica heat tiles are entirely of NASA heritage - something they never seem to mention in their live telecasts.

I see people berating NASA while comparing them with SpaceX. How much of a coincidence is it that the technologies used by SpaceX are something under NASA's expertise? The real engineers at SpaceX wouldn't deny those links. Many of them were veterans who worked with NASA to develop them. And that's fine. But it's very uncharitable to not credit NASA at all. The real important question right now is, how many of those veterans are left at SpaceX, improving these things? Meanwhile unlike SpaceX, NASA didn't keep getting government contracts, no matter how many times they failed. NASA would find their funding cut every time they looked like they achieved something.

> It's the same as Steve Jobs, the Android guys were still making prototypes with keyboards until they saw the all screen interface of the iPhone.

Two things that cannot be denied about Steve Jobs is that he had an impeccable aesthetic sense and an larger-than-life image needed to market his products. But nothing seen in the iPhone was new even in 2007. Full capacitive touch screens, multi-touch technology, etc were already in the market in some niche devices like PDAs. The technology wasn't advanced enough back then to bring it all together. Steve Jobs had the team and the resources needed to do it for the first times. But he didn't invent any of those. Again, this is not to take away the credit from Jobs for his leadership.

> Sometimes it requires a single individual pushing their will through an organization to get things done, and sometimes that requires lying.

This is the part I have a problem with. All the work done by the others are just neglected. All the damages done by these people are also neglected. You have no idea how many new ideas from their rivals they drive into oblivion, so as to retain their image. Leaders are a cog in the machine - just like everyone else working with him to generate the value. But this sort of hero worship by neglecting everyone else and their transgressions is a net negative for human race. They aren't some sort of divine magical beings.

  • mc32
  • ·
  • 19 hours ago
  • ·
  • [ - ]
In many companies, probably being the main override. Trust but verify.
  • prawn
  • ·
  • 17 hours ago
  • ·
  • [ - ]
Maybe a bit like an ultimate funnel directing the broader effort of the company. That, plus brand/figurehead.
  • qq66
  • ·
  • 13 hours ago
  • ·
  • [ - ]
Elon doesn't bring huge amounts of time to his companies, he brings some sort of skill which I don't know how to characterize but empirically must exist given the level of repeatable success he's had.

If there was a job description to "throw this football 50 yards into a trash can, a couple of times per week" I wouldn't be able to do the job at all, but an NFL quarterback might be able to do the job for 5 different companies while also Tweeting 50 times a day.

Maybe he's a brand. The skill is that investors will put money into a company that has his name on it, and companies with money succeed.
  • qq66
  • ·
  • 10 hours ago
  • ·
  • [ - ]
No, it's clearly much more than that. He was able to get Grok to a frontier-level LLM while Apple and Microsoft, with far more money to throw at the problem, and more existentially threatened by not succeeding, have not.
Have Microsoft and Apple even tried? I haven’t seen any evidence that it’s their goal.
Microsoft didn’t but Apple did.

So did Meta.

Is there any secret sauce behind LLM other than big money? I'm under the impression that its a known recipe at its core and for many of the enhancements around it.
Maybe, but there is also the potential for survivorship bias being a factor here too. The chance that a specific person with no football skills can throw a football 50 yards into a trash can is pretty low. But if you gather a stadium full of unskilled random people, chances are good that one of them will be able to do so, even multiple times. But you'd be wasting your time trying to discern what special football skill that person has.

I'm not saying this means successful CEOs don't have any relevant skills contributing to their success, but it's worth considering that for the most part we're only seeing the successful ones. It's hard to say how many would-be billionaire CEOs are out there with similar skills to someone like Elon Musk who just happened to get unlucky.

The MBA curriculum is stunningly ridiculously easy.

The entire point of an MBA is networking for executive roles.

  • whstl
  • ·
  • 18 hours ago
  • ·
  • [ - ]
Funny story: I'm friends with a political scientist that sustained themselves through college by writing thesis papers for MBA students. They would research, then buy a two liter energy drink bottle and write it all in one go over the weekend.
It is easy, yes. About the equivalent of two or three A levels for anyone in the UK. However the point is not networking, but understanding large areas of business operation that you don't already know. For people like us, that's generally things like strategy, finance, marketing (which isn't the same thing as advertising), organisational behaviour (effectively applied sociology), HR (the weakest area of the course I took). It's not particularly useful for networking, since the people you meet are at your own level.
> understanding large areas of business operation that you don't already know

Library card, google search, LLM, Annas archive, not even joking. I've seen the curriculum, its the kind of stuff you read a book about on a weekend.

> not particularly useful for networking, since the people you meet are at your own level.

I think you may have missed the point of the MBA.

if it's so easy, why is spacex 90% of earth launch volume? lol

something is different at elon's companies. My guess is he has autism superpowers of not caring about your feelings and just operating on the facts. Nothing done for show.

I'm roughly paraphrasing Andrej

https://www.youtube.com/watch?v=YfFA9KCsKZ8

A CEO can be valuable while still doing nothing with a simple explanation: they are cult figures whose purpose is to increase the stock value. This is obvious in the case of a drug addict like Elon like you describe, but others are increasingly copying the playbook.
  • ·
  • 9 hours ago
  • ·
  • [ - ]
I want to see this experiment run, lol
If it's so easy, why haven't you done it?
Because luck plays a very significant role.
That and having enough millions in the first place to meet with the right people and get/buy a position helps.
As the Rick & Morty quote goes, "that just sounds like luck with extra steps".
okay, thanks reddif
  • lijok
  • ·
  • 18 hours ago
  • ·
  • [ - ]
Attributing something to luck sounds like a lazy cop out, sorry. We just had an article on the front page yesterday about “increasing your luck”.

If you need to be lucky in meeting the right people, you can increase your chances by spending your evenings in the your nearest financial district watering hole. We’ve easily established luck can be controlled for, which puts us back into skill territory.

What specifically must one luck out on? Have you tried?

Exactly, as a multimillion lottery winner, it upsets me so much when people say I won because of luck.

I played every single day, and I played at different locations. I also made sure I performed my pre-ticket rituals which I learned from other lottery winners. Other people could have done the same. It’s absolutely a skill issue.

  • lijok
  • ·
  • 9 hours ago
  • ·
  • [ - ]
You picked the lottery you played, on which day, with what buyin, where you bought the tickets from. Did you not?
> Attributing something to luck sounds like a lazy cop out, sorry.

Everyone one of us here has an unbroken line of lucky (lucky enough!) ancestors stretching back a billion years or so. Pretending it's not a thing is silly.

When you're born matters. Where you're born matters. Who you encounter matters. etc. etc. etc.

> What specifically must one luck out on? Have you tried?

I think perhaps we have different definitions of luck.

  • lijok
  • ·
  • 18 hours ago
  • ·
  • [ - ]
No, I think we have a similar definition of luck, but I think you’ve succumbed to a defeatist attitude. You have to be pretty unlucky to be permanently locked out of becoming a CEO, and if you’re dealt those cards, moaning about it on an online forum would be way down in your list of priorities.
> You have to be pretty unlucky to be permanently locked out of becoming a CEO…

Sure, but that's not what's being asserted. I am not "permanently locked out" of megacorp CEO roles; I'm just vanishingly unlikely to get one.

There are lots of people who have enough singing/dancing skill to be a mega popstar like Taylor Swift. There just aren't enough slots.

Could I become the next Steve Jobs? Maybe! I'd have to get really lucky.

  • lijok
  • ·
  • 18 hours ago
  • ·
  • [ - ]
Then why were you bringing up conditions of ones birth?

Vanishingly unlikely to get one if you try, or vanishingly unlikely to get one if you sit on your ass all day?

I assume you’re talking about the former and yet I don’t think you’ve thought this through. I think you’ve blindly attributed to luck what actually requires time, perseverance, grit, lack of morality. The only way to figure that out is for you to offer up your understanding of what one must luck out on?

> Then why were you bringing up conditions of ones birth?

Because they're a form of luck?

If you're born in the developed world, that's luck. If you're born to supportive parents, that's luck. If you're Steve Jobs and you wind up high school buddies with Woz in Mountain View, CA, that's luck. White? Luck. Male? Luck. Healthy? Luck. A light touching of psychopathy? Luck!

> Vanishingly unlikely to get one if you try, or vanishingly unlikely to get one if you sit on your ass all day?

Both.

> I think you’ve blindly attributed to luck what actually requires time, perseverance, grit, lack of morality.

There are many, many people who devote time, perserverance, and grit to their endeavours without becoming a "hugely expensive" CEO. Hence, luck. Is it the only thing? No. Is it a thing? Yes, absolutely.

  • lijok
  • ·
  • 17 hours ago
  • ·
  • [ - ]
None of what you’ve mentioned is a requirement to become a “hugely expensive” CEO. If you’re born into conditions which stop you from becoming self reliant, that’s a different story but we covered that.

Those people who devote time - do they devote time to becoming a hugely expensive CEO or just some “endeavours”?

I think we’re fundamentally disagreeing on whether or not lack of luck can be adequately compensated for by exerting more effort. I have not yet heard of a compelling argument for why that’s not the case.

> None of what you’ve mentioned is a requirement to become a “hugely expensive” CEO.

Again, no one said they're requirements. Just significant factors. You don't have to be white, you don't have to be male, you don't have to be from the developed world… but you do have to have some substantially lucky breaks somewhere.

A quadriplegic orphan of the Gaza War might become the next Elon Musk. But the odds are stacked heavily against them.

God save us from grindset influencers who pedal all this ‘if you didn’t succeed it was down to you not trying hard enough’ m’larky. In some respects I appreciate the call to taking agency but the fact it results in people being unable to acknowledge the sheer extent of external factors in the world is crazy.
No one said anything about megacorps though, just CEOs.
No one except the article we're all (theoretically) discussing, titled "CEOs are hugely expensive", citing "the boards of BAE Systems, AstraZeneca, Glencore, Flutter Entertainment and the London Stock Exchange" as examples in the introductory paragraph.
Now read the rest of the article. It talks about CEOs in general, not just megacorp ones, even if it does use megacorp CEOs in the intro. It is asking a general question of whether the role of a CEO should be automated. Articles often start with a hook that is related but does not wholly encompass the entirety of the point of the article.
> Now read the rest of the article.

I did.

> It talks about CEOs in general, not just megacorp ones, even if it does use megacorp CEOs in the intro.

This does not accurately describe the article.

Well if we're deriving different conclusions from the same article, then there is probably not much else to talk about.
But the dice gets rolled for everyone and clearly success isn’t randomly distributed.

So what does that tell you?

It must be luck plus something else.

> It must be luck plus something else.

That is why I said “significant role”, not “the only requirement”, yes.

In science we have the idea of background noise - a random signal that is always there is random fashion.

And what is typically done is you ignore it. It’s always there, it’s random, and it applies to all samples.

Same with luck and success. You can control luck, so you focus on what’s left.

This is the battle cry of the loser. Luck plays a part in everything, so it's moot.

You aren't the CEO of anything, like most, because you aren't good enough.

  • croon
  • ·
  • 17 hours ago
  • ·
  • [ - ]
Just because luck plays a part in everything does not make it moot.

Set up two identical agents in a game with rules guaranteeing a winner, and you will end up with one loser being equal to the winner.

I agree that CEO positions in aggregate are likely generally filled by people better at "CEOing", but there is nothing ruling out "losers" who were equally skilled or even better that just didn't make it due to luck or any of the innumerable factors playing into life.

Aww, you did the meme! https://lol.i.trollyou.com/
Because most of the people don't want to. Additionally, there is a limit on positions. Only few people will get there. But it doesn't mean that there was a competition based on abilities, that some extraordinary skills are needed, or that many other people would not be as good.
  • ·
  • 14 hours ago
  • ·
  • [ - ]
[dead]
  • lijok
  • ·
  • 18 hours ago
  • ·
  • [ - ]
> then the demands of the job can't be that rigorous.

Maybe he’s just that good at what he does?

What makes CEOs different is they have a wild idea, with enough naivety and passion to pursue it, and hopefully the partners that will get them there.

The secret sauce is execution.

Hired CEOs are there to execute a board vision.

Made CEOs are there to execute their vision.

I get some wild ideas out of LLMs too, is that all CEOs are bringing to the table?
Can LLMs execute on their ideas?
It can tell other people what to do, just like CEO. You know LLM is having the vision and employees will execute. Now where is the multibillion package?
Yes, tool calling.
  • gos9
  • ·
  • 17 hours ago
  • ·
  • [ - ]
If you can figure out a way to collect and parse information needed to make executive decisions via LLM+tool calls, you would be a billionaire overnight. There’s a reason that it takes a human in these roles and people w/ 0 organizational/executive experience fail to understand just how complex they are.
So how do they expect them to accomplish the even harder task of programming?

Looks like a CEO's job nowadays is to find out what the latest hype train is, and instruct the company to ride it.

Depends what the tool to be called is, not everything has an API, especially anything that relates to other humans.
lol obviously no.

Their level of expertise, access, relationships, etc all scale with the business. If it’s big, you need someone well connected who can mange an organization of that size. IANAE but I would imagine having access to top schools would be a big factor as well.

The people in this thread coming to the defense of their CEOs sound like Tom Smykowski in Office Space desperately trying to save his job: “I already told you: I deal with the god damn customers so the engineers don't have to. I have people skills; I am good at dealing with people.”

https://youtube.com/watch?v=hNuu9CpdjIo

I don't get what you mean. There are people who are great at bridging the customer-engineering gap. (Although we don't know what Tom was really like with customers) There's skill to that kind of position. Bobs were the stereotypical consultants brought in to change things and cut costs without understanding the actual work. What does this have to do with defending CEOs?
We do know that Tom actually didn’t really do anything, all the real work was done by his underlings. Similar to what most CEOs do. Of course it’s not always true, but like Christine Carrillo in the article i think it’s not a stretch to say that most CEOs don’t do that much; certainly not enough to warrant being paid 1000 times what their menials make
  • whstl
  • ·
  • 18 hours ago
  • ·
  • [ - ]
> Although we don't know what Tom was really like with customers

The movie makes it quite clear, actually.

The Bobs were actually way better than the stereotypical layoff consultants. They even caught on the crazy management chain and the busywork generated by TPS reports. Sure they wanted to layoff good engineers, but doesn't invalidate the actual good findings.

> The movie makes it quite clear, actually.

Did we ever see him interacting with a customer? I don't remember that part and I can't find any clip of it. We see him in many other situations. We know he was not respected and was a weirdo in many ways, but that doesn't say anything about the quality of his customer communication.

He admits himself he isn’t actually the one communicating with the customer, it’s his secretary
I completely forgot that part, my bad!
I used to scoff at people skills too. I don’t any more.

Getting thousands of employees to all work towards a common goal is EXTREMELY difficult. Not to mention selling it to customers, investors, etc.

It doesn’t matter how technically proficient you are - you will fail if you don’t have people skills.

And people skills are far harder to measure, so we basically filter by success (which everyone knows is imperfect).

And there are far, far fewer people with the kind of people skills needed than people who can program a computer. Hence, pay is far higher.

If you’ve ever taken a sales or business or networking course, and seen the people that take the advice literally and act like creepy wooden weirdos, you know why. Business “skill” is intangible, partially luck obviously but partially “game” that you can’t just systematize. It’s the same reason you can’t automate sales. An AI ceo would just be the composite of all the lame business advice you find on the internet - like a lot of wannabe CEOs that also don’t succeed.

You might as well ask why people don’t use AI pickup coaches.

I like that this idea is brought up though, despite it being ridiculous currently, to hold a mirror in front of CEO faces. Just like we won't replace capable (!) CEOs with AI any time soon, we will not replace capable (!) developers with AI any time soon.

It is good, that CEOs also get some of this "You will be replaced by AI!" flak, that we hear from CEOs of big tech directed at developers. Do those CEOs think their job is more complex than a software developer job, which they are so eager to replace? How many times more urgently do we want to replace the CEO, considering salaries? How about we put as many times the amount of money into that, as we are putting into trying to replace developers?

In the end neither will work out any time soon, judging current "AI"'s actual AI level. I think for that we still need some 2-3 architectural leaps forward. And by that I don't mean simply building bigger ANNs and ingesting more data. It already seems like the returns for that are rapidly diminishing.

This is not about the job being complex, they’re the top of the food chain. Also they have a network that an automated system could not replicate because it’s not a technical problem but more of a people problem
> Do those CEOs think their job is more complex than a software developer job, which they are so eager to replace?

You can estimate the difficulty of a job by what fraction is the population can successfully do it and how much special training this takes. Both of which are reflected in the supply curve for labor for that job.

> How many times more urgently do we want to replace the CEO, considering salaries? How about we put as many times the amount of money into that, as we are putting into trying to replace developers?

Pretty sure that (avg developer pay * number of developers) is a lot more that (avg ceo pay * number of ceos).

To play the devil's advocate for a moment:

Since businesses need to start somewhere/when and most startups fail, I think most people who even get into the role of CEO, are doing it successfully. However, this is a lot due to circumstances and many factors outside of their control. There are also many CEOs ruining their businesses with bad decisions. It is not certain, that an "AI" wouldn't do at least as good as those failing CEOs. Similarly, many developers ruin things they touch, introducing tons of complexity, dependencies and breaking user workflows or making workflows cumbersome without listening to user feedback and so on.

In short many people do a bad job and businesses are carried by others, who do a good enough job to make a net positive for the final product. Or consequences of messing up are happening slowly, like a slow user drain, or a user replacement with bad actors until good actors start to leave, or any other possibility.

About the pay argument: Well, these days you still need a good crew of developers to make the shiny AI toys do what you want them to do, so you are not replacing all of the developers, so you can't calculate like that. If we calculate some Silicon Valley CEO making 2 million and a developer making 100k-200k, then we are still at a ratio of 10x-20x. If we manage to make only one CEO obsolete or 2 out of 3 CEOs 1.5x as efficient, we have achieved a cost saving of 10-20 developers! Yay!...

too late to edit: *[...] are doing it UNsuccessfully [...]
So Reiki masters and fortune tellers are safe from being automated, too?
I actually know a non zero amount of people that are using chatgpt to interpret their tarot card readings.
  • whstl
  • ·
  • 18 hours ago
  • ·
  • [ - ]
I think even AI-skeptics can get behind ChatGPT replacing those jobs.
Unironically, they are indeed somewhat safer -- however if people are willing to accept a substitute good of AI-based fortune telling... which I have seen lately ...
we automated fortune telling a long time ago

https://en.wikipedia.org/wiki/Fortune_teller_machine

Well, but they do.
To catastrophic effects, invariably.
You meant that totally unironically?
How could AI coaches be successful?
  • whstl
  • ·
  • 18 hours ago
  • ·
  • [ - ]
> An AI ceo would just be the composite of all the lame business advice you find on the internet

I thought you meant "AI-startup CEO" for a moment and was going to agree.

I do think it's a lot about personality, though I gotta say that I don't really think it should be like that.

My dad had a manager (who was a VP) that he privately nicknamed "VPGPT", because despite being a very polite and personable guy he pretty much knew nothing about the engineering he was ostensibly managing, and basically just spoke in truisms that sounded kind of meaningful unless you do any kind of analysis on them.

I'm not saying that AI would necessarily be "better", but I do kind of hate how people who are utterly incapable of anything even approaching "technical" end up being the ones making technical decisions.

Let's say the CEO is some milquetoast average CEO. Is there some evidence that this is in fact a bad thing?
It means you're never going to be the next Nvidia, and that your business strategy will be predictable
For the vast majority of companies, employees, and investors, that is enough.
Depending on how the next few quarters go being 'the next Nvidia' might not be the flex that's implied here. "Take big swings, maybe get a home run once in a while, maybe bankrupt the company' might be a model that makes stocks fun to trade, but it's arguable whether it's a good model for capitalism as a whole.
> You might as well ask why people don’t use AI pickup coaches.

You'll easily find people preaching or selling that sort of thing on Twitter, and the sort of people who are still on Twitter are probably buying it.

  • klipt
  • ·
  • 19 hours ago
  • ·
  • [ - ]
Haven't a bunch of people fallen in love with ChatGPT? There's a whole Reddit called myboyfriendisai

(Probably mentally unhealthy people, but still it happens!)

I don't even think we need ChatGPT or anything for this. Instead, just create an n8n job that runs nightly that sends a company-wide email that says "we are continuing to strive to implement AI into our application". Maybe add a thing talking about how share price going down is actually a blessing in disguise, depending on how the market is doing, obviously.

Don't steal this idea it's mine I'm going to sell it for a million dollars.

I must be a weird CEO because I’ve lost count of the number of times I’ve had to explain to people why shoving AI into our application will only make it worse not better.
Some CEOs are better than others, but I think a lot of CEOs, especially for BigCos, don't really know what's actually happening in their company so instead of actually contributing to anything, they just defer to buzzwords that they think the shareholders want to hear.
I can't think of a job that is less automatable.

The entire job is almost entirely human to human tasks: the salesmanship of selling a vision, networking within and without the company, leading the first and second line executives, collaborating with the board, etc.

What are people thinking CEOs do all day? The "work" work is done by their subordinates. Their job is basically nothing but social finesse.

> The entire job is almost entirely human to human tasks: sales, networking, leading, etc.

So, writing emails?

"Hey, ChatGPT. Write a business strategy for our widget company. Then, draft emails to each department with instructions for implementing that strategy."

There, I just saved you $20 million.

I get your point but if you think that list of critical functions (or the unlisted "good ol boys" style tasks) boils down to some emails then I think you don't have an appreciation for the work or finesse or charisma required.
> I think you don't have an appreciation for the work or finesse or charisma required.

I think that you don't appreciate that charismatic emails are one of the few things that modern AI can do better than humans.

I wouldn't trust ChatGPT to do my math homework, but I would trust it to write a great op-ed piece.

For some reason the AI prompt "make me 20 million" hasn't been working for me. What am I doing wrong?
Have you got that plan reviewed by your analysts and handed over to implement by your employees? You may be missing those steps...
Automation depends on first getting paid to do something.
We could solve that by replacing all CEOs to remove the issue of finesse and charisma. LLMs can then discuss the actual proposals. (not entirely kidding)

It would be actually nicely self reinforcing and resisting a change back, because now it's in board's interest to use an LLM which cannot be smoothtalked into bad deals. Charisma becomes the negative signal and excludes more and more people.

Why are there "good ol boys" tasks in the first place? Instead, automate the C-suite with AI, get rid of these backroom dealings and exclusive private networks, and participate in a purer free market based on data. Isn't this what all the tech libertarians who are pushing AI are aiming for anyways? Complete automation of their workforces, free markets, etc etc? Makes more sense to cut the fat from the top first, as it's orders of magnitude larger than the fat on the bottom.
  • lurk2
  • ·
  • 20 hours ago
  • ·
  • [ - ]
> There, I just saved you $20 million.

If it were this easy, you could have done it by now. Have you?

> If it were this easy, you could have done it by now. Have you?

In order to save $20 million dollars with this technique, the first step is to hire a CEO who gets paid $20 million dollars. The second step is to replace the CEO with a bot.

I confess that I have not yet completed the first step.

  • lurk2
  • ·
  • 19 hours ago
  • ·
  • [ - ]
Have you replaced the executive function in any one of your enterprises with ChatGPT?
I have completely replaced management of every company that I own with ChatGPT.
  • nubg
  • ·
  • 19 hours ago
  • ·
  • [ - ]
0 x 0 = 0 I guess?
  • lurk2
  • ·
  • 16 hours ago
  • ·
  • [ - ]
How have they scaled?
People seem to have a poor model of what management and many knowledge workers. Much of it isn't completing tasks, but identifying and creating them.
"ChatGPT, please identify the tasks that a CEO of this company must do."
This is literally a caricature of what the average HN engineer thinks a businessperson or CEO does all day, like you can't make satire like this up better if you tried.
Do you think CEOs have an accurate idea of what engineers do?
Neither side can truly know, that is the nature of a diffuse organization.
That won't stop them from replacing us.
Even if the AI gets infinitely good, the task of guiding it to create software for the use of other humans is called...software engineering. Therefore, SWEs will never go away, because humans do not know what they want, and they never will until they do.
It's mind-boggling. I get riffing on the hyped superiority of CEOs. I've heard inane things said by them. But, being a human being with some experience observing other humans and power structures, I can assuredly say that the tight-knit group of wealthy power-brokers who operate on gut and bullshitting each other (and everyone) will not cede their power to AI, but use it as a tool.

Or maybe the person you're describing is right, and CEOs are just like a psy-rock band with a Macbook trying out some tunes hoping they make it big on Spotify.

I am sympathetic to your point, but reducing a complex social exchange like that down to 'writing emails' is wildly underestimating the problem. In any negotiation, it's essential to have an internal model of the other party. If you can't predict reactions you don't know which actions to take. I am not at all convinced any modern AI would be up to that task. Once one exists that is I think we stop being in charge of our little corner of the galaxy.
  • krapp
  • ·
  • 19 hours ago
  • ·
  • [ - ]
Artists, musicians, scientists, lawyers and programmers have all argued that the irreducible complexity of their jobs makes automation by AI impossible and all have been proven wrong to some degree. I see no reason why CEOs should be the exception.

Although I think it's more likely that we're going to enter an era of fully autonomous corporations, and the position of "CEO" will simply no longer exist except as a machine-to-machine protocol.

The one big reason why CEOs exist is trust. Trust from the shareholders that someone at the company is trying to achieve gains for them. Trust from vendors/customers that someone at the company is trying to make a good product. Trust from the employees that someone is trying to bring in the money to the company (even if it doesn't come to them eventually).

And that trust can only be a person who is innately human, because the AI will make decisions which are holistically good and not specifically directed towards the above goals. And if some of the above goals are in conflict, then the CEO will make decisions which benefit the more powerful group because of an innately uncontrollable reward function, which is not true of AI by design.

> The one big reason why CEOs exist is trust.

This sounds a lot like the specious argument that only humans can create "art", despite copious evidence to the contrary.

You know what builds trust? A history of positive results. If AIs perform well in a certain task, then people will trust them to complete it.

> Trust from vendors/customers that someone at the company is trying to make a good product.

I can assure you that I, as a consumer, have absolutely no truth in any CEO that they are trying to making a good product. Their job is to make money, and making a good product is merely a potential side-effect.

They've been proven wrong? I'm not sure I've seen an LLM that does anything beyond the most basic rote boilerplate for any of these. I don't think any of these professions have been replaced at all?
I feel like the people who can't comprehend the difficulties of an AI CEO are people who have never been in business sales or high level strategy and negotiating.

You can't think of a single difference in the nature of the job of artist/musician vs. lawyer vs. business executive?

> I feel like the people who can't comprehend the difficulties of an AI <thing doer> are people who have never <tried to do that thing really well>.

That applies to every call to replace jobs with current-gen AI.

But I can't think of a difference between CEOs and other professions that works out in favor of keeping the CEOs over the rest.

You sound like a CEO desperately trying not to get fired.

Everyone is indispensable until they aren't.

This whole thread is delightful. Well done.
  • krapp
  • ·
  • 18 hours ago
  • ·
  • [ - ]
>You can't think of a single difference in the nature of the job of artist/musician vs. lawyer vs. business executive?

I can think of plenty, but none that matter.

As the AI stans say, there is nothing special about being human. What is a "CEO?" Just a closed system of inputs and outputs, stimulus and response, encased in wetware. A physical system that like all physical systems can be automated and will be automated in time.

My assertion is that it's a small club of incredibly powerful people operating in a system of very human rules - not well defined structures like programming, or to a lesser extent, law.

The market they serve is themselves and powerful shareholders. They don't serve finicky consumers that have dozens of low-friction alternatives, like they do in AI slop Youtube videos, or logo generation for their new business.

A human at some point is at the top of the pyramid. Will CEOs be finding the best way to use AI to serve their agenda? They'd be foolish not to. But if you "replace the CEO", then the person below that is effectively the CEO.

Hooker is probably harder to automate.

They both need social finesse and CEO’s don’t need a body.

> Hooker is probably harder to automate.

I'm pretty sure I've seen news articles about attempts to do exactly that.

You can’t automate joining an ivy league fraternity.
Seeing as there are people that believe that they are dating a chat bot and others that believe that chat bots contain divinity, there are probably some people that would respond positively to slop emails about Business Insight Synergy Powered By Data-ScAIence and buy some SaaS product for their Meritocrat-Nepo sneaker collab drops company
It is easy to make the mistake of believing CEOs are automatable based on their public speaking: interviews, earning calls, conference talks. With a rare exception (cough Musk) CEOs communicate in a very sterilized PR-speak, coached and vetted by PR, media relations, and legal counsels, and usually stick to facts or not very controversial opinions. That part of the job is pretty replaceable with a well-trained LLM.

The real job is done behind the curtain. Picking up key people based on their reputation, knowledge, agency, and loyalty. Firing and laying off people. Organizational design. Cutting the losses. Making morally ambiguous decisions. Decisions based on conversations that are unlikely to ever be put into bytes.

Yeah, that's because business leadership is largely a cult. The way you prove your loyalty to the cult is by overseeing larger and larger layoffs ordered by those above you until you're the one putting people on the street.
CEO's at least in the USA have multiple legal obligations under federal law. Can legal obligations be delegated legally to automation? Has this been tested yet, specific to legal obligations related to the board? Any corporate lawyers wish to chime in?
> CEO's at least in the USA have multiple legal obligations under federal law.

Lots of people have legal obligations.

In this case, I assume that in this case you're referring to a fiduciary duty (i.e. to act in the best interests of the company), which is typically held not by the CEO, but but by the directors.

Ultimately the responsibility to assign daily operation of the company rests with the board, both legally and practically, as does the decision to use a human or AI CEO.

Under Delaware law (where most U.S. public companies incorporate), directors' fiduciary duties are non-delegable. The actual exercise of judgment must be performed by the director, who must be a "natural person". Other jurisdictions might offer grey areas, but good luck finding one that meaningfully changes this.

More practically, legal accountability would be placed in the individuals approving LLMs actions and/or the entity providing the LLM service. The latter aspect being why many AI vendor deals fall through. Because everything is awesome until the contract comes and the vendor wants to take no responsibility for anything that results from their product.

  • pabe
  • ·
  • 5 hours ago
  • ·
  • [ - ]
There are CEO affect studies. Seems like the CEO effect on company performance is about 11.5%, see https://www.sciencedirect.com/science/article/pii/S104898432...

In my opinion, a lot of strategic work can be automated - and results would actually be better. Why? Because the strategic management agent would most likely follow a scientific process and make small bets instead of investing millions in initiatives without having a clear signal.

What you cannot automate is the human contact. Trust between humans. Enlighting the fire that burns inside you inside others. CEOs aren't necessarily the best strategic thinkers but they're very good in dealing with people and they are high-agency people.

  • mvkel
  • ·
  • 20 hours ago
  • ·
  • [ - ]
I was a CEO for thirteen years. For the hard CEO skills, AI is perfectly suited for the job; even more than it would be for specialist roles.

For the soft CEO skills, not so much.

Not that that's a deal-breaker. I have a vision of an AI CEO couched as a "strategic thought partner," which the wet-CEO just puppets to grease the skids of acceptance among the employees.

I'd fully trust an AI CEO's decision making, for a predictable business, at least. But some CEOs get paid a lot (deservedly so) because they can make the right decisions in the thick fog of war. Hard to get an AI to make the right decision on something that wasn't in the training corpus.

Still, business strategy isn't as complex as picking winners in the stock market.

I’d suspect that an AI CEO would have to complement its weaknesses — just like any CEO. And in this case, rely on subordinates for glad-handing and vision pitches, while itself focusing on coordinating them, staging them to the right meetings, coordinating between departments, etc.

I think an AI could be strong at a few skills, if appropriately chosen:

- being gaslightingly polite while firmly telling others no;

- doing a good job of compressing company wide news into short, layperson summaries for investors and the public;

- making PR statements, shareholder calls, etc; and,

- dealing with the deluge of meetings and emails to keep its subordinates rowing in the same direction.

Would it require that we have staff support some of the traditional soft skills? Absolutely. But there’s nothing fundamentally stopping an AI CEO from running the company.

CEOs who build great, long-lasting companies would be very hard to replicate. But CEOs who make money for stockholders at the expense of everything else seem like the type of thing that would be much easier to replicate. As others have commented, replacing the people skills of a CEO might be difficult. But if the CEO's job is to strip the company of assets and cash out for the owners, people skills are kind of a hindrance at that point.
You don't automate jobs, you automate tasks. In the very near future any CEO worth even a fraction of their compensation will automate most of their day to day tasks, and will use AI to better accomplish things that are currently hard. Certainly to some degree the optimal skillset for a good CEO will transform, but the role will remain.
Some CEOs are obviously extremely skilled at marshalling large organisations in the successful pursue of business goals, while the rest derive most of their value from sharing a job title with the first group.
Because the people supplying the automation would extract the surplus that otherwise goes to the ceo?
  • gruez
  • ·
  • 20 hours ago
  • ·
  • [ - ]
Since when did people support CEOs as some sort of jobs or redistribution program? It's at least somewhat plausible to mandate bullshit jobs like gas station attendants or whatever to keep teenagers employed, but nobody is clamoring for CEOs to exist to screw over shareholders.
I didn't mean it as a moral statement, but as a practical observation that a person with effective control over a company has a lot of leverage when it comes to negotiating resource splits. Also, you want them to be aligned.
If there's competition they'll race each other to costs+overhead.
What is the difference that explains why this doesn't seem to happen for humans in the ceo role, but would happen if the role were fulfilled by automation?
Mostly the massive supply increase that automation always brings.
I'm not sure you can get around the principal-agent problem that easily. Who sets the policy levers on the automation and governs it? They inherit the ceo's negotiating leverage with shareholders.

It seems like you'd need some sort of fairly radical control structure (say, no board, just ai interacting directly with shareholders) to get around this. But even this ignores that the automation is not neutral, it is provided by actors with incentives.

This. The enshittification doesn’t happen until later, at which point it’s too late to fix.
  • Zak
  • ·
  • 19 hours ago
  • ·
  • [ - ]
My impression is that most of the value a CEO provides comes from a combination of their existing social network and their ability to interface socially with other companies. The day to day work described in the article is not where the value comes from.

It would be an interesting experiment to promote an executive assistant to CEO though.

A hundred thousand posts on HN about how you can, without hesitation, replace your network security team with AI, and you can see the flood of CEOs (or CEO wannabes) nodding and murmuring about saving costs. A single post about CEOs being automated and all of a sudden it’s all about “intangible human relation skills” and “AI couldn’t possibly” and “but my network of angel investors and other CEOs”.

I swear there’s a joke or cautionary tale here somewhere about “first they came for..” or something along those lines. The phrasing escapes me.

Maybe the problem isn’t that you can’t automate a CEO, it’s that the actual tangible work just isn’t worth as much as some companies pay for it, and this thread it touching a few too many raw nerves.

Well, either way it’s hilarious.

It really is funny. I'm sympathetic to the idea that you can't automate the intangibles of company leadership, but that same idea also applies to other things like producing art or software, which seems to be eagerly ignored by the same cohorts who take offence to this idea.
I too laughed when Stable Diffusion came out and artists said their jobs couldn't possible be automated due to the "intangible human creativity that machines can't replicate." I mean, it's the same thing right? At the end of the day, everyone has economic anxiety and will continue to fight to make money to survive.
  • Havoc
  • ·
  • 7 hours ago
  • ·
  • [ - ]
How expensive something is is a very poor starting point for what we should attempt to automate with nascent technology that still has a pretty high miss rate

It’s not entirely irrelevant to the commercial case but you really need to start with technical feasibility and how resilient the job is to mistakes.

And something that is high risk, high leverage and very softskills/experience driven is a bad place for AI imo

Anthropic‘s shenanigans with their AI vendor should illustrate how far we are from credible AI CEOs

Maybe first get the automated vending machines to work properly, and then revisit this question to find what tasks that allows to be peeled away from the role?
Automated vending machines have been a thing since their inception, definitionally.
  • xigoi
  • ·
  • 11 hours ago
  • ·
  • [ - ]
Vending machines run by an LLM have not been a thing until recently.
Vending machines are static experiences. They sit there and wait until told to give an item and paid for it. Why would you need an LLM for that? There’s nothing to solve there
You know those puff pieces are to assuage our fears of job loss and provide an folksy aww-sucks patina to the plucky LLM who can't run a vending machine.
Or maybe that LLMs just aren't that good.
CEOs usually follow the worse forms of human collaboration known to our species: totalitarianism, dictatorships, monarchies, and centrally planned messes.

Anything that removes the power of CEOs and gives it to the worker should be highly encouraged. Economic democracy is the final frontier of human empowerment and giving workers then means to have democratic control over the economy can only unlock more human potential, not less.

Perhaps we could also get rid of ship captains, and allow the ship to steer itself.
Not the same at all, you're talking about a highly specialized skill with a small amount of people that are mostly compelled by either military order or international law. Deciding who should be the leader among the in-group doesn't require being ordained by the priesthood (the boardroom), it should be decided and voted on by the workers.

To make the actual analogy you wanted, maybe you should discuss fragging:

https://en.wikipedia.org/wiki/Fragging

And what it means when your underlings hate your leadership so much they would rather kill you than follow their "superior's" orders.

While we're at it let's get rid of coaches for sports teams and just allow the players to coach themselves
Just want to call out that these are both not great examples?

High performance sports teams have a captain that is often elected in some form from the team.

Likewise the crew of a pirate ship used to elect their captain.

Both examples serve contrary to your point, and there's no reason you couldn't have something similar in business: a cooperative that elects a CEO, rather than it being done by a board of other CEO's.

When is the last time you were someone who did real work at a Fortune 500 company and were coached by a CEO?
  • lijok
  • ·
  • 18 hours ago
  • ·
  • [ - ]
> Anything that removes the power of CEOs and gives it to the worker should be highly encouraged.

Pretty sure the moment you do this, the workers liquidate the company and distribute the assets among themselves, as evidenced by the acceptance rate of voluntary severance offers in many past downsizings, such as the Twitter one.

Accepting severance isn’t “liquidating the company,” it’s individuals minimizing risk when leadership is downsizing. Explicitly, in the case of Twitter.
  • lijok
  • ·
  • 18 hours ago
  • ·
  • [ - ]
And why would that play out differently?
“Accept this money and go or be fired” isn’t remotely comparable to the situation and claiming it is reveals a fundamental misunderstanding about the nature of the offer?
> Anything that removes the power of CEOs and gives it to the worker should be highly encouraged.

The only thing that will do this is if workers are the resource bottleneck.

> Economic democracy is the final frontier of human empowerment and giving workers then means to have democratic control over the economy can only unlock more human potential, not less.

This already exists. It's called free enterprise and freedom of association.

Unless of course you mean that nobody can own or expend resources without (nominally) everybody agreeing... which has also been tried, and failed horribly.

For one such example, see the years long fights in city halls over resource usage or utilization, such as building new developments for example. A corporation trying to get something done moving at that pace would, well, not get anything done. That is why worker owned co-ops, which you can create today even in this capitalist system we have, do not outcompete capitalist structures generally speaking.
Plato said it best when talking about the benevolent dictator, so in those cases, it's not the "worse" form of human collaboration. Not everyone follows the labor theory of value.
No, we know it's the worse form of collaboration because dictatorships have never been good systems of government. Does this really need to be stated? That dictatorships are always bad and never good, that people deserve autonomy and freedom, and that we should be molding society to fit the needs of those that actually serve it (the workers)?
> because dictatorships have never been good systems of government.

This is literally historically wrong, there are many examples to the contrary. If you think any form of government is "always" wrong, I can point you to failed democracies that then prospered in a dictatorial rule too, such as Singapore or South Korea or even China if you want to count that, which have all been basically one party rule to great effect.

  • avaer
  • ·
  • 19 hours ago
  • ·
  • [ - ]
> Anything that removes the power of CEOs and gives it to the worker should be highly encouraged.

Except replacing CEOs with AIs will not do this.

Why not? Let's not act like the average CEO isn't also an actively hostile member of the company toward its workforce. Why should people be forced to work under such hostile regimes? People should be empowered to vote for their leaders where they work. Boards have this authority already, there's zero reason why workers shouldn't be granted the same priviledge.

It won't make the companies worse run, why would workers want to destroy their means to live? CEOs do this with no skin in the game, the workers should take that skin as they will always be better stewards than the single tyrant.

No one is forcing people, they can switch jobs, hence why we vote for politicians, because we cannot switch countries (as easily) but we do not vote for CEOs. Well, by working at a certain company, that is automatically a vote that one supports that company and wants to continue working there.

Where is your evidence that companies won't be worse run? Workers could just vote to give themselves massive raises and hemorrhage the company, ironically like how some private equity firms operate but en masse. No one would start companies in this sort of scenario thereby causing the economy to fall, especially in comparison to companies that don't have this sort of voting system for companies.

My evidence is the fact that companies layoff people in general due to the ineptitude of their executives. My evidence is that people do not like it when others have dominion over them.

The ability to deny someone healthcare, the ability to eat, how you dress yourself, who you can speak to, and even if you can subsist in our society (notice I didn't say thrive); these are things no human should have over another.

Companies have the ability to destroy lives and workers have ZERO recourse.

My solution is simple, workers should be able to vote for their bosses. If these executives are as good as you say then they should win the election easily. I mean after all boards vote on who becomes an executive, why shouldn't workers be allowed this right as well? Boards + executives decide company strategy, why can't workers be allowed to do this too? Why can't their be consensus building where we actually give workers the freedom to dictate their own success? Boards + executives vote on their salaries too (hint, they never go down), workers should have this right too.

I'm sorry but the more I think about it, the more farcical it all seems. Workplace democracy should the next pursuit of human rights as democratizing the economy can only lead to more human flourishing as democratic governments have provided thus far.

If it's good enough for the state, it's good enough for the F500.

If you want socialism, you can make your own co-op right now. In fact, I too was in a software engineering co-op where we all made decisions together and chose an external facing CEO. It works, but only at a small scale. There's a reason co-ops are not the dominant force in the market, if socialism really worked they would be but turns out capitalist corporate structures simply perform better.

And also, the way you dress yourself or have healthcare? Do you not think co-ops like REI don't have a work uniform or give their workers healthcare? Sorry but your takes sound extremely naive, where have you worked before? As I said I have literally worked in a software co-op and it's not all it's cracked up to be based on how you're describing them.

Monarchy is the best form if it's the right monarch and the worst if it isn't. Sounds about the same as CEO to me
History has yet to produce a “good” monarch (i.e. one that yields better results than the alternative, which is liberal democracy).
better is doing an insane amount of lifting here. In the last 30 years China has lifted more of its own people out of poverty than all democracies summed together over the same time period.
What?? History has produced a huge amount of good monarchs. There are some around even today, for example Oman.
We just posted a memorial for Louis Gerstner, the CEO that turned around IBM. I think reading about Louis Gerstner gives a pretty good example of why this would be a bad idea.

My business experience is that company culture is very important to a company’s success and I’m just doubtful that this can be created through AI.

That could be survival bias and smells of a multiple testing problem. Surely there would also be a few automated systems to that sometimes would be able to turn around a company.
LLM CEOs would be middle of the road, I suspect. Yes, they could be much more in touch with the goings on within the business and make fewer baseless decisions. Though you would not get an innovative or culture-building leader that is necessary in some circumstances.
Because despite appearences, being a CEO is an actual complex job that AI is not remotely competitive at
Considering that these CEOs are talking about replacing all skilled and unskilled labor under them with LLMs, I don't see why they can't be replaced too. In reality, LLMs are overhyped. Even Grok says it straight - LLMs are probability models with condensed human knowledge that decides what the next word/letter should be. Original thoughts isn't its forte.

(Surprisingly though, that's enough for them to recognize that you're a human. Their models can identify your complex thought progression in your prompts - no matter how robotic your language is.)

The REAL problem here is the hideous narrative some of these CEOs spin. They swing the LLMs around to convince everyone that they are replaceable, thereby crashing the value of the job market and increasing their own profits. At the same time, they project themselves as some sort of super-intelligent divine beings with special abilities without which the world will not progress, while in reality they maintain an exclusive club of wealthy connections that they guard jealously by ruining the opportunities for the others (the proverbial 'burning the ladder behind them'.) They use their PR resources to paint a larger-than-life image that hides the extreme destruction they leave behind in the pursuit of wealth - like hiding a hideous odor with bucketfuls of perfume. These two problems are the two sides of a coin that expose their duplicity and deception.

PS: I have to say that this doesn't apply to all CEOs. There are plenty of skilled CEOs, especially founders, who play a huge role in setting the company up. Here I'm talking about the stereotypical cosmopolitan bunch that comes to our mind when we hear that word. The ones who have no qualms in destroying the world for their enjoyment and look down upon normal people as if you're just fodder for them.

Who gets the final signing authority and can be held liable for specific mistakes or crimes commissioned by the company if there is no CEO?
Because being CEO in huge part means having connections with the right people.
  • ·
  • 20 hours ago
  • ·
  • [ - ]
As if AI doesn't have connections with people.
Build an AI that can play golf and bribe Government officials. Problem solved.
If it can engage in criminal activity, then it can skip the golf training.

Every time the LLM CEO gets caught doing a crime and goes to 'jail', the LLMs on the exec board can vote to replace it with another instance of the same LLM model.

Forget 'limited liability', this is 'no liability'.

  • ·
  • 20 hours ago
  • ·
  • [ - ]
I'm not sure you understood what "connections" means in this context.
Enough of a connection to get into people's minds.
Other than people hallucinating relationships with a stream of text, not really.
Do AIs have a rolodex of other executives, board members, alumni and frat buddies they've worked with or spent years exchanging with?
You're Absolutely Right!
This should be marked as being from 2023.
If CEOs are automated, i.e. eliminated, then all decisions, popular and unpopular will be attributed to boards. It's related to other CxO as well. Who'd you blame for insulin price increase over years if CFO was a script parametrized by the board?
You could replace all of the arguments against automating CEOs with the same arguments against automating allocation of capital. In both these cases people are imagining their work to be more subtle and ineluctably human than, for example, the work of doctors and lawyers. Sure, buddy. Keep telling yourself.
You say that like we've already done away with human doctors and lawyers. Lawyers can't even use LLMs as an aid without them making up fake citations. The technology isn't close to being able to be used unsupervised, and humans are proving too irresponsible to supervise it.
Even if this did happen there would no doubt be some sweet-talking schmuck with half a college degree and connections to all the other rich and powerful (likely through their dad) claiming credit for everything the LLM did.
  • duxup
  • ·
  • 17 hours ago
  • ·
  • [ - ]
I've seen some truly great CEOs from time to time. I knew what they did.

I have zero idea what most CEOs that I've worked for do ... and the seem to want it that way.

If CEOs can be replaced, shouldn't founders be also (who are often CEOs)? What about all levels of leadership? Why can't AIs just run full companies autonomously?
Love the idea, but whence the training data? Not as readily available as billions of jpegs, lines of code, and audio files. Any clever ideas to source it?
CEO is the most over valued position in human history aside from literal royalty.
  • sfc32
  • ·
  • 17 hours ago
  • ·
  • [ - ]
  • avaer
  • ·
  • 19 hours ago
  • ·
  • [ - ]
Last year's joke is next year's reality.

Every year I feel a bit less crazy in my silly armchair speculation that the Second Renaissance from the Animatrix is a good documentary. If AI "takes over" it will be via economic means and people will go willingly until they have gradually relinquished control of the world to something alien. (Landian philosophers make the case that hyperstitional capitalism has already done this)

I would take the over that this will happen sooner than later -- when it's proven to make a lot of money to have an AI CEO, suddenly everyone will change their tune and jump on the bandwagon with dollar signs in their eyes, completely ignoring what they are giving up.

Except unlike e.g. the metaverse/cryptocurrency bandwagon of yesteryear, there's no getting off.

  • pdyc
  • ·
  • 12 hours ago
  • ·
  • [ - ]
if something goes wrong with that "automation" who takes the responsibility? as a shareholder are you comfortable with it? i am not.
  • jasfi
  • ·
  • 15 hours ago
  • ·
  • [ - ]
That's a role that would require AGI, we're not there yet.
As a founder/CEO who started as a programmer, I have been running my second company for 15 years. I am not great, but I got the company to be sizable and profitable.

1. I will take five automated CEOs. If I can split my company into five distinct companies (one per product), it would be amazing. We are splitting the company into two to streamline focus on different/incompatible industries, and I am dreading the process of finding another CEO. It is very, very hard.

2. I know a lot of CEOs. It helps. I didn't know a single one when I started. It is no more a cult than my programmer's peer group was.

3. Did I tell you how hard it is to find a good CEO? It is VERY, VERY hard. Think of hiring a great product guy with agency to do whatever needs to be done, with people skills to attract talent, a sales drive, and a willingness to deal with finance & legal. Oh, and I am in the tech field, so I need him to be very hardcore technical. Your mileage might vary, but this is who I need. Anyone who has that is running their own companies. Oh, and the person has to have a proven track record. I cannot let someone unproven ruin the company and well-being of hundreds of employees and tens of thousands of customers.

4. I don't believe CEOs are special in any way other than that most other professionals are special. There are probably some underlying qualities, but they're all so different.

5. Some CEOs got there because they were lucky, but they didn't stay there for long because of luck. It is very, very simple to screw up as a CEO.

6. Growing someone within an organization to become a CEO is very hard. We are trying - giving some people more and more responsibilities, trying to involve them in more and more aspects of the organization. The filter is - repeatable success. You don't have to succeed all the time, but you have to succeed most of the time. Most people don't want the pressure, aren't interested in certain aspects, or are unsuccessful more often than they should.

7. Boards are not a cult as well; they don't have CEO's back. Boards are represented by investors (pension funds, wealthy individuals, etc.) - they will oust the CEO if the company's performance suffers. They are willing to pay a lot to the CEO because ... it is so hard to find a good CEO.

The CEOs of the biggest corporations could absolutely be replaced... You don't even need an LLM, just a cron job that runs once a day and executes a script:

if (marketCrash) then sendEmailToGovernmentAskingForBailout();

AI can’t even clean toilets.
Neither can CEOs. What's your point?
They can. AI can’t automate anything.
  • Vaslo
  • ·
  • 20 hours ago
  • ·
  • [ - ]
Should say,”Soft skills are hugely expensive. Why not automate them?” Then you don’t need to read the article.
  • ·
  • 20 hours ago
  • ·
  • [ - ]
The article might be a modest proposal [1], but sooner or later we're going to have to answer questions like these.

An even more interesting one is: What will we reward?

We've been rewarding labor quantity, as well as quality via higher wages - as motivation and as incentives for more education. This reflected the productivity primacy of knowledge work in modern economies, but that might not be the case down the road.

We've also been rewarding capital. Originally this was a way for the elites to keep themselves in place (a.k.a. economic rents), but in modern times it's been more of an entrepreneurial incentive (a.k.a. economic profits.)

Without the economic profit rationale, there's no reason to reward capital accumulation. Only pro-profit decisions are good for society, pro-rent decisions are awful. If there's no profit to incentivize, capitalism is just bad all around.

If AI becomes a better profit decision-maker than an entrepreneur, any humans left in the loop are nothing but high-rollers gambling with everyone else's money.

[1] https://en.wikipedia.org/wiki/A_Modest_Proposal

And replace them with government bureaucrats?

It’s been tried before, it didn’t work out well.

For context, New Statesman is the premier British socialist magazine.

Whatever the merits of the argument here (and my bolshie side has also flippantly pushed it in the past) the motivation and thrust of the essay needs to be considered in that ideological grounding.

A CEO is just a mascot without the goofy outfit.
This is precisely the same dilemma as self-driving cars. Who will be held accountable if something goes wrong? Who will be dragged into court by shareholders?
This article fundamentally misunderstands the role of a CEO.

The main job of CEOs is not decision making. 99% of company decisions are made below the level of CEO. For the ones that make it to CEO, the board tends to have final say.

It’s a leadership role where people interactions are the most important. The CEO sets the tone, gets people on the same page, and is the external face of the company.

It’s silly to think a robot can replace that.

There is probably a point when there is enough capital and enough investors that you don't need a CEO or executives. The investors already take care of all the politics and business networking side of the company for free.

The investors can organize the government bailouts themselves. You don't need a CEO.

Misunderstanding why the CEO is paid so much.

If you've ever worked at a company that's a chaotic shitshow, you'll know how strong the effect of the CEO is - it always comes down to the guy at the top not being up to it.

The leverage of the role is enormous, and the strength of someone who can carry out this role well for a large company is sky high - not many such people in the world, and they only need one.

So the math all comes out very straightforward: even at obscene looking salaries, they're still a bargain.

Christine Carrillo's most recent company helps CEOs find executive assistants. So she had her executive assistant find executive assistants... sigh...
Should have (2023) in the title. This is an old post.

Also, I think it misses the critical point. C-suite executives operate under immense pressure to deliver abstract business outcomes, but the lack of clear, immediate feedback loops and well-defined success metrics makes their roles resistant to automation. AI needs concrete reward functions that executive decision-making simply doesn't provide.

  • lurk2
  • ·
  • 20 hours ago
  • ·
  • [ - ]
Inane Reddit crosspost.
Imagine an AI agent that ingests a ton of data from business analytics and then makes decisions, which are signed off by the company legal team.

Could be good, but could also be bad if it turns out the AI is able to be even more ruthless in how it treats its workforce.

The good news is that it doesn't need to be very accurate in order to beat the performance of most execs anyways.

Legal team? Why not another AI.
Well, somebody has to go to jail if catastrophic decisions are made and you can't jail AI. We very often see CEOs being jailed in the real world, so the pay is actually a very fair compensation for the risk.
> We very often see CEOs being jailed in the real world

Where "very often" means "almost never?"

  • mslt
  • ·
  • 20 hours ago
  • ·
  • [ - ]
This is the actual answer to the whole question - accountability
You hire a bum for 6 figures to rubberstamp things with a slight risk of going to jail.
  • lijok
  • ·
  • 18 hours ago
  • ·
  • [ - ]
In violation of a comical number of laws
I'm constantly reminded about the 2000 election. Part of Al Gore's communications strategy was to constantly talk about the "top 1%". Now he wasn't wrong about this but interestingly there was a survey in 2000 that showed that 19% of people thought they were the top 1% and another 20% thought they would be someday.

This is how successful American propaganda is. 39% of people believed something that definitionally could never be true.

So you will find people who make average salaries defending the stratospheric salaries of CEOs because they believe they'll one day be the one benefitting or they've fallen for some sort of propaganda such as the myth of meritocracy or prosperity gospel.

Our entire economy is designed around exploiting working people and extracting all of their wealth to a tiny portion of the population. And we're reachign the point where the bottom 50% (if not more) have nothing left to exploit.

Ai and automation could be used to improve all of our lives. It isn't and it won't be. It'll be used to suppress wages and displace workers so this massive wealth transfer can be accelerated.

I get the point of the article. But those with the wealth won't let themselves be replaced by AI and seemingly the populace will never ask the question of why they can't be replaced until economic conditiosn deteriorate even further.

I agree with the other comment. It's easy to get into the top 1%. This is actually true about most things. Most people are lazy, don't have the drive to work hard to achieve their goals, or are not willing to make the sacrifices it takes to do so.
> This is how successful American propaganda is. 39% of people believed something that definitionally could never be true.

It's not that difficult to get into the top 1%. Most Americans earn a top 1% income. Even the top 1% of America is only a salary of around $500k. It's possible 19% of survey takers were in the top 1%, or were on a path to make that in the future.

I don't see how it's definitionally untrue to believe you could make $500k a year at some point...Let alone $34,000 a year...

Try "must have a net worth north of $11M and/or earn $800k/yr" for either 1% net worth or 1% top annual earnings in the US.
> Most Americans earn a top 1% income.

1% of Americans earn a top 1% income. They weren't being asked "do you make more than an amputee kid in Gaza?"

> It's possible 19% of survey takers were in the top 1%…

There's a whole field of math devoted to preventing this. Polling works quite well, all things considered.

Global 1% net income is $34k, it's likely that more than 19% of those who were polled make that since the median net annual income in the US is ~$50k.
Again:

> They weren't being asked "do you make more than an amputee kid in Gaza?"

Context matters.

What were they asked then?
They will have been asked something like "what income level do you believe places someone in the top 1% of American earners?" or "what percentile of American earners do you believe yourself to be in?"

Often posed as a multiple choice question.

So you're just guessing? I'd say if 19% said they believed they were in the top 1%, it was probably not specified as specifically American.
Sounds like you're guessing.

I'm not; this sort of thing is quite well documented.

https://phys.org/news/2024-09-people-underestimate-income.ht...

> Barnabas Szaszi and colleagues conducted four studies to explore how well people understand the wealth held by others. In one study, 990 US residents recruited online were asked to estimate the minimum annual household income thresholds of various percentiles of American earners.

That study shows that people are worse at estimating income brackets as the brackets increasingly become outliers. Look at how the variance in estimations increases sharply at higher brackets. That they tend to underestimate the outlier brackets doesn't indicate that 19% of Americans think they're in the top 1%...
So the top 1% of income is currently ~$570k. The vast majority of people will never see that.

But more relevant is the top 1% of net worth is currently ~$11.6M [1], which is vastly more unattainable.

Also, the net worth of the bottom 99% is skewed by house prices. You might be sitting on a house worth $1M but when every other house also costs $1M and you have to live somewhere, you don't really have a net worth of $1M.

[1]: https://finance.yahoo.com/news/among-wealthiest-heres-net-wo...

In the states, yes. But globally it's like ~$35k.

I don't know how that particular poll was worded, but in general if your a politician who rails against the top 1%, you might suffer from the fact that people have widely varying conceptions of who the 1% are.

It was about the American presidential election. The audience was Americans. About America.
> Why not automate them?

Because building psychopathic AI's is - at the moment - still frowned upon.

not everywhere :)
It’s cute that you think major AIs aren’t psychopathic. I wish I had your optimism.
Can't read the paywalled article so i'm judginf from just a headline (not sure if it's sarcasm or actual opinion),

but as someone who has the honor of working witha really good ceo i can definitely say that you cannot automate them. maybe in some streamlined corporate machine like ibm or something, but not in a living growing company

Another article that starts from the false premise that AI is intelligent.
  • ·
  • 17 hours ago
  • ·
  • [ - ]
  • ·
  • 17 hours ago
  • ·
  • [ - ]
[dead]