I'm deeply skeptical of CEOs being "built different" like some people are arguing here. If Elon can be CEO of three companies and the founder of a couple more while also finding time to tweet 50+ times a day, have a failed and embarrassing stint in trying to optimize the federal government, and get K-holed at parties then the demands of the job can't be that rigorous.
If anything, I would argue that the strategic decisions actually can be automated/performed via broader consensus. With that handled, all that's left is the cartel that CEOs have invented to justify their exhorbant pay packages.
> With that handled, all that's left is the cartel that CEOs have invented to justify their exhorbant pay packages.
CEO compensation is determined by board committees mostly made up of other CxOs. They write letters to each other's shareholders about how valuable CEOs are to build up the edifice.
I wish my compensation were determined by fellow engineers who "truly know my worth". I'd pay it forward if I were on a committee determining colleague's pay packets.
> If Elon can be CEO of three companies and the founder of a couple more while also finding time to tweet 50+ times a day, have a failed and embarrassing stint in trying to optimize the federal government, and get K-holed at parties then the demands of the job can't be that rigorous.
Except, Elon has been largely successful at being the "CEO" of these companies because he attracts talent to them. So...either:
(1) Businesses still will require human talent and if so, I don't see how an AI bot will replace a CEO who is necessary to attract human talent.
OR
(2) Businesses don't require any human talent beneath the CEO and thus a CEO is still necessary, or at least some type of "orchestrator" to direct the AI bots beneath them.
Keep in mind, SpaceX and Tesla are both in industries where people don’t actually have that many other options. If you really want to see what talent Elon is able to attract, you need to look at his other companies like X.
I mean, you're not wrong. The COO, Gwynne Shotwell, at SpaceX, is known to handle a lot of the day to day stuff, and I feel that further reinforces the point. If she can handle all of that in her role as COO then what's the point of a CEO?
Hot take: Musk is a great CEO. He's a horrible person, but I feel it's undeniable that his weight behind a project greatly increases the chance of interesting and profitable things happening (despite the over-optimistic claims and missed deadlines). I think he achieves this in large part _because_ he is an asshole, tweeting all the time to drum up publicity, being notorious for doing K, being too optimistic about what can be achieved, etc. I think somebody can be a good CEO without being such a jerk, it's just that Musk doesn't take the good-person strategy. And the bad-person strategy works well for him.
A CEO's job is (roughly) to maximize a company's valuation. It is not to run the company themselves, not to be nice, not to improve the world. I'm not claiming this is what _should_ be, just how it _is_. By this metric, I think Musk has done really well in his role.
Edit: Tangentially related -- at the end of the musical "Hadestown", the cast raise their glasses to the audience and toast "to the world we dream about, and the one we live in today." I think about that a lot. It's so beautiful, helps enforce some realism on me, and makes me think about what I want to change with my life.
The first part works because otherwise reusable rockets wouldn't have been invented (or maybe they'd have been invented 20 years later). It's the same as Steve Jobs, the Android guys were still making prototypes with keyboards until they saw the all screen interface of the iPhone. Sometimes it requires a single individual pushing their will through an organization to get things done, and sometimes that requires lying.
Their level of expertise, access, relationships, etc all scale with the business. If it’s big, you need someone well connected who can mange an organization of that size. IANAE but I would imagine having access to top schools would be a big factor as well.
Funny story: I'm friends with a political scientist that sustained themselves through college by writing thesis papers for MBA students. They would research, then buy a two liter energy drink bottle and write it all in one go over the weekend.
Because most of the people don't want to. Additionally, there is a limit on positions. Only few people will get there. But it doesn't mean that there was a competition based on abilities, that some extraordinary skills are needed, or that many other people would not be as good.
Attributing something to luck sounds like a lazy cop out, sorry.
We just had an article on the front page yesterday about “increasing your luck”.
If you need to be lucky in meeting the right people, you can increase your chances by spending your evenings in the your nearest financial district watering hole. We’ve easily established luck can be controlled for, which puts us back into skill territory.
What specifically must one luck out on? Have you tried?
> Attributing something to luck sounds like a lazy cop out, sorry.
Everyone one of us here has an unbroken line of lucky (lucky enough!) ancestors stretching back a billion years or so. Pretending it's not a thing is silly.
When you're born matters. Where you're born matters. Who you encounter matters. etc. etc. etc.
> What specifically must one luck out on? Have you tried?
I think perhaps we have different definitions of luck.
No, I think we have a similar definition of luck, but I think you’ve succumbed to a defeatist attitude. You have to be pretty unlucky to be permanently locked out of becoming a CEO, and if you’re dealt those cards, moaning about it on an online forum would be way down in your list of priorities.
Then why were you bringing up conditions of ones birth?
Vanishingly unlikely to get one if you try, or vanishingly unlikely to get one if you sit on your ass all day?
I assume you’re talking about the former and yet I don’t think you’ve thought this through. I think you’ve blindly attributed to luck what actually requires time, perseverance, grit, lack of morality. The only way to figure that out is for you to offer up your understanding of what one must luck out on?
> Then why were you bringing up conditions of ones birth?
Because they're a form of luck?
If you're born in the developed world, that's luck. If you're born to supportive parents, that's luck. If you're Steve Jobs and you wind up high school buddies with Woz in Mountain View, CA, that's luck. White? Luck. Male? Luck. Healthy? Luck. A light touching of psychopathy? Luck!
> Vanishingly unlikely to get one if you try, or vanishingly unlikely to get one if you sit on your ass all day?
Both.
> I think you’ve blindly attributed to luck what actually requires time, perseverance, grit, lack of morality.
There are many, many people who devote time, perserverance, and grit to their endeavours without becoming a "hugely expensive" CEO. Hence, luck. Is it the only thing? No. Is it a thing? Yes, absolutely.
No one except the article we're all (theoretically) discussing, titled "CEOs are hugely expensive", citing "the boards of BAE Systems, AstraZeneca, Glencore, Flutter Entertainment and the London Stock Exchange" as examples in the introductory paragraph.
The people in this thread coming to the defense of their CEOs sound like Tom Smykowski in Office Space desperately trying to save his job:
“I already told you: I deal with the god damn customers so the engineers don't have to. I have people skills; I am good at dealing with people.”
I don't get what you mean. There are people who are great at bridging the customer-engineering gap. (Although we don't know what Tom was really like with customers) There's skill to that kind of position. Bobs were the stereotypical consultants brought in to change things and cut costs without understanding the actual work. What does this have to do with defending CEOs?
We do know that Tom actually didn’t really do anything, all the real work was done by his underlings.
Similar to what most CEOs do.
Of course it’s not always true, but like Christine Carrillo in the article i think it’s not a stretch to say that most CEOs don’t do that much; certainly not enough to warrant being paid 1000 times what their menials make
> Although we don't know what Tom was really like with customers
The movie makes it quite clear, actually.
The Bobs were actually way better than the stereotypical layoff consultants. They even caught on the crazy management chain and the busywork generated by TPS reports. Sure they wanted to layoff good engineers, but doesn't invalidate the actual good findings.
Did we ever see him interacting with a customer? I don't remember that part and I can't find any clip of it. We see him in many other situations. We know he was not respected and was a weirdo in many ways, but that doesn't say anything about the quality of his customer communication.
If you’ve ever taken a sales or business or networking course, and seen the people that take the advice literally and act like creepy wooden weirdos, you know why. Business “skill” is intangible, partially luck obviously but partially “game” that you can’t just systematize. It’s the same reason you can’t automate sales. An AI ceo would just be the composite of all the lame business advice you find on the internet - like a lot of wannabe CEOs that also don’t succeed.
You might as well ask why people don’t use AI pickup coaches.
I like that this idea is brought up though, despite it being ridiculous currently, to hold a mirror in front of CEO faces. Just like we won't replace capable (!) CEOs with AI any time soon, we will not replace capable (!) developers with AI any time soon.
It is good, that CEOs also get some of this "You will be replaced by AI!" flak, that we hear from CEOs of big tech directed at developers. Do those CEOs think their job is more complex than a software developer job, which they are so eager to replace? How many times more urgently do we want to replace the CEO, considering salaries? How about we put as many times the amount of money into that, as we are putting into trying to replace developers?
In the end neither will work out any time soon, judging current "AI"'s actual AI level. I think for that we still need some 2-3 architectural leaps forward. And by that I don't mean simply building bigger ANNs and ingesting more data. It already seems like the returns for that are rapidly diminishing.
> Do those CEOs think their job is more complex than a software developer job, which they are so eager to replace?
You can estimate the difficulty of a job by what fraction is the population can successfully do it and how much special training this takes. Both of which are reflected in the supply curve for labor for that job.
> How many times more urgently do we want to replace the CEO, considering salaries? How about we put as many times the amount of money into that, as we are putting into trying to replace developers?
Pretty sure that (avg developer pay * number of developers) is a lot more that (avg ceo pay * number of ceos).
This is not about the job being complex, they’re the top of the food chain. Also they have a network that an automated system could not replicate because it’s not a technical problem but more of a people problem
Unironically, they are indeed somewhat safer -- however if people are willing to accept a substitute good of AI-based fortune telling... which I have seen lately ...
I do think it's a lot about personality, though I gotta say that I don't really think it should be like that.
My dad had a manager (who was a VP) that he privately nicknamed "VPGPT", because despite being a very polite and personable guy he pretty much knew nothing about the engineering he was ostensibly managing, and basically just spoke in truisms that sounded kind of meaningful unless you do any kind of analysis on them.
I'm not saying that AI would necessarily be "better", but I do kind of hate how people who are utterly incapable of anything even approaching "technical" end up being the ones making technical decisions.
I don't even think we need ChatGPT or anything for this. Instead, just create an n8n job that runs nightly that sends a company-wide email that says "we are continuing to strive to implement AI into our application". Maybe add a thing talking about how share price going down is actually a blessing in disguise, depending on how the market is doing, obviously.
Don't steal this idea it's mine I'm going to sell it for a million dollars.
I must be a weird CEO because I’ve lost count of the number of times I’ve had to explain to people why shoving AI into our application will only make it worse not better.
Some CEOs are better than others, but I think a lot of CEOs, especially for BigCos, don't really know what's actually happening in their company so instead of actually contributing to anything, they just defer to buzzwords that they think the shareholders want to hear.
CEOs usually follow the worse forms of human collaboration known to our species: totalitarianism, dictatorships, monarchies, and centrally planned messes.
Anything that removes the power of CEOs and gives it to the worker should be highly encouraged. Economic democracy is the final frontier of human empowerment and giving workers then means to have democratic control over the economy can only unlock more human potential, not less.
> Anything that removes the power of CEOs and gives it to the worker should be highly encouraged.
Pretty sure the moment you do this, the workers liquidate the company and distribute the assets among themselves, as evidenced by the acceptance rate of voluntary severance offers in many past downsizings, such as the Twitter one.
Accepting severance isn’t “liquidating the company,” it’s individuals minimizing risk when leadership is downsizing. Explicitly, in the case of Twitter.
Just want to call out that these are both not great examples?
High performance sports teams have a captain that is often elected in some form from the team.
Likewise the crew of a pirate ship used to elect their captain.
Both examples serve contrary to your point, and there's no reason you couldn't have something similar in business: a cooperative that elects a CEO, rather than it being done by a board of other CEO's.
Plato said it best when talking about the benevolent dictator, so in those cases, it's not the "worse" form of human collaboration. Not everyone follows the labor theory of value.
Why not? Let's not act like the average CEO isn't also an actively hostile member of the company toward its workforce. Why should people be forced to work under such hostile regimes? People should be empowered to vote for their leaders where they work. Boards have this authority already, there's zero reason why workers shouldn't be granted the same priviledge.
It won't make the companies worse run, why would workers want to destroy their means to live? CEOs do this with no skin in the game, the workers should take that skin as they will always be better stewards than the single tyrant.
No one is forcing people, they can switch jobs, hence why we vote for politicians, because we cannot switch countries (as easily) but we do not vote for CEOs. Well, by working at a certain company, that is automatically a vote that one supports that company and wants to continue working there.
Where is your evidence that companies won't be worse run? Workers could just vote to give themselves massive raises and hemorrhage the company, ironically like how some private equity firms operate but en masse. No one would start companies in this sort of scenario thereby causing the economy to fall, especially in comparison to companies that don't have this sort of voting system for companies.
The entire job is almost entirely human to human tasks: the salesmanship of selling a vision, networking within and without the company, leading the first and second line executives, collaborating with the board, etc.
What are people thinking CEOs do all day? The "work" work is done by their subordinates. Their job is basically nothing but social finesse.
> The entire job is almost entirely human to human tasks: sales, networking, leading, etc.
So, writing emails?
"Hey, ChatGPT. Write a business strategy for our widget company. Then, draft emails to each department with instructions for implementing that strategy."
I get your point but if you think that list of critical functions (or the unlisted "good ol boys" style tasks) boils down to some emails then I think you don't have an appreciation for the work or finesse or charisma required.
We could solve that by replacing all CEOs to remove the issue of finesse and charisma. LLMs can then discuss the actual proposals. (not entirely kidding)
Why are there "good ol boys" tasks in the first place? Instead, automate the C-suite with AI, get rid of these backroom dealings and exclusive private networks, and participate in a purer free market based on data. Isn't this what all the tech libertarians who are pushing AI are aiming for anyways? Complete automation of their workforces, free markets, etc etc? Makes more sense to cut the fat from the top first, as it's orders of magnitude larger than the fat on the bottom.
> If it were this easy, you could have done it by now. Have you?
In order to save $20 million dollars with this technique, the first step is to hire a CEO who gets paid $20 million dollars. The second step is to replace the CEO with a bot.
I confess that I have not yet completed the first step.
I am sympathetic to your point, but reducing a complex social exchange like that down to 'writing emails' is wildly underestimating the problem. In any negotiation, it's essential to have an internal model of the other party. If you can't predict reactions you don't know which actions to take. I am not at all convinced any modern AI would be up to that task. Once one exists that is I think we stop being in charge of our little corner of the galaxy.
Artists, musicians, scientists, lawyers and programmers have all argued that the irreducible complexity of their jobs makes automation by AI impossible and all have been proven wrong to some degree. I see no reason why CEOs should be the exception.
Although I think it's more likely that we're going to enter an era of fully autonomous corporations, and the position of "CEO" will simply no longer exist except as a machine-to-machine protocol.
I feel like the people who can't comprehend the difficulties of an AI CEO are people who have never been in business sales or high level strategy and negotiating.
You can't think of a single difference in the nature of the job of artist/musician vs. lawyer vs. business executive?
>You can't think of a single difference in the nature of the job of artist/musician vs. lawyer vs. business executive?
I can think of plenty, but none that matter.
As the AI stans say, there is nothing special about being human. What is a "CEO?" Just a closed system of inputs and outputs, stimulus and response, encased in wetware. A physical system that like all physical systems can be automated and will be automated in time.
This is literally a caricature of what the average HN engineer thinks a businessperson or CEO does all day, like you can't make satire like this up better if you tried.
Even if the AI gets infinitely good, the task of guiding it to create software for the use of other humans is called...software engineering. Therefore, SWEs will never go away, because humans do not know what they want, and they never will until they do.
Seeing as there are people that believe that they are dating a chat bot and others that believe that chat bots contain divinity, there are probably some people that would respond positively to slop emails about Business Insight Synergy Powered By Data-ScAIence and buy some SaaS product for their Meritocrat-Nepo sneaker collab drops company
CEOs who build great, long-lasting companies would be very hard to replicate. But CEOs who make money for stockholders at the expense of everything else seem like the type of thing that would be much easier to replicate. As others have commented, replacing the people skills of a CEO might be difficult. But if the CEO's job is to strip the company of assets and cash out for the owners, people skills are kind of a hindrance at that point.
It is easy to make the mistake of believing CEOs are automatable based on their public speaking: interviews, earning calls, conference talks. With a rare exception (cough Musk) CEOs communicate in a very sterilized PR-speak, coached and vetted by PR, media relations, and legal counsels, and usually stick to facts or not very controversial opinions. That part of the job is pretty replaceable with a well-trained LLM.
The real job is done behind the curtain. Picking up key people based on their reputation, knowledge, agency, and loyalty. Firing and laying off people. Organizational design. Cutting the losses. Making morally ambiguous decisions. Decisions based on conversations that are unlikely to ever be put into bytes.
Yeah, that's because business leadership is largely a cult. The way you prove your loyalty to the cult is by overseeing larger and larger layoffs ordered by those above you until you're the one putting people on the street.
CEO's at least in the USA have multiple legal obligations under federal law. Can legal obligations be delegated legally to automation? Has this been tested yet, specific to legal obligations related to the board? Any corporate lawyers wish to chime in?
Under Delaware law (where most U.S. public companies incorporate), directors' fiduciary duties are non-delegable. The actual exercise of judgment must be performed by the director, who must be a "natural person". Other jurisdictions might offer grey areas, but good luck finding one that meaningfully changes this.
More practically, legal accountability would be placed in the individuals approving LLMs actions and/or the entity providing the LLM service. The latter aspect being why many AI vendor deals fall through. Because everything is awesome until the contract comes and the vendor wants to take no responsibility for anything that results from their product.
> CEO's at least in the USA have multiple legal obligations under federal law.
Lots of people have legal obligations.
In this case, I assume that in this case you're referring to a fiduciary duty (i.e. to act in the best interests of the company), which is typically held not by the CEO, but but by the directors.
Ultimately the responsibility to assign daily operation of the company rests with the board, both legally and practically, as does the decision to use a human or AI CEO.
My impression is that most of the value a CEO provides comes from a combination of their existing social network and their ability to interface socially with other companies. The day to day work described in the article is not where the value comes from.
It would be an interesting experiment to promote an executive assistant to CEO though.
Some CEOs are obviously extremely skilled at marshalling large organisations in the successful pursue of business goals, while the rest derive most of their value from sharing a job title with the first group.
If CEOs are automated, i.e. eliminated, then all decisions, popular and unpopular will be attributed to boards. It's related to other CxO as well. Who'd you blame for insulin price increase over years if CFO was a script parametrized by the board?
Since when did people support CEOs as some sort of jobs or redistribution program? It's at least somewhat plausible to mandate bullshit jobs like gas station attendants or whatever to keep teenagers employed, but nobody is clamoring for CEOs to exist to screw over shareholders.
I didn't mean it as a moral statement, but as a practical observation that a person with effective control over a company has a lot of leverage when it comes to negotiating resource splits. Also, you want them to be aligned.
What is the difference that explains why this doesn't seem to happen for humans in the ceo role, but would happen if the role were fulfilled by automation?
I'm not sure you can get around the principal-agent problem that easily. Who sets the policy levers on the automation and governs it? They inherit the ceo's negotiating leverage with shareholders.
It seems like you'd need some sort of fairly radical control structure (say, no board, just ai interacting directly with shareholders) to get around this. But even this ignores that the automation is not neutral, it is provided by actors with incentives.
We just posted a memorial for Louis Gerstner, the CEO that turned around IBM. I think reading about Louis Gerstner gives a pretty good example of why this would be a bad idea.
My business experience is that company culture is very important to a company’s success and I’m just doubtful that this can be created through AI.
LLM CEOs would be middle of the road, I suspect. Yes, they could be much more in touch with the goings on within the business and make fewer baseless decisions. Though you would not get an innovative or culture-building leader that is necessary in some circumstances.
That could be survival bias and smells of a multiple testing problem. Surely there would also be a few automated systems to that sometimes would be able to turn around a company.
Even if this did happen there would no doubt be some sweet-talking schmuck with half a college degree and connections to all the other rich and powerful (likely through their dad) claiming credit for everything the LLM did.
I was a CEO for thirteen years. For the hard CEO skills, AI is perfectly suited for the job; even more than it would be for specialist roles.
For the soft CEO skills, not so much.
Not that that's a deal-breaker. I have a vision of an AI CEO couched as a "strategic thought partner," which the wet-CEO just puppets to grease the skids of acceptance among the employees.
I'd fully trust an AI CEO's decision making, for a predictable business, at least. But some CEOs get paid a lot (deservedly so) because they can make the right decisions in the thick fog of war. Hard to get an AI to make the right decision on something that wasn't in the training corpus.
Still, business strategy isn't as complex as picking winners in the stock market.
I’d suspect that an AI CEO would have to complement its weaknesses — just like any CEO. And in this case, rely on subordinates for glad-handing and vision pitches, while itself focusing on coordinating them, staging them to the right meetings, coordinating between departments, etc.
I think an AI could be strong at a few skills, if appropriately chosen:
- being gaslightingly polite while firmly telling others no;
- doing a good job of compressing company wide news into short, layperson summaries for investors and the public;
- making PR statements, shareholder calls, etc; and,
- dealing with the deluge of meetings and emails to keep its subordinates rowing in the same direction.
Would it require that we have staff support some of the traditional soft skills? Absolutely. But there’s nothing fundamentally stopping an AI CEO from running the company.
If CEOs can be replaced, shouldn't founders be also (who are often CEOs)? What about all levels of leadership? Why can't AIs just run full companies autonomously?
A hundred thousand posts on HN about how you can, without hesitation, replace your network security team with AI, and you can see the flood of CEOs (or CEO wannabes) nodding and murmuring about saving costs. A single post about CEOs being automated and all of a sudden it’s all about “intangible human relation skills” and “AI couldn’t possibly” and “but my network of angel investors and other CEOs”.
I swear there’s a joke or cautionary tale here somewhere about “first they came for..” or something along those lines. The phrasing escapes me.
Maybe the problem isn’t that you can’t automate a CEO, it’s that the actual tangible work just isn’t worth as much as some companies pay for it, and this thread it touching a few too many raw nerves.
It really is funny. I'm sympathetic to the idea that you can't automate the intangibles of company leadership, but that same idea also applies to other things like producing art or software, which seems to be eagerly ignored by the same cohorts who take offence to this idea.
I too laughed when Stable Diffusion came out and artists said their jobs couldn't possible be automated due to the "intangible human creativity that machines can't replicate." I mean, it's the same thing right? At the end of the day, everyone has economic anxiety and will continue to fight to make money to survive.
Maybe first get the automated vending machines to work properly, and then revisit this question to find what tasks that allows to be peeled away from the role?
You know those puff pieces are to assuage our fears of job loss and provide an folksy aww-sucks patina to the plucky LLM who can't run a vending machine.
If it can engage in criminal activity, then it can skip the golf training.
Every time the LLM CEO gets caught doing a crime and goes to 'jail', the LLMs on the exec board can vote to replace it with another instance of the same LLM model.
Forget 'limited liability', this is 'no liability'.
Love the idea, but whence the training data? Not as readily available as billions of jpegs, lines of code, and audio files. Any clever ideas to source it?
Every year I feel a bit less crazy in my silly armchair speculation that the Second Renaissance from the Animatrix is a good documentary. If AI "takes over" it will be via economic means and people will go willingly until they have gradually relinquished control of the world to something alien. (Landian philosophers make the case that hyperstitional capitalism has already done this)
I would take the over that this will happen sooner than later -- when it's proven to make a lot of money to have an AI CEO, suddenly everyone will change their tune and jump on the bandwagon with dollar signs in their eyes, completely ignoring what they are giving up.
Except unlike e.g. the metaverse/cryptocurrency bandwagon of yesteryear, there's no getting off.
You could replace all of the arguments against automating CEOs with the same arguments against automating allocation of capital. In both these cases people are imagining their work to be more subtle and ineluctably human than, for example, the work of doctors and lawyers. Sure, buddy. Keep telling yourself.
You say that like we've already done away with human doctors and lawyers. Lawyers can't even use LLMs as an aid without them making up fake citations. The technology isn't close to being able to be used unsupervised, and humans are proving too irresponsible to supervise it.
Should have (2023) in the title. This is an old post.
Also, I think it misses the critical point. C-suite executives operate under immense pressure to deliver abstract business outcomes, but the lack of clear, immediate feedback loops and well-defined success metrics makes their roles resistant to automation. AI needs concrete reward functions that executive decision-making simply doesn't provide.
The CEOs of the biggest corporations could absolutely be replaced... You don't even need an LLM, just a cron job that runs once a day and executes a script:
if (marketCrash) then sendEmailToGovernmentAskingForBailout();
If you've ever worked at a company that's a chaotic shitshow, you'll know how strong the effect of the CEO is - it always comes down to the guy at the top not being up to it.
The leverage of the role is enormous, and the strength of someone who can carry out this role well for a large company is sky high - not many such people in the world, and they only need one.
So the math all comes out very straightforward: even at obscene looking salaries, they're still a bargain.
The article might be a modest proposal [1], but sooner or later we're going to have to answer questions like these.
An even more interesting one is: What will we reward?
We've been rewarding labor quantity, as well as quality via higher wages - as motivation and as incentives for more education. This reflected the productivity primacy of knowledge work in modern economies, but that might not be the case down the road.
We've also been rewarding capital. Originally this was a way for the elites to keep themselves in place (a.k.a. economic rents), but in modern times it's been more of an entrepreneurial incentive (a.k.a. economic profits.)
Without the economic profit rationale, there's no reason to reward capital accumulation. Only pro-profit decisions are good for society, pro-rent decisions are awful. If there's no profit to incentivize, capitalism is just bad all around.
If AI becomes a better profit decision-maker than an entrepreneur, any humans left in the loop are nothing but high-rollers gambling with everyone else's money.
For context, New Statesman is the premier British socialist magazine.
Whatever the merits of the argument here (and my bolshie side has also flippantly pushed it in the past) the motivation and thrust of the essay needs to be considered in that ideological grounding.
There is probably a point when there is enough capital and enough investors that you don't need a CEO or executives. The investors already take care of all the politics and business networking side of the company for free.
The investors can organize the government bailouts themselves. You don't need a CEO.
I'm constantly reminded about the 2000 election. Part of Al Gore's communications strategy was to constantly talk about the "top 1%". Now he wasn't wrong about this but interestingly there was a survey in 2000 that showed that 19% of people thought they were the top 1% and another 20% thought they would be someday.
This is how successful American propaganda is. 39% of people believed something that definitionally could never be true.
So you will find people who make average salaries defending the stratospheric salaries of CEOs because they believe they'll one day be the one benefitting or they've fallen for some sort of propaganda such as the myth of meritocracy or prosperity gospel.
Our entire economy is designed around exploiting working people and extracting all of their wealth to a tiny portion of the population. And we're reachign the point where the bottom 50% (if not more) have nothing left to exploit.
Ai and automation could be used to improve all of our lives. It isn't and it won't be. It'll be used to suppress wages and displace workers so this massive wealth transfer can be accelerated.
I get the point of the article. But those with the wealth won't let themselves be replaced by AI and seemingly the populace will never ask the question of why they can't be replaced until economic conditiosn deteriorate even further.
I agree with the other comment. It's easy to get into the top 1%. This is actually true about most things. Most people are lazy, don't have the drive to work hard to achieve their goals, or are not willing to make the sacrifices it takes to do so.
> This is how successful American propaganda is. 39% of people believed something that definitionally could never be true.
It's not that difficult to get into the top 1%. Most Americans earn a top 1% income. Even the top 1% of America is only a salary of around $500k. It's possible 19% of survey takers were in the top 1%, or were on a path to make that in the future.
I don't see how it's definitionally untrue to believe you could make $500k a year at some point...Let alone $34,000 a year...
They will have been asked something like "what income level do you believe places someone in the top 1% of American earners?" or "what percentile of American earners do you believe yourself to be in?"
So the top 1% of income is currently ~$570k. The vast majority of people will never see that.
But more relevant is the top 1% of net worth is currently ~$11.6M [1], which is vastly more unattainable.
Also, the net worth of the bottom 99% is skewed by house prices. You might be sitting on a house worth $1M but when every other house also costs $1M and you have to live somewhere, you don't really have a net worth of $1M.
I don't know how that particular poll was worded, but in general if your a politician who rails against the top 1%, you might suffer from the fact that people have widely varying conceptions of who the 1% are.
Well, somebody has to go to jail if catastrophic decisions are made and you can't jail AI. We very often see CEOs being jailed in the real world, so the pay is actually a very fair compensation for the risk.
Can't read the paywalled article so i'm judginf from just a headline (not sure if it's sarcasm or actual opinion),
but as someone who has the honor of working witha really good ceo i can definitely say that you cannot automate them. maybe in some streamlined corporate machine like ibm or something, but not in a living growing company
If anything, I would argue that the strategic decisions actually can be automated/performed via broader consensus. With that handled, all that's left is the cartel that CEOs have invented to justify their exhorbant pay packages.
CEO compensation is determined by board committees mostly made up of other CxOs. They write letters to each other's shareholders about how valuable CEOs are to build up the edifice.
I wish my compensation were determined by fellow engineers who "truly know my worth". I'd pay it forward if I were on a committee determining colleague's pay packets.
Except, Elon has been largely successful at being the "CEO" of these companies because he attracts talent to them. So...either:
(1) Businesses still will require human talent and if so, I don't see how an AI bot will replace a CEO who is necessary to attract human talent.
OR
(2) Businesses don't require any human talent beneath the CEO and thus a CEO is still necessary, or at least some type of "orchestrator" to direct the AI bots beneath them.
s/operates/operates on/
So they are a surgeon? Wouldn't be surprised at the damage they cause, conidering the business results of so many companies.
A CEO's job is (roughly) to maximize a company's valuation. It is not to run the company themselves, not to be nice, not to improve the world. I'm not claiming this is what _should_ be, just how it _is_. By this metric, I think Musk has done really well in his role.
Edit: Tangentially related -- at the end of the musical "Hadestown", the cast raise their glasses to the audience and toast "to the world we dream about, and the one we live in today." I think about that a lot. It's so beautiful, helps enforce some realism on me, and makes me think about what I want to change with my life.
It's called "lying to customers and investors".
> And the bad-person strategy works well for him.
Worked. Tesla is not doing that well recently. Others are a bit better.
Maybe, maybe not. We often see technology reach a threshold that allows for sudden progress, like Newton and Leibniz both coming up with calculus at around the same time (https://en.wikipedia.org/wiki/Leibniz%E2%80%93Newton_calculu...), or Darwin rushing to publish On The Origin of Species because someone else had figured out the same thing (https://en.wikipedia.org/wiki/Alfred_Russel_Wallace).
SpaceX benefited immensely from massive improvements in computing power, sensors, etc.
The secret sauce is execution.
Hired CEOs are there to execute a board vision.
Made CEOs are there to execute their vision.
Their level of expertise, access, relationships, etc all scale with the business. If it’s big, you need someone well connected who can mange an organization of that size. IANAE but I would imagine having access to top schools would be a big factor as well.
Maybe he’s just that good at what he does?
The entire point of an MBA is networking for executive roles.
If you need to be lucky in meeting the right people, you can increase your chances by spending your evenings in the your nearest financial district watering hole. We’ve easily established luck can be controlled for, which puts us back into skill territory.
What specifically must one luck out on? Have you tried?
Everyone one of us here has an unbroken line of lucky (lucky enough!) ancestors stretching back a billion years or so. Pretending it's not a thing is silly.
When you're born matters. Where you're born matters. Who you encounter matters. etc. etc. etc.
> What specifically must one luck out on? Have you tried?
I think perhaps we have different definitions of luck.
Sure, but that's not what's being asserted. I am not "permanently locked out" of megacorp CEO roles; I'm just vanishingly unlikely to get one.
There are lots of people who have enough singing/dancing skill to be a mega popstar like Taylor Swift. There just aren't enough slots.
Could I become the next Steve Jobs? Maybe! I'd have to get really lucky.
Vanishingly unlikely to get one if you try, or vanishingly unlikely to get one if you sit on your ass all day?
I assume you’re talking about the former and yet I don’t think you’ve thought this through. I think you’ve blindly attributed to luck what actually requires time, perseverance, grit, lack of morality. The only way to figure that out is for you to offer up your understanding of what one must luck out on?
Because they're a form of luck?
If you're born in the developed world, that's luck. If you're born to supportive parents, that's luck. If you're Steve Jobs and you wind up high school buddies with Woz in Mountain View, CA, that's luck. White? Luck. Male? Luck. Healthy? Luck. A light touching of psychopathy? Luck!
> Vanishingly unlikely to get one if you try, or vanishingly unlikely to get one if you sit on your ass all day?
Both.
> I think you’ve blindly attributed to luck what actually requires time, perseverance, grit, lack of morality.
There are many, many people who devote time, perserverance, and grit to their endeavours without becoming a "hugely expensive" CEO. Hence, luck. Is it the only thing? No. Is it a thing? Yes, absolutely.
https://youtube.com/watch?v=hNuu9CpdjIo
The movie makes it quite clear, actually.
The Bobs were actually way better than the stereotypical layoff consultants. They even caught on the crazy management chain and the busywork generated by TPS reports. Sure they wanted to layoff good engineers, but doesn't invalidate the actual good findings.
Did we ever see him interacting with a customer? I don't remember that part and I can't find any clip of it. We see him in many other situations. We know he was not respected and was a weirdo in many ways, but that doesn't say anything about the quality of his customer communication.
You might as well ask why people don’t use AI pickup coaches.
It is good, that CEOs also get some of this "You will be replaced by AI!" flak, that we hear from CEOs of big tech directed at developers. Do those CEOs think their job is more complex than a software developer job, which they are so eager to replace? How many times more urgently do we want to replace the CEO, considering salaries? How about we put as many times the amount of money into that, as we are putting into trying to replace developers?
In the end neither will work out any time soon, judging current "AI"'s actual AI level. I think for that we still need some 2-3 architectural leaps forward. And by that I don't mean simply building bigger ANNs and ingesting more data. It already seems like the returns for that are rapidly diminishing.
You can estimate the difficulty of a job by what fraction is the population can successfully do it and how much special training this takes. Both of which are reflected in the supply curve for labor for that job.
> How many times more urgently do we want to replace the CEO, considering salaries? How about we put as many times the amount of money into that, as we are putting into trying to replace developers?
Pretty sure that (avg developer pay * number of developers) is a lot more that (avg ceo pay * number of ceos).
I thought you meant "AI-startup CEO" for a moment and was going to agree.
https://en.wikipedia.org/wiki/Fortune_teller_machine
My dad had a manager (who was a VP) that he privately nicknamed "VPGPT", because despite being a very polite and personable guy he pretty much knew nothing about the engineering he was ostensibly managing, and basically just spoke in truisms that sounded kind of meaningful unless you do any kind of analysis on them.
I'm not saying that AI would necessarily be "better", but I do kind of hate how people who are utterly incapable of anything even approaching "technical" end up being the ones making technical decisions.
You'll easily find people preaching or selling that sort of thing on Twitter, and the sort of people who are still on Twitter are probably buying it.
(Probably mentally unhealthy people, but still it happens!)
Don't steal this idea it's mine I'm going to sell it for a million dollars.
Anything that removes the power of CEOs and gives it to the worker should be highly encouraged. Economic democracy is the final frontier of human empowerment and giving workers then means to have democratic control over the economy can only unlock more human potential, not less.
Pretty sure the moment you do this, the workers liquidate the company and distribute the assets among themselves, as evidenced by the acceptance rate of voluntary severance offers in many past downsizings, such as the Twitter one.
High performance sports teams have a captain that is often elected in some form from the team.
Likewise the crew of a pirate ship used to elect their captain.
Both examples serve contrary to your point, and there's no reason you couldn't have something similar in business: a cooperative that elects a CEO, rather than it being done by a board of other CEO's.
Except replacing CEOs with AIs will not do this.
It won't make the companies worse run, why would workers want to destroy their means to live? CEOs do this with no skin in the game, the workers should take that skin as they will always be better stewards than the single tyrant.
Where is your evidence that companies won't be worse run? Workers could just vote to give themselves massive raises and hemorrhage the company, ironically like how some private equity firms operate but en masse. No one would start companies in this sort of scenario thereby causing the economy to fall, especially in comparison to companies that don't have this sort of voting system for companies.
The entire job is almost entirely human to human tasks: the salesmanship of selling a vision, networking within and without the company, leading the first and second line executives, collaborating with the board, etc.
What are people thinking CEOs do all day? The "work" work is done by their subordinates. Their job is basically nothing but social finesse.
So, writing emails?
"Hey, ChatGPT. Write a business strategy for our widget company. Then, draft emails to each department with instructions for implementing that strategy."
There, I just saved you $20 million.
I think that you don't appreciate that charismatic emails are one of the few things that modern AI can do better than humans.
I wouldn't trust ChatGPT to do my math homework, but I would trust it to write a great op-ed piece.
If it were this easy, you could have done it by now. Have you?
In order to save $20 million dollars with this technique, the first step is to hire a CEO who gets paid $20 million dollars. The second step is to replace the CEO with a bot.
I confess that I have not yet completed the first step.
Although I think it's more likely that we're going to enter an era of fully autonomous corporations, and the position of "CEO" will simply no longer exist except as a machine-to-machine protocol.
You can't think of a single difference in the nature of the job of artist/musician vs. lawyer vs. business executive?
That applies to every call to replace jobs with current-gen AI.
But I can't think of a difference between CEOs and other professions that works out in favor of keeping the CEOs over the rest.
Everyone is indispensable until they aren't.
I can think of plenty, but none that matter.
As the AI stans say, there is nothing special about being human. What is a "CEO?" Just a closed system of inputs and outputs, stimulus and response, encased in wetware. A physical system that like all physical systems can be automated and will be automated in time.
They both need social finesse and CEO’s don’t need a body.
The real job is done behind the curtain. Picking up key people based on their reputation, knowledge, agency, and loyalty. Firing and laying off people. Organizational design. Cutting the losses. Making morally ambiguous decisions. Decisions based on conversations that are unlikely to ever be put into bytes.
More practically, legal accountability would be placed in the individuals approving LLMs actions and/or the entity providing the LLM service. The latter aspect being why many AI vendor deals fall through. Because everything is awesome until the contract comes and the vendor wants to take no responsibility for anything that results from their product.
Lots of people have legal obligations.
In this case, I assume that in this case you're referring to a fiduciary duty (i.e. to act in the best interests of the company), which is typically held not by the CEO, but but by the directors.
Ultimately the responsibility to assign daily operation of the company rests with the board, both legally and practically, as does the decision to use a human or AI CEO.
It would be an interesting experiment to promote an executive assistant to CEO though.
It seems like you'd need some sort of fairly radical control structure (say, no board, just ai interacting directly with shareholders) to get around this. But even this ignores that the automation is not neutral, it is provided by actors with incentives.
My business experience is that company culture is very important to a company’s success and I’m just doubtful that this can be created through AI.
For the soft CEO skills, not so much.
Not that that's a deal-breaker. I have a vision of an AI CEO couched as a "strategic thought partner," which the wet-CEO just puppets to grease the skids of acceptance among the employees.
I'd fully trust an AI CEO's decision making, for a predictable business, at least. But some CEOs get paid a lot (deservedly so) because they can make the right decisions in the thick fog of war. Hard to get an AI to make the right decision on something that wasn't in the training corpus.
Still, business strategy isn't as complex as picking winners in the stock market.
I think an AI could be strong at a few skills, if appropriately chosen:
- being gaslightingly polite while firmly telling others no;
- doing a good job of compressing company wide news into short, layperson summaries for investors and the public;
- making PR statements, shareholder calls, etc; and,
- dealing with the deluge of meetings and emails to keep its subordinates rowing in the same direction.
Would it require that we have staff support some of the traditional soft skills? Absolutely. But there’s nothing fundamentally stopping an AI CEO from running the company.
I swear there’s a joke or cautionary tale here somewhere about “first they came for..” or something along those lines. The phrasing escapes me.
Maybe the problem isn’t that you can’t automate a CEO, it’s that the actual tangible work just isn’t worth as much as some companies pay for it, and this thread it touching a few too many raw nerves.
Well, either way it’s hilarious.
Every time the LLM CEO gets caught doing a crime and goes to 'jail', the LLMs on the exec board can vote to replace it with another instance of the same LLM model.
Forget 'limited liability', this is 'no liability'.
Every year I feel a bit less crazy in my silly armchair speculation that the Second Renaissance from the Animatrix is a good documentary. If AI "takes over" it will be via economic means and people will go willingly until they have gradually relinquished control of the world to something alien. (Landian philosophers make the case that hyperstitional capitalism has already done this)
I would take the over that this will happen sooner than later -- when it's proven to make a lot of money to have an AI CEO, suddenly everyone will change their tune and jump on the bandwagon with dollar signs in their eyes, completely ignoring what they are giving up.
Except unlike e.g. the metaverse/cryptocurrency bandwagon of yesteryear, there's no getting off.
Also, I think it misses the critical point. C-suite executives operate under immense pressure to deliver abstract business outcomes, but the lack of clear, immediate feedback loops and well-defined success metrics makes their roles resistant to automation. AI needs concrete reward functions that executive decision-making simply doesn't provide.
if (marketCrash) then sendEmailToGovernmentAskingForBailout();
If you've ever worked at a company that's a chaotic shitshow, you'll know how strong the effect of the CEO is - it always comes down to the guy at the top not being up to it.
The leverage of the role is enormous, and the strength of someone who can carry out this role well for a large company is sky high - not many such people in the world, and they only need one.
So the math all comes out very straightforward: even at obscene looking salaries, they're still a bargain.
An even more interesting one is: What will we reward?
We've been rewarding labor quantity, as well as quality via higher wages - as motivation and as incentives for more education. This reflected the productivity primacy of knowledge work in modern economies, but that might not be the case down the road.
We've also been rewarding capital. Originally this was a way for the elites to keep themselves in place (a.k.a. economic rents), but in modern times it's been more of an entrepreneurial incentive (a.k.a. economic profits.)
Without the economic profit rationale, there's no reason to reward capital accumulation. Only pro-profit decisions are good for society, pro-rent decisions are awful. If there's no profit to incentivize, capitalism is just bad all around.
If AI becomes a better profit decision-maker than an entrepreneur, any humans left in the loop are nothing but high-rollers gambling with everyone else's money.
[1] https://en.wikipedia.org/wiki/A_Modest_Proposal
Whatever the merits of the argument here (and my bolshie side has also flippantly pushed it in the past) the motivation and thrust of the essay needs to be considered in that ideological grounding.
It’s been tried before, it didn’t work out well.
The investors can organize the government bailouts themselves. You don't need a CEO.
This is how successful American propaganda is. 39% of people believed something that definitionally could never be true.
So you will find people who make average salaries defending the stratospheric salaries of CEOs because they believe they'll one day be the one benefitting or they've fallen for some sort of propaganda such as the myth of meritocracy or prosperity gospel.
Our entire economy is designed around exploiting working people and extracting all of their wealth to a tiny portion of the population. And we're reachign the point where the bottom 50% (if not more) have nothing left to exploit.
Ai and automation could be used to improve all of our lives. It isn't and it won't be. It'll be used to suppress wages and displace workers so this massive wealth transfer can be accelerated.
I get the point of the article. But those with the wealth won't let themselves be replaced by AI and seemingly the populace will never ask the question of why they can't be replaced until economic conditiosn deteriorate even further.
It's not that difficult to get into the top 1%. Most Americans earn a top 1% income. Even the top 1% of America is only a salary of around $500k. It's possible 19% of survey takers were in the top 1%, or were on a path to make that in the future.
I don't see how it's definitionally untrue to believe you could make $500k a year at some point...Let alone $34,000 a year...
1% of Americans earn a top 1% income. They weren't being asked "do you make more than an amputee kid in Gaza?"
> It's possible 19% of survey takers were in the top 1%…
There's a whole field of math devoted to preventing this. Polling works quite well, all things considered.
> They weren't being asked "do you make more than an amputee kid in Gaza?"
Context matters.
Often posed as a multiple choice question.
But more relevant is the top 1% of net worth is currently ~$11.6M [1], which is vastly more unattainable.
Also, the net worth of the bottom 99% is skewed by house prices. You might be sitting on a house worth $1M but when every other house also costs $1M and you have to live somewhere, you don't really have a net worth of $1M.
[1]: https://finance.yahoo.com/news/among-wealthiest-heres-net-wo...
I don't know how that particular poll was worded, but in general if your a politician who rails against the top 1%, you might suffer from the fact that people have widely varying conceptions of who the 1% are.
Could be good, but could also be bad if it turns out the AI is able to be even more ruthless in how it treats its workforce.
The good news is that it doesn't need to be very accurate in order to beat the performance of most execs anyways.
Where "very often" means "almost never?"
but as someone who has the honor of working witha really good ceo i can definitely say that you cannot automate them. maybe in some streamlined corporate machine like ibm or something, but not in a living growing company
Because building psychopathic AI's is - at the moment - still frowned upon.