talk is cheap, so what is he doing today to make that happen? Is he supporting NGOs or politicians focused on making this happen, or is he just musing that it would be nice? (I honestly don't know, but I've only seen the non-committal musing.)
There is this ongoing flack in our "billionaire-said-a-thing" news where billionaires imply that their tech will result in huge benefits that will be delivered by someone else once they amass their fortunes.
Musk is the most obvious. He publicly proclaims his mission to create technology that ensures we all live better, healthier lives, while routinely violating labor, safety, and environmental laws and insisting his employees work 60-hour weeks with minimal vacation in order to give him a shot at becoming the world’s first trillionaire.
I'll say it, he's going to end up inciting action from the people with nothing left to lose. It's happened once already and frankly I'm surprised it didn't happen sooner. He's regularly talking about how we should be trying to adapt and cope with an extinction level problem of his design for which the benefits only apply to an extremely small minority of the population.
I say this as someone who is an avid AI enthusiast. Sam Altman is a monster.
people believe in an afterlife for the same reason. sometimes people really need something to make it all worth something. and for some people knowing there is a light at the end of tunnel economically is exactly that
> the average person in a post-work future needs to have a genuine ownership stake in the AI compute that's making things happen, not just welfare funded from the profits of the billionaires who own it
So nationalise the AI companies? Isn’t that exactly what that would be. I am not opposed to the idea of public ownership, but I think some of the existing investors aren’t going to be happy with that option.
The headline is clearly crafted to make you think that so that you'll click on it. The contents make it clear that he's really saying something different and much more aligned with your thoughts. He thinks that in a world where labor is less valuable, UBI won't be enough; the average person in a post-work future needs to have a genuine ownership stake in the AI compute that's making things happen, not just welfare funded from the profits of the billionaires who own it.
A "genuine ownership stake in the AI compute that's making things happen" sounds to me like corpo-speak for "taxpayer-funded bailout of my unprofitable company". After all, if everyone has a stake in AI, and AI crashes, then everyone (not just OpenAI) loses their money
> the average person in a post-work future needs to have a genuine ownership stake in the AI compute that's making things happen, not just welfare funded from the profits of the billionaires who own it
I've heard this said and can only imagine babies being born with stock options in OpenAI; in which case, there is not really much difference between your two scenarios.
Otherwise, how are you going to distribute ownership of AI compute, if no one has jobs to earn it?
There was a study a while back in Germany which showed (https://cepr.org/voxeu/columns/identity-and-wellbeing-how-re..., ht Matt Yglesias) that people in retirement are happier than people economically equivalent unemployment. People's understanding of the system and how they relate to it is an important factor.
If 50% of the population understand themselves to belong to a permanent leisure class, who are entitled to simply go through life hosting dinners and grabbing drinks and taking walks in the park whenever they'd like, that's probably an OK future. The pathway from here to there is scary, and you'd have to think about how to manage the other 50% outbidding them for positional goods, but you could imagine it working out.
If 50% of the population understand themselves to belong to a permanent underclass, dependent on the largesse of the other 50% to keep them alive, they're going to be extraordinarily motivated to burn things down. Even if the other 50% establish a generous welfare system, perhaps so generous that you can obtain all the same goods and services that you could in the first scenario, it wouldn't solve the problem.
sam, i’m sorry but is that really the safest thing for you to be saying with the recent attempts against your life? doesn’t seem very wise. you’re just going to pour gasoline on the fire
This makes sense.
Initially, the focus was consumer use of AI. People needed to feel safe, they needed to feel part of something better.
Now, the focus is enterprises, and they need to know that their tokens aren't going to spike in price from taxes
He wouldn't want to be accused of actually believing in anything
Fickle Clown Economics
talk is cheap, so what is he doing today to make that happen? Is he supporting NGOs or politicians focused on making this happen, or is he just musing that it would be nice? (I honestly don't know, but I've only seen the non-committal musing.)
There is this ongoing flack in our "billionaire-said-a-thing" news where billionaires imply that their tech will result in huge benefits that will be delivered by someone else once they amass their fortunes.
Musk is the most obvious. He publicly proclaims his mission to create technology that ensures we all live better, healthier lives, while routinely violating labor, safety, and environmental laws and insisting his employees work 60-hour weeks with minimal vacation in order to give him a shot at becoming the world’s first trillionaire.
I wish the press would push back once in a while.
Yep. Classical cheap talk. No skin in the game. Put your money where your mouth is.
Sam Altman is willing to roll the dice on human extinction in order to make a buck. Even though hes already outrageously rich.
I feel that I cannot properly voice my feelings about him without being banned from hacker news.
I'll say it, he's going to end up inciting action from the people with nothing left to lose. It's happened once already and frankly I'm surprised it didn't happen sooner. He's regularly talking about how we should be trying to adapt and cope with an extinction level problem of his design for which the benefits only apply to an extremely small minority of the population.
I say this as someone who is an avid AI enthusiast. Sam Altman is a monster.
I don’t feel bad for believers in UBI; there’s adults that still believe in the Easter Bunny too.
people believe in an afterlife for the same reason. sometimes people really need something to make it all worth something. and for some people knowing there is a light at the end of tunnel economically is exactly that
> the average person in a post-work future needs to have a genuine ownership stake in the AI compute that's making things happen, not just welfare funded from the profits of the billionaires who own it
So nationalise the AI companies? Isn’t that exactly what that would be. I am not opposed to the idea of public ownership, but I think some of the existing investors aren’t going to be happy with that option.
> some of the existing investors aren’t going to be happy with that option.
they will be happier than having no customers cause everyone is out of a job
Yeah billionaires don't want to pay any money to the society that enabled them.
The headline is clearly crafted to make you think that so that you'll click on it. The contents make it clear that he's really saying something different and much more aligned with your thoughts. He thinks that in a world where labor is less valuable, UBI won't be enough; the average person in a post-work future needs to have a genuine ownership stake in the AI compute that's making things happen, not just welfare funded from the profits of the billionaires who own it.
A "genuine ownership stake in the AI compute that's making things happen" sounds to me like corpo-speak for "taxpayer-funded bailout of my unprofitable company". After all, if everyone has a stake in AI, and AI crashes, then everyone (not just OpenAI) loses their money
Now that you mention it! He's been big on bailouts recently! Socialize the losses ...
> the average person in a post-work future needs to have a genuine ownership stake in the AI compute that's making things happen, not just welfare funded from the profits of the billionaires who own it
I've heard this said and can only imagine babies being born with stock options in OpenAI; in which case, there is not really much difference between your two scenarios.
Otherwise, how are you going to distribute ownership of AI compute, if no one has jobs to earn it?
There was a study a while back in Germany which showed (https://cepr.org/voxeu/columns/identity-and-wellbeing-how-re..., ht Matt Yglesias) that people in retirement are happier than people economically equivalent unemployment. People's understanding of the system and how they relate to it is an important factor.
If 50% of the population understand themselves to belong to a permanent leisure class, who are entitled to simply go through life hosting dinners and grabbing drinks and taking walks in the park whenever they'd like, that's probably an OK future. The pathway from here to there is scary, and you'd have to think about how to manage the other 50% outbidding them for positional goods, but you could imagine it working out.
If 50% of the population understand themselves to belong to a permanent underclass, dependent on the largesse of the other 50% to keep them alive, they're going to be extraordinarily motivated to burn things down. Even if the other 50% establish a generous welfare system, perhaps so generous that you can obtain all the same goods and services that you could in the first scenario, it wouldn't solve the problem.
And UBI said by billionaires is like lube wielded by a rapist.
[dead]
The only people who love basic income is people who want to get it.
But if you want to give it? It's an entirely different picture.
Nah. Tax the shit out of me (and my peers).
sam, i’m sorry but is that really the safest thing for you to be saying with the recent attempts against your life? doesn’t seem very wise. you’re just going to pour gasoline on the fire
anyone remember: pushing back the "veil of ignorance" Q-Star Q*
[dead]