Joe Allen
๐ค SpeakerAppearances Over Time
Podcast Appearances
So right now that's how pretty much all of this is done.
So one example of that would be after OpenAI got all the flack for children committing suicide and people getting AI psychosis and misdiagnosis, they assure that OpenAI's GPT can't be used for medicine.
So it's an internal regulation with no one to stand over their shoulder and enforce it.
That's right.
And then you would have standards that would be official standards of the United States government.
Yeah, I think one of the big problems is that these companies are just simply acting like they're putting out apps.
This is just another tool, another technology.
Typically, when you purchase even a pack of cigarettes, it's not going to tell you to shoot yourself in the head, right?
Anything that's even remotely dangerous is regulated, labeled, so on and so forth.
With AI, as you say, it's completely unregulated in any meaningful way other than these new state laws that are being put in.
We'll see what happens there.
What this EO and what the potential moratorium injected into the NDAA would do is to block the ability of states to govern themselves, to determine whether they want their children using these apps, whether they want them in the schools, whether they want them in companies.
Surveillance is a huge issue.
The copyright issue.
that copyright infringement is a huge, huge issue.
It's pointed out in here, they're dangling California's law that would demand transparency of companies as some sort of egregious infraction, right?
Some sort of intrusion onto the freedom of corporations.
Or Colorado, which, how do they put it?
He's telling me he's up.
Okay.