Barry Weiss
👤 SpeakerAppearances Over Time
Podcast Appearances
And what they were aiming to do was to make it so closely regulated by the government that in his words, there would only be sort of two or three big companies that they would work with and that they were trying to ultimately protect them from competition. Is that true? Do you know what he's referencing? Was OpenAI one of those companies?
So OpenAI was not one of those companies?
So OpenAI was not one of those companies?
So OpenAI was not one of those companies?
You weren't like in a room with OpenAI and a number of, you weren't in a room ever with the Biden administration and other AI companies.
You weren't like in a room with OpenAI and a number of, you weren't in a room ever with the Biden administration and other AI companies.
You weren't like in a room with OpenAI and a number of, you weren't in a room ever with the Biden administration and other AI companies.
What was your feeling in general about the Biden administration's posture toward AI and tech more generally? You just said, like, you didn't think they'd have the competence to –
What was your feeling in general about the Biden administration's posture toward AI and tech more generally? You just said, like, you didn't think they'd have the competence to –
What was your feeling in general about the Biden administration's posture toward AI and tech more generally? You just said, like, you didn't think they'd have the competence to –
OK, that's like a perfect analogy to get us to the comparison that's often made, which is the comparison between AI and nuclear weapons. When Mark was on, I asked him to kind of steel man the Biden administration's perspective or steel man the perspective that this should be heavily regulated.
OK, that's like a perfect analogy to get us to the comparison that's often made, which is the comparison between AI and nuclear weapons. When Mark was on, I asked him to kind of steel man the Biden administration's perspective or steel man the perspective that this should be heavily regulated.
OK, that's like a perfect analogy to get us to the comparison that's often made, which is the comparison between AI and nuclear weapons. When Mark was on, I asked him to kind of steel man the Biden administration's perspective or steel man the perspective that this should be heavily regulated.
And he basically drew the analogy to the Manhattan Project and the development of the atomic bomb when the government failed. felt that it needed to make sure that this new science and innovation remained classified. First of all, do you think that that's a good analogy?
And he basically drew the analogy to the Manhattan Project and the development of the atomic bomb when the government failed. felt that it needed to make sure that this new science and innovation remained classified. First of all, do you think that that's a good analogy?
And he basically drew the analogy to the Manhattan Project and the development of the atomic bomb when the government failed. felt that it needed to make sure that this new science and innovation remained classified. First of all, do you think that that's a good analogy?
And if so, if it is as powerful as nuclear weapons, wouldn't it make sense for this to be not OpenAI and Gemini and Claude, but rather a project of the federal government?
And if so, if it is as powerful as nuclear weapons, wouldn't it make sense for this to be not OpenAI and Gemini and Claude, but rather a project of the federal government?
And if so, if it is as powerful as nuclear weapons, wouldn't it make sense for this to be not OpenAI and Gemini and Claude, but rather a project of the federal government?
At the beginning of the nuclear age, we had people in this country who functioned almost like chief science officers, right? I'm thinking about people like Vannevar Bush who helped launch the Manhattan Project and came up with the National Science Foundation and kind of guided American policy for those first few like very crucial years of nuclear energy. Does that person exist?