Eric Schmidt
đ¤ SpeakerVoice Profile Active
This person's voice can be automatically recognized across podcast episodes using AI voice matching.
Appearances Over Time
Podcast Appearances
I'll give you a simple example. The social media algorithms select the most inflammatory statements who are often from the most deranged people. And that's because the algorithm works. And because the algorithm says, oh, this is interesting, and a lot of people are listening to it and so forth, that's not a good way to run a democracy.
I'll give you a simple example. The social media algorithms select the most inflammatory statements who are often from the most deranged people. And that's because the algorithm works. And because the algorithm says, oh, this is interesting, and a lot of people are listening to it and so forth, that's not a good way to run a democracy.
Maybe we should have a rule that if you make a claim, you have to make a paragraph, right? And actually justify your argument as opposed to, oh my God, the following thing is about to kill and we're all going to die. But that's an example where humans have control, but we've chosen to allow inflammatory speech without the benefit of wisdom. And that's not good.
Maybe we should have a rule that if you make a claim, you have to make a paragraph, right? And actually justify your argument as opposed to, oh my God, the following thing is about to kill and we're all going to die. But that's an example where humans have control, but we've chosen to allow inflammatory speech without the benefit of wisdom. And that's not good.
We don't know the definition of consciousness. My own opinion is that this will not occur in my lifetime. I think that what will be true is that we will coexist with these systems and they'll take on more and more of the drudgery. They'll make the systems more efficient. Efficiency is generally a good thing in economic systems. People will be wealthier. People will be more productive.
We don't know the definition of consciousness. My own opinion is that this will not occur in my lifetime. I think that what will be true is that we will coexist with these systems and they'll take on more and more of the drudgery. They'll make the systems more efficient. Efficiency is generally a good thing in economic systems. People will be wealthier. People will be more productive.
My own view is that in my lifetime, everyone's productivity will double. You can do twice as many podcasts. I can do twice as many speeches. Whatever it is that each of us is doing, Because the tools make us more efficient. And that's the nature of technology invention. It's been true for 200 years. The car made us more efficient. Google made us more efficient and so forth.
My own view is that in my lifetime, everyone's productivity will double. You can do twice as many podcasts. I can do twice as many speeches. Whatever it is that each of us is doing, Because the tools make us more efficient. And that's the nature of technology invention. It's been true for 200 years. The car made us more efficient. Google made us more efficient and so forth.
I think that will continue. Because we can't define consciousness, we can imagine that the system can itself imagine consciousness. But it's highly unclear that one, it could detect it. And second, how would we know? Because it could have just decided to fool us.
I think that will continue. Because we can't define consciousness, we can imagine that the system can itself imagine consciousness. But it's highly unclear that one, it could detect it. And second, how would we know? Because it could have just decided to fool us.
We do not. A simple answer is that the systems will automate a more and more complex world. So if you look at a young person, at the moment I'm at Harvard surrounded by students, they are so comfortable with the world of clicking and moving around. They're in this infinite information space and they're comfortable. Whereas people in my generation find it overwhelming.
We do not. A simple answer is that the systems will automate a more and more complex world. So if you look at a young person, at the moment I'm at Harvard surrounded by students, they are so comfortable with the world of clicking and moving around. They're in this infinite information space and they're comfortable. Whereas people in my generation find it overwhelming.
So people adapt to this explosion of information. But the right system is to have the equivalent of an assistant that sort of organizes your digital world in a way that is net positive for you. Now that has a lot of negative implications, but I don't think that humans will be able to be very productive without their own AI assistant telling them what's most important, reading things.
So people adapt to this explosion of information. But the right system is to have the equivalent of an assistant that sort of organizes your digital world in a way that is net positive for you. Now that has a lot of negative implications, but I don't think that humans will be able to be very productive without their own AI assistant telling them what's most important, reading things.
We have this huge problem around misinformation right now. I just want something, an AI system to say, This is likely to be true, and this is probably somewhat true, and then give me the analysis. And then I can form my own opinions. At the point, going back to your point earlier about agency, which I really liked, is when you give agency to the computer, you're giving up something very important.
We have this huge problem around misinformation right now. I just want something, an AI system to say, This is likely to be true, and this is probably somewhat true, and then give me the analysis. And then I can form my own opinions. At the point, going back to your point earlier about agency, which I really liked, is when you give agency to the computer, you're giving up something very important.
Don't lose your critical thinking. Don't just believe it, even if it's Google.
Don't lose your critical thinking. Don't just believe it, even if it's Google.
Well, the biggest one would be things like access to weapons. What I mentioned, recursive self-improvement, where the system can actually learn on its own and we don't know what it's doing. I worry about those, the misuse in biology. There are plenty of people working on what are the capabilities of these models and to make sure that they can't produce pathogens.
Well, the biggest one would be things like access to weapons. What I mentioned, recursive self-improvement, where the system can actually learn on its own and we don't know what it's doing. I worry about those, the misuse in biology. There are plenty of people working on what are the capabilities of these models and to make sure that they can't produce pathogens.