Nicole Perlroth
👤 SpeakerAppearances Over Time
Podcast Appearances
And because these companies are more beholden to their shareholders than they are to the security of their customers, it's really on government to force it upon them. to mandate that they sell software and hardware that's secure right out of the box. Like automakers, if there is a defect, they should be forced to fix it and bear the cost of the recall.
And because these companies are more beholden to their shareholders than they are to the security of their customers, it's really on government to force it upon them. to mandate that they sell software and hardware that's secure right out of the box. Like automakers, if there is a defect, they should be forced to fix it and bear the cost of the recall.
All of this is what's called secure by design. And under Jenny Sterling, this became a major priority at CISA.
All of this is what's called secure by design. And under Jenny Sterling, this became a major priority at CISA.
Secure by design is perhaps most urgent in one particular burgeoning field, AI. Artificial intelligence is rapidly embedding itself in how we communicate, how we diagnose illness, in surveillance and national defense. It promises incredible advancements and efficiency, freeing us to focus on higher order tasks. But behind the scenes, it's unleashed a Pandora's box of complexity.
Secure by design is perhaps most urgent in one particular burgeoning field, AI. Artificial intelligence is rapidly embedding itself in how we communicate, how we diagnose illness, in surveillance and national defense. It promises incredible advancements and efficiency, freeing us to focus on higher order tasks. But behind the scenes, it's unleashed a Pandora's box of complexity.
And complexity is security's greatest enemy. It allows for entirely new points of entry and an entirely new range of dependencies. Many we don't and won't understand until someone exploits them. Every time we engage Gen AI, we're not just asking a question. We're handing over the keys to our private lives, our medical histories, our business secrets, even our unspoken thoughts.
And complexity is security's greatest enemy. It allows for entirely new points of entry and an entirely new range of dependencies. Many we don't and won't understand until someone exploits them. Every time we engage Gen AI, we're not just asking a question. We're handing over the keys to our private lives, our medical histories, our business secrets, even our unspoken thoughts.
I find the whole exercise to be a quiet, compounding surrender of trust. And soon that trust will be granted to AI agents, not just to answer our questions, but to manage business operations on our behalf. As a society, it appears we're determined to dive head first into AI, without a second thought as to how this might one day be used against us.
I find the whole exercise to be a quiet, compounding surrender of trust. And soon that trust will be granted to AI agents, not just to answer our questions, but to manage business operations on our behalf. As a society, it appears we're determined to dive head first into AI, without a second thought as to how this might one day be used against us.
On this, I want to play you an interview that Paul Tudor Jones, the hedge fund manager, recently gave to Andrew Ross Sorkin this May.
On this, I want to play you an interview that Paul Tudor Jones, the hedge fund manager, recently gave to Andrew Ross Sorkin this May.
What he just told you is that behind closed doors, the leaders of every major AI model are deeply afraid that the very systems they're building could one day be used to kill off millions. Not necessarily because AI becomes sentient and suddenly takes over everything, but because it could be used to automate what we have discussed here.
What he just told you is that behind closed doors, the leaders of every major AI model are deeply afraid that the very systems they're building could one day be used to kill off millions. Not necessarily because AI becomes sentient and suddenly takes over everything, but because it could be used to automate what we have discussed here.
It could be used to do what hackers currently are doing manually, hacking into our critical systems like food and water at scale. And yet no one is hitting pause. Why?
It could be used to do what hackers currently are doing manually, hacking into our critical systems like food and water at scale. And yet no one is hitting pause. Why?
Because the AI arms race, especially with China and very recently with DeepSeek, is so intense that there is simply no incentive at the national or industry level to pause and do what is necessary to mitigate against these harms in the build.
Because the AI arms race, especially with China and very recently with DeepSeek, is so intense that there is simply no incentive at the national or industry level to pause and do what is necessary to mitigate against these harms in the build.
Trump already gutted Biden's AI executive order, which, among other things, required AI developers to test for potential harms before they released these tools into millions of hands. And buried in Trump's new big, beautiful bill, the one that just passed the House,
Trump already gutted Biden's AI executive order, which, among other things, required AI developers to test for potential harms before they released these tools into millions of hands. And buried in Trump's new big, beautiful bill, the one that just passed the House,