Bogdan Botezatu
๐ค SpeakerAppearances Over Time
Podcast Appearances
Like there are a couple of Instagram accounts that have millions of subscribers and the person does not exist.
The only thing that exists is an AI algorithm that's building content to order.
Well, unfortunately, there is no defense against that.
And would we need a defense to that?
Or would we need a defense to probably some nefarious goals that the AI content will attempt to lead us to?
And here's what we're trying to do here.
We're trying to help people understand the red flags in communication, understand disinformation, understand the likelihood of something that they're exposed to being real.
And probably that will be the future of technology.
Not necessarily detecting that some content is created by AI, but rather the fact that that content created by whoever is malicious and will have an impact on you and your security.
You're asking all the hard questions.
This is a very important topic for me because these nefarious interactions that you described can be used by a commercial actor, for instance, to, I don't know, make you behave in a specific way that will result in loss of money.
But they're also used as hybrid warfare now.
Disinformation is a big part of that.
And it doesn't have that kind of structure that makes it obvious.
It doesn't have that call to action that would let me know that the message is wrong, false, or leading to unintended consequences.
The fact that we have deepfakes talking about, I would say, political stuff, impersonations, hidden agendas, and so on, will help an adversary dilute our amount of trust.
They will cause uncertainty.
They will reach their goal by making us question everything and ultimately not caring about the message because we cannot distinguish what's wrong from the right, what's true from the false, and so on.
So, not sure if this answers the question, but that's probably the best answer I can give at this point.
Cybersecurity is a fundamental part of the way we're interacting with technology right now.