Gili Raanan
👤 PersonAppearances Over Time
Podcast Appearances
Cybersecurity for over 35 years, not happy to admit it, but that's reality. So I believe I do have a perspective on it. We live in days where it's actually the perfect storm in cybersecurity for various reasons. And when I look back at the days I founded CyberStarts, short of seven years in 2018, it wasn't the case. In 2018, cybersecurity was considered a boring portion of IT.
Cybersecurity for over 35 years, not happy to admit it, but that's reality. So I believe I do have a perspective on it. We live in days where it's actually the perfect storm in cybersecurity for various reasons. And when I look back at the days I founded CyberStarts, short of seven years in 2018, it wasn't the case. In 2018, cybersecurity was considered a boring portion of IT.
I think that what we see today is that there's a global powerhouse conflict that drives major forces into cybersecurity. It's the Ukrainian conflict. The Gazan conflict, it's all over the world. Offensive cybersecurity became a little weapon or even strategic weapon for state and powerhouses involved in those conflicts.
I think that what we see today is that there's a global powerhouse conflict that drives major forces into cybersecurity. It's the Ukrainian conflict. The Gazan conflict, it's all over the world. Offensive cybersecurity became a little weapon or even strategic weapon for state and powerhouses involved in those conflicts.
So that's one element that's making the cybersecurity threat vector so real, so dangerous, so available. And there's a constant drift from state-level cybersecurity weapons to criminal organization and then just creep kiddies. So we live in a very dangerous world, way more dangerous than it was just a few years ago. So that's one. The second element is technology.
So that's one element that's making the cybersecurity threat vector so real, so dangerous, so available. And there's a constant drift from state-level cybersecurity weapons to criminal organization and then just creep kiddies. So we live in a very dangerous world, way more dangerous than it was just a few years ago. So that's one. The second element is technology.
I think that we've seen the iPhone moment for AI in 2023 or maybe even 2024. AI is not going to improve cybersecurity. It's going to redefine it. The same technology, the same LLMs, the same AI agents that you'd use to predict attacks and prevent them would be used against you to deliver attacks. A scale and level of sophistication we haven't seen before.
I think that we've seen the iPhone moment for AI in 2023 or maybe even 2024. AI is not going to improve cybersecurity. It's going to redefine it. The same technology, the same LLMs, the same AI agents that you'd use to predict attacks and prevent them would be used against you to deliver attacks. A scale and level of sophistication we haven't seen before.
And the methods that we developed maybe for the past 80 years starting in World War II would be useless because the speed, scale, and sophistication of those attacks would be something that we haven't seen before. The world is a way more dangerous place today.
And the methods that we developed maybe for the past 80 years starting in World War II would be useless because the speed, scale, and sophistication of those attacks would be something that we haven't seen before. The world is a way more dangerous place today.
Think about modern warfare and the use of technology. It started with better analysis, better decision-making process. Then it moves to human augmentation. Think about the introduction of the first tanks in the First World War or the introduction of the Kalachnikov. It made humans more lethal. But with AI, there are additional steps. It's the workflow automation.
Think about modern warfare and the use of technology. It started with better analysis, better decision-making process. Then it moves to human augmentation. Think about the introduction of the first tanks in the First World War or the introduction of the Kalachnikov. It made humans more lethal. But with AI, there are additional steps. It's the workflow automation.
And then the final step where AI takes control. And if you think about a modern warfare, you think about LLMs that quickly put together attack plans that you didn't practice for, you didn't plan for, you don't have a B plan and a C plan for. They're already using AI agents to take control and launch those attacks. On the different side, that means that human augmentation would be an improvement.
And then the final step where AI takes control. And if you think about a modern warfare, you think about LLMs that quickly put together attack plans that you didn't practice for, you didn't plan for, you don't have a B plan and a C plan for. They're already using AI agents to take control and launch those attacks. On the different side, that means that human augmentation would be an improvement.
For time being, it might be enough, but it won't be a great solution for the long term. For the long term, you'd have to use LLMs to predict attack vectors in the very same way and give AI agents the control so that you have a chance to respond to those attacks in a timely manner. That's the future we are going to. There's no question about it.
For time being, it might be enough, but it won't be a great solution for the long term. For the long term, you'd have to use LLMs to predict attack vectors in the very same way and give AI agents the control so that you have a chance to respond to those attacks in a timely manner. That's the future we are going to. There's no question about it.
If you've been a fan of the Arnold Schwarzenegger Terminator movies, that's been a science fiction future.
If you've been a fan of the Arnold Schwarzenegger Terminator movies, that's been a science fiction future.
The best offensive tactics are always ROI driven. How do I apply the highest damage to you in the smallest effort, lowest cost, fastest way? And what damage is always a contextual question. Depends who you are. But we've seen examples of so many different ways to inflict pain on you on a target.
The best offensive tactics are always ROI driven. How do I apply the highest damage to you in the smallest effort, lowest cost, fastest way? And what damage is always a contextual question. Depends who you are. But we've seen examples of so many different ways to inflict pain on you on a target.