Tim Davis
๐ค SpeakerAppearances Over Time
Podcast Appearances
So scaling this out to mobile phones, penetrating Android devices.
So, you know, most folks may not understand that when you take a photo, you're using AI.
You know, when you search for something, obviously you're using AI.
Even when you pull down and look at applications on your phone, you're using AI.
Increasingly now, even when you're typing on your keyboard, you are using AI.
There is pattern recognition in the predictive text that you're doing on your keyboard.
100%.
I mean, humans are similar creatures, and I think recommendation systems have proven that over time.
So all of that needed to be powered by infrastructure.
And when you're a company as significantly large as Google, trying to standardize on a piece of infrastructure is really important.
But one of the things that happened, and I think
what Chris and I started to realize when we were at Google was Google really had like, and it's probably interesting for your audience, they actually had three stacks.
And this is like full software stacks.
They had a software stack for TensorFlow that was sort of optimized for TPUs, which is their proprietary silicon.
They had a software stack that was optimized for
CPUs and GPUs, which is now what the world would know as, for example, NVIDIA machines or AMD machines.
And then another software stack that was optimized really for edge devices.
So think of Android phones or iPhones, or even smaller things like microcontrollers that run around in all sorts of different robots and systems like that.
And so
What we quickly began to realize was the world is actually quite hard for an AI engineer that believes in deploying not only in large-scale data centers, but all the way out to edge-based environments.