Zack Kass
👤 SpeakerAppearances Over Time
Podcast Appearances
And the parallelization of data they borrowed
from the study of the human brain.
They basically argued that we should be building artificial intelligence systems much more like the human brain and that the neural network, which again was a theory that had been around since the 60s, I think 50s or 60s, was in fact the right way to do it.
And they were right.
And it catapulted the industry forward.
And it was at that time that OpenAI launched GBT, the original GBT, which stands for Generative Pre-trained Transformer.
It was the transformer architecture that the Transformer 8 pioneered.
And GBT was very good by any measure, relatively, and then GBT2 and then GBT3.
And it was around the time they're opening up planning for GBT3 that they were like, you know, we should get a salesperson to sell this thing.
They looked around, they're like, it's all researchers.
No one here can sell this thing.
We should get someone who can help the customers.
It turns out there was basically one other person in the world who was selling large language models at the time.
me and I was selling them for the purposes of translation.
I was selling large language models that Lilt was building to help companies translate their content.
So I got introduced and met the team and spent some time with
Sam and got hired as basically the first business hire.
And for a long time, if you emailed, not for a long time, I guess for four months, if you emailed support at or sales at or info at OpenAI, you arrived at my inbox, which is fun.
And it gave me a pretty good view of how everything should work.
And then that gave way to building a team.