Jacob Kimmel
๐ค SpeakerAppearances Over Time
Podcast Appearances
So that's another version of what would a all-encompassing model look like, where you actually have compounding returns in drug discovery.
Yeah, absolutely.
So I think we actually do both today.
So we can train these models where basically the inputs are a notion of what that cell looked like at the starting place.
Here's what a generic old cell looked like.
And then representations of the transcription factors themselves.
We derive those from protein foundation models.
They're language models basically to train on protein sequences.
Turns out that gives you a really good base level understanding of biology.
So the model is kind of starting from a pretty smart place.
And then you can predict a number of different targets from some learned embedding the same way you could have multiple heads on a language model.
And so one of those for us is actually just predicting every gene the cell is expressing.
Can I just recapitulate the entire state and guess what effect these transcription factors will have on every given gene?
And you can think about that as like an objective rather than a value judgment on the cell.
I'm not asking whether or not I want this particular transcriptome.
I'm just asking what it will look like.
And then we also have something more like, you know, value judgments.
I believe that that transcriptome looks like a younger cell, and I'm going to select on that and train a head to predict it where I can denoise across genes and then select for younger cells.
But you could do that for arbitrary numbers of additional heads.
What are some other states you might want?