Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Yoshua Bengio

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
1179 total appearances
Voice ID

Voice Profile Active

This person's voice can be automatically recognized across podcast episodes using AI voice matching.

Voice samples: 2
Confidence: High

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

I mean, this is like too much of an assumption.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

They're going to have some interesting relationships that allow to predict things in the future, to explain what happened in the past.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

The kind of knowledge about those relationships in a classical AI system is encoded in the rules.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

Like a rule is just like a little piece of knowledge that says, oh, I have these two, three, four variables that are linked in this interesting way.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

Then I can say something about one or two of them given a couple of others, right?

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

In addition to disentangling the elements of the representation, which are like the variables in a rule-based system, you also need to disentangle the

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

the mechanisms that relate those variables to each other.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

So like the rules.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

So the rules are neatly separated.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

Each rule is living on its own.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

And when I change a rule because I'm learning, it doesn't need to break other rules.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

Whereas current neural nets, for example, are very sensitive to what's called catastrophic forgetting, where after I've learned some things and then I learn new things, they can destroy the old things that I had learned, right?

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

If the knowledge was better factorized and separated, disentangled, then you would avoid a lot of that.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

Now, you can't do this in the sensory domain, but

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

What do you mean by sensor?

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

Like in pixel space.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

But my idea is that when you project the data in the right semantic space, it becomes possible to now represent this extra knowledge beyond the transformation from input to representations, which is how representations act on each other and predict the future and so on.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

in a way that can be neatly disentangled.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

So now it's the rules that are disentangled from each other and not just the variables that are disentangled from each other.