Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Yoshua Bengio

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
1179 total appearances
Voice ID

Voice Profile Active

This person's voice can be automatically recognized across podcast episodes using AI voice matching.

Voice samples: 2
Confidence: High

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

And there are other issues, of course, with the old AI, like not really good ways of handling uncertainty.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

I would say something more subtle, which we understand better now, but I think still isn't enough in the minds of people.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

There's something really powerful that comes from distributed representations, the thing that really makes neural nets work so well.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

And it's hard to replicate that kind of power in a symbolic world.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

The knowledge in expert systems and so on is nicely decomposed into like a bunch of rules.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

Whereas if you think about a neural net, it's the opposite.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

You have this big blob of parameters which work intensely together to represent everything the network knows.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

And it's not sufficiently factorized.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

And so I think this is one of the weaknesses of current neural nets.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

that we have to take lessons from classical AI in order to bring in another kind of compositionality, which is common in language, for example, and in these rules, but that isn't so native to neural nets.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

So let me connect with disentangled representations, if you might, if you don't mind.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

So for many years, I've thought, and I still believe, that it's really important that we come up with learning algorithms, either unsupervised or supervised, or reinforcement, whatever, that build representations in which the important factors, hopefully causal factors, are nicely separated and easy to pick up from the representation.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

So that's the idea of disentangled representations.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

It says transform the data into a space where everything becomes easy.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

We can maybe just learn with linear models about the things we care about.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

And I still think this is important, but I think this is missing out on a very important ingredient, which classical AI systems can remind us of.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

So let's say we have these disentangle representations.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

You still need to learn about the relationships between the variables, those high-level semantic variables.

Lex Fridman Podcast
Yoshua Bengio: Deep Learning

They're not going to be independent.