Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

BlueDot Narrated

The Easy Goal Inference Problem Is Still Hard

04 Jan 2025

Description

One approach to the AI control problem goes like this:Observe what the user of the system says and does.Infer the user’s preferences.Try to make the world better according to the user’s preference, perhaps while working alongside the user and asking clarifying questions.This approach has the major advantage that we can begin empirical work today — we can actually build systems which observe user behavior, try to figure out what the user wants, and then help with that. There are many applications that people care about already, and we can set to work on making rich toy models.It seems great to develop these capabilities in parallel with other AI progress, and to address whatever difficulties actually arise, as they arise. That is, in each domain where AI can act effectively, we’d like to ensure that AI can also act effectively in the service of goals inferred from users (and that this inference is good enough to support foreseeable applications).This approach gives us a nice, concrete model of each difficulty we are trying to address. It also provides a relatively clear indicator of whether our ability to control AI lags behind our ability to build it. And by being technically interesting and economically meaningful now, it can help actually integrate AI control with AI practice.Overall I think that this is a particularly promising angle on the AI safety problem.Original article:https://www.alignmentforum.org/posts/h9DesGT3WT9u2k7Hr/the-easy-goal-inference-problem-is-still-hardAuthors:Paul ChristianoA podcast by BlueDot Impact.Learn more on the AI Safety Fundamentals website.

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.