Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

BlueDot Narrated

The Need for Work on Technical AI Alignment

13 May 2023

Description

This page gives an overview of the alignment problem. It describes our motivation for running courses about technical AI alignment. The terminology should be relatively broadly accessible (not assuming any previous knowledge of AI alignment or much knowledge of AI/computer science).This piece describes the basic case for AI alignment research, which is research that aims to ensure that advanced AI systems can be controlled or guided towards the intended goals of their designers. Without such work, advanced AI systems could potentially act in ways that are severely at odds with their designers’ intended goals. Such a situation could have serious consequences, plausibly even causing an existential catastrophe.In this piece, I elaborate on five key points to make the case for AI alignment research.Source:https://aisafetyfundamentals.com/alignment-introductionNarrated for AI Safety Fundamentals by Perrin Walker of TYPE III AUDIO.---A podcast by BlueDot Impact.Learn more on the AI Safety Fundamentals website.

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.