Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

Two Voice Devs

Episode 199 - Is the Future of AI Local?

22 Jul 2024

Description

Join Allen Firstenberg and Roger Kibbe as they delve into the exciting world of local, embedded LLMs. We navigate some technical gremlins along the way, but that doesn't stop us from exploring the reasons behind this shift, the potential benefits for consumers and vendors, and the challenges developers will face in this new landscape. We discuss the "killer features" needed to drive adoption, the role of fine-tuning and LoRA adapters, and the potential impact on autonomous agents and an appless future. Resources: * https://developer.android.com/ai/aicore * https://machinelearning.apple.com/research/introducing-apple-foundation-models Timestamps: 00:20: Why are vendors embedding LLMs into operating systems? 04:40: What are the benefits for consumers? 09:40: What opportunities will this open up for app developers? 14:10: The power of LoRA adapters and fine-tuning for smaller models. 17:40: A discussion about Apple, Microsoft, and Google's approaches to local LLMs. 20:10: The challenge of multiple LLM models in a single browser. 23:40: How might developers handle browser compatibility with local LLMs? 24:10: The "three-tiered" system for local, cloud, and third-party LLMs. 27:10: The potential for an "appless" future dominated by browsers and local AI. 28:50: The implications of local LLMs for autonomous agents.

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.