Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

Chain of Thought

AI, Open Source & Developer Safety | Block’s Rizel Scarlett

29 Jan 2025

Description

As DeepSeek so aptly demonstrated, AI doesn’t need to be closed source to be successful.This week, Rizel Scarlett, a Staff Developer Advocate at Block, joins Conor Bronsdon to discuss the intersections between AI, open source, and developer advocacy. Rizel shares her journey into the world of AI, her passion for empowering developers, and her work on Block's new AI initiative, Goose, an on-machine developer agent designed to automate engineering tasks and enhance productivity.Conor and Rizel also explore how AI can enable psychological safety, especially for junior developers. Building on this theme of community, they also dive into topics such as responsible AI development, ethical considerations in AI, and the impact of community involvement when building open source developer tools.Chapters:00:00 Rizel's Role at Block02:41 Introducing Goose: Block's AI Agent06:30 Psychological Safety and AI for Developers11:24 AI Tools and Team Dynamics17:28 Open Source AI and Community Involvement25:29 Future of AI in Developer Communities27:47 Responsible and Ethical Use of AI31:34 Conclusion Follow Conor Bronsdon: https://www.linkedin.com/in/conorbronsdon/Rizel Scarlett : https://www.linkedin.com/in/rizel-bobb-semple/Rizel's website: https://blackgirlbytes.dev/Show NotesLearn more about Goose: https://block.github.io/goose/

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.