Images of child sexual abuse generated by artificial intelligence are on the rise. Australia’s eSafety Commissioner, Julie Inman Grant, says 100,000 Australians a month have accessed an app that allows users to upload images of other people – including minors – to receive a depiction of what they would look like naked. Predators are known to share know-how to produce and spread these images – and in Australia, the AI tools used to create this material are not illegal. All the while, Julie Inman Grant says not a single major tech company has expressed shame or regret for its role in enabling it. Today, advocate for survivors of child sexual assault and director of The Grace Tame Foundation, Grace Tame, on how governments and law enforcement should be thinking about AI and child abuse – and whether tech companies will cooperate. If you enjoy 7am, the best way you can support us is by making a contribution at 7ampodcast.com.au/support. Socials: Stay in touch with us on Instagram Guest: Advocate for survivors of child sexual assault and director of The Grace Tame Foundation, Grace TameSee omnystudio.com/listener for privacy information.
No persons identified in this episode.
This episode hasn't been transcribed yet
Help us prioritize this episode for transcription by upvoting it.
Popular episodes get transcribed faster
Other recent transcribed episodes
Transcribed and ready to explore now
Before the Crisis: How You and Your Relatives Can Prepare for Financial Caregiving
06 Dec 2025
Motley Fool Money
Anthropic Finds AI Answers with Interviewer
05 Dec 2025
The Daily AI Show
#2423 - John Cena
05 Dec 2025
The Joe Rogan Experience
Warehouse to wellness: Bob Mauch on modern pharmaceutical distribution
05 Dec 2025
McKinsey on Healthcare
The method of invention, AI's new clock speed and why capital markets are confused
05 Dec 2025
Azeem Azhar's Exponential View
Meta Stock Surges on Plans for Metaverse Cuts
05 Dec 2025
Bloomberg Tech