Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing
Podcast Image

Austrian Artificial Intelligence Podcast

44. Andreas Stephan - University of Vienna - Weak Superversion in NLP

27 Dec 2023

Description

# Summary I am sure that most of you are familiar with the training paradigm of supervised and unsupervised learning. Where in the case of supervised learning one has a label for each training datapoint and in the unsupervised situation there are no labels. Although there can be exceptions, everyone is well advise to perform supervised training when ever possible. But where to get those labels for your training data if traditional labeling strategies, like manual annotations are not possible? Well often you might not have perfect labels for your data, but you have some idea what those labels might be. And this, my dear listener is exactly the are of weak supervision. Today on the show I am talking to Andreas Stephan who is doing is PhD in Natural Language Processing at the University of Vienna in the Digital Text Sciences group led by Professor Benjamin Roth. Andreas will explain about his recent research in the area of weak supervision as well how Large Language Models can be used as weak supervision sources for image classification tasks. # TOC 00:00:00 Beginning 00:01:38 Weak supervision a short introduction (by me) 00:04:17 Guest Introduction 00:08:48 What is weak supervision? 00:16:02 Paper: SepLL: Separating Latent Class Labels from Weak Supervision Noise 00:26:28 Benefits of priors to guide model training 00:29:38 Data quality & Data Quantity in training foundation models 00:36:10 Using LLM's for weak supervision 00:46:51 Future of weak supervision research # Sponsors - Quantics: Supply Chain Planning for the new normal - the never normal - https://quantics.io/ - Belichberg GmbH: We do digital transformations as your innovation partner - https://belichberg.com/ # References - Andreas Stephan - https://andst.github.io/ - Stephan et al. "SepLL: Separating Latent Class Labels from Weak Supervision Noise" (2022) - https://arxiv.org/pdf/2210.13898.pdf - Gunasekar et al. "Textbooks are all you need" (2023) - https://arxiv.org/abs/2306.11644 - Introduction into weak supervision: https://dawn.cs.stanford.edu/2017/07/16/weak-supervision/

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

This episode hasn't been transcribed yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.