In 2016, Microsoft pulled the plug on Tay (short for Thinking About You) a chatbot designed to mimic the language patterns of a 19-year-old American girl and to learn from interacting with human users of Twitter. According to Microsoft CEO, Satya Nadella, Tay was an important influence on how Microsoft is approaching AI," and has taught the company the importance of taking accountability.It can be argued that as we come to depend on data and on technology to make decisions, we also need to consider the implications such dependence has on the outcomes.With us today is Brandon Purcell, VP, Principal Analyst with Forrester. 1. One of the attributes of machines is that they are “supposedly” unbiased executing based on a pre-defined set of “rules”. And yet, studies from the World Economic Forum and commentaries from Harvard Business Review suggests AI is biased. Where does the fault (if any) lie? On the code? On the algos?2. Would you consider these concerns about AI bias as having a significant impact on how AI will be adopted in commercial environments?3. What should leadership ask of their data science/AI research teams to mitigate against the risks that may come from perceived AI bias?4. In your view, how far away are we from achieving ethical AI?5. You contributed to the Forrester report, How to Measure AI Fairness. What was the conclusion of the report?
No persons identified in this episode.
This episode hasn't been transcribed yet
Help us prioritize this episode for transcription by upvoting it.
Popular episodes get transcribed faster
Other recent transcribed episodes
Transcribed and ready to explore now
3ª PARTE | 17 DIC 2025 | EL PARTIDAZO DE COPE
01 Jan 1970
El Partidazo de COPE
13:00H | 21 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
12:00H | 21 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
10:00H | 21 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
13:00H | 20 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
12:00H | 20 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana