Esther Perel
👤 SpeakerAppearances Over Time
Podcast Appearances
What is the purpose of my life?
How do I feel about death?
I mean, this is extraordinary.
We're no longer turning to faith healers, but we are turning to these machines to answer us.
But they have no moral culpability.
They have no responsibility for their answer.
If I'm a teacher and you ask me a question, I have a responsibility in what you do with the answer to your question.
I'm implicated.
AI is not implicated.
And from that moment on, it eliminates the ethical dimension of a relationship.
You know, when people talk relationships these days, they emphasize empathy, courage, vulnerability probably more than anything else.
They rarely use the word accountability and responsibility and ethics.
That adds a whole other dimension to relationships that is a lot more mature than the more regressive states of what do you offer me?
I think the program in is the last thing to be programmed.
I think that if you make this machine speak with people in other parts of the world, you will begin to see how biased they are.
I think it's one thing we should really remember.
This is a business product.
When you say you're falling in love with AI, you fall in love with a business product.
That business product is not here to just teach you how to fall in love and how to develop deeper feelings of love and then how to send them and transport them onto other people as a mediator, as a transitional object.
You know, children play with their little stuffed animal and then they move, they bring their learning from that relationship onto humans.