Nataliya Kosmyna
๐ค SpeakerAppearances Over Time
Podcast Appearances
Kind of, hey, Siri, I have problems with relationships.
It's Alexa, right?
It's a joke for a very heavy topic, so I need to preface it immediately that we have even less data and less scientific papers, preprints or peer-reviewed papers about this.
So most of what we have right now, we personally received after our paper around 300 emails from husbands and wives telling us that their partners now have multiple agents they're talking to in bed.
And I immediately thought about the South Park episode.
from a couple like years ago like with integrity and like that you know farm as like literally but we have much less of scientific information about this what we have what we know right that also coming from our group's research that there is definitely amplification of loneliness that's what we know as a research and some of other papers are showing up right now
There is potential, and again, a lot of people who are pro-AI therapy pointing out on advantages of the fact that it is cheap.
It's $20 a month compared to hours that can cost up to hundreds of dollars a month, right?
But there is definitely, you know, a lot of drawbacks here.
And the drawbacks is we see that because there is not such a regulated space, it still can basically give you suggestions that are not good.
So you knew that earlier, a couple of months ago, for example, the CharGPT, I'm going to give you an example on CharGPT, because again, we are focused on CharGPT, but the ones are actively, actively publicized, at least.
It actually suggested, you know, different heights of the brand.
bridges in new york if you say that you lost your job right so can not smart enough to do this connection that maybe that's not what you need to give response to and apparently right from this awful recent it's interesting where a teenager 16 16 so so young
unfortunately, you know, suicided.
And now, Chaji Patia, OpenAI, and Sam Altman are being sued.
Apparently, what happened is that a conversation from the spokesperson of OpenAI pointing out that they thought when a person is talking about suicide, not to engage at all, just say, here are the numbers, this is what you need to do, and stop talking.
But they thought that experts told them that, hey, it might be a great idea to try to dig
people a bit out.
But it looks like in this case, it still failed because from the conversations that have been reported, we don't know how authentic they are.
It looks like it's suggested to keep it away from parents.