Uri Simonson
👤 PersonPodcast Appearances
Yuri Simonson. I'm a professor of behavioral science at the Asada Business School in Barcelona.
What motivated our whole journey into methodology was that we would go to conferences or read papers and not believe them. And we would find that whenever a finding didn't align with our intuition, We would trust our intuition over the finding. And that was, it sort of defeated the whole purpose. Like if you're only believing things you already believe, then why bother?
So the idea was like, how do we show people that you can really very easily produce evidence of anything? So we thought, let's start with something that's obviously false. We said, OK, something that's quite hard to do is to make people younger. We've been trying forever. We never succeeded. So let's show that we can do that in a silly way.
So we decided to show that we can make people younger by listening to a song by the Beatles. The song was When I'm 64, correct? That's right. And so the idea is, if we can make anything significant, one way to prove it is to say, I'm going to show you with statistical significant evidence that people got younger after they listened to When I'm 64.
A control song was, it's called, I believe, Kalumba by Mr. Scruff. And then we had another song that was meant to go in the other direction, and it didn't work, so we just didn't report it, which was Hot Potato.
Yeah, we thought it was very bad.
There's a few approaches. Some of them that we've done, like, just do statistics and say, this is statistically impossible. The other is you see associations in the data or lack associations of the data that they're not mathematical properties, but just anybody looking at data who's familiar with that would realize this is not right.
Imagine that you have data on weight and height and you correlate it and you find zero correlation. That cannot be right. People who are bigger are heavier. And so if you found zero correlation or a negative correlation, you think maybe these are not real weight measures. Another one is you see rounding or precision that is suspicious.
You see rounded values where there shouldn't be any rounding or absence of rounding where there should be. So for example, in one case that we worked on, there was data of supposedly when people were asked, how much will you pay for this T-shirt? And the very curious thing is there was no rounding. People were equally likely to say $7, $8, or $10.
But if you've ever collected data like that, you know that people round. People say 10 or 20. They don't say 17.
I asked Uri Simonson. I would estimate the share of fraud in the order of, say, 5% of articles.
Sometimes I will, but if I come across fraud there, I will ignore it because the cost is so high of pursuing a case of fraud. that it's just not worth it. If it's a paper that has seven citations after three years and it's published in a journal that nobody knows, I just let it be. And I'm sure a lot of people do that too.
So I would say I have on the falsity of the findings. I don't have reasonable doubt.
So I would say I have, on the falsity of the findings, I don't have reasonable doubt.
So we were sued together with Harvard for $25 million. We were sued for defamation.
Within 24 hours, they had $200,000. We... Found a First Amendment expert lawyer who's representing us. We've learned a lot of the boring stuff that happens with lawyers that you don't get from TV shows, like the timelines and the language and how long they're judged. Like, things take forever. I mean, it makes academia seem expedient in comparison.
It's not just nice to have money, but it's also nice to know that thousands of people are willing to, at least somewhat publicly, support what you're doing. So that was a big boost.