Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing
Podcast Image

The Last Show with David Cooper

Can Science Get Political?

30 Jan 2026

Transcription

Chapter 1: What does the study reveal about political beliefs affecting scientific results?

3.338 - 23.2 David Cooper

We're here because your heightened awareness deserves heightened entertainment. The Last Show with David Cooper. We interview a lot of scientists on this show about their research. But what if we gave two different researchers the same data set? It turns out their political beliefs might nudge their results in opposite directions.

0

23.56 - 42.741 David Cooper

That's according to a new study, which to me raises some unsettling questions. Some pretty human questions about bias and how science gets made. I'm here with someone who worked on that research. He is a social and behavioral science researcher at the German Institute for Adult Education. And his name is Nate Bresnau. Nate, welcome in. Hey, David.

0

42.761 - 43.983 Nate Breznau

Thanks so much for having me on the show.

0

44.724 - 51.775 David Cooper

I think this is like the biggest worry in science, right? Like someone's personal beliefs can affect their results.

0

Chapter 2: How was the research study structured and what were its findings?

51.855 - 56.262 David Cooper

Do you want to walk me through the study, how you set it up and maybe what you found?

0

57.423 - 84.628 Nate Breznau

Sure. The original study got together 73 different research teams with 158 different researchers in those teams. And we gave them the same data and same hypothesis. And we wanted them to test the same research question. And the research question was immigration reduces support for social welfare programs, social security programs. And they came to all different results.

0

85.233 - 111.358 Nate Breznau

And, you know, from really strong positive effect, really strong negative effect and everything in between. And that was pretty surprising in some ways, but in some ways not. And that probably has to do with what I might call like two categories of variation. And one of them is more like noise, stuff we don't know about. And one of them is more like bias, stuff where we can look at

0

112.452 - 117.099 Nate Breznau

attributes of the researchers that might have caused those different results.

0

118.18 - 123.287 David Cooper

You say on the one hand, it was like kind of surprising, maybe shocking. On the other hand, maybe kind of obvious.

Chapter 3: What are the implications of bias in scientific research?

123.388 - 138.409 David Cooper

But I think in both cases, these results are a little bit uncomfortable. I guess, where do we go from here? Where do we go knowing that like the same researchers who may be giving governments data about how to create programs would come to these different results?

0

139.874 - 163.924 Nate Breznau

There's probably two good suggestions. And one of them is that we as scientists could be more humble about what we're doing. And if we have some findings we want to share with the public or with the government, we should be cautious about what we're recommending they do with that until many other researchers have also

0

164.258 - 176.071 Nate Breznau

looked at a similar, you know, done similar studies or also looked at the data. So if we can see a consensus where a lot of researchers are coming to similar results, that's when we should start to think, okay, here we really have something.

0

176.551 - 189.625 Nate Breznau

Rather than relying on any one, because if you pulled any one result from our study, it's not very representative of the universe of results that a scientist might come to, any random scientist, if they were to study the topic.

0

190.651 - 208.896 David Cooper

There's two things. One, I'm thinking like when you actually choose who works on a study, should that be part of the methodology? Like should we be looking at what kind of scientists, what their beliefs are before actually jumping in and assemble a team that has a wide array of beliefs? Is that one maybe solution to this problem?

209.5 - 231.019 Nate Breznau

Yeah, you kind of hit the nail on the head there. There's starting to be a movement, if you want to call it that, to do what's called an adversarial collaboration, where scientists who have maybe different opinions or different theories or even different ideological positions get together and design studies together.

Chapter 4: How can researchers mitigate bias in their studies?

230.999 - 250.102 Nate Breznau

and agree on the criteria in advance. They say, okay, like we're going to do the study like this. We all agree this is the way to do it and, you know, and agree on what the results mean in advance. So they say like, if we get results A, that means there's a positive effect. If we get results B, there's a negative and C, there's no effect. And so they already like clarify all that in advance.

0

250.663 - 257.091 Nate Breznau

And so that would be a good way to avoid some of this bias that might creep into the process.

0

257.543 - 272.059 David Cooper

Going into the details here, for someone who's never done like a research data analysis before, how can two smart people who have a decent background in their field look at the same data set and walk away with different conclusions?

0

272.832 - 293.283 Nate Breznau

Yeah, that's a great question. So one simple answer has to do with their knowledge of science. So if you imagine someone who only knows how to use a hammer, they're going to run around trying to hammer things. They're going to try to solve problems with a hammer, whether or not the hammer is appropriate for that problem.

0

293.764 - 312.022 Nate Breznau

And so some scientists have different training and backgrounds in their methodological knowledge can limit or affect what they come to. But other aspects of the process can be, like, people have a profit motive, you know?

Chapter 5: What are questionable research practices and why are they problematic?

312.122 - 335.674 Nate Breznau

Like, one of the... anti-vax movement cites a paper by Wakefield. And he was one of the only papers that we know of to show negative consequences of vaccines. And it turns out he had a bunch of money to make if he could show negative consequences of vaccines. And so it's like, oh, wait a minute. But this happens.

0

335.694 - 354.417 Nate Breznau

If you think about the pharmaceutical industry, there's a lot of industries where you can make a lot of money as a scientist. And so you would be incentivized to show certain results. And, you know, there are other things like in science, it's a competition. We're competing for jobs and grant money and things like that.

0

354.497 - 378.257 Nate Breznau

And in this competition, people could be motivated to do things that are not so scientific. Like they come to some findings and then they do some little nudging or in really practical people who are really bad actors might do a little fudging where they even like fake some data. We do have scandals, you know, in science where people have just like straight up faked data.

0

378.838 - 388.65 Nate Breznau

But the more common is like the nudging where they, you know, there's tricks that people can kind of game the scientific system to make their results look a little bigger, a little shinier.

0

Chapter 6: How can transparency and open science improve research integrity?

388.69 - 392.935 Nate Breznau

And those are called questionable research practices. And they're a big problem.

0

393.218 - 409.496 David Cooper

I want to talk about what should be done about questionable research practices. But before that, in your study, you studied like immigration policy. That's a kind of politicized field, I suppose. Could you see this possibly popping up in like biology or physics or in mathematics?

0

410.137 - 416.003 David Cooper

Things that would on their face at least not have more of like a political motive or be tied to people's political views.

0

416.506 - 435.472 Nate Breznau

Probably not. I would think not, unless there was something about that study that could shape policy. Because that's how ideology creeps into the process. People have, turns out, scientists or human beings too. And they have preferences and how they want the world to be and maybe a certain political party or certain policies.

0

435.992 - 452.535 Nate Breznau

And they may then consciously or even subconsciously be shaping their results. Or they may even design studies in the first place to try to get certain results. And so, yeah, this can creep into the process.

Chapter 7: What steps can be taken to ensure science remains trustworthy?

452.955 - 464.25 Nate Breznau

And if there was a philosophy or maths or biology, if it could have political implications, then we would expect to see it there.

0

465.451 - 472.18 David Cooper

Back to what I was saying before about questionable research practices, what can be done here so people listening to this don't think science is broken?

0

473.729 - 492.284 Nate Breznau

Yeah, I mean, there are a lot of steps, and there's been this kind of crisis in science, and then there's been this response, what many people call the open science movement. And in the past, a lot of people did science where they would just say, trust me, these are my results. And it turns out we shouldn't do that.

0

493.826 - 516.296 Nate Breznau

And so this movement is to make all of the underlying data and all of the workflow and every decision point, every decision that was made and the rationale for that decision, to make that all transparent and open and to record it. And ideally to write down as much stuff in advance about what a scientist is going to do in a particular study.

0

516.716 - 537.642 Nate Breznau

And so this transparency, it makes it so people can check the data and And in particular, the original study that I helped lead that found all these different results, we made all of our data open. And that's what the follow-up study came in and found about the ideology. We didn't notice that in the first study. And that was only possible because of the sharing data.

537.662 - 550.193 Nate Breznau

So not only does it help insulate against the questionable research practices, it opens up new avenues for research. So it's good all around if we make these changes in the way we do science to be open and transparent.

550.662 - 559.777 David Cooper

Well, Nate Bresnau is a social and behavioral science researcher at the German Institute for Adult Education. Nate, this has been a really interesting chat. Thank you so much for sharing the results of your research on the show.

560.478 - 565.426 Nate Breznau

Yeah, thanks so much, David. And I've really enjoyed listening to your show this week. It's very refreshing.

566.067 - 569.792 David Cooper

Oh, well, thank you. I hope your research wasn't biased because of your political beliefs, Nate.

Comments

There are no comments yet.

Please log in to write the first comment.