Sandra Matz
๐ค SpeakerAppearances Over Time
Podcast Appearances
It's a double edged sword. Right. So the moment that I have the ability to peek into your mental health and potentially change it, that means that I can use it the way that I just described to help you. But it can also mean going back to the previous example, as Facebook was was trying to use it to say, well, here's.
It's a double edged sword. Right. So the moment that I have the ability to peek into your mental health and potentially change it, that means that I can use it the way that I just described to help you. But it can also mean going back to the previous example, as Facebook was was trying to use it to say, well, here's.
It's a double edged sword. Right. So the moment that I have the ability to peek into your mental health and potentially change it, that means that I can use it the way that I just described to help you. But it can also mean going back to the previous example, as Facebook was was trying to use it to say, well, here's.
a teenager who's clearly struggling and who might be the most susceptible to the ads that you're showing them. So it very much depends on how you use it. And for me, the challenge that we have right now is that it all rests on the assumption that users can essentially make their own decisions.
a teenager who's clearly struggling and who might be the most susceptible to the ads that you're showing them. So it very much depends on how you use it. And for me, the challenge that we have right now is that it all rests on the assumption that users can essentially make their own decisions.
a teenager who's clearly struggling and who might be the most susceptible to the ads that you're showing them. So it very much depends on how you use it. And for me, the challenge that we have right now is that it all rests on the assumption that users can essentially make their own decisions.
The idea of a lot of the data protection regulations are like, well, just explain to users what's happening with their data and then just give them the control to decide whether they want to do it or not. but it's an incredibly complicated space, right? Like if I really wanted to manage my data all by myself, that would be a 24-7 full-time job.
The idea of a lot of the data protection regulations are like, well, just explain to users what's happening with their data and then just give them the control to decide whether they want to do it or not. but it's an incredibly complicated space, right? Like if I really wanted to manage my data all by myself, that would be a 24-7 full-time job.
The idea of a lot of the data protection regulations are like, well, just explain to users what's happening with their data and then just give them the control to decide whether they want to do it or not. but it's an incredibly complicated space, right? Like if I really wanted to manage my data all by myself, that would be a 24-7 full-time job.
And it would mean that I would have to continuously keep up with the latest technology. So for me, shifting towards this, how do we amplify some of the positive use cases while also kind of trying to protect us from abuses just means that we have to do a much better job at protecting consumers. And that could take different forms.
And it would mean that I would have to continuously keep up with the latest technology. So for me, shifting towards this, how do we amplify some of the positive use cases while also kind of trying to protect us from abuses just means that we have to do a much better job at protecting consumers. And that could take different forms.
And it would mean that I would have to continuously keep up with the latest technology. So for me, shifting towards this, how do we amplify some of the positive use cases while also kind of trying to protect us from abuses just means that we have to do a much better job at protecting consumers. And that could take different forms.
An hour, it's six gigabytes. So this is, I think I at some point calculated that it's like half a million times more than what the computer used that we used to, that we launched the Challenger rocket to space had capacity for. In just one hour, we generate that much data. So it's absolutely insane.
An hour, it's six gigabytes. So this is, I think I at some point calculated that it's like half a million times more than what the computer used that we used to, that we launched the Challenger rocket to space had capacity for. In just one hour, we generate that much data. So it's absolutely insane.
An hour, it's six gigabytes. So this is, I think I at some point calculated that it's like half a million times more than what the computer used that we used to, that we launched the Challenger rocket to space had capacity for. In just one hour, we generate that much data. So it's absolutely insane.
And for me, that's actually the interesting part, because most of the data that we generate, we don't intentionally create it, right? So in psychology, there's this distinction between identity claims. So this is all of the data that you know about. This is you putting something on social media because you want to send a signal that you're maybe open-minded, extroverted, and so on.
And for me, that's actually the interesting part, because most of the data that we generate, we don't intentionally create it, right? So in psychology, there's this distinction between identity claims. So this is all of the data that you know about. This is you putting something on social media because you want to send a signal that you're maybe open-minded, extroverted, and so on.
And for me, that's actually the interesting part, because most of the data that we generate, we don't intentionally create it, right? So in psychology, there's this distinction between identity claims. So this is all of the data that you know about. This is you putting something on social media because you want to send a signal that you're maybe open-minded, extroverted, and so on.
There's a second category of data, and that's what we call behavioral residue. So those are all of the traces that you leave and create without really thinking about it. So again, take your smartphone, for example. There's so many sensors embedded in that. So every kind of second, I get a snapshot of where you are based on your GPS record. Again, it might not seem super
There's a second category of data, and that's what we call behavioral residue. So those are all of the traces that you leave and create without really thinking about it. So again, take your smartphone, for example. There's so many sensors embedded in that. So every kind of second, I get a snapshot of where you are based on your GPS record. Again, it might not seem super