Dr. Jigar Patel
👤 PersonAppearances Over Time
Podcast Appearances
Now, going back. back even further into, you know, before college, before college and then into high school and before. It needs to be baked in there too, right? We do have this thing in America where people don't like STEM and STEAM, right? STEM is core to this. You have to have a basic understanding of that way back when. So we have to push it all the way back to the very early years.
Now, going back. back even further into, you know, before college, before college and then into high school and before. It needs to be baked in there too, right? We do have this thing in America where people don't like STEM and STEAM, right? STEM is core to this. You have to have a basic understanding of that way back when. So we have to push it all the way back to the very early years.
And I was in an airport last night and invariably you're walking through the airport and people are stuck on their phones and they're consumed by it. But they also, many of them are quite sophisticated and understand the technology. Many don't. And so it goes all the way back to that. So it's a societal problem. It's not just professional. It's not just educational.
And I was in an airport last night and invariably you're walking through the airport and people are stuck on their phones and they're consumed by it. But they also, many of them are quite sophisticated and understand the technology. Many don't. And so it goes all the way back to that. So it's a societal problem. It's not just professional. It's not just educational.
It's the whole thing that we need to keep front and center.
It's the whole thing that we need to keep front and center.
Yeah. It's got to be a societal goal to inform more on it, holistically, I think. And that is a, it gets back to the, got to educate on STEM, right? And understanding the technology and not just taking it at face value. With that knowledge and that loss, comes a blindness to what it's doing to you individually.
Yeah. It's got to be a societal goal to inform more on it, holistically, I think. And that is a, it gets back to the, got to educate on STEM, right? And understanding the technology and not just taking it at face value. With that knowledge and that loss, comes a blindness to what it's doing to you individually.
And when we start to accept the inputs without any questions, that's when we may have lost. Right. And lost is probably a strong word here, but it is something that we have to be very, very cognizant of. I mean, there's this I read an article that said at some point, 50 percent or more of the Internet may have been generated by AI. And it's not even a human query. Right. Right.
And when we start to accept the inputs without any questions, that's when we may have lost. Right. And lost is probably a strong word here, but it is something that we have to be very, very cognizant of. I mean, there's this I read an article that said at some point, 50 percent or more of the Internet may have been generated by AI. And it's not even a human query. Right. Right.
And so the information we get is AI generated. And that is.
And so the information we get is AI generated. And that is.
scares the bejesus out of me frankly um in that it what becomes truth then what it's some human behind the scenes manipulating potentially in an adverse way the truth right that's out there um and it and it becomes a non-non-concept and it gets back to i saw it here and that's the truth well yeah yeah so let me let me ask this yeah one of the other themes that you
scares the bejesus out of me frankly um in that it what becomes truth then what it's some human behind the scenes manipulating potentially in an adverse way the truth right that's out there um and it and it becomes a non-non-concept and it gets back to i saw it here and that's the truth well yeah yeah so let me let me ask this yeah one of the other themes that you
Yeah. The concept around a large language model is it, depending on how you've trained, what corpora of text you've loaded into it.
Yeah. The concept around a large language model is it, depending on how you've trained, what corpora of text you've loaded into it.
That corpora of text understands the relationship of words to one another. And when you say to it, that you want it to act more like, say, a specific author or a specific somebody that does a good job of conveying empathy through words, then it can take on that characteristic.
That corpora of text understands the relationship of words to one another. And when you say to it, that you want it to act more like, say, a specific author or a specific somebody that does a good job of conveying empathy through words, then it can take on that characteristic.
So it's all about the language and the use of language and the right language and how those relate to one another that it can do better than a human. Because it has this billions, trillions of words and the relationship of those words to one another and examples of different things and the probabilities of those things, right? So we can understand how...
So it's all about the language and the use of language and the right language and how those relate to one another that it can do better than a human. Because it has this billions, trillions of words and the relationship of those words to one another and examples of different things and the probabilities of those things, right? So we can understand how...