Matt Frankel
๐ค SpeakerAppearances Over Time
Podcast Appearances
I personally don't think we'll see a wave of successful litigation against these social media giants for mental health lawsuits, but we'll have to wait and see here.
Yeah, so on the Google side of things, they said that their recent research showed that a new memory compression method that they call TurboQuant could potentially reduce the memory requirements of large language models, the AI models, by a factor of six.
And understandably, shareholders of memory-focused chip makers like Micron and SanDisk were panicking about this.
You know, memory efficiency, it's a big focus of AI development recently.
And the demands are so high that these memory companies literally can't make chips fast enough.
And if Google's right, that might not be the case anymore.
Yeah, I would add to that that, you know, take this with a grain of salt and zoom out.
Memory stocks like Micron are still up over 300% over the past year, even after the recent pullback.
And if LLMs evolve like most analysts believe they will,
one sixth of the current memory usage will still be a lot of memory chips that they need.
And it will still keep these companies very busy for the foreseeable future.
Every new technology gets more efficient over time, consuming less power, the hardware becomes smaller, et cetera.
Think of like, you know, flat screen TVs.
This is a natural evolution of AI technology and it's a good thing.
Yeah, so thanks for that, Tyler.
And thanks for the question.
I do a combination of the two, just like the email said, but I'll build it out a little bit.
So I like to build stock positions and ETFs, if that's what you're asking about, automatically over time, as it really takes the emotion out of the equation and it mathematically forces you to buy more shares when stocks are cheaper.
So think of it this way.
Let's say that I want to invest $5,000 in a certain stock.