Luke Posted Monday at 12:24 PM Posted Monday at 12:24 PM Now the question is, how will protectionist Silicon Valley react? Ban foreigners from accessing these language models by implementing ID verification and strong AI user limits? That would stop the fast scaling and adoption...
dealraker Posted Monday at 12:42 PM Posted Monday at 12:42 PM (edited) 18 minutes ago, Luke said: Now the question is, how will protectionist Silicon Valley react? Ban foreigners from accessing these language models by implementing ID verification and strong AI user limits? That would stop the fast scaling and adoption... In medieval times when being chased knowing you'd be soon caught - you scampered into the chapel. Elon, not the inventor meme, but the positioning elitist...he's got this down being guv-boy. Look for sanctions against competition everywhere. The new world of the superior US of A! Buy-buy-buy. Edited Monday at 12:43 PM by dealraker
Milu Posted Monday at 12:44 PM Posted Monday at 12:44 PM I wouldn't get too far ahead of yourself with too much talk of a bubble burst. Obvioulsy today there looks like there will be a bit of a pullback but you have to remember that Google at Meta are trading in the 25-30 PE range, hardly 'bubble' levels. Microsoft and Apple a little higher in the 30-34 PE range. This is not the 50+ PE range of 2000 tech bubble, no matter how much people would like this to be the case. As I said above Nvidia will likely be hit hard, maybe a few others in that space, but I don't think the growth of big tech is going to be stopping anytime soon.
Kizion Posted Monday at 12:47 PM Posted Monday at 12:47 PM (edited) 2 hours ago, Paarslaars said: Yeah going to be a red day today. I don't mind red days, but then they should be bloody red. Not this. Or it's a real threath and AI related stocks (incl. MAG7 ones) should go down +10%. Or it's not a real threath and nothing changed and we can continue upwards. Please, give me Meta at 450 or Google at 150 Edited Monday at 12:53 PM by Kizion
Gregmal Posted Monday at 12:52 PM Posted Monday at 12:52 PM This is the same behavior from the TOP thread that closed. Every few percent down everyone wants to talk about it. Because it’s warranted after all those bullshit gains on the way up.
widenthemoat Posted Monday at 12:56 PM Posted Monday at 12:56 PM 2 hours ago, Milu said: To take the other side of the argument, couldn't this deep seek development be a positive for many of the big tech companies, one of the main concerns I have with Meta, Google etc is the massive amount of cap ex they are planning to spend to stay at the cutting edge with these frontier models. If it's now the case (as demonstrated by deep seek) that this is not necessarily required shouldn't this result in a downward adjustment of future cap ex. Nvidia would strike me as being the main loser if this type of situation plays out. Here’s how I think about it in a very simple sense: Pre-AI: these companies had capex in their main business at high ROI given the massive network effect and scale they enjoyed. Post-AI: these companies have to spend massive capex on an AI related business that someone in china can then replicate for pennies on the dollar, creating a low if not negative ROI. In other words, they are forced to spend money on a no-moat business going forward just to stay in the game. That would suggest no moat over the long term for these once high flying darlings.
Luke Posted Monday at 01:04 PM Posted Monday at 01:04 PM 16 minutes ago, Kizion said: I don't mind red days, but then they should be bloody red. Not this. Or it's a real threath and AI related stocks (incl. MAG7 ones) should go down +10%. Or it's not a real threath and nothing changed and we can continue upwards. Please, give me Meta at 450 or Google at 150 TSMC is down -10%, SuperMicro is down -10% and thats only the premarket. Lets see what happens! 12 minutes ago, Gregmal said: This is the same behavior from the TOP thread that closed. Every few percent down everyone wants to talk about it. Because it’s warranted after all those bullshit gains on the way up. Okay, but these are significant moves in the semispace that are worthy of the discussion no? Perhaps not a "bubble pop" so far but at least it looks like a start...i think COBF should have a thread that discusses "why the market is down or up" no? 10 minutes ago, widenthemoat said: Here’s how I think about it in a very simple sense: Pre-AI: these companies had capex in their main business at high ROI given the massive network effect and scale they enjoyed. Post-AI: these companies have to spend massive capex on an AI related business that someone in china can then replicate for pennies on the dollar, creating a low if not negative ROI. In other words, they are forced to spend money on a no-moat business going forward just to stay in the game. That would suggest no moat over the long term for these once high flying darlings. Thanks for sharing.
Blake Hampton Posted Monday at 01:59 PM Posted Monday at 01:59 PM (edited) Just something of interest: Chatbot Arena LLM Leaderboard There’s only one open-source model on that list… Edited Monday at 02:04 PM by Blake Hampton
Gregmal Posted Monday at 02:18 PM Posted Monday at 02:18 PM 1 hour ago, Luke said: Okay, but these are significant moves in the semispace that are worthy of the discussion no? Perhaps not a "bubble pop" so far but at least it looks like a start...i think COBF should have a thread that discusses "why the market is down or up" no? I guess maybe if theres a dedicated macro day trading thread or something, but otherwise it seems dumb. This sort of volatility for semis isnt that wild. If nothing else maybe what would pique my interest is the relationship between tech and rates. Definitely been a correlation of late.
Luke Posted Monday at 03:07 PM Posted Monday at 03:07 PM 52 minutes ago, Gregmal said: I guess maybe if theres a dedicated macro day trading thread or something, but otherwise it seems dumb. This sort of volatility for semis isnt that wild. If nothing else maybe what would pique my interest is the relationship between tech and rates. Definitely been a correlation of late. Ill open one.
Spekulatius Posted Monday at 03:17 PM Posted Monday at 03:17 PM 2 hours ago, Luke said: Now the question is, how will protectionist Silicon Valley react? Ban foreigners from accessing these language models by implementing ID verification and strong AI user limits? That would stop the fast scaling and adoption... The correct solution of course to look at what the Chinese have done, lese from it and then improve it with their next iterations. Tech progress should be deflationary mot inflationary. Same happens with Telecom where WDM and other technologies made data transfer dirt cheap. The beneficiary were those companies that build upon that foundation not the Globalcrossing, Worldcom , JDS Uniphase, Lucents or Nortels. NVDA will be fine but maybe AI chips aren’t 80% gross margins but 60% gross margin business going forward. It’s not the end of the world.
rogermunibond Posted Monday at 03:25 PM Posted Monday at 03:25 PM (edited) Unregulated utitlities/electricity providers and data center builders down much more than semiconductors. Still need chips. Data center buildout, maybe not so much. Short Stargate. Edited Monday at 03:26 PM by rogermunibond
John Hjorth Posted Monday at 03:26 PM Posted Monday at 03:26 PM 2 hours ago, dealraker said: In medieval times when being chased knowing you'd be soon caught - you scampered into the chapel. Elon, not the inventor meme, but the positioning elitist...he's got this down being guv-boy. Look for sanctions against competition everywhere. The new world of the superior US of A! Buy-buy-buy. Certainly an interesting angle and take on things, not just for calling it mind provoking. [And now I'll enter shut-up mode again on this.] - - - o 0 o - - - I hope you're OK, Charlie [ @dealraker ] - Santé!
dealraker Posted Monday at 03:38 PM Posted Monday at 03:38 PM 4 minutes ago, John Hjorth said: Certainly an interesting angle and take on things, not just for calling it mind provoking. [And now I'll enter shut-up mode again on this.] - - - o 0 o - - - I hope you're OK, Charlie [ @dealraker ] - Santé! Remember not to take me too seriously. But seriously Elon is a positioning genius - best bar none. He's way-way-way ahead of his competition. I'm an outlier basking in the fake news evidently that we went to the moon without an explosion over 55 years ago.
Blake Hampton Posted Monday at 05:18 PM Posted Monday at 05:18 PM Just btw, if any of you haven’t happened to use AI yet, you should really give it a go. I’m just trying to figure out whether I should shift away from ChatGPT to Gemini. I honestly respect Google more than OpenAI, as I don’t really like Altman.
Sweet Posted Monday at 06:11 PM Posted Monday at 06:11 PM Why is a cheap LLM a problem? Seems to me there are far more bread and butter companies that will benefit from cheap AI than not. So semis get whacked… from a 10,000 ft view who cares? Isn’t it a good thing?
Pelagic Posted Monday at 06:14 PM Posted Monday at 06:14 PM Maybe I'm missing seeing more discussion on it, but doesn't the ability to run DeepSeek locally have massive implications? For $4k or so you can build a very capable workstation to run it and this would seem to have major applications for users who have so far hesitated to utilize AI primarily for privacy reasons (think doctors offices, researchers, government contracts, etc.). Running DS locally on an offline machine would seem to be a major advantage in productivity for these potential users at a relatively low cost. And for users paying $200/month for ChatGPT pro, the ability to run a comparative model locally with a breakeven less than two years - and even less if ChatGPT pro price increases, replacing that subscription looks appealing.
lnofeisone Posted Monday at 06:43 PM Posted Monday at 06:43 PM 27 minutes ago, Pelagic said: Maybe I'm missing seeing more discussion on it, but doesn't the ability to run DeepSeek locally have massive implications? For $4k or so you can build a very capable workstation to run it and this would seem to have major applications for users who have so far hesitated to utilize AI primarily for privacy reasons (think doctors offices, researchers, government contracts, etc.). Running DS locally on an offline machine would seem to be a major advantage in productivity for these potential users at a relatively low cost. And for users paying $200/month for ChatGPT pro, the ability to run a comparative model locally with a breakeven less than two years - and even less if ChatGPT pro price increases, replacing that subscription looks appealing. You can run it for $4k but you can't train it for that amount. Only the inference side of the business is impacted which is roughly 10% of the hardware usage for most large organizations today. It'll shift more over time but right now training is what eats up the $s.
MungerWunger Posted Monday at 07:45 PM Posted Monday at 07:45 PM article from Stratechery today: https://stratechery.com/2025/deepseek-faq/
rogermunibond Posted Monday at 08:07 PM Posted Monday at 08:07 PM @MungerWunger thanks. great write up with incisive points from Ben Thompson.
Luke Posted 15 hours ago Posted 15 hours ago 18 hours ago, Blake Hampton said: Just btw, if any of you haven’t happened to use AI yet, you should really give it a go. I’m just trying to figure out whether I should shift away from ChatGPT to Gemini. I honestly respect Google more than OpenAI, as I don’t really like Altman. I have tried all, gemini, chatgpt 4o, o1 and perplexity. Perplexity still my favourite because of the clean interface but deepseek seriously offers a lot of value since its free and can do similar things...
james22 Posted 13 hours ago Author Posted 13 hours ago 1 hour ago, Luke said: deepseek seriously offers a lot of value since its free . . . All it costs is your privacy.
frommi Posted 13 hours ago Posted 13 hours ago 19 hours ago, lnofeisone said: You can run it for $4k but you can't train it for that amount. Only the inference side of the business is impacted which is roughly 10% of the hardware usage for most large organizations today. It'll shift more over time but right now training is what eats up the $s. Yeah but that is what most companies were buying these pricy chips for. Training is a one time thing, especially if at some point you have compressed all available quality data (books,papers, etc.) into the system. What should additional training lead to? If the interference side runs on a smartphone (which is now possible in the future) than all these data center investments were for the bin. And even if you don't trust the deepseek model, i am very sure that OpenAI and Meta will produce a model in relative short timeframe that will be as efficient as the deepseek model, because it is open source and everything of it can be copied.
frommi Posted 13 hours ago Posted 13 hours ago In the endgame i think we will get thousands of robots that replace humans in every task possible. The true winners are the companies that sell stuff to consumers that use a lot of labour right now that can be replaced. All the robot stuff and AI software will get commoditized. Heck were is even the moat in something like software when you can replace it in an eyeblink by some AI generated code?
Pelagic Posted 12 hours ago Posted 12 hours ago 19 hours ago, lnofeisone said: You can run it for $4k but you can't train it for that amount. Only the inference side of the business is impacted which is roughly 10% of the hardware usage for most large organizations today. It'll shift more over time but right now training is what eats up the $s. Isn't inference though where the rubber meets the road so to speak when it comes to end users? The majority of users aren't training custom models but rather using existing models for inference to enhance productivity or build things with it. Companies developing AI models are trying to sell them to users as SaaS, get them hooked on the utility of a limited free tier and slowly let it worm its way into their life until they need the pro plan . It just seems to me there's an appeal to "owning" an open source model to use locally vs. paying a subscription fee for life to one of the big AI players. And back to my original point about this allowing organizations hesitant to utilize AI because fear the data they're providing it might end up elsewhere, having it run on local hardware is a solution at a fairly reasonable price.
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now