r/wallstreetbets Apr 27, 11:34 PM
Last year's "DeepSeek moment" was an overreaction, yet the actual "DeepSeek moment" that just happened is being completely ignored. We all remember how the U.S. tech stocks took a dump last year when DeepSeek R1 was released. Looking back we all realized it was a gross over reaction. And the fear that AI tech stack will move away from U.S. companies seems to be unfounded. DeepSeek was trained on Nvidia GPUs and all inferences were run on Nvidia GPUs. That's true for all the subsequent Chinese frontier models.
Until now.
Fast forward a year. Early last week Jensen Huang went on a podcast and said the day DeepSeek come out on Huawei first would be a terrible day for this nation. That interview went a little unnoticed because Jensen's argument has been mostly dismissed by many, from politicians to tech analysts and most people do not believe what he says would come true.
Well, less than 3 days later, DeepSeek V4 Pro was released. Its benchmark puts it within striking distance of U.S. frontier models while costing 1/6th as much for inference cost.
But the biggest story isn't the model capability, it's that the scenario Jensen predicated 3 days prior came true, as that DeepSeek V4 is fully powered by Huawei Ascend 950 GPUs for inferernce in production. It was designed and optimized for the Huawei chip, and became the world's first frontier model that was designed for non-American hardware (Nvidia GPU/Google TPU).
And DeepSeek said their cost is "high" at the moment because Huawei is scaling production, and they expect price to drop significantly later in the year.
The implication of an entirely viable Chinese AI tech stack (most people's workflow do not need that last 3-5% of performance), from hardware to a fully open sourced model layer, that approaches American AI stack while cost 20% as much, is being entirely ignored by Wall Street this time around.
Sure the biggest and richest companies in the West will still be paying for Claude/ChatGPT/Gemini, but the same won't necessarily be true for everyone else, from less wealthy countries to non-FT500 organizations in Western nations, let alone individual consumers.
At the end of the day LLM is all about large scale parallel computing and data, which is why people, including leaked memo at Google to Jensen Huang himself, have said the moat is very shallow for model providers (Anthropic/Google/OAI), no matter how much hype and momentum they are getting now. But if the hardware moat also gets eroded, then the entire U.S. tech industry's valuation will need to be given a separate look.
Disclaimer: I'm still long Google and Nvidia https://imgur.com/a/PoRg2fK
Edit: Longer interview from Jensen where he schooled the interviewer: https://youtu.be/Hrbq66XqtCo?t=3514 I think he's one of the greatest CEO in tech history and he has always had amazing insights from the perspective of an actual engineer, instead of being driven by emotions. But again, I may just be biased myself due to my own engineering background.
submitted by /u/cookingboy
[link] [comments]