We're like >this< close to They Live glasses being a thing, aren't we?
Well, the mass market spectacles will definitely be showing whatever they want you to see.We're like >this< close to They Live glasses being a thing, aren't we?
China can feel slighted all they want, that doesn't mean they have any alternatives. China has always known what they need to do to lift the sanctions, they just don't want to. It will probably take a major economic collapse that the west has to bail them out from to finally break the protectionism, just like happened in Japan and Korea.If they were to launch a part that China cannot buy, this may be perceived as a slight against China.
They will definitely release a 5090 for consumers, every single hardware and game reviewer and content creator uses 4090s for 90% of their content and Nvidia won't want to give that up. They can always order more dies if demand is high. AI demand will fall off at some point but it's not going to crash like crypto mining did, there are too many varied and already released products that use it. If anything the massive demand for AI GPUs will even out the supply for gaming GPUs because there will be so many dies getting made they can easily shift over however many they need to meet the gaming demand. That's how things work when manufacturing volume scales up. They will probably even start looking for other markets they can stick high end GPU dies.The other reason is that the 5080 will be close enough to the 4090 such that nVidia can kill off the 4090 for consumers and send those dies to more lucrative AI-centric products (like the ones being smuggled into China).
No one said they wouldn't, just that it was being delayed slightly so the 5080 can launch alongside it.They will definitely release a 5090 for consumers, every single hardware and game reviewer and content creator uses 4090s for 90% of their content and Nvidia won't want to give that up.
Ehhh... that depends on how the 5090's being put together. If it's relying on CoWoS to glue two dies together, then it's not about the dies, it's about the advanced packaging. Packaging is the bottleneck for anything that is multi-chip.They can always order more dies if demand is high.
Capital Economics is saying the AI bubble will burst in 2026, deflating the S&P.AI demand will fall off at some point but it's not going to crash like crypto mining did
Existing "AI" isn't intelligent, it's just more advanced text prediction. It serves no real purpose other than to delay actual tasks.there are too many varied and already released products that use it
It's not sentient, but if you can't tell the difference for a specific task, is there actually a practical difference? If you don't think current AI has a massive amount of real world uses then you aren't paying any attention at all. Stock photos for random "stuff" are basically dead, it will be all AI from here on. Commodity voice acting is similarly dead, why pay someone when the computer can say it and takes direction better? Tools like Generative Fill in Photoshop are already critical to people's workflows and basically impossible to detect so people don't realize it's already everywhere. Go look some of the videos of ChatGPT-4o that released yesterday and tell me that's not something dozens of industries are going to jump on immediately. AI isn't great at coding whole applications, but it is great at giving answers to specific programming tasks tailored to your existing code, it's already in a lot of programmer's workflows.Existing "AI" isn't intelligent, it's just more advanced text prediction. It serves no real purpose other than to delay actual tasks.
Analysts say otherwise. For society's sake, I hope they're correct.AI is not a bubble and it's not going to crash.
AI is a bubble in the same sense that dotcom was a bubble in 2000; it's overhyped and money is pouring in a veritable torrent not justified by any remotely reasonable return on investment but the underlying product is very much real. Right now we're seeing the equivalent of Pets.com paying to ship cat food to everybody's door before logistics supported it-- but eventually it very much did and lots of people buy cat food from Amazon every month with Subscribe and Save. Amazon.com opened its doors in 1994 but wasn't profitable until 2003.
Nvidia is more infrastructure, they're like Intel supplying all the CPUs Amazon's servers run on to allow for its innovation. The analogy doesn't really transfer to Nvidia as they aren't themselves a player in gen AI, rather an enabler.
None of that is generative AI, though. Very different thing. Nvidia doesn't heavily develop in gen AI itself; just stuff like ChatRTX for home users to mess around with and some non-competitive image generation. At least as far as has been publicly disclosed, of course. They prefer to assist developers by providing proprietary (but very, very good) libraries like TensorRT so their hardware is always the best choice for whatever you're looking to do. Same philosophy as in gaming, really.They are both. They train their own models for gaming, and for self driving cars, and probably Silicon design and who knows what else.
None of that is generative AI, though. Very different thing. Nvidia doesn't heavily develop in gen AI itself; just stuff like ChatRTX for home users to mess around with and some non-competitive image generation. At least as far as has been publicly disclosed, of course. They prefer to assist developers by providing proprietary (but very, very good) libraries like TensorRT so their hardware is always the best choice for whatever you're looking to do. Same philosophy as in gaming, really.
Anyway, in the analogy Nvidia would be Pets.com.
Ah yes, the "it's hobold's fault" meme. The labelling of AI as "bubble" happened before, by a forum user who has moderation privileges. My take is that there is currently an AI bubble happening, not that AI itself is a bubble. I mentioned two examples where machine learning has been disruptively successful, and neither of those are large language models.The derail began from Hobold saying AI was a bubble
We all agree, then. Certainly overvalued, but not by any means without value.My take is that there is currently an AI bubble happening, not that AI itself is a bubble.
I wonder what software ecosystem they have in mind. Nintendo has one, and has demonstrated with the Switch that they can bootstrap from scratch. Steam Deck and Windows portables have a gamut of Windows games. What'll Nvidia do; x86 emulation? Cooperation with Valve?Anyway back to GPU news, Nvidia+ARM handhelds coming.
There are mature Linux and Android ecosystems for cheap emulation devices, and if Nvidia started selling chips capable of later gen 3d console emulation they would own that market. Mobile gaming is big enough that I'm sure a bunch of people would prefer a steamdeck like device instead of a phone. Proton also works on ARM, so theoretically an ARM device running SteamOS is a possibility. That would require a bunch of work on Valve or the community's part, but most of the hard work is done already.I wonder what software ecosystem they have in mind. Nintendo has one, and has demonstrated with the Switch that they can bootstrap from scratch. Steam Deck and Windows portables have a gamut of Windows games. What'll Nvidia do; x86 emulation? Cooperation with Valve?
It's a legal grey area, emulation means obtaining ROMs and firmwares from unauthorized sources and less than 0.1% of the users actually have the knowledge and hardware needed to dump anything themselves. I doubt any big business is crazy enough to market a chip to that user base.There are mature Linux and Android ecosystems for cheap emulation devices
My point is (I wasn't very clear about it) that Nvidia must have a business model in mind. They would not spend any resources if they didn't believe in a return. So where will the games come from?Arm, well, surely nobody needs more FPS in Candy Crush.
They just need to market it to legal emulation devices and the community usually takes it from there. When you round up the market as a whole N64 and below emulation devices are a big market, both legal and the dodgy chinese ones, and people are willing to pay a premium for solid devices. If an Nvidia ARM chip means those types of devices can push into N64/GC/PS1/PS2/XBox emulation, that's a lot of potential money on the table.It's a legal grey area, emulation means obtaining ROMs and firmwares from unauthorized sources and less than 0.1% of the users actually have the knowledge and hardware needed to dump anything themselves. I doubt any big business is crazy enough to market a chip to that user base.
That was my point, Proton runs on ARM now, so theoretically so could SteamOS and therefore most Steam games. ARM chips have decent performance these days, but the GPUs still suck pretty hard compared to Nvidia and even AMD GPUs, so if NVidia enters the market they could probably sell a lot of chips/devices if it's cheaper and more power efficient than x86 CPUs.SteamOS is sponsored by their store just like Switch with their software; sure you can run third party software on it but it has a legit main source of games and software. Arm, well, surely nobody needs more FPS in Candy Crush.
And with 'x86 emulation, including hardware assist like Apple built into their flavour of ARM CPU? In the latter case, Nvidia might get polite mails from Apple's lawyers. Without the hardware assist, the Steam Deck's Zen 2 cores will compete much better than most people would expect.I expect it'll run windows with a shell on top like the Ally.
I find it immensely amusing that the stated reason for Intel getting back into GPUs was that they needed to fill their plants - and then they fab it at TSMC.Xe has to be an incredible hard sell to Intel's board - a VERY expensive product that's just a money pit and will continue to be for likely several generations. I guess these days so many people are frothing for anything you can tack the work 'AI' on, so maybe it can be sold as a steppingstone to global AI domination or something.
Nobody has anything ready to ship. AMD didn’t paperlaunch because they wanted people to pay attention to Zen 5. Intel didn’t paperlaunch because they wanted people to pay attention to Lunar Lake. Nvidia didn’t paperlaunch because they didn’t want to reduce demand when the next gen is still 6 months out and they just did a facelift.Disappointed that we didn't see any consumer GPU announcements from Computex - at least some people were expecting Nvidia to announce Blackwell-based RTX cards, but Nvidia basically just did a victory lap talking about previous announced products.
The funny thing is that from a purely technical point of view, shader programs should neither know nor care about the SIMD with of the specific hardware they are running on. If this is really such a big compatibility issue, then it means a surprisingly large fraction of real world shader programs are technically incorrect. Of course the programmers never knew, because there were no test cases exposing such bugs.This wasn't rocket science, Intel. AMD was on SIMD16, and the engineers they poached knew this...
Shaders are a problem kind of like javascript, they are doing a job they were never designed to do at a scale nobody imagined when they were writing the specs. The majority of shaders in a game are going to be cobbled together automatically from UI buttons and sliders and recursively stacked and bolted together to get the effect the designer wanted. They are way too dynamic, which is what causes all the shader compilation issues, since you don't actually know what combinations of shaders need to be run on each object until runtime because everything is so conditional. To combat the performance mess that flexibility creates, the game engines and drivers try to do just in time optimization to make it all faster. All that means there's a whole lot of room for bugs especially when there are only two platforms to build to. You don't need code that works to spec, you need code that runs as fast as possible on Nvidia and AMD cards.The funny thing is that from a purely technical point of view, shader programs should neither know nor care about the SIMD with of the specific hardware they are running on. If this is really such a big compatibility issue, then it means a surprisingly large fraction of real world shader programs are technically incorrect. Of course the programmers never knew, because there were no test cases exposing such bugs.
And then indeed the pragmatically best way to deal with the issue is to become "bug compatible" to whoever is setting the de facto standard (probably Nvidia; AMD would have converged on SIMD16 earlier for the same reason).
It stopped being funny when people who knew how de facto standards in the industry worked decided to ignore those standards and fuck things up.The funny thing is that from a purely technical point of view, shader programs should neither know nor care about the SIMD with of the specific hardware they are running on.
NVidia is more akin to the Sun Microsystems of the AI bubble. They're generating massive profits but their profits and stock value is being inflated by the bubble. They may still survive and be profitable after the bubble bursts but their sales and stock value will likely decline considerably.Generative AI is not the only AI, and arguably not the most useful AI.
Calling NVidia the Pets.com of AI is just absurd. Pets.com is punch line for failure, when NVidia is the exemplar of success.
It's not just over-exuberant share holders over valuing NVidia, NVidia is generating massive profits from selling real products.
There is no miracle involved here.it’s miraculous
Not their first try. We're on attempt three or four.they released something even vaguely usable on the first try
No, it's not. Qualcomm, Apple, nVidia, AMD, and many others pull the task off, every year.It’s an unfathomably challenging task.
No, they didn't. They used their IGP driver experience, and the IGP has been playing games for over a decade.Then they dedicated massive resources to catching up to twenty years of driver development
You're off by two orders of magnitude.optimizing for hundreds of games released over the decades
NVidia is more akin to the Sun Microsystems of the AI bubble. They're generating massive profits but their profits and stock value is being inflated by the bubble. They may still survive and be profitable after the bubble bursts but their sales and stock value will likely decline considerably.