Zen 3 build thread

malor

Ars Legatus Legionis
16,093
Unless you're a working chip designer, it strikes me that you're using language of certainty in a very inappropriate way here.
I don't need to be a chip designer to know what the frametime graphs mean for in-game smoothness. Until that's fixed, the APUs AMD occasionally throws into the market won't be replacing dGPUs.

Yes, because computers never get better. Once a type of chip is introduced, all chips of the same type will have exactly the same flaws, forever.
 

IceStorm

Ars Legatus Legionis
24,871
Moderator
Yes, because computers never get better. Once a type of chip is introduced, all chips of the same type will have exactly the same flaws, forever.
The flaw isn't in the chip, it's trying to have the CPU and GPU sip through a straw at the same time. This is why all the technically successful implementations use slabs of very fast local memory on package (Iris Pro, RX Vega). They are, unfortunately, expensive niche parts.

Assuming similar performance to the 128MB eDRAM cache from Iris Pro, an APU with 3D vcache might work. To make that happen requires they build an APU part that doesn't sacrifice on the CPU, they add vcache to it, and it doesn't cost more than a CPU+GPU combination that outclasses it.

The 5800X3D is $409 at the moment. The 5800X is $260. That's $150 for vcache. The RX 6400 is available for $150, and will be a better option than any APU today. Thing is, you don't have to be wed to the 5800X. The 5600 is $150, leaving $260 for a GPU. That gets you a RX 6600.
 

hobold

Ars Tribunus Militum
2,657
I remember when discrete video cards made the move from special-purpose, dual-ported VRAM to ordinary DRAM. Until that switch, it had seemed ludicrous to not have a dedicated read port for the absolutely timing critical video signal scan-out.

Yes, discrete GPUs with local memory will have "a small integer factor" more memory bandwidth. They also push approximately the same "small integer factor" more pixels per second.

The question is not if (or when) discrete GPUs get overtaken by iGPUs. The question is when (not if) iGPUs get good enough to serve a majority of the market. From that point on, development of discrete GPUs gets much harder to finance continually into the indefinite future.

The history is heavily in favor of integrating more and more stuff into the CPU. First we went from multi-chip processors to single-chip "microprocessors". Then we integrated the floating point co-processor onto the same die. Then we integrated L1 cache, later L2 cache controllers (with a dedicated "back side bus" to an external SRAM cache). Then we integrated the "huge" L2 cache itself. And then the memory controller, which had traditionally been on the other side of some "front side bus".

As long as transistors can be shrunk at all (be it exponentially like in the "Moore's Law" past, or more slowly now and into the future), there will be benefits to higher integration. Nowadays with chiplets/tiles, and upcoming standards across silicon vendors, integration will jump ahead even further, while becoming more modular at the same time.

The question isn't if iGPUs ever get good enough for enthusiasts. The question is how long enthusiast money can sustain development of discrete overkill GPUs, when the money of the ordinary mass market consumer is re-directed towards iGPUs.
 

continuum

Ars Legatus Legionis
94,897
Moderator
The question is when (not if) iGPUs get good enough to serve a majority of the market.
In the recent context of the last few posts in this conversation, I take it we all mean this is for gaming. 'cause in the context of not-gaming we've found the integrated GPU good enough since Haswell, if not even Sandy Bridge.
 

mpat

Ars Praefectus
5,951
Subscriptor
The question is when (not if) iGPUs get good enough to serve a majority of the market.
In the recent context of the last few posts in this conversation, I take it we all mean this is for gaming. 'cause in the context of not-gaming we've found the integrated GPU good enough since Haswell, if not even Sandy Bridge.

For non-gaming tasks, integrated GPUs were good enough with the GMA 950 (GMA 900 has issues with the Aero desktop and the Vista generation of graphics driver model). And it isn’t impossible to make an integrated GPU that works for gaming - just look at the consoles. They have what is for all intents and purposes an integrated GPU, and have for several generations now. The question is, for lack of a better wording - Why bother? If you put a console SoC in a PC, it will need a console cooling design, and it will lose all of that PC expandability. The only thing you get back is that it can be made smaller.

If you listened to Apple’s WWDC this year, it is clear that they think their integrated graphics are good enough for a gaming renaissance on the Mac. I have reservations about their gaming strategy, but they’re not related to the quality of their GPUs. For Apple, it made sense to make their design like that, because they sell about 85% laptops so space is important to them when designing.

So I’m not going to say never on the “integrated GPUs for gaming” front. If AMD can put six different memory controller dies in a GPU package, they can do so for a CPU, if it makes business sense for them to do so. Big GPUs need specialized cases today, so it isn’t that much of a stretch to say that the new super-APU needs a case with space for a 240mm radiator (or more, there are 180x180mm radiators and I’m sure someone has made a 240x240mm one). It just doesn’t make sense to do so today, and I’m not sure business conditions are going to change to make it worthwhile any time soon.
 
Integrated GPUs will continue improving until the cheap laptop you buy your kid for college can play any game at a reasonably high resolution and consistent framerate. They will be mainstream, and discrete GPUs will only be for enthusiasts. That doesn't mean discrete GPUs are going away; the enthusiast market is large and valuable.

Think about the type of people that put fancy rims on their souped-up cars, or audiophile home audio systems, or amateur photographers. They're all enthusiasts, and they're willing to pay what seems to an outsider as extravagant prices for at best marginal improvements. But it's their hobby, and they like it. We're enthusiasts too.

I have no doubt that people will still be building their own computers with gigantic power-hungry GPUs in 15 years, and it'll take those people a couple of minutes to explain the marginal benefits over a commodity device. It'll still be around, but much less common than today. And that's good.

I used to think streaming games would take over the mainstream, but I bet that soon enough, local compute will get so cheap and fast that your plasticy laptop or Roku will play games just as well as the cloud.
 
Integrated GPUs will continue improving until the cheap laptop you buy your kid for college can play any game at a reasonably high resolution and consistent framerate. They will be mainstream, and discrete GPUs will only be for enthusiasts. That doesn't mean discrete GPUs are going away; the enthusiast market is large and valuable.
So does this mean that iGPUs will see order of magnitude leaps in performance, or will games stop gaining in resolution and graphical features?
 

IceStorm

Ars Legatus Legionis
24,871
Moderator
I set up my 2200G again, on its ASRock B450 ITX/ac board using a beta AGESA 1.2.0.7 BIOS. 32GB of DDR4-3600 CL18 doesn't work, so I ran it at its second XMP profile, DDR4-3200 CL16.

Warframe and No Man's Sky can be made to run, if you don't mind the screen looking like someone smeared vaseline across it.

Destiny 2 stutters no matter what, because the CPU can't keep up.

I popped in a slot-powered RX 560. That helped No Man's Sky quite a bit. Now I can get 1080p FSR Quality, which takes the image from vaseline to a film of soap scum. Destiny 2? Not at all, and that's because of the other failing APUs suffer from - the CPU can't be upgraded independent of the GPU. Destiny 2 may be from 2017, but the CPU requirements are too much for a 4C/4T CPU from 2018.

Those who say APUs will take over at the low end are wrong. The x86/x64 platform will have to fundamentally change for that to happen. No local VRAM, no sale. Even if VRAM's added, all it takes is a low end CPU component to make games unplayable.

It's also pretty insulting to the masses to say an APU's "good enough". It's clearly not, and if your idea of PC gaming is relegating the masses to something which costs more than an entire Xbox Series S, but games far worse, then maybe you should consider how entitled you sound saying it.
 

hobold

Ars Tribunus Militum
2,657
Destiny 2 may be from 2017, but the CPU requirements are too much for a 4C/4T CPU from 2018.

Those who say APUs will take over at the low end are wrong.
First and foremost: bonus points for going through the trouble and dusting off that AMD Ryzen 5 2200G. However, as you wrote, that processor is from 2018. It was fabbed in GlobalFoundries' 14nm silicon, and used CPU and GPU cores that were not the latest generation at the time.

We are now in mid 2022. A contemporary chip might be the AMD Ryzen 7 6800H ... four years later and three times more pixel pushing power. That is what you get from using what isn't the bleeding edge: TSMC 6nm, Zen3+ cores, RDNA2 GPU. The Ryzen mobile 6000 series is not AMD's mighty attack. It is meant for thin and light laptops, where you don't want to spend space and energy on a discrete GPU. And it is already impressive ... but not cheap; this is not a low end APU.

Still, a four years old example is a weak foundation for an argument about the future, when things are moving as quickly as they do. AMD's serious attempt at a "big" iGPU isn't what's in Ryzen 2000G, or what's in Ryzen 6000G. The rumor mill says that the serious attack on entry level gaming laptops will come with a project called "Phoenix".

https://www.techpowerup.com/296217/...-phoenix-mobile-processor-specifications-leak

Phoenix will still suck compared to an RTX 3090 or a Radeon 6900. But it has a real chance of killing the lower half of discrete laptop GPU chips, as it is actually meant and designed to do. Phoenix might lose that battle and be forgotten. Or it might win, and then fight its way further up.

I don't know what will happen; the future is not set in stone. But these are interesting times, both in utopian and apocalyptic terms.
 

Made in Hurry

Ars Praefectus
4,553
Subscriptor
I set up my 2200G again, on its ASRock B450 ITX/ac board using a beta AGESA 1.2.0.7 BIOS. 32GB of DDR4-3600 CL18 doesn't work, so I ran it at its second XMP profile, DDR4-3200 CL16.

Warframe and No Man's Sky can be made to run, if you don't mind the screen looking like someone smeared vaseline across it.

Destiny 2 stutters no matter what, because the CPU can't keep up.

I popped in a slot-powered RX 560. That helped No Man's Sky quite a bit. Now I can get 1080p FSR Quality, which takes the image from vaseline to a film of soap scum. Destiny 2? Not at all, and that's because of the other failing APUs suffer from - the CPU can't be upgraded independent of the GPU. Destiny 2 may be from 2017, but the CPU requirements are too much for a 4C/4T CPU from 2018.

Those who say APUs will take over at the low end are wrong. The x86/x64 platform will have to fundamentally change for that to happen. No local VRAM, no sale. Even if VRAM's added, all it takes is a low end CPU component to make games unplayable.

It's also pretty insulting to the masses to say an APU's "good enough". It's clearly not, and if your idea of PC gaming is relegating the masses to something which costs more than an entire Xbox Series S, but games far worse, then maybe you should consider how entitled you sound saying it.

Insulting? There are legions out there (especially happy parents) happy with their G series APU's that allows gaming on a ultrabudget, and i am one of them. I am quite surprised how well that 2200G box has been doing as i bought it not expecting it to perform as well as it has :). The games it's running pretty decently is Minecraft (with shaders), Rocket League, City Skylines, Sims 4 (with all the dlc's), Red Dead Redemption 2 (35 fps) Mafia II, GTA V and so forth and so forth.
Isn't that gaming? The desktop 5600G has been a pretty popular chip for AMD even if Vega now is getting outdated, although the MSRP should have been far lower.

It will never of course replace the experience of a good dGPU, but to say it's insulting just speaks of your own expectations of what PC gaming is.

PC gaming is more than running the latest titles, and AMD hit a home-run with these, especially with the 2200G/2400G and the 3200G/3400G.
 

w00key

Ars Praefectus
5,907
Subscriptor
Those who say APUs will take over at the low end are wrong. The x86/x64 platform will have to fundamentally change for that to happen. No local VRAM, no sale. Even if VRAM's added, all it takes is a low end CPU component to make games unplayable.

It's also pretty insulting to the masses to say an APU's "good enough". It's clearly not, and if your idea of PC gaming is relegating the masses to something which costs more than an entire Xbox Series S, but games far worse, then maybe you should consider how entitled you sound saying it.
Quoted for truth. APU has their place as entry level machines and they are fine really.

But "bigger APU" loses hard against cards <= $200. The 1650 (192 GB/s) and RX 6400/6500 (128/144 GB/s + infinity cache). There is zero way for a bigger GPU core plus embedded cache to be better at the same price as a discrete card once you leave the current APU territory, which has a budget of between $0 and $100 for additional silicon. The 5600G / 5700G barely cost more than the plain 5600X / 5700X, you pay for it in reduced CPU performance; the RDNA2, Zen4 versions may be more competitive and go for a full $100 premium like laptop chips with Iris Xe G7 96EUs (yes, that one is indeed faster than Vega 8).


This is about the most you can get out of an APU; in laptops, AMD already made a big APU with 45W graphical power: https://www.notebookcheck.net/AMD-Radeo ... 860.0.html

So around a GTX 1650 Mobile performance and around 60% faster than Iris Xe 96EUs. But actually trying to find one for sale results in a list of €2500+ laptops with RTX 3070/3080 GPU integrated. It makes sense, top tier laptop CPU part won't go in a cheap build, it's expensive. The smaller 660M is 40% slower and is an even match for Iris Xe => good, but nothing special.
 

IceStorm

Ars Legatus Legionis
24,871
Moderator
First and foremost: bonus points for going through the trouble and dusting off that AMD Ryzen 5 2200G. However, as you wrote, that processor is from 2018. It was fabbed in GlobalFoundries' 14nm silicon, and used CPU and GPU cores that were not the latest generation at the time.
All the games I chose pre-date the 2200G. Destiny 2 was the "newest", at about 9 months before the 2200G launched. Warframe's from 2013. No Man's Sky is from 2016. This illustrates the problem with saying, "but newer APUs will be fine!" They won't. Even excluding the one or two "Can it run Crysis?" titles (Cyberpunk 2077), APUs should be able to run the vast majority of titles if they're going to stand in for the $150-$250 dGPU market.

Insulting? There are legions out there (especially happy parents) happy with their G series APU's that allows gaming on a ultrabudget, and i am one of them.
Please keep in mind that AMD doesn't have legions of anything on the user front. They may have a few rabid users, but sales volume for APUs is basically zero compared to Intel's vast number of iGPU-equipped CPUs. APUs are laptop reject parts. There can't be large volumes. All the evidence you need is the spotty availability of APU parts.

Rocket League
You should probably tell people HOW you got Rocket League to work, as when it was tested on the 5600G, it had a regular framerate dip where the GPU just stops working. They did test at 1080p. If you're using 720p, then that may be why yours doesn't hitch.

Isn't that gaming?
As you are an owner, you can make that judgement (though your kids should probably chime in), but if we're talking replacing the $150-$250 GPU segment, the answer is a resounding no.

The desktop 5600G has been a pretty popular chip for AMD even if Vega now is getting outdated, although the MSRP should have been far lower.
I don't recall the 5600G ever being sold out. Its timing was poor, and it wasn't deemed a great value due to its sky-high MSRP.

Today, it's currently $150 at Best Buy, but since GPU prices have fallen and Intel has ~$100 i3 Alder Lake parts, there's no real point to buying one. If you were willing to spend $260 before, you can get a new 12100F and a used RX580 off eBay for $250 today.

This speaks to why APUs aren't successful for gaming when crypto isn't gobbling up every GPU in existence.

It will never of course replace the experience of a good dGPU, but to say it's insulting just speaks of your own expectations of what PC gaming is.
I was referring to people being insulting to the masses in deciding, without using one, that APUs are "good enough" to replace the low end/mainstream GPU market.

AMD's APUs being insulting? AMD's been pulling the football out from under their buyers for over a decade. Buyer beware is pretty much baked into the product line at this point.

PC gaming is more than running the latest titles
As I noted, nothing I tried was a "latest title". Ironically, "latest titles" are going to run better on the APU because of FSR. No Man's Sky has FSR 1.0 support, and it looks somewhat better than just dialing down the render resolution.
 

IceStorm

Ars Legatus Legionis
24,871
Moderator
You just keep repeating that fundamental argument in different forms: my old APU sucks, so therefore all new APUs must also suck.
It's not "my" old APU, it's ALL APUs to-date since 2006.

It's a bad argument.
You have zero evidence that AMD can ship a product which will replace the $150-$250 GPU market. You ignore the fundamental reason why there's only ever been one successful socketted iGPU for gaming (Iris Pro, no longer usable because of lack of API support), and why that solution isn't one AMD can use.

What's worse, you don't appear to have ever tried using an iGPU gaming solution, yet you make blanket assertions that they'll replace the $150-$250 market. I own both an Iris Pro system and the 2200G. I at least know how bad these "solutions" really are.
 

hobold

Ars Tribunus Militum
2,657
Oh, I have a mini-ITX 2200G build, too. And Warframe just so happens to have been the target for that machine. But I never really had to rely on that little emergency box to be my daily driver.

IceStorm is certainly right that games are a moving target ... if you restrict your view to AAA titles. Quite a few of the other games have very different thresholds of "enough" pixel pushing power.

(BTW, Warframe is an interesting example in so many ways. The engine is technologically quite interesting, even if they never chased the latest GPU buzzwords. Render quality options and eye candy scale all the way from Intel iGPU to, say, Radeon 6600XT mid-range; and GPUs above that effortlessly run higher res / high frame rates. No game recommendation from me, though, because it is the kind of endless occupation that can suck up as much of your life as you want to sacrifice to it. :) )
 
So does this mean that iGPUs will see order of magnitude leaps in performance, or will games stop gaining in resolution and graphical features?
That's an insightful question. I don't see resolution past ~4k (as a shortcut for describing pixel density) having any value at all on computer monitors, laptops, handhelds, or TVs. VR it will to some extent, but it remains to be seen how mainstream VR actually becomes. My guess is it'll never go mainstream outside of porno, and AR will reign supreme, but I digress so moving on.

Graphical fidelity will of course continue to improve, but those improvements are increasingly marginal too. Think about even Xbox 360-era games today. They don't look modern for sure, but they really aren't bad. Compare Oblivion to Skyrim to Red Dead 2 to the UE5 Matrix demo. There's a clear progression in presentation, but Oblivion remains completely playable today, it isn't like early 3D games like Daggerfall where the graphics are truly tough to swallow.

So-- my answer is "both".
 

w00key

Ars Praefectus
5,907
Subscriptor
You just keep repeating that fundamental argument in different forms: my old APU sucks, so therefore all new APUs must also suck.

It's a bad argument.
DRAM is the hardest thing to improve, even DDR4 to 5 added neglectable real world benefits. Only GDDR6(X)'s brute force approach worked, stupid wide memory bus x high clock rate and with the -X variants, quad data rate via PAM4 encoding.

How on earth is an iGPU going to keep up with ever increasing demands using just the main memory bus? We already written off eDRAM, AMD's stacked implementation costs a ton and isn't very thermal friendly. Threadripper/EPYC style, quad channel memory per socket? I doubt there is any demand for that, just the extra cost for the board pays for a low end card that has more than that in GB/s bandwidth.

The GTX 1650 has 6 channels worth of DDR5 bandwidth. And that's a pretty ancient card.
 

w00key

Ars Praefectus
5,907
Subscriptor
Well, it's still a factor 3 off from a GTX 1650, and factor 10 off a RTX 3080. I don't see iGPU getting to a GTX 1650 level soon, maybe when DDR6 or 7 is released in well, another 5 to 10 years? By then, a GTX 1650 is probably only good for rendering at lowest quality. Even a hugely overclocked DDR5-6400 DIMM is only 50GB/s, two of them for 100, still need another 2 DIMMs to match an entry level card from 2019 that didn't deserve the label RTX.
 

Xavin

Ars Legatus Legionis
30,167
Subscriptor++
Integrated GPUs will continue improving until the cheap laptop you buy your kid for college can play any game at a reasonably high resolution and consistent framerate. They will be mainstream, and discrete GPUs will only be for enthusiasts. That doesn't mean discrete GPUs are going away; the enthusiast market is large and valuable.
It's also not just the gaming enthusiast market, it's the gaming enthusiast + 3D workstation + AI/Machine learning market. If you are already making discrete GPUs for the enterprise and data center, which they will be for the foreseeable future, and making gaming drivers, which they have to do GPU or APU, then making discrete gaming GPUs makes sense because it's very little extra work.
 
Very true, workstation, deep learning, modeling, media rendering, etc, all use powerful GPUs too. That will never go away.

When my dad was a little kid, his hobby was building high-fi stereos from parts he sourced in local electronics stores. This was by far the cheapest and best way to listen to high quality stereo music at home. He read high-fi magazines, devoured that stuff. Then in the 60s transistor radios became popular, and in the 70s they got cheap and sounded good.

Now today in 2022 you can still buy incredibly expensive high-fi equipment. That market still exists, and it does sound better than your airpods or Alexa speaker, even your Sonos or Bose if you're fancy. And yet most people don't care, because wait a sec, just how much better? Speaker systems, and amps, and streaming sound sources, all that stuff up and down the chain has become commoditized. Sure you can spend $15k on a system, if you want. But only enthusiasts care.

That's basically where computers are going. Your phone can already play great looking games comparable to two generation old gaming consoles like the Xbox 360 while consuming only 2 watts of power. The path ahead seems pretty clear to me. Just don't hold your breath, it won't happen immediately, like hobold noted.
 

Made in Hurry

Ars Praefectus
4,553
Subscriptor
I haven't experienced any problems with Rocket League, neither on the 2200G or the 5600H laptop which is my daily driver. Both me and daughter play on 1080P, (me on 1920x1200) performance mode which is a regular weekly event we have together :)

The 5600G is currently the 4th top seller in Norway according to our own pricematch service, the 3 others on the top also belong to AMD. Actually, the top 7 belong to AMD before Intel's 12900K and 12700K follows in the next spots. https://classic.prisjakt.no/category.php?k=500

At least here even if it's a small market, AMD's CPUs has regularly been out of stock.

A recent dGPU will always be better of course (which is what i am going to buy myself in the next month or so), but to say that the APU's are useless for casual gaming is something i just do not comprehend, The Intel equivalents did not make that both widely available and possible before the 2200G/2400G and their successors came along.
 

w00key

Ars Praefectus
5,907
Subscriptor
Huh. This discussion is more pointless than I thought. There's already the Radeon 680M, built into the Ryzen 7 6800U and up, RDNA2 with 12CU, 768 pipelines. And it's little brother Radeon 660M, with half the cores.


That's your dream of a big APU right? It's definitely bigger than the 5600G's iGPU.

Performance, compared with a $200 card: https://www.notebookcheck.net/Radeon-RX ... 598.0.html
And vs 5600G's Vega 7: https://www.notebookcheck.net/Vega-7-vs ... 598.0.html


It's not bad really, good enough for gaming and better than Steam Deck's 8 CU APU. But scaling between 6CU and 12CU isn't linear, it's only 44% faster in Time Spy, so making it even bigger is a waste of silicon without getting better DRAM. Compare it with a $200 card though, like the 6500 XT / Navi 24, and it is very badly beaten. The biggest iGPU is half as fast as a $200 card, so it's worth maybe $100 to a desktop user.


The question is, does AMD think this is what users want on a desktop? If it costs $100 more than a no/small iGPU, then it's pointless, you can grab basically any cheap dGPU right now and match the Radeon 660/680M, and $200 is twice as fast as the 680M.

And by the time the next desktop APU is released, the $200 RDNA3 "Navi 34" card will kick the Navi 24 little brother's ass. It's a moving target and there is very little space between budget APU, little bit more premium APU (like those 12CU laptop chips) and the cheapest discrete cards.

~~~

New series for Zen 4 confirmed so far:

Ryzen 7000 / Raphael, the next desktop chip with Zen 4, is rumored to come with 4 CU of RDNA2. So that's even slower than the R660M.

Ryzen 7000H / Dragon Range, the next high end laptop chip, Zen 4 + RNDA 3, low CU count. Makes more sense than putting the biggest iGPU in 6900H, those laptops always come with an RTX 3070/3080 right now.

Ryzen 7000H/U / Phoenix (Point), thin and light laptop chip, Zen 4 + RNDA 3, high CU count, for laptops without dGPU.


I doubt they will add a desktop APU to the lineup. For laptops it makes sense. For things with a power cord, not so much. Can they put a Phoenix Point APU on a socket and sell it for $100 more than Raphael? Maybe. Is the market big enough to be worth it? Roadmap (from last Q's financial report presentation) says no.
 

mpat

Ars Praefectus
5,951
Subscriptor
I brought up the 6000G series already, but nobody took the bait. Yes, a 12CU APU with a 45W TDP loses to a 16CU card with a 107W TDP. Not exactly a shock, even if we ignore for a second that the 16CU card has that power budget all to itself, while the APU has to share its with 8 Zen 3 cores. It I always bout the power budget in these things. It is also why the scaling from the 660M is so poor.

Also, your link states that the 680M uses DDR4 up to 3200MHz. That must be a typo, as the 6800U and 6850H don’t support DDR4, only DDR5 up to 4800MHz and LPDDR5 up to 6400MHz, but it makes me doubt the rest of the specs.
 

w00key

Ars Praefectus
5,907
Subscriptor
The 6800U is already released and sold in laptops. That's DDR4.

Zen 4 / DDR5 is going to use 7000 (H/U) numbering. And no big APUs are planned for desktops.


First AMD Ryzen 7 6800U benchmarks: Closer to Core i7-1260P in multi-core, Radeon 680M impresses in synthetics but lags in gaming

Radeon 680M iGPU: Beats the MX450 in synthetics but not in gaming, requires FSR
One of the main USPs of the Ryzen 6000 generation is the introduction of an RDNA 2-based iGPU that should finally negate the need for an entry-level dGPU like the GeForce MX450.

In Geekbench 5.3 compute tests, the Radeon 680M is able to handily beat an average MX450 in both OpenCL and Vulkan tests with comparable performance to that of an average GTX 1650 Mobile. In 3DMark, the Radeon 680M is able to lead an average MX450 by as much as 37% in Fire Strike Graphics and 25% in Time Spy Graphics. However, it loses out to the GTX 1650 Mobile and the GTX 1650 Max-Q in this test.

Gaming presents a strange conundrum. We have scores only from Dota 2: Reborn for the moment, but the Radeon 680M trails the average MX450 and the average GTX 1650 Max-Q by 21% and 28%, respectively in 1080p High. This performance gap between the GPUs further increases by up to 7% at 1080p Ultra. The deltas are somewhat lower at lower resolutions and settings.

In a bigger comparison, it compares favorably against the "better than nothing" MX450. https://www.notebookcheck.net/GeForce-M ... 598.0.html
 

hobold

Ars Tribunus Militum
2,657
And no big APUs are planned for desktops.

(Speaking about Ryzen 7000 / Zen4) Quoted for emphasis. The small iGPU included in all desktop Ryzen 7000 models removes AMD's motivation to rebrand any mobile silicon as 7000G desktop models. Intel does not pose a big enough threat that AMD might need a beefy iGPU as a unique selling point.

Furthermore, the GPUpocalypse is still looming ... i.e. the upcoming flood of used mining GPUs is expected to pretty much kill the low end discrete GPU market. There is neither a need nor any realistic hope that AMD could capture that market with a strong iGPU in this current situation.

In a bigger comparison, it compares favorably against the "better than nothing" MX450. https://www.notebookcheck.net/GeForce-M ... 598.0.html
Where "favourably" means ~80% faster averaging over a few dozen games. So the 680M in the Ryzen 6000 strongly encourages laptop OEMs to choose AMD over an Intel + Nvidia combo. Here the iGPU is indeed a unique selling point.

(There might be a future where an Intel + Intel combo is an option. But Arc Alchemist doesn't deliver that.)
 

w00key

Ars Praefectus
5,907
Subscriptor
Well the MX450 is the crappiest of them all, the 1650 is the one to beat and it is not there yet.

All the laptops with an AMD 6800U or better pair them with an RTX 3050-3080 or the AMD dGPU equivalent. You can't pair a top model = expensive CPU with no GPU and expect it to sell well, at least, that's what the market decided so far. Except for that one ZenBook, 14.9mm / 1kg is too small to stuff in a gaming card.


On our pricewatch, selected CPUs with R680M iGPU, sort by popularity, spoilered for size:

AnXcgcU.png



And don't just trust that average, not every device is tested in every game and with laptops, implementation, TDP limit and thermals are king. Take a game that's been in the test suite for a long time like DOTA 2 or GTA V and MX450 wins from the Radeon 680M

oiqv0X0.png


Every score is a single tested laptop. And the MX450 does quite well there, faster than the average 680M implementation.

Witcher 3 shows them at about the same speed, with 680M up to 15% faster @ 768p low and 3% faster at 1080p high.

GTA V shows the MX450 5 to 15% faster.


In games with 1x MX450 laptop vs 1x R 680M laptop, it's a toss up, depending on how chunky they are.
 

IceStorm

Ars Legatus Legionis
24,871
Moderator
He doesn't have to. The pricing argument already wins. People aren't going to pay that much for a laptop if it doesn't have a dGPU.

Dave2D did a decent video on AMD Advantage:
https://www.youtube.com/watch?v=fKoDe6kw_ZE

Mobile is where the iGPU shines, keeping power levels lower when at the Windows desktop. You have the option to run titles that can run on the iGPU (Don't Starve) to save power, but you have the dGPU for titles that will never run well on the iGPU. All of that goes away on the modular PC desktop.

This reinforces the point I've been making - AMD does not make desktop parts. They make server parts, and they make mobile parts. Desktop parts are the rejects from both lines. You're not going to see a "big IGP" desktop part unless there's a reason for it to exist in mobile or server.
 

Made in Hurry

Ars Praefectus
4,553
Subscriptor
He doesn't have to. The pricing argument already wins. People aren't going to pay that much for a laptop if it doesn't have a dGPU.

Dave2D did a decent video on AMD Advantage:
https://www.youtube.com/watch?v=fKoDe6kw_ZE

Mobile is where the iGPU shines, keeping power levels lower when at the Windows desktop. You have the option to run titles that can run on the iGPU (Don't Starve) to save power, but you have the dGPU for titles that will never run well on the iGPU. All of that goes away on the modular PC desktop.

This reinforces the point I've been making - AMD does not make desktop parts. They make server parts, and they make mobile parts. Desktop parts are the rejects from both lines. You're not going to see a "big IGP" desktop part unless there's a reason for it to exist in mobile or server.

That's probably true since I guess AMD wants to sell their garbage RX6400 and RX6500 and their future equivalents, so the iGPU won't for the foreseeable future compete or risk compete with their low end dGPUs
 

IceStorm

Ars Legatus Legionis
24,871
Moderator
That's probably true since I guess AMD wants to sell their garbage RX6400 and RX6500 and their future equivalents, so the iGPU won't for the foreseeable future compete or risk compete with their low end dGPUs
There will be a need for lower end GPUs for a long time. APUs do not interfere with that market. They cost too much to fabricate, they're too low volume on the desktop as they're laptop rejects, and they aren't drop-in replacements for systems that need a basic video card.

APU pricing goes like this: Initial launch pricing is too high. APU parts don't sell. APU parts go on fire sale (how I got my $60 2200G, how you can get a $150 5600G today). APU parts disappear, but demand returns and the aforementioned fire-saled parts go back up on scalper sites at 2-4x original price.

AMD's desktop APU availability is truly awful.
 

hobold

Ars Tribunus Militum
2,657
Gentlemen, please make up your minds if you want to argue technology, economy, or market acceptance. Because if either of you keeps switching gears whenever it seems convenient, I have to keep jogging after those goals posts while I try to honor the complexity of each individual line of reasoning with more refinements of each respective argument.

So if you could stick to one particular gun each, I'll happily continue our respective duels for the entertainment of the audience. Otherwise I'll only be running around without getting anywhere. Thank you.


Technology:
A single data point does still not suffice to prove or disprove some overall technical performance.

Economy:
AMD can currently sell every piece of silicon they are making. So thanks to the weakness of Intel and the threat of the GPUpocalypse hitting Nvidia much, much harder, AMD is in no hurry to declare the "war on mid-range GPUs" which I fantasized about above. (The absolute bottom of the barrel low end is already gone. It vanished when Intel upped their iGPU to answer the Ryzen 2200G/2400G generation.)

Market Acceptance:
Disrupting a low end market requires low end prices. AMD was the underdog for over 15 years, and they were lean and mean enough that this "beefy APU" strategy looked like a good plan. Today's AMD has already managed to become a premium brand for desktop CPUs, and they made serious progress in mobile and server markets. The market might well accept beefy APUs for a good price. But AMD does not currently need low end pricing to find customers for their products. Their stuff is just too good right now.


Nonetheless, despite all those many valid counterpoints, AMD keeps integrating bigger and bigger GPUs. Nvidia doesn't have any 'x86 CPU cores to do the same. And Intel ... well, they don't have good GPU cores, as the Arc Alchemist drama is making painfully obvious in these last few days (botched launch, earnings call, third parties jumping ship).

So there isn't much competition driving AMD forward on the path of integration. And yet they continue, allegedly picking up more speed with their upcoming Phoenix project.
 

w00key

Ars Praefectus
5,907
Subscriptor
Also you get to a weird point where 4 CU of RDNA 2 in the next mainstream gen is good enough for basically everyone.

6 CU of RDNA 2 on mobile (Radeon 660M) is like 30% faster than 7 CU of Vega 7 (5600G). So a cut down, Zen 4 with 4 CU part is maybe just as fast, especially if paired with some decent RAM instead of laptop grade energy saving DDR4.


Every 7000 series CPU is already a decent APU, on par with the best of them available now. The market between that and the RX 6400/6500/GTX 1650 is vanishing small.

For Steam Deck clones, the 7X00U series will be great, but that's a mobile chip of course.

~~

This is getting really off topic now. We're done building Zen 3 machines right? Maybe start a new desktop APU topic if you want to continue down that road. This thread is EOL.
 

Made in Hurry

Ars Praefectus
4,553
Subscriptor
~~

This is getting really off topic now. We're done building Zen 3 machines right?

Not quite done yet, there has been a fire sale on AMD CPU's lately and i am piecing together a regular 5600/B550 system that won't impress anyone, but it's going to be a cheap ($175 for the CPU and $80 for the motherboard) and pretty decent build. Found a used XT5500 as well for $100,- that will be enough for my needs. Over here i find S1700 motherboards to be too expensive and AM5 is out in a minute, so going to wait on building anything else until that sorts itself out.

But i do wonder. How cheap can you build a decent AM4 setup over there? Seems you have much lower prices than we do currently, and with A320 support for most of the Zen 3s, i am guessing building on a tight budget might be the ideal time now.
 

DaveB

Ars Tribunus Angusticlavius
7,274
But i do wonder. How cheap can you build a decent AM4 setup over there? Seems you have much lower prices than we do currently, and with A320 support for most of the Zen 3s, i am guessing building on a tight budget might be the ideal time now.
Pretty much a push here in the US between AM4 vs. LGA 1700 at my local Microcenter.

Ryzen 5 5600 (6C/12T) + MSI B550M = $230

Intel i5-12400 (6C/12T) + MSI B660M = $230
 

Made in Hurry

Ars Praefectus
4,553
Subscriptor
But i do wonder. How cheap can you build a decent AM4 setup over there? Seems you have much lower prices than we do currently, and with A320 support for most of the Zen 3s, i am guessing building on a tight budget might be the ideal time now.
Pretty much a push here in the US between AM4 vs. LGA 1700 at my local Microcenter.

Ryzen 5 5600 (6C/12T) + MSI B550M = $230

Intel i5-12400 (6C/12T) + MSI B660M = $230

Checking our pricewatches here, i find this:

MSI B550M Pro ($75,-) + 5600 ($188) = $263

MSI B660M-E ($134) + 12400 ($234) = $368 (There are several B660M models, but this was the cheapest one)

Split between 3 different shops to get the lowest price.
 
Helped my son build his first system as a 5600G based setup. It was a budget box for his gaming and school needs (he's young but saved his money and bought the stuff). It works just fine for the stuff he does. Games like Minecraft and Fortnight play very well on it. I like my dGPU on my gaming system but I'm doing gaming at 4k where I can. At some point I'm sure he'll upgrade but that'll be a few years I think.