Past GPU releases: fillrate is king and your drivers are cheating

malor

Ars Legatus Legionis
16,093
I remember running Quake at 1024x768, and marveling at the detail, even though it was only a slideshow at that resolution. "Someday," I thought, "all gaming will be that good!"

Years later, I remember being absolutely blown away by the Mechwarrior 2 pre-made video intro. "Someday," I thought, "we'll be able to play games like that, live!"

But I thought it would take decades. If someone had showed me a 2023 game back then, I would have thought it was a hoax. I mean, I just exited Horizon Zero Dawn, and Quake-era me simply would not have believed graphics like that were possible.
 

IceStorm

Ars Legatus Legionis
24,871
Moderator
It's been a decade since I last had an AMD card, and I'm considering getting a 7800XT.
My new SWFT RX 6700 XT has had driver crashes thrice in two weeks. At first, I thought it was because the system went into Standby while idle in a game. It crashed yesterday while in Starfield, of all things, then crashed again while filling out the crash report (via RDP because the crash killed the video output). But hey, at least the second event allowed me to gracefully reboot the system remotely. The first time, the crash report finished and then everything display related froze, including the RDP session.

I've put in the reference 6700 XT I have to see if the problem is the GPU, or the drivers. I suspect it's the drivers.

nVidia, no crashes in Starfield across a mobile 2070, 3060 Ti, 3080 Ti, and 4090. Unless you're really hankering for disappoint, I would just spend the extra and get a 4070 or 4070 Ti.
 

grommit!

Ars Legatus Legionis
19,295
Subscriptor++
My new SWFT RX 6700 XT has had driver crashes thrice in two weeks. At first, I thought it was because the system went into Standby while idle in a game. It crashed yesterday while in Starfield, of all things, then crashed again while filling out the crash report (via RDP because the crash killed the video output). But hey, at least the second event allowed me to gracefully reboot the system remotely. The first time, the crash report finished and then everything display related froze, including the RDP session.

I've put in the reference 6700 XT I have to see if the problem is the GPU, or the drivers. I suspect it's the drivers.

nVidia, no crashes in Starfield across a mobile 2070, 3060 Ti, 3080 Ti, and 4090. Unless you're really hankering for disappoint, I would just spend the extra and get a 4070 or 4070 Ti.
I figured I could avoid using the astigmatism hating UI, but driver crashes are a different matter. I'll be interested to hear if they continue with the reference card.
 

Baenwort

Ars Tribunus Militum
2,470
Subscriptor++
My first card I actually bought was an ATi x800 pro that I put a pelter cooler and DangerDen block on.

It was both the generation that ATi surpassed nVidia and with the pelter I was able to really OC it.

Outside:
Total.jpg
Inside:
P1050166.JPG
I think I have the card still packed away as I was so proud of the Modding I did (even over volted it by changing out some resisters)

All my GPUs before that came from my Dad's hand me down computers so I don't count those even though he was buying the newest thing every other year
 
  • Like
Reactions: owdi

SuinusLatinus

Ars Tribunus Militum
2,291
Ohh, I had a X1800XT, the first and only time I bought a halo card, mostly to play Half-Life 2 at 1600x1200 maxed, with no compromises!

The realization of how brief the period of time was where nothing was faster than what you had (at least on HL2), plus the fact that, by the next generation, you could buy the same performance for a fraction of the price you paid*, made me swear off from halo parts forever.


* I already knew that but feeling it at a practical level, aka with my wallet, as somewhat different.
 
  • Like
Reactions: Sharps97

Ulf

Ars Legatus Legionis
12,551
Subscriptor++
My first "discrete" GPU was an integrated ATi Rage Pro 6 MB in a ... Beige G3 Mac desktop. Only came with 2 MB, I had to upgrade it to 6 MB. I played through the RAVE version of Quake.

Then I bought Unreal (also for the Mac) and my onboard graphcis simply could not handle it, so I bought ... a 12 MB VooDoo 2. Night and day difference.

Unfortunately 3dfx died out so I replaced that card with a ATI Radeon 32 MB, which was slower than the prior card until I upgraded my CPU from 266 Mhz to 500 Mhz.
 
At the university, my parents bought me a no-name Pentium 100 box. At the time, my knowledge of pc parts only went so far as to know of Intel and their generations of CPUs, mostly. I think the fastest CPU available when I got that machine was the Pentium 133. It was a sweet machine, it left my roommate's 486 eating dust. It really was and felt blazing fast!

I had no idea of what kind of GPU it had, my only benchmark for the GPU was how much RAM it had, I think it had 1 MB. Some time later I got my hands on a, eh... borrowed Starcraft disc and it stuttered* at some places. I knew enough to know it probably was the GPU causing it. So I went to a store and asked them for a better graphics card. They sold me a 2 MB S3 ViRGE... So, that was my first dGPU, excluding the one already in the box (I have no idea of what it was).

My next upgrade was for a GeForce 2. I hated that card because it was always hard-locking my machine (looking back, it could have been a crappy PSU). I sold it and bought an ATI 9600 Pro.

Then came the X1800XT.

After that was a 6950, followed by a 7850. It died while playing the old Desperados game! I sold it for parts and the new owner revived it with the oven trick.

After that came my current 1660 Super, luckily bough before the last COVID and crypto prices madness.


*Plot twist: the stuttering continued, even with the upgrade. Further research led me to also replace the sound card for a Creative Soundblaster, replacing an ECS soundcard, and that was what finally made the stutter go away!
 
Last edited:
  • Like
Reactions: continuum
That's it, damn those lens flares looked impressive back then in all their 32 bit glory. :p
I had also forgotten the game's name and had been wondering for months which game I was faintly remembering. Your comment reminded me of it and thankfully allowed me to narrow the age of the game down, enabling me to find it. Thank you for that!

I was playing it (or its demo, that is) on a P3 667 MHz with TNT 2 Pro. I was a total hardware noob at the time so I don't really remember how well it ran. The first game I had played on the system was Drakan, where I could check all the options and became interested in what those actually affected and meant. Before that system I was on a P2 300 MHz IIRC but I cannot remember the GPU, if it even had a dedicated 3D one. I remember playing Starcraft on that. I think that P2 was the machine I added a Voodoo 2 to at some point. I also put that Voodoo 2 into the P3 machine later on because I wanted to play FF7 with Glide.

This was the first PC I started tinkering with beyond the previous Voodoo 2. Some mild software overclocking, later I upgraded the RAM and installed a Geforce 2 MX 400. At the time it was hard for me to pick a suitable upgrade and was torn bitween the MX400 and a Kyro 2. I was playing Hard Hard 2 - King of the Road at the time and there were reports of massive issues with that game on a Kyro 2. So the Kyro 2 had in interesting approach to rendering, was pretty powerful but, to me, failed because of usability/driver issues. Sounds familiar.

I purchased the Geforce 2 MX 400 on 9/11, being unaware of what was happening until I returned home in the late evening (what a time that was before mobile phones...) in a timezone ~6 hours ahead of NY. Big oof.

The PC after that one was selected and built entirely on my own, thanks to having access to internet guidance. Athlon 2700+ Thoroughbred (an easy pick at the time) with a Radeon 9500 Pro (here I was torn between that and a Geforce 4 Ti 4200. I cannot remember what made me pull the trigger on the Radeon). It was a great system for all my needs and I never had issues that I could trace back to it being an ATI card.

I think I built another PC in 2005/2006 when I was in university, but I cannot find any emails to trace back the components. I vaguely think it would feature a X1000 series radeon. Or I wanted to do that, but never followed through. I need to investigate further.

In early 2009 I built a system with Core2Duo E8500 and Radeon HD4870. Never any issues there either, besides the PSU exploding spectacularly (it got replaced within 24 or 48 hours, bequiet support works).

That was following by a Xeon 1231 v3 with GTX 970 (now 1070 as my wife's PC) and then a Ryzen 7 3700X with 2060 Super (now 2070 as my PC) after the 2060 Super did not work in the previous rig, presumably due to PSU issues.

I am growing old. I also never grew to value contemporary state of the art graphics in games and even today I play old stuff a lot and reduce image fidelity on current titles because it "looks wrong". And because I want the GPU to STFU and remain silent.
 
Last edited:
  • Like
Reactions: owdi
My first "discrete" GPU was an integrated ATi Rage Pro 6 MB in a ... Beige G3 Mac desktop. Only came with 2 MB, I had to upgrade it to 6 MB. I played through the RAVE version of Quake.

<snip>
Sigh; guessing I am much older. My first discrete GPU came with my 286/12, and could do extended EGA, at 640x480 (stock EGA was only 640x350 as best I recall). That machine, with 640KB memory, 40MB hard drive, 1200bps modem, and one each of 5.25" and 3.5" floppy drives, and a no-name multi-sync 13" (?) monitor was US$2200 in 1988. This link says that would be almost US$5700 now...
 

malor

Ars Legatus Legionis
16,093
I tried one of the 9700 Pros, but IIRC, I ended up stuck with it, unable to use it properly, because it had a terrible RAMDAC that didn't drive my expensive widescreen CRT properly. I was really pissed off that they'd put such a shitty part in such an expensive card. It was $300ish, just wasted. It put me off pretty hard on AMD until around 2007ish, maybe 2008, when I tried an, um, maybe an x1600 in my Mac Pro, because that was one of the only options to upgrade it. One thing that I really noticed there was that the digital output was far superior to the competing NVidia card, I think the 680. The colors on my Dell 2407 were much richer, and the pixels were sharper with the AMD card.

I remember seeing a Youtube video about this. The NVidia cards at the time were putting out a really sloppy signal on their DVI ports, with rise and fall times on the signal that weren't nearly steep enough. The net effect was that the ATI card looked much better on a digital signal, which, until then, I didn't even think was possible. I thought digital signals either worked correctly or failed catastrophically, and that was my first lesson otherwise.

But I stayed with the 680 anyway, because it was so. much. faster, and the drivers were wildly better. I haven't bought an ATI/AMD card since.
 
Sigh; guessing I am much older. My first discrete GPU came with my 286/12, and could do extended EGA, at 640x480 (stock EGA was only 640x350 as best I recall). That machine, with 640KB memory, 40MB hard drive, 1200bps modem, and one each of 5.25" and 3.5" floppy drives, and a no-name multi-sync 13" (?) monitor was US$2200 in 1988. This link says that would be almost US$5700 now...
Before the Pentium 100 I mentioned above, I had a 286 with almost the same specs. Until now I hadn't really thought about it but did those early machines had a dGPU? Wasn't the VGA signal processed by the CPU and/or the motherboard?
 

malor

Ars Legatus Legionis
16,093
Before the Pentium 100 I mentioned above, I had a 286 with almost the same specs. Until now I hadn't really thought about it but did those early machines had a dGPU? Wasn't the VGA signal processed by the CPU and/or the motherboard?
As far as I know, all PCs have always had discrete video circuitry that operates without needing the CPU's involvement. On the original ISA bus, the CPU could not have been driving the video output, no matter what, because there wasn't enough bandwidth. Even a basic CGA or MDA card has onboard RAM, which is modified by the CPU, and then scanned out by the card at 60 or 70Hz, respectively. That sort of thing can be built into a motherboard, but it looks like a discrete card to the CPU, and operates independently, just like a video card would.

The Atari 2600 did use that 'racing the beam' technique, but that was because it was designed in the mid-70s, and they were trying to save every penny. Even the Apple II, which famously used the 6502 to do nearly everything, had a dedicated video raster output chip of some kind.
 
Meanwhile, I went looking for an answer. Per Wikipedia:

The color palette random access memory (RAM) and its corresponding digital-to-analog converter (DAC) were integrated into one chip (the RAMDAC) and the cathode-ray tube controller (CRTC) was integrated into a main VGA chip, which eliminated several other chips in previous graphics adapters, so VGA only additionally required external video RAM and timing crystals.[10][11]

This small part count allowed IBM to include VGA directly on the PS/2 motherboard, in contrast to prior IBM PC models [...]



@malor post above adds to it :eng101:
 
  • Like
Reactions: continuum

IceStorm

Ars Legatus Legionis
24,871
Moderator
I figured I could avoid using the astigmatism hating UI, but driver crashes are a different matter. I'll be interested to hear if they continue with the reference card.
They did, but they were slightly different and I didn't lose the display output.

The problem turned out to be RAM, yet again. People swear AMD's CPUs can run DDR4-3600 just fine, but they usually can't, and that was the case again. Dropping the RAM to DDR4-2400 (default for the sticks) solved the issue, as did putting in a pair of DDR4-3200 sticks. I put the XFX back in and it's been fine all day.

I managed to find the only reasonably priced and still-available DDR4-3600 2x16GB kit that's also double-sided on the QVL for the board. It's from Klevv, so we'll see how this goes once it arrives.
 
  • Like
Reactions: grstanford
I still have a couple of GTX 680's, I also have a monitor capable of taking VGA, DVI, DP and HDMI inputs. The output of the GTX 680 is identical on DVI and HDMI and just barely worse on VGA. I don't know what malor was seeing with his, but I doubt the 680 was causing it - probably the cable or the monitor used with it was the cause.
 
  • Like
Reactions: U-99

malor

Ars Legatus Legionis
16,093
They did, but they were slightly different and I didn't lose the display output.

The problem turned out to be RAM, yet again. People swear AMD's CPUs can run DDR4-3600 just fine, but they usually can't, and that was the case again. Dropping the RAM to DDR4-2400 (default for the sticks) solved the issue, as did putting in a pair of DDR4-3200 sticks. I put the XFX back in and it's been fine all day.

I managed to find the only reasonably priced and still-available DDR4-3600 2x16GB kit that's also double-sided on the QVL for the board. It's from Klevv, so we'll see how this goes once it arrives.
You do have to buy quality RAM, you can't just shove any old thing in there. I carefully bought off this motherboard's RAM list, and it's never dropped a byte that I've been able to determine. It's DDR4-3600 16-16-16-36, and it's been flawless.

Note that you don't want more than four total DIMM ranks, either four single-rank or two double-rank. If you try to drive four double-rank, you're gonna have trouble.
 
  • Like
Reactions: Kaiser Sosei
Meanwhile, I went looking for an answer. Per Wikipedia:





@malor post above adds to it :eng101:
Tons of early PC's had their video cards integrated on to the motherboard, IBM PC Jr with its enhanced CGA graphics, tandy did the same, lots of Commodore, Atari and Amstrad PC's integrated CGA and or EGA onto motherboards well before VGA was a thing.


Plenty of VGA cards had external RAMDAC's on them back in the day and even in modern times DAC's could be split out of the main chip - which was done with nvidia's original g80 gpu.
 
Last edited:
Tons of early PC's had their video cards integrated on to the motherboard, IBM PC Jr with its enhanced CGA graphics, tandy did the same, lots of Commodore, Atari and Amstrad PC's integrated CGA and or EGA onto motherboards well before VGA was a thing.
Yes, those other early machines had integrated video chips (before the 286 I had a 8 bit PC, an amazing little machine, so limited in its graphics capabilities but at the same time, so awesome, at least for little kid me). I was thinking specifically of the "modern" x86 machines, 286 and later, from VGA onwards, but wasn't clear when writing it.

ETA: Basically, my curiosity was about when did the dGPU became a thing. From what I've been reading, it was when different companies started trying to expand the VGA standard and then started the arms race that took us here.
 
Last edited:

grommit!

Ars Legatus Legionis
19,295
Subscriptor++
They did, but they were slightly different and I didn't lose the display output.

The problem turned out to be RAM, yet again. People swear AMD's CPUs can run DDR4-3600 just fine, but they usually can't, and that was the case again. Dropping the RAM to DDR4-2400 (default for the sticks) solved the issue, as did putting in a pair of DDR4-3200 sticks. I put the XFX back in and it's been fine all day.

I managed to find the only reasonably priced and still-available DDR4-3600 2x16GB kit that's also double-sided on the QVL for the board. It's from Klevv, so we'll see how this goes once it arrives.
Yeah, troubleshooting can be a headache, thanks for the update. Hopefully this actually stops the crashes for you.

Amusingly, my last AMD card was a HD 7870, so a 7800XT would almost be full-circle on their naming cycle.
 
nVidia, no crashes in Starfield across a mobile 2070, 3060 Ti, 3080 Ti, and 4090. Unless you're really hankering for disappoint, I would just spend the extra and get a 4070 or 4070 Ti.

Ah, so here's where I should have been looking for anecdotes against a 7800XT purchase.

My 7900XTX crashes like a college student after an all-nighter. Usually in a new game.

What about for older-ish (2yr+) games?

(I'm sure there's a good joke you could make with a bird crashing into a window on this one, but I can't find any good phrasing)
 

IceStorm

Ars Legatus Legionis
24,871
Moderator
You do have to buy quality RAM, you can't just shove any old thing in there.
The RAM I used was Team Group T-Create Expert, which uses Samsung B die chips. It's not garbage, but it's also not on the QVL. The kit I ordered most people would consider to be absolute garbage (Klevv? Really?), but Gigabyte put it on the QVL. I'm not confident it will function.

Yeah, troubleshooting can be a headache, thanks for the update. Hopefully this actually stops the crashes for you.
On the temporary Ballistix DDR4-3200 kit, the game crashed to the desktop once. The driver didn't crash, and the display output didn't stop working, so I'm going to chalk that up to Starfield and not the system itself.

So far, when the game crashes, it's immediately after a transition from one location to another. The game autosaves any time you do that sort of transition, so you don't lose progress.
 

Xavin

Ars Legatus Legionis
30,167
Subscriptor++
ETA: Basically, my curiosity was about when did the dGPU became a thing. From what I've been reading, it was when different companies started trying to expand the VGA standard and then started the arms race that took us here.
I mean, the first IBM PCs (well before VGA) had video as a separate card, but everything was a separate card in those. Everyone figured out pretty early on that having seperate video circiutry was the only way to get decent (at the time) performance out of a machine, so they all had some form of it, even though there was very little going on with early video chips other than them being able to do their thing at the same time the CPU was doing something else. Even expanding to non-PCs (even though it's a generic term now, before the late 90s it explicitly meant IBM x86 machines or compatibles), they all had discrete graphics hardware of one kind or another, sometimes integrated onto one board or sometimes replaceable. Nothing really built up a big ecosystem of replacement video cards before the PC though, computer platforms just didn't last long enough back then. Even though the popular ones like the C64 sold for many years, the state of the art changed fast, and if you wanted the best performance and graphics you could buy into a completely new platform basically every year until the late 80s when the PC, Mac, and Amiga started crowding everyone else out. You have to remember, there was no such thing as video drivers or even standards until CGA/EGA/VGA, so if you got a new video card for your random 80s microcomputer (which did exist for some of them), it was probably pretty useless outside a few specific games or peices of software that explicitly supported it.

IBM wasn't interested at all in graphics or games, so third parties had free reign to innovate on video cards without worrying about IBM stepping on them (IBM stuff was also stupidly overpriced, so they weren't really an option for most consumers). Once IBM lost control of "the PC" it was just a free for all with hardware that mostly, sometimes, worked together with everything else, if you were lucky, and had the right software. It's a little annoying with the current Intel/AMD/Nvidia/MS control over everything PC, but it's sure easier a hell of a lot easier to just buy things and have them work as advertised.
 
  • Like
Reactions: SuinusLatinus

cogwheel

Ars Tribunus Angusticlavius
6,691
Subscriptor
The Millennium series used special RAM (maybe WRAM?) that made it faster, so you got better framerates on 2D games; it could push the PCI bus a little harder.
Old stuff trivia time: it was called "WRAM", which was a minor tweak on what was then called "VRAM". Back then "VRAM" mean a specific type of RAM optimized for video card usage and not just "the RAM on your video card". Specifically, VRAM was DRAM that added a second read-only port to the chips. This allowed the graphics chip to exclusively use the main read/write port at full speed all the time, and the RAMDAC read from the VRAM via that separate read-only port and didn't interrupt the graphics chip. Sure, you got tearing when the graphics chip wrote to RAM while the RAMDAC was reading it, but tearing was just viewed as normal back then.

Cheaper cards, like the Mystique, used the exact same graphics chip as the higher-end versions (e.g. the Millenium), but paired it with relatively generic DRAM, and the graphics chip had to stop doing stuff while the RAMDAC read the screen buffer to send to the monitor.
 

malor

Ars Legatus Legionis
16,093
Old stuff trivia time: it was called "WRAM", which was a minor tweak on what was then called "VRAM". Back then "VRAM" mean a specific type of RAM optimized for video card usage and not just "the RAM on your video card". Specifically, VRAM was DRAM that added a second read-only port to the chips. This allowed the graphics chip to exclusively use the main read/write port at full speed all the time, and the RAMDAC read from the VRAM via that separate read-only port and didn't interrupt the graphics chip. Sure, you got tearing when the graphics chip wrote to RAM while the RAMDAC was reading it, but tearing was just viewed as normal back then.

Cheaper cards, like the Mystique, used the exact same graphics chip as the higher-end versions (e.g. the Millenium), but paired it with relatively generic DRAM, and the graphics chip had to stop doing stuff while the RAMDAC read the screen buffer to send to the monitor.
Aha, thanks. I think I knew that at some time, but lost the details long ago. Well, about WRAM, anyway. Didn't know about the Mystique having DRAM.

I also have a vague memory of different Millennium cards, even in the same model line, being built to different standards, so buying one 'on sale' was maybe not such a good idea. Matrox marketing, IIRC, was a little shady.
 

Ulf

Ars Legatus Legionis
12,551
Subscriptor++
Sigh; guessing I am much older. My first discrete GPU came with my 286/12, and could do extended EGA, at 640x480 (stock EGA was only 640x350 as best I recall). That machine, with 640KB memory, 40MB hard drive, 1200bps modem, and one each of 5.25" and 3.5" floppy drives, and a no-name multi-sync 13" (?) monitor was US$2200 in 1988. This link says that would be almost US$5700 now...
No, I just had a Commodore Amiga 500 before the Mac and a Commodore 64 prior. I skipped the early PC/Mac era almost completely.
 

Ulf

Ars Legatus Legionis
12,551
Subscriptor++
Atari ST and Spectrum 48k+ here. We'd have never got along as kids! :D
I wanted an Atari 8-bit. Unfortunately, the first Atari 130 XE I got failed to boot (black screen). I returned it, and the second Atari 130 XE I got had a ROM error, and again, failed to boot. I returned that one and got ... a Commodore 64. I saw the Atari ST line in stores, but being weary of prior issues, I went with an Amiga instead.
 

malor

Ars Legatus Legionis
16,093
I talked my parents into buying an Amiga 1000, and then I later owned an A500 and an A2000 myself. Damn, those were great computers. But, in retrospect, there isn't a lot there worth revisiting; despite how amazing the OS was for the time, nearly all of the memorable software titles also ran on the PC, sometimes better. There's almost nothing unique to that OS that hasn't been brought forward and improved on, so revisiting it in emulation is interesting, but doesn't typically hold my interest that long. And the last real use for the Amiga died when SD video did.

Weirdly, the relatively simplistic Atari ST is still somewhat useful; they make great MIDI controllers, and it's pretty easy to do bare-metal bitbanging on one, where that was always awkward on the Amiga. You could expand and improve Amiga OS and keep it relevant to current computing standards much longer than you could with an ST, but the ultimate long tail usages ended up in the ST's favor.
 
What is so special about builtin MIDI ports?! You could strap them onto any Amiga via the parallel port for next to no money and any app that could be built for ST's could also be built for Amiga, they shared the same CPU but Amiga had the superior chipset.

Looking at PC's for a while they had both MIDI and digital waveform audio, but for the personal market MIDI is all but dead nowadays and ditial waveform has totally taken over. MIDI only gets used by music professionals.

Modern day Atari fans clinging desperately to the MIDI ports truly amuses me.
 

Peldor

Ars Legatus Legionis
10,646
Oh we're going way back. I give you the Timex Sinclair ZX81 hooked to a 19" TV. 1 whole KB of RAM.

undefined
 
  • Hug
Reactions: SuinusLatinus