Past GPU releases: fillrate is king and your drivers are cheating

pauli

Ars Legatus Legionis
37,643
Moderator
This thread is for discussing (that is, arguing about) past video card releases. It's a counterpart to the NEXT GPU releases thread - if you aren't talking about something cutting edge, bring it here.

Past vs future. Very simple. If you get confused, I will yell at you until you understand.
 

richleader

Ars Legatus Legionis
20,635
When I got my Geforce 2, everyone was talking about how awesome 24 bit color was but I didn't buy it: I figured that the speed advantage of running in 16 bit color would let me run many games in 1600x1200 (vs. 800x600) and the dithering from the higher rez would approximate the missing colors to some degree while giving me a sharper image overall. Was I crazy?

OTOH, I also thought id engine games had better dithering than Unreal engine games at the time so that came into play as well.
 

MadMac_5

Ars Praefectus
3,700
Subscriptor
When I got my Geforce 2, everyone was talking about how awesome 24 bit color was but I didn't buy it: I figured that the speed advantage of running in 16 bit color would let me run many games in 1600x1200 (vs. 800x600) and the dithering from the higher rez would approximate the missing colors to some degree while giving me a sharper image overall. Was I crazy?

OTOH, I also thought id engine games had better dithering than Unreal engine games at the time so that came into play as well.
I would have said you were crazy; I was convinced of 32-bit colour's superiority when I saw Freespace 2's nebula effects rendered so much more smoothly in 32-bit colour versus 16-bit. I could run Freespace 2 at either 640x480x32 or 1024x768x16 on my S3 Savage4 at the time, and I ALWAYS chose 32-bit colour since it was pretty much no contest.
 

pauli

Ars Legatus Legionis
37,643
Moderator
When I got my Geforce 2, everyone was talking about how awesome 24 bit color was but I didn't buy it: I figured that the speed advantage of running in 16 bit color would let me run many games in 1600x1200 (vs. 800x600) and the dithering from the higher rez would approximate the missing colors to some degree while giving me a sharper image overall. Was I crazy?

OTOH, I also thought id engine games had better dithering than Unreal engine games at the time so that came into play as well.
Given your predilection for gaming via powerpoint slideshows, if you were getting an actual performance improvement from 16bit color, it was the right choice.

On the other hand - with the tools available at the time, it was just about always obvious if you were in 16bit or 24bit. It wasn't a barrier to playing, but man was 24bit a step up.
 

etr

Ars Scholae Palatinae
741
I never had my Verite 1000, but I still have my Verite 2100. ;)

I'm still chewing on options for my next build. After pricing my preferences, I have just about decided to go current/last gen. Based on the pricing I see now, there's a good chance I'll wind up with an RX 6700 or RX 6700 XT.

* I know AMD is going to be weaker on rat tracing, but ray tracing is still newish enough that I suspect the equivalent nVidia cards will age well in that respect over 4 or 5 years, either.
* I typically value stability, so one would think that this would sway me. On balance, however, going with a line that's been out a while means that the worst bugs are likely to have been worked out, and if a new one arises I could theoretically go back to an older driver. That last point may give the XT a leg up, since it has been out a good while longer and is likely to be recognized in more driver versions.

* The performance per dollar metric looks to favor the RX 6700 over the XT. Ignoring upscaling, nVidia equivalents still seem to come in behind both. That said, spot sales could change the equation.

* I tend to run Linux on my aged hardware, so the high quality open source Linux drivers are a point in AMD's favor.

* I've mostly focused on x16 cards so far in the event I found a deal I wanted to throw in my current machine. It's old enough to have PCIe v2, so the bus bandwidth at x8 is equal to x4 PCIe v3...which is low enough to constrain the maligned RX 6500.

That's my current thinking at least. I'd give the RTX 3060 12 MB more consideration if it actually got down to its $330 MSRP.
 

owdi

Ars Praefectus
3,972
Ah yes, I remember dropping a ridiculous $3000 on a custom prebuilt P2 300mhz rig with a Matrox Millenium card. The CPU alone was $1000, which was not easy to save up as a cook at Pizza Hut. I had no idea this card was no good for games. Quake sucked on it. I hated it, compiled a list of everything that was wrong with it, and made CompUSA take it back for a full refund.

Replaced it with my first custom build, using a P2 233mhz (overclocked to 300mhz), a much better Nvidia Riva 128, and about a dozen 80mm fans. Quake was much better.

Then came the 3dfx Voodoo2, making 800x600 possible. Yum.

Which brings me to my point. 16-bit is all you need, and 32-bit support on the Riva TNT was a marketing gimmick. Completely unusable. Enjoy your slide show suckers.
 

Xavin

Ars Legatus Legionis
30,167
Subscriptor++
* I typically value stability, so one would think that this would sway me. On balance, however, going with a line that's been out a while means that the worst bugs are likely to have been worked out, and if a new one arises I could theoretically go back to an older driver.
While people swear up and down that AMD has gotten better about drivers, and maybe they have, your logic is flawed. Most of the issues with video card drivers are in new games, it's a constantly moving target where the driver is updated individually for damn near every game that shows up on the Steam top sales charts. Nvidia is on the ball with those updates, basically always having them out before the release of the game, AMD has a much spottier track record and updates much less frequently.
 
While people swear up and down that AMD has gotten better about drivers, and maybe they have, your logic is flawed. Most of the issues with video card drivers are in new games, it's a constantly moving target where the driver is updated individually for damn near every game that shows up on the Steam top sales charts. Nvidia is on the ball with those updates, basically always having them out before the release of the game, AMD has a much spottier track record and updates much less frequently.
I recently got back into Nvidia, and they had to release another driver, because the previous one wasn't very good.

But then again, I'm not playing the latest games generally, as those have issues of their own (2077 was/is a POS), and I value stability.

If both sides are releasing buggy SW at the same time, it helps no one.

Basically, like the BIOS, at a certain point, I do not really update the video drivers, as I'm not going to squeeze much more performance out of it.
 

malor

Ars Legatus Legionis
16,093
While people swear up and down that AMD has gotten better about drivers, and maybe they have, your logic is flawed. Most of the issues with video card drivers are in new games, it's a constantly moving target where the driver is updated individually for damn near every game that shows up on the Steam top sales charts. Nvidia is on the ball with those updates, basically always having them out before the release of the game, AMD has a much spottier track record and updates much less frequently.
I haven't owned an AMD card for quite awhile, but as a lifelong fan of retrogaming, AMD's drivers were terrible. It was obvious that they didn't have the same kind of testing depth that NVidia had, and their driver bitrot was constant. You'd think old games would just keep working, but over and over I found that old games bombed horribly on AMD hardware of the time. You could count on them to run all the current titles, but if you dipped into your backlog, good freaking luck.

NVidia has always done well there, part of the reason, I believe, that their driver install packages are so huge. They're hundreds of megabytes, and a great deal of that seems to be accumulated wisdom and fixes for old games.

On another forum, I kept getting static from someone I considered an AMD fanboy that this was all nonsense and that everything was all fixed, but I didn't believe him at the time (5 or 6 years ago.) Whether it's still a problem today, I have no idea, because I've been NVidia-only for probably a decade.
 

wireframed

Ars Legatus Legionis
16,733
Subscriptor
Ah yes, I remember dropping a ridiculous $3000 on a custom prebuilt P2 300mhz rig with a Matrox Millenium card. The CPU alone was $1000, which was not easy to save up as a cook at Pizza Hut. I had no idea this card was no good for games. Quake sucked on it. I hated it, compiled a list of everything that was wrong with it, and made CompUSA take it back for a full refund.

Replaced it with my first custom build, using a P2 233mhz (overclocked to 300mhz), a much better Nvidia Riva 128, and about a dozen 80mm fans. Quake was much better.

Then came the 3dfx Voodoo2, making 800x600 possible. Yum.

Which brings me to my point. 16-bit is all you need, and 32-bit support on the Riva TNT was a marketing gimmick. Completely unusable. Enjoy your slide show suckers.
I still have my Righteous3D Voodoo card in the parts pile. Eventually I’ll build a retro-gaming PC with a P200 MMX like it originally went into. That was how I played Unreal the first time, and MAN that blew my mind.

Never could afford the Voodoo2, I eventually got a Matrox G400, because I studied graphics design and it sorta made sense and also was a decent gaming card. :p
 
  • Like
Reactions: owdi

Mister E. Meat

Ars Tribunus Angusticlavius
7,241
Subscriptor
The Rendition Verite was so far ahead of its time. It had a programmable core when everything else, including Voodoo, was stuck using fixed functions. Voodoo was obviously much faster though. I owned a 1000 and later on a 2200. I thought it was going to take over the 3d industry but the delays on the 2x00 series killed it.
 

Paladin

Ars Legatus Legionis
32,552
Subscriptor
Diamond (Multimedia) Stealth 3D II S220, i believe mine was, also thought it was V2100
Yup that one was, I think, the only 2100 card. Most were 2200 which was just the same thing with faster clock speed. The S220 card later got a new driver or firmware or something to unlock the same clock speed. Silly that they tried to paywall it off but never really sold an upgrade so, what was the point? :confused:
 
  • Like
Reactions: Axl

etr

Ars Scholae Palatinae
741
While people swear up and down that AMD has gotten better about drivers, and maybe they have, your logic is flawed. Most of the issues with video card drivers are in new games, it's a constantly moving target where the driver is updated individually for damn near every game that shows up on the Steam top sales charts. Nvidia is on the ball with those updates, basically always having them out before the release of the game, AMD has a much spottier track record and updates much less frequently.

The thing is, I've had nVidia cards and rarely update the drivers for a couple decades now, and only rarely update the drivers. Stuff just generally works, and my impression of nVidia drivers would be far less favorable if I were going back to upgrade them each time I played a new game. (To be fair, it's been a while since I've been bleeding edge. I tend to play a smaller number of titles than most, even when I wind up putting in the same number of hours.)
 

Nevarre

Ars Legatus Legionis
24,110
The thing is, I've had nVidia cards and rarely update the drivers for a couple decades now, and only rarely update the drivers. Stuff just generally works, and my impression of nVidia drivers would be far less favorable if I were going back to upgrade them each time I played a new game. (To be fair, it's been a while since I've been bleeding edge. I tend to play a smaller number of titles than most, even when I wind up putting in the same number of hours.)

In nVidia's defense, the GeForce Experience all but automates driver updates. You basically just have to say "yes" whenever a new one is released. In practice it's not always that simple and GFE is a very poorly behaved program if you're on a network that can't reach nVidia's servers but they have built out the infrastructure and software around those driver updates.

Other than the fact that all software can have security issues, driver updates are mostly about performance more so than compatibility and occasionally allow new features. Not upgrading drivers is leaving performance you paid for on the table.
 

Paladin

Ars Legatus Legionis
32,552
Subscriptor
Unfortunately my current machine at home has GeForce Experience telling me that it wants to update the driver but when I click the button to tell it to do so, it does nothing. Reboot, restart the software, tell it to check for updates... no change.

Oh well, it's a minor rev update so I'll wait a bit and see what happens.

But yeah, no one is immune to weird driver/software issues.
 

malor

Ars Legatus Legionis
16,093
I use the utility NVCleanstall, which downloads the driver, breaks it apart, and lets you choose which components to install. I go with the basic driver, HDMI audio, and PhysX drivers only, and leave all the other crapware behind. It's a very nice utility.

Oh, you do get the normal screen control panel, choosing just those three things. You just don't get all the other spy- and crapware.
 
  • Like
Reactions: hansmuff

Ulf

Ars Legatus Legionis
12,551
Subscriptor++
I only wait when doing a NVidia driver update because it utterly screws up all my open windows, forcing me to close everything before hand. Precision X1 gets really messed up if I leave it open. It also occasionally crashes the start menu, so I just reboot when that happens.
Occasionally a new version has a bad bug, but that is found out in the first week after an update.

My first GPU, technically, was a Mac 12 MB VooDoo2. I got that because the internal 6 MB ATi Rage Pro couldn't play Unreal at all.
My first PC GPU was a 1 GB ATi HD 5770. I did get the "Gray Screen of Death" bug. :rolleyes:
 

MadMac_5

Ars Praefectus
3,700
Subscriptor
I just re-installed my ATI Rage Pro AGP as part of my refresh of my trusty Windows 98 Pentium II box, and I decided to try it out in Need For Speed III since that's how I first experienced that game. The performance is actually surprisingly good, but one of the biggest weaknesses is that there's no way to disable Vsync in the drivers that I can find. This leads to pretty substantial framerate drops to 15 FPS, which is EXTREMELY annoying. If it weren't for that issue, the card would probably have performed a lot better compared to the Voodoo 1!
 

etr

Ars Scholae Palatinae
741
Not upgrading drivers is leaving performance you paid for on the table.

If I thought that was primarily general performance, I'd be persuaded.

At the end of the day, I suspect that most of the improvements are "game-specific optimizations", which some might label "driver cheats".

Whatever one calls them, they might help in the near term, and folks who want them are welcome to them.

Personally, I'm not sold. I'm not a pro, so my priority tends to be decent performance and lie maintenance. If I buy a good card, then I can generally live without a few FPS, even if I need to turn down a setting or two. By the time a good card is old enough for driver optimizations to make the difference between tolerably playable and insufficient, it probably isn't getting too many of them because it's not for sale anymore.

That said, I tend to run things into the ground. I ran a GTX 570 until it died, and I replaced it with a GTX 1050--the cheapest thing that wouldn't underperform it's pedecessor--pending a complete rebuild.

In retrospect, I wish I'd gone higher than the GTX 1050 at some point, but I held off the rebuild longer than expected (life happens). If the RTX 30x0 were routinely available at MSRP, the ATI options would be less interesting. For all the "glut" talk, I haven't seen that yet. (To be fair, some MSI's are getting close at the moment.)
 

grommit!

Ars Legatus Legionis
19,295
Subscriptor++
It's been a decade since I last had an AMD card, and I'm considering getting a 7800XT. However, the driver package looks very different now.

I have questions:
  • Is there a way to control vsync/freesync on a per-game basis, the same way you can do on the nvidia control panel (i.e. no GFE). Does this require the full install?
  • What options are available under the tabs for the "minimal" install?
  • The nvidia control panel allows you control the size of the driver shader cache. Is there an equivalent on the AMD driver, and what installation option is required to utilize this?
(edit) this guide seems accurate. And as there is only a "dark theme", it's a no-no for me due to astigmatism.

(edit 2) it appears there is a hardcoded limit of 4GB for the shader cache🤦‍♀️
 
Last edited:

mpat

Ars Praefectus
5,951
Subscriptor
I could understand if there was only ever one theme, but that is even more annoying. I'm guessing the driver-only install removes the ability to change settings on a per-game basis?
Probably - I have never done a driver-only install of the AMD driver, but I imagine that it will be minimal. AMD doesn't have any log-in requirements or similar, so there was never any reason for me to stick to a minimal design.
 

Cool Modine

Ars Tribunus Angusticlavius
8,539
Subscriptor
Ah yes, I remember dropping a ridiculous $3000 on a custom prebuilt P2 300mhz rig with a Matrox Millenium card. The CPU alone was $1000, which was not easy to save up as a cook at Pizza Hut. I had no idea this card was no good for games. Quake sucked on it. I hated it, compiled a list of everything that was wrong with it, and made CompUSA take it back for a full refund.
What?!? The Millenium was one of the fastest gaming cards of its time!!!

And then 3D happened. Matrox faded away in the sunset. For a while, you could still find their chips as the basic video output on server motherboards.

I think I had an ET6000 based video card in that era, with a Cyrus CPU. I don’t really remember what my first 3D card was. But after thinking about it, the phrase “Riva TNT 2” comes to mind. I believe I had a P3 500E, which ran at 750 from day 1.
 

owdi

Ars Praefectus
3,972
What?!? The Millenium was one of the fastest gaming cards of its time!!!

And then 3D happened. Matrox faded away in the sunset. For a while, you could still find their chips as the basic video output on server motherboards.

I think I had an ET6000 based video card in that era, with a Cyrus CPU. I don’t really remember what my first 3D card was. But after thinking about it, the phrase “Riva TNT 2” comes to mind. I believe I had a P3 500E, which ran at 750 from day 1.
I replaced the Matrox card with a Riva 128

For the cpu, instead of paying $1000 for a p2 300, I paid about $400 for a p2 233, then promptly overclocked it to 300 :)
 

Kaiser Sosei

Ars Praefectus
3,613
Subscriptor++
I bought a Voodoo5 5500 PCI for Mac right before the company collapsed. Not my best use of funds, though it still slapped the snot out of my previous Rage128 card.

As someone that did not have a PC with an AGP slot, I disagree. I loved mine. I was mostly playing Diablo then and glide looked so much better than OpenGL. Pretty sure I used it until I upgraded computers and got a 9700 Pro.

1694149721131.png

Maybe it still works.
 

malor

Ars Legatus Legionis
16,093
Matrox Mystique was my card back then; Millenium was too expensive as best I recall... MechWarrior 2-3D was wild.
The Millennium series used special RAM (maybe WRAM?) that made it faster, so you got better framerates on 2D games; it could push the PCI bus a little harder. But then the world went 3D, and Matrox fell behind.

What's kinda funny about that whole thing is we had VESA Local Bus, AGP, and PCIe, which were more or less designed around improving the 2D experience, fixing one of the biggest faults in the PC (limited video bandwidth.) But then the whole game market went 3D, and you hardly needed any bandwidth at all for 3D graphics back then. The normal routine was for games to load all the textures for a level into the VRAM, and then send only geometry during play, which hardly took any bandwidth. A regular old PCI card could offer 3D framerates just as good as PCIe cards, because the slot wasn't the limiting factor. At best, a PCIe card would save you a few seconds on level loads, while textures were being sent.

That didn't stop GPU sellers from harping on the latest PCIe standards, even though PCIe was useless for 3D for quite a few years. What actually benefited was 2D, but nobody was making those games anymore.

It's been good to see the major renaissance in 2D titles of late. Modern PCs are just devastatingly good at pushing pixels, so designers can go pretty nuts.

(edit, a couple days later: added a missing N to "Millennium".)
 
Last edited:
U

U-99

Guest
As someone that did not have a PC with an AGP slot, I disagree. I loved mine. I was mostly playing Diablo then and glide looked so much better than OpenGL. Pretty sure I used it until I upgraded computers and got a 9700 Pro.

View attachment 62715

Maybe it still works.
My quibble was more with 3dfx ceasing driver support for the fickle Mac platform and it's smaller homebrew driver community than the card itself.
 

Sunner

Ars Praefectus
4,330
Subscriptor++
I would have said you were crazy; I was convinced of 32-bit colour's superiority when I saw Freespace 2's nebula effects rendered so much more smoothly in 32-bit colour versus 16-bit. I could run Freespace 2 at either 640x480x32 or 1024x768x16 on my S3 Savage4 at the time, and I ALWAYS chose 32-bit colour since it was pretty much no contest.
While I absolutely loved the shit out of FS2, I think the game that convinced me was some vampire game whose name escapes me. It you spent a significant amount of time walking around with a flashlight and the light cone was absolutely gorgeous for the time, and it just looked so much better in 32 bit.

Also my first real 3D card (i.e. nor Mystiques or S3 Virges) was a 3DLabs Permedia 2. Nowhere near as performant as Rivas and Voodoos of its day but a pretty good all around card nevertheless. Could even stand a bit of overclocking.

Speaking of overclocking I also had a Creative Labs TNT2 which was an absolute beast of an overclocker. I seem to remember the standard clocks were 150/175 Mhz core/mem and even without modding I got that to something silly like 200/230. I put a salvaged PPro heat sink on it as well but as far as I recall the core didn't wanna go much higher anyway.
 

Kaiser Sosei

Ars Praefectus
3,613
Subscriptor++
Yeah, there was a dedicated group making sure the drivers stayed updated for some time. I stopped following when I upgraded but a quick search shows this site still has drivers all the way up to Windows 7.

 
  • Like
Reactions: Baenwort

owdi

Ars Praefectus
3,972
I remember being so jealous of Glide back when I had my Riva 128, gaming at 640x480

Then I bought a Voodoo2, omfg 800x600 was so crisp, and Quake II was f'n beautiful. But the hot stuff was running two cards in SLI for 1024x768 goodness.

I held on to that setup for way too long, missed all the TNT vs 3dfx drama. My next card was the insane GeForce 256 SDR. crazy huge upgrade. And to this day I regret not buying the DDR version.

I remember dreaming about one day having a top of the line God box, but here I am rocking a 5600x and 2070 Super, lol. I mean, I can certainly afford a ridiculous system, but my kids want gaming computers like daddy. The more RGB the more their faces light up, so this Xmas they will get a pile of parts, with more LEDs than I can stomach, and we will do our first build. Can't wait!
 
Speaking of Voodoo 2's in SLI :


View: https://youtu.be/AA4axv0LZjA?si=54nL5sCDQ2i8yg1I


My first ever PC graphics card was some slow arsed Cirrus Logic POS, then a generic S3 Trio, then the Permedia 2 with a 12mb Voodoo 2 accompanying it, then the Riva TNT and the rest is history. Did tinker a little bit with an S3 Savage 3D around the same time as the TNT in another system because nvidia wasn't a sure thing until a bit after TNT launched.