Has anyone built a PC for exclusively for Folding@Home?

JimboPalmer

Ars Tribunus Angusticlavius
9,402
Subscriptor
I am more a 'trailing edge' user so none of my computers were new when i got them.

You need at least one CPU thread per GPU.

GPUs will slow down if the PCIE bus is electrically under x8.

So bang for your buck: figure out how many top end GPUs you can afford, Find a motherboard that supports that many x16 PCIE 3.0 slots, get a PSU that can feed all that power, get a CPU that fits the motherboard and has as at least as many threads as you bought GPUs, install linux.

(Windows has 'issues' with the PCIE bus utilization)

Alternatively, buy a GPU, ebay a PC with a strong PSU, and repeat. (If you are wedded to Windows this may be your best bet.)
 

continuum

Ars Legatus Legionis
94,897
Moderator
Tons of people have-- JimboPalmer's advice is pretty good.

Some motherboards, especially those for mining, often have a ton of PCI-e x1 slots, and if your app isn't PCI-e bandwidth intensive, that might help with scale. Or depending on your budget-- Intel's high-end desktop (HEDT) product line of desktop/workstation processors with up to ~44 PCI-e lanes are now (relatively) affordable if you have the $$$ and just need a lot of PCI-e lanes.
 

continuum

Ars Legatus Legionis
94,897
Moderator
I'm really not sure what the best deals are on hardware, but given that most bang for the buck now for F@H is all GPU-driven, basically finding a motherboard/CPU/memory that does the job and ideally has two PCI-e x16 slots (will probably end up as x8/x8 electrical on most desktop CPUs), plus enough PSU for two GPUs, and then maximizing your spend on bang-for-buck GPUs is probably the way to go.

Looks like people are starting to retire Haswell-era stuff for reasonable prices:
viewtopic.php?f=12&t=1465808

That's higher-end than you need to go, but a start. Ivy Bridge or Sandy Bridge era stuff would also work but those would be a generation (and likely a year) older and two generations (and likely at least two years) older, and at that point that's getting pretty well beyond reasonable service life.

I think the biggest key might be bang for the buck GPUs. I don't know what does best in F@H off the top of my head, but there's some discussion here for nVidia:
viewtopic.php?f=6&t=1466097
 
D

Deleted member 32907

Guest
I've not built high end F@H rigs, but it's pretty straightforward!

Couple sockets worth of Xeon, a bunch of ECC RAM, 8 high end GPUs, triple power supplies, chassis fans to match, maybe some creative re-engineering of cooling solutions, a 240V circuit to improve power supply efficiency, and...

...but I was hoping for ~$500.

Ah. Well then. Never mind.

======

I have a PC that's more or less dedicated to F@H/BOINC, but it's just an old scrap system I had years ago that I used for some GPU dev. Currently I think it's got an i7-3820 and a pair of 980s, and I literally just use it for compute and office heat. Thinking of tossing it outside or building a vented corner of my office to get all that heat out of my office, though. Very welcome in the spring/fall/winter, unwelcome in the summer.

Poke around your local Craigslist for someone's "gaming PC" that they're desperate to sell, lowball them, and you've got a halfway tolerable F@H rig for cheap. Throw the CPUs at Rosetta, just leave a core for feeding the GPU.