ShuggyCoUk's 3090 receptacle build (such as you can call it that) thread

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
IMG_7015.jpg
Found it. And this box can be cut up to use as PDM in a lego box so that's a plus.
IMG_7016.jpg
Pretty, really like the top mounted connections in person
IMG_7017.jpg
It too is packed
IMG_7018.jpg
Loving the cable management in there. Looks like they put in the right modular connectors for the 3090 too
IMG_7019.jpg
Lets boot this up "as is" and see how it goes
 

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
Let's use the age old mounting tech that's done me proud since spinning rust stopped being something I cared about. Velcro!

IMG_7022.jpg
All that lovely cable management slightly less nice now, but not too bad (again - so glad I asked them to include these cables!)

IMG_7023.jpg

All that for an extra 1.5 TiB of local NAND Woo Hoo! (I am seriously missing having access to a distributed NAS maxing out 100G ethernet with PB available so every little helps)

That worked. I'm wondering about booting ubuntu off the other M2 drive and using the UEFI to switch operating systems since windows tends to stomp on things. But that's for another time. For now - the main event, the 3090.

Sadly that needs to happen after bedtime (for my son!)
 
  • Like
Reactions: Paladin

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
My GPU (mini) saga

IMG_7024.jpg

It has pretty lights...
IMG_7026.jpg
but really poor clearance for the display ports :/
IMG_7028.jpg
I couldn't get that to work. So HDMI for now.


What am I losing using HDMI? anything? It's not a gsync etc. monitor and I've no plans to get one. Old 2560x1600 Dell that I love for the pixel count vs size trade off.

If need be I can borrow a decent dremel and carve a little extra space.
 

DaveB

Ars Tribunus Angusticlavius
7,274
Just for hell of it, here's my one and only run of Time Spy with an i7-13700K/RTX 4070 Ti Super combo.

The Graphics score difference makes sense as TechPowerUp shows the 3090 performing at 86% of the 4070 Ti Super (25,236*0.86=21,702). Yours being a bit lower is likely due to the slightly higher 2560p x 1600 resolution vs. 2560 x 1440. But the CPU score of the 7800X3D is just 13,162 vs. 19,314 for the i7 13700K. That calculates to the 7800X3D running at 68% of performance the i7 13700K. This doesn't seem right, I would think these scores would be much closer. Is it something in the Time Spy benchmark that causes this?

3DMark-Time Spy-24126-25236-19314-13700K-DDR5-6000-RTX4070TiSuper-Win 11.jpg
 
  • Like
Reactions: ShuggyCoUk

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
Thanks for the comparison point.

According to this I'm about where I should expect - which was the purpose of the benchmark for me.
If it was just games I'd likely have gone with the RTX 4070 Ti Super (about the same money but better FPS less power and new!) but I wanted the extra VRAM for local playing with ML, it's annoying Nvidia still gate that on the very top end card each generation.
 

DaveB

Ars Tribunus Angusticlavius
7,274
Thanks for the comparison point.

According to this I'm about where I should expect - which was the purpose of the benchmark for me.
If it was just games I'd likely have gone with the RTX 4070 Ti Super (about the same money but better FPS less power and new!) but I wanted the extra VRAM for local playing with ML, it's annoying Nvidia still gate that on the very top end card each generation.
Yeah, as I wrote, the RTX 3090 scored as expected. Plus, I fully understand your need for more VRAM. It's the CPU score I can't understand.
 

malor

Ars Legatus Legionis
16,093
I've seen someone post online that it might be a good idea to loosen all the screws on the motherboard to try to re-align the case and GPU. Seems reasonable since the system builder wouldn't have had any GPU in there at all when setting it up
That makes sense. There are usually either six or nine motherboard screws, depending on how big it is. They're very roughly in a grid layout, although they don't line up exactly. There should be three on the back plate under the card slots and I/O shield area, three spaced across the middle of the board, and then if it's big enough, three more on the edge opposite the I/O shield. (edit: I'm pretty sure I see one of the far-edge screws, so your board is probably a nine-screw type.)

Don't take them out, just loosen them all, along with the screws holding down any expansion cards (like the GPU), and then wiggle a bit to get a good square fit. You might need to slide the whole board toward the GPU slot a tiny bit, to get that Displayport socket clear. Then tighten them all down again. You don't need to crank them hard, just get them snug.

It might be easiest to actually take out the GPU screw, and then replace it after you've got things settled.
 
  • Like
Reactions: ShuggyCoUk

hobold

Ars Tribunus Militum
2,657
CPU score guesswork.

1.
The 13700K has 8P + 8E cores, while the 7800X3D has 8 cores total.
If the benchmark in question is heavily multithreaded, the 13700K would be expected to have a strong advantage.

2.
The 7800X3D clocks notably lower than the 7800X without stacked cache. This is true even in all-core loads because of the decidedly lower power limit of the X3D model variants.
If the benchmark in question is running on a small data set, then the score would scale mainly with clock speed.

Hypothesis: the 3DMark CPU score is derived from an all-core workload with a small data footprint. This would explain the relative ranking quite well.
 
  • Like
Reactions: ShuggyCoUk

DaveB

Ars Tribunus Angusticlavius
7,274
CPU score guesswork.

1.
The 13700K has 8P + 8E cores, while the 7800X3D has 8 cores total.
If the benchmark in question is heavily multithreaded, the 13700K would be expected to have a strong advantage.

2.
The 7800X3D clocks notably lower than the 7800X without stacked cache. This is true even in all-core loads because of the decidedly lower power limit of the X3D model variants.
If the benchmark in question is running on a small data set, then the score would scale mainly with clock speed.

Hypothesis: the 3DMark CPU score is derived from an all-core workload with a small data footprint. This would explain the relative ranking quite well.
So, the e cores have some value. I remember all the AMD fans mocking those little and slower cores as a useless gimmick.

It seems odd that 3DMark would use a small data set for a CPU score for a GPU benchmark.
 

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
I'll benchmark in a real game some time. once I decide which to play :)

The motherboard shift was interesting.
  • I have an 8 screw motherboard
  • I actually only have 7 1/2 screws in it.
    • The middle one is missing the head of the screw, I wondered if I'd bashed it myself getting the GPU out but nope - looking at the prior photos you can see it
  • I think I got about 0.5mm of shift and that was it
That's not enough to get the display port in cleanly. If I remove the 3rd slot protector and tilt the card just a little I can seat the display port cables I have - but that's not a great idea I reckon.
I have HDMI, I only need the one screen for the foreseeable future so I will try a cheap narrower plug display port cable at some point (amazon's basic ones look pretty tiny actually) and if that fails dremel as and when I need it.
 

hobold

Ars Tribunus Militum
2,657
I remember all the AMD fans mocking those little and slower cores as a useless gimmick.
There exists at least one AMD fan who recognizes the E cores as what they are: a Cinebench accelerator that solves Intel's problem of having a non-competitive transistor density. At the cost of yield and power, and with scheduler woes, and causing AVX-512 backpedalling.

But Intel's immediate problems were solved: Cinebench is no longer an uncontested win for Ryzen. :)

(I actually think the E-cores are a heroic somersault out of the hole Intel had dug for themselves. A second rate solution, in my not so humble opinion, but very creative and very effective. Once Intel has their new fabbing tech up and running, I am all for keeping some E-cores or ZenN-"C" cores in addition to a healthy number of all-out-performance cores. But please with 100% compatible ISA between core tiers.)
 
  • Like
Reactions: diabol1k

Doomlord_uk

Ars Legatus Legionis
24,892
Subscriptor++
Is that an actual 3XS system, or just 3XS packaging?

HDMI has lower peak bandwidth than DP, but this only matters if you want high framerates on a big monitor. For reference, my 38" ultrawide alienware (3840 x 1600) is limited to 85Hz on its HDMI but can do 144Hz on DP. That's quite a difference :)j

Looking at the pic of your graphics card's fit, I'd say you could move that a millimeter downwards (in the pic) at least. Looks like you've just tightened the screws up too 'high' (relative to the pic).
 

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
Actual 3XS system, but I put the graphics card in separately as it was second hand.

I can wiggle it a little if I untighten the screws, but it’s definitely putting lateral load on if I do that. My DP cables are fairly old and very chunky so it might just be that that’s the issue.

Since my monitor is IIRC limited to 60 Hz it’s not an issue for now :)
 

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
Thanks for posting your build! I’m eyeing the Fractal cases for an upcoming build that I’m still mulling.
Apart from the alignment with the GPU I think the fractal north is lovely.

With hindsight I'd have a few more type c connectors, just one on the top isn't ideal. I can fix that with a daughter card if I care (unless there's more headers on the motherboard, then it's just a backing plate in a free slot)
 

ShuggyCoUk

Ars Tribunus Angusticlavius
9,975
Subscriptor++
It’s probably a locking standoff? Some cases come with one preinstalled in the very middle position.
I'm not certain. the metal I can see has that "bright slightly uneven level but still 'clean'" look I associate with a shear. But it's a long time since my (abortive) materials science metallurgy 101 stuff... It appears to have no ill effects so far
 

Knightmare

Ars Scholae Palatinae
1,075
Another thanks for posting this. Excellent choice of CPU. Good to see a Noctua tower cooler in use instead of the 120mm AIO liquid cooler in your previous build. Those have basically no benefit over air cooling.

Also very nice to see a Dell U3014 in use. I could never afford one when they were available. The closest I ever came was an "open box" U2713H I bought off eBay 6 years ago, and promptly returned due to it having 6 pressure bruises.

As far as the SSDs in your build, I have to give a thumbs down to the use of a Samsung QVO (QLC) drive. Those have poor endurance, and I would personally rather use spinning rust.
 
  • Like
Reactions: continuum

Jeff3F

Ars Tribunus Angusticlavius
6,825
Subscriptor++
I will say that after I built my box I upgraded to 1Gbs internet service, and Steam downloads can sometimes be impacted by not-fast-enough storage (ie it pauses downloading to catch up with file processing). Lots and lots of disk I/O going on.

My SSD (Samsung 960 EVO from 2017) was well regarded back in the day, but Samsung makes a wide range of devices and I would scrutinize for potential bottlenecks.
 

Knightmare

Ars Scholae Palatinae
1,075
re: the Samsung. It was one I already had, it’s fine for storing photos and media AFAICT. Also I now just have all my photos on the new machine without copying a TB. All the smart parameters on it are happy.
If you have anything of real value on that QLC drive, I would definitely keep a backup on an external hard drive, or maybe even BD-R M-Discs. SSDs, especially QLC drives, are not good for long term archival. Here's why:

QLC drives use 16 distinct charge levels in their flash cells to store 4 bits of data (2^4 =16). For reference, TLC drives use 8 distinct charge levels, MLC drives use 4 charge levels, and SLC drives use just 2 charge levels. Because of physics, electrons will gradually leak out of those flash cells over time (a period of years). At some point, enough electrons will have been lost that those 16 charge levels in a cell cannot be maintained, and the data becomes corrupt. This bit rot will obvously happen faster with a QLC drive than other types of SSDs that pack fewer bits into a flash cell.
 
Last edited:

malor

Ars Legatus Legionis
16,093
If you have anything of real value on that QLC drive, I would definitely keep a backup on an external hard drive, or maybe even BD-R M-Discs. SSDs, especially QLC drives, are not good for long term archival. Here's why:

QLC drives use 16 distinct charge levels in their flash cells to store 4 bits of data (2^4 =16). For reference, TLC drives use 8 distinct charge levels, MLC drives use 4 charge levels, and SLC drives use just 2 charge levels. Because of physics, electrons will gradually leak out of those flash cells over time (a period of years). At some point, enough electrons will have been lost that those 16 charge levels in a cell cannot be maintained, and the data becomes corrupt. This bit rot will obvously happen faster with a QLC drive than other types of SSDs that pack fewer bits into a flash cell.
SLC cells can last a very long time, particularly at low density. There's a lot of PS1 memory cards out there with perfectly readable data on them, for example.

You shouldn't generally trust an SSD to hold data longer than a year if it's disconnected from power. The cells get rewritten occasionally when the drive has power, so it'll last probably for at least a decade, but remove power and the data starts to decay.

The hotter the flash cells were when written, the longer the data will last (so you want the flash cells running hot), but the hotter the drive is when unpowered, the faster the charge dissipates. At least as of a few years ago, a drive that wrote data at room temperature, and was then stored at room temperature, would generally retain it for about two years, but I think that stat may have been for TLC drives. QLC drives may not last that long.

If the drive is hot enough in storage (I think circa 50C, so that wouldn't normally happen except in direct sunlight), you can lose data within a couple of weeks.
 
Last edited: