How often for a new Build?

Lord Evermore

Ars Scholae Palatinae
1,490
Subscriptor++
From the late 90's on (Socket7/SuperSocket7 to about Skylake) through the first mining crisis was pretty much the good times.
In 1998 I moved out of my hometown to live with my girlfriend, and her brother and I bonded (as much as I can bond) over technology. For several years we went to computer shows at convention centers and racetracks and things like that, drooling over all the components available that we couldn't afford and didn't really need. Before buying things online became cheaper and you could do it anytime instead of every few months, when there were quality parts from many brands, no throwaway cheap shit from fly-by-night Chinese brands on marketplaces (and you got cards and phone numbers from the people you bought from, and many of them had local stores), when there was a real difference between getting the lowest-cost types of things and the higher-end parts (16X CD-ROM vs 24X, IDE vs SCSI), and there were still people coding software by hand to make it optimized to fit in minimal RAM space and not waste hard drive space. Those few years were the best time of my life, looking back.
 

Lord Evermore

Ars Scholae Palatinae
1,490
Subscriptor++
@Lord Evermore My SCSI Plextor UltraPlex 40max was one of those things I (at the time) had to really save up for and it was King of the Hill for Years for ripping etc. That sort of differentiator is gone now. And I mourn it.
I still have the box from my Plextor 12Plex, which I think I bought in 1998 to connect to my AMD K6 system (or possibly the Cyrix 6x86 system). It holds collectible stuff like Star Trek Christmas ornaments, birth year coin collection, other little things.

I just looked up a picture and I'd forgotten that CD-ROM drives back then had audio jacks and even volume wheels on them, because sound cards also had a connector for plugging the CD-ROM into them because that's how they played audio CDs early on.
 
  • Like
Reactions: hansmuff

Made in Hurry

Ars Praefectus
4,553
Subscriptor
I never really dabbled with SCSI anything, i remember drooling over the Cheetahs that people owned back then, so finding the 7200RPM drives with the maximum amount of cache on them was a thing for me as i never could afford the top of the line stuff like SCSI anything for storage, but i was always lucky in some way. Never bitten by the IBM Deathstars, or the Iomega Zip drive clicks of death. My K6-2 350 i killed accidentally as i fed it 2,7V by wrong jumper configuration while it was overclocked to 412Mhz i think, and when changing it for a K6-III 500 for a short while, the chip just disintegrated in my hands when i removed it.
What i do not miss is the IRQ fudging you had to deal with to get everything working sometimes.

I hope the coming CAMM2 memory modules will keep things upgradable, i assume this standard will eventually come to the desktop.
 

Nevarre

Ars Legatus Legionis
24,110
In 1998 I moved out of my hometown to live with my girlfriend, and her brother and I bonded (as much as I can bond) over technology. For several years we went to computer shows at convention centers and racetracks and things like that, drooling over all the components available that we couldn't afford and didn't really need. Before buying things online became cheaper and you could do it anytime instead of every few months, when there were quality parts from many brands, no throwaway cheap shit from fly-by-night Chinese brands on marketplaces (and you got cards and phone numbers from the people you bought from, and many of them had local stores), when there was a real difference between getting the lowest-cost types of things and the higher-end parts (16X CD-ROM vs 24X, IDE vs SCSI), and there were still people coding software by hand to make it optimized to fit in minimal RAM space and not waste hard drive space. Those few years were the best time of my life, looking back.

I miss the Computer Shopper magazines that weighed a ton, the traveling hardware shows that sometimes had great stuff or great bargains and sometimes the oddball rejects. I know at least two friends who bought computer cases from a guy who had a bunch of dirt cheap (like $20) cases from China with the 'gotcha' being that all of the power buttons were made to look like golf balls. It was unaesthetic but it worked. Another key ingredient was that it was relatively easy possible to get most Microsoft OSes, often times legally through the MSDE program. You could cheaply mess around with stuff at home and learn with the main cost being hardware in a pre-virtualization era.

I just did lots of interesting things back then. Now a lot of what I'm doing is a veneer of civility on top of 30+ year old technology or processes that I've been doing one way or the other for the last 30 years with incremental small changes as things have progressed.
 
  • Like
Reactions: Lord Evermore

Nevarre

Ars Legatus Legionis
24,110
I hope the coming CAMM2 memory modules will keep things upgradable, i assume this standard will eventually come to the desktop.

As long as there are x86 desktops, there will be upgradeable RAM. Apple seems determined to push the RAM-as-part-of-the-SoC mission, and hopefully if desktops migrate to ARM, other ARM vendors won't follow suit. RAM will probably always be slightly less reliable than a CPU anyway, and a primary component that you'd like to replace when it fails even if you don't ever want to upgrade.

At work I just had to retire an otherwise perfectly good-but-just-out-of-warranty laptop because the soldered in RAM had a sudden failure. Consumers may not see systems in enough numbers to notice failures like that, but hopefully those of us who deal with enough volume keep pushing for RAM to be replaceable if at all possible.

CAMM2 specifically, though, has a form factor problem for desktops. It might be worth it at some point to get the RAM profile lower and bring latency down by bringing it physically closer to the CPU but because it lies totally flat against the board, it takes up PCB real-estate on the motherboard that can't be used for other components. Since the cost of a PCB is tied to its area, you want to use that space efficiently and you want to keep costs down (CAMM2 is probably going to be slightly more expensive than DIMM slots forever since the wiring is more complex). For now you can get much more RAM density for the same real estate using DIMMs. For laptops the problems revolve around heat and device thickness and that's where it makes a lot of sense.

I should be getting my first CAMM (one, Dell proprietary) system in next week or so, for what it's worth. It's in a design that allows high RAM density on mobile and that upgradeability and replaceability is very much appreciated. It'll be a 64 GB mobile workstation, with a design that allows up to 128 GB in a regular laptop form factor. The 128 GB configuration ain't cheap, but it exists and the parts are serviceable.
 

redleader

Ars Legatus Legionis
35,019
CAMM2 specifically, though, has a form factor problem for desktops. It might be worth it at some point to get the RAM profile lower and bring latency down by bringing it physically closer to the CPU
FWIW with the speed of light in a PCB trace being about 6 picoseconds per millimeter, the extra few centimeters you save with CAMM don't add up to anything at all compared to the 50-100 nanosecond memory latency. CAMM on desktop makes more sense if you want to use LPDDR to save power and/or have more bandwidth, but usually that doesn't even matter much. I think normal desktop will just keep using DIMMs, although we'll probably be stuck with 2 DIMMs max at the DDR6 generation, or maybe one after it.
 

Nevarre

Ars Legatus Legionis
24,110
And no desktop memory controllers, AFAIK support LPDDR. At least this is a theoretical way to allow LPDDR in a non-soldered-to-the-motherboard way.

As long as capacity keeps up, the lower slot count is manageable if not ideal for leaving slots open for upgrades. We've been dealing with limitations since ~Zen2 and most of those have turned out to be very manageable issues.
 

redleader

Ars Legatus Legionis
35,019
It's hard to see the point of CAMM on the desktop since it's the answer to the question of how to put LPDDR on a stick, and we can't used LPDDR on desktop CPUs (except Apple Silicon). Maybe some day soon someone will build a non-Apple desktop ARM around the Snapdragon X Elite.
Intel puts LPDDR controllers on their mobile CPUs each generation, so any OEM can buy the BGA part and stick on a desktop system with CAMM if they really want. Similarly, if Intel were interested it wouldn't be that hard for them to make a socketed version of the same CPU dies they're already making millions of. The issue is that LPDDR doesn't serve any particularly compelling purpose on a desktop CPU as compared to DDR, so there is little reason for these products to exist.
 
  • Like
Reactions: jwbaker
Intel puts LPDDR controllers on their mobile CPUs each generation, so any OEM can buy the BGA part and stick on a desktop system with CAMM if they really want. Similarly, if Intel were interested it wouldn't be that hard for them to make a socketed version of the same CPU dies they're already making millions of. The issue is that LPDDR doesn't serve any particularly compelling purpose on a desktop CPU as compared to DDR, so there is little reason for these products to exist.
Right, exactly. Maybe you could save 100mW, but at low power the overwhelming efficiency problem faced by desktops is their AC/DC conversion loss. Would you redesign the NUC with CAMM instead of SO-DIMM? Their CPUs already support LPDDR...
 
  • Like
Reactions: redleader

molo

Ars Legatus Legionis
14,786
We are there already, technically, if you don't care for video games. I have a Lenovo C930 laptop I got back in 2019, it still works fine, outside of a mostly dead battery. I kind of want to replace it, but it's not really necessary.

Yup. I also have a Lenovo notebook from 2019 that meets all my needs. It has an Nvidia 1050ti graphics chip that has almost never been used to run any kind of modern 3D game. I haven't really played computer games for...nearly 20 years. But I can't bring myself to give up the ability to play them, if I want to. But I never do. I think the next computer I buy, though, I'll finally just accept it and use whatever built-in Intel/AMD GPU comes with it.

Like you said, if you don't play modern 3D games, upgrades happen when your computer fails. Cheap SSDs and RAM have solved the performance problems for the applications that most people use.
 
  • Like
Reactions: teleos

Lord Evermore

Ars Scholae Palatinae
1,490
Subscriptor++
I should be getting my first CAMM (one, Dell proprietary) system in next week or so, for what it's worth. It's in a design that allows high RAM density on mobile and that upgradeability and replaceability is very much appreciated. It'll be a 64 GB mobile workstation, with a design that allows up to 128 GB in a regular laptop form factor. The 128 GB configuration ain't cheap, but it exists and the parts are serviceable.
The big downside I see with CAMM(2) is that it's the same module for both single- and dual-channel configurations, and you know that there will still be systems sold with single-channel RAM (one side of the module empty I guess) to save even a tiny bit of money, so if you want to add RAM, you have to replace all your memory rather than just adding another stick. Aside from being bad for laptop users, that would be horrible for desktop users who EXPECT upgradeability to some degree. With a laptop, until recently, people have still been able to assume they'd have two slots, but were less likely to ever perform an upgrade, and now have been getting used to not having the ability at all or only having one slot plus a soldered amount, but being returned to having the full upgrade ability but finding out it's more expensive to use it will be an unwelcome surprise to some.

The cost savings of having RAM soldered onto the board will also still apply compared to CAMM(2), it will just be slightly less than compared to SO-DIMM. But with many laptops already only having soldered RAM with no slot, the manufacturers won't have any incentive to switch to CAMM(2) in those product lines. Those few that have soldered and a SO-DIMM slot may switch to CAMM(2) for the upgrade slot, but that will mean paying for that large CAMM(2) module but wiring it up only for single-channel connectivity, if that's even a thing that's possible. (Would be nice if it could allow adding a dual-channel module and disable the on-board, so both channels could be full and the same size.)

No matter what, manufacturers will find a way to screw the consumer and limit choice without paying through the nose.
 

Lord Evermore

Ars Scholae Palatinae
1,490
Subscriptor++
FWIW with the speed of light in a PCB trace being about 6 picoseconds per millimeter, the extra few centimeters you save with CAMM don't add up to anything at all compared to the 50-100 nanosecond memory latency. CAMM on desktop makes more sense if you want to use LPDDR to save power and/or have more bandwidth, but usually that doesn't even matter much. I think normal desktop will just keep using DIMMs, although we'll probably be stuck with 2 DIMMs max at the DDR6 generation, or maybe one after it.
What does the speed of light have to do with copper traces?

It's also not just about the frequency, it's also about power (which is why CAMM is aimed at LPDDR RAM) but it does make a difference, and will make a greater and greater difference as time goes on and they try to reduce mobile device power usage further and further. And when you're talking about traces that are only several centimeters to start with, cutting off a few cm is significant for anything that is helped by shorter traces.
 

Lord Evermore

Ars Scholae Palatinae
1,490
Subscriptor++
I also have a Lenovo notebook from 2019 that meets all my needs.
I'm repairing a 9-year-old 4C/4T Lenovo laptop right now to give to my niece. My stepmother was using it for years, even after the mechanical HDD started to fail. (With 4GB RAM and upgraded to Windows 10! My father bought cheap stuff for her but high-end parts for himself.) She finally accepted getting a new laptop and gave this one to me. I swapped in a low-end used SATA SSD and doubled the RAM to use for my rare out of the house needs for about a year. It just barely did the job for me but was painful. Now I have some money and have a new i7-1255U laptop in a box waiting for me to move the old piece of shit out of the way (needed a keyboard and the SSD was slow and losing life). It will work fine for my niece's mobile needs, who is currently using an even older 2C/2T laptop that is failing.
 

Nevarre

Ars Legatus Legionis
24,110
The big downside I see with CAMM(2) is that it's the same module for both single- and dual-channel configurations, and you know that there will still be systems sold with single-channel RAM (one side of the module empty I guess) to save even a tiny bit of money, so if you want to add RAM, you have to replace all your memory rather than just adding another stick. Aside from being bad for laptop users, that would be horrible for desktop users who EXPECT upgradeability to some degree. With a laptop, until recently, people have still been able to assume they'd have two slots, but were less likely to ever perform an upgrade, and now have been getting used to not having the ability at all or only having one slot plus a soldered amount, but being returned to having the full upgrade ability but finding out it's more expensive to use it will be an unwelcome surprise to some.

Ultimately it's a simple choice if your laptop's RAM fails out of warranty and that's 99% of what I care about with CAMM2 and SODIMMs:

  • If it has SODIMMS or CAMM2 modules, figure out what those cost to replace and make the decision to replace them or retire the hardware. Replacement should be easily user-serviceable if you're handy with tools.
  • If the RAM is soldered on (either all of the RAM or in a soldered + SODIMM slot configuration) then either figure out how much it costs to replace the entire board including the CPU, possibly GPU if there's a dGPU, and painstakingly verify that you are going to get the right board and right revision and it comes with some kind of warranty. The replacement can be a huge PITA as you have to fully disassemble the laptop and probably re-paste all the heat sinks. This kind of repair is almost never cost effective unless you have spare laptops ready to be cannibalized and lots of time on your hands.
  • Or if it's soldered RAM fails, just yeet the broken laptop into the E-waste pile and buy a new one.

You're also hoping that the laptop isn't shipping with an amount of RAM that effectively neuters the computer, like buying an absolute low end laptop in 2024 with 4-8 GB of RAM. That's the point at which an upgrade becomes more than a nice-to-have but it's also the point where saving $2 by soldering the RAM becomes tempting for OEMs. Buying and upgrading laptops has never been as cost-effective a choice as desktops and you ideally should buy laptops for roughly the specs you intend to need at the time of purchase-- all upgrades in the future are nice-to-haves that largely coincide with repairability. Desktop users shouldn't expect laptops to be highly upgradeable anymore-- there have been quite a few years of "hey check out all these gotchas" with all sorts of laptops. It's a pain to verify now that the laptop will have the upgrade options you need and the days of things like "RAM doors" in the chassis for ease of upgrades are loooooong gone in the quest for thin and light.

The cost savings of having RAM soldered onto the board will also still apply compared to CAMM(2), it will just be slightly less than compared to SO-DIMM. But with many laptops already only having soldered RAM with no slot, the manufacturers won't have any incentive to switch to CAMM(2) in those product lines. Those few that have soldered and a SO-DIMM slot may switch to CAMM(2) for the upgrade slot, but that will mean paying for that large CAMM(2) module but wiring it up only for single-channel connectivity, if that's even a thing that's possible. (Would be nice if it could allow adding a dual-channel module and disable the on-board, so both channels could be full and the same size.)

For now, CAMM2 modules are only really going into higher end laptops, so going with single channel is less of a consideration. That may someday change, but they're not intended to compete with the kind of laptops that use soldered RAM because it's cheaper. They're intended to compete with laptops that use soldered RAM because it's smaller and lower profile or designed to compete against the really chonky mobile workstations and gaming style laptops and allow those to fit into smaller, lighter form factors hopefully.

For the future, who knows. The standard may not stick around. It all comes down to...

No matter what, manufacturers will find a way to screw the consumer and limit choice without paying through the nose.

... consumers needing to vote with their wallet. If consumers and corporate buyers refuse to buy laptops with fully soldered RAM for example then they will eventually stop being offered.
 

MikePellegrini

Smack-Fu Master, in training
8
Subscriptor++
I like to play games, and I'm a sucker for good graphics. So mainly it's framerate that drives my upgrades. Typically, when I build something, I'll upgrade the video card 1-2 times before I see a CPU bottleneck - although not always.

In the early 2000's, I built mainly Athlon rigs, because they were easily overclockable.

In 2009, I switched over to Intel:

Asus P6T Deluxe - a socket 1366 X-58 board with three channel RAM - 3x6GB of Corsair Dominator PC-1600 (which in 2009 was a huge amount).
The CPU was a Core i7920 - a Bloomfield stock at 2.6 GHz. I was able to get a 1.3GHz overclock out of it! And on air - a Thermalright Ultra-eXtreme 120. Them were the days! The vid card was a GTX-295. I had a 32" 1440P monitor.

That rig is still running! I used it for gaming until 2018! A record for me. The hard drive (a WD 150 GB Ultra Raptor) crashed maybe a year ago. It'd been running Win 7. I tried installing Win 10 but it wasn't quite up to it so I ended up with Lubuntu. I just use it for listening to music in my exercise room.

In 2018, I built a new rig - an Asus ROG Rampage VI Apex - a socket 2066 X299 board. Had 4 channel RAM.
The CPU was a Core i77820X. It was an eATX board so I bought a new case a Phanteks Enthoo Evolve
For the first time, I went with watercooling - a NZXT Kraken X62.
For the OS drive, for the first time I went with an M2 drive a 512 GB Samsung 960 Pro.
RAM - 32 GB (4x8) of 3000 MHz Corsair Vengeance LPX

In the summer of 2020, anticipating the release of the RTX3000's, I upgraded to an Acer Predator, 43" 4K monitor. And got screwed. Because it took me about 9 months to get a new video card. So while I was waiting, I had a great 4K monitor but was still using the old GTX-295. That'd been great at 1440P but wasn't even up to the task of 4K.

My X-299 rig died unceremoniously in August 21, so I built a new Z590 rig.

Asus Tuf Gaming Z590 Plus WiFi
Core it 11700K CPU
eVGA RTX 3090 hybrid (watercooled)
32 GB Corsair Vengeance LPX RAM
OS on a Samsung 980 Pro M2 drive
I would have used the Kraken X62 for cooling but NZXT didn't have a kit for the LGA 1700 socket.
So I got an Arctic Freezer II 240 mm AIO.

I just recently re-did that computer for my grand daughter. So I built myself a new 14th gen Intel rig.

Gigabyte Aorus Elite X WiFi7
Core i714900K CPU (I rationalized that by the fact that it was a lot cheaper than previous Intel top of the line CPUs.)
OS on a Crucial T1700 gen 5 - 1 TB NVmE M2
Corsair Dominator Titanium RAM - 32 GB DDR5 7200MHz
Arctic Freezer III 360 mm AIO
Fractal North XL case

I'm still able to get good framerates on all the games I play with the RTX 3090 at max settings (Last of Us, Pt 1, Fortnite, Fallout 76, CP 2077, Hogwarts Legacy) so I'm hoping this card will last till the RTX 5000's are out (although the GPU is frequently running at 100% now)

I have a couple 2TB storage drives I use for music/videos/photos, with everything backed up on a Synolgy DiskStation 211+ which has 2 12 TB Seagate Iron Wolf drives.

I've been using 1000W PSU's for the last several builds.
 
What does the speed of light have to do with copper traces?
The ones and zeros on the data lines are EM fields and so travel at the speed of light for their specific frequency on the traces. This works out to about 6 ps per mm for typical materials, although it depends on the trace type with for example surface traces having a faster velocity since more of the field is in air and less in fiberglass.

It's also not just about the frequency, it's also about power (which is why CAMM is aimed at LPDDR RAM) but it does make a difference, and will make a greater and greater difference as time goes on and they try to reduce mobile device power usage further and further. And when you're talking about traces that are only several centimeters to start with, cutting off a few cm is significant for anything that is helped by shorter traces.
CAMM is not specifically aimed at LPDDR, there's a DDR5 variant too. Actual power savings of CAMM is probably not significantly different than DIMM for the same format (eg DDR5 DIMM vs. DDR5 CAMM) since the dissipation along the line will be small compared to the late dissipation into the tens of ohm termination resistors.
 
Last edited:

Lord Evermore

Ars Scholae Palatinae
1,490
Subscriptor++
CAMM is not specifically aimed at LPDDR, there's a DDR5 variant too.
DDR5 and LPDDR are not separate technologies. CAMM may not be designed specifically for LPDDR versus regular DDR but it is primarily aimed for use with LPDDR5, because that's what manufacturers want to use in laptops and other small forms that still need modular memory due to the power reduction (no matter how small and how insignificant you try to make it seem).
 

Lord Evermore

Ars Scholae Palatinae
1,490
Subscriptor++
If consumers and corporate buyers refuse to buy laptops with fully soldered RAM for example then they will eventually stop being offered.
Sure, if there are any laptops being made easily available that don't have soldered RAM and meet all the other needs of buyers. The people that are fine with buying laptops with soldered RAM aren't the ones who even know there is soldered RAM. It's not an important factor to them because they don't understand it, and they just buy based on the performance specs and appearance and size (and don't really understand the specs, either). It's only when it fails that they learn there's a difference. Corporate buyers don't care much, because they buy in volume with warranty coverage, and replace the fleet every few years. People that do pay attention to the RAM are probably a little more particular about other features and specs, and will have to compromise in order to get it. I spent several days trying to find a laptop, and while there were plenty in the $230 to $300 range that would have made me happy in terms of performance, size, resolution, I had to eliminate them all and go up to $500 to get one important item, which was a power button that wasn't integrated into the keyboard. Every brand has gone to that in nearly all of their models.
 
DDR5 and LPDDR are not separate technologies. CAMM may not be designed specifically for LPDDR versus regular DDR but it is primarily aimed for use with LPDDR5, because that's what manufacturers want to use in laptops and other small forms that still need modular memory due to the power reduction (no matter how small and how insignificant you try to make it seem).
CAMM2 is a physical socket type for memory that comes in different formats for different memory types, like DIMM or SODIMM. You know how there are DIMMs that work with DDR4 and other types of DIMMs that are DDR5? CAMM2 works like that. You can have a DDR5 CAMM or an LPDDR5 CAMM, but they're not electrically compatible since those are different memory technologies, same as you cannot shove a DDR4 DIMM into a DDR5 slot even though they're both "DIMMs". Different types of CAMM are aimed at different applications. For example, an LPDDR CAMM would be for mobile, while an ECC DDR5 module would be for a data center/server applications.

As for DDR5 and LPDDR5, they're unrelated memory formats.
 

hobold

Ars Tribunus Militum
2,657
You're wrong about that. Technically speaking, LPDDR5 runs the data rate at 8 times the clock rate and 16 times the actual DRAM array frequency. The DDR in LPDDR is essentially historical at this point since the technology is now a lot different than the DDR4/5 standards.
Well, data is still being transferred one bit (per pin) on both the rising and the falling flanks of the data clock, so I guess the "double" is still justified. Or did we progress to modulation schemes with more than one bit per symbol?

The slowness of the actual DRAM cells is unlikely to be specific to LPDDR, I would guess, because DDR is likewise limited in access latency, right?
 
Well, data is still being transferred one bit (per pin) on both the rising and the falling flanks of the data clock, so I guess the "double" is still justified.
They changed the clocking scheme a lot from traditional DDR SDRAM and instead send a 1/8th data rate clock which the chip then multiples by 8 to get the data rate. This saves power since you're generating the fast clock closer to where it is used and so can run more of the system at a lower speed. It is also more complicated than just sending on the rising and falling edges like DDR, but LPDDR cares a lot about power and less about cost.

Or did we progress to modulation schemes with more than one bit per symbol?
No, its still 1 bit per edge, it just sends multiple edges in between the normal rising/falling edges of DDR. Clock multipliers exist, they're more complicated, but you don't actually have to stick to whatever rate the main clock gives you.

Longer answer is that "DDR" as most people understand it only applies to the original DDR1 circa 2000. That worked by putting two SDRAM arrays in parallel, one on the rising edge, one on the falling edge. If they were 100 MHz arrays, then you got a 200 MT/s data rate. This is what most people think of as DDR where there is one clock and its double pumped. That ended with DDR2 where the chips on the DIMM divided the clock by 2 so that 4 SDRAM arrays could still run at 100 MHz but now transmit at 400 MT/s. DDR3 then changed that to divide by 4 so that 8 could run at 100 MHz and hit 800 MT/s. It is still technically DDR since you are sending on the rising and falling edge of the clock pin, but since the clock pin is immediately divided down to a much lower frequency or doubled to the DDR rate, not much is actually using that clock speed. If they had sent the lower clock directly, they could have called DDR2 QDR, DDR3 ODR, and so on which would have made more clear how the system actually works (sending data at multiples of the DRAM speed), but they choose to send full speed and then divide down so the DDR name stuck. With LPDDR5, they finally did this and send the lower clock which is multiplied up.

The slowness of the actual DRAM cells is unlikely to be specific to LPDDR, I would guess, because DDR is likewise limited in access latency, right?
Yes, they're still limited to a couple hundred MHz, and faster rates are obtained by putting more and more in parallel and then time-sharing the bus, sort of like RAID. This is why the absolute latency stays about the same between DRAM generations even as the frequencies double.
 

Lord Evermore

Ars Scholae Palatinae
1,490
Subscriptor++
They changed the clocking scheme a lot from traditional DDR SDRAM and instead send a 1/8th data rate clock which the chip then multiples by 8 to get the data rate.
So it's actually octal data rate? Sure wish they'd at least try again with quad data rate on desktops. Pure frequency increases are getting harder and harder for them to do anything with. DDR5 is so much more complex than DDR4, from what little I have gotten around to learning about it. I don't know of QDR would have the same complexity increase or not, but it seems like it would be a little easier. Eventually the frequency ceiling will be getting hit just like it is with CPUs.