-->

Saturday, July 13, 2024

uh,? (Userbenchmark)

 I found a quick benchmark.
My train: Running sixteen actual cores and disabling the virtual cores,
What do I compare my PC to?
A 5800X!

And what does that mean, exactly?


Apparently as a desktop PC it's only average (for going here and Google news)
But if I ever had to do "workstation" stuff, I'd be impressive.
I'm an old Porsche at 65MPH, apparently.

So the nice dress on this PC is my graphics.
(attractive shoes?)

In a way different world, I could have
A wi-fi 7 motherboard,



Says Corsair, a while back.
But then they put out
64GB Cas 30 (which is better than "CL34")
(except it's "optimized for Intel")
TBA, "AMD EXPO" (must be another f'ing standard)

(uh) checking the list twice, 6000-something, CAS30, AMD Expo-yada-whatever
Because anything faster and the CPU'd freak.








And a CPU to go with the motherboard. 7950x(3d) ($500)(etc, etc)
___________
$1000(+"etc") +300 for the actual motherboard, hmm)
uh, $1500
No-problem-people could get by on 899.99, but I'm not window-shopping for the cheapest-crap out there)







The video has the strangest math I've ever seen.
"16 cores is better than 8 on 7950x, don't disable SMT" waitasec, SMT pumps up the cores to 32, not 16. And yes, 16 actual cores is better than 8 (why are we talking "8"???
I did my own research (so I don't appreciate his Digs) and 16 is better than 32
but (of course, wtf?) 16 is better than 8.
WHY are we talking "8"?
ACTUAL VS VIRTUAL (sorry, was I shouting?) 
You'd Never disable Actual cores, unless maybe you'd want some unreal overclock, IDK
Yeah, OK, NVM
(Mumbling)
Games rarely use more than 4 cores, so 4 actual + 4 virtual might help a core-impaired (slightly brain dead) game....
and we ARE talking about a game (right?) not some overall benchmark.
See, I can squint and see his POV
(but WHY are we talking "8"?)
(and why aren't we buying a cheaper CPU then (8 vs 16(v) would be a 7800x3d)
NVM



But (ya know) Mine hasn't died yet. It's like me, old but still going.

A teeny tiny insignificant addendum:
If a person was ever wishing they could change over from AMD to Intel,
uh,
Now would be the time (dontcha think??) 
Motherboard, CPU, Memory requirement, and Brand-loyalty is totally f'ing tossed out the window.
But I'm just, ya know, ranting.
Consider a "Bundle" of some motherboard and a CPU:
Intel (so far, this gets edited tons so wait for it)

And AMD (watch this space)
TBA: wifi 7 as I wrote this, I seriously looked, and most motherboards are still selling "6e". Gigabyte was the exception, and there are prolly others, I just never found them. (at 2AM on 7/14/24)
TBA: 7950X3D (the wifi-7 bundle)
Dated articles touting the upcoming 8950 do not show if they are here or in the future, and the 79/8950 are not on the list of AI.
Wait so real-smart-CPU's won't have AI? Or is this just another graphics thing.
Marketing has me confused.



PS being serious about Bundles means they would include RAM, but none did, that I saw. My gaping lack of investigative skills is prolly giving some readers an aneurysm....
uh.........
--------
weird math (Intel)
uhm, 8+8+16, Oh. o....k (I guess)
 it might be too late for me, but save yourselves!! 
With enough reddits and Youtube-videos, U 2 cud have Intel (and better bragging rights)
https://www.anandtech.com/show/21374/intel-issues-request-to-mobo-vendors-to-use-stock-power-settings-for-stability (Overclocking is dangerous, they're releasing a patch, uh?)
If I had to translate the algebra in the link, um, I'd posit that allowing a CPU to have too much power is bad (m'kay?) beecause the CPU (according to the article) will actually use the power and die.
Plus (This is all me) it's hot, and what worked in the dead of winter might be fatal now.




No comments: