OK I'll state what I know:
The combined total of ALL voltages output is the advertised wattage.
So a 3.3,5v, yada-newfangled volts, and 12 volts *combined* equal 1000 watts (in a "1000 watt" PC Supply.
That's the *output*.
But what is the *input*????
Cosmic question!!!
Beeeeecause, reddit and Stack exchange start talking coulombs and phases.
So my idea, (OK this is totally wrong, it'll void your warranty, etc)
calculate 12V-amps at the advertised wattage, which would be 1/10 the amps at 120 volts, and add a fudge figure (20%).
So.
1000/12=(watts divided by output volts) is amps, divide those amps by 10, add a fudge figure of 20%.
83.33333 amps at 12v (let's say "85")
divided by 10 is 8.5 amps, if your PC were on fire and calculating "PI" to the last digit (like in an episode of "Star trek") and running "Furmark".
http://www.numberworld.org/y-cruncher/ (which would be useless without some reliable way to measure current at your outlet) Plus, I am not aware of any study or list that tells you *which* benchmark(s) use the most power, for testing purposes
8.5 amps times 120 is (uhm) 1020 watts.
Go figure.
Add 20% of 1020 and get 1200, that's a lotta watts!
But your PC is prolly reading this and on twitter and looking at kitties, so....
I was curious, what gauge wire do electrical contractors and engineers like, for a 15A (fifteen-amp) circuit.
They said "12 Gauge" except for home depot, which likes selling more wire (fuckem).
Could you sleep soundly in this house?? |
Scarier but same concept:
I don't trust my own math, plus I'm thinking "No WAY that's right" so I'll find a site agreeing with me, who writes more prettily.
This is relevant beeecause:
You wanna rent out your college-kid's bedroom or the attic in your garage, to some poor unfortunate attending your church, and they have several appliances and a big PC.
How much rent should you charge, what's the average electricity bill gonna go up by?
YOU don't reveal your calculations to the tenant, some reasonable amount plus a fudge figure, and if they use less, you're happier, and if they use more, you calculated wrongly.
The little old lady using (o, let's say) a 500Watt supply from before her husband died, who loves to play feature rich games and stream soaps, might appear to use less than the kid using 1000w because his video-card says to, so you'd judiciously charge the kid way more, even though he might (conceivably) use way less because his power supply is more efficient and his games less demanding.
(well it gets complicated, f*ckin nevermind)
Prettier, much more bombastic:https://www.cgdirector.com/how-to-check-pc-power-consumption/
The problem is, sites emphasize the output power and sort of gloss over the input,
But if my math is anywhere *close* to being accurate, the inputs and outputs are roughly equal, adding a small amount of fudge for fudging (or dew yew want a lecture on efficiency?)
suppose you had a store card tied to some bank.
I doubt (now) that they charge interest separately on different items,
20 for $50 sounds high, so I am thinking that it applies to the whole balance of 800, because I charged a little thing.
Love it and despair!! The article almost never has the photo,
Clickbait (magnified), something about AI. |
The actual picture (Or whatever) has nothing to do with AI, and more with Marketing something. |
No comments:
Post a Comment