This is great for the price. My main complaint is power consumption. Intel is still delivering wonderful performance, but the performance per watt isn’t great at all.
Is there a reason why people are so stressed over high power consumption in new CPUs? The majority of the time it wont use anywhere near that and I just accept it as a “stock OC”.
I think the concern is scalability. A very high performance per watt is a signal that it the processor design will be difficult to scale practically.
Beyond that, at scale, is it good to have high-consumption computers all over the world before we have widespread renewable energy? Probably not ideal.
They are usually sipping power is my point. The high boosts being possible is a bonus if you need it but won’t hurt if you don’t. It’s the best of both worlds.
As far as energy use we’re it’s such a tiny amount of energy compared to any given appliance.
For me it's about how much heat they put out. I have to crack a window I'm winter when gaming.
I think others are concerned with a trend that extrapolates to something insane power consumption where power use grows exponentially to megawatts where clock speed is growing asompotically to say 10ghz
> For me it's about how much heat they put out. I have to crack a window I'm winter when gaming.
Back in the Amiga era, I inverted the PSU fan of a very loaded 2000 (lots of memory, SCSI, Video Toaster, etc) and piped in the output of an air-conditioner directly to it (with a little bit of epoxy sculpting). It stopped crashing and worked happy for a long time until being replaced by a boring Windows box running NT.
I had, but only in front of the computer, where the cold air hit the room air. The air from that AC unit was very dry - to the point of similar ones being replaced due to complains.
When I’m gaming my cpu (11700k) puts out about 50 watts (some cores moderately busy). The gpu (3080) puts out 320 watts!!!
Tbh I like it in winter, warms my living room up nicely. In summer I just run my aircons more, but I also usually game less because the weather is nice.
I can't agree with you. The CPU will try to boost any time you are doing work, and these models are overclocked to a max at the expense of an insane power consumption overhead. My previous Intel laptop was getting spikes of 50+ watts for things like rendering a website or opening files and I can only imagine how bad it is with current desktop models. If it would be the case that the CPU enters it's maximal turbo once the system has detected that you are doing sustained, demanding work — you would be right, but all of these dynamic overclocking systems are opportunistic. As someone mentioned above, they are designed to win benchmarks.
> is it good to have high-consumption computers all over the world before we have widespread renewable energy?
I doubt too many computers based on these parts will be running that hot for any significant time. It's much more likely to experience bursty loads - compiling something big, crunching a massive amount of data - than sustained ones, such as a build server or something similar.
If the CPU is managing GPU-based number crunching, keeping it busy will require a lot of GPUs, which will make the CPU's thermal output a rounding error next to the rest of the data furnace.
Yes but either you need your computer to work and therefore you are willing to accept the fact that energy=work, or you don't need the computer to do anything in which case it will just sit there silently drawing ~1W.
Perf per watt isn’t interesting to users who want the job finished. Work done per joule is extremely interesting but Intel is near the top of the game on that metric. Many CPUs that are viewed as low power are not energy efficient.
On an energy basis the “performance” cores in an Intel CPU are more efficient than the “efficiency” cores.
TDP and actual power draw are two VERY different things. Intel and AMD both will plaster 95W or 125W or 160W TDPs on chips that will draw well over 250W when boosting as long as your cooling allows for it.
Your second case is rather...optimistic given the amount of background work any modern OS does.
My i9 desktop will start to spin the fans up with light web browsing.
To be precise, even at true zero load, an i9-10850k (what I've got) draws 25-30w. Any sort of load at all and it gets to 60+w. A gaming type load is well over 125w.
If you are measuring that correctly it means that your operating system or something about your platform is utterly broken. `turbostat` and `powertop` on my i7-13700K indicate that it is currently, at "true zero load", drawing 1.1W
Not my measures… measurements by people who run hardware websites.
i9 is much hotter than i7, especially the high end ones. When I bought the cpu the only thing higher (without getting into server grade stuff) was the 10900k which was basically unobtanium.
No. That's just the modern web. If a person needs to use the modern web, he/she needs to use a browser. If he/she needs to use a browser, it is almost always going to be Chrome (or something built on Chrome). Chrome is heavy on Linux, BSD, and macOS as are many of the pages it loads. The web can spin up fans on Intel, AMD, and ARM when I am running NetBSD, Slackware, or Intel Clear Linux.
In a closed room heat isn't down to fan, that heat may not be in the case but it's not getting pumped out of my room quickly. I'm basically running a 900 watt heater whenever I'm doing hangouts or gaming
But the new chips are already shipping with an “ECO mode” for small-form factor computers. That just makes it easy though, undervolting was already popular.
The alternative is shipping a power constrained for…no gain? Letting the cpu boost this high is only a boon for people who want it and is an imagined negative.
I'm generally most interested in performance at peak, and consumption at idle, since my usage typically bounces between those two extremes and spends little time in-between.
After owning an i9 for a while... I care about power usage because all the power gets turned into heat. i9s like to run HOT, even under fairly mild loads.