CPUs: WHY SO POWER HUNGRY?

Sdílet
Vložit
  • čas přidán 28. 06. 2024
  • We seem to be going through one of those stages where everybody goes mad with their power consumption. But there's no need! Here, let me show you how you can NOT extract all the performance you can out of that expensive new processor you just bought.
    0:00 - History of power consumption
    3:22 - Why is power consumption increasing?
    5:33 - When they went too far
    6:20 - The solution
    Impact Prelude Kevin MacLeod (incompetech.com)
    Licensed under Creative Commons: By Attribution 3.0 License
    creativecommons.org/licenses/b...
  • Věda a technologie

Komentáře • 514

  • @worldkat1393
    @worldkat1393 Před rokem +720

    Overclocking is no longer a trend. I see so many more threads now days on how to undervolt the best for the minimal performance loss. I am currently using my 7950x with the eco mode 65w. I lose 4 fps in my favourite game.

    • @lycanthoss
      @lycanthoss Před rokem +26

      I mean to be fair in games even the 13900K doesn't use that much power. My 12600K generally uses 40-60 watts in games even though its power limit is 150.

    • @MLWJ1993
      @MLWJ1993 Před rokem +37

      @@lycanthoss Intel usually uses less power during gaming & idling than AMD does. It's just all-core workloads that kind of blow the powerbudget.

    • @henrym5908
      @henrym5908 Před rokem +23

      @@MLWJ1993 the 12600k always uses more power than the amd 5600x and 5800x in the comparisons I could find; like 10% more at idle and up to almost double under load.

    • @MLWJ1993
      @MLWJ1993 Před rokem +8

      @@henrym5908 Interesting, the charts from Igors lab suggest the 12600K to equal the 5600X's power consumption (on average), but the 12700K & 12900K use less power than the 5600X in their tested games (at all resolutions actually) games.
      Idle I see power consumption of ~5W on 12600K vs 9W on 5600X on Tweakers.
      Although if we compare the 2 it doesn't make a huge difference until we look at all-core workloads where alderlake pushes for much more power.

    • @l5ismi
      @l5ismi Před rokem +4

      Don't forget that AMD has an adventage over intel due to their litography proccess which is smaller so less power consumption but still you can make it power hungry.

  • @MrRoko91
    @MrRoko91 Před rokem +136

    Fortunately other popular gaming devices, like handhelds, are pushing in the other direction. I know they don't have a choice but still - it's still mindblowing to me what level of performance I'm getting with my Steam Deck for only 15W TOTAL CPU + GPU (tops at around 25W when we account for display consumation and other internals).

    • @LutraLovegood
      @LutraLovegood Před rokem +11

      25W? That's less than my my CPU + GPU on idle.

    • @MrRoko91
      @MrRoko91 Před rokem +7

      @@LutraLovegood I know right 😄 My desktop also uses more than that idle.

    • @sihamhamda47
      @sihamhamda47 Před rokem +4

      And some laptops that dedicated for long battery life also do that (although in some vendors and laptop models, the U-series APUs are still allowed to boost up to 30W if the cooling system is good enough)

    • @dylon4906
      @dylon4906 Před rokem +3

      @@LutraLovegood I have a thin and light laptop with an r7 6800u and it is insane the level of performance it gets for the power draw. the cpu performs between a desktop 3700x and 5700x and has an extremely competent igpu capable of comfortably playing new aaa games at 1080p medium to low, all in a 25w package. cpus really don't need to be drawing 300w

  • @visual9ine
    @visual9ine Před rokem +450

    I wish tech press would focus on power efficiency more, gamers nexus are already starting to test this sort of thing which is great, but it's not really done as a standard test which sucks.
    edit: realised hardware unboxed also seem to be starting to test power efficiency too :D

    • @koko-fv2zr
      @koko-fv2zr Před rokem +3

      yeah arm chips are the future and gamers not going to let them win

    • @SavageArms357
      @SavageArms357 Před rokem +11

      @@koko-fv2zr I'm hoping RISC-V overtakes x86 in the next decade or two

    • @ollydix
      @ollydix Před rokem

      Apple is already doing this.

    • @kekero540
      @kekero540 Před rokem +3

      Tbf power consumption is more of a overheating problem than anything else. More power it consumes the hotter it gets.

    • @Rachit0904
      @Rachit0904 Před rokem +13

      @@koko-fv2zr ARM chips have nothing inherently more power-efficient about them. They just happened to have found that niche at the moment.

  • @Rachit0904
    @Rachit0904 Před rokem +353

    Yeah it has gotten a little out of hand.
    Also for GPUs; the RTX 4090 really should have been a 350 watt card but Nvidia wanted that last 2% of performance.
    Before 12th gen, Intel’s official power draw was reasonable. But they allowed Z-series motherboard vendors to set unlimited power limits by default, which got progressively worse as they added more cores.

    • @suoyidl2654
      @suoyidl2654 Před rokem +8

      A little? I'm annoied by this development since the release of > so far < usless tech like Raytracing.

    • @lynackhilou4865
      @lynackhilou4865 Před rokem +31

      @@suoyidl2654 ray tracing isn't useless tech , especially on high end cards where you can max it out and still have good performance , I would say ray tracing was useless when it was first inroduced in the 20 series because only the 2080ti could run it reasonably and it didn't make a noticeable difference in visual quality in games like battlefield 5 and shadow of the tomb raider and it still halved the frame rate ( plus dlss 1 was terrible )
      But now , ray tracing is a step up in visual fidelity in many games such as cyberpunk , control , metro exodus EE , dying light 2 , spider man ...etc

    • @simptrix007
      @simptrix007 Před rokem +3

      Well 4090 doesnt draw that much in usual gaming scenarios in my experience.

    • @suoyidl2654
      @suoyidl2654 Před rokem +1

      @@simptrix007 Do you cap the fps? Because most people don't.

    • @circedge
      @circedge Před rokem +10

      @@lynackhilou4865 When in a blind test users are unable to tell ray traced from rasterization unless they're trained to do so and concentrate, I'd call it useless.

  • @CyberneticArgumentCreator

    Power envelope for desktop CPUs was a kind of prisoner's dilemma. As soon as either brand moved up, the other brand had to move up or lose. I'm impressed it stayed almost the same from 2010 to 2020, honestly. The "good" news is that they can only really do this once, and then they have to improve IPC/efficiency/process/etc to once again realize gains each generation. And they've now done it. This means the next chipset generations will have to have architecture or process improvements to raise performance.

    • @cube2fox
      @cube2fox Před rokem +3

      Good point about the prisoners dilemma. But I think there is no good news about it. After they both "defected", the CPU power consumption for desktop will probably stay high forever. It will perhaps never go back to the more sensible 100W.
      It would probably only work if they all signed a mutual agreement.

    • @Azzy_Mazzy
      @Azzy_Mazzy Před rokem

      @@cube2fox you can always limit how much power the CPU uses, its just now there is more headroom for people who don’t care. Personally both my CPU and GPU are Undervolted and its performs better than stock and quieter.

    • @cube2fox
      @cube2fox Před rokem

      @@Azzy_Mazzy Yeah that's good. Although few will know how or choose to do this given that it isn't the default.

    • @Azzy_Mazzy
      @Azzy_Mazzy Před rokem +1

      @@cube2fox i know R7000 CPUs you can just enable eco mode and it would run way cooler, idk if intel has something like that

    • @NightKev
      @NightKev Před 10 měsíci

      "I'm impressed it stayed almost the same from 2010 to 2020, honestly."
      It's because AMD couldn't compete with Intel until after they got Zen going, so it was basically a decade+ of Intel being able to do whatever they wanted with no risk. Once AMD had good CPUs again, Intel had to get off their ass and start pushing their CPUs to compete again. There's nothing to be impressed by, except maybe Intel's anti-competitive tactics that held AMD back I suppose.

  • @klad2860
    @klad2860 Před rokem +6

    You've seen a british man try to chow down a steam deck, now you see him trying to bite through a cpu.

  • @badass6300
    @badass6300 Před rokem +20

    I've been undervolting gpus since 2016 and cpus since 2017 and getting 20-30% lower power and temperatures for a loss of 0-4% performance.

  • @amirmirzaei3940
    @amirmirzaei3940 Před rokem +64

    I'm just glad that power consumption is getting more and more attention.

    • @XZ-III
      @XZ-III Před rokem +1

      Only like a solid half decade late but yea ig

  • @electricindigoball1244
    @electricindigoball1244 Před rokem +3

    While you have alluded to this I think it's important to remember that power consumption on its own doesn't tell you if a given CPU is more efficient than another CPU.
    For example if you were to compare a CPU with a 60W power limit and a CPU with a 100W power limit but the 100W CPU is able to complete its tasks twice as fast then it is actually more efficient. Efficiency is measured in the amount of Wh required to perform a given amount of work, power consumption is only a part of this metric.
    It's also worth considering that a faster CPU can allow you to get more work done in the same amount of time and if you're using your PC to make money then it's possible that those time savings will end up paying for the inefficiency of the CPU.
    Regarding the low amount of power pulled by a single core: it has less to do with diminishing returns in terms of performance and more to do with the fact that there's a limited amount of power you can push into a single core before you can't cool it due to thermal density. This was less of an issue in the past because the cores weren't as dense as they are today due to how much the transistors have shrunk.
    Finally you have to consider how big of an impact the difference in CPU efficiency will really have on the overall amount of power all of your devices consume (this also heavily depends on your usage as you have to have a workload that will actually stress the CPU before you start seeing those very high power consumption numbers, gaming in particular is known to not stress CPUs all that much). 100W sounds like a lot especially when it comes to CPUs but its impact on the number of Wh you end up consuming could be so small that it's within the margin of error.
    Of course I'm not saying that it's not worth it to consider reducing power limits and ideally also undervolting CPUs/GPUs especially since drawing less power also makes cooling a lot easier and could save you a significant amount of money in cooler costs.

  • @Mulakulu
    @Mulakulu Před rokem +32

    As someone who has a 5L case and is very limited on cooling, when i looked at Intels slides, what peaked my interest was that 65W and same performance as previous gen at 240W, because in my case, power efficiency is king. Things aren't as ridiculous as they seem with the high power draws on display because while they technically allow you to suck 400W or whatever, you don't need that. You still get much better performance than the previous generation on the same small cooler. Performance per watt is still increasing.

    • @QuantumLeclerc
      @QuantumLeclerc Před rokem +1

      @@2kliksphilip Both at 65 W is definitely good, but also showing what power level the older gen draws for the new one. Both make sense really. Especially in a situation where the power increase changes for each increase in watt. For example, if 65 W gave X multicore performance %, 66 W gave X+Y performance, but if 67 W didn't give X+2Y, rather it gave X+1.5Y.

    • @Mulakulu
      @Mulakulu Před rokem +1

      @@2kliksphilip very true, and this should be normalized. Maybe a standard should be making a graph that tests the performance every 5 watts, so 10, 15, 20, or at least for the high power CPUs and then plotting that on a graph. If we just test at 65W, many low power ones won't even show up so they can maybe test per watt.

    • @quantum5661
      @quantum5661 Před rokem

      ive been looking at sff cases and been pondering going with intel next time i build, im fucking shocked how high wattage has gotten.
      like, i went from a 4790k which idles at like 3 watts, to a 5800x which idles at 45 watts. this cant continue.

    • @Mulakulu
      @Mulakulu Před rokem +1

      @@quantum5661 i agree. Efficiency should be more important, although 45 idle is very high. I bet you could drop that with underholdning and one of the auto frequency mores like PBO

    • @Mulakulu
      @Mulakulu Před rokem

      @@quantum5661 and if you want efficiency, don't go with Intel. They used th be the efficiency kings but it changed like 2 years ago. Intel has gotten desperate trying to keep up with AMD's performance so they pump in lots of watts to get good performance. Also, AMD's socket is more likely to last much longer so you have an upgrade path later on with AMD's motherboard

  • @gJonii
    @gJonii Před rokem +9

    I bought 5900x as my cpu, and underclocked it to use about 40W. It's still about 10x faster than my old processor, and I have now cpu that doesn't need cooling. With my (older, I'll upgrade when gpu prices go down to level of sanity) GTX 980 frame-limited during gaming, I'm genuinely unsure if my gaming computer even uses its fans. The quality of life upgrade was massive, and turning the computer on without any sound beside monitors turning on still makes me smile.

  • @bbear
    @bbear Před rokem +3

    Just yesterday I actually put my 3700X in Eco-Mode, and noticed it was still clocking itself to 4.3 GHz on single cores in games, 100 MHz lower than normal. The only time it really noticeably throttled itself was at full core loads, where the processor capped all its cores to 3.8 GHz. But as a result, my thermals never reached higher than 58 degrees, as compared to PBO mode where my CPU would eventually reach 86 degrees under synthetic load, or even standard mode where it would reach 71 degrees.
    I admittedly didn't check the power consumption stats, but I imagine they were also significantly lowered. And then you uploaded this video, letting the world know that it's okay to power limit your CPU if your power draw and thermals are significantly lowered, with a near-single-digit impact on performance. You really laid out why it's more beneficial than keeping it stock. I'd add that if you purposefully play eSports games at 480 Hz and high resolutions, then maybe it can be argued you'll need that extra performance for some games. But that's a small percentage within the enthusiast PC gaming community who mainly use 120-165 Hz, which is already a small portion of the PC community as a whole. The vast majority of people will pretty much only benefit from power limiting/underclocking their processors.

  • @RealRogerFK
    @RealRogerFK Před rokem +56

    AMD or Intel could just say "Imagine this scenario: a hot summer, in your room, all windows closed, you're boiling" or something more PR. Also say "Spend less on bills, and more on games" or something, idk. That would sell more than "hurr durr spend 300€ on motherboards"

    • @utfigyii5987
      @utfigyii5987 Před rokem +13

      That would sell a lot of amd gpus. Do not die from a stroke gaming, get a Radeon instead.

    • @LutraLovegood
      @LutraLovegood Před rokem +9

      And less heat = lower fan speed = less noise. It's a win-win-win.

    • @Azzy_Mazzy
      @Azzy_Mazzy Před rokem +2

      No it wouldn’t

  • @sdHansy
    @sdHansy Před rokem +5

    A good tip is to undervolted your CPU. You'll maybe see a 5-10 percent performance loss, but a massive drop inn power consumption.
    The CPU's are very efficient, it's just that the manufacturers crank them to the max (more or less)

  • @bumi5892
    @bumi5892 Před rokem +8

    in summer It will be impossible to run a computer without air conditioning in the room, especially here in South America.

    • @xponen
      @xponen Před rokem +1

      the heat accumulates in the room. It is cheaper to cool the room using an exhaust fan (50watt), rather than an air-condition (1horsepower, 1000watt). A person emits 100watt of body heat, but a PC can emit 5x a person, so it is as tho the room is crowded with people.

    • @geleiou
      @geleiou Před rokem

      Im feeling it here in Brasil.. 30c everyday, without AC i dont even open games, just use pc normally because i cant stand such heat, and cant use AC most of the time because energy bill are in all time high right now

  • @grimtermite191
    @grimtermite191 Před rokem +7

    When tech reviewers compare cpus price to performance they should include the cost of average power draw for average usage over average lifetime of cpu. Just to help people understand what they will really pay to buy and run that cpu. And of course it would put a big inventive to hardware manufacturers to prioritise power consumption.

  • @gjits5307
    @gjits5307 Před rokem +4

    Probably the most insidious part of this race-to-the-bottom is all the hidden ecosystem costs. Spend more on a cooler, spend more on case/ fans, spend more on power supply... the current dearth of affordable motherboards is partly due to VRM costs. All taken into account you may be spending $100 or $200 more on a build because team purple decided to jump off the performance per watt cliff for 4% higher performance.

  • @nauscakes1868
    @nauscakes1868 Před rokem +10

    I'm glad to see a video like this. I just upgraded my computer of 6 years, and honestly for playing the games I do like FF14, or Age of Empires 2 -- I wouldn't mind more energy efficient parts.
    I'm at a weird point, where I want to upgrade more, because building computers is fun! But I don't need performance, I could easily see myself going down the micro-builds and more efficient route. It'd be really neat to build a computer that has the same performance as my 6 year old ring, but is physically half the size or puts out way less heat.

  • @kajori1899
    @kajori1899 Před rokem +1

    Congrats on being featured in the newest ltt video 👍 folliwing you for years by now and I was very suprised with the ltt staff mentioning your videos

  • @terry2295
    @terry2295 Před rokem +2

    I feel Philip is good at predicting trends in the hardware market be it about the companies or users.

    • @terry2295
      @terry2295 Před rokem +3

      @@2kliksphilip The rough side of being able to analyze data.
      Don't let people without that ability get in your way or ruin the fun you have with it.

  • @Correct_Opinion
    @Correct_Opinion Před rokem

    I noticed the upscaling on the graphs, thank you mr Klik

  • @colbyboucher6391
    @colbyboucher6391 Před rokem

    Glad to see you talking about this.

  • @snowballeffect7812
    @snowballeffect7812 Před rokem +1

    Excellent perspective and points.

    • @snowballeffect7812
      @snowballeffect7812 Před rokem

      @@2kliksphilip Of course! This is especially useful for me as I'm looking to upgrade after a long time out of the market. I would not have easily come across the idea of intentionally throttling performance to increase efficiency. I'll need to take this into account while doing cost-effectiveness assessments, as well (unless you're going to do that anyway 😅, in which case I can wait). It seems like there's already enough data out there from benchmarks to see which power level is most GHz/watt and then we can just divide price by that value.

  • @Bebeu4300
    @Bebeu4300 Před rokem +2

    I love the little small print in the thumbnail

  • @OussaMeb
    @OussaMeb Před rokem +9

    What about an experimental episode, where you port old BSP maps (Quake3/GldSrc) to Unreal LUMEN or even to Blender Cycles (offline). 🙏
    I've already tried this, and the results were just fantastic 🤩

    • @OussaMeb
      @OussaMeb Před rokem +1

      It will be a follow up for your great comparison between CS.GO and the new HL Alex engine

  • @TennessseTimmy
    @TennessseTimmy Před rokem

    Very good overview

  • @kyoyeou5899
    @kyoyeou5899 Před rokem +2

    I completely agree with this. One of my favorites PC build was the LTT energy saving PC, wish someone would do one like that at different price points every year like Jean-Baptiste Show does a 500€ PC build every year

  • @xenotiic8356
    @xenotiic8356 Před rokem +1

    This is very important to me as someone living in a hot climate in a thermally constrained room! Before I lived here, I had more consistent A/C so I didn't need to worry as much about thermals. Now I almost only ever use laptops and game consoles because, despite enjoying the customizability of pc towers and pc gaming, heat output is almost always higher than the other options, which just doesn't make sense in my situation. The recent trends in desktop computing haven't been reassuring either, with the ever higher TDPs. Glad this is getting more attention so someday I can return to computer towers!

  • @gideonunger7284
    @gideonunger7284 Před rokem +3

    One of the reasons i like the new m1/m2 apple devices. they have really good performance but are also very efficient. Maybe its "only" 70% the performance of the top of the line intel CPU but at 10% of the electricity

  • @marsovac
    @marsovac Před rokem +3

    I was running my Ryzen 3600 at 4500Mhz all core, up from the base clock of 3.6Ghz. So there was overclocking headroom in the low-mid range. The high end was saturated. I think this distinction should be pointed out. Now we are seeing even the lower end ones being close to max.

  • @Rubafix989
    @Rubafix989 Před rokem +1

    Undervolting can yield impressive results on that front with no or very little loss in performance

  • @Povilaz
    @Povilaz Před rokem

    Very interesting.

  • @xponen
    @xponen Před rokem +46

    You can just leave the default settings, but instead tweak the "Processor Power Management" in Window 10/11. Set "Processor performance increase threshold" to 95%, "Processor performance decrease threshold" to 94%, and "Processor performance core parking min cores" to 100%. You will get the lowest power usage without limiting performance in peak load. (notice that I disabled core parking, which is not a mistake, rather it is better to have all cores on standby and throttled down, than let them get parked)

    • @earthling_parth
      @earthling_parth Před rokem

      This seems very interesting. I don't know enough about this so boosting this comment up so people more knowledgeable than me can comment on this. Thank you

    • @CornRecords972
      @CornRecords972 Před rokem

      clever

    • @xponen
      @xponen Před rokem +4

      @@earthling_parth there is a forum titled "Windows power plan settings explorer utility" on Guru3d, a lot of discussion about these settings there.

    • @puccionicolas7763
      @puccionicolas7763 Před rokem +1

      this basically disables turbo boos, which makes it run at Base Frequency, you are basically running the 13900k at 3.0ghz

    • @xponen
      @xponen Před rokem +1

      @@puccionicolas7763 this 3 setting were not turbo settings.

  • @giantnoah
    @giantnoah Před rokem +5

    Perhaps benchmarks should be done at standard power levels, showing how efficient each chip is when given the same amount of power to work with.

    • @DesertCookie
      @DesertCookie Před rokem +1

      The issue here would be to find a common testing ground, i imagine. Different generations can have wildly different efficiency points. Maybe a processor could be 20% faster by only consuming 10% more power which would make it more efficient overall but how to test that efficiently? One could just run dozens of time consuming tests for ever possible power level - each with dozens of benchmarks.
      While I agree this would be great and the manufacturers definitely already are doing this, to consumers and reviewers the non-existance of automated testing systems does make this impractical (at the moment), I fear.

  • @GUTZ420
    @GUTZ420 Před rokem +3

    i havent owned a good computer in my entire life but i still find these videos extremely interesting!

  • @IAmMike2K
    @IAmMike2K Před rokem +1

    Thanks Philip, useful video. Also, why the hell is your energy kWh price 49.25p? I thought the price cap put it to an average of 34p?

  • @HypoCT
    @HypoCT Před rokem

    Steve's thumbanail face at the end is just *chef kiss*

  • @FTRnRhome
    @FTRnRhome Před rokem +2

    My nhd15s cools my 13900k just fine, I also put a contact frame on it to shave off more heat.

  • @margesimpson805
    @margesimpson805 Před rokem

    great shit!

  • @Raudnen
    @Raudnen Před rokem +8

    I undervolted my R5 3600 and still got a fair bit of performance lift from overclocking. Processors are weird sometimes.

    • @Rachit0904
      @Rachit0904 Před rokem +1

      Nothing weird about this. Most processors have some headroom for pushing the voltage-frequency curve. So you have the option to just decrease the voltage, just increase the frequency, or do a bit of both.

    • @rdmz135
      @rdmz135 Před rokem +2

      That is expected behavior. Your chip wasnt power limited so by undervolting you caused it to run cooler, which in turn makes it boost higher.

  • @crispx19
    @crispx19 Před rokem +1

    The most i REAALLY cared about "Power consumption" is "Will i have to buy a bigger power Supply?" , so yeah... i think we kinda are some of the problem

  • @TerraWare
    @TerraWare Před rokem +71

    The tinkering hasn't necessarily gone away per say. It used to be where we'd overclock our cpu's now we undervolt them. Same goes for GPU's. Personally however I do feel that this is a good thing if you look at the bigger picture. Most people don't overclock or really tweak their cpu's and gpu's so they benefit by the companies that design these chips doing this from the factory with safety parameters in place to make sure nothing is damaged.

    • @timeTegus
      @timeTegus Před rokem

      Undercoating can Aldo describe stabilu

    • @hi_its_jerry
      @hi_its_jerry Před rokem +7

      @@timeTegus are you okay mate? seriously i'm genuinely concerned, i hope you're not having a stroke

    • @JessicaFEREM
      @JessicaFEREM Před rokem +6

      CPUs are pre-overclocked these days. It's cool though that consumers don't have to overclock their CPUs anymore if they just want to eek out the most juice.
      If you're good with your power plans, you can underclock your CPU in seconds.

    • @timeTegus
      @timeTegus Před rokem

      @@JessicaFEREM that klcan make the system unstable

    • @ChucksSEADnDEAD
      @ChucksSEADnDEAD Před rokem +2

      @@timeTegus Overclocking also can make the system unstable. People used to ride the line of stability and then dial it back a little for performance.

  • @michakrzyzanowski8554
    @michakrzyzanowski8554 Před rokem +1

    With AMD cpu's you can adjust the frequency to voltage curve by an offset. What that allows is better clocks at same temperatures. But if you power limit PBO, you will have stock performance at massively reduced thermals.

  • @luizarthurbrito
    @luizarthurbrito Před rokem +2

    100% agreed. I even underclock and undervolt my phone's SOC and memory for better battery life (ROG phone 5).

  • @Exnem
    @Exnem Před rokem +1

    I am so clueless about any PC stuff and every year it just seems to get more complicated lol. Dont know if I can ever catch up, I do recognize that buying pre-built is paying too much for less, but dang if I would ever get my head around all of this stuff.

    • @Azzy_Mazzy
      @Azzy_Mazzy Před rokem

      Just be patient and don’t overthink things. set a budget and a goal then start looking at reviews of products that meets your budget

  • @Ph42oN
    @Ph42oN Před rokem

    I like to do undervolting to keep same performance at lower power draw or even small overclock and undervolt to gain performance without more power.
    Original ryzen 1000 series boost was actually bad. I think something went wrong and they released it with boost that does not work properly. It was fixed in zen 1 laptop chips.

  • @mamamia5668
    @mamamia5668 Před rokem +1

    Im perfectly happy with more performance even at a greater power consumption since we still have tbe option of power limitting any cpu

  • @picblick
    @picblick Před rokem +2

    The Ecomode makes this a bit better I think, especially since it boosts to a configurable maximum.
    But anything higher than 80°C is a no-go in my book because the efficiency drops hard at some point.
    Boosting for a short time is fine, but boost should not be the norm.

    • @MLWJ1993
      @MLWJ1993 Před rokem

      It's not even the temperature that's the problem, obviously the process node from TSMC does not scale favourable compared to Intel when given more power which is evident by just dropping down to 105W eco mode & only losing a few % performance & at least 20°C in temperatures.

  • @TehF0cus
    @TehF0cus Před rokem +12

    I have to say my new 13900k is using a bit less wattage than my old 10900k in daily usage/gaming
    If I stream/record and play GTA with 80-99% GPU usage, my CPU only runs at around 100-120W where my old 10900k with the same settings/gpu sucked 150-200W
    13900k only uses 1-2 Threads for this workload. Same results in many different daily scenarios, as the CPU hardly ever gets fully utilized. When doing productivity based workloads though.. oh well.. lets just say I am glad I have a 1000W Platinum PSU

  • @bodaciouschad
    @bodaciouschad Před rokem

    My new PC is a seasonal computer. It raises the temp in my room by about 5F. Thats lovely in the winter, but during the summer I could really only leave it on for an hour or two at a time before the room was uncomfortably hot. Between my 240hz 2k monitor, my RTX 3070 ti and my i7-12700K, my PC has begun to function more like a space heater than a gaming rig. When I build a new system in 4-5 generations, power/performance is going to be top of my mind. Most of the games I play are retro graphics anyways...

  • @nBasterd
    @nBasterd Před rokem +1

    my guy, u make good videos

  • @YannMetalhead
    @YannMetalhead Před rokem

    Good video.

  • @MotoCat91
    @MotoCat91 Před rokem +3

    I undervolted my RTX3080 so that it only draws 250W flat out now with an imperceptible difference in performance - great for small form factor cases
    But I discovered that intel locked down the 12400F even more than just locking the multiplier.. because I can't undervolt it at all.
    -30mv offset does absolutely nothing, and -50mv cuts the performance by like 40%, same as setting the power limit to 45W with stock voltage.

  • @pikaboi4408
    @pikaboi4408 Před rokem +2

    Nah
    I dont see the power trend die out.
    People were always willing to spend extra money for little performance gains.
    Its not gonna stop just cause their powerbill has an additional 0.

  • @akiraishu4453
    @akiraishu4453 Před rokem

    dude, the music for this video is too good. What's the name of it?

  • @Malkovith2
    @Malkovith2 Před rokem +1

    i like the music in your videos

  • @asphere8
    @asphere8 Před rokem +1

    I prefer to undervolt rather than underclock; reduces my power consumption and heat output while simultaneously providing more performance!

  • @doltBmB
    @doltBmB Před rokem

    I'm happy with my 65W the *900 branded stuff has always been the "extreme enthusiast" tier with zero regard for efficiency

  • @ReneHoffmann194
    @ReneHoffmann194 Před rokem +4

    You somehow always manage to amaze me, great new video 👍
    I'm also of the opinion that the consumption got out of hand. This is why I decided that in future upgrades (currently having a 5800x and rtx3090) I am no longer buying the fastest available CPUs/GPUs but also take a close look at efficiency. It looks though that I skip the new gen and instead wait for the next one. this way it should be possible to upgrade to a faster setup and even lower the power consumption.
    Btw. i reduced the power consumption of my 3090 in summer because else it would get to warm in my working room. but in autumn/winter reset it to its default power target to heat my room while i play instead of using the gas-heating 😉

  • @MemeMan69
    @MemeMan69 Před rokem

    i see what you did there with the benchmark graphs

  • @alistermunro7090
    @alistermunro7090 Před rokem

    I both andervolted and power limited, to 125W, my 12900K when I got it at the start of the year. You got me thinking about a 13900KS now.

  • @Sigma1
    @Sigma1 Před rokem +8

    I don't mind this, but I wish there were more benchmarks with limited power, at 40W, 65W, 115W etc.
    (or at least at the CPUs rated TDP)
    The fact that we can lower the power usage drastically without losing much performance is imo. great.
    Being able to bump power limits in a few years to regain some of that performance when it will matter is also nice.
    It's basically the benefits of overclocking, but without the hassle and it works on locked intel CPUs too.
    I love that Intel and AMD are finally competing and that we're no longer limited to weak 4c/4t $400 outdated products.
    Cpus are finally at the point where we can compromise performance for efficiency,
    as there is next to no downsides to this approach.
    I have been waiting to upgrade for years, and now there is an actual reason to do so.
    I can't wait for the release of 13400 and 13500, as 13500s e-cores are basically my current cpu x2, not to mention the 6 p-cores and a ridiculous amount of threads for the price.

  • @magcs6233
    @magcs6233 Před rokem

    When playing a game like BF4, my Ryzen 5 3550H uses 7W, highest I’ve seen it hit was 15W, the problem is that the performance gain from a higher TDP nowadays is so negligible, yet vendors put their TDPs so high just for those clocks

  • @popcorny007
    @popcorny007 Před rokem +1

    To be super pedantic, power-limiting is not under-clocking.
    Under-clocking actually changes the boost behaviour, while power-limiting just truncates the top end of the curve.

  • @Teh-Penguin
    @Teh-Penguin Před rokem

    Yup. This is the reason why I have not upgraded and hope for the next generation to be more power sensible. Both for GPUs and CPUs.

  • @117johnpar
    @117johnpar Před rokem +6

    Systems are much much less efficient now as well. I remember back in 2010. I had a laptop I would use at school all day. 8 hours going from class to class. Taking notes and browsing the internet. It would sip about 15% CPU usage on that cheap processor with those loads for 8 hours and I would get home usually with at least 30% battery life left. Now, I would love to find a laptop that would last half that time and load without completely draining its battery. It really feels like technology is regressing.

    • @sweetdreams5454
      @sweetdreams5454 Před rokem +9

      tbh, current softwares are also super bloated
      i think newer cpus are much more power efficient, they just have a lot more things to do in the background constantly
      my laptop which is an acer swift x lasts about 5 hours on battery on windows when idle, while on linux that goes up to 24 hours on idle
      the entire computer uses 2.5-3watts on a 54wh battery when idle, so i guess you just gotta look for more efficient laptops

  • @anepicotter4595
    @anepicotter4595 Před rokem +1

    As someone who runs an overclocked i5-9600k (default: 4.3ghz all core boost 4.6 single core), I can say they aren’t really pushed very close to their limits. When I tried an overclock with an undervolt, I was able to get a stable all core boost frequency of 4.7Ghz. With a relatively heavy but mostly unharmful overvolt, I found 1 core that failed to get past 4.9Ghz but the average frequency is 5.1Ghz. This is an overclock of roughly 18% and while mine is good, it’s not extraordinary. If intel was really pushing the 9000 series, a 20% overclock with an air cooler and $150 motherboard would be unheard of.
    .
    I think for a few generations, casual overclocking is going to be a matter of undervolting. How much performance can I keep when I reduce the voltage by 10-20-50-or even 150 millivolts? Can I reduce the power draw by 10% and keep 95% of the performance? 96% 97% where is the tipping point?

  • @Bauxitedev
    @Bauxitedev Před rokem +2

    I just bought a 750 watt PSU. I think I might've gone a little too low on that one.

  • @koko-fv2zr
    @koko-fv2zr Před rokem +2

    i dont think you will see my comment but you should check m1 pro-m1 max chip fps-specs in re village. They draw 30-50watt power (cpu and gpu) and output like 4k 120fps in hdr gaming when optimized

  • @CobaltSpace
    @CobaltSpace Před rokem

    If we don't stop, we will eventually hit the ac circuit limit of 1500W. And unless it becomes common for computers to be given their own dedicated circuit, it also has to handle monitors and lights. It may also have to share with multiple office or bedroom appliances.

  • @ads1035
    @ads1035 Před rokem +2

    Idea: bring back the turbo button! Back in the day, the turbo button disabled the CPU's onboard cache... Let's bring it back in the form of a button that unlocks the CPU's higher power limits on demand. If I'm just surfing the net, I may want my CPU limited to 65 watts. But if I fire up a game, let me punch my Turbo button, thus unlocking my 170 watt limit and letting the CPU clock up.

    • @puccionicolas7763
      @puccionicolas7763 Před rokem

      just change the windows power plan, you can even add a little shortcut on your taskbar and run it using WINDOWS KEY + 1,2,3,etc

  • @nhansgoofyvideos7581
    @nhansgoofyvideos7581 Před rokem

    I suggest you to try undervolting. Basically sqeezing every last bit of the processor while reducing power draw.

  • @hedgeearthridge6807
    @hedgeearthridge6807 Před rokem +2

    Moving into my first house, I am practically paranoid about energy consumption. Simply because I live in a very HOT and HUMID climate, like 8 months out of the year. And things that use power make heat, which my A/C uses more electricity to remove. Rather than building my own router out of an old PC, Im getting an off-the-shelf router that's ARM based and uses barely any energy. A space heater on full power (in the US) uses 1800 watts, so even just 18 appliances using 100 watts is like having a space heater running in your house while it's burning hot outside and your A/C struggles to keep up. But a computer using over 1000 watts on top of everything else? Nope, I will pass.

  • @abbcc5996
    @abbcc5996 Před rokem +1

    gem

  • @Rachit0904
    @Rachit0904 Před rokem +31

    Intel’s 8th to 10th gen weren’t repurposed server CPUs. They were new consumer dies that were just lengthened to add the extra cores. Intel’s server CPUs used a different core layout, different memory controllers, no integrated graphics, etc.

    • @popcorny007
      @popcorny007 Před rokem +7

      He's talking about LGA2011 CPUs, even showing a 6950x during that segment.
      HEDT 2011 CPUs absolutely were repurposed server chips, same as the w3175x was a repurposed Xeon 8280.

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Před rokem +4

      @@popcorny007 ...which is before the 8th generation, like Rachit said.

    • @GewelReal
      @GewelReal Před rokem

      ring-bus moment

    • @wanderingwobb6300
      @wanderingwobb6300 Před rokem

      Welcome to 2kliksphilip where we act like AMD is all high and mighty creating Zen which was super scalable while ignoring the practically impossible scalability of Skylake Intel managed with them consistently staying ahead of AMD in performance for its entire run. Then once performance became even Intel was consistently managing better price to performance than AMD was. Because they've managed to stay that way for 3 generations now Philip needs to make a video saying price/performance isn't important anymore since AMD's staggered behind badly in that regard and it's efficiency which is key since that's the only metric they've managed to stay ahead at (although even then undervolting Intel shows exceedingly competitive efficiency regardless).
      I don't hate AMD, AM4 will probably go down as one of the most legendary sockets of all time but the AMD bias/shilling on this channel is unreal. I'm completely baffled how he ever managed to get review samples from Nvidia with how shockingly blatant the AMD bias is. If Intel makes good CPUs they're only mentioned in passing only to then return to the main subject at hand which is how holy and beloved our saviors AMD are.

  • @sjneow
    @sjneow Před rokem +2

    And people say Bulldozer was too hot when it was 125w

  • @rashoff
    @rashoff Před rokem +1

    This is why people need to start tuning their CPUs and set a power limit

  • @lemmonsinmyeyes
    @lemmonsinmyeyes Před rokem +1

    the power issues with new ryzen 7xxx is because they made the heatspreader too damn thick. they did that to keep the mounting height the same as AM4 so people could keep their cooelrs, but they really should have made it thin as possible

  • @BigCheeseEvie
    @BigCheeseEvie Před rokem

    Great video! My condolences about you being british though hope you get better 🙏

  • @kamikazilucas
    @kamikazilucas Před rokem +1

    my biggest problem is how much power my cpu uses when idle i have a 3600x and it uses over 30 watts when idle which is insane, my laptop only uses like 1 watt when idle, when i undervolt it it uses less but then it is always at a locked frequency, i wish there was an option to change the frequency and voltage when doing more heavier tasks, i dont want my cpu to use 30 watts when on twitter

  • @TheGrizz485
    @TheGrizz485 Před rokem +7

    I predict that we'll see 550+watt 4-slot AIB 7900 XTX cards that perform close to the 4090.
    I bet AMD just doesn't want to deal with this power-PR nonsense ATM. for gods sake they clocked the card to only 2.3ghz when everyone was expecting them to hit at least 3ghz.

    • @Ali-dp6xv
      @Ali-dp6xv Před rokem +1

      if not better

    • @KonradVarga
      @KonradVarga Před rokem +2

      its slightly bigger than the RX 6900 XT
      and where did you get the 550+ TDP out of?

    • @TheGrizz485
      @TheGrizz485 Před rokem +1

      @@KonradVarga just speculation. A 3rd 8-pin connector would give it a theoretical maximum of an extra 150 watt. So may be I went overboard. 500w is more like it

  • @jekanyika
    @jekanyika Před rokem

    I've started running my 5950x in power saver mode and only put it up to high power mode if required.
    It saves about 25W at idle.

  • @croftoc
    @croftoc Před rokem

    Good vid

  • @mushieslushie
    @mushieslushie Před rokem

    I just upgraded to a 5600 (65w TDP) from an older 85w TDP i5 and I was pretty disappointed that it actually consumes more wattage, especially at idle. I do have double the performance which is nice but it also runs way hotter even though the rated TDP is so much less. I did under volt it as well, which dropped my temps by 8c and increased my multi core cinebench score by over 500 points.

  • @js1zzle
    @js1zzle Před rokem

    i wouldn't say i underclock but i do undervolt my gpu and cpu in order to lower my temperatures while not sacrificing any performance

  • @TheHalfmanofOz
    @TheHalfmanofOz Před rokem

    Top vid.

  • @katnax3059
    @katnax3059 Před rokem

    When i noticed, that i bough weaker cooler than i should've and my cpu was almost always above 60C, i tried undervolting with PBO but i didn't choose aproporiate settings. I will try next time with 3700X, and anyway i won't use this PC too much for gaming, mabye minecraft at FHD 60Hz but that's all

  • @tinleo333
    @tinleo333 Před rokem

    Performance pet watt is the future in cpu/gpu

  • @nimushbimush2103
    @nimushbimush2103 Před rokem

    Undervolting is good too!

  • @Bill-lt5qf
    @Bill-lt5qf Před rokem

    am overclocked & undervolted. r5 3600. really makes me wonder why it doesn't tune itself this way from the start.

  • @bigcazza5260
    @bigcazza5260 Před rokem

    intel hedt was always based on the server chips
    apparently there will be a seperate monolithic "fish hawk falls"with 24 golden cove cores hedt soon

    • @bigcazza5260
      @bigcazza5260 Před rokem

      amds current consumer is just server leftovers

  • @Tychosvideos
    @Tychosvideos Před rokem +1

    The power consumption of the new parts makes me wholy uninterested in buying them since my previous computer (which had a bulldozer cpu) would overheat in the summer and turn off...
    And then there's also the ever increasing electricity bill.

  • @cloudownage666
    @cloudownage666 Před rokem

    me: regularly watch new phillips videos
    also me: only now realized i never subbed
    thanks for you informative vids phil

  • @earthling_parth
    @earthling_parth Před rokem

    I agree, power efficiency should be given equal importance as of price/performance ratio.

  • @herre3147
    @herre3147 Před 10 měsíci

    processor coolers are so weird. They were making computers in the 80s or something and were bottlenecked by the temperature of the chip. Then one guy said we should make a tiny little fan that sucks air away from the chip and put it inside the case. First computers didn't have moving parts, and then they did. Like a little mechanical machine inside your electronics. also its literally a free way to increase your processor speed in a sense

  • @L_U-K_E
    @L_U-K_E Před rokem +1

    Interesting

  • @ETILHK54
    @ETILHK54 Před rokem +1

    Underclocking as the new overclocking.

  • @Hyper_1989
    @Hyper_1989 Před rokem

    Anyone know what cpu cooler Phillip is using and if it's any good?

  • @blu_pi
    @blu_pi Před rokem +4

    I just wanna mention the way you've set up your gpu power adapter it is more likely to fail according to tests done by some tech youtubers. They almost never fail when bent up or down but bending them left/right seems to be making them melt. Generally this may only happen to people who get bad batches but it could happen to you. I would recommend rerouting the adapter to bend down instead of right!

  • @TrueBlueGamer
    @TrueBlueGamer Před rokem

    Could the code in which the OS is written contribute to that same waste in energy we see nowadays?
    I mean we use smartphones daily that aren't based on x86 and have much smaller batteries than most laptops and yet still last longer, per charge, with regular use in comparison. Maybe Apple's M1 products are good because of both the SoC architecture and the software that complements the hardware.