The Intel Problem: CPU Efficiency & Power Consumption
Vložit
- čas přidán 15. 06. 2024
- GRAB A BRAND NEW GN15 MOUSE MAT! store.gamersnexus.net/product... (black/blue/yellow and available in signed or unsigned!) or consider buying one of our GN15 metal emblem glasses to support our work! store.gamersnexus.net/product...
A lot of you have requested that we run power consumption tests for gaming on CPUs, so we've finally done that! And alongside fulfilling that request, we also wanted to tackle a comment from Greg, our recurring antagonist commenter, who "requested" CPU efficiency testing. These benchmarks look at the efficiency and raw power consumption of Intel vs. AMD CPUs. There's a particular focus on the AMD Ryzen 7 7800X3D and Intel Core i7-14700K and 14900K CPUs, as these are the most recent and directly comparable / best gaming CPUs.
Watch the CPU Sample Size video! • "NOT ENOUGH SAMPLES" -...
Learn about Intel's Snake Oil: • Intel's Snake Oil & Co...
Read our CPU power consumption mega charts (these ones aren't there yet!): gamersnexus.net/megacharts/cp...
Like our content? Please consider becoming our Patron to support us: / gamersnexus
You can also find Wendell from Level1 Techs here: / level1techs
RELATED PRODUCTS [Affiliate Links]
AMD Ryzen 7 7800X3D on Amazon: geni.us/bah9
AMD Ryzen 7 7800X3D on Newegg: howl.me/cle8h3CtNL9
Intel Core i9-14900K CPU on Amazon: geni.us/Etbz3
AMD Ryzen Threadripper 7980X CPU on Amazon: geni.us/6HfTHBO
AMD Ryzen Threadripper 7980X CPU on Newegg: howl.me/ck3kepz4Noi
AMD R9 7950X on Newegg: howl.me/ck2HGLczYCA
TIMESTAMPS
00:00 - TwitterBall X
00:25 - BET
01:23 - Tests Included
03:22 - The Plan & Testing Methodology
06:54 - F1 2023 CPU Gaming Efficiency
08:12 - Locking the Framerate (F1)
10:25 - AMD's Floor - I/O Die Requirements
10:44 - Cyberpunk: Phantom Liberty Heavy Gaming
12:15 - GPU-Bound CPU Efficiency in Gaming
13:28 - Starfield Efficiency
14:19 - GPU Utilization vs. CPU Power Discussion
15:28 - 60FPS Locked
16:21 - Baldur's Gate 3
17:03 - Blender Power Efficiency (Wh)
17:27 - Compression Efficiency (MIPS/W)
17:47 - Photoshop Locked to 86W
18:10 - Highest Power Consumption Spikes
21:04 - Conclusions & Actual Electricity COST
25:59 - Future Plans for This Testing
** Please like, comment, and subscribe for more! **
Links to Amazon and Newegg are typically monetized on our channel (affiliate links) and may return a commission of sales to us from the retailer. This is unrelated to the product manufacturer. Any advertisements or sponsorships are disclosed within the video ("this video is brought to you by") and above the fold in the description. We do not ever produce paid content or "sponsored content" (meaning that the content is our idea and is not funded externally aside from whatever ad placement is in the beginning) and we do not ever charge manufacturers for coverage.
Follow us in these locations for more gaming and hardware updates:
t: / gamersnexus
f: / gamersnexus
w: www.gamersnexus.net/
Host, Testing, Test Lead, Video Editing: Steve Burke
Video Editing, Testing: Mike Gaglione
Video Editing: Tim Phetdara
Guest: Wendell of Level1 Techs - Hry
Thanks, Greg for this idea! We're funding this one ourselves, so grab a brand new GN15 all-over print component mouse mat! store.gamersnexus.net/products/15-yr-mouse-mat or a GN15 metal emblem glass to support the testing! store.gamersnexus.net/products/gn-3d-emblem-glasses
Watch the CPU Sample Size video! czcams.com/video/PUeZQ3pky-w/video.html
Learn about Intel's Snake Oil: czcams.com/video/xUT4d5IVY0A/video.html
You can also find Wendell from Level1 Techs here: czcams.com/users/Level1Techs
And find our mega charts for CPU power here -- new charts not added yet: gamersnexus.net/megacharts/cpu-power
Grabbed me one of those 15 yr annimousary pads ol son!😂
Thanks, Greg!
@GamersNexus can I have one of those chips :( uwu
When did Intel lose in effizient vs AMD? 1st or 3rd Ryzen? (Dont think Bulldoser)
@weltsiebenhundert378 I think during the 3000 series, thought the 2000s might be close
I've done extensive testing and I've found that if you power limit both the 14900K and 7800X3D at 0W they perform identically with identical performance per watt.
Of course they would! 😂
Dont say it to Greg, he would still bet that even at 0w the intel cpus where more efficient and faster
Wow! This is a huge break-through! We'll immediately test the power consumption of the systems when they're off. hahaha
You are wrong.. Intel CPU will actually give out energy from ether while AMD CPU is just turn off at 0W
@@GamersNexus Actually, that's not as mad as it seems, who actually switches their systems off at the wall .......
It's hilarious to me that the 7800X3D without FPS limit still pulls less power than the 14900K with an FPS limit
Edit: the amount of fanboys defending Intel with all they have, for no good reason at all, is just ridiculous. Stop saying they should have undervolted it. Stop saying they should have turned off the E-cores. Stop saying they should have limited the power draw (which they did for the 14700K photoshop test, actually). Stop trying to make this literal waste of energy better than it is. You get nothing from it by doing this. You only expose yourself to siding with a company that rips you off, and that makes you look like morons.
And besides, this video was supposed to be a fair comparison between Intel and AMD. If you tweak Intels chip to the point of being advantaged, you surely can do the same for AMD.
Looking at power consumption of the 7800X3D, I wonder why AMD didn't release it on laptops platforms yet. It would have no match at 50-75W TDP in games.
Edit: yes, I know about the R9 7945HX3D, but it still has to share some of its power between two CCDs, instead of focusing all the power on the 3D V-cache one(and laptops usually have 55-75W limits). There's 13980HX on laptops, and it is just identical(in terms of its cores count, frequency and even cache) to power limited 13900K. But they wouldn't need to limit the 7800X3D that much, even 55W would be enough.
@@ReclusiveXLIt might be a temperature constraint; that and the higher power floor to run them
@@WWAVES1 nah, in laptops the cpu doesnt have a thicc boi IHS and 50-75 watts is relatively normal for laptop CPU's
Thx to the 3D cache performance are higher AND they have to reduce the maximum voltage to avoid damaging this extra cache. Those 2 factor make it incredibly efficient/performance, that and the fact that zen core are more efficient that Intel core by default
@@ReclusiveXLit needs desktop levels of cooling to remain stable at 50-75W. There’s a reason it doesn’t boost as high as its non-X3D counterpart, which has to do with voltage and temperature limitations.
''We figure that some people might not like this scenario because it's completely valid, fair, realistic and out of the box as you use the processor'' - sassy Steve is by far my favorite Steve
Pretty sure the cat wrote that line 😜
😂
Even though Greg took a fat L, he gave us all a big W for introducing this new set of power efficiency tests! Thanks Greg and Thanks Steve!
Thanks Steve
I don't see it that way. It is a win-win. Scientists learn from their mistakes, empirically.
Now i see Greg as my personal hero!
Back to you Steve.
Thank you for saying it
There's something admirable about a team which idea of having fun is piling up weeks of benchmark over a troll comment. Kudos guys!
troll comment OR might have been someone from the userbenchmark team 🤣
Some folks enjoy doing what they do!
@@noba4696 team? i thought it was a single person
@@noba4696 Ain't that the same? :p
@@lucyfer7748no, trolls intentionally say things no matter how ludicrous with the express intent of provoking a response, this dumbfuck's genuine conviction was that Intel is currently better than AMD and GN humiliated him
The low-key tragic detail of this video is that the 14900k is LESS efficient than the 13900k in literally every single test. Ouch.
That was to be expected. They're basically milking the same CPU for a few extra cycles, at the cost of even higher power.
That just proves that BIG NUMBER BETTER
Underrated comment.
It is normal, it is a refresh. Same trash CPU overclocked to look better on benchmark. They literally do on desktop what they say AMD is doing on laptop. But intel is doing it on high end parts.
because it's literally the same chip, just overclocked a little bit more than it already is
Greg: you are not testing CPU’s scientifically enough!
GN: challenge accepted.
Subscribers: Thanks Steve 👍
@@amrmoustafa328 er they are comparing based on claims by either company on what their CURRENT chips performs. It doesn't matter what "technology" those chips are build on. It is FAIRLY COMPARED based on the claims the producers of those chips are saying about those chips NOW. Why is it that someone like you can't seem to understand that difference? Are you being shady here and claiming something stupid by saying that those companies like AMD and INTEL are LYING about their chips in the first place? Your assumption is what is truly not fair as the baseline is the comparing based on claims MADE BY THE MANUFACTURERS of their own chips in their current state, no matter what technology is in them.
not really fair considering Gamer nexus has a billion sponsors and has the money to just throw at research while the normal person cant.
back to you steve
Can we all just take a moment to thank Greg for threatening us with a good time? This was an EXCELLENT presentation and more than a little eye opening. I knew 14rth gen i7 and i9 CPU's were power hungry but Intel got absolutely abused. I should not have started this at bed time. 30 minutes gone in no time.
even 8, 9, 10, 11, 12, 13 gen too were/are power hungry...and i dont know why intel call those cpu 14gen, in fact they are 13gen with a bit overclock
"Ah yes, let me limit this 550$ CPU that I bought to 40% of its operating power, so it can get half the efficiency of its rival instead of 1/3rd"
- Greg, 2023
*of it’s 360$ dollar rival
fact is if you just limit power draw (not fps like this silly video did) then it's quite darn efficiënt for gaming. these chips are just stupid out of the box.and dont forget the 7950x also draws nearr 200 watts which is way past its efficiency curve. limit that to 150 watts and its also way more efficient.
@@pino_de_vogel Why would you limit power draw if you're trying to measure efficiency for a typical use case? That's why they limited fps instead of draw. That's like arguing you should drive with your motor locked to 2000 RPM everywhere because it's more fuel efficient to do so. People are buying these products to just use them, any other points to be made are either worthless or blatantly made to try and discredit the fact one product is CLEARLY under-performing comparatively in the efficiency department. edit: switched MPH to RPM, because that's actually closer to what's being spoken about here.
@@pino_de_vogel Next time watch the video before commenting. They did limit the power and it did not make it more efficient than AMD.
@@pino_de_vogelHi Greg
As a computer engineer, it baffles me how many have such strong and vocal opinions and push those opinions when they have not a single clue about the subject
Twitter is the epitome of Dunning-Kruger at it's finest
@@cmac3530 *the internet
There's a word for that type of person. Ultracrepidarian :)
Don't forget that each user comment you see might be written by a 9-12 year old, who basically just parrot things without understanding them.
Right there with ya 😂 if there's one thing I learned from studying computer engineering it's how the most vocal people on the Internet are often the most wrong in terms of computers lok
The important question remains tho... How will userbenchmark spin this one??
They were bitching on the 7950x review how it uses 160W (has 160W tdp actually) and on the 14900K how it’s SUPERIOR. I emailed them a screenshot of the 10fps/watt and 3fps/watt and called them shills and liars.
Can we just collectively stop visiting that site? It doesn't deserve any attention. It misrepresents its contents as benchmarks when they are just random numbers pulled out of his a**, inconsistent with every other reviewer.
Never understood the hate for userbenchmark, intel wins, and it makes diehard AMD butthurt, but in terms of efficiency, that's a whole different story.
@stacksofslaps7442 why buy a chip that consumes 3 times as much power to get a 10% gain? Ever heard of the law of diminishing returns? 14900k for gaming is a waste, 7800x3d is better for power consumption and price. Easy choice.
@@guywhitey2028 they totally win tho ignore the absurd power requirements and all the other data on non biased sources that clearly show a very different picture.
Anyone else enjoying the ridiculously good quality of stuff that the GN team has been putting out lately? Mfers are heading into 2024 on FIRE!
Damn Gregs out here makin' us look bad! From one Greg to another, DAMN IT, GREG!
I guess Dave (if you're familiar with the radio commercials) has handed the title off to Greg.
It’s okay. Let us be the best of Greg’s and hopefully we’ll inspire the others to do better.
Gregs does ask the stupid questions nobody asks, but when asked want too know the anwser to... :)
Results confirm bias, though bias had no quantification.. this is nice
Yeah that other Greg really Greg'd it up for everyone. sigh
My name Greg
I don't think I've ever seen someone get blown out of the water as hard as Greg just was
I love it when Steve pulls out these "I didn't need to go this hard but..." videos
Greg owes somebody his paycheck, lol.
Can you believe the guy is doubling down on Twitter ?
@@rmvdhaak It's impossible to argue with stupid, even with facts.
@@rmvdhaakWhat's the @?
If I can make a humble suggestion: you're referring to the voltage/frequency curve a few times in this video. It would be interesting to actually plot the curve, then find the sweet spot for the CPUs (7800X3D vs. 14700K would probably be most interesting) on that curve and compare the performance at that particular point. This would put a figure on the maximum achievable efficiency.
This seems a cool idea!
Problem is, that optimal efficiency would be different for every single possible application. Even hardware, just different CPU/GPU/memory combos could have different optimums.
its not just the power cost, for a 14 gen high end part you're pretty much forced to use an expensive water cooler while a 7800X3D can be cooled by a cheap tower cooler.
you aren't - you can even overclock a little bit with some of noctuas several year old models and there are better air coolers out there now
@@malcomreynolds4103Noctua or any other high-end air coolers aren't exactly cheap in a lot of places. I'm from a very hot part of Brazil, for example, and for me the 14th gen intel CPUs would need an expensive water cooler or a top-end air cooler. These aren't cheap at all here, some of these high end air coolers go for around 180USD here, money that could go into getting an rx7900xtx vs rx7800xt. Also, electricity is around 0.23USD/kW. A CPU that draws 200w in Cyberpunk would make a significant impact in the energy bill for me, even though I play around 3-4h a day.
And for the Ryzen you're stuck with absurd mobo prices :D I know, I got a 7800X3D earlier this month :D
Also all cores of 7800 are real core. Meanwhile intel 14k only has about 30% of physical core, the remaining are emulator core made from some top secret Hyper-V of sort
@@superspies32 No.
GN team, the intros are something else. I may come here for the info you provide but I get entertained by the intros.
hahaha, gotta keep it interesting!
@@GamersNexus I think you should make some kind of "Cringers Nexus" channel for shots like that intro and memes from GN. Would be lovely. Thanks Steve!
I absolutely love the intros.
@@GamersNexus I applaud your effort by purposely going out to get a brand new pitchfork 0:08 just to film this video
@@GamersNexusso. when is the first gamers Nexus short film coming out?
I can only imagine its difficult to come up with creative and unique shots for a processor, it doesn't go unnoticed from myself. Some creativity here using the lens with it being the subject of investigation.
The video team will appreciate that you noticed! I'll share the comment!
Agreed! The video team has really been upping their game over the past couple of months. Kudos to the GN video and editing crew!
Power efficiency is a critical metric for me these days; thanks for this in-depth testing!
Why is that?
Yeah, one of my heaters broke, and this 65w gen1 ryzen is not doing it even with a malfunctioning fan. Came here to get some blood to my head and consider buying intel for the extra heat generation.
@@thomasschraubt7497electricity bill for me 😢
@@thomasschraubt7497 In some countries, energy bills have been doubled and in some cases tripled since covid.
@@annekedebruyn7797we pay 0,5$ per KWh, when your average net income is around 1750 and half that is price of your rent ...
this power efficiency is really good information not just for cost, but for thermal generation. it tells us realistic heat loads in games. this is super important for SFF builders who might need to cram everything onto a single Rad, or use undersized coolers. your case can end up limiting your hardware choices due to thermal constraints.
I got an i5-13600k about a year ago and it's really difficult to keep cool here in Australia where it's usually quite hot and houses have zero insulation, along with evaporative air conditioning that is absolute garbage.
I knew it ran hotter than its competitor but didn't think it would be this hard to keep cool.
I legit eyeing 7800x3d for my first SFF build. Due to current situation, i guess i would go to 8800x3d when it arrives. To be honest, my current PC is just fine, just too bad that it's literally idling at like 70W (150-200 at full load). It's quite hot in summer and power bills are also going up
True! Especially high end CPUs can be very difficult to cool under load.
I use evaporative and have asbestos insulation. My state is dry and not tropical I guess, so maybe that's your problem.
Describing CPUs in terms of their fps per watt is brilliant.
Intel has left the chat.
I wish the same would also be done with GPUs, I have the suspicion that AMD will also be ahead there when the workload is pure rasterization without any DLSS/RT.
Well, it's the same as describing EVs of their mile per Kw , because there is the truth of inefficient and expensive usage.
@@n3rdy11 the real performance saver is FSR 2. It is really good.
The 5800X3D will need to be in a CPU Hall of Fame. What a champ of a CPU. I feel like this CPU alone will keep AM4 around for quite a few more years.
Yea i went to am5 and got the 7800x3d but my brothers build i stayed am4 and for the price to performance the 5800x3d and a b550 velocita being 175$ cheaper than my combo... Lets say i have a 7900xt in my am5 build.. for the same price u can stay am4 with the 5800x3d and get a 7900xtx and have better performance for the same price as the am5 system
I bought one for a Dual Vega 64 Crossfire build as I was trying to get around the poor GPU scaling due to CPU bottleneck. Just for a fun project. It worked. Both Vega 64 GPUs get 100% utilized and you get a doubling of performance. Very strong CPU.
If they continued to make it.
Currently on ryzen 2700 when im gonna upgrade i dont know but when i do it will be 5800X3D it will be cpu that a lot of people with old ryzens will upgrade to just because performance of it for just upgrading your cpu is insane
For gaming yes but for anything non gaming not really.
That shot of all the cpu boxes stacked is amazing. Whoever did that deserves a raise.
I live in Brazil and own a AMD Ryzem 5600G, and I love its efficiency. There is an extra argument you can make in favor of the less consuption also, you need a smaller power supply for your PC and it is more silent because it needs less fan speed to dissipate heat generated by the extra power.
I "upgraded" from a 5600G to a 12900K. The noise issue alone makes me want to go back to the 5600G. That and the fact that the idle power of the 12900K system is what the 5600G pulled at 80% load.
@@MartyAckerman310 you've learned your lesson. NEVER BETRAY LISA SU OUR LORD AND SAVIOUR
Fan speed doesn't depend solely on the heat generated. The X3D CPUs have a significant disadvantage compared to Intel in terms of how efficiently the cooler can dissipate their heat. They have lower temperature limits, which means lower difference in temperature compared to the air. The 7000 series also have that oddly thick AM5 IHS for AM4 cooler compatibility, which makes them less efficient at transferring heat to the cooler, so the cooler has to work harder to keep the chip at the same temperature. The heat pipe layout of a cooler can also make a sizeable difference between AMD and Intel because the positions and sizes of the heat sources within the CPU are different.
That's why I like 65W CPUs... I bought a 7700 instead of a 7700X (7700 is cheaper in Brazil btw). I ran tests on my CPU setting it to the same parameters of a 7700X and measuring the power drawn on the power supply cable, it simply doesn't worth it, too much heat and to much power for just a little bit more of performance... I'm currently running my CPU at 75W TDP +250MHz on the boost clock and undervolt. Stresstesting the CPU at 100% is taking only 150W from the power supply (entire system consumption) and CPU temperature is around 80ºC. When I tested it on 105W TDP +100Mhz + undervolt (same specs of 7700X) the total system power was around 200 - 215W and CPU temperature went straight to 90ºC.
Me personally, the efficiency is necessary as my PC is hooked into a UPS. I have a _hard_ power limit; anything to drop the power consumption can be critical in an emergency.
"We figured that some people might not like this scenario... because it's completely valid, fair, realistic and ootb as you use the processor"
LOL, you didn't have to roast them this hard, Steve !
Steve didn't do the roasting - he just put Greg#2 next to the 14900k heatsink (ok, let's be real, radiator) and let all the stupid amounts of heat do the work.
Yes he did🤪
“Thanks Steve!”
Greg must be a new userbenchmark for hire 😂
LOL, for that matter... They should call this test _The Greg._
I thought the same. Sadly there are still a whole lot of people who fall for the bullshit Bullshitbenchmark is telling them.
with how ridiculous these types are getting im starting to think theyre just stockholders
with all their life savings on the line
@@j5ylim396 I'd honestly buy Unity stocks over ITNC right now. Even with the Unity drama.
@@j5ylim396I wish I could believe that. I've seen enough fanboys over the years that this just seems like normal denial of reality to me.
Steve, don't forget that higher power consumption means spending more money for a bigger PSU and Heatsink.
Not to mention motherboards that flow the amount of required power
not to mention more case fans to make it cool..
what ever encoder you guys at GN are using to make videos is BEAST. Your youtube videos look orders of magnitude better than 99% of other youtube channels. It's insane how good these videos look on my G9 monitor and 4090 lol.
Insane the amount of power Intel chips need just to keep up. 3x more power at some instances and almost always twice+ more. I knew it was bad but this was a bloodbath
It's wild. I think the x3d chips in particular kinda highlighted this, because before that most reviewers/consumers just defaulted to "performance is more important than efficiency." And that's still true, imo, but now that AMD has both (in gaming, at least) the disparity being this wide is jarring.
I also think the people saying "oh well you can undervolt it or cap your fps" are either very niche users or coping. Yes, a small percentage of users undervolt, but when they talk about it they advertise *not* losing performance as one of the main draws. Multiple tests in this video show that you do lose a significant amount of performance if you cap the power draw, That Blender example where the render time nearly doubled was extreme, I'm sure, but the point is if you're buying a 14700k because it's a better multithreaded workload performer.... why are you kneecapping it for the sake of power draw. At that point one of the only advantages it has is nullified.
Most people just want the best performance you can get out of the box. I really doubt more than 1 percent of people are gonna stomach losing 20 fps in cyberpunk for the sake of a few bucks off their power bill. You would have bought cheaper more efficient equipment if you cared that much more about power draw than performance.
10nm vs 5nm. Intel sux
@@Skeames1214 It is insane the gap. I have a 13900K. I have always and will likely still just keep buying whatever has the highest FPS numbers when I buy, but damn would this make my cooling so much easier. The heat is such an issue.
It's reversal of 2013. Bulldozer time
@@Skeames1214 capping the power draw and undervolting are two difference things.
I love how Steve will see a tweet with like 5 likes and take it as a personal challenge necessitating tens of man hours of work. What a mad lad
This one was about a month of work! And it required 3 of us! And give me some more credit -- it had 2 likes! hahaha
"Someone is wrong on the internet!"
@@GamersNexus
It’s the sort of work that keeps us coming back. Great stuff.
He wants the paycheck
@@GamersNexus Astounding work gentlemen
Excellent review and interesting perspective. One thing to note about cpu power consumption: power correlates to heat, which can be a meaningful factor when considering how ambient room temperatures and noise levels are changed over time, especially in hot climates or during the summer months.
I'm excited to hear idle power draw is coming, I feel like as live service models get worse there will be more interest in running local servers. I know I am. I don't pay a ton for electricity, but I'm still interested in running as efficiently as possible where I can.
I love how much the tables have turned in terms of performance per Watt. The kind of issue we see now used to be in AMD’s camp with the FX series.
I had a phenom and skipped the FX generation. Didn't seem worth the upgrade
@@davidellis1355I had one and did the upgrade. You were right lol
@@davidellis1355 Phenom II x6 was a screamer w 2x msi 7770 oc but the power bill suffered... lol.
@@frizzlefry1921 1090T gang
And before that Intel had issues with the Pentium 4 where they once again tried to brute force themselves ahead through clockspeed at the cost of power/heat. It's one of those things that will probably keep swinging back and forth over the course of history.
The fact you get biased comments like this and still take the time to go through and methodically test them speaks loads to you and your team's character. THIS is the content I'm subscribed for! Love to see it.
i like that you used the word biased, instead of call greg a fanboy, what he is, lol
I don't think it matters how many benchmarks or tests Steve runs on these CPU's. Intel fanboys will not accept any result even if it proves them wrong.
@@anorax001 it doesn't matter what company the FanBois are all as STOOPID as eachother, they are the Flat Earthers of the Tech Community.
@@anorax001Look, if AMD was really better, how come they can't afford a five-letter name, HUH?
That's right, you can't answer that, because there is no answer!
That's the difference between maturity and also having the intelligence to create out of hate and to make lemonade out of lemons something that only comes with maturity something that many of the Millennials and the under 30 crowd seriously lack and their parents are to blame
I have been loving my new 7800X3D! Coming from a 3700X that was a great CPU, i've seen a strong boost in 1% lows in my games with the same RTX 3070.
Same scenario for me..next is gpu upgrade in 2024 or early 2025
Great data! Totally not surprised a smaller manufacturing process pulled ahead in efficiency, lol. The kWh charts were especially great. I'd be curious about idle or near idle power consumption between systems. Like whether either brand has better technology to keep computers efficient when they are not doing intense work. Such as browsing and maybe watching videos. Maybe the differences are insignificant but that could be charted out over their expected lifetime make it significant (say 5 years). EDIT, I saw idle was mentioned toward the end, nice!
Sometimes I can sit down with my desktop for half a day only browsing or watching videos. Sometimes I use the PC daily but don't game for a week. The power consumption of having a few base programs up is just as important as the heavy load consumption since I only game for a few hours each week.
Some kind of baseline would be nice to see, however it is defined.
It's not all about the fab process, but is more about the focus. Intel is taking the attitude, "just throw more power at the chip to let it clock higher", and the result is that efficiency goes right out the window. Alder Lake to Raptor Lake to Raptor Lake Refresh, yea, add some E-cores going from 12th gen to 13th gen, add cache, but then boost the power draw for higher clock speeds. Intel has been using this approach since the 7th generation when Intel had locked its upgrade path to the fab process. Oh, Skylake...yea, 10nm wasn't ready yet, so Skylake with more power and higher clock speeds was what we saw from 6th to 7th gen. Then 8th gen, still Skylake, but with 6 cores at the top end. 9th gen, 8 cores at the top end, but still Skylake. 10th gen, yep, for the desktop, Intel still couldn't improve the design, so again, Skylake drawing more power for higher clock speeds. 11th gen, still no 10nm on the desktop, all a joke.
That was the big, "same design, throw more power at it for higher clock speeds" approach. Intel is still doing it that way, being so obsessed with higher clock speeds, they don't bother with thinking about efficiency at all.
@@Targonis1Not all about, sure. I'll agree there. Though this video clearly shows differences when the CPUs are given the same power limits. Design can certainly be a factor as well. All that said I'd consider manufacturing process the key attribute.
So as someone with a 7800X3D living in Germany, where electricity is about 32 cents / kWh... WOW I'M SO BUSY SAVING MONEY HERE.
You you realize that even if you would use 400w 24/7 for a month @ 40c price, it would only cost you like 100€
E: You do it scales down? Eg. 150w gaming 2-4h per day @ 40c.
Month cost: ~3-6€
@@slipknot1943 I don't think you made the point you meant to make there.
Positive side: Saving power. Negative side: It's that expensive because it's all coal, instead of GN's nuclear...
@@slipknot1943 I think they would rather have the extra 100 euro
On the other hand i'm probably wasting money, i have a 5800X3D and it takes about ~60W on Idle which is a lot especially when i watch videos which is when the system spikes to 110W watching youtube, my old i7-6700 system does about ~33W... while watching videos, i know it didn't have a GPU in it, but that's still a heck of a difference especially at idle where the 6700 would draw like 15W at most, that's a huge difference and it's really stupid how little AMD's CPUs save power at idle because of their IO die or whatever, or that my GPU needs apparently at least 40W just to watch a damn video at 1080p.
Greg: places bet
Intel: Be careful of what you wish for
Greg is Homer. Intel is Santa's Little Helper
Best reviews on the net! Keep it up and have a Merry Christmas guys!!
The snake oil bottle being an isopropyl alcohol bottle is so fitting for Wendell 😂
Idle power consumption for CPU and GPU reviews would be extremely cool to see. I don't think anyone is doing this kind of tests, but it would be nice to know that
They're tough to get right. Working on that separately!
my 7950x3d pulls around 30ish watts at idle and my 4090 pulls 15w.
AMD would be hard because the 7000 series still has high idle power under certain circumstances like different monitor resolutions and refresh rates amongst other things
one problem with idle power testing is the *crazy* swings based on motherboard model as well as software configuration.
@@tommihommi1 second this. it’s very hard to have a “control” when there is no “control” everyone’s software is different, everyone’s hard ware is different, even people’s monitors are different, it’s hard to control all that when no one does that 100% of the time
More efficient CPU also means lower requirements for the CPU Fan, Case Coolers as well as the Powersupply or even the case itself. So there is more merit than just power cost.
Not just lower requirements, but more options as well. CPU trying to pull 300W is going to need a lot different cooler than one chilling around 150W, but run the 300W cooler on a 150W CPU and you can run the fans a lot slower and thus quieter.
More efficient isn't going to be less total power, take a 100W CPU that can do X work/hour but a 200W CPU that can do 4X work/hour. Yes total power saving but higher peak power.
@@fluffy_unicorn859 Not sure on exact numbers, but when dealing with heat a 4:1 ratio of heat to power needed to deal with the heat is probably on the good side.
also motherboard , dont have to worry about vrm's as much
My i7-12700K is the only one that can be air cooled. Any i7 above it needs an AIO.
@@saricubra2867 What nonsense. I have a Noctua NH-U14S and it cools my 13700k just fine at 5.4Ghz. Granted I set the power limit to 200W and it sits at 90C in Cinebench. In gaming and everything else it sits at a stable 600 RPM.
It's so nice to see how Tech Reviewers always come up with new ways to approach things and you really cut it again with this one. Always thought that Power is an underated Aspect in CPU / GPU reviews.
This is even more relevant after the recent microcode update that Intel sent out, as it completely disables undervolting in non-K CPUs. Only PL1/PL2 adjustments still work, and those bring performance down significantly.
Damn, poor choice of locking up another great feature from Intel.
Wow that sucks not being able to undervolt locked CPUs. Probably not a good idea to update the BIOS then... ever.
"Greg" is clearly confusing Intel v AMD now with Intel v AMD in 2013. Big, big, big difference.
The "avalanche moment" was July 7th 2019 when Ryzen 3000 launched. Since then the efficiency disparity has only grown larger with every generation.
From the 1st Gen i7 all the way up to when amd released Ryzen is the reason why I'll never buy Intel ever again.
5 to 10% increase generation their tick tock business plan and extremely over priced cpus for a good 7 plus years. Every generation had a new chipset requiring a new motherboard. na I'm good I'll stick with ryzen even if it is slower.
When they made the 9700k the same chip as the 9900k and then artificially restricted hyperthreading so only 9900k had it, for hundreds of dollars more, that was the moment Intel blew their reputation for me. Total gross money grab behavior.@@shortyorc121
@@shortyorc121Please don't do this. Who knows, maybe in a couple years the roles will be reversed again and Intel will be the better option.
@@andersjjensen The disparity was already there at that point. People just didn't pay any attention yet.
Intel chips have been toasting systems since the P4 days. Back then even worse with Intels choice of ram being extremely power hunger on top of that.
And in the server market it was even worse where it peaked at the point of the Intel 5000 series Xeon platforms used 300+W just siting idle.
Intel changed their tune because datacenters started demanding better power efficiency on chipsets and the CPUs themselves, which allowed AMD to sell FX era Opteron platforms to datacenters while being far less performant, simply because they were so much more power efficient compared to Intel.
Could you also take a focus of the idle cpu usage? I think that's also extremly important when a lot of pc's run maybe 30% of the time in idle. For business pc's some are on 24/7 and are only used ~8 hours a day 5 days a week.
Yes, we'll have some data for that in the next few months. We're planning to add that to the mega charts pages per popular request as well!
@@GamersNexusAny plans to do the same for graphics cards? Maybe idle, watching videos, that kinda stuff
@@commanderbly009 I think that will be a good idea if Nvidia release low power GPU.
Also regarding the IO die idle power consumption. Could you test with some different ram speeds if this ever gets a dedicated piece. Given potential sleep states not being active for the IO die for some of the higher ram speeds. This might disproportionately affect threadrippers ofc.@@GamersNexus
Idle power usage is where Intel wins compared to AMD desktop CPUs.
As a Greg, I appreciate the shout outs. I have plenty of ideas that I like to test on my own because I have no life, however I'm not bitter like these other Gregs. Thanks Steve, you're living my dream job and I appreciate your work and being the gold standard in journalistic tech enthusiasm. Merry Christmas!
I can't believe you guys made such an outstanding video in regards of testing methodology, just because some dude posted a relatively stupid comment. I think it's a very interesting video and quiet important to not just look at pure peak performance but also at those metrics. Thank you for the work you guys put into those videos.
There is a certain level of professional spite here, And I absolutely appreciate it.
That's the perfect description for it!
The exact video I wanted to see. Thank you for this testing.
Love the frames per Watt metrics. Think you should select 2 games as baselines and the current top selling game so that the data is meaningful at the time of benchmarking for gamers who are building a system for the current top seller.
Also add short support of their sockets to the problem.
That's a different problem!
that's a dealbreaker for me. if i can get a motherboard with an intel socket for cheap or free i'll take it but if i'm buying new AMD desktop motherboards hold value longer because they don't get obsoleted as fast. anyone who bought an am4 board in the last 6 years can attest to that
for the record: when i built my home server last year, i opted for an asrock imb-1711 motherboard and an i9-10900KF.
when i bought my gaming pc in 2017, i got a ryzen 5.
Constant crashing, mobo problems, can’t oc ram on 4 sticks etc. etc. on my 7800x3d with NO issues zipping on coffee ☕️
That was the main reason I went with a 7800X3D over intel when upgrading from my 8700K this month. After I read that their next gen of CPU's will need a new socket. Not buying a new mobo that's already a upgrade dead end. AMD claims they will support AM5 for "years to come" and with the longevity of AM4 I believe them.
@@sean8102 AMD only gave 5000 support on AM4 because Intel would’ve been more popular if they didn’t
The coverage you all have been doing, on virtually everything, has been outstanding.
I know it's not much, but here's a token of appreciation.
I debated whether or not I should give you this, considering you got someone's whole paycheck as a result of this video. I'm obviously in the wrong line of work.
I don't think they're actually getting anything from Greg lol. He's all talk. But if you want to support them, it's better to buy from their store, they get a similar cut to what CZcams gives them and you get a thing!
$20 is huge! Thank you so much. We really appreciate that. That's a large donation for us and they all go a long way!
@@Brockzillagaming Yeah, I know they didn’t actually get a paycheck, it was a joke. :)
Thank you, Greg, for this amazing video I was waiting for so long.
Sincerely, thank you GN, and thank you Greg. This is the kind of data I really need to see. The synthetic benchmarks are fine, but some real world data is especially useful for my situation.
I'd really enjoy seeing this data set expanded to include lower TDP CPUs like the 65W Ryzen 7 5700X and 7600, as well as a GPU equivalent set.
My needs are probably not so unique in the USA, as many renters suffer from landlords that are restrictive with what we can do ourselves while unwilling to fix things when presented with them. I personally am living in an apartment with shoddy electrical wiring and share it with roommates in order to be able to afford a roof over my head. The circuit that my computer is on is also shared with computers from other roommates and we don't have a way to work around it. As a result, power efficiency is extremely important partly to keep the electricity bill down, but more importantly, so that we don't blow the circuit breaker whenever we're all using our computers.
I picked out my system specifically for its overall power consumption. I chose a Ryzen 7 5700G with a Radeon RX 6700 XT. The former because I had a B450 motherboard already and wouldn't benefit from the 5700X's PCIe 4.0 support, and it was cheaper anyway. Besides that, the 5700X/G are the best AM4 CPUs whose TDP is only 65W. Going any higher would bump the max power draw and bring me closer to the apartment circuit's limit, and that limit is the primary factor for me.
The 6700 XT was chosen because it was a solid and affordable mid-range card at the time of purchase. My specific card has its limit set to 194W, but my testing showed that limiting it to 120W gave the best power vs performance ratio. The FPS gain past that was only 1-3 FPS per 10W on average, while going lower than that saw closer to a 5 FPS per 10W decrease in the same test scenario.
Add in peripherals and dual monitors and this results in a very capable gaming machine capable of achieving 120 FPS at 1080p with max details in almost all my games while still only pulling 300W max from the wall, as measured by my UPS.
Incidentally, my previous system also has very good power efficiency. (The B450 motherboard had to come from somewhere.) A Ryzen 5 2600 and a GTX 1050 that I was using as a secondary GPU because the R9 290x doesn't have hardware HEVC support. The 1050 is lower power consumption anyway and I won't need it for intensive tasks, so it would work perfectly for the file server's display.
With a few more parts, I can reassemble it to replace my current file server and vastly expand the capabilities while keeping overall idle power consumption under 30W.
Beyond that, I ccan add some extra features such as router capabilities. I'm aware of the security concerns that comes with running a router on the same system as a file server, but adding a 4-port PCIe NIC would let me create a much faster and customizable router for my local network than anything I can get store-bought while only increasing idle power consumption by maybe 10W more at most. It'd also save on the amount of cables, making for a cleaner setup.
I'd even have extra processing power to let me host dedicated servers for playing private games with my friends without affecting performance on the other parts.
I'd be able to do all of that with a single system that draws less tham 40W at idle and probably around 210W under full load.
AMD really has knocked power efficiency out of the park, and thanks to them, it's possible for a poor, disabled, lifelong tech nerd like myself to have an extremely powerful setup that Intel and Nvidia simply can't provide.
i want to say good job to the audio team at gamers nexus im using flat curved headphones the sony md 7506 and the audio is fantastic
Thank you!
GN production quality is peak
Agreed, GN's audio quality makes them one of the few channels I can watch at 3x without having to use text to get all the information.
@@cajampa How do you even get 3x, with an extension or something? YT doesn't offer that natively.
Also that's LUDICROUSLY FAST considering how quickly Steve already talks. :D
@@jubuttib I use ReVanced Extended to get 3x and many more settings missing from the regular CZcams app.
On PC browser there exist addons for this also but I am not on my PC right now and I do not remember what it is called. But there is a bunch of them in the chrome addon store.
One thing that really help with being able to pick up and follow the information at faster speeds is great neutral speakers, listening through lower quality mobile speakers makes it harder.
And of course great recording and mixing like GN is doing.
Also at least some basic treatment of the room helps also, if you are listing in a room with lots of reflections, your mind will struggle to follow and pick up everything from the noise of all the reflections, when it goes really fast.
When it works great, it feels more like speed reading, you injest the information almost more in shunks than word for word.
And when it does not work so great you might need to "cheat" with having the text on.
Regarding power I just want to point out that for users in hot climates who game in summer with air conditioning, the power consumption difference is much greater than just the CPU consumption as you also have to factor in the added AC costs.
Same could be said for GPUs
Exactly. I'm so glad people are pointing this out. So sick of dismissive reviewers who exclaim "nobody cares about power draw". Not true.
Yup, I have a window unit in my room and it's on the same circuit as my PC because I have no other option than running an extension cord half way across my house so this is very useful info for me.
There's also the PSU requirement. While not always a significant cost difference, you'd typically want more power headroom on Intel. Granted, for AM5 the generally higher motherboard cost likely makes a slightly larger PSU irrelevant for new builds but it's another point nonetheless.
scandinavians be like: 💙
Once again top shelf content, well done Gamersnexus!
Thanks for the awesome video! That's exactly what I've asked for and I just got this video :D The only little addition I would've liked is a graph for the idle power consumption. Sure, it's awesome that my room stays cool while gaming with the 7800X3D, but I'm 80% doing office stuff and my 10600k only used about 12W compared to the 7800X3D taking ~30W. Doesn't matter that much, but I need to let fresh air in a bit more often now. while working from home.
As an Australian with relatively cheap electricity this still matters. Heat can be a major factor in summer especially when you don't want to run AC. If you need to run and it works harder or you run it when you otherwise wouldn't due to heat load that extra power can start to ad up.
Same here in Texas. I don't use my oven during the summer while the sun is out. My AC can't catch up if I do. I turn off lights and keep the windows covered.
I run a 13900k in Australia no problem cooling
Heat can be a big factor if you live in a warm climate. My apartment gets sun all day and is not well insulated. Nice in winter, not nice in summer. I work from home, with the computer on all day and half the night, so those extra watts dumping heat into the room do make a difference.
@@Pushing_PixelsEven in cold or normal climate it can matter much. As an example in Europe most houses are build to preserve heat as much as possible. Running your Gaming PC for a few hours already heats up your room pretty fast. And If its Not like deep winter you cant just cool your room down fast without using an AC.
@@Pushing_Pixels very much this. I used to live in an apartment which had sunlight all day just like you do, and upgrading from a 970 to a 1080Ti absolutely made a difference in the summer. Couldn't really get by without an AC after the upgrade.
Thank you for this! Funny, I have the 13900k and a 4090. I took a month plus away from gaming, and my energy bill lowered by 50%. I've never seen my electricity bill this cheap. When bg3 came out, I had never seen my electricity bill that high before.
Bg3 is amazing lol
Bill it to Greg. he seems like somebody who likes to throw money away. win-win
A simple underclock changes alot. Also, if your bill lowered by 50% you must have been gaming 12 hours a day + that your 4090 draws way more power. BG3 is not really the game that taxes the most either. So yeah... today on what never happened. Or you're getting screwed by your power provider.
@@Xfade81 bg3 does draw power if you max out
@@Xfade81hi Greg 2
This is a really great content! Thank you GN!
love the 3dfx box on top on the back
I think I remember from your video on californian power supply regulation that idle draw plays a huge role in overall power consumption. At least for non-gaming use, I could imagine most systems run hot for 0.5s to load a program/webpage and then idle 99% of the time.
So I am glad you mentioned idle draw at the end and am looking forward to your tests on this.
Great memory! I bet not a lot of people remember that one. Will be doing idle over the next couple months!
@@GamersNexus Great to hear.
A fun aspect on the side: Since I do a lot of normal office work on a daily basis (self employed), I used for the last 5-6 years a notebook with a dock instead of the big gaming machine in the daily activity - only for games and higher workloads (Blender, compiling, ...) I turn on the big machine.
Living in germany with higher prices for power it does make a difference, since notebooks are extremly power efficient. For my case it made especially sense, since I need a mobile device anyway for visits to the client. So a win-win.
@@GamersNexus I think Intel got this one, I have done extensive search on this topic on Internet. Intel 12th gen+ wins every time.
@@pradeepchinna5545Are you willing to bet your paycheck on that?
Man Greg really on a tear of headassery lately.
Someone needs to stop this absolute menace to society!
I love the new snarkier Steve lol the energy is just entertaining and informative... keep it up GN.
Superbly helpful information again, thanks :)
The issue with power is also the costs to COOL the chip. In addition to the cost to power the chip, you need a bigger cooler. Aaaand the heat goes into the house, which means in summer you also have to power an AC more to move it out of the house too.
But you get a good heater in the winter, lol
It's also a big deal for SFF builders like me, where cooling is already constrained
@@MrQuay03use heat pumps. >100% efficiency.
@Postman00 COP is different from efficiency
@@MrQuay03Not an equal trade off in global warming where winter is only at best 3 months long for the majority of people.
Nice work on this one. Power consumption and power efficiency are two of the most important yet overlooked aspects of this technology.
sadly optimal power target/limit was never shown as they capped fps for gaming not power which is hot the correct way to go about it.
intel bulldozer moment lol
some intel fanboys only state that the 14900k smokes amd just because of raw performance while drawing three times or more wattage. thats what they never tell users, and also that it gets hotter while doing so.
@@MultiOlimac Well TBF some people don't really care about it if they have the money for a proper cooling solution. Most people just aren't going to figure power efficiency into a Gaming build. A server or workstation that's going to be running 24/7 and there's going to be dozens of them? yeah people are going to care then. But most gamers are just going to buy the CPU that gets them the best frames that's within their budget or other build constraints.
@@pino_de_vogel no. Why would you limit power for gaming? You want a smooth experience first and power efficiency second. Limiting power low enough to potentially destroy or have fluctuating framerate kinda defeats the whole point. No sane person buys a top end CPU to look themself out of 60+FPS gaming.
Limiting power for non gaming makes way more sense, because you don't need a specific speed/framerate to get a enjoyable experience.
The new mouse mats look awesome, they do look a little too small height/depth wise though seems closer to a bar mat size but I could be wrong I’ll have to check the sizes on your website
Great piece of testing and analysis
Steve, wow, the GN team delivers some of the best most objective content on CZcams full stop. And while the comment leading to this video was wild, the testing and analysis of power consumption and efficiency in real world application is incredible and eye opening - thank you for this and hope to see this work in future videos for future cpus and gpus
A minute of silence for Greg's paycheck.
Thank you for giving us this awesome video Greg
Great comparison, well done.
This is the best CPU test ever. This should be presented to all people who want to build their new PC
I’m pretty sure when Amit had his 7 leftover X3D CCDs he never could’ve imagined how much of a game changer (pun unintended) a server part would become in performance and efficiency
Long term, instructions per watt would be pretty interesting. It helps highlight what company is actually improving their efficiencies (and heat).
That was some very cool experimentation and data!
The money you waste on electricity can be deducted when you realize the intel cpu’s can be used as a duel purpose heater for your home.
On other hand your power bill will double on air conditioner during summer.
P.S. There is centralized heating during winters where i live, so even there Intel CPU will make things worse by overheating room up to point when temperature forces me to open window to cool it down
This just gave me a flashback to ancient times, when we used to make that joke about AMD CPUs xD
The pre-Ryzen era really was dark times all around.
We just need to run it at negative GHz so it sucks heat out of your room in summer
Hurry up and figure it out already, Science
@@greebj Just go to the bios and set a negative offset of 10ghz on your CPU and boom, your CPU is running backwards now! ggez
@@R4d1o4ct1v3_ dark days indeed, but now we are back to the Athlon days :D competition rocks!
Great analyze and comparison story!
Really loving the creative intros!
Well, thanks Greg for spurring some new testing from GN. Really interesting to see the efficiency between them for real world use cases.
You guys do outstanding testing for consumer products, with data and knowledge of how it applies. All while making it easy to understand. You guys rock.
Considering my basic, general understanding of computers, watching a shorts series or videos showcasing your testing methods and what they aim to achieve is needed. Alot of things go over my head or lead me into rabbit holes. Maybe thats why google recommend the channel lmao
I think he's on to something here. There is far too much BS flying around in the industry and we need an independent, nonprofit organization to test PC components for safety, efficiency and performance. UL is a joke with their 3D Farce benchmark.
If anyone could get intel off their butts and designing a modern architecture, I bet Steve can. The problem is they've been huffing their own farts for too long over there at intel and they've gotten complacent. Now the entire world can see what BS artists they are.
This troll post that prompted all this new testing was undoubtedly the direct result of intel propaganda. Why put the effort into R&D when troll bots are cheaper and more effective?
It’s not just extra power used to run the processor but in places like here in Florida you need to spend the extra power cooling the living area that much more.
Glad to see different electricity prices included. I don't know our exact rate, but the average in the Netherlands is €0.40 / kWh. I might not game 8 hours per day, but we've got 2 gaming PCs that will probably add up to close to that 8 hours. We've currently got a 5950X and 5600 so I feel the difference won't be that noticeable, but it's good to know.
Did that test of 8 hours of gaming include idle usage? HWinfo says that intel uses less power at idle but I have seen some from-the-wall figures that suggest this isn't the full story so I'd like to see the numbers on, say, 12-14 hours of usage, with only 2 of those hours being gaming, and the rest being browsing or video watching and being idle while you're away from the PC, and see which one wins there. It's because I have heard that intel uses less power at idle so I always wonder at what point intel would win overall for power usage if you leave your PC on idle a lot, or do low energy tasks for long periods of time like watch lots of youtube videos.
Nop
My favorite part of using efficient CPUs is that I can use air cooling without having the fans spin to audible levels while gaming. The electricity saving is a bonus.
Buy a good case? I haven't heard a fan spinning since 2015
@@meekrab9027 no spinning? then u need to visit a server room, then u get up to speed very quickly
Thank you very much for this test. One side note I want to mention. What would happen on an 14700K/14900K if you would disable the E-Cores at the Gaming Tests? You showed us a test with Intel APO tool. You can achieve the same whilst using Process Lasso for example. I personally don‘t think, that that would elevate those cpus to another level, but it may be possible, that you can achieve higher fps values and or reduce power consumption.
Just fascinating how efficient nearly all AMD processors are.
Keep up the good work ❤
I love the absolute sass that Steve has, makes these super informative videos very entertaining.
Thanks for owing them a paycheck, Greg! Not only was this video entertaining, it's also incredibly useful. Context: my eyes can't notice a difference on anything above 90 FPS, so I tend to look for more power efficient/cheaper CPUs, especially now that I live in a place where power costs fluctuate wildly and I also spend 9 hours a day at a minimum on my PC for work and use the same for gaming. I haven't had a top of market gaming setup since the first gen i7 because it hasn't been worth the investment to me.
exactly this... Planned to build 5800x3d and 4060 for that low TDP and good performance.
maybe you should re-evaluate other gpus for that CPU choice, the only reason getting a 4060 is if the price is insanely cheap otherwise you're wasting that CPU power or your money@@matejboras9279
@@matejboras9279 fyi, the 4070 is a muuuuuuch better choice than the 4060ti 16gig, and it's probably the most competitive of all 4000-series cards, especially in the US, where it dropped as low as 515$, while the 7800xt is still a 500$
@@matejboras9279 according to techpowerup, the 4080 is actually the most power efficient (frames per watt), just too bad it's too expensive
@@LeLe-pm2pr 350$ 4060@115W vs 600$ 6800XT@300W. that's alot of money in pocket on purchase and every month on electricity bill. both can run new games and both wont be relevant in few years... and if you are doing gaming only then a 500$ ps5 is probably a better option for 4k@60 gaming than a 2000$ pc, and that ps5 will still be optimized for 4k until ps6 comes out in 5+ years
Given he chose to carry on a fight in a hopeless battle long after it was lost; Greg's Friza icon fits like a glove.
would love to see XMRig benchmarks and hashrates/efficiencies across these same cpus and see if it's the same as these gaming tests
Good Job, Thank You and keep up the good work!
Thanks Greg's paycheck for sponsoring this test :)
Greg out there looking like the CEO of User Benchmark
I feel like more should have brought up about AMD selling the 7950x3d saying its the "fastest gaming cpu"...then putting out a faster cpu for 300 less dollars not long after
That was fun! Love the work you're doing, great nerd factor! Greg should thank his lucky stars...
It's crazy 10 years ago AMD was in the exact same spot
Even about 7 years ago, pre-Ryzen. The fake core era. Intel was so far ahead, kinda disappointing to see where they're at now.
@@Qalibrated thing could have been different if they did not sit on their butt, milking their customer money for few extra mhz for years. It's quite funny
Came here to write the same comment, and I also wonder.... is Intel doing the exact same thing AMD did back then? Maybe they've got something up their sleeves and in the meantime they're pushing the old stuff harder and harder to bide time.
@@shanesgettinghandy Intel's big ace in the hole is... copying AMD. They're moving to chiplets (or "tiles" as they're calling them) and working on their own stacked cache solution. It's all quite amusing really, given Intel's previous malicious marketing about "glued together" CPUs.
@@Qalibratedwelcome to the trap that is complacency