Ryzen 7 7800X3D vs. Core i7-13700K, The Best $400~ CPU For Gaming?
Vložit
- čas přidán 30. 06. 2024
- Get the Insta360 Link here: store.insta360.com/product/li...
Amazon: www.amazon.com/dp/B0B826KS4B/...
Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Buy relevant products from Amazon, Newegg and others below:
AMD Ryzen 7 7800X3D: geni.us/FxBWD
AMD Ryzen 9 7950X3D: geni.us/BV9I9nH
AMD Ryzen 5 7600: geni.us/qGJVbZ7
AMD Ryzen 5 7600X: geni.us/Hn7m9
AMD Ryzen 7 7700: geni.us/6VP8wD
AMD Ryzen 7 7700X: geni.us/fAfH
AMD Ryzen 9 7900: geni.us/RRL3Fa
AMD Ryzen 9 7900X: geni.us/Rjlq
AMD Ryzen 9 7950X: geni.us/lTmCsO
Intel Core i3-13100: geni.us/DfDsan
Intel Core i5-13400: geni.us/lQZFpt
Intel Core i5-13500: geni.us/goRuq2
Intel Core i5-13600K: geni.us/lJgCa
Intel Core i7-13700K: geni.us/Gpuqh3
Intel Core i9-13900K: geni.us/AieGiL
Video Index
00:00 - Welcome back to Hardware Unboxed
00:32 - Ad Spot
01:30 - Test System Specs
02:07 - Hogwarts Legacy
02:35 - Hogwarts Legacy [RT]
02:49 - Marvel’s Spider-Man Remastered
03:07 - Marvel’s Spider-Man Remastered [RT]
03:24 - Fortnite [DX11]
03:36 - Fortnite [DX12] RT
03:45 - The Last of Us Part I
04:03 - Star Wars Jedi: Survivor
04:28 - Halo Infinite
04:56 - Total War: Warhammer III
05:13 - F1 22
05:34 - Cyberpunk 2077
05:49 - Cyberpunk 2077 [RT]
06:05 - The Callisto Protocol
06:21 - Hitman 3
06:43 - Far Cry 6
07:00 - Shadow of the Tomb Raider
07:17 - Tom Clancy's Rainbow Six Siege
07:37 - Horizon Zero Dawn
07:53 - Assetto Corsa Competizione
08:17 - Resident Evil 4
08:34 - A Plague Tale: Requiem
08:49 - Watch Dogs: Legion
09:15 - The Riftbreaker
09:32 - Call of Duty: Modern Warfare II
09:49 - PERFORMANCE SUMMARY
10:00 - 1080p Average
10:49 - 1440p Average
11:13 - 4K Average
11:29 - Power Consumption
12:21 - Conclusion and Final Thoughts
Read this review on TechSpot: www.techspot.com/review/2700-...
Ryzen 7 7800X3D vs. Core i7-13700K, The Best $400~ CPU For Gaming?
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Outro music by David Vonk/DaJaVo - Věda a technologie
Highly appreciate the inclusion of 7700 and 5800x3d and other resolutions
Yeah, me and my 6700 also say thank you! :-) lololololol #JustProcessorNamingIssues
@@AndrewTJackson Can't wait to upgrade my 7900 GTX to a 7900 XTX
@@Eidolon2003 *chuckle *
Very interesting how it's only 1 FPS off the 7800X3D, with it about $200 cheaper, it's obvious that the 7700 is just as good.
But I see a problem where he includes that info that kinda shows what a massive rip-off the 7800X3D is while the narrative is jerking off to it. He's trying to laser focus on the 13700k to warrant it's price without even _mentioning_ that the i7 completely fucking destroys it the second you _aren't_ playing games. It's a 1 trick pony that the 5800x3d does just as well for far less money.
Both thumbs up for including the 5800X3D in the charts. Much appreciated.
How was it? Is it worth upgrading from Ryzen 5 3600 to 5800x3D?
@@mongkolchaik5865Without a doubt, yes. Several generations apart.
@@mongkolchaik5865what? Why is that even a question a 5600X would of been a decent upgrade from a 3600 lmfao 5800X3D is now the only chip people should be buying on AM4.
You only have two thumbs?!
Yes it is, it was for me even a good last upgrade, from a Ryzen 5 5600X to the Ryzen 7 5800X3D.
im more impressed with the power usage of the 3d chip
that thing is crazy efficient
On load, i have it (7800x3d) and with 1.2 vsoc uses 30W on idle. I hate that! So damn chip on average uses 30W idle, and 60W load (games)....i just cant swallow idle...
@@Markisha64yup same here. I had to turn that down.
It's also not worth it at all to OC it. It has a hard cap of 5.0ghz too, so no matter what you do (as a gaming user) so it's basically pointless doing most performance tricks. I'd rather have it at 2W when downloading updates over night than 30
@@Markisha64 lol turn off the igpu
Mine idles at 7 watt
What you see is cpu package power idle
The cpu itself dont draw much , the igpu sucks like 20 watts on idle
@@Deathscythe91 But can you turn off the iGPU? Or undervolt/Underclock?
@@Markisha64my 7800x3d idles around 7-12W, as per the comment above, disable the igpu
Cant stress enough how much work you put into this and still going with such a strong passion. Thx for another treat of a video 👍
Long-time subscriber here. Am really liking your change-up of the filming setting, moving around the studio, keeping the scenes interesting, & such! 😊
I wish more reviewers included MSFS in their test suite. That is a game where you are still CPU bound, even with a 4090 at 4K. It is also a game where the 7800X3D holds a marked improvement over a 13900k. The cache is really beneficial in flight simulators, be it DCS, IL2, MSFS.
Also battlefield 2042 hammers CPU harder than any other game I played. Tho I understand this one is hard to test.
@@mmadevgame bf2042 is pretty good in CPU optimization nowadays, they did magic a few months ago. i went from 70-80fps at low settings and 30% gpu utilization to 150-170fps with 100% gpu util with an rtx 2070 and 3700x after they did lots of patches
Steve said before that he had a lot of issues benchmarking the game where sometimes it wouldn't even launch and he would spend days trying to fix it. Probably due to Microsoft store since they also had issues with Forza
yep, old gta V is also interesting as regard to cpu bound situation.
I think another major one would be the Arma 3 crowd, I've seen massive gains from the 5800X3D in lesser known reviews and I'd love to see the results these guys could bring to the table
Love the fact that my Ryzen 7 5800x3D is trading blows with the most recent CPU’s
Edit: I love that a majority of us in the comments agree the Ryzen 7 5800x3D is a great cpu for its price, I’m gonna bet that next year the Ryzen 7 7800x3D will takes its rightful place as the next bang for the buck cpu coming 2024
same here, was thinking about a 7800x3d but it seems pointless at 1440p
5800x3d IS a recent cpu.
@@gimmy2294 you’re right, but it’s considered last gen. It was released in April of last year. So not as recent as the Ryzen 7 7800x3d. Plus doesn’t have the ability to used DDR5 ram. So basically the fact that it’s trading blows with more advanced cpus makes me happy. It’s definitely a bang for the buck cpu in my opinion.
I put the 5800x3d in my cousins new build as an xmas gift last year.
He's a long time intel fan boy and was second guessing my build choice. After doing some benchmarks and his user experience I think it's safe to say his opinion on amd has changed.
Personally I don't mind either amd or intel, have owned both. The platform longevity with amd does win me over in most cases.
@@alexs.2069 agreed, I’m gonna help build my friends computer and was gonna send him a list of cpus he should get but I just might make him get a better cpu then the 7800x3D if it’s in his budget tbh.
your style of videos in terms of showing the differences with REALLY copareable parts AND in all important solutions per game is by far the best way to do this
absolutely great videos from you
i really love it^^
I really value the page where you presented the average results for each resolution (1080, 1440 and 4k). Many of the other benchmark reviews will only do individual games which makes it very difficult as a consumer to understand the relative overall performance. Expanding further and speaking to the total cost of the board + ram, architecture longevity, etc was also excellent.
Really appreciate that you test so many games. Lets the viewers decide what to buy based on the games they actually play
Loving how the studio is turning out! It looks amazing!
Omg a gaming hardware youtuber acknowledged the benefits of quick sync. Good shit guys genuinely. Sets a good precedent for the quality of ur coverage
I just upgraded from 1700x to 5800x3D for £268.00 and very happy with it. Probably gonna be on it for 4 more years till I move to AM6
If you wait 4 years. Go to AM6!
@@haukikannel Yep ur right^ that's what i mean to say :)
You got a deal! However you stuck with that old 1700 too long. The value of AM4 is you can upgrade the CPU every 6 months and sell your old one on ebay. You missed years of much better performance by not installing a 3700. PS, CZcams is such an arse putting a 4 digit number after your name when it's already unique. Are there really 7150 waylands on YT?
Gotta love that power efficiency coupled with performance
Thank you for the 4k results much appreciated!
SUPERB comparison with details!
as usual ;)
Great overview!
I like the new inclusion of 1440 and 4K as it kinda confirms what you have been saying, and it makes the data feel more complete.
But just curious why you decided to include them now? As before you were excluding them due to the extra work.
Was it a case of "extra testing > keep responding to same comment".
Maybe it was because even though Steve has Explained multiple times in Q&A why 1440p & 4K are not needed for CPU reviews ID10TS keeps saying they don't believe him,
or they are intel FanBois that are upset that AMD has the better gaming CPU & so if the results are GPU limited they can say "See there is no difference".
This is not a CPU review tho, I might be wrong here, but didn’t they always test 1080p, 1440p and 4K on CPU comparisons?
Very much agree. I get that most games at 1440 and 4K are just GPU bound so CPU tests show no difference but we could actually see noticeable changes in 1440 here and it is good to be reassured 4K is evenly taxing on all CPUs and there are no anomalies.
I use 7800X3D, and with tuned 6000 memory and curve optimizer, this CPU is crazy for gaming!
Nice! I'm building (like literally now) my new pc with the 7800x3d. After updating the bios, I want to try and mess with the settings a bit? Did you only touch CO or used some other manual setting? Also, do i need to update the mobo chipset driver as well to enable the 3d cache? Or is there some cpu specific firmware to download and update?
Sorry for bothering but I really don't want to fuck up the mobo or the cpu, given how finnicky their interactions seem to be based on reviews.
TY
i agree, its a very good choice for gaming but if you do any, and i repeat any productivity tasks, 7800X3D is ass. Intel is still best all-arounder.
This comparison came riht on time for me, thanks!
just want to thank you all for doing all of this great testing / benchmarking!
God knows where you find the time for round ups this extensive, but we're ever so glad you do. You guys are earning that million subscriber mark that's just around the corner!
Thanks mate, basically having no life is the key ;) Little time for fishing though :(
@@Hardwareunboxed I think you'll find extra inspiration to hit the water real-soon-now(tm). Shame about the weather though! It'll be South-West-Rocks-o'Clock before you know it!
Hope so, can't wait to get back out there :)
@@Hardwareunboxed I was talking up SWR to Fynn on the weekend. He got pretty keen on it. We might have to tag along next time if there's room for 2 more
Sounds good :)
Great data, as always! By the looks of it, my Ryzen 7 5800X3D will be a beast for many years (and I only paid 320 euros for it, on Black Friday, 8 months ago).
Not really depends on the gpu
Same here
@@chriswright8074 Not sure what you mean by that. I'm guessing you are referring to situations where you are GPU bound and having a more powerful CPU would be overkill. Well, yeah, of course, you obviously also need a pretty beefy GPU for a good balance. Furthermore, it will also depend on the games you play. In my case, I paired the Ryzen 7 5800X3D with an RTX 4070Ti (don't judge me, I also need it for work 😛), and they are a match made in heaven.
for 2 years maybe
not exactly a beast, but it'll be more than adequate for a long time as long as games are optimised reasonably well
Great stuff. Steve Suggestion - On your comparison charts can you add the difference between the lows.
Highly informative, as always.
Really like these useful side by side comparisons for stuff that you would actually consider buying.
Assuming you're looking for an upgrade. It's very useful.
Except it completely omits the massive difference between the x3d and i7 the second you _aren't_ gaming. At least is it shows how close the 7700x and 5800x3d are in gaming performance but also doesn't at all mention they're stupidly cheaper. Every bit of info he focuses on is solely to put the 7800x3d in the best light possible. In reality it's really a freaking terrible comparison that just looks like an ad for the 7800x3d.
@@zodwraith5745 I mean the title of the video specifically states best CPU for gaming, not best overall. The 5800X3D was insanely popular, and I just checked Mindfactory and the 7800X3D is also very popular by a huge margin (fastest selling CPU). Most people don't really care about multi-core performance, they just want the fastest gaming CPU. I'll admit I'm one of the people who do care about the multi-core stuff since I need it for workstation programs, but for most people it's irrelevant and 8 cores is fine.
Would have been great if you included a few productivity benchmarks for those of us who use our systems for both work and play.
That's hugely dependent on your workflow, though. I work a lot in revit and unreal engine, so the 7800x3d is a better option for me. However, other programs would favour the stronger multithreaded performance of a higher end intel or non x3d amd chip.
Liking this new format Steve!
I like the studio now ~ Also appreciate the info.
I really like how you had a regular Zen 4 CPU in the list so you can tell titles that favor AMD over Intel or vice versa. This sort of thing makes your comparisons second to none in the techtube space.
I do feel like it's always worth at least mention of the non gaming workload advantage even in a gaming CPU test.
if you want gaming plus productivity performance you ll go with 7950x3d its 2% faster than 7800x3d but it have much better productivity performance
I'm surprised how well the 5800x3d still holds up against these new chips
Isn't the 5800X3D only a year old?
It's AMD's i5-2500k and much like it, will probably be competitive for 6-8 years.
@@JGStonedRaiderNah, the Ryzen 7 5700X takes that spot.
The i7-12700K currently is faster than the 5800X3D when both are paired with an RTX4090, the latest benchmarks that i watched with a 12700K vs 5800X3D is LTT on 13th gen benchmarks when the i5-13600K beats both as well.
I mean it's an 8 core chip just like the 7800X3D, with almost the same amount of cache
@@pixels_per_inchThe 7800X3D is the way faster 8 core chip.
Rarely you will watch a 5800X3D at 4.6GHz all core on games, more like between 4.2GHz or 4
5GHz.
The 7800X3D has Zen 4 IPC and clocks.
Great video as always and really loving the set
i was waiting for this, thank you.
Im currently planning on upgrading my system (5600x). I was originally leaning towards the 13700k however in the end I opted for the 7800x3d for the same reasons you touched on. Primarily it was the power draw. In Europe the electricity prices are quite steep and its most certainly an important factor to consider over the life of a system.
It's not just power draw. It's just unpleasant to have unnecessary heat being dumped in your room, especially during summer. Gaming GPU's and CPU's should be optimized for low power use, not 500w firehoses that set the room on fire.
That's why 7800X3D is a banger. It's relatively low power with really high performance for its niche. 4090 and 4070 are good for similar reason (4090 is certainly not low wattage), power efficient yet also powerful.
See above for my post then.. given a few basic steps the 7800x3d can perform exceptionally while sipping power and it's extremely easy to setup..
@@noway8662 how is the 13700k if you underclock it? I have seen some making claims that they can get power consumption reduced by at least 100watts with only loosing about 10% or less of the performance. Never tried stuff like that personally but it really looks interesting seeing people make these claims.
did u go with the 7800x3d? if so how is it?
A fire hose doesnt set houses on fire. Please avoid becoming a fire fighter if at all possible 👍
That 3D V-cache is really doing work. I'm extremely pleased with my 5800X3D. While I don't see bonkers FPS like with a 4090 my 1% lows really get pulled up and the play is smooth.
My 7700x and my 7900XT have my gaming performance bouncing off the top with my 3440x1440 Q-OLED from Alienware. I recenlty used the Precision Boost Overdrive and a couple other Bios Options to Undervolt my CPU. It now has a higher top end and runs 15 degrees cooler, not to mention using 20 less watts at max. My GPU is OC'd to 2850mhz which is nothing special but putting it all together makes me one happy PC gamer. You Guys at Hardware Unboxed, Keep up the Good Work!
Best review as always!
Really appreciate the gaming power consumption graphs in this comparison and hope they be there for future CPU reviews. Sometimes it isn't clear how much power consumption during gaming compared to the 100% load tests.
on strest test, the intel would consume more and more power
It’s better to have idle power consumption imo. Am5 cpus pulls 30-40w constantly while on idle
@@storyanaksekolah2well is also have almost twice the multi thread performance you gotta remenber Tha aswell
It's nice knowing considering that I run my desktop off a portable power station.
@mrjohnhung412000 idle isn't that important these days atleast for me.
I only turn my pc on to game or run a heavy load for the rest I use my laptop or phone and leave my pc off.
Can't beat off in my books and my ocd goes crazy if something is just on for no reason.
My 5800x3D+3080Ti PC idles at around 20 to 28 watts for the CPU alone and 22 watts for GPU.
Would love to see a roundup of good choices of motherboards for the 7800x3D. Never felt this hard to choose a good MB with the right features, without wasting money on pointless fluff.
Well, they have a video for b650 boards. I bought gigabyte b650m gaming x ax for my 7800X3D. Updated bios before placing ram or cpu, and good to go. Works without issues.
Ive just bought an MSI MAG X670E TOMAHAWK WIFI to go with my second hand 7800x3D. Lots of features,mid range price and will accept all AMD CPU's for at least the next 2 years.
Just bought the 7800x3d... Sooo exited.... thx mate for all the reviews and the time you take to make them.
Just ordered 7800x3d to upgrade from my old i7 6700k. So keen for it to arrive!. Thanks for your reviews Steve!
Awesome work. Will you guys one day get your hands on the elusive 7900X 3D for a full 3D showdown across the stack.
They have it, it's just not very good for gaming. It only has 6 cores it can use when running games since it has to park the other six.
If your picking a 3d chip really the only choices are 7800x3d or 7950x3d. And 5800x3d as well.
@@TheMightyMuch That shows how much you know. I have one and it's fast as f**k in capable hands. It spanks my 5800X 3D. PBO 2 Enabled +200Mhz sees the VC cores boost up to 5150Mhz.
The scheduler does a great job. All 12 cores are always accessible. Depending on the game, the V Cache cores may only be in use. Other times the normal faster cores will be used, and sometimes all 12 cores will be in use for example F1 23.
@@Bang4BuckPCGamer Frame Chasers was not impressed, said you have to park half the cores.
It's impressive when the 5800X3D beats the latest Intel.
@@Bang4BuckPCGamer I was under the understanding that the remaining cores are parked. If they changed it since initial reviews that's very good then.
Something I would love to see is a video about the 7800x3d vs the 7900x since they are competitive in a few areas I think.
Same price but the 7900 clocks faster and has more cores. I'd expect it to be the better all rounder.
GN called it a fraud but youre probably right.
The new studio is so cool. Love all the new backdrops in recent videos!
Love how simple and neat this comparison
I agree - the 7800X3D gets my nod. But for anyone on AM4, esp. on a 300 series board and esp if they've had the system for a good while, I say check for BIOS support and likely the best bang for buck is to go with the 5800X3D - it's awfully hard to knock that CPU's value as an upgrade option.
It also has the benefit of being a stable platform. AM5 needed more time in the oven.
@@Aaron-ui9tj I really don't consider AM4 to be a stable platform. Ive never had so much issues running RAM at its rated speeds or having to fuck around with voltages because my motherboard kept putting insanely high SOC voltages, which has been discussed on Gamer Nexus as a real problem slow frying some CPUs on some chipsets if you aren't careful. Had lots of people on 3 different AM4 motherboards with the last one being a fairly expensive one, switched over to Intel a month ago - boom not a single problem any more.
@@Yes-sw8gh that's case by case, I did not have any problem on AM4 (B450) but had problems on Z170 with hynix memory that would not work on XMP without tweaking, which is why I went with AMD after the 6700K. Perhaps I made the mistake with the ASRock board on Intel. I0ve heard that B350/X370 had a lot of memory problems though. The safe bet is always to skip the first generation of boards and cpus.
@@marsovac Amen! and definitely research your motherboard before you buy it, thats a lesson I have had to learn aswell; back in the day motherboards were easy, you just bought a reasonably priced one and everything would run how it was supposed to. Nowadays cheap VRMs on a motherboard can be your doom, for example.
I have an ASRock board for Intel now, and it works pretty well actually so far. Let's hope I don't have to change my mind on this board down the line haha.
@@Yes-sw8gh Ok, well I wasn't aware of that. I skipped AM4, but hopped on AM5 and it has been a less than desirable experience.
When it comes to power consumption, IDLE / DESKTOP numbers would be very interesting as well, most systems are idle for a lot more hours per day than gaming.
if you care about idle power you should turn off your PC and buy a netbook or a tablet
bruh just put the PC on sleep or hibernate...
That is honestly lacking in a lot of these videos. I recently built a machine with an 7900X and I was blown away that it used ~40W idle, quickly jumping to 70W on medium "Office" loads. I wasn't exactly happy with that and since I was also having stability issues (system BSODing) I refunded the CPU and MoBo and went with a 13700KF which uses a ~10W on idle and barely breaks 50 even in non-demanding games.
So maybe my CPU was extremely broken (which is entirely possible with BSODs and all), but if not, then I do not really consider the 7000 Ryzens as more power efficient on a general use base. On full load, sure, but in "normal work", I would give that price to Intel (giving the comparison of 13700KF and 7900X)
@@GewelReal some people run vms/fileservers as well. it matters, just not to you.
Well for that the amd again is killing it. The efficiency is staggeringly good. And it's even more killer if the x3d is running in eco mode.
Damn good review, needed to see it
First Ad ever I watched the entire part. Great job presenting the product. Also thanks for this great comparison of the CPU's and their performance. Lots of work put into it, and it shows.
Another great video. Can you please compare the i9 13900K to the Ryzen 9 7950X3D for gaming with all the latest software? Have AMD worked with Microsoft to make using the appropriate cores for gaming or productivity better? Like will the 7950X3D now automatically just use its 3D VCACHE cores for gaming and allow you to use the non-3D cores for running video capture and other apps in the background at the same time? Thanks.
AMD has worked with microshit for that. But for gaming as your primary the 7800x3d rules over it's big brother and is far more efficient
Noe that said if it was a gaming AND work system where pure raw clocks are more important vs latency then the 7950 is the better option.. and why amd used the 2 diff chipset types for such.
Short answer: no, AMD hasn't done anything to improve thread scheduling on Ryzen 9 7900X3D and 7950X3D. You still need Game Mode enabled or fiddle with Process Lasso to set core affinity.
@@offspringfan89 that's awful - you would think that AMD didn't want to sell them!
Yeah, it's obviously a crap processor.
Those power figures are even more relevant now considering the prices for electricity has increased over in EU, and here in Australia. The cost to operate a PC over the course of a year could be significantly different depending on what platform and CPU you are running, could range from several dollars to hundreds of dollars yearly.
Why did the prices increase? Oh yes, lets sanction Russia and support corrupt hellhole that is Ukraine, slava ukraini guys
oddly in some niche circumstances like fileserver the intel cpu might be more efficient as it's idle watt draw is much less. like 3-9 watts. my cpus get passed down to my fileserver stack when their desktop duty is done.
@@jeremytineI would just take an i7-12700K over a 13700K.
The 13700K uses 57% more power on Cinebench R23 for 25% more perfomance.
@@jeremytineMy i7-12700K already iddles at 5 watts, but all core power is way, WAY lower than the 13700K.
@@saricubra2867 I'm considering it for a new build. Microcenter has mother +ram + 12700k for $350. It meets my current needs, but I'm also doing a custom loop, so debating longevity vs price
Epic ! Thank you sir!
amazing video as always :D thanks a lot
hats off to AMD for coming this far. They really are taking the cake. I mean the Ryzens 5k shook the world, and the x3d just snatched it. Even now the Ryzen 5 X3D still performs in a generational leap of amazing CPUs. Amazing!
The 5800X3D looks the perfect (value) option for an RTX 4090 and a 4K monitor. Someone with an AM4 platform doesn't really need to upgrade to build such a high end setup for 4K gaming.
And even if they are new to the AM4 platform, it's cheap than going with AM5. Ddr4 and mobos are so affordable now.
I find it comical that someone who can afford a 4090 is penny pinching about saving money on a modern CPU. That’s like buying a Lamborghini and complaining about the price of fuel..To each his own I guess.
@@JailBr3ak3r77 Of course someone who is penny pinching might be able to buy a 4090 instead of a 4080 or a 7900xtx instead of 7900xt.
People who are bad with money don't tend to have nice things, shocker.
@@JailBr3ak3r77well gpu market is pretty boring and we all know the 4090 is the best but doesn’t mean you need the best or newest or most expensive cpu to pair, 5800x3d/4090 is a combo people use
@@puffyips Fair enough.
Wow, the new Studio looks amazing!
Love the new background in the intros.
As someone looking to "upgrade" to 5800X3D, thank you Harbor Boxed!
@@tilapiadave3234It's $280 USD and delivers similar performance to the latest Intel CPUS. It's absolutely not over hyped lol.
Just upgraded from 5800x to 7800x3d and i'm blown away how much of an upgrade it is, 50% + more fps in Simulation games like MSFS2020 and Assetto corsa. and it uses less than half the power. Truly an epic gaming chip.
Im upgrading from a 10700f very exiting times
same situation for me, but im stuck between the 7800x3d or 13700k
@@RainDrpz what did you end up going with, I’m trying to decide between the two
Same. I am now leaning towards 7800x3d because of am5, L3 cache and power efficiency. @@ethaneaves_04
@@ethaneaves_04 I used to have a 13700K and now i changed to AMD. Intel have had the same IPC for three generations and the new cpus are just no exciting except for multithreaded workloads, and trust me, i use alot of programs the utilize alot of threads, but my 13700K acted strange with the e cores and i got better performance locking them away and boosting the clocks on the P cores for music production.
video editing times took a little bit longer but wer talking about 3 vs 4 minutes, yes it's a sizeable amount, but that extra minut dont matter to me tbh.
And the 7800X3D is about the same speed as a 13700K with the cores locked away so thats no loss.
My DAW is happy tho, for some reason I had wierd latency issues with the 13700K, even with the e cores locked. dunno what thats about.
I’m actually in the go for a cpu upgrade since my 9400f is getting beat up in the latest game titles. So this was valuable information! Thanks!
Thanks for all the reviews/benchmarks. Hopefully you have time to do the 7950x3d vs 13900k. I’m curious if amd have managed any improvements with the scheduler to prevent those negative outlier cases. Either way it seems the 7950x3d is quite impressive being at the top of gaming and close to the top in applications while being extremely power efficient.
Great W for the 7800X3D CPU, so good that not even UserBenchmark wrote something about the 7800X3D itself
Lol 😂
It would be fun to see them write something about the product since it is very much the best gaming cpu you can buy atm
You're telling me they didn't die out yet?
What good is information when it's THAT biased...
Honestly, the 7700 looks actually really good. For about USD 150 less, it has about 10% less gaming power than the 7800X3D, and I'm sure it can be tweaked to get it even closer. Those monies can go to a better board, ram or even the difference between a 7900 XT and an XTX.
This is the kind of content we need! Blessings
Great work
The Power consumption can't be underestimated. Not sure what Electricity in the US and Austrlia cost, but in Europe it is expensive. 1kw cost arround 40cent. Depending on usage and games you like you are looking at 8-10€ difference a month, that makes 120€ a year. Given the fact that you don't change your CPU every year and keep it maybe 2-4 years and electricity is not getting anny cheaper, you are looking at a life time extra cost of almost 500€, if you use it longer it will be even more.
@@Matias-wu6xp it is not a lot for me, but why waste all that money, you could buy games from it, a better monitor...etc
@@Matias-wu6xp 120€ a year, if you keep your pc for 5 years thats 600€ from which you could buy a new cpu + mainboard down the line.... there is no point in getting an intel chip right now if all you do is gaming
@@Matias-wu6xp That doesn't make much sense, if you got money why waste them? I don't get it
@@Matias-wu6xp it adds up. 120 here, a 100 there, 150 this, 80 that, etc.
😂yea, cause extra $5 a year is huge difference.. 😂😂
I thought about just swapping in a 5800x3D but decided to go with the 7800x3d. After selling my old parts (cpu,mb,ram) the final upgrade cost was down to $275 USD which was well worth it.😊
why?
@@orkhepaj Why upgrade? Isn't that the age old PC dilemma? 🤓
@@JCTiggs1 5800 is not old enough to upgrade
@@orkhepaj I didn't upgrade from a 5800 🤔
As a current 5800X3D owner, thank you for including it in the mashup. Seeing it on some of the charts beating the 13th gen hardware and also at times matching the hardware shows me it was a good choice. Now I don't have mine paired with a 4090, so I am losing out, but for what I have it is preforming at its max; 2070 Super. At full tilt, with my eco adjustments, I hit about 90w CPU package power and at idle sipping at 25w (would love to see lower, meh) and my top two cores hit 4.5Ghz with all core between high 4.3s to 4.4Ghz. Temps reach almost 75c, but the PC needs cleaning, so, yeah. I can leave it on 24/7 because it costs about maybe a high tier sub to a stream service a month, about $20; at its highest cost kwh.
really loving this new look guys, looks a lot more dynamic than just sitting on the table
it is hard choose intel i7 13 with z690 ddr5 cost around 825$ - amd r7 7800x3d ddr5 x670 cost 1030$
People that pick intel over AMD 3d chips are nutcases.
AMD circlejerk on the internet is crazy. Intel is still the preferred choice for many and it is used in most laptops. AMD was always the jank choice for enthusiasts or overclockers
Sadly, Intel is not competive right now and ARL is still 18 months away.
I wish there was more competition.
@@tilapiadave3234 Sadly Intel fails to make competitive CPUs.
AMD has the expertise on that.
@@tilapiadave3234 Tell this to all the 4090 owners.
@@tilapiadave3234 some people want the best, I can't blame them
Fantastic video, thank you.
Small production value note: I'd reduce the brightness on the TV to get the levels more normal looking for your opening shot. Otherwise I love how production values have continued to be elevated on this channel. Things look better every year.
The 7700 is looking like the best overall
I'd love it if you can also test the newly released 7940HS. I'm hearing it's quite good.
great comparison!
Great video as always. If it's not much to ask, Steve, would you please include ddr4 results for Intel in the next videos? Just to learn how much performance on the table there is for people who opted for ddr4.
As I'm building right now, I did my research. ddr5 is absolutely unnecessary for gaming and not worth it as of today. Even in production workloads, you would need very specific use cases to make it worth. And no idea if ddr5 is futureproving anything, it's just way ahead of anything right now. With my mixed usage of gaming/editing, I'll go with ddr4 even in a new system to save some money.
Every time I see a CPU comparison, I just sit and think WOW what a Brilliant move it was to buy my 5800x3d at $300.00 US . I could not be in a better situation at this moment in time.
how can you get that for 300? in my country, at least 510.
Got it for $289, USA bro
I was contemplating these two when I built my rig a few months ago. I went with the i7 and have no regrets. At the time the motherboards were much more expensive and then came the x3d chips melting motherboards. I did however get an AMD GPU and have no regrets there either.
There should be no regrets on any modern cpu. CZcamsrs really exaggerate the differences between them when they are so similar it's just personal preference at the end of the day
@@Matias-wu6xp I wouldn't call 11% (sometimes as high as 20+%) difference in CPUs as exaggerated results, thats pretty much an entire new generation. Its a bigger difference that between 12700k and 13700k
@@ThunderingRoar you make a valid point but those rather large differences were at 1080p. Who is buying pretty high end chips in 2023 and playing at no less than 1440p?
@@theripper121 When using DLSS/FSR/XeSS your GPU will render at lower internal resolutions like 1080p (for Quality preset @1440p and Balanced @4K) before upscaling so these lower resolutions results like 1080p are still very relevant
@@ThunderingRoar Maybe a fair point
That Room looks awesome !
Hardware Unboxed.... the go to place to work out what to buy. Always come to you guys first. Thanks for your stellar work :)
13700k is really more about productivity. Might have gotten it, had it not been the last for its platform. (got the 7800x3d instead)
Technically not last most likely but everyone has parroted media outlets so much it doesn't matter.
you are probably still better of buying 7900/7900x instead of the 13700k for productivity. Lower power, same speed in production, more upgrades possible :)
@@Daisudori I agree... Cause if you are on a hard budget you probably get a 13600k and if you don't have a budget you just get a 13900k, the 13700k is in no man's land.
@@SpoonHurler Funny enough the 7950x is actually faster than a 13900k for a lot of real world workloads, cinebench isn't a real workload except for a very very tiny percentage of people. Stuff like code compiles or video transcodes or compression/decompression the 7950x does very favorably with only 2/3rds the power. Lastly, compression/decompression is particularly important, even for gamers as that's what you're waiting for every time you install a game or a game is loading or assets are being loaded in causing stutters with ue4.
@@PineyJustice there's a bunch of professional programs that run better on Intel and it can utilize higher clocked RAM as well
I guess if you game at 3-4k. Really doesn’t matter. But if productivity is my profession, I think I would go with a diff Ryzen cpu or the i7.
Another great video
That practical ad was kinda neat.
Good video. Based on these results the 7800X3D looks like the better option. The power efficiency of the 7800X3D is really impressive compared to the i7-13700K. I'm into SFF builds, so power consumption is an important consideration for me and 100W is a big difference. It would be nice to see a similar comparison with the i5-13600K. That is probably a better option if you're building a gaming PC and want to use an Intel CPU. I think the i7-13700K is overkill for gaming.
Notice, how most of the games. Are made for AMD consoles🤔 Hogwarts, spider man, watchdogs, halo, plague tale. That doesn't seem very fair🫣
Intel is always king, haters will hate
Intel is not king in CPU space for some time now, AMD is!
@@Gay-is-_-trash Yep, Intel is king in power consumption and getting smashed by AMD.
But I hope they will become competitive again with Arrow Lake late next year.
@@mimon6706intel seems better in the productivity space for most of their cpus compared to their amd counterparts
@@test_account939 i have to disagree. 13900k and 7950X offer the same performance in productivity. 13700K and 7900X are also equal.
But AMD is always far better in efficiency and therefor in costs/value.
200W more power usage seems insane.. That could make the psu more expensive and add abt 40-50$.
1 game lol.
Although 100w is also big, doesn't matter for the psu for me but rather the need for a lesser cooler/less noise
also more expensive electricity bills, more cooling needed, and more noise
@@user-wq9mw2xz3j 1 game, but we don't know if or when the next popular game will come out with a similar power usage disparity.
@@user-wq9mw2xz3j you need a better cooler for all that power, more 100w is about more heat. for the long run it will wear out faster than and. this is not even their best cpu and already consump that much power just for playing game.
@@nguyenhanh9479you also have to considere have more core even if the games doest use those extra just by having it enable it will always consume power on iddle this was just gaming bench you have to considere the 13700k is also almost twice a fast in multi-tasking benchmark
I like the standing-up Steve!
Thanks Steve!
for pure gaming only the 7800X3D is the obvious choice, but as soon as you do some work with the machine, the 7900 or 13700k seem like a better buy. 7900 for more number crunching, 13700k for video and photo work.
Amd all day everyday. A cool system in the summer is a quiet system in the summer. Amd is just more for the money spent.
Awesome. Love seeing the man cave! It looks great!!
I feel more impressed by the 5800X3D's performance. I mean, it's the CPU I have so I'm a bit biased, but the fact that it does so well against the new gen CPUs is really amazing.
Really interested in how you managed to get 200W more compared to the 7800X3D. That means the 13700K alone would be drawing.. 250W~ while gaming? :)
not even my overclocked 13900KS draws that much in tlou at 1080p.
It's not just the CPU. The entire system needs 200W more.
But Steve mentioned in his review that 13900K and 13700K thermal throttle even in gaming when using a 360 AIO so there's definitely a lot to it.
@@mimon6706 Did you even read what I said? I’ll explain. The 250W comes from the fact that if a 7800X3D is drawing around 50W~ and steve says the system with the 13700K is drawing 200W more, that means the 13700K ALONE must be drawing 250W while gaming, which is why it’s extremely questionable.
@@Yahfz No. It's the power draw of the entire system. Not just the CPU itself, as Steve explained in the video.
@@tilapiadave3234 No worries! Fortunately, there's no 4090 in my system.
@@mimon6706 ok so you can’t read even after explaining it to you, have fun mate!
Thank you, much appreciated, the only channel I trust.
Great analysis as always. The 7700 is just insane value if you're looking for high end gaming on a budget. Thanks for including it!
Can you explain to me how exactly the 13700k pulled 200w more power in TLOU? I know that the 3d draws around 70w in that game, are you telling me the 13700k alone was pulling 270w to end up at that big a gap?
TLOU is actually one of the more threaded recent games and will heavily utilize your cpu. Considering the 13700k will hit big numbers like 300w and the 7800x3d is almost always under 100w it's certainly possible.
@@PineyJustice I know, i've tested TLOU excessively on my 12900k, never seen it above 110w. There is no way it's hitting 270w on a 13700k, lol.
Here it is, 1080p on a 4090
czcams.com/video/9kzWMKL30TM/video.html
@@tilapiadave3234 Just got a power meter, measure 475-480w average in TLOU 1080p.
7800x3d rocks. got it a week ago. but 5800x3d is also an excellent piece of hardware!
Great video! You’ve probably answered this question before, but why did you switch to higher graphical settings that become gpu limited. Wouldn’t testing at low settings be a better comparison for cpus as the games become cpu dependent?
Amazing work as usual!
I wanted to ask if you could include Diablo 4 in your benchmarks.
I think it would be a great addition considering it not only a popular game, but an open world MMO-ARPG that might benefit from the extra v-cache.
I havnt done official BM for that yet. But above mentioned system I did for fam is running D4 with zero issues as for performance goes
I will be swapping him to a new monitor as well soon since this major upgrade is laughably silly using it on his old 1080p 60hz max screen. (Dont get me wrong it was a VG monitor when new but that was many yrs ago but there was no suitable new 1440 or 4k on sale for a decent price at the time his new sys was built.. soninfig he can live with such for a few weeks while lansdale comes up for him etc. Meanwhile for tests I probably will just take over one of our lab displays and use that if I need to get such done in the near future)
PCGH (very reputable German publication) tested D4 across a *very* wide range of CPUs, going back to even include Haswell, although it is a game that is notorious for being extremely hard to benchmark consistently. The outcome was predictable. I'll summarize it for everyone who doesn't understand German:
Cache rules, both Zen3 and Zen4 X3D perform exceptionally well. Alder Lake/Raptor Lake can tread water, as in not being completely outclassed like e.g. in ACC, and can make use of their frequency advantage when compared to non-X3D Ryzen. Unsurprisingly, this comes at the cost of a severe disadvantage in energy consumption. Who would've thought.
Lastly, anything below 6 cores really struggles in 1% lows, D4 can generally make good use of 8c/16t CPUs. Anything beyond that is wasted and provides no measurable benefit. Again, a very predictable result.
With all that being said, D4 is typical for Blizzard as it doesn't really require state of the art hardware to run comfortably. Out of curiosity, I've tested it on a 3770K (Ivy Bridge, no OC) with a GTX 970. Runs perfectly playable in 1080p with medium to high settings and can push past 60 fps with FSR enabled. Yes, you're giving up some visual fidelity, you're entirely skipping the high-res textures and to have it play really smooth I had to go from 8 GB to 16 GB RAM. But still, that's mighty impressive optimization for hardware you could call ancient.
Sidenote: I switched to a 3570K (OCed to match the i7's clocks) just to see the difference: Loosing 4 threads (no HT on Ivy Bridge i5!) completely ruins the 1% lows! The game runs measurably worse, but the difference feels even bigger than the numbers suggest. I'd strongly recommend not using anything below 8 threads.
@@danieloberhofer9035
Thank you for info! I've turned down my texture quality from ultra to high, since you don't really notice the finer details when you explode a pack of enemies in a fraction of a second and move on :)
Good to see you are having no issues on older hardware. When I first got into Diablo3 in 8th grade I was playing on some Dell business laptop getting 25 FPS!
Crazy how AMD went from the cheaper multicore performant cpu to the exact opposite, of what Intel was, to a better gaming-focused cpu. Crazy how times change.
we witnessed it. interesting turn of the table.
How did that change?
7950X and 13900K are tied in multicore performance, but AMD is still a lot cheaper.