Bought AMD? You got played - Ryzen 7000 non-X Review
Vložit
- čas přidán 9. 05. 2024
- Visit www.squarespace.com/LTT and use offer code LTT for 10% off
Create your build at www.buildredux.com/linus
AMD’s Ryzen 7000 series cranked up the power draw, but had performance to match. Now that the non-X chips are out with limited power draw, did AMD accidentally play themselves - And you?
Discuss on the forum: linustechtips.com/topic/14801...
Purchases made through some store links may provide some compensation to Linus Media Group.
► GET MERCH: lttstore.com
► SUPPORT US ON FLOATPLANE: www.floatplane.com/ltt
► AFFILIATES, SPONSORS & REFERRALS: lmg.gg/sponsors
► PODCAST GEAR: lmg.gg/podcastgear
FOLLOW US
---------------------------------------------------
Twitter: / linustech
Facebook: / linustech
Instagram: / linustech
TikTok: / linustech
Twitch: / linustech
MUSIC CREDIT
---------------------------------------------------
Intro: Laszlo - Supernova
Video Link: • [Electro] - Laszlo - S...
iTunes Download Link: itunes.apple.com/us/album/sup...
Artist Link: / laszlomusic
Outro: Approaching Nirvana - Sugar High
Video Link: • Sugar High - Approachi...
Listen on Spotify: spoti.fi/UxWkUw
Artist Link: / approachingnirvana
Intro animation by MBarek Abdelwassaa / mbarek_abdel
Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
CHAPTERS
---------------------------------------------------
0:00 Intro
1:18 Specs, or: What does an X mean?
2:14 Intel's trying not to compete too hard
3:36 Some notes about our test setup
4:34 Gaming benchmarks
8:04 Productivity benchmarks
11:03 Thermals
12:08 Clock stability
13:04 Power consumption
13:41 Performance per Watt
14:41 Overclocking
15:44 Pricing, platform cost, and upgrade path
17:03 Conclusion - An excellent value - Věda a technologie
Linus, we all know that we gonna buy this CPU like 3 years after it arrive with 50% of initial prices
People don't want to buy a 3 yo cpu
3 years? You mean 5 years?
Lol, like with the 5600...
@@txmits507 lmao speak for yourself
@@txmits507 what's the point eh haha
X = Xspensive
Watch this be top comment
Linus better pinned this comment
Top comment
That's a killer one
X sounds like a badass letter
That thermals chart is insane! I'm so happy to see a return to awesome midrange CPUs, and am very excited for the budget ones.
I so fucking agree =)
@@fridaycaliforniaa236 TRUE
Mid range lol.
@@fynkozari9271 this. Midrange CPUs should cost no more than $170, but whatever.
The 7600 is goiing to become a legendary budget cpu like the 2600k.
i have a Amd 3700x since 2018 and im considering upgrade to intel 13700k and ddr5. i work as video editor. now i dont know where to go. ryzen 7900 or intel 13700k
Fun fact: Factorio devs mentioned in one of the blogs that they optimized game code so much that it's mostly bottlenecked by cpu cache rather than clock speed. Also ram speed boosts performance too and by a lot!
Feeling good about the 3DVCache CPU I just bought then
These suckers bought 7700x instead of 5800X3D with 96mb cache!!! MUAAHAHAH
@@petercollins797 i literally bought the 5800x3d for factorio lmfao
@@jaxrammus9165 7800x3d looking juicy as well for factorio, tarkov and other games .
@@jaxrammus9165 I did the same thing but for rust lmao
Labs are definitely showing their value, great review.
I think you should also include idle power draw. It can be a big deal for some use cases and sometimes people may find that their systems are drawing more at idle than they should (had that happen recently and it turned out to be a driver issue), so it's useful to know where approximately they should be.
People are noticing high idle power draws because they install crap like animated wallpapers and 10 game launchers, or they have multimonitor setups with high refresh rate. Or because they enable the high performance power profile in Windows.
Tldr: too many factors for idle power draw today, since there is never a real idle. What you see drawn from the CPU is not what you measure from the cord.
I use a 15 min sleep with wake up via usb (key/mouse). So It uses a few watts to power the memory and that's it. And for the commodity to not have to push the button manually, by just pressing the mouse to power on.
Usually idle power draw is around 5W whatever platform you're on
At least, mine draws the same with 3600 and 5800x3d. Note that the 3600 never puts to sleep its cores, while the 5800x3d will run just 2 cores/4 threads if the workload is light enough and the others are almost literally turned off
You can't measure the whole system 's powerdraw for a review tho: different ANY component will massively change your results, so the possible combinations are too many to test or even just list.
i'm interested in the idle power draw of the non X SKUs. I've seen a few people note 60-80w idle power draws on 7000 series compared to their intel systems.
Love the frames per watt charts. Would love a real deep dive on this with eco modes on cpus and maybe even power limits / fps limits on gpus too!
The performance per watt is insanely impressive. I can imagine it being useful for say a university environment, where you might have 100's of machines working on students engineering projects or similar. You want the performance but also dont' want to use too much energy.
once you turn on PBO they consume the same exact wattage almost.
Short Feedback on the new Performance Graphs: They are very nice and clean but it is a little confusing how they are ordered. You seem to randomly order them either for average or for 5% lows but never consistently. You have graphs where some CPU is in the Middle by it's average, directly behind you have one that has lower average, but higher 5% lows, after which you have one that has higher average, but lower 5% lows, which makes it hard to quickly see, which CPU is actually faster. Both ordering methods, average or 5% lows are totally viable, but you should choose one and stick to it to make the graphs more easily readable, especially since you show them for a limited amount of time. That means pausing the video constantly is a necessity to take in the information acuratly, which makes it harder to follow the script while those graphs are on screen. Other then that, Lab seems to really do a great job here!
Felt similarly confused as well
The graphs are completely unreadable blurs on Mobile at 603p,
Plus the tests seem irrelevant for anyone not playing those particular games. It was really easier back when everyone used synthetic tests that were designed to give realistic results across all CPU generations since the 1981 launch of the PC platform. Famous tests that would still be relevant include LINPACK, Whetstone and 3D-mark, each test may need updates to deal with new inventions by the 3 chip giants.
I was going to comment on the same thing. Maybe get some inspiration from Hardware Unboxed. Their graphs are also often packed with information, but theirs is a bit easier to digest.
Looks to me like a calculated score is assigned to each CPU based on all 3 categories in which they're measured
@@johndododoe1411 if they made their graphs readable at [360?]p it would look like granny zoom to the majority of their audience. Change the quality and wait for the frame to buffer if the graphs are that important to you.
Only using synthetic tests that were relevant on the first PCs would be more useless to the average viewer than testing even one modern indie game that nobody plays.
Get some perspective
THANK YOU for finally including Factorio benchmarks on your CPU reviews.
I think CPU benchmarking in industry reviews are way too focused on AAA GPU-bound action games instead of processing-intensive strategy/sim games. Titles like Factorio and Civilization 6 are very relevant for people shopping for the optimal gaming CPU.
We need more benchmarks for productivity as well. Everything focuses on AAA gaming at 8K with full RTX, paired with the best DDR5 and a 4090 overclocked to blowing up.
That's not my use case and the data unhelpful.
@@halfbakedproductions7887 they did the same number of productivity benchmarks as they did gaming....
I play games like factorio and timberborn and I dream about reaching 60 UPS when having a really large base. Triple A are GPU bound that's well known
Also games like Victoria 3 (seeing how far it can get as observer from start after x amount of time) or Dyson Sphere Program
Phoronix has a great benchmarking suite
Great to hear. Still planning on waiting 2 years for my next upgrade but LOVE hearing there is a much cheaper option that is still quality. Here's hoping we get the same on gpu end as well
Wow, those are impressive thermals for the 7900! Think it might be going into my next build (which will be SFF)! I want to see what the X3D line this time around does compared to the 7900 and the 5800c3D
AMD being able to pull off these power draws just gives me a lot of faith in their engineers.
Actually, what they did is CPU binning 101. They just stockpiled "golden" 7000x CPUs for months and are now selling them for the 65w lineup. Genius move.
@@sovo1212 Not actually Golden. that 0.2GHz drop is enough for parts that were subpar at higher clocks work a lot better.
If you're amazed at this performance and power draws you should check the work they did with the Steam Deck's APU. I got a lot of hope for their future mobile end chips.
@@EuclidesGBM Grab any 7000x CPU, try to underclock that 0.2ghz and undervolt it to get to 65w TDP, it won't be stable.
@@sovo1212 considering I managed to overclock my 3600X to 4.3GHz All Cores whislt undervolting it to 58W... I think it is quite possible
Waiting for the 3D prices, but that 7900 is looking amazingly efficient for a nice price
It's not a nice price at all. AM5 motherboards are very expensive, so is the DDR5 memory you have to buy.
@@sophieedel6324 it’s better than buying a motherboard and CPU every 2 years with Intel
@@TheEpxMaster not really I upgrade after 5 years and by then every component is worth upgrading.
Ryzen 7900 76 mb cache. I would kill for that cache. 65 watt tdp is the best.
@@TheEpxMaster It's realistically the same since the AMD motherboards are twice the price of Intel's. The 7900 sounds like an pretty mediocre deal, when you're spending that much on the AM5 boards and DDR5, you're probably better off with a 7900X3D when that comes out, because chances are if you want a board for a 7900, you're not an average user and shouldn't be cheaping out
Looks like they made the 7600 with the same idea that the 3600 played out to be. I really like it!
I'm still using R5 3600 and don't have any intentions on upgrading soon. I'm still running at stock but when it starts to wear down, I'll simply need to OC a bit to match what ever GPU I have(currently a RX 6650 XT).
Due to ass prices in a small market, I still often buy Ryzen 5 3600s for my office.
@@Postman00 5700x is like only 200€
@@Tupsuu He mentioned "a small market". That makes me think he's not in US, nor UK.
@@Winnetou17 Im in Finland
@@Tupsuu Ok, fair enough.
I'm actually super interested in the non x chips after watching this purely because of that incredible thermal performance, I live in Australia so almost the same performance with incredible thermals, I'm genuinely going to consider a new build
Upgraded from 2700X to the 5900X recently and even with its eco mode capped at 65W it's a massive improvement to me. So I will probably wait for the next huge efficiency jump and another increase in thread count.
I'm curious. I also have a 2700x but with a rtx 2070. What gpu are you using and what is your workload?
@@Barrysteezy You should ponder these questions for a while, if you can confidently answer them then you’ll have an easier time making a choice:
1. What chipset is ur MOBO, and is it one that supports 5000 series?
2. Do you plan on upgrading to an entirely new system in the next 5 years
3. Do you plan on doing any creative work or CPU-intensive work such as live-streaming to twitch while gaming?
These are all things you should consider before spending any moneys on upgrades
@@Barrysteezy I recently got an RX 6800 but had an RX 5700 before that. One thing helping a lot with the GPU was enabling resizable bar in the BIOS.
Honestly though my setup is pretty screwed by PCIe limitations since I'm using 8 lanes for some SSD adapter card. Therefore my GPU is limited in bandwidth (only using 8 lanes) and I only have PCIe 3.0 with my motherboard. However it's still fine but I assume with PCIe 4.0/5.0 I could easily get 5~10% more performance in games. The reason for upgrading was mostly to get a newer feature set with mesh shaders and ray tracing pipelines (I wanted to do some graphics development with that).
However the CPU upgrade still made a lot of sense to me because I'm compiling a lot of code during development. So this adds up easily when I need to draw less power now while it takes less time as well.
The most reason why I didn't went for AM5 is the pricing while my old mainboard (x370 chipset) got a BIOS update which enabled Ryzen 5000 support (even with more than 8 cores). So the upgrade was much cheaper and the benefit of Ryzen 7000 over 5000 is not that huge to be honest.
But if you consider it, I would wait for the reviews of the 3D cache chips, especially if you care most about gaming. There's still a chance that they won't perform as perfect during launch though since they rely on software optimizations to utilize their stacked cache most efficiently. So hopefully reviews will cover that part.
where exactly do you see improvement and did you really needed it? 2700x is still capable of any task, you just wasting money for higher numbers in benchmarks.
why would you buy a 16 core chip just to cap it at 65w? 😂 I can almost guarantee you that you'd lose barely any performing with a 65w 7900x
If you are still on AM4 mobo, the 5800x3d is a great cap filler without the need of new mobo/ram/cpu and it seems to be staying on par with the next gen CPU's
I'm not gonna upgrade from a 5800x to a 58003D xD
Well, i dont wanna be the karen today but the 5800x3d is better for content creator or ai people but more cores could mean better performance even though a game only runs at 8
yeah, i found recently that my b450 mobo has a bios update for 5xxx series... and i'm running 1600 so i'll probably save a buck and just buy brand new r9, and i don't even have to buy other parts, yay
@@Shadow-zo8xj It thought it was the other way around, 5800x3D for gaming mostly, but i'm no expert
For games yes.
Great video as always! What AutoOC are you talking here for the 7900 chip? More details on the OC please, how about the increase of power draw, temp, and consumption?
Very excited to see what the 3D V-Cache chips offer. Amazing how competitive the 5800X3D is a generation later
5800X3D ,, WAY WAY overpriced EXTREMELY overhyped
@@tilapiadave3234 wrong :)
@@TheNicePIXEL Indeed You are wrong ,, very wrong
@@tilapiadave3234 keep using your Intel CPU and let us use our OVERPRICED and OVERHYPED X3D's 😁
@@TheNicePIXEL I most certainly will continue to use the vastly superior cpu's.
My task is to INFORM all the newbies of the LIES spread regarding the 5800X3D , I hate when newbies waste hard earned dollars
As an SFF user, those temps are impressive. I would definitely go this route once I move to am5.
how small? i want to go sff but i’m afraid to make the jump
@@halocubed6788 I built a 11 liter Dan A4 H2O case with a 3070ti and a 5800x3D about 2 weeks ago. It has 60 degree temps for the GPU and the CPU never goes higher than 70 degrees when gaming. It has been running great and I overall love it.
@@halocubed6788 try tecware fusion. its itx friendly
as a sff fan myself, i gotta keep the 5700x until am5 reaches its last gen
@@cuerex8580 nice, eol is going to pretty awesome on am5!
Hey just wanted to say thank you for the channel. I used to be a system builder back at the turn of the century and into the sli Era of gaming. I'm slowly catching up with the new tech. So glad to see some other enthusiasts make these videos.
Same situation here =)
Yeah me too. Did my first modern build (first one in 12 years) late last year. Amazing how much faster everything has gotten for about the same money.
I am so glad to hear about this. This makes me less reluctant to upgrade, because I don't just want my system to draw more power in order to be faster.
Nice, would have loved a more indepth section on the overclocking part. How does it affect the amazing efficiency?
All of the X version chips were available for $10 -$20 above these new chips between Thanksgiving and Christmas. I picked up a 7700X for $339 from Newegg. Currently, the new chips are a great value but early adopters of the X series did just fine.
I went with the 7700 and have no regrets. I got 32GB of free DDR5 from Microcenter. I did a direct comparison against a free mobo deal they were running on Intel 12 series, and made the decision to go with the new platform instead of the tail end of an old one. I don't feel bad about it. And I'm happy with the 6800xt I paired it with for $500 after seeing the latest gen video cards.
I got $20 off the CPU and mobo combo, in addition to the discount on the CPU, so I kinda got it cheaper than than the non-x version. 😂
I don’t believe you
I paid $30 more for a 13700k, z690 for $150, and carried over ddr4 b-die ram I bought two generations ago. A more normie 32gb ram kit would've only been $70 anyway
@@SlimedogNumbaSixty9 please tell me total price?
So the 7600 is the new budget king, the 7700 is DOA, and the 7900 is a power efficient mid/high-end monster? If only Radeon could deliver like Ryzen
Edit: changed the 7900 from mid-range to mid/high end
7700 isn't exactly DOA because you can just turn on PBO in the BIOS and get pretty close to X performance for less $
Budget king will be the 13400F. Being BLCK OCing is becoming more common on boards at good prices. a 13100F overclocked will even give the 7600 a run for the money.
No chance of it being a budget king with those board prices.
@@MiGujack3 I was thinking that too, AM4 is dead end but will still be relevant. Z690's are very affordable. B660's are plenty enough for a 13600k or better. Mortar Max+12400F is where it's at right now. 5.3GHZ and 13600k performance in gaming.
@@PDXCustomPCS 13400F is still based on alder lake cores so gaming performance will be far behind 7600 and 13600k and only about the same as Zen 3 5600x which you can get for $135. You can see at 7:10 the 7600 is a bit faster than even the 13600k in gaming, it's prettymuch impossible for the 13400f and 13100f to match 13600k and 7600, if you OC them you can also OC the 13600k or 7600.
Can someone help give me a better idea of where a good landing point would be? I’m building a new PC for both gaming and to video edit on for an eventual channel. I like the idea of AMD and planning on probably under-volting a cpu to get much lower thermals for a minimal decrease (assumption) in performance.
I don’t care so much for the cost of the hardware. Is it worth waiting for the new X3D processors from AMD? Or Instead an X or non X cpu? I was going to pair this with the x670 Asus Pro Art mobo.
Any help is much appreciated.
7600 is ice cold, which is lovely. Can't wait for it to age and see more cheaper options and small factor options!
i got mine to 93c building shaders on the last of us lol
@@teamfishbowl1076 mine is going above 95c with aerocool verkho2 (2 pipe cooler) and there was R5 1600x (95w) never going above 60c with the same cooler. I don't understand that.
@@kliibapzyeah its a bit odd. prime95 on am5 7600 for an hour with prism cooler will easily hit 90c+.
my ryzen 2600 with the dark rock pro4 can do 3-4 hours of torture tests on prime95 and never go above 65-70c.
i want to test the dark rock pro on the am5 and see what happens, but either way the 7600 runs NO WHERE NEAR ICE COLD.
ICE cold? They run around 93c with stock cooler thats 200F enough to cool meats easily. Not sure I would use ice to describe something that can cook steaks.
@@Physics072I'd it much worse than the 7600x?
I was planning on skipping the 7000 series, but price and TDP of the non-X chips definitely has me reconsidering. If only the 600 series boards and ddr5 ram weren't so overpriced, then it would be a really easy choice to upgrade.
DDR5 ram isn't that overpriced in my opinion, however those 600 series boards are extremely expensive.
@@officialteaincorporated243 Easy, just buy b560 mobo. How much DDR5 currently? DDR4 3200mhz in my country $58 current price before discounts.
@@fynkozari9271 i believe zen4 doesnt support ddr4 ram
@@reformierende_person b550 am4 b650 am5.
I'm hoping we see some Zen 3+ AM5 chips for the budget 6000 series. Imagine something like the 6980HS becoming the 6700G or something like that.
It's nice seeing a product being able to get a good review from time to time, not what we are getting with gpus
When iGPU integration starts shitting on a dedicated GPU.
@@NinjaForHire Have you seen the AMD laptop chips announced at CES? iGPUs are going to DEMOLISH low end dedicated GPUs.
@@Gabu_ lmao yeeaa suuuurre
@@NinjaForHire bad troll. Wasn't funny. Nobody but you laughed. Cus an integrated gpu beating my dedicated gpu is more of a 1 in 9 billion chance. Get back to me when integrated steps up thier game *mic drop*
@@cooleyYT who hurt you...
Hey linus when watching the graphs of the data, i feel it would be helpful when doing this style of comparison to color code the CPU names. For example with this one, the 7900 and 7900x could be blue and then the 7700 and 7700x could be green and then so on so forth. Just a suggestion :)
Btw guys does anyone know a qvl-ready b650 motherboard that can support a 64 gb double stick ram with 5600+ speed? I'm trying to find a motherboard to go with 7700 but all qvl lists i found only officially support 32gig ram setups. I dont want to buy an expensive items and find out they are incompatible.
Happily surprised to see Factorio there as a game to compare at; no idea how that got on your radar but I look forward to hearing more benchmarks with that game, maybe with the UPS/FPS on a 500+ hour mega base since that's where it can really matter :D
Factorio devs are magician or some sort. My mid size base would run happily on 12 years old i5 2410m. Just amazing.
Factorio is great for cache benching, because it just beats the living shit out of CPU and GPU cache.
People kept spam requesting it during the intel arc gaming stream so I guess it stuck in their heads afterwards lmao
Destiny no lifing the game maybe.
@@jammo7370 Probably Destiny streamer fans.
7900 will be great for building a quiet entry-level workstation at a very reasonable budget. This is pretty exciting. I think productivity-wise for devs, this is a great chip.
What about for gaming? Good as well?
@@bigdoublet CPUs haven’t mattered too much in gaming for a while. A well chosen, cheap, older CPU is usually the way to go.
@@youtubewatcher4603 every recent cpu released has increased ipc to the point where something like a second hand i7-8700k is actually a bad purchase rn, you really dont wanna go below 10th gen even if you gotta go entry level to stay on the newer platform unless you wanna deal with microstutters on older cpus.
DDR5 128 GB ram (4400 Mhz is the highest stable config) costs like $800
DDR4 3600 128 GB ram costs $450 and has the same performance
7900 sucks for virtualisation workstation builds or CAD simulation workstations
@@Haskellerz sucks for virtualisation? does that mean VMware machines? that's my use case. please respond with what I should go for. I know I don't upgrade a lot coming from a 4690k but I like the idea of having the latest and greatest. was almost sure to buy a 7900 but why would it suck for vmware?
Actually the ram speed thing changed on Zen 4 there’s no longer a penalty for desyncing infinity fabric and ram. That said Zen 4’d memory controller generally can’t handle much faster than 6000 but the 6800 used on the Intel bench is way more expensive
Can Someone explain why the 7700 8/16 which has more cores is preforming lower than the 7600 6/12 in 'Cyberpunk 1080 ultra RT/scaling off'.
After he said that the game is more multi core & clock speed dependent as they both have the same Base clock & 7700 has a slightly higher boost & more cache. Is this purely linked to the silicon luck draw or is there some other reason, cause it seems to do if only marginally better than the 7600 on other tests for games FPS but not cyberpunk .
Cause Sure a game could be programmed a certain way but i can't see that downgrading the potential of the 7700 & not the 7600
I would appreciate some guidance as I was looking into getting the 7700 cause it comes with a decent box fan I could use for a spare as I need much needed upgrade.
That's amazing. Love the lower TDPs and temps with an almost equal performance. It's definitely worth the small performance loss.
I'm with this guy save so much pwr to performance it's crazy makes Intel look like they don't care about your wallet in the long run.
@@NinjaForHire LOL.
AMD is a corporation too. They don't give a flying F about your damn wallet.
They only released these non X chips because their X sales and new platform sales were low.
It's almost as if the average user didn't like the idea of spending all that money on a new everything for a chip that stupidly ran itself factory over clocked to run at 95c.
Funny how as soon as AMD became the better choice for CPUs. The X series was all there was. And the prices went up!
But in a fanboys mind. AMD AND IT'S SHAREHOLDERS ARE LOOKING OUT FOR MY WALLET!
@@NinjaForHire if AMD cared the 7900 XT wouldn’t cost so astronomically high. Don’t skimp on a company. Any company, let alone a billion dollar company
It would be great if you could include the idle power consumption (e.g. on Windows Desktop) in future tests as well. Like you said in the video, energy costs are higher then ever. When I'm using my pc I'm usually browsing the web or listening to music the same amount of time or even more than I'm acutally gaming.
Besides, I believe that it was AMD graphics cards that had unusually high numbers on idle. Including that in the tests puts pressure on the manufacturers to fix whatever issue they might have.
Very good point, I always wish for the same thing.
Intel have lower power usage on idle. The CCD chiplet design is inherently more power hungry on idle/very low loads
About the AMD Gpu: the memory doesnt downclock properly in idle when using multi monitors (of a differing hz rate so ie 165hz main and 60hz 2nd like I have) and it consumes 30-40watt constant in idle. (in my case on rx6800)
@@Daisudori Your GPU only consumes 30W at idle? My 2080 Ti consumes 60 - 70W at idle, and I only have 1 1080p monitor.
@@joshjlmgproductions3313 If your GPU pulls 60W at "idle" then your GPU never actually goes to idle.
You prob have your windows misconfigured, set the power setting to balanced instead of high performance, it doesnt have any real impact on game performance anyway, its just a waste of energy.
A 2080Ti should pull around 10-15W at idle.
With current CPU's, for the Factorio benchmark, might be good to use the Flame 30K save as opposed to the 10K. Regardless, very happy to see Factorio being used as a benchmark, it is great for single core bench marking.
3:09 Whenever I buy a CPU one of the first things I do is check the amount of cache and L2 cache available on the CPU. Having a little extra space for your computer to store stuff in the CPU cache can go a long way. The moment your PC needs to reach out to memory is the moment you start to see less performance, granted it's not usually noticeable.
6:24 This is one of the moments when having some more clear documentation and dedicated explanation regarding the Labs SOPs and testing methodology, not necessarily directly in the video but either in a stand-alone video or a link to a forums thread, would be really helpful. It’s no doubt hard to get a high quality testing lab put together, especially when there is a well known focus on developing testing automation. However, when there’s been issues with previous tests that don’t get caught before the video is published and then get at most a pinned comment, it makes me cautious when I start hearing about “strange data” that was discovered which for all I know could be human error but is immediately passed over without much explanation regarding how this result was validated to be accurate.
Reputation is so important, and LTT as a brand has often made transparency a prominent feature, so please continue to apply that to the labs as they develop so that detail oriented viewers, the primary target of labs level content, won’t view the labs as having a reputation for letting errors slip by or having a murky methodology more focused on obtaining results quickly over accurately.
To be clear, I love the content and don’t believe that there is any issue per se in this video, but hearing that line and no follow-up immediately made me remember other errors and think “well, they’ve been wrong in the past so I’ll just check GN instead” which is not the feeling I want to get out of labs related content because I’m really excited about what the labs have the potential to do. I am looking forward to seeing more content making use of the Labs capabilities and how your testing evolves in the future!
This is quality reporting. Coverage like this helps real people filter the marketing hype and holds manufacturers to account. Nice job LMG 👌 And nice job AMD!
Hey, I was wondering: are these CPUs overclocked, because my 13700K can get no better than 26,000 in Cinebench R23, even though I have a triple radiator water cooling setup and a very well-circulated case.
sir Linus good evening, I'm Thomas E. Berdon from the Philippines...my rig is already complete but except for the CPU...if you ever any spare to give-aways that you never use it anymore... a AMD athlon 3000G or AMD ryzen 3 3200G it would be awesome...this is my rig I built it on my own...an old micro atx computer case, aA320M-S2H gigabyte motherboard, a 2X4 kingston beast 2666Mhz cl16 ram, a walram 128Gb M.2 ssd for OS, a ramsta 128Gb 2.5" ssd for storage, a wd blue 500Gb hdd for storage and a DVD rom drive...that's it.....I always watch your videos on CZcams as you assemble it one by one until you done with it...I learned a lot from you sir...thank you very much for the time that you read this....
Man, really glad to see how much information the lab is churning out in these vids now. Makes for a perfect trifecta of information for buyers when combined with Jayz and Tech Jesus.
That r9 7900 was pretty much what I was waiting for, I'll grab one in the summer when supply/demand settles on the entire platform
But x670 price. Are u gonna get b560?
@@fynkozari9271 why? I just care about the power consumption. 1kwh is 1€ here so
If I go for a lower end B660, I cannot safely assume it'll run the next gen of chip, meaning that the 50-70€ I save on not going x670 means I gotta redo half my build and spend another 150-200 bucks
I'll go x670, the 50€ saved on a 700€ investment is not worth the compromise
@@roqeyt3566 wow, your power is about 10x what I pay right now. The margin drops to 4x what I pay in summer though.
@@roqeyt3566 Even A320 motherboards can run the 5800X3D. I don't think future compatibility is really an issue on B650. There's a reason B650 is so expensive. They come with good VRMs and I/O. Buy only what you need and want in a motherboard. No point having a X670E if it doesn't have the features you want.
@@dalzy25 you technically can, but how many mobo's from first gen got bios's for 5th gen
On top, many mobo's show weird behavior (sluggishness, microstutters, performance loss) if they're too low end or old, as they were designed for their generation of chips and not the future ones. Not only that, power delivery, especially with a 200w+ socket, becomes much more important compared to the old limited AM4 socket.
So while you could, it's a question of should you. It'll be a pain in the behind if history is anything to go by
How did you manage to top temps. at ~40°C for the 7900? Mine can't be cooled down even with a Silent Loop 2 280mm ...
Was thinking of getting a 4070ti with an i7-13700k. Any suggestions on motherboards, ram, and AIO cooler? I found the Z790 DDR5 to be the best pair but for some reason when I’m on microcenter I get a compatibility issue and I’m not about to pick out a proper 4070ti, it gives me one option. I’ve looked all over and I’m not sure why it’s happening
The fact that 5800X3D is still here for comparison still surprises me 👀
Why ? As a owner i'm more than happy tho haha.
@@TiPeteux I love how 5800X3D still performs, would've gotten one if I didn't have 3950X right now :P
@@leungoscar4126 I have a 5600, worth an upgrade?
@@blunderingfool that's basically the upgrade I made. Depending on what you play and the price you could find the 3d chip at it's a bit equivocal. If you are happy with performance right now it's not worth it. If you play things like flight sim in vr or really logic heavy games like civ a lot though it'll be a great upgrade without needing a whole new pc, especially if you sell the 5600 and get a good deal
It's a last gen chip. It's not that suprising.
I'm still using my first CPU, R5 2600. Love the direction AMD have taken with value, which means I'll be looking at potentially going AM5 when I upgrade probably next year. (Gotta let those prices drop a bit more). I can only hope GPUs will follow at some point
you will see that in 10 years, when others companies came out.
Even if you could find a decent priced 5600 you'd still see a decent performance jump over the 2600. Then when it comes time to upgrading you would have a decent base system to sell as the 5000 is the last series of the AM4 platform. I had a 3700x and didn't need to upgrade but still did (5800x) and honestly did not regret doing so one bit.
What's the issue with the 2600? I have a 3700, I know it's zen2 and 8core but at least for many home workloads it seems pretty good. Games will probably not run much better unless you have a really great GPU like at least a 3070 or so, and the same goes for blender renders or whatever else you might be doing for fun.
I would only upgrade if there's a clear advantage in your particular system combination and your use case, and then also only if you need it. So many people play on 1080p 60fps but want 4k120 for no reason. If you play battlefield, will you really be better at 4k120 or notice the visual difference enough to warrant spending $1500 on all that stuff?
I upgraded my whole system a while ago (3700, rtx2070, 32gb RAM) and my secondary PC with a 1050ti & i7 4770k runs most things well enough to not really care during gameplay.
@@LuLeBe It's not an issue but it's just a less mature processor. AMD made huge improvements between 2000 and 3000 and even bigger up to the 5000 series. Memory optimisation was one of the biggest changes and in some scenarios you can see a massive uplift in performance as the way the infinity cache works relies heavily on memory speeds and latency. I upgraded my 3700x to a 5800x to take full advantage of my RTX 3070 and triple wide 5760x1080 setup. I get a solid 60 fps now in most games whereas before I did have quite a few frame time issues as well as 1% lows dipping. You have to remember that everyone has different uses for their computers and what may seem like a small upgrade can turn out to be a big one for some people. That, and AMD have made huge improvements between each generation of Ryzen. It's not like the old days where Intel only improved by a few percent each generation.
I always wanted an R5 2600! Of course, my upgrade got delayed substantially, and I got an R7 7700X for christmas. Best thing ever. Good luck with the upgrade! I'm still using a mid-tier GPU...
I just noticed that the graphs all have the LTT logo in a fading dot matrix. Really cool form of content protection. I feel like a lot of CZcamsrs could benefit from this kind of tech.
@Linus Love your charts, but can you add another axis on the Y chart with the CPUs that is Retail Price. So not only you get a performance ticks, but price ticks. Thanks.
Those temps and power consumption 👀 Very exciting to see this for small form factor pcs
I’m probably still gonna hold out for the 3D CPUs, but I’m glad for the better deal here.
🤑
Really nice review, I really appreciate the deeper look at the power consumption and performance per watt. Since these new chips are looking this interesting it leaves me wondering if maybe non-X Ryzens could benefit from the extra cache the 3D variants have. Perhaps even more than the Xs?
X3D is already a non X. It has lower clocks compared to X variant.
I'm still using my late-2019 3700X 1080p build & it is still chugging along great over 4 years later. I do want to build a whole new system this year (2024) though just for the fun of it, & go for a targeted 1440p lower-draw, no-overclock build using the Ryzen 9 7900 CPU as the heart of the system.
Loving the new layouts for your data! It made it very clear which data was what and made it easier to take a look at at a glance without pausing. Fantastic work by your graphics team!
it's nice to see optimization to reduce power consumption and thermals instead of just throwing more watts at their chip like Intel and Nvidia have been doing for the past few years
behold i9 14900K 500W
@@damara2268 more like 600+ xDD
To be fair, a lot of people forget that Intel is still on a 10 nm process node while AMD is on a 5 nm process node. This is what accounts for the current difference in power efficiency.
We can look at the Intel roadmap to see that they plan to release 14th gen processors in the second half of this year. These are on the 7 nm node, which should greatly reduce power consumption. Similarly in 2024, they plan to release the 15th gen, which should use a 2 nm process. This is slated to compete with the AM5 release in 2024 which should be using a 3 nm process.
So, we should expect the Intel 14th gen to be a large jump in efficiency even if the designs are just scaled, though there are also likely going to be some other improvements. This should help them compete with the non-X AMD cpus.
Then, we are set up for another head-to-head in 2024: Intel 15th gen vs Zen5. It is great to see Intel potentially back in the game and helping to spur more competition in the CPU space.
In recent times, Intel has been stagnant, but now, they are showing signs of moving back into the competition, which is great for us consumers.
@@withjoe1880 intel pulled a sneaky on ya and renamed their 10 nm ESF to "intel 7" , they are the same process, just a different marketing number
@@YounesLayachi I know the current gen is 10 nm (Intel 7), I said 14th gen is 7 nm (Intel 4) and 15th gen is 2 nm (Intel 20A).
Thanks for watching out, though.
LTT I got a question my pc has been not letting me go into any windows it’s just saying please restart if I try to wipe or Sade mod what do I do?
I wasn't planning on upgrading until I watched this video. Thank you for always being excited about value in the PC space. Helps a lot of us make informed decisions! -Wage Slave
i'm very happy to see efficiency being targeted for once. now if only the GPU market would go this route.
AMD have always cared about efficiency though. It's intel who show no interest in it
@CompilationHUB since 2017, yes they have, rx 7000 series isnt as efficient because its not as powerful as they expected, but its not a hardware flaw, they need to fix the drivers
@@defnotatroll here we go, fangirls spreading misinformation just to shill for a corporate company
@CompilationHUB r9 290x....only the psu's can tell the horror stories from that card haha
@@defnotatroll Not until Zen.
I'm loving the multiple approaches to computing , feels like the companies are specializing for different needs. It will be very difficult to tell who is moving in the correct direction. I commend AMD for squeezing raw cores as much as possible and Intel for going through a period of trying new things with cpu gpu and storage.
I got the 7700 non-x because not only very little performance diff that will be noticable but that 65w tdp in an SFF build is a huge bonus for thermals
It has been 7 months. Have you enjoyed it so far?
What kind of board would pair well with a 13700k, w/ 3600 cl16?
Z690s are still expensive, though lower than z790, and I don't quite know the B boards or the difference.
And I can get a wifi adapter if the board doesn't have.
Meanwhile I'm satisfied with my 5900x, but it does make good expectation for the next gen (8000) when I'm more likely to upgrade. I eagerly await seeing benchmarks for the x3D variants of these chips.
Same with my 5900x. I'll probably upgrade in 5 years right as AM5 socket support ends so I can get a mature and cheap CPU.
I'm happy having my 5950X running perfectly on an original x370 board! Longevity!
Same. I went 2700x then 5900x. Gonna stay a while
wouldnt next gen be 9000?
Got a 5900x too but I can't seem to get the temps under control, 85C when gaming no matter what the hell I do. I'm not used to having temps that high, feels uncomfortable.
Thank you for including those Watts per FPS graphs, after all! That's where we can clearly see efficiency at gaming. It's very interesting.
Watts per FPS is kind of pointless though. The only thing that should matter is FPS. Would you really want to lose 20% of your FPS to save a few watts? I mean it makes sense for laptops or anything else that runs on batteries but it's useless for desktops.
@@rubiconnn It shows efficiency and helps people who aren't building super gaming computers make an informed decision. It's not the do-all and end-all of CPU choice but very helpful. My computer idles between 150 and 180 watts so I turn it off when I'm not using it because that's a lot of power to just throw away. If I were to do a PC build right now that 7600 would be on my short list.
Always love seeing competition bringing better, cost effective, and power efficient parts for people to enjoy doing the things they do on their computers!
hey I'm gonna Build my pc in the next 1-2 months, any idea when these are gonna be available on Amazon or offline Stores?
I'm glad to see these well-priced options showing up, but do keep in mind that the X CPUs are selling way below their MSRP right now. For the past month, the 7950X has been $550-$570 on both Amazon and Newegg.
Great price to performance here. Looking forward to what the x3d really offers to round out the lineup.
Good video. My only complaint is that the charts starting at 4:56 are not sorted by chip or average FPS. This makes it difficult to discern and compare between all the CPUs.
Useful video! I like that AMD start to work on their power efficiency. I want to see some APUs with more CUs! A power efficient AMD APU with 8 or 12 RDNA2 CUs would be a true competitor to NVIDIA GPUs for some gamers.
Btw, the part from 13:41 sounds like it's recorded with a phone. 😀
First, I like how your review and foremost the benchmarking improved! Thanks for including Factorio as an example! In the past I missed the coverage of such games where the CPU has more to do then just feeding frames to render for the GPU. I don't care if I can play some game with 220 FPS instead of 210FPS, but if can get a better tick rate in my Dyson Sphere Programm save game with huge factories. Not for everybody relevant, but for some.
Also like this move from AMD and that you are not just comparing the plain performance numbers, but the power aspect. This is the way to go! Sure there is nothing about an enthusiast who wants the fastest 13900K(S) and doesn't care about the massive power draw, but for the mainstream the non X CPUs should be the better choise.
I hope AMD will base the 3D Vcache chips based on this lower powered non X chips. (The so far given TDP number might suggest this).
We’re the tests on these non x cpus done with pbo turned on or off?? I didn’t see it mentioned in the video and apparently it gives a decent bump in gaming.
I'm interested in buying a Ryzen 7 7700 Non X. Is it a real silent CPU?
(I had previously an i7-8700 non K and it was really silent with a Noctua NH-U12A).
Thanks for your answers!
I love that factorio is now a benchmark game. Such a nice introduction
Considering I bought a 7600X just last week I should probably feel some buyers remorse but since it was on sale for 230€/$ I am actually very happy that I got it a day before AMD announced the new chips since it's now back to the MSRP
I've bought an 7900x but for the same price as the 7900 currently. So I don't have buyers remorse (yet).
I got the 7600x last week. It's 10 euros more expensive than the 7600. So a pretty negliable difference, just as negliable as the performance difference.
@@00vaag correct me, but i think 7600x doesn't come with a cooler. And 7600 does. So depending what cooler you got it should be a bit bigger difference.
Would it make sense to start looking at instructions per second as a metric with breakdowns of what ips are being used for different tasks?
Hey Linus, anyway you could do a little review on the Minisforum UM690? It is a very interesting mini pc with a very powerful APU that can handle a lot of games at 1080p
I am excited to upgrade to an R7 7700 for a hopefully cheap price in 2 years.
My 3700X with it's low power consumption is great and I am hoping to get more performance for the same power consumption.
Love the Factorio benchmarks, the factory must grow and so must the UPS
I love the fact it's a more fair test with correct timings. I'm interested to see how well infinity cache works too but these tests are always done with Nvidia cards.
I’m litle confused. I builded aprox 5 systems eith each r9 and i9 cpus and my cinema bend scores on 13900k was 40000 but amd 7950 i,m managed to get 36000. I need ti investigate why your results are bit different from amd side
the graphs in this video are amazing, props to whoever made them, the highlight on the new CPUs bars is very helpful. Also, Linus's explanation for what each result means specifically is very very helpful in conveying the differences. great video, and great review, and I'm excited for the 7000x3d review when that comes out!
I liked how the non x cpu’s were highlighted in the chart, good job
It would be interesting to know if there's any difference in performance between these new non-X chips and the old X chips running in 65W Eco Mode.
That does also interest me. You can boost an 7900 up to an 7900x but I'm sure you can also turn down your 7900x.
Kinda looks like those are the same chips lol.
The 7900 is actually more expensive currently than the 7900x in Germany.
Thanks for review ! Does anyone know what is the power consumption of the Ryzen 7900 when iddle ?
I just got the 5800X3D with a way more affordable motherboard + RAM. Got now a solid system thx to the cheap price of the AM4 platform :P
It's actually a better cpu then the 7600 or the 7600x lol and a much cheaper mb and ram so yeah great deal. I got the same cpu. It's a no brainer right now if you need gaming performance
@@timcesnovar978 how does 5800x3d beat 7600 or x variant? the new am5 platform supports ddr5 ram which massive increases the performance?
@@sumedhtech1526 3D Vcache. You do know how to use google right? And no the ram doesn't make a big difference at all.
@@timcesnovar978 if i want to build a pc purely for gaming is there a more affordable cpu than 5800x3d which has a good price to performance ratio?
@@sumedhtech1526 price to performance no, but there are more affordable options like 5600x and 5700x although if your building a new pc I'd go for 5800x3d and some 3600mhz RAM
Does this not happen on every release? There's always a price cut on last gen, and non x skus have refined power usage. I've gotten the 1800x, 2600, 3700x all on good price cuts. I always buy when the price cuts happen. The 5800x dropped to like 300 and the 5800x3d to 350, have both, the 5800x3d does wonders for my mmos.
Ok I'm confused. At 11:14 you show that the 7900 runs at only 40C when running Prime95, and you say the cores were running at 100%, and yet in the other graph (12:17) you show that the core speed was around its base speed. So why did the cores ramp up closer to their Max Boost speed?
My expectation was that they would run as max boost speed and only start to thermally throttle when they reached about 95C.
Or to put it another way, why didn't the cores run faster given that they had so much spare thermal headroom?
ProArt X670E Combined power phases 16 + 2 are rated for only 70 amps. Comparing with MSI x670e carbon 18 + 2 90 amps, does it make a difference in processor speed?
Picked up the 7600x for $240 while it was still available at that price when AMD announced the non x and x3d cpus. I'm hoping AM5 will be just as good as AM4 and adopting now also means ready for new features and getting the most life out of the socket 🤘🏼
Same, I got my 7600x for 245 and I’m happy
Same here, in the process of building, just waiting for my mobo to arrive.
I got mine on black Friday, I get the same performance for only 10 dollars more, 2 months earlier.
Got my 7600x for 229 last week, saw the release of the non X's and laughed
Do you actually update during a sockets life tho?
I bought Ryzen first gen and actually ended up swapping Motherboards, not CPUs, as I went from ATX to ITX.
having a 5950x I wasn't looking to upgrade anytime soon, however as someone really into Star Citizen right now, I've heard that the x3D cache is really helpful for it, so the 7950x3D might be my next upgrade.
Same with tarkov. I just upgraded from 3700x to 5900x and everyone who has the 5800x3d gets better frames than me. So I either kinda downgrade but get better game performance or wait for next gen 3d chips
Fortunately I stopped worrying about my PC not being able to run Scam Citizen decently a long, long time ago. My latest build doesn't even reach 60 fps at 1080p on it. And now it's a 10 year old "game", it doesn't even look that good anymore.
@@sovo1212 well that would be because it’s still in an a alpha and is pretty unoptimized currently, I get 1440p 60fps+ so I’m not too worried about really :) but to each their own.
I know the charts are still being worked out as you deal with more lab data coming in, but you've gotta make a change SOON. Every time a chart switches I have to spend 5 or more seconds trying to figure out where everything moved on the y-axis. There are definitely situations in which ordering based on performance is most helpful, but when you're shooting off several charts in a row comparing the same set of components, you've *gotta* keep those components in the same spot every time. It's way easier for us to get meaningful information *between* charts that way because we'll be able to see the bars move on the cards we're most interested in, and it cuts way down on pause-and-scan time
Perks of being a very late adopter. I just got the 5000 series recently, and in a few years once the MoBo and RAM prices go down, I'm gonna grab one of these 65W beauties
Did you test the x variants on eco mode compared to the non X variants? I wonder what the temps would do if you limit the X versions to 65W.
My 7700x in Eco Mode runs as the 7700 in this video degree’s wise but faster thanks to higher clocks.
I Appreciate the Factorio Benchmark :)
How fast does the 7000 series goes to sleep? For mixed uses for most users, the power draw during idle and low load conditions matter too.
Love the intro song and I hope it sticks around.
the 13600k's score is ronge by a lot and the 13900k is probably hitting a thermel trodelling I am not an intel fan boy but the testing latly is very light in gaming and in general not accert an cosistnt