Budget CPU Battle Royale: Intel i3-13100F CPU Review & Benchmarks

Sdílet
Vložit
  • čas přidán 9. 09. 2024

Komentáře • 1,4K

  • @GamersNexus
    @GamersNexus  Před rokem +161

    If you missed it, check out our mini-documentary covering the final days of EVGA's GPU division -- they really deserve the watch: czcams.com/video/Gc0YlQS3Rx4/video.html
    WE WILL RUN OUT OF DISAPPOINTMENT T-SHIRT ALLOCATION BY NEXT WEEK! Grab a 100% cotton shirt here: store.gamersnexus.net/products/disappointment-pc-t-shirt-2022-100-cotton-black
    And lightweight tri-blend here: store.gamersnexus.net/products/disappointment-pc-t-shirt-2022-triblend-black

    • @Ketobbey
      @Ketobbey Před rokem +3

      Thanks for the updated version. Any chance Davinci Resolve will show on the charts soon?
      It has a massive market share

    • @wawaweewa9159
      @wawaweewa9159 Před rokem

      add a quick few game test with RX5600XT or RX6600 as a real world test where budget gpus are paired with these

    • @rawdez_
      @rawdez_ Před rokem +2

      psst 5600 is 120 bucks free delivery from china.
      ps AM4 boards are generally a bit cheaper than comparable lga1700 boards. or a bit better for the same money. + AM4 CPUs eat less power = go easier on VRMs and have better upgrade path if you choose from really cheap boards. practically anything (e.g. b450m s2h) can run 5800X3D. but to run 13600k you need a way pricier board than b450m s2h. 5800X3D and 13600k are comparable in gaming.

    • @blacksama_
      @blacksama_ Před rokem

      This "opps we screw and had to reupload it up but we were right anyway and wont change our opinion" is really what you want to tell to your audience? this is by far your most confusing and apologetic review ever. And your graphs are utterly confusing, seems like you just twisted the graph to prove your point, wth are those Performance graph dude?

    • @rawdez_
      @rawdez_ Před rokem +1

      @@blacksama_ 5500 sucks as a gaming CPU. but 5600 exist and costs only 120 bucks. so 5500 is irrelevant anyway and should be ignored. just like 12100(f) and 13100(f) should be ignored. just get a 5600 from china, MSI B550M PRO or ASUS TUF GAMING B550M-PLUS depending on your budget and be happy. or even GIGABYTE B450M S2H. or MSI A520M PRO if its noticable cheaper for you.

  • @GamersNexus
    @GamersNexus  Před rokem +206

    Just got done going through comments and scrubbing a bunch of the spambots... remember that they're getting tricky! They copy/paste comments from actual users, somehow insta bot-upvote themselves, and then hope you'll click on the profile picture so they can phish you. The profiles are all of a very obvious type. Downvote or report them if you feel like contributing, but we'll keep nuking them for now!

    • @sigy4ever
      @sigy4ever Před rokem +10

      Thanks Steve, now back to you Steve.

    • @-opus
      @-opus Před rokem +4

      I generally look through the comments and report them as spam, they are ridiculously obvious and it doesn't take long to do. Have to wonder if it achieves anything with youtube though...

    • @rickysargulesh1053
      @rickysargulesh1053 Před rokem +1

      How can it phish us by just clicking on a YT profile?

    • @rube9169
      @rube9169 Před rokem +5

      @@rickysargulesh1053 They always have links in their profile that are either obviously malicious or are a redirect from a legit-looking site to a malicious site in an attempt to get your personal information. Not 100% sure how it works, but pretty sure it goes something like that.

    • @H31MU7
      @H31MU7 Před rokem +3

      thank you for being one of the very few CZcamsrs i watch that actually helps get rid of these

  • @sulphurous2656
    @sulphurous2656 Před rokem +1297

    Man, I wish the fight for budget GPUs (remember when those existed?) was as interesting as it is for CPUs nowadays.

    • @GamersNexus
      @GamersNexus  Před rokem +459

      100% agreed! GPUs need to get a similar shakeup to CPUs!

    • @riven4121
      @riven4121 Před rokem +248

      RTX 4050: 5% more performance than the 3050 but for 20% more in cost.

    • @DrakkarCalethiel
      @DrakkarCalethiel Před rokem +158

      @@GamersNexus (pokes Intel) Come on, do something!

    • @odizzido
      @odizzido Před rokem +50

      Yeah. The last good value GPU, the 5700XT, launched in 2019. Since then it's been nothing but turd after turd.
      edit--------
      due to the number of replies I have gotten, I wish to clarify things. I am talking about what AMD/nvidia have been trying to sell their cards for. Excluding companies trying to sell off the last of their stock before new cards come in, or fake MSRPs, everything after the 5700XT has been crap. The 5700XT was good on day one and remained good until crypto went nuts long after launch. Cards after the 5700XT have been worse value or in one or two best case scenarios were able to at least match the 5700XT. But matching your old generation isn't an improvement, it's pointless and a waste of time.
      So yes, you can find a better card than the 5700XT as the last of the 6600s are trying to be sold off before they're replaced by the 7600 or 7500, but it's the blip at the end of the product's life and not what it was for about a year after launch.

    • @crookim
      @crookim Před rokem +28

      What is a budget GPU?

  • @Meat88
    @Meat88 Před rokem +538

    One major problem with price per FPS is that prices are always changing.

    • @ArtisChronicles
      @ArtisChronicles Před rokem +26

      True. The price of the 5700x went down about a month after I got it lol

    • @shanenokes6535
      @shanenokes6535 Před rokem +41

      It would be ideal to see the price they are using for the FPS per dollar comparison.

    • @ChrisLightbulb
      @ChrisLightbulb Před rokem +15

      Yeah, this is why Hardware Unboxed has to do a monthly price/performance investigation and video...

    • @michaelmonstar4276
      @michaelmonstar4276 Před rokem +2

      @@ArtisChronicles - Hate when that happens... But then, when you buy something, it better be worth the monetary investment to you, so that you can't complain. - Still, if you might be on a tighter budget (as I often am), it must sting. So I really have to jump on like sales or price-drops, which I did with the 5600 and the promotional "Uncharted 4" key. - That's a good way of throwing in an incentive and adding some value, besides that the price was good for the CPU alone anyway.

    • @Masoch1st
      @Masoch1st Před rokem +8

      it still provides valuable context. you know if things got cheaper then so did your cost per frame. its not rocket science.

  • @Meansoduck
    @Meansoduck Před rokem +464

    class act to take community feedback, adding more cpus to the chart for comparison, and redoing the video with new testing done. cant imagine the work that required. this is why you guys are awesome.

    • @GamersNexus
      @GamersNexus  Před rokem +234

      Thanks, uh, "pooplord." It means a lot that a Lord of your demeanor says something so kind!

    • @volvo09
      @volvo09 Před rokem +1

      The pumper truck challenges your claim of lord.

    • @BigDonkMongo
      @BigDonkMongo Před rokem +15

      ALL HAIL POOPLORD

    • @GrabLifeByDaPoosy
      @GrabLifeByDaPoosy Před rokem +2

      @@BigDonkMongo Hear hear 🍻

    • @RandomPerson-vf3ld
      @RandomPerson-vf3ld Před rokem +1

      @@GamersNexus Feels like the internet of old

  • @Jistarii
    @Jistarii Před rokem +659

    We love these videos but aren't you supposed to be taking a break Steve? *uses concerned parent voice* Come on man take a rest you've been killing it and we want you around for many years.

    • @GamersNexus
      @GamersNexus  Před rokem +600

      Just one more video. I can stop anytime I want! I swear!

    • @aftermax01
      @aftermax01 Před rokem +20

      @@GamersNexus meanwhile I'm watching the video at 00:30
      I woke at 04:00

    • @bjn714
      @bjn714 Před rokem +7

      @@GamersNexus please take the break! Maybe it's the overwork, but the cringy Eminem "Forgot about Cezanne" joke was a definite departure from the absolutely fantastic joke writing of the last few months of videos.

    • @Distrusted_Citizen
      @Distrusted_Citizen Před rokem +1

      @@GamersNexus YOU BETTER❤

    • @trousersnake1486
      @trousersnake1486 Před rokem +23

      @@GamersNexus no, stop it. You and the staff need rest. I want full energy for 7000x3d chips

  • @Darxide23
    @Darxide23 Před rokem +153

    A large CZcams channel taking suggestions from the comment section and incorporating them into an updated video? Preposterous!
    Seriously, this is one of the reasons I love you guys. The content here isn't backed up by pride or bias. It's cold, hard facts.

    • @PotatMasterRace
      @PotatMasterRace Před rokem +7

      Ain't it what all monetarily incentivised channels should do to keep their core following intact and... following? This video made me want to help this channel monetarily, and I'm a humongous cheapskate.

    • @Freestyle80
      @Freestyle80 Před rokem

      yeah we should always listen to amd fangirls

  • @tarfeef_4268
    @tarfeef_4268 Před rokem +303

    I agree with the fps/$ reservations because there are just so many ways people buy CPUs:
    - drop in upgrades
    - platform upgrades
    - new builds
    For all of these, and especially for a huge amount of varied price tiers in the latter two, the actual cost and fps increase by changing CPUs is hugely different, and it's difficult to show this in one or even 2 or 3 charts

    • @GamersNexus
      @GamersNexus  Před rokem +100

      Great input & agreed on these points! We'll keep tuning/tweaking it! Thanks for commenting. Good thoughts.

    • @instantstupor
      @instantstupor Před rokem +8

      Definitely good points, especially since it is - effectively - theoretical. What GPU you get & what resolution you game at can have a MASSIVE impact on your personal, real-life FPS per dollar rating. When, like in this video, you're looking at budget options, you aren't going to see real-life FPS increase per dollar on a matching lower end GPU compared to something like the 3090Ti they use in their chart.
      You can get all kinds of differing bottlenecks between different hardware combinations & resolutions. For example, I upgraded from a 6700k to a 5600x the same time I upgraded my GPU to an RTX 3060Ti. The 5600X would have a meaningful FPS impact on that GPU when gaming at 1080p to help justify its value, whereas at 1440p and up the bottleneck shifts pretty heavily to GPU so would be comparably poorer. And even RAM rank, speed, etc can impact the outcome. Feels like FPS/$ charts are simply too broad & theoretical to have much applicable value.

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem +7

      Exactly. The graph basically compares how these CPUs today (as in on this very one day) on the US market do as a drop-in upgrade for people who play those 7 games.
      If you need to upgrade the whole platform, watch the video on the weekend, don't live in the US or play any other game, the graph is rather worthless for price comparisons.

    • @saricubra2867
      @saricubra2867 Před rokem +1

      I bought an i7-12700K for 330, it's a 2021 batch with AVX512. The MSI-Z690 Pro A DDR5 was cheap too, 150 bucks. Then i added two 1TB M.2 NVMe PCIe Gen 4 SSDs.
      It's a music production and console emulation PC, no graphics card needed. It's air cooled (Deepcool Assassin 3, and Corsair 5000D Airfow). I bought an overkill power supply (wasn't cheap).
      Waiting for a good GPU. 3080Ti is a good card but it uses way too much power and has a lot of noise, Radeon is even worse.
      All of the 3D cache nonsense and overheating CPU problem can be solved if both AMD and Intel simply added quad channel memory support and focused on increasing IPC.
      My alternative was an overpriced gaming laptop with overheating chips, less RAM and storage, slow processors. At least a 3060 mobile is an amazing 1080p beast at a low wattage, and the i5-13600K is a no no.

    • @saricubra2867
      @saricubra2867 Před rokem

      I almost choosed AM5 because i read somewhere that it would have quad channel memory support, then i remembered that AMD wants to sell gimmicky 3D cache chips, so that never happened.
      Quad channel memory support is so overlooked. It's crazy the perfomance upgrade it did for Intel HEDT for games.
      With DDR5, imagine the absurd perfomance increase vs dual channel DDR4.

  • @festusian9089
    @festusian9089 Před rokem +123

    I feel sorry that you're faced with an impossible task. There seems to be a lot of viewers who can't/are unwilling to look at a graph and analyze the results but simply look for the highest number. Your disclaimers about percentage charts were great, but many won't see the difference. Still, keep doing what you're doing because it's vital info for thousands.

    • @lurick
      @lurick Před rokem +13

      You mean I have to READ and THINK!!!
      Get out of here!!!

    • @aqulex84
      @aqulex84 Před rokem +12

      Thanks man. 🙏 amen for that statement. I don’t get why people are not able anymore to use their own freaking cpu aka brain to understand reviews
      Feels like most of those guys are still on a budget cpu from mid 2000‘s

    • @BBWahoo
      @BBWahoo Před rokem +5

      @@aqulex84
      They need to overclock with adderall

  • @Hardwareunboxed
    @Hardwareunboxed Před rokem +13

    ANNDDD... THERE'S THE RE-ROLL

    • @rednammoc
      @rednammoc Před rokem

      There's a deep well of jokes somewhere

  • @deathb1ossom
    @deathb1ossom Před rokem +66

    Gamers Nexus killing it with the hard work yet again

  • @khaaaaaaaaaannn
    @khaaaaaaaaaannn Před rokem +22

    I came for the unbiased review and in depth testing results, I stayed for the unexpected Dr Dre reference.

    • @Blafard666
      @Blafard666 Před rokem +8

      I had to mute the video for a second to be sure the music was indeed coming from it XDD

    • @GamersNexus
      @GamersNexus  Před rokem +6

      @@Blafard666 hahahaha

  • @CloneMalone
    @CloneMalone Před rokem +34

    Meanwhile at NVIDIA:
    what does budget mean

    • @ChrispyNut
      @ChrispyNut Před rokem

      It means increase the available budget for Leather jackets to make "all of the leather jackets" viable.

    • @jrsomethingnumbers9704
      @jrsomethingnumbers9704 Před rokem

      A $499 4060, which is technically under $500 lmao

  • @shaddow_24
    @shaddow_24 Před rokem +37

    For budget/value cpu reviews, if feel like the included cooler should be part of the assessment as well. For example, does it even come with a cooler, and does the cooler let it run as intended. I know the coolers are generally the same ones they've had for years but it is part of what let's budget builders save a bit, especially as it affects a higher percentage of the build cost at the low end, as you said. Also i know your charts are done with aio which is sometimes unattainable for budget builders, but I understand it makes the chart more consistent. If there was a i3-12100/13100 vs 5500/5600 air cooled numbers chart, that would make a good additional data point to evaluate. Thanks for the review.

  • @Bulldawzer
    @Bulldawzer Před rokem +36

    Respect for the integrity to pull the review and include the 5500.

  • @JohnCarter04
    @JohnCarter04 Před rokem +37

    Steve and everyone involved,
    I greatly appreciate your effort to not only show performance for what new things come into existence from our corporate overloads, but to really dig into these new products and find new/better ways to show how they compare to the competition. I know you won't but NEVER STOP FIGHTING FOR THE CONSUMER! As a Raleigh native who is just as enthusiastic to new tech as you, nothing will ever compare.

    • @GamersNexus
      @GamersNexus  Před rokem +17

      Thank you for the support! Keep that tech enthusiasm up! There's a lot to learn at all levels.

  • @saltee_crcker2363
    @saltee_crcker2363 Před rokem +143

    you remain the hardest working tech tuber imo and the effort, care, and thoroughness you and your team put into GN content is amazing and appreciated. you are the standard for me when it comes to tech information and news.

    • @GamersNexus
      @GamersNexus  Před rokem +37

      Thank you for your kind words!

    • @mkendallpk4321
      @mkendallpk4321 Před rokem +8

      @@GamersNexus No Steve, thank you to you and the team for providing unbiased information that is both useful and relevant.

    • @pcmasterracetechgod5660
      @pcmasterracetechgod5660 Před rokem

      Yup love his videos, I wonder if he'd ever consider a website similar to how LTT is trying to do where GN can post more lengthy charts with more components. Like I was saying on the previous video I'd love to see older Intel CPUs included in these charts.
      Because he has stuff like the 1700, 2600, 3600, etc. However no competing Intel CPUs from those eras such as the almighty 8700k, and others such as 9700k, 10600k, etc

  • @SpiritHealer1
    @SpiritHealer1 Před rokem +52

    Glad to see you guys back and with new charts

  • @Bezray
    @Bezray Před rokem +82

    The CPU market has never been this exciting since Ryzen was introduced. Great video as usual GN!

    • @shadow7037932
      @shadow7037932 Před rokem +40

      Now if only this could happen to the GPU market ...

    • @Noah-lj2sg
      @Noah-lj2sg Před rokem +21

      @@shadow7037932 IF intel stays in the GPU market, and IF they can leech some nvidia engineers, they'll be amazing in 5-10 years

    • @tilapiadave3234
      @tilapiadave3234 Před rokem +2

      Yes after AMD finally ,, how many DECADES to catch up

    • @ChrispyNut
      @ChrispyNut Před rokem +8

      Which is Ironic, given that AMD's abandoned the lower end CPUs for years now. The 3300 was the last one and that had really low availability before disappearing pretty quickly.
      E2A: I might be a bit harsh with a second though. They did have the G series which was kinda down there, but only "kinda".

    • @GamerBoy705_yt
      @GamerBoy705_yt Před rokem +1

      @@ChrispyNut The G series is still just rebranded mobile processors. 5600G is a rebranded 5600H, 5700G is a rebranded 5800H

  • @Valtekken
    @Valtekken Před rokem +26

    Talking about feedback, I'd love to see some sort of "emulator benchmark", as in taking a set of stable games on RPCS3 and Xenia and using them as a way to compare performance between CPUs. All the cores and threads in the newer CPUs really shine when it comes to emulation of 7th gen consoles.

    • @Woodzta
      @Woodzta Před rokem +2

      I'd like to see that. I know there's an RPCS3 CPU tier list written up on Reddit. The i3 12100 is rated decently on it, and rated high if you manage to get AVX-512 enabled. I haven't found much for Xenia yet but it honestly runs even better for me for the 2 games I tested - when it works. Just that its compatibility and stability is not nearly at the stages of RPCS3.

    • @blank3478
      @blank3478 Před rokem +1

      They could add the dolphin 5.0 benchmark, anandtech has included it for years, although having several emulator would be pretty nice.

    • @itsTyrion
      @itsTyrion Před rokem +1

      @@Woodzta Xenia is less CPU intensive because the Xbox 360's CPU is more straightforward.
      It's a 3 core PowerPC CPU with SMT. That's it. The main difficulty here is the L4 cache!, stronger GPU and unified RAM+VRAM.
      The PS3's CPU has 1 PowerPC core and 8 128-bit co-processors with own cache and insane-for-the-time throughput.
      Look up the schematic on Wikipedia ("Cell broadband engine"), it's infamous for being very hard to develop for, let alone emulate it

  • @stuartlunsford7556
    @stuartlunsford7556 Před rokem +12

    GN and Dre, didn't see that one coming.

  • @ionamygdalon2263
    @ionamygdalon2263 Před rokem +25

    The "How Things Work Here" was the best bit! Keep up the AMAZING work 😉

  • @t5kcannon1
    @t5kcannon1 Před rokem +17

    Thank you Gamers Nexus for the revised video! Excellent work, great job.

  • @joannaatkins822
    @joannaatkins822 Před rokem +16

    Honestly you've continually pushed your quality higher and higher, and I hope we all agree on that. Thank you for your dedication to clear and informational communication with an emphasis on contextual data.

  • @TomSp88
    @TomSp88 Před rokem +2

    I like the Dr.Dre - forgot about Dre tune insertion with the rap from Steve

  • @wrestlevania
    @wrestlevania Před rokem +21

    Properly belly-laughing at the sudden "Forgot About" homage, well played GN.

  • @thisuraD
    @thisuraD Před rokem +2

    That 'Forgot About Dre' music at 5:10 when u explained u forgot about the 5500 was a nice touch..
    All in all another quality GN vid. Keep it up!

  • @-feonix48-47
    @-feonix48-47 Před rokem +3

    TWO Steve’s in the thumbnail????? You spoil us GN

  • @jackmiller8851
    @jackmiller8851 Před rokem +12

    I've had the 12100F for almost a year now, just upgraded to 1440p. Still does the business. Using G-sync at 165hz I'm perfectly happy... I'd have happily picked an AMD cpu but for the NZD$175 i paid there was simply no equivalent in terms of value/performance. It's great to see for those that are due an upgrade, there is another new budget option.

    • @PDXCustomPCS
      @PDXCustomPCS Před rokem +1

      Good choice. Ive done a lot of builds with the 12100F. If you get the Mortar Max, you can hit 5.1ghz very easily. Worthy upgrade if you get a deal on it 😁. 12400F hits 5.3ghz with a 240mm aio with ease too.

    • @kubotite9168
      @kubotite9168 Před rokem +1

      @@PDXCustomPCS the asrock b660m pg riptide is cheaper and can do the job too..

    • @PDXCustomPCS
      @PDXCustomPCS Před rokem +1

      @@kubotite9168 No it cant. Ove had all of them.

    • @PDXCustomPCS
      @PDXCustomPCS Před rokem

      @@kubotite9168 It doesn't have a clock generator. FYI.

  • @kaseyboles30
    @kaseyboles30 Před rokem +20

    In chart 1 the 4090 is paired with a 12100f, a massive cpu bottleneck, for highest fps/dollar. I think this aptly demonstrates Steve's point about the number not being the whole, or even significant portion of, the picture. And perhaps why he made sure to make that point so thoroughly up front.

    • @filippetrovic845
      @filippetrovic845 Před rokem +3

      I have exactly the same cpu + gpu combo. Its not even remotely massive bottleneck as you would think. There is no lack of graphical settings in games that can tame 4090 to performance that is reachable by this tiny beast.

    • @derbigpr500
      @derbigpr500 Před rokem +2

      The whole idea of bottlenecks is so overrated its hilarious at this point.

  • @jaysmudger
    @jaysmudger Před rokem +2

    Steve, you should not have pulled the first video, there is more than enough information out there for the viewers of your channel to make an informed opinion for themselves, this channel is one of the best if not the best for honest no shill reviews of hardware and I cant see a good reason that you unfortunately had to waste resources making this video when it could have been used for something else. Thanks for the excellent content.

    • @GamersNexus
      @GamersNexus  Před rokem +1

      Thanks for that. The response was strong enough at the outset that we decided it was worth doing just to be sure.

  • @tarfeef_4268
    @tarfeef_4268 Před rokem +8

    Re: removal of near-identical CPUs
    I agree with the space+time savings by removing one, but I think you should default to the cheaper part, to represent the choice you actually recommend. That way discussions will be clearer and easier to understand. If you're going to say "maybe consider the 5600", then show the 5600, not the X.
    Props for taking feedback and reuploading.

    • @GamersNexus
      @GamersNexus  Před rokem +9

      We normally do, but then people get mad that we aren't showing the 'best' part. There's no winning.

    • @tarfeef_4268
      @tarfeef_4268 Před rokem

      @@GamersNexus 🙃 of course, I should have expected as much. I still think going with the part you (at least have a better chance of) recommend(ing) is the way to go, though.
      May as well get some extra utility if there's gonna be haters either way

    • @XX-_-XX420
      @XX-_-XX420 Před rokem

      I personally think the 5600x is better since the 5600 with pbo ties with the 5600x with pbo. but I wouldnt mind to much either way since its so close, but imo showing the 5600x is better.

    • @tarfeef_4268
      @tarfeef_4268 Před rokem

      @@XX-_-XX420 why not just show the 5600 with pbo, then? Since that's what they're actually recommending people get and use, they should show that instead.

    • @XX-_-XX420
      @XX-_-XX420 Před rokem

      @@tarfeef_4268 thats also fair, but I think just running the 5600x probably does save time while showing about max performance. ( pbo probably gains the 5600x like 5-10% in gaming or something but that doesnt matter to much).

  • @cldpt
    @cldpt Před rokem +2

    OMG you guys even addressed my "percentage is tricky without a baseline" comment! THANK YOU!

  • @monojira
    @monojira Před rokem +5

    watching this again, commenting for the algo and upvoting because you guys deserve it. The amount of work you put into this is insane

  • @bricktronics
    @bricktronics Před rokem +4

    I love how you listened to the comments and revisited this review. That is one of many reasons why this channel is AWESOME! Thanks for what you do Steve! 👊

  • @RetroWarhead
    @RetroWarhead Před rokem +7

    Your knowledge, detailed testing, and integrity to the game makes your videos my favorite on YT.

  • @wewillrockyou1986
    @wewillrockyou1986 Před rokem +11

    It's kinda important to consider the price of the platform in fps/$, it's not really a big problem for the 12100 which you can run on the lowest of low end LGA1700 boards, but especially with the likes of AM5, motherboard costs inflate the overall price more than just the CPU.

  • @ChrisBeardSAP
    @ChrisBeardSAP Před rokem +3

    Great, now I've got 'so you forgot about dre' going round and round in my head.
    Great job, Steve!

  • @jordi95
    @jordi95 Před rokem +1

    All that extra work to get the same answer , that is what a good cientific method is about! Great Job !

  • @PaulsTechSpace
    @PaulsTechSpace Před rokem +46

    I really hope they make a significant update to the i3 this year with core counts like 4P cores + 2 E cores or 4+4 cores.

    • @mtunayucer
      @mtunayucer Před rokem +15

      4p+8e would be legit ryzen 5 killer. and lets leave the quad core stuff to pentiums and celerons.

    • @MafiaboysWorld
      @MafiaboysWorld Před rokem +9

      @@mtunayucer Except that's 16 threads which makes it a R7 competitor and the R7 will slap it silly. 4p+4e is 12 thread. 👍

    • @mtunayucer
      @mtunayucer Před rokem +45

      @@MafiaboysWorld bruhhhhhh who compares cpus by thread count?? You compare cpus by PRICE. It doesnt make R7 competitor. Thats the whole point!

    • @MafiaboysWorld
      @MafiaboysWorld Před rokem +1

      @@mtunayucer The companies themselves do. People do. Have for decades. That's why Intel, with only 8p+16e, so 32 threads total, calls itself the i9 13900K and competes with the R9 7950X which is guess what genius, 16 cores and 32 threads!
      But "wHo cOmPaReS tHrEaD cOuNtS?!" right? 🙄🤦 Quit while you're behind already dimwit. 👎

    • @ezer0923
      @ezer0923 Před rokem +2

      Hoping for 4p+8e as well
      Maybe for the i3 1x300?

  • @vailpcs4040
    @vailpcs4040 Před rokem +3

    Loved the Dre reference! Thanks so much for adding the 12100F, I really wanted to see this!

  • @depression_daily
    @depression_daily Před rokem +5

    Love the new charts. Especially the FPS/$. Puts thing in a better perspective for gaming.

  • @brucepreston3927
    @brucepreston3927 Před rokem +7

    This was great! I never shop in budget class, but I still enjoy watching the charts...It's amazing how good a $100 CPU is these days! I appreciate all the hard work you and your team put into these videos, and I'm excited for more fan and PSU content also!

  • @KmanKarl1
    @KmanKarl1 Před rokem +9

    I got my 5600 for $119 and after increasing the boost frequency by 200mhz it has identical performance to a 5600X! It’s a powerful and inexpensive cpu I’ve paired with my RTX 3080

    • @Jsteeeez
      @Jsteeeez Před rokem

      Yea but if you take a 5600x and do the same overclock it again is going to be much faster than the 5500. It literally has half the l3 cache at 16mb versus 32MB on the 5600 (x).

    • @rookiebeotch
      @rookiebeotch Před rokem

      I got my 5600 at the same price point. Instead of adding the 200mhz boost, I fine tuned curve optimer per core. Cinebench increased several percent, but not as much as the 200mhz boost would. The temps and power usage are much, much lower.

    • @tilapiadave3234
      @tilapiadave3234 Před rokem

      R5 54600 ( non x ) is EXCELLENT choice at the $120 - $125 mark ,, FAR better value than the OVER priced 5800X3D

    • @jamesm568
      @jamesm568 Před rokem

      @@tilapiadave3234 But the performance of the 5800X3D supersedes the value of your little budget chip. Besides, you can get a 5800X3D for less than $300.

    • @tilapiadave3234
      @tilapiadave3234 Před rokem +1

      @@jamesm568 Yes the WAY over-priced EXTREMELY over-rated 5800X3D can beat a 12100f ,,,, But is that an i5 YES a mere i5 i see way ahead in many of the charts ,, WOW ,,,, so embarrassing for the 5800X3D shills

  • @kinomora-gaming
    @kinomora-gaming Před rokem +2

    OMG I see you did the Blue = Intel; Orange = AMD thing for the FPS per dollar chart. Not sure if that was on purpose but I recommended you do this a few times and I'm SOOO happy to see it!! Wow, that looks so good :)

  • @moose6667
    @moose6667 Před rokem +10

    Sorry Guys, the internet has spoken, every graph has to include every processor from the last 5 years. We look forward to your new once-yearly video upload schedule. 😂 But seriously, thanks as always to Gamers Nexus for all the hard work and dedication, you guys rock.

    • @ArtisChronicles
      @ArtisChronicles Před rokem

      That's a scary amount of work. A monstrous undertaking...

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem

      5 years? We have to go back at least to the Q6600 for include all possibly relevant computers of everyone interested in getting a new CPU :D

    • @XX-_-XX420
      @XX-_-XX420 Před rokem

      I mean everyone once in a while that would be super cool to see like first gen i7 parts overclocked all the way upto modern parts.
      or like a 7ghz FX8350 beating stuff like the 5500, that would be hilarious, and cool. ( or I think it was the 8350 @ 7ghz that was as fast as a r5 3600).

  • @Shryquill
    @Shryquill Před rokem +1

    Hey Steve, if you're having trouble incorporating all the relevant information into the Cost-per-Frame graphs then I'd recommend checking out a channel called "Hardware Unboxed", they have some really good graphs you could use as inspiration.
    They manage to include maximum performance per CPU and a cost breakdown of all cost of CPU/Motherboard/RAM per test, making it much easier for the viewer to check Cost-per-Frame for themselves later when the prices change.
    It's important to remember the point of a Cost-per-Frame graph is to illustrate value, while a maximum performance graph is better suited to showing maximum performance. 👌

  • @rishabh5669
    @rishabh5669 Před rokem +3

    What's the point of comparing games that are already running on more than 120fps ? Compare those games that struggles to push 60fps on high settings not everyone has a 144hz monitor bt each and every guy has a 60hz monitor. test these cpus on games like cyberpunk, Valhalla and dying light 2 these games may be not optimised well or more cpu demanding but they can keep your build ready for any titles if you can run these games at High on 60+ fps

  • @TheYuppiejr
    @TheYuppiejr Před rokem +2

    I feel like the distinction @8:23 is gray with the Intel stuff, as a 13100f and other "Raptor Lake" CPU's using the Golden Cove p-cores really are just remixed/rebadged Alder Lake CPU's. Odds are really good the re-badged Alder Lake 13xxx procs are going to be paired with an Intel B660 series board with DDR4 from last gen which makes it a complete "last gen" solution in line with an AM4 processor and 5xx chipset mainboard.
    I wonder how much this skews the relative uptake numbers for new generation products from both Intel and AMD... while AMD has more clearly distinct platform and processor divisions for AM4 vs AM5, while Intel is much more mix and match between Alder and Raptor lake generation CPU's and chipsets/boards.

  • @furiosarana1857
    @furiosarana1857 Před rokem +8

    Have you considered price-vs-performance scatter plots (possibly with iso-price/perf lines)? With the entire value discussion being comparative, the difficulty of reading accurate values in scatter charts does not matter.

    • @GamersNexus
      @GamersNexus  Před rokem +5

      We can try it. Sounds maybe difficult to understand, but I'll play around with it!

    • @Jsteeeez
      @Jsteeeez Před rokem

      @@GamersNexus if you do add new charts I would suggest to explain how the chart works. Even if it seems simple, it’s always good to take 1-2 minutes explaining the charts and allowing the content to be more accessible to more people.

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem

      I've seen a couple scatter plots for that kind of data. Really useful as a visual tool.

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem

      @@ArdgalAlkeides Obviously as cards stop being sold, they leave the graph. Nobody would be interested to see how well a 8800 GT or a HD 4870 or a GTX 570 are nowadays.
      And as OP said, it's more about having a broad overview about everything available.
      That the plot would be obsolete a week later is also obvious, but the same can be said for any price/performance graph.

  • @popcorny007
    @popcorny007 Před rokem +1

    Removing CPUs from the chart:
    - 100% agree, saves time, looks better, zero complaints.
    Removing the 5600, keeping the 5600X:
    - Don't quite understand
    - In a conversation about value, it would make sense to show the better value choice of two similar-performing products.
    Industry-leading work in terms of dedication to clarity though, love everything you do.

  • @carlosteixeira3966
    @carlosteixeira3966 Před rokem +11

    Gamersnexus again setting the bar even higher on the tech industry reviews ! Thank you for your dedication

  • @willfancher9775
    @willfancher9775 Před rokem +1

    Thank you for the commentary on fps per dollar. A lot of reviewers have been using that metric lately but without the disclaimer about their questionable utility. (Plus a lot of them call it "dollars per frame" which is just factually an incorrect term for the metric since it removes the time component of fps).

  • @jamesblackmore2598
    @jamesblackmore2598 Před rokem +3

    R5 5500 overclocks to 4.8ghz very easily. I have mine at 4.422Ghz just because I'm using a stock cooler and I want the temps under 80 degrees. intel chips don't overclock as well, I got my 4.4ghz 5500 with a free game I wanted too, so all in all I'm happy with my new UFO RIG. I love you guys but your wrong to hate the R5 5500......its a beast in hiding!

    • @jamesblackmore2598
      @jamesblackmore2598 Před rokem +1

      Please do a chart on overclocked intel vs AMD budget cpu's, great to see.

  • @Dyjsong
    @Dyjsong Před rokem +1

    Never did I think that I'd be watching Tech Jesus making a Forgot about Dre reference

  • @saddamalhasi802
    @saddamalhasi802 Před rokem +3

    Love watching your content man , you're on another level , we promise we will keep support this amazing effort , keep it up 🌹

  • @cnhtol1586
    @cnhtol1586 Před rokem +1

    I am glad you pointed out that i3 12100F also goes on discount sale. Some big tech tuber only pointing out that ryzen 5600 goes on discount sale. Its so easy to be fair but some just choose not to be. I am glad you are not shaping your narrative to favor any brand.

  • @Sunlight91
    @Sunlight91 Před rokem +8

    FPS/$ is much better suited for graphics cards. For a CPU you have to include motherboard prices and RAM because it doesn't do anything without these components. Choosing good options for these parts is impossible considering the diverse audience, but even the cheapest options will noticeably change the value for these budget CPUs.

  • @benjaminlynch9958
    @benjaminlynch9958 Před rokem +2

    Steve, great video (THANKS STEVE!) LOL.
    When you guys are reviewing really low end CPU’s in the future, maybe it would be useful or helpful to pair them with midrange or budget GPU’s. I understand testing methodology and wanting to remove bottlenecks and hence using a top tier GPU, but in the real world nobody is pairing a $1,500 GPU with a $150 CPU. What’s really relevant for anyone considering buying a CPU this far down the stack is whether the bottleneck is the CPU or the GPU it will be paired with in real life. Testing with something like an RTX3060 is highly relevant at this price point, and it would be really interesting to see if all of these CPU’s are functionally equivalent with that big of a drop in GPU performance.

  • @WSS_the_OG
    @WSS_the_OG Před rokem +3

    Nice job as always. This must have been a lot of extra work, but it served as a good reminder of your testing philosophies and methods. Well done, Steve and team.

  • @PotatMasterRace
    @PotatMasterRace Před rokem +1

    We all have different approaches to "budget gaming" I guess. As a person on a tight budget I want the biggest bang for my little buck, I don't care how much new parts cost in my country, so I saved 50$ by ordering a 5600 from Ali (it is around 165$ locally). And bought a used mobo, ram and GPU on a local online flea market.

  • @kaiseralbrecht6099
    @kaiseralbrecht6099 Před rokem +3

    Thanks Steve

  • @FORCEITDOWN
    @FORCEITDOWN Před rokem

    Love the subtle hint to "Forgot about Dre" thought I had Spotify playing in the background lol

  • @tomcatsdb4755
    @tomcatsdb4755 Před rokem +3

    Something that I think would also benefit in budget CPUs reviews like this is a chart or two where the CPUs are paired with one or two budget class / mid tier GPUs as well. Maybe something like the 6600 as kind of a reality check against what an actual budget build with the CPU would perform like. The point being that gaming performance between these CPUs may well flatten out with a lower end GPU which could change the value equation. If a CPU could technically run a game 30% faster, but only when paired w/ a 4090, then you'd be paying a premium for a performance difference you're never going to see. Just a thought.

    • @jeffjungers2034
      @jeffjungers2034 Před rokem

      That is not the purpose of these videos. The whole premise is to remove the GPU as a variable. Push the CPU to it's absolute limits. There are far far too many GPUs available to do testing like this. 10 hours for a single addition of one CPU. Adding in an arbitrary "realistic budget GPU" to the stack means a doubling of testing time on the gaming side. You might just have to use the excellent and extensive data GN has provided and make your own conclusions.

    • @tomcatsdb4755
      @tomcatsdb4755 Před rokem

      @@jeffjungers2034 I know full well how benchmarking works and why GN does what they do; however, this is the first time they've done a "value" metric in terms of FPS per dollar because it's important to a significant portion of the target market of these CPU (someone building a budget gaming rig). It's that audience that I had in mind when making the suggestion (and IIRC, GN did ask for feedback about this addition).
      I disagree with your understanding of "the purpose of these videos". GN's really the only one w/ authority to speak on that, but my take is they want to give their audience enough information to make informed purchasing decisions.
      IMO an important aspect of budget builds, probably more so than higher end rigs, is balancing out component choices. If processors in this range are functionally equivalent when paired with a mid/low tier GPU, that's useful information as the price difference between CPUs and their platforms can be put towards a different component that will impact the user's experience. Put another way, one data point that's largely missing for potential buyers of these CPUs is what class of GPU will be bottlenecked by them. I'd argue that's far more important than how fast they can push a 4090.

  • @justhitreset858
    @justhitreset858 Před rokem +1

    Speaking of being around here for a bit, I think the first video I watched of yours was the GTX 960 4GB vs 2 GB way back when it came out. To see not just the set change so much but the quality of content come so far is a testament to you and your team's hard work. Keep it up.

  • @dennisp.2147
    @dennisp.2147 Před rokem +6

    Ryzen 5 5500 has a very specific use case, cheaply upgrading older boards that only have PCI-E 3.0 anyway. It was useful to give my son's circa 2018 Gigabyte AB350 and Ryzen 3 2200G a cheap boost. Gigabyte's been very good about keeping the microcode updated for new BIOS/UEFI releases.

    • @_sneer_
      @_sneer_ Před rokem

      You do realise that usually GPU PCIE lanes are provided by the CPU, not the motherboard? So a PCIE 4.0 GPU with 5600 on a PCIE 3.0 board will still run in 4.0 mode

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem

      Basically a competitor to the 3600. Or for people got got a cheap 1600AF back in the day and want a decent upgrade for a decent price.
      Would be interesting to see how the 5500 compares to the 3600. Maybe throw a 5600 on an older board into the mix as well to see how much one really saves.

    • @fleurdewin7958
      @fleurdewin7958 Před rokem +3

      Nah.. I think used R5 3600 makes much more sense than R5 5500. Generally speaking, CPU last much longer than motherboards, if your B350 died, you can upgrade straight to B550, but if you are stuck with PCI-E 3.0 of the R5 5500, you can't fully make use of the newer B550 motherboard which supports PCI-E 4.0. The only good thing about the R5 5500 is the monolithic die because it originates from APU , thus you can overclock the Infinity Fabric much higher compare to chiplet based AMD CPU. Those APU have better memory controller than chiplet based CPU.

  • @curtishoke
    @curtishoke Před rokem +1

    It's refreshing to see a candid explanation of what happened in the other video and take community comments into serious consideration.

  • @Jessie-bl3rm
    @Jessie-bl3rm Před rokem +3

    thanks steve

  • @Stickyxo
    @Stickyxo Před rokem

    I lost it when I heard the Dre reference, keep up the great videos GN!

  • @SpookySkeletonGang
    @SpookySkeletonGang Před rokem +4

    Good vid! I just wanna say, I think the primary concern (at least for me) around the whole 5600x/5600 thing is that the 5600 is almost always better value (cheaper for nearly the same performance), so it makes sense to use the better value chip as the baseline instead of the worse value chip. And just for an example why, I was helping a friend upgrade from a 3600 to a 5600 and I sent him one of the gamersnexus charts and he was confused about the 5600x being there and asked me questions about it, are they the same, should he spend more for the x, and I had to walk him through it. It's minor issue but I'd prefer the better value chips being on the chart if we get more these x and non-x chip situations. Like, I'd rather the 7600 be on these charts over the 7600x, for example.

    • @vinylSummer
      @vinylSummer Před rokem

      One of the options is to just add a non-x counterpart even if it wasn't actually tested with 2-3% less performance. Of course, as long as the X version is better only by 2-3%

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem

      Reminds me of the difference between 5700X and 5800X, the latter clocks higher but also has a higher power target. How fast would a 5800X be under 95W and how fast would a 5700X be under 125W....

  • @JosepsGSX
    @JosepsGSX Před rokem

    I feel bad for the people that dropped an opinion in the dropped video. Is the single issue I find in relasing an improved video. I whis there was an archived or previous versions, so they contribution doesn't get lost.
    Nothing to add. Another superb review, as always.

  • @StefanEtienneTheVerrgeRep

    The human eye can't even see the first comment.

  • @bubber4444
    @bubber4444 Před rokem +1

    Thank you for teaching me so much about computers. I have not built one yet but I think I could do it. Maybe I will with my next computer. In the mean time I will keep watching your videos and learning from you.

  • @jeremyleemartens801
    @jeremyleemartens801 Před rokem +5

    Tbh the 10100f is also insane value. I think you save 50-100 bucks depending on your config but lose barely any performance

    • @tilapiadave3234
      @tilapiadave3234 Před rokem +1

      10100f get's STOMPED on by 12100 etc

    • @jeremyleemartens801
      @jeremyleemartens801 Před rokem

      @@tilapiadave3234 yes, if you buy a 3080. not with a low end card like a 3050 or 60

    • @tilapiadave3234
      @tilapiadave3234 Před rokem +1

      @@jeremyleemartens801 LAUGHING ,,, I always plan at least one ,, often 2 GPU upgrades ahead.

    • @jeremyleemartens801
      @jeremyleemartens801 Před rokem

      @@tilapiadave3234 well, then you shouldn't consider budget CPUs anyways

    • @tilapiadave3234
      @tilapiadave3234 Před rokem

      @@jeremyleemartens801 You should consider what has caused your stupidity. Your pathetic if you cannot see spending $15 more INTIALLY in order to have at least one extra GPU upgrade free from bottle-necking is a fantastic investment. The 10100 WAS , PAST TENSE , a good value option , it NO LONGER is.

  • @bryanthartwig6994
    @bryanthartwig6994 Před rokem +1

    I am very happy with my ryzen 5600 and Asrock steel legend x570 I got on sale back in December for $270 for both items. Very low cpu utilization in tasks that maxed out my old i5 4570 computer

  • @mapesdhs597
    @mapesdhs597 Před rokem +5

    12:15 - That's why HUB includes value and fps in the same chart, where they sum CPU+RAM+mbd, and I think it works very well (except of course they use $ per fps rather than fps per $; same idea really, one can debate which is easier to comprehend). It allows one to see the value difference, but also see where a more expensive option provides much greater performance.
    15:05 - HUB's version of this kind of chart does not leave out which combination has the performance edge, and it really does help that various combinations of mbd and RAM are included too, to show for example DDR4 vs. DDR5 value/speed variance.
    Overall conclusion still correct I'd say, but of course price volatility (and how it varies between regions) can shove all this around like bumper cars. Strange where I am (UK), the 5600 is now largely unavailable, or places that do have it are charging far more than makes sense vs. Intel competition (for gaming that is). By contrast, the 5500 is easy to find but is exactly the same price as the 12100F, quite unusual for UK pricing of this kind. What would be nice of course is a proper 7500 or 7500X, but AMD isn't going to do that, not for some time; they learned from the 3300X that too strong a value part just negates most of the good-margin SKUs.
    The Zen4 stack will only receive entry parts when Intel once again kicks AMD in the shins. AMD waited far too long to release the 5600 and meanwhile released a mess of other older-tech parts with irritating caveats & compromises (PCIe, cache, etc.); wouldn't surprise me if they do the same thing again, leave competitive entry Zen4 so late that by the time it comes out Intel has already moved on.
    Also interesting from the value perspective, the 10100/F is still available to buy and it's 20% cheaper than the 12100F; sure it's slower, but with realistic GPUs and display tech in mind this likely doesn't much matter, most people in this price class are still fine with 60 to 90Hz displays.
    Btw, RandomGamingInHD recently uploaded a video showing the 13100 with the Arc A750 at 1440p, some interesting numbers.

  • @NoizyCr1cket
    @NoizyCr1cket Před rokem +1

    Really impressed with the Dre line. Looking forward to Steve's mixtape.

  • @sultanofsick
    @sultanofsick Před rokem +4

    I ABSOLUTELY look for and buy based on FPS per dollar. You're right, that's not the only thing that matters and can mask other factors, but give your audience some credit. I believe anyone who has their own personal "minimum" metric, or needs production performance also, is smart enough to also watch that part of the review. You can also make a two-level chart like you do for avg, 1% and .1% fps, and show frames per dollar along with avg. I use HW unboxed charts like this a lot.

    • @azethegreat958
      @azethegreat958 Před rokem +1

      Yeah, I don't know why they made such a big deal of this "problem" when it's been solved perfectly by Hardware Unboxed. The only real criticism against performance/$ charts (in my opinion) is that costs can change. But even then, when you display the raw performance, users can easily update for current costs by just dividing.

    • @_sneer_
      @_sneer_ Před rokem +2

      I think that fps per dollar are stupid charts, as prices change constantly, they greatly depend of the rest of the system resolution, etc and don’t show performance at all. CPU fps per dollar is more dependant on the GPU than CPU in most cases. 13900k might be exactly the same fps with RX6500XT as Ryzen 5500 in CP2077. If you add the whole system cost, two systems with 100% price and fps difference will be exactly the same fps/$. Like I said, those charts are stupid and too subjective and easy to manipulate to add any real value to the review.

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem +2

      Remember that it also depends on the games you want to play with it. For example I play none of the games in the test and don't live in the US, so the numbers in the FPS/$ chart are completely meaningless to me. Neither the FPS part nor the $ part. But what I know is that in my most demanding game the 5800X3D beats the 12900K by about 40%

    • @azethegreat958
      @azethegreat958 Před rokem +1

      @@HappyBeezerStudios So I'm sure you go around saying that all FPS charts are worthless as well because you don't play those games, right? And you do realize that the relative % differences are literally just calculated based off the FPS numbers?

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem +1

      @@azethegreat958 All I'm saying is that the chart displays one situation that is limited in time, location and software used.
      A rough first start, but to know what is the optimal choice for a person, they have to do the math themselves.

  • @Pratalax
    @Pratalax Před rokem

    That lil' "forgot about cezanne" bit was gold xD

  • @kaisersolo76
    @kaisersolo76 Před rokem +4

    No stress, just remember to include a good coverage of cpu's that are available and in and around the same price bracket. low end is as important as high end. its correct to include cpus in that prices range. its a shocker you didn't include the 4500 that could be picked up for £65. lol. Also Don't always take the new build approach , need to remember the amount of am4 boards out there. this will happen with am5 going forward

    • @GamersNexus
      @GamersNexus  Před rokem +2

      big no to the 4500, sorry. We already said what we think about that in the original review. We're not going back for it.

    • @kaisersolo76
      @kaisersolo76 Před rokem

      @@GamersNexus twas as joke hence the lol steve!

  • @djbadandy1
    @djbadandy1 Před rokem +1

    i think that you guys are showing what a class act you are here. you have strong opinions, but you do keep yourselves grounded. the first video i saw of gn, i was very critical about in the comments (and i may have been drinking), but since i have become a big fan of the channel, and really appreciate your content. keep up the awesome work!

    • @jarnovilen5259
      @jarnovilen5259 Před rokem +1

      Well, to be honest, GN has not been this humble forever. I have given some critic also in the times gone bye. For a reason I think. But they've improved all the time.

  • @parthkachhara9824
    @parthkachhara9824 Před rokem +1

    The "Steve reviews too many things" and Dr Dre reference got me good.😂😂😂

  • @conza1989
    @conza1989 Před rokem

    Great video, as always, surprisingly perhaps, it reminds me the most of when the CLC mounting topic was raised, the second video with that awesome thumbnail *looks it up to confirm language* "AAAAAH! IT'LL EXPLODE ANY SECOND!" where you addressed a vocal part of the community as a PSA to say 'Bro, chill a bit' which was really good.

  • @rockking1379
    @rockking1379 Před rokem +1

    One case for buying the 5500 over the 12100F is my scenario. I had a couple AM4 boards laying around so I snagged 2 of the 5500’s for kid gaming towers. They are 10 and 6 so absolute maximum performance isn’t needed. And they were much cheaper than the 5600.

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem

      Exacly. Half the system was already there. Also a question for people who need to buy a board for either platform.

  • @Jsteeeez
    @Jsteeeez Před rokem

    5:08 no snowmobiles, no ski’s, can finally afford to feed my family with groceries.

  • @omaral-maitah181
    @omaral-maitah181 Před rokem +2

    Am finally gonna build my first PC coming from a laptop🤩
    so exited, I will get a 6650xt With an I3 12100f

    • @vinylSummer
      @vinylSummer Před rokem +1

      Good luck with your build! 6650xt is a beast for 1080p, nice components choice

  • @sandrahiltz
    @sandrahiltz Před rokem +2

    I really wish the review was done with reasonable GPU's for the segment, on a budget gaming build no one is going to have an RTX 3090ti, now I understand it's to show which CPU is technically better, in reality it's going to make people building budget systems that they are going to see the in some cases large increase in performance, but when compared with a GPU that would actually be compared with those CPU's then the difference is going to be almost nothing.

  • @Ganiscol
    @Ganiscol Před rokem

    Glad you explained why you de-clutter the charts. Nothing worse than having redundant information on a benchmark chart, making it a PITA to read.

  • @AtomikMenace
    @AtomikMenace Před rokem

    That forgot about Dre fade in was perfect ❤️

  • @dirkslipperypeanutdiggler9211

    Thank you guys for all that you do! Great comparison and information. Yall are a class above when it comes to reliable, unbiased and data driven information. I truly appreciate that you are here to provide that. Ignore the jerks/fanboys. It is a testament to the state of our society when you have to make disclaimers about things when common sense should suffice. Cheers.

  • @stephenmeinhold5452
    @stephenmeinhold5452 Před rokem +1

    I have a Intel Core i5 4570 @ 3.20GHz CPU and i use my PC mostly for gameing, so far i have not come across any limitations even with collisto proticol a game that does not use all the cores, the thing that holds me back is i am still on a old school hard drive, so when a cut scene comes in suddenly or a section loads it takes a second or two to catch up its not a deal breaker but it does break the emersion and some games can take a redicoulusly long time to load.

  • @kevinwise5128
    @kevinwise5128 Před rokem

    Impressive rapping skills! "It was a little bit low key, was hated on by most reviewers with no IGPs, and no deals and no ECC's, so everyone forgot about Cezanne."

  • @deagt3388
    @deagt3388 Před rokem +1

    Stop explaining, thorough and clear video! ;-)

  • @Akkelos
    @Akkelos Před rokem

    Hey, your transparency and community engagement is great!
    Just some feedback from a semi-regular viewer:
    - FPS/dollar in CPU section isn't important to me personally, I rly appreciate them in GPU reviews though. I still appreciate you taking feedback from the community.
    - The disadvantage of not being able to show absolute performance and fps/dollar simultaneously was solved by other reviewers by showing both on the same graph

  • @TCGView
    @TCGView Před rokem +1

    5:07 "...and mothu-truckas act like they forgot about Steve!"

  • @gameboyterrorysta6307
    @gameboyterrorysta6307 Před rokem +1

    About "fps per dollar" argument...
    I literally do "the unthinkable" when choosing components. I create excel table in which I write down prices and both "avg" and "1% low" fps values from benchmarks for games I play. Then I rule out parts that don't meet desired performance goals. In the end I tend to go for the best value option out of parts within the budget.

    • @HappyBeezerStudios
      @HappyBeezerStudios Před rokem

      hehe, being big brain and doing research about what parts work best for you

  • @POkegreis
    @POkegreis Před rokem

    Sometimes you're, even though i really CHERISH your hard word, just background noise to me, when im writing/reading something. But the "Forget about Dre" part really got me listening. :D

  • @Br0adCastYourS3lf
    @Br0adCastYourS3lf Před rokem +2

    Could you add "AV1 video decoding performance" of the iGPU in the future?
    Motivation: future proofing a HTPC build.

  • @Hagop64
    @Hagop64 Před rokem +1

    FPS/dollar will always make high end look bad sure, but people looking at a /dollar metric is more than likely more worried about cost/budget. When looking at these budget CPUs, I think showing the bang/buck is very helpful. Looking at just FPS or just FPS/dollar won't give you a complete picture, but seeing both is good.

    • @emlyndewar
      @emlyndewar Před rokem

      People should just watch the reviews of CPUs, and work it out based on what they’ve watched…

  • @BuzzKiller23
    @BuzzKiller23 Před rokem

    I can't see a linode ad without hearing Dawid's voice saying Linoooooooooooooode