NVIDIA RTX 4090 PCIe 3.0 vs. 4.0 x16 & '5.0' Scaling Benchmarks

Sdílet
Vložit
  • čas přidán 21. 06. 2024
  • Sponsor: Montech Sky One Lite on Amazon geni.us/sux2
    This testing is for PCIe generations on the NVIDIA RTX 4090 GPU. We're benchmarking PCIe 3.0 x16 vs. PCIe 4.0 x16 vs. PCIe "5.0" x16 (but the 4090 doesn't support 5.0, so it's just 4.0 -- still worth showing for educational purposes). Remember that PCIe 3.0 x16 is equal to PCIe 4.0 x8, so if you're wondering if the RTX 4090 behaves differently on PCIe 4.0 x8 vs. x16, these two tests would answer that. We're doing testing on an Intel i7-12700KF, which is Gen5-ready but again, if the device isn't Gen5, that doesn't matter. If you're wondering about AMD vs. Intel for the RTX 4090, strictly from a PCIe standpoint, it doesn't matter at this point -- they're the same in that regard. Other factors are more important, like CPU performance in the applications you run (see our CPU reviews for that).
    One correction of an off-hand remark at 4 minutes 25 seconds - the RX 6500 XT has 4 PCIe lanes (Gen4), not 8. We forgot how much of a piece of trash that card was, sorry. Our mistake for not double-checking.
    Watch our video about the melting RTX 4090 12VHPWR cables here: • Testing Burning NVIDIA...
    See our RTX 3080 PCIe scaling video here: • NVIDIA RTX 3080 PCIe 3...
    The best way to support our work is through our store: store.gamersnexus.net/
    Like our content? Please consider becoming our Patron to support us: / gamersnexus
    TIMESTAMPS
    00:00 - PCIe Bandwidth Testing
    01:15 - Explaining PCIe Support
    04:25 - Correction: 6500 XT has 4 lanes, not 8
    04:58 - Maximum Theoretical Bandwidth
    05:50 - Benchmarking PCIe Bandwidth (Only)
    07:23 - Total Warhammer Gaming Benchmarks (4K)
    08:08 - Major Outlier with Interesting Results
    09:28 - Tomb Raider - 4K & 1440p PCIe Bandwidth Results
    10:05 - FFXIV, Rainbow Six Siege (4K, 1440p, 1080p), & Strange Brigade
    11:14 - Conclusion
    ** Please like, comment, and subscribe for more! **
    Links to Amazon and Newegg are typically monetized on our channel (affiliate links) and may return a commission of sales to us from the retailer. This is unrelated to the product manufacturer. Any advertisements or sponsorships are disclosed within the video ("this video is brought to you by") and above the fold in the description. We do not ever produce paid content or "sponsored content" (meaning that the content is our idea and is not funded externally aside from whatever ad placement is in the beginning) and we do not ever charge manufacturers for coverage.
    Follow us in these locations for more gaming and hardware updates:
    t: / gamersnexus
    f: / gamersnexus
    w: www.gamersnexus.net/
    Host, Testing: Steve Burke
    Video Editing: Mike Gaglione, Andrew Coleman
  • Hry

Komentáře • 1,3K

  • @GamersNexus
    @GamersNexus  Před rokem +757

    One correction of an off-hand remark at 4:25 - the RX 6500 XT has 4 PCIe lanes (Gen4), not 8. We forgot how much of a piece of trash that card was, sorry. Our mistake for not double-checking.
    Watch our video about the melting RTX 4090 12VHPWR cables here: czcams.com/video/EIKjZ1djp8c/video.html
    The best way to support our work is through our store: store.gamersnexus.net/
    Like our content? Please consider becoming our Patron to support us: www.patreon.com/gamersnexus

    • @legostarwarsrulez
      @legostarwarsrulez Před rokem +6

      If it had 16 lanes I would’ve bought a couple and used them in some low end budget builds for friends.

    • @wnxdafriz
      @wnxdafriz Před rokem +9

      @@legostarwarsrulez it doesn't have hardware decoding support... or was it encoding?? or was it both?? ... sigh it was so bad i can't even remember how bad it was or if i'm making it even worse (if thats possible)

    • @Flooberjobby
      @Flooberjobby Před rokem +3

      I know this sounds like it'd be trivial and pointless. But does PCI-E performance differ on platform? Such as AMD vs Intel? I know it seems like a no brainer they shouldn't differ. But never know unless it's tested.

    • @markjacobs1086
      @markjacobs1086 Před rokem +5

      @@wnxdafriz it can do hardware decoding, but not encoding so things like Relive don't actually work (even though the "feature" was still selectable in the software, at least when it released)...

    • @Ketobbey
      @Ketobbey Před rokem +1

      THANK YOU! I was waiting for this test to be done by you guys! Thank you, Thank you!

  • @user-tn6tb2qh8d
    @user-tn6tb2qh8d Před rokem +895

    4:24 as a proud 6500XT owner i must point out that it has 4 PCI-E lanes, not 8

    • @GamersNexus
      @GamersNexus  Před rokem +425

      Thanks for that. That's an error, yes.

    • @budgetking2591
      @budgetking2591 Před rokem +83

      exactly, thats its biggest flaw, would of been a decent gpu if it had 8 lanes so there would be no bottleneck for people on pci 3.0.

    • @Lord_Reset
      @Lord_Reset Před rokem +154

      It's like a raccoon defending it's pile of garbage /s

    • @jaronmarles941
      @jaronmarles941 Před rokem +19

      ​@@budgetking2591 The 5500XT had 8 lanes, and still got bottlenecked on 4 gigs on double the bus size. That card was doomed from the start, and 4 more lanes wouldn't have saved it.

    • @youkofoxy
      @youkofoxy Před rokem +14

      As a owner of a RX 6600 because I was really unhappy with buying a RX 6500 XT or a RX 6400 I can confirm, it's 4 lanes PCI-E gen 4 with memory bandwidth at 64 bits.

  • @hquest
    @hquest Před rokem +1329

    Bottom line: Vaseline does not make bits flow smoother, and PCIe 4.0 is more than adequate for today’s applications.

    • @ChrisM541
      @ChrisM541 Před rokem +143

      So is PCIe 3 !!!!
      - You'll see, for the vast majority of games, little/no difference from PCIe 3 all the way up to PCIe 5

    • @Senarak
      @Senarak Před rokem +69

      @@ChrisM541 Very true, I am honestly surprised at how well Gen3 keeps up with high framerates at 4K. It just goes to show how little bandwidth graphics cards need

    • @IRefuseToUseThisStupidFeature
      @IRefuseToUseThisStupidFeature Před rokem +24

      I am staggered that something claiming twice the speed (3-4) makes almost no difference, even on the 4090.
      I feel like the current hardware tests don't accurately show performance scaling (and each GPU generation seems better than they are)

    • @GamesFromSpace
      @GamesFromSpace Před rokem +60

      Good devs minimize the amount of data actually transfering to the video card, because we're not idiots. It's *always* a bottleneck, making it less bad doesn't make it good.

    • @TheSkiddywinks
      @TheSkiddywinks Před rokem +23

      > "...PCIe 4.0 is more than adequate for today’s applications."
      Relative to PCIe 3.0 for sure. But since the card does not support any higher (i.e. PCIe 5.0) it is impossible to say it wouldn't benefit. For sure I would argue it likely doesn't, but in the interests of accuracy, I thought it worth mentioning that no conclusion can be drawn there.

  • @krakentortoise7531
    @krakentortoise7531 Před rokem +74

    Wow this came at a good time! I was just searching about the effects between these. Thanks Steve!

    • @TwistedEyes12
      @TwistedEyes12 Před rokem +3

      @Gamers Nexus shorts 🅥 This spammer is actually the FieRcE channel on CZcams, going by "thanks" or Gamers Nexus shorts. Don't click it unless you want to give them a free view, very sad way to try to do this. I wouldn't have even cared if you just didn't try to copy and pretend to be Gamers Nexus including using their logo to trick people.

    • @IRefuseToUseThisStupidFeature
      @IRefuseToUseThisStupidFeature Před rokem +1

      FFS spam

  • @juansarmiento6754
    @juansarmiento6754 Před rokem +199

    This was really useful for some of the guys who are on am4 with some older motherboards and upgraded to Ryzen 5000. Making a newer GPU purchase will still extend the life of these systems even more.

    • @christophermullins7163
      @christophermullins7163 Před rokem +12

      I just bought a b450 and 5600 6months ago. It's plenty for most any GPU in 2022.

    • @mastersmurfify
      @mastersmurfify Před rokem +13

      Im in that boat - just popped in a 5600X and now waiting for a 4070 or RDNA3 GPU

    • @TheNerd
      @TheNerd Před rokem

      Wait isnt the minimum spec for Ryzen 5000 a X470 Chipset? X370 seems a little old, since these boards also were not made for the amount of watts that 5900X and 5950X can pull.

    • @anon7631
      @anon7631 Před rokem +10

      @@TheNerd Some X370 boards did get BIOS updates to support 5000 series. In fact, even as low as A320 have some support, but I definitely wouldn't run an R9 on those.

    • @juansarmiento6754
      @juansarmiento6754 Před rokem +17

      @@TheNerd I have an Aorus x370 with a 5800X3D. Most major motherboard manufacturers updated bios to support 5000.

  • @Velocirabbit53
    @Velocirabbit53 Před rokem +20

    This is incredibly useful information to determine whether or not a platform upgrade is necessary if I want a new GPU. The 9900 is still running like a charm and it looks like it will continue to do so. Thanks y’all!

  • @bencunningham2611
    @bencunningham2611 Před rokem +31

    Thank you so much for this testing. I am someone who has a 4090FE in a PCIE3.0 slot, mostly as I was GPU bound running 2k ultrawide. Now I can get my monitors 175hz again, even with DLDSR thrown into the mix, and my 9900k at 5.2 can get another generations worth of use before I upgrade it. :)

    • @T.K.Wellington1996
      @T.K.Wellington1996 Před rokem +7

      I have a 4090 and a 5800X3D on a x370 AM4 board with PCIe 3.0, and 16GB DDR4-3200 RAM. I could not care less about PCIe 5.0 and DDR 5. It does not matter.

    • @De-M-oN
      @De-M-oN Před 4 měsíci +2

      @@T.K.Wellington1996 It does matter. DDR5 improves performance

    • @T.K.Wellington1996
      @T.K.Wellington1996 Před 4 měsíci +2

      @@De-M-oN Therefore I have the 96MB L3 3DV-Cache auf dem 5800X3D as compensation.

  • @masterww8
    @masterww8 Před rokem +11

    Thank you for doing this. It really helps make better informed decisions!

  • @CheesusSweet
    @CheesusSweet Před rokem +3

    I'm fairly new to your vids but I love how thorough and articulate you are. Thanks for all the work you guys do!

  • @Devilman666
    @Devilman666 Před rokem +500

    I think the benefits of pcie gen 5 is more about being able to use less lanes for equal bandwidth. Gpu's who use 16x slots shouldn't see a real difference.

    • @GamersNexus
      @GamersNexus  Před rokem +177

      Agreed

    • @ffwast
      @ffwast Před rokem +38

      The question is "why build a GPU that doesn't use all 16 lanes to begin with?" What's the point? Hobbling the performance with older boards?

    • @reaperreaper5098
      @reaperreaper5098 Před rokem +34

      That was true of PCIe 3.0 when it first came out, and is still true of 4.0.
      But mainboard manufacturers have their heads so far up their asses that they refuse to offer anything beyond a pathetic x8/x8/x4 allocation, and even that's not overly common and often requires forgoing some other feature or port (or set of ports) when it's not really necessary.

    • @dominikobora5385
      @dominikobora5385 Před rokem +11

      @@ffwast cost plus somethings such ad nvme ssds take up pcie lanes so if even if the gpu has less lanes it has the same bandwidth and other devices can use those lanes

    • @budgetking2591
      @budgetking2591 Před rokem +7

      @@ffwast because there is many gpu's that dont need more then 8 lanes on pci 4.0, also that way you dont lose performance when you use 2 nvme drives because only 8 pci lanes will be left on a lot boards. im glad my rx6600xt only uses 8 lanes for its max potential, or else i would of lost even more performance, 8 pci lanes is all that is left for the gpu on b450.

  • @KyleKunze
    @KyleKunze Před rokem +7

    Thank you. This helps me better plan to use some older hardware more effectively and for longer. Appreciate the straight forward and informative piece.

  • @alsiniz
    @alsiniz Před 26 dny +1

    Thank you Steve and GN team! I just remembered that my b350 motherboard from 2017 is still on PCIe 3.0 with a 4070 Super on the way. This video alleviated my anxiety around ordering a GPU a bit too early.

  • @Netsuko
    @Netsuko Před rokem +350

    I literally just asked myself if my PCIe 3.0 board would be a bottleneck since I also have two NVMe SSDs. And here's GN's video. Perfect timing, as always!

    • @PolskiJaszczomb
      @PolskiJaszczomb Před rokem +27

      Well, it's not the bandwidth that's gonna be a bottleneck :D

    • @ffwast
      @ffwast Před rokem +12

      If direct storage starts making an actual difference it might come up.

    • @GamersNexus
      @GamersNexus  Před rokem +176

      The CPU on the board might be a limit, though. Keep in mind also that lane count may get thin with multiple devices, depending on how the board assigns lanes and on which CPU you have.

    • @smiIingman
      @smiIingman Před rokem +11

      For real ive been wondering aswell.
      Ive a 10700k system so PCIE 3.0 and got both nvme ssd slots occupied AND i got a 4090 recently and i can definitely feel the massive bottleneck the 10700k puts on the 4090.
      Planning on upgrading when black friday hits though.

    • @dcan4
      @dcan4 Před rokem +14

      If your on a x470 or b450 with a Ryzen 5000 processor, you'll be fine.

  • @WolvenSwiftwind
    @WolvenSwiftwind Před rokem +3

    I'm so happy you made this. I am temporarily using my 4090 suprim on a pcie 3.0

  • @emike09
    @emike09 Před rokem

    I was super curious about this, thanks for covering it!

  • @celcius_87
    @celcius_87 Před rokem

    Thank you for testing this because I’ve been wondering about this

  • @Isamu27298
    @Isamu27298 Před rokem +11

    You guys keep on pumping out content. Really gratefull for everything you do. So much to learn from your videos!

  • @rbtoj
    @rbtoj Před rokem +36

    6:18 In my experience, playing GTA V with the grass setting set to the absolute maximum creates a lot of PCIe traffic. Especially in the area near the "Vinewood" hills. Back when PCIe 3.0 was still recent-ish, I remember taking a huge hit in performance when using PCIe 2.0 around this area.
    If I'm not mistaken, HWmonitor has sensor for measuring the PCIe traffic, I believe it's called GPU IO or something.
    Any other game or game area with a high number of draw calls should spike PCIe bandwidth usage too.
    EDIT: From memory, 3DMark used to have a draw call benchmark and it showed a very clear performance scaling between the PCIe generations.

    • @Focus_Fearless
      @Focus_Fearless Před rokem +4

      There is another person in the comments who experienced similar results with grass density, except different game.

    • @beninua
      @beninua Před rokem

      Maybe I'm asking out of ignorance, but why DX11 and not DX12 or Vulcan. I run all games that can on Vulcan or at least DX12 on my rig (ASUS z790-E, Intel Core-i9-13000K, 128GB DDR5-6000 RAM, nVidia RTX4090-24GB), and I can run all games with all settings at ULTRA with FPS never dropping below 100 on my 49" Samsung G9 NEO SuperUltraWide 5120x1440px (aka 32:9) GSync (240hz) monitor. I love your mentioning of the final death of SLI. Man the headaches I had finding motherboards with enough room for 3 nvidia 1080gtx in tripple SLI or just running two 2080Ti OC in SLI and a Motherboard running two times genuine PCIe x16 with two GPUs mounted in SLI, not to mention the extreme cooling needed with water pipes running everywhere. IT was a pest! I love plugging in just one mf.....ing brick of a 440mm long RTX4090 and get ten times the performance. And literally no noise at all from any fan.
      I do have one question though. I do have 8 SSD in my rig. four of them are normal SATA 8TB Samsung 870QVO, but the other four are identical Corsair MP600 PRO NH PCIe 4.0 NVMe M.2 - 8TB. Should it be a problem that I've used all four NVMe slots on the motherboard when they are all gen4 capable?
      I actually thought the RTX4090 Card was the first to take advantage of PCIe gen5 so I set the PCIe x16 slot to gen5 in BIOS. Could thet be the reason I constantly get a startup BIOS error saying that BIOS slot x16 has downgraded to PCI GEN3 (Which is default if I load BIOS default settings) and that I should press F1 if I want to change BIOS settings manually? Should I simply try to set my PCIe x16 to GEN4 in BIOS to avoid the fallback to GEN3?

  • @Kiyuja
    @Kiyuja Před rokem +2

    oh wow I thought this would have mattered more, especially in the times of ReBar and whatnot. Thanks for testing it!

  • @madarasoun3018
    @madarasoun3018 Před rokem

    Thank you for this video. I have been waiting for it.

  • @legendaryx590motherboard6
    @legendaryx590motherboard6 Před rokem +199

    When things get extreme, Vaseline.

  • @MagnumMatt09
    @MagnumMatt09 Před rokem +6

    dang that PCIE test footage nukes youtube's bandwidth

  • @HYBECTIVE
    @HYBECTIVE Před rokem

    Was looking for this exact video! Thank you :D

  • @EgaoKage
    @EgaoKage Před 9 měsíci +1

    You guys rock! This was near enough exactly the info I was looking for. I have an older PC which I use as a secondary machine (formerly my main PC) that's all PCIe Gen 3.0, which currently has a pair (SLI) of GeForce 950s in it. I don't remember why I went with that build specifically, but I remember there were reasons. Anyway, now that prices on last-gen GPUs have come down a bit, I've been thinking about giving it an upgrade. The options I've been considering are between a 2080 Ti (Gen 3.0) and a 6900 XT (Gen 4.0). But I wasn't sure how much the PCIe Gen of the MB would negate any advantages the 6900 XT had over the 2080 Ti. Thanks for this info!

  • @anon7631
    @anon7631 Před rokem +26

    One thing I think is interesting about the Warhammer 1080p results is that the lows are basically identical despite the average being lower.

    • @jacobparis6576
      @jacobparis6576 Před rokem +1

      It certainly makes it look like a driver or software issue, since that's the behaviour I'd expect from a frame limit being applied.

    • @6St6Jimmy6
      @6St6Jimmy6 Před rokem +2

      @@jacobparis6576 That's how PCie bottleneck can also starts to show. Lows don't get too much effect. Highs get much lower, so average lowers. It's more about when in the benchmark the actual PCIe bottleneck happens.

  • @marekkovac7058
    @marekkovac7058 Před rokem +7

    This is a good news for 2x 4090 in ryzen 7000 series motherboard that supports PCIE 5.0 x8 / x8. I was hoping you'd test some rendering / simulations. Thanks for the gaming benchmarks, it tells a lot too! :)

    • @rdiricco
      @rdiricco Před rokem

      Are you sure that's what you are going to get? I have a 12900k in an Asus MB and even though I can do Gen5@8x the card only supports Gen4@8x. So if you cut the lanes to 8x (I'm using the other 8x for M.2 drives) you only get the Gen4 speeds so only 16GB/sec not 32GB/sec.
      If your seeing something else let me know I'd be interested in how you got there.

    • @mikeramos91
      @mikeramos91 Před rokem

      @@rdiricco I recently upgraded & see my card running at 4.0x8. It’s technically the same as 3.0x16. Will there be a difference?

  • @t0mn8r35
    @t0mn8r35 Před rokem

    Very interesting test and very interesting test results. Well done as always.

  • @TybJim
    @TybJim Před rokem +1

    Thanks for making this video. I had a suspicion that PCI-Express 3.0 would be enough for modern cards and your video and my other research leads me to believe it will be fine. I have an older PCI-Express motherboard (Asus Rampage V Extreme) in my PC that I built back in 2015. After recently upgrading the CPU to a second-hand 6950X, adding more RAM and M2 drive I'm now looking at new graphics cards. I've been out of the loop for a while, so your comment about SLI being dead was helpful too.

  • @191desperado
    @191desperado Před rokem +3

    THIS is what I’ve been waiting for! THANKS!

  • @largote
    @largote Před rokem +37

    PCIe 2.0 (or 3.0 x8) tests would also be interesting here since a lot of motherboards go down to 8x when multiple M.2 drives are installed. (X470, in particular, does this and plenty of people slotted 5800X3D chips onto those and X370 boards).

    • @MegaMoonse
      @MegaMoonse Před rokem +5

      Watched the video for this answer. Steve, if you are reading please do one test.

    • @kaanozkuscu5079
      @kaanozkuscu5079 Před rokem +3

      @@MegaMoonse people who get 4090s have the money to buy a proper pro mainboard.
      why care for "benchmarks" if you wont ever have the hardware?

    • @MegaMoonse
      @MegaMoonse Před rokem +8

      @@kaanozkuscu5079 the main question is how much pci-e speed matters? The gpu is not that important. It could be 4090 or future 4050. If gen 3 x8 is enough or performance hit is relatively small I would certainly keep my motherboard.

    • @cristiansaucedo7893
      @cristiansaucedo7893 Před rokem +1

      @@MegaMoonse theyve already done a video on this topic, testing a 3080 on pcie gen 2 3 and 4. short answer yes, you lose a decent chunk of performance on gen 3 x8 (equivalent to gen 2 x16). if youve got 30 series its worth it to give it the most possible bandwidth.

    • @mytech6779
      @mytech6779 Před 6 měsíci

      gen 2.0 chipsets provided many more lanes, My AM3+ has 40+4 PCIe lanes and the board provides x16/x16 or x8/x8/x16 or 4 x8 slots plus two x4. (They are all from the chipset, that generation of CPU did not manage the PCIe on-die).
      Storage drives and all onboard I/O was handled by the southbridge (which is connected with the "+4" lanes)
      There was also a critical change in the protocol between v2 and v3, there is another major change between v5 and v6 because with 5 and 6 they are getting into frequencies that cause major circuit engineering problems and high error rates in transmission.
      The original proposed v5 standard was 1.3x v4 not double v4, because there was no proof that the full speed increase could be feasibly obtained in consumer commodity products.

  • @cars291
    @cars291 Před rokem

    Awesome and thorough testing 👏

  • @riba2233
    @riba2233 Před rokem

    was waiting for this, thanks!

  • @xHighPotencyx
    @xHighPotencyx Před rokem +5

    I feel like this is as good a place as any to ask: on motherboards where the PCIe 5.0 x16 slot goes to x8 with the PCIe 5.0 M.2 occupied, do you only get an effective 8 lanes of PCIe 4.0 with a 4.0 device? That is to say, are you limited by both interface compatibility and lane bifurcation simultaneously?

  • @AndroidPoetry
    @AndroidPoetry Před rokem +10

    Thank you for including Warhammer 3, lots of places don't do RTS/4x titles, which is one of the main reasons to do PC gaming.

  • @studlyruddly16
    @studlyruddly16 Před rokem +1

    This test 100% eased my concern about buying a pcie gen4 board for my new r7 7700x pc

  • @Siegdrifa
    @Siegdrifa Před rokem

    Exactly what i was looking for ! thanks !

  • @wedgoku
    @wedgoku Před rokem +7

    Thankyou for testing, I game at 4k resolution & I'm currently using an Intel core i5 10600k which is limited to only PCI express 3.0 speed, for gaming it looks like there is no reason for me too upgrade my CPU then.

  • @EVPointMaster
    @EVPointMaster Před rokem +34

    I noticed that the game "Flower" is actually extremely bandwidth sensitiv and also that the grass density setting impacts this a lot.
    I was curious so I tested with different pcie bandwidths. I used an RTX 3080.
    The frame rate scales almost linearly with the bandwidth!
    1.0 = 36.4fps
    2.0 = 69.9fps
    3.0 = 127.7fps
    4.0 = 222.4fps
    It's possible that the bottleneck shifts at 5.0, but even then it could be interesting to test for cards that are only x8.

    • @adjoho1
      @adjoho1 Před rokem +2

      Ffs more spam.

    • @EVPointMaster
      @EVPointMaster Před rokem +3

      @@adjoho1 I always report these comments, but I'm not sure if CZcams actually does anything about it.

    • @GraveUypo
      @GraveUypo Před 10 měsíci

      @@EVPointMaster they don't

  • @bleedinggumsmurphy537
    @bleedinggumsmurphy537 Před rokem +2

    i was just debating upgrading from gen 3 to gen4/5 with a full mobo and cpu upgrade. thanks for saving me some money for the time being.

  • @erfa10
    @erfa10 Před rokem +2

    Thanks for the continued barrage of extensive testing GN!
    I'm curious to know whether DLSS 3 with its higher framerates might be more impacted by the bandwidth... likely not considering that the CPU is only required for every other frame but might be interesting to confirm as upscaling becomes more common.

    • @sharathvasudev
      @sharathvasudev Před rokem +1

      it shouldn't. the bandwidth is mostly needed to communicate with CPU. dlss 3 generation happens all in the GPU. it's infact cpu independent. higher vram speeds could help to access the on GPU data from previous frames

  • @CyberJedi1
    @CyberJedi1 Před rokem +232

    I wonder if Resizable Bar is making a bigger difference now with the 4090 than with 3000 series. Could be an interesting video Steve.

    • @GamersNexus
      @GamersNexus  Před rokem +225

      Banned the spam account. As for ReBAR - good question. We test with ReBAR on, so it might.

    • @TwistedEyes12
      @TwistedEyes12 Před rokem +19

      @@GamersNexus Glad you caught them quick, appreciate everything you do! Cheers

    • @ExoKiller4
      @ExoKiller4 Před rokem +5

      Strikes me as a good question relevant to this one, if you're on an older platform limited to PCIe 3 there's a good chance you won't have ReBar either.

    • @Holyspiritrecieved
      @Holyspiritrecieved Před rokem

      Nvidia a cinderblock gpu.. runs at much higher Temps at cost of longevity

    • @ScottGrammer
      @ScottGrammer Před rokem +2

      @@GamersNexus Do you suppose that NVidia will eventually release a driver that turns on resizable bar on older GPU's, like my 2080TI? I am willing to bet that the hardware supports it, it's just turned off.

  • @kirby0louise
    @kirby0louise Před rokem +3

    Hybrid graphics is definitely something you could explore as a PCIe heavy workload. Even on desktop it's somewhat relevant as decoding videos on your iGPU is more power efficient than the dGPU + less risk of getting into a memory bandwidth fight with a game

    • @niks660097
      @niks660097 Před 7 měsíci

      16x pcie 5 has more bandwidth than most DDR4 or LPDDR4 dual channel or LPDDR5 single channel laptops or iGpu desktops, nowadays pcie bandwidth is never the bottleneck with things like CXL, in encoding/decoding your SSD will always be bottleneck, not pcie bandwidth, and dGPU encoders and decoders like nvidia's nvenc can do multiple 4k60 decode/encode streams, even in AV1, so your comment is not valid in 2023, maybe 2015? but nowadays amd and nvidia is way ahead of intel "quicksync"..

  • @techclub8528
    @techclub8528 Před 6 měsíci

    Thanks for the video, I’m upgrading my board this month to a Gen5 AM5 my 4090 is currently using a modified vertical gpu riser that’s Gen3 I can’t really swap this out as I would need to redo all my hard tube runs, glad to know there won’t really be a difference and I can use the same riser.

  • @corr2143
    @corr2143 Před rokem

    I love your videos, Im just a little confused that all this work was done to show the performance of 3 games would be nice to see some encoding software and rendering (Davinci Resolve and Blender) as well. None the less, appreciate the free resources.
    This channel taught a valuable thing like many others, benchmark and research your usecase online before purchases. No more buyers remorse and less ewaste, thanks for the work over the years.

  • @BlueBoy0
    @BlueBoy0 Před rokem +3

    Really glad you did this one. There's a bug in Asus 500-series motherboards with 11th gen cpus and the 4090 where it only runs the card as PCI-E 3.0 (despite running the 3090 at 4.0). I've been wondering how much performance I'm losing.

    • @NaanStop96
      @NaanStop96 Před rokem +1

      I'm having a similar issue with a b550i itx mobo from Asus. I'm using a 5800x3d and with gen4 I can't get a display, only when it's set to gen3.

    • @BlueBoy0
      @BlueBoy0 Před rokem +1

      @@NaanStop96 Definitely a BIOS issue. I hope Asus still updates these motherboards...

    • @NaanStop96
      @NaanStop96 Před rokem +1

      @@BlueBoy0 likewise, no other issues however other than GEN4 causing no display so that leads me to believe everything else is working correctly and not defective. For now, I'm glad the performance difference is negligible.

  • @aidanpjpost
    @aidanpjpost Před rokem +26

    Far out Steve, you and the team have been spitting out great in-depth pieces CONSTANTLY for over a month now. Please take a break for your own sanity!

  • @Stop_Pre-Ordering_Video_Games

    Sincerely, thank you so much for this, I'm currently building an exploded view wall mounted build and I need a 60 cm PCI riser cable which I can only get in PCI Gen 3 so it's really good to know that on my 3090 it will be more than enough

  • @dennisnicholson2466
    @dennisnicholson2466 Před rokem

    Also perfect timing As I was contemplating whether or not I had a bandwidth issue since I have PCI 3.0 pulled my sli 2 1080 founders replaced with 6800XT with a 5800x3d proc. At first the results were atrociously bad until I discovered specific toggles on the UI of the gpu software and drivers. Not too terribly worried about upgrading the motherboard now thank you.

  • @Pyrocumulous
    @Pyrocumulous Před rokem +3

    Would be interesting to see eGPU with Thunderbolt / USB4 in the comparison

  • @edwardallenthree
    @edwardallenthree Před rokem +17

    Always interesting. I still run my GPU, a 3070 at 3.0x8 because I use the other x8 for an infiniband card. The performance hit is about 5%.

    • @user-tn6tb2qh8d
      @user-tn6tb2qh8d Před rokem +1

      Very interested on why do you need an infiniband card. Isn't that a server thing?

    • @edwardallenthree
      @edwardallenthree Před rokem +10

      ​​@@user-tn6tb2qh8d it is indeed. While the windows drivers require a special version of Windows pro and Windows Server, the Linux drivers are free and old infiniband gear is dirt cheap. I use it as the backbone for my NAS. This leaves me in the ridiculous position of running my Windows machine in a virtual machine where it's basically the only thing on the device so that I can use Linux drivers for my storage. Realistically the speeds are about equivalent to what you get with 10 gigabit Ethernet, but far lower CPU overhead if you do it right. And when you can get a 8-port switch, nominally 40gbs per port, for $89? It's a great way to get performance cheap.

    • @budgetking2591
      @budgetking2591 Před rokem +1

      I also have to run my gpu at 8x pci, because both nvme slots are occupied, so im actually glad the rx6600xt only uses 8 lanes.

    • @marekkovac7058
      @marekkovac7058 Před rokem

      @@edwardallenthree imteresting.. what kind of files system are you using? is cpu offload due to RDMA ?

    • @edwardallenthree
      @edwardallenthree Před rokem

      @@marekkovac7058 ZFS on the server, but NFS over RDMA to share it. Network performance exceeds performance of the array, significantly, even with NVME caching.

  • @declangallagher1448
    @declangallagher1448 Před rokem

    THIS IS THE EXACT VIDEO I NEEDED THANK YOU

  • @MrPontus
    @MrPontus Před rokem +1

    thankyou for this video, I have a 10980XE and not planing to change platform for a few years, so I will run my next graphics card at Gen 3 x16

  • @scarletspidernz
    @scarletspidernz Před rokem +3

    I want PCIE GEN 6 motherboards to change the gpu slots to using/reserving X8 instead of x16, maybe leaving it on extreme high end only for the professionals (for who know what cards) and take those extra 8 lanes into more discrete i/o rather than "shared with"

  • @seanpasquarella4149
    @seanpasquarella4149 Před rokem +3

    So would it be preferable to plot an upgrade path through the 40 series if im going to stay at 4th gen PCIe for a little while? Or would it be smarter to look into high-end 30 series?

  • @Xeonzs
    @Xeonzs Před rokem

    Very useful video, I was going to run into the issue today when I considered pairing my 4090 with a Auros 8TB SSD AIC, after this video and some other research I found running it 8x/8x wouldn't be an issue for the 4090, but the drives would be an issue cause 1-2 of the NVMe ssds on the AIC wouldn't be detected, while other models would run at lower speeds due the bottlenecked bandwidth.
    Decided to not go with the AIC and instead just re-use my current NVMe SSDs and plop them onto the motherboard directly.

  • @maxabillon
    @maxabillon Před rokem +1

    Nice one Steve,👍 I have a Z690i ITX Gigabyte lite motherboard on order & only supports up to PCI-E 3.0 x 16 slot

  • @dgraves14
    @dgraves14 Před rokem +8

    PCIe 3.0 x8 would have been interesting to see. While most people buying 4090s are unlikely to still be using PCIe 3.0 motherboards, it'd be interesting and useful to know just in case.

    • @Mom19
      @Mom19 Před rokem +1

      Well its the same as gen 2 x16 so if someone tests that scenario at some point, you can go off that.

    • @12Burton24
      @12Burton24 Před rokem

      PCI E 3.0 x 8 can already be overloaded by i think it was 1080ti.

    • @mrsuperselenio5694
      @mrsuperselenio5694 Před rokem +1

      @@12Burton24 Not really, there are test done that show that PCI E 2.0 x16/ 3 x8 has a loss of 3 to 10 fps tops, which honestly is within margin of error.

  • @Operational117
    @Operational117 Před rokem +4

    I believe the reason 1080p might benefit from upgrading from PCIe 3.0 to PCIe 4.0 is because 1080p nowadays is associated with higher framerates, and higher framerates means the CPU needs to send more commands to the GPU per second (and perhaps data as well).
    It's just a theory, though.

  • @mandalorian43
    @mandalorian43 Před rokem +2

    I'm happy to see these results. I'm on a z590 and was going to be pretty bummed if the performance was drastically different.

  • @terakahn
    @terakahn Před rokem

    This is a really helpful video. I'm planning on upgrading to a 5080 when it releases, and have gen 3 hardware currently. I feel like that will hold the card back. Which means a lot more money is going to be required for this upgrade lol.
    Though I haven't used a card in x16 in years. Capture cards + SSDs kind of force me into x8 unless I get some crazy motherboar/cpu combo with more lanes.

  • @Co-opSource
    @Co-opSource Před rokem +5

    Please cover VR gaming performance too. 🙏🏽

  • @thrilhouse007
    @thrilhouse007 Před rokem +4

    "Joe Rules, Steve Drools."
    How long has that sign been there in the background?

  • @ChizuruP
    @ChizuruP Před rokem +1

    On 10th Gen x299 and doing my best to wait for fishhawk falls in Q1 to jump up ( in the event it IS HEDT ) , despite the small gains from 3.0 -> 4.0 its still exciting to see I love the way things are accelerating lately.

  • @hamzasamar8823
    @hamzasamar8823 Před rokem +1

    thank you so much this is so informing

  • @croakingembryo
    @croakingembryo Před rokem +4

    Wouldn't the main difference be in loading the game? Like how fast the memory gets populated?

  • @ZiddyN
    @ZiddyN Před rokem +6

    So its about a 1% to 3% difference. Is it worth not slotting an m.2 into the same lane to keep your 4090 GPU on x16 lanes gen4 then?

    • @csguak
      @csguak Před rokem

      Difference is minimal, you can just use M.2 on the same lane

  • @Nate-lw2gu
    @Nate-lw2gu Před 2 měsíci

    I appreciate y'all doing this test. I'm still on a ryzen 2700X with pcie3 and will be buying a 4090. Good to know that I'm not gonna be bottlenecked by that and don't need to upgrade the whole PC immediately

  • @lennieb7367
    @lennieb7367 Před rokem +1

    Thank you very much! Thanks to your testing, I bought a PCI 4 motherboard for ~$200 instead of paying almost ~$600 for a PCI 5 motherboard.

  • @magottyk
    @magottyk Před rokem +3

    PCIe generations is more relevent to boards that can bifurcate the GPU slot.
    If there's not much difference between gen 3 and gen 4 on a 4090 at X16, it would be useful information to have if the card is limited to X8 lanes in these scaling tests.
    One major use for bifurcated slots is direct to CPU m.2 SSD's add in boards, so the X8 bandwidth results would be useful.

    • @TryllHDTv
      @TryllHDTv Před rokem

      This is exactly my curiosity!!

  • @jgorres
    @jgorres Před rokem +5

    Since this testing shows that the bandwidth difference between PCIe gen 3.0& 4.0 is negligible at x16, I'd be interested in seeing how PCIe 4.0 x8 would do... Will fewer lanes have latency issues, or expose driver problems?

    • @12Burton24
      @12Burton24 Před rokem

      PCI E 4.0x8 will have little advantages ovet PCI E 3.0x16 because CPU lanes and m.2 Drive Lanes are also PCI E 4.0 the Bandwidth itself is the same foe 3.0x16 vs 4.0x8

    • @jgorres
      @jgorres Před rokem

      @@12Burton24 Yes, I said the bandwidth was essentially the same, but testing would show any latency or driver issues with just 8 lanes, and that's what I'm curious about.

    • @12Burton24
      @12Burton24 Před rokem

      @@jgorres And i said that everythingnis faster so even drivers are the same you should see a difference 😉

    • @jgorres
      @jgorres Před rokem

      @@12Burton24 ??? I'm not understanding what you're saying.

  • @VGSoniTech
    @VGSoniTech Před rokem

    Thanks i have i9 9900k z390 i was really planning to upgrade my gpu to 4090. This video was really helpful!

  • @budgetking2591
    @budgetking2591 Před rokem +6

    The real problem with PCI 3.0 occurs when u have a graphics card that only has 4X or 8X bus, like many AMD gpu's. I have some performance loss with my RX6600XT on PCI 3.0.

  • @jolness1
    @jolness1 Před rokem +10

    I’m surprised that 3.0 isn’t bandwidth limiting still since this is years into cards with 4.0. Makes me feel better about holding on to my x370 and running a 5800X3D with a 4090
    I do think this will continue for awhile, pcie4 5 etc are useful in data centers (and empirically, but maybe not practically, nvme)

    • @BrickBazooka
      @BrickBazooka Před rokem +1

      I also have an x370 with an 5800x3d and 4090 is coming. Is this really only 1-2% loss compaired to a pcie gen4 board?

    • @jolness1
      @jolness1 Před rokem

      @@BrickBazookaGN does really good and thorough testing, I trust it. I didn’t want to buy a new board for an EOL socket and my board has plenty strong of a vrm. How are you like the 4090? I’m very impressed with the FE

    • @BrickBazooka
      @BrickBazooka Před rokem

      @@jolness1 yeah, I'll keep my Board also for my 4090fe. The card is great, but rest should be also latest gen.. later next year I should upgrade to raptor lake or amd 7000

    • @kecimalah
      @kecimalah Před rokem

      I have x370 with 5800x3d and 4090 and going to change board to b550 because of 4.0 as it cost nothing compared to 4090, it is more about work invoved in switching it. Also i will upgrade my main SSD to NVM 2 TB 4.0 (from 3.0 500 GB). I am using that PC mainly for VR and in Flight simulator 2020 in dense city areas i get quite a lot of stutering, this upgrade should quite help with bottlenecks and it should hopefully solve that stutters. I dont think i will much notice improvments somewhere else than in Fsim 2020. Other mainly racing sims are running fine and are mainly gpu limited, but few more FPS could help in some places to be completly without stutters. VR should be about good imersion and stutters are quite ruining it, so this quite cheap upgrade should make it better and would be shame not to use 100% potencial of that expensive 4090. For non VR gaming i dont think that upgrade make sense, also i dont see much benefit using 4090 for playing games at 4k 60 Hz LCD, it is looking and running almost same as with 3080.

    • @BrickBazooka
      @BrickBazooka Před rokem

      @@kecimalah why b550 , sell 5800x3d and just upgrade everything. That 1-3% from upgrading your pcie Standard isn't worth your nerves

  • @jonathonsuggs1782
    @jonathonsuggs1782 Před rokem

    Already knew that from your previous testing of PCIE Generations before no?? Look forward to seeing the different Gen PCIE versions and Lane allocation add up against one another!!

  • @Flank.Sinatra.
    @Flank.Sinatra. Před rokem +1

    when will you guys post 2022 best cases and coolers? maybe also best fans if you guys have time? looking forward. love what you do

  • @cdurkinz
    @cdurkinz Před rokem +4

    I feel like this will matter more if you have ANY other pcie card in your system and are running x8. Would love to see those.

    • @davidcobra1735
      @davidcobra1735 Před rokem +1

      It doesn't matter. You either have enough lanes total for all the cards or something doesn't work.

    • @ytmaxxammo8591
      @ytmaxxammo8591 Před 10 měsíci

      Period….Well said. ✅

  • @HiHi-eq2tn
    @HiHi-eq2tn Před rokem +5

    Larger margin than i expected was expecting 1-2% difference.

    • @TwistedEyes12
      @TwistedEyes12 Před rokem

      @Gamers Nexus shorts 🅥 @thanks 🅥 This spammer is actually the FieRcE channel on CZcams, going by "thanks" or Gamers Nexus shorts or Bully Maguire. Don't click it unless you want to give them a free view, very sad way to try to do this. I wouldn't have even cared if you just didn't try to copy and pretend to be Gamers Nexus including using their logo to trick people.

    • @GamersNexus
      @GamersNexus  Před rokem +4

      1% is pretty much error/variance in most instances. 2% is real, but still has +/- a bit of range.

  • @johnelectric933
    @johnelectric933 Před rokem +2

    Thank you for your constant great scientific method. I shop the back of the wave and always try to keep up on what features matter when I buy last year's model.
    As a life long R&D lab tech, I think you have great lab skills.
    After around half a century in the lab, the only advice I can add to your toolkit is NOTES, lots and lots of NOTES.
    I promise you, in 2 years you will not remember accurately. It is very frustrating to have to recreate an experiment because you forgot to write something down about the setup. It also keeps people from picking your results apart.

  • @jayjayspcbuilds
    @jayjayspcbuilds Před rokem +1

    awesome vid 💪

  • @andrewvirtue5048
    @andrewvirtue5048 Před rokem +3

    Explain why the quantity of PCIe lanes a device uses are important, and which devices use them, and how many can be used at once.

    • @Vile-Flesh
      @Vile-Flesh Před rokem +1

      I would like to know this as well. I don't understand PCIe lane allocation and why they compared 3.0 x16 to 4.0 x8 and now I am further confused when he mentioned lanes from the CPU and lanes from the chipset--pulling lanes from the chipset was possible but not often done. Where are the options for that?

  • @JDHitchman
    @JDHitchman Před rokem +3

    Steve, just curious if you have ever considered building and selling a "GN Mark" hardware benchmark test suite along the lines of the upcoming LTT "Mark Bench"?

    • @GamersNexus
      @GamersNexus  Před rokem +9

      No, we use all our stuff internally only. It's dangerous to build and distribute that type of thing without an insane amount of controls (that we won't have over user setups) just because it can easily produce bad data that runs rampant for comparisons online - e.g. user benchmark. It's possible. FutureMark has done a good job. But we're not going to try and do that as we aren't set up to do it in a way that I think is responsible (e.g. would be too easy to generate bad data that feels like good data to the users)

  • @chincemagnet
    @chincemagnet Před rokem +2

    Sweet! I was hoping to see this from a reputable channel 👍🏼😁

  • @Tainted-Soul
    @Tainted-Soul Před rokem

    Thanks for this . It is good that the board speed is a head of the need of the graphics cards, showing that this will not be the bottle neck for years to come :)

  • @dorfkind9825
    @dorfkind9825 Před rokem +3

    The difference between PCIe 4.0 x16 x8 x4 would be interesting

  • @RoughRunnerAce98
    @RoughRunnerAce98 Před rokem +4

    In the real world this basically confirms for me that you can run close to the best CPU and GPU on an old B350 board that has PCIE 3.0 and only lose single digit percent performance from it. Hardware Unboxed showed that B350 boards can run the 5800X3D with little performance lose, within reason of course, you should pay close attention to your VRM temperatures and don't take it too far. I personally won't be going that far, the 5700x is the best 65W CPU on AM4 to my knowledge so that's what I will be upgrading to from my 1700 and now I know that PCIe 3.0 won't hurt at 1440p. Now I will wait to see how RDNA 3 and the the rest of the 40 series scales with the 5700x to upgrade my 1070, I wish GN did that testing but HU usually does great scaling videos as well, they did an excellent one for the 5000 series CPUS with 30 series and 6000 series GPUs. AM4 was truly a great platform for longevity. Thank you for doing tests that answer important questions.

    • @Littleandr0idman
      @Littleandr0idman Před rokem

      Can confirm. Upgraded my 1600 for a 5700x on a b350 board. And upgrade my gtx 1070 with a 6700 xt. I actually had worse performance wifh the new setup until I realized that the bios update reset the ram’s clockspeed. Once I got it back up to 3200mhz, it’s been smooth sailing

    • @mapesdhs597
      @mapesdhs597 Před rokem

      Indeed. The same can apply all the way back to IB/IB-E CPUs on P67/Z68/X79, infact it was possible to force 3.0 even with SB-E on X79 (such as the 3930K, because it was a XEON in disguise). All one is then limited by for such older setups is overall CPU strength, but that's a whole other thing.
      I just find it interesting how Gen 3.0 ended up spanning so many years of different boards and sockets (well, mainly Intel), whereas now the industry seems to be skipping through 4.0, 5.0 and beyond a lot faster. People perhaps forget the utility of what once was, eg. my lowly old 4c/8t 4820K (so easy to oc) has 40 lanes of 3.0 from the CPU, so it can actually do multi-device/GPU things which much later SKUs like the 7820X could not (that only had 28), plus relevant mbds had lots of lanes off the chipset aswell, eg. my P9X79-E WS supports x16/x16/x16/x16, half from the CPU, half from the chipset via two PLX chips IIRC; or it can even run x16/x8/x8/x8/x16/x8/x8 using all seven slots. See:
      www.anandtech.com/show/7613/asus-p9x79e-ws-review
      It all became horribly complicated when for a time the no. of lanes from the CPU depended on the SKU (really horrible product segmentation, it meant slot device support and even whether some slots could be used at all depended on which CPU was fitted), eg. the original 4c CPU for X79 (i7 3820) had 40, yet the much later 5820K and 6800K only had 28. So glad when all that malarky came to an end.

  • @daveuerk4030
    @daveuerk4030 Před rokem

    Thanks for revisiting this

  • @SkateClipsAndTips
    @SkateClipsAndTips Před rokem +5

    I have a z490 motherboard and read that m.2 speeds don't improve gaming much. It's mainly for transferring files, I also learned that pcie 3.0 and 4.0 isn't a big enough difference for gpu bottleneck. It's mainly bandwith related

    • @paullasky6865
      @paullasky6865 Před rokem

      And I have the z390. And using 2 nvme drives downs affect the 16 lanes on the first piece port.

    • @Skippernomnomnom
      @Skippernomnomnom Před rokem

      @@paullasky6865 what?

    • @paullasky6865
      @paullasky6865 Před rokem

      @@Skippernomnomnom people keep saying that if you use both m.2 drives you lose pice lanes on the main slot. It isn't true.

    • @Skippernomnomnom
      @Skippernomnomnom Před rokem

      @@paullasky6865 Yeah. I have two nvme drives and 4 HDD drives on pcie 3.0 and have no issues

  • @dfwruss2392
    @dfwruss2392 Před rokem +17

    I'm not sure how many are gaming with a 4090 at 1080p. More would be gaming at 1440p and the target is 4k I would imagine. I wonder what the difference would be when 4k gaming is factored in.

    • @randomyoutubeuser8509
      @randomyoutubeuser8509 Před rokem +4

      Nobody who spends that kind of money on a 4090 should game on anything less than 4k especially when the 4090 will barely be utilized in 1080p and 1440p in a lot of games

    • @amkyutube
      @amkyutube Před rokem

      Still, 3.0 vs 4.0 question clearly is an optimization issue rather than a bottle-neck one.

    • @dustingarder7409
      @dustingarder7409 Před rokem

      @@randomyoutubeuser8509 I have a rx 7900xtx and i know that it is slower but not by a lot. I am Playing Escape from Tarkov and Satisfactory (with mods). Escape from Tarkov runs on almoast maxed settings in 1440P with about 55 Fps (On the newest map). Satisfactory runs moast of the time stable at 144fps but has drops to 100. The maximum benefit of the 4090 that i can imagine is 15% more fps becasuse i am not using rt or dlss/fsr. What I want to say is that many games dont even run in 4k. Escape from tarkov could work in 4k with lower settings and dlss 3.0 but this game is so poorly optimized that dlss looks garbage in this game and makes it litterally unplayable

    • @HappyHubris
      @HappyHubris Před rokem

      @@dustingarder7409 The 4090 is generally 30-65% faster (depending on settings and features), not 15%.
      czcams.com/video/f1femKu-9BI/video.html
      I use 7900XTX in 3440x1440 and definitely wouldn't mind a 4090, but that's just too much for a GPU!

  • @nicholasgiobbi279
    @nicholasgiobbi279 Před rokem

    Love how you guys left steponz's blue shop towel sign up.

  • @damien8540
    @damien8540 Před rokem

    Love the channel. Been watching forever. I've usually found myself to be pretty much in the know of pc hardware at most times, but for once I was stuck trying to decide if 13th gen intel on a z690 was worth it or just go AMD to get gen 5.... If I end up with Gen 4, I'll be happy. Thank you.

  • @5371W
    @5371W Před rokem +3

    I would drop the coin on a 4090, but at this point my entire computer is a bottle neck.

  • @BlackJesus8463
    @BlackJesus8463 Před rokem +3

    Gen 3 is legendary!

    • @m8x425
      @m8x425 Před rokem

      At least you watched the video.

  • @jeffsaffron5647
    @jeffsaffron5647 Před rokem +2

    Great video, always nice to test this when new cards come out.
    I personally use DeckLink 8K (8x PCIe 3.0), and it does not work reliably over shared chipset lanes at all. They need to be on CPU lanes. So unless one is on Threadripper or something with bunch of CPU lanes it is good to know we can run in 8x/8x split mode on cheaper platforms with no real performance hit.

    • @inkprod
      @inkprod Před rokem

      I don't think it's neccesarily the chipset lanes, but more the amount of them. Asus has an x570 x8/x8/x8 board with the 3rd x8 connected to the chipset. The chipset itself is only connected pcie4.0x4 to the CPU, but that still leaves enough bandwidth to throw 4 12G feeds at the Decklink.
      We have a bunch of them with a mixture of quad2's and 8K's in the chipset slot without any issues.

    • @jeffsaffron5647
      @jeffsaffron5647 Před rokem

      @@inkprod yea It may work on chipset lanes just fine, thing is dedicated CPU lanes are certainty. Back in PCIe 2.0 days I had Decklink with single HD-SDI strugle on chipset lanes. It might work on chipset lanes or it might not.

  • @Shirocco7
    @Shirocco7 Před rokem

    Informative, thanks

  • @Nathanael_Forlorn
    @Nathanael_Forlorn Před rokem +8

    Would have loved to see what happens to fps, when limited to x8 lanes.
    With m.2 becoming more widespread, many boards limit from x16 to x8.

    • @LuisGarcia1992_
      @LuisGarcia1992_ Před rokem +3

      Pcie 4.0 x8 is equal to Pcie 3.0 x16, so according to these video's charts, nothing changes

    • @Nathanael_Forlorn
      @Nathanael_Forlorn Před rokem

      @@LuisGarcia1992_ oh is it? Kinda makes sense, but didn't know for certain whether bandwidth precisely doubled or even if it correlates linearly at all. Thanks for letting me know!

    • @DasFlank
      @DasFlank Před rokem

      @@LuisGarcia1992_ Im looking at a 4080 for an upgrade from my 2080ti. I have an nvme boot drive which is forcing my top slot into x8. It's an x470 board and gen 3 to boot. I feel a bit silly asking this. But is it a good idea to upgrade to an x570 board with pcie gen 4? Or am I worried about nothing

    • @flipdry
      @flipdry Před rokem

      ​@@DasFlank It may have some impact but how much I can't say. I would either get a b550 board or move the m.2 SSD to a PCIe X4 adapter bracket to allow the top slot to operate at x16 speeds.

    • @Justifier
      @Justifier Před rokem +1

      This comment actually has more relevance than you'd think.
      Lots of z790 boards, such as the z790 Aorus Master, run their PCIe lanes in x8 if you have a m.2 in the top CPU slot. There's going to be a ton of people who want to run Gen5 m.2 storage when it becomes more widely available, how much is this going to impact GPU performance if at all on these boards?

  • @raulitrump460
    @raulitrump460 Před rokem +4

    you get cpu bottleneck with older cpus before you hit pcie limitation

  • @Ryan-1337
    @Ryan-1337 Před rokem +2

    My B350 board not only got support for Ryzen 5000 but it got a bios update for Re-Bar as well. I'm glad I can stretch out pcie gen 3 for a bit longer and not lose any meaningful performance.

  • @Born2War20
    @Born2War20 Před rokem

    ive been waiting for this video

    • @TwistedEyes12
      @TwistedEyes12 Před rokem

      @Gamers Nexus shorts 🅥 @Gamers Nexus shorts 🅥 @thanks 🅥 This spammer is actually the FieRcE channel on CZcams, going by "thanks" or "Gamers Nexus shorts". Don't click any link they give unless you want to give them a free view, very sad way to try to do this. I wouldn't have even cared if you just didn't try to copy and pretend to be Gamers Nexus including using their logo to trick people.

  • @PolskiJaszczomb
    @PolskiJaszczomb Před rokem +6

    Do the same now for 3070, differences should be much higher in contrast to 4090, due to much smaller VRAM, similar to what happens with 6500 XT.

    • @Cruor34
      @Cruor34 Před rokem +2

      3070? Did you typo? I'm pretty sure the covered the 3080 PCIe 3.0 vs 4.0 like 2 years ago and there was no meaningful difference, so there wouldn't be on the 3070.

    • @Secunder
      @Secunder Před rokem +1

      6500XT is 4 lanes, thats what tanks it

    • @PolskiJaszczomb
      @PolskiJaszczomb Před rokem +1

      @@Cruor34 Oh wait, they totally did, completely forgot. But 3080 has 10 GB of VRAM, whereas 3070 only 8, so there'd be much higher usage of PCIE bandwidth, due to constant VRAM data shuffling, once they fill 8 GB of it.

    • @GamersNexus
      @GamersNexus  Před rokem +5

      We already did that on the 6500 XT. And we did that on the 3080.

    • @bertracoon1884
      @bertracoon1884 Před rokem

      nah, i dont think so. the 3070 is simply to weak, to make a difference.

  • @alouisschafer7212
    @alouisschafer7212 Před rokem +3

    So even with a 4090 the most powerful GPU on the market the difference between 3.0 and 4.0 is marginal.
    Now that is interesting because it shows that PCIe is very future proof.