Why Did SLI Die?

Sdílet
Vložit
  • čas přidán 6. 06. 2024
  • The Truth About NVIDIA SLI! In this in-depth video, we explore the rise and fall of NVIDIA SLI and, to a lesser extent, AMD Crossfire. These once-revolutionary technologies promised unparalleled gaming performance but faced numerous challenges that led to their decline. From compatibility issues to diminishing returns, we uncover the reasons behind the struggles and ultimate downfall of these multi-GPU setups.
    Join us as we delve into the history, impact, and future implications of SLI and Crossfire in the world of graphics cards. Whether you're a gaming enthusiast or a tech aficionado, this video provides valuable insights into the evolution of GPU technology. Don't miss out on this comprehensive analysis that highlights the lessons learned and what to expect next in the ever-changing landscape of PC gaming hardware.
    #nvidia #videocard #gpu
  • Věda a technologie

Komentáře • 274

  • @FlamingTX
    @FlamingTX Před 16 dny +83

    To sum it all up
    1. AI
    2. The more you buy, the more you save

    • @ultrapredatorr
      @ultrapredatorr Před 15 dny

      Preach brotha

    • @Aerobrake
      @Aerobrake Před 15 dny

      The more you buy, the more you save

    • @jovee6155
      @jovee6155 Před 3 dny

      3. The market isn't there

    • @PantsCranks
      @PantsCranks Před dnem +2

      0:09 into the video, let me stop you there as SLI died because of:

      Money, and Greed first and only first, this dictated the turn of events that subdued and pretty much killed "SLI"...
      SLI options and or similar features to the common man posed a potential when GPU upgrades would normally be warranted, to have the PC Gamer (instead of buying a new shiny mid-tier $800 dollar GPU) have another option. What about doubling your V ram? Doing so would be also, half to a quarter the price of that shiny shiny new GPU if that is, the current one possessed by said GAMER cost about the equivalent of that new shiny shiny Nvidia wants to force or throw, maybe slam? down our throats.
      SLI was this option for us mere mortals, that little connector (never used by most) that came with your GPU back in the day, allowed you to utilize SLI on compatible motherboards (most had this feature OR crossfire AMD). The subsequent move then by Nvidia was too, ice this feature, as only reserved for higher end cards first, then by, replace said feature with their "new improved" Nvidia Link. This also was an over-priced monstrosity of an item that now one would need to also acquire separately (in most cases). The plan now was moving this feature too only the highest end top tier consumer card (by that I mean 1 card only) available at the time being the RTX 3090.
      They, Nvidia, was already against the clock as AMD and Gen 5 PCIE was about to establish itself for the mainstream. Gen 3 and prior PCIE channels (where you slot in GPU) were in past, not, fast enough to (basically) effectively use SLI feature AND the requirements for the SLI adapter or NV LINK adapter was needed that synced and made this feature possible (basic explanation).
      But with the cat out of the bag and Intel to soon follow, Gen 5 PCIE allows very easily, for, all current GPU's (including the 40 90 that needs max gen 4 lanes for full functionality) the necessary speed between GPUs to enable these linked states and subsequentially ALL that POWER for mere mortals to ponder and acquire, while Nvidia would be counting less mullah in turn going towards their wallets...
      Also, the narrative driven was that those (past titles) games ("not much support it anymore."...) did not develop and allowed for the SLI feature set (for who's actual benefit and influence I have you ponder?) to be utilized anymore... People? Please.

      So now after 0:09 seconds into this video I apologize for what I have transpired on all of you but with that said if you got this far into my comment then now kiddies you know the truth!
      Shamelessly after reflection of my own actions, consider going down a rabbit hole with us and check out my new tech channel, linked to my handle here (shamelessly of course!) @phantomtec while just another tech phantom myself, Peace.
      P.S I hope for the love of all things tech that "META PCs" did this truth justice and is not another one of those shills we at phantomtec call ironically "just phantom tech here" (you know? more BS...) no hard feelings...

  • @luckythewolf2856
    @luckythewolf2856 Před 12 dny +14

    The GTX 1060 never supported SLI. The GTX 1070 and up supported it.

  • @nikke2404
    @nikke2404 Před 14 dny +30

    Actually, RDR2 was released in the fall of 2018, not 2017...

    • @Pasi123
      @Pasi123 Před 4 dny +8

      October 2018 for PS4 and Xbone. It didn't come out for PC until November 2019

  • @arenzricodexd4409
    @arenzricodexd4409 Před 15 dny +28

    former SLI user here. the thing that kill SLI is definitely the software. more specifically game engine. nvidia can improve SLI with things like NVlink but it is useless when game engine developer keep introducing tech that working against tech used in multi GPU like AFR. honestly i don't think nvidia remove SLI from consumer in favor of things like AI. after a decade of pushing multi GPU game developer have no interest to support multi GPU in their game engine (they do the opposite when it comes to CPU). the effort to support SLI in games is not cheap. ex nvidia engineer said in the past almost 50% of driver team work was to make sure SLI work in all existing games despite the amount of SLI user only account for small amount of nvidia user.

    • @notaras1985
      @notaras1985 Před 14 dny +2

      This. But AI loads still need it

    • @adamtajhassam9188
      @adamtajhassam9188 Před 14 dny +1

      @@notaras1985 100% agreed these companies just dont care

    • @kevinerbs2778
      @kevinerbs2778 Před 5 dny +3

      No Nvidia did take it away on purpose There's an Extreme over clocker(venturi) who found that Nvidia was specifically blocking it in software when it could've been working on specific games that it should've been working on (RTX 2060 in S.L.I because of mGPU) . The game is Dues Ex Mankind Divided. It has mGPu support but the way Nvidia wanted mGPU to work with their cards was that only with S.L.I specific cards.
      the absolute worst thing that ever happened for S.L.I was the completely & utter lie that Mutiple card caused "mirco-stutter". We've gotten plenty of games already that only ever use a single card & still stutter or have micro-stutter in them on DX12 By only using a single card. In 2023 there were at least 14 Triple A games that all released with bad stuttering or micro stuttering. It's time Reviewers actually ate crow because they're ones who pushed for this single card crap that is now costing the consumer cards to be in the $1,000to $2,500.
      That Ex-Nvidia Engineer is lair, all they did was make sure flags from DX9,DX10,DX11 to DX12 that were in the drive, it's not even that hard to do. AMD is currently doing it right now for mGPU for all their RDNA line this goes from RX 5000 series up though RX 7000 series. you can still use two cards on all AMD RDNA line up, you only lose Rebar because it's pointless for dual GPU.
      Complete bull crap about it costing too much; cards right now as in RTX cards all cost twice as much as GTX cards of the same tier used to cost that supported S.L.I. Don't even say D.L.S.S is superior either to S.L.I
      D.LS.S is not ran on the GPU in your computer first, D.L.S.S isn't free either it cost money to use an A.I machine for which Nvidia uses to process that correctly. That is ran by Nvidia first & then it implemented into the drivers by a driver team along with them implementing D.L.S.S into the game itself. Which is far more than what S.L.I ever needed. it is some what on the base level exactly the same, while having even more time consuming as how S.L.I was with the drivers.
      People need to understand that & stop praising this D.L.S.S as some kind savior. It isn't, in fact it's also made developers lazy to the point where it's become a crutch for running poorly written/coded games.

    • @arenzricodexd4409
      @arenzricodexd4409 Před 4 dny

      @@kevinerbs2778 1) DX12 mGPU is implemented by game developer not graphic drivers like CF of SLI. it is up to developer which card they want to enable mGPU support. nvidia have nothing to do with it when DX12 mGPU did not work in games. because the support is completely done inside the game. for example Ashes of singularity allows AMD Fury X to work with nvidia 980Ti in DX12 mGPU because the implementation are done by game developer to allow both card work in mGPU. nothing nvidia can do to stop it. nvidia can block SLI from working but not DX12 mGPU since implementation are done by game developer natively inside the game.
      2) you're talking about different issue. the stutter we see post multi GPU era are game engine issue. for example in UE case we saw it become more common with UE4. when UE3 still very popular and used in many games we did not see this kind issue.
      3) just enabling SLI is easy. in the past if the game does not support SLI or SLI official support was late people can enable it themselves using tools like nvidia inspector. the hard part is to make sure it can scale as high as possible and to ensure stability in games. what's the point going multi GPU if you can only get 40% more performance or the game keep crashing or have visual artifact? i still remember game like Dragon Age Inquisition have artifacts when running on my 660 SLI. it completely gone when i run the game on single GPU only. as for AMD so called mGPU support: did you see any gamer end up buying two radeon card and make them work in recent games?
      4) the effort to support all the issues caused by multi GPU does not worth it when the userbase is extremely small. initially the idea behind multi GPU is to encourage gamer to buy more than one GPU for a single system. after more than a decade pushing SLI/CF not even 50% of gamer world wide entertain that idea. in the end it comes to this: multi GPU need gamer to spend money to buy more GPU while upscaling like DLSS will let gamer keep their single GPU a bit while longer.
      5) DLSS might be more expensive cost wise for nvidia to do but it become a marketing point for people to buy their GPU over competition. with DLSS being supported from low end to high end all gamer that buy nvidia GPU can benefit from it. multi GPU? only people with such setup can benefit from it.
      6) this kind of thing are not specific to DLSS only. when 8th gen console have massive memory upgrade over 7th gen developer also use that as a reason not to optimized their VRAM/RAM usage in games as well as pointed by John Carmack a few years ago.

    • @kevinerbs2778
      @kevinerbs2778 Před 4 dny

      @@arenzricodexd4409your whole comment is nothing but hot air & misinformation.

  • @stefensmith9522
    @stefensmith9522 Před 15 dny +49

    I miss SLI. I was running SLI in every setup I've had since the 8800gtx era.

    • @METAPCs
      @METAPCs  Před 15 dny +11

      Wow I bet you have a lot of experience to share! Very rad 🫡

    • @regisegek4675
      @regisegek4675 Před 8 dny

      @@METAPCs PX 8600 GT

    • @billchildress9756
      @billchildress9756 Před 8 dny

      I have 2 8800 GTS cards in SLI on a XFX 680i Board. What you had was the ultimate setup!

    • @kevinerbs2778
      @kevinerbs2778 Před 5 dny +1

      I love my S.L.I system, there's like over 1,000 Games that support it.

    • @JerryNeddler
      @JerryNeddler Před 15 hodinami +1

      I love SLI

  • @SlyNine
    @SlyNine Před 15 dny +13

    They do explain why. It's deferred rendering techniques that rely on the previous frame to render the next frame. With SLI the previous frames buffer is on the other card.
    Because the rendering technique used was AFR. This is how implicit multi gpu modes worked. The devs had the option of using explicit multi GPU modes, but only one or two games did.
    So Nvidia gave up on the feature.

    • @exoticspeedefy7916
      @exoticspeedefy7916 Před 13 dny

      An so now we will have MCM GPU which is sort of the same, two GPU on one Die

  • @e2-woah939
    @e2-woah939 Před 15 dny +16

    Lots of people were buying gtx 970s and getting great performance. Nvidia didn’t want others buying used gpus cheap and not buying newer cards.

    • @FantomMisfit
      @FantomMisfit Před 14 dny +5

      Yes I think that's part of it too.

    • @Trikipum
      @Trikipum Před 2 dny

      Native multigpu support is a thing since dx12. You can even mix diferent brands and models there. They simply phased it out when it made no sense anymore

    • @KraziMutherFuckingAgent
      @KraziMutherFuckingAgent Před 13 hodinami

      well dont buy them any way

  • @Idk-Anymore4
    @Idk-Anymore4 Před 17 dny +21

    If NVIDIA keeps on pulling these actually good features from their gpus AMD and Intel will be sure to catch up.

    • @jai_the_guy.
      @jai_the_guy. Před 17 dny +9

      AMD already has, but Crypto miners and AI companies have increased Nvidia's market share

    • @arenzricodexd4409
      @arenzricodexd4409 Před 15 dny +4

      catch up? AMD kill CF support on their card much earlier than nvidia. nvidia still support SLI until 20 series. AMD last card that have CF support was Vega. and even back then AMD no longer put priority on it to the point polaris actually have better CF support than Vega which is AMD high end GPU at the time.

    • @MrBlackdragon1230
      @MrBlackdragon1230 Před 15 dny

      @@jai_the_guy. LMAO!

    • @SlyNine
      @SlyNine Před 15 dny

      Nvidia killed SLI for the same reason ATI killed Crossfire. AFR rendering was not compatible with deferred rendering techniques.
      The previous frame's buffer is on the other card. The game uses that frame to help with rendering.
      Devs had the option to use explicit multiple GPU modes for both and didn't do so.

    • @ZackSNetwork
      @ZackSNetwork Před 12 dny

      @@jai_the_guy.AMD has not caught up with anything on the GPU side they suck.

  • @stefanwalsh8034
    @stefanwalsh8034 Před 13 dny +6

    I had 2, HD 5850's to crossfire BATTLEFIELD 3 back in the day. What a time to be alive.

  • @FantomMisfit
    @FantomMisfit Před 14 dny +21

    I think SLI hurt their sales so thats why they ended it. You could buy 2 4080 Supers for what you could buy a 4090 for

    • @ZackSNetwork
      @ZackSNetwork Před 12 dny +4

      Exactly, I still remember back in 2016 I bought a Titan Xp for 4k gaming. A person I knew said why didn’t you just do a 1080 SLI. More money, more power, less consistency in performance.

    • @betag24cn
      @betag24cn Před 12 dny +7

      it was killed because driver never worked as expected, and two gpus together only gave you like 50% extra performance boost, but you paid for two gpus, a expensive motherboard, the sli bridge and a expensive psu, it was not the best idea

    • @AndyU96
      @AndyU96 Před 5 dny

      Agreed. If the product stack had the prices scaling linearly with the performances of the cards, then they wouldnt be losing any money through SLI. But since they scale the prices of their cards exponentially with respect to performance, what you just said ends up being the case.

    • @AndyU96
      @AndyU96 Před 5 dny

      @@betag24cn Surely if enough research was thrown at it, it could have been improved to the point where the performance boost is near 90%

    • @betag24cn
      @betag24cn Před 5 dny

      @@AndyU96 i doubt it, engines do not split loads, programs barely split between cpus, it was not meant to be

  • @gamerstewart1660
    @gamerstewart1660 Před 14 dny +6

    Amd and nvidia should have just made a driver special for multi gpu use. Making all gpus in the setup into 1 virtual gpu so games can treat it that way and the driver splits the workload between the gpus😮‍💨

    • @arenzricodexd4409
      @arenzricodexd4409 Před 14 dny +1

      That is what SLI and crossfire is. Games most often did not aware about there is two GPU inside the system. Most of the "magic" are done by the drivers.

  • @kizunadragon9
    @kizunadragon9 Před 10 dny +4

    running two 1080 ti's in SLI was godly

  • @stevens1041
    @stevens1041 Před 14 dny +4

    I always heard about it, but could never justify the cost to myself. Never got to experience it, sadly. Thanks for covering it.

    • @METAPCs
      @METAPCs  Před 14 dny +1

      It was a difficult cost to justify for sure! Thanks for watching. Appreciate it

    • @Brodda-Syd
      @Brodda-Syd Před 4 dny

      It doubled the frame-rate in the Crysis series, Battlefield series and Sniper series. I loved it.

  • @andrewszombie
    @andrewszombie Před 13 dny +3

    3:45 imagine being this forum poster & seeing your comment from a decade ago in this video 😂

  • @doriansanchez1961
    @doriansanchez1961 Před 15 dny +6

    Cutting the BS Nvidia wants you to buy high end GPU and not a old one that they won't profit off of.

    • @arenzricodexd4409
      @arenzricodexd4409 Před 15 dny

      The idea behind multi gpu is to encourage people to buy more gpu.

    • @doriansanchez1961
      @doriansanchez1961 Před 14 dny +4

      @@arenzricodexd4409 nope that was what they were hoping for so Towne the end only high end GPU had sli. But most people instead of buying a new GPU would buy a used one to add to their PC. Therefore no new GPU where being sold. With the release of the Titan Nvidia learned that people would pay twice as much for a GPU so the invention of the 90 series came about and the end of SLI was born.

    • @arenzricodexd4409
      @arenzricodexd4409 Před 14 dny

      @@doriansanchez1961 when i was still active on hardware forum years ago the talk about getting used card for cheap later on does not really fly in reality. In the end most of us conclude that if you want to go multi gpu do it from the get go. Even if you want to do it later do not do it much later than mid cycle of the series. Because back it is known that there will be people hunting for their second card later. So brand new gpu end up retaining it's original MSRP while on the used market people know the value of that second card for some so they were sold at higher price. In some cases rather than going with SLI/CF and deal with the quirk of such system it better to get next gen GPU.

    • @kevinerbs2778
      @kevinerbs2778 Před 5 dny +2

      @@doriansanchez1961 even worse is that people can't see that D.L.S.S needs more than what S.L.I needed.
      D.L.S.S has to implemented directly into the game
      then it has to get updated from NVidia's A.I machines
      Next the driver implementations have to go out with the new drivers that include the game.
      Last the card has to support that version of D.L.S.S
      Meanwhile S.L.I only need a driver implementation & two S.L.I enable cards.
      =_=

  • @ronhaworth5808
    @ronhaworth5808 Před 7 hodinami +1

    The problem with SLI for me was every time I considered adding a second card later to boost performance a new generation GPU came out that could do it with a single card.

  • @Bludgeta9001
    @Bludgeta9001 Před 14 dny +5

    Could you imagine 2 rtx 4090s working together?!

    • @user-tq4rt2fj9l
      @user-tq4rt2fj9l Před 14 dny +4

      💣

    • @arenzricodexd4409
      @arenzricodexd4409 Před 14 dny +3

      Cpu bottleneck will be an issue even at 4k.

    • @BarnOwl-dx8vg
      @BarnOwl-dx8vg Před 14 dny +4

      The plugs would melt

    • @Brodda-Syd
      @Brodda-Syd Před 4 dny +1

      RASSSS-FIRE

    • @Pasi123
      @Pasi123 Před 4 dny

      ​@@arenzricodexd4409 Probably not that much at native 4K in graphically intensive games. In TPU review the RTX 4090 + i9-13900K got 45 FPS in Alan Wake 2 at native 4K with RT on, 67 FPS with DLSS Quality, 77 FPS DLSS Balanced, 87 FPS DLSS Performance, 106 FPS DLSS Ultra Performance.

  • @GregOver
    @GregOver Před 17 dny +4

    Yung Gravy and Shania Twain just dropped a song about White Claw. 202wtf is going on.

  • @Ivan-pr7ku
    @Ivan-pr7ku Před 15 dny +3

    Frame Generation tech pretty much supplanted SLI (and CrossFire) functionality -- it produces more frames with the same or worse input lag, plus few image artifacts for a change, but at the low cost of one GPU in the system. Multi-GPU is still an option in DX12 and other APIs, but now it's all up to the game developers to implement and use it.

    • @TheMaztercom
      @TheMaztercom Před 12 dny +2

      You forgett that SLI didnt downgrade your visual fidelity, remember that DLSS is just lkw resolution scaled

    • @max.racing
      @max.racing Před 9 dny

      but what if we combine Frame Generation with the power of a SLI system?
      Unlimited fps hack

    • @kevinerbs2778
      @kevinerbs2778 Před 5 dny

      Frame generation can't even touch S.L.I
      The Witcher 3 supports both & I get 170fps on DX11 with two RTX 2080 ti's in S.L.I
      Meanwhile the RTX 4090 in DX 12 can't even past 110 fps with D.L.S.S & Frame generation enable.
      This game is actually terrible in DX12 with RT because it lacks a lot of proper code calling because the engine was ported over from their DX11 game engine & not built from the ground up for DX12. A single RTX 2080 ti in 4K with everything maxed out including RT gets 32fps.

  • @WhiskeyInspekta
    @WhiskeyInspekta Před 15 dny +2

    I had 2 780ti’s in SLI for years. I want to say till 2020. But like you said many new game I tried always had some issue with SLI. I ended up upgrading to a 5700XT then. 7900XT.

  • @rangersmith4652
    @rangersmith4652 Před 14 dny +2

    Years ago I ran a dual 980Ti SLI setup, and before that I ran a dual R9-290 Crossfire setup. Both added a ton of power draw and heat. Neither scaled worth a hoot in FFXIV, which was the only game I cared about. Still is, pretty much. Now that any decent mid-range GPU is more than enough for FFXIV, I have zero interest in dealing with the added complexity of a dual-GPU setup.

    • @kevinerbs2778
      @kevinerbs2778 Před 5 dny

      That game has terrible game engine built around it.

    • @rangersmith4652
      @rangersmith4652 Před 4 dny

      @@kevinerbs2778 It's old, yes. Terrible is a very subjective thing.

  • @The_Twisted_Monk
    @The_Twisted_Monk Před 12 dny +2

    SLI truly died because of money... You said it "It was becoming more a budget option". They wanted people to get off the older GPUs and spend more money for the newer stuff. The longer they kept SLI alive, the longer people would want to stay on their older GPUs.
    I think AMD threw a wrench into their plans when they made FSR and Upscaling for everyone. hahaha

  • @Tribudo
    @Tribudo Před 14 dny +1

    I remember seeing a video where a guy used crossfire with a custom driver to do combined graphics between an A series apu and a low end graphics card.

  • @mraltoid19
    @mraltoid19 Před 14 dny +2

    I think AMD could do well if they market "Crossfire" with RDNA4...since they are mid-level cards anyway....AMD can market (RDNA4) 2x Radeon RX-9700XT for 2x $550 will get you better than RTX-5090 performance for $500 less. Then they can do the same with the Radeon RX-9600XT, and provide sell those cards (which have 7700XT class performance) for $299 each, and let you crossfire them later for RX-7900XTX+-Class performance.

    • @arenzricodexd4409
      @arenzricodexd4409 Před 14 dny

      AMD have more and a decade crossfire experience with past product. They ditch it because it is not worth pursuing.

    • @AsmodeusExx
      @AsmodeusExx Před 12 dny

      Will be back...

    • @kevinerbs2778
      @kevinerbs2778 Před 5 dny +1

      you can already use two cards with all of AMD's RDNA type line up cards from RX 5,000 series through RX 7,000 series. You can enable mGPU in the control panel of their software. You just lose "Rebar" when you do.

  • @Anima_moxhya
    @Anima_moxhya Před 14 dny +2

    SLI evolved into THe Linking Chip that is now used in Cloud GPUs now 2 48 gb GPU are 96 GB VRAM and the high speed data transfer between the Servers

  • @TewaAya
    @TewaAya Před 15 dny +3

    Still useful for 3d rendering, 3090 is the last nvlink/multi gpu compatible sku though.

    • @arenzricodexd4409
      @arenzricodexd4409 Před 15 dny

      for non gaming task you can always multi GPU. even mixing the architecture. the benefit of Nvlink on the consumer grade card was those it allows for combined VRAM thing. it is the reason why nvidia take away nvlink from 4090.

    • @kevinerbs2778
      @kevinerbs2778 Před 5 dny

      @@arenzricodexd4409 There were issues with the RTX 4090 in CUDA programs that supported multiple cards, I read a lot complaining about the issues by independent developers in forums.

  • @andreabriganti1226
    @andreabriganti1226 Před 15 dny +7

    Now more than ever we need SLI/Crossfire. Or something that could help with ray tracing, especially when it comes to AMD GPU.

    • @DenverStarkey
      @DenverStarkey Před 15 dny +1

      yeah imagine using one card to render the polys and textures while the other card handles just the RT. you'd have 4k 60+ FPS with path tracing easy and cheap then. stead of having to buy a 2000 dollar 4090.

    • @MelioUmbraBelmont
      @MelioUmbraBelmont Před 13 dny +1

      Thats my point in many discusions, like Phys X, Just buy a card to accelerate it.

    • @DenverStarkey
      @DenverStarkey Před 12 dny

      @@MelioUmbraBelmont well that was the original idea .. like physX Nvidia aquired RTX via buying out a company that was developing it for a stand alone card. except that company never got their card to market before getting bought , while physX did see market first.
      this is one instance where Nvidia really hurt the market with their buy out. as the standalone RT card was aimign to get 1440p RT at 60Fps jsut on it's first gen. instead the tech got bolted on nivida's video cards and as we know the first RTX video cards couldnt acheive 1440p @ 60FPS with RT on. hell most of the second and 3rd gen RTX video cards still can't achieve this. and forget 4k unless you got a 4080 or 4090 at 1000 bucks and 1700 bucks.

    • @kizunadragon9
      @kizunadragon9 Před 10 dny +2

      An SLI like setup where one card renders the game and the other handles just ray tracing would be lit

    • @MelioUmbraBelmont
      @MelioUmbraBelmont Před 10 dny

      @@kizunadragon9 it's possible to do, a Brazilian Channel made a UE4 demo using AMD cards, It was in the release of the RT when Nvidia said that was only possible with their hardware.
      I remember something about a guy who make RTGI on Reshade in Resident Evil remake and Bioshock, its possible to render different layers/chuncks/shaders based on normal maps Fresnel in parallel then combine all in the final frame like in Zelda BotW

  • @xBINARYGODx
    @xBINARYGODx Před 3 dny +1

    You could set AFR for any "unsupported" game and get AT LEAST a ~50% uplift from having the second card.

  • @Snakkers
    @Snakkers Před 15 dny +3

    We need this for vr.

  • @shadowr2d2
    @shadowr2d2 Před 12 dny +1

    Thank you for posting this video 🎥🍿🍺🇺🇸. You have a new subscriber 🎉. Keep up the great 👍 work 🎉.

  • @stephenmolinski6837
    @stephenmolinski6837 Před 6 dny +1

    I just want to drop down here in the comments and say that I think this technology would be very useful in VR graphic handling. Wish it would be adopted.

  • @amdintelxsniperx
    @amdintelxsniperx Před 9 dny

    nvlink is still going on in the server space . it basically doubles vram and turns two gpus into 1

  • @pstm53
    @pstm53 Před 15 dny +2

    this video is criminally underrated

  • @rossmclaughlin7158
    @rossmclaughlin7158 Před 9 dny +1

    I had 2 hd 7970 and move/ upgrade to the 980 was rather lackluster at the time given what 4 or 5 years I had between upgrading to be noticing that games largely played similarly to the way they did is a bit of a disappointment to say the least that's like some one today building a pc to highest spec they can afford and upgrading to second highest end GPU available in 2029 to see little to no improvement but this honestly happened.

  • @dannyyristenpatt614
    @dannyyristenpatt614 Před 13 dny

    I built my last SLI rig back in 2012 using two GTX 660 Ti's and it was awesome until mid-2014 when titles like Watch Dogs, Wolfenstein: TNO and the Evil Within started releasing with non-existent or broken SLI support. Things didn't improve that following year with games like Assassins Creed: Unity and Batman: Arkham Knight both forcing me to physically remove my second card from the system because simply disabling SLI wasn't cutting it. Things haven't gotten much better until GTA V surprisingly launched with excellent SLI support and my 660 Ti's were useful for another 6 month's until I eventually switched to a GTX 980 Ti.

  • @dragonsystems5973
    @dragonsystems5973 Před 14 dny +1

    I think you are a bit off on your time scale... actual usability of SLI, specifically game support was dying as of like late Keppler. Yeah, its true, there where a few cherry picked examples that thrived on SLI, but ive tried it on newer games that state they support it, and generally it does nothing, or even less than nothing... i love me some crysis as much as the next guy, but thats almost all it was good for, Crysis was the Cyberpunk of its day. I have a GTX 690 on hand right now, and even in games from its era it is very unwieldy

  • @mtbwithjure7809
    @mtbwithjure7809 Před 17 dny +56

    AMD for the win!

    • @kaishedan37
      @kaishedan37 Před 16 dny +7

      crossfire is also dead though?

    • @CallOFDutyMVP666
      @CallOFDutyMVP666 Před 16 dny +6

      15% of the steam hardware, 90% of youtube comments.

    • @ghostlyinterceptor7756
      @ghostlyinterceptor7756 Před 16 dny +10

      @@CallOFDutyMVP666 the fact is that the reputation of AMD software and drivers was what tarnished people into this kind of state of "avoid amd at all costs no matter what even if they improoved allot" , right now amd did a massive flop with the 7000 series but a huge W with 6000 series rivaling the best 3090 ti in raster making nvidia actually improove and not go the intel route of "MORE POWER"

    • @arenzricodexd4409
      @arenzricodexd4409 Před 15 dny +4

      ​@@ghostlyinterceptor7756RDNA 2 are winning because they were using TSMC 7nm while nvidia are using samsung 8nm. But when they were on node parity nvidia will come out on top unchallenge like 4090. Similar thing happen during pascal generation as well.
      On software side AMD tend to have major issue when they move into new architectute or doing something drastic to it. This end up giving AMD the stigma of "always" having software/driver issues.

    • @WolfChen
      @WolfChen Před 15 dny

      for the price*

  • @sargonsblackgrandfather2072

    Shame about SLI, I used to be a top tier card like a 80 series then after a few years when it was showing its age I’d buy another one second hand and be back on top again

  • @billchildress9756
    @billchildress9756 Před 8 dny

    How about 3 Asus 1080 8gb cards in SLI?

  • @charcoalmoth
    @charcoalmoth Před 13 dny +1

    Honestly this is AMD and Intel's opportunity to implement a new and improved version of this software on their gpu's and actually push its use because they arent going blow for blow with NVIDIA in the single gpu market why not try and hit them in the multi gpu set up market especially with their more budget friendly cards.

    • @AsmodeusExx
      @AsmodeusExx Před 12 dny

      Will be a.good idea 4 intel invest in sli... will.boost their gpu at.nice budget... i think will we return to sli in few time... cause is more cheaper than invest in ultra expensive tech like nvidia

  • @ReRy09
    @ReRy09 Před 14 dny

    Just imagine dual or even triple 4090 builds combined as one

  • @kizzmiaz
    @kizzmiaz Před 11 dny

    I ran 2 Kepler era (gtx7xx I think) Titan X in sli to run games at 4k and they struggled. That cheap 4k monitor wasn't that cheap after all lol.

  • @leeebbrell9
    @leeebbrell9 Před 10 dny

    I remember the voodoo cards, and yes I would dream of SLI even upto 20 series. Some game ran slower with it enabled

  • @BodySculptTV
    @BodySculptTV Před 14 dny

    Just subscribed to your channel cause this video was just that good. Thank you for bringing this to the limelight. I agree with you 110% and can add a few things to your research. Let's take DLSS technology and how now MicroSoft is incorporating it into Windows 11. Do you really think Microsoft couldn't have embedded some sort of SLi support into Windows if they would have wanted to? They could have and it would have given SLi the much-needed support it deserved. The fact is that SLi backfired on Nvidia giving customers the opportunity to not have to buy the latest and greatest but instead just buy another card "used even" from Ebay and get the performance they needed. As a matter of fact, I ran two 1080Ti in Sli for many years without the need of an upgrade. The 20 series of Nvidia was no match to the 1080Ti Sli rig and only the RTX 3090 outperformed it. So, in Nvidia eyes SLi was causing sales disruption and that cannot be allowed. But I believe some sort of SLi/NV link is coming back due to all the AI intensive products that are coming our way. Although the RTX 4090 can do up to 1300 Tops that's without having to render a game, so I figure that a combined game with high end graphics and embedded A.I. will require multiple cards to perform all these processes at once. The "supposed" Tops that these new Intel and AMD processors will yield of 45-100 will not be enough to have multiple characters in a game be A.I. driven. In the not too far future, I see a little check box in the Nvidia control panel similar to how you enable PhysX that allows you to allocate a certain portion of your cards for A.I. inference in games. Thank you again for such great content.

  • @ccramit
    @ccramit Před 2 dny

    I was looking to build a PC in 2019 for the first time since 2010. I was in the Army and went with a console for simplicity sake. So I did not keep up with anything PC for almost a decade
    So it was a bit of a surprise when I was looking to build again and looked up SLI only to find out that it wasn't really a thing anymore. I suppose it's a good thing. No need to pay for another expensive GPU and then I found out SLI wasn't really the status quo anyway. Apparently it wasn't fully optimized for your average gamer and was only adopted by enthusiast games.
    And Crossfire was also a name for a children's game where you shot metal balls at plastic toys kind in the realm of Rocket League/Foosball. Spent many an hour on that as a kid lol.

  • @SpoonyOrbit
    @SpoonyOrbit Před 8 dny +1

    AI upscaling and frame generation makes more sense now

  • @Personalinfo404
    @Personalinfo404 Před 15 dny +2

    my first gaming pc was an intel 2600k and 2 gtx 660ti in SLI

    • @METAPCs
      @METAPCs  Před 15 dny

      Nice! That’s a pretty rad first gaming PC.

    • @arenzricodexd4409
      @arenzricodexd4409 Před 15 dny

      Wow almost similar to mine back then. 2500K with 660SLI.

  • @EveryGameGuru
    @EveryGameGuru Před 10 dny

    😭 SLi (and crossfire) was amazing. Sadly my 3090s are the last SLi rig, for now 😭 and it's not even compatible with DX11 games. For that, I still have my dual 2080Ti system

  • @DenverStarkey
    @DenverStarkey Před 15 dny +4

    what you mean by "bought by an up and coming" ..... Nvidia was already an 800lb gorllia in the industry when the bought 3dfx. they were already on Geforce 2 and working on pushing out geforce 3 when they bought 3dfx. they already had a 80+% market share at that time ... they had already got "up".
    that said i think SLI died because Nvidia doesn't want consumers to spend just $300-400 bucks per card (600-800$ total) and be able to beat out a 2000 dollar card doing so. they don't want you to be able to do that and even come in just under the $2000dollars they want you to buy the $2000 dollar card or the single card that come under it at $1000-1200 dollars. thbat's why SLI died and why they won't be bringing it back. it was only getting better and they sure as shit didn't want some one to pass up buying the $1500 3090 and just get 2 3070's for 500 bucks less.

    • @METAPCs
      @METAPCs  Před 15 dny

      Sometimes my dry sarcasm doesn’t play well 😅

    • @arenzricodexd4409
      @arenzricodexd4409 Před 15 dny

      AFAIK nvidia 80% market share only happen recently. Even when AMD released 5700XT to the market back in 2019 nvidia market still at 70%.

    • @kevinerbs2778
      @kevinerbs2778 Před 5 dny

      That lower cards beating higher cards also happened with AMD. The HD 5,000 seriesIs when it happened . There were benchmarks of "Trifire HD 5670" in crossfire-x beating two HD 5870 in Crossfire.

    • @arenzricodexd4409
      @arenzricodexd4409 Před 5 dny

      @@kevinerbs2778 maybe in benchmark like 3dmark but many games rarely ever work beyond 2 cards. And even if they work the gain over two cards are very small.

    • @kevinerbs2778
      @kevinerbs2778 Před 4 dny

      @@arenzricodexd4409 No they were tested in games, like GTA IV & a few others. I can find article cause it well known tech website. They weren't actually faster in 3Dmark bench for some reason. yet all the games show a 10-40% increase over the two HD 5870

  • @tristman8413
    @tristman8413 Před 10 dny

    I remember having 2 gts 320s in sli years ago and one of the first core 2 duo cpu 😂 man how things have improved since then

  • @settlece
    @settlece Před 8 dny

    Even now when the snow's outside, I switch on my SLI computer just to keep the house warm.

    • @Brodda-Syd
      @Brodda-Syd Před 4 dny +1

      Yep. I remember gaming on Boxing Day with both windows wide open when it was 2C outside yet my two 2080 ti's kept me nice and warm.

  • @joonasvahasalo6635
    @joonasvahasalo6635 Před 13 dny

    Becouse you cant fit 2 graphics cards in the same case anymore ?

    • @Brodda-Syd
      @Brodda-Syd Před 4 dny

      Depends on the case.
      My Silverstone Raven RV02 can only fit a Founders 4090 with about 2mm to spare.

  • @Oomlie
    @Oomlie Před 10 dny

    NVLink wasn't introduced to the consumer cards until the 20 series, not sure why you said 10 series & NVLink on the consumer side was basically SLI it didnt operate like the workstation version and pool resources

  • @AsmodeusExx
    @AsmodeusExx Před 12 dny

    Building atm a sli pc... so exited. Hope sli will back again...😊

  • @n8spL8
    @n8spL8 Před 14 dny

    i had dual 6800 ultras w/ my fx-55.
    quadish 4870x2's w an fx-60.
    it's nice to just run one card nowadays.

    • @nocturnal0072
      @nocturnal0072 Před 14 dny

      Crossfire with variable refresh was amazing, but came too late. IMO.

  • @chengong388
    @chengong388 Před 2 dny +1

    Nobody is getting to the real reason, which is economical. You see back then most people did not buy the top end graphics card, let alone two of them, there simply didn’t exist this economic class of people who could afford ultra high end graphics card, but there were a few outliers. So what you do is simply give them a shitty software that allows the use of two cards, so you can sell two cards to the whales. It just didn’t make economical sense to fab a huge GPU for the 1% PC gamers.
    Where as today, any forum or discord server has a bunch of guys with 4090, the gamer kids of the past are now working and have money to throw at GPUs, plus there’s a market for ultra high end professional graphics for AI and other compute, so now it makes economic sense to just take one of those huge compute cards you’re making anyway, and turn it into super high end gaming card.
    Technology is also a factor of course, most people don’t know SLI does not pool your VRAM, having two cards with 1GB each in SLI, you still have just 1GB, which was a big problem during the transition over to 4K,, there’s no point having so much GPU power when you are getting huge stutters because you don’t have enough VRAM to render the resolution you want. SLI also works very similarly to frame gen, it doubles your frame rate but it increases input delay, because you aren’t rendering frames faster, you’re rendering two frames on two cards. So today the only difference between SLI and frame gen, is the interlaced frames will be perfectly rendered instead of AI interpolated, good luck seeing the difference.

  • @JerryNeddler
    @JerryNeddler Před 15 hodinami

    Subscribed I'm very happy that you said the truth about SLI and the hypocrisy of frame generation and upscaling being better when in fact both had problems but both were good there was no need to eliminate SLI one of the other reasons why is because Nvidia doesn't want you buying older gpus they don't want you putting 2 4080s together and beating a 5090 ti

    • @METAPCs
      @METAPCs  Před 13 hodinami +1

      Thanks for the sub and for watching. Appreciate it!

  • @eliadbu
    @eliadbu Před 15 dny

    My last SLI setup was 2 gtx 980ti back in 2015-2017 even back then it was badly optimized for, with few new releases supporting it and you had to tweak a lot with Nvidia Inspector to make it work. Nobody really cared for it, not nvidia nor devs so eventually it found its demise. RTX 3090 was the last card to support SLI over NV-Link but even then, it was primarily for benchmarks.

  • @RareFXChrisC
    @RareFXChrisC Před 4 dny

    SLI started in the 3dfx video accelerator cards.

  • @leeebbrell9
    @leeebbrell9 Před 10 dny

    One of the GPUs would over heat, it then throttled due to heat. Air flow was poor

  • @mainsource8030
    @mainsource8030 Před 9 dny

    its criminal that the moment they started charging 80 dollars for an nvlink bridge and made sli bridges unusable is the same time they stopped supporting sli. i bought 2 x 2080ti and the80 dollar bridge only to find out the setup was useless

  • @xlr555usa
    @xlr555usa Před 13 dny

    Homey having enough power is never an issue for a playa, just jack in and go. Plus with alt energy and battery tech you can really push the limit and survive a blackout. Get in the game.

  • @felipenachmanowicz9393

    I had a GTX 690 and a SLI of 970's. I can say, it was very underwhelming. Games always released without SLI support (when you most wanna play them) and would after a good while would get support to SLI, of very varying quality.

  • @phantomtec
    @phantomtec Před dnem +2

    0:09 into the video, let me stop you there as SLI died because of:
    Money, and Greed first and only first, this dictated the turn of events that subdued and pretty much killed "SLI"...
    SLI options and or similar features to the common man posed a potential when GPU upgrades would normally be warranted, to have the PC Gamer (instead of buying a new shiny mid-tier $800 dollar GPU) have another option. What about doubling your V ram? Doing so would be also, half to a quarter the price of that shiny shiny new GPU if that is, the current one possessed by said GAMER cost about the equivalent of that new shiny shiny Nvidia wants to force or throw, maybe slam? Down our throats.
    SLI was this option for us mere mortals, that little connector (never used by most) that came with your GPU back in the day, allowed you to utilize SLI on compatible motherboards (most had this feature OR crossfire AMD). The subsequent move then by Nvidia was too, ice this feature, as only reserved for higher end cards first, then by, replace said feature with their "new improved" Nvidia Link. This also was an over-priced monstrosity of an item that now one would need to also acquire separately (in most cases). The plan now was moving this feature too only the highest end top tier consumer card (by that I mean 1 card only) available at the time being the RTX 3090.
    They, Nvidia, was already against the clock as AMD and Gen 5 PCIE was about to establish itself for the mainstream. Gen 3 and prior PCIE channels (where you slot in GPU) were in the past, not, fast enough to (basically) effectively use SLI feature AND the requirements for the SLI adapter or NV LINK adapter was needed that synced and made this feature possible (basic explanation).
    But with the cat out of the bag and Intel to soon follow, Gen 5 PCIE allows very easily, for, all current GPU's (including the 40 90 that needs max gen 4 lanes for full functionality) the necessary speed between GPUs to enable these linked states and subsequentially ALL that POWER for mere mortals to ponder and acquire, while Nvidia would be counting less mullah in turn going towards their wallets...
    Also, the narrative driven was that those (past titles) games ("not much support it anymore."...) did not develop and allowed for the SLI feature set (for whose actual benefit and influence I have you ponder?) to be utilized anymore... People? Please.
    So now after 0:09 seconds into this video I apologize for what I have transpired on all of you but with that said if you got this far into my comment then now kiddies you know the truth!
    Shamelessly after reflection of my own actions, consider going down a rabbit hole with us and check out my new tech channel, linked to my handle here (shamelessly of course!) @phantomtec while just another tech phantom myself, Peace.
    P.S I hope for the love of all things tech that "META PCs" did this truth justice and is not another one of those shills we at phantomtec call ironically "just phantom tech here" (you know? more BS...) no hard feelings...

  • @AdraserPL
    @AdraserPL Před 11 dny +1

    No matter now if only 1 is soo much expensive i cant affort it 😂

  • @artmanrom
    @artmanrom Před 4 dny

    The owners of gaming rigs with an SLI or Crossfire setup were in constant bickering about games not supporting it, panning the devs for this stupid reason like when GTA IV was launched not supporting that feature: "Bad optimization, it doesn't support SLI nor Crossfire!".

    • @kevinerbs2778
      @kevinerbs2778 Před 4 dny

      GTA IV was just released in a poorly form anyways when it came out. The game still doesn't well even on modern hardware it still drops to like 45fps randomly.

  • @MelioUmbraBelmont
    @MelioUmbraBelmont Před 12 dny +1

    It's easy blame devs when YOUR documentation SUCKs.
    "Kill" the tec is just a excuse to force people to go high end. Think I am wrong?
    Now we got:
    - PCI E GEN 5 (crossfire needed gen3 x4 without any bridge)
    -M$oft Direct Acess Memory for the card decompress textures and from the storage device without passing by the CPU
    - HAGS (Hardware accelerated graphcs scheduling)
    -Resible Bar (read/access more then 256mb)

  • @gamesushi
    @gamesushi Před 4 dny

    I had two Radeon 7970s and it only worked well in like 2 games lol.

  • @aisolutionsindia7138
    @aisolutionsindia7138 Před 2 dny

    nvlink was deprecated simply because amd gave up on crossfire first, for nvidia sli was just a checkbox to be ticked against the competitor - this has been something they would have liked to do earlier but were forced to keep up the charade

  • @pavelstoikov3780
    @pavelstoikov3780 Před 5 dny

    oh boy on old days my frend have 2x980ti this pc was monster

  • @Lujantony
    @Lujantony Před 9 dny

    Imagine the power if today's graphics cards was sli compatible

  • @nghialekim3539
    @nghialekim3539 Před 12 dny

    SLI for budget build is not an option anymore. Bbecause you can get double performance for double price in recent years. For example, you can get a 2nd hand GTX 1660 super for $125-$150, just as the same price as 2x 2nd hand GTX 1050 Ti with $50-$75. 2x 1050 Ti with SLI would never give you 2x traction, while the 1660s can. So why would anyone use SLI instead for budget build.

  • @OrientalStories
    @OrientalStories Před 12 dny

    It died in consumer products, in enterprise they are rebranded nvlink

    • @betag24cn
      @betag24cn Před 12 dny

      nvlink doesnt do the same thing, it is the same idea but doesnt really work for games

  • @leeebbrell9
    @leeebbrell9 Před 10 dny

    Vr and SLI had a cool option if devs utilised it, one gpu per eye basically, but lazy devs never used or did it correctly

  • @taylormarshall3982
    @taylormarshall3982 Před 10 dny

    Nvlink pinout is on the 4090 just not the actial hardware it was cut legit last second

  • @iokwong1871
    @iokwong1871 Před 3 dny

    I don't miss SLI at all. Like not evta single bit. LOL. As a Linux user back then when SLI is a thing, we got a out maybe 5-10FPS dropped when SLI is enabled. You hear me right DROP!!! For years I got a second graphic card sitting in my PC doing Nothing.

  • @EinSwitzer
    @EinSwitzer Před 4 dny

    It took me 2 power supplies in order to get 2 cards to go 90to97%

  • @walterchi2655
    @walterchi2655 Před 10 dny

    NVlink didn’t die. Now it only use in enterprise server like A100 or H200

  • @MikeIce4SG
    @MikeIce4SG Před 15 dny +2

    6:26 NAH YOU DID NOT JUST USE YANDEV XDDDDDDDDD

    • @METAPCs
      @METAPCs  Před 15 dny +1

      🤣🤣 Ayyyyyy

    • @MelioUmbraBelmont
      @MelioUmbraBelmont Před 13 dny

      Dude this guy must have blood in the eyes to make that joke.
      (I think I Will figure How to enable crossfire in Unity before Yandere simulator's release)

  • @cptnibbles
    @cptnibbles Před 11 dny

    I bet path tracing would work much better with 2 4090s in SLI.

  • @Trikipum
    @Trikipum Před 2 dny

    Nvlink simply makes no sense in consumer cards anymore with the new specifications of PCIe 5.0...The point of nvlink is quick communication, in the old times, PCIe was not fast enough but nowadays it is. Nvlink is still used in data centers, it is not just a thing to link 2 graphic cards, it is a comunnication protocol...The thing is, dx12 comes ready to use multiple GPUs without any linking required, the pci bus is enough... the problem is the developers have to actually program for it and it is a waste of time nowadays...

  • @AvroBellow
    @AvroBellow Před 2 dny

    What's your problem with Crossfire? I used it for almost a decade and was very happy with it!

    • @METAPCs
      @METAPCs  Před 2 dny

      Nothing, in fact I said that it’s pretty underrated and doesn’t get the credit it deserves.

  • @mostwanted2000
    @mostwanted2000 Před 2 dny

    The games should be agnostic of dual videocards. Then it means the implementation of SLI/nvLink was bad.

  • @sanji663
    @sanji663 Před 3 dny

    Bull
    SLI *AND* crossfire died due to bad frame times
    Also:
    Nearly every generation jump before the 600 series was as big or bigger than the 1000 series leap.
    Where did you research all of this?

  • @terrabyteonetb1628
    @terrabyteonetb1628 Před 14 dny +1

    Game dev kill support for it.......if a game does not use sli, it wasted money

  • @truthseeker6532
    @truthseeker6532 Před dnem

    Getting game makers to code for it =They were like ........WHATEVER!!!!!!!
    GET THAT NIGHTMARE MESS............ OUT R FACE!!!!!!!!!!
    -Game Devs from around the world.

  • @gsestream
    @gsestream Před 10 dny

    it did not, it was phased out of consumer products

  • @YAAMW
    @YAAMW Před 2 dny

    SLI or NVLink are done for desktop. That ship has sailed

  • @novafox7091
    @novafox7091 Před 3 dny

    Could not image my motherboard staying together with not 1 but 2 obese 4080s snapped in. Let alone my wallet.

  • @post-leftluddite
    @post-leftluddite Před 9 dny

    "....dirty, yucky step-child..." explain your reasoning....

  • @NoSpamForYou
    @NoSpamForYou Před 15 dny

    If you have headroom above the target frame rate, micro-stuttering could probably be eliminated by setting a fixed frame rate based on 1% lows or .1% lows. If you can give up several/tens of frames per second for a rock-solid FPS it could hide any asynchronicity.

  • @andrewstewrt2141
    @andrewstewrt2141 Před 15 dny

    nvlink is going to be the new sli its testing in ai then used in gameing

    • @nadtz
      @nadtz Před 15 dny

      Not going to be, already is. The 3090 used nvlink and the datacenter cards use it as well. Way past the testing stage with nvlink but I doubt it will return to consumer cards.

    • @andrewstewrt2141
      @andrewstewrt2141 Před 15 dny

      @@nadtz there are new cards for datacenters that have 2 chips and use the newest nvlink to function as 1 and is similar to some old cards that were 2 gpus in sli on one card

    • @nadtz
      @nadtz Před 15 dny

      @@andrewstewrt2141 Yes, the H200 superchip.

    • @SlyNine
      @SlyNine Před 15 dny

      Nvlink isn't new at all.

    • @SlyNine
      @SlyNine Před 15 dny

      ​@@andrewstewrt2141it's very different. Sli made two cards look like one but only recognized the vram of one card. The data has to be mirrored between both.
      Nvlink allows the GPU memory to be used as one huge frame buffer. But the cards still appear as separate cards.

  • @Ultralightaman
    @Ultralightaman Před 13 dny

    Why SLI when you can charge 5x for 1 card.

  • @Mr.QuantumC0re
    @Mr.QuantumC0re Před 12 dny

    SLI Good for rendering though.

  • @NKO_8188
    @NKO_8188 Před 12 dny +1

    Nvidia wants you to upgrade to every gen they have, that's why sli is dead

  • @xlr555usa
    @xlr555usa Před 13 dny

    Because strapped leather boy wants to control gpu supply, we are getting down to 1nm, we need to find ways to kick ass raw dog style. Im working with Intel using their Deep Link tech, go deep for 6 or go home.

  • @KN-592
    @KN-592 Před 15 dny +1

    Would two 4060s be better than one 4070-4080

    • @DenverStarkey
      @DenverStarkey Před 15 dny +2

      defintiely better than the 4070 or the 4070 super , but the 4070ti and ti-super as well as teh 4080 would still stomp it.

    • @arenzricodexd4409
      @arenzricodexd4409 Před 15 dny +1

      Maybe it can reach 4070 with good scaling but it will not reach 4080. But even if SLI still supported to this day many game will not going to work with it. That is another issue with SLI. people need their games work on it for people to consider them. And that still not counting issues that can arise from multi gpu that other wise not with single gpu.

    • @kevinerbs2778
      @kevinerbs2778 Před 5 dny

      @@arenzricodexd4409 D.L.S.S has even more needs that S.L.I did.
      Lower cards scaled better because the cpu can handle more split up work than large single work of feeding.

  • @skivvywaver
    @skivvywaver Před 8 dny

    Micro-stutters were the reason I was happy to see multi card rigs go extinct. SLI and Crossfire both suffered from it. You be playing along at 90 FPS and get a hiccup that freeze frames for even a quarter second and you feel like you got slapped upside the head. I don't care what they do with multicard setups. I have already played that game. It was cool that you could buy 2 cheap cards to equal one good card. Sometimes it worked, sometimes it didn't.

    • @kevinerbs2778
      @kevinerbs2778 Před 5 dny +1

      Micro stuttering is still happening with single cards. There were like 14 Triple A games in 2023 that released for DX12 that only used a single card & all of those games had micro stuttering /stuttering.

  • @cy4n145
    @cy4n145 Před 3 dny

    The GTX 1060 doesn't support SLI

    • @MauriBT
      @MauriBT Před 2 dny

      YES with modded drivers

  • @RetiefBoshoff
    @RetiefBoshoff Před 3 dny

    ok 90% through your video and you and nvidia missed a sales pitch, hold my beer( quickly imagine nvlink being viable but not for games but for generative AI in pcs meaning you use nvlink and run for example 2x rtx4080s and get double the vram and double the tensor cores for generative AI for pcs I would love that, it would double the performance on for example xdsl