Milking Buyers' Wallets: Nvidia GeForce RTX 3090 Ti Review

Sdílet
Vložit
  • čas přidán 12. 09. 2024

Komentáře • 1,7K

  • @Hardwareunboxed
    @Hardwareunboxed  Před 2 lety +625

    1:38 - I meant 24GB! It was a rush to get this done so get off my back :D

    • @zackfair1868
      @zackfair1868 Před 2 lety +41

      glad this correction is here i was like really 24 mb

    • @meyertechrants3589
      @meyertechrants3589 Před 2 lety +32

      I came to the comments to poke fun at that, so you're lucky I saw this right away :P

    • @Hardwareunboxed
      @Hardwareunboxed  Před 2 lety +58

      @ZACK FAIR yeah you would have never ever worked it out otherwise :P

    • @Hardwareunboxed
      @Hardwareunboxed  Před 2 lety +54

      @Meyer Tech Rants major bullet dodged ;)

    • @jrherita
      @jrherita Před 2 lety +9

      Lol 24 megabyte would be whopping

  • @Jeffmorgan83
    @Jeffmorgan83 Před 2 lety +871

    "I'm not sure you'll notice the difference going from123fps to 138fps at 4K. But you'll probably notice the $500+ missing from your wallet" -Hardware Unboxed 2022. Haha, love you guys!

    • @goukenslay7555
      @goukenslay7555 Před 2 lety +4

      Me who doesn't carry cash - as if.

    • @GreasyFox
      @GreasyFox Před 2 lety +12

      Those 500 dollars can get you a nice cooler to tame the hot GPU beast, instead of getting meager 10 percent frame rate increase and torturing the PSU.

    • @DRAG0N1012
      @DRAG0N1012 Před 2 lety

      Problem is its gonna get worse more price less performance next gen its supposed to be 600w a card which is a lot they have hit a wall of not only their design but they are running into power limits because they aren't updating the design to something not only cheaper but will definitely get more performance for less power draw

    • @TheRobstar1983
      @TheRobstar1983 Před 2 lety +4

      People who buy a 3090 shouldn't be worried about $500 and if they do they are buying the wrong card.

    • @IslamGhunym
      @IslamGhunym Před 2 lety +2

      Never been a matter of a couple of frames. A pro FPS player knows that the stable input latency is what matters the most when it comes to competitive FPS games. The stable latency is achieved by a stable frame rate count 99% of the time. I don't think a 3090 Ti would increase that count much over a 3090

  • @MinosML
    @MinosML Před 2 lety +222

    "I'm not sure you'll really notice going from 123 fps to 138 fps at 4k, but you'll probably notice the $500+ missing from your wallet"
    Lmao masterful writing there Steve, thanks as always for the consumer oriented review, these really aren't worth a damn to 99.9% of the market.

    • @Delver639
      @Delver639 Před 2 lety +17

      Forgot to mention how much you're going to lose month after month in energy bills

    • @Koozwad
      @Koozwad Před 2 lety +7

      You forgot the last 0.1%.

    • @post-leftluddite
      @post-leftluddite Před 2 lety +1

      I wish hardware unboxed would actually test how big a difference in FPS has to be for the average human to even notice it, that way it can finally show that one product beating another by 5% is pointless and to buy on other criteria

    • @Koozwad
      @Koozwad Před 2 lety +1

      @@post-leftluddite Not sure how useful that would be seeing as cards are meant to be futureproofed. Heck I bought a 3090 near launch and lo and behold it can barely run Cyberpunk(maxed out but still). Sure the optimisation there aint the best but I mean the game came out within months of the top consumer gaming card. Now imagine a couple years down the road.

  • @SleepyFen
    @SleepyFen Před 2 lety +103

    The fact that this thing draws upwards of 3 times the power of a 1080 is absolutely insane to me.
    If you can't compete on architecture you can always draw more power, I guess. Just insane.

    • @ToadstedCroaks
      @ToadstedCroaks Před 2 lety +14

      Imagine selling someone something that just uses more fuel as an upgrade.
      "You want Ferrari, yes? Sell you Honda Civic with Honda Civic pushing it. Just as good."

    • @JZF629
      @JZF629 Před 2 lety +9

      Intel school of thought

  • @Dan-uf2vh
    @Dan-uf2vh Před 2 lety +428

    3090 Ti 6 months away from the 4000 series is REALLY, really not worth it

    • @FilmGamerHUN
      @FilmGamerHUN Před 2 lety +28

      RTX 4000 leaked they eat a power plant so that will be really bad..... AMD and Intel may be can better this time way more! AMD will have the first multichiplet mode.

    • @hongockimquang1994
      @hongockimquang1994 Před 2 lety +9

      I always think those 3090 + 3090Ti were never meant for us normie, but those using GPU to work and still want to game. For them even 3090 Ti is still cheaper than a comparable RTX A series.

    • @Dan-uf2vh
      @Dan-uf2vh Před 2 lety

      @@FilmGamerHUN I got an 850W Gold from Seasonic even though I only have a 12700KF with a 3060 Ti, in the idea that I will upgrade to maybe a 4070 or 4080. I think the power limit will make that decision for me ^^

    • @Zero11s
      @Zero11s Před 2 lety +2

      it is a cooler/power draw preperation

    • @promc2890
      @promc2890 Před 2 lety

      @@silverwerewolf975 it most probably can't since its gonna be like 300 watt TDP

  • @nipa5961
    @nipa5961 Před 2 lety +81

    4-7% faster for a 33% higher price...
    while consuming more than 100W more...
    and coming more than 18 months later...
    Nvidia surely knows how to disappoint even the lowest expectations.

    • @H0don
      @H0don Před 2 lety +6

      They will sell these cards anyway so nothing more to say about it.

    • @Safetytrousers
      @Safetytrousers Před 2 lety

      Were you hoping to buy a 3090ti?

    • @andersjjensen
      @andersjjensen Před 2 lety +7

      They didn't disappoint my expectations AT ALL. At this point I expect them to lie through their teeth, manipulate benchmark scores through their drivers, coerce/threaten/bribe reviewers, employ every known anti-consumer practice and come up with a few new ones, stab their business partners in the back publicly, and milk the market drier than the Mojave Desert all while pretending to be gamers biggest friends. They have not disappointed me ONCE for the past two decades....
      TL;DR: Your expectations are NO WHERE NEAR low enough of Nvidia, and you best soon learn the old saying "Fool me once, shame on you. Fool me twice, shame on me!" :P

    • @fwef7445
      @fwef7445 Před 2 lety

      Don't forget this thing is going to raise your electricity bill from the power consumption, so it'll work out even more expensive in the end

    • @nipa5961
      @nipa5961 Před 2 lety

      @@2buckgeo843 Quite the opposite, it was visionary.
      Every single card they released this year was a huge disappointment.
      The GTX 1630 is even worse than a 6500XT and a A380 put together, while being more expensive...

  • @ludvik1378
    @ludvik1378 Před 2 lety +487

    I can't imagine how NVIDIA expects people to react in any way other than negatively to this launch. After 18 months this is laughable.

    • @MMK86
      @MMK86 Před 2 lety +68

      "hey guys, NVIDIA here, our feet still haven't touched the ground from all the record profits, we know there's been a global shortage and we fucked over most of our customer base for 2 years but check out this sick new card that doesn't fix any of those problems!"

    • @solocamo3654
      @solocamo3654 Před 2 lety +19

      Yet it will still sell out sadly. Both camps will be increasing msrp next round, just wait.

    • @kamiladamski5715
      @kamiladamski5715 Před 2 lety +10

      It's mostly for bragging rights. They know AMD got a pretty competitive performance with RDNA2, RDNA3 looks promising, hence they release this monstrosity claiming the "performance crown". No one reasonable would buy this card, but someone who is not interested in technology and has unlimited money can buy it anyway. Moreover, customers will get prepared mentally to the power draw and a price for 4000 series. This is the logic behind it.

    • @ployth9000
      @ployth9000 Před 2 lety

      lol sale current 3090 to makeup for the price all is good

    • @FIVESTRZ
      @FIVESTRZ Před 2 lety +1

      Think they may end up as the new 6900 XT of Newegg, readily available when you are lol. Unless mining goes crazy again then the memory temps fix is going to be nice for those folks and even some gamers looking to push more performance.

  • @MechE11B
    @MechE11B Před 2 lety +8

    I upgrade graphics cards every 6-8 years or so. Still rocking a 980ti. I pulled the trigger on the 3090ti after the price cut to $1400. I'll upgrade again in another 6-8 years. =P

    • @parisarnett87
      @parisarnett87 Před 2 lety

      1099

    • @xeonmainframerx7
      @xeonmainframerx7 Před 2 lety

      999 and of the same mind as op

    • @bigdaddyschin5692
      @bigdaddyschin5692 Před rokem

      I have a 3060 now. But I wanna do a 5900x and 3090ti LC build before everything new drops. Play games at 4K for the next 4-5 years then do it again.

    • @alarson1799
      @alarson1799 Před rokem

      Which version of the 3090ti did you get, and how has it been working for you?

    • @xeonmainframerx7
      @xeonmainframerx7 Před rokem

      @@alarson1799 evga ftw3 ultra gaming and I have an ek quantum vector2 block equipped and it has been flawless as all of my past evga products.
      It's a shame they have now stopped doing graphics cards they were my exclusive source for nvidia cards

  • @theigpugamer
    @theigpugamer Před 2 lety +305

    Ti went from being the enthusiast card like the 1080ti to price milked card where the ti stands for tiny improvement in the 2080ti and now the 3090ti

    • @Krazie-Ivan
      @Krazie-Ivan Před 2 lety +93

      props for "Ti" = "tiny improvement"
      gold.

    • @RobBCactive
      @RobBCactive Před 2 lety +23

      Call them "Pi" for Price Increase

    • @theigpugamer
      @theigpugamer Před 2 lety +13

      @@RobBCactive the 3090 pi and imagine if they announce it as the 3090pie
      that would be hilarious xD

    • @smudlicko
      @smudlicko Před 2 lety +18

      The only true ti card in this gen is 3060ti

    • @MLWJ1993
      @MLWJ1993 Před 2 lety +4

      If this suits your fancy for being the best of the best GPU money can buy right now this card is definitely "enthusiast" grade. The 1080Ti was like the only outlier in that case since it held up very well for a long period without costing that much. It was still worse value than the 1080 at the time though.

  • @starshepherd11
    @starshepherd11 Před 2 lety +7

    I simply don't understand how anyone can live with a 700w+ pc next to them. When I ran my FX 8350 and RX 570 overclocked build (450w) I couldn't go for more than an hour without my room getting uncomfortably hot in Melbourne 25c weather, and I couldn't even use the pc on the hotter summer days.. and I'm in a South facing room with zero direct sun access

    • @joelr1096
      @joelr1096 Před 2 lety +2

      oh shit bruh same here in perth, oc'd x99 5960x build with 2080ti running flat out and it's a literal space heater in the middle of hot australian summer lmfao

    • @matthewleon4378
      @matthewleon4378 Před 2 lety +1

      @@joelr1096 Do you guys not do liquid cooling?

    • @fwef7445
      @fwef7445 Před 2 lety +1

      Imagine the electricity bill also

    • @starshepherd11
      @starshepherd11 Před 2 lety +2

      @@matthewleon4378 I don't bother with it, but with respect to total heat output water-cooling doesn't change it. 700w is still 700w, you can look up the specific heat formula Q=mcΔT

  • @biggranny000
    @biggranny000 Před 2 lety +651

    Graphics cards nowadays are pulling way too much power by themselves. I was surprised to see how much more efficient AMD is.

    • @AmdUlkoinen
      @AmdUlkoinen Před 2 lety +95

      This is my main reason why Pascal was so great, very good performance per watt (Zotac made an GTX 1080 Ti mini, only 2 slots thick and only 211mm long). Nowdays video cards are like 320mm long, 3 slots thick and use 400w+ of power. But hey, at least they are so large they can put a load of rgb and lcd screens!

    • @0101_root
      @0101_root Před 2 lety +2

      @@shreyasdharashivkar8027 Slow? Where?

    • @amashaziz2212
      @amashaziz2212 Před 2 lety +5

      r u a bot? serious question.

    • @PuppetierMaster
      @PuppetierMaster Před 2 lety +9

      AMD isn't any better they have a capped amount of forged GPUs and forget people might want to buy them. I blame the console rush of 2020, those playschool consoles did not need anything close to what they got people who have consoles do not care for RTX even in the slightest so it was a big waste of time, over saturated the market with _"next gen"_ graphics I mean it would be nice if PS5 or Xbox12 could play Crysis 1. That would be cool.

    • @TheFather_
      @TheFather_ Před 2 lety +12

      as far as i know, the 3000 series is based on samsung 8nm node which is not that efficient compared to TSMC, the next 4000 series will be based on TSMC 5nm process node, this will give a huge boost in performance while consuming similar or less power than the current 3000 series, thats why everyone is waiting for the new series.

  • @ItsReaxxion
    @ItsReaxxion Před 2 lety +17

    3090 Ti is like the beta test version of the ever power hungry next gen gpus, tbh im more interested to see how different AIBs will handle and deal with 500W or even 600W next gen gpus than the 3090 Ti itself.

    • @haukionkannel
      @haukionkannel Před 2 lety +4

      So true I am really waiting for those 600W monsters, so that I can avoid buying them at $3500+
      :)

    • @ToadstedCroaks
      @ToadstedCroaks Před 2 lety

      I've never been so open to buying the mid/entry level xx60 cards until the last few years. Every series since 10xx has been "Well ... I could buy a xx80 or higher card ....oooor just buy the new xx60 coming out this year and save myself 75% of the cost for the same performance.
      I still get the xx70 anyway ...but the premise still stands. The consumer value of high end cards has been lost for years. The last Ti card I got was back in the 700 series .. when they weren't stupid expensive and you could actually skip a generation or two before getting another one.

    • @Sucosama
      @Sucosama Před 2 lety

      @@haukionkannel plus $500 for a shiny psu

    • @bobsort
      @bobsort Před 2 lety

      Well, I think it is mostly beta testing to see how much gamers are willing to pay for a GPU for just 5~12% increase in performance. Not counting 30% increase in power usage!

    • @ItsReaxxion
      @ItsReaxxion Před 2 lety

      @@bobsort i meant the power consumption not the barely increased framerate

  • @DrearierSpider1
    @DrearierSpider1 Před 2 lety +305

    These cards have 2 purposes: milking more of your cash, and serving as the 450W lube so you're ready for the 600W+ violations to your PC that will be Hopper cards.

    • @sherlocksinha2435
      @sherlocksinha2435 Před 2 lety +27

      Man this card draws as much power as dual gpu cards like the R9 295X2

    • @Soutar3DG
      @Soutar3DG Před 2 lety +33

      Considering there have been 2k PSUs being released the past couple weeks as well. Be funny if a consumer law came down from the EU stating individuals PC, be that components themselves or the PC as a whole shouldn't use more than a certain amount of power, ya know, for environmental reasons.

    • @DrearierSpider1
      @DrearierSpider1 Před 2 lety +16

      @@Soutar3DG Oh, you'll get a carbon tax while your "leaders" fly to Brussels on private jets on a weekly basis. Biden would do the same if he had enough votes in Congress too.

    • @ThunderingRoar
      @ThunderingRoar Před 2 lety +29

      @@Soutar3DG if they cared about that they would have banned mining

    • @concinnus
      @concinnus Před 2 lety +12

      Hopper is the compute card, probably up to 800W. GPUs are Lovelace, maybe up to 600W.

  • @captante9889
    @captante9889 Před 2 lety +64

    Never thought I'd find myself rooting for Intel.... I really hope their GPU's are good.

    • @YourSkyliner
      @YourSkyliner Před 2 lety +2

      I figure if they were decent they would have launched them by now.

    • @ryanwallace983
      @ryanwallace983 Před 2 lety +3

      @@YourSkyliner it’s the drivers they are working on, I expect the hardware is reasonably solid

    • @odizzido
      @odizzido Před 2 lety

      Same here. I hope intel comes in swinging.....but of course they will probably release a 500W card that performs like a 3070 for 1200 dollars. It's intel. I fully expect intel's GPUs to be bad but a small part of me foolishly hopes for something good.

    • @Atlanticmantic
      @Atlanticmantic Před 2 lety

      Intel will rape you with the pricing also for a card that doesn't perform as well... IT'S INTEL!

    • @sean8102
      @sean8102 Před 2 lety

      I def hope they are good. The more competition the better. Plus with Intel actually owning some of its own fabs unlike AMD and Nvidia I'm curious to see if that will help them meet demand (assuming their first gen of cards are good enough to become popular).

  • @riven4121
    @riven4121 Před 2 lety +318

    Remember when Ti cards gave a significant uplift over the base models?

    • @DrearierSpider1
      @DrearierSpider1 Před 2 lety +16

      Gotta look at the GPU die. 1080 was GP104, and the 2080 was TU104, while the 80 Ti models were 102 dies that generation. This gen, only the 3060 has a smaller die than its Ti counterpart.

    • @zfacersane272
      @zfacersane272 Před 2 lety +20

      Like the 3060Ti

    • @Dangerman-zg3ui
      @Dangerman-zg3ui Před 2 lety +1

      TBF the 980 & 1080 Tis where basically different dies. Nvidia started naming 80 series cards with more mid/perfomrnace range dies starting with the 600 series cards (kepler).

    • @idontwatchtv
      @idontwatchtv Před 2 lety +4

      Yep I remember 780 owners figuratively getting slapped in the face by Jensen when the 780 Ti came out to compete against the 290X and it was still not able to fully beat it.

    • @TheRealXXDarknezz
      @TheRealXXDarknezz Před 2 lety +8

      Well my RTX 3060 Ti is worth it compared to the 3060 ;)
      But as far as I know probably the only one in the current gen of nividia

  • @jtmiddelkoop
    @jtmiddelkoop Před 2 lety +5

    Well done, HW Unboxed. This review was brutal for all the right reasons! Keep it up!

  • @roccociccone597
    @roccociccone597 Před 2 lety +149

    This gives me 11900k vibes, considering rtx 4000 is not too far away.

    • @ChrisSS1LE
      @ChrisSS1LE Před 2 lety +7

      Except the 11900k was literally the exact same gaming performance as the 10900k. At least the 3090ti is better than the 3090

    • @maegnificant
      @maegnificant Před 2 lety +2

      @@ChrisSS1LE 3090 my man

    • @solocamo3654
      @solocamo3654 Před 2 lety +6

      @@ChrisSS1LE Yea, they traded blows. But it was embarassing how many time the 10900k outperformed it. Also, 10c20t vs 8c16t and less l3 cache was a major downgrade. If the 11900k was 10c20t with atleast 20mb cache it would have faired a bit better.

    • @ChrisSS1LE
      @ChrisSS1LE Před 2 lety +1

      @@maegnificant edited it, I was stuck in the future

    • @roccociccone597
      @roccociccone597 Před 2 lety

      @@ChrisSS1LE I mean technically it’s still the same generation

  • @JWooden271
    @JWooden271 Před 2 lety

    22:14 That Spaceball One outro had me expecting a sticky note on the end saying, "WE BRAKE FOR NOBODY"

  • @technetium_tech
    @technetium_tech Před 2 lety +30

    Get ready for 5 slot, $3000 MSRP Lovelace GPUs!

    • @Z3t487
      @Z3t487 Před 2 lety +1

      F*ck, i thought that price was just supposed to be a meme!

    • @haukionkannel
      @haukionkannel Před 2 lety +1

      You mean $3000 at release and $3500+ in real life ;)

  • @MrMcGreed
    @MrMcGreed Před 2 lety

    Loving the spatulas @18:40 - nice touch

  • @opbush5272
    @opbush5272 Před 2 lety +53

    Woah! a new PSU destroyer, a housefire generator and a money stealer all in one purchase!

    • @masternoel123
      @masternoel123 Před 2 lety +4

      You'd love to see it

    • @notchipotle
      @notchipotle Před 2 lety +1

      you can get a decent amount of cocaine for $2000

    • @Ragnarok540
      @Ragnarok540 Před 2 lety

      Maybe they are designed for crypto bros.

  • @lixlax1494
    @lixlax1494 Před 2 lety +6

    I feel like this is only launched to make RTX 4000 power consumption and efficiency not to look that bad (whenever these things launch).
    With the next gen rumoured to be ~600w at the top end and AMD also going for 400W+, GPUs are reaching stupid levels of power. It wasn't so long ago when I felt 275-300W was already too much. Sadly the lower end will go up on power as well.

    • @mapesdhs597
      @mapesdhs597 Před 2 lety

      It would be hillarious if people buy these things, especially the upper 4000 series, and then find they can't afford to run them because of skyrocketing energy costs (upfront power draw but also AC for those in warmer locales).

    • @fwef7445
      @fwef7445 Před 2 lety +1

      stick to mid tier cards, more affordable, less power consumption, still out perform new consoles, these top tier cards aren't worth it

  • @ARCHarrison
    @ARCHarrison Před 2 lety +41

    I almost bought a 3090 last week when I saw one for under $2000 usd for the first time in forever, I resisted and decided to wait for the next gens. Seeing the 3090ti and the absolute cash grab it represents, just makes me want to wait harder.

    • @thegreatotaku118
      @thegreatotaku118 Před 2 lety +6

      Unless you don't have any working Gpu, just wait for a more reasonable price.

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ Před 2 lety +3

      yeah the 4070 is likely to be better than this cost less and draws less power. so who would be crazy enough to buy this?

    • @damara2268
      @damara2268 Před 2 lety

      Do you know 3090ti is only a preparation for what will come??
      4090 will be 600W and MSRP probably also 2000$.
      Still a better buy than this of course, because the performance will be at least 50% better.

    • @spookyskellyskeleton609
      @spookyskellyskeleton609 Před 2 lety +1

      @@damara2268 you think it will be near msrp?

    • @s.muller8688
      @s.muller8688 Před 2 lety

      It is absolutely astonishing how incredible numb you people are. the very reason the prices are as high as they are is because people like you keep buying the crap they put out. So if there's anything dumb about this card and situation in regards to the VC business it is the consumer.

  • @TheMasterOfSafari
    @TheMasterOfSafari Před 2 lety +4

    to be honest, in my opinion Nvidia are just testing the waters in preparation for RTX 40 series, as its rumored to use 600W on the 4090 SKU.

    • @JZF629
      @JZF629 Před 2 lety

      I’ve heard this. It’s probably true though I’m still hoping it won’t be. I just got a new PSU thinking it would last me for years to come… I WAS WRONG ABOUT THAT…

  • @obeliskt1024
    @obeliskt1024 Před 2 lety +23

    I guess "Winter is coming" has never been more real because you're damn well gonna need a chilly environment to not sweat your arse with the heat this thing puts out.
    goodluck to next gen 40 series high end

    • @sean8102
      @sean8102 Před 2 lety

      Rumors should always be taken with a grain of salt of course, but all of the rumors/leaks etc are indicating the 4000 series is going to be a power hog.

  • @billybobjoeftw
    @billybobjoeftw Před 2 lety +5

    How can a GPU that's about 1.5x faster than the 2080ti at roughly 1.5x the price (at MSRP) released 3.5 years later not be a great deal Steve?

    • @andersjjensen
      @andersjjensen Před 2 lety +3

      "Hello? I'd like to report a murder... Yes, I know his name.. billybobjoeftw... yes, Sir, it happened right in front of my eyes.... on CZcams... yes, Sir, in the comment section in broad daylight... no, Sir, I'm not acquainted with him... The victim?... Well I'm not 100% sure, because there's not much left of him, but the bloody leather jacket suggest to me that it's Jensen Huang"

  • @AndreasRavnestad
    @AndreasRavnestad Před 2 lety +163

    With power usage in mind it looks like 6900 XT has really carved out a space for itself. Would be interesting to see how a 6900XT would fare if pushed to the same power levels as a 3090 Ti.

    • @geerstyresoil3136
      @geerstyresoil3136 Před 2 lety +34

      very well, there are significant overclocking gains between 350-400 watts when using MorePowerTool on my water cooling setup. the 6900XT is very conservative at stock clocks. the 6900XT is breaking 3DMARK records under liquid nitrogen.

    • @Theafter.
      @Theafter. Před 2 lety +13

      I have become a bit of an AMD fan boy recently and I’d love to see how powerful a 450 watt 6900xt would be 😂😂 it would be so over kill

    • @GewelReal
      @GewelReal Před 2 lety +10

      Let's wait for 6950 XT

    • @stabhappy6296
      @stabhappy6296 Před 2 lety +5

      @@Theafter. Maybe look at the 6900xtx, a prebuilt exclusive 6900xt with higher tdp and aio.

    • @sonicboy678
      @sonicboy678 Před 2 lety +4

      Not to mention that it costs far less...

  • @anthonywong7004
    @anthonywong7004 Před 2 lety +19

    At this point what AIBs have done are even more impressive
    I mean sure the cards are huge but you are talking air-cooling 450-500w that's quite an engineering feat to make such actually possible

    • @ToadstedCroaks
      @ToadstedCroaks Před 2 lety

      Most of that card's bulk is the cooling element, just like always. People who don't regularly take their cards apart don't notice that. Most video card boards are pretty small/thin .. with a massive heat plate covering it and then a bulky heat pipe / fin design with 1-3 small fans.
      There are some elements of design that can make your radiator more efficient, which is how 3rd party custom coolers make their money. But generally speaking, if you want to have more cooling, you just have more metal attached to it and more airflow. It's not really an engineering marvel, it's 100 year old tech.

    • @DarkReturns1
      @DarkReturns1 Před 2 lety

      @@ToadstedCroaks did you know the grass is green? Amazing

  • @generalgrevous1984
    @generalgrevous1984 Před 2 lety +90

    Sure feels like Intel hubris: charging an arm and a leg for struggling tech. Intel learned its lesson but Nvidia looks to be picking up the torch where Intel left off: overcharging and underdelivering.

    • @dagarath
      @dagarath Před 2 lety +12

      Not at all, it is a company that internally has no interest in gamers, on the cusp of disgust, and simply are using them as free form of advertising for their real market, miners and machine learning/ai

    • @TheXev
      @TheXev Před 2 lety +5

      The perception of "AMD has bad drivers because I heard..." is still something that AMD is going to struggle with for years. It's been going on since the late 90's. I don't know how AMD can overcome that perception... AMD needs to get serious about using streaming as part of it's marketing, as many gamers look at their favorite streamers hardware.

    • @ELXatrix
      @ELXatrix Před 2 lety +1

      @@TheXev and i dont know were that comes from only had ati/amd cards since i switched from my voodoo3 and never realy had any problem appart from some issues trying out crossfire just for shits and gigles wich well wasnt a great experience.

    • @qwkimball
      @qwkimball Před 2 lety +1

      It's a halo product. The sole purpose of its existence is to back up the claim that Nvidia has the fastest gaming card on the planet. They don't give a damn if it makes a dollar.

    • @arurueruru9177
      @arurueruru9177 Před 2 lety +1

      @@qwkimball I do heard some people said that Jensen is quite competitive in that regard. Leather-jacketman really doesn't like it if their performance crown is taken away from them. And considering their history and their behaviour whenever AMD was able to be competitive, it sure brings credibility to the rumour.

  • @napperneo4516
    @napperneo4516 Před 2 lety +27

    i was able to get a red devil ultimate 6900xt for $1200, overkill for gaming but cost me less than some 3080s and keeps up with or surpasses the 3090. happy to finally not worry about a GPU for years to come.

    • @israelpereda8507
      @israelpereda8507 Před 2 lety +5

      Same I got a gaming OC 6900xt from someone who didn’t know how to properly power it and said it was dead. No need for an upgrade for many years

    • @FrazzleCat
      @FrazzleCat Před 2 lety +5

      mhm same, and same card as well. I don't even want to think about the money that I spent on the thing. It's actually literally stupid. But, I have a GPU for several years to come, especially with RSR and FSR, so whatever. That's one and done and for hundreds less expensive than the 3X cards.

    • @sean8102
      @sean8102 Před 2 lety +3

      I was more than down to give AMD a try this gen, but I couldn't find any AMD board partners that had a waiting list system like EVGA. I was tired of getting in stock alerts over and over only to have the site crash or prevent me from checking out in some way until the cards were gone. I finally got a RTX 3080 from EVGA when they emailed me 1 year later after getting on the waiting list. I was upgrading from a GTX 1080 which was just starting to show its age a bit at 1440p in new releases so I feel set for at least 2 generations.

    • @JZF629
      @JZF629 Před 2 lety +1

      That’s the price it SHOULD be at finally. Glad to hear it!

  • @Gamingtechgg
    @Gamingtechgg Před 2 lety +29

    I love the smoke effect, well done Hardware Unboxed :) sick work

    • @nonholographic
      @nonholographic Před 2 lety

      That's just the gpu overheating

    • @Mandrag0ras
      @Mandrag0ras Před 2 lety

      @@nonholographic And because it's an nvidia, it releases green smoke.

  • @crookim
    @crookim Před 2 lety +4

    This is the reason why I haven't changed my old RX 570 GPU, this prices are ridiculous and also the amount of energy it consumes is insane!

  • @CESAR_H_ARIAS
    @CESAR_H_ARIAS Před 2 lety +18

    I bet you this gpu alone draws more power than my 5800x&rx6800 gpu 🤣🤣🤣
    Edit: 9:49 i was *not* wrong. Lord Christ have mercy

    • @Exomious
      @Exomious Před 2 lety +1

      Not to mention it's like 25 percent faster at 1440p for like 3x the price of the 6800 lmao 🤣

    • @CESAR_H_ARIAS
      @CESAR_H_ARIAS Před 2 lety +1

      @@Exomious fun fact. I can max out my 144hz panel with the games im playing using radeon chill and fps locks (don't need to waste power and dump more heat to my itx case/radiators)

    • @hendaben
      @hendaben Před 2 lety

      LOL

  • @muhammad_haider_pervaiz
    @muhammad_haider_pervaiz Před 2 lety +1

    7:40 This roast almost killed me 😂

  • @xlinnaeus
    @xlinnaeus Před 2 lety +63

    This is really just a great opportunity to hit a smash at what the fuck gamers have had to endure over the last 2 years. Thanks guys. Really appreciate all the exhaustive work you've both put in over the last couple of years. Cheers. I approve of this editorial direction!

  • @TheGeorkef
    @TheGeorkef Před 2 lety

    10:02 the smoke in the background right after talking abound energy consumption is kind of symbolic :D

  • @thomasni123
    @thomasni123 Před 2 lety +61

    I wonder how close a 3090 would get to the Ti if they were given the same power limits.

    • @lordofthemfl9899
      @lordofthemfl9899 Před 2 lety +19

      Almost same tbh

    • @mavericksyao708
      @mavericksyao708 Před 2 lety +11

      2.5% more cores so 2.5%

    • @lupintheiii3055
      @lupintheiii3055 Před 2 lety +23

      Exactly same performance.
      GA102 chip is already a meme, there is nothing much else you can squeeze from that lemon.

    • @xlinnaeus
      @xlinnaeus Před 2 lety +2

      @@lupintheiii3055 So true.

    • @georgevul3
      @georgevul3 Před 2 lety

      As we can see 8nm Samsung have a trash perf/power scaling

  • @tonyg8471
    @tonyg8471 Před 2 lety +18

    Steve, thanks for your typically thorough and insightful product review. This 3090-Ti is simply a cash grab by Nvidia, and I have no problem with that. There are always a small handful of people with more money than brains, and if they want to spread that cash around...go ahead. I applaud Nvidia for giving those dim bulbs the avenue they seek. If you hold it up against the light at just the right angle...maybe some content creator can justify the purchase of a 3090-Ti? Meanwhile, back on Earth, this is not a real product. What the market needs is a huge influx of Intel desktop GPUs, at prices enticingly below the existing market. This would bring correction to the GPU market and that would actually be worth reporting. I'm an AMD fan since the 90's and I've never built an Intel PC. But if their GPUs are competitive, I'm going to buy one as a middle finger salute to Nvidia and AMD.

    • @razorphone77
      @razorphone77 Před 2 lety +1

      as much as its a cash grab, i really cant see them grabbing anyones cash. Like, who would buy this? it's utterly ridiculous

    • @Pesticide7G
      @Pesticide7G Před 2 lety +1

      @@razorphone77 i have x2 3090ti SLI. I’ll give the pc to my 11 year old nephew when the 40 series comes out so I really don’t care. Im just hyped for the 40 series 😂

  • @sirjonsmithiii992
    @sirjonsmithiii992 Před 2 lety +10

    I am very grateful that I was able to get a 3080 tuff at msrp (took me two months of super clicking and getting lucky on amazon for $759). I have enjoyed the heck out of it and always thought I was set. That was a long time ago and it is still crazy. I have little hope that the 4000's will be any different.

    • @MrDoctorBryce
      @MrDoctorBryce Před 2 lety

      @NICULL I have a 6800xt from week 2 of Amd queue and it’s served me well thus far prolly won’t need to upgrade for a lot of years it runs everything I throw at it

    • @MLWJ1993
      @MLWJ1993 Před 2 lety

      @NICULL even a 2060 is going to be fine as long as DLSS is integrated in games (or anything that can be on par visually for the same performance gains).
      At the same time you'll be able to look at how raytracing improves over time in games even if that's not what you're using it for in actual gameplay.
      by the time it's underperforming I'd hope the market is somewhat recovered to sane levels 😅

    • @ToadstedCroaks
      @ToadstedCroaks Před 2 lety

      @NICULL I'm pretty sure upgrading to a 4060 would be a huge improvement over a 2060, considering it's rumored to be as good as a 3090/Ti. That's like ... $300 or less. Way reasonable for something that will use the newer "tech" more efficiently, or at all, compared to a 2000 series card. I have a 2070 myself, and had no interest in jumping on the 3000 series carnival wagon. The 4000s though look more appealing, if they hold up to the rumors and the prices stay at reasonable msrp. I'm even thinking of jumping ship and going with amd's new cards because of power consumption concerns and pricing. Plus it seems all the monitors I buy never have Gsync, lol.

    • @sean8102
      @sean8102 Před 2 lety +1

      Same. I was able to get a EVGA 3080 Hybrid for MSRP, but it took a year on their waiting list. But I stopped upgrading my GPU every gen a while ago. Heck, I upgraded from a GTX 1080. So it was a night and day difference for me (plus getting all the RTX features for the first time). I expect my 3080 to last me at least as long as my 1080 did.

  • @lebigmac74
    @lebigmac74 Před 2 lety

    That smoke at @10:05 - looks a bit like these cards came directly from hell!😂😂😂

  • @octoman_games
    @octoman_games Před 2 lety +13

    I'm STILL on my 1080Ti ! I love this card!

    • @ezpzjayzaar
      @ezpzjayzaar Před 2 lety +1

      As long as you stay at 1080p you're good for another 5 years.

    • @masternoel123
      @masternoel123 Před 2 lety +1

      Pascal Gpus are legendary

    • @octoman_games
      @octoman_games Před 2 lety

      @@ezpzjayzaar it plays everything maxed out @ 1440p

    • @ezpzjayzaar
      @ezpzjayzaar Před 2 lety +3

      @@octoman_games I have 1080Ti and 1440p monitor and I can confirm that it can't run all games maxed out at 1440p, you have to lower some graphics settings. Assassin's Creed Valhalla, Cyberpunk 2077 and God of War 2018 are just a few that I had to run with custom settings instead of all maxed out.

    • @giglioflex
      @giglioflex Před 2 lety +2

      Same, still more VRAM than a 3080 and uses less power.

  • @silverarm6130
    @silverarm6130 Před 2 lety +6

    We wouldn't have such a silicone shortage, if they'd stop making stupid ass models like this. That's a ridiculous power draw!

  • @Mordiaken
    @Mordiaken Před 2 lety +10

    2-20fps improvement for only the low low price of only $500 more than our last over priced card!

    • @Safetytrousers
      @Safetytrousers Před 2 lety

      So what? Were you in the market for a 3090?

    • @ToadstedCroaks
      @ToadstedCroaks Před 2 lety

      Imagine being a poor sob that paid 2000+ for their 3090, wants to upgrade to a 3090Ti ..and realizes that selling their 3090 for $500 less ( and going )than they paid is already going to be painful.

    • @Safetytrousers
      @Safetytrousers Před 2 lety

      @@ToadstedCroaks Nobody with a 3090 will be upgrading to a 3090ti unless they have loads of spare cash, in which case they could probably buy one outright.

  • @prem3548
    @prem3548 Před 2 lety +2

    I am so sick of the "It's a Titan RTX replacement" argument. NO, It's not. Titan RTX has access to professional drivers, this card doesn't. This is advertised and meant for gaming first and content creation on the side. Whereas Titan RTX and any of its equivalent cards are always advertised for professional use cases. They don't even talk about gaming on their product pages, but they are perfectly capable of running games on the side. It has the Geforce branding, then it's not a professional card.

  • @Sherudons
    @Sherudons Před 2 lety +8

    They made a mistake with the 1080TI, that thing was godlike and remained the best of the best for years almost, but what they release now is incremental bumps and markups that and more than just %10 performance worthy.
    Still, people will still go it's my money to spend and miss the point completely.

    • @tourmaline07
      @tourmaline07 Před 2 lety +1

      You only got the 1080ti because nVidia were scared of Vega and the boastful Radeon Technologies Group. Not a mistake from them.
      All this incremental gains from nVidia is just because AMD haven't competed properly at the top end - I think you'll see that chance quickly especially next generation - even if nVidia need to get into the HVAC market to do so ;)

  • @neonix01
    @neonix01 Před 2 lety +1

    After owning a RTX 3080 factory OC model that consumes about 340w since launchday, I can wholeheartedly say that my next upgrade will be from the manufacturer that offers the best performance PER WATT at around 300w max. I went from a GPU that used about 200w to 340w. That is a 140w increase. In my very well ventilated case, this meant that overall temps went up significantly. Noise also went up just as much. My CPU, chipset and memory temps went up noticeably as well. This is in case with 2x 140mm fans, 3x 120mm fans and 1x 160mm fan as case fans, and 2x 140mm CPU fans. All BeQuiet! Silent Wings 3 fans. I also noticed the much warmer overall temperature in the room, and PC gaming all of a sudden started to get uncomfortably warm and sweaty on warm days. No way will I purchase a card that generates even more heat. It's just not an option. Energy prices are also rapidly increasing all over Europe thanks to the ongoing war, several hundred percent increase in some countries just in the last month. At that point I will rather play games on my PS5, which consumes about 215w MAX in total for the entire box! Do I appreciate 120+fps? Yes. Is it worth consuming upwards of 1000w in total versus a console at 215w for 60fps? NOPE.

    • @ToadstedCroaks
      @ToadstedCroaks Před 2 lety

      Me hoping my AIO will fit the new graphics cards, or im investing in HVAC piping out my window.

  • @cptwhite
    @cptwhite Před 2 lety +4

    Just switched from a 3060Ti to a 6800 - glad to see my 'new' card is worth being included in the uber top tier list still, even if it is propping up the bottom of the charts. £369 for the 3060Ti on launch day and £100 for the switch over. So I effectively got the 6800 for £469, minor miracle to beat MRSP! (Which was £529 here in the UK for reference)

    • @lupintheiii3055
      @lupintheiii3055 Před 2 lety

      RX 6800 is just the best card this gen

    • @cptwhite
      @cptwhite Před 2 lety

      @@lupintheiii3055 performance per watt is very nice, and the 220W power consumption is the sweet spot

    • @lupintheiii3055
      @lupintheiii3055 Před 2 lety

      @@cptwhite Love mine from day one :-)
      Found best setting at 2450Mhz core, 2100Mhz memory, 930mV and -5% powerlimit, I can max out everything at 1440p for 170-190W :-)

    • @cptwhite
      @cptwhite Před 2 lety +1

      @@lupintheiii3055 only got it the other day, so thanks for the tip I'll have a play

  • @NOSfusion
    @NOSfusion Před 2 lety +4

    700+ watts!? I would love to see a 450w shootout with like 3 different builds. Best performance for under 450w consumption. Electricity cost to run a 3090ti every day is over $3, or damn near $100 a month. 450w however is roughly $1.27 per 24 hours of use, or $38 per month.

    • @ToadstedCroaks
      @ToadstedCroaks Před 2 lety +1

      Yeah, something like 30 cents per kwh where I'm at. Going from a 200 watt card to a 400 watt card over 5 hours of playtime is 30 cents, or $9 a month. If you're a streamer / power gamer with a 600 watt card you'll be doing 4kw a day, easy ( compared to earlier cards ), or around $450 a year more than previously. That's just for "upgrading", which doesn't include the extra power you're already using before like RGB, cooling, extra monitors, etc.. That cost could be minor, or double your power bill.
      And good lord, the heat it'll produce inside your room / home. That'll make things way more discomforting and increase your bill from just extra home cooling required.

  • @sinnexit7450
    @sinnexit7450 Před 2 lety +10

    Glad to see prices are coming down for GPU's

  • @The117th
    @The117th Před 2 lety +6

    So, seeing your results against my shunt modded 3090 FE on my 3950X, 64GB 3000MHz Rig Nvidia should be embarassed, consuming the same amount of power roughly, I beat it by 1-2% which is embarassing. there is literally zero reason to get this card, I called the 3090 a pointless product when I got it and now, I didnt think it could get worse.

    • @benjaminoechsli1941
      @benjaminoechsli1941 Před 2 lety +1

      How are your temps? The change in memory modules means the Tie's VRAM isn't roasting itself like the vanilla 3090's has been known to.

    • @The117th
      @The117th Před 2 lety +1

      @@benjaminoechsli1941 honestly for VRAM not that bad the highest I've seen under extreme prolonged loads is 86C, gaming usually mid 60s to 70s keep in mind this is watercooled though, before that I did read the air cooler and had similar temps if around 5C higher temps on memory

  • @wcg66
    @wcg66 Před 2 lety +8

    I bet the availability of these will be fine since no sane person would buy one. Even miners will avoid it since any investment in hardware at this point is going to burn them in a few months.

  • @zedorda1337
    @zedorda1337 Před rokem +3

    I just upgraded from a GTX 1080 to the Msi Suprim X 3090 ti for $1,149.95. I feel it was an ok cost to performance upgrade imo. Was not impressed with the 4000 series specs and feel I can skip afew generations with this new Msi Suprim X.

  • @taz1999
    @taz1999 Před 2 lety +3

    Brutally honest review like always. Great video.

  • @TAKOpowdercoating
    @TAKOpowdercoating Před 2 lety

    Thanks for you guys review these cards.
    Just bought a Gigabyte RTX 3090 Ti GAMING OC 24G in Taiwan(09/11/2022), it costs me NT$29990, about 1,000 USD, 1428 AUD.

  • @mapesdhs597
    @mapesdhs597 Před 2 lety +40

    7:45 - "... you'll probably notice the 500+ dollars missing from your wallet."
    That was such a good line. :D
    GPU technology has utterly stagnated. Obtaining more performance just by throwing vastly more power at the problem is not progress. These products are not *gaming* cards, they are table scraps from the Enterprise/compute market. We need genuinely new gaming-focused designs, not this nonsense which for many years now has evolved primarily for Enterprise, with gaming barely an afterthought. Hype tech like RT and DLSS is just an excuse to do something with silicon that was going to be present anyway (which by definition wastes die area which could be used for something more directly relevant, such as a beefier raster subsystem). People can sing the praises for DLSS if they like, but ultimately it's allowed NVIDIA to stall useful gains in raster performance and get away with not improving baseline native resolution rendering techniques, even though they were the company largely responsible for the initial surge in 4K gaming populatity. Quite the irony when now more than ever people are trying to push super high fps, alongside the demands of VR. Steve @ TGOG discussed this some years ago:
    czcams.com/video/PkeKx-L_E-o/video.html
    I recommend reading old reviews for products like the X1950 Pro, GTX 200 series, 8800 GT, 4870, etc. Back then there was a strong focus on introducing new rendering techniques to improve baseline realism, and reviews often did image quality comparisons to check that any performance gains were not being obtained via degraded visuals (something rarely done these days), especially for changes in AA methods.

    • @ToadstedCroaks
      @ToadstedCroaks Před 2 lety +1

      God I miss when there was a meta treasure hunt during each video card series where consumers tried to figure out which card was the one that was going to last you several years ( future proof ). When people figured out the 8800 was that card .. it was an amazing moment. There were always people going "Yeah, well I'm still using my 8800 and everything is still on max settings. These games still cant even touch it."
      Now companies like Nvidia create "technologies" that make players have to upgrade their cards because of hardware tie-ins, like Ray Tracing. "Yeah, you're older card is still more than enough to run this game at high settings ... but Ray Tracing." It's pretty sickening when computer improvements are just proprietary gimmicks to make them look way more advanced than they are. Like you said, if instead of doing that they spent that hardware on more meaningful improvements it would life everything up as a whole, rather than just letting people do that gimmick. ( Like 3D TVs and still needing to wear glasses for most of them ).

    • @mapesdhs597
      @mapesdhs597 Před 2 lety

      @@ToadstedCroaks I remember that era well. I replaced my X1950 Pro with 8800 GT SLI and, oh my, did it serve nicely for a good long while. Eventually upgraded to GTX 460 SLI, and so on.
      Irony is, just for benching purposes, in later years I was able to obtain triple GTX 280, dual 8800 GTX and numerous other cards on the cheap as each series hit rock bottom at relevant times in the used market. At one point though the normal price for a new GTX 460 was just 115 UKP, and that was a model which could easily handle 1GHz core; here's my later benching system with two of them (easily beats a 580 and not far off a 680 or 7970 GHz, sans VRAM capacity of course):
      www.3dmark.com/3dm11/6616515
      Things began to change with the 500 series. The 560 was just a rebranded 460, then there was a weird 570 or somesuch with a different VRAM config (remember when 1280MB was a thing?), though the 560 Ti actually had some oomph to it and wasn't too bad (I had two of those for a while, bought used of course), but the 580 really appealed to solo pros and I reckon that's when light bulbs went off at NVIDIA (I bought quite a few for pro builds, plus my own gaming upgrade, all on the used market), hey this CUDA thing has legs, but we don't want serfs buying these things. So they killed FP64 in normal cards, the TItan/Black being the last which had it to any degree, and pricing really began to climb with the 600s onwards, though the marketing was a mess for gamers because the shaders being used were totally different, a GPU needed a lot more of them to match the CUDA power of the 500s (the 580 is faster than all 600 cards for CUDA, and the only 700s that can beat it are the Titan and 780 Ti). This is when the better VRAM capacities were squeezed into the higher tiers, which was really annoying, though weirdly the 760 had a 4GB version even though for relevant games it really wasn't fast enough anyway, whereas the 780 Ti would still be pretty useful today if it had 6GB like the TItan (it runs similar to a GTX 970/980 or RX 570/580).
      AMD cards had their own ups and downs, but tbh I had too many drivers issues, especially with CF (I obtained two GHz 7970s which were certainly quick, though loud, but DX9 CF support was broken), inparticular weird desktop glitches when turning it on & off. Still, for benching I also obtained two 4870s, two 5850s and some others.
      Anyway, it's all a different market now. Gamers have been sucked into a black hole of functionless visual bells and whistles (by that I mean, for example, water which can look amazing but is not actually *wet*, or game worlds in which most of the landscapes one can see can never be reached, too on rails) and a ridiculous obsession with fps rates, seemingly oblivious to the fact that using higher refresh displays literally retrains their brains' vision systems so that they can no longer tolerate ordinary 60Hz monitors. They move their own tolerance goal posts (ref research published in New Scientist long ago). IMO DLSS wasn't about providing 'free' performance, it was about keeping those gamers happy who had bought high(er) refresh displays but suddenly found the next gen cards (Turing) had no useful raster improvement, a double whammy when it coincided with the higher spending market beginning to adopt 4K aswell, a fad that NVIDIA itself promoted. It blew my mind when Turing launched and NVIDIA actually framed its RT PR around 1080p/60. :D Suddenly now we have DLSS to "bring it back up again" - ha! What a psych and a half. Beats me why we still don't have basic full scene AA that's as good as IR gfx from 25 years ago, it's not like available VRAM is still an issue. People also talk about game stutter and refresh sync now as if back before Freesync/Gsync peoples' heads exploded whenever they saw screen tearing. Infact, back then, people simply didn't care, the focus was much more on just getting basic performance up while improving visuals at the same time. I played Stalker, Oblivion and other games with various SLI configs, the screen tearing never bothered me, though I was using a good CRT (22" Trinitron @ 2048x1536).
      In other words, no matter how one might regard DLSS (or any upscaling tech), it's allowed these companies to offer lower quality output at higher fps rates, the very thing article reviews from long ago sought to warn us about, but a form of testing which isn't done anymore. The focus ought to be on native res quality, but all this upscaling stuff means comparing A with B for any game is rapidly becoming impossible. It's a dream come true for marketing depts., they can claim pretty much whatever they like, it's all just more acronyms they can encourage people to believe are worth paying for. It also makes it very hard to know whether other visual tech such as AOC is actually working as intended, because visual quirks could be due to whatever upscaling is active.
      What really gets me is when someone says a DLSS or whatever image is "better than native"; that makes no sense at all, such a thing shouldn't be possible. If it's happening then it merely proves that the baseline tech itself is being allowed to degrade, with what used to be new *hardware* features instead being rolled out as software addons (via APIs or drivers) in an attempt to lock gamers into one particular product line.
      Frankly, it all kinda stinks. I remember being so impressed when I bought the X1950 Pro, and for some years what was happening with tech for gaming was a fascinating scene to observe, but now it's a complete joke.
      Btw, for more on what I mean by functional immersion, see this old article: www.sgidepot.co.uk/reflections.txt

  • @failomas1443
    @failomas1443 Před 2 lety +1

    the joke on: you will miss the 500$ more from your wallet, killed me on spot from laughing

  • @craigdavies8866
    @craigdavies8866 Před 2 lety +7

    Imagine paying over 1k for 5 to 10 frames... well done Nvidia, you are drunk go home.

    • @H0don
      @H0don Před 2 lety

      There are ppl willing to buy this product anyway that'w why we can see releases like this one. Pandemic showed us how much underpriced GPUs were before. I don't expect much improvement in price/performance in the next gen GPUs. I don't consider global price affecting factors like crypto, energy cost etc.

    • @craigdavies8866
      @craigdavies8866 Před 2 lety

      @@H0don Very true people buy it, which means Nvidia know they can keep going higher. I mean look at pascal to Turing, that started the trend really. Ampere just continued it higher, and now Lovelace will do the same. I do not believe GPU's were underpriced however. Fabrication costs of course have gone up, but this was not the case last generation. Shipping, manufacturing etc all were stable yet prices still went to 1200 for a Ti. Nvidia saw it worked and so just continued the trend.

  • @geerstyresoil3136
    @geerstyresoil3136 Před 2 lety +1

    the smoke coming out of the cards was a nice touch. 😆

  • @Sherudons
    @Sherudons Před 2 lety +5

    Soon the gpu's will need a computer case just to support the weight of the gpu and a case for the smaller motherboard.
    All plugged into a nuclear reactor to power the damn thing.

    • @haukionkannel
      @haukionkannel Před 2 lety

      You can replace central heating system by this... So pure savings ;)

  • @gwynbleidd9921
    @gwynbleidd9921 Před 2 lety

    This is a good channel. You guys deserve more subs

  • @wertywerrtyson5529
    @wertywerrtyson5529 Před 2 lety +5

    I am fairly certain that the power draw would trip my circuit breaker as my home movie setup did. I am sticking with less powerful cards not just because I don’t want to buy a 1000+ watt PSU and pay the increased power bills but also because with my monitors and sound and everything else it would probably as a said trip the breaker and I wouldn’t be also to use it. Just like I had to downscale my home theatre.

    • @i_grok_u2902
      @i_grok_u2902 Před 2 lety

      I keep thinking that come next gen, I'm going to need to get an electrician to put in a new circuit for just my PC. I keep waiting for my circuits to pop... lucky so far, but I don't think there is anymore headroom.

    • @tablettablete186
      @tablettablete186 Před 2 lety

      What are your thoughts on using a power limit on the GPU? Example: limit a 3090 to say 250W.

    • @Tallnerdyguy
      @Tallnerdyguy Před 2 lety

      Wah

    • @ToadstedCroaks
      @ToadstedCroaks Před 2 lety

      @@tablettablete186 Then why even buy the card? Get one that normally does 250W.

    • @tablettablete186
      @tablettablete186 Před 2 lety

      @Sherman Herritt More performance per watt. A 3090 for example performs better than a 3070 ate the same wattage.
      And, there are other benefits: beefier heatsink, more VRAM...

  • @Alchemist37
    @Alchemist37 Před 2 lety

    Loving the smoke machine effect in the background really set the mood hahaha whose brainstorming idea was that??? 😉

  • @jdbb3gotskills
    @jdbb3gotskills Před rokem +5

    I totally understand that the 3090ti is ridiculous. But now that prices are a lot lower. I had to go for it. I've never bought a "halo" product before when it comes to PCs. I wanted a close to ridiculous PC. I still only have 16gb ram but I'll be upgrading that later haha

    • @nicekeyboardalan6972
      @nicekeyboardalan6972 Před rokem

      You must feel like a fool that 4070ti (899) beats it in a lot of stuff. I do love my EVGA 3090ti I traded it for an Amiga 1000

  • @ZeroHourProductions407
    @ZeroHourProductions407 Před 2 lety +2

    For this kind of asking price and the power consumption, _if I were_ in a position to actually afford to drop this much on a card, I'd really expect a super slimmed edition with a water block from the factory to deal with that kind of thermal load.

  • @Z1nka
    @Z1nka Před 2 lety +5

    I would rather wait for the 4070 coming later this year. At least I might get the same +/- perfomance at a lower price point.

    • @ThunderingRoar
      @ThunderingRoar Před 2 lety +2

      @@silverwerewolf975
      1070 was faster than 980ti
      2070s was on par with 1080ti
      3070 was on par with 2080ti
      4070 will be on par/faster than 3090

    • @ThunderingRoar
      @ThunderingRoar Před 2 lety

      @@silverwerewolf975 sure 3090 and 3080ti are very close in performance anyway

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ Před 2 lety

      @@ThunderingRoar according to moores law the 4070 should be around 30% better than the 3090. Which means around 20% better than the 3090ti
      czcams.com/video/c--1_0HqB3w/video.html

    • @SteelGrayRider
      @SteelGrayRider Před 2 lety +1

      @@silverwerewolf975 I think you left out the 4th digit

  • @youtubevanced4900
    @youtubevanced4900 Před 2 lety +1

    19:37 Linus get down, shots being fired.

  • @escescus
    @escescus Před 2 lety +5

    As aways, no RT nor DLSS. Nice Job

  • @SilverWolf-VR
    @SilverWolf-VR Před rokem +1

    Rendering 4k games in VR is basically the same as rendering 8k games on a flat screen.
    VR is always ignored, but VR can and will take full advantage of the 24GB offered by these cards.

  • @anthonytech
    @anthonytech Před 2 lety +6

    Damn, I'd feel bad putting that in my PCIe slot! Might rip the damn thing off because it's so heavy!
    Edit: Nvidia doesn't care about reviews. They know the people who want the very top end are going to buy the 3090Ti.

  • @james2042
    @james2042 Před 2 lety +1

    All this proves is just how good my 1200 dollar liquid cooled 6900xt (from August 2021) was lol. I game at 1440p and I couldn't be happier

  • @mcguiremnc
    @mcguiremnc Před 2 lety +3

    Who in their right mind would buy 2 year old tech at these prices for so little price/performance and then tap on a 30% power uplift? For what people paid for 3090's there should have been a recall for the better, no proper thermal design a year ago.

  • @capitalwun
    @capitalwun Před rokem +2

    I don’t regret buying this, Nvidia had it at like $1000 or $1100 I don’t remember, good deal I think. Idk what it’s at now but almost 50% off, wow.

  • @iulian2548
    @iulian2548 Před 2 lety +10

    It is just as expected.
    Nice to see Steve raising his rear leg over this useless product that will be obsolete in a few months.
    Riding the gravy train!

  • @-pcmr-108
    @-pcmr-108 Před 2 lety

    Your B-Footage for CP2077 towards the end is actually Watch Dogs.

  • @theigpugamer
    @theigpugamer Před 2 lety +10

    Really proud of your review Steve
    I am so glad you didn't shill out like Jay does in his "reviews" aka paid advertisements and shared the frustrations of gamers and not just parroting this fucking cash crab and punch in the gut to gamers

    • @andersjjensen
      @andersjjensen Před 2 lety +5

      Jay is becoming as much of corporate thug as Nvidia themselves. He recently ran a "here's the best computer you can buy for $1000" where he LEGIT POSTULATED that a 1660Super for nearly $600 was the best way to go.... while posting affiliate links to his overpriced recommendations. Needless to say the comment section went apeshit with links to 6600XTs for less than $550....

    • @DarkReturns1
      @DarkReturns1 Před 2 lety

      It's funny how much hate people can get for their opinions lol. I don't agree with him all the time but Jay is a good dude.

  • @devackroyd
    @devackroyd Před 2 lety

    Man, I haven't been paying much attention to PC hardware stuff for the past 6 months and and I come back to take a look to find this? Back to exile I go.

  • @houzeroze
    @houzeroze Před 2 lety +7

    If you ask me, the 6900XT is an awesome card, efficient, and provides great performance for the power that it consumes and the MSRP price. The 6950XT hopefully won't consume that much more power but I'm sure it will be awesome.

    • @FzNd
      @FzNd Před 2 lety +2

      The only silly thing about the 6900XT is the MSRP, specifically how much higher it is than the 6800XT. However, MSRP hasn't mattered EVER since the release of this generation's GPUs. I bought a 6900XT half a year ago, when it was at pretty much the same real world money as the 6800XT and cheaper than a 3080 10GB.
      Still ludicrously overpriced by any reasonable standards, but so was EVERY OTHER GPU available. In the general state of the market, it was a decent buy, I think.

    • @andersjjensen
      @andersjjensen Před 2 lety

      @@FzNd The 2080ti was $1200 for all of it's life. Which, granted, was revolting already back then. But from that perspective the 6900XT isn't even that bad. As long as AMD provides a "close to the top, but reasonably priced" card I actually don't care they make a slightly better version for those people who think that the obtuse price in and of it self is making the product attractive. Some people are shallow enough to define themselves by having better things than others. It's worth something to them. So someone should sell it to them....

  • @billcarson1966
    @billcarson1966 Před 2 lety +1

    4:27 love the smoke effect in the background. Not necessary. It doesn't make the cards any faster. But I like it! As long as that doesn't happen to my card when I'm gaming! LOL

  • @holobolo1661
    @holobolo1661 Před 2 lety +9

    Great unbiased review that is so much more in touch with the average PC user than Nvidia could ever hope to be.

    • @aternialaffsalot
      @aternialaffsalot Před 2 lety

      Their review isn't wrong but I mean, they're obviously biased against nvidia

  • @Mattribute
    @Mattribute Před 2 lety +2

    Power consumption matters. The heat is a problem for performance since its going to heat up the whole case, and then there’s the noise. If you have a wide open case with fewer fans there’s still nothing to stop the noise. But with an enclosure you need more fans. The noise interferes with enjoyment.
    So these cards are a problem at full power and one must look for the best cooler and vrm, and then downclock a bit.

  • @coldvaper
    @coldvaper Před 2 lety +9

    This last 18 months I've been thinking to myself that we needed more expensive graphics cards and Nvidia delivered, thanks Nvidia...

    • @andersjjensen
      @andersjjensen Před 2 lety

      Yeah, I know right? Now we can finally put those upper midrange PEASANTS back in their PLACE!

  • @mr_fearless
    @mr_fearless Před 2 lety +2

    Good stuff Steve. I've no desire to ever pay crazy money for a huge, loud, power guzzling graphics card. I want a graphic card that is slim, power efficient, reasonable priced and good value for money that can do 1440p gaming at 144hz. I don't need ray tracing, and I don't need image scaling/up-sampling - just a card that can natively do the job.

    • @mapesdhs597
      @mapesdhs597 Před 2 lety

      Alas I think that's a market segment which is effectively dead, these companies don't care anymore. Their new MO is to suggest one can achieve the desired fps rates via the upscaling methods, which is a con really but it's gained a lot of ground very quickly.

    • @fwef7445
      @fwef7445 Před 2 lety

      still rocking my 1080ti, I'd buy a ps5 if I could find one also, this 'upper elite' bracket of gaming isn't worth it

  • @eliotcole
    @eliotcole Před 2 lety +3

    A top of the range GPU used to be around £800-900 ... so I was thankful for getting my RX 6900 XT at ~£1000 ... but this is just getting insane.

    • @MrPreet23
      @MrPreet23 Před 2 lety +1

      Same here very much happy with 6900 XT with 1000 watts power supply rather spending $2000 just for 10-15 performance difference and AMD 6900 draws half of wattage what 3090 does

  • @PokèMyBalls
    @PokèMyBalls Před 2 lety +1

    The Gainward model was steady holding 2Ghz at 40 Watts less. I'd say that's a decent difference.

  • @AleksAngel82
    @AleksAngel82 Před 2 lety +8

    When it comes to graphics cards, I have to say anything over 500 dollars is stupid in my opinion, especially if you just do gaming.

    • @H0don
      @H0don Před 2 lety +1

      Well, the mid/high end gaming became a really expensive hobby.

    • @reinhardtwilhelm5415
      @reinhardtwilhelm5415 Před 2 lety

      Idk, I consider the 3080 and 6800 XT to be some of the best cards of this generation. They’re merely quite good, and not great.

    • @giglioflex
      @giglioflex Před 2 lety

      @@reinhardtwilhelm5415 The 6800 XT maybe but the 3080 is too power hungry. It's the reason I decided to stick with my 1080 Ti. I live up north and even then my 1080 Ti heats up the room quickly. Anyone with these 350 - 450w video cards is certainly using AC during the summer and even then I can imagine the power you have to draw just to keep your room from getting too hot.

  • @bobapearl2193
    @bobapearl2193 Před 2 lety +1

    A use case that you might not have considered for the 3090 (not the 3090 Ti) is that programs like Daz Render use huge amounts of ram, and if you don't have enough vram on your GPU it can't do the Daz render, and will have to work off the PC's CPU. That 24gb makes a huge difference in rendering times.

  • @WybremGaming
    @WybremGaming Před 2 lety +9

    This is what pc gamers brought upon themselves during this pandemic. You kept on buying these marked-up cards and now nvidia is here to jack up prices. Let's be real here, the 3090ti replaces the 1080ti, 2080ti. Not their titan as this card has again handicapped fp64 performance. Pathetic. I expect pc gaming to become so expensive that it just implodes sooner or later. I myself moved to consoles. They're not the best, sure, but it saves a LOT of money. Speaking about money, that electric bill with 600watt gpus will be crazy.

    • @twiztidsidfreak13
      @twiztidsidfreak13 Před 2 lety +2

      Why don't you just say "im a console peasant" it would save time. I'm still seeing SP5s going for $800-1000 in my local area. So how are console cheaper?

    • @WybremGaming
      @WybremGaming Před 2 lety +3

      @@twiztidsidfreak13 The fact that you use that wording tells me you're a typical 1080p srgb pc gamer on a shit 1650ti gaming in what I'd say is close to black/white. I used to be a pc gamer, a real one, you know, 580, 780, 980, 1080, 2080 all high-end, but there is no reason for it today. Games are shit, launchers are shit and fragmented and hardware is overpriced and slow. The 3090ti can't even maintain 60fps in Valhalla Ultra settings lmfao. No OLED, no Retina, no P3 gammut or barely. What a joke. PS5 is 500eur, not a cent more. Same for xbox. It's not that hard to find.

    • @tourmaline07
      @tourmaline07 Před 2 lety

      We really need some cards next generation which offers 3060 performance at £250 or less (1080ti raster), 2060 performance at £175 or less.
      It's well past time the RX 580s and 1060s died off now. They've had their time in the sun. But they will hold everything back on PC if devs need to target them this gen.

    • @tonysoprano655
      @tonysoprano655 Před 2 lety

      @@WybremGaming lol bro you said it very well. I have 3060ti PC but these prices are outrages I rather have a PS5. PC gaming starting to feel like a rip off.

    • @WybremGaming
      @WybremGaming Před 2 lety +3

      @@tonysoprano655 I sold my rtx 2080 pc cause someone was willing to pay 1250 for it. Pc prices need to drop, hard before I consider going back. I always want the 2nd best card for 800eur so a 3090 for 800.

  • @Shannon-ul5re
    @Shannon-ul5re Před 2 lety +1

    The said thing is; people will pay the high price if it's what they want and when you are at this level, money isn't an issue!

  • @orenm1454
    @orenm1454 Před 2 lety +4

    Since they saw customers are willing to pay absurd prices for high end gaming this is a legit move from Nvidia there are always some people willing to pay those prices.

    • @Zero11s
      @Zero11s Před 2 lety

      they are like Apple

  • @hughmungous6417
    @hughmungous6417 Před rokem +1

    Now that it's October 2022, and prices have come down significantly, which series card now shows best value for gamers? The MSI Gaming Trio 24G is available in Canada for less than $1200 on sale and can also be found periodically on Ebay for that price. That slots it in at $900 US approx. Is this the expected drop before the 4000 series?

  • @nibelungvalesti
    @nibelungvalesti Před 2 lety

    The one and only fair comparison on CZcams - thank you. Everyone else is comparing it (Ti is factory overclocked) against 3090 FE, or other stock-ish variants. Idiots.

  • @caoticnda8465
    @caoticnda8465 Před 2 lety

    Thx 4 the review. I do mostly agree. My personal opinion however: 1.700 U/min is far from being silent. If the room has a base enviromental noise level and the card respectively the pc can't be heard, than we would talk about being silent. everything else would be "more or less quiet" max.

  • @just_some_guy3687
    @just_some_guy3687 Před 2 lety

    please review the hp omen 27i it hasn't been really noticed and the amount of information you guys give makes me confident in what to buy

  • @VanilloBumbleboo1981
    @VanilloBumbleboo1981 Před 2 lety

    10:07 An RTX 3090 Ti in the background caught fire!! 😉

  • @dushanbebbddhdhdjsjs715

    It is cool that you guys have a different 3090ti from all the other reviewers that got Evga ones

  • @MSJChem
    @MSJChem Před 2 lety

    The Gainward 3090Ti is selling for 2700 USD here in Japan (327000 yen).

  • @garydurn7983
    @garydurn7983 Před 2 lety

    This product is only a pricing study Steve, it's being used to assess market acceptance ahead of the next generation launch.

  • @James-rv3yi
    @James-rv3yi Před 2 lety

    3:07 24gb of vram is really useful for machine learning. If you're data is over the vram size your CPU has to fetch the data from cache which can cause major performance losses. Even with 20+ thread CPUs and the fastest IO NVMEs you can find you will be bottle necked by how much data you can send to the GPU vs the GPU. Really 2x 3080s will train faster than 1x 3090 but it doesn't scale as well. To most people 24GB seems like a lot, but for my non work at home models 24GB isn't enough anymore. Vs Quadros 2x 3090s for $3,000 is a deal if you can find them for a personal rig. Personally I want to see the next XX90 or Titan have 48gb of vram. Some of even my at home projects have hit 80GB of vram.

  • @Runningr0se
    @Runningr0se Před 2 lety

    One nice thing for NVIDIA is that now the AIBs will already have 450W coolers ready for the RTX4000 series later this year.

  • @willi4242
    @willi4242 Před 2 lety

    Hey Steve, thanx for dialing in the 3090 TI for us! I got my 3080 over a year ago and still think I hit the sweet spot then. So thanx not to singing the selling-hyper-song of the card companies and giving some real world perspective! Keep going and greetings from good ol Germany.