I’m Dreading this Review - RTX 4070

Sdílet
Vložit
  • čas přidán 5. 05. 2024
  • Save 15% with code LinusTechTips at vessi.com/LinusTechTips
    Check out VideoCom for free at lmg.gg/VideoCom and use code LINUS for 50% off any paid subscription!
    Nvidia's RTX 4070 is a complicated card.. on one hand it is the best value we've seen in years, but on the other everyone is still mad.
    Discuss on the forum: linustechtips.com/topic/15002...
    Buy a NVIDIA RTX 4090 GPU: geni.us/1BwMkoC
    Buy a NVIDIA RTX 4080 GPU: geni.us/HGfZ0
    Buy an XFX AMD RTX 7900 XT GPU: geni.us/FiwjABC
    Buy a Gigabyte GeForce RTX 4070 TI OC GPU: geni.us/zPhA1cl
    Buy an XFX AMD RX 6800 XT GPU: geni.us/6qWA
    Buy a NVIDIA RTX 3080 GPU: geni.us/yGFs
    Buy a NVIDIA RTX 3070 GPU: geni.us/a47uSbC
    Buy a NVIDIA RTX 2080 GPU: geni.us/6kPA
    Buy a NVIDIA GTX 1080 GPU: geni.us/lmfy
    Buy an Intel Arc A770 GPU: geni.us/tHZb5
    Check out RTX 3080 GPUs on eBay: ebay.us/cKqlbc
    Check out RTX 3070 GPUs on Ebay: ebay.us/Iap1tx
    Buy an Intel Core i9-13900K CPU: geni.us/uK3UAK
    Buy a Gigabyte Z790 Aorus Master Motherboard: geni.us/SMzYt2
    Buy an AMD Ryzen 7 7800X3D CPU: geni.us/BzdOS
    Buy a Gigabyte X670E Aorus Master Motherboard: geni.us/GAj33
    ► GET MERCH: lttstore.com
    ► LTX 2023 TICKETS AVAILABLE NOW: lmg.gg/ltx23
    ► GET EXCLUSIVE CONTENT ON FLOATPLANE: lmg.gg/lttfloatplane
    ► SPONSORS, AFFILIATES, AND PARTNERS: lmg.gg/partners
    ► OUR WAN PODCAST GEAR: lmg.gg/wanset
    FOLLOW US
    ---------------------------------------------------
    Twitter: / linustech
    Facebook: / linustech
    Instagram: / linustech
    TikTok: / linustech
    Twitch: / linustech
    MUSIC CREDIT
    ---------------------------------------------------
    Intro: Laszlo - Supernova
    Video Link: • [Electro] - Laszlo - S...
    iTunes Download Link: itunes.apple.com/us/album/sup...
    Artist Link: / laszlomusic
    Outro: Approaching Nirvana - Sugar High
    Video Link: • Sugar High - Approachi...
    Listen on Spotify: spoti.fi/UxWkUw
    Artist Link: / approachingnirvana
    Intro animation by MBarek Abdelwassaa / mbarek_abdel
    Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
    Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
    Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
    CHAPTERS
    ---------------------------------------------------
    0:00 - Intro
    0:48 - VideoCom!
    1:00 - Specs
    2:00 - Gaming Performance
    4:00 - Productivity
    5:53 - So why aren't we stoked?
    6:44 - Efficency, Overclocking
    8:18 - DLSS3
    9:57 - Non-gaming Nice Things
    10:46 - Vessi!
    11:30 - Outro
  • Věda a technologie

Komentáře • 7K

  • @UncleSheoTV
    @UncleSheoTV Před rokem +12826

    Who would've thought the 40 series was going to make the 30 series look a lot more desirable.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Před rokem +746

      I still remeber all those videos regarding how RTX 40 will be the new pascal LOL. More like the new Turing

    • @Saver310
      @Saver310 Před rokem +181

      ​@@DragonOfTheMortalKombat MORTAL KOMBAAAAAAAAT!

    • @RZ_2K
      @RZ_2K Před rokem +192

      the problem releasing new series gpus too soon, if the 40 series released 3-4 years after 30 series it would be much much better and at a similar or slightly higher price

    • @indiexanna
      @indiexanna Před rokem +361

      "Who would've thought"??? Saw this coming miles away after that report from last year which claimed that NVidia were actually part of the shortage problem, stockpiling the 30 series.
      They have to sell the 30 series fast while keeping the 40 series appealing, how? Sell the 40 series at high prices so the 30 series look appealing, but keep some features like Frame Generations to the 40 series only.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Před rokem +29

      @@Saver310 FATALITY!!!!!!!!!!!!!!

  • @GGdeals
    @GGdeals Před rokem +612

    9:25 We're so glad you're using our website, Linus! Thanks for featuring us!

    • @DomikDomik
      @DomikDomik Před rokem +11

      Congrats!

    • @MyVecter
      @MyVecter Před rokem +5

      So happy for u guys! 🥰

    • @nairocamilo
      @nairocamilo Před rokem

      The best website I've found. Recommend to all my friends.
      _So easy to find out free games out in the wild, hehehe_

    • @YannMetalhead
      @YannMetalhead Před rokem +13

      Your site is awesome, thank you for the great job!

    • @Hoogles0
      @Hoogles0 Před rokem +3

      This is the way

  • @thebricknomads
    @thebricknomads Před rokem +2363

    The way that the 1080 is still able to show up on graphs just shows how much of a monster it was

    • @Physik_
      @Physik_ Před rokem +116

      Mine just died and I am so sad about it.

    • @javierxavier11
      @javierxavier11 Před rokem +45

      i have a 1080 lol

    • @Vsevolod3788
      @Vsevolod3788 Před rokem +165

      It shows how stagnant computer graphics and gpu market of today is.

    • @Thewaterspirit57
      @Thewaterspirit57 Před rokem +57

      It’s literally a high end gpu, that’s the only reason why it’s still good even more than 5 years later

    • @-TNT47-
      @-TNT47- Před rokem +64

      I've been using a 1080ti since the day it came out, i can safely say i was able to keep up with games flawlessly to this day.

  • @Eneeki
    @Eneeki Před 8 měsíci +57

    I have found the RTX 4070 is really good for 3D rendering in programs like Brice, Daz studio, and the IRAY render engine, (it also rendered really fast on 3D Studio Max, but i only had an upgrade trial copy because my licensed copy of 3DS Max is a decade old and is only used for modeling). My personal benchmark was of a scene I created specifically to punish the GPU's. I rendered the same scene in Daz Studio using the IRAY engine at 4k resolution with 4k textures on every object set for a total of 3500 iterations. My result for the 3 cards i have were the NVIDIA 2070 Super render time 6 hours and 43 min. The RTX 3070 was 3 hours and 22 min , and the RTX 4070 cut it down to 44 min total!!! (I was shocked as to how fast the 4070 rendered the scene) I made no changes to the file before rendering, so all three cards were rendering the exact scene with zero changes. I know this was not a formal bench test, but i thought I would share the results for the 3D modelers out there since most card reviews focus on games vs 3D programs. Hope the info helps.

    • @4536647674
      @4536647674 Před 2 měsíci +1

      Nice. The benchmark im looking for. Using an AMD and wanted to upgrade to AMD again (for gaming and rendering/designing) but noticed that Nvidia has major advantages for 3D rendering

  • @the_netherqueen
    @the_netherqueen Před rokem +851

    I love the small changes to the graphs you guys did! Last review there were a bunch of people saying they couldn't make out which product was the one that was being reviewed in the graphs, and just highlighting the product being reviewed is a very nice!

    • @mohamedtechnology5105
      @mohamedtechnology5105 Před rokem +13

      Exactly!

    • @robertp457
      @robertp457 Před rokem +7

      Thank you for pointing this out. Totally agree it's so much better!

    • @Ben.N
      @Ben.N Před rokem +14

      I still think it would have been clearer if they had one section for each manufacturer.
      Edit: also it would probably be nice if they had an icon or a photo for each graphics card

    • @grigri750
      @grigri750 Před rokem +4

      ​@@Ben.N Totally agree! it's also kinda confusing that they are sorted by speed/score instead of keeping the order the same - if i want to compare the 4070 to e.g. a 7900xt i keep having to pause and search for them with each new slide

    • @NotTheCIA1961
      @NotTheCIA1961 Před rokem +1

      I'm in agreeance with this, I didn't realize while I was watching but in hindsight the graphs were wildly easy to understand

  • @Ikol3751
    @Ikol3751 Před rokem +217

    04:14 "This test is very CPU bound, but it is still funny how well Arc did." - shout out to the editors for this one.

    • @djkiIIag
      @djkiIIag Před rokem +2

      Backhanded compliment for sure! 🤣

  • @thaddeusk
    @thaddeusk Před rokem +492

    You can get a 6950 XT for slightly more than a 6800 XT these days. That would have been interesting to see on the benchmarks.

    • @sleepspacee
      @sleepspacee Před rokem +78

      @@lennybarentine6425 Linus isnt either, nvidia does not like him

    • @FerralVideo
      @FerralVideo Před rokem +36

      GN also mentioned this on their review of the 4070.
      I just finished a "last generation weapon" build, 6950xt + 5800X3D, targeting high-refresh 1440p.
      No regrets.

    • @Joseph-zh7fm
      @Joseph-zh7fm Před rokem +36

      @@lennybarentine6425 Linus has shit on Nvidia countless time before, what are you talking about.

    • @David-bh1rn
      @David-bh1rn Před rokem +23

      ​@@lennybarentine6425 he literally uses a 7900 xtx

    • @PolarisWZ
      @PolarisWZ Před rokem +1

      @@FerralVideo I'm doing the same thing for 1080p 240hz

  • @trstnu
    @trstnu Před rokem +854

    As someone that recently built their first PC weeks ago, I personally went with the 6700 XT and I've been absolutely blown away by the performance, plus the 12gb of VRAM is always a plus for less than 400 dollars.

    • @EricFixalot
      @EricFixalot Před rokem +46

      I'm also on a 6700XT, I got mine used for $250. Absolutely love it

    • @stumm3186
      @stumm3186 Před rokem +28

      I am also a 6700XT owner and absolutly happy, i can play every gane on 2k with highest settings(yes, Hogwarts legacy and Cyberpunk included)

    • @WithMay
      @WithMay Před rokem +23

      Glad to hear so many people are happy with this card, because I just bought it!

    • @EricFixalot
      @EricFixalot Před rokem +18

      @@WithMay I highly recommend going 1440p at the same time if you haven't already. Mine needed so much CPU power at 1080p that it stuttered a bit. 1440p is buttery smooth. I went with a Koorui 1440p 144hz monitor from Amazon and I've been super happy with that as well.

    • @WithMay
      @WithMay Před rokem +8

      @@EricFixalot I got a 1440P 165 Hz monitor from ASUS on sale for 329 euros 🥰

  • @dumboy886
    @dumboy886 Před rokem +684

    Thanks a lot for including the Arc 770 on this. Got one a few months back and all the data out there is out of date, so being able to compare it to other newer gpus is nice.
    Really glad to be able to use this review as an updated Arc 770 update.
    (different typography for intel arc in the graphs makes it stand out. It's a fetaure not a bug)

    • @superashish006
      @superashish006 Před rokem +15

      Arc is looking good with these graphs

    • @sonnyboy2340
      @sonnyboy2340 Před rokem +50

      So excited for battlemage and celestial. We might have cheap GPUs again after all

    • @justinliechty7745
      @justinliechty7745 Před rokem +17

      My arc has impressed me when gaming 1440 and 4k

    • @MeteorMetro
      @MeteorMetro Před rokem +2

      who buys arc???

    • @sadiporter2966
      @sadiporter2966 Před rokem +3

      intel will be launching some lower end gpus soon so we will see updated benchmarks in those reviews.

  • @xpreame3406
    @xpreame3406 Před rokem +2772

    What's even more crazy is how the A770 can hang with these cards.

    • @markmanderson
      @markmanderson Před rokem +225

      fingers crossed for Battlemage!

    • @neckbeardpig279
      @neckbeardpig279 Před rokem +172

      true, if it wasn't for the resizable bar thing the A770 would be an unbeatable budget king.

    • @Elkarlo77
      @Elkarlo77 Před rokem +2

      @@neckbeardpig279 I have one A770 on a B450 with Re-Bar. Great Card, drivers are.. okey i quitted Nvidia for long time and ran AMD Cards since the first 5000th series, so they are okeyisch for me, but i won't recommend them to others, but with every Update the Card improves now it is somehow stable with the last Update. Back to the topic, A770 would be a great budget card for older PC, but the Performance tanks with deactivated Re-bar and the drivers are.. not good still. Battlemage will show the potential for two reasons: "5 Year old PC will include Rebar capable PC's then, and the drivers are matured." Intrestingly the combination old AMD Board + Intel GPU is atm the most feasable for A770. You can get an AMD Ryzen 5 5600 for 110-120 Bucks, when you got a B450 Board, which was released in 2018, then you get Rebar and full capability with Intel GPU. So an Upgrade with an Ryzen 5 5600 and Intel Arc A750 for less then 400USD/€ is great Deal and you get great Performance. You need for a RX 6600/6650 XT as well Re-Bar for good Performance, with compareable Performance the Intel Card is 50 Bucks cheaper. (The A770 is in my Workstation not my Gaming PC)

    • @steveleadbeater8662
      @steveleadbeater8662 Před rokem +160

      Intels next gen cards could cause real market movement - Rumoured 3080 performance at $250 -$300 (although I'll believe it when i see it). Maybe Intel can do what AMD have refused to do.

    • @anotherkakarot5578
      @anotherkakarot5578 Před rokem +25

      Team BLUE!!!

  • @ValiantNoob
    @ValiantNoob Před rokem +213

    I was bit taken back when I saw the price of the 4090 was around 1600-2000 dollars. Then I remembered the days when people were buying 2 or 3 $1200 Titans to SLI in their enthusiast builds. The enthusiast budget hasn't really moved from that $2000ish range. However, to see the 70 series cards cost $600 minimum is insane.

    • @kahlernygard809
      @kahlernygard809 Před rokem +2

      already dropped to 500, black friday it will be sub 400

    • @pandemicneetbux2110
      @pandemicneetbux2110 Před rokem +11

      What infuriates me so damn much about this is that first of all the tech industry takes everybody's grandma for a ride, getting ignorant naive parents to overpay for crap (I really don't care if smug hipsters get ripped off, iPhones and macbooks are sometimes a victimless crime), but also because if you get enough of the consumers to be stupid and accept being fucked in the ass, it fucks everybody over equally. And so I really don't have the free market option of just switching to AMD, if they get simps to pay $600 for what's basically a 4060, because AMD is just going to do the same. Case in point, 80 level performance is now $900. No, I don't care that they called it "the 7900XTX" It's like Steve at Gamersnexus said, if you just move the entire gauge you didn't improve performance, you just turned the entire gauge but the reality stays the same. ANd in this case it's that 80/800XT level performance that cost about $650/700 two years ago now is literally $900. It's not a 7900XT. It's an RX 7800. They gave their RX 7800 a different name. So because nVidia buyers are all such cucks but they outnumber everyone else, you get this wave of people not even smart enough to make a Dunning Kruger remark tipping the scale by sheer mass into market stupidity, and like it's not even drug dealing, it isn't like this is healthcare or heroin or military gear where there's some broader reason why the consumer literally needs it to survive or is clinically crazy and stupid, it's just exemplifying this notion companies have that gamers are the market least deserving of being taken seriously or respected, that then gamers do stupid shit like this that makes those execs seem like they're right. I mean, why would you respect someone that pays that kind of money for a graphics card and is too dumb to notice what they're doing?
      So I can't even do anything about that beyond buy used at this point, which mainly is a problem because I didn't want to replace PSU and something like the 6900XT uses too much power (not that it matters, Lovelace efficiency sucks and so does the 7800XTX). I'm so mad about this because you know in 2020 when nVidia was being insultingly overpriced (imo, I had no idea the average gamer is this dumb) I at least had the option to switch brands, and it was fairly competitive. I still like my 5700XT, and it still runs everything but Cyberpunk at 1440p (the game itself is just kind of a mess, I think they finally fixed the bugs and mostly optimized it enough but you're still talking like 35-45fps on a 1080ti basically) so it isn't like I'm suffering, but soon enough this may be a problem for my system. If the corpos are just renaming everything while scamming me I'm basically just going to keep doing what I did all pandemic long and sit things out until my graphics card breaks.
      But the other thing about it is, it's not just the price being worse, 8gb on a 4070 is so bad I literally keep forgetting this, it's like my brain does some trim function as if 8gb had to be in error, but no it really is an 8gb card somehow. That's not even budget at this point dude. That's like if they put 2gb on the GTX 1060. It's clearly not even enough by modern standards. I play at 1440p and it already maxes out, in fact I question sometimes if some of the texture popin I got in Cyberpunk is due to it exceeding 8gb sometimes. There are games right now that use over 8gb on ultra settings at 1080p. So it's not just the fact it costs substantially more, it's the fact you're getting a substantially worse product too, with worse bus width ON TOP OF the inferior VRAM. So the nu xx70, the slot that last gen was practically intro to 4k and high refresh 1440p and a midrange card, is now two years later basically a budget 1080p card, and even at that it's questionable how long it's going to last for 1080p. Bear in mind this was the memory config on the RX 480, this is a VRAM standard that's nearing a decade old now, and it also pisses me off that literally my number one pet peeve about games, smudgy bad texture work, still stagnates solely because of nVidia's self defeating greed making their RT also somewhat useless when cranking up RT just hits VRAM limits anyway. Textures is literally the no. 1 most noticeable thing to me, so at least I have AMD to thank for pushing that all these years so devs can keep upping the standard. I just hope enough devs finally stop giving a shit about nVidia user's whining and they learn to deal with the fact they're paying premium prices for budget hardware and therefore should either stop buying that hardware or accept the fact they need to turn their graphics settings down because the GPU they chose to buy is pretty crappy and therefore they don't get that part of the premium experience. Like I'm not gonna buy an AMD GPU and then review bomb some hard working dev team just because I can't understand why my fps is worse with RT ultra on, these kids need to learn the same about their bad textures and stuttering and crashing with 8gb.

    • @user-kf2wc6ly4j
      @user-kf2wc6ly4j Před 11 měsíci +6

      ​@@pandemicneetbux2110 damn, dude

    • @philbertius
      @philbertius Před 11 měsíci +1

      @@pandemicneetbux2110 Enjoyable rant, you should edit and post somewhere for internet bux.

    • @whoknows8864
      @whoknows8864 Před 11 měsíci +2

      ​@@pandemicneetbux2110 you seem genuinely knowledgeable, but the 4070 has 12gb of ram, it was even stated in this video.

  • @rshaart4810
    @rshaart4810 Před rokem +24

    Honestly, I'd love to see you use DCS as a comparison for some of these cards, simply because it's a free to play full fidelity flight sim and it would be good to see some benchmarks

    • @pandemicneetbux2110
      @pandemicneetbux2110 Před rokem +2

      What I really like about DCS is how well it proves the point about system RAM. That game is a monster when it comes to memory. I think it was the first game to recommend 32gb, but not only that, you can see benchmarks out there for comparing different RAM speeds and timings, and the differences between bottom of the barrel 2133mhz vs 4400mhz preemo RAM kits was literally like the difference between a 6600XT and a 6800XT. That game's super sensitive to RAM apparently, and intensive af. I do like it when studios really push it to the limit, like did you know Age of Empires actually has a 2560x1440p display resolution?? Microsoft actually left a resolution for it back in the 1990s, when every screen was 640x480, pure wizardry. So I don't think it's a bad thing at all to make a game that the most high end hardware barely takes advantage of, and is why I like the overdrive fully pathtraced tech demo of Cyberpunk. I just don't like it when manipulative lying corpos then go and try to bully tech journalists for treating it like a tech demo, and not acting like it's the only performance which counts. I think I would've had a much better reception to RT in general if they'd just introduced it the way they're introducing fully pathtraced scenes, and emphasized performance otherwise. Or it's like how I was cooler receiving 3D vcache, because it's a one trick pony, so I may get 5950x anyway. Because I think the average gamer in general doesn't seem to understand not all games use system resources the same way, and DCS World vs Civ 6 vs Cyberpunk vs GTA V is a really good example of that to where you can have all kind of engine limits and resource utilizations, that make any one game not really indicative of the fuller experience, because RAM speed might not matter in lots of games, but some they do, and it's easier to make an informed choice when the answer to does it matter is "sometimes, but maybe....but those times it counts it might count a lot." Like why 5800X3D is literally superior to the 12900ks in certain titles, but other times it loses pretty badly. It isn't always that it's just "faster" or "better" but it performs differently, depending on engine and title etc.
      Honestly I'm a bit shocked how much compatability there is at all sometimes. Maybe for all my bitching, it would be so much worse in tech if we had a dozen different companies. I guess it's really nice for GPU coolers, but not having standards is such a nightmare. Is why USB existed in the first place I guess. Just crazy the variety we have I guess, and how much the performance is able to vary sometimes. It's part of why the way these brick and mortar companies sadly even Steam system req sometimes is so useless, like "i5 8gb of RAM" is practically meaningless to me at this point, and RAM is definitely an area where SIs like to cheap out on. DCS is a great example why that matters.

  • @penguindrummaster
    @penguindrummaster Před rokem +419

    I really wish more outlets would remind audiences of how expensive GPUs have become. Nvidia -60 cards used to be budget-friendly between $199-249 USD. These days, a 3060 goes for $300-350 which is the price point of past -70 cards. It used to be that only novelty cards like the TITAN or ARES dual-GPU cards were $1,000 USD or more, but now that has become normal for flagship-level cards.
    It just really feels unfair how the market has shifted so far from what was expected just 10 years ago

    • @sophieedel6324
      @sophieedel6324 Před rokem +41

      All these outlets are bought off, they won't give bad reviews because they won't get review samples anymore. PC gaming is just a complete ripoff nowadays. If Nvidia or AMD thinks I am going to pay $600 for a midrange GPU they are out of their mind. My next "upgrade" will be the next Nintendo Switch.

    • @JeffEdmondson
      @JeffEdmondson Před rokem +28

      The advantage of being a gamer since the 1990's is that I don't need a reminder, I remember very well.

    • @aj0413_
      @aj0413_ Před rokem +50

      Inflation, economic depression, supply chain issues, etc… I personally think it’s crazy how people keep going “Back in the day…”

    • @DUD3H0WD4R3U
      @DUD3H0WD4R3U Před rokem

      @@aj0413_ the only crazy thing is how gullible you are

    • @jorgeramirez7761
      @jorgeramirez7761 Před rokem +16

      ​@@aj0413_ I was just thinking this

  • @alejorag
    @alejorag Před rokem +700

    Seeing how good the 6800XT is doing right now I'd love a video on how last-gen cards from both AMD and Nvidia are doing right now vs at launch!

    • @TheSahil713
      @TheSahil713 Před rokem +27

      My 6800xt is really great rn.

    • @cardassa
      @cardassa Před rokem +12

      I am still using a 5700xt. Its not bad and i actually gain some benefit from AMDs FSR feature which allowed me to play diablo 4 bwta relatively smooth

    • @danieldaniel1210
      @danieldaniel1210 Před rokem +1

      You forgot that 6800xt eats more power than a 4070

    • @t1aoshibby
      @t1aoshibby Před rokem

      Strix LC 6800xt here and it's a killer card.

    • @t1aoshibby
      @t1aoshibby Před rokem +2

      ​@@danieldaniel1210more powah means more better?

  • @Ragmon1
    @Ragmon1 Před 11 měsíci +104

    Would love to see an energy bill comparison, RTX 4070's peak 185 W power draw is very compelling. Compared to the 6800 XT's ~290 W.

    • @f.ferenc88
      @f.ferenc88 Před 11 měsíci +15

      Imagine undervolting the 4070.. Ive seen people with the 4070 under load eating 130W...

    • @TheSelfAwakening
      @TheSelfAwakening Před 10 měsíci +2

      I mean if i had 6800xt i would definitely undervolt, as long as it is at max 250w id be fine with it...

    • @lachlanokeefe8020
      @lachlanokeefe8020 Před 10 měsíci +2

      @@TheSelfAwakeningyou can also undervolt the 4070

    • @f.ferenc88
      @f.ferenc88 Před 10 měsíci +12

      @@lachlanokeefe8020 I got the 4070 for a while now, and sure enough, consumption under load is 130W... Eats as much as most cpu's.....

    • @bradhaines3142
      @bradhaines3142 Před 10 měsíci

      i just bought a 6950xt, planning on cranking it down to around 250w

  • @eddeh0772
    @eddeh0772 Před rokem +13

    Great review! Clearly a lot to be a bit grumpy about, especially the cost increase to performance boost since last gen, but in the right scenario this still feels like a good option in niche circumstances. My brother is on the now ancient 4790k with a 1070 ti, and holding out for ddr5 to be more accessible and standard before upgrading his cpu and mobo etc, but is desperate for a better gpu to carry him through that. He’s obviously gonna be cpu bottlenecked a lot, but even so, a 3080 for $600 that sounds like it will work with his 750W supply feels like a great quality of life improvement until he can sort the rest of his rig. His rig is actually a hand me down from me, I upgraded everything after I was lucky enough to get a 3080 FE back in 2020, but there was a short window where I just slotted the 3080 into that 4790k setup and it definitely ran things a WHOLE lot nicer. This feels like a perfect bridge gpu in that (very particular) case

  • @hex_off
    @hex_off Před rokem +756

    Dang, the RX 6800 XT performance it's now REALLY competitive, AMD is doing a great job updating drivers and making a good product in a incredible product for free!

    • @JasonEllingsworth
      @JasonEllingsworth Před rokem +71

      Yeah I have people saying i'm crazy thinking the 6800xt performs on par with a 3080ti now, but it does.

    • @Freestyle80
      @Freestyle80 Před rokem +19

      so from now on when a game releases crap full of issues and devs upgrade it for “free” i expect you to say this too right?
      Unless ofc if its not AMD it doesnt matter 😂😂

    • @teapouter6109
      @teapouter6109 Před rokem +144

      @@Freestyle80 6800XT didn't launch as crap. It launched as a 3080 competitor, and now it's a 3080ti competitor.
      If a game launches as a 9/10 and then 2 years laters becomes a 10/10 for free, then yeah, we're all going to celebrate that.
      Look at Terraria. Over a decade of free support adding tons of new features and content.

    • @kcgunesq
      @kcgunesq Před rokem +3

      I picked one up a few weeks before Thanksgiving for $540, and have no regrets picking the 6800xt

    • @MarcABrown-tt1fp
      @MarcABrown-tt1fp Před rokem +8

      Snagged a factory OC'ed Asrock Rx 6800xt Taichi running at 2.49ghz for 520$ so far its running better than many Rx 6900xt's I feel lucky!.

  • @SinsOfScience
    @SinsOfScience Před rokem +181

    I've only been watching LTT videos for a few years, but I have to thank you for finally highlighting the product featured in the video, instead of just tossing it in a list. Even just that one detail makes it so much easier for me to compare what's being featured to what it was tested against, instead of having to pause the video every few seconds to read each graph in its entirety. So, thank you.

  • @drewwhited2088
    @drewwhited2088 Před 9 měsíci +70

    Being a fan of this card feels like being a Chad but here goes... IT'S A GOOD CARD. When he said "shockingly stable" he's not kidding. It runs like a sports car: all the fat has been trimmed off, and its efficiency is almost beautiful to watch under load. For it to contend with a 3080 at $600 MSRP, especially after last seasons cryto-fail and the chip shortage is also pretty astounding. The A.I. and performance boost patches are also worth considering. I mean, let's be honest here, what does your average gamer/worker really want in a card? Raw POWER, or sleek performance within a reasonable price range? I think this is a win. Check back on this in a few years, I won't edit this OP lol

    • @waltert9
      @waltert9 Před 8 měsíci +17

      just bought a 4070, ill also be giving an update in 5 years

    • @GizyBeMizy
      @GizyBeMizy Před 7 měsíci

      how's it working so far?
      @@waltert9

    • @plain-bagel
      @plain-bagel Před 6 měsíci +9

      As someone with a 1080 just looking into the available upgrade options, I think you’re right. It’s hilarious to wade through all the negativity and entitlement online, but that’s largely par for the course for social media these days. Sure, I’d prefer it to be cheaper, but ultimately the power and efficiency of the 4070 will give a huge performance boost over my current card.

    • @afscrew123
      @afscrew123 Před 5 měsíci +1

      ​@@plain-bagelim in the same boat, 1080 here. Was looking at a 4070ti for 1000$ but realizing i dont need that extra performance, however i cant go under 700$ for a 4070 in sweden atm, even on black friday.. oh well

    • @Haitch_Kay
      @Haitch_Kay Před 5 měsíci +3

      The 4070 really is in a bad spot because the card itself *IS* very good, but the starting price was fucked and now everyone just blanket dislikes the 40 series. As a $500-$550 card to upgrade to from a previous gen, lower spec card, it's *fine*.

  • @kennethd4958
    @kennethd4958 Před rokem +12

    With the launch of every new GPU I just get more and more happy I bought a Sapphire Pulse 6800 in 2020 for MSRP.

  • @0mnilegacy
    @0mnilegacy Před rokem +818

    I thought I was supposed to be excited by the 4070 benchmark but shocked to see how well the RX 6800 XT performs now.

    • @zackmandarino1021
      @zackmandarino1021 Před rokem +62

      it always performed well on release it was plus or minus 5% on regular rasterization performance against the 3080 now it competes near 3080ti with its fine wine drivers. people were just so focused on 30 series that even when we finally had on par competition in the gpu market they still bought 30series

    • @androidx99
      @androidx99 Před rokem +21

      @@zackmandarino1021 The mind share is strong, I'm trying very hard to go AMD and love the fine wine that is historically proven. But that fucking little voice in my head always inches me towards Nvdia and I hate it. I really wanted AMD to knock the RX 7000 series out of the park and take the performance crown with their MCM design but they came up short of the hypothetical performance targets that were possible for the 7900 XTX which would have a 5-10% performance ADVANTAGE over the 4090. They ended up coming in at 80-90% in Rasterization performance, so nearly 15-25% lower than their own targeted goals. They are at least less expensive, but not by a large enough discount that gamers are happy and having them fly off of the shelves like they had the potential to do. I get it, oversupply and market demand and all that. But it really was a missed opportunity to capture the high-end market and mind share.

    • @JasonHendry22
      @JasonHendry22 Před rokem +17

      thats probably because the 4070 is a 60 series card or 50 series card with old 80 series pricing.

    • @kius5033
      @kius5033 Před rokem +8

      @@androidx99 This is exactly why I personally see Intel as the future. Why compete for the best card when you can compete with last generations cards for half the price? I understand that many people want the best card, but for almost everything the best card is way overkill.

    • @JasonHendry22
      @JasonHendry22 Před rokem

      @@kius5033 sadly I'm lik e 1 good bf game away from spending 12oo on a 4080. But no bf and gta 6 no need

  • @StopBlurryGames
    @StopBlurryGames Před rokem +262

    Thanks for including the A770! We need to keep including them in test.

  • @DeviniteHD
    @DeviniteHD Před rokem +73

    I always switched GPUs when I noticed my old GPU wasn't holding up anymore. I still remember how I tried running Assassin's Creed Unity on my GTX 760 and it ran like crap in 1080p. I then switched to the 970 which was a teriffic card for it's price and served me for many years. Unfortunately, it broke and I then bought the 1080, because the price dropped massively when the RTX 20 series got announced. I'm using that 1080 to this day and of course, you have to do some tweaks. But so far, there hasn't been a game where I felt the urge to change my GPU, because it's not holding up anymore. I'm currently playing Borderlands 3 at 1440p Medium to High settings and it runs at about 80-100 fps on my 120hz TV. Maybe things will change once devs leave the old consoles behind as PS5s and Xbox Series X are now widely available. But for now, the 1080 is still good. Which is a shame, because I actually wanted to wait for the 4070. But when I saw the price, I immediately lost interest. Thanks NVIDIA.

    • @ISirSmoke
      @ISirSmoke Před rokem

      your gpu would not hold up to metro exodus enhanced edition

    • @rugbyf0rlife
      @rugbyf0rlife Před 9 měsíci

      I've been running my trusty 970 since that card launched, upgrading from a 660 TI.
      Every GPU generation I've been looking to upgrade, but I've always held back.
      2080 due to RTX being experimental and overpriced.
      3080 due to supply issues with covid
      4080 due to the price and the 4090 being too tempting in comparison... though I never bit the bullet.
      I just started doing UE5 game development and my 3.5GB VRam is crippling me on my 970, so I'm finally pulling the trigger.
      Honestly the 970 is such a legendary card and has kept up at 1080p and even some games at 1440p for so long. But now AAA games and technical programs just destroy it. Gonna be a hell of a leap for me, and will finally get to use my 144hz 1440p monitor to its full potential.

    • @NSHorseheadSD70
      @NSHorseheadSD70 Před 5 měsíci +1

      ​@rugbyf0rlife I've been using my 970 for 9 years. It's been an absolute machine. I'm kinda sad to replace it actually but it's time for a total overhaul. I'm hoping to still find a use for it and the entire old machine in some capacity because it all still works

    • @cavemann_
      @cavemann_ Před 4 měsíci

      ​@@rugbyf0rlifeI'm still running 2GB GTX 960 and you basically perfectly described what I've been going through as well. The jump is going to be insane, but hey, at least the 960 can still hold a candle to most AAA games... aside from Cyberpunk 2077 and similarly demanding titles 😅

  • @M3dicayne
    @M3dicayne Před rokem +1

    For some strange reason I noticed the driver development over the time with my RX 6900 XT. Like it got quicker, the older it becomes...
    I reapplied thermal paste as it got hot very fast due to a manufacturer's lack of doing it correctly. That was the first big jump without the card thermal throttling all the time.
    Then, the game performance increases... I can play Far Cry 6 with everything maxed, no FSR, but Raytracing and with around 100 fps at 1440p. That's damn nice.
    Same with The Callisto Protocol. With Raytracing and FSR in Quality mode, I had around 60 fps.

  • @Larry_Owlmann
    @Larry_Owlmann Před rokem +853

    Thank you NVIDIA for convincing me to pick up a 6800 XT for my next build.

    • @SFearox
      @SFearox Před rokem +17

      What card do you have now? The 6800 XT seems like a nice upgrade for a 1080ti.

    • @Sam-xg2jn
      @Sam-xg2jn Před rokem +31

      Yeah, you can get it for a lot cheaper than the 4070 (atleast in Europe)

    • @nightfr09
      @nightfr09 Před rokem +38

      I've got a 6900xt and it's been great.

    • @photlam9769
      @photlam9769 Před rokem +28

      I just got myself a 6700xt up from a 2060 and it feels like every game runs like butter

    • @RiddleTime
      @RiddleTime Před rokem +20

      @@nightfr09 6950xt here, picked it up for about $700. thing is fllying even when undervolted.

  • @theGamer93
    @theGamer93 Před rokem +24

    5:17 - Linus Kink Tips

  • @adriendecroy7254
    @adriendecroy7254 Před 9 měsíci +4

    One thing more focus should be on is on power dissipation. It looks like the 40 series is a lot more efficient in terms of power consumption than the 30 series. So whilst the 3080 may have a small edge in performance over the 4070, it uses 50 - 100% more power to do it.

  • @suspectedcrab
    @suspectedcrab Před rokem +24

    I upgraded from a 1080 to the 4070. I didn't mind lowering settings over the years, but I wanted to go back to playing modern games in 4K since the release of the 3080 which was never in stock. Regardless, I saved $100 waiting, and the 8 pin requirement saves time.

    • @mjtomsky6387
      @mjtomsky6387 Před 11 měsíci +2

      Are you happy with the 4070? Looking forward to buy one too

    • @LutherLaPlace
      @LutherLaPlace Před 11 měsíci +2

      Bro same I wanted to buy a used 3080 to replace my old 1080 but I wasn't feeling like changing my PSU

    • @suspectedcrab
      @suspectedcrab Před 11 měsíci +1

      @@mjtomsky6387 Yes, specifically the Asual Dual OC model. I think it has one of the best coolers of all time. It stays at 33c at idle without the fans even spinning. Great energy efficiency for a card that is 3080-level.

    • @mjtomsky6387
      @mjtomsky6387 Před 11 měsíci +1

      @@suspectedcrab Aight thank you, do you have multiple monitors?
      If you do, may I know the resolution and if your gpu is struggling with these?

    • @adilhachedmoreno7852
      @adilhachedmoreno7852 Před 9 měsíci

      could you tell me what CPU you use with the 4070 pls ?

  • @Jamie-cz2xu
    @Jamie-cz2xu Před rokem +489

    A month ago i bought the 6800xt and am quite pleased with it. Especially after seeing this

    • @Lucromis
      @Lucromis Před rokem +18

      I’ve had my 6800 xt for a while now and it’s been a great buy. No issues, plenty of performance, and vram. I don’t feel like it’s too much money or going to be obsolete soon.

    • @RichLifeStories
      @RichLifeStories Před rokem +14

      I'm thinking of upgrading from my RTX 3060 TI and getting the RX 6800 XT. I can see 8GB of VRAM isn't going to cut it. It's fine if I can still play on 1080p but I'm already seeing that games are going to require probably 10 to 12gb of VRAM. Also the RX 6800 XT is a more powerful GPU and sometimes I play games at 1440p or 4K.
      What I'll probably do is swap out the RX 580 4GB in my wife computer and put the RTX 3060 ti in that and then my main system will be the RX 6800 XT.

    • @Jamie-cz2xu
      @Jamie-cz2xu Před rokem +2

      @@Lucromis My issues were coil whine, drivers and micro stutter. But i found that undervolting the card got rid of coil whine and switching to a display port got rid of micro stutter and saving my overclock/undervolt settings sorted out the drivers. So in the beginning i was semi regretting my purchase thinking i should've got the 3080 but now i am experiencing perfect gaming with no issue at all.

    • @kimjongpoontv69
      @kimjongpoontv69 Před rokem +7

      6800XT was my favorite GPU I’ve owned, moved on to the XTX but 6800xt still holds up really well 😊

    • @Flat_6
      @Flat_6 Před rokem +1

      Same. And after seeing the issues the 3070/3080 were having with their VRAM in modern games, I knew I made a good choice.
      I was also laughing at the 1080 performance comparison because that's what I upgraded from.
      The only gripe I have is that I can't get consistent updates on AMD Adrenaline. Not sure what the issue is, but considering most of the driver updates now are focused on the 7000 series, it's not a big deal. I do miss GeForce Experience though. But overall I couldn't be happier.

  • @mihalis1010
    @mihalis1010 Před rokem +132

    I miss the days when the 70 tier was just the 80 tier from the last generation at a lower price, possibly with a different RAM amount. Really kind makes me feel better about the 2080 I bought off a friend for $150 a few weeks ago.

    • @jacobturk7543
      @jacobturk7543 Před rokem +13

      Good snag. I got my 2080 new and it still performs like a champ.

    • @Selsato
      @Selsato Před rokem +8

      Damn that is a good deal

    • @Aerobrake
      @Aerobrake Před rokem +4

      3080Ti for 75% msrp factory sealed. Pretty happy about it.

    • @iurigrang
      @iurigrang Před rokem +8

      Uh? Isn’t the point of the video that this is mostly 3080 at a lower price?

    • @Jacob-hl6sn
      @Jacob-hl6sn Před rokem +3

      nope the 70 usually equaled 80 ti performance of last gen

  • @BrendaEM
    @BrendaEM Před rokem +5

    You were spot-on about the GTX1080 people holding on to their cards. I am still using mine for gaming, but I a looking at GPUs for rendering Rhino3D/Cycles. I find the new card prices a bit much.

  • @ExampleOfPlay
    @ExampleOfPlay Před 8 měsíci +39

    Kinda crazy how I’m in the same boat with so many others. Got a 1080 when it came out and it’s still going strong, but the 4070 finally seems like the time to upgrade. Planning to hang on to this one for quite a while as well!

    • @gamingsfinest3356
      @gamingsfinest3356 Před 7 měsíci +3

      I just bought a 4070, upgrading from a 1080 as well lmao, gonna set my dad up with my 1080 so we can play some games together.

    • @michaelyoutube3722
      @michaelyoutube3722 Před 5 měsíci +1

      @@gamingsfinest3356how’d it go?

    • @johnhawthorne4716
      @johnhawthorne4716 Před 5 měsíci +6

      His dad owned him on
      4fortnight

    • @JonathanGonzalez-gh3xr
      @JonathanGonzalez-gh3xr Před 4 měsíci +1

      My 1080ti was a fucking beast sadly I came across a nasty deal for the 4070 and bought it for 500$ flat no tax so I just had to bite the bullet

  • @0NicoLoco0
    @0NicoLoco0 Před rokem +1059

    I like how Linus and the guys over at LTT have taken benchmarking to such a level that we only need 1 to be able to compare. Including all the way back to the 10-series is brilliant, guys. Keep this up.

    • @Donnerwamp
      @Donnerwamp Před rokem +80

      You should still look at numbers from other people. Everyone makes mistakes, all people have a price, everybody lies. Linus himself says that you shouldn't trust them blindly, and he's more than right with that.

    • @oaw117
      @oaw117 Před rokem +43

      @@Donnerwamp if he himself lies then it's possible he is lying about lying. Makes you think.

    • @bigbay1159
      @bigbay1159 Před rokem +24

      ​@@Donnerwamp yeah the point being blindly believing anyone is completely foolish

    • @BoSaGuy
      @BoSaGuy Před rokem +7

      @@oaw117 no, no NO… I will not waste 2 hours thinking about this now

    • @Skozerny
      @Skozerny Před rokem +19

      Except other reviewers have higher quality benchmarking. Just flat out.

  • @japeshmarwaha3406
    @japeshmarwaha3406 Před rokem +1607

    Feels like a 3080 renamed 😂

    • @jaydeep-p
      @jaydeep-p Před rokem +180

      With almost half the power draw

    • @Madblaster6
      @Madblaster6 Před rokem +90

      @@jaydeep-p and frame generation

    • @EonNShadow
      @EonNShadow Před rokem +67

      Honestly with my psu that might be the selling point for me personally.

    • @YoukoZuki
      @YoukoZuki Před rokem +56

      People who is going to buy 3080 now has a better "3080". Meanwhile, people who bought a 3080 last month now suffer from buyer's remorse.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Před rokem +25

      @@jaydeep-p And more price and less bandwidth🤦🤦

  • @Nogha12
    @Nogha12 Před rokem +20

    For me, my 2070 still runs above 60 FPS in every normal game I want to play, but the area I would really want to have better performance in is VR. It would be nice if VR stats were included in the videos, since that's really the only thing that makes me want a better graphics card.

    • @potvinsuks8730
      @potvinsuks8730 Před 11 měsíci +2

      It's definitely not the most accurate but, I use the 4k stats for VR. Unforunately, VR-GPU Testing should have its own production as the variables in resolution differs considerably between headsets (even for high end models), as I'm sure you know. But, couldn't agree more! Need more VR data, especially nowadays!

  • @jayloopgamingchannel4926

    Bought a GTX1070 back in 2016 . The card has performed beautifully for my needs : with some settings sorcery, I managed to run Assetto Corsa Competizione in native 4k at around 50~60fps with medium/low settings on that card. Also used to play VR with it (valve index) and it runs fairly decent in most cases. Recently bought a Zotac 3070Ti brand new for €560 , which is a nice deal I think (the 1070 was €440 in 2016 , so price is almost bang on considering recent inflation).

  • @Youtube_Heros_Police_Force
    @Youtube_Heros_Police_Force Před rokem +415

    This makes me feel even better about upgrading to a 7900 XT (at $800) instead of waiting to see how the 4070 performs.

    • @redpilljesus
      @redpilljesus Před rokem +31

      It just isn't fast enough for the cost - especially with the efficiency of the 40 series.
      I'm skipping both, however. Until they make the 4080 $800, nothing is worth upgrading my 6900 XT over.

    • @josephjoestar515
      @josephjoestar515 Před rokem +99

      @@redpilljesus Why would you even want to upgrade your 6900XT? That card is fast as hell and I don't see a reason why you would need to upgrade it so soon.

    • @kani-
      @kani- Před rokem +6

      Same, i got a founders 3080 for $700 during peak rona era, and i really don't feel like the 40 series is making any compelling arguments for an upgrade (which I'm happy about, no FOMO)

    • @LetsRocka
      @LetsRocka Před rokem +2

      Yeah, just wait until nvidia crash and burn with these prices...

    • @ARCbirdman
      @ARCbirdman Před rokem +1

      @@josephjoestar515 idk man. 1st world things. I have a 3090 but for content creation I've thought abt upgrading because AV.1 encoding is 😩

  • @user-xu6sx8ix6p
    @user-xu6sx8ix6p Před 10 měsíci +1

    I have and rtx 950 upgraded the rest of my setup January 2022 I bought the 4070 founders about a week ago haven't been home yet but from what I'm playing and what I'm used to I think I'm gonna love it

  • @johndododoe1411
    @johndododoe1411 Před 11 měsíci

    Still building my systems around single-slot cards and integrated GPUs . The only dual width cards I've seriously condidered are cards with many rear connectors and the old hardcard that fit a 3½" HDD on the card itself.

  • @killzone866
    @killzone866 Před rokem +694

    This is basically a what a 4060 should be.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Před rokem +119

      It even has a 192 bit bus just like every other 60 series.

    • @Maxrepfitgm
      @Maxrepfitgm Před rokem +136

      4070ti was supposed to be the 4070 and the 4070 was supposed to be a 4060. Skipping this generation. Not paying 600 for a 4060.

    • @matteominuto230
      @matteominuto230 Před rokem +64

      In a world where nvidia isn't as greedy as they are today this card would be the 4060 and it'd be priced appropriately. Not here tho

    • @Alex-wg1mb
      @Alex-wg1mb Před rokem +3

      I am glad that got 4070ti for 830 euro. Not counting the vram it is faster than 3090

    • @Victicat
      @Victicat Před rokem +53

      cant wait for an rtx 4040 with a 64 bit bus which will cost 369.99 and will be on par with a 2060, truly impeccable value

  • @Lexingtonomega
    @Lexingtonomega Před rokem +178

    As someone who is still running a GTX 1080, I loved seeing how the other hardware compares XD, gives me a good base line for upgrading

    • @phaolo6
      @phaolo6 Před rokem +14

      GTX 1070 here. It's still good for my current PC & monitor, but I can't uprade when the 3070 is still 600-1100 € in my country 😔

    • @bartolomeothesatyr
      @bartolomeothesatyr Před rokem +12

      My 980Ti is old enough to be in second grade and I haven't yet felt particularly compelled to upgrade it. 45-55 fps in Cyberpunk 2077 at 2560x1080 with all the non-RT eye candy cranked to ultra is still good enough for me, for now.

    • @ntzt2150
      @ntzt2150 Před rokem +1

      @@phaolo6 Just get a used one. The Radeon 6000 series used are way cheaper than 3000nvidia where I live. Just gotta be smart about it.

    • @sedawk
      @sedawk Před rokem +12

      My fellow 1080 brothers! Stand strong!

    • @phaolo6
      @phaolo6 Před rokem +2

      @@ntzt2150 I kinda dislike buying used tech, because it's a gamble. You never know what crap they did with it. Maybe they stressed it to the limit or dropped it _(wink)_
      But well, maybe I'll have to adapt..

  • @Felix_Schmitz
    @Felix_Schmitz Před 8 měsíci +2

    THE GTX 970 WASN'T A 300W CARD. DATA ERROR.
    It was less than 200W.
    780 ti Performance at this efficiency was a big leap

  • @alexspata
    @alexspata Před rokem +1

    Awesome review, guys!
    Giving the fact that I own an Intel I5 11400F + 1660 Super GPU build, I was hoping that the price will drop down in order to be able to upgrade to 3060Ti.
    But a good version like Asus or MSI is still insanely priced at over 500 euros and to be honest, that is way too much.

  • @hongphucvu8599
    @hongphucvu8599 Před rokem +819

    I am impressed how much performance AMD could squeeze out of their cards through software updates.

    • @gokublack8342
      @gokublack8342 Před rokem +123

      I keep telling people Nvidia is only better at the very top end ie (4090) if you don't need the very tippy top in terms of performance AMD is better (also RT is very overrated I've never noticed much benefit from using it) if you're on a budget AMD demolishes Nvidia Edit: Also to you Linux users out there AMD is generally better than Nvidia due to better driver compatibility

    • @u2popmart
      @u2popmart Před rokem +44

      I’m still happy with my 6700 XT it’s not even overclocked but meets all my needs

    • @jameysummers1577
      @jameysummers1577 Před rokem +36

      @@u2popmart And with AMD finewine you may have it for a while. My brother is STILL using his 8GB RX 480 Red Devil that he bought at it's launch and it's still going strong. Now, he is cheap and it is very old, but his card is a great example of AMD aging cards. His card is impressive. I am the idiot who went out and bought an RTX 4080.

    • @dycedargselderbrother5353
      @dycedargselderbrother5353 Před rokem +19

      and VRAM
      I mean, it's still shame on NVIDIA but the fact of the matter is that there is a lot of performance left on the table with the 8 GB framebuffer on 30-series cards.

    • @samuelswenson1505
      @samuelswenson1505 Před rokem

      ​@@gokublack8342 well they are also better at the 499$ price point with the 4070 if you are playing with Raytracing and frame generation supported games. Otherwise just get the rx6650 or rx 6800 and be happy or if you want to be really cheap the rx6500 and turn on and use custom interlaced resolutions.

  • @kozinc
    @kozinc Před rokem +467

    Considering I have had Nvidia's 1080 for about 5 or 6 years, this was a fun video to see it in comparison with a new GPU :)

    • @MisterFoxton
      @MisterFoxton Před rokem +74

      It's shocking after all this time it's only a 60% difference.

    • @MsPreciouslittleones
      @MsPreciouslittleones Před rokem +7

      it is shocking.

    • @datgreekafroguy3082
      @datgreekafroguy3082 Před rokem +18

      I mean, even the comparison itself is weird. Thats just goes to show how amazing 1000series was all around..!! Well, seems like ill keep rocking my 1080 too, unless i find an amazing 3070 used deal.

    • @SoulTouchMusic93
      @SoulTouchMusic93 Před rokem +4

      those 1080 boards man, those were gold!

    • @stanislavkimov2779
      @stanislavkimov2779 Před rokem +7

      I'm still on 1070, with a 1440p monitor. But considering a new AMD GPU soon.

  • @dragonsaphira2
    @dragonsaphira2 Před rokem

    Managed to pick up a new 3080 twords the end of last year for msrp (though did get a $175 gift card back that I used to add a bit more ram and nab a solid 2 arm monitor stand with) and have been pretty happy with it considering I was running a 1070 befor.

  • @angelo50gt
    @angelo50gt Před 7 měsíci +17

    I’m watching after buying 4070 and for me it’s a win since I was using a 1080TI for awhile now and was going to go with a used 3080 but knowing used might be broken or have issues it wasn’t worth me taking the chance also the lower wattage was great for me since I knew I wouldn’t have to upgrade my PSU and so far I’m loving it for the price since I paid around the same for the 1080TI back in the day

  • @reubenzilla8219
    @reubenzilla8219 Před rokem +333

    what I'm actually impressed with is arc a770 and the amd cards performance

    • @Gajaczek93
      @Gajaczek93 Před rokem +37

      I am laughing my ass off with my A770. 37% less performance @4k while it costs ~289$ vs 600$.

    • @sophieedel6324
      @sophieedel6324 Před rokem +15

      You're impressed about a duopoloy where AMD is not competing and always diving 1% under the price of Nvidia so they can both keep fleecing consumers. Good for you I guess.

    • @doniscoming
      @doniscoming Před rokem +9

      Yeah I have switched from 3070 to 6950XT and I had a great deal to switch to 7900xtx and I'm super happy - dont care about dlss I just want raw 4K gaming 😊

    • @supersuede91
      @supersuede91 Před rokem +6

      Battlemage will be fun

    • @lathrin
      @lathrin Před rokem

      ​@@doniscoming 4k is only good for the desktop space ngl. Anything past 1800p isn't worth it unless you refuse to enable antialiasing

  • @MikeyDAngelo
    @MikeyDAngelo Před rokem +192

    this makes me feel fine about the 7900XT I just bought a few weeks ago. was really torn between it and the 4070/4070TI but ultimately decided I wanted more vram for the long haul. have been really enjoying it so far.

    • @linkus7918
      @linkus7918 Před rokem +20

      It's been a lot more attractive since they lowered the price, for sure.

    • @JulienChabanon
      @JulienChabanon Před rokem +17

      That was definitely the right move. 12go of VRAM is not enough even for 1440p resolution these days as I am consistently getting more than 12go on last of us, fh5, re4 or hogwarts

    • @Blissy1175
      @Blissy1175 Před rokem +8

      the 7900xt and 7900xtx outperform it anyway, even in some raytracing scenarios, so you should feel fine/good with your choice.

    • @lucidlucifer8605
      @lucidlucifer8605 Před rokem +1

      What cpu are u using if i can ask. Im curently thinking about upgrading from my gtx1080 to a 7900xt and maybe wanna pair it with a 5900x

    • @MikeyDAngelo
      @MikeyDAngelo Před rokem +1

      @@lucidlucifer8605 Ryzen 7 3700X

  • @jkrejci.
    @jkrejci. Před 9 měsíci

    I enjoyed the video. On pricing though, could a cumulative 2021-today inflation rate of almost 20% have something to do with the price increase? Just wondering 🤔

  • @zachdenney
    @zachdenney Před rokem +142

    you know one of my favorite things about LTT videos? clear, and proper volume level audio.

    • @polarizedpotstone
      @polarizedpotstone Před rokem +7

      underrated fact 💯

    • @Thomas_Needs_Coffee
      @Thomas_Needs_Coffee Před rokem +4

      This! I appreciate it so much lol. Maybe I'm just picky or maybe I'm holding channels to higher standards.

    • @purpleneons
      @purpleneons Před rokem +1

      i just got new Bluetooth earphone modules and i found myself testing them on LTT videos. i second that.

    • @Zer0--Zer0
      @Zer0--Zer0 Před rokem

      For like 90% of tech, absolutely, just don't trust his advice on monitors, man has been misleading in that department more than once.

    • @polarizedpotstone
      @polarizedpotstone Před rokem

      @@Zer0--Zer0 in what way do you mean?

  • @bigjoe7051
    @bigjoe7051 Před rokem +248

    This review of an nvidia product made me a bigger intel and AMD fan as I am very suprised by the performance of older AMD cards after all the updates and I can't believe that ARC actually can be somewhere on these charts with pretty good fps

    • @TalesOfWar
      @TalesOfWar Před rokem +25

      The hardware in the Arc cards is solid, it's the drivers that (mostly) suck. That's the secret sauce when it comes to real world performance these days. We're well beyond the days of just throwing more transistors at the problem (at least to the degree it used to be). Writing good optimised code is the real magic.

    • @chronometer9931
      @chronometer9931 Před rokem +12

      Arc a770 16GB is actually really good, most games can run with the settings on highest and still achieve 60fps even if that is with XeSS or FSR. I'm maxing out Hogwarts with ray tracing at 4k and still getting playable frame rates. Lots of Nvidia and AMD cards can't even do that due to a lack of VRAM on Nvidia's side and a lack of ray tracing on AMDs. Intel is the sweet spot right now and I only expect future launches to be even better

    • @temporaltomato3021
      @temporaltomato3021 Před rokem +3

      ​@Chronometer I hope that's not mostly ownership bias, but it does make me hopeful for the mainstream market. It's no problem in my eyes if Nvidia screws off and creates some kind of "luxury gaming" tier of the market, as long as there remain compelling offers from other entities.

    • @EbonySaints
      @EbonySaints Před rokem +1

      The thing with Arc is that it's effectively Vega 30, at least according to Chips and Cheese. It's really good when it gets loaded, hence why you'll see how the A770 competes with a 6700XT or a 3070 in TimeSpy, or how when running something stupid like Hogwarts Legacy with RT at native 4K it loses so comparatively little performance that it slumps its way to 3090 tier in some way at 23 FPS to the latter's 25 (going off of day one TPU review). The problem is that games aren't synthetic benchmarks, and most people tune their settings to match their card. So, while an A380 beats a 1060 and almost matches a 580 on Cyberpunk at 1080p Medium, most PC gamers don't want to play it at 36 FPS. So the settings go down to low, and the A380 gets around 50 FPS while the latter two get 70+.
      Arc is like a random grab bag of candy you get on Halloween, and you hit up every non-normal neighborhood in town. You'll get great candy in small amounts from the rich. Someone will home make some surprisingly good chocolate. A person will slip in Tootsie Rolls with razor blades and get you in the hospital.

    • @jemiebridges3197
      @jemiebridges3197 Před rokem +1

      Honestly the only reason I'm not buying an arc is the lack of support for legacy games, of which I have 500+. I might just make a legacy computer and use an arc or AMD build for new games post 2020.

  • @Kristian.H.Nielsen
    @Kristian.H.Nielsen Před rokem +68

    I just upgraded to a 4070 from a 2060 super in my small formfactor build, and I am very happy. It only needs a single 8-pin, runs cool and it's quit. The performance is uplift from my previous GPU is also quite good. I went from 1689 to 2619 in the Heaven Benchmark. I use Blender, so I had no choice but going with Nvidia, so I am glad that there finally was an option that didn't require a new case and power supply.

    • @codeman8113
      @codeman8113 Před rokem +6

      You made me feel better about myself , in my country 4070 ti goes up to 1600 dollars , yes so I just spent 1600 dollars on a 4070ti

    • @Kristian.H.Nielsen
      @Kristian.H.Nielsen Před rokem +4

      @@codeman8113 I'm not quite sure how I helped, but I am happy to be of service 😄

    • @codeman8113
      @codeman8113 Před rokem +8

      @@Kristian.H.Nielsen you being happy with a 4070 made me feel better haha

    • @Sid-Cannon
      @Sid-Cannon Před rokem +4

      I'm in a similar boat, my GTX 1080 is starting to show its age and the only GPU that is at least twice as good as my 1080 that will fit in my case is a 4070. So now the question is do I buy now or wait and see what AMD has to offer with their mid range RDNA 3 ?

    • @Kristian.H.Nielsen
      @Kristian.H.Nielsen Před rokem +12

      @@Sid-CannonJust an opinion from a stranger on the internet, so take it for what's it worth: If you actually need the extra performance today - go for it. Assess your needs and base your choice on what meets your criteria. It's hard to make a decision based on a GPU which doesn't exist.

  • @jackrametta
    @jackrametta Před rokem +37

    The data in these vids is getting more and more impressive. I'll say tho, another upgrade y'all could make for the figures is incorporating uncertainty bands. That one note about +/- 400 for arc made me wonder if there were any significant differences between any of those cards, or if the results were just an arbitrary ordering based on a single draw. Ik doing repeated draws of the same tests takes much more time, but for really important reviews the return might be worthwhile!

    • @jackrametta
      @jackrametta Před rokem +2

      Plus, then you could switch from bar charts to the much more pleasing dot and whisker style plots.

    • @jonathanodude6660
      @jonathanodude6660 Před rokem +1

      @@jackrametta I suggested using distribution curves for some of their frame data as well, rather than multiple bars for different values. they seem to respond to criticism of their data presentation with the defence that it needs to be widely understood and there are simply too many normies who wont understand high quality data for it to be worthwhile.

    • @kanjakan
      @kanjakan Před rokem

      @@jonathanodude6660 Lol that sounds about right. I have no idea what a distribution curve or whisker plot is.

    • @jonathanodude6660
      @jonathanodude6660 Před rokem

      @@kanjakan just a normal distribution/bell curve (google it). It’s pretty self explanatory when you look at it, but it’s hard to describe. It’s essentially a plot of frequency vs value. You’d have your frame rates as the x axis, same as their current bar charts, then the y axis will be what percentage of the total time spent testing was spent at each frame rate. If you had each test rig colour coded and semi transparent, then overlaid them, it would provide heaps of immediate visual information about what to expect for each setup. It would look like upside down “U” shapes with varying curvatures where the further to the right the peak is and the less spread the curve is, the better. I don’t think it requires collecting any more information than they do already.

  • @yagesz
    @yagesz Před rokem +209

    Props to the writers. The script for slap in the face to transition to LTT Store genuinely had me burst in laughter.

  • @MerolLord
    @MerolLord Před rokem +236

    This video finally convinced me to upgrade my GTX1070, but for the first time in my life to a team red GPU, the RX 6800 XT looks like such a great deal now

    • @JSenior0
      @JSenior0 Před rokem +15

      not sure where you are based but in the UK there is only a small difference in price acorss 6800, 6900 (used) and if you are wanting new ive seen some good deals on 6950xt. Worth a look!

    • @blankblank224
      @blankblank224 Před rokem +2

      I'm in the same boat

    • @adamcowood7553
      @adamcowood7553 Před rokem +8

      Still rocking a 1070! Watercool and tune the balls off it. it's like a Fiesta XR2i (hopefully without the stolen and crashed in a ditch bit).

    • @precisionxe8701
      @precisionxe8701 Před rokem +4

      Just upgraded to the 6800 XT a month ago (From a 2070 Super) and it was well worth it. I'd be lying if I said I didn't have to do some tuning with software to get it up to standards but that was mainly my own fault with not keeping my PC well optimized over the past 3 years. I bought this rather than the 6950 XT because the thermals and power consumption are infinitely better, and imo, that outweighs the raw "value per frame" advantage that the 6950 has. 6800 XT is plenty enough to run all my games at 1440p with max graphics settings, would recommend.

    • @vasopel
      @vasopel Před rokem +8

      there are no "teams" ..... you just buy what makes more sense ;-)

  • @SkiDaBird
    @SkiDaBird Před 7 měsíci +1

    Its time to retire the old RX580, and I'm strongly leaning toward getting a 4070. Im going to do my next build in phases, doing the GPU and power supply now, and upgrading the rest in a few years.

  • @Xnate13X
    @Xnate13X Před rokem

    I bit the bullet and upgraded SLI 980 TI's to a single 4070TI on release in Jan. Did a whole Gen 11 rebuild in 2021 but waited until a fairly priced with a high performance card was available for purchase.

  • @RiddleTime
    @RiddleTime Před rokem +306

    It would be nice to have the 6950 xt included in your graphs, although the 6800 xt is already a very good comparison.

    • @Aznrob869
      @Aznrob869 Před rokem +7

      They need to add the 6900xt

    • @SirWolf2018
      @SirWolf2018 Před rokem +13

      @@Aznrob869 They need to add all cards dating back to 1999 (GeForce 256's release date).

    • @Mexidorf
      @Mexidorf Před rokem +1

      They should also add the 3080 ti & that would make the 4070 look even worse. Anyone who took advantage of back-to-school free upgrades from base 3080 to 3080 ti on pre-built sites made off like bandits & RT/DLSS can't hide that comparison. Wait in a few months when Nvidia ends the rebate program to AIB partners for initial 4070 sales & the price jumps will make this card a hard pass.

    • @amistrophy
      @amistrophy Před rokem +1

      ​@@SirWolf2018 cards? What about iGPU and SOCs???!?!?!?

    • @SirWolf2018
      @SirWolf2018 Před rokem

      @@amistrophy :D

  • @HeisenbergFam
    @HeisenbergFam Před rokem +109

    The intro has more tension and emotion than entirety of Twilight

  • @JasonB808
    @JasonB808 Před rokem

    Got an RX 6800XT for $540 last year and it’s an OC version so it has higher clock out of the box speed than reference. I figured it would be as fast as a RTX 4070 but with 8 GB more video memory. I have to try Hit Man 3 with Ray Tracing again but with Resident Evil 4 with ray tracing set to normal runs really well. Hit Man 3 is unplayable with RT shadows and lighting on until game is set to medium 1080p with FSR set to balanced. Even then it’s not hitting more than 40 fps. Hitman 3 runs maxed out at 4K with ray tracing off, I play on a 4K TV but it’s limited to 60hz. I just cap fps to 60 and it runs really well.

  • @joebob2738
    @joebob2738 Před rokem +26

    Would be nice to see a price per frame comparison (adjusted for inflation) of the last few GPU generations to see if we're actually getting more bang for our buck.

    • @redfoottttt
      @redfoottttt Před rokem

      so according to this logic if a card could pull down 2,000 fps, they should've cost 2k$ right? what about 4,000 fps then I wonder.

    • @nyregrus
      @nyregrus Před rokem +4

      @@redfoottttt you take the price of any given gpu and divide it by the number of frames per second it on average generates in a given game, on a given set of video settings, then you get different price per frame values

    • @jamesbyrd3740
      @jamesbyrd3740 Před rokem +1

      Relative to the flagship cards, 90 series, the 4070 is MUCH worse perf/$ than the 3070 was.

  • @danialonderstal3564
    @danialonderstal3564 Před rokem +260

    AMD's 'fine wine' tech is a force in its own. I still remember when AMD released the 7970, was amazing how long that card held up

    • @jordanwardle11
      @jordanwardle11 Před rokem +14

      Fine wine was because they were using the same base tech for years. So they could bring the improvements from later GPUs to older ones

    • @spicymeatball3297
      @spicymeatball3297 Před rokem +3

      I'm just now upgrading from my R9 390. Outside of driver conflicts, I've been able to run pretty much anything at 1080p on it. Have a 3060 coming in tomorrow tho

    • @rolasleyva8602
      @rolasleyva8602 Před rokem +13

      yeah, I remember Rx 580 8GB being equal to a GTX 1060 6GB when was released. But now performs like a 1070 if not slighly better. Same story with the RX 570

    • @xeigen2
      @xeigen2 Před rokem +4

      It also helped that both ps4 and xbox one were GCN based and since almost every game is designed primarily for console, almost every game engine ended up being optimized for that architecture

    • @TheOmarhaff
      @TheOmarhaff Před rokem +5

      ran those in crossfire for years loved it!

  • @ComputerProfessor
    @ComputerProfessor Před rokem +13

    Getting 6800 XT performance... For $10 more, you can get a 6950 XT.
    Why are people buying Nvidia? I mean I bought the 4090 going from the Radeon R9 295x but if this was my budget I wouldn't even consider Nvidia

  • @personguy8839
    @personguy8839 Před rokem

    kinda reminds me of what you talked about with the Window's comparisons, people just getting the newest OS's just for compatibility reason

  • @jamesm568
    @jamesm568 Před rokem

    Hell, I have the RX6800XT and the RX6900XT as they are collecting dust on the shelf since the driver support and software suit for those cards were horrific when they first came out which made me go to the 3090 FE. I might have to pop in the RX6900XT and see what improvements are done.

  • @klugginfresh
    @klugginfresh Před rokem +269

    Feels good being a 6800xt user, saving several hundred dollars over a NVIDIA card and seeing the better or similar non rt performance is a great feeling :)

    • @markmanderson
      @markmanderson Před rokem +64

      you aint missing much on RT, its woo for 10 mins then unnoticeable.

    • @diegofernandomunozzuleta7734
      @diegofernandomunozzuleta7734 Před rokem +42

      Even better when you see how good the 16 GB of vram of AMD cards like my 6800 xt have aged compared with the 8 or 10 from the 30 series from nvidia 🥰

    • @Tovek
      @Tovek Před rokem +10

      Lol, keep believing that.

    • @augustoalvarez6766
      @augustoalvarez6766 Před rokem +7

      @@markmanderson with the launch of rtx remix a lot of titles are going to need that rt performance though

    • @fkim8449
      @fkim8449 Před rokem +49

      ​@@Tovek believing what ?? Nvidia is just nothing more then money hungry and it's all thanks to you guys... the consumers who keep picking up with there crap 👍

  • @Rick020
    @Rick020 Před rokem +203

    Similar performance as the 3080, just a tiny bit better... Difference is that the 4070 probably costs €800, while I paid €500 for my used 3080 (:

    • @antonhelsgaun
      @antonhelsgaun Před rokem +34

      Or you can get a used AMD card

    • @whoathatcombo
      @whoathatcombo Před rokem +10

      @@antonhelsgaun nope lol

    • @TylerRolloGuitar
      @TylerRolloGuitar Před rokem +3

      Well, DLSS 3.0 is kinda neat, but idk if thats worth €300

    • @Hyperus
      @Hyperus Před rokem +4

      Why would it cost 800€? The 4070ti can be had for 880-900, I am expecting this to cost 650-700.
      Also note that comparing new vs used cards is a bit unfair in the price department.
      edit: it can be had for 659€, officially and on third party sites such as caseking, if you are in germany, ofcourse regional variances apply.

    • @alaskanhybrid1845
      @alaskanhybrid1845 Před rokem +1

      Can we truly say it's similar performance when dlss 3 fakes frames between gpu rendered fps to make it sound better?

  • @brentbillow6559
    @brentbillow6559 Před rokem

    I was lucky to get a 3070ti, and I am making use of Nvida's broadcast software. If I wanted to upgrade my GPU it might not happen till the 5070 series...that and maybe a new cpu.

  • @TumTuned
    @TumTuned Před 9 měsíci +6

    Truth be told, the RTX 4070 is a pretty fantastic card. It offers performance similar to a 3080, while having FAR less power consumption, more VRAM (most 3080s were 10GB models), and a smaller form-factor. And, price-wise (new, anyways), it cost less. Like Linus said, all big "W's" for nVidia. BUT... the price is asinine, and shows that nVidia is perfectly fine with their position of "slight gains in performance is the new norm, get used to it".
    To make matters worst... while AMD has GPU's that go toe-to-toe with nVidia (while costing less and having more VRAM), they simply can't compete with nVidia's RT performance. And, of course, AMD has an inferior form of upscaling tech (it's great that FSR doesn't require special tech, but let's be honest, it's nowhere near as good as DLSS2.0), which also makes it difficult for some people to select AMD over nVidia.
    I'm hoping we'll see a similar thing happen with the 5000-series that we saw with the 3000-series. The 3000-series was a large leap forward in performance compared to the 2000-series, while prices stayed about the same. For example, the 3070 offered better performance than a 2080, while costing less. Hopefully, nVidia will backpedal on their strategy, and we'll get a 5070 that performs better than a 4080, while costing less.
    Let's not get our hopes up, though...

    • @juliusjacob1416
      @juliusjacob1416 Před 8 měsíci

      But the thing is, is that a 4070 is significantly outgunned by a 6950XT, while the latter costs about the same. I’m looking at a build and I simply want the most raw performance. CPU will be an 7600X but it’s really hard to decide what GPU.

  • @kurvin2719
    @kurvin2719 Před rokem +242

    been planning on getting a 7900XT to replace my 1070, and this just solidifies my decision.

    • @PalakPreman08
      @PalakPreman08 Před rokem +10

      Why not XTX bro?

    • @Toutvids
      @Toutvids Před rokem +25

      If you can afford the extra money, buy the 7900 XTX instead.

    • @jette24
      @jette24 Před rokem +3

      if you turn on dlss3 and ray tracing its gets clobbered but to each their own

    • @o_sagui6583
      @o_sagui6583 Před rokem +51

      @@jette24 what also gets clobbered is your wallet by spending almost double the price just for a AI crutch

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Před rokem +25

      @@jette24 Pretty sure that 7900XT is faster than 4070 even with RT. And DLSS 3 simply isn't mature right now.

  • @heywatchyourmouth3048
    @heywatchyourmouth3048 Před rokem +10

    Dang Linus 0:22 you're huge

  • @chirantannath6593
    @chirantannath6593 Před rokem +2

    5:25 that lttstore ad feels too illegal bruh

  • @peoplez129
    @peoplez129 Před rokem +10

    What they don't tell you about DLSS3 on the 4070's is that it uses cores instead of dedicated hardware. That's their dirty little secret. You can tell this by the 4090 giving 2-4x frame generation, while the 4070 only gives ~1.5x frame generation. That means you're actually losing real frames to gain fake frames. The 4090 however does not lose real frames, it simply adds more frames on top of real frames. This is actually quite a deceptive change, because it means frame generation does not equally increase frame rate across different cards, making frame generation far less effective on the 4070's. If you notice on the graph, the 4070 went from 36 to 58 fps with frame generation. Since you need an equal amount of frames generated as real frames, this means with frame generation on, the 4070 is actually producing only 29 real frames instead of 36.
    That means your real-frame latency actually drops below even 30fps levels with frame generation on. This can only be explained by them secretly using CUDA cores to do the frame generation, rather than the dedicated DLSS3 frame generation hardware on the 4090. This is a big freakin deal, because it seriously undermines the benefit of even having DLSS3. I would consider this kind of deception not just borderline illegal, but actually illegal, as they have fundamentally altered the promised nature of how a core feature functions, by using real hardware in the higher tier cards, and what amounts to emulation in the lower tier cards, all without actually disclosing it. Like a child being fed and having the spoon switched out from delicious cake to vegetables at the last minute, this is exactly what Nvidia has done with the 4070 and consumers, hoping you don't notice.
    This is class action lawsuit levels of deception, like back when Nvidia made the GTX 970 with 3.5GB of VRAM, but 512MB of it was segmented from the main RAM, and thus virtually useless. The 4070 doesn't have real frame generation, it has EMULATED frame generation, that comes at the expense of core performance.

    • @eddiepark9069
      @eddiepark9069 Před 9 měsíci +1

      Just because Ngreedia didn't meet your expectations with this card does not mean they did anything illegal lol. Don't know if you were being serious with that or not.

  • @Alucard-gt1zf
    @Alucard-gt1zf Před rokem +166

    I love how the psu standard got changed to accommodate power spikes and then the next generation of gpus stopped having power spikes

  • @milohdd
    @milohdd Před rokem +70

    Well Nvidia sure is making my 1080 I bought 7 years ago look like an excellent investment

    • @chickenpotpie79
      @chickenpotpie79 Před rokem +12

      My 1080 is still goin strong too lol

    • @RJ-vy4yd
      @RJ-vy4yd Před rokem +1

      COPIUM

    • @wnxdafriz
      @wnxdafriz Před rokem +4

      my brother in-law is using my brothers old 1080, has no complaints about hogwarts legacy and is enjoying it with his son / he is a big board game guy but his friends kids and friends (now that kids are older) are starting to dip into video games again

    • @WV-HillBilly
      @WV-HillBilly Před rokem +2

      my 1080 TI startin to really struggle these days lol
      this 4070 looks juicy tbh

    • @codethecreator
      @codethecreator Před rokem +2

      My 1080Ti is still strong. Will most likely upgrade my 9th gen/1080Ti system when 14th/15th gen comes out and will pair nicely with a greatly reduced in price 4090 :D

  • @user-pw8ks8mq1d
    @user-pw8ks8mq1d Před rokem +1

    I got a 1660 Super brand new only with 210€. The last time such a great investment on GPU department. Right after, the pricing went nuts. Anyway, now I aim for a 4060. It's going to be a cute upgrade, although, going from 6 to 8GB of VRAM I have some reservations, but anyway.

  • @Lucius_Light
    @Lucius_Light Před rokem +16

    Based on spec sheet Nvidia should have called this gpu 4060. If you told me this is 4060 or 4060ti for about 350$ - 450$, it would be an awesome value buy? Looks like my 1070 will live to see another day (fingers crossed for 5000 series I guess).

  • @Alirezarz62
    @Alirezarz62 Před rokem +100

    I thought 4070 was gonna beat my 6900xt which I got for $550 this generation but damn AMD is killing it! Everyday I'm becoming more pleased with going red

    • @paulelderson934
      @paulelderson934 Před rokem +11

      Second hand 6800XT & 6900/6950XT are amazing value.

    • @Alirezarz62
      @Alirezarz62 Před rokem +9

      @@paulelderson934 I didn't get second hand tho. It was a refurbished model with 2 year warranty but yea the second hand market is crazy rn in my region. I can see there's still that refurbished model available for like $400 now

    • @gokublack8342
      @gokublack8342 Před rokem +2

      It should get more powerful as they continue to improve the drivers too

    • @SriramVenkatesan
      @SriramVenkatesan Před rokem

      and mor VRAM

    • @puneetsharma1437
      @puneetsharma1437 Před rokem +1

      ​@@gokublack8342 6900xt seem matured enough

  • @macus23
    @macus23 Před rokem +80

    This 4070 launch has cemented amd graphics in my build this year. I am upgrading from i5-7600 with a 1070 to i5-13600k with either 6800XT or 7800/7700 series that are probably coming this summer

    • @drew3045
      @drew3045 Před rokem +4

      i highly recommend the 6800xt if you build before the new gen AMD cards come out!! i love mine to death. got it a few weeks ago and it rips 1440p

    • @TheJackiMonster
      @TheJackiMonster Před rokem +4

      What I find amazing is that the 6800XT is about 14% ahead in performance in 1080p. That's similar to the difference between 6800 and 6800 XT. So given that the 6800 always has been faster than a 3070... it's like Nvidia just closed the gap to their competitor's last gen with a whole new generation. It's terrible to see that. Especially considering that you can a 6800 for $500 or even less.
      Even if someone wants to get an Nvidia GPU, it might even make more sense to wait for next generation. It's unlikely they can make a worse deal than this time. Otherwise AMD just offers the better deal currently.
      I'm definitely happy with my 6800. Also having 16GB of VRAM can be pretty handy when ray tracing gets used more. Because it actually requires more VRAM in practice which is something Nvidia won't tell you before selling 12GB or less.

    • @macus23
      @macus23 Před rokem

      @@drew3045 thanks for the advice! I am upgrading to transition from 1080 to 1440 so glad to hear it performs!

    • @lathrin
      @lathrin Před rokem +1

      Might as well get a Ryzen to go with it

    • @macus23
      @macus23 Před rokem

      @@lathrin I just might! But the bang for buck on the 13600k is pretty enticing so we will see.

  • @aopoczko
    @aopoczko Před rokem +3

    Cheapest 4070 in Poland is 730 USD.
    6800 XT I got in Feb for 640 USD looks like a win, especially with those new drivers that were showing in the charts, yay!

  • @smitesfan
    @smitesfan Před rokem +1

    Going after Pascal owners isn't necessarily a bad idea. I bought a 1080 right before the 1080Ti released (tragic, considering the performance increase). Regardless, that 1080 served me well throughout the years, even after my upgrade from a 1080p 240Hz monitor to the Odyssey G9 at 5120x1440. I didn't dump that card until this winter, November specifically, where I bought an Acer Predator Bifrost A770 because I wanted to tinker with it. I then replaced that with a 4090, once those were in stock at MSRP. The Pascal generation was excellent.

    • @nyregrus
      @nyregrus Před rokem

      yeah this is an ideal upgrade for me, considering i was on a 2060 (broken 1080 was still on warranty, but there were no more in stock in 2018, so i had to buy something with the money i got back, and it was all i could afford) and i am buying a 1440p 240hz monitor.

  • @ender-gaming
    @ender-gaming Před rokem +229

    I'm really glad they focused on how they're raising prices equal to performances gains for the second time in the last 3 generations. What I'm sad about is them not pointing out NVIDIA's record profits last quarter (even with the cyrpto bubble over, a new record), they're taking the fact there isn't competition to simply raise prices knowing they have a near monopoly, record profit after record profit has been posted...they learned the market can be squeezed for more and they're doing so hard.

    • @tuanseattle
      @tuanseattle Před rokem +10

      what can we say... we voted with our wallet. Look at Intel...thanks AMD.... and now look at Nvidia. Nvidia stock is trading too high compare to their P/E because everyone think they're unbeatable now.

    • @e5211
      @e5211 Před rokem +9

      It's capitalism. They're simply doing what all companies do. They are actually obligated to by their shareholders.
      The prices are high because people are willing to buy them at that price. Be angry at the buyers. They're the ones causing the price rise. Not the company.

    • @PhysicsGamer
      @PhysicsGamer Před rokem +16

      @@e5211 Bad press is part of capitalism, and covering for a company like you're doing doesn't help anything.

    • @Rubmaster
      @Rubmaster Před rokem +1

      Do you mind share the source of this record?

    • @lainiwaku
      @lainiwaku Před rokem +3

      yeah , actually, if nobody would buy the cards, they would have no choice than lower the price, the fact is there are still people who buy at that price
      personally i will keep my 2070S until a better deal

  • @deus_nsf
    @deus_nsf Před rokem +51

    I would have loved to see the 7900 XTX included in the bench....

    • @xruud24
      @xruud24 Před rokem

      15% slower on average while costing 40% less

  • @Limburg92
    @Limburg92 Před rokem

    I really do hope on an high tear with good drivers Arc.
    Would be awesome to have an intel only.

  • @FordFalcon54
    @FordFalcon54 Před rokem +3

    I had buyers remorse for paying over double for a scalped reference 6900xt 2 years ago...as this progresses I have very little regret especially because I was on a 5700xt when I upgraded. I love my 6900xt and it's really lifted over the last year imo. Seems I wasn't wrong in that based on the results of the 6800xt

    • @gildg7806
      @gildg7806 Před rokem +1

      Compared to people paying double or more for 3060ti 3070 and 3070ti 3080 your way better off the limited vram of those cards will severely limit there longeitivity in modern aaa games compared to the much beefier 16gb vram and its raterization performance is nothing to scoff at it was amd second strongest gpu at the time afterall.

  • @bidoofus7921
    @bidoofus7921 Před rokem +110

    I’ve been loving my RX 6800 XT! Didn’t realize the software updates had been so influential-kinda crazy that it outperforms a 3080 nowadays, according to the data here.

    • @eaman11
      @eaman11 Před rokem +17

      AMD has been gorgeous for software updates, I have a rx580 and I got through the years: Freesync / VR, Sharpening for upscaling, FSR. This GPU still rocks for the resolutions it was made and even stretches to 1440p with old games or FSR for new ones.

    • @TheXev
      @TheXev Před rokem

      Please stop buying this Kool Aid.. It isn't the software updates, its the larger VRAM sizes. Please go watch Hardware Unboxed 3070 vs RX 6800 2023 revisit to see some eye opening stuff!

    • @AvroBellow
      @AvroBellow Před rokem +5

      I knew that it would happen, especially considering the RTX 3080's tiny 10GB frame buffer.

    • @stoneymahoney9106
      @stoneymahoney9106 Před rokem +1

      @@eaman11 My 980ti died recently and I'm temporarily running an RX5500XT - when it's replacement arrives, it'll be an Nvidia card, but I will absolutely miss those AMD drivers, they're so good! Chill is witchcraft, I love it!

    • @vasopel
      @vasopel Před rokem +1

      RX 6800 XT: about 550 dollars ?????
      RTX 3080: about 810 dollars ?!?!?!?
      what the hell? :-O
      personally I was kinda pissed that my GTX 750Ti couldn't play "Doom Eternal"...so I installed "Binary Domain"...had fun...and didn't spend a penny ;-)

  • @Struders
    @Struders Před rokem +27

    AMD just got my money for 7900 XT. I was on a 3070 but that 8GB vram was a huge hold back for that card.

  • @alpacabloom1545
    @alpacabloom1545 Před rokem +1

    Really happy I was able to snag a 6900XT near it's launch price. It has held up ridiculously well.

  • @michaelmanning9028
    @michaelmanning9028 Před rokem

    I'm still learning and researching, I'm running ryzen 7 5700, 16g ram and geoforce 3060 rtx V2, but I want to run new games in 4k 60fps, wasn't sure where to begin, I know more ram is a cheaper place to start, but where should I go from here graphics card wise?

  • @codecthelios
    @codecthelios Před rokem +105

    Pro tip. Don't buy anything new that has less than 12gb of vram at this point. That is really the important thing to note. It doesn't matter how many cuda cores if you just dont have the texture memory

    • @TheRogueWolf
      @TheRogueWolf Před rokem +17

      Hardware Unboxed did a good demonstration of just what happens when you're hitting VRAM limits. Stuttery, unplayable messes where all those cores are meaningless.

    • @Jacob-hl6sn
      @Jacob-hl6sn Před rokem +15

      some games need 16 gb right now, not 12
      12 is not enough for an expensive gpu

    • @TRC_WA
      @TRC_WA Před rokem +1

      Yep… just upgraded my 11900k/3090 rig to 4090 and games like Hogwarts and Last of Us are performing beautifully in 4K Ultra… using 14/24 GB VRAM…. In addition to the generational leap the card is over the 3090 I didn’t consider any other card because none of them have 24GB but the 4090.

    • @thestorminmormin1442
      @thestorminmormin1442 Před rokem +23

      Games need more than 8 GB Vram are just badly optimized if you ask me

    • @markjacobs1086
      @markjacobs1086 Před rokem +4

      ​@@thestorminmormin1442 It depends on the target resolution, needing more than 12gb at 1440p is a >50% increase for as far as I'm able to tell, no perceptible increase in graphical fidelity... That's definitely worryingly bad from a technical point of view.

  • @oguzhanakdogan9787
    @oguzhanakdogan9787 Před rokem +160

    Great review LTT. You guys have convinced me to buy a second-hand RX 6800 XT!

    • @Torso6131
      @Torso6131 Před rokem +11

      Hell, you can get a new one for under $550 if you look in the right places.
      For a card that has 15% greater performance and more vram as long as you don't whirl up rt to get sub 60 frame rates anyway.

    • @bradhaines3142
      @bradhaines3142 Před rokem +12

      i need a 6900 for the memes

    • @baconwizard
      @baconwizard Před rokem +1

      @@Torso6131 even the 30 series wasn’t ready for RT

    • @seasunt
      @seasunt Před rokem +1

      Yeah, I grabbed my red devil 6800xt new for $513 back in January and I can't believe how good of a decision that was lol. Micro center is the goat for that deal.

    • @reapuscreepus8257
      @reapuscreepus8257 Před rokem

      I bought mine used for 450$ incredibly worth it! Do it now!

  • @oatbear8243
    @oatbear8243 Před rokem +7

    Good review! I actually got at 4070FE, but not without some hesitation. For my relatively modest needs, it’ll probably be okay for a couple years to come, but yeah, it could’ve been cheaper. The main reason for me was the performance per watt of this card. Honestly, the 300W++ power draw of modern flagships, or even the higher end of mid-range, is ridiculous IMO. That’s not innovation, that’s just torturing the silicon just to get a few more frames compared to the other team.

    • @6reen6uy
      @6reen6uy Před rokem +1

      Considering this for an upgrade from 2080ti. 5800x3d, 650w gold psu, and this build will stay on ddr4. I just don’t know if it’s worth.

    • @oatbear8243
      @oatbear8243 Před rokem +1

      @6reen6uy I can understand that. I’m at a 1060, so for me the choice is a lot easier :)

    • @sybilusmusicandart7196
      @sybilusmusicandart7196 Před rokem

      @@oatbear8243 also sitting at 1060 but upgraded rest of pc (12700k) and now I wonder between 4070/4070ti and 6800xt

    • @ravendarklawgaming197
      @ravendarklawgaming197 Před rokem

      That low wattage is a huge factor in me considering this as an upgrade. ;)

  • @MannsWoodlandPerspective
    @MannsWoodlandPerspective Před 10 měsíci +1

    Appreciate the 1080 being shown in the graph for benchmarks for comparison lol.
    Hoping to find a GPU to upgrade my 1080SC to for gaming and editing. Intel knows.

  • @BruceKarrde
    @BruceKarrde Před rokem +95

    Having an RTX2070 seems still like a solid option for my workload; Factorio, an MMO every once in a while, video editing, and basic programming. But damn, holding a freshly unboxed GPU is one of the few joys in life :D

    • @tinkerspammer1
      @tinkerspammer1 Před rokem +11

      Same here. I currently have an RTX 2060 still does the job haha.

    • @dislexicpotato641
      @dislexicpotato641 Před rokem +7

      For me the 2070 can't cut it anymore, upgraded to a 1440p 144hz monitor and its a struggle trying to get high refresh rates on any modern AAA games

    • @Thisandthat8908
      @Thisandthat8908 Před rokem +2

      my factories nver get that big before i get lost. But my gaming laptop doesn't even spin the fans up for it.

    • @Stasiek_Zabojca
      @Stasiek_Zabojca Před rokem +2

      And here I'm playing on GTX 980 at 1440p, which unfortunately isn't good enough anymore for lots of games, especially with just 4GB of RAM.

    • @Donnerwamp
      @Donnerwamp Před rokem +2

      2080Ti, and I have to say, I'd love an upgrade for VR stuff. Aside from that, I'd be happy with a 2070. DLSS is amazing for prolonging anGPUs life.

  • @jhosioja
    @jhosioja Před rokem +119

    I'm really feeling good about that 6900XT I bought 5 months ago, especially now that I got my cpu and mobo up to date too with the 7600x with SAM enabled. Feels like it would've taken the combined cost of gpu, cpu, mobo and ram just get an nvidia gpu that could compete, add to that the cost of cpu, mobo and ram and it would take a couple decades for that power cost to make up the difference.

    • @sexyfishproductions
      @sexyfishproductions Před rokem +1

      Interested in upgrading my 2080ti with a 7900xt - very interested in SAM now that I have a 7900 - have you done any testing with and without SAM enabled? Should I go for it? Wanting the upgrade to make the most of a 4K 120Hz LG OLED

    • @primadhy
      @primadhy Před rokem +1

      same! initially wanted to upgrade to 7900 from 590 but the early price was so high in my country compared to actual MSRP, luckily then i found 6900xt in a very good price, so happy with it now

    • @hoverbike
      @hoverbike Před rokem

      @@sexyfishproductions eh, could only find 6900xt at 700$.
      i found a 3080 at 400$ so i think that was better value.

    • @YardonNim
      @YardonNim Před rokem +1

      @@sexyfishproductions I know you didn't ask me but it's really up to you if want to turn it on. Usually a 3% performance increase on most games at 4k but some can be up to 15%.

    • @TotallyCreativeNameBtw
      @TotallyCreativeNameBtw Před rokem

      @@sexyfishproductions My 6900xt has a lot less stuttering with SAM, as for direct fps boosts idk, i think it did give me a bunch of fps depending on the game, especially in forza
      I have a 5800x btw
      But hey, rezisable bar for nvidia also exists so at the end of the day it depends on what you wanna pay for a card lol

  • @DarkChely
    @DarkChely Před rokem

    i'm currenty running a rx580 8gb with a 2700x cpu(without a pin... but that is another story) on a 750w power supply. initially i wanted to go for a 6800xt which seems to trade blow for blow with the 4070 in most cases but from what i saw, they recommend a 850w psu for the 6800xt. now the 4070 will be a huge improvement over my 580 and it's recommended psu is 600w.
    From my point of view, going for the 4070 would be the logical economical sense, especially when i don't plan on moving over 1980x1080 resolution any time soon.
    Is my train of thought on the right track or is there another alternative, budget wise?

  • @captainquain
    @captainquain Před 5 měsíci +2

    I upgraded from my 1080 ASUS card to ASUS 4070 TI OC. I bought it used at 150 below the market price and so far it has been a great purchase. The fact that I was able to wait 4 years to upgrade tells you how good the 1080 card was.

  • @redetermine
    @redetermine Před rokem +53

    I would've liked to have seen more of AMD's last gen in the benchmark. For example the 6700XT has the same amount of VRAM but is alot cheaper

  • @garrettrinquest1605
    @garrettrinquest1605 Před rokem +73

    I wish you'd included the 6700xt on your graphs. With 12GB of VRAM, it's actually been edging out the 3070 in some newer games. It would be a nice point of reference in approximately the same tier