16GB vs. 8GB VRAM: Radeon RX 6800 vs. GeForce RTX 3070, 2023 Revisit

Sdílet
Vložit
  • čas přidán 1. 06. 2024
  • Hetzner: hetzner.com/hub/ex-line
    Support us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane.com/channel/Ha...
    Buy relevant products from Amazon, Newegg and others below:
    Radeon RX 7900 XTX - geni.us/OKTo
    Radeon RX 7900 XT - geni.us/iMi32
    GeForce RTX 4090 - geni.us/puJry
    GeForce RTX 4080 - geni.us/wpg4zl
    GeForce RTX 4070 Ti - geni.us/AVijBg
    GeForce RTX 3050 - geni.us/fF9YeC
    GeForce RTX 3060 - geni.us/MQT2VG
    GeForce RTX 3060 Ti - geni.us/yqtTGn3
    GeForce RTX 3070 - geni.us/Kfso1
    GeForce RTX 3080 - geni.us/7xgj
    GeForce RTX 3090 - geni.us/R8gg
    Radeon RX 6500 XT - geni.us/dym2r
    Radeon RX 6600 - geni.us/cCrY
    Radeon RX 6600 XT - geni.us/aPMwG
    Radeon RX 6700 XT - geni.us/3b7PJub
    Radeon RX 6800 - geni.us/Ps1fpex
    Radeon RX 6800 XT - geni.us/yxrJUJm
    Radeon RX 6900 XT - geni.us/5baeGU
    00:00 - Welcome back to Hardware Unboxed
    00:35 - Ad-spot
    01:25 - Backstory
    04:33 - Test System Specs
    04:48 - The Last of Us Part 1
    08:01 - Hogwarts Legacy
    12:55 - Resident Evil 4
    14:15 - Forspoken
    16:25 - A Plague Tale: Requiem
    18:49 - The Callisto Protocol
    20:21 - Warhammer 40,000: Darktide
    21:07 - Call of Duty Modern Warfare II
    21:34 - Dying Light 2
    22:03 - Dead Space
    22:29 - Fortnite
    22:53 - Halo Infinite
    23:22 - Returnal
    23:58 - Marvel’s Spider-Man: Miles Morales
    24:30 - Final Thoughts
    Read this review on TechSpot: www.techspot.com/article/2661...
    16GB vs. 8GB VRAM: Radeon RX 6800 vs. GeForce RTX 3070, 2023 Revisit
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunboxed
    Outro music by David Vonk/DaJaVo
  • Věda a technologie

Komentáře • 6K

  • @DrearierSpider1
    @DrearierSpider1 Před rokem +5117

    The first mainstream GPU with 8GB of VRAM was the R9 390 in 2015, which cost $330. By 2020, we should've only been seeing 8GB on the most entry level cards.

    • @oOZellzimaOo
      @oOZellzimaOo Před rokem +96

      @@istealpopularnamesforlikes3340 Ahem* RX 6300 2GB

    • @singular9
      @singular9 Před rokem +520

      @@oOZellzimaOo thats a sub 100$ GPU...8GB of Vram is like 50$+ alone...

    • @_Winfried_
      @_Winfried_ Před rokem +231

      There were variants of the R9 290X with 8GB as well

    • @BleedForTheWorld
      @BleedForTheWorld Před rokem +75

      @@oOZellzimaOo nuance is that everyone SHOULD know NOT to get a gfx card like that in the modern era. Essentially, it's a waste of money over something that's actually useful. Sorry to be pedantic about it but that's the nuance of it.

    • @LuisC7
      @LuisC7 Před rokem +138

      In 2015, 8gb of vram was 330? Man, prices have come a long and sad way.....

  • @kougar7
    @kougar7 Před rokem +681

    The 1070 had the same 8GB of VRAM as the 3070, and that was seven years ago...

    • @Crimsongz
      @Crimsongz Před rokem +82

      My gtx 1080 TI got more VRAM 😂

    • @Accuracy158
      @Accuracy158 Před rokem +18

      Yeah I bought a 1080 on launch day and that was probably the first generation for me that I stopped looking at vram as a large concern. ...But as you point out that was kind of a while ago at this point.

    • @Magjee
      @Magjee Před rokem +41

      ​@@Crimsongz 11GB, 1 more than the 3080. They are so stingy with VRAM ;(

    • @His_Noodly_Appendages
      @His_Noodly_Appendages Před rokem +14

      Loved my 1070.

    • @AvroBellow
      @AvroBellow Před rokem +7

      And the 1080 Ti had 11GB.

  • @TheTardis157
    @TheTardis157 Před rokem +451

    My ancient RX580 has as much VRAM as that 3070. Now 8GB is just enough for 1080p gaming. Glad to see the Radeon card doing so well.

  • @ThreaT650
    @ThreaT650 Před rokem +547

    I really appreciate the fact that on this channel you guys make the pressure you are putting on companies through your views and influence extremely clear to understand and remedy, with extremely specific target points so that they can't weasel out of criticism with a few extra gigabytes of VRAM. Quite frankly you guys are using your power in this market to the highest efficiency you possibly can without getting anti consumer pushback.

  • @claytonmurray11
    @claytonmurray11 Před rokem +481

    The "I told you so angle" - is completely justified and I loved every second of it. However, the content is also extremely important because a lot of RTX 3070ti's, 3070's and 3060ti's are still being sold new, and are going to be exchanged on the second hand market. People need to know the value of these cards and the performance they can expect. Well done Steve!

    • @chrissoclone
      @chrissoclone Před rokem +37

      And even worse, nVidia will continue to throw out 8GB cards in the 4xxx series while even new 12GB models should already get that warning the 8GB cards got back then. If people run out to buy 8GB 4050 Tis etc. at these highly inflated prices a "told you so" will be more than justified next time.

    • @claytonmurray11
      @claytonmurray11 Před rokem +28

      @@chrissoclone exactly. A 6950XT seems like a much better deal than the 4070ti and the upcoming 4070.

    • @logirex
      @logirex Před rokem +9

      for me it reeks of confirmation bias. you would not run a 30 month old mid range gpu at ultra settings these days. I got a 6900XT and certainly don't run most games on ultra but high or very high settings for good fps. test the 3070 on high settings against the 6800 and it will do well.

    • @claytonmurray11
      @claytonmurray11 Před rokem +8

      @logirex I see what you're saying - but it's still great information for people looking at buying these cards - which sell a considerable amount new and used today. The old reviews are no longer valid.

    • @logirex
      @logirex Před rokem +4

      @@claytonmurray11 Going forward you certainly want cards with more memory but it looks like they are getting it also with the RTX 4070 getting 12Gb for instance.
      That said even getting an older card I would still take a RTX 3070 over the 6700XT any day of the week. This channel tested those over 50! games and the 3070 was almost 15% faster. The only thing is if you buy older cards or mid range cards DO NOT RUN THEM AT ULTRA SETTINGS.
      Both this channel and others such as LTT multiple times points out running at ultra settings is pointless. LTT has one video where they had people look at games using ultra or very high setting and most preferred the very high setting as they clearly could not norice the difference.

  • @FixCongress
    @FixCongress Před rokem +750

    Again, NVIDIA 100% knew this would be an issue within a year of release. They talk to developers just like AMD does. They knew that devs were itching to start using more and more of that console memory and completely ignored what that would mean for PC gamers.

    • @zewmxsr
      @zewmxsr Před rokem +243

      they didnt ignore it at all. in fact they created cards that would be obselete to make a few extra bucks from us.

    • @GFClocked
      @GFClocked Před rokem +215

      Planned obsolescence

    • @filipealves6602
      @filipealves6602 Před rokem +91

      "[Nvidia] completely ignored".
      WOW! Even after watching a whole video on the subject and describing what the problem is, you *still* don't understand that this was completely intentional and *NOT ignored* at all!!! 🙄

    • @roadrash2005
      @roadrash2005 Před rokem +34

      This is the reason I’m skipping the 4000 series. They are basically not an improvement over my 3060ti based on vram

    • @erlienfrommars
      @erlienfrommars Před rokem +30

      That's because Nvidia wants maximum profits, so they make lower tier products as unattractive as possible to push people towards the flagships with their significantly sizable VRAM.

  • @selohcin
    @selohcin Před rokem +443

    Excellent work, Steve. I had no idea new games were using this much VRAM. I'll be sure to get something with no less than 16GB of VRAM next time I buy a GPU.

    • @Karibanu
      @Karibanu Před rokem +39

      Some of the stuff I've been running has been using all 16GB VRAM and *still* going into system memory...
      Emergency 6800XT ( thankyou card death 1 month out of warranty! ) was a good purchase, thankfully got one of the lower binned versions on sale & saved about £300 at the time. I'd never get anything below 16GB now & frankly still want more. Less bothered about clocks.

    • @tre-ou5gt
      @tre-ou5gt Před rokem +1

      ​@@Karibanu what resolution do you play at?

    • @Karibanu
      @Karibanu Před rokem +5

      @@tre-ou5gt 1440p, occasionally 1440p triple-screen.

    • @orcusdei
      @orcusdei Před rokem +2

      I've just bought 4070 with 12... the issue is that I can't fit any AMD card into my computer.

    • @kontoname
      @kontoname Před rokem +29

      @@orcusdei Did the thought occur to you that a decent PC case is way, way cheaper than a GPU?
      I'd never buy a crappy GPU because it doesn't fit - especially not if that GPU is way more pricy than the alternative card. You could literally have bought a decent future proof case, had a nice meal with the family at a restaurant AND bought a GPU instead of a 4070 🤣🤣🤣🤣💀

  • @stianh.4587
    @stianh.4587 Před rokem +28

    My choice a year ago stood between the RX 6800 and GeForce 3070. I'm happy to say I made the right choice.

  • @philosophyetc
    @philosophyetc Před rokem +2328

    RIP in peace to anyone who paid $1400 for a GPU with 8GB of vram two years ago

    • @tpmnrcks
      @tpmnrcks Před rokem +112

      Desperate times..

    • @RifterDask
      @RifterDask Před rokem +100

      I paid $360 for one six months ago. In my defense, it was an upgrade from a 6GB 2060

    • @znoozaros
      @znoozaros Před rokem +69

      Paid 400€ for 3070 strix a month ago. Now waiting for my 729€ 6950 xt red devil to arrive

    • @breadkrum6860
      @breadkrum6860 Před rokem +176

      Rest in peace in peace lol

    • @King-O-Hell
      @King-O-Hell Před rokem +75

      I did pay like 1100$ for a 3070ti because I didn't really think things would get better, as far as price. I should have stuck to my 1070 and waited:)

  • @whiteglovepc
    @whiteglovepc Před rokem +739

    What about the planned obsolescence angle in terms of NVIDIA purposefully limiting the VRAM in order to encourage consumers to upgrade sooner? Seems a clear strategy at this point.

    • @Hardwareunboxed
      @Hardwareunboxed  Před rokem +647

      They're only going to give gamers as little as they can get away with. This is obviously an excellent strategy on their behalf.

    • @TheHighborn
      @TheHighborn Před rokem +5

      my second ever notebook was a Lenovo y510P and i had 2x755M in SLI (nvidia). At that time i didn't know anything about PC hardware, so i thought 2 GPUs are better obviously, right.
      Anyways, nvidia came out with shadowplay and i was so happy, i could use it, and record stuff as FRAPS really sucked ass. I was recording my league of legends gameplays left and right.
      Few weeks passed, and i couldn't use shadowplay. I googled it, and found out Nvidia disabled it, but there was a workaround. So, i used said workaround. Few weeks passed again, and said workaround was disabled too. And that was the moment i didn't want Nvidia products anymore. When i built my PC that i use today, i bought a vega64, despite almost buying a 1080Ti. Now, i'm bying a new computer and looking at the things Nvidia does, the obvious choice was the 7900xtx. (CPU is in the mail, and it's a liquid devil, so can't use it yet)
      Fuck nvidia.

    • @BleedForTheWorld
      @BleedForTheWorld Před rokem +68

      Gotta love that profit driven economy

    • @kaisersolo76
      @kaisersolo76 Před rokem +11

      @@Hardwareunboxed So why did thy come out with the 3060 12gb. Its was a weird card when it came out in terms of the VRAM it got. Did Nvidia finally see the problem coming?

    • @1armbiker
      @1armbiker Před rokem +132

      @@kaisersolo76 they saw 6 as too little at the time, as it would have had issues with some titles on launch. Remember you can only change VRAM amounts by doubling or halving because of the bus width.

  • @vor78
    @vor78 Před 11 měsíci +54

    I was in the market for a GPU upgrade at the start of the year and it came down to these two as the primary contenders. I had been a longtime user of Nvidia products, but this time around, I went with the RX 6800, with the VRAM issue being one of the major deciding factors. Watching this video, I'm very thankful that I did.

    • @briefingspoon380
      @briefingspoon380 Před 5 měsíci

      how she been treating you so far?

    • @vor78
      @vor78 Před 5 měsíci +5

      @@briefingspoon380 I'm very satisfied with the 6800 and have had no issues with it at all.

    • @briefingspoon380
      @briefingspoon380 Před 5 měsíci

      @@vor78 Whats type of fps you getting and which games?

  • @Puremindgames
    @Puremindgames Před rokem +59

    It happened with 128MB, 256MB, 512MB, 1GB, 2GB and 4GB it was only a matter of time until 8GB joined the 'Not enough' graveyard, I think some are just shocked at how fast it seemed to happen.

    • @AFourEyedGeek
      @AFourEyedGeek Před 11 měsíci +20

      And that NVIDIA is selling a 4060Ti with 8GB of VRAM in 2023.

    • @drek9k2
      @drek9k2 Před 5 měsíci +3

      It really didn't though, 2, 3, 4gb was in pretty fast succession, it took awhile for 8gb to no longer be enough because RX 480 had 8gb long LONG ass time ago and it was still fine until a few years ago. And at the time of release it was already not enough, Doom Eternal was taking up 11gb and at 1440p Cyberpunk took up enough VRAM that it has texture issues.

    • @Bassjunkie_1
      @Bassjunkie_1 Před 5 měsíci +1

      @@drek9k2 Agreed. I think the people that are shocked are the ones who recently bought a 3070 on sale or maybe even a 3080, while the rest of us from the r9 290x 8gb and rx 480 8gb era are not.

    • @drek9k2
      @drek9k2 Před 5 měsíci +3

      @@Bassjunkie_1 Oh true that's a good point, you know I totally forget that a lot of people this may be their first actual video card, rather than, I'm not saying you needed to be from the GT 8800 days just that I keep forgetting many of these kids weren't even around for the RX 580 8gb yet. I guess a lot of people are really, really truly clueless.

    • @alexarose2996
      @alexarose2996 Před 5 měsíci

      Go play uncharted new version on pc i have no compliant getting new with 16gb vram when the game looks like that damn rip 2070 super and 8gb cards u won't be missed

  • @extragooey813
    @extragooey813 Před rokem +461

    nVidia's planned obsolescence has worked... I'm gonna buy a new GPU to replace my 3070. Grats nVidia, you've driven me into getting an AMD RX 7900 XTX.

    • @Dempig
      @Dempig Před rokem +57

      I paid $850 for a 3070 ti on launch day.....what a mistake lol but im now switching to a 7900 xtx as well soon

    • @macdonald2k
      @macdonald2k Před rokem +11

      Nothing wrong with 4090 too, but I get that $1600+ isn't necessary for a good experience.

    • @pcdog8903
      @pcdog8903 Před rokem +38

      I’m enjoying my 7900 XTX. I think you will like it.

    • @extragooey813
      @extragooey813 Před rokem +34

      @@macdonald2k Yup, the best card right now. But I choose not to give nVidia any more of my money.

    • @extragooey813
      @extragooey813 Před rokem

      @@Dempig I have a 3070 FE. Got it a few weeks after release when the only way to find one was to get lucky on Best Buy with constant website refreshes on stock days. Should have resold it when it was going for >$1000 lol.

  • @MrLimeyLime
    @MrLimeyLime Před rokem +251

    I was using a RX 580 for many years, and I feel what made that card age so well was its 8GB of VRAM. At its launch it was pretty high knowing NVIDIA's equivalent was using 3 and 6GB. So when I finally moved away from that card it was obvious that the RX 6800 would be the right pick for me. Its been good to see the card has been aging well already for the same reasons.

    • @kaapuuu
      @kaapuuu Před rokem +18

      I switched from 1060 to Rx6800. I consider 3070 on release as price was very similar but im enthusiast and i knew 8gb won't cut at 1440p in couple months. It happend yet Nvidia claimed its not a problem.
      3080 10gb ? What a joke. NVIDIA made fools from consumers

    • @chrissoclone
      @chrissoclone Před rokem +9

      RX580 was the best bang for buck card I ever had, and yes, 8GB VRAM played a large part in that. Before that I had a 660 Ti as my favorite bang for buck card, it was quite excellent and the ONLY reason I had to replace it was because already then nVidia was just too stingy with their VRAM, it could've lasted me even longer if not for that.

    • @MetalGeek464
      @MetalGeek464 Před rokem +3

      I made this same upgrade recently. The additional VRAM and Nvidia's bonkers pricing were key drivers in my decision.

    • @sacamentobob
      @sacamentobob Před rokem

      you must have avoided the lemons quite well!!

    • @MrLimeyLime
      @MrLimeyLime Před rokem +1

      @Varity Topic 6750 XT is still a pretty solid choice!

  • @petarracic6740
    @petarracic6740 Před rokem +44

    Just to add to this, it was a real kick in the nuts when I bought the 10GB 3080 and then the 12GB variant comes out after... I wanted the extra VRAM too it should of been the first product they released.

    • @eugenijusdolgovas9278
      @eugenijusdolgovas9278 Před 11 měsíci +1

      They did and it was called 3080 Ti. 3080 was never meant to last longer than 2 years for 1440p ultra.

    • @CheGames497
      @CheGames497 Před 11 měsíci +4

      never buy the first version. while its hip and cool to buy it first. you usually are a test subject just for better to drop right after, be it apple, samsung, monitors, gpus, cpus, whatever. dont get caught up into fomo fam.

    • @CheGames497
      @CheGames497 Před 11 měsíci

      @@eugenijusdolgovas9278 ya we are a wasteful throw away society, we think we need to replace things because corporations have trained us that way, so we keep profits nice. we are just modern day serfs.

    • @eugenijusdolgovas9278
      @eugenijusdolgovas9278 Před 11 měsíci

      @@CheGames497 Whilst some are wasteful, I tend to buy GPUs from used market and rock 6800 XT rn, which will be working till I decide to switch to 4k competitive FPS gaming.

  • @bravestbullfighter
    @bravestbullfighter Před rokem +99

    My 16GB RX6800 bought two years ago at $579 MSRP aged really well.

    • @elmSTREETnasty
      @elmSTREETnasty Před rokem +4

      That's what I paid for my Gigabyte Gaming OC 6800XT a few months ago. Bout $585 after shipping.

    • @amirhasamtemori1843
      @amirhasamtemori1843 Před 3 měsíci

      bro i got mine for only 430 euros from xfx

    • @Karimftw0.0
      @Karimftw0.0 Před 3 měsíci +1

      lol I buying mine for 300 rn

    • @stefannita3439
      @stefannita3439 Před 3 měsíci +1

      Got a 3070 for 600 euros in 2020 (GPUs cost more here than in the US). This week I found a one-year-old 6800 XT for 360 euros on the used market and was able to sell my 3070 for about 300. Can't wait to enjoy the next few years with this beast of a card, it's still so powerful at 1440p!

    • @NadeemAhmed-nv2br
      @NadeemAhmed-nv2br Před 2 měsíci

      ​@@Karimftw0.0yeah but you got the performance 4 years later well I've been using my RX 6900xt for 4 years now. Getting that performance for half half a decade later isn't as big of a flex when we'll just move onto something 2x the performance

  • @scotty28653
    @scotty28653 Před rokem +348

    AMD have really improved their Radeon cards over the recent years. I used to use Nvidia cards all the time but have been using Radeon for the last 3 years now and love these cards.

    • @SMILECLINIC-2004
      @SMILECLINIC-2004 Před rokem

      How is the driver support for amd cards?

    • @nicane-9966
      @nicane-9966 Před rokem +33

      @@SMILECLINIC-2004 Its fine. not worse than nvidia wich recently got issues aswell. They are just slower at releasing newer drivers but thats it however im not one of those guys who update their divers immediately new ones come out, why would u do so? its not like u are gonna get much better performance in newer games, u need to buy a faster gpu for that. if it works good then u dont need to touch it.

    • @mikem2253
      @mikem2253 Před rokem +36

      @@SMILECLINIC-2004 No issues as a former 6700xt and a current 7900xt owner.

    • @MisterDirge
      @MisterDirge Před rokem +4

      I'm using a RX 5700, and I haven't had any driver issues throughout my time with AMD cards (rx 580, vega 56). I'm happy that the RX 5xxx series cards are still getting support and FSR/RSR updates as its allowed me to play most games at 2560x1080 and higher than 60fps. I don't really like the metrics overlay that AMD uses but I can't use Rivatuner without MSI afterburner which messes up my undervolting settings.

    • @bumperxx1
      @bumperxx1 Před rokem

      It's been said since 2020 go amd if u want regular features. They have the best regular rasterization frames. But at the low end with these 3070 3060 6800xt 6700xt ect you won't get any good benefits for raytrace lightefects so u need to go higher up the Chain. So obviously this test proves that

  • @PotatMasterRace
    @PotatMasterRace Před rokem +724

    It would be very nice to see a 12Gb model like a 6700XT in the comparison as well. Or even an older 11Gb card like a 1080Ti. It would be hilarious if frametimes were more stable on the 1080 Ti than on a 3070 :D

    • @TRONiX404
      @TRONiX404 Před rokem +64

      Actually I recently ran my 1080ti I7700k with God of War 4K Ultra high with FSR 2 balanced or quality.
      60fps solid

    • @singular9
      @singular9 Před rokem +50

      Blyat-Vidia gimping its own cards? Who would have thought.

    • @legi0naire
      @legi0naire Před rokem +29

      I am having a lot better gaming experience with my 1080ti, compared some other people with cards twice as fast but only 8gig.\
      I still get 60+ fps in almost all new games 1440p, no stuttering.

    • @razoo911
      @razoo911 Před rokem +1

      gtx on the last of us run like shit, zero optimization for that gpu generation, like in every games

    • @The_Cokimoto
      @The_Cokimoto Před rokem +6

      ^This please Steve, you need to do this!!

  • @thefumexxl
    @thefumexxl Před rokem +95

    Literally just got my 6800 a week ago and it's been blowing my mind how powerful it is even wth raytracing.

    • @MrBones105
      @MrBones105 Před rokem +12

      I’ve had one for a year now and I honestly haven’t run into a situation where I’m concerned about performance. It’s a really good card!

    • @UKGBManny
      @UKGBManny Před 10 měsíci +4

      Interesting, GTX1080 user here looking upgrade! The top card my 9700k can handle is a RTX3070 or a 6800. The 16GB of ram is definitely pulling me to team red for sure.

    • @thefumexxl
      @thefumexxl Před 10 měsíci +4

      @@UKGBManny For the money right now it's one of the best options. It's closest competition is over $100 more. If you can spend another $100 you can get the XT, who's biggest competition is WAY more expensive than that. Unless you can get an AMD 79xx or NVidia 3080 the rest of either card that might compare is double to quadruple the price. I paid $370 for mine and am still amazed at how much it can handle.

    • @user-mx4gm6pc2o
      @user-mx4gm6pc2o Před 9 měsíci +5

      going to get my 6800 for 350usd tmr and selling mine 6700xt, looking forward to get that thing after seeing all those videos

  • @ademkin
    @ademkin Před rokem +74

    Still rocking my 1070 I bought in 2016. Absolutely love this card, it's been enough to run recent games, granted in medium quality, but they still run and I can play in good conditions. I'm not hurry to upgrade GPU and not a compulsive buyer, and thank god for that seeing the state the GPU market is in. Hardware Unboxed is one of the channels I'll be sure to follow the advices of to know which GPU I want to upgrade to.

    • @ZeroG
      @ZeroG Před rokem +8

      Whatever makes you happy, that's the important thing in life. This man has his head screwed on straight!

    • @cameronblazowich2671
      @cameronblazowich2671 Před rokem +3

      Dkn I agree the 1000 series my have cost most then the previses generations at first, but boy did the have longevity. I picked up a GTX 1070 Ti at the beginning of 2022, and man do I wish I had gotten it sooner! Nvidia and now AND have lost their track now, at least when it comes to there gaming side of things. more so Nvidia

    • @SquillagusNiggle
      @SquillagusNiggle Před rokem +2

      Same here, EVGA 1070 FTW, an absolute diamond of a card. Lasted longer than any GPU I've ever bought, lasted through COVID and crypto, and plays almost everything I care to throw at it in near-silence. Just replaced it with a cheap second-hand 6800XT to last me until the next 'good' GPU release cycle whenever that will be, but that's only because I'm building an entire new system from scratch.

    • @ArtisChronicles
      @ArtisChronicles Před rokem +1

      ​@@cameronblazowich2671 if we take AMDs perspective into consideration, then the only way they continue to increase profit margins is to charge their 5 customers even more money for their products.

    • @korana6308
      @korana6308 Před rokem +3

      1070 is absolute beast of a card

  • @instantstupor
    @instantstupor Před rokem +159

    Would have also been interesting to see the RTX 3060 12GB vs a higher end RTX 8GB card. See if there were cases the former played better than its more expensive counterpart.

    • @sinephase
      @sinephase Před rokem +3

      it was too much for the GPU though so it's not as good as you might expect. At least it won't crash in RE4 as much though LOL

    • @ole7736
      @ole7736 Před rokem +2

      The RX 6750 XT might be interesting to throw in the mix for those titles that report ~15GB of total VRAM usage already.

    • @opachki8325
      @opachki8325 Před rokem

      I'd love to see a 3070Ti with its G6X in this comparison.

  • @corriban
    @corriban Před rokem +572

    An RTX 4060 with 8gb of VRAM will be pretty much dead on arrival. I can't believe Nvidias social media team is unable to clearly communicate this to the higher ups.

    • @Ddofik
      @Ddofik Před rokem +89

      and does it really matter? It will still sell really well

    • @apostolos8734
      @apostolos8734 Před rokem +80

      @@Ddofik as well as the 4080 did? 😂

    • @BlueMax109
      @BlueMax109 Před rokem +124

      Imagine thinking Nvidia care about the consumer lol

    • @MoltenPie
      @MoltenPie Před rokem +36

      They may give it 16gb in the end actually, just like they did with 3060. I am almost certain 3060 was planned to be a 6gb card (just like 2060 was) but they changed the mind last moment. No way they planned 3060 to have more vram than 3080! But the only thing they could do at that point is to double the buffer (same number of channels, double the volume of memory banks).

    • @fepethepenguin8287
      @fepethepenguin8287 Před rokem +15

      You think the higher ups are not fully informed?
      They have NO incentive to offer anything more.
      This would be a different story if they were facing the AMD of 8 years ago, that would do anything to gain market share.
      As soon as a competitor comes in offering the same performance with triple the vram costing half as much... and they truly loose market share.
      Then next gen will be different.

  • @HuebietheGuru
    @HuebietheGuru Před rokem +13

    Glad I sold the 3070 two months after purchasing it. I was very disappointed with it and bought a 6800XT - still in use and happy 😅

    • @HuebietheGuru
      @HuebietheGuru Před rokem

      @@siwexwot8994 yes. That's what we saw in the video. Dunno why you mention it..? 🤔

    • @HuebietheGuru
      @HuebietheGuru Před rokem

      @@siwexwot8994 you can still use it in combination with DLSS...

    • @HuebietheGuru
      @HuebietheGuru Před rokem

      @@siwexwot8994 and that's the point. The 3070 is only a 1080p card today due to it's lack of more memory... The take away is: the 6800XT was and is still the better buy. 😉👋🏼

    • @HuebietheGuru
      @HuebietheGuru Před rokem

      @@siwexwot8994 yeah you're may be right but lowering the texture quality in 2023 is not future proof. I'm still glad that I sold my 3070 2 months after the purchase and got a 6800XT in my hands. FSR 2.x and 3.x will make it even longer usable as the 3070 will ever be. That's my personal opinion 😉

  • @nhand42
    @nhand42 Před rokem +60

    You are accurate as always. 8GB is now budget tier. 16GB is mid-tier and 24GB for top-end. When Nvidia launched (and then quickly unlaunched) the 4080 12GB it was such an obvious mistake. Even 16GB gives me pause because 4K gaming and SS needs more VRAM than ever.

    • @Dell-ol6hb
      @Dell-ol6hb Před 7 měsíci +1

      especially with raytracing too

    • @yonislav_9821
      @yonislav_9821 Před 7 měsíci

      Like really i dont know how people trust them while they try to skip and move on cheapest shameless set higher price scenario.

    • @CSTITAN576
      @CSTITAN576 Před 6 měsíci +2

      I think 10-12 gb is mid tier. 16 and above is still pretty high end

    • @shinefake48
      @shinefake48 Před 6 měsíci

      16GB may not technically by mid end considering the current generation of games, I would say 8Gb would be entry like the 4Gb back then, 10Gb being somewhat standard like 6Gb and 12 Gb being the mainstream for GPUs, and 16Gb should be standard for cards that can run 4k unlike 3080 and 3080ti with that 12Gb

  • @amineabdz
    @amineabdz Před rokem +205

    Retesting the 6700XT would be a great idea, considering that it offers 12gb of VRAM, and how it's already very close to the 3070 compute power but for cheaper, would be interesting to see how it fairs against the 3070 in the current VRAM limitaions meta.

    • @KrisDee1981
      @KrisDee1981 Před rokem +21

      Hopefully Steve can do 2080ti vs 3070 vs 6750xt.
      Turing would win this because of much higher RT performance and 11GB of Vram is actually sweet spot for that class of performance.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Před rokem +17

      @@KrisDee1981 Turing uses first gen RT cores just like 6750XT, So I doubt that. But hey, no problem in comparing.

    • @HankBaxter
      @HankBaxter Před rokem

      Actually, that would be awesome.

    • @ComputerProfessor
      @ComputerProfessor Před rokem

      Yes, very good idea. Hardware Unboxed pleasssseeee

    • @cooper23231
      @cooper23231 Před rokem +4

      @@KrisDee1981 🤣🤣 yeah let's do it until nvidia wins, not the consumers, Nvidia must win.

  • @Magmafire
    @Magmafire Před rokem +224

    You know what would be an interesting follow up, including benchmark results from the 16gb Intel A770 in the titles the 3070 struggled. Just for fun, you could even include data from the Radeon VII for those same titles. That additional data would really drive your point home for owners of 8gb cards.

    • @ERZAY2
      @ERZAY2 Před rokem +18

      I’d also like to see a 3060 vs 3070 video to see if it can handle these better as well

    • @UTFapollomarine7409
      @UTFapollomarine7409 Před rokem +8

      hey man even a vega frontier at 16gb would dominate the 3070 lmao 3070 trash

    • @StevenDolbey
      @StevenDolbey Před rokem +7

      Or a 2080 Ti. Similar GPU performance to the 3070, but more VRAM. Steve mentioned it, but didn't test it this time.

    • @tompayne4945
      @tompayne4945 Před rokem +3

      Yeah even going all the way to a 1080ti would be a good shout. I wonder if they even still have a Radeon vii?!😮

    • @piercewiederecht5135
      @piercewiederecht5135 Před rokem

      Nahh they should it for $2500 in the mining boom

  • @WilliamOwyong
    @WilliamOwyong Před rokem +46

    This reminds me of the issues we used to have [way back] when we'd run out of system RAM and Windows started using it's swap file causing massive slow-downs. The main difference being we can upgrade system RAM. If only we could do the same with GPUs.

    • @ArtisChronicles
      @ArtisChronicles Před rokem +2

      Maybe I should relive those times by running 4GB today

    • @archthearchvile
      @archthearchvile Před rokem

      ​@@ArtisChronicles and then you can fix it by using an operating system that doesn't suck

    • @moonstomper68
      @moonstomper68 Před rokem +5

      You might be on to something there. I wonder if some bright spark at Nvidia/AMD or Intel could come up with a way to offer upgradable graphics cards. Buy a card and add compatible RAM or even swappable same generation GPUs. Can be done on a motherboard so why not? Would increase the life of the cards and saves on e-waste too.

    • @youfreaker
      @youfreaker Před 10 měsíci

      Moonstomper68 Swapping out RAM works much better on a motherboard because the motherboard PCB isn't completely covered with heatsinks, blocking off the memory chips. There is not much need to upgrade VRAM when the GPU performance is already at a given. It's much cheaper to just put on enough VRAM so the VRAM won't run out in situations where the card's raw performance could still hold up.

    • @GMBazillion
      @GMBazillion Před 10 měsíci

      @@archthearchvile And what OS would that be Hmmmm?

  • @grey4980
    @grey4980 Před 11 měsíci +19

    I wish I could like this video twice! This was so helpful and instrumental in my progress of understanding how a GPU works and what the heck VRAM was and what it did, and clearly demonstrated why the graphs can be misleading sometimes, like when the GPU compensates for lack of memory and avoid crashing or buffering by entirely not loading in the textures, which isn't shown any other way but watching the side by side
    Thank you so much! Wonderful video

  • @falcone5287
    @falcone5287 Před rokem +420

    I'm glad I went with an RX 6700 XT. I remember people telling that rx 6700 xt was no match for rtx 3070 and the amd card actually performed a little bit worse initially but now it is a completely different story thanks to a bigger VRAM buffer and AMD's FineWine technology :)

    • @tilburg8683
      @tilburg8683 Před rokem +29

      And in fairness 6700xt always has been better especially at 108p and above due to the Vram.

    • @LowMS3
      @LowMS3 Před rokem +16

      My 6900xt reference beats my buddy's 3080ti in most games and synthetic benchmarks. His 80ti will out class my card in some games though!

    • @hardergamer
      @hardergamer Před rokem +22

      ​@@LowMS3 I have been having vram problems with my 3080 10gb for some time, mainly with newer games, and after borrowing a friend 6800xt just last week I'm shocked how much better, so many games looked and played, even some 2+ year old games look and play much better, with older games like GoW at 1440p texter pack crashing the 3080 but the AMD cards run fine. Anyway, I have just brought a 6800 xt new for nearly half what the 3080 12gb cad cost.

    • @dontmatter4423
      @dontmatter4423 Před rokem +11

      The 3070 has enough gpu power to drive 1440p at decent frame rates, but struggles at 1080p just for less vram..
      What a shame

    • @philipgibbon4473
      @philipgibbon4473 Před rokem +5

      Same here I got RX 6700XT for what I play, also I wanted more vram but not brake the bank, but the game I play need more than 16GB of Vram (Transport Fever 2, CityBus Manager, Cities Skylines, etc.....) I had a 8GB card but I had unbearable frame-times and a lot of lags due to Vram is overflowing into system memory and I had people saying you need Nvidia for gaming it's the best but my 8GB card was Nvidia GTX 1070TI, I'm like no I need more vram then 8GB on what Nvidia was offering in the mid-high end, I get what feels the best for me not for others, It's like being back at high school choosing your college and your friends say come where I'm going, sorry getting off topic aha.

  • @recordatron
    @recordatron Před rokem +234

    I'm actually pretty gobsmacked by how well the 6800 performs here. The 16gb buffer is an absolute life saver and I'm surprised how well it's doing with RT as well considering how questionable it was at launch. One GPU will be usable at higher settings for a few years yet probably and the other is relegated to lower settings and resolution for completely arbitrary reasons. Very frustrating.

    • @angrysocialjusticewarrior
      @angrysocialjusticewarrior Před rokem +13

      To be fair, the 3070 has never competed with the 6800. The 6800 has always been 1 tier above the 3070 from the very beginning (even before this new vram scandal).
      The 3070 was built to play with the like of the RX 6700xt and RX 6750xt (which the 3070 beats most of the time).

    • @sorinpopa1442
      @sorinpopa1442 Před rokem +17

      ​@@angrysocialjusticewarriorand what about the 3070 Ti suffer from same problem as the 3070 non Ti

    • @RydarGames
      @RydarGames Před rokem +37

      ​@@angrysocialjusticewarrior no no no. Even the $400 digital PS5 console has over 12gb vram dedicated to games. Launching the 3070 with only 8gb vram while knowing devs in the near future would target the consoles as a baseline is treacherous 😢

    • @hansolo631
      @hansolo631 Před rokem +20

      @@angrysocialjusticewarrior I agree that the 6800 and 3070 weren't direct competitors, but they were only seperated by 70 bucks and that 70 bucks gets you a MASSIVE advantage now. Like an absurd one

    • @AvroBellow
      @AvroBellow Před rokem +11

      What's frustrating is the fact that so many people bought these pieces of (as Steve would say) GAHBAGE just because they were in a green box. Anyone who spent that kind of money on 8GB deserves their fate.

  • @stefannita3439
    @stefannita3439 Před 2 měsíci +2

    This video inspired me to start looking at the used market as a way to jump ship from my 3070. I was able to find a one-year-old still-in-warranty 6800 XT and after also selling the 3070 I'm only out about 60 or 70 euros! The performance gains are massive, and the extra VRAM is sooo nice for maxing out games but also for keeping stuff like Discord and CZcams open on my second monitor. Super happy I made the switch and thanks again tor spurring that!

  •  Před rokem +73

    I agree with this. The tiers are now:
    - 24 GB -- High End
    - 16 GB -- Mid Range
    - 12 GB -- Entry Level
    - 8 GB -- "barebones", sub 200$ cards
    This should be the standard in 2023.

    • @FouTarnished
      @FouTarnished Před rokem +25

      I would agree but this also highlights the laziness in development of games now. There are still games more complex that look better than games like the last of us that use half the vram. Devs need to take responsibility as well, not just GPU manufacturers.

    • @cairnex4473
      @cairnex4473 Před 11 měsíci +1

      I'm sure that makes people who bought 10GB 3080's feel good about their purchasing decision ;-)

    • @FouTarnished
      @FouTarnished Před 11 měsíci +2

      @@cairnex4473 lol that card was always a trap, I doubt anyone that bought it feels good about it now, I went for the 3080ti and the extra vram has been a godsend, still i think Devs need to seriously re-evaluate the priorities when making games, if a game looks amazing but plays like absolute garbage it's a bad game

    • @jalma9643
      @jalma9643 Před 11 měsíci +1

      ​@@FouTarnishedNot every developer became lazy it just technology has advance a lot, i don't included Bad Port Pc like the last of us or Beta Release Game like Lotr Gollum. What i mean is like Resident Evil 4 Remake, Capcom really improve game graphic and we can't blame game like Capcom to increase usage of Vram.

    • @thierrybo6304
      @thierrybo6304 Před 11 měsíci

      Agree. Me playing with 1060 3gb ;-)

  • @itmegibby2459
    @itmegibby2459 Před rokem +669

    Would have been interesting to see how the 2080ti with its 11gb vram compared to the 3070.

    • @KrisDee1981
      @KrisDee1981 Před rokem +78

      I wish Steve can do this comparison next. More interesting to see Nvidia vs Nvidia. Almost identical performance, but extra 3GB of Vram.
      I do have 2080ti and TLOU runs great at 4k, dllss balance, high settings.

    • @Evie0133
      @Evie0133 Před rokem +14

      Now that would be interesting to see :D

    • @deamondeathstone1
      @deamondeathstone1 Před rokem +127

      Add the RTX 3060 with 12GB for shits and giggles.

    • @YH-lj9gy
      @YH-lj9gy Před rokem +41

      2080ti is a beast, unfortunately those 2018 micron Gddr6 are ticking time bombs.

    • @ShinyHelmet
      @ShinyHelmet Před rokem +24

      @@deamondeathstone1 Yeah, it would have been great to see how the 12GB 3060 fared in these games compared with the 3070. I imagine a lot smoother gameplay in many instances, especialy with dlss.

  • @bierrollerful
    @bierrollerful Před rokem +101

    I must admit, I didn't think 8GB would be an issue this soon. Up until recently I thought that, all things considered, the 3060 Ti was *_the_* card of this GPU generation.
    In the end, I bought a 6700 XT. Less so because of the additional VRAM, but mainly because it was cheaper - and now I'm really glad I did.

    • @PeakKissShot
      @PeakKissShot Před rokem +20

      As soon as you give a dev more ram to work with they will consume it as fast as possible 😂

    • @madtech5153
      @madtech5153 Před rokem +15

      the rise of VRAM usage is due to consoles. consoles back then was a hindrance to PC gaming since companies are still focusing on making their games in ps4/xbox at that time. now that the current gen consoles are quite good and has 16gb ram (although unified), expect most games used at ultra around 10gb-12gb leaving 4gb ram for the OS. safe play to be made would be to buy 12gb vram gpu but its guaranteed to be safe in 16gb vram gpu unless you want 4k ultra + raytracing.

    • @chronology556
      @chronology556 Před rokem +1

      @@madtech5153 VRAM and Teraflops on XBSS compared XBSX?

    • @qweasd-xf5pv
      @qweasd-xf5pv Před rokem +4

      @@madtech5153 ps5 and xbox sx have 16 of shared memory. It means that 16 us shared buy both vram and ddr ram. Do how come 8gb vram and 16gb ddr ram become obselet
      If you consider texture size ps5 might only have 6gb vram and 10gb ddr ram to work with.
      Dude devs are f**king up

    • @dragonl4d216
      @dragonl4d216 Před rokem +4

      The 6700XT is the card of this generation, not the 3060Ti. It is cheaper, faster, and has more VRAM than the 3060Ti.

  • @Lurker-dk8jk
    @Lurker-dk8jk Před rokem +2

    Thanks, Steve. Bought a 2080 Super with 8GB in early 2020, before the lock downs. Noticed it chugging with the new Dead Space at Ultra settings at 1440p. Knew right away it was VRAM limited with RivaTuner running. Drop settings to High, reflections to Medium and ambient occlusion to SSAO and it's right as rain. Solid 60fps the whole game, except in two spots. The Bench on the bridge was the first. Can't remember the other one.

  • @megumi_0
    @megumi_0 Před rokem +6

    I'm coming back tk this video 2 weeks later to say thank you, you single handedly changed the consensus for Min requirement of vram in 2023 forwards

  • @gusgyn
    @gusgyn Před rokem +102

    I'm glad late last year, when upgrading, I went with a 6800 XT instead of the 3070Ti. They had similar price, but I also thought that 8gb was a bit too low since I had a GTX 1070 which also had 8gb, so I also wanted an upgrade in the VRAM. Thanks for your analysis HW Unboxed!

    • @aHumanCookiee
      @aHumanCookiee Před rokem +13

      These two cards are not even in the same league except for RT overall, you made the right choice sir!

    • @highmm2696
      @highmm2696 Před rokem +5

      Very good choice. The 6800 XT is more in line with the 3080, often beating it at 1440p in alot of titles (Non RT of course)

    • @3lDuK371
      @3lDuK371 Před rokem +1

      I think it was HW Unboxed that sold me on my RX6800 due to vram and 1440p performance

    • @gusgyn
      @gusgyn Před rokem +3

      @@highmm2696 Thanks, yes I know the 6800 XT is often better than the 3070ti, but here where I live, they were practically the same price. Like Apple, Nvidia has a "tax" with it that they usually are more expensive or drop the price less due to more demand. So AMD usually tends to be cheaper

    • @Kamikatze_Gaming
      @Kamikatze_Gaming Před rokem

      Thats why I bought the 6800XT instead :)

  • @kennethd4958
    @kennethd4958 Před rokem +158

    Been waiting for this video... in 2020 I had the opportunity of getting either a Sapphire Pulse 6800 or EVGA 3070... and I'm so glad I went with the 6800.

    • @MrDoBerek
      @MrDoBerek Před rokem +2

      Came from a 5700XT to a 6800, worth it!

    • @gaav87
      @gaav87 Před rokem

      You must have friends in nvidia coz rtx 3070 released in july 2021 so 1year 8months ago and true release was months later late 2021

    • @kennethd4958
      @kennethd4958 Před rokem +2

      @@gaav87 No it didn’t. Maybe the 3070ti released in 2021 but the 3070 def released 2020.. September or October I believe

    • @kennethd4958
      @kennethd4958 Před rokem

      @@CZcamsTookMyNickname.WhyNot Nice! I had a 5500XT, 5600XT and then a 5700XT before my 6800 lol… I kept wanting more power lmao

    • @kimjiniinii
      @kimjiniinii Před rokem

      yeah you got lucky with all the scalpers and miners they were selling by bulk before arriving

  • @Enivoke
    @Enivoke Před rokem +38

    You know what would be fun doing this exact tests?
    RTX 3060 12GB
    RTX 3080 10GB
    RTX 3080 12GB
    And see if the 3060 will pass the 3080 10GB in some games while comparing what performance had been left on the table had the 3080 originally come with 12GB!

    • @kanbin4189
      @kanbin4189 Před rokem +3

      Yes, please.

    • @alenko4763
      @alenko4763 Před rokem +11

      Hate to burst your bubble but the 3060 wouldn’t get higher fps in ANY game compared to a 3080… 12GB or not lol, it wouldn’t even be close. Just google performance stats for both and compare.

    • @ArtisChronicles
      @ArtisChronicles Před rokem

      ​@@alenko4763 yeah unfortunately I think 10GB is just enough to actually avoid the issues the 3070 has.

    • @valenrn8657
      @valenrn8657 Před rokem

      At 1080p Windows 11 desktop, about 0.5 GB VRAM is consumed.

    • @valenrn8657
      @valenrn8657 Před rokem +2

      @@alenko4763 3060 12 GB superior 0.1% lows when compared to RTX 3070 Ti 8 GB.

  • @tosvus
    @tosvus Před 11 měsíci +7

    I'm very happy using 6800 with a ryzen 7600 as my secondary gaming PC. Never owned an and CPU or GPU before and have to say I'm impressed! Will mostly be used for emulation as I put it all in a compact case.

  • @MonkeyMan125431
    @MonkeyMan125431 Před rokem +66

    I'd love to see a 3070 Vs 2080ti comparison since most recommended specs specifically mention a 2080ti instead of a 3070 for a 1440p 60fps Ultra settings even though they're comparable in Rasterization, most likely due to VRAM

    • @Keivz
      @Keivz Před rokem +6

      That would be the most sensible comparison.

    • @knuttella
      @knuttella Před rokem +10

      and maybe add the 1080 ti in the mix just for the lulz

  • @PKDeviluke25
    @PKDeviluke25 Před rokem +221

    Once I had VRAM issues, I ditched my 3080 12GB for a 7900xtx 24GB what an amazing card, it just lets me max out anything at 4K with loads of VRAM remaining. Am on the Team Red bandwagon for now.
    EDIT: When I mention maxing out everything, know that I do not use Ray Tracing if you want good RT performance go with the newer NVIDIA cards.

    • @alexts4920
      @alexts4920 Před rokem +1

      Can it play 4k 120 FPS with maxed out settings?

    • @dybrado1133
      @dybrado1133 Před rokem +12

      @@alexts4920 For which game are you asking??

    • @AdiiS
      @AdiiS Před rokem +3

      I go green, 4090 > everything

    • @megamanx1291
      @megamanx1291 Před rokem +62

      @@AdiiS 4090 is too expensive man

    • @chriswright8074
      @chriswright8074 Před rokem +1

      @@alexts4920 not exactly

  • @leonavis
    @leonavis Před rokem +3

    lol. Back in 2021 the saying was "If you just wanna use rasterization, buy AMD, if you wanna use RT, buy nvidia". Funny how that turned out.
    For me it was never a question, since nvidia doesn't give us Linux-nerds open source drivers.

  • @katherinesilens2994
    @katherinesilens2994 Před rokem

    This is why I watch you guys (and the other Steve). Good insight into potential future issues, but also willing to talk from the data even if it's slightly contrary to expectations.

  • @kolle128
    @kolle128 Před rokem +62

    Its funny how NVidia is pushing ray tracing, yet when you turn it on thats when their card drops to its knees because of VRAM, despite probably having superior ray tracing accelerators.

    • @Kuriketto
      @Kuriketto Před rokem +16

      It's also bewildering how people blame developers and poor optimization, justified or not, for low performance on 8GB cards, yet there doesn't seem to be much outrage about Nvidia pushing RT so heavily to sell 20-series cards back when support for it was practically non-existent and enabling it now on virtually any card from that generation will bring it to its knees. Is that also poor "optimization" on the part of developers or Nvidia overpromising and underdelivering?

    • @chronicalcultivation
      @chronicalcultivation Před rokem +5

      It's not even "probably", the RT cores make a huge difference and the Nvidia cards should be vastly better at it, so the lack of VRAM is without a doubt a big issue.

    • @sammiller6631
      @sammiller6631 Před rokem +2

      @@chronicalcultivation Don't exaggerate the difference RT cores make. Sucking less does not mean RT cores don't suck. And that's only usual limited RT.
      The RTX 4090 still only manages 17 FPS on Cyberpunk 2077 RT Overdrive mode.

    • @chronicalcultivation
      @chronicalcultivation Před rokem +2

      @@sammiller6631 as someone who owns cards from both AMD and Nvidia, and also a PS5, no, it is not an exaggeration. The performance hit taken on an RDNA2 card or console for enabling RT is significantly more noticeable vs an Ampere GPU.
      It's not totally unusable on AMD, but it's a lot worse.
      I was really hoping the 7000 series would catch up and be equal to Ada in that department, but they seem to have only reached Ampere level RT instead.
      I've always been more of a Radeon guy for almost 20 years, so the last few generations have been a bit disappointing..
      It was better when it was still ATi

    • @AvroBellow
      @AvroBellow Před rokem

      It's funny how, despite this, there were enough people stupid enough to buy them.

  • @GewelReal
    @GewelReal Před rokem +418

    I can't believe in the mental gymnastics that people go through to justify them buying Nvidia when they don't absolutely require it

    • @schwalmy8227
      @schwalmy8227 Před rokem +26

      One of my buddies got a 6700 xt recently and had horrible driver issues and some games performed really poorly in comparison to others, could’ve been his fault but I think amd still needs to work on their drivers more

    • @EarthIsFlat456
      @EarthIsFlat456 Před rokem +16

      Well they're better products (albeit more expensive) so that's really the only justification needed.

    • @remasteredzero4076
      @remasteredzero4076 Před rokem +58

      ​@@schwalmy8227 if there was a problem with the driver he could just use a older one

    • @NostradAlex
      @NostradAlex Před rokem +32

      It's actually normal behavior. They bought something and they want people to assure them they did the right choice.
      It's our human nature. Even though we know we're wrong, we'll continue telling ourselves we're right in hopes that we'll believe it ourselves.
      It's just lying to ourselves. And we do it not just with things we buy, but all sorts of things. Even relationships to give an example.

    • @GewelReal
      @GewelReal Před rokem +68

      @@schwalmy8227 You're saying that as if Nvidia drivers were perfect.
      I had so many issues with Nvidia ones that for me it's not an argument for going team green. Their only advantage right now is prefessional space IMO. For someone that just plays video games AMD is a clear choice.
      Hell, even Intel has a lot to say here

  • @TheDukeofPretzels
    @TheDukeofPretzels Před rokem +47

    I really appreciate the revisit. I've been watching used GPUs and you saved me from a costly mistake.

  • @DownwithEA1
    @DownwithEA1 Před rokem +3

    This is very useful information. Thank you so much for doing the work. If the mid-lower end of the 40 series is still 8GB then it is very probably a hard pass. Looks like I'm coming back to Radeon next build.

  • @waynesabourin4825
    @waynesabourin4825 Před rokem +90

    NVidia planned this knowing what was coming as they had optics working with upcoming devs. All their products had gimped VRAM amounts so they could justify everyone buying new cards sooner.

    • @DebasedAnon
      @DebasedAnon Před rokem +22

      The worst part is the customers who got burned by Nvidias bs will just go "oh i guess my GPU is getting old" and will buy a newer Nvidia one without a second thought.

    • @cin2110
      @cin2110 Před rokem +1

      Planned obsolesence doing it's work

    • @Thysamithan
      @Thysamithan Před rokem +3

      @@DebasedAnon I have a 3070 and my next card will be an AMD card.
      I have enough of Nvidias increasingly annoying nonsense for good and AMD cards are now more than a viable alternative.

    • @DebasedAnon
      @DebasedAnon Před rokem +5

      @@Thysamithan I'd say AMD outright won the previous generation before the entire GPU market went up in flames, if they stayed at around MSRP Nvidia was trounced across the board.
      If people cared a tiny bit more the market wouldn't be as bad as it is now, it took the absolute farce that is the 4080 at 1200$ + a recession to make people NOT buy Nvidias products..
      With that being said AMD is being way too passive for my tastes, they could afford to go for the jugular right now due to having vastly cheaper costs but they're content to undercut Nvidia by small amounts (relative to the price).

    • @thestig026
      @thestig026 Před rokem +1

      @@DebasedAnon My issue is that, I know Nvidia are screwing me over but I'm split between gaming and doing AI projects on the side in my free time, so I have no other choice but Nvidia since the workflows rely on CUDA. If AMD bothered to push Rocm and make it as viable as CUDA, and make FSR better than DLSS, I wouldn't even consider an Nvidia card.

  • @nomustacheguy5249
    @nomustacheguy5249 Před rokem +201

    I couldn't be happier getting my 6800 last year for 480$. Apart from the odd driver issue (nothing serious) I'm a happy AMD customer so far. Nvidia became too rich for my taste with the 3000 series. Now both AMD & Nvidia are luxury goods with the newer gen products.

    • @valtonen77
      @valtonen77 Před rokem +2

      I tried amd cpu and after how shit it was and didnt work and couldnt even update the drivers, just never trying amd again.

    • @PotatMasterRace
      @PotatMasterRace Před rokem +63

      @@valtonen77 Something's terribly wrong with your basic tech/soft skills.

    • @thefurmidablecatlucky3
      @thefurmidablecatlucky3 Před rokem +8

      Nvidia with their 4000 series GPU: you're too poor to buy our products so we don't need you anymore

    • @madtech5153
      @madtech5153 Před rokem +1

      @@PotatMasterRace ?? there is nothing to do, its just plug and play. dunno why he had a problem but if linus had no problem, its more likely applies to general population as well (yes im dissing linus).

    • @sven957
      @sven957 Před rokem

      what scares me off of buying AMD are all of these driver issues I read about for example they had extremely high idle power usage for a really long time - I dont even know if its fixed now? And then there is FSR which is just not nearly as good as DLSS, Ray tracing etc

  • @tylerkidby8991
    @tylerkidby8991 Před rokem +20

    Feel a bit bad buying my 3070 now, haven't used RT once and the RX 6800 is smashing it in these benchmarks.

  • @josephmuniz9959
    @josephmuniz9959 Před rokem

    I’ve never watched one of you episodes but I am glad I did tonight. Well present and well narrated nice work dude I’ll be watching more of your content.

  • @rustler08
    @rustler08 Před rokem +194

    The 6700 XT, the 6800, and the 2080 Ti are the only cards you should be looking at in the $350-450 range. Maybe sometime in the future we can add the A770 to that list.

    • @themaxster_
      @themaxster_ Před rokem +9

      @Cm Not to mention the problems that arise with early gddr6. If your 20 series card has a certain type of early memory it could fail at any point now!

    • @steelkinq3708
      @steelkinq3708 Před rokem

      @Cm you can find the 6700 xt New for 399€ for weeks in Germany. The prisebump to the 6750 xt is not wirth it.
      Beispiel: Mindfactory hat die Saphire Puls 6700 xt aktuell für 383,05 €

    • @steelkinq3708
      @steelkinq3708 Před rokem +9

      Steve revisited the A770 after a major drivers update and if I remember correctly the A770 and 6700xt were arround on par

    • @amcdonald7479
      @amcdonald7479 Před rokem +1

      My Aorus 6800 Master still runs anything I throw at it. As long as my multi online games are able to pull 144fps on 5% low as no lower than 120 on 1% Im happy. As for offline games like resident evil or atomic still all good. Ray tracing at mid though.

    • @karma9007
      @karma9007 Před rokem

      @Cm 348euro second hand here in Denmark, or a bit cheaper Ive seen.

  • @jacobvriesema6633
    @jacobvriesema6633 Před rokem +194

    I waffled between the 3070 and 6800 back in 2021, and probably would have bought a 3070 if I could have gotten it anywhere near MSRP. Instead, I bought a $900 Gigabyte RX6800 Aorus Master. While I wasn’t thrilled with the price, I’m glad to hear I made the right choice. HUB’s concerns about VRAM was a major deciding factor for me. Thank you, HUB!

    • @TBRANN
      @TBRANN Před rokem +3

      Best buy is selling that still for a cool $1400 + dollars 😂😂😂

    • @jacobvriesema6633
      @jacobvriesema6633 Před rokem +1

      @@TBRANN oof! Newegg still has it listed for $1,000. I probably bought the most expensive RX6800 out there 😂.

    • @TBRANN
      @TBRANN Před rokem +4

      @jacobvriesema6633 hell $900 is a steal apparently 😳

    • @KaHzAdaNe
      @KaHzAdaNe Před rokem +2

      I feel your pain at that price too... I had to buy a 6700XT for 700€ in that same period... If only my GPU hadnt died on me back then i'd gladly get a 6800XT for 650€ right now...

    • @DELTA9XTC
      @DELTA9XTC Před rokem

      i can buy a 6950xt for 650€ atm lol

  • @BonusCrook
    @BonusCrook Před rokem +3

    Nice to see that someone fixed the dip in callisto protocol from team red. if you check the 3080 and 6800 XT results in this game from prior test videos the radeon card has huge 1% low issues where fps would just tank in certain areas

  • @stealthshock1
    @stealthshock1 Před rokem +2

    Some people are missing the point in the comments. Limited VRAM is Planned obsolescence, most people don't upgrade their GPU often. it intentionally limits high end Graphics card from its full potential.

  • @ziyaceliksu
    @ziyaceliksu Před rokem +43

    You showed very well that everything is not about fps. Most people were looking at the fps results and passing by. We understand again how complicated the user experience is. Congratulations

  • @pb602
    @pb602 Před rokem +94

    I saw the 8GBs of VRAM being in issue back in 2020 because Shadow of War (2017 game) requires more than 8GBs of VRAM if you wanted to use the highest texture pack with all the settings at max. That's why I ended up going with the 6800XT.

    • @aaz1992
      @aaz1992 Před rokem +1

      Aging like fine wine

    • @jayclarke777
      @jayclarke777 Před rokem +1

      @Steven Turner Shadow of Mordor wasn't a grindfest

    • @Mark13376
      @Mark13376 Před rokem +3

      Far Cry 5, too. I’ve seen it use almost 10GB when you supersample it

    • @tkanal1
      @tkanal1 Před rokem +3

      I had 6700XT with 12GB and some games could fill 9-11GB easily on the highest preset in 1080p!!! 8GB is not enough...now I have also 6800XT and there it is normal that games use almost all the 16GB at 4K...

    • @jkahgdkjhafgsd
      @jkahgdkjhafgsd Před rokem +3

      modded XCom 2 used almost 8Gb on my 1070 at 1440p in 2016. I upgraded this year and anything less than 12Gb was a non-option to me, which meant NVIDIA auto-excluded itself

  • @kozad86
    @kozad86 Před rokem +27

    I woulda loved to see the 16GB A770 on these charts too, even just as a wildcard.

    • @josh2482
      @josh2482 Před rokem

      Intel is really providing a lot of value for entry level PC builders with the A770. If I was just starting out building PCs I would buy that one.

    • @georgepanathas2009
      @georgepanathas2009 Před 10 měsíci +3

      Unstable card for this kind of results?!?! No?

    • @theplayerofus319
      @theplayerofus319 Před 10 měsíci

      ​@@josh2482no? It still has a lot of crashes and is nothing for a pc beginner tbh

  • @niko5514
    @niko5514 Před rokem

    Excellent review. The long graphs with FPS numbers does not show smoothness. Maybe you can adapt this to your GPU reviews in general. However, I believe this is much work to show ingame footage. Maybe you can use some metric like Igors lab does to measure and evaluated “smoothness” via statistical approach in between frames.

  • @Alcatraz760
    @Alcatraz760 Před rokem +20

    Imagine making such a good core architecture and software like DLSS only to be thrown away because of not enough VRAM. It really is the epitome of making an F1 car only to fit tyres from a prius.

    • @od13166
      @od13166 Před rokem

      Outdated shadowplay that dont have Capture preview

  • @AxTROUSRxMISSLE
    @AxTROUSRxMISSLE Před rokem +275

    This makes me really feel good about my recent upgrade to the 6800XT, hopefully it doing great at 1440p for years to come

    • @Doomrider47
      @Doomrider47 Před rokem +39

      On the future side of things, team red has seemed to always had an eye for longevity. Just nabbed a 7900xtx myself after I saw my venerable rx580 8gig was minimum spec, after 5 years of use it was a no brainer to go red again.

    • @AxTROUSRxMISSLE
      @AxTROUSRxMISSLE Před rokem +8

      @@Doomrider47 Agreed, I never have really dipped into AMD until I got a Ryzen 2700 a while ago. I dont think I will go back to an Nvidia card for the forseeable future unless AMD starts sticking its fists in consumers asses more than normal. Im honestly hoping Intel really brings a big update to its next GPUs, maybe both AMD and Intel putting pressure on Nvidia will knock them down a peg a bit.

    • @pyronite3323
      @pyronite3323 Před rokem +3

      @@Doomrider47 how big was the jump? I have a gtx 1070ti (which has similar performance to the rx 580) and I was also planning to buy a 7900 xtx or maybe wait 2 years and buy a 5080.

    • @nishanthadda8824
      @nishanthadda8824 Před rokem +7

      ​@@pyronite3323 wait a bit more.Or if u can get a 6800XT for less than 500 bucks thats a steal.

    • @musouisshin
      @musouisshin Před rokem

      @@pyronite3323 if u can grab the XTX for anywhere close to 700-850$ - Just go for it - its worth it - tbh i have a 3yrs old 2070S 8Gb but i dont regret it for 170$

  • @Crankshaft_NL
    @Crankshaft_NL Před rokem

    Nice video! i would like to see a video where we can see you testing and talking us trough your testing proces. Like if you have to change hardware a lot like in the b650 test. How do you make your time efficient? Do you use an usb stick with al drivers for every brand. You test allot and the way you do it has a massive effect on time consumption and quallity of analysis. It would bee nice to see a video like " behind the smoke" 😂

  • @silasmayes7954
    @silasmayes7954 Před rokem +3

    This perfectly encapsulates my recent thoughts. I really wanted to get a 3070 back around November of last year, but the price was a lot more than the 6700 XT that I got that for around $380. Really happy with that purchase as I now know I'll get a good 3-5 years at least with it.

    • @moox556
      @moox556 Před rokem

      In my country the price difference is like nothing. So is it worth buying the rx 6700xt over the rtx 3070 just for a higher vram ??

    • @silasmayes7954
      @silasmayes7954 Před rokem +1

      @@moox556 yeah I'd do it again in a heartbeat even at the same price. Modern games use a lot more VRAM. 8GB isn't enough anymore like it used to.

  • @Moshenokoji
    @Moshenokoji Před rokem +47

    I had a 3070 for a little while, it was a fine card. Ended up with a Radeon 6800XT not long afterwards and sold the 3070. I don't regret it.

    • @Vis117
      @Vis117 Před rokem +1

      I’m curious what the value of a 3070 will be in a couple more years as this continues to happen.

    • @toxicavenger6172
      @toxicavenger6172 Před rokem +3

      @@Vis117 it'll sell very poorly I think. The market will be flooded with them in due time.

    • @davidyang9902
      @davidyang9902 Před rokem +1

      I went from 3060ti to 6800xt, happy about the decision especially since I play call of duty

    • @musouisshin
      @musouisshin Před rokem

      Smart guy

    • @LikeAGroove
      @LikeAGroove Před rokem

      i did exactly the same thing

  • @JamesSomersetActor
    @JamesSomersetActor Před rokem +16

    It's funny to me that you got comments complaining about your VRAM criticisms. Part of the reason I went with the 6800XT (aside from the fact I was able to get one for not much more than a 3070, and I would have been able to get a 6800 for a bit less than a 3070 if I'd gone that way) was because of your VRAM comments.
    I am, therefore, very grateful for your VRAM criticisms personally.

    • @madtech5153
      @madtech5153 Před rokem +1

      alot of people who criticizes the hub's Vram criticism says that speed it all that matters, size doesn't. its already quite clear back in the last of us part 1 benchmark that the speed doesn't help when there isn't enough vram.

    • @BitZapple
      @BitZapple Před rokem +1

      The 3070 Ti with 8GB is faster than the 6800 but he chose to compare two different tiers of performance and keeps riding on bottlenecking issues when trying to run Ultra on badly optimized titles with a mid range card from years ago

    • @mmis1000
      @mmis1000 Před rokem

      I actually encounter vram problems when using my old rx580 8g. I sometimes need to stop the browser/kill dwm.exe so game will run smoothly. Thats in 2021, the time I decided to upgrade my graphic cards. And then I saw 3080…10g. I am like "what the hell? How on the earth 10gb at 2021 is enough?". Ends up go rx6900 because nvidia cheap out the vram for the whole 30 series.
      Nvidia plays a good game here.

  • @joskamps4711
    @joskamps4711 Před rokem +40

    I know this comparison would be a little out dated, but could you do a comparison of an older title with an RTX 3060 (12 GB) vs RTX 3070 (8 GB) to see how resolution and settings scaling do? This should also highlight the limitations of the 3070 compared to Nvidia's own cards.

    • @unholydonuts
      @unholydonuts Před 6 měsíci

      That would be an interesting comparison, how much more frames would the 3070 be at 1080p for $200 more.

  • @nfsm654
    @nfsm654 Před rokem +17

    Thanks for the content Steve and team. This is a really important subject IMO and I'm very grateful you guys have covered it.
    The amount of VRAM really came into my consideration of getting a 3080Ti or 6950XT ($1400 vs $1000 AU at the time). This information really would've been handy then but anyone who had taken the blinders off seen this coming a long time ago. Now I also feel more at peace recommending AMD's higher VRAM offers to my friends who have asked for advice. Cheers guys

    • @tanthokg
      @tanthokg Před rokem +1

      Same here. Months ago I was considering a 3080Ti 12G for my new build, it costs around 1000 USD. Until 3 days ago, I saw on a site that I could buy either a 7900XT at 900 or a 7900XTX for 1000. Not only the performance is greater (for non-RT), but the VRAM capacity is also significantly bigger

  • @murdergang420
    @murdergang420 Před rokem +337

    As an owner of both gpus i approve of this video

    • @murdergang420
      @murdergang420 Před rokem +9

      I noticed it in cod forza and other games for the 3070
      Any questions feel free to ask 🙏

    • @fabiusmaximuscunctator7390
      @fabiusmaximuscunctator7390 Před rokem +2

      Does OC the memory help to mitigate the issue?

    • @jinx20001
      @jinx20001 Před rokem +7

      @@murdergang420 have you noticed who sponsors the vast majority of these problem games though? i mean seriously i feel like im the only one that notices this coincidence. Every game in this video with serious problems is an AMD sponsored title, that means they work closely with AMD when developing the game... im sure its nothing though right, no way AMD would be turning some knobs here to make games run badly on certain nvidia cards while taking advantage of the one huge advantage AMD have... nahhh couldnt possibly be, its just nvidia BAD and we need to remember that.
      not like anything like this has happened before with nvidia tessellation which crippled the opposition cards. honestly im very surprised outlets like HWU dont at least acknowledge this and question it even a little that something underhand and fishy is going on here when every game with a problem has AMD hands on it, always those games that come out with terrible RT aswell, just a coincidence im sure they have no say at all lol.

    • @slylockfox85
      @slylockfox85 Před rokem +33

      @@jinx20001 So do you have the same issues with video editing software like Adobe running better on Nvidia hardware? Because it's been that way for the longest and I've never seen conspiracy theories mentioned about performance. You should be more upset with Nvidia for putting out a crippled card with 8 GB of VRAM. If they had least put out a competent product, it would be easier to entertain accusations of bad optimization.
      That said, when AMD got the console contracts from Sony and MS, it was predicted that game optimizations might start to favor AMD.

    • @crib8007
      @crib8007 Před rokem

      ​@@murdergang420 for some games, have you experienced stutter due to shader cache with the 6800? Destiny, for an example, experience high stutter for a bit then over over time gets better.i just found it annoying that this happens every GPU driver update. It makes me not want to update the driver.

  • @HWMonster
    @HWMonster Před rokem +112

    Thanks for the revisist, Steve! As I said over 2 years ago: The RX 6800 was probably the best graphics card released in recent years. It was faster, more efficient and more future-proof than the 3070 from the start, and yet the 8GB VRAM cripple had to be hyped because it can do the oh-so-great DLSS and has much better ray tracing performance. Good job, press and influencers.

    • @noanyobiseniss7462
      @noanyobiseniss7462 Před rokem +1

      czcams.com/video/kKSBeuVlp0Q/video.html

    • @thodorisi.2889
      @thodorisi.2889 Před rokem +5

      Yes but in most places in europe was out of stock or when it was in stock was more expensive than a 3080 10 gb

    • @Nimbus3000
      @Nimbus3000 Před rokem +8

      They can only review what they know, whether games will end up using more VRAM imminently is speculation and the reality is VRAM usage of games *in general* has been pretty stagnant for years. (Largely due to consoles, imo) I agree that Nvidia shouldn't be skimping on memory size or memory bus, but you can't really blame press for reviewing the cards they got at the time they got them with the information they had.

    • @mircomputers1221
      @mircomputers1221 Před rokem +1

      yes, the efficiency in every metric is incredible, but that got it very close to the XT version in real world prices

    • @DlAlYlVlilD
      @DlAlYlVlilD Před rokem +3

      comparing 3070 to 6800 skipping 6700xt is like comparing 6800xt to 3090 skipping 3080 ... depends if u red or blue team fanboy... why not compare 3070 to what amd was promising as direct competitor at that time? 12gb 6700xt? both of those companies crapped on gamers in last years equally.

  • @akhmadarifluthfi6713
    @akhmadarifluthfi6713 Před rokem +26

    Great video as always! I would like to see how the 12gb GPUs like the 3060 and RX6700 fare against these newer games.

    • @nguyenhanh9479
      @nguyenhanh9479 Před rokem +3

      with the price of 3060 you can just buy 6700xt which will shit on 3060 any day.

  • @odetas9597
    @odetas9597 Před rokem

    This is a great video, great job showing the effects on screen

  • @ksevinamaniego
    @ksevinamaniego Před rokem +65

    I'm impress by the rt performance of the 6800.

  • @belema22
    @belema22 Před rokem +29

    makes a whole video to say I TOLD YOU SO! No but seriously, great work. I bought 6800 2.5 years ago based on your original videos!

  • @ZiggyMeister
    @ZiggyMeister Před rokem

    Great work Steve! This looks like important information!

  • @cronos4267
    @cronos4267 Před rokem

    Great video. Glad I watched as I was considering the rtx3070 but not now. I game at 1440p on a gtx 1070 atm which days are numbered. 👍

  • @TheLingo56
    @TheLingo56 Před rokem +122

    I remember feeling a bit burnt when my 1GB 560ti couldn't play PS4 games once the generation swapped over. Funny to see this happening all over again.
    Nvidia has *always* been stingy with VRAM. Even back in the day it was easy to get Radeon GPUs with over double the VRAM for the same price. In 2015 the GTX 960 only had 2GB, and the 970 famously only had 3.5GB of full speed VRAM!

    • @Stratos1988
      @Stratos1988 Před rokem +16

      Lets not forget about 1030 with DDR4. VRAM so crap it shouldn't even be called 1030.

    • @AvroBellow
      @AvroBellow Před rokem

      The question is, did you learn from the 560 Ti or did you bend over for Jensen again?

    • @user-zl8sm6ok2r
      @user-zl8sm6ok2r Před rokem +16

      Same! I remember asking on the forum whether I should get 560ti 1G or 2G, and everyone recommended 1G. I mean, people took you for an idiot if you dare to recommend the 2G version. But within just 1 year, Battlefield 3 came out and destroyed that 1GB vram. I was so mad. Yet it seems 12 years later, the history has repeated again.

    • @vega7481
      @vega7481 Před rokem

      Как же вы западные куколд клоуны закалебали с этим вечным упоминанием 3.5 гб у 970, словно каждый из вас вчера узнал и спешит всем остальным "не знающим" об этом рассказать

    • @supabass4003
      @supabass4003 Před rokem +4

      Goes all the way back to the Geforce 3/4 days.

  • @VredesbyrdNoir
    @VredesbyrdNoir Před rokem +70

    Far out, I was tossing up between a 3070 Ti and a 6800 XT late last year and I'm really glad I went with the Radeon GPU now! Great work as always Steve.

    • @jays8287
      @jays8287 Před rokem +4

      I had the same dilemma earlier this year and went with the 6800xt. Very happy with my choice

    • @Magaswamy716
      @Magaswamy716 Před rokem +3

      no DLSS...

    • @ck275
      @ck275 Před rokem +15

      @@Magaswamy716 fsr is pretty decent

    • @sirab3ee198
      @sirab3ee198 Před rokem +1

      Unfortunately people still buy a RTX3070 TI over the RX6950XT in some countries where prices are similar 😢😮

    • @sirab3ee198
      @sirab3ee198 Před rokem +8

      ​@@Magaswamy716 man you guys never give up........

  • @leif-erichansens2635
    @leif-erichansens2635 Před rokem +1

    I play a heavily modded simulation game that uses more than 32Gb ram and more than the 11Gb of my 2080ti at that time ( I use Riva tuner tom monitor in game ). I first upgraded the ram to 64gb 3600 cl 14 ddr4, still not good enough. I then pulled the trigger on a 3090, for way too much money. But it made the game smooth again. I ended up monitoring 39Gb ram usage and 17gb Vram. I stung a lot at the beginning, but now I feel my 3090 will serve me well for the years to come. All my decision making was supported by your great content as well, thank you for that :)

  • @MAzK001
    @MAzK001 Před rokem +3

    I remembered getting the 3070 2 years ago and running into VRAM issues on CP77 and Metro Exodus Enhanced Edition. I was much happier switching to the 6800XT.

  • @NashvilleUK
    @NashvilleUK Před rokem +61

    I managed to get an RX 6800 for RRP on launch day, very savvy purchase in hindsight 😀
    Keep up the good work HW UB

  • @Finalterra1396
    @Finalterra1396 Před rokem +83

    If the rumored 8gb vram on the 4060 and 4060ti turn out to be true, I am very much looking forward to seeing HUB's review as there is no way any of those products will be priced accordingly.

    • @hugobalbino2041
      @hugobalbino2041 Před rokem

      RTX 4060 will be buyed by miners... cryptocurrency...

    • @Bokator2
      @Bokator2 Před rokem

      It's joever

    • @josh2482
      @josh2482 Před rokem +17

      ​@@hugobalbino2041 Do you live under a rock, crypto is dead. Nvidia even came out and said that cryptocurrency provides no value to society. Lmao.

    • @hugobalbino2041
      @hugobalbino2041 Před rokem

      @@josh2482 8 GB cards are fine for crypto... not so much for gamers... even 12GB cards... for me, I don't pay electric bills because of my solar power... so yeah I don't live under a rock... and there are still GPU miners out there...

    • @danieloberhofer9035
      @danieloberhofer9035 Před rokem +6

      The sad part is that yes, the 4060(ti) will get scathing reviews for only having 8GB vram, and yes, they'll sure as hell be overpriced for that - but people will still buy them.
      And, from what it looks like, AMD won't go ahead and do the smartest thing they could do and launch both the 7800XT *and* the 7700XT with 4 MCDs and 16GB. It's such a pity, and I really hope that they reconsider. Already having chiplet design for RDNA3 implemented would certainly give them the chance, unless the binning for the GCD also includes defects in the fabric interconnect to the MCDs.

  • @ChinchillaBONK
    @ChinchillaBONK Před rokem +4

    Jensen Huang : It's just an in-built feature called Motion Blur.

  • @theviewer1423
    @theviewer1423 Před rokem +7

    I knew that more RAM was better back on the Radeon 6800 release day and that finewine would kick in sooner or later it was good that the reviewers mentioned that more RAM and the infinity cache was a useful feature.

  • @Alirezarz62
    @Alirezarz62 Před rokem +49

    This is one of the reasons I went from 3070 to 6900xt the more VRAM and also shockingly I had to spend no additional money on 6900xt! The performance has been great for the last 6 months and I'm quite pleased with the AMD card. Also knowing that AMD cards age much better than Nvidia cards makes me think maybe I should stick with AMD from now on

    • @ChiplockRock
      @ChiplockRock Před rokem

      how are the drivers working for you? I'm very tempted to go red, but I don't know if AMD has resolved most of ther driver issues.

    • @Alirezarz62
      @Alirezarz62 Před rokem +2

      @@ChiplockRock To my surprise I have yet to encounter any problem with the drivers every game that I have played and tested worked great I don't know about productivity applications tho as I only use Photoshop and premiere pro anything else is CPU related

    • @m8x425
      @m8x425 Před rokem +1

      @@ChiplockRock I've had a good number of Radeon cards.... most of the driver issues I've has is when I've tried obscure stuff like connecting up multiple monitors of odd sizes and Windows automatic updates borking the drivers. My Radeon VII is also picky with what driver it wants to work with, and my x570 Crosshair VIII board does not like my Radeon VII.
      AMD has always had a shaky history with drivers.... and things like taking months to get RDNA 1 drivers months to get right has only hurt them.
      I bought a 7900xtx Red Devil on launch day and no problems so far. My only complaint is there is a 25c difference between Hot spot and regular GPU temps. Otherwise the card doesn't get too hot.

    • @hobmaniac
      @hobmaniac Před rokem +1

      ​@@ChiplockRock just got a 7900 XTX (new build from a 2080ti) and have no issues, also the AMD software is way better than nVidias. I think the driver thing isn't really as applicable nowadays as it was in the past.

    • @ItsFaber
      @ItsFaber Před rokem

      What size psu do you have? And what cpu do you have paired up with it? Ive been looking at maybe going a different route from my 3070

  • @flashmutex
    @flashmutex Před rokem +112

    I paid a shit load of money for my 6800 XT during the shortage (twice of what it is now). Still extremely happy with it and still glad I did. Happy to see it still performs so well today.

    • @conchobar
      @conchobar Před rokem +4

      Same here. Felt lucky at the time to ONLY pay $900 for an AIB 6800 in 2020. I think I got my monies worth.

    • @joaopescusa
      @joaopescusa Před rokem +2

      Same. Actually, at that time in my country 6800XT and 6900XT were roughly the same on price. I ended up getting the 69 one. Glad 68 users are still in good shape

    • @Kamikatze_Gaming
      @Kamikatze_Gaming Před rokem

      Same here :)

    • @SpeedRebirth
      @SpeedRebirth Před rokem

      Also got the 6800xt, at launch price. 8GB was a big no no. It was very obvious that it was too little.
      Just like the gtx 580 that came in 1,5 and 3GB variants. The 3GB lasted years longer. Same with 3/6GB 1060

    • @riba2233
      @riba2233 Před rokem +2

      Paid

  • @pieterrossouw8596
    @pieterrossouw8596 Před rokem +4

    This video combined with LTT’s 4070 review finally got me to upgrade my old GTX 1080. Picked up an XFX 6800XT for less than a 3070. Looking forward to an AMD GPU, haven’t used one since the ATI Radeon HD 3870.

  • @TabularConferta
    @TabularConferta Před rokem

    Thanks for the review. I got a 3070 at release for RRP and have been really happy with it, but it's good to know for future purchases (I'll likely not even worry till the 5000 series)

  • @MenTal9R
    @MenTal9R Před rokem +160

    Nothing quite like some fine wine 🍷😂
    Great job Steve! Id love to see this same type of comparison with the 3080 10GB & 12GB against the 6800XT.

    • @odi8359
      @odi8359 Před rokem +16

      Exactly! This would be a nice follow up video, probably including 4K resolutions.

    • @discipleofdagon8195
      @discipleofdagon8195 Před rokem +20

      only difference is that the 3080 price wise is more comparable to a 6950 xt

    • @taipeitaiwan10
      @taipeitaiwan10 Před rokem +4

      So glad I got the 12gb version when prices became low. Getting the 3070 would have caused buyers remorse.

    • @declde
      @declde Před rokem +2

      The 6800 XT also has 16GB, the cheaper 6700 XT and 6750 XT came with 12GB, they were more comparable to 3060 (Ti).

    • @zaidlacksalastname4905
      @zaidlacksalastname4905 Před rokem +1

      Wish got granted.

  • @KorbiCS2
    @KorbiCS2 Před rokem +32

    Great video Hardware Unboxed.
    As a 3070 owner, I feel jipped and spent a decent amount in this card thinking it’ll be good for years to come. I’m so extremely frustrated now, since most new games even on 1080p I’m running out of VRAM. Selling the used GPU is going to get me nothing in terms of price, and the new RTX cards are outrageous and frankly I’ll never buy an nvidia product again. 6900 XT is available and cheaper now, but not sure if I want to buy a new car that’s already last gen, even if it’s better than the 3070. Better yet, 7900 XTs and such seem to be quite expensive.
    I’m torn now to wait and deal with that absolute hot garbage product or try to empty my wallet into an AMD GPU. Or wait until next gen AMD. I have no idea.
    Nvidia is terrible and extremely anti-consumer. My next GPU will be amd and I cannot wait to be able to play games normally on a card that wasn’t made to just be obsolete in a few years.

    • @Justathought81
      @Justathought81 Před rokem +5

      Totally agree, I was shocked to see this issues at 1080p, crazy feel for you bro. at this point I’m done with Nvidia.

    • @Sparksman23
      @Sparksman23 Před rokem +6

      Give Intel A770 a try then. Its cheap, has 16GB vram, supports raytracing, and Intel has been doing a lot of improvements to their drivers. Buy it from somewhere with a good gpu return policy in case it doesn't work out for you.

    • @Vis117
      @Vis117 Před rokem +3

      I bet you could sell a 3070 and buy a 6800 or 6800XT for a minimal difference. We will also have 7800XT and 7700XT sometime this year.

    • @bramborovyPuding
      @bramborovyPuding Před rokem

      I had lot of issues with drivers on RX6700 XT specialy with old games where it performs almost like my old GTX1060, in new games everithing was ok. See internet forums lot of people have same issues. Im going backt to nvidia. AMD drivers are not good.

    • @KorbiCS2
      @KorbiCS2 Před rokem

      7800 XT sounds promising!

  • @matthewsmith22
    @matthewsmith22 Před 10 měsíci

    Great video as always. I've been looking at the rx 6800 as they've dropped a decent amount in price here in blighty recently, down to £440 mark. I'd be upgrading from a 2070super. Is this a decent upgrade?

  • @ethanculp
    @ethanculp Před rokem +5

    I know it'd be super unapplicable but seeing the difference between the older HBM/2 memory cards vs the GDDR6 cards would be an interesting comparison.

  • @user-kq7fe1ny7n
    @user-kq7fe1ny7n Před rokem +40

    That video perfectly reflects what it feels like to buy a GPU with 8GB some time ago and suffer with it now. I would be glad to see a comparison between RTX 3060Ti vs RX 6700XT, as these are also direct competitors. Good job!

    • @RealElevenTimes
      @RealElevenTimes Před rokem +7

      They are not though. RX 6700xt crushes 3060Ti in everywhere except raytracing.

    • @cccpredarmy
      @cccpredarmy Před rokem +1

      I game atm since 1,5 years on a 6700xt on all ultra settings 1440p (with minor tweaks here and there). If I'd EVER experience ANY stutters or something I don't like I'd instantly switch to a better card. But I don't have to because EVERYTHING TODAY RUNS BUTTER SMOOTH!

    • @Aggnog
      @Aggnog Před rokem

      Are you really suffering? Just drop texture quality one step and this won't even affect you. But if you swap the situation around and high texture quality didn't exist, everyone would blast AMD for wasting money on RAM when they don't need it. Funny how that works.

    • @pokemongo-py6yq
      @pokemongo-py6yq Před rokem

      Similar situation here. On the other hand I tend to not use raytracing and use a mix of low/medium settings so I don't think 8GB VRAM matters too much for my use cases. The testing shows 3070 is kinda bad but my 3070 will do just fine until it dies I'd say.

    • @alexworm1707
      @alexworm1707 Před rokem

      Im okay with my 5700xt 8gb it's performance ain't nothing to write home about to begin with lmao

  • @PeerensClement
    @PeerensClement Před rokem +38

    Great video Steve, thanks. Another thing to consider is that calling these cards "aging"... for a lot of people, these cards have only recently become available to buy, due to shortages and pricing. And they are currently still the newest 'midrange' cards available! So these are actually current gen cards struggling!

    • @bendavis3545
      @bendavis3545 Před rokem +3

      Makes me wonder where the "value" is...

  • @destebangm11216
    @destebangm11216 Před rokem

    Thanks a lot for the video, I had no idea this was a problem and with this results is a HUGE problem, I have a 3060ti at 1440p, and that explains why I was able to play metro exodus with RT maxed out at a good performance but cyberpunk with RT became really unstable after some game time, and this totally explains it.
    Thanks a lot my next card will be a 16GB card at minimum.

  • @mohit9206
    @mohit9206 Před rokem

    Hi, love your videos, when can we expect to see a review of the mobile parts like 4050 and 4060 mobile?

  • @highmm2696
    @highmm2696 Před rokem +152

    Would be awesome to compare the 3080 10GB to a 6800XT 16GB in these games.
    They are pretty closely matched in rasterisation performance, I wonder if the AMD card pulls away massively with more RAM.

    • @JamesSomersetActor
      @JamesSomersetActor Před rokem +6

      I would also like to see this comparison!

    • @yasunakaikumi
      @yasunakaikumi Před rokem +25

      not with Vram tho, as a 3080 and 3090 user, I can tell you that the 10GB isint even that much of a difference from using an 8GB Vram since games that needs more than 8GB usually just go 12-14GB

    • @OC.TINYYY
      @OC.TINYYY Před rokem +26

      6800 XT is closer to 3080 ti/3090 performance in rasterization in newer games, outside of 1 or 2 random titles. Sometimes exceeds that.
      The 3080 would get shit on.

    • @lhv2k
      @lhv2k Před rokem +3

      @Kevin P. Lmao NO, wtf are you watching to say bullshit like this ? Just check the recent games benchmarks from HUB, the 3080 10Gb still beats the 6800xt in EVERY GAMES, even at 4k, latest ones are TLOU, Hogwart and Spider Man, infact the 3080 10Gb even beats the 6900xt and 6950xt in Hogwart and Spider Man respectively at 4k ultra, and the 6800xt by like 10-15%. And if you check other reviewers, the 3080 is also better in other new games like Dead Space or Plague Tales. The only recent game that the 6800xt can outperform the 3080 10Gb was CoD MW2, and everyone knows this game is just an outlier that heavily favors AMD. Imagine saying "6800xt is on 3090 level" while in the very recent video of TLOU the 3090 was faster by like 30% at 4k, stop coping.

    • @madscientist6843
      @madscientist6843 Před rokem +13

      The first 6 games tested used 12-14GB of VRAM, so the 3080 10GB is still going to struggle

  • @Nate_the_Nobody
    @Nate_the_Nobody Před rokem +10

    I just recently switched from team green (GTX 1070 8GB) to team red (RX 6900XT 16GB) and I gotta say, that extra VRAM is nice, the Resident evil 4 remake uses a whopping 13.8GB of VRAM at 1440p if you set everything to max (with Ray Tracing and Hair) so 16GB does seem like the way to go moving forward, which, isn't shocking considering there was a time where we were jumping to double, triple, or quadruple of the "at the time" acceptable VRAM requirements back in 2007-2010, I remember having a 128mb card and moving to a 768mb, I remember having a 1.5gb card and moving to a 4gb, 4gb to 8gb, etc, etc, as the years go by, we need more VRAM, it's that simple.

  • @hopesea
    @hopesea Před rokem

    Your video was great! I was wondering if you could make a video discussing the best DDR5 RAM to use with an i5 processor. I've noticed that most videos on CZcams focus on either AMD or the Intel i9, and I think it would be really helpful to have a video specifically dedicated to the i5. According to the Intel website, the processor supports up to 4800 MT/s, but I'm curious if using a different or higher speed could provide any benefits or cause potential issues. Would it be possible for you to create a video on this topic? I would highly appreciate.

    • @NoMasters.
      @NoMasters. Před rokem

      "I'm curious if using a different or higher speed could provide any benefits or cause potential issues. "
      The answer to this is the same as it was for DRR4, "Yes, and yes." there are many variables that could make it be one or the other and it can also be both. generally though the system will be perfectly stable using 5800MT/s sticks paired with any intel processor and running XMP to get the full speed of the sticks. You will see some performance gains in some applications. In other applications the bandwidth is important, in others having tighter timings is more important, so it depends on what you're doing... in gaming latency is the most important so you want those timings as low and as close together as you can.
      This also might be relevant to you:
      czcams.com/video/b-WFetQjifc/video.html

  • @TheSleppy
    @TheSleppy Před rokem +2

    The closest way to simulate a a 16GB 3070 is to use a RTX A4000, think it would be 10-15% slower in non-VRAM limited scenarios but beat the RX 6800 in ray tracing scenarios, its also around 500 USD, Used thou not new