16GB vs. 8GB VRAM: Radeon RX 6800 vs. GeForce RTX 3070, 2023 Revisit
Vložit
- čas přidán 1. 06. 2024
- Hetzner: hetzner.com/hub/ex-line
Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Buy relevant products from Amazon, Newegg and others below:
Radeon RX 7900 XTX - geni.us/OKTo
Radeon RX 7900 XT - geni.us/iMi32
GeForce RTX 4090 - geni.us/puJry
GeForce RTX 4080 - geni.us/wpg4zl
GeForce RTX 4070 Ti - geni.us/AVijBg
GeForce RTX 3050 - geni.us/fF9YeC
GeForce RTX 3060 - geni.us/MQT2VG
GeForce RTX 3060 Ti - geni.us/yqtTGn3
GeForce RTX 3070 - geni.us/Kfso1
GeForce RTX 3080 - geni.us/7xgj
GeForce RTX 3090 - geni.us/R8gg
Radeon RX 6500 XT - geni.us/dym2r
Radeon RX 6600 - geni.us/cCrY
Radeon RX 6600 XT - geni.us/aPMwG
Radeon RX 6700 XT - geni.us/3b7PJub
Radeon RX 6800 - geni.us/Ps1fpex
Radeon RX 6800 XT - geni.us/yxrJUJm
Radeon RX 6900 XT - geni.us/5baeGU
00:00 - Welcome back to Hardware Unboxed
00:35 - Ad-spot
01:25 - Backstory
04:33 - Test System Specs
04:48 - The Last of Us Part 1
08:01 - Hogwarts Legacy
12:55 - Resident Evil 4
14:15 - Forspoken
16:25 - A Plague Tale: Requiem
18:49 - The Callisto Protocol
20:21 - Warhammer 40,000: Darktide
21:07 - Call of Duty Modern Warfare II
21:34 - Dying Light 2
22:03 - Dead Space
22:29 - Fortnite
22:53 - Halo Infinite
23:22 - Returnal
23:58 - Marvel’s Spider-Man: Miles Morales
24:30 - Final Thoughts
Read this review on TechSpot: www.techspot.com/article/2661...
16GB vs. 8GB VRAM: Radeon RX 6800 vs. GeForce RTX 3070, 2023 Revisit
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Outro music by David Vonk/DaJaVo - Věda a technologie
The first mainstream GPU with 8GB of VRAM was the R9 390 in 2015, which cost $330. By 2020, we should've only been seeing 8GB on the most entry level cards.
@@istealpopularnamesforlikes3340 Ahem* RX 6300 2GB
@@oOZellzimaOo thats a sub 100$ GPU...8GB of Vram is like 50$+ alone...
There were variants of the R9 290X with 8GB as well
@@oOZellzimaOo nuance is that everyone SHOULD know NOT to get a gfx card like that in the modern era. Essentially, it's a waste of money over something that's actually useful. Sorry to be pedantic about it but that's the nuance of it.
In 2015, 8gb of vram was 330? Man, prices have come a long and sad way.....
The 1070 had the same 8GB of VRAM as the 3070, and that was seven years ago...
My gtx 1080 TI got more VRAM 😂
Yeah I bought a 1080 on launch day and that was probably the first generation for me that I stopped looking at vram as a large concern. ...But as you point out that was kind of a while ago at this point.
@@Crimsongz 11GB, 1 more than the 3080. They are so stingy with VRAM ;(
Loved my 1070.
And the 1080 Ti had 11GB.
My ancient RX580 has as much VRAM as that 3070. Now 8GB is just enough for 1080p gaming. Glad to see the Radeon card doing so well.
u_u
❤️
Gddr5 vs gddr6
@@jpesicka999 AMD's RX 6600 8 GB GDDR6 is reaching $199
And to think that even the 470 had 8Gb variants back in 2016.
I really appreciate the fact that on this channel you guys make the pressure you are putting on companies through your views and influence extremely clear to understand and remedy, with extremely specific target points so that they can't weasel out of criticism with a few extra gigabytes of VRAM. Quite frankly you guys are using your power in this market to the highest efficiency you possibly can without getting anti consumer pushback.
theyre still very, very far from done, although im very tempted to swap them
💯💯💯
yet, idiots still buy 3070ti in december 2023 "i need raytracing"
The "I told you so angle" - is completely justified and I loved every second of it. However, the content is also extremely important because a lot of RTX 3070ti's, 3070's and 3060ti's are still being sold new, and are going to be exchanged on the second hand market. People need to know the value of these cards and the performance they can expect. Well done Steve!
And even worse, nVidia will continue to throw out 8GB cards in the 4xxx series while even new 12GB models should already get that warning the 8GB cards got back then. If people run out to buy 8GB 4050 Tis etc. at these highly inflated prices a "told you so" will be more than justified next time.
@@chrissoclone exactly. A 6950XT seems like a much better deal than the 4070ti and the upcoming 4070.
for me it reeks of confirmation bias. you would not run a 30 month old mid range gpu at ultra settings these days. I got a 6900XT and certainly don't run most games on ultra but high or very high settings for good fps. test the 3070 on high settings against the 6800 and it will do well.
@logirex I see what you're saying - but it's still great information for people looking at buying these cards - which sell a considerable amount new and used today. The old reviews are no longer valid.
@@claytonmurray11 Going forward you certainly want cards with more memory but it looks like they are getting it also with the RTX 4070 getting 12Gb for instance.
That said even getting an older card I would still take a RTX 3070 over the 6700XT any day of the week. This channel tested those over 50! games and the 3070 was almost 15% faster. The only thing is if you buy older cards or mid range cards DO NOT RUN THEM AT ULTRA SETTINGS.
Both this channel and others such as LTT multiple times points out running at ultra settings is pointless. LTT has one video where they had people look at games using ultra or very high setting and most preferred the very high setting as they clearly could not norice the difference.
Again, NVIDIA 100% knew this would be an issue within a year of release. They talk to developers just like AMD does. They knew that devs were itching to start using more and more of that console memory and completely ignored what that would mean for PC gamers.
they didnt ignore it at all. in fact they created cards that would be obselete to make a few extra bucks from us.
Planned obsolescence
"[Nvidia] completely ignored".
WOW! Even after watching a whole video on the subject and describing what the problem is, you *still* don't understand that this was completely intentional and *NOT ignored* at all!!! 🙄
This is the reason I’m skipping the 4000 series. They are basically not an improvement over my 3060ti based on vram
That's because Nvidia wants maximum profits, so they make lower tier products as unattractive as possible to push people towards the flagships with their significantly sizable VRAM.
Excellent work, Steve. I had no idea new games were using this much VRAM. I'll be sure to get something with no less than 16GB of VRAM next time I buy a GPU.
Some of the stuff I've been running has been using all 16GB VRAM and *still* going into system memory...
Emergency 6800XT ( thankyou card death 1 month out of warranty! ) was a good purchase, thankfully got one of the lower binned versions on sale & saved about £300 at the time. I'd never get anything below 16GB now & frankly still want more. Less bothered about clocks.
@@Karibanu what resolution do you play at?
@@tre-ou5gt 1440p, occasionally 1440p triple-screen.
I've just bought 4070 with 12... the issue is that I can't fit any AMD card into my computer.
@@orcusdei Did the thought occur to you that a decent PC case is way, way cheaper than a GPU?
I'd never buy a crappy GPU because it doesn't fit - especially not if that GPU is way more pricy than the alternative card. You could literally have bought a decent future proof case, had a nice meal with the family at a restaurant AND bought a GPU instead of a 4070 🤣🤣🤣🤣💀
My choice a year ago stood between the RX 6800 and GeForce 3070. I'm happy to say I made the right choice.
RIP in peace to anyone who paid $1400 for a GPU with 8GB of vram two years ago
Desperate times..
I paid $360 for one six months ago. In my defense, it was an upgrade from a 6GB 2060
Paid 400€ for 3070 strix a month ago. Now waiting for my 729€ 6950 xt red devil to arrive
Rest in peace in peace lol
I did pay like 1100$ for a 3070ti because I didn't really think things would get better, as far as price. I should have stuck to my 1070 and waited:)
What about the planned obsolescence angle in terms of NVIDIA purposefully limiting the VRAM in order to encourage consumers to upgrade sooner? Seems a clear strategy at this point.
They're only going to give gamers as little as they can get away with. This is obviously an excellent strategy on their behalf.
my second ever notebook was a Lenovo y510P and i had 2x755M in SLI (nvidia). At that time i didn't know anything about PC hardware, so i thought 2 GPUs are better obviously, right.
Anyways, nvidia came out with shadowplay and i was so happy, i could use it, and record stuff as FRAPS really sucked ass. I was recording my league of legends gameplays left and right.
Few weeks passed, and i couldn't use shadowplay. I googled it, and found out Nvidia disabled it, but there was a workaround. So, i used said workaround. Few weeks passed again, and said workaround was disabled too. And that was the moment i didn't want Nvidia products anymore. When i built my PC that i use today, i bought a vega64, despite almost buying a 1080Ti. Now, i'm bying a new computer and looking at the things Nvidia does, the obvious choice was the 7900xtx. (CPU is in the mail, and it's a liquid devil, so can't use it yet)
Fuck nvidia.
Gotta love that profit driven economy
@@Hardwareunboxed So why did thy come out with the 3060 12gb. Its was a weird card when it came out in terms of the VRAM it got. Did Nvidia finally see the problem coming?
@@kaisersolo76 they saw 6 as too little at the time, as it would have had issues with some titles on launch. Remember you can only change VRAM amounts by doubling or halving because of the bus width.
I was in the market for a GPU upgrade at the start of the year and it came down to these two as the primary contenders. I had been a longtime user of Nvidia products, but this time around, I went with the RX 6800, with the VRAM issue being one of the major deciding factors. Watching this video, I'm very thankful that I did.
how she been treating you so far?
@@briefingspoon380 I'm very satisfied with the 6800 and have had no issues with it at all.
@@vor78 Whats type of fps you getting and which games?
It happened with 128MB, 256MB, 512MB, 1GB, 2GB and 4GB it was only a matter of time until 8GB joined the 'Not enough' graveyard, I think some are just shocked at how fast it seemed to happen.
And that NVIDIA is selling a 4060Ti with 8GB of VRAM in 2023.
It really didn't though, 2, 3, 4gb was in pretty fast succession, it took awhile for 8gb to no longer be enough because RX 480 had 8gb long LONG ass time ago and it was still fine until a few years ago. And at the time of release it was already not enough, Doom Eternal was taking up 11gb and at 1440p Cyberpunk took up enough VRAM that it has texture issues.
@@drek9k2 Agreed. I think the people that are shocked are the ones who recently bought a 3070 on sale or maybe even a 3080, while the rest of us from the r9 290x 8gb and rx 480 8gb era are not.
@@Bassjunkie_1 Oh true that's a good point, you know I totally forget that a lot of people this may be their first actual video card, rather than, I'm not saying you needed to be from the GT 8800 days just that I keep forgetting many of these kids weren't even around for the RX 580 8gb yet. I guess a lot of people are really, really truly clueless.
Go play uncharted new version on pc i have no compliant getting new with 16gb vram when the game looks like that damn rip 2070 super and 8gb cards u won't be missed
nVidia's planned obsolescence has worked... I'm gonna buy a new GPU to replace my 3070. Grats nVidia, you've driven me into getting an AMD RX 7900 XTX.
I paid $850 for a 3070 ti on launch day.....what a mistake lol but im now switching to a 7900 xtx as well soon
Nothing wrong with 4090 too, but I get that $1600+ isn't necessary for a good experience.
I’m enjoying my 7900 XTX. I think you will like it.
@@macdonald2k Yup, the best card right now. But I choose not to give nVidia any more of my money.
@@Dempig I have a 3070 FE. Got it a few weeks after release when the only way to find one was to get lucky on Best Buy with constant website refreshes on stock days. Should have resold it when it was going for >$1000 lol.
I was using a RX 580 for many years, and I feel what made that card age so well was its 8GB of VRAM. At its launch it was pretty high knowing NVIDIA's equivalent was using 3 and 6GB. So when I finally moved away from that card it was obvious that the RX 6800 would be the right pick for me. Its been good to see the card has been aging well already for the same reasons.
I switched from 1060 to Rx6800. I consider 3070 on release as price was very similar but im enthusiast and i knew 8gb won't cut at 1440p in couple months. It happend yet Nvidia claimed its not a problem.
3080 10gb ? What a joke. NVIDIA made fools from consumers
RX580 was the best bang for buck card I ever had, and yes, 8GB VRAM played a large part in that. Before that I had a 660 Ti as my favorite bang for buck card, it was quite excellent and the ONLY reason I had to replace it was because already then nVidia was just too stingy with their VRAM, it could've lasted me even longer if not for that.
I made this same upgrade recently. The additional VRAM and Nvidia's bonkers pricing were key drivers in my decision.
you must have avoided the lemons quite well!!
@Varity Topic 6750 XT is still a pretty solid choice!
Just to add to this, it was a real kick in the nuts when I bought the 10GB 3080 and then the 12GB variant comes out after... I wanted the extra VRAM too it should of been the first product they released.
They did and it was called 3080 Ti. 3080 was never meant to last longer than 2 years for 1440p ultra.
never buy the first version. while its hip and cool to buy it first. you usually are a test subject just for better to drop right after, be it apple, samsung, monitors, gpus, cpus, whatever. dont get caught up into fomo fam.
@@eugenijusdolgovas9278 ya we are a wasteful throw away society, we think we need to replace things because corporations have trained us that way, so we keep profits nice. we are just modern day serfs.
@@CheGames497 Whilst some are wasteful, I tend to buy GPUs from used market and rock 6800 XT rn, which will be working till I decide to switch to 4k competitive FPS gaming.
My 16GB RX6800 bought two years ago at $579 MSRP aged really well.
That's what I paid for my Gigabyte Gaming OC 6800XT a few months ago. Bout $585 after shipping.
bro i got mine for only 430 euros from xfx
lol I buying mine for 300 rn
Got a 3070 for 600 euros in 2020 (GPUs cost more here than in the US). This week I found a one-year-old 6800 XT for 360 euros on the used market and was able to sell my 3070 for about 300. Can't wait to enjoy the next few years with this beast of a card, it's still so powerful at 1440p!
@@Karimftw0.0yeah but you got the performance 4 years later well I've been using my RX 6900xt for 4 years now. Getting that performance for half half a decade later isn't as big of a flex when we'll just move onto something 2x the performance
AMD have really improved their Radeon cards over the recent years. I used to use Nvidia cards all the time but have been using Radeon for the last 3 years now and love these cards.
How is the driver support for amd cards?
@@SMILECLINIC-2004 Its fine. not worse than nvidia wich recently got issues aswell. They are just slower at releasing newer drivers but thats it however im not one of those guys who update their divers immediately new ones come out, why would u do so? its not like u are gonna get much better performance in newer games, u need to buy a faster gpu for that. if it works good then u dont need to touch it.
@@SMILECLINIC-2004 No issues as a former 6700xt and a current 7900xt owner.
I'm using a RX 5700, and I haven't had any driver issues throughout my time with AMD cards (rx 580, vega 56). I'm happy that the RX 5xxx series cards are still getting support and FSR/RSR updates as its allowed me to play most games at 2560x1080 and higher than 60fps. I don't really like the metrics overlay that AMD uses but I can't use Rivatuner without MSI afterburner which messes up my undervolting settings.
It's been said since 2020 go amd if u want regular features. They have the best regular rasterization frames. But at the low end with these 3070 3060 6800xt 6700xt ect you won't get any good benefits for raytrace lightefects so u need to go higher up the Chain. So obviously this test proves that
It would be very nice to see a 12Gb model like a 6700XT in the comparison as well. Or even an older 11Gb card like a 1080Ti. It would be hilarious if frametimes were more stable on the 1080 Ti than on a 3070 :D
Actually I recently ran my 1080ti I7700k with God of War 4K Ultra high with FSR 2 balanced or quality.
60fps solid
Blyat-Vidia gimping its own cards? Who would have thought.
I am having a lot better gaming experience with my 1080ti, compared some other people with cards twice as fast but only 8gig.\
I still get 60+ fps in almost all new games 1440p, no stuttering.
gtx on the last of us run like shit, zero optimization for that gpu generation, like in every games
^This please Steve, you need to do this!!
Literally just got my 6800 a week ago and it's been blowing my mind how powerful it is even wth raytracing.
I’ve had one for a year now and I honestly haven’t run into a situation where I’m concerned about performance. It’s a really good card!
Interesting, GTX1080 user here looking upgrade! The top card my 9700k can handle is a RTX3070 or a 6800. The 16GB of ram is definitely pulling me to team red for sure.
@@UKGBManny For the money right now it's one of the best options. It's closest competition is over $100 more. If you can spend another $100 you can get the XT, who's biggest competition is WAY more expensive than that. Unless you can get an AMD 79xx or NVidia 3080 the rest of either card that might compare is double to quadruple the price. I paid $370 for mine and am still amazed at how much it can handle.
going to get my 6800 for 350usd tmr and selling mine 6700xt, looking forward to get that thing after seeing all those videos
Still rocking my 1070 I bought in 2016. Absolutely love this card, it's been enough to run recent games, granted in medium quality, but they still run and I can play in good conditions. I'm not hurry to upgrade GPU and not a compulsive buyer, and thank god for that seeing the state the GPU market is in. Hardware Unboxed is one of the channels I'll be sure to follow the advices of to know which GPU I want to upgrade to.
Whatever makes you happy, that's the important thing in life. This man has his head screwed on straight!
Dkn I agree the 1000 series my have cost most then the previses generations at first, but boy did the have longevity. I picked up a GTX 1070 Ti at the beginning of 2022, and man do I wish I had gotten it sooner! Nvidia and now AND have lost their track now, at least when it comes to there gaming side of things. more so Nvidia
Same here, EVGA 1070 FTW, an absolute diamond of a card. Lasted longer than any GPU I've ever bought, lasted through COVID and crypto, and plays almost everything I care to throw at it in near-silence. Just replaced it with a cheap second-hand 6800XT to last me until the next 'good' GPU release cycle whenever that will be, but that's only because I'm building an entire new system from scratch.
@@cameronblazowich2671 if we take AMDs perspective into consideration, then the only way they continue to increase profit margins is to charge their 5 customers even more money for their products.
1070 is absolute beast of a card
Would have also been interesting to see the RTX 3060 12GB vs a higher end RTX 8GB card. See if there were cases the former played better than its more expensive counterpart.
it was too much for the GPU though so it's not as good as you might expect. At least it won't crash in RE4 as much though LOL
The RX 6750 XT might be interesting to throw in the mix for those titles that report ~15GB of total VRAM usage already.
I'd love to see a 3070Ti with its G6X in this comparison.
An RTX 4060 with 8gb of VRAM will be pretty much dead on arrival. I can't believe Nvidias social media team is unable to clearly communicate this to the higher ups.
and does it really matter? It will still sell really well
@@Ddofik as well as the 4080 did? 😂
Imagine thinking Nvidia care about the consumer lol
They may give it 16gb in the end actually, just like they did with 3060. I am almost certain 3060 was planned to be a 6gb card (just like 2060 was) but they changed the mind last moment. No way they planned 3060 to have more vram than 3080! But the only thing they could do at that point is to double the buffer (same number of channels, double the volume of memory banks).
You think the higher ups are not fully informed?
They have NO incentive to offer anything more.
This would be a different story if they were facing the AMD of 8 years ago, that would do anything to gain market share.
As soon as a competitor comes in offering the same performance with triple the vram costing half as much... and they truly loose market share.
Then next gen will be different.
Glad I sold the 3070 two months after purchasing it. I was very disappointed with it and bought a 6800XT - still in use and happy 😅
@@siwexwot8994 yes. That's what we saw in the video. Dunno why you mention it..? 🤔
@@siwexwot8994 you can still use it in combination with DLSS...
@@siwexwot8994 and that's the point. The 3070 is only a 1080p card today due to it's lack of more memory... The take away is: the 6800XT was and is still the better buy. 😉👋🏼
@@siwexwot8994 yeah you're may be right but lowering the texture quality in 2023 is not future proof. I'm still glad that I sold my 3070 2 months after the purchase and got a 6800XT in my hands. FSR 2.x and 3.x will make it even longer usable as the 3070 will ever be. That's my personal opinion 😉
You are accurate as always. 8GB is now budget tier. 16GB is mid-tier and 24GB for top-end. When Nvidia launched (and then quickly unlaunched) the 4080 12GB it was such an obvious mistake. Even 16GB gives me pause because 4K gaming and SS needs more VRAM than ever.
especially with raytracing too
Like really i dont know how people trust them while they try to skip and move on cheapest shameless set higher price scenario.
I think 10-12 gb is mid tier. 16 and above is still pretty high end
16GB may not technically by mid end considering the current generation of games, I would say 8Gb would be entry like the 4Gb back then, 10Gb being somewhat standard like 6Gb and 12 Gb being the mainstream for GPUs, and 16Gb should be standard for cards that can run 4k unlike 3080 and 3080ti with that 12Gb
Retesting the 6700XT would be a great idea, considering that it offers 12gb of VRAM, and how it's already very close to the 3070 compute power but for cheaper, would be interesting to see how it fairs against the 3070 in the current VRAM limitaions meta.
Hopefully Steve can do 2080ti vs 3070 vs 6750xt.
Turing would win this because of much higher RT performance and 11GB of Vram is actually sweet spot for that class of performance.
@@KrisDee1981 Turing uses first gen RT cores just like 6750XT, So I doubt that. But hey, no problem in comparing.
Actually, that would be awesome.
Yes, very good idea. Hardware Unboxed pleasssseeee
@@KrisDee1981 🤣🤣 yeah let's do it until nvidia wins, not the consumers, Nvidia must win.
You know what would be an interesting follow up, including benchmark results from the 16gb Intel A770 in the titles the 3070 struggled. Just for fun, you could even include data from the Radeon VII for those same titles. That additional data would really drive your point home for owners of 8gb cards.
I’d also like to see a 3060 vs 3070 video to see if it can handle these better as well
hey man even a vega frontier at 16gb would dominate the 3070 lmao 3070 trash
Or a 2080 Ti. Similar GPU performance to the 3070, but more VRAM. Steve mentioned it, but didn't test it this time.
Yeah even going all the way to a 1080ti would be a good shout. I wonder if they even still have a Radeon vii?!😮
Nahh they should it for $2500 in the mining boom
This reminds me of the issues we used to have [way back] when we'd run out of system RAM and Windows started using it's swap file causing massive slow-downs. The main difference being we can upgrade system RAM. If only we could do the same with GPUs.
Maybe I should relive those times by running 4GB today
@@ArtisChronicles and then you can fix it by using an operating system that doesn't suck
You might be on to something there. I wonder if some bright spark at Nvidia/AMD or Intel could come up with a way to offer upgradable graphics cards. Buy a card and add compatible RAM or even swappable same generation GPUs. Can be done on a motherboard so why not? Would increase the life of the cards and saves on e-waste too.
Moonstomper68 Swapping out RAM works much better on a motherboard because the motherboard PCB isn't completely covered with heatsinks, blocking off the memory chips. There is not much need to upgrade VRAM when the GPU performance is already at a given. It's much cheaper to just put on enough VRAM so the VRAM won't run out in situations where the card's raw performance could still hold up.
@@archthearchvile And what OS would that be Hmmmm?
I wish I could like this video twice! This was so helpful and instrumental in my progress of understanding how a GPU works and what the heck VRAM was and what it did, and clearly demonstrated why the graphs can be misleading sometimes, like when the GPU compensates for lack of memory and avoid crashing or buffering by entirely not loading in the textures, which isn't shown any other way but watching the side by side
Thank you so much! Wonderful video
I'm glad I went with an RX 6700 XT. I remember people telling that rx 6700 xt was no match for rtx 3070 and the amd card actually performed a little bit worse initially but now it is a completely different story thanks to a bigger VRAM buffer and AMD's FineWine technology :)
And in fairness 6700xt always has been better especially at 108p and above due to the Vram.
My 6900xt reference beats my buddy's 3080ti in most games and synthetic benchmarks. His 80ti will out class my card in some games though!
@@LowMS3 I have been having vram problems with my 3080 10gb for some time, mainly with newer games, and after borrowing a friend 6800xt just last week I'm shocked how much better, so many games looked and played, even some 2+ year old games look and play much better, with older games like GoW at 1440p texter pack crashing the 3080 but the AMD cards run fine. Anyway, I have just brought a 6800 xt new for nearly half what the 3080 12gb cad cost.
The 3070 has enough gpu power to drive 1440p at decent frame rates, but struggles at 1080p just for less vram..
What a shame
Same here I got RX 6700XT for what I play, also I wanted more vram but not brake the bank, but the game I play need more than 16GB of Vram (Transport Fever 2, CityBus Manager, Cities Skylines, etc.....) I had a 8GB card but I had unbearable frame-times and a lot of lags due to Vram is overflowing into system memory and I had people saying you need Nvidia for gaming it's the best but my 8GB card was Nvidia GTX 1070TI, I'm like no I need more vram then 8GB on what Nvidia was offering in the mid-high end, I get what feels the best for me not for others, It's like being back at high school choosing your college and your friends say come where I'm going, sorry getting off topic aha.
I'm actually pretty gobsmacked by how well the 6800 performs here. The 16gb buffer is an absolute life saver and I'm surprised how well it's doing with RT as well considering how questionable it was at launch. One GPU will be usable at higher settings for a few years yet probably and the other is relegated to lower settings and resolution for completely arbitrary reasons. Very frustrating.
To be fair, the 3070 has never competed with the 6800. The 6800 has always been 1 tier above the 3070 from the very beginning (even before this new vram scandal).
The 3070 was built to play with the like of the RX 6700xt and RX 6750xt (which the 3070 beats most of the time).
@@angrysocialjusticewarriorand what about the 3070 Ti suffer from same problem as the 3070 non Ti
@@angrysocialjusticewarrior no no no. Even the $400 digital PS5 console has over 12gb vram dedicated to games. Launching the 3070 with only 8gb vram while knowing devs in the near future would target the consoles as a baseline is treacherous 😢
@@angrysocialjusticewarrior I agree that the 6800 and 3070 weren't direct competitors, but they were only seperated by 70 bucks and that 70 bucks gets you a MASSIVE advantage now. Like an absurd one
What's frustrating is the fact that so many people bought these pieces of (as Steve would say) GAHBAGE just because they were in a green box. Anyone who spent that kind of money on 8GB deserves their fate.
This video inspired me to start looking at the used market as a way to jump ship from my 3070. I was able to find a one-year-old still-in-warranty 6800 XT and after also selling the 3070 I'm only out about 60 or 70 euros! The performance gains are massive, and the extra VRAM is sooo nice for maxing out games but also for keeping stuff like Discord and CZcams open on my second monitor. Super happy I made the switch and thanks again tor spurring that!
I agree with this. The tiers are now:
- 24 GB -- High End
- 16 GB -- Mid Range
- 12 GB -- Entry Level
- 8 GB -- "barebones", sub 200$ cards
This should be the standard in 2023.
I would agree but this also highlights the laziness in development of games now. There are still games more complex that look better than games like the last of us that use half the vram. Devs need to take responsibility as well, not just GPU manufacturers.
I'm sure that makes people who bought 10GB 3080's feel good about their purchasing decision ;-)
@@cairnex4473 lol that card was always a trap, I doubt anyone that bought it feels good about it now, I went for the 3080ti and the extra vram has been a godsend, still i think Devs need to seriously re-evaluate the priorities when making games, if a game looks amazing but plays like absolute garbage it's a bad game
@@FouTarnishedNot every developer became lazy it just technology has advance a lot, i don't included Bad Port Pc like the last of us or Beta Release Game like Lotr Gollum. What i mean is like Resident Evil 4 Remake, Capcom really improve game graphic and we can't blame game like Capcom to increase usage of Vram.
Agree. Me playing with 1060 3gb ;-)
Would have been interesting to see how the 2080ti with its 11gb vram compared to the 3070.
I wish Steve can do this comparison next. More interesting to see Nvidia vs Nvidia. Almost identical performance, but extra 3GB of Vram.
I do have 2080ti and TLOU runs great at 4k, dllss balance, high settings.
Now that would be interesting to see :D
Add the RTX 3060 with 12GB for shits and giggles.
2080ti is a beast, unfortunately those 2018 micron Gddr6 are ticking time bombs.
@@deamondeathstone1 Yeah, it would have been great to see how the 12GB 3060 fared in these games compared with the 3070. I imagine a lot smoother gameplay in many instances, especialy with dlss.
I must admit, I didn't think 8GB would be an issue this soon. Up until recently I thought that, all things considered, the 3060 Ti was *_the_* card of this GPU generation.
In the end, I bought a 6700 XT. Less so because of the additional VRAM, but mainly because it was cheaper - and now I'm really glad I did.
As soon as you give a dev more ram to work with they will consume it as fast as possible 😂
the rise of VRAM usage is due to consoles. consoles back then was a hindrance to PC gaming since companies are still focusing on making their games in ps4/xbox at that time. now that the current gen consoles are quite good and has 16gb ram (although unified), expect most games used at ultra around 10gb-12gb leaving 4gb ram for the OS. safe play to be made would be to buy 12gb vram gpu but its guaranteed to be safe in 16gb vram gpu unless you want 4k ultra + raytracing.
@@madtech5153 VRAM and Teraflops on XBSS compared XBSX?
@@madtech5153 ps5 and xbox sx have 16 of shared memory. It means that 16 us shared buy both vram and ddr ram. Do how come 8gb vram and 16gb ddr ram become obselet
If you consider texture size ps5 might only have 6gb vram and 10gb ddr ram to work with.
Dude devs are f**king up
The 6700XT is the card of this generation, not the 3060Ti. It is cheaper, faster, and has more VRAM than the 3060Ti.
Thanks, Steve. Bought a 2080 Super with 8GB in early 2020, before the lock downs. Noticed it chugging with the new Dead Space at Ultra settings at 1440p. Knew right away it was VRAM limited with RivaTuner running. Drop settings to High, reflections to Medium and ambient occlusion to SSAO and it's right as rain. Solid 60fps the whole game, except in two spots. The Bench on the bridge was the first. Can't remember the other one.
I'm coming back tk this video 2 weeks later to say thank you, you single handedly changed the consensus for Min requirement of vram in 2023 forwards
I'm glad late last year, when upgrading, I went with a 6800 XT instead of the 3070Ti. They had similar price, but I also thought that 8gb was a bit too low since I had a GTX 1070 which also had 8gb, so I also wanted an upgrade in the VRAM. Thanks for your analysis HW Unboxed!
These two cards are not even in the same league except for RT overall, you made the right choice sir!
Very good choice. The 6800 XT is more in line with the 3080, often beating it at 1440p in alot of titles (Non RT of course)
I think it was HW Unboxed that sold me on my RX6800 due to vram and 1440p performance
@@highmm2696 Thanks, yes I know the 6800 XT is often better than the 3070ti, but here where I live, they were practically the same price. Like Apple, Nvidia has a "tax" with it that they usually are more expensive or drop the price less due to more demand. So AMD usually tends to be cheaper
Thats why I bought the 6800XT instead :)
Been waiting for this video... in 2020 I had the opportunity of getting either a Sapphire Pulse 6800 or EVGA 3070... and I'm so glad I went with the 6800.
Came from a 5700XT to a 6800, worth it!
You must have friends in nvidia coz rtx 3070 released in july 2021 so 1year 8months ago and true release was months later late 2021
@@gaav87 No it didn’t. Maybe the 3070ti released in 2021 but the 3070 def released 2020.. September or October I believe
@@CZcamsTookMyNickname.WhyNot Nice! I had a 5500XT, 5600XT and then a 5700XT before my 6800 lol… I kept wanting more power lmao
yeah you got lucky with all the scalpers and miners they were selling by bulk before arriving
You know what would be fun doing this exact tests?
RTX 3060 12GB
RTX 3080 10GB
RTX 3080 12GB
And see if the 3060 will pass the 3080 10GB in some games while comparing what performance had been left on the table had the 3080 originally come with 12GB!
Yes, please.
Hate to burst your bubble but the 3060 wouldn’t get higher fps in ANY game compared to a 3080… 12GB or not lol, it wouldn’t even be close. Just google performance stats for both and compare.
@@alenko4763 yeah unfortunately I think 10GB is just enough to actually avoid the issues the 3070 has.
At 1080p Windows 11 desktop, about 0.5 GB VRAM is consumed.
@@alenko4763 3060 12 GB superior 0.1% lows when compared to RTX 3070 Ti 8 GB.
I'm very happy using 6800 with a ryzen 7600 as my secondary gaming PC. Never owned an and CPU or GPU before and have to say I'm impressed! Will mostly be used for emulation as I put it all in a compact case.
I'd love to see a 3070 Vs 2080ti comparison since most recommended specs specifically mention a 2080ti instead of a 3070 for a 1440p 60fps Ultra settings even though they're comparable in Rasterization, most likely due to VRAM
That would be the most sensible comparison.
and maybe add the 1080 ti in the mix just for the lulz
Once I had VRAM issues, I ditched my 3080 12GB for a 7900xtx 24GB what an amazing card, it just lets me max out anything at 4K with loads of VRAM remaining. Am on the Team Red bandwagon for now.
EDIT: When I mention maxing out everything, know that I do not use Ray Tracing if you want good RT performance go with the newer NVIDIA cards.
Can it play 4k 120 FPS with maxed out settings?
@@alexts4920 For which game are you asking??
I go green, 4090 > everything
@@AdiiS 4090 is too expensive man
@@alexts4920 not exactly
lol. Back in 2021 the saying was "If you just wanna use rasterization, buy AMD, if you wanna use RT, buy nvidia". Funny how that turned out.
For me it was never a question, since nvidia doesn't give us Linux-nerds open source drivers.
This is why I watch you guys (and the other Steve). Good insight into potential future issues, but also willing to talk from the data even if it's slightly contrary to expectations.
Its funny how NVidia is pushing ray tracing, yet when you turn it on thats when their card drops to its knees because of VRAM, despite probably having superior ray tracing accelerators.
It's also bewildering how people blame developers and poor optimization, justified or not, for low performance on 8GB cards, yet there doesn't seem to be much outrage about Nvidia pushing RT so heavily to sell 20-series cards back when support for it was practically non-existent and enabling it now on virtually any card from that generation will bring it to its knees. Is that also poor "optimization" on the part of developers or Nvidia overpromising and underdelivering?
It's not even "probably", the RT cores make a huge difference and the Nvidia cards should be vastly better at it, so the lack of VRAM is without a doubt a big issue.
@@chronicalcultivation Don't exaggerate the difference RT cores make. Sucking less does not mean RT cores don't suck. And that's only usual limited RT.
The RTX 4090 still only manages 17 FPS on Cyberpunk 2077 RT Overdrive mode.
@@sammiller6631 as someone who owns cards from both AMD and Nvidia, and also a PS5, no, it is not an exaggeration. The performance hit taken on an RDNA2 card or console for enabling RT is significantly more noticeable vs an Ampere GPU.
It's not totally unusable on AMD, but it's a lot worse.
I was really hoping the 7000 series would catch up and be equal to Ada in that department, but they seem to have only reached Ampere level RT instead.
I've always been more of a Radeon guy for almost 20 years, so the last few generations have been a bit disappointing..
It was better when it was still ATi
It's funny how, despite this, there were enough people stupid enough to buy them.
I can't believe in the mental gymnastics that people go through to justify them buying Nvidia when they don't absolutely require it
One of my buddies got a 6700 xt recently and had horrible driver issues and some games performed really poorly in comparison to others, could’ve been his fault but I think amd still needs to work on their drivers more
Well they're better products (albeit more expensive) so that's really the only justification needed.
@@schwalmy8227 if there was a problem with the driver he could just use a older one
It's actually normal behavior. They bought something and they want people to assure them they did the right choice.
It's our human nature. Even though we know we're wrong, we'll continue telling ourselves we're right in hopes that we'll believe it ourselves.
It's just lying to ourselves. And we do it not just with things we buy, but all sorts of things. Even relationships to give an example.
@@schwalmy8227 You're saying that as if Nvidia drivers were perfect.
I had so many issues with Nvidia ones that for me it's not an argument for going team green. Their only advantage right now is prefessional space IMO. For someone that just plays video games AMD is a clear choice.
Hell, even Intel has a lot to say here
I really appreciate the revisit. I've been watching used GPUs and you saved me from a costly mistake.
This is very useful information. Thank you so much for doing the work. If the mid-lower end of the 40 series is still 8GB then it is very probably a hard pass. Looks like I'm coming back to Radeon next build.
NVidia planned this knowing what was coming as they had optics working with upcoming devs. All their products had gimped VRAM amounts so they could justify everyone buying new cards sooner.
The worst part is the customers who got burned by Nvidias bs will just go "oh i guess my GPU is getting old" and will buy a newer Nvidia one without a second thought.
Planned obsolesence doing it's work
@@DebasedAnon I have a 3070 and my next card will be an AMD card.
I have enough of Nvidias increasingly annoying nonsense for good and AMD cards are now more than a viable alternative.
@@Thysamithan I'd say AMD outright won the previous generation before the entire GPU market went up in flames, if they stayed at around MSRP Nvidia was trounced across the board.
If people cared a tiny bit more the market wouldn't be as bad as it is now, it took the absolute farce that is the 4080 at 1200$ + a recession to make people NOT buy Nvidias products..
With that being said AMD is being way too passive for my tastes, they could afford to go for the jugular right now due to having vastly cheaper costs but they're content to undercut Nvidia by small amounts (relative to the price).
@@DebasedAnon My issue is that, I know Nvidia are screwing me over but I'm split between gaming and doing AI projects on the side in my free time, so I have no other choice but Nvidia since the workflows rely on CUDA. If AMD bothered to push Rocm and make it as viable as CUDA, and make FSR better than DLSS, I wouldn't even consider an Nvidia card.
I couldn't be happier getting my 6800 last year for 480$. Apart from the odd driver issue (nothing serious) I'm a happy AMD customer so far. Nvidia became too rich for my taste with the 3000 series. Now both AMD & Nvidia are luxury goods with the newer gen products.
I tried amd cpu and after how shit it was and didnt work and couldnt even update the drivers, just never trying amd again.
@@valtonen77 Something's terribly wrong with your basic tech/soft skills.
Nvidia with their 4000 series GPU: you're too poor to buy our products so we don't need you anymore
@@PotatMasterRace ?? there is nothing to do, its just plug and play. dunno why he had a problem but if linus had no problem, its more likely applies to general population as well (yes im dissing linus).
what scares me off of buying AMD are all of these driver issues I read about for example they had extremely high idle power usage for a really long time - I dont even know if its fixed now? And then there is FSR which is just not nearly as good as DLSS, Ray tracing etc
Feel a bit bad buying my 3070 now, haven't used RT once and the RX 6800 is smashing it in these benchmarks.
I’ve never watched one of you episodes but I am glad I did tonight. Well present and well narrated nice work dude I’ll be watching more of your content.
The 6700 XT, the 6800, and the 2080 Ti are the only cards you should be looking at in the $350-450 range. Maybe sometime in the future we can add the A770 to that list.
@Cm Not to mention the problems that arise with early gddr6. If your 20 series card has a certain type of early memory it could fail at any point now!
@Cm you can find the 6700 xt New for 399€ for weeks in Germany. The prisebump to the 6750 xt is not wirth it.
Beispiel: Mindfactory hat die Saphire Puls 6700 xt aktuell für 383,05 €
Steve revisited the A770 after a major drivers update and if I remember correctly the A770 and 6700xt were arround on par
My Aorus 6800 Master still runs anything I throw at it. As long as my multi online games are able to pull 144fps on 5% low as no lower than 120 on 1% Im happy. As for offline games like resident evil or atomic still all good. Ray tracing at mid though.
@Cm 348euro second hand here in Denmark, or a bit cheaper Ive seen.
I waffled between the 3070 and 6800 back in 2021, and probably would have bought a 3070 if I could have gotten it anywhere near MSRP. Instead, I bought a $900 Gigabyte RX6800 Aorus Master. While I wasn’t thrilled with the price, I’m glad to hear I made the right choice. HUB’s concerns about VRAM was a major deciding factor for me. Thank you, HUB!
Best buy is selling that still for a cool $1400 + dollars 😂😂😂
@@TBRANN oof! Newegg still has it listed for $1,000. I probably bought the most expensive RX6800 out there 😂.
@jacobvriesema6633 hell $900 is a steal apparently 😳
I feel your pain at that price too... I had to buy a 6700XT for 700€ in that same period... If only my GPU hadnt died on me back then i'd gladly get a 6800XT for 650€ right now...
i can buy a 6950xt for 650€ atm lol
Nice to see that someone fixed the dip in callisto protocol from team red. if you check the 3080 and 6800 XT results in this game from prior test videos the radeon card has huge 1% low issues where fps would just tank in certain areas
Some people are missing the point in the comments. Limited VRAM is Planned obsolescence, most people don't upgrade their GPU often. it intentionally limits high end Graphics card from its full potential.
You showed very well that everything is not about fps. Most people were looking at the fps results and passing by. We understand again how complicated the user experience is. Congratulations
I saw the 8GBs of VRAM being in issue back in 2020 because Shadow of War (2017 game) requires more than 8GBs of VRAM if you wanted to use the highest texture pack with all the settings at max. That's why I ended up going with the 6800XT.
Aging like fine wine
@Steven Turner Shadow of Mordor wasn't a grindfest
Far Cry 5, too. I’ve seen it use almost 10GB when you supersample it
I had 6700XT with 12GB and some games could fill 9-11GB easily on the highest preset in 1080p!!! 8GB is not enough...now I have also 6800XT and there it is normal that games use almost all the 16GB at 4K...
modded XCom 2 used almost 8Gb on my 1070 at 1440p in 2016. I upgraded this year and anything less than 12Gb was a non-option to me, which meant NVIDIA auto-excluded itself
I woulda loved to see the 16GB A770 on these charts too, even just as a wildcard.
Intel is really providing a lot of value for entry level PC builders with the A770. If I was just starting out building PCs I would buy that one.
Unstable card for this kind of results?!?! No?
@@josh2482no? It still has a lot of crashes and is nothing for a pc beginner tbh
Excellent review. The long graphs with FPS numbers does not show smoothness. Maybe you can adapt this to your GPU reviews in general. However, I believe this is much work to show ingame footage. Maybe you can use some metric like Igors lab does to measure and evaluated “smoothness” via statistical approach in between frames.
Imagine making such a good core architecture and software like DLSS only to be thrown away because of not enough VRAM. It really is the epitome of making an F1 car only to fit tyres from a prius.
Outdated shadowplay that dont have Capture preview
This makes me really feel good about my recent upgrade to the 6800XT, hopefully it doing great at 1440p for years to come
On the future side of things, team red has seemed to always had an eye for longevity. Just nabbed a 7900xtx myself after I saw my venerable rx580 8gig was minimum spec, after 5 years of use it was a no brainer to go red again.
@@Doomrider47 Agreed, I never have really dipped into AMD until I got a Ryzen 2700 a while ago. I dont think I will go back to an Nvidia card for the forseeable future unless AMD starts sticking its fists in consumers asses more than normal. Im honestly hoping Intel really brings a big update to its next GPUs, maybe both AMD and Intel putting pressure on Nvidia will knock them down a peg a bit.
@@Doomrider47 how big was the jump? I have a gtx 1070ti (which has similar performance to the rx 580) and I was also planning to buy a 7900 xtx or maybe wait 2 years and buy a 5080.
@@pyronite3323 wait a bit more.Or if u can get a 6800XT for less than 500 bucks thats a steal.
@@pyronite3323 if u can grab the XTX for anywhere close to 700-850$ - Just go for it - its worth it - tbh i have a 3yrs old 2070S 8Gb but i dont regret it for 170$
Nice video! i would like to see a video where we can see you testing and talking us trough your testing proces. Like if you have to change hardware a lot like in the b650 test. How do you make your time efficient? Do you use an usb stick with al drivers for every brand. You test allot and the way you do it has a massive effect on time consumption and quallity of analysis. It would bee nice to see a video like " behind the smoke" 😂
This perfectly encapsulates my recent thoughts. I really wanted to get a 3070 back around November of last year, but the price was a lot more than the 6700 XT that I got that for around $380. Really happy with that purchase as I now know I'll get a good 3-5 years at least with it.
In my country the price difference is like nothing. So is it worth buying the rx 6700xt over the rtx 3070 just for a higher vram ??
@@moox556 yeah I'd do it again in a heartbeat even at the same price. Modern games use a lot more VRAM. 8GB isn't enough anymore like it used to.
I had a 3070 for a little while, it was a fine card. Ended up with a Radeon 6800XT not long afterwards and sold the 3070. I don't regret it.
I’m curious what the value of a 3070 will be in a couple more years as this continues to happen.
@@Vis117 it'll sell very poorly I think. The market will be flooded with them in due time.
I went from 3060ti to 6800xt, happy about the decision especially since I play call of duty
Smart guy
i did exactly the same thing
It's funny to me that you got comments complaining about your VRAM criticisms. Part of the reason I went with the 6800XT (aside from the fact I was able to get one for not much more than a 3070, and I would have been able to get a 6800 for a bit less than a 3070 if I'd gone that way) was because of your VRAM comments.
I am, therefore, very grateful for your VRAM criticisms personally.
alot of people who criticizes the hub's Vram criticism says that speed it all that matters, size doesn't. its already quite clear back in the last of us part 1 benchmark that the speed doesn't help when there isn't enough vram.
The 3070 Ti with 8GB is faster than the 6800 but he chose to compare two different tiers of performance and keeps riding on bottlenecking issues when trying to run Ultra on badly optimized titles with a mid range card from years ago
I actually encounter vram problems when using my old rx580 8g. I sometimes need to stop the browser/kill dwm.exe so game will run smoothly. Thats in 2021, the time I decided to upgrade my graphic cards. And then I saw 3080…10g. I am like "what the hell? How on the earth 10gb at 2021 is enough?". Ends up go rx6900 because nvidia cheap out the vram for the whole 30 series.
Nvidia plays a good game here.
I know this comparison would be a little out dated, but could you do a comparison of an older title with an RTX 3060 (12 GB) vs RTX 3070 (8 GB) to see how resolution and settings scaling do? This should also highlight the limitations of the 3070 compared to Nvidia's own cards.
That would be an interesting comparison, how much more frames would the 3070 be at 1080p for $200 more.
Thanks for the content Steve and team. This is a really important subject IMO and I'm very grateful you guys have covered it.
The amount of VRAM really came into my consideration of getting a 3080Ti or 6950XT ($1400 vs $1000 AU at the time). This information really would've been handy then but anyone who had taken the blinders off seen this coming a long time ago. Now I also feel more at peace recommending AMD's higher VRAM offers to my friends who have asked for advice. Cheers guys
Same here. Months ago I was considering a 3080Ti 12G for my new build, it costs around 1000 USD. Until 3 days ago, I saw on a site that I could buy either a 7900XT at 900 or a 7900XTX for 1000. Not only the performance is greater (for non-RT), but the VRAM capacity is also significantly bigger
As an owner of both gpus i approve of this video
I noticed it in cod forza and other games for the 3070
Any questions feel free to ask 🙏
Does OC the memory help to mitigate the issue?
@@murdergang420 have you noticed who sponsors the vast majority of these problem games though? i mean seriously i feel like im the only one that notices this coincidence. Every game in this video with serious problems is an AMD sponsored title, that means they work closely with AMD when developing the game... im sure its nothing though right, no way AMD would be turning some knobs here to make games run badly on certain nvidia cards while taking advantage of the one huge advantage AMD have... nahhh couldnt possibly be, its just nvidia BAD and we need to remember that.
not like anything like this has happened before with nvidia tessellation which crippled the opposition cards. honestly im very surprised outlets like HWU dont at least acknowledge this and question it even a little that something underhand and fishy is going on here when every game with a problem has AMD hands on it, always those games that come out with terrible RT aswell, just a coincidence im sure they have no say at all lol.
@@jinx20001 So do you have the same issues with video editing software like Adobe running better on Nvidia hardware? Because it's been that way for the longest and I've never seen conspiracy theories mentioned about performance. You should be more upset with Nvidia for putting out a crippled card with 8 GB of VRAM. If they had least put out a competent product, it would be easier to entertain accusations of bad optimization.
That said, when AMD got the console contracts from Sony and MS, it was predicted that game optimizations might start to favor AMD.
@@murdergang420 for some games, have you experienced stutter due to shader cache with the 6800? Destiny, for an example, experience high stutter for a bit then over over time gets better.i just found it annoying that this happens every GPU driver update. It makes me not want to update the driver.
Thanks for the revisist, Steve! As I said over 2 years ago: The RX 6800 was probably the best graphics card released in recent years. It was faster, more efficient and more future-proof than the 3070 from the start, and yet the 8GB VRAM cripple had to be hyped because it can do the oh-so-great DLSS and has much better ray tracing performance. Good job, press and influencers.
czcams.com/video/kKSBeuVlp0Q/video.html
Yes but in most places in europe was out of stock or when it was in stock was more expensive than a 3080 10 gb
They can only review what they know, whether games will end up using more VRAM imminently is speculation and the reality is VRAM usage of games *in general* has been pretty stagnant for years. (Largely due to consoles, imo) I agree that Nvidia shouldn't be skimping on memory size or memory bus, but you can't really blame press for reviewing the cards they got at the time they got them with the information they had.
yes, the efficiency in every metric is incredible, but that got it very close to the XT version in real world prices
comparing 3070 to 6800 skipping 6700xt is like comparing 6800xt to 3090 skipping 3080 ... depends if u red or blue team fanboy... why not compare 3070 to what amd was promising as direct competitor at that time? 12gb 6700xt? both of those companies crapped on gamers in last years equally.
Great video as always! I would like to see how the 12gb GPUs like the 3060 and RX6700 fare against these newer games.
with the price of 3060 you can just buy 6700xt which will shit on 3060 any day.
This is a great video, great job showing the effects on screen
I'm impress by the rt performance of the 6800.
makes a whole video to say I TOLD YOU SO! No but seriously, great work. I bought 6800 2.5 years ago based on your original videos!
Great work Steve! This looks like important information!
Great video. Glad I watched as I was considering the rtx3070 but not now. I game at 1440p on a gtx 1070 atm which days are numbered. 👍
I remember feeling a bit burnt when my 1GB 560ti couldn't play PS4 games once the generation swapped over. Funny to see this happening all over again.
Nvidia has *always* been stingy with VRAM. Even back in the day it was easy to get Radeon GPUs with over double the VRAM for the same price. In 2015 the GTX 960 only had 2GB, and the 970 famously only had 3.5GB of full speed VRAM!
Lets not forget about 1030 with DDR4. VRAM so crap it shouldn't even be called 1030.
The question is, did you learn from the 560 Ti or did you bend over for Jensen again?
Same! I remember asking on the forum whether I should get 560ti 1G or 2G, and everyone recommended 1G. I mean, people took you for an idiot if you dare to recommend the 2G version. But within just 1 year, Battlefield 3 came out and destroyed that 1GB vram. I was so mad. Yet it seems 12 years later, the history has repeated again.
Как же вы западные куколд клоуны закалебали с этим вечным упоминанием 3.5 гб у 970, словно каждый из вас вчера узнал и спешит всем остальным "не знающим" об этом рассказать
Goes all the way back to the Geforce 3/4 days.
Far out, I was tossing up between a 3070 Ti and a 6800 XT late last year and I'm really glad I went with the Radeon GPU now! Great work as always Steve.
I had the same dilemma earlier this year and went with the 6800xt. Very happy with my choice
no DLSS...
@@Magaswamy716 fsr is pretty decent
Unfortunately people still buy a RTX3070 TI over the RX6950XT in some countries where prices are similar 😢😮
@@Magaswamy716 man you guys never give up........
I play a heavily modded simulation game that uses more than 32Gb ram and more than the 11Gb of my 2080ti at that time ( I use Riva tuner tom monitor in game ). I first upgraded the ram to 64gb 3600 cl 14 ddr4, still not good enough. I then pulled the trigger on a 3090, for way too much money. But it made the game smooth again. I ended up monitoring 39Gb ram usage and 17gb Vram. I stung a lot at the beginning, but now I feel my 3090 will serve me well for the years to come. All my decision making was supported by your great content as well, thank you for that :)
I remembered getting the 3070 2 years ago and running into VRAM issues on CP77 and Metro Exodus Enhanced Edition. I was much happier switching to the 6800XT.
I managed to get an RX 6800 for RRP on launch day, very savvy purchase in hindsight 😀
Keep up the good work HW UB
If the rumored 8gb vram on the 4060 and 4060ti turn out to be true, I am very much looking forward to seeing HUB's review as there is no way any of those products will be priced accordingly.
RTX 4060 will be buyed by miners... cryptocurrency...
It's joever
@@hugobalbino2041 Do you live under a rock, crypto is dead. Nvidia even came out and said that cryptocurrency provides no value to society. Lmao.
@@josh2482 8 GB cards are fine for crypto... not so much for gamers... even 12GB cards... for me, I don't pay electric bills because of my solar power... so yeah I don't live under a rock... and there are still GPU miners out there...
The sad part is that yes, the 4060(ti) will get scathing reviews for only having 8GB vram, and yes, they'll sure as hell be overpriced for that - but people will still buy them.
And, from what it looks like, AMD won't go ahead and do the smartest thing they could do and launch both the 7800XT *and* the 7700XT with 4 MCDs and 16GB. It's such a pity, and I really hope that they reconsider. Already having chiplet design for RDNA3 implemented would certainly give them the chance, unless the binning for the GCD also includes defects in the fabric interconnect to the MCDs.
Jensen Huang : It's just an in-built feature called Motion Blur.
I knew that more RAM was better back on the Radeon 6800 release day and that finewine would kick in sooner or later it was good that the reviewers mentioned that more RAM and the infinity cache was a useful feature.
yeah amd is always very memory sensitive, even gpu memory bump raise fps
This is one of the reasons I went from 3070 to 6900xt the more VRAM and also shockingly I had to spend no additional money on 6900xt! The performance has been great for the last 6 months and I'm quite pleased with the AMD card. Also knowing that AMD cards age much better than Nvidia cards makes me think maybe I should stick with AMD from now on
how are the drivers working for you? I'm very tempted to go red, but I don't know if AMD has resolved most of ther driver issues.
@@ChiplockRock To my surprise I have yet to encounter any problem with the drivers every game that I have played and tested worked great I don't know about productivity applications tho as I only use Photoshop and premiere pro anything else is CPU related
@@ChiplockRock I've had a good number of Radeon cards.... most of the driver issues I've has is when I've tried obscure stuff like connecting up multiple monitors of odd sizes and Windows automatic updates borking the drivers. My Radeon VII is also picky with what driver it wants to work with, and my x570 Crosshair VIII board does not like my Radeon VII.
AMD has always had a shaky history with drivers.... and things like taking months to get RDNA 1 drivers months to get right has only hurt them.
I bought a 7900xtx Red Devil on launch day and no problems so far. My only complaint is there is a 25c difference between Hot spot and regular GPU temps. Otherwise the card doesn't get too hot.
@@ChiplockRock just got a 7900 XTX (new build from a 2080ti) and have no issues, also the AMD software is way better than nVidias. I think the driver thing isn't really as applicable nowadays as it was in the past.
What size psu do you have? And what cpu do you have paired up with it? Ive been looking at maybe going a different route from my 3070
I paid a shit load of money for my 6800 XT during the shortage (twice of what it is now). Still extremely happy with it and still glad I did. Happy to see it still performs so well today.
Same here. Felt lucky at the time to ONLY pay $900 for an AIB 6800 in 2020. I think I got my monies worth.
Same. Actually, at that time in my country 6800XT and 6900XT were roughly the same on price. I ended up getting the 69 one. Glad 68 users are still in good shape
Same here :)
Also got the 6800xt, at launch price. 8GB was a big no no. It was very obvious that it was too little.
Just like the gtx 580 that came in 1,5 and 3GB variants. The 3GB lasted years longer. Same with 3/6GB 1060
Paid
This video combined with LTT’s 4070 review finally got me to upgrade my old GTX 1080. Picked up an XFX 6800XT for less than a 3070. Looking forward to an AMD GPU, haven’t used one since the ATI Radeon HD 3870.
Thanks for the review. I got a 3070 at release for RRP and have been really happy with it, but it's good to know for future purchases (I'll likely not even worry till the 5000 series)
Same here 3070 happy with it i will never buy AMD
Nothing quite like some fine wine 🍷😂
Great job Steve! Id love to see this same type of comparison with the 3080 10GB & 12GB against the 6800XT.
Exactly! This would be a nice follow up video, probably including 4K resolutions.
only difference is that the 3080 price wise is more comparable to a 6950 xt
So glad I got the 12gb version when prices became low. Getting the 3070 would have caused buyers remorse.
The 6800 XT also has 16GB, the cheaper 6700 XT and 6750 XT came with 12GB, they were more comparable to 3060 (Ti).
Wish got granted.
Great video Hardware Unboxed.
As a 3070 owner, I feel jipped and spent a decent amount in this card thinking it’ll be good for years to come. I’m so extremely frustrated now, since most new games even on 1080p I’m running out of VRAM. Selling the used GPU is going to get me nothing in terms of price, and the new RTX cards are outrageous and frankly I’ll never buy an nvidia product again. 6900 XT is available and cheaper now, but not sure if I want to buy a new car that’s already last gen, even if it’s better than the 3070. Better yet, 7900 XTs and such seem to be quite expensive.
I’m torn now to wait and deal with that absolute hot garbage product or try to empty my wallet into an AMD GPU. Or wait until next gen AMD. I have no idea.
Nvidia is terrible and extremely anti-consumer. My next GPU will be amd and I cannot wait to be able to play games normally on a card that wasn’t made to just be obsolete in a few years.
Totally agree, I was shocked to see this issues at 1080p, crazy feel for you bro. at this point I’m done with Nvidia.
Give Intel A770 a try then. Its cheap, has 16GB vram, supports raytracing, and Intel has been doing a lot of improvements to their drivers. Buy it from somewhere with a good gpu return policy in case it doesn't work out for you.
I bet you could sell a 3070 and buy a 6800 or 6800XT for a minimal difference. We will also have 7800XT and 7700XT sometime this year.
I had lot of issues with drivers on RX6700 XT specialy with old games where it performs almost like my old GTX1060, in new games everithing was ok. See internet forums lot of people have same issues. Im going backt to nvidia. AMD drivers are not good.
7800 XT sounds promising!
Great video as always. I've been looking at the rx 6800 as they've dropped a decent amount in price here in blighty recently, down to £440 mark. I'd be upgrading from a 2070super. Is this a decent upgrade?
I know it'd be super unapplicable but seeing the difference between the older HBM/2 memory cards vs the GDDR6 cards would be an interesting comparison.
That video perfectly reflects what it feels like to buy a GPU with 8GB some time ago and suffer with it now. I would be glad to see a comparison between RTX 3060Ti vs RX 6700XT, as these are also direct competitors. Good job!
They are not though. RX 6700xt crushes 3060Ti in everywhere except raytracing.
I game atm since 1,5 years on a 6700xt on all ultra settings 1440p (with minor tweaks here and there). If I'd EVER experience ANY stutters or something I don't like I'd instantly switch to a better card. But I don't have to because EVERYTHING TODAY RUNS BUTTER SMOOTH!
Are you really suffering? Just drop texture quality one step and this won't even affect you. But if you swap the situation around and high texture quality didn't exist, everyone would blast AMD for wasting money on RAM when they don't need it. Funny how that works.
Similar situation here. On the other hand I tend to not use raytracing and use a mix of low/medium settings so I don't think 8GB VRAM matters too much for my use cases. The testing shows 3070 is kinda bad but my 3070 will do just fine until it dies I'd say.
Im okay with my 5700xt 8gb it's performance ain't nothing to write home about to begin with lmao
Great video Steve, thanks. Another thing to consider is that calling these cards "aging"... for a lot of people, these cards have only recently become available to buy, due to shortages and pricing. And they are currently still the newest 'midrange' cards available! So these are actually current gen cards struggling!
Makes me wonder where the "value" is...
Thanks a lot for the video, I had no idea this was a problem and with this results is a HUGE problem, I have a 3060ti at 1440p, and that explains why I was able to play metro exodus with RT maxed out at a good performance but cyberpunk with RT became really unstable after some game time, and this totally explains it.
Thanks a lot my next card will be a 16GB card at minimum.
Hi, love your videos, when can we expect to see a review of the mobile parts like 4050 and 4060 mobile?
Would be awesome to compare the 3080 10GB to a 6800XT 16GB in these games.
They are pretty closely matched in rasterisation performance, I wonder if the AMD card pulls away massively with more RAM.
I would also like to see this comparison!
not with Vram tho, as a 3080 and 3090 user, I can tell you that the 10GB isint even that much of a difference from using an 8GB Vram since games that needs more than 8GB usually just go 12-14GB
6800 XT is closer to 3080 ti/3090 performance in rasterization in newer games, outside of 1 or 2 random titles. Sometimes exceeds that.
The 3080 would get shit on.
@Kevin P. Lmao NO, wtf are you watching to say bullshit like this ? Just check the recent games benchmarks from HUB, the 3080 10Gb still beats the 6800xt in EVERY GAMES, even at 4k, latest ones are TLOU, Hogwart and Spider Man, infact the 3080 10Gb even beats the 6900xt and 6950xt in Hogwart and Spider Man respectively at 4k ultra, and the 6800xt by like 10-15%. And if you check other reviewers, the 3080 is also better in other new games like Dead Space or Plague Tales. The only recent game that the 6800xt can outperform the 3080 10Gb was CoD MW2, and everyone knows this game is just an outlier that heavily favors AMD. Imagine saying "6800xt is on 3090 level" while in the very recent video of TLOU the 3090 was faster by like 30% at 4k, stop coping.
The first 6 games tested used 12-14GB of VRAM, so the 3080 10GB is still going to struggle
I just recently switched from team green (GTX 1070 8GB) to team red (RX 6900XT 16GB) and I gotta say, that extra VRAM is nice, the Resident evil 4 remake uses a whopping 13.8GB of VRAM at 1440p if you set everything to max (with Ray Tracing and Hair) so 16GB does seem like the way to go moving forward, which, isn't shocking considering there was a time where we were jumping to double, triple, or quadruple of the "at the time" acceptable VRAM requirements back in 2007-2010, I remember having a 128mb card and moving to a 768mb, I remember having a 1.5gb card and moving to a 4gb, 4gb to 8gb, etc, etc, as the years go by, we need more VRAM, it's that simple.
Your video was great! I was wondering if you could make a video discussing the best DDR5 RAM to use with an i5 processor. I've noticed that most videos on CZcams focus on either AMD or the Intel i9, and I think it would be really helpful to have a video specifically dedicated to the i5. According to the Intel website, the processor supports up to 4800 MT/s, but I'm curious if using a different or higher speed could provide any benefits or cause potential issues. Would it be possible for you to create a video on this topic? I would highly appreciate.
"I'm curious if using a different or higher speed could provide any benefits or cause potential issues. "
The answer to this is the same as it was for DRR4, "Yes, and yes." there are many variables that could make it be one or the other and it can also be both. generally though the system will be perfectly stable using 5800MT/s sticks paired with any intel processor and running XMP to get the full speed of the sticks. You will see some performance gains in some applications. In other applications the bandwidth is important, in others having tighter timings is more important, so it depends on what you're doing... in gaming latency is the most important so you want those timings as low and as close together as you can.
This also might be relevant to you:
czcams.com/video/b-WFetQjifc/video.html
The closest way to simulate a a 16GB 3070 is to use a RTX A4000, think it would be 10-15% slower in non-VRAM limited scenarios but beat the RX 6800 in ray tracing scenarios, its also around 500 USD, Used thou not new