AMD vs Nvidia
Vložit
- čas přidán 13. 07. 2024
- AMD vs Nvidia- which GPU makes the most sense to buy? In this video I specifically review the RTX 3080 vs the RX 6800 XT. All testing was done on my system described below at their default out-of-the-box settings and with Smart Access Memory and/or ResizeBAR enabled.
Specific GPU models:
Asus TUF Gaming OC RX 6800 XT amzn.to/2YeMT63
Gigabyte Eagle RTX 3080 12GB amzn.to/3speCxi
My system:
CPU: AMD Ryzen 9 5950X amzn.to/3a6NZTO
Cooler: Arctic Liquid Freezer II 360 amzn.to/3Ddrqts
Mobo: Gigabyte X570 AORUS Elite amzn.to/3laFVIh
RAM: Crucial Ballistix 3600 CL 16 amzn.to/3iyZdoM
SSD: Samsung 980 Pro amzn.to/3BfkKds
SSD: Samsung 970 Evo Plus amzn.to/3aFYIVO
Case: Phanteks Eclipse P600S amzn.to/3A3fLeE
PSU: Super Flower Leadex III Gold 850W amzn.to/3Bo9HyZ
GPU: AMD RX 6800 XT amzn.to/2YeMT63
Monitor: LG C1 48 inch OLED amzn.to/3nhgEMr
What equipment do I use to make my videos?
Camera: Sony a6100 amzn.to/3wmDtR9
Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
Capture Card: Elgato CamLink 4K amzn.to/3AEAPcH
Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
Greenscreen: Emart Collapsable amzn.to/3AGjQXx
Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
RGB Strip Backlight on desk: amzn.to/2ZceAwC
Sponsor my channel monthly by clicking the "Join" button:
/ @danielowentech
Donate directly to the channel via PayPal:
www.paypal.com/donate?hosted_...
Disclaimer: I may earn money on qualifying purchases through affiliate links above.
Chapters:
0:00 Why I bought an AMD GPU in the first place
1:20 Why did I buy a 3080 when I already had a 6800XT?
2:34 RTX 3080 vs RX 6800 XT first impressions (Elden Ring)
4:34 Some full benchmark comparison graphs at all resolutions
5:58 Forza Horizon 5 is an outlier (SAM ON and reBAR forced on through Nvidia Inspector- 3080 does worse without it and no official reBAR support in FH5)
6:58 RTX 3080 4K ray tracing and DLSS benchmarked in Cyberpunk 2077 (and thoughts on RT and DLSS)
13:18 DLSS vs FSR thoughts
15:03 What about AMD driver issues?
16:02 What about Nvidia driver issues?
17:42 Productivity and Software Comparisons (gameplay capture, nvenc, video editing, etc)
23:15 resizeBAR support vs Smart Access Memory support
23:56 Final thoughts on RTX 3080 vs RX 6800 XT - Věda a technologie
What a great comparison. Surprise ending both amd and nvidia make good graphics cards. So your point of what card to buy comes down to price is correct
Correct
I'd also add DLSS or VRAM. Atleast one game uses 11GB+ (FC6) and that one is dangerous close to 12GB. I go for AMD always as I dislike nVidia alot for their shitty behaviour. That said nVidia in most cases are probably better than AMD(but not by alot)
Dont forget amd can go black screen. That should be very important. 15 years with nvidia now, problems with drivers or games are non existant to be honest.
@@Giedrius_Savickas Had nVidia & AMD cards.. Some of the drivers to nVidia was instant bluescreen. Their both had their bad sides
@@genrabbit9995 well maybe I was lucky... Peace!
In the Elden Ring 4K footage, did you notice that the 3080 is running 10C higher and eats 100W more than the 6800XT for the same performance ? That's shocking !
3080 has fsr and dlss while 6800 xt doesn't get dlss
Elder ring is crap rn on desktop so that may actually be the game problem
Well, RTX note very unstable temperatures. Its either VRAM or Core. On AMD you got higher Junction temps, and this is dictating fan RPM. Just saying, that i finally sold RTX 3080 and 6800XT is coming to me today.
@@vlad_8011 how did this work out? Also was one of the reasons u switched was because u weren't a big user of ray tracing and dlss?
@@xaimond12 After month of return, exchanges i got 6800 XT Gaming Z Trio.
I learned one lesson - keep away from XFX, 3 cards in the row (made a video about it), bad batch, and lot of people are getting exacly same coil whine (100% fault of GPU). I was mad at ASUS, dont have trust to Gigabyte, lost trust to "AMD only manufacturers", so Sapphire Pulse (not a single review!) was also far beyond my choice, and choosed MSI. Fantastic PCB, great coils, 3 x 8 pin (card rated only 300W max!) but poor fans and cooler. And BIOS have limit to power limit max +9% only. I think its my last GPU, quality of those is worse with every gen. I deshrouded mine, and putted 2 x 120 Noctua S12A PWM Black, plus made a silver cover for fans. Temps are getting little bit better, but watching video on YT is making Junction temp 60, which turn on fans. Thats poor cooler design. On XFX i could watch 10 vids and temp was 45 on junction.
Wow, I will admit. I had the exact same experience that you had. Anyone else not agreeing it’s either because they did not test both cards thoroughly, or are supporting one team more than the other. At the end, I don’t care which name I have my money on, I care about the value I’m getting for the money I spent. I exactly agree with you, and I’m finally happy to see someone not biased and that can speak from both worlds. Good job!
I'm the same but I also look at business practices and Nvidia has been pretty shady as of late. Intel, AMD, Nvidia have all had shady experiences, but some react better than others when called out on it. IMHO. Of course I'm weird and look at things like power consumption and cooling as well vs raw performance and price. At least it's nice we have some choices. It'll be even better if Intel starts selling GPU's that work well with modern titles.
AMD claims it will not possible to bring Ray Tracing to fairly recent Ryzen generations of chips (only the newest ones) meanwhile Nvidia provides driver updates that bring it to cards far back as 1060/1070/1080 (from 2016!!!!) Awful... My footage on AMD: czcams.com/video/6P8Fc9mNQhQ/video.html
Nvidia backports capabilities into cards going back as far as 2016 models... AMD say they can't provide support for chips released in 2019 (3000 ryzen gen) Pick your side...
For me it is only a bit of worry about drivers. I haven't tried any AMD cards yet but I'm worried they might shitty drivers and require more troubleshooting than Nvidia cards.
@@nightdark2616 I haven't had a problem with AMD drivers in quite a while. I got a 6800 XT Liquid Devil in March of 2021 when you couldn't get a 3080 anyhow and even it was only available due to requiring a custom loop. Luckily I got it before the huge price bumps but I haven't regretted it one bit. My son runs a 3080 10 gb and either card works nicely but I do like my 16 GB of VRAM.
the 6800 xt was the best buy i did in a long time. No regrets here. I play 1440p and i couldnt ask for more in terms of FPS.
Ma brotha 🙌
same its been great with my 32" lg 165hz screen 5600x 32gb 3200 ram
you overpayed. unless having 100+ fps on ultra quality on cyberpunk is worth $500 extra
@@user-wq9mw2xz3j overpayed? How do you know how much i payed? I payed 900€ for a sapphire nitro 6800 xt. The reference card is 650€ so its a reasonable price considering current market.
@@izidor i ment overpayed as in bought a card more expensive than needed for 1440p gaming at quality visual settings and high enough fps.
but for 900 that sounds fine, 6700xts cost that price lol
Lol, for those of you saying "misleading/clickbait title" did you read the thumbnail? They go together lol. This entire video is designed as a response to the "Buying AMD was a HUGE MISTAKE" comments on my previous "I switched to AMD from Nvidia" video. Edit: I've added the "or was it?" To the actual title now so people don't miss it.
bruh
btw, the title gets indexed by Google....while the thumbnail does not.
@@sharktooh76 yeah thats what got me, didn't watch a second time so never saw the thumbnail till now!
Go team red.
Don't you worry Daniel.. I hate this cliché but haters will always hate
So glad to see your channel growing! I found it probably a month or two ago and it’s nice to see you still gaining traction (while also teaching). Keep on going man!
yeah it's just great to see the channel growing so much, right?! I think he had around 200 subs when I first landed here and since then it's just been just sky rocketing
@@rodrigoferreiramaciel4815 ikr? I haven’t seen growth like this in a while
yea you're crushing it, bruh! keep it up! maybe its time to reach out to some companies and ask them to sponsor your channel? idk how it works, but I would just call up their marketing dept and see what they do... sorry about the necro, if that bothered anyone.
I think this is the best video you’ve done so far! I’ve been having these exact same questions for the exact same reasons as you for about a year. Thank you so much for going into more of the software side, those of us who can’t get our hands on either card yet obviously can’t look at the software so this video is super helpful.
While AMD's h.264 encoder is "meh...", the h.265 encoder is just amazing. Ensure to record videos using that.
Sadly, h.265 is not supported by streaming platforms yet.
oh... i forgot about the audio. there are several softwares that are doing a way better job than RTX voice. For example you can use vb-audio's voicemeeter and set up a realtime noise gate, compressor and equalization on the microphone to remove any background noise or unwantend sounds and GREATLY improve your audio quality.
On linux you can use Easy-Effects with pipewire or Pulse-Effects with pulse audio. It is much more versatile than any solution on windows.
What is h.265 on rdna 2?
@@THEpicND h.265 is also known as HEVC or High Efficiency Video Coding. It is the evolution of h.264. if you want more info look on Wikipedia.
to bad you cant stream with x265 so it's crap for streaming. Kind of pointless having a feature that is not supported at all but AMD just does not seem to give a rats ass about it. as they havent even aknowleded the pisspoor streaming capabilty of their last few cards.
@@pino_de_vogel video streaming services have to implement it. it's way more efficient than h.264 and it has a significantly better picture quality at lower bit rates.
it's not AMD's fault if CZcams and twitch do not care to implement it.
even DVB/DVBT use h.265. the streaming platforms have no excuses...
I ended up with a 6900XT about 5 months ago, I was shopping for a 3080 myself but the prices were insane. At the time even a RTX 3070 would have cost more than the card I bought. Elden Ring runs great and my GPU temps hover around 58c and draws right around 100-200 watts of power while playing. Its not bad. One last thing for Elden Ring. If you have a G-Sync/Freesync monitor enable it on both your monitor and GPU drivers. Those micro-stutters pretty much disappear after about 5 minutes of gameplay.
I was extremely fortunate and was able to get a 6800xt for MSRP from the AMD store a few months back. This video makes me feel even luckier since I paid half the price for the exact same performance. Thanks for the in depth comparison 👍
I'm so happy to see you grow, thank you for all the amazing content man!
Really liking your content! Glad to see your channel growing too. Your one of my favorite video makers now :)
I'm a Linux user of over 20 years and I've used NVidia cards for years. I've now been on AMD for a few years and boy is it better! Now with more Steam games working in Proton, I've no need to switch to Windows hardly at all for games. Life is good.
AMD needs to up it's game on improving its performance against DLSS. I bought a 6600 XT based on the hope that their ability to upscale well would improve. I had a 2060 before and DLSS impressed me but the price chasm between Nvidia and AMD pushed me to team red.
I brought a 6600xt and 12400f for 1440p.
DLSS and FSR are both hot garbage, What matters are Nvidia's CUDA cores (AMDs high end RDNA can use HIPS), because adults want to use the cards for more than "teh vida gaiyms" and want to get work done.
Performance against DLSS isn't the problem. FSR does that just fine. It's quality that needs to be improved.
Imma waiting for RSR update since January haha.
Seriously why it's so hard for AMD to update driver.
The irony is that FSR works best in the higher end cards (higher resolutions) where you need it least. DLSS is the better performing and looking of the two technologies, but it requires proper implementation from the developers. AMD has a lot to work on. It is interesting to me that the sound processing for Daniel's mic was better on AMD.
Nice video glad your channels growing helping you to test more hardware. I was in the same boat i tried for 7 months to get a 3090 to pair with my 5950x but couldn’t find one. I tried the newegg shuffle so many damn times never won a chance to buy an overpriced gpu. But a month ago 6900xt’s started becoming available so I bought a nitro 6900xt se. I play at 1440p and it’s probably the fastest 1440p card but like you said 4k its a little slower.
Great video! I liked that you added the personal experiences in the comparison.
The best comparison I've ever seen on youtube, I've never seen anyone show so well the points of the 6800xt and 3080
I'm with you on this take, having owned both myself, it's kind of a "eh whichever you can get for a good price go for it". One big advantage you don't touch on here is selling your used GPU on the 2nd hand market, those nVidia cards are 5,000% easier to find a buyer, which shouldn't matter, but if you're upgrading every 12-24 months, this may be a small factor to consider.
Not so sure. Had very big troubles seling RTX 3080 10GB for 2 months. it was really pain in the a.... and i loosed most money on this card. Before RTX, i had 2 x 6800XT, which i sold without any lose.
@@vlad_8011 Let me guess, you were trying to sell it for the price you bought it? Even tho it was used?
Or perhaps even trying to get more?
For example. Righ tnow? I'd be able to sell this 3080 12gb for maybe 600 bucks. 700 tops.
If I went on Ebay perhaps more. hard to say. But I'd rather get less and get cash in person and hand it to them done deal.
I just got the 30 80 12gb. No need to upgrade any time soon. And if for some reason I felt I needed to? I'd sell it and get the rest quickly in like two weeks cause I have a job (lol) and buy one of the new ones scalped. Which is how you'll have to doi t.
@@duckilythelovely3040 Well, let me give you exac numbers ;
Red Devil limited Edition 6800XT - bought for 6200PLN, sold for 6200PLN
XFX Merc 6800XT - bought for 6200PLN, Sold for 6200PLN
ASUS 3080 TUF 10GB - bought for 6400PLN, sold for 4000PLN
Thats most costly card for me for now, i coudnt sell it fo 4100, 4200, 4300 - used cards started for 4600PLN when i was making my post.
@@vlad_8011
I've returned the 3080 12gb today. I've decided none of these cards are worth the money right now. The new ones will be out in a few months and they will be vastly superior. I'll wait.
I've found that none of these are worthy of their price tags.
@@duckilythelovely3040 I have second 6800XT from XFX right now, i can still return it, but reason for me is different.
Previous one had terrible coil whine, new one (came today) also have terrible coil whine - exacly the same as previous one. How the hell those cards pass QC? I'm seriously mad, as i need to return propably best card i had.
In the past, on exacly same rig i had used 6800XT, from XFX MERC too, it wasnt whining, also 6800XT from Power Color, small coil whine, RTX 3080 TUF - no coil whine. WTF is going on with 1200$ hardware?
The whole battle between the companies definitely has me interested to see future releases. I really appreciate your honest and unbiased videos on your hardware as I will be shopping for a new build around July and am really after gear that offers good price to performance and have no allegiance to any company.
Did you try how the driver side upscaling features (Radeon super resolution and. Nvidia image scaling) compare in terms of picture quality? For games like elden ring that do not support any ingame upscaling these seem to be the best solution if you cant run 4K native.
I guess I should feel pretty good about my 6900 XT I picked up for $999 a year ago then. For the price, especially in today's market, it's looking like it was the right call. Nvidia cards were never in stock when I looked anyways.
Same, I was able to pick up the 6900xt for msrp from AMD last year as well and I couldn't be happier
I paid MSRP for my 6900XT Nitro+ back in December of 2020. I had my heart set on a 6800XT, but the 6900XT came in stock and I knew if I didn't pull the trigger on it I might not get one at all. So I ended up paying more than I wanted, but I did pay MSRP for a top of the line card.
6900xt is better anyway.
Another thing about the future of gaming, is not just GPU's but newer game engines as well. One in particular that claims it will handle ray traced lighting differently and more efficiently.
I'm of course referring to the next Unreal engine. Can't wait to see more and see how newer gpus handle games.
Personally, I've made it my plan to stay within 1440p high refresh monitors for now, just to circumvent these technical shortcomings (like Cyber Punk's optimization).
This will ensure I get good performance at max settings 2k. 4k is still a bit of a difficult (and expensive) hurdle for me to take, just for gaming. I can wait.
I really appreciate this video. I’ve been debating between these 2.
very happy to be one of the 270k of the ppl that started seeing you channel from 6800xt video, keep up the good videos :)
For me personally there doesn't exist a GPU that's good enough for 4K gaming. I bought the 6800 XT for 2K gaming which it does very well. I feel next gen top end is good enough for 4K but I don't want the power consumption. So once again for me 4K gaming is 2 generations away when both AMD and Nvidia will have products in the range of 325-350W, something like an 8800 XT or 5080 where from both companies RT performance will be worked out to where the performance drop will be minimal, unless a game has too much demand on RT cores. On top of that they should be capable of true HDMI 2.1 (4K HDR at 120 fps). THAT is when it's worth it to me to game on a 65" OLED and enjoy the visuals along with stutter free gaming.
@@leeloodog First, I don't mean to be critical of you personally so don't be offended by this. I have NO IDEA what software engineers learn. I got about 2/3 the way through Software Engineering and there was never an emphasis on multithreading which I find criminal knowing what I do now.
So when I hear "Software Engineer", I put no weight on that. I should be able to but my education says otherwise, and I went to UTD which at the time was supposed to be good for computer science and software engineering.
This is not a reflection on any particular software engineer. I just know the variability in what that means can be great. Today I see some software companies have done a great job in developing software which utilizes many cores and others, Autodesk, Adobe looking at you, suck.
For a long time software engineers did indeed count on processing power on a single core to keep improving via faster clocks and better IPC and Intel was backing this message. And then you had the stagnation in the 2010s.
I understand in the world of graphics compute it's very different and I have zero knowledge in this area. However I'm just as skeptical in that realm as I am in CPU compute. I think you're seeing a bigger push in software solutions because of the realization that a GPU can't consume 1000W to put it bluntly, meaning that in both worlds of CPU AND GPU compute there WAS a reliance on brute force.
At some point the companies with better software solutions balanced with effective hardware solutions when they sell a hardware product will win out. I feel the same about software companies. At some point Adobe and Autodesk will be replaced if they can't get an 8 core CPU to give smooth and fast operation in ALL operations.
To be fair to Nvidia, using an AI assisted solution for DLSS means they started working on the solution quite a few years ago so I think I have to give them the most credit. AMD is playing catch up in EVERY regard to Nvidia other than chiplet technology and part of that came from TSMC.
I hope that answers your question.
I agree there are no 4k cards yet
Honestly, for me 4K gaming is just weird at a time that 1080p is still the standard. Why is it taking so long for 2K to become the new 1080p
LOL, if you have been around and have used the tech for as many years as I have, 4k at ultra settings framerate lows being in the 60's is TRUE 4k. As far as the whole history of where it started to where it is now! By the way, HDR is a PANEL technology, and is only true HDR if panel can hit 1000 nits brightness or more. 4k at 60 became a realization with the 3090, 3090ti, 6900xt and finally 6950xt as far as pure rasterization power. To say that 4k at 120 is the only way it is true 4k gaming, that to me is someone who is uneducated in this area and reading labels and buying into console marketing, especially talking about hdmi2.1 and HDR,LMAO!
@@octapusxft this lol
Very informative video! I'm considering switching from a 2060 to a 6600 XT, haven't been happy with NVIDIAs marketing tactics the past couple years. That and team red uses less power and I like the AMD overclocking/undervolting much better.
That's a good card, it's the minimum I would go, but the 6600 xt is able to play all 1080 and some 1440
@@SwingArmCity Ended up getting the 6650 XT because it ended up being cheaper and I couldn't be happier.
@@thymelord8353 that's a great card too. Enjoy!
Hello there, nice detailed video again like before. 18:02 Had the same exact problem here as well. My microphone makes an unbearably noticeable noise when recording using Nvidia software while not only don't you have to face such problem when it comes to AMD's, but also you can boost your microphone volume level while you're less likely to trade your microphone quality for that off, you can also get more options like much more different bitrate numbers you can choose from & I love the quality of the recorded videos at the same time. Not to mention that Nvidia software does not respond to the hotkeys sometimes for example when attempting to take a screenshot & it happens a lot & is just so annoying. There was huge benefits for me to use the Radeon software for replay savings cause it starts with Windows & you don't need to worry about forgetting to open OBS & make sure you've hit the "Start Replay" button before you start playing your game. I know you can just simply put OBS in startup apps list, but the app may crash(happens just few times in a while though) & you might forget to hit the "Replay Saving" button. I was used to use AMD software for recording with no complaints but after having an Nvidia card just had to switch to OBS!
I think that you went a bit too much for the video title, but good video, love to see that you're gaining more subscribers! Keep going I love those videos especially the 1000 dollar builds.
Nice objective comparison. Thanx a lot for such a good content. I had the possibility to trade my 3070 with a 6800 xt almost a year ago. The guy was an nVidia fanboy and I wanted the best bang for buck. I'm so happy with the 6800 xt. My 165hz 1440p monitor is often maxed out. Everything looks gorgeous and buttery smooth. I understand that some people only wants nVidia, but for now, I'll stay with team red.
Lol what a deal, you scammed the guy xD
@@angrydlc 😁 well, I think that he was also happy with the trade. He wanted an nVidia card. He has one now 😂
@@angrydlc well.... as i see it, the "guy" scammed himself ;) Personally i own a rtx 3060, rx 6800 xt and a rtx 3070 (laptop). i do care about raytracing and use it whenever i can.... comparing the rx 6800xt to the rtx 3070 on my laptop. i get higher framerates on the rx 6800 xt with fsr at ultra quality, than i do on the rtx 3070 (laptop) when running that with dlss balanced. Dlss though is far superior than fsr at preserving the fine detail in in hair, fern leaves and other things like that, but when it comes to more "blocky" things like buildings, cars and such i think fsr is better in some scenes (but from a higher upscaling res.). another thing with the laptop, is that i get some heavy stuttering when going from map to gameplay and the other way around in cyberpunk. Due to lower vram perhaps? i know it's not exactly apples to apples, but only kind of comparison i can make.
AMD claims it will not possible to bring Ray Tracing to fairly recent Ryzen generations of chips (only the newest ones) meanwhile Nvidia provides driver updates that bring it to cards far back as 1060/1070/1080 (from 2016!!!!) Awful... My footage on AMD: czcams.com/video/6P8Fc9mNQhQ/video.html
Nvidia provides driver updates for customers with older cards going far back as 2016... meanwhile, AMD does not provide further support for Ryzen chips released in 2019 (3000 ryzen) Pick your choice of customer treatment... "cheaper" ends up costing more
I think during the RTX 3080 launch, even the scalper got scalped and you had like 1.5 seconds to click and buy one :D
Happy to hear about the channel growth!
Great info, thanks!
Cool comparison. I've always leaned NVIDIA, but it doesn't seem like you can go wrong with either.
I found DLSS and DLDSR being of much more use on my 3080 10GB than RT to be honest. But for me 90fps avg and 75 min is target in many AAA games.
Very informative. Thank you.
The mic problem sounds like a sample rate mismatch. I had that issue with Shadowplay too. If you go to Windows settings and set the sample rate of your mic input to what the Shadowplay videos use (48k?) it should sound fine again.
Also, can't ReLive use H.265? :) I heard somewhere else that Relive videos take up LESS space at similar quality because of that, but I don't know a lot about it.
Future Proofing with Nvidia and DLSS kind of looks sad, since Nvidia wants to limit the new version to their new GPU's, that's a bold and bad move from them, as Apple is doing with the new M1 iPads. On the other hand FSR updates come to older gen cards. Something to have in mind! Great video! Thanks for all the thoughts and insights.
I moved from 3080 ti to 6900 xt because I just hated the huge 350w power of 3080 ti. I like dlss and rt but never really used them in games especially rt. cp2077 was the only game i cared and once I was done with it there was nothing else for me to keep 3080 ti for.
BRUH you downgraded for 50 watts.
@@100500daniel bruh 50 wat is hugee
Facepalm
@@100500danielTo be fair to him though the spikes in power from the 3090 makes it about 100Watts more than the 6900xT(comparing the highest power spikes of both as per Linus TT testing) and the raw raster performance is very close between the two better at 1080p and 1440 for the 6900xt so it's a sideways move not a downgrade.
its not only 50 watts, its most likely 100 watts with proper tuning.
Very informative as always Daniel. I am a long time follower and this time I have a question: how much the performance difference is there between 6800xt variants do you think? I bought a gigabyte gaming oc, that was the only one available. Should i regret? It never crashes and i can get around %6 extra performance by UV/OC. Best regards.
I'm not a big channel with all the different models to compare. But my overall opinion is that various AIB models only make single digit percentage differences so it doesn't really matter unless you prefer the look of one, or one has particularly noisy fans or something.
Best video to date man. I have a 3080 10gb and I really wanted a 3080 ti but it's 2k for one of those. I was almost tempted at getting a 6900xt as they're always in stock. I am happy with my EVGA ftw GPU 10GB for now and might upgrade to a 3080ti or 3090 maybe a 6900xt once prices go down. Would like to see you compare GPUs with their CPUs too see if a Intel runs best with an AMD card or Nvidia card.
For the majority you won't notice the difference unless you do AutoCad, etc.
For RTX it's just eye candy right now. The hardware isn't ready. Once it doesn't hamper the FPS, then it will be ready. Especially at higher resolutions.
Awesome summery Daniel! Honestly, I'm an old timer that have enjoyed 3D acceleration since its inception and I feel like progress is actually REALLY quite slow today. Generally as long as you are weighing the same class of cards. You really don't need to fight over which is better. Price is generally the only thing that matters, as the "general gameplay experience" is about the same. I bought a 6600XT because the 3060 and 3060Ti was 250-400EUR more expensive. And looking at performance levels today, I can say that I am extremely satisfied that I went for AMD... The only game I want to play utilizing RT today is Metro Exodus Enhanced Edition. And it runs brilliantly with "Normal RT" on the 6600XT. (65-90fps at 1440p in the Volga mission with 0.8x Shading rate 2xVRS). DLSS is an awesome technology. But right now, I wouldn't actually even pay 50EUR extra for it. Because there aren't actually any games I play which I would turn it on. Cyberpunk I locked to 50fps due to CPU bottleneck with the Ryzen 2700 and high "crowd density", and instead I use dynamic resolution, which hovers around 1500p-1800p FSR on the 6600XT which is then super sampled to 1440p. Image quality is absolutely brilliant. And like Daniel says, RT isn't really feasible today without massive hit to image quality.
Also, I wouldn't bet that DLSS will be around forever either. It might, but there is a very high risk that more open standards will replace it in the future. (which has happened so many times with proprietary nvidia tech) So people need to ask themselves "do I need it right now? and in the current promised games?" Because that's where the value is. not some unforeseen future. It's great tech, but it might as well be superseded by Intels XeSS or something else.
Made the same choice and I am really happy too, considering I even play 1080p. I guess the Nvidia good/Amd bad thing is really more of a general perception amongst people that really have no real clue about what they are talking about, especially in the midrange section of the market where since the r9 280 at least you had it way better on amd for performance/cost and arguably longevity.
@@oimazzo2537 yeah. And we really need to stop with brand favoritism. Because in the end, they don't care about consumers... they care about their shareholders, and need to do everything they can to maximize profits.... by law. So they aren't friends of consumers. Especially not when they are "on top" Going with underdog brands is usually (but not always) where the value is at. Look at AMD and their Ryzen lineup, they completely shock the market in the first couple of Ryzen generations until brand has reached market recognition and become a household brand in the industry. After that they jacked up prices, while Intel (now being an underdog) had to lower their prices and enforce "value" to get market shares back.
But yeah, people will continue with their fanboism, and I don't really know what kind of primitive drive is behind it all. xD
And by the way. 6600XT at 1080p is insane hahaha. In many games I can't even max out the GPU since my Ryzen 2700 often becomes a bottleneck at 1080p. (depending on which game of course, but the large scale CPU intensive AAA titles at least)
The 3060ti is faster than the 6600xt by a pretty high margin and with DLSS on it's just not even close, that's why it costs more. And as far as DLSS going away. That's not going to happen anytime soon. More and more games are using it , Actually all the new ones actually.
It won't last long. It's like the freesync/gsync situation. The tech is nice and not only the gpu vendors want it as a standard. Eventually it will be a normal stuff to have on any platform
@@anthonylong5870 well I do believe DLSS will become more or less obsolete with FSR 2.0. It's an R&D sales pitch by nvidia. They are known to do it. Same with physx, tesselation, g-sync. But eventually the market will just opt for open standards. No developer is interested in proprietary tech. Especially since the consoles (Steam Deck included) won't use DLSS.
I actually have no games in my library which I can't run satisfactory at 1440p (heck, even Cyberpunk I run supersampled at ~1600p). For me personally, the 3060ti would've been a waste of money. I much rather save that extra money and invest it in a future GPU a couple of generations down the line.
Good content! I can see why your channel is growing. :)
Very useful info, thanks
I don't think either of these cards are bad purchases. I got a 6800 XT for 650. I don't regret it at all, it' s a 1440p monthers (the resolution i personally play verything at). And I feel like it even ray trace well enough. Which even if I had the 3080, ray tracing is too expensive to consider for myself personally. Follow your heart and get the card that you feel most comfortable with. There really isn't a wrong answer here.
I've been using AMD GPUs for the last 8 years and it's been a solid experience. I love the software, how all the functionality is built into one app. They're cheaper then Nvidia and generally have similar if not better pure rasterization performance.
of course,people complaining with any details they find when there s no games outhere that needs more than those good powerfull cards...amd of course,doesnt worth it to buy nvidia for what software?
Thanks for the info Daniel.
Did you use the ddu driver Uninstaller on the amd driver when you switched to nvidia?
So, me and couple other folks were using the same exact AMD GPU (ASUS TUF GAMING Radeon RX 6800 XT), and about a few months into we started having the black screen issues you were talking about. It started as one black screen per week or even more rare, but quickly developed into a much bigger issue - now we are not able to play any AAA titles at all without having a black screen every 10 minutes. This is really bad.
Do you have any ideas on how to fix this?
What monitor cord are you using
@@mervin3x DisplayPort, why?
First to troubleshoot, make sure Enhanced Sync is disabled. 2nd, are you in a carpeted room? If so, make sure your PC Is well grounded and that all of your DisplayPort cables are certified DisplayPort. Only certified DisplayPort cables have the recommended amount of shielding, and can help prevent black screens due to static build up.
Check if Asus made an updated bios for that card. I had that issue with MSI 6800XT after a few months and a bios update solved it
@@treefiddy727 no, they didn't.
Well, they do update drivers every now and then, but this problem still stays
Went from an msi rtx 2060 gaming x to and msi rx 6900xt gaming z.
I paid a bit over $1300 for it. But that was from Newegg when they first came out. Still feels like a lot, but is worth it if you have the money.
same I paid about 1280€ myself on my ASUS 6800XT and I'm happy with it ngl
@@BruceCarbonLakeriver paying 800 for a red devil 6800xt
@@corpmobb4134 payed 800 for 6900xt phantom gaming.
Which FPS program is he using and if it is FPS Monitor then how does he get the units for the various metrics to be superscript instead of subscript? (such as at 16:46)
The "...I could't tell which game I'm using..." part was funny but it got even more fun when the video reached the footage about Forza Horizon and the text was Horizon Zero Dawn.
Having a great graphics card is so amazing you lose your grip of reality. :D
Thanks for also showing what the cards can do with pros and cons for all the enhancements from both sides (DLSS, FSR, ...)!
That Geforce Experience is what makes me averse to Nvidia. Why do I need to login to fully use the hardware I purchased? I'm honestly shocked that the EU has fined them for this.
Edit to add:
I've had/have AMD cards but I'm currently using a GTX 1070 without geforce experience.
why you ask? because not only does nvidia want to sell you overpriced cards but they also want to make extra cash by selling your personal data to the highest bidder. leather jackets dont pay for itself.
why would you even want that bloatware? just install the drivers and use the control panel thing (the one that looks like it's from 1995 XD)
I played CP 2077 with both the RTX 2070 Super and my new EVGA FTW3 Ultra RTX 3080 Ti and found that the ray tracing was way better with the 3080 Ti but, without looking at benchmark numbers in real time I doubt I'd notice much of a difference at 1440P...now I don't own a 4K monitor other than an OLED TV that I'm way too scared to game on it due to the high chance of burn in, besides the good high refresh rate 4K 144hz+ ,1ms response time, IPS monitors with 500-800 nits+ with HDR and G-Sync Ultimate certification are going to cost almost as much if not more than my 3080 Ti did so, IMHO 1440P high refresh rate is where it's at right now, because sadly we're just not ready for full fledged 4K high refresh rate gaming quite yet :/
I am a simple guy, when Daniel releases new video, I watch and I like lol...Like your presentation style, keep it up !
I have used both AMD & Nvidia for gaming. I totally agree with you! Thank you for an honest assessment.
You have 12GB one right? I would consider the 12GB one a 3080 super, and a 6900xt competitor(6900xt is about 1300 to 1400 on newegg now).
Also, 7000 series amd is looking like a beast. it is using mcm, so it should have 2-2.5 times the performance of 6900xt. Nvidia is trying to match it with a higher power (some rumor suggesting 800w, though I think that is to much 500 watt is what I would expect.) uff, cant wait for the new gen.
Also I think in 4k, VRAM will be more concerning than DLSS in longevity. LIke a rx580 vs 1060
Anyways, a great video as always.
In price yes but it's literally the same card as the 10gb version. frames are about the same
I just want to know why 6800xt prices don't go down
@@whereswaldo9089 no it has more cuda cores
@@user-db4lk7yg3o that doesn't mean it gets better performance on games. on paper it looks good but in real world test doesn't mean much
czcams.com/video/oBDFtJ4lR2c/video.html
Traded my FHR 3080 10gb for a 6900xt last week and I do not regret anything. I play at 4k and all games I play and tested I have more fps than with the 3080 at the same setting. Elden ring, forza horizon 5, assasin creed Valhala...
3080 is comparable to 6800 tho, so of course 6900xt is gonna be in a league of its own
@@Dekkerlion Yeah standart 6800 is only as fast as 3080 in some specific games but 6900xt is on another level. 6800xt is what is completely comparable to 3080 other than raytracing.. Even 6800xt beat 3080 in some games.
Man I'm in a situation where i have to choose between the RTX 3080 12GB and the RX 6900xt and both are around the same price I would be playing at 1440×3440 resolution and from what I know the higher the resolution the more advantage nvidia has so it isn't even a matter of price for me should I go with DLSS and Ray tracing or 4GB of more vram it's a very hard decision
@@callmeveler5695 3080 is more stable and have dlss and better raytracing. 6900xt is stronger a bit but lack of dlss and raytracing performance and less stable in some games (stuttering) I traded my 3080 for a 6900xt like i said but traded back the 6900xt for a 3080 lol.
@@gunzorkgaming7847 oh damn that's what attracted me to nvidia DLSS and Ray tracing but what concerns me is vram usage I've seen some games use almost 12 GB of vram but for the RX6900XT which has 16 GB of vram that wouldn't be an issue did you ever had any problem with vram playing AAA games?
Have you played around with scaling in the Nvidia driver control panel settings ?
If you change the thermal pads (and lost your guarantee, but guarantee doesn't matter cause you have a gigabyte card) you can get a few fps, because 3080's core at 67c probably memory was +100c and transfer a little bit of heat to the core, and also buying a 12Gb card is much better than 10Gb version specially for 4k gaming. Both are good cards!
Great video btw!!
the 6800xt was the bargain of the century at $650- performing between the 12gb 3080 and the 3080ti- too bad no one got them for that price. I got one for $1000 that i used for a while but now i'm using a $430 6600xt. I also had a 3070 i got for $900 but dlss is just distracting i never used it and the performance isnt anywhere near the 6800xt... my 6600xt is better in some games. I just can't stomach the depreciation these are going to see as mining eases. But i cannot wait for all these to flood the used market!
So often in these benchmarks that the 3080 12GB is pulling about 100 watts more than the 6800 XT. Was it worth it? The price you pay in cost and power consumption for identical raster performance and raytracing performance that you will barely use if at all because it's not a ready technology.
I’ll be going AMD in the 7000 series assuming RT performance is on par (which it’s expected to be) due to the much lower power draw. However, you sound like an AMD fanboy trying to downplay ray tracing. It’s really awesome stuff, and tons of games support it. Games like Control and Cyberpunk (finally playing now that it’s gotten enough patches) have amazing RT visuals. You also get DLDSR on Nvidia, which if you have a beefy GPU, is the best way to player older games. I played the Bioshock series super-sampled down from 8K using DLDSR and it was incredible.
Came from a 5700XT which was amazing price-performance when I got it a few years ago, to a 3080 now. Also, regular DLSS is awesome as well. If I didn’t get a 3080 in the launch window (paid 1200), I would have gone with the 6800 XT no doubt, which is a BEAST.
Let’s be fair to these cards and their features though. I’m super excited for the 7000 series to win in every single category, as it really could with their efficiency advantage.
If you spent $1400+ on a GPU I doubt think the $7 per month in electricity to run it 24/7 is any sort of consideration.
Also the GPU could be used for mining and will make around $80-90 a month after electrical costs, possibly over $110-120 if you dual mine. So... Yeah, probably worth it.
I use RT all the time if DLSS is an option, which is supported by tonnes of games now. DLSS alone is a big bonus over AMD.
BTW ampere cards undervolt really well, so you can easily lower power draw with a 2 minute tweak in afterburner.
@Andre Wee I wouldnt trust the onboard readins too much when it comes to AMDs cards. We know from reviewers using the correct tools that AMD is pulling about the same as Nvidia (300W for 6800XT and 320W for 3080FE just as an example), and AMD having the bigger power spikes at around 600W for the 6900XT and 400-450W for 3090.
@@wixxzblu I just checked results from Igorslab. Afaik he does actual power measurements (first is 10-20ms, second is
What is the name of that frame rate counter overlay software that we can see in the video?
Thank you for your comparison between RTX3080 12GB and 6800XT for various perspectives, different from those usual game benchmark videos.
Man I didn't know Aloy could drive a race car that well!
Did I mess up a caption somewhere? lol
@@danielowentech yeah on 5:58, the title of the game in the vid is "Horizon Zero Dawn" Lol😂
@@danielowentech Yeah the previous poster pointed it out. It was funny, but I can see why it happened. It's all good, just having a bit of fun! :)
The whole video summed up
Nvidia advantages
Slightly better at 4k
DLSS
Nvenc encoder
Better software integration for productivity
better rt
better brand and mindshare
Amd advantages
slightly lower power consumptions
6gb/4gb more ram
better control panel and login not required
better sam support
better voice quality
linux open source drivers
cheaper in today's market(key difference imo)
Yes 3050 is equivalent to 6600 in terms of price here and 3060 is equal to 6600xt pricewise it's obvious rtx cards are overpriced
I would disagree that nvidia has a better control panel, not including geforce experience.
What does the whole "better productivity" for NV cards mean?
@@philmehrart a lot more editing softwares will work with nvidia cards or work better since Nvidia is a bigger company than AMD.
@@philmehrart take blender for example it has optix plugin that speeds up rendering and uses the rt cores only on nvidia though
Another example would be better performance in solidworks and even pudget bench
I suggest watching the Rx 6800xt review from LTT you'll understand once you see the graphs
Are you using curve optimizer and PBO on the 5950X?
Just copped a 6600 for $460 thanks to your videos ! I'm upgrading from an RX 570. Would SAM work with my 3300x and B450 Mortar Max? And would this build be good for 1080p gaming?
I've owned a 3090, 3080 and 3070, but always wanted to try a 6800xt. I finally got a MSI GamingX 6800XT and it was not what I expected. The card itself had a composite back plate and junction temps were 100c plus. Drove all my temps up and the only way to control it was to increase fan speeds and was just too noisy, so I got rid of it. Almost gave up but was able to try a reference AMD 6800XT and was impressed. Using a custom fan curve and mild oc the fans are barely audible with junction temps only at 75c and performance is good not to mention the card is not huge like all the others. Ray tracing is still a gimmick and DLSS is not necessary for the games I play, so that was never a factor for me. I actually am not a fan of AMD's control panel and prefer the simple Nvidia panel. I don't install the GeForce Experience either. I do my own game setting tweaks. I do like the fact that I can run overlays without crashing CoD unlike Nvidia...
yea, imo you shouldn't get msi, asus, gigabyte etc that also do nvidia because they usually cut corners or cost alot more. Things like Powercolor, sapphire and so on are way better.
yeah i got stupid lucky i got the xfx merc 319 6800xt traded a 3060 ti plus cash for it and haven't looked back such a good card excellent cooling really good styling overall a monster at 1440p 4k
@@opbush5272 Exactly, the GPU fans on my Sapphire Nitro + are quieter than my Noctua system fans.
Always got very nice experiences on amd and sapphire cards. Only kind of disappointment was a model from gigabyte, so I guess that sticking to amd exclusive oems might make sense since sometimes for other companies amd cards can be kind of an afterthought
for newer cards might be, but AMD official software won't provide support for older chips anymore, like my Ryzen 3000 chip released 2019... a fairly recent chip and I can't use AMD software anymore... I am left unable to tweak fan speeds in general, perform undervolting, etc etc... crappy customer treatment
I feel like part of the issue with 4k is that it is just so taxing even for high end gpus and every time it feels like gpus that can finally handle it come out, games also become more taxing and we're back to square one. Tbh, for heavy RT titles I've just been knocking the res down to 1440p with DLSS. I'd generally rather take the minor hit to resolution and be able to have more RT features active with smoother fps and I play on a 55 inch QLED. With a bit of sharpening thrown in the mix too, I think you can get a pretty clean image from 1440p.
Saying 4k is too taxing in this situation, is like saying it's the waters fault for you getting electrocuted in it with a toaster. It's the RT that's too taxing.
@@xSkittlesxNewbx Horrible analogy...He's right nerd, 4K is definitely way too taxing...1440p with some image scaling is all you need and it barely looks any different from 4K and you don't have to sacrifice visual quality and or FPS, oh and not to mention Ray Tracing is actually usable with 1440p...Not knocking 4K but it's unnecessary and not worth trading visual quality for FPS and vice-versa just to enjoy a game, especially for the price you paid for the GPU.
@@jonesaundre3 I mean, when you can play games without RT in 4k, and then can't play games in 4k with RT on, then does that mean the 4k is the issue or the RT is the issue?
Like, let's make it more simple, the RT part, is down to perspective, I'm happy with 4K gaming on my 1080ti without RT and has no issues. So for me, RT is as pointless as for you 4k is.
And don't get me wrong, I understand, having few extra rays of light, and more realistic scenery is nice for the massive performance drop in some cases, but when the difference between having RT on and off can mean, running textures higher, or having more FPS, even on a 4k screen, I'll pick RT off.
And I know DLSS got a bit better, but again, you are sacrificing fluidity to keep those rays up, either through losing some textures here and there, micro stutters, and etc, depending how heavy you are running the DLSS, which eventually, makes it counter productive, sacrificing over all performance and visuals depending how heavy you are running it), for a few RT.
So again, in my eyes the RT is the issue, as it's more taxing on your GPU than 4k, and using the tools provided to mitigate the performance loss, becomes counter productive eventually.
This is quite an exciting honest opinion. Love to see it. Very informative.
Great video, thanks.
Currently, I'm using a RX 5700 and I'm very satisfied with it. But if there is anything that will push me towards Nvidia in the future, that's gotta be DLSS. DLSS is the secret weapon that Nvidia has which even Intel has realized, hence they made XeSS even before releasing a proper gaming dGPU in the market. If AMD doesn't come out with a proper DLSS competitor in the coming years, my next GPU will definitely be a Nvidia/Intel one.
Nah dlss is only for rt
@@chriswright8074 and I'll simply turn RT off and use DLSS only. In fact, most RTX GPU owners exactly does that.
That true, but I guess that if xess is good enough general purpose AI upscaling might end up being just another not proprietary industry standard
I think the video was pretty good, just the cyberpunk comparison that I felt missed some points.
- Why run everything at ultra? You could probably get 4k native 60fps by turning down things like SSR reflections.
- Why run ultra ray tracing as well!? Even a 3090 would struggle.
- Maybe test with more sane settings? Instead of lowering the resolution (DLSS performance)
Other things that I would to share and see:
- Comparison with Metro Exodus Enhanced Edition. Best implementation of RT as of now.
- Reducing the resolution also reduces the number of rays in Ray Tracing. Yeah! Crazy
I get your point on cyberpunk. What I was trying to do is respond to the marketing idea that nvidia cards can do RT 4K Ultra and AMD can't. In reality neither one can. Actually playing on Nvidia I'd go high with rt reflections dlss quality, and on amd I'd go high with fsr UltraQuality, no rt. The framerates and image quality are similar at 4k but small advantage to the rt reflections for nviidia.
@Daniel Owen Oh, I get it now! Yeah, the comparison makes 100% sense then.
I didn't get at first. Great video in general!
@@danielowentech At the price tags on the market today, you would expect both of them to play anything at 4k 60... Yet since the 1080 we are still struggling :) However the games also look way better.
Awesome video, as always!
This is a topic which has plagued me for a while and i've watched so many benchmarks and reviews i've lost count. I now have an XFX 6900xt and also have a 3080 suprim x 10gb. Which of these cards would you say is the better to keep? The 6899xt has more VRAM and faster core clock but the 3080 has faster memory, wider bus and better ray tracing. I only play at 1440P so have no requirement for 4K. Any advice or comments would be very welcome tia
The only AMD driver issues I’ve experienced with a 6800 was with the Halo infinite beta and with Icarus early access.
Strange since its supposed to be the prefered system for Halo/Xbox 🥲
@@exeverrr Xbox Drivers are not AMD drivers. They're BASH low level drivers. Completely different, just same architecture.
The 3080 will age like shit just wait
facts. people are saying theyre getting 30 series when next gen cards drop, im just reading and laughing about it because obviously 6900xt and 6950xt are far better choice than any 30 series in price and performance.
Are you using free sync or gsync at all? I'm using a r9 380 and free sync makes those sub 60fps mid 50 fps games feel really nice
Did DLSS3 now changed the situation now? I may think to go for 3080TI for that but I dont know if it still worths in this case.
In general desktop use, i like AMD cards because of its software (i know its weird). It has a feature that is great for night owl users
AMD has this feature that you can set separate display profiles for the application/ games you use, i can quickly change brightness, contrast, white point, color saturation just by having specific app/ game open then having the colors reset to default once you close the app/ game
I personally use this feature to shift from my "night mode" to "day mode" settings just by opening my vlc player and just keep it in the background and my color saturation upped a bit when im playing a match with league
I'm switching to AMD only because of the Linux drivers. While 2080S to 6700XT is a side grade or downgrade in some areas, it's well worth it to support a company that supports open source communities. Nvidia has been ignoring and providing lackluster support to critical new Linux technologies like Wayland and Vulkan. (The proprietary Nvidia Linux driver situation on Linux is truly dire. Nvidia is literally holding back years of progress.) I want the card that works with the most operating systems. I've also been annoyed with Nvidia in general over the past few years. Their awful GeForce Experience software and Nvidia Control Panel have hardly changed despite me expecting a refresh at some point in time. AMD has continued to improve their all-in-one software, as mentioned in one of your previous videos. Nvidia's anti-consumer pricing and practices have completely frustrated me. (They purposely lower supply to keep prices high. See Moore's Law Is Dead.) Or just notice various controversies with Gamers Nexus, Linus Tech Tips, Hardware Unboxed. They regularly come across as tone deaf, slimy, and throw around proprietary technology like DLSS and RTX to keep people locked in while AMD invents open standards like FSR 1.0 and FSR 2.0 (a true DLSS competitor). Well guess what, I've hardly used either RTX or DLSS on my 2080S because it wrecks visual quality and performance for some barely noticeable rendering gimmick. I'm done with Nvidia. It's time I stop sacrificing versatility. Oh, and by the way, 6700XT prices have plummeted to $600 while 3070 sticks sky high near $1000. I saw my path out from Nvidia and I'm taking it. 🖕Nvidia.
Thanks for this FAIR comparison by the way. Nearly everyone dismisses AMD cards when performance really isn't that different.
Bought a rog 1080ti at launch and it took almost 2 years before it was a relatively fully functional card in linux. On top of the fact I bought it specifically to be a virtualmachine gaming dedicated card only to find out that nvidia killed that artificially has left me wanting to kick this pile of trash into a lake.
6950xt cards are running 800 bucks or less, the next gen cards may be pretty pricey since leaked info shows they will likely be ahead of the 4090 and without the firewire connector.
what is your cpu im getting better results with my 3080 10 gb with a 150 over clock on mem and core yours was at stock tho im going to test it./ i have a 16 core tho 129000k
Thanks for the review! I just picked up a 6800xt at MSRP (from a gaming conventions) And was wondering if I made the right choice rather than wait for an oppurtunity to grab a 3080 at a good price. Glad to know it wouldn't be that big of a difference
I love my 6800xt
The question is, did you enjoy your AMD card when you did not have a 3080? If so, it was not a mistake. I been twittering your chan, not that people read anything I write. I love that you are giving honest opinions.
Agreed, I feckin love the 6800xt. Way more affordable than the NV cards and its solid for everything I want to do, 3D, gaming etc. just no RT :)
Well put video. Thank you
I'm loving this video on this topic. I have a 6700xt I didn't care if I got it or the 3070ti I wanted a white GPU and that's what I found at micro center the day I went. Both cards were the same price when I looked around give or take about 100 bucks. I'm very happy with my 6700xt now still.
As a person who currently owns both AMD and NVidia cards I can honestly say that hardware wise they are pretty much equal when it comes to gaming, might be a different story for mining. The major difference I notice is in drivers for the two companies, I still find that NVidia drivers are better than AMD drivers although AMD has made significant improvements over the years. When it comes to 4K gaming neither company has cards that are powerful enough to have a good gaming experience compared to 1440p gaming on max settings. I believe right now 1440p is the sweet spot for gaming on max settings, but that may change in the next few years.
I went from Nvidia to AMD and now back to Nvidia thank god. There were just too many older games that just didn't play well or had weird annoying issues and driver crashes every now and then with the AMD 6700xt card. I got a 3070ti and not only is the performance a bit better but I have zero issues. Really glad I went back.
That's highly doubtful that it gets better performance than a 6700xt. Unless you play with dlss all the time. Or unless you meant the 6500xt instead
3060ti is worse than a 6700XT about 5-10% and for older games you could have just use DXVK to fix the issues its simple as dropping the DXVK files inside the game directory
there is no such thing as an rx 6700, so I'm assuming you mean the rx 6700xt? there's no way a 3060ti performs better consistently
@@whereswaldo9089 Oops I meant 3070ti
@@promc2890 Looks like you use UserMemeMark for reference lmao
My 3070 broke, I sold the replacement and got a 6900xt - I don’t miss dlss or rtx.
Great comparison video. With prices as they are now in Europe, a RX 6900 XT (€1350) is more closer to that of a RTX 3080 (€1280)... You can get both GPUs cheaper than that by another €100, but these prices are from a website/store I trust. Now, the question is, is it better to wait for the RTX 40 series and the RX 7000 series GPUs or buy the RX 6900 XT/RTX 3080 this year.😕
Try with VR - nvidia has huge advantage - especially in 4K, and for Quest 2 wireless pcvr - nvidias video encoding is way better. For flatscreen gaming - theres nvidia titles and amd titles
I would always take DLSS over native in Horizon Zero Dawn (at least at 1440p), it eliminates all the shimmering in grass.
A great and honest comparison, thank you.
Did the higher power consumption affect the ambient room temp much, ofcourse with AC off
AMD has their own DLSS but it’s up to developers to implement it.
He mentions that, its called FSR. While the results are similar, it works a lot different, and in some cases it looks a fair bit different.
Sometimes FSR looks worse with jaggies. I think the next version of FSR will look very good.
@@theyoungjawn are they coming out in September or December
@@arkplaysgame3826 no timeline for now. In fact nobody knows if FSR will have another version.
@@kishaloyb.7937 ok
By the way, Elden Ring just isn't optimized well. My 6700xt is doing better at 4K than some people's 3070's when that shouldn't be the case. Hope to see it improved
What kinda CPU?
@@MikeyMike87 5600x
Have you considered getting a 6900xt to compare? Especially since they are available at bestbuy for about $300 more than a 6800xt?
Woah your channel has grown a lot since the cyberpunk 2077 release days! Congrats!!