Are AMD falling too far behind NVIDIA?

Sdílet
Vložit
  • čas přidán 13. 10. 2023
  • Today Steve and Tim discuss whether AMD are becoming dominated by NVIDIA's feature set and could potentially see a further reduction in sales?
    Join us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane.com/channel/Ha...
    Buy relevant products from Amazon, Newegg and others below:
    Radeon RX 7900 XTX - geni.us/OKTo
    Radeon RX 7900 XT - geni.us/iMi32
    GeForce RTX 4090 - geni.us/puJry
    GeForce RTX 4080 - geni.us/wpg4zl
    GeForce RTX 4070 Ti - geni.us/AVijBg
    GeForce RTX 3050 - geni.us/fF9YeC
    GeForce RTX 3060 - geni.us/MQT2VG
    GeForce RTX 3060 Ti - geni.us/yqtTGn3
    GeForce RTX 3070 - geni.us/Kfso1
    GeForce RTX 3080 - geni.us/7xgj
    GeForce RTX 3090 - geni.us/R8gg
    Radeon RX 6500 XT - geni.us/dym2r
    Radeon RX 6600 - geni.us/cCrY
    Radeon RX 6600 XT - geni.us/aPMwG
    Radeon RX 6700 XT - geni.us/3b7PJub
    Radeon RX 6800 - geni.us/Ps1fpex
    Radeon RX 6800 XT - geni.us/yxrJUJm
    Radeon RX 6900 XT - geni.us/5baeGU
    Are AMD falling too far behind NVIDIA?
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunb. .
    Music By: / lakeyinspiredg. .

Komentáře • 377

  • @VTOLfreak
    @VTOLfreak Před 8 měsíci +126

    I noticed AMD is introducing more driver-level features that work on all games without requiring developer support like AFMF and RSR. If you are in an underdog position, this makes sense because you won't be able to convince as many game developers to integrate AMD proprietary technologies that will only run on AMD hardware. This might also be the reason they made FSR3 vendor-agnostic, otherwise no game developer would implement it. I have allot of older games that will never be updated so AMD makes more sense for me than Nvidia. IMHO they are not behind, they are heading into a different direction.

    • @notjustforhackers4252
      @notjustforhackers4252 Před 8 měsíci +8

      Now factor in Linux, MESA and AMD.

    • @LeMAD22
      @LeMAD22 Před 8 měsíci +1

      But then again is there any AMD driver-level feature that isn't utter crap? I'm using Radeon Chill as a fps limiter, but there are plenty of other options. And the rest (Anti-lag, Boost, AFMF, RSR, etc) should not be touched with a ten foot pole. Imo, aside from the price, the only reason to go with AMD is the amount of VRAM. I bought my 6900xt last year, and even though it came with the usual AMD driver and software problems, it will probably be the right decision in the long run vs the 3080 10gb which was more expensive at the time, and the 3070 which was priced similarly.

    • @OneAngrehCat
      @OneAngrehCat Před 8 měsíci +4

      You're half correct about the vendor agnostic part.
      One of the biggest reasons that people don't consider enough is that AMD is the power behind both PS5 and XBOX SX.
      It is in the interest of their continued partnership that they keep compatibility with these consoles at all costs.
      HOWEVER, and that's why next year will be interesting, this requirement will greatly weaken when the PS5 Pro comes out. Probably with RDNA 4 hardware.
      It will tell a lot of both Sony and AMD's strategy in the long run.

    • @plasmahvh
      @plasmahvh Před 8 měsíci +2

      @@notjustforhackers4252 i swear to god people who comment this either a) never used a modern nvidia gpu (maxwell+) b) never used an nvidia gpu to begin with and are just repeating what others say. i've had way more issues on arch with my old rx 590 and rx 5700xt than my 4060 and 4070 combined. 5700xt was unusable the first few days and then it was buggy wayland especially. 4060 and 4070 flawless so far with hyprland with proprietary drivers. i dont understand whats the hassle behind installing nvidia drivers, you just install nvidia-dkms package and it just works. literally 0 issues day0 0 issues since then. nvidia's proprietary DX11 call conversion is still superior to mesa so if you play a lot of DX11 games nvidia is gonna be faster. when is ur afmf gonna work under linux? it's closed source bud, it's the FIRST time amd released a feature NVIDIA doesn't have and they close its source. and then u say nvidia is the devil for not open sourcing DLSS. Why the fuck would they when it's better than its competition?

    • @unknown14191
      @unknown14191 Před 8 měsíci +1

      I wish AMD could become a rival to Nvidia in academic and commercial sectors, but for now the majority in this world uses Nvidia GPUs for anything AI related or anything that requires parallel compute(CUDA for example). I wouldn't be surprised if Nvidia strip functionalities such as CUDA from their next gen consumer GPU and upsell their industrial cards by 100x(just some demoralising value but at most 5X) and people will still fork out money to buy them.
      In the basic parallel compute technology CUDA is still the best due to the efforts Nvidia have spent over the years, alternatives from AMD don't even get wide adoption.

  • @mikem2253
    @mikem2253 Před 8 měsíci +40

    Have a 7900xt and man does it fly. Having 20gb of RAM is a nice cherry on top.

    • @goldengamer8838
      @goldengamer8838 Před 8 měsíci

      Love having a 6800xt with 16gb. I wish there was games with crazy high textures that could use 16 or 20gb

    • @RpJ672
      @RpJ672 Před 8 měsíci +5

      Just picked up a 7900xt too. Swapped a 1080

    • @nico3064
      @nico3064 Před 8 měsíci

      I loved my 6800XT, until an update in April. Eversince I had horrible performance problems. Nothing could fix it. Swapped to a 4070 yesterday and all my problems are gone. It's a shame

    • @VaginaDestroyer69
      @VaginaDestroyer69 Před 8 měsíci

      @@nico3064
      Should’ve just gotten a 7800 XT then. It shits all over the 4070, costs a bit less, and has more VRAM.

    • @VaginaDestroyer69
      @VaginaDestroyer69 Před 8 měsíci

      @@goldengamer8838
      Don’t worry, those games are coming in the future. We’re getting to the point where 12GB is becoming the minimum for anyone that wants to play at 1440p High settings.

  • @Thor_Asgard_
    @Thor_Asgard_ Před 8 měsíci +56

    Sometime people make it seem that you need Nvidias shinanigan tools all the time. Truth is you dont. If you are no "special person", it makes absolutely no difference if you go team green or read for normal gamers. Sometimes people are hypocrits.

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ Před 8 měsíci +16

      well it does make a difference, at team red your more likely to spend less for similar or more performance lol

    • @Aggnog
      @Aggnog Před 8 měsíci +1

      Even for just gaming not having DLSS is a kick in the nuts. On my GPU-P VM with a 4090 I can't use DLSS and have to use FSR2 in cyberpunk, it runs and looks worse in all cases.

    • @visevise9520
      @visevise9520 Před 8 měsíci

      More perf if u dont take efficiency into account

    • @Thor_Asgard_
      @Thor_Asgard_ Před 8 měsíci +5

      @@Aggnog If iam not standing still while looking with a magnifying glas on a szene, I can literally see no difference. Besides that, Cyberpunk 2077 has such bad ghosting with DLSS 3... i turned on FSR. But as you own a 4090, you most likely apply to the special people, so there you go. Its ok that you have a Nvidia card, but its not neccessary. And yeah, I normally prefer AMDs price/performance too much. Still i have Nvidia this time, cause i got the card for cheap.

    • @Ghostlynotme445
      @Ghostlynotme445 Před 8 měsíci +8

      @@Aggnogdude why you worry about upscale on $1600 gpu you should be fine running it raw performance then with upscale

  • @GloriousKev
    @GloriousKev Před 8 měsíci +10

    Odd because I do VR, video editing, play with RT, and live stream with AMD. Zero issues. I don't do blender or AI. FSR and anti lag are a thing on AMD. I am not even sure if this guy has ever used an AMD gpu before.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      Yeah, I was thinking something similar. My 7800 XT outperforms my 4070 in most video editing tasks in both Premiere Pro and DaVinci Resolve Studio. In Topaz Video AI there are many scenarios where the 4070 wins but with 4x 4K upscaling the 7800 XT usually does.
      I’ve done lots of Blender work with the 7800 XT and 7900 XTX but the 4070 is definitely faster. Still, we are talking about a 150% Nvidia lead previous gen, before HIP RT, down to an 88% lead current gen with HIP RT. There’s a lot of room for improvement but that’s narrowing the gap a ton.

  • @shortstaccck7965
    @shortstaccck7965 Před 8 měsíci +4

    Reading discourse behind AMD or Nvidia really gotta be the most circular shit. Everyone says the same shit every time and still here we are.

    • @m0ssyk1m10
      @m0ssyk1m10 Před 8 měsíci

      agreed
      tfw reading this comment section is making me lose braincells, on god

  • @perlichtman1562
    @perlichtman1562 Před 8 měsíci +12

    Some of those areas are no longer stronger for AMD than Nvidia - the question is based on a partially incorrect premise. The 7800 XT is better than even a 4090 in many BlackMagic Design Fusion tasks, for example.
    For video editing, the 7800 XT saw big improvements over the 6800 XT in several areas (including big improvements to longGOP performance, including going from 109 fps on the 6800 XT to 192.19 fps in PugetBench Resolve h.264 to h.264) and is outperforming the 4070 in lots of Premiere Pro 23 and DaVinci Resolve benchmarks (
    - especially Fusion. In my testing it scores 38% higher in Fusion in PugetBench.
    The top 3 4090 PugetBench overall scores online only have Fusion scores of 650 to 681. My 7800 XT got 705 on its own and a massive 781 when paired with the 13900K’s iGPU.
    In recent Topaz Video AI 3 versions, it depends on which card, resolution and AI model, with the 7800 XT usually outperforming the 4070 in 4X upscaling 4K and the 4070 often outperforming the 7800 XT at lower resolutions. It’s not a slamdunk for Nvidia unless you are talking about performance at any price as opposed to performance at the same price tier.
    In regards to Blender Nvidia still has an advantage in both performance and compatibility. But using the built-in Cycles engine, the recent HIP-RT implementation has boosted AMD performance by 11-31%. There is still a significant advantage for Nvidia but the gap is narrowing.
    The 3060 vs the RX 6600 without HIP-RT was around 2.5x faster in the benchmark scenes.
    The 4070 vs. the 7800 XT with HIP-RT is 1.88x faster in the benchmarks most favorable to AMD from the same set.
    So between Blender software changed and AMD hardware changes in one generation AMD narrowed the gap by 41%.
    Nvidia frame generation vs. Nvidia frame generation is further along in regards to driver development and VRR support etc. but AMD’s implementation is supported on more hardware.
    In other words, these are examples of how AMD has been catching up, not falling further behind, in around half the areas discussed in the original question.

    • @DimkaTsv
      @DimkaTsv Před 6 měsíci

      Can i also add from Video editing part.
      Comparing RDNA2 to RDNA3 there was huge gain and one HW bug, which AMD AMF team makes effort to mitigate globally, but it takes time to cooperate.
      RDNA2 had limit of 8k30 for playback (based on 6750XT experience). And it was quite heavy limit, 8k60 couldn't be played with HW decoder and if you paused high res (aka said 8k30) video for 5 seconds and unpaused it, it would either lag, slow down or freeze for second or two until decoder buffer fills, depending on video renderer you used.
      RDNA3 (based on 7800XT) can easily do 8k60 and there will even be a headroom. Also no lags after pausing and resuming. Huge progress. But RDNA3 has a HW flaw with AV1 encoder, which will be fixed in next VCN generation, but still... Basically as it uses 64x16 blocks you cannot encode 1080p properly (for some reason 1080p is one of the very rare main resolutions that are not multiplicative of 16...). This issue doesn't affect 1440p, 4k and other resolutions that can be divided by 16 vertically. Originally it was 1088p as result with 8 lines of black filler at bottom of frame. And issue there rooted not in HW block size, but in fact that AV1 encoder doesn't support cropping metadata by spec. This leads to videos being rendered back as cropped to 1906x1080 (or 1916 if from 1082p). You can force players to play them as proper 1920x1080 by adding forced aspect ratio in container data (16/9, for .mkv in MKVToolNix) without any need for reencoding.
      To mitigate severity of the issue AMF team made software trick and added compatability flag that forces 1080p AV1 to encode as 1082p instead of 1088p, making issue significantly less noticeable (unless you specify 1088p, like ReLive team for some reason did). Again, you can still add aspect ratio data into container and most players will follow it, including CZcams). Currently they are working with ffmpeg team to merge cropping feature that will detect and crop these added lines for AV1 video on playback. After which they said, they will cooperate with ISV's to make fix widely acceptable.
      I know AMD drivers aren't perfect. Personally submitted few AMF bugs to AMD teams and most of them were fixed (within 2 months from quick github commit to becoming a silent addition in new driver usually). I also submitted DX9 render bug which also got silently fixed within 2 months. So AMD team definitely doesn't slack.
      RDNA3 also fixed idle compatability with my monitor, which used non-standard timings (causing max VRAM frequency and high power consumption at idle on RDNA2, or 0 mHz VRAM stuck state if FreeSync was enabled as attempt use SW to fix HW issue), so i don't need to use custom resolution (with monitor timing fix and FreeSync range compatabiĺty for these new timings) at all now. Also great progress.
      Overall my experience with 7800XT for now is extremely smooth. I hadn't had much issues that weren't caused by my even with 6750XT, granted using CRU was... Forced, but 7800XT became even better as experience atm.
      Granted, AMD MPO implementation is still in need of work. As they link cursor layer to top layer, it causes text input cursor become white on white background while using Microsoft Word for some reason... It can be fixed with MPO being disable with one registry edit though.

  • @MasterCorvus
    @MasterCorvus Před 8 měsíci +12

    Running my 1080 Ti I'm looking for decent 1440P/4K card and in the end I will settle down with either 7800 XT or 7900 XT depending if I'll find 7900 XT for less than 200 euro price difference ( 7800 XT costs around 600 eur while 7900 XT is still around 800 eur ) I also have a huge backlog of older and old school games so I'm taking that into consideration before overspending for a higher top tier card that would only give me 10-20% more performance on Ultra settings.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci +3

      I have both (as well as an Arc A770, 3060, 4070, 6800 XT and 7900 XTX) and the 7800 XT is easily the best value card on the market right now, especially if you do any video editing (where in some Resolve Studio Fusion tasks it even outperforms a 4090 and outperforms my 4070 by over 30%).
      The 7900 XTX is also a good card, especially for gaming, but I’m probably returning mine because it’s overpriced relative to the performance it provides, even if it has been a significantly better value than the 4080 at their respective pricing up until very recently.

  • @ViperCannabis
    @ViperCannabis Před 8 měsíci +26

    Personally, FG will never replace native frames. Especially the added latency it comes with.
    All these features aren't technically necessary, but they're just nice-to-haves (like any other features).
    If you do find them necessary. Then you should look at the game itself closely. Is it optimized? Is it just too hard to run?

    • @vladimirdosen6677
      @vladimirdosen6677 Před 8 měsíci

      It's an AI...of course it will. You're comparing what a human being can do with a computer.

    • @ViperCannabis
      @ViperCannabis Před 8 měsíci +2

      @@vladimirdosen6677 And you overestimate the capabilities of AI... Even Nvidia has to release iterations of denoisers just to clean up a raytraced image.

    • @vladimirdosen6677
      @vladimirdosen6677 Před 8 měsíci

      @@ViperCannabis Of course, but to think that you can stop progress, that's a fight we won't win. Anything that makes our lives easier and better will inevitably be used. If devs have an easier time using AI, you can damn well be sure they'll use them.

    • @ViperCannabis
      @ViperCannabis Před 8 měsíci

      @@vladimirdosen6677 The more AI the game is than "human", the more people will find an alternative. Big corporations just see this as a business opportunity, similar to microtransactions.
      I agree it is the future, but without regulations. It would just be a greed machine that gamers will see through. And they'll find devs and publishers that will develop games "by hand".

    • @vladimirdosen6677
      @vladimirdosen6677 Před 8 měsíci

      Oh absolutely. It's just terrible how there is no accountability. They are scamming people with games left and right. if they regulate it well, AI platforms would actually be more trustworthy than the actual devs sadly. So it could be a curse, but also a blessing. @@ViperCannabis

  • @roccococo2498
    @roccococo2498 Před 8 měsíci +43

    I've just built my new pc with7900 XTX + 7800x3d and I love it. My old rig had a 3080 + 5800X. I am disgusted with Nvidia since I bought the 3080, as the 10 Gb Vram was insufficient in less than 2 years. The 7900 XTX is really impressive and it costs almost half the price of the 4090 in my country. It plays almost any game at 4k ultra native resolution well over 60 fps, no upscaling needed. Had no issues with AMD drivers. I do recommend the 7900 XTX over the 4080, since those 16 Gb of Vram on it will be the norm in 2 years. I am sorry for the 4070ti buyers, though, as I feel it will share the same fate as the 3080.

    • @advancedmomuser-kb2qn
      @advancedmomuser-kb2qn Před 8 měsíci

      enjoy ur amd driver crashes...

    • @roccococo2498
      @roccococo2498 Před 8 měsíci +16

      I've had AMD GPUs before without any major driver issues so I don't really get your point. You're probably just buying into the AMD-bad/Nvidia-good narrative. I am running this system for about 2 weeks now and did not have a single crash or issue with it. You should be glad there is competition in the GPU market...otherwise you'd be selling your organs to purchase Nvidia GPUs...

    • @Ghostlynotme445
      @Ghostlynotme445 Před 8 měsíci +11

      @@advancedmomuser-kb2qndon’t be mad because he’s joining the red team your Nivida driver ain’t when gotta install 3 apps just to make it work

    • @WyattOShea
      @WyattOShea Před 8 měsíci +12

      lol that's such bs and you know it. I've had an almost equal amount of nvidia and amd gpus over the years and have had more major issues on nvidias side (not that either are bad these days though). Both nvidia and amd are on par more or less with their drivers these days and any major issue are likely on the user side not the gpu vendor side. Idk what it is with people like you that always try to spout bs about amd just because they heard someone say something bad about them once and then try to preach it like it's completely true and relevant today when it isn't and hasn't been for years at this point. Btw I have a 6900xt and the only times I ever get driver crashes is if I do some overclocking/undervolting and haven't dialed the settings in fully yet and my previous card was an nvidia 1080ti so you can't call me a fanboy lol.@@advancedmomuser-kb2qn

    • @notjustforhackers4252
      @notjustforhackers4252 Před 8 měsíci

      @@advancedmomuser-kb2qn enjoy ur amd driver crashes... on Windows.

  • @TutonicMonkey
    @TutonicMonkey Před 8 měsíci +26

    I have never had any issues on my all AMD living room Steam VR PC with a Win 10, 5800 + xfx 6700 12gb (24 different games). Not sure where all the negative feelings are coming from.

    • @sup3rnov400
      @sup3rnov400 Před 8 měsíci +4

      Yeah, I was a little doubtful about buying an AMD GPU, but I'm happy that I purchased the RX 6600 instead of the RTX 3050 or 3060.

    • @xpodx
      @xpodx Před 8 měsíci

      Previous gen amd, I've still heard issues with current gen too.

    • @xpodx
      @xpodx Před 8 měsíci

      ​@@sup3rnov400didn't wanna get a used 1080 ti for 200 or less?

    • @Greenalex89
      @Greenalex89 Před 8 měsíci +2

      5600g+5800x3D, 6700XT, b450+b550 boards used, 16gb oc@3600 mt/s c18 and 32GB oc@3600 cl16 RAMs used, dual screen set up, gaming on 1440p..Never had any issues with anything whatsoever+all upgrades worked right off the bat

    • @xpodx
      @xpodx Před 8 měsíci

      @Greenalex89 that's good for you mate. Previous gen have had issues. Used amd cards are a coin flip. From time to time I heard people having issues with there amd cpus and gpus, with cpus having memory compatibility issues, board issues, etc.

  • @frostoise
    @frostoise Před 8 měsíci +9

    One thing AMD has really fallen behind compared to NVIDIA is 3d rendering (Blender,Vray etc.) Almost any RTX GPU will massively outperform a similar class Radeon GPU. I'm talking about almost double the performance for the same price. Even AMD's own ProRender works much better with RTX GPUs. OptiX currently dominates the rendering scene, not to mention most rendering engines like Vray don't even support HIP RT.
    With GPU rendering becoming much more relevant as time passes, I hope AMD works on making HIP RT more relevant. With Blender 3.6 we got the experimental HIP RT support and while it's a nice improvement there is still much needed to be done if they want to catch up to OptiX.

    • @Ghastly10
      @Ghastly10 Před 8 měsíci +2

      There is also Octane and IRay, Octane made by Otoy is for Nvidia only, same with iRay which is an Nvidia made render engine. And this is the problem for folks who have an AMD GPU, if they want to do 3D rendering, there is little to no options available as what is out there is mainly for Nvidia GPU's.

    • @frostoise
      @frostoise Před 8 měsíci

      @@Ghastly10 Probably the only reason I haven't jumped to AMD is this. Yes, for gaming AMD offers better performance for lower price on pretty much all GPU tiers, besides maybe the 4090. But people that work on rendering will give up 5-15% gaming price/performance (which is pretty significant, I'm not downplaying AMD here) if it means close to double the rendering speeds + better support.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci +1

      Current Nvidia cards from the 4070 on up are definitely better than the AMD cards for Blender. Blender is one of the only applications where I prefer my 4070 to my 7800 XT. But in Cycles that gap is narrowing.
      Using the built-in Cycles engine, the recent HIP-RT implementation has boosted AMD performance by 11-31%. There is still a significant advantage for Nvidia but the gap is narrowing. I agree with you that there is still a ton of work to be done - this remains one of AMD’s weakest areas, even if we ignore the lack of compatibility for many other render engines.
      The 3060 vs the RX 6600 without HIP-RT was around 2.5x faster in the benchmark scenes.
      The 4070 vs. the 7800 XT with HIP-RT is 1.88x faster in the benchmarks most favorable to AMD from the same set.
      So between Blender software changed and AMD hardware changes in one generation AMD narrowed the gap by 41%.
      AMD has been getting closer, not falling further behind.

    • @leucome
      @leucome Před 8 měsíci +1

      ​@@perlichtman1562 I render everything on two AMD computer. I tweak the setting as far as I can and it dont take too long. So far nobody complained about the quality. Seriously I am not sure that rendering is an actual good reason to pick Nvidia unless literally being forced to work with a Cuda only render engine by a specific project/contract or something.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      @@leucome Oh I agree about the quality and compatibility with Cycles vs. some other render engines. I recently did a project using 7900 XTX and it turned out fine in terms of quality. The only issue is that it takes longer, but like you said using multiple cards can help mitigate that. My 7800 XT typically takes at least 88% longer to render the same scene as my 4070 so that’s the sort of thing I’m talking about. Just how long it takes.
      Also, outside Blender, I wish I could use my 7900 XTX to process guassian splats but I had to get a used 3090 for that because I needed 24GB VRAM Cuda card to be able to do it.

  • @hwi7114
    @hwi7114 Před 8 měsíci +3

    I have a RTX 4070, I don't use any of the Nvidia specific tech. Only reason I went with it is because my 4.9L case could not fit a RX 7800 XT. I mostly play FPS, so DLSS and RT are useless to me.

  • @zeitgeistx5239
    @zeitgeistx5239 Před 8 měsíci +11

    LMAO Nvidia fan boy's salty question.

    • @tomtemple69
      @tomtemple69 Před 8 měsíci +1

      that guy probably uses userbenchmark lol

    • @born2serve92
      @born2serve92 Před 6 měsíci

      sadly, I did too for ages. Their writing is so absurd, but I really like to see the stats. Is there a good replacement site??@@tomtemple69

  • @Tracenji
    @Tracenji Před 8 měsíci +7

    VR works great on amd from my experience, haven't had any issues with steamVR on a 7900XTX
    i also have not heard any complaints about VR (or any flat screen games) from my friend who went from a GTX 1650 to a 6700 XT (very different performance class, i know), they have an oculus quest 2 and use Virtual Desktop to play PCVR games
    and no, it isn't because they don't tell me about issues, i helped them pick out the GPU and i am also their tech support

    • @tex69420
      @tex69420 Před 8 měsíci

      6950 xt, using vr, no issues.

    • @Joric78
      @Joric78 Před 8 měsíci

      Haven't had any issues? The 7000 series performance was crap until they finally patched it a month or so ago, and now it depends on what headset you're using. The DPVR E4 officially doesn't support AMD cards. The Varjo Aero officially doesn't support AMD cards. I wonder if your friend just has no point of comparison or thinks having to use Virtual Desktop is normal. Link cable and Airlink on Quest headsets has godawful antialiasing on 6000 series cards. The Video playback in VRChat still doesn't work with multiple players using AMD cards. I'm overdue for an upgrade and making do with a 6600 XT for my Quest 2, but it'd take an amazing turnaround for my next card to be AMD when I grab a Quest 3.

  • @CasparMinning
    @CasparMinning Před 8 měsíci +6

    I just game, and I always go for the bang for your buck deal. So the last 10+ years that ment no Nvidia or Intel parts.
    I'll pass on the needless fluff features.

    • @gerardotejada2531
      @gerardotejada2531 Před 8 měsíci

      Hard to find a better bang for your buck combo than a 12100f and a 3060. And before Ryzen AMD was the worst company on earth, and with the great value on the original GTX1000 series is hard for me to belive your lies.

    • @gerardotejada2531
      @gerardotejada2531 Před 8 měsíci

      The best moment for an all AMD PC was after Ryzen 3000 and RX5000 series launched. But that was just 4 years ago.

    • @Greenalex89
      @Greenalex89 Před 8 měsíci +1

      Same. I want efficiency, multigenerational support and a good price to performance. Since Nvidias weak 3060 is the only recommendable card without being price scammed and intels CPUs dont last more than 2 generations and consume absurd amounts of watts under load..AMD it is for me. Didnt have any issues whatsoever and adrenaline is such a nice monitoring and easy to use all in one OC/capture/update tool.

  • @evergaolbird
    @evergaolbird Před 8 měsíci +6

    RX 6700 XT - I play games on it, render humanoid models plus texture them with Unreal Engine 5 with Adobe Substance, edit with Adobe Premiere. It gets the job done.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci +1

      I feel you. Also, Adobe Premiere Pro and DaVinci Resolve studio performance improved with the 7000 series, especially in h.264. In Resolve Studio I went from under 110 fps with h.264 to h.264 in PugetBench with the 6800 XT to over 190 fps with the 7800 XT and Fusion module scores that not only beat my 4070 by over 30% but also beat the 4090 Fusion scores I’ve seen online.
      In Premiere Pro 23.6 h.264 encoding went from 109 fps on 6800 XT to 180 fps on the 7800 XT.
      The 4070 has a big RTXDI advantage in Unreal Engine 5, where it is generally a better performer than 7800 XT even without it, but the AMD cards are still viable.
      Blender Cycles gap has narrowed from Nvidia being 150% faster previous gen than AMD at the same price point previous gen with AMD HIP to 88% faster current gen in best cases with HIP RT. I have been editing a lot with AMD cards in Blender and it works fine - but my 4070 is significantly faster than even my 7900 XTX unless VRAM becomes an issue.

  • @boerbenlp8659
    @boerbenlp8659 Před 8 měsíci +1

    I just wanted the best rasterizing for my money from the new generation and grabbed a 7800xt coming from NVIDIA. Pretty happy with that. But I was really surprised how intuitve and cool the driver is. You can toggle anything easy. Overclocking and undervolting works fine. Super easy no other tool needed. In NVIDIA you have these 20 year old control center you have to deal with just to enable g-sync or change some graphics settings. For overclocking you have to find the hidden feature in gforce experience, where you need also an account for. So on that part AMD made things a lot easier and more convenient from my perspective. Also like the idea of havin those new technologies on a driver level than hoping that the games I wanna play support it. Hope this approach works out for AMD.

  • @walllec
    @walllec Před 8 měsíci

    Good question.

  • @Gissersj
    @Gissersj Před 8 měsíci +3

    Glad I changed from intel/Nvidia to an all AMD rig , saved money which I spent on better MB and Ram , beast of a rig !

    • @ResidentWeevil2077
      @ResidentWeevil2077 Před 8 měsíci

      I have two all-AMD rigs (one an AM5/RDNA 3 rig, the other a WIP for AM4/RDNA 2), and a 3rd all-Intel rig (an i5-13400F paired with an Arc A770 16GB LE card). All 3 are mini-ITX form factors and some of the best rigs I've built. I don't think I'd ever consider Nvidia again after theBS they keep pulling; DLSS is nearly a key selling point, but FSR and XeSS are comparable and especially in FSR's case, not AMD bound.

    • @tomtemple69
      @tomtemple69 Před 8 měsíci

      same, I went to all AMD rigs back in 2018, AMD has only improved every single generation, this gen was kinda disappointed but the last 4 generations of card and GPUs were great value for money

  • @rajagam5325
    @rajagam5325 Před 8 měsíci +5

    i do editing with a 6800xt, and i have to say its way better than my old 2060!

    • @kenkoyka9
      @kenkoyka9 Před 8 měsíci +1

      lol obviously :)

    • @lasantabibliaparanino3795
      @lasantabibliaparanino3795 Před 8 měsíci +1

      ​​@@kenkoyka9lol who cares still better

    • @detroid89
      @detroid89 Před 8 měsíci

      Hi friend, I also had the 2060 and looking for an upgrade but i only have a 600w psu. If i undervolt the 6800xt, will the psu before enough and have you undervolted your gpu yet?

  • @Yahya-X
    @Yahya-X Před 8 měsíci +2

    As the king of benchmarking, I think you guys should go do some VR benchmarks here and there at least.

  • @mihapuzelj
    @mihapuzelj Před 8 měsíci +1

    you should do a poll, what do people use GPUs for...
    I bet it's 95% gaming only

  • @ZombieTechie
    @ZombieTechie Před 8 měsíci +2

    So I've heard with the latest drivers, AMD is actually faster in video editing on resolve with a 7900xtx than a 4090. Ordered a 7800 xt myself over a 4070 as I hear that's better in Davinci as well. I do game. I have a g-sync monitor, but truth be told , Im more likely to play my PC on my OLED that doesn't have VRR support. Monitor is good for video editing.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci +1

      Overall the 7800 XT is outperforming my 4070 in both Resolve Studio 18.5.1 and Premiere Pro 23.6, especially in several Resolve Studio Fusion tasks (PugetBench 705 for my 7800 XT in that category vs. 510 for 4070). The 7800 XT saw big improvements to h.264 encoding over my 6800 XT (as in around 110 fps in both apps on the 6800 XT vs 180 in Premiere Pro and 192 with Resolve Studio). Really great card for video editing - currently the best value on the market.
      The only thing I will mention is that if you have a ton of LongGOP video layers running at once in Resolve Studio alongside a lot of GPU FX, you may eventually hit a point (like I did) where you get errors. In my case, once I got that error, I temporarily disabled AMD video decoding in the Resolve preferences so the card could fully focus on the GPU FX and my 13900K could handle the decoding. That solver the issue.

    • @ZombieTechie
      @ZombieTechie Před 8 měsíci +1

      @@perlichtman1562 good to know. My channel is garbage right now. Restarting the kind of channel I want soon. I will be using multiple layers guaranteed. How does AMD cards handle magic mask?

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci +1

      @@ZombieTechiePretty decently. In 18.6 at the lower quality mode, the 7900 XTX is faster than the 4080 with Magic Mask but when using the Better quality setting the 4080 is faster.

    • @ZombieTechie
      @ZombieTechie Před 8 měsíci +1

      @@perlichtman1562 thanks, was curious. Planning on upgrading to studio when new PC is built. I wasn't sure if radeon was the way to go because of magic mask. I used to use a green screen a lot. MM seems like an amazing tool.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      @@ZombieTechieIt’s a great tool and renders very quickly in the newest updates. A number of my peers say that it isn’t as easy to correct things like stray hairs as with RotoBrush 3 in After Effects - but it renders a lot faster.
      Green screen is still simpler when you have the option but Magic Mask is great when you don’t, especially if the talent doesn’t have a lot of stray hairs. :)
      The 7800 XT has the best price to performance ratio for that kind of work in my testing so far but I haven’t gone back again to retest Magic Mask against the 4070 again after the 18.6 improvements to performance.

  • @PhillipLemmon
    @PhillipLemmon Před 8 měsíci +1

    I plan on building an all AMD desktop.
    So....
    5700xt, 6600xt, 6800xt or 7600xt gpu?
    (Don't plan on using raytracing or playing at 1080P)

    • @kpep588
      @kpep588 Před 8 měsíci

      6600 or 6650xt for budget, 6700xt for best bang for your buck and entry level 1440p, 7800xt for a good 1440p card. If the 6800xt is significantly cheaper in your country than the 7800xt, then get that one, because its just a little slower

  • @randomzocker8956
    @randomzocker8956 Před 8 měsíci

    I would be interested in the VR on amd?
    i would plan to use a 7900xtx on my next build and also use that for VR. Is that a problem?

    • @OneAngrehCat
      @OneAngrehCat Před 8 měsíci

      It isn't. The RDNA 3 problems with stuttering in VR have been fixed.
      You can however get a worse looking experience with their rather poor h264 encoder. If you're gonna VR, either get a cabled headset, or get a Quest 3, it takes in AV1.

  • @SoundwaveZabuza
    @SoundwaveZabuza Před 8 měsíci +2

    I've got a 3080ti and haven't used DLSS or RT. Mind you, I don't play the latest and greatest games. I mainly play FPS games.
    The only thing I do like with Nvidia over AMD is their Shadow Play. As I like to record plays I make while gaming and find the quality and ease of use with Nvidia over AMD.

    • @DimkaTsv
      @DimkaTsv Před 6 měsíci

      Quality for recordings is not that much different between AMD and Nvidia visually. Just because you can set high bitrate for source and encoder issues aren't as apparent (it is low bitrate that becomes an issue).
      But for streaming, and especially on Twitch, which for who know what reason supports ONLY H.264 (aka AVC) as input and limited at extremely low bitrate as well. That is where AMD lagged behind.
      It became better with B frames and RDNA3, but still there is gap. CZcams on other hand supports inputting HEVC and AV1 streams. At that point Nvidia stops having clear advantage already.

  • @michaelragai3065
    @michaelragai3065 Před 8 měsíci

    Now I have rx 6750xt so should I upgrade for rx 6800xt or rx 7800xt.... or it not worth the upgrade and wait for (8000series)

  • @Djoki1
    @Djoki1 Před 8 měsíci +4

    Yeah ive tried blender on an AMD card and it was essentially how intel cards ran most games before optimization patches.
    I know im in the 0.000000000001% of gamers who dont exclusively game but it was still sad to see a simple render take like an hour to render.
    At least that has to improve.

    • @eldritchtoilets210
      @eldritchtoilets210 Před 8 měsíci

      AMD needs to work on their CUDA equivalent (ROCm? I ain't sure that's the name) REALLY badly.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      I agree that there is still a big Blender gap in favor of Nvidia, but it narrowed a ton over the last generation. Looking at 3060 vs. a 6600 with HIP and then comparing that to a 4070 vs a 7800 XT with the new HIP-RT, the benchmarks most favorable to AMD narrowed reduced the gap by 41%. To be clear: the Nvidia card is still 88% faster, but it was 155% faster a generation ago and that is a pretty big reduction over the course of a single generation.

  • @connorWithTheReallyLongHandle
    @connorWithTheReallyLongHandle Před 8 měsíci +3

    Most people aren't full-time streamers, that is to say, the bulk of your income comes from streaming (same goes for AI, video editing, etc). Unless that is the case the worse streaming, video editing, and AI aspect is a non-issue. Yes it's worse but unless you're GPU is how u make money, those differences aren't enough to be a factor.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      Actually video editing is no longer consistently worse on AMD than Nvidia. In fact in my testing the 7800 XT not only outperformed the 4070 in the extended score for PugetBench for DaVinci Resolve Studio (and increased h.264 to h.264 from under 110 fps on a 6800 XT to over 190 on the 7800 XT) but the Fusion module scores were a lot higher than a 4070 (705 vs 510) and even outperformed all the Fusion scores I could find online for the 4090.
      The 7800 XT is currently the best value for video editing on the market out of the 2 Nvidia, 3 AMD and 1 Intel Arc GPUs I have tested in person - and my online research is showing the same.

    • @OneAngrehCat
      @OneAngrehCat Před 8 měsíci

      That's not even really the problem...the accusation is plain old bullshit and the response, if Tim wasn't a tool, would've been "AMD's fine for 90% of what you talked about".
      AMD is functional with a vast majority of the BS accusation, without any real problems. The big exceptions are Blender where they're just bad and AI where it's an ongoing work. AI will probably be mostly in shape by summer next year. Blender...I wouldn't count on any competitiveness with Nvidia.

    • @connorWithTheReallyLongHandle
      @connorWithTheReallyLongHandle Před 8 měsíci

      @@OneAngrehCat That's basically what I'm saying. If u do one or two of those things a week as a hobby what's an extra minute or two. But if it's ur job and ur doing it several times a day that adds up, and that extra time is costing u money

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      @@connorWithTheReallyLongHandle For 3D animation in software like Blender, you are right on the money in terms of the performance benefit to professionals making a potential difference on their income. I was just saying that for video editing AMD is, for the first time in years, often faster than Nvidia . So even for professionals, AMD is often is a good choice for video editing now.

  • @monsG165
    @monsG165 Před 8 měsíci +1

    My use case is literally playing city building games or car games like Horizon and NFS after a long day of work.
    No need for DLSS, RT or VR, buying parts for a 7900xtx build but not sure which CPU is best for Anno1800 or City Skyline, 7800x3d or 13900k.
    Current set up is a GTX1070 with 8700k still manage to run the games I have just fine on 1080p but would like to upgrade to 1440p. Also not sure if my current set up could handle City Skyline 2 with pop density that I like to play with.

    • @onemorescout
      @onemorescout Před 8 měsíci +1

      Your best bet is waiting for Cities Skylines 2’s release, it’s looking like another unoptimized disaster and we don’t 100% know whether you need that 13900k or if you can save money buying a 13700k and waiting for a few patches.

    • @marcinkarpiuk7797
      @marcinkarpiuk7797 Před 8 měsíci +3

      i think that u should go AM5 with 7800x3d cos later u can upgrade processor for better one without changing everything other(zen5 for sure, probably zen6 too), just make sure u get motherboard with E(for future cards with pci5), atm intel is dead platform, raptor lake is just shhty refresh and arrowlake will be for new socket. I repleaced my old 3600 processor for 5800x3d half year ago and this will be ok for few years ;)

    • @mikem2253
      @mikem2253 Před 8 měsíci +3

      7800x3d. Those games seem to like cache. Check out benchmarks online.

    • @monsG165
      @monsG165 Před 8 měsíci

      Only found one German benchmark for Anno 1800 with 7800x3d and it was on 720p lol, however you are the first one to say if these games prefer cache over single core performance as with Intel CPUs @@mikem2253

    • @monsG165
      @monsG165 Před 8 měsíci

      So a motherboard from the B650E line?@@marcinkarpiuk7797

  • @Visleaf
    @Visleaf Před 8 měsíci +2

    Intel Arc is the value option for AI/raytracing. People aren't having that conversation because they care more about gaming (except when it's AMD, then suddenly the gimmicks matter again).

    • @dylf14
      @dylf14 Před 8 měsíci

      Intel came in and did everything well that nvidia fans demanded. Yet, you'll never hear anyone switch from nvidia to intel. And once again, when battlemage comes out, we will see that the same ratio of nvidia to non-nvidia marketshare. Its the same as the retarded Apple cult. The brand is now more important that the customer's real experience.

    • @ishiddddd4783
      @ishiddddd4783 Před 8 měsíci

      The thing that a lot of people forget when recommending arc, is that for new users it's still a hard no, which is why it gets a lot of flack despite having excellent encoders and good rt, most people buying gpu's in the below 300 mark are mostly gamers, so those functions are quite gimmicky when arc still has a pletora of troubles, it's gotten better but still a long way to go.

  • @stevenszmek7007
    @stevenszmek7007 Před 8 měsíci +13

    Intertesting how still so many people fall for Nvidias marketing schemes and percieve AMD as inferior. Both have their ups and downs and those features will be mainstream in 2 generations and nobody will talk about raytracing anymore since it's gonna be a regular thing. I still remember when shadows where introduced in games and how it was a "big deal". RT was introduced with the 20 series of Nvidia and still it is barely unsable without fake frames and upscaleing - 2 more generations / maybe 3 depending how little steps they want to make to tease customers and both companies will be at the point where todays "buzzwords" to cash in will be common tech and will not justify 1000 euro and above GPUs. After my trusty 1070 i went with the 6800 (because the XT wasnt much better for the price it had) and i am truly happy that i didn't chose Nvidia. What the future will bring nobody knows - but i rather stick to the raw performance numbers of hardware that can be improved upon by drivers than go for buzzwords and software features that break every other driver update - on both companies.

    • @yeahitsme4427
      @yeahitsme4427 Před 8 měsíci +2

      Tech youtubers really help Nvidia marketing. Should they start to ignore or put less value perception towards nvidia features, and everything would start to change. They continue to praise DLSS like a godly feature, when in reality, yes real gaming and not still or 4x zoom slides, the difference at 4K or 1440p with FSR Quality is like 5~10%. Nvidia strenght is perception, and tech youtubers can really change that easily.

    • @kenkoyka9
      @kenkoyka9 Před 8 měsíci

      lol yeah rtx 3080 with cp2077 1440p rt overdrive = 8 fps w/o dlss

    • @plasmahvh
      @plasmahvh Před 8 měsíci +1

      @@kenkoyka9 it's obvious u have 0 clue what path tracing is. try rendering a path traced image with a 3060. it would take 20 seconds for a single frame. dont expect to run PT native, especially not with a 3080. even the 4080 struggles with it at native.

    • @Zenzuu
      @Zenzuu Před 8 měsíci

      Same could be said about the PS5 & Series X. Sure they're using 4k checkerboarding and dynamic resolution and a few graphical settings turn down but in reality it's quite close to PC at max settings excluding ray tracing. Almost every single Digital Foundry video they had to zoom in 200% just to see the difference side by side. @@yeahitsme4427

    • @speedforce8970
      @speedforce8970 Před 8 měsíci

      I don't think people understand this but the whole world isn't the US. For example where I live, the RTX 3060 is cheaper than the 6600xt with more VRAM and features so it's the obvious choice. The price to performance improvements for AMD doesn't exist where I live so I have no reason to buy AMD cards. That's probably why AMD cards are so much less popular than Nvidia cards on the steam hardware survey.

  • @tohloc
    @tohloc Před 8 měsíci

    Reupload?

  • @shaunhall6834
    @shaunhall6834 Před 8 měsíci +2

    This proves the more choice we have the better.

  • @Whoisthis911
    @Whoisthis911 Před 8 měsíci +7

    That list of features sounds more like a bad collection of jokes & gimmicks.
    Give me raw rasterization performance, i dont care about anything else.

    • @injest3r574
      @injest3r574 Před 8 měsíci +1

      That's what I used to say until I actually used the features.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      @@injest3r574But for content creation it’s an outdated list. In fact some of the areas where the question said that Nvidia had a lead (like video editing) now show a performance advantage for AMD.
      That’s because cards like the 7800 XT massively improved h.264 encoding over the 6800 XT (as going from under 110 fps in both Premiere Pro and Resolve Studio to 180 fps and over 190 fps respectively in my testing).
      There’s still a big Blender advantage for Nvidia and there are some tools that only run on Nvidia, but for most video editing (including even some AI video upscaling tasks) the AMD cards offer better performance for the money.

    • @injest3r574
      @injest3r574 Před 8 měsíci +1

      @perlichtman1562 That's good to hear , I had a 6700xt and it was pretty bad when trying to stream anything. That card died then I just got a 4070. I will say dlss and frame gen are very good features in alot of games , mostly single player.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      @@injest3r574Yeah, DLAA specifically is the one that I miss most often in games on AMD. Ray reconstruction is one that could become more important in the future. I have been raytracing since I had a 486 laptop in the 90s so I appreciate a good denoiser. :)

  • @tex69420
    @tex69420 Před 8 měsíci +1

    I do vr, stream, content create and ray tracing, and I recently switched from a 2080 ti to a 6950 xt, everything is fine or as expected. People just need to unbias themselves.

  • @EthelbertCoyote
    @EthelbertCoyote Před 8 měsíci +3

    If you are a multi use gamer aka you work with GPU's as well as play, you use Nvidia. This is a gaming channel I like and have watched HUB for a long time but it's very important to get a range of views if you are a graphic professional. That being said, most game devs and 3d artists I know are Nvifia users so they often exclude AMD when making games till the end of development. On a side note most audio people I know who game are Mac users and endlessly grumble about having to buy a seperate lesser gaming pc to play games.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci +1

      I chuckled a bit about this because as someone that has worked with a Grammy winning artist as a professional audio engineer, I can relate to hearing Mac users grumble about having to buy PCs. :)
      With that said, having been benchmarking AMD, Intel and Nvidia GPUs extensively the last couple months, the situation has changed drastically regarding content creation over the last several years. The 7800 XT is currently the best value GPU on the market for video editing in both Premiere Pro and DaVinci Resolve Studio, with massive increases in h.264 performance over the 6800 XT and PugetBench extended overall wins over the 4070 in both Premiere Pro and DaVinci Resolve Studio with Resolve Studio Fusion performance that outperforms even a 4090 in some scenarios (and is within the margin of error for a 7900 XTX).
      There is still a significant Blender advantage in favor of Nvidia, but we’re talking about going from 150% advantage previous generation with basic HIP to 88% in best case scenarios with HIP RT in current gen. The gap is slowly narrowing there. However there are many other render engines that remain Nvidia only.
      There is also a big advantage in Nvidia’s favor if you want to use RTXDI in Unreal Engine 5.

    • @EthelbertCoyote
      @EthelbertCoyote Před 8 měsíci +1

      @@perlichtman1562 I hear you, (I am an animator) I chuckel abit at the gamer zealotry at manufaturers, AMD has pretty much always made great compute cards. As number crunchers I hear they usually do better than nVidia but man they just cannot seem to get a win in on the software side. Intels cards will if they keep at it catch up and maybe over take them as hard as it is for me to say this. What is wierd is on the pro side nvidia has been going software nuts with the omniverse stuff but has done damn near nothing on the quadro hardware side to justify the price difference between the pro and "gamer/consumer cards". I Just don't get it.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      @@EthelbertCoyoteYou really nailed it in on all counts. As far as justifying the Quadro cards etc. it really is bizarre how little Nvidia is doing to justify the price tags to 3D animators. I remember back in the 90s when I was raytracing on a Pentium 90 and 486 laptop and eventually a K6-3 450 MHz, looking longingly at the high price tag workstation cards offered in prebuilts from vendors like Intergraph, wishing I could take advantage of those features which were on a totally different level from gaming cards. But now? I honestly don’t even consider the Quadro cards when I shop GPUs because of the lack of compelling differences for my intended applications.

  • @dekngpasig
    @dekngpasig Před 8 měsíci +1

    spending for 1000usd for playable ray tracing are not worth it

  • @TheArakan94
    @TheArakan94 Před 8 měsíci +4

    Nvidia has issues with multiple monitors? Good to know! I've been using AMD for last 12 years or so (currently have 6800XT) but was actually considering Nvidia for next GPU (in 2-4 years) because of all the extra features. But I simply cannot live without multiple displays (work PC has 2, home PC 2-3) so it's crucial it works well (though gotta say, AMD ain't perfect - getting pretty high idle power when using more monitors - 50 W instead of 20-something W - because the memory doesn't downclock). Combining that with me using Linux a lot and AMD having great open-source drivers, it seems I am still going to be vendor-locked to AMD :D

    • @plasmahvh
      @plasmahvh Před 8 měsíci

      been running triple monitors on my 4070 and 0 issues on linux xorg/wayland and windows 10

    • @TheArakan94
      @TheArakan94 Před 8 měsíci

      @@plasmahvh hmm, that's good to hear. Perhaps HUB guys could elaborate on their issues? But honestly, only proper survey can really help to know..
      Oh, are all your displays same resolution / refresh rate?

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci +1

      @@TheArakan94It’s not just HUB guys. I’ve had several Discord discussions with another streamer that has been talking to Nvidia for weeks about multi-display issues and Nvidia has basically told him that it doesn’t look like there’s a fix on the way.
      As for myself, most of the time my 4070 runs fine in multi-monitor mode (including F1 2022 in triple monitor mode) but occasionally there are some issues that require toggling Nvidia control panel settings and restarting the game to get things working properly again. It does seem like my 6800 XT, 7800 XT and 7900 XTX haven’t had similar issues but I haven’t tested it extensively so I can’t be overly confident about that.

    • @TheArakan94
      @TheArakan94 Před 8 měsíci +1

      @@perlichtman1562 Hmm.. Any idea on how prevalent these issues are?

    • @tomtemple69
      @tomtemple69 Před 8 měsíci

      ill buy your 6800xt for $375

  • @dontsupportrats4089
    @dontsupportrats4089 Před 8 měsíci +3

    I care about pure rasturization first. Then the actual SIZE of the gpu. Who cares about the features if every model is 1" to long to fit in my small case. I never consider streaming nor production of any type as a sales point for my use case.

    • @TuskForce
      @TuskForce Před 8 měsíci +1

      surpass your boundaries - cut a hole in the case :P

    • @dontsupportrats4089
      @dontsupportrats4089 Před 8 měsíci

      @@TuskForce I can just hear my wife since my case is in the living room and she's super picky.

  • @Joric78
    @Joric78 Před 8 měsíci

    Currently best prices I can find here in Oz, the 4070 is $9aud cheaper than the RX 7800 XT. That and the fact that VR (using Meta headsets) is my most demanding use case, makes AMD a clearly inferior option. Shame really. I'd like actual competition. I wish there were reviewers that included VR performance reviews, because inferring performance and value from flat screen gaming often misses critical issues.

  • @adams4075
    @adams4075 Před 4 měsíci

    An online friend of nephews (plays online (friend hosts the game they play) bought a 4090 for just under 2000USD)) He sent my nephew some throw away parts: A 10400F, GTX 1650 4GB (with a Super Bios flash) at least GPU-Z said so, even though ASUS did not make that model in a super version)) and an 8GB stick of ram. I put in two 8gb sticks in boosting his performance. Then I changed it to an AMD 3600 with RTX 3060 12GB. I showed him for 1000 to 1500 you can put together an AM5 system with the RX 7800XT or 7900XT. All that aside, you need to ask yourself what games to you want to play, what else to you want to do. Just buy a 12GB or bigger GPU.

  • @Earthishome
    @Earthishome Před 8 měsíci

    they can keep the features none of them come close to what i need we are still at least 3 or more generations behind for that in my personal use and pc gaming is not car racing..

  • @TheMacedoniaGR
    @TheMacedoniaGR Před 8 měsíci +1

    Amd first must look the drivers? I had vega56 and now have rx6750xt, still have crashes in games, freesync not work properly, and many other issues...It not stable at all, and seems they not care to make graphics card stable, i cant imagine one company like amd with billions cant make properly drivers and support their products. Many people know that, and they wanna give more money to go nvidia cards, even the prices are 2 times up some times, thats why nvidia will never lower prices.

  • @jeofthevirtuoussand
    @jeofthevirtuoussand Před 8 měsíci +1

    to summurize the video , nvidia better all rounder which is a fact

  • @CS_Mango
    @CS_Mango Před 8 měsíci +3

    I feel like this nvidia vs amd thing is getting old. Yes and is behind nvidia. Yes they are cheaper and have more raw performance. Yes fsr3 is not as good as dlss.
    But i feel like this matters less than the loudest voices make it seem. Because we dont need the latest and greatest hardware to run the games. Especially if we avoid the horribly optimized ones. And plenty of people are overly obsessed with graphical fidelity. It often times doesnt even mean these games are good looking. For me personally some of the best looking games are not even that demanding. Elden ring, ratchet and clank rift apart, sea of thieves, bioshock, no mans sky, especially Subnautica, doom eternal and mass effect to name a few.

  • @Jackson-bh1jw
    @Jackson-bh1jw Před 8 měsíci

    maybe some edition like huntl3o does in his locutions would go nice.

  • @stephen8132
    @stephen8132 Před 8 měsíci

    VR on AMD is fine. I use a 7900xtx with no issues.

  • @stuartthurstan
    @stuartthurstan Před 8 měsíci

    If you're just buying a GPU for gaming, you may be better off abandoning the PC all together and get a console. The advantage of a PC over a console is the ability to do so many other things with it apart from just gaming. For me this means buying the GPU with the best capabilities in a wide range of tasks. Personally I use 3d CAD software which means AMD are really not an option. I wish they were a bit more competitive in productivity and feature set.

  • @lashyndragon
    @lashyndragon Před 8 měsíci

    I don't understand the VR thing. I haven't had any issues with my 6900 XT. Same with streaming. Even if its h264 is *technically worse* it's not like it's leagues behind there. And I feel like you shouldn't be streaming at such a low bitrate where it matters anyways.

  • @spiritualantiseptic
    @spiritualantiseptic Před 8 měsíci +4

    In reality Ray Tracing is not viable until at least the next generation of GPUs.

    • @VaginaDestroyer69
      @VaginaDestroyer69 Před 8 měsíci

      Yep, until the consoles can feasibly achieve 4K 60FPS with RT on, it’s going to be a feature relegated to the upper midrange and top end halo cards. I’d say that by the time the next generation of consoles is out, RT will have been standardised enough that the next great challenge will be path tracing.

    • @born2serve92
      @born2serve92 Před 6 měsíci

      RT works fine now on the 3090 and higher. Path tracing is where it becomes a issue. Reflections and shadows are fine. I can play 1440p control full RT at 80-100 fps with a 4070ti. Metro exodus enhanced. Spiderman titles. Alan wake 2. I could go on.... but RT works fine and some of these games do not even have DLSS in them. It's native 1440p and even yesterday I got some game called bright memory infinite. Maxed RT and everything and played through it. The PS5 is pull RT well in Spider-man 2, so idk why you would think the higher end GPUs cannot? Cyberpunk Path tracing mode in the DLC is really the only one that hits hard to me and needs frame gen or AI to make it work.

  • @sketchy0078
    @sketchy0078 Před 8 měsíci +6

    Gamer here and definitely F*@£ nvidia. I’ve run AMD for the last two generations and the money to performance including AMD’s fine wine aging make the choice simple. Ray tracing at the moment is just too much of a performance hit on both sides for it to be a mainstream option.

    • @xpodx
      @xpodx Před 8 měsíci

      Nvidia is faster

    • @sketchy0078
      @sketchy0078 Před 8 měsíci +2

      @@xpodx yeah but gamers don’t care about faster just that it runs their games for a decent period of time. My 5700xt lasted me a good three years and still runs a lot of games well now. I replaced it with a 7900xtx that I got for a crazy good price and it will run everything I need to for the next three years at least. I’m not knocking nVidia but everything has got expensive and for rasterisation AMD still does what I need it to for significantly less money.

    • @xpodx
      @xpodx Před 8 měsíci

      @sketchy0078 alot of gamers care about how fast there pc is. Yea amd has better price to performance most the time, but dlss is better, and other better features. Also known to be more reliable, no matter if you get a 10 year old nvidia card or new. Amd old is a coin flip, probably will give you issues in one way or another. I still hear people talk about there amd cpu having issues, as well as issues on there gpus

    • @mikem2253
      @mikem2253 Před 8 měsíci +3

      @@xpodxDLSS isn’t in every game though. Until it is, raster per dollar is the primary driver for me.

    • @sketchy0078
      @sketchy0078 Před 8 měsíci +2

      @@xpodx have you personally owned both side because I have and they both have their issues. NVidia at the moment is a greedy corporation that really is cashing in on the money they made during the pandemic. The fact they are releasing the 4070 with just 8gb of memory is terrible business practice. They must know these cards will be struggling in 18 months with new releases.

  • @RonOverdrive
    @RonOverdrive Před 8 měsíci

    Honestly I don't get the comment about VR as there's nothing that Nvidia does differently or better then AMD when it comes to VR support. I know plenty of folks that use AMD cards, myself included, for VR and there have been zero issues. Its not like upscaling or frame gen is prevalent in VR for example. Even if you're using a standalone VR headset in PCVR mode like a Quest there isn't much difference as you'd use H265 or now AV1 with the Quest3 for video encoding which both AMD and Nvidia are basically equals on.

  • @fabrb26
    @fabrb26 Před 8 měsíci +1

    AMD is for players 🐭, Nvidia is for true GAMERS 👑 ! 😎

  • @kevinkershaw1435
    @kevinkershaw1435 Před 8 měsíci +3

    this generation on gpu,s is a shitshow and consoles to. going to wait myself, ps5 pro maybe?

  • @stebo5562
    @stebo5562 Před 4 měsíci

    Nvidia is just really good at marketing. Even when amd is faster, nvidia fan will just say but nvidia is better at blender or say the amd encoder is bad even though they never used it. It’s similar to the console wars. I have amd and nvidia cards.

  • @williamregal8503
    @williamregal8503 Před 8 měsíci +1

    the average joe just wants to game what ever is cheaper just grab it and be happy

  • @raven80wolfx2
    @raven80wolfx2 Před 8 měsíci

    Honestly I finally went full amd this time. Tired of new connector. Tired of raise in price and appreciation for fsr when I had gtx 1080. I am now enjoy team amd. Driver issues when it first came out. It was quickly fixed in couple months. It's only got better month after month. Adding hyper rx, anti lag plus, and fluid motion. Can't wait for fsr 3. Never had any issues with quest 2 doing vr. If you have valve index maybe nvidia is better for that. Everything I do is better. I get a smoother game for control and dead space than 4090 with fluid motion. I saved 600 dollar difference vs 4090. I could of went 4090 and I am glad I gave amd a chance. They aren't falling behind they are doing things differently. If they don't show up next gen fine. They need to blow us away 3 years from now. Revolutionize the way gpus are made and figure out how to make them cheaper. If it worked for ryzen why can't they do it again. They make just as much money as nvidia, but spread out more. They just need to manage the rnd efforts towards efficiency. Sometimes, you can't always get it working your first time. I believe in this companies vision and if nvidia had their way prices would be just as high as they were. 2k or 3k a gpu.

  • @MechAdv
    @MechAdv Před 8 měsíci

    I’m trying to build a portable PC right now, and I’m really disappointed that I’m pretty much limited to RTX 4060 and 4060 TI. I don’t know why AMD doesn’t have more options under 200 W so that AIBs can make single fan cards or double fan two slot cards, shorter than 200 mm. I was really hoping that RDNA three would have a card that was equivalent to a 6750 XT, but in a single fan configuration for less than $400. But it’s looking like I can either get a 4060 at $280 or a 4060 TI for $360, and that’s it. Those cards will be fine for the next year or two at 1440p with upscaling, but UE5 games are crushing GPU performance left and right.

    • @skywalker1991
      @skywalker1991 Před 8 měsíci

      I'm running mine rx7800xt undervolted , uses around 210 watts , so you limiting your self to rtx4060 is stupid , bad cards . Power saving is not that much but rx7800xt performance is over 50% faster .

    • @MechAdv
      @MechAdv Před 8 měsíci

      @@skywalker1991 What part of portable PC did you not understand? 2 slots, less than 200mm. AMD makes nothing other than the 6600 that fits.

    • @DimkaTsv
      @DimkaTsv Před 6 měsíci

      ​@@MechAdv AMD makes dies and reference models. Other models are made by AIB's. And for AMD cards there is generally MUCH smaller variety of options than for Nvidia cards. You can just open TechPowerUp and check GPU Retail model list for AMD and Nvidia GPU's... Nvidia lists are multiple times longer.
      And there is 7600XT which should be able to run as portable card. But noone cares. Maybe there will be option when 7600 non-XT will come out, as it should potentially have even lower power consumption?

  • @yuboka49
    @yuboka49 Před 8 měsíci

    The number of Features are just a cover-up for the lack of progress in performance.

  • @Nasty_Pixels
    @Nasty_Pixels Před 8 měsíci

    I just upgraded 3070ti to 7700xt (some one waned 3070ti for custom loop badly so we changed). One thing what I noticed restorization of nvidia and amd is difrent and ai think I prefer amd. Nvidia is like with somekind anti aliasing even I turning off everything in driver’s and game. Games looks nice, soft edges, but in longer range its so hard to spot thing or track targets couse it looks like blured abit. Now on amd card I see more rought edges, but image way sharper and I cant see stuff way better. Well its probly difrent for ppl, but like more amd end result. Shure it has its own problems like draivers somehow gets broken feutures in some versions, but if just abit tweaker you wont see problem take older driver version from 3dguru and wait fixed version.

    • @DimkaTsv
      @DimkaTsv Před 6 měsíci

      Small hint. On AMD driver download page for your GPU, if you scroll down there is "Previous Drivers" link. Which will lead you to driver history, up to about... 6-8 months?

    • @Nasty_Pixels
      @Nasty_Pixels Před 6 měsíci

      @@DimkaTsv well in 3d guru you can get even older driver’s if you need for some reason 😀

    • @DimkaTsv
      @DimkaTsv Před 6 měsíci

      @@Nasty_Pixels Oh, don't mind me... I am an experienced guy, and have a collection of every driver release for 6750XT and now 7800XT in last... Almost 1.5 years already)
      Saved on local drive and NAS. I do sit regularly on 3DGuru though).
      I don't actually remember how much current Nvidia driver release takes space as package... 700? 800 MB? AMD ones atm are around 630-650MB

  • @jermainesabajo7225
    @jermainesabajo7225 Před 8 měsíci

    Yall dont know save the world ??

  • @christofernystrom2840
    @christofernystrom2840 Před 8 měsíci

    Yeah mabye talk more about who should buy it and who should not.

  • @RCcars2023
    @RCcars2023 Před 8 měsíci

    It's actually nvidia that will find themselves behind if they keep going in the direction of ai

  • @ruisilva4317
    @ruisilva4317 Před 8 měsíci +13

    After 20 years of nvidia cards I have switched past 2 generations, my only regret for going AMD is not to have done it sooner.
    All the upsides of Nvidia will eventually become irrelevant, AMD will catch up and probably surpass them once those technologies get more mature:
    * AMD just debuted its first implementation of FG
    * Unreal Engine 5 shows that RT performance becomes a more even playing field when game devs are not depending on implementing RT using NVIDIA's SDK and libraries
    * ROCm is getting buffed up to compete with CUDA
    * AMD new cards and with ROCm will challenge NVIDIA in AI workloads
    * Streaming will get even when platforms decide to address AV1 enconding (don't know why they are taking this long to implement)
    I'm here addressing things people attribute as NVIDIA being vastly superior, but there are things no one talks about where AMD just blows NVIDIA out of the water but contrary to AMD I don't have hopes of NVIDIA ever addressing them:
    * UI and user experience
    * Multi monitor experience
    * Linux experience
    * System snappyness (like alt+tab)
    * Driver overhead
    * Continuous delivery of improvements to older hardware
    For the average user AMD has been the better choice for a long time. IMO the real diff between NVIDIA and AMD is that NVIDIA is like Apple: a marketing company very focused on selling you a product and wanting you to buy it (even if you don't need it), for that they are usually the first ones to the race but after the initial thrill they leave you stranded by either stopping improvement/support to focus on the next level new unfinished feature. AMD is a product company, a guy that comes late but does it better after a while

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci +2

      I’m in a somewhat similar boat, though I would mention that the Unreal comment doesn’t account for Nvidia’s big recent RTXDI advantage. And you’re right about the driver overhead advantage: in Particle Illusion standalone particle system for video editors, in 3D mode I saw shorter export timed for both the 7800 XT and Arc A770 over the 4070 due to decreased CPU overhead helping in CPU limited scenarios.
      Other than that, overall the AMD cards have made huge strides for content creation recently and outside of applications like Blender, the 7800 XT is probably the best value in content creation on the market today.
      Here are some things I mentioned in another comment in the thread earlier.
      “Some of those areas are no longer stronger for AMD than Nvidia - the question is based on a partially incorrect premise. The 7800 XT is better than even a 4090 in many BlackMagic Design Fusion tasks, for example.
      For video editing, the 7800 XT saw big improvements over the 6800 XT in several areas (including big improvements to longGOP performance, including going from 109 fps on the 6800 XT to 192.19 fps in PugetBench Resolve h.264 to h.264) and is outperforming the 4070 in lots of Premiere Pro 23 and DaVinci Resolve benchmarks (
      - especially Fusion. In my testing it scores 38% higher in Fusion in PugetBench.
      The top 3 4090 PugetBench overall scores online only have Fusion scores of 650 to 681. My 7800 XT got 705 on its own and a massive 781 when paired with the 13900K’s iGPU.
      In recent Topaz Video AI 3 versions, it depends on which card, resolution and AI model, with the 7800 XT usually outperforming the 4070 in 4X upscaling 4K and the 4070 often outperforming the 7800 XT at lower resolutions. It’s not a slamdunk for Nvidia unless you are talking about performance at any price as opposed to performance at the same price tier.
      In regards to Blender Nvidia still has an advantage in both performance and compatibility. But using the built-in Cycles engine, the recent HIP-RT implementation has boosted AMD performance by 11-31%. There is still a significant advantage for Nvidia but the gap is narrowing.
      The 3060 vs the RX 6600 without HIP-RT was around 2.5x faster in the benchmark scenes.
      The 4070 vs. the 7800 XT with HIP-RT is 1.88x faster in the benchmarks most favorable to AMD from the same set.
      So between Blender software changed and AMD hardware changes in one generation AMD narrowed the gap by 41%.
      Nvidia frame generation vs. Nvidia frame generation is further along in regards to driver development and VRR support etc. but AMD’s implementation is supported on more hardware.
      In other words, these are examples of how AMD has been catching up, not falling further behind, in around half the areas discussed in the original question.”

    • @OneAngrehCat
      @OneAngrehCat Před 8 měsíci

      If you listen to the marketing, buying Nvidia is giving you the best stuff ever.
      If you look at the numbers, buying the 4090 is giving you the best stuff ever.
      How many people actually buy the 4090? 5% of the market.
      How many people would be better off not listening to PR? 95% of the market.

  • @sippi18fd
    @sippi18fd Před 8 měsíci +1

    AMD:
    - FidelityFX Super Resolution
    - Radeon Super Resolution
    - Fluid Motion Frames
    - Radeon Anti Lag
    - Radeon Anti Lag+ (yea its a bit different)
    - Radeon Boost
    - Radeon Chill
    I mean if you want a long list there you go. Most of the time feels like youtubers are not talking about amd features as much as nvidia features.
    Like DLSS is so good omg best thing ever amd can stop making gpus and such. Oh wait only supports 4000 series gpus and only works in a few games. If you only play Cyberpunk and Microsoft Flight Simulator than buy nvidia without a question. If you play other games, nvidia is just not better.
    I think AFMF is so good that if you play a lot of games and not just the AAA titles with dlss or fsr, amd will dominate these titles. Oh and if it will support steam deck im gonna touch myself.
    (I switched from GTX 1080 to 7900XTX this year. AMD Adrenaline is sooo much better.)

  • @cristian9993
    @cristian9993 Před 8 měsíci

    Honestly don't know why AMD gets so much hate. I've been using countless ATI/AMD graphics starting with starting with the x1800xtx, hd 3870, hd 4870, HD6850 ..... up to my current rx 6700xt. They were all performing flawlessly, never had any issues, not even when drivers were shit.
    Also been using couple of Nvidia cards, they were all great too, only thing i've noticed was that dual fan cooler designs are barely enough to cool gpus like the 3060ti, 3070ti, 2060super, so my only regret was not getting a triple fan card.
    Honestly if you care about value just get the better deal and leave your brand loyalty behind, they don't care about us so don't get baited by their marketing and gimmick features (yes i mean ray tracing which is unusable for the average customer even after 3 generations).
    Also Nvidia control panel and GeForce experience are a joke compared to.the Adrenaline kit.

  • @duxnihilo
    @duxnihilo Před 8 měsíci

    Because of AMD, my 1070 can run MW3 on high with FSR LOL.

  • @FullFledged2010
    @FullFledged2010 Před 8 měsíci

    Went from a 1070ti to a 6700xt and now a 7900xt and I have 0 reasons to go back to nvidia as I don't give a d%mn about rt or dlss.. Rasterize performance and enough vram is all I want 😑

  • @rogierstoel7898
    @rogierstoel7898 Před 8 měsíci

    First of all like your content of the last years! But I'm getting a bit of a 'to much' AMD (dont want to call fanboy haha) vibe. To be honest. If you want a graphic card for high end gaming you just need a Nvideo (40 series). Ray Tracing is real and new games really shine. I'm playing on a 8k (almost) monitor so I need a 4090. But to be fair, if you want to play the newest singleplayer AAA games amd is really a couple of steps behind.

    • @born2serve92
      @born2serve92 Před 6 měsíci

      where did you get your 8k monitor?

    • @rogierstoel7898
      @rogierstoel7898 Před 6 měsíci

      @@born2serve92 i've got the new Samsung Neo G9 57 inch. Its a beast and beautifull to behold

  • @tomtemple69
    @tomtemple69 Před 8 měsíci +1

    nvidia's bragging points are for features that like 5% of people actually use

    • @DimkaTsv
      @DimkaTsv Před 6 měsíci

      Well, it is still an user base.
      Like professional rendering performance, where Nvidia is still better. I saw transistor counts for RTX 4000 series, and and number of transistors is a lot higher than for AMD card to get similar or worse performance. What are these additional transistors do? Tensor cores (and RX 7000 series is actually not bad if not even better than Nvidia for some types of AI load), RT cores and... CUDA.
      RTX 4000 series gained impressive performance jump for Blender, assumingly because of CUDA improvements. It has it's user base, it is smaller than gamers, but it is willing to pay for such heavy performance increase (not to say some professional renderes don't even support AMD HIP properly yet, so CUDA becomes only option). That is one of the reasons 4090 is THAT much more expensive than other cards now. Professionals do not have an alternative. On other hand to gain this userbase Nvidia puts gamers under a bus giving them a second-sort type of treat.

  • @tk9839
    @tk9839 Před 8 měsíci

    I hope everyone realizes that the MSRP of GPUs is a total scam...should be about at most 3/4 of the price as it is currently.

  • @latlanticcityphil
    @latlanticcityphil Před 8 měsíci +1

    I just play games, and AMD works better for me "6900Xt".

  • @Bluelagoonstudios
    @Bluelagoonstudios Před 8 měsíci

    VR is only good to watch porn 😆 I used it on a Ratchet and Clank game, and it disappointed me. So that was it for me, even watching a movie, it gave me headaches after a film from about 3 hours. A bad experience of all.

  • @notjustforhackers4252
    @notjustforhackers4252 Před 8 měsíci +4

    Are AMD falling too far behind NVIDIA? NO, AMD absolutely own NVIDIA and Intel... on Linux, where they work better than they do on Windows. Facts told to you by a multi booter.

    • @xpodx
      @xpodx Před 8 měsíci

      Amd is slower. Nvidia has the fastest card

    • @notjustforhackers4252
      @notjustforhackers4252 Před 8 měsíci

      @@xpodx But not the support.

    • @xpodx
      @xpodx Před 8 měsíci

      @@notjustforhackers4252 support what?

    • @notjustforhackers4252
      @notjustforhackers4252 Před 8 měsíci

      @@xpodx MESA, gamescope, open source.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      It depends a lot on what specifically you are comparing. In many areas AMD is either catching up to or surpassing Nvidia - especially in video editing where a 7800 XT is overall outperforming a 4070 in both Premiere Pro and Resolve Studio and in Fusion it even outperforms a 4090 in some tasks. In Blender and Unreal Engine 5 creation there’s a big Nvidia advantage. But even in Blender we are talking about an 88% advantage 4070 vs 7800 XT with HIP RT vs a 150% advantage previous gen with cards at comparable price points before HIP RT was an option.
      The gap is narrowing - not getting wider.

  • @darreno1450
    @darreno1450 Před 8 měsíci

    UE 5 will pretty much nullify the need for RT or any other proprietary image quality tech from Nvidia. And many big games are going in that direction. Lumen IQ is excellent and runs great on AMD cards.
    Nvidia is riding on the CP RT / PT fad for now but that's going to fade away.

    • @plasmahvh
      @plasmahvh Před 8 měsíci

      they are closer cause the only reason lumen exists is for console "ray tracing". cause they are too weak to run actual ray tracing. and guess what? consoles are amd's domain. in actual ray tracing nvidia is still much faster than amd. dont confuse lumen with ray tracing cause it isnt.

    • @darreno1450
      @darreno1450 Před 8 měsíci

      @@plasmahvhWhether RT is done in software or hardware hardly matters if the IQ is there. Lumen's IQ is amazing and it doesn't require proprietary hardware which is my point. It is also faster and is the direction devs are going in.
      I mean, the ONLY game people are hyped about RT and PT is CP. In most other games turning in RT is pointless. Even CDPR is jumping on the UE5 bandwagon for their upcoming titles.
      I see hardware based RT taking a back seat for a while to come.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      My AMD cards are great for content creation and in many applications better than my 4070 but when it comes to Blender and UE5, it’s not at all like you described. Not only can Lumen be hardware RT accelerated (to Nvidia’s benefit) but currently there Unreal Engine 5 forks with Nvidia’s RTXDI that show insane performance increases when using many light sources (as in over doubling the performance and sometimes significantly more than that) in native frames without frame generation using code that won’t even run on AMD.
      To be clear, the 7800 XT is overall the best value GPU on the market if you do a lot of video editing, but creating with Unreal Engine 5 and/or Blender is going to run a lot faster with Nvidia.

    • @darreno1450
      @darreno1450 Před 8 měsíci

      @@perlichtman1562 Lumen RT hardware acceleration is slower than software from what I read recently and is not he preferred route right now. Not sure about any UE 5 forks, but at the end of the day, software based RT would always be an option. As long as that's the case I don't see.the point of purchasing Nvidia hardware for RT.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      @@darreno1450 Epic has been doing great work to make software Lumen get faster when working with lots of lights but if you look in some of the developer and artist (as opposed to gamer oriented) forums you’ll see that especially in the early Lumen days, people were getting better performance when working with a high number of lights (as in up to 750) with Nvidia’s RTXDI enabled in the UE5 editor than with software Lumen alone.
      However, another user recently gave me an important correction that I apologize for getting wrong: while RTXDI is an Nvidia developed technology, Nvidia made it compatible with AMD RX 6000 and up. So it will run on AMD too, though I haven’t seen a performance comparison. My mistake.

  • @andregomes3317
    @andregomes3317 Před 8 měsíci +1

    The problem with av1 is that it's not supported on Many streaming platforms, Including Twitch. Nvenc is still a valuable asset

    • @Rood67
      @Rood67 Před 8 měsíci

      Excellent point. However, with av1 gaining in popularity, more and more *will likely* adapt to support the technology.
      I stress will likely, because for some, it may never fit their business model or hardware or … whatever they would need to change.

    • @plasmahvh
      @plasmahvh Před 8 měsíci

      @@Rood67 i mean u need av1 decoding capable hardware so that will take some time. there are people with ancient hardware so either a) the streaming site needs to convert from av1 to h264 or b) they will just wait until that hardware is obsolete and nobody uses them. if you want a gpu to stream with get nvidia. nvenc is amazing

    • @OneAngrehCat
      @OneAngrehCat Před 8 měsíci

      TBH Twitch is the problem here. They need to move their asses already. AV1 is the upcoming standard, it's for them to move and start transcoding.

  • @XeqtrM1
    @XeqtrM1 Před 8 měsíci +1

    To be honest no matter how much better AMD would be people would still choose Nvidia

    • @born2serve92
      @born2serve92 Před 6 měsíci

      I don't believe that's true. Most of us buy based on what fits our need and income level. I have AMD in a laptop, Steam deck, Ps5 and an old 580. I also have a 4070ti in my current main PC. I like 1440p and DLDSR, DLSS, FG etc. I don't care what company or pricing is, but I care if it takes care of my needs at working / gaming equally. Maybe I am in denial and people are cult like about this stuff? I just think it's a loud minority of peeps

  • @offseteraofficialband
    @offseteraofficialband Před 8 měsíci +1

    As much I love a good value product, AMD cards have given the most problems. I won an IGN sweepstakes last year and got a sweet ITX system with an AM5 cpu and an RX 6900 XT. In less then a year the GPU completely died. Just gaming, no over clocking. Before that i had another AMD rx 590 that black screen after a few minutes of playing a game. Nvidia has been more stable, power efficient, and run cooler. I will say though as a content creator myself I have never seen a difference between video editing with an AMD graphics card versus an in video one.

  • @stefensmith9522
    @stefensmith9522 Před 8 měsíci

    AMD has always been seen as the "underdog" so people look at NVIDIA and see them as the standard and then look at AMD as this start up company that's trying to do something close, much like Intel is doing now. Intel is seen as a cute novelty graphics card which is warranted right now but AMD has ray tracing performance on par with the 30 series so... Is the 30 series bad at ray tracing? If you say no then that means AMD isn't bad at it either lol and raw performance AMD is on par with a 4080ti ish performance... Is that bad performance for $500 less than NVIDIA? The gimmicks that are admired by NVIDIA are going to slow down gpu advancement like frame generation, now these companies can simply toss in a fake frame here and there simply to get that fps number up and call it "improved" BUT they will charge us the same price as if they physically improved the card which, to me, will slow down innovation on the gpu and just push better software tricks.

  • @roo79x
    @roo79x Před 8 měsíci

    Also if you use Linux then it's use AMD unless you want to do A.I Machine learning. And to some extent transcoding. Jellyfin (on Linux) rates it as Intel →Nvidia →AMD. Plex prefers Nvidia over AMD as well. If you want to say that there is no gaming on Linux then you are not looking at all the facts.

  • @Azureskies01
    @Azureskies01 Před 8 měsíci

    If AMD are too far behind Intel should just give up right now.
    These questions are sometimes....dumb

  • @RCcars2023
    @RCcars2023 Před 8 měsíci

    Ai is not for gaming keep gaming legit

  • @giolrz8237
    @giolrz8237 Před 8 měsíci +1

    Absolutely no. AMD is doing the best they can with the little they have. But yeah, I forgot this is NUB.

  • @stevensmith6445
    @stevensmith6445 Před 8 měsíci +3

    I went with a 4090 this round because I wanted "The Best" I assure you I will not waste the extra money next time and will be going back to AMD as SOON as they surpass the 4090 marginally.

  • @nimaarg3066
    @nimaarg3066 Před 8 měsíci +1

    NVIDIA fanboys are the funniest bunch.

  • @ninjabastard
    @ninjabastard Před 8 měsíci +1

    I wanted an AMD gpu for gaming but i also do AI workloads at home. When all the AI/ML documentation, tutorials and github code is based on using nvidia cuda, it's hard to justify getting amd card. AMD supposedly works better on linux due to drivers shipping with the linux kernal. But after you get the nvida linux drivers working, i havent had much problems running games or AI workloads on linux. i ended up with a nvidia gpu (3060 ti) since the ai/ml features were worth the 10% price premium from the equalivant amd card.

  • @kravenfoxbodies2479
    @kravenfoxbodies2479 Před 8 měsíci +1

    Nvidia has priced there self out of the market with the overpriced RTX 3000 and RTX 4000 cards charging the Bitcoin and AI way.

  • @Belgsken001
    @Belgsken001 Před 8 měsíci +4

    100%

    • @WyattOShea
      @WyattOShea Před 8 měsíci +1

      Agreed even though I own a full amd system it's undeniable that nvidia just has more and better features than we do.

  • @tldrinfographics5769
    @tldrinfographics5769 Před 8 měsíci

    Nvidia have too many usage for content creators to ignore.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      That was definitely the case a year or two ago but that changed drastically over the last year , especially for video editing. I’ve been using a 3060, 4070, 6800 XT, 7800 XT and 7900 XTX a ton for work in Premiere Pro, Resolve Studio and Blender for video editing the 7800 XT is the best value of the group, hands down. It outperforms the 4070 in the PugetBench extended overall for both Premiere Pro and Resolve with an even bigger win in Fusion. In fact some of the 7800 XT Fusion scores in my testing are higher than you’ll find for the 4090 online, though recent updates are likely to improve RTX performance significantly in AI based tasks like Magic Mask in Better Quality mode.
      In Blender the 4070 is a much better performing card than even the 7900 XTX but we are talking about a much narrower gap with the 7800 XT with HIP RT vs the 4070 than we were with RX 6000 cards with HIP vs the RTX 3000 series. Nvidia went from a 150% lead to an 88% lead in best case scenarios. Although if you want to use many 3rd party render engines instead of Cycles, then Nvidia has the compatibility advantage.
      Still, overall the 7800 XT is one of the best content creator cards unless you primarily do 3D animation. Even some AI video upscaling tasks are faster than on the 4070.

  • @ronchum5178
    @ronchum5178 Před 8 měsíci +1

    I only care about DLSS upscaling being better than FSR, but even still FSR is adequate. People might like the idea of DLSS3 with its fake frames, but it's really more of a selling point that a useful feature.

  • @JayzBeerz
    @JayzBeerz Před 8 měsíci +5

    Yes

  • @striderwhiston9897
    @striderwhiston9897 Před 8 měsíci +1

    The comment at the beginning is like that one "Leave the multi-billion dollar company alone!!!" meme defending the most anti-consumer company in the GPU market

  • @theviewer1423
    @theviewer1423 Před 8 měsíci

    lol nah I don't play overwatch and fortnight, yucky mainstream kiddy and cartoon games I play hardman games like insurgency sandstorm and darktide

  • @neofitosmihail4272
    @neofitosmihail4272 Před 8 měsíci

    yes amd is falling far behind oh wait with fsr 3 frame gen and amfm now amd have more features than nvdia !!! even in streaming tools or editing you can use rocm

  • @leovanlierop4580
    @leovanlierop4580 Před 8 měsíci

    Energy consumption again not a topic here. Sigh.

    • @urtoodumb7335
      @urtoodumb7335 Před 8 měsíci

      so you are buying gpu because of energy consumption? wtf. You work harde and pay your electric bill before buying a better gpu.

    • @leovanlierop4580
      @leovanlierop4580 Před 8 měsíci

      @@urtoodumb7335 all you can think of is money?

  • @gejamugamlatsoomanam7716
    @gejamugamlatsoomanam7716 Před 8 měsíci

    If AMD keep being complacent with RT, intel will take second place from them. Lets be realistic no one is taking no 1 from nvidia in gpu market

    • @maxwell2288pspgo
      @maxwell2288pspgo Před 8 měsíci +1

      Bro RT is a pipe dream at the moment. One day it will be good, but right now it’s not worth using in most cases. The performance hit even on NVIDIA cards is still too high to be warranted.

    • @gejamugamlatsoomanam7716
      @gejamugamlatsoomanam7716 Před 8 měsíci

      @@maxwell2288pspgo eventually it will be as common as AA. when that occurs nvidia will be the front runner as they have a significant lead in that feature.
      AMD is always late to the party, nvidia came out with dlss first then later amd makes fsr, nvidia makes rt cores first, took amd till 6000 series to have working RT, nvidia had AV1 for a while, AMD only got it in 7000 series, nvidia launched 40 series with frame gen, took AMD 8+ months of rdna3 to get fsr3 frame gen... on two of the worst selling pc games

  • @PlanetSorow
    @PlanetSorow Před 8 měsíci

    Nvidia Ray Tracing That Nvdia DLSS That.
    Start think rational.
    AMD has all futres that nvidia got except DLSS 3.5 Ray Reconstrucion.
    Make a list of UE 5 Games and put them on benchmark.
    You can see that 7900xtx is qual in Ray Tracing Performance with 4080.
    You miss this importing futures.
    You can see clearly in Lord of The Fallens 4090 has little advatage over 7900 XTX.
    Clearly the new games are gonna be based on UE 5 so the real question for the gamer is wich Graphic Card is working really good with that engine.
    We clearly see how the future of vram is going UP.
    Gamers with 3060, ti, 3070 are now on barly minimum to play at 1080 and gamers with 6700 still play games.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      As a content creator that loves the way my AMD cards perform in video editing but prefers creating with Nvidia in Blender and UE5, I think it’s a lot more nuanced than that.
      First of all, Nvidia’s RTXDI performance in UE5 is pretty amazing and on a performance level that AMD isn’t even coming close to yet. That is currently one of the few areas where the gap has been widening as opposed to narrowing.
      Second, there are areas where AMD is catching up (like raytracing in Blender with Cycles) with huge strides across the last generation when combining hardware changed with Blender’s expanding HIP-RT support, but we are still seeing an 88% advantage in Nvidia’s favor at the same price point.
      Now, for overall mixed use (for instance lots of gaming and video editing with a much smaller amount of 3D modeling and rendering) the 7800 XT is the best value on the market. But if we are mainly talking Blender and UE5, Nvidia still has a big lead.

    • @c523jw7
      @c523jw7 Před 8 měsíci

      7900xtx is not equal to 4080 in ray tracing. Not true.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      @@c523jw7You’re right. There are some light ray tracing scenarios where the 7900 XTX comes out on top but as the ray tracing complexity increases sufficiently the 4080 consistently takes the lead - unless VRAM limitations hit, in which case the 7900 XTX sometimes narrows the gap. It’s hard to argue that the 7900 XTX is superior in regards to the raytracing itself: the 4080 is.

    • @c523jw7
      @c523jw7 Před 8 měsíci +1

      @@perlichtman1562 yeah heavy ray tracing is not as common in games, it's just when it happens it really plays to the NVIDIA card.
      Vram wise we are not over the 16g buffer and won't be for a while even hardware unboxed are like 4 to 5 years..plus the console being 16 was the tell tale sign. Now games can allocate more but usage isn't happening. In the event it goes over 16g, it depends on what's pushing it over if it's heavy rt, the xtx will still struggle because the cores aren't there. But it's struggling for the right reason and not the manufacturer skimping out on vram. I actually got a 4080 and was heavily weighing up an xtx so thought about this stuff alot. I do think 20g would have never have hurt still.
      I also think about the 3090 though and how we didn't even scratch half the vram on that card in 3 years of launch..unless you mod. I feel like NVIDIA did the 3080 dirty but 24g is overkill..like by the time 24g is being utilised that card is basically obselete in terms of 4k rt maxed etc.
      I just try and see things objectively, if you want to maintain performance your upgrading every 3 years or so, I think future proofing is a lost battle in this PC game ...maybe a 4090 does good but everything just doesn't.

    • @perlichtman1562
      @perlichtman1562 Před 8 měsíci

      @@c523jw7Yeah there aren’t a lot of game scenarios yet where you need over 16GB of VRAM and you’re right that the developers are likely to optimize for less for a while when the consoles have 16GB of unified RAM.
      My higher VRAM usage scenarios are essentially all content creation. I unfortunately recently had to pickup a 3090 recently because my 4070 doesn’t have enough VRAM to run offline gaussian splat processing (need at least 24GB) and my 7900 XTX isn’t compatible with the software (needs to be Cuda). I’ve usually been able to optimize my Blender work to fit in 16GB of VRAM but unfortunately this was a bit more of a hard limit. 😂

  • @BrianCroweAcolyte
    @BrianCroweAcolyte Před 8 měsíci

    The problem with AMD streaming is their h264 encoder. Even on RDNA3 it's worse than like, the Intel HD Graphics 530 found in the i7 6700K. AMD basically just gave up on h264 and h265 and is waiting for the clock to run out and everyone using AV1 (which is probably going to take another 4-6 years). I'd say the biggest argument for buying Nvidia over Radeon is that they usually hold their value better on the used market and PC flipping market, so the piddly little ~10-20% discount you get by going for a Radeon card usually gets diminished a fair bit if you ever plan on selling it.
    Other areas might be different, but when I'm flipping PCs on Facebook marketplace the ones with Nvidia GPUs get sold within days, while the Radeon ones usually take 1-2 weeks and a lot of negotiating. Not worth it just to save $50-100 AND have less/worse features.

  • @mrshinobigaming8447
    @mrshinobigaming8447 Před 8 měsíci +2

    Simple answer YES
    Hard answer Yes but not in performance