Why Isn't AMD Innovating With Features Like Nvidia?

Sdílet
Vložit
  • čas přidán 2. 03. 2024
  • ► Watch the FULL video here: • DF Direct Weekly #149:...
    ► Support us on Patreon! bit.ly/3jEGjvx
    ► Digital Foundry CZcams: / digitalfoundry
    ► Digital Foundry Merch: store.digitalfoundry.net
    ► Digital Foundry at Eurogamer: eurogamer.net/digitalfoundry
    ► Follow on Twitter: / digitalfoundry

Komentáře • 672

  • @cbrunhaver
    @cbrunhaver Před 4 měsíci +38

    Nvidia has literally 5X the number of software engineers

    • @Thejacketof-huang
      @Thejacketof-huang Před 4 měsíci

      Ahahahahahhahahahaha in your mind AMBU, in your mind AMDumb 😂😂😂😂😂😂😂😂😂 a clown company -> AMDick

    • @Koeras16
      @Koeras16 Před 4 měsíci

      I always found it was rather surprising that the gap isn't much bigger between those two companies. NVIDIA holding back ?

    • @Henrik_Holst
      @Henrik_Holst Před 4 měsíci +1

      @@Koeras16they will only do exactly as much as they have to, there is no reason to invest even more resources and be say 10x better and still only sell the same amount of cards.

    • @o_sagui6583
      @o_sagui6583 Před 4 měsíci +1

      ​@@Koeras16we all know what happens when they don't hold back. I'm looking at you GTX 1080

  • @tdrm
    @tdrm Před 4 měsíci +154

    One big AMD GPU innovation was Mantle, which AMD developed in partnership with DICE for Battlefield 4. Later on Mantle was donated to Khronos, tweaked a bit and rebranded as Vulkan.

    • @guts2787
      @guts2787 Před 4 měsíci +23

      And now vulkan is basically dead.

    • @PrefoX
      @PrefoX Před 4 měsíci +7

      and AMD didnt give a fuck about it, lets be real... and it wasn't necessary because DX11 and 12 were on their way. and Mantle is no Innovation at all, it has nothing to do with the GPUs soft or hardware... its just an API.

    • @brkbtjunkie
      @brkbtjunkie Před 4 měsíci +35

      @@PrefoXyou sound mad about it, what happened between you and mantle???

    • @pixeljauntvr7774
      @pixeljauntvr7774 Před 4 měsíci

      Is it an innovation?

    • @Beany2007FTW
      @Beany2007FTW Před 4 měsíci +37

      @@guts2787If you ignore the Steam Deck, perhaps. Proton's translation engine ports DX commands into Vulkan for every game that doesn't have a native Linux version.

  • @FatNorthernBigot
    @FatNorthernBigot Před 4 měsíci +185

    Has anyone else noticed just how tribal the CZcams comments section is? Even picking sides over things such as "GPU manufacturer". Both AMD and Nvidia will gladly take people's money. Why defend one over the other?🤔 People get really angry over this.

    • @Wobbothe3rd
      @Wobbothe3rd Před 4 měsíci +37

      85% of PC Gamers buy Nvidia. Nvidia has clearly won. The overwhelming AMD bias on social media is a reflection of the DISCONNECT between actual gamers in real life and people who comment about games. It's the job of journalists to report REALITY, but social media reflects what the users (ie non-gamers, broke losers, people bitter about their recent purchases) WANT TO HEAR. This is why the entire business model of making free content for ads fails - the people who are clickbaited are usually NOT the real world subjects of the content being discussed.

    • @ratchet2266
      @ratchet2266 Před 4 měsíci +34

      Same with anything. (Apple vs Android, PlayStation vs Xbox.) people want to justify their purchases and or feel superior.

    • @FatNorthernBigot
      @FatNorthernBigot Před 4 měsíci +13

      @@ratchet2266 I once criticised the PS5's aesthetic on this very channel. I might as well of recommended the punching of babies.

    • @somegeezer4058
      @somegeezer4058 Před 4 měsíci +12

      ​@@Wobbothe3rd Trap sprung!

    • @outsideredge
      @outsideredge Před 4 měsíci +7

      That's why the best DF videos are the retro ones. You don't get this sort of tribalism looking at retro games.

  • @cbrunhaver
    @cbrunhaver Před 4 měsíci +173

    Mantle/Vulkan was a big AMD contribution

    • @DirranProductions
      @DirranProductions Před 4 měsíci +3

      I love Vulkan titles. Doesn't seem to happen often anymore though.

    • @josephdias5859
      @josephdias5859 Před 4 měsíci +7

      @@DirranProductions but every game can be modded to vulkan pretty much fixes poor porting forsure

    • @DirranProductions
      @DirranProductions Před 4 měsíci +3

      @@josephdias5859 yeah, still would be nice to have it come as an option out of the box, monster frame rates on modest hardware.

    • @dpptd30
      @dpptd30 Před 4 měsíci +12

      Yeah sure, over a decade ago, and never since have they been ahead of Nvidia. Nvidia already have AI powered RTX HDR, VSR, frame generation, denoising (DLSS 3.5 ray reconstruction), temporal upscaling and anti aliasing (DLSS/DLAA) perfectly implemented, while AMD have only just announced that they are working on an video upscaling solution that is not even AI powered, while their FidelityFX anti aliasing is not even comparable to TAA let alone DLAA in terms of quality, and their anti lag plus won't even be implemented anymore due to CS2 players being banned, and their FSR 3 still have horrible frame pacing issues, and now without anti lag plus, worse latency as well; they haven't even started to develop a DLSS 3.5 like solution or RTX HDR solution, and their FreeSync "Premium Pro" isn't really premium or pro consider that the pipeline is still causing tone mapping issue that result in inaccurate EOTF tracking on HDR monitors. Now that Nvidia's new app is in beta, AMD's adrenaline will soon no longer be a selling point, making slightly cheaper rasterization performance per dollar their only selling point, which won't be relevant for long considering Nvidia Blackwell is already rumored to be 2.5 times faster in RT, which will make RT a no brainer to turn on due to very low performance impact, while currently Radeon still only sort of caught up with last gen Ampere's RT performance. Plus AMD have recently pulled driver support for Vega, a generation that is not even 7 years old, while Nvidia is still supporting Kepler, a generation that is 12 years old, making AMD fanboys' argument of "ages like fine wine" ages like spoiled milk, especially now that Pascal have just received a performance boost on Alan Wake 2 in the latest update, even though Pascal doesn't support mesh shaders. On top of all that, Nvidia is now worth more than Google, Amazon, and Facebook, which means they have even more money than they have before to invest in R&D, not that they need more to dominate the competition in the first place. So in which case, GOOD LUCK AMD, you guys need to try much harder than "OuR's Is OpEn SoUrCe!".

    • @Peytonwatson14
      @Peytonwatson14 Před 4 měsíci +2

      @@dpptd30even as an AMD gpu user(not fanboy). You make great points and I really can’t argue it even if I tried.

  • @SvDKILLSWITCH
    @SvDKILLSWITCH Před 4 měsíci +28

    Mantle is probably the largest of AMD's recent contributions to the graphics industry that's affected consumers in a fairly large way and was also something that they 'pioneered'. From a consumer perspective, Mantle is what laid the groundwork for what would become Vulkan, and Mantle came almost a full two years before Microsoft released DirectX 12 with similar goals (lower-level APIs, reduced driver overhead, better multi-threaded support). Vulkan itself hasn't seen incredibly high adoption on Windows, but it is used in extensively as part of Proton (or rather DXVK) which is a critical lynchpin in games working as well as they now do on Steam Deck and Linux.
    FreeSync is also a big one but could definitely be seen as a response to the original version of G-SYNC, but VRR being so common-place now is a huge win for gamers in general IMO.

  • @Marhan93
    @Marhan93 Před 4 měsíci +45

    AMD has been playing catch up for a while, I suppose maybe the "largest" step AMD took for the industry recently was the Mantle API with Battlefield 3 or 4 (dont remember which one), which I kinda consider the catalyst that made DX12, Vulkan and Metal end up like they did.

    • @mitchjames9350
      @mitchjames9350 Před 4 měsíci +3

      @@ICeyCeR3Aldon’t forget Infinity Cache and Infinity Fabric.

    • @Thejacketof-huang
      @Thejacketof-huang Před 4 měsíci

      ​​@@ICeyCeR3Alwow really? 👾 all copy (bad) and past of technologies from other companys. Nice try AMBug. It's over for trashdeon 💩🤡🧠🚫💀👉🏻🚽

  • @tomsimmons7673
    @tomsimmons7673 Před 4 měsíci +48

    I used to watch Alvin and the Chiplets as a kid.

    • @Dezzyyx
      @Dezzyyx Před 4 měsíci +11

      alvin and the chipsets

  • @rob4222
    @rob4222 Před 4 měsíci +9

    Truform was game changing. It opened the door for tessellation in games

  • @LucidStrike
    @LucidStrike Před 4 měsíci +9

    1:09 CIG confirmed their GI solution is largely based on AMD's approach in that paper you're talking about, so at least Star Citizen / SQ42 will eventually count as games with that implemented.

  • @AntonioAugustoVianaS
    @AntonioAugustoVianaS Před 4 měsíci +3

    Guys, you failed to talk about Mantle!!! AMD gave birth to DX12!

  • @EspHack
    @EspHack Před 4 měsíci +37

    that HD 5970 made nvidia look like a joke, and they shot back with that fermi furnace barely competitive lineup, good times

    • @ETin6666
      @ETin6666 Před 4 měsíci +4

      Good ol ATI days. Amd should have stuck with the ATI branding, its been kinda all downhill since rebranding.

    • @christophervanzetta
      @christophervanzetta Před 4 měsíci +2

      And yet ATI went bankrupt 😂

    • @HunterTracks
      @HunterTracks Před 4 měsíci +10

      ​@@christophervanzettaATI didn't go bankrupt, they were bought out by AMD for $6.8 billion in a corporate merger.

    • @ResidentWeevil2077
      @ResidentWeevil2077 Před 4 měsíci +1

      @@christophervanzetta ATI Technologies was purchased by AMD in 2006; they didn't retire the ATI brand until 2010.

    • @HosakaBlood
      @HosakaBlood Před 4 měsíci

      5970 was dual gpu card no alot of ppl liked dual gpu back in days enthusias still perfer SLI 480 or crossfire 5870 vs a 5970

  • @Lucromis
    @Lucromis Před 4 měsíci +12

    I remember the cuda features in the late 2000s made me pay attention to nvidia features like physx. Nvidia just has so much more money they invested more than a decade ago, it will be a while before anyone can catch up if they ever can. The only AMD one I can think of is that direct x 11 early.

    • @lePoMo
      @lePoMo Před 4 měsíci

      i do not remeber how cuda and physx correlate, so this is not a rebuttal.
      I just want to add that nvidia bought Physx, so that part isn't an Innovatioun of there's. (not that you said it was)
      oh and amd gave us
      * resizable bar. had that been nvidia, they would have made it proprietary somehow.
      * amd laid the stepstones for DirectX 12 and Vulkan with their Mantle api.

    • @darudesandstorm7002
      @darudesandstorm7002 Před 4 měsíci +5

      Resizable BAR wasn't an AMD innovation. It was already in the PCIE spec but no company had made use of it. Sure, they were the first to implement it but they didn't actually develop it.

    • @lePoMo
      @lePoMo Před 4 měsíci

      @@darudesandstorm7002 agreed. where do you draw the line though? Gsync came after Vesa adaptive sync. You could say "it was already in the spec but no one made use of it".

  • @crypto1300
    @crypto1300 Před 4 měsíci +46

    AMD was first with a mainstream HBM implementation, Freesync was huge for me becasue Gsync was way too expensive. Chiplets are innovative. Xenos unified shaders was a first too.
    Edit: To be fair, DLSS and FG are amazing Tech. I run a 7950X3D/4090 and 7950X/7800 XT and to be fair both systems are great! I usually can't even tell the difference between each while I'm gaming.

    • @Jaaxfo
      @Jaaxfo Před 4 měsíci +3

      Not just first with HBM, they spearheaded the development of the technology

    • @mitchjames9350
      @mitchjames9350 Před 4 měsíci +1

      They developed HBM also Infinity Cache, Infinity Fabric, Async Compute and the general direction of gpu architecture which Nvidia borrowed from GCN for Turing onwards.

    • @Thejacketof-huang
      @Thejacketof-huang Před 4 měsíci

      It's over AMDumbs, it's over 💀

  • @mitchjames9350
    @mitchjames9350 Před 4 měsíci +2

    HBM, Infinitie Cache and Infinite Fabric, Async Compute and the general direction of the architecture of GPU’s today people forget are from AMD.

  • @andremalerba5281
    @andremalerba5281 Před 4 měsíci +2

    How about mantle/vulkan and tressFX?

  • @Belfoxy
    @Belfoxy Před 4 měsíci +3

    AFMF + anti-lag would probably be amazing if they could get it working on Steam Deck. Fast mouse movements can break it on desktop, but the slower gamepad controls would mix much better with it.

    • @MajesticBowler
      @MajesticBowler Před 4 měsíci

      You can use AFMF + anti-lag on other handhelds like Asus ROG Ally. Latest Windows drivers from AMD support that option. Driver for Z1 Extreme is not published yet but you can use driver for 780M (it is rebranded Z1 Extreme)

    • @Thejacketof-huang
      @Thejacketof-huang Před 4 měsíci

      AFMF + antilag are trash technologies and won't work on low end hardware like steam deck. Stop thinking AMD can save peoples or magic exists

    • @mojojojo6292
      @mojojojo6292 Před 4 měsíci +1

      Afmf is crap. The movement doesn't have to be that fast for it to break.

    • @Thejacketof-huang
      @Thejacketof-huang Před 4 měsíci

      LoL AFMF + antilag ahahahhahahahahshsha what gimmick useless copy paste technologies ahahahahahah
      Cope

  • @sergiu6650
    @sergiu6650 Před 4 měsíci +4

    Thank you AMD for giving me frame generation tehnology on my rtx 3080, because nvidia is inovating 😂😂😂

    • @vasilije94
      @vasilije94 Před 4 měsíci

      Frame generation that barely works and is not even worth the image degradation. Be for real. Same like when people brag that they have FSR on their rx580. Like someone is going to upscale 720p to 1080p with FSR no less...

  • @planckstudios
    @planckstudios Před 4 měsíci +24

    AMD's market is consoles. They provide products in 5-7 year intervals. There is no room to fail here. They are married to each partner - working within their boundaries to perform on closed systems. Nvidia can take chances because they are not dependent on these restraints

    • @delaorden
      @delaorden Před 4 měsíci

      That's only because Nvidia doesn't produce x386 CPUs and Intel GPUs has been trash but even in the past generation they didn't have the best GPU or CPU, but they had both in an APU.
      NVIDIA doesn't seem much interested in the ARM standard since they were blocked from buying the arm foundation, but they have made money with the Switch and are very present.

    • @coffee7180
      @coffee7180 Před 4 měsíci

      @@delaorden Intel CPU's are good with the beginning of the 8gen, the thing is that amd is just so much better with the release of zen+.

    • @BigHomieTyreese
      @BigHomieTyreese Před 4 měsíci +1

      There's a lot more to the GPU market than the relative few that end up going into gaming consoles.

    • @wamba2097
      @wamba2097 Před 4 měsíci

      ​@@delaorden To be fair nobody is producing 40 year old 386 CPUs 😂

    • @redclaw72666
      @redclaw72666 Před 4 měsíci

      You do realize Xbox is ditching AMD for their next console lol

  • @badmoose01
    @badmoose01 Před 4 měsíci +4

    This is kind of a noob question but could amd put 3D V cache in a gpu?

    • @veda9151
      @veda9151 Před 4 měsíci

      If rumours are true, they can. Leaks suggested there is a version of Navi 31 with V cache, but for whatever reason didn't go into production. Maybe it's yeild, or maybe is it not as beneficial as it cost. But it shouldn't be surprising that AMD did explore that route.

    • @Thejacketof-huang
      @Thejacketof-huang Před 4 měsíci

      It would be an useless think cuz rdna are useless peace of garbage arc. Need to re-invent all arc

    • @falcone5287
      @falcone5287 Před 4 měsíci +2

      Well, AMD introduced Infinity cache with their rx 6000 series GPUs, top of the line models got 128mb of infinity cache. I think it's not beneficial enough to double that amount with 3d v-cache, cause 128mb is already a lot. In fact in the rx 7000 series GPUs AMD even reduced the amount of cache. Adding V-cache would also increase the gpu temperatures under load

  • @LockeLeon
    @LockeLeon Před 4 měsíci +32

    They should focus in stuff like APUs, Ryzen-Radeon synergy or making GPUs with tons of GDDR memory for local AI

    • @Fusion05
      @Fusion05 Před 4 měsíci +3

      I wonder if it'd be viable to load a card with tons of older GDDR for AI vs newer more expensive GDDR? I'm not quite sure how intensive AI is so not sure if this idea would work

    • @christophervanzetta
      @christophervanzetta Před 4 měsíci +1

      You mean copy NVIDIA like they have been copying Intel?
      BOOOOORRRRRRING

    • @PrefoX
      @PrefoX Před 4 měsíci +1

      local AI.. for those 1000 people who need it xD ya you are no good business person, u would ruin AMD in a day lol

    • @LockeLeon
      @LockeLeon Před 4 měsíci +1

      @@PrefoX localllama subreddit has 127k users

    • @deeplerg7913
      @deeplerg7913 Před 4 měsíci

      @@LockeLeon they won't do the VRAM stuff. Not happening, period. They're selling their own datacenter GPUs at a big markup, just like Nvidia.

  • @Aetohatir
    @Aetohatir Před 4 měsíci

    Wasn't FreeSync earlier than g-sync?

  • @-szega
    @-szega Před 4 měsíci

    2:20 And cuDNN was released almost ten years ago (2014).

  • @skywalker1991
    @skywalker1991 Před 4 měsíci +23

    Realistically AMD only makes about 3 billions from cpu and gou stuff and 2.5 billion from fpga , nvidia makes close to 22 billion last quater from gpus alone ,
    So amd has lot less money for RnD that also shared with cpu team and gpu team .
    Unless their AI syuff takes off , but its gonna be slow , amd will make 8 billion extra next 2 years from AI , maybe then RnD gets bumped up. But amd cant chase nvidia its too big now.

    • @Wobbothe3rd
      @Wobbothe3rd Před 4 měsíci +7

      This is just backwards rationalization and begging the question. Obviously Nvidia didn't START with that much market share, the question is WHY did Nvidia innovate while AMD stagnated!? Nvidia has been investing in R&D for decades before it made them more profitable.

    • @ktcool4660
      @ktcool4660 Před 4 měsíci +1

      nvidia makes close to 22 billion last quarter from genrative Ai accelerators.

    • @skywalker1991
      @skywalker1991 Před 4 měsíci +3

      @@Wobbothe3rd , Nvidia GPUs happened to be better at running generative AI , it was not planned by Nvidia years ago , Nvidia did get lucky with openAI , amd and Intel been focusing on Server CPUs , this generative AI stuff was lucky honestly ,
      I wouldn't give too much credit to Nvidia when you accidentally get lucky , now tho , every other company focusing on custom chip design to run their version of generative AI , Nvidia expensive GPUs won't be required as much , I can see decline in revenue in few years for Nvidia , even wall street saying it,
      I own Nvidia shares and Amd as well , soon I will sell Nvidia shares cause of they hit the peak , amd is just getting started , bought amd shares at $52 a share now at $200+ , AMD should hit $400 to $500 in a year with AI , I'm just shy from a million but with Amd I see few million made ,
      Love both companies cause they made me tons of money .

    • @pheria
      @pheria Před 4 měsíci +8

      @@Wobbothe3rdYou sort of answered your own question when you said it can take decades to pay off. Throughout most of the 2010's AMD was struggling financially much worse than Nvidia and also much worse than Intel, they simply couldn't afford to do as much (and yet they still did some - Mantle, TressFX, etc) The times mentioned in the video around remembering DX11 launch innovation were just before that. Remember that Radeon is a smaller part of AMD too. It's a miracle Radeon have been able to remain competitive on the budget and headcount they have compared to Nvidia, but I do hope more investments for Radeon by AMD overall (and thus innovations) are in the pipe now that the company overall has been doing much better.

    • @christophervanzetta
      @christophervanzetta Před 4 měsíci +1

      @@skywalker1991By your own statement AMD isn’t going to be going into AI because there will already be beaten to the punch…
      Not sure how you can’t see the contradiction in your own statement

  • @shremk9100
    @shremk9100 Před 4 měsíci +10

    They've had a few smaller things they've done first recently, of course nothing that I can remember to the scale of DLSS or any of that ML stuff from Nvidia. IIRC, they were first with Radeon Anti-lag in 2019 before Nvidia took it a step further with Reflex. They also initially popularized the usage of resizable bar in 2020, and they have some cool driver features like Radeon Boost (though I'm pretty sure this was originally developed by a smaller company whom they acquired) and of course more recently AFMF.

  • @sanmattew7221
    @sanmattew7221 Před 4 měsíci +20

    It still baffles me how despite YEARS passing by, they still have not come up with a DLSS-level upscaler. Even Intel managed to do and they have only just (re-)started their DGPU business....

    • @whatistruth_1
      @whatistruth_1 Před 4 měsíci +3

      They are determined and believe that a hardware agnostic upscaler is possible.

    • @Peytonwatson14
      @Peytonwatson14 Před 4 měsíci

      @@whatistruth_1What does this mean exactly?

    • @demrasnawla
      @demrasnawla Před 4 měsíci +4

      XESS definitely isn't DLSS-level, it's more or less tied with FSR 2.1

    • @DeepteshLovesTECH
      @DeepteshLovesTECH Před 4 měsíci +10

      That's a lie. XeSS on Intel Arc is on par with DLSS on several games. FSR 2.1 is not even close, it can only somewhat match DP4A fallback of XeSS@@demrasnawla

    • @gavinderulo12
      @gavinderulo12 Před 4 měsíci +11

      ​@@demrasnawlayou are talking about the dp4a pass. If you actually run it on Arc GPUs which have xmx cores it's definitely closer to dlss than fsr.

  • @sapphyrus
    @sapphyrus Před 4 měsíci +27

    As much as I dislike Jensen's profit maximizing approach, it's pretty much a given that Nvidia has the best engineers and long-term innovation R&D department for over 2 decades. They literally created what we know today as GPU. Then they moved onto GPGPU and CUDA, invested on universities using their systems. Tried physics simulation etc that didn't always stick but if you go through the history of the 3D graphics, it's pretty much their lineup that carried it after 3dfx's initial 3D accelerators. And they started doing that when they didn't have expendable cash like today. They don't stand on their laurels like Intel did so AMD is always playing a catch-up game that has a widened gap recently.

    • @Wobbothe3rd
      @Wobbothe3rd Před 4 měsíci +3

      Investing in R&D isn't "profit maximizing", Jensen passed off Wall Street for DECADES precisely by LOWERING MARGINS to invest in CUDA and other technologies that eventually enabled AI. Thats the EXACT OPPOSITE of "profit-maximizing" at least in the short term.

    • @kalaichelvan
      @kalaichelvan Před 4 měsíci

      The only but major problem that I have with Nvidia GPUs are their cost

    • @adamdunne6645
      @adamdunne6645 Před 4 měsíci +7

      ​@@Wobbothe3rdover pricing gpus are shifting sku naming to gain more money for less performance bump is absolutely profit maximizing.

    • @ktcool4660
      @ktcool4660 Před 4 měsíci +5

      @@Wobbothe3rd " CUDA and other technologies that eventually enabled AI" no it did not. Jensen was chasing blockchain. AI was enabled by independent researchers using CPUS, Dude just got lucky.

    • @Wobbothe3rd
      @Wobbothe3rd Před 4 měsíci

      ​@@kalaichelvanthe 4060 is $300. Cheaper than the 3060.

  • @Jaaxfo
    @Jaaxfo Před 4 měsíci +10

    AMD put out a whitepaper on how to do AI ray reconstruction about 1-2 months before Nvidia announced the feature. I'm not saying that Nvidia stole the tech, it's far more likely that they were parallel developments, but it's interesting how AMD can figure out a feature, and then Nvidia beat them to the market with it. It also doesn't help that Nvidia likes to develop proprietary tech and then make it mandatory, forcing AMD to play catch-up. But such is the nature of the beast, AMD has to step up and do the same, and its partnerships with PlayStation and Xbox should be where they put out these types of new features first

    • @JimBob1937
      @JimBob1937 Před 4 měsíci +3

      Oh yes, you can bet that stuff was years in the making in both companies. Nvidia was likely just a bit secretive since they were near market release. With that said, the idea of AI ray reconstruction isn't new at all, even before Nvidia released it. When Nvidia first released ray tracing hardware, they hinted as AI based solutions like that since it coincided with DLSS tech. So, I get you, but that's not a good example at all, haha.

    • @JimBob1937
      @JimBob1937 Před 4 měsíci

      "develop proprietary tech and then make it mandatory"
      Which tech is this? Don't confuse a proper engineering integration of software and hardware as proprietary. If your hardware team can lower the computation threshold of your software solutions, that's just a no brainer. Ray tracing wasn't proprietary either, they were just first to market with proper ASIC hardware.

    • @Jaaxfo
      @Jaaxfo Před 4 měsíci +1

      @@JimBob1937 I had a long writeup, but CZcams ate it because I clicked ever so slightly outside the notification box. I'll summarize what I had as it was too long anyway.
      Nvidia makes implementations of technologies without making them open or any sort of official standard. They then push these technologies into popular use to create a seemingly huge advantage - so long as the specific feature is enabled. Two of the biggest examples are RT and tessellation. Neither are conceptually theirs, but their implementation was heavily tied to the hardware it was developed for (because why wouldn't it be). Nvidia has also put a load of effort into working with developers to use their versions of the tech so that even if their competitors do create a viable implementation of their own, it still won't be able to use the Nvidia optimized code as well as Nvidia's hardware can.
      So mixing the popularity of the feature that they created with the closed implementation, they create an environment where their competitors have to play catch-up, and make it hard for them to ever catch-up. This doesn't just affect AMD, it's also affecting Intel. Arc drivers are going through so much per-game tuning in part because of all the Nvidia (and to some extent) AMD optimizations and assumptions in every game's code.
      And like I said in my original post, it's really up to AMD (or Intel) to figure out some feature that they can implement well before Nvidia, and that they can likewise make into a "must have" feature that Nvidia struggles to catch up on. The last time AMD had anything like that would have to be Vulkan, where for quite some time we could see Vulkan and DX12 games run better on AMD GPUs than on Nvidia GPUs, but Nvidia closed that gap fast enough as AMD didn't spend the time making these APIs "must haves"

    • @gavinderulo12
      @gavinderulo12 Před 4 měsíci +3

      The original Ray reconstruction paper is called "Interactive Reconstruction of Monte Carlo Image Sequences using a Recurrent Denoising Auto encoder" and was published in 2017 by nvidia. They have been working on it for a while.

    • @Jaaxfo
      @Jaaxfo Před 4 měsíci

      @@gavinderulo12 that's good to know, thank you. I knew that they would have been working on it well before, but now I have an actual timeline

  • @bigyundol3598
    @bigyundol3598 Před 4 měsíci

    AFMF is an interesting new feature.
    But yes, they have a way slower pace. But they work mostly to establish standards, which need a longer time than if you stuck some proprietary things together.

  • @nickhowatson4745
    @nickhowatson4745 Před 4 měsíci +1

    Microsoft, AMD and Nvidia have come together in a partnership to create a universal standardized Upscaling API called "DirectX Super Resolution" or "DirectSR" which will supersede DLSS, FSR and XeSS and finally end the upscaling wars.

  • @eugkra33
    @eugkra33 Před 4 měsíci

    Is there a reason why they can't leverage the integrated graphics on the Ryzen 7000 series CPUs for something? Maybe let it do some more image enhancement when FSR3 is enabled. If FSR3 like DLSS3 keeps an extra frame in the buffer, why not let the APU enhance that image if there is an entire extra frame worth of frame time to do so? Let it do the entire FSR2.2 upscaling even.

    • @WeeGeeX
      @WeeGeeX Před 4 měsíci

      The reason is software developers actually taking advantage of the hardware when only a subset of pc users have ryzen 7000?

  • @a-qy4cq
    @a-qy4cq Před 4 měsíci +240

    Nvidia gets all the hate but if AMD was the only player on the block we'd be a decade behind.

    • @Wobbothe3rd
      @Wobbothe3rd Před 4 měsíci +65

      Nvidia only gets hate on social media echo chamber. In REALITY 85% of PC Gamers buy Nvidia GPUs

    • @notenjoying666
      @notenjoying666 Před 4 měsíci +7

      ​@@Wobbothe3rdYou mean 76%?

    • @ReaperCet
      @ReaperCet Před 4 měsíci +31

      If we had a lower ceiling and games were designed within that ceiling, I'm sure we'd have a better gaming experience overall.
      Over the last decade the ceiling has been raised in ways that aren't transformative, but drastically increases costs for everyone.

    • @WendelTC
      @WendelTC Před 4 měsíci +52

      If AMD was the only player on the block they would have much more money to invest in RnD

    • @rla9889
      @rla9889 Před 4 měsíci +22

      Absolutely. Every time I’ve bought an AMD graphics card I’ve regretted it and ended up replacing with NVIDIA. I’ll never buy another AMD card again, they just don’t have the features to make it worth while.

  • @HendrixRipoff
    @HendrixRipoff Před 4 měsíci +6

    Not gonna lie, I kind of miss 3dfx right now

  • @Johnny_Ohio
    @Johnny_Ohio Před 4 měsíci

    Can someone tell me if a Lenovo RTX 4090 gaming laptop is as good as an Alienware 4090? Or is an Alienware a better laptop brand in general?

    • @christophervanzetta
      @christophervanzetta Před 4 měsíci

      If you are going to spend that much you might as well buy a desktop ….
      GPU’s and CPU’s in laptops have been and will always be severely power limited and overpriced for their weak performance

    • @Johnny_Ohio
      @Johnny_Ohio Před 4 měsíci

      @@christophervanzetta well, I am new to PC gaming and I went to Best Buy and bought a Lenovo RTX 4050, normal price was $1099, it was on on sale for $780 or something. My computer buddy also put 32 GB of ram in it for me. do you think this RTX 4050 is fast enough to play most AAA games that I download and buy from steam? At least until I get a more powerful desktop in the future.

    • @alexzandermanning6094
      @alexzandermanning6094 Před 4 měsíci +4

      Lenovo is better than Alienware

    • @Johnny_Ohio
      @Johnny_Ohio Před 4 měsíci

      @@alexzandermanning6094 OK, that’s what I was trying to figure out. Is the Leivo 4050 gaming laptop that I bought fast enough to play most AAA games that I buy on steam? I bought a Lenovo RTX 4050 and my buddy put 32 GB of ram in it. It was on sale for $780 normally $1099 from Best Buy

    • @HunterTracks
      @HunterTracks Před 4 měsíci

      Lenovo on average tend to be slightly better than the price competition in terms of cooling, build quality and layout imo, whereas Alienware tends to be quite a bit worse. That said, I'd advise you to look at reviews before you make a call either way, that's a lot of money to throw away on any purchase.

  • @Mottbox
    @Mottbox Před 4 měsíci +2

    AMD seems far more passionate about the data centre than consumer gaming these days. I think Lisa enjoys the CPU side more than GPU.

    • @Thejacketof-huang
      @Thejacketof-huang Před 4 měsíci

      Yes of course and are better for AMD

    • @HapPawhere
      @HapPawhere Před 4 měsíci

      GPU against NVIDIA is not healthy because NVIDIA take all the market most of the time. Meanwhile CPU market is more healthy for both Intel & AMD

  • @DeanClean42069
    @DeanClean42069 Před 4 měsíci

    I just find it crazy that our modern desktop pcs nowdays are performing 40 trillion fp16 flops and people are acting totally unaware their modern systems have similar processing power to NEC earth simulator supercomputer that use to take an entire warehouse to hold. Now it fits under your desk.

  • @DSRT888
    @DSRT888 Před 4 měsíci +1

    Anyone wonder what happened to PhysX?

    • @Melsharpe95
      @Melsharpe95 Před 4 měsíci

      Nvidia made it open source years ago and it's bundled in with all major gaming engines such as Unreal and Unity.
      Basically any Nvidia gpu can run it in hardware natively whereas other cards can use a software implementation.
      PhysX hasn't needed a separate card to run it for decades, which is why it basically fizzled out as a concept.

    • @DSRT888
      @DSRT888 Před 4 měsíci +5

      @@Melsharpe95 The current set of games today have mediocre physics quality in comparison to PhysX games 10 years ago.

    • @Melsharpe95
      @Melsharpe95 Před 4 měsíci +2

      @@DSRT888I fully agree.

    • @RedEverything
      @RedEverything Před 4 měsíci

      ⁠@@DSRT888Someone made a video running PhysX (Batman Arkham) on a 4090 and even with all that power it still butchered the framerate.

    • @DSRT888
      @DSRT888 Před 4 měsíci

      @@RedEverything Awesome I'll check it out.

  • @Jeremy-WC
    @Jeremy-WC Před 4 měsíci

    I think AMDs path is to figure out how to leverage their console partnerships better default game optmization. I heard on a podcast that the PC port of God of War had a ton of AMD optimizations striped out to optimize for NVIDIA. While this is understandable AMD needs to figure out how to have an easy port option for their graphics. That said with the NVIDIA dominance its hard to get your exclusive features adopted. I don't know if free sync would have taken off if NVIDIA had not first had g synch.

  • @ultrahd3388
    @ultrahd3388 Před 4 měsíci +1

    They need to invest more, even then it looks like too late to catch to Nvidia.

  • @brkbtjunkie
    @brkbtjunkie Před 4 měsíci

    “Good artists copy, great artists steal” -Picasso

  • @garytallowin6623
    @garytallowin6623 Před 4 měsíci +1

    Vulkan and dx12 are ALL because of amd... both share a lot of their roots in mantle.

  • @eugkra33
    @eugkra33 Před 4 měsíci +8

    The thing is that FSR3 upscaling and frame generation work on the RX 5000 series. AMD actually could have build all this tech 4 years ago, but didn't.

    • @josephdias5859
      @josephdias5859 Před 4 měsíci

      it works on the rx 480 even

    • @GameslordXY
      @GameslordXY Před 4 měsíci +1

      Perhaps they did , but it wasn't ready for the general public.
      Kept it very tight liped under many, many NDAs .

    • @phattjohnson
      @phattjohnson Před 4 měsíci +1

      @@GameslordXY NDA's? What are you even talking about? AMD potentially loses money and brand confidence by not putting new features out and making their cards more attractive to enthusiasts.. that's the point of this.

    • @ishiddddd4783
      @ishiddddd4783 Před 4 měsíci +5

      @@phattjohnson DLSS was garbage back when it was released for turing, let alone fsr1 when it came out in 2021, this stuff isn't developed in a day, it takes a while to get software working properly, especially fsr that needs tweaking by the developers to get the best visual quality possible, thinking FSR3 could've come out in 2019 is just being dumb when it wasn't even ready on 2022.
      Because something works in older tech, it doesn't mean it was even close to being ready or even existing, ray tracing can work on normal shadders, but the results are horrible so it's disabled by default, async compute has been a thing for a while, but because it exists, it doesn't mean stuff that uses it existed back then.

  • @bes12000
    @bes12000 Před 4 měsíci +20

    Driver level frame generation is pretty innovative AMD tech and it works well if you set it up correctly.(AFMF), Also AMD's focused on console and server tech, lot's of innovations there, I feel their desktop GPU side of things is more of an afterthought, or else they would actually TRY to beat Nvidia on the high end.

    • @HunterTracks
      @HunterTracks Před 4 měsíci +7

      @@RicochetForce ...After FSR 3 and AFMF came out.

    • @bes12000
      @bes12000 Před 4 měsíci

      @@RicochetForce you don't know what driver level frame generation even is do you?(it's actually called AMD fluid motion frames), have to actually have the new driver installed and have to run a DX11/12/Vulkan game with the specific settings configured and there is a slider to turn it on IN THE DRIVER to enable frame generation, I was a beta tester for it. It's specifically for AMD users of 6000 and 7000 series GPU's and whatever the APU is on the new hand helds 700series I think.

    • @eugkra33
      @eugkra33 Před 4 měsíci +2

      Yeah, and it turns itself off when you need it most, and turns itself back on when you don't need it. Off during fast motion, on for still images.

    • @bes12000
      @bes12000 Před 4 měsíci

      @@eugkra33 thats why AMD recommends you have a 60FPS base frame rate for it to work correctly(45 is fine too)

    • @bes12000
      @bes12000 Před 4 měsíci

      @@RicochetForce Yeah ok, keep drinking that Kool Aid

  • @MajesticBowler
    @MajesticBowler Před 4 měsíci +1

    Biggest innovation from AMD? Ryzen Z1 extreme - first ever APU designed for PC handhelds. Most of PC users play games on really old 60Hz monitors without VRR. Handhelds like ROG Ally show how modern PC should look: fast and very silent device with 120Hz VRR display, FSR3 frame generation + full support of Anti-Lag + perfect colors (eRGB 100%)
    I have 70-80 fps in Starfield at 1080p at medium details (FSR 3 performance with frame generation active). Ghosting is not visible. Pixel density on screen is 3x time higher than on standard 24" 1440p PC monitor. You can't see ghosting on "retina class" display (300+ PPI)

    • @Thejacketof-huang
      @Thejacketof-huang Před 4 měsíci +1

      Okok but now stop cry with your trashdeon and get a better card like RTX. AMD copy bad and paste what nvidia invent

  • @EthelbertCoyote
    @EthelbertCoyote Před 4 měsíci +1

    AMD's "lagging" on features is likely strategy. As new is a tax, that you don't have to pay when you do it second. Also going second you have a chance to not make the same mistakes. Being a smaller company now more than usual as Nvidia making wins in Ai you have to look to your customers and then look to your competitors customers for gains. Add that 2 of AMD's biggest partners Microsoft and Sony have their own likely mandatory needs on AMD's graphics RND outside of the PC space and I can see how they get tapped quickly. Finally keep in mind the pecking order at AMD in that the CPU division exists in a real sense to support and supply the CPU division. I'd assume the SOC side of AMD, CPU's with attached GPU's dwarfs the gaming market, and therefore innovations in CPU's are prioritized as long as the CPU's graphics needs are covered. As CPU margins are so much a larger portion of on AMD's survival against Nvida's priorities.

  • @stephenwakeman3074
    @stephenwakeman3074 Před 4 měsíci

    Wasn’t it AMD first to market with stream processors and moving away from fixed pipelines? Seen first in Xbox360 whereas PS3 had the fixed raster and shader pipes ala GeForce 7000 series?

  • @karehaqt
    @karehaqt Před 4 měsíci +1

    It's pretty simple to understand why AMD can't do it from an R&D perspective, they have to split their budget between the CPU and GPU divisions, with the CPU team clearly getting the lions share. Nvidia's R&D is pretty much all GPU focussed with a little bit going on networking for the team from Mellanox.

    • @psobbtutorials6792
      @psobbtutorials6792 Před 4 měsíci +1

      1) AMD's main focus was the console market and it had about 80million units sold on the gpu market with both ps5 and xbox
      2) Nvidia is heavily focused on AI for b2b. GPU sales to private consumers for gaming are their side project by now
      Amd wants to jump on the AI ship now too since there are trillions in it

  • @iamthegamepro
    @iamthegamepro Před 4 měsíci +5

    cause they cant.

  • @Diablokiller999
    @Diablokiller999 Před 4 měsíci

    Think you forgot Mantle or HDM, without AMD there would be no low level DX12 and Vulkan.
    Also AMD was one of the HBM fathers.
    They bring open source drivers forward as well.
    AMD is innovating on other fronts I would say.

  • @DuncDog
    @DuncDog Před 4 měsíci +1

    Nvidia also had a head start while AMD was in the middle of trying to bankrupt themselves with bulldozer.

  • @georgeindestructible
    @georgeindestructible Před 4 měsíci

    AMD could use the AI accelerators in RX 7000 GPUs to emulate ray traced light over normal rasterization without actually having to do any ray tracing, right?

    • @mojojojo6292
      @mojojojo6292 Před 4 měsíci

      There are no ai accelerators in AMD GPUs. They have no equivalent to tensor cores yet.

  • @advancetotabletop5328
    @advancetotabletop5328 Před 4 měsíci

    At least for business, business is more interested in buying a company’s worth of laptops or small desktops that “get the job done”. You don’t need high-end gaming specs to run a spreadsheet.

  • @WeeGeeX
    @WeeGeeX Před 4 měsíci +1

    AMD has been playing catchup in graphics for a long long time.

  • @romxxii
    @romxxii Před 4 měsíci +1

    To be fair to AMD, they did innovate _some_ technologies --- resizable BAR comes to mind. It's likely not as often as Nvidia because they don't have as much budget to spare for R&D.

  • @alexm8071
    @alexm8071 Před 4 měsíci +4

    The current AMD cards support DisplayPort 2.1 but the Nvidia cards don’t. Kinda nullified by the quality of DSC though.

    • @denisruskin348
      @denisruskin348 Před 4 měsíci +5

      Yeah, what’s the point of DP 2.1 when not even a 4090 can do those resolutions on frame rates unless we are talking older games. Let alone AMD cards which can’t compete with a 4090 and as you say, DSC is good enough.

    • @Chuck15
      @Chuck15 Před 4 měsíci +2

      its already leaked that rtx 5000 will have DP 2.0+

    • @Thelango99
      @Thelango99 Před 4 měsíci

      @@denisruskin348If you need to daisy chain the monitors, the extra bandwidth is nice.

    • @christophervanzetta
      @christophervanzetta Před 4 měsíci

      @@Thelango99You should never do that as it causes signal degradation…

  • @cocobos
    @cocobos Před 4 měsíci +1

    Well, their software is far ahead than Nvidia though.. was overwhelmed when first switched from gtx980 to rx6600 😅. Also Steam Deck!

  • @OrbZero
    @OrbZero Před 4 měsíci

    They just started with chiplet design for GPUs. When they perfect this it will result in massive cost reductions.

  • @fcukugimmeausername
    @fcukugimmeausername Před 4 měsíci +1

    I wonder what kind of price point AMD could sell a 16gb 4070 competitor at if they completely disregarded RT features?

    • @tmsphere
      @tmsphere Před 4 měsíci

      7900xt is superior to 4070ti Super and costs the same. A solid 1440p card that doesn't need upscale and can do RT tho weaker at RT than RTX but a lot of games can run +60fps on a 7900xt with RT on.

  • @damianabregba7476
    @damianabregba7476 Před 4 měsíci +1

    Even if they do, it have to be supported by nvidia cards. Otherwise it will be too niche with huge gamers backlash, which will make it dead anyway

  • @NeonPixels81
    @NeonPixels81 Před 4 měsíci +4

    I mean, the fact that I have FSR on my RX580 is pretty solid.

    • @vasilije94
      @vasilije94 Před 4 měsíci +1

      And what do you do with FSR? Like lets get real here, trully real. What does FSR do for your rx580 or my GTX 1060? Literally nothing. Our GPUs barely play at 1080p new games. What are you going to do with FSR? Upscale 720p to 1080p? With FSR? It looks like someone smeared shit all over your screen. And that is a real take on FSR and old GPUs. It makes no sense at all. If anything FSR is only good for high end AMD GPUs. Because then you can play 4k and run FSR quality and get at least decent picture. Even at 1440p FSR is bad. And that is the truth. Dont get me wrong, i was just as excited for FSR on my 1060, but once you try It, its complete shit and it is not worth it at all. Its completely unplayable. I would rather play 720p native if needs be than 1080p upscaled with FSR. So be real, what does FSR do for your GPU? Its doing the same as Ray tracing did for rtx2060. Literally nothing. It was just there. It is not usable in any realistic scenario.

    • @NeonPixels81
      @NeonPixels81 Před 4 měsíci

      @@vasilije94 Speak for yourself, my desktop HoloISO build gets 1080@60 in Hogwart's Legacy...? Maybe calm down a little? If you wan to spend a mortgage payment on a RTX4090 to Super Ultra Deep AI Perfect Path Scale Tracing EX SSS Extra Secret Rare Mega Graphics™ then go right ahead.

  • @GatheredMyst
    @GatheredMyst Před 4 měsíci +7

    One big thing that AMD has done is take their APU development and move it into the mobile space. Outside of the Switch, all PC handhelds run AMD processors, to the point that NVidia is having to respond and work on their own mobile APU technology with hopes of releasing their own handheld years from now.

    • @a36538
      @a36538 Před 4 měsíci +4

      Wow such recency bias. Have you heard of the Nintendo switch? It 💩 all over the steam deck

    • @GeneralS1mba
      @GeneralS1mba Před 4 měsíci +7

      ​@@a36538Only because a bunch of people love to play Zelda and mario in their middle ages and refuse to expand to other genres.

    • @456MrPeople
      @456MrPeople Před 4 měsíci +2

      People want to play good games? No way.

    • @HunterTracks
      @HunterTracks Před 4 měsíci

      @@a36538 Nintendo Switch uses a mobile processor (Tegra X1) Nvidia originally developed for their Shield handheld. It's not a PC handheld and packs far less power than even the cheapest laptop chip from 10 years ago. Switch's popularity has nothing to do with its hardware, hell, I'd argue that it's popular in spite of said hardware.

    • @HunterTracks
      @HunterTracks Před 4 měsíci +2

      @@a36538 Switch uses an old ARM chip, Steam Deck and the like use new x64 chips, there's no comparison. Switch succeeded for a myriad reasons, but Nvidia's hardware isn't one of them.

  • @Halvkyrie
    @Halvkyrie Před 4 měsíci +5

    I think part of the "thing" here is also that Nvidia seems to spend a lot more on marketing the things they create, such as with DLSS

    • @shaolin95
      @shaolin95 Před 4 měsíci

      😂😂😂😂AMD would do the sand if they created anything but they don't. Stop being a fanboy

  • @TonkarzOfSolSystem
    @TonkarzOfSolSystem Před 4 měsíci +1

    Vision doesn’t matter if you don’t have the money. For all we know AMD has plenty of vision. But without R&D money AMD can never show it.

  • @adamhero459
    @adamhero459 Před 4 měsíci +1

    Amd does innovate. The issue is that programmers don’t take advantage of them when only 10% or so of gpus are amd. Usually what happens is that only when nvidia finally implements what and started, will you hear about it.

  • @xWris3
    @xWris3 Před 4 měsíci +1

    I see everyone saying amd is lagging behind but ... why would both manufacturers be expected to offer the same features ? I mean I couldn't care less about the shinny new software stunts nvidia pulls and I'm fine with it ? I've been using Nvidia cards since forever and recently switched to amd because Nvidia was a bad cost to performance deal and honestly I don't see why there is that much debate or some kind of "proud" feeling or whatever of being on nvidia or amd side. They're not teams. They're gpu manufacturers and you pick the one that suits your needs.
    If you're fine with not having great RT (not X) performance then imo amd is fine. And let's be real with an example: everyone I know is not enabling RT in The Finals for instance because it tanks performance like hell. And it's not just competitive games. I've gotten so used to high framerates that I prefer CP2077 without RT to get those extra frames than with RT.
    That's my opinion but in the end: what's the point ? I mean unless a game only offers RT and nothing else I would not enable it (and then it would'nt be an option) simply because the graphical fidelity gain is very often just not worth it. I prefer a smoother gameplay and pushing those frames (165hz 21:9 on my side) than a 60 fps gameplay which I don't perceive as fluid anymore.

  • @KevynBGrams
    @KevynBGrams Před 4 měsíci

    If Sony would return to making PlayStation a competitor to PC like they were doing with PS2/3, and they could build an enthusiast level product (in addition to the casual $500 base model), then I think we could see a beautiful competitive landscape, filled with innovation. AMD & Sony could co-develop massive advancements, and have them in 10’s of millions of devices.

    • @christophervanzetta
      @christophervanzetta Před 4 měsíci

      A $750 console is asinine when a $1000 PC would be over double the performance and more usability LOL.
      Consoles are pretty much towards the end of their life cycle with cloud gaming. Even the developers can’t wait to move to cloud gaming for the ease of programming and the ability to release a game without hardware specific bugs…

    • @pixeljauntvr7774
      @pixeljauntvr7774 Před 4 měsíci

      ​@@christophervanzetta Cloud gaming is absolutely horrific. It wont become mainstream.

    • @DirranProductions
      @DirranProductions Před 4 měsíci

      ​​@@christophervanzettathat is a horrifying thought.
      Also while I wouldn't splash 750 on a console, I know many who would.
      Sure the corporations would love cloud gaming to take off though.

    • @KevynBGrams
      @KevynBGrams Před 4 měsíci

      @@christophervanzetta There are plenty of people who would like to have the opportunity to take full advantage of their TV/Sound System and PSVR2, but don’t want to deal with the issues PC gaming introduces. It would be nice to have an ‘option’ in this space.
      If Sony would open up PlayStation to a more Mac/iOS style walled garden, where it can achieve the majority of average users needs, it would be a fantastic competitor in the Home Computing space, and potentially the Mobile space if they expand.
      Local hardware is not going anywhere. People are not going to give up reliable local processing to rely on the cloud. Even with Film/Music, the content is still downloadable and playable on the local hardware.
      Also, $750 would be too cheap for an enthusiast product, pushing the limits of innovation, and being the center of home entertainment and computing. It should be around $2500 - $3000 like their top end Receivers.
      Basically:
      TV - $7,000
      Receiver - $3,000
      PlayStation - $3,000
      13.2 Speakers - $3,000
      PSVR - $550

  • @abdullahzafar4401
    @abdullahzafar4401 Před 4 měsíci

    Nvidia is more of a software company now, AMD doesn't have that much software engineering prowess but things are changing now, we could see them on the same level as nvidia in software in a few years

  • @andremables5990
    @andremables5990 Před 4 měsíci +2

    I go way back amd fury

    • @ledheavy26
      @ledheavy26 Před 4 měsíci

      Lol, I'm still there baby!

  • @DragonOfTheMortalKombat
    @DragonOfTheMortalKombat Před 4 měsíci

    AMD does innovate, Fluid motion frames, radeon boost, but I feel like these are flawed. While I love the idea of Radeon boost that it gives extra boost in fps and restores image quality as soon as you stop. But it causes a lot of shimmering. And FMF, well it turns off as soon as you move too fast. LSFG did this better.

  • @kidsythe
    @kidsythe Před 4 měsíci

    they aren't nearly the size of literally any of their competitors by like 1 trillion money. amd for the past while is just aiming for the masses. their chiller thing made a huge smash they now have to low end and benchmark wise the high end.

  • @qlum
    @qlum Před 4 měsíci

    While it is undeniable that AMD is behind on the software side and is also lagging on rt / ml hardware.
    It is not like they are only following Nvidia in that regard. With RT the timeline just wouldn't add up if they only started development after Nvidia's 2000 series came to be, those things take quite some time, and RDNA 2 was already well under way, and probably close to undergoing testing for consoles at that point. It also would not make sense that their implementation is completely different.
    For AI acceleration the story may be a bit different, but even then AMD has had at least one patent for AI upscaling back in 2020.
    None of that doesn't mean they are not behind, but they are not just following Nvidia either.
    I really do hope for plurality of choice that they have and are making the right decisions and keep pushing. As for market share, it's not as skewed as you may think, yes it heavily favors Nvidia but AMD still sells their chips for consoles and now PC handhelds, these benefit from the work done as well. Another thing is in terms of market share, they have practically 0 on dedicated laptop gpu's and oems also do not sell a lot of AMD gpu's in the DIY market they are still definitely behind but a lot closer.
    The console market and now to a lesser extend pc handheld market also changes the type of tech AMD may push, namely ones that the lower end can benefit from.

  • @orewaerojijidaa
    @orewaerojijidaa Před 4 měsíci

    Amd need to crank up their price to match nvidia

  • @DSDSDS1235
    @DSDSDS1235 Před 4 měsíci

    You can really see the lack of amd driven new stuff when you see the mention of mantle so frequently around here. That was 2013 yall, the last notable amd contribution can a decade and a bit ago. And people wonder why nvidia gets all the mindshare.

  • @ytmB4HyU4kUq
    @ytmB4HyU4kUq Před 4 měsíci

    Ryzen, 'nuff fken said

  • @fabrb26
    @fabrb26 Před 4 měsíci +3

    AMD introduce the WDGAF a long time ago

  • @alumlovescake
    @alumlovescake Před 4 měsíci

    Amd doesn't have to, all they need to do is copy NVIDIA and it doesn't even have to be better then release it but it make it "more open" then Nvida's version and they are done

  • @jattdachannel6652
    @jattdachannel6652 Před 4 měsíci +1

    frame gen and dlss are the things which shouldn't be a thing in gaming, this is how we are worsening pc ports, people with 10 series are fukd cause of this

  • @saialexander1873
    @saialexander1873 Před 4 měsíci

    The same question why apple not innovate anymore, just copy the feature and make it better

  • @100500daniel
    @100500daniel Před 4 měsíci

    AMD did innovate with chiplets and HBM, but yeah they're not catching up to Nvidia's software features any time soon.

  • @tashriquekarriem8865
    @tashriquekarriem8865 Před 4 měsíci

    Because AMD is taking on both intel and Nvidia at the same time for years now.

  • @Ivan-pr7ku
    @Ivan-pr7ku Před 4 měsíci

    AMD has trapped itself in the low-margin strata of the GPU market -- both the consoles and PC. The console situation is particularly straining where a $500 box doesn't pay enough to have fancy features that blow the SoC die budget. And since one GPU architecture have to serve all market segments, these restrictions spills into the PC platform too, while Nvidia can't produce enough 4090s that are flying off the shelves even at $2K, AMD is force handed to discount Radeon 7000 series to keep its market share afloat. In other worlds, AMD's situation doesn't justify the R&D investment to get even close to Nvidia in forward looking features.

  • @shadowpeople89
    @shadowpeople89 Před 4 měsíci

    TressFX was the last technical innovation that I recall that they did. Nvidia Hairwoks came out a couple years later I think.

  • @reav3rtm
    @reav3rtm Před 4 měsíci

    One thing to factor is Nvidia was not target of anti-competitive practices for years by Intel how AMD was.

  • @TheIndulgers
    @TheIndulgers Před 4 měsíci +2

    Vulkan, Mantle, open source FSR/Freesync, ReBar, Adrenaline software with built in overclocking and performance monitoring, 120fps recording, driver level FG, DP 2.1, and about a dozen other features.

  • @virtualmonk2072
    @virtualmonk2072 Před 4 měsíci

    A day later ,Ai is coming to fsr ❤

  • @KimTe63
    @KimTe63 Před 4 měsíci

    NVIDIA usually is there first and AMD does the same for everyone with open source version of it 😁 which is great tbh but usually Nvidia implementation is better

  • @doc7000
    @doc7000 Před 4 měsíci

    I would argue that AMD has done more for "game changing" innovation then Nvidia has, like ray tracing existed even in games before Nvidia launched their RTX cards though they were the first to release a hardware ray tracing set. Though I wouldn't call ray tracing game changing, yes it makes games look great and eventually will become the standard at the end of the day cyberpunk looks good even without ray tracing.
    Even upscalers isn't a new thing or game changing tech.... on the other hand Mantle had an impact on graphics APIs such as directx 12 and vulkan which has a much bigger impact on game development then upscalers or ray tracing.
    What Nvidia is good at is marketing and they knew that being the first to release ray tracing hardware even though it was certainly pointless at first and still takes a big performance hit for heaving ray tracing they gain mind share from it. With something like DLSS they gain an exclusive feature that developers would ideally want to put on their "box" and Nvidia will be the only route for them to get it. Nvidia which has deeper pockets then AMD uses money in order to maintain a market advantage over AMD, while I like the 4090 as it gives great performance for a price lower down their stack things got really dumb really fast.
    On the other hand AMD does a lot of things in an open source manner which is better for the PC gaming community.

  • @notjulesatall
    @notjulesatall Před 4 měsíci

    It's just a matter of spendings. NVIDIA puts a ton of cash to make sure that their GPUs aren't just ready for developers to provide tools for professionals in fields like AI, simulation, or computer vision. They make the tools themselves. AMD's software solution lacks in so many ways: it doesn't support pre-RDNA2 GPUs, it is 2~3x slower for 3D rendering in Blender, mainline Pytorch doesn't support ROCm, mainline Clang doesn't support HIP, ROCm barely documents which GPUs are effectively supported, and so on...
    AMD just needs to hire more people to take care of their software. Period.
    In comparison, even Intel's GPU software offering seems to be in a *much* better shape only 2 years after the commercialization of their discrete GPUs (although they had iGPUs and Xeon Phi accelerators before that).

    • @mojojojo6292
      @mojojojo6292 Před 4 měsíci

      Amds software support always was and always will be a joke. They don't take the prosumer market seriously at all.

  • @BenderBendingRodriguezOFFICIAL
    @BenderBendingRodriguezOFFICIAL Před 4 měsíci +16

    Because their board of directors won't let them.
    Literally. They don't have the cash flow to create new technology when they have a competitor that is willing to blow billions of dollars to do it for them.
    AMD is always playing catchup with Nvidia in terms of features and that has been their strategy from the very beginning as a cost saving measure.

    • @eugkra33
      @eugkra33 Před 4 měsíci

      But are they doing it for them? DLSS doesn't work on AMD. XeSS's performance loss isn't worth it, and they made FSR anyways so they have the budget.

    • @korinogaro
      @korinogaro Před 4 měsíci +3

      @@eugkra33 true but NVidia publishes a lot of research papers about upscaling, RT and so on. So NVidia spends money on theory and publishes paper (AMD starts to allocate some resources to look into it), in the mean time NVidia spends fat stacks on turning models into features working on GPUs in real time. They make presentation (AMD sees that it can be done and allocates more resources). NVidia releases product with the feature. AMD closely looks at how it works (you can find a lot about how something works even without direct access to the code) and allocates even more resources while spinning BS that they also worked on that feature and it comes soon. Cycle repeats.

    • @HunterTracks
      @HunterTracks Před 4 měsíci +2

      @@korinogaro You can argue this about upscaling and RT (hell, it's pretty obvious AMD was caught with their pants down when it comes to RT), but they've clearly had frame generation in the works for a while. Going from zero to full game and driver implementation in a year, in a corporate setting? They'd need to invent a time machine for that.

    • @korinogaro
      @korinogaro Před 4 měsíci +1

      @@HunterTracks NVidia's ODA presentation in 2018, some time after paper about use of OFA in FG. Finally NVidia uses 2nd gen OFA in 40 series for FG. So paper was available for like at least 2 years before 1st presentation of FG. So it wasn't like AMD had 1 year to implement it. They knew it is comming, I just don't think they expected to be extra hardware for it and/or it will be launch feature.

    • @HunterTracks
      @HunterTracks Před 4 měsíci

      @@korinogaro Unless I looked up the wrong paper, it had no information regarding potential gaming usage. And the thing is, frame interpolation itself isn't really new, hell, AMD themselves had it for videos way back in the Vega era, the innovative thing was the gaming usage.

  • @sokol5805
    @sokol5805 Před 4 měsíci +1

    AMD is able to supply MS and Sony with chips to move gaming forward because no one cares about PC. The last game that bothered to care was Crysis 1 and even that was a CPU disaster.

  • @kenhew4641
    @kenhew4641 Před 4 měsíci

    Well AMD don't seem to have a problem with playing catch-up, why do people think that's a problem?

  • @fredsorre6605
    @fredsorre6605 Před 4 měsíci

    AMD did develop Tress FX which was hair and fur physics tech only for Nvidia for copy it and produce Hairworks as part of the AMD gimping Nvidia Gameworks tech but it has indeed been a long while since then AMD is just following Nvidia's innovations at this point which is just sad to see.

  • @Pyroteq
    @Pyroteq Před 4 měsíci +1

    Do we REALLY want under powered GPU's that rely on upscaling and other rubbish just to play a game at a reasonable framerate? I'd rather they DIDN'T "innovate".
    I don't use FSR, I don't use DLSS, I don't use frame generation.
    I don't want my graphics distorted by "AI", I don't want added input latency, I don't want my card to GUESS what is supposed to appear in an image, I don't want certain scenes flickering constantly. I want my graphics card to render the graphics as the artist intended them. If the graphics card can't handle the details I'll do what I've done for the past 30 years and turn down the graphics settings.
    All these "features" are just a way of selling you an underpowered card for the same amount of money as previous generations and pocketing the difference. (See the pathetic amounts of VRAM recent Nvidia cards offered)

  • @Darkrod99
    @Darkrod99 Před 4 měsíci

    All is simply not enough budget..
    AMD graphics division is too small vs NVIDIA..
    AMD should stick with open source stuff as well as improving the quality of the drivers.. also try to do the most efficient way to do these graphics techniques

  • @GameslordXY
    @GameslordXY Před 4 měsíci +2

    Well FSR3 is doing quite well , does it not?
    Even compared to DLSS3.
    Not much if at all worse .
    Though I do wonder about AMD alternative to DLSS3.5 Ray Reconstruction.
    When if at all can we expect it

    • @Eleganttf2
      @Eleganttf2 Před 4 měsíci

      thats literally the thing, they only play catchup and offer second grade quality feature counterpart.

  • @InternetListener
    @InternetListener Před 4 měsíci

    We forgot how x64 is based on AMD64. INtel's 64 bit x86 architecture evolution was a total failure and new total architecture... 3Dnow had similar convergence to sse instructions of intel....

  • @OrjonZ
    @OrjonZ Před 4 měsíci

    Nvidia had to innovate to differentiate and control the market. AMD does not have the market share for stuff to get adopted. They have been first to quite a lot of features that latter came to DX but adoption was not there. Even Nvidia with 80% share sometimes it takes years for stuff to move.

  • @htoomyatlin123
    @htoomyatlin123 Před 4 měsíci

    AMD can innovate but it is more innovative in terms of hardware. Infinity cache and 3d cache on its cpu are game changing.

  • @NACLGames
    @NACLGames Před 4 měsíci

    I think not enough has been said about vision. AMD has innovated before, sometimes on very critical stuff for both the CPU and GPU market. It's not like Nvidia has led on every advancement. But Nvidia's very, VERY long efforts to push CUDA, ML and AI applications, over a decade now, while many tech observers have slammed and heckled them for it, is something rather absent in AMD. They don't look that far and don't take that wide a leap away from their core products. Whether a 'vision' is fulfilled is partially luck yes, but you have to have the goods when the opportunity finally comes knocking. And it's indisputably paid off, Nvidia has absolute supremacy not only in their existing market, but in the entire tech world, at least for now.
    That being said, anti-red guys should recognize that this isn't a failing (unless you really want to only make an absolute measure of how large either company is). AMD simply takes safer bets. It's not just about how much money they have for R&D, but also the nature of their business. AMD is a go-to partner for console makers, who need a chips with reliable and mature features and performance. AMD makes CPUs and has to invest in R&D for both hardware and software on that side too. It's even undesirable for AMD to innovate in some cases due to how they need to control their market. For example, while AMD has talked about developing ARM based CPUs, it's really only to placate stakeholders. Why should AMD advance ARM into the CPU market, which will simply spawn countless competitors into the space they currently hold as a duopoly with Intel, and specifically compete with their niche of cost- and power-efficient CPUs?
    Some of that might sound like a lot of negative connotations to some, but there's more than one way to run a business. Say what you want about these two companies, whether you're anti-red or anti-green, but compared to everyone else who came before, they dominate the market today, together.

  • @spacebarwasd9782
    @spacebarwasd9782 Před 4 měsíci

    Nvidia had vision where the card at one point gets fast enough they need to find new reasons and innovate on new advancement but AMD had vision to be faster than Nvidia

  • @matheusnave8220
    @matheusnave8220 Před 4 měsíci

    Amd bring freesync to everyone
    Anti lag come first on amd too