Video není dostupné.
Omlouváme se.

People Keep Getting THIS Wrong... Nvidia vs AMD vs Intel

Sdílet
Vložit
  • čas přidán 13. 08. 2024
  • Nvidia va AMD vs Intel GPUs. Which manufacturer makes the Graphics Card that's right for you? We will explore: GPU value, rasterization performance, raytracing, upscaling (DLSS vs FSR) vs Xess, memory capacity, streaming, VR, Software and Stability, power efficientcy and Professional applications. Hope you Enjoy!
    👉👉👉👉👉👉👉 SPONSOR 👈👈👈👈👈👈👈
    25% Coupon code for software: TL20:
    Windows 10 Pro Oem Key: biitt.ly/XQT7V
    Windows 11 Pro Oem Key: biitt.ly/anjq8
    Office2019 Pro Plus Key: biitt.ly/4Twqt
    👉👉👉👉👉 PRODUCTS USED 👈👈👈👈👈
    (Links are Affiliate Links when possible)
    As a retail affiliate, I am paid a percentage of qualifying purchases at the expense of the retailer.
    💎 Nvidia's Best Value GPU Recently 💎 amzn.to/3IJAUl5
    💎 AMD's Best Value GPU Recently 💎 amzn.to/3IJ2kaG
    💎 Intel's Best Value GPU Recently 💎 amzn.to/3BY72xD
    👉👉 VIDEO MENTIONS & RELATED CONTENT 👈👈
    Here is where the additional content / references I mentioned in the video will be - as some other content I think you would enjoy!
    🌟 VIDEO MENTION ❯ The GPU Value Playlist: • TL: GPU Value Playlist
    🌟 VIDEO MENTION ❯ (HUB) Nvidia's DLSS 2 vs. AMD's FSR 2 in 26 Games [...] : • Nvidia's DLSS 2 vs. AM...
    ⭐ RECOMMENDATION ❯ Best Budget GPUs: • Best Budget GPUs 2023!
    ⭐ RECOMMENDATION ❯ The "Cheap" 7900 XTX Battle: • The "Cheap" 7900 XTX B...
    👉👉👉👉👉👉 LETS GET SOCIAL 👈👈👈👈👈👈
    ► CZcams ❯ TechLens: Behind The Lens: / @techlensrgb
    ► Business Email ❯ techlens.hello@outlook.com
    ► Discord ❯ / discord
    ► Instagram ❯TechLens.Official
    ► Facebook ❯ TechLens.Official
    ► Twitter ❯ @TechLens_Real
    ► TechLens Website ❯ www.TechLens.net/
    #Intel #Nvidia #AMD
    👉👉👉👉👉👉👉 CHAPTERS 👈👈👈👈👈👈👈
    0:00 Intro
    0:53 Sponsor: Whokeys
    1:31 What the Community and ChatGPT Thinks
    4:24 Rasterization / Pure Gaming Performance
    5:47 Raytracing
    8:43 Upscaling and Frame Generation
    10:36 Resuts: Gaming
    11:51 Other Important Considerations
    13:59 Results: All

Komentáře • 104

  • @TechLens
    @TechLens  Před rokem +3

    Watch THIS Next ⭐czcams.com/play/PLLpHqtMBNe-unx0IzobN5F1INwwhyvCnh.html ⭐
    25% Coupon code for software: TL20: Windows 10 Pro Oem Key: biitt.ly/XQT7V ⭐ Windows 11 Pro Oem Key: biitt.ly/anjq8 ⭐

  • @VelcroSnake93
    @VelcroSnake93 Před rokem +24

    5:37 - Man, those non-RT water reflections looked intentionally bad. I've seen tons of games without RT with water reflections leagues beyond that. In fact, I think that was CP2077, which I'm pretty sure has better non-RT water reflections than shown there.

    • @kmyre
      @kmyre Před rokem +1

      AMD apologetics at its peak

    • @dgillies5420
      @dgillies5420 Před rokem

      Yeah that's sure gonna help a lot in that shooter because we all know that HE WHO SHOOTS THE MOST PUDDLES WINS!

  • @Technerd207
    @Technerd207 Před rokem +26

    Nvidia if you need cuda. AMD if you want price to performance. Intel if you whant to be unique. I can't go without cuda, lets hope rockm is good.

    • @analogicparadox
      @analogicparadox Před rokem +4

      Also Nvidia if you need Optix

    • @miyagiryota9238
      @miyagiryota9238 Před rokem +13

      Nvidia if you want overpriced gpus

    • @Humanaut.
      @Humanaut. Před rokem

      Cost to performance can be better on Intel if you play modern games and the drivers actually work or you put in the time to make it work.
      Generally I'd still go with amd though.

    • @TrueMathSquare
      @TrueMathSquare Před rokem +2

      AMD if on Linux

    • @frosken8889
      @frosken8889 Před 3 měsíci +1

      Nvidia for good fps and latest games

  • @tonydalimata
    @tonydalimata Před rokem +5

    Im a gamer and a gamer only.
    Price to performance is king.
    Something to mention is matching your gpu to your monitor or the monitor you plan on upgrading to in the near future.

  • @Cynical_Finch
    @Cynical_Finch Před rokem +13

    I would love to see you do a companion video to this one on how to choose the right monitor once you've chosen the right GPU for your needs and budget.

    • @MadmanLink
      @MadmanLink Před rokem +2

      I think it should be the other way around. Pick your monitor then pick the gpu you need for what you are going to do.

    • @TechLens
      @TechLens  Před rokem +7

      Interesting ideas! I'll see what I can do 😊

    • @ColdestRage303
      @ColdestRage303 Před rokem

      @@TechLens If i have to choose between a NVDIA RTX 3060 12GB and a AMD Radeon RX 6700 XT 12GB for a gaming pc. What would be the better option?

    • @angeldust1732
      @angeldust1732 Před měsícem

      Go for 6700 XT or 6750xt oc ​@@ColdestRage303

    • @angeldust1732
      @angeldust1732 Před měsícem

      6750 XT oc​@@ColdestRage303

  • @AndreyStevn
    @AndreyStevn Před rokem +3

    Some days ago I asked ChatGPT about Nvidia 40s series and AMDs 7900, it turns out, ChatGPT's data was included until 2021, so anything after that, it's not on their system.

  • @culo9999
    @culo9999 Před rokem +4

    First thing to consider is not performance imo, it's budget. Determining the most performant card, either in absolute terms, or relative to price is entirely irrelevant if you can only afford up to a certain point. Therefore I think one should first consider how much they can/are willing to spend THEN any type of perfomance metrics including raw rasturization, features, etc.

  • @unrelentingstrength
    @unrelentingstrength Před rokem +2

    I want to upgrade from an AMD RX 6400 to an NVIDIA RTX 40 series, but the only thing that's confusing me is my monitor. It does say it has g sync compatibility, but its a freesync premium model (Lenovo Legion Y27H-30). Thank you for your tech tips. Really appreciate your efforts @TechLens.

  • @dianaalyssa8726
    @dianaalyssa8726 Před 9 měsíci +1

    Free ChatGPT doesn't time include the current gen unfortunately. Have 1060, 3060 12gb, and 7900XTX.

  • @sula2315
    @sula2315 Před rokem +5

    AMD
    *good price to performance
    *usually ages well
    *good for linux support, open source friendly
    *forget about Raytracing in most games, maybe there are custom solutions in some games, but they are probably beta hacky ways and tank performance
    Nvidia
    *Higher price usually compared to AMD or Intel
    *Better for Raytracing
    *Better for content creators, video recording. Not that AMD cant do it, Nvidia just has better support and features
    *No open source drivers nor presence.
    *If you want the absolute best raw performance Nvidia often seems to win still, but with a colossal price tag
    Intel
    *Its new so be ready for more bugs
    *price is quite good
    *no raytracing
    *no open source friendliness at all, moreso anti behavior
    *for experimenting i guess?
    Finally company loyalty is a pointless thing, any company just wants your money. Buy what you need and want.

    • @Greenalex89
      @Greenalex89 Před 9 měsíci +1

      I agree with all points but u got the best raw performance at Nvidia and not AMD, by accident I guess. If not, yes the 4090 has the best performance, but pretty much all relevant AMD cards deliver by far the better rastorization performance. Its not even a debate.

    • @Hunterek41
      @Hunterek41 Před 6 měsíci

      intel has very good raytracing but because of the initial bad raw performance people dont buy it

  • @mbgmadbull1141
    @mbgmadbull1141 Před 3 měsíci

    Cool but what if VR performance is a deal maker?

  • @nathangamble125
    @nathangamble125 Před rokem +2

    "no-one scoffed at 3090 Ti Ray Tracing performance"
    False. I do. The 3090 Ti's performance was awful for its price, and the loss of performance relative to the graphical improvement meant enabling ray tracing was almost never worthwhile on the 3090 Ti, except in the RTX mods of old games like Minecraft RTX. IMO Ray tracing isn't worth using until the performance loss from enabling it is less than around 30%, or until ray/path tracing settings that revolutionise graphics (rather than only changing a few reflections and shadows that you don't even notice while playing) are available in most of the games that people play and can run at playable frame rates on most GPUs.
    The RTX 4090 is the _only_ GPU that can run Cyberpunk 2077 with path tracing at 1440p (or 4K with DLSS) at playable frame rates. As far as I'm concerned, ray traced graphics in games will mostly be gimmick and a waste of money until at least that level of performance and graphical detail is available in multiple games on midrange GPUs. This will likely happen through a combination of hardware and software improvements. I expect that we'll get there around 2026 with the RTX 5070 and RX 8700 XT. But right now, the 4070 Ti doesn't have enough VRAM for RT in next-gen games, the 4080 is overpriced for its overall performance, the 7900 XT and 7900 XTX aren't properly-optimised for path tracing, and the 4090 is only for people with money to burn.

  • @Definesleepalt
    @Definesleepalt Před rokem +2

    also one thing to consider , Nvidia has Cuda cores which just outperform everything else in terms of AI, all the Non cuda cards that outperform are mostly just for data centers

  • @PhonkMachine220YT
    @PhonkMachine220YT Před 2 měsíci +1

    Funnily enough Intel gets better pure rasterization performance than AMD in low end cards with their newest drivers. A380 singlehandedly defeats 6400 in pure rasterization, and would defeat 6500XT due to latter's horrible cut down specs.

  • @cetriyasArtnComicsChannel

    not a gamer at all, just want to work with very large files in adobe, while getting a good price. is AMD enough? currently have an old I7 and Nvidia that seems to not have an updated drive I can download and laptop kinda runs warm. current on windows 10 (not compatible for windows 11)
    thoughts?

    • @ScoopDogg
      @ScoopDogg Před 4 měsíci

      keep it anything else now is a downgrade people are falling for all this upscale fps nonsense and dont understand the fastest way to send data is 8bits to byte so sending all at once is how we code (more keys on a typewriter isnt faster same as writing a book with more letters in the alphabet is slower we need what we need and at the optimum) x2 8 for 000000 and 8 for 1111 al at once i.e pciex16 8 n 8 the 16cl 2x 16gb xmp ddr 4 say on m.2 nvme going to say a quad core 16mb cach the gfx card will neve be bottlenecked because it cant rec'v data any faster than all in real time at ONCE, no render or waiting yet we have a gfx card gpu capable of 64bit 4x that but only code in ascci so we cant go 32bit as computers think in binary 8 bits too a byte all at once, now they use 4 x 4 going to slower vram but the vram numbers are not right as the vram number is now divided by cores on gpu so to beat an 80ti 11gb would need 11gb x 4 at least and still wouldnt be mono native irl time it would have to read it and convert everything back to binary from 4 locations hence needs 4 sticks of memory using loads opf pcie lanes then extra to send and receive so double that again, when aaa high end would be all on gpu chip and not lagging due to send back down to cpu or vram.

  • @wayne69wayne69
    @wayne69wayne69 Před rokem

    @TechLens Can u help me build a gaming pc and video editing all round pc build please

  • @numb6686
    @numb6686 Před 3 měsíci

    I like how you tell anything, maybe a bit slow, but, whats with buying a GPU for something else than Games? Blender, Maya, Unreal; 3DMax, 3DScanners and Apps. That is rare good compared contend.

  • @teardowndan5364
    @teardowndan5364 Před rokem +1

    Intel could be great value per dollar, but you have to make the bet on future drivers bringing performance closer to where it should be more uniformly. Right now, it does great in some games beating the 3060Ti, horrible in others where it cannot even keep up with the 3050 or fails to run altogether. A 2+X relative performance spread from best to worst is ridiculous when GPUs usually fall within a fairly consistent +/-10% of where you'd expect them to be based on averages, excluding comparison-breaking features.

  • @mikebruzzone9570
    @mikebruzzone9570 Před rokem

    James apply change in quantity to the 204 SKU WW channel dGPU inventory sales and trade-in trend table and find out what people are really buying (outside AD and RDNA lll) and you will be surprised [?] and will likely ignore it [?] because who in tech tube makes sales referral from what people are actually buying in terms of the last 5 months GPU sales? mb

  • @N7Spades
    @N7Spades Před rokem +3

    Chst gpt got its data from years ago

  • @shremk9100
    @shremk9100 Před rokem +2

    In terms of ray tracing performance, there are a few discrepencies with some games that make me question how much of Nvidia's lead is because of their hardware advantage compared to just optimization. For example, in Metro Exodus Enhanced, while they're is still slower by a good margin, AMD is a lot closer to Nvidia than in many other ray tracing games despite it having arguably some of the best ray tracing around.

    • @GodKitty677
      @GodKitty677 Před rokem

      All of nVidia RT lead is down to hardware. Ray Tracing is all about hardware processing power. AMD's hardware design can process less Rays and thus peaks in performance lower than nVIdia cards. So nVIdia gpu can process more Rays before performance tanks.
      So if the game limits rays for AMD hardware, then performance is great on AMD gpus and they match nVidia GPUs. If the number of rays is not balanced for AMD cards, nVidia cards will win. If you use Path-Tracing nVidia will win because its all down to hardware performance.

    • @UKKNGaming
      @UKKNGaming Před 9 měsíci

      ​@@GodKitty677they're two vastly different architects that's like comparing Intel and AMD on a per core basis. It comes down to optimization because any Nvidia titles has the worst impact on RT compared to anything else. Most games they're literally neck and neck except for control cyberpunk ratchet and clank metro which are all Nvidia sponsored titles. It's very clear they do not care about optimizing AMD as shown on cyberpunk plagued with a bunch of issues on cyberpunks end when it came to optimizing the game. This is why I tell people if you're watching RT benchmarks and it's all Nvidia titles it's not a good comparison.

  • @mattirealm
    @mattirealm Před rokem +3

    Ray Tracing looks great. It is however, a pig. Until these companies figure out a way to make RT work A LOT more efficiently, it is irrelevant. There has to be some way to make to do it better. Not sure if faster VRAM (GDDR7) is the answer, or better RT cores, or better software. I do believe it can be done more efficiently. We just aren't there yet. And as the video states, there are numerous plusses and minuses to either AMD or Nvidia cards. I guess it all depends on what you value.

    • @analogicparadox
      @analogicparadox Před rokem

      The issue isn't even RT itself, it's the fact that game devs have been pushing a lot in terms of graphic fidelty, but not optimization.

    • @Humanaut.
      @Humanaut. Před rokem

      @@analogicparadox how do you prove your point?
      Sure, console ports are mostly poorly optimized. Indi titles as well as they don't have the budget to optimize.
      But regular AAA PC first games unoptimized?
      Please provide proof and don't use exceptions to the rule as a generalization.

    • @analogicparadox
      @analogicparadox Před rokem

      @@Humanaut. properly optimized AAA games are not "exceptions to the rule", they're proof that the work isn't being done for most other titles. The performance of titles like the Forza series, or doom (and eternal), show that graphic fidelity and framerate don't have to be opposite ends of the spectrum. Hardware capability has been getting better and better, yet performance on mid tier hardware hasn't been, even if technology development has been outpacing graphic uplift in the past few years.
      Demands for game devs have been going way up, and a ton of titles launch in poorly optimized states, and this doesn't exclusively concern performance. 90% of AAA titles have a day one patch to fix the issues that couldn't be ironed out during development, and a ton more hot fixes are released during a game's lifetime. The same often happens with performance, but it usually only happens when a game runs too poorly (cyberpunk) or is a competitive game that requires higher framerates.
      As someone that uses GPUs for both gaming and work, and has been upgrading every couple of years, I can guarantee that the difference in cardperformance between generations has been astounding (4060 not included), and the skyrocketing of game hardware requirements is not really justified (talking specifically about standard raster performance, not raytracing ad similar).

    • @Humanaut.
      @Humanaut. Před rokem

      ​@@analogicparadox I appreciate the lengthy end genuine response.
      I genuinely don't know the answer of where the fault lies but I see so much unsubstantiated sentiment on a range of topics within the GPU hardware / gaming space - especially when it comes to pushing blames around people's preconceived and often ill informed notions get activated very quickly.
      I listened to a long format interview with a game dev on mooreslawisdead and he said they had to optimize really hard over the past years just to get games running on 8GB vram and we have had 8gb around for 7 years (my rx 480 from 2016 has 8GB vram).
      So people often get stuck in time and dont realize that hardware has partially stagnated or "not increased at the rate it should" while game devs are planning 5-7 years in advance and expect uplift in hardware for games in advance.
      Jenson recently claimed a 500x (or 50,000x? some crazy high number - “Accelerated computing, at data center scale and combined with machine learning, has sped up computing by a million X.” - there was nother quote though) GPU speed improvement over the past 5 years in his last talk but the 4060 only beats the 3060 by 2-15% and in some specific benchmarks even lost to it. so basically if the generational improvement is small enough to be indistinguishable from margin of error in testing, fluctuation of sample quality etc. then i dont call that generational uplift.
      It's just too easy to pin it on the devs.
      And even from those games that are poorly optimized, like CP2077 its not specifically the "devs" fault, the devs arent being lazy, they get pushed by upper management who are motivated by shareholders to push a game out early even though devs need more time.
      Blame should be placed appropriately, with care and not nonchalantly.

    • @analogicparadox
      @analogicparadox Před rokem

      @@Humanaut. not blaming the devs specifically, just the gaming industry for being so content-centric. Most new big titles are some sort of live service, and even those that aren't still have unreasonable demands in terms of new content such as DLC or even release dates.
      And in terms of generational uplift, as I said the 4060 is a sort of edge case.it has less memory than the stock 3060, and a PCIE x8 slot. When I talk about generational uplift I mean cards like the 4080, which has a whopping +55% average performance compared to the 3080.
      And outside of gaming specifically, the whole high end 40 series has made incredible leaps. The *4070ti* beats the *3090ti* in terms of Optix rendering, and the 4090 is almost *twice* as fast as the 3090ti. Obviously mgaming isn't a 1:1, since it leverages different tech in different ways, but my point still stands.

  • @Haydenh127
    @Haydenh127 Před 7 měsíci

    Chatgpt was trained on data from 2020, so for new stuff its still really bad

  • @JosephIrvin-pm4cr
    @JosephIrvin-pm4cr Před 26 dny

    Some games have absolutely mind blowing ray tracing (re village, cyber punk) while others aren’t so impressive. And im not even sure any console (xbox, playstation, nintendo lol) can render real ray tracing

  • @eliceomartinezmejias867
    @eliceomartinezmejias867 Před 11 měsíci

    Excellent video. Congratulations

  • @AetherXIV
    @AetherXIV Před rokem

    excellent content. thank you

  • @MooKyTig
    @MooKyTig Před rokem +1

    If you're buying a $200-$400 gpu then you are NOT turning on ray tracing. RT cores are a waste of sand in an AMD X600 or an Nvidia X060 card. Period. And always have been.

  • @MrArcanjoGabriel
    @MrArcanjoGabriel Před rokem +1

    A 4090 in europe it's 2400 Euros/+-2400 Dollars

  • @fang5843
    @fang5843 Před rokem

    can someone please help im building my first pc and i want about a 1800 - 2000 dollor price range so kinda high end i can't decide where to get a 7900 XT or 4070 TI since i never really use nvidia or AMD graphics cards since this is my first build so can some one help me

  • @skandababy
    @skandababy Před 4 měsíci

    Which game is being played at 9:40?

  • @sussita7
    @sussita7 Před 2 měsíci +1

    intel family hd 600!!!

  • @mikemoore5270
    @mikemoore5270 Před rokem

    Best part of the weekend, TechLens 🥰

  • @normaniglesias8446
    @normaniglesias8446 Před rokem +1

    I have no preference, whatever is at the right price, don't really care about RT

  • @freefall_910
    @freefall_910 Před rokem

    Im a gamer and for me VRAM and rester matter most and am entry to mid-tier gamers, so there is no question about RT

  • @LexDreyer
    @LexDreyer Před rokem

    Man, I cannot decide between 4070ti and 7900xt... I really can't decide

    • @Greenalex89
      @Greenalex89 Před 9 měsíci

      The 4070ti seems to have such a horrible value just cause of the tiny VRAM for the insane price. U need a very very niche reason to logically justify that purchase, like very specific programs u use with it paired with a micro ATX build and u mostly work and dont play much and u are fine with playing in low settings in 2 years since games will be unplayable if u wont lower settings and resolution.
      Its just very hard to justify a 4070ti..What an amazing card it would have been with 16 gb VRAM, but Nvidia decided to scam although extra VRAM doesnt cost much at all. I just dont get it.

  • @jarenpocopio6033
    @jarenpocopio6033 Před rokem

    I bet that most people wont notice when a already good looking game has rt on

  • @jamesmetz5147
    @jamesmetz5147 Před rokem

    Good Job.

  • @Rooachie
    @Rooachie Před rokem +1

    I choose the one that gives me the most bang for my budget. Everything else is just plaing stupid

    • @Humanaut.
      @Humanaut. Před rokem +4

      Exactly. And if you need one for streaming and rendering work then even an overpriced Nvidia GPU could be best "bang for your buck" because it all depends on what you need the product for.

  • @magnusnilsson9792
    @magnusnilsson9792 Před rokem +1

    As long as the elephant in the room is ignored, people will keep getting this wrong.
    -XeSS is better than DLSS unless you desperately need fake frames.
    -Intel beats Nvidia in many productivity tasks such as streaming.
    -Intel Arc A770 is the cheapest 16GB+ VRAM GPU by far.
    Sure the drivers have some kinks but anyone savvy enough to watch these kind of clips should be able to wield it.

  • @miraclemanoj
    @miraclemanoj Před rokem

    Living in matrix is clear ay the start😂

  • @AvroBellow
    @AvroBellow Před rokem

    Looks more like BingChatGPT. 😁

  • @raphaelnyamai7180
    @raphaelnyamai7180 Před rokem

    What about eSports ??

  • @willtews955
    @willtews955 Před 9 měsíci

    I wonder if amd and nividia would want to buy out intel so they can use any advancements combined in the next next gen gpu. If nividia bought out amd they would corner the market.

  • @user-uk8ti8mf4o
    @user-uk8ti8mf4o Před 7 měsíci

    I can diffidently say both AMD and Nvidia Graphics cards are good, in my opinion Nvidia offers slightly better performance but for much more money.
    Intel arc Graphics cards are good but !DO NOT USE AN INTEL ARC AS ALAPTOP GPU! I personally have experience with Intel arc.

  • @dd22koopaesverde58
    @dd22koopaesverde58 Před rokem

    For me is amd>nvidea>intel this is for me and in my country intel is expensive

  • @justhitreset858
    @justhitreset858 Před rokem

    The reason NVidia wins isn't Ray Tracing, it's DLSS. So far as AV1 goes... AMD supported h.264 with their RX 6000 series, yet NVidia with H.264 still looked better. Until we see a side by side comparison then chances are the NVidia encoder will still be better.

  • @GrendelMTL
    @GrendelMTL Před rokem

    why would ANYONE care what ChatGPT says about video cards?

  • @Cordialsun29
    @Cordialsun29 Před 4 měsíci

    AMD cheap has insane power

  • @Eurypsi
    @Eurypsi Před 3 měsíci

    INTEL

  • @KurNorock
    @KurNorock Před rokem +1

    NVidia if you are doing any work such as video editing, rendering, or other intensive work. Also if you are a gamer who really wants to use ray tracing. Also, the 4090, while not as powerful as it should be, is far and away the most powerful card available right now. Honestly, the entire rest of the 40 series lineup for gamers is pretty pointless. The 4080 is FAR too expensive for its performance and the story just gets worse as you move down the lineup.
    AMD if you are a gamer only and don't want to spend an arm and a leg for performance. You won't get the absolute best performance, but you'll save a bunch of money and most of the cards actually have enough vram to be useful for a few years.
    Intel if you are broke but don't want to buy a used card and only want to game at 1080p.

  • @matthiasschuster9505
    @matthiasschuster9505 Před rokem +1

    Linux changes this clearly towards AMD, and slightler so, to Intel
    The typical Nvidia features are not present there
    And AMD drivers are actually top notch at Linux, also thanks to Steam Deck, PS5 and Xbox Series X and S. :)

  • @digitalplayland
    @digitalplayland Před rokem

    I can tell you what GPU I will NEVER buy again.

  • @obi0914
    @obi0914 Před rokem

    If you got money and want performance Nividia, if you are on a budget AMD. Dlss and Ray tracing are just extras. Arc.....well i like a underdog story as well

  • @Stormpriest
    @Stormpriest Před rokem +1

    April last year I dropped $2,000 on a 3080 TI.
    Granted, that was also in the Apex of the Bitcoin mining issue. But for what I do, it's always been about Nvidia. Never mind they're shady business practices and their questionable releases as of late.
    It's not that I actually had 2000 to spend necessarily, but it's what I was willing to spend to get the quality I was after.

    • @kravenfoxbodies2479
      @kravenfoxbodies2479 Před rokem +1

      my 3070 cost me $879 in Oct 2021, last Nvidia card that I ever plan to buy because of 8Gb of over priced vram being the limit on the video card ever going forward, feels like the GTX 970 all over again.

  • @waynesworldofsci-tech

    I disagree. My M2 Mac blows Nvidia out of the water for streaming and video editing. The other issue being that Windows isn’t stable, and you’d be insane to use it in production.
    But for gaming, Windows provides a sweet experience. Mostly. I use Mac OS and Linux for everything but gaming because of how crappy Windows is for anything else.
    Yeah, I’m a cantankerous old bugger. Learned computer programming using punch cards, and have run 10-15 different home computer operating systems (I count all Linux distros as one OS).

  • @kmyre
    @kmyre Před rokem +1

    Also nVidia is capable of developing functional and up to date drivers, which AMD has historically incapable of. So if you want to use your GPU as anything but a paperweight, go green. I hope Intel catches up one day. I am afray team red is beyond redemption

    • @jakubkazmierczak1837
      @jakubkazmierczak1837 Před rokem +1

      I never had any issues while using 560X for three years and now 7600S. Nvidia on the other hand once messed with Apex shaders resulting in crawling like FPS on RTX 3050.

  • @JayzBeerz
    @JayzBeerz Před rokem +5

    Nvidia all day.

    • @KikiCeg
      @KikiCeg Před rokem +11

      keep overpaying and supporting bad bussines practices, congrats

    • @JayzBeerz
      @JayzBeerz Před rokem +2

      @@KikiCeg ok poor boy

    • @KikiCeg
      @KikiCeg Před rokem +7

      ​@@JayzBeerz lmao

    • @callofmetals24
      @callofmetals24 Před rokem +1

      "A good card that works" all day on mines...no fan girly shit here

    • @miyagiryota9238
      @miyagiryota9238 Před rokem +1

      Nvidia overpriced all day!

  • @mc_sim
    @mc_sim Před rokem +2

    For people who are not so much into hardware this video is absolutely useless.

    • @josephboustany4852
      @josephboustany4852 Před rokem +1

      The video is literally about hardware though

    • @mc_sim
      @mc_sim Před rokem +1

      @@josephboustany4852 not really. Not a single useful tip. Just tons of water about some tech gimmicks that are absolutely irrelevant to most users. Common Joe doesn’t care about dlss, rt or whatever. He needs to know what to buy and specs to look at and when. Which gpus are still relevant and which are definitely obsolete.