AMD Says You’re Doing it Wrong. - Best settings for AMD GPUs

Sdílet
Vložit
  • čas přidán 10. 05. 2024
  • Thanks to AMD for sponsoring this video! Learn more at lmg.gg/WAdzj
    Buy an AMD Ryzen 7 5800X3D: geni.us/tnn0GBY
    Buy an AMD Radeon RX 6950 XT: lmg.gg/brtnL
    Buy an AMD Ryzen 7 5700X: geni.us/iIZeLJB
    Buy an AMD Radeon RX 6750 XT: geni.us/DG0UoTU
    Buy an AMD Ryzen 5 5600: geni.us/B6xE
    Buy an AMD Radeon RX 6650 XT: geni.us/5Gz0n7
    Buy an ROG Ryujin 240: geni.us/iIoj
    Buy an ROG Thor 850W: geni.us/hm8kxQc
    Buy a Corsair MP600: geni.us/hTIiyh
    Buy a kit of Vengeance RGB 3600 16GB RAM: geni.us/epcOV
    Buy an ROG Strix X570-E Gaming Wifi: geni.us/Z6oTluT
    Buy a HYTE Y60 Mid-Tower: geni.us/8pV13t
    Purchases made through some store links may provide some compensation to Linus Media Group.
    Discuss on the forum: linustechtips.com/topic/14338...
    ► GET MERCH: lttstore.com
    ► AFFILIATES, SPONSORS & REFERRALS: lmg.gg/sponsors
    ► PODCAST GEAR: lmg.gg/podcastgear
    ► SUPPORT US ON FLOATPLANE: www.floatplane.com/
    FOLLOW US
    ---------------------------------------------------
    Twitter: / linustech
    Facebook: / linustech
    Instagram: / linustech
    TikTok: / linustech
    Twitch: / linustech
    MUSIC CREDIT
    ---------------------------------------------------
    Intro: Laszlo - Supernova
    Video Link: • [Electro] - Laszlo - S...
    iTunes Download Link: itunes.apple.com/us/album/sup...
    Artist Link: / laszlomusic
    Outro: Approaching Nirvana - Sugar High
    Video Link: • Sugar High - Approachi...
    Listen on Spotify: spoti.fi/UxWkUw
    Artist Link: / approachingnirvana
    Intro animation by MBarek Abdelwassaa / mbarek_abdel
    Monitor And Keyboard by vadimmihalkevich / CC BY 4.0 geni.us/PgGWp
    Mechanical RGB Keyboard by BigBrotherECE / CC BY 4.0 geni.us/mj6pHk4
    Mouse Gamer free Model By Oscar Creativo / CC BY 4.0 geni.us/Ps3XfE
    CHAPTERS
    ---------------------------------------------------
    0:00 Intro
    1:08 Undervolt
    3:28 Smart Access Memory
    4:16 Resize BAR
    4:50 RSR
    5:50 Benchmarking
    8:41 Conclusion
  • Věda a technologie

Komentáře • 3,5K

  • @koizumiizumi5426
    @koizumiizumi5426 Před rokem +10597

    kinda nice seeing a company push the knowledge to get better performance with their equipment instead of just telling us to upgrade to better hardware so they get more money

    • @chessprince1164
      @chessprince1164 Před rokem +223

      It's simply smarter

    • @theunkownviolinist
      @theunkownviolinist Před rokem +1

      let's be real, those of us who can afford to upgrade to better tech will do so when the new gen drops because we're sluts for fresh new tech

    • @MinkSquared
      @MinkSquared Před rokem +323

      well, a majority of people wont buy their products if they say "buy this for gooder" because *noone* wants to mess with vulkan.
      they're getting much more marketing value if they just seem like the cool older brother company as opposed to nvidia giving a big L to linux

    • @m_sedziwoj
      @m_sedziwoj Před rokem +31

      yes, but not mention that most of video is FSR 1.0, is bad as education video, because 2.0 will be different.

    • @gabrieldhondt6432
      @gabrieldhondt6432 Před rokem +96

      That was exactly the same thing i was thinking when i saw the title. Thats the problem with many new tech, people just dont understand it. People buy by names. One time i had someone say his gpu needed to be an RTX and when i said why, he couldnt respond.

  • @LonelySandwich
    @LonelySandwich Před rokem +3361

    I don't remember the last time AMD sponsored an LTT video. Its been so long

    • @RossColborne
      @RossColborne Před rokem +118

      Guess that's probably due to LTT being owned by nvidia, so they didn't want to directly hand over cash to their rivals (more or less)

    • @opvask
      @opvask Před rokem +194

      @@RossColborne yea because in this space you’re gonna make it big by putting all your eggs in one basket.. or not.
      LMG is taking everybody’s money, they don’t discriminate like that.

    • @keldon1137
      @keldon1137 Před rokem +123

      @@RossColborne I hate internet, without facial expressions and stuff i cant really tell if u believe that or just repeat an old joke

    • @paris6817
      @paris6817 Před rokem +89

      @@RossColborne that was an April fools joke 6 years ago… 😂

    • @riztiz
      @riztiz Před rokem +31

      @@opvask They're referencing a joke made by linus a few years ago

  • @semibreve
    @semibreve Před rokem +526

    this is actually so cool to see AMD give this advice to end users: optimising and tweaking has always been the most slept on and most confusing part of PC building so it's awesome to seeing Linus covering this

    • @madangel71
      @madangel71 Před rokem +7

      Just activating SAM on bios, I got 20% more FPS on Benchmark of Division 2. On my RX 580 8Gb, it went from 90 FPS Average, to 108 FPS :D. On my Ryzen 9 5900X system, this GPU is a huge bottleneck and activating SAM really helped.

    • @umamifan
      @umamifan Před rokem +9

      It’s because “optimizing” is seen as a low spec gamer kind of thing, and is thus something a person who buys a mid range card doesn’t want to do. They expect out of the box settings, and don’t want any of the “low spec inconveniences” just because they think they’re too good for it, or because they “paid too much” to have to optimize!

    • @neildiamond610
      @neildiamond610 Před 11 měsíci

      @@madangel71 how did you manage to get sam on? I also have an rx580 8gb. Setting resize bar to On doesn't seem to work for me cause when I do after i save settings restart then it will say vga card is not supported.

    • @relax9086
      @relax9086 Před 11 měsíci

      ​@@madangel71 sam don't work on the rx580, it's not compatible cause i have a 580 card!

    • @madangel71
      @madangel71 Před 11 měsíci

      ​@@neildiamond610 Hi. I have a Motherboard ASUS TUF GAMING B550-PLUS bought in 2022 February, and when I activated Sam with RX-580, it worked fine on first attempt without any issue. I replaced RX580 with an RX 6700 XT, on 2023 February also, and with the more recent GPU, I got a new issue where one of the SSD stopped being recognized. So it seems it depends on the mainboard/GPU combo.

  • @dnakatomiuk
    @dnakatomiuk Před rokem +106

    "If GPUs get any more powe hungry" Nvidias series 4000 launches a few months after

    • @attmrcmailik9653
      @attmrcmailik9653 Před rokem +1

      Such tdp isnt really high. Check out amd's r9 top models (450w in stock in 2013)

    • @OsGere
      @OsGere Před rokem +4

      @@attmrcmailik9653yeah that was in 2013 so its irrelevant now😂 its 2023 now buddy

    • @10siWhiz
      @10siWhiz Před rokem

      Electric suppliers wringing their hands

    • @frostedflakes3168
      @frostedflakes3168 Před 2 měsíci +3

      40 series uses a lot less power than the previous gen of GPUs

  • @jakedill1304
    @jakedill1304 Před rokem +1263

    I love how gearbox has managed to turn every single advantage that you should be getting with cell shading into a disadvantage...

    • @coffee7180
      @coffee7180 Před rokem +214

      It's just insane. I always fought more devs should use cell shading for great art style, fast development, great performance, great aging, easy to build assets and contentbut gearbox showed me how incompletents can destroy everything.

    • @ishiddddd4783
      @ishiddddd4783 Před rokem +160

      indeed lmao, near to 0 graphical improvement but with a game not only terribly optimized that it's so stupidly demanding, but while looking like an early 2010s game

    • @Just_RJ
      @Just_RJ Před rokem +31

      They should take some lessons from Arc System Works.

    • @silverhawkroman
      @silverhawkroman Před rokem +15

      ​@@ishiddddd4783 probably randy pitchfords middle finger to gamers

    • @sammi7971
      @sammi7971 Před rokem +41

      @@silverhawkroman Oh he already did that when he accepted that huge check from Epic Games to make the Borderlands franchise an Epic Games store exclusive.

  • @TheMx5Channel
    @TheMx5Channel Před rokem +1726

    Would help alot if developers start optimizing programs instead of increasing raw system power.

    • @burkino7046
      @burkino7046 Před rokem +212

      especially looking at you Windows, theres basically no reason why you still need code thats 20 years old

    • @clipsedrag13
      @clipsedrag13 Před rokem +79

      why optimize when you can just keep releasing new hardware

    • @CarbonPanther
      @CarbonPanther Před rokem +262

      Imagine if all games were as optimized as Doom Eternal, we wouldn't even need ANY of the tech shown here... What a dream to have.

    • @SiniSael
      @SiniSael Před rokem +22

      U cant do that - thats common sense...
      Like its obvious so OFC they wont do that - Its all about profit! ;)

    • @moistjohn
      @moistjohn Před rokem +100

      The time honored tradition of throwing hardware at a software problem

  • @sythex92
    @sythex92 Před rokem +434

    3:00 - Quantum Tunneling has been an issue for 5nm for a long time, it's only recently that TSMC has been able to overcome it, 3nm is where it really starts to get difficult though, we've made almost no progress on a 3nm node, and 4nm is awhile away. I feel 5nm will be here to stay for quite awhile.

    • @Jane-qh2yd
      @Jane-qh2yd Před rokem +7

      I'm pretty sure Nvidia's upcoming Love Lace will be built on TSMC's 4nm platform

    • @IschChef
      @IschChef Před rokem +49

      @@Jane-qh2yd TSMC "4nm" is not actually truly 4nm, just like "Intel 7" is not 7nm. The numbers are only marketing and completely decoupled from real transistor size.
      On top of that quantifying three dimensional transistor structures like we have since the start of the FinFET era with a single number is a bit questionable anyway.

    • @Jane-qh2yd
      @Jane-qh2yd Před rokem +9

      @@IschChef The comment above is specifically talking about TSMC chips, so TSMC 4nm is relevant here

    • @yacquubfarah6038
      @yacquubfarah6038 Před rokem +11

      @@IschChef Nah they are 4nm lmao.

    • @Ignacio.Romero
      @Ignacio.Romero Před rokem +8

      @@IschChef TSMC nm sizes are real, unlike Intel, but that doesn't mean it's inherently better than Intel's process just because the number is smaller.
      That being said, mobile chips are already using TSMC's 4nm process (Snapdragon 8 gen 1, upcoming Apple A16), so your point about 5nm sticking for a while isn't true, or at least only true for computer CPUs

  • @reyalPRON
    @reyalPRON Před rokem +28

    This is also a great way to save power. And heat in your system surely will be less of a problem and you might get away with less fan noise too :)

  • @ThatNerdChris
    @ThatNerdChris Před rokem +377

    Thanks for the FPS Papa Linus & AMD!

  • @thatsgottahurt
    @thatsgottahurt Před rokem +521

    Interested to see what Smart Access Storage or Direct Storage will bring to the table once we finally see it in games.

    • @DailyCorvid
      @DailyCorvid Před rokem +44

      Probably a marginal gain for most people, I expect 2-3% at most.
      Software isn't really advanced enough to take advantage of it yet. Developers need to write code in a different way to make use of it.

    • @vith4553
      @vith4553 Před rokem +1

      check out forspoken's result

    • @LiveType
      @LiveType Před rokem +26

      Probably just stutter free high speed open world exploration. So performance in that aspect will see a massive boost. Otherwise, negligible difference as that doesn't touch the render pipeline.

    • @Archmage1809
      @Archmage1809 Před rokem +17

      Direct storage helps asset streaming. But most likely it's gonna reduce the game size (compression implementation, use GPU horsepower to decompress), improve load time, and maybe improve 0.1% and 1% low.

    • @ThejusRao
      @ThejusRao Před rokem +7

      @@DailyCorvid It doesn't offer any performance increase, but it's supposed to improve loading times by a factor over atleast 30% if implemented right.

  • @goolash1000
    @goolash1000 Před rokem +8

    FYI: most us outlet boxes are fed by two phases. Which means in many cases, a two-outlet was fixture can be changed by a qualified electrician into a single outlet 220V fixture for half the amp draw, and most PC power supplies are compatible with 220V out of the box.

  • @diederickvermeulen8927
    @diederickvermeulen8927 Před rokem +5

    This really helped. Thank you so much. In Assassin's Creed Valhalla I was getting only 40-50fps average at 1440p. When enabling SAM in bios and RSR it has increased to over 70fps. Really cool

  • @AdistiantoYuwono
    @AdistiantoYuwono Před rokem +521

    Would be nice if there’s an “efficiency war” for desktop component also, as fierce as seen in mobile devices. New Intel CPUs and Nvidia GPUs are no joke in sucking those electrons. It almost looks like instead of improving processing technique or architecture, adding more power is the easier way out to gain a marketable performance. However in reality, electricity is very rarely free (in terms of money and environmental impact). Apple and other ARM vendor, as well as AMD in laptop market already shows us that this is actually possible in consumer market.

    • @83hjf
      @83hjf Před rokem +27

      the problem with Apples M1 performance is that its only good for specific things. because the efficiency comes from having dedicated hardware for common tasks, like for example, compressing one specific video codec, or encrypting one specific algorithm. GPUs are general-purpose devices, because what they do changes with every game. Apple (and ARM) are no gods with secret knowledge, they just optimize for 99% of use cases. A M1 machine, as powerful as it is right now, it's probably not going to fare that well in 10 years when we have moved away from H.265, for example, and it has to do software decoding in the CPU.

    • @Hybris51129
      @Hybris51129 Před rokem +8

      @SomeoneOnlyWeKnow Honestly I think this is more of a sign that the years of "You don't need a 600, 700, 800, 1000w PSU" coming back to bite gamers hard. Bad advice has now driven people to have to add another $150-300 to their upgrades because they didn't build their rigs to handle the march of technology.
      While power efficiency is nice just looking at how desktops have gone from 250w PSU in the late 90's/early 2000's to realistically only a 1000w for exponentially more processing power we have achieved a lot with relatively little increase in power useage.

    • @custard131
      @custard131 Před rokem +12

      i think if you look beyond the headline power ratings there have also been huge steps forward in efficiency too, its just that when looking at the top end chips that increased efficiency is always used to get more performance because there isnt really any reason to reduce the power budget for high end desktop components just for the sake of drawing less power like there is in laptops where battery life is a big selling point
      if you wanted to you could look at the product list and see how much performance you could get for a given power budget, and im fairly confident you would still see performance improvements from generation to generation.
      kind of related to that, in past gpu generations it wasnt that uncommon to have multi-gpu sli setups with up to 4 gpus, with ampere only the 3090 even supports that and even then only 2-way but it can still out perform even the most overkill of past gen setups
      another thing to keep in mind is that manufacturers have also been making improvements to turbo boost / variable clock rates so while the headline figure for a shiny new gpu might say it can draw 400w that doesnt mean it draws that much at the time, that means if you choose to give it a workload that can push it to its max thats how much it has to give.

    • @poipoi300
      @poipoi300 Před rokem +4

      @@custard131 Thank you for that comment I was about to write something similar haha. People focus on the power right now and act like it's always more power for more performance while you could probably take any one generation over the last at the same wattage and it would outperform the last gen. An example of that is the 1080ti vs 2080ti. The 2080ti is significantly faster all the while consuming the exact same amount of power out of the box. Using more power is entirely a matter of choice at that point, but I don't know many who would give up part of their performance for not even a few cents an hour. Personally if all it takes to get more performance is increased power consumption, I'll do it. After all I'm not constrained by a battery like I would be on a laptop or phone. Those are extremely different markets and sets of considerations. People who complain about wattage like that should seriously calculate the amounts they're complaining about. Of course it depends where you live, but electricity is cheap. You're not using your gaming rig all year round either. Where I live, if you run something like a 2080 all year long (mining) it costs you about 40$ CAD if I recall correctly.

    • @joshjlmgproductions3313
      @joshjlmgproductions3313 Před rokem +8

      @@custard131 Many reviewers state average and peak power draw separately.
      Also, comparing the 2060 Super to the 3060, it looks bad.
      RTX 2060 Super: 180W
      3DMark score: 8739
      MSRP: $399
      RTX 3060: 210W+
      3DMark score: 8766
      MSRP: $329
      That's why reviewers almost never compare cards to the previous generation.

  • @NoLongo
    @NoLongo Před rokem +756

    Game developers doing everything they can to make the most unoptimized games imaginable because resolution upscaling and other features give them more headroom.

    • @ShiroCh_ID
      @ShiroCh_ID Před rokem +57

      lol i thought the same thing
      we made solution of the problem that actually didnt exist or so it should
      and seeing the Ages old Nintendo 64 games or PSone games or even some PS2 games i can see that they sure work around the hardware limitation nicely,heck even C64 and MS-DOS games also have same clever wokraround of that limitation,nowdays they just care about Graphics and sometimes forgot about space

    • @nogoat
      @nogoat Před rokem +117

      The best example for this is the GTA Trilogy.
      Like, how can a remaster of a 20 year old game have higher minimum requirements compared to your own title from 2018 (RDR2)

    • @DailyCorvid
      @DailyCorvid Před rokem +41

      If you ignore shitty triple A's then it is the opposite! Indie devs rocking currently.

    • @EnigmaticGentleman
      @EnigmaticGentleman Před rokem +54

      Yep, i miss the 90s when game developers actually tried to optimize their games so as many people could game as possible. With computers having gotten as powerful as they have, there is simply no excuse for games to have trouble on cards like the 1060 (at medium settings).

    • @CaptPatrick01
      @CaptPatrick01 Před rokem +13

      @@ShiroCh_ID Squeezing water from stone as someone put it. It is a lost art.

  • @JUMPJUMP2TIMES
    @JUMPJUMP2TIMES Před rokem +1

    The resize bar helped tremendously. Thank you so much!

  • @Eugomphodus
    @Eugomphodus Před rokem +8

    There's also a toggle (once enabled in BIOS) to switch SAM on/off in the Radeon software. Some games I've found out REALLY doesn't like this feature and performs really bad with it on, but I'm sure it'll be more useful as times goes on.
    An extreme example I found so far is "Element TD 2". I've set a cap to 144 fps and normally it's around 90-144 fps @1440p max settings during intense moments with a 5700 XT. But with SAM toggled on the fps drops as low as 20 fps and stutters a lot, including the audio.

  • @minerskills
    @minerskills Před rokem +213

    I‘d love to see a comparison of FSR 2.0 to DLSS and NIS.

    • @cromefire_
      @cromefire_ Před rokem +72

      Hardware Unboxed has some of that, they tested NIS too, it's not too far off, but notable worse than FSR 1. FSR 2 though seems to be almost at DLSS level and on rare occasions even better (in the one game it's in, Deathloop).

    • @awetstone56
      @awetstone56 Před rokem +3

      Digital foundry probably has something

    • @minerskills
      @minerskills Před rokem

      @@cromefire_ Ah thanks I'll check that out!

    • @ravenclawgamer6367
      @ravenclawgamer6367 Před rokem +2

      According to the testing so far, FSR 2.0 seems neck and neck with DLSS 2.3 However, does anybody know about Unreal Engine's TSR? I'd like to know if it performs better or worse than FSR 2.0.

    • @brothatwasepic
      @brothatwasepic Před rokem

      Gamers Nexus is awesome at these comparisons

  • @Steve30x
    @Steve30x Před rokem +135

    2:01 here in Ireland my electric bill due in two weeks is €133. The same bill last year was €80. That's a €53 increase despite the fact I've used less electricity this time around.

    • @MalcolmCrabbe
      @MalcolmCrabbe Před rokem +10

      You got off lightly. My £100 direct debit for Electric has jumped to £265pm from July 1st !!!

    • @emanuelmayer
      @emanuelmayer Před rokem +2

      My new reading is in June 2022. And a week later, the bill arrives. I plan not to buy anything until I have paid the increase -.-

    • @welkfor1753
      @welkfor1753 Před rokem +3

      Jesus why is so expensive over there I've never seen a bill higher than 30bucks for a month

    • @emanuelmayer
      @emanuelmayer Před rokem +6

      @@welkfor1753 a large portion of electric costs are taxes and stuff like "we have to pay nuclear plant owners because we forced plants to be shut down but we have contracts to let them run until ... lets say 2050". Also ecological taxes (which should/have been invested in renewable energies).
      I am usually at 1500-1700 Kilowatt-hour a year.

    • @roqeyt3566
      @roqeyt3566 Před rokem +7

      i switched to solar, steam deck, and eating more cold foods. Showering at the gym too. even so my bill went up...

  • @Tobser
    @Tobser Před rokem +19

    Thanks for the tip. It really improved the vr performance of 6600xt from unplayable to playable.

  • @hunn20004
    @hunn20004 Před rokem +7

    Fury X is perhaps my favourite GPU I wish I owned.
    The Blower style coolers of Radeon have always appealed to me, wish they actually paired them with low Watt components.

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Před rokem +1

      yeah fury and vega are interesting mainly bcs they prob only available at limited numbers

  • @sirspamalot4014
    @sirspamalot4014 Před rokem +1020

    We're finally coming back to a point where developers have to optimize their games to run on hardware for longer. Remember, the Xbox 360 can run GTA V, even though it has no right being able to, that's the kind of optimization I want to see, where 10 plus year old hardware can still run a game REALLY well.

    •  Před rokem +46

      And to this day it is the only version of GTA V that I have played, I really liked it but not enough to give it another go when the other versions came out, also I think it was the last game I played in my 360 so it had a good send off.

    • @SimonBauer7
      @SimonBauer7 Před rokem +16

      true. when the ryzen laptop apus launched we where able to see just how slow the xbox one and ps4 where. i mean a ryzen 3 apu can run gta v at framerates the ps4 couldnt even dream of.

    • @jordanwardle11
      @jordanwardle11 Před rokem +8

      @@SimonBauer7 yes... 4 years later

    • @chriswright8074
      @chriswright8074 Před rokem +1

      @@SimonBauer7 not really

    • @bolland83
      @bolland83 Před rokem +6

      @ I was in the same boat, spent the $99 on the special edition bundle when it came out on 360. I ended up getting it again on steam though, it was on sale for $10 since I never play on consoles anymore. Anything I want now just goes in my wishlist and I get a notification if it goes on sale, if it's a good deal I get it.

  • @thestig007
    @thestig007 Před rokem +1072

    I think we also desperately need game developers to optimize their game engines and also we need to come up with better performing game engines. Some games just run like crap even with top tier hardware!

    • @sphygo
      @sphygo Před rokem +109

      Like Minecraft, running one core at max and the rest barely at all...

    • @HMSNeptun
      @HMSNeptun Před rokem +35

      @@sphygo fabric+sodium/iris will fix that

    • @sphygo
      @sphygo Před rokem +100

      @@HMSNeptun Yes but that doesn’t change the fact that it’s poorly optimized for modern processors. I wish the devs would focus on updating the base code to be more in line with the times. Movable tile entities are also something that could be fixed with a code rework, rather than relying on mods for all the performance optimizations.

    • @SirDragonClaw
      @SirDragonClaw Před rokem +66

      Most modern game engines are very optimised. The real issue is game developers not making proper use of engines and leaning too hard on scripting languages.

    • @CanIHasThisName
      @CanIHasThisName Před rokem +32

      The real problem here is people playing games on the absolute highest settings. Those aren't meant for top tier hardware of today, those settings are there for several generations later when the game runs great even on modern low end. And don't get me started on people expecting high FPS with RT enabled.
      Most games that end up hitting the market are well optimised and run well, you just gotta take your time to set the details right for your HW. Nowadays you often can't tell the visual difference between Ultra and High except for some absolutely minor deatils, and even Medium often doesn't bring any notable graphical downgrade. The problem is that people still think that they're just gonna put everything on Ultra and call it a day. Or worse, they're just using the presets without going through the individual settings.

  • @ilovelamplongtime
    @ilovelamplongtime Před rokem +2

    This worked well for me. 20-25ish frame increase in MWII and getting slightly cooler GPU temps

  • @JesusOfTheJungle
    @JesusOfTheJungle Před 9 měsíci +2

    I run a 2080 ti and have moved to the blue mountains (Australia) in the last year. Through winter house temp wouldn't get much more than 8 degrees celsius but man, my gaming/entertainment room was, I mean, I wouldn't say warm but by comparison to the rest of the house, it was comfortable!

  • @digiscream
    @digiscream Před rokem +89

    Yeah...here in the UK, I had a noticeable reduction in my power bill by downgrading my daily driver to a Quadro K2200 (60W) a couple of months ago. Madness.

    • @tuckerhiggins4336
      @tuckerhiggins4336 Před rokem +9

      That's why I use a laptop, gives me the most efficient hardware

    • @silverhawkroman
      @silverhawkroman Před rokem +3

      I rather paid 80 bucks over MSRP to get a 6600 that's 70 watts less for 30-70% performance than to stick to my rx 580, even though it's seen as an entry tier card. Plus selling the old card will net me 150 so win win!!!

    • @digiscream
      @digiscream Před rokem +4

      @@tuckerhiggins4336 - sadly, I can't do that. Fan whine is a sensory nightmare for me, and quite a lot of what I do is fairly CPU-intensive stuff.

    • @tuckerhiggins4336
      @tuckerhiggins4336 Před rokem

      @@digiscream ah, the headphones must not do it for you then

  • @seank4148
    @seank4148 Před rokem +166

    Just FYI. If your boot drive is MBR partition, enabling Resize Bar disables CSM support, which means you can't boot off your MBR. MBR boot needs CSM enabled, but CSM can't be enabled simultaneously as Resize Bar. You'll have to first convert to GPT partition on your MBR boot partition to enable Resize Bar.

    • @mikebrownphotography2784
      @mikebrownphotography2784 Před rokem +7

      This might have been my issue as I had to reset my bios after attempting to enable SAM. My PC wouldn’t boot

    • @jonasedvardsson7612
      @jonasedvardsson7612 Před rokem +1

      @@mikebrownphotography2784 convert system disk to gpt isnt hard though🙂

    • @5avvproductions41
      @5avvproductions41 Před 11 měsíci +2

      Great now I’m stuck in bios fml 😂

    • @joshua7015
      @joshua7015 Před 9 měsíci

      What if my boot drive is GPT partition? Are there any precautions I should know beforehand?

    • @seank4148
      @seank4148 Před 9 měsíci +1

      @@joshua7015 only issue is with MBR, which is a much older way to setup a boot drive. GPT has no issues I'm aware of. 👍

  • @fraxesz1598
    @fraxesz1598 Před rokem +16

    This video is indeed very helpful for most people who aren't into PC hardware to get the most out of AMD hardware :D Keep it up guys!

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Před rokem

      they just need to spend more tinkering and testing graphic settings, some settings are just not meant to be enabled/maxed on radeons, oc etc arent exactly necessary (although more fps always welcomed, but try to find correct settings first before oc ing)

  • @Tantium
    @Tantium Před rokem +2

    i have this working with a b350 board with 5800x3d and 5700xt, sweet compatability!

  • @natec1
    @natec1 Před rokem +269

    I love how computers are getting so small that modern physics is an actual issue

    • @citizenkane2349
      @citizenkane2349 Před rokem +21

      I got to admit it is pretty cool.

    • @kz03jd
      @kz03jd Před rokem +57

      But isn't physics just plain.....physics? I mean it's the same 10,000 years ago as it is today. Physics doesn't change, our understanding of it does.

    • @Boringpenguin
      @Boringpenguin Před rokem +9

      Even cooler, harnessing those "issues" properly will lead to the next breakthrough in computing. I hope I could see the day when general quantum computers enter the consumer market.

    • @BixbyConsequence
      @BixbyConsequence Před rokem +27

      @@kz03jd "modern" as in quantum physics being a modern field of study.

    • @natec1
      @natec1 Před rokem +26

      @@kz03jd That’s true but the whole principle of modern physics is that we don’t understand it fully yet. Physics at our scale behaves differently than it does at the extremes.

  • @faizmumtazramadhan2212
    @faizmumtazramadhan2212 Před rokem +38

    1:08 Now we know why Alex has been obsessed with so many crazy cooling solutions lately 🤣🤣

  • @Davidpp-ix6zu
    @Davidpp-ix6zu Před rokem +1

    Gran video! Explicas muy bien y claramente 👍 💪

  • @thoraero
    @thoraero Před 6 měsíci

    Ok that smart mem access really helps. Thank you.

  • @JordonAM
    @JordonAM Před rokem +185

    FSR at 1080p can always be counteracted using Radeon Image Sharpening, and RSR just added a sharpening slider so you can also use that for games that don't support FSR. RIS isn't magic though, but it'll help some definitely

    • @johndoh5182
      @johndoh5182 Před rokem +5

      AND AMD will improve RSR over time.
      Frankly I think AMD is doing a great job right now, better than other companies for trying to get consumers through a period where money is tight. I don't like it when some reviewers spend an entire video bashing them for a specific product.

    • @AntonioNoack
      @AntonioNoack Před rokem

      @@johndoh5182 RSR will never have FSR 2.0 though. They can't, because they don't have the motion vector information on all engines.
      They might be able to do some driver magic, but I doubt it.

    • @zodwraith5745
      @zodwraith5745 Před rokem +1

      DSR has literally been available on Nvidia since 2014 as well as a sharpener. Why do you think Nvidia hasn't said anything about RSR?

    • @JordonAM
      @JordonAM Před rokem

      @@zodwraith5745 No one mentioned NVIDIA at all in this thread.

    • @zodwraith5745
      @zodwraith5745 Před rokem

      @@JordonAM My point was Nvidia did the exact same tech long ago. They don't promote it _because_ it sucks.
      You can't create data that's simply not there without AI having information to fill in blanks. It's like asking a robot to drive you to the store without explaining what a car is or how to drive. It often ends up looking even worse than just dropping the resolution.
      The only way you could effectively have high quality upscaling on the driver side is building it into a frame buffer to analyze multiple frames. Works great for video where adding 100ms of lag goes unnoticed as long as sound is synched, but completely unusable for gaming where adding just 20ms is highly noticeable.
      It's a boondoggle for marketing. Nothing more. Nvidia got their marketing in back in 2014, AMD's using it for marketing now.

  • @jackbenimblejack1
    @jackbenimblejack1 Před rokem +33

    the best way to get more performance is by purchasing a $69.99 screw driver from LTT

    • @Ghost-hs6qq
      @Ghost-hs6qq Před rokem +3

      Linus? is that you?

    • @SaucerX
      @SaucerX Před rokem +1

      It comes with rgb, or is that a separate purchase?

    • @DailyCorvid
      @DailyCorvid Před rokem +3

      @@Ghost-hs6qq Is that you? Find out after this short sponsor message...

    • @frozenturbo8623
      @frozenturbo8623 Před rokem +2

      @@DailyCorvid Today's sponsor is Linus Tech Tips, Buy our $69.99 screwdriver for better performance from the RGB and 78.28% less kernel crashes.

    • @jackbenimblejack1
      @jackbenimblejack1 Před rokem +1

      @@Ghost-hs6qq yes it is me, Linus ... buy the LTT Screw driver and get your friends to buy it too.... I have to pay for a really really really expensive house

  • @onetunepauly1194
    @onetunepauly1194 Před rokem

    Thanks so much for this video. Recently upgraded my GPU to a 6700 XT and have a 3600x. Downloaded a bios update and turned theses setting on

  • @ChandlerUSMC
    @ChandlerUSMC Před rokem +71

    Thank you, LTT and AMD. Someone finally said what I've been thinking since the 3xxx/69xx series launches. My wife and I play on two different computers in the same room- frequently together -- it's a past time we share. In the same room means we're on the same 15amp circuit (U.S.). With the way things are going, how are we supposed to have 2 gaming machines on the same circuit? Are we both going to have 4xxxTi or Radeon 7xxxx with modern CPUs? How, exactly? Are we going to be stringing wires from the study to the game room? What about smaller homes that don't have another room to extend from? Eventually, I'm going to need to install custom electrical circuits (20+ amp) and treat a game room like a kitchen? That just buys some time but how much time before we blow past that?
    I was also thinking, what do apartment dwellers do? Is the gaming market going to "power their way" past the people living in apartments because they can't power their graphics cards and CPUs? Then there's the cost in Europe.
    In other words, won't the idea of "more power!" winning over "more efficient" eventually shrink the available market for these devices?

    • @TheSliderW
      @TheSliderW Před rokem +1

      That's my concern as well and My wife and I do the same. I don't know about America or Canada but In France (ok, we're on 230V but still), we were always able to have LAN parties with 5+ players running from extensions out of the same room's breaker including file servers and other additional devices. My take is that your PC's are not going to be pushed to their maximum consumption during gameplay. Only if you stress test them using specialised benchmarks. So you and your wife should be fine anyway.

    • @Tn5421Me
      @Tn5421Me Před rokem +1

      Yes.

    • @Rushtallica
      @Rushtallica Před rokem

      A good quality extension cable from another room.

    • @juliendesrosiers3177
      @juliendesrosiers3177 Před rokem

      Im happy to live in Québec, where energy costs nothing.

  • @GABN0MADormarcous
    @GABN0MADormarcous Před rokem +96

    They actually gave us help so that we can enjoy our stuff , after this i have alot more respect for them

    • @cozza819
      @cozza819 Před rokem +3

      Absolutely 👍

    • @zodwraith5745
      @zodwraith5745 Před rokem +2

      Well, technically they simply copied Nvidia that did it years earlier. Nvidia even has an RSR type feature that they never talk about that doesn't require RTX cores. This is literally just an AMD infomercial. I'd rather have them spend the money actually getting it implemented into any real freaking games instead of just crying "me too!"

    • @GABN0MADormarcous
      @GABN0MADormarcous Před rokem +1

      @@zodwraith5745 i don't disagree but we do have to give th credit as they didn't need to do it , it's still cool they did it , if anything it helps create a new Standard

  • @CaptKornDog
    @CaptKornDog Před rokem +6

    Wish there were more tips like this in general out there.

    • @elbowsout6301
      @elbowsout6301 Před rokem

      check out the lowspecgamer channels for how to get more from low to mid tier gear.

  • @katerhamnorris3936
    @katerhamnorris3936 Před rokem +4

    here in germany we get max 3650watts out of the socket but we have 230 volt outlets at 50hz

  • @joshualopez9045
    @joshualopez9045 Před rokem +14

    For those who can't find DOCP then it's going by to be EOCP or XMP depending on the motherboard you have. These are overclocking profiles for your RAM.

    • @antebellum1776
      @antebellum1776 Před rokem

      I still can't find it... does it depend on your motherboard/ram/cpu?

    • @joshualopez9045
      @joshualopez9045 Před rokem

      What motherboard do you have?

    • @antebellum1776
      @antebellum1776 Před rokem

      @@joshualopez9045 GA-AB350M-DS3H V2 (rev. 1.1)

    • @joshualopez9045
      @joshualopez9045 Před rokem +1

      In BIOS go to the M.I.T. tab. You'll move down to the Advanced Memory Settings then to Extreme Memory Profile (X.M.P.). Enable by selecting profile 1. Look on page 23 and 24 of your user manual that came with your mobo.

    • @antebellum1776
      @antebellum1776 Před rokem

      @@joshualopez9045 I'll double check this tomorrow when I get home but I'm pretty sure I don't have that setting anywhere.

  • @BReal-10EC
    @BReal-10EC Před rokem +65

    I think these temporal upscaling techs will become more and more important in the future as part of normal video compression. A 4k movie is 100 GB now.

    • @Sabbra
      @Sabbra Před rokem +1

      It is 25g cmon

    • @Lubinetsm
      @Lubinetsm Před rokem +14

      @@Sabbra if it's 25G, you got scammed and compression is a bit too lossy.

    • @sermerlin1
      @sermerlin1 Před rokem +8

      4K movie is about 50-60 GB, rarely you'll see 70-80GBs at this time and extremely rarely (like lord of the rings) where it will exceed 100GB.

    • @formdoggie5
      @formdoggie5 Před rokem

      @@sermerlin1 thats not at 60 to 120 frames though.
      Getting 24-30 frames kinda defeats the whole point of all the tech you buy.

    • @Dave5gun
      @Dave5gun Před rokem +8

      @@formdoggie5 for movies 24 frames are enough, actually even better than 60 frames. HDTVTest once explained that 24 frames look more dreamlike, not like a cheap smartphone camera. However, there has to be a certain amount of motionblur, which is not really the case with OLED anymore, so 24 frames can look pretty choppy sometimes. i still think it's better than 60 frames.

  • @josuad6890
    @josuad6890 Před rokem +21

    GPU power has been insane lately, so much so that I decide to just buy the 3070 series even though I can afford the 3080 easily. Going 3080 will also require me to ditch my PSU, and I just don't feel like doing anymore cable work ever again.

  • @kessilrun6754
    @kessilrun6754 Před rokem

    I’ve done all of this but was curious as to what DOCP is. That was interesting. Never noticed that one myself a I use an AMD CPU and Steel Legend mainboard board. I’ll have to go check it out.

  • @zezba9000
    @zezba9000 Před rokem +1

    HZD has foveated rendering (if not mistaken) which is the best solution.
    However this probably confuses AI upscales as that should happen per foveated region not the final image.

  • @SeabooUsMultimedia
    @SeabooUsMultimedia Před rokem +18

    Some AMD 300 series motherboards actually got smart access memory support via a bios update. My x370 board got it via a bios update. It came with the update that introduced compatibility for 5000 series CPUs.

    • @ertai222
      @ertai222 Před rokem +1

      Yeah MSI gaming pro carbon x370 got it

    • @ansaranduin
      @ansaranduin Před rokem

      @@ertai222 mine didn't let me update, I opened M-Flash, then it said 'Entering Flash Mode' and then was a blank screen. Did yours update fine?

    • @ertai222
      @ertai222 Před rokem

      @@ansaranduin yeah I've been updated since the BIOS came out.

    • @ansaranduin
      @ansaranduin Před rokem

      @@ertai222 Just curious but does your system have an M.2 SSD, I've read that it could be causing my issue.

    • @ertai222
      @ertai222 Před rokem

      @@ansaranduin yes this board has 2. But if you use the second slot you get half speeds or something.

  • @Melchirobin
    @Melchirobin Před rokem +35

    This is a great way to do a sponsored video. I just expected weekend videos to be huge and usually the ones you have to watch so getting used to this is a change

    • @vincent67239
      @vincent67239 Před rokem

      I didn’t skip the sponsor this time because I didn’t notice until the opening song started playing lmao

  • @israelurbano2515
    @israelurbano2515 Před rokem

    i like that return talking about ecosystem and electricity over cost of the util in ampere to watt, money and around repercussions on climate

  • @busterthemutt8224
    @busterthemutt8224 Před 8 měsíci

    I tried the fsr when testing out my new card and it didnt do well but I also only can run 1080p on a 60ghz monitor with a 10th gen intel i7 cpu. I just got the 6550 xt since it was on sale and I go through spurts where I dont play anything for 5 months at a time due to work running me ragged so I couldn't justify spending $700-900 on the newer models to myself. But I'm noticing a better quality over all on the same settings with the added bonus of being able to tweak it a lot easier and the gpu is staying 10c cooler. I also only just got into computers after not playing video games for like 5-6yrs so I'm trying to catch up on knowledge and power xD If work stays steady I'll probably build my own slowly while using this one. But with the talk of another shut down I'll get screwed on work as I contract for a property management company. LOL

  • @ravenclawgamer6367
    @ravenclawgamer6367 Před rokem +38

    Warning to Linus : Huge number of $5000 cheques from intel coming at a high speed.

  • @bluehatbrit
    @bluehatbrit Před rokem +18

    My AMD graphics card just hit 10 years old and it's still my main driver with no tweaks to the settings while running along side a Ryzen 7. Perhaps it'll make it to 15 years when the prices have eventually come down!

    • @ShadowDragoon004
      @ShadowDragoon004 Před rokem +3

      Not sure if you've been out of the loop, but GPU prices are getting much more reasonable. I've seen some 3060s damn near MSRP very recently. Definitely much more affordable than it was at the beginning of the year. And with the crypto crash going on, there's a very decent chance stock on the 40 series will be reasonable enough to make a purchase then as well.

    • @10unnecessary01
      @10unnecessary01 Před rokem +1

      What card do you have an what games do you play that you're able to still use it? I upgraded from my 290x in 2019 and even then the difference was night and day

    • @vanilla4064
      @vanilla4064 Před rokem

      unless there is too much supply and not enough demand, those prices aren't coming down.

    • @alishabab3
      @alishabab3 Před rokem

      Covid killed the pricing of cards..that and the conflict between China and Taiwan

    • @kidthebilly7766
      @kidthebilly7766 Před rokem

      if your card is 10 years old then getting a 1080 ti for $200 is probably worth it lol

  • @johnrehak
    @johnrehak Před rokem +7

    AMD free performance gain is so simple even without standard overclocking. Highest jump in my case was just simply enabling Smart Acces Memory (SAM). Another free performance gain which turned my Gigabyte Aorus RX 6800 xt in to reference RX 6900 xt and beyond was thanks to undervolting GPU from 1150mV to 1100mV, increasing power limit to +15% and switch memory timing to fast timings. I havent touched GPU clock or memory clock. Card boost up to 2474mhz from factory settings and as long as temps allows it stays at that speed. FPS gains are in CP 2077 in 1440p ultra from 84fps avg. 54fps min. in to 92fps avg. 60fps min. In Horizon Zero Dawn from 135fps avg. 60fps min. to 159fps avg. 78fps min. (GPU FPS - 168, 95% - 152fps, 99% - 132fps), The Division 2 from 128fps avg. to 137fps avg.

    • @salxncy5998
      @salxncy5998 Před 2 měsíci

      Can you help me over clock mine, I have a Ryzen 5 5600g with rx 6600 eagle, 32 gb ram

  • @argon6520
    @argon6520 Před rokem +4

    I think it should also be added that such tricks can make wonders for VR, especially with high-end headsets that put 4K monitors to shame with their resolution.

  • @Cheesebread42
    @Cheesebread42 Před rokem +13

    I bought the 6950 XT last week for my RDNA2 system so this was a nice reminder.
    I tried to enable it as described though but the Compatibility Support Module must still be enabled to boot - apparently my boot drive was not automatically set up as UEFI when I built it the other year. Linus missed an opportunity to catch this hiccup. I now need to convert the Master Boot Record/Legacy drive to UEFI.... >_>

    • @AdaaDK
      @AdaaDK Před rokem +1

      Your issue is not linked to sam(or anything related to amd really). Its a Windows security Thing (required for win11). That you got issues with from what your saying.

  • @kiefgringo
    @kiefgringo Před rokem +6

    Thanks for the SAM tip. I was already using Sapphire's version of -RSR- Radeon Boost, Trixxx Boost (works better and allows more customization in my opinion, but limited to Sapphire cards I assume), but wasn't aware my 3600x and 5700xt could use SAM by simply tweaking the BIOS settings.

    • @sbrader97
      @sbrader97 Před rokem +2

      Trixx boost isnt the same as rsr afaik its more like there version of radeon boost and only uses standard gpu upscaling from the lowered render res

    • @sbrader97
      @sbrader97 Před rokem +1

      @John-Paul Hunt idk if they could do fsr 2.0 at a driver level it needs deeper intergration into a game to have access to the colour and depth buffer and it has to be done earlier in the render pipeline rsr 1.0 works more like a post proccesing filter stretching and sharpening the final image

    • @kiefgringo
      @kiefgringo Před rokem +1

      @@sbrader97 Yeah, I messed that up. It's Radeon Boost that I dislike. It makes the image look terrible during movement in FPS games I play, whereas it seems Trixxx Boost just decreases the resolution and upscales it the entire time, so it's at least consistent.

  • @teamgaming1294
    @teamgaming1294 Před rokem

    I just enabled Resizable Bar on a Z170 Motherboard with the Bios Mod and yes it does give a good performance boost!!!

  • @arrezo
    @arrezo Před 9 měsíci

    thank you so much i wish i saw this video sooner but i had some games like mw2 2009 and payday 2 just older games like that running max 60fps even going down to 15 ish fps but i turned on smart access memory and its a whole new world, i was getting upwards of like 300 fps on both the games that i said, just because i turned on sam.

  • @omgawesomeomg
    @omgawesomeomg Před rokem +90

    Props to AMD for allowing Linus to talk about the shortcomings of FSR 1.0

    • @ShadowGirl-
      @ShadowGirl- Před rokem +8

      I believe AMD sponsored this to force them to not talk about Nvidia's use of ResizeBar, which does exactly the same thing and they gave it to us over a year ago. Also it was figured out that not all titles gain advantages from accessing more of a GPU's memory, so not all titles even make use of the technology. Normally these things would be mentioned by Linus, but this is the "cost" of sponsorship.

    • @chriswright8074
      @chriswright8074 Před rokem +25

      @@ShadowGirl- but AMD actually put in the work unlike nivida which doesn't work or require more work than needed

    • @lauchkillah
      @lauchkillah Před rokem +13

      @@ShadowGirl- Nvidia did not support Res. bar at launch, AMD was the first to drop this tech. Early 3000 series GPUs require a vbios update to even support it

    • @ValentineC137
      @ValentineC137 Před rokem +18

      @@ShadowGirl- Nvidia's Res. Bar literally just doesn't give the same performance improvement as AMD's SAM
      Comparing Hardware Unboxed videos on the two systems on average
      Nvidia is the same while AMD has a 3% gain 1080p
      Nvidia gets 1% while AMD gets 3% at 1440p and 4K
      the peak gains are
      11% 9% and 6% at 1080, 1440 and 4K for Nvidia
      20% 19% and 17% at 1080, 1440 and 4K for AMD

    • @santmc12
      @santmc12 Před rokem

      @@ShadowGirl- NIS? Is trash

  • @piotrj333
    @piotrj333 Před rokem +44

    5:10 FSR doesn't use AI. It uses Lanczos upscalling with sharpening filter. In FSR 2.0 it gets a bit more complicated, but it is still not AI.

    • @sh0dn
      @sh0dn Před rokem +7

      Even DLSS not using AI directly. It's a DLSS was made with AI and this algorithm need some hardware acceleration to be fast enough.

    • @m_sedziwoj
      @m_sedziwoj Před rokem +5

      as above DLSS is close system, and we don't know what it do, and if you listen to Nvidia, they did claim that it using AI but _only_ for 1.0, for 2.x+ they only saying that algorithm is optimized by AI, so not AI inside. So AMD when they use AI to optimize parameters of FSR 2.0 (and they should, to automate it), they can claim same things as Nvidia.

    • @grn1
      @grn1 Před rokem

      AI is used as a bit of a catch all term for anything that uses a complex algorithm. In most cases machine learning, what you could call true AI, is used to create an algorithm that runs on lighter hardware but then that algorithm itself is often labeled as AI.

    • @piotrj333
      @piotrj333 Před rokem

      @@m_sedziwoj Not really. DLSS actually does use convualation autoencoder and that is not simple parameters, but actually applying weight network. FSR 2.0 still doesn't reconstruct details on textures, but DLSS does, and DLSS does thanks to that AI step that FSR lacks. Also Tensor cores are both good at learning and applying machine learning,

  • @dapz44
    @dapz44 Před rokem

    Great video looked like u had fun making it aswell good job guys 😁

  • @nintendowiids12
    @nintendowiids12 Před rokem +2

    Played God of War with FSR 2.0 on a 6700XT. HUUGE visual / performance uplift over 1.0. I want FSR 2.0 in all the games!!

  • @LucasDaRonco
    @LucasDaRonco Před rokem +12

    Btw, this works exactly the same for Nvidia. SAM is called Resizable Bar so go ahead and activate that too and you also have DLSS for IA super sampling (or upscaling) with antialiasing included. You also get image sharpening if you want it too.

    • @YaBoiBigNutz
      @YaBoiBigNutz Před rokem +2

      Works for Nvidia 30 series only

    • @small_pc_gaming
      @small_pc_gaming Před rokem

      do you know why i dont see a above 4G decoding.
      the o e i instead have is called above 4g memory/crypto currency mining

    • @iikatinggangsengii2471
      @iikatinggangsengii2471 Před rokem

      no i can enable rebar too, try updating bios

  • @djsnowpdx
    @djsnowpdx Před rokem +6

    You’re not the only one, Alex! I haven’t attempted undervolting yet but I slash my CPU and GPU power limits to their minimum values (98 watt 5700XT and 87 watt 5950X) when the outside temperature gets warmer than inside.

    • @joel3399
      @joel3399 Před rokem

      I just have a rx 570 but the heat bothers me so much that in summer I usually play with my laptops integrates graphics

    • @rrekki9320
      @rrekki9320 Před rokem

      @@joel3399 Tried repasting and freeing it of dust? Otherwise I can recommend takin off the shroud, puttin two 120mm Arctic P/F12 on it with Zipties and ya done :D

    • @CarbonPanther
      @CarbonPanther Před rokem +1

      A smart tip for the 5700XT is to reduce the clock slider instead of the power limits.
      You'll save about as much on power consumption, but you will have much more of that performance back that you'll lose by hampering the cards power limit!
      Try running the card at 1600Mhz, look at the FPS before and after and please tell me the results!

    • @-eMpTy-
      @-eMpTy- Před rokem

      @@rrekki9320
      lowering the temps by repasting and/or adding fans doesn't decrease the heat output to your room

    • @CarbonPanther
      @CarbonPanther Před rokem

      @@-eMpTy- I know this will be heavily debated but i firmly believe that: The cooler the Chip runs and the lower the surface temperature of the heatsink is, the colder the air will be that is expelled from the case into the ambient room.

  • @Roger_Diz
    @Roger_Diz Před rokem

    On Assetto Corsa Competizione fsr does a great job getting me great framerates on my old vega64 at 1440p with unnoticeable image loss.

  • @kingdon9690
    @kingdon9690 Před rokem +1

    I'm running 1080p Sceptre 165hz 24 inch monitor, and have RSR on not bad, and AMD freesync premium enabled.

  • @DDRWakaLaka
    @DDRWakaLaka Před rokem +3

    0:40 I'm still running an R9 Fury today with nimez drivers -- still handles everything at 1080p like a champ (provided I'm not going over 4GB VRAM)

    • @rrekki9320
      @rrekki9320 Před rokem +1

      Aged better than the 960/70 lol

    • @DDRWakaLaka
      @DDRWakaLaka Před rokem +1

      @@rrekki9320 wish I didn't have to rely on 3rd party drivers, but that's *definitely* true

  • @Ray13star
    @Ray13star Před rokem +5

    RSR on my xfx rx 5700xt thicc iii is great at 1080p. The image quality that comes out is similar to 2k native in most games that I play. While there is shimmering that occurs, it isn't much of a distraction (since I don't play competitive esport games).

    • @gotworc
      @gotworc Před 8 měsíci

      ​@@elcactuar3354it's not 1080p is HD, 1440p is 2K and 3840 is 4K

  • @Neversettle0o
    @Neversettle0o Před 2 měsíci

    Thank you! Great info!

  • @JustMartha07
    @JustMartha07 Před rokem

    Yeah, I keep doing undervolts for years. First I do OC to check how much I can boost at lower power consuption and then I do Undervolting

  • @kristiancoleman3200
    @kristiancoleman3200 Před rokem +3

    Would be really helpful to show where in the messy GUI from AMD i could activate what.... i feel quite lost in their settings where everything feels like it is there at least two times 😝

    • @noleftturnunstoned
      @noleftturnunstoned Před rokem

      no. it is super simplified and non technical. These modern GUIs make tweaks a cakewalk compared to an oldschool bios.

  • @IncognitoX8
    @IncognitoX8 Před rokem +26

    1:47 It's actually 230v and not 240v, that is the standard in Europe. That can usually fluctuate between 225v-235v. So for a standard 10A fuse, you can expect a output of 2300w. If you have a newer house (or electrically renovated), in my country with confirmed cabling approved of this, you can install 13A fuses instead, that will take the wattage up to 2990w for a single phase group.

    • @GreenCinco12Official
      @GreenCinco12Official Před rokem +7

      It's actually both and more.
      There is everything between 220v and 240v..
      There is also everything in between 110v and 127v..

    • @GrimK77
      @GrimK77 Před rokem +3

      Standard load for TN-C single phase circuit in EU household is rather 16A breakers (fuses are going out of fashion), so 3.6kW. And RCCB's, too for additional protection. Single phase household electric stoves are usually 2.9-3kW.

    • @iaadsi
      @iaadsi Před rokem +3

      @@GreenCinco12Official Up to 253 V in Czech Republic (230 V ±10%). If you're close to the branch transformer, you'll be seeing over 250 V even when the street behind you is pulling heavily.
      Plus we have 400V three-phase nearly everywhere, even in old commie block apartments, for really heavy loads like induction cooking or car charging.

    • @jort93z
      @jort93z Před rokem +3

      Here in Germany we usually use 16A fuses for outlets, and 6A-10A fuses for lights(normal light switches are 10A, so using any higher value would be risky).

    • @p_mouse8676
      @p_mouse8676 Před rokem +3

      Since P=I^2 * R . It also goes to show how much bigger the power losses are in 115/120V countries or how much bigger the wires need to be.

  • @rez2oox645
    @rez2oox645 Před 14 dny

    Resize bar leads to higher fps but also for more stutters (lower low fps)

  • @IggyBang
    @IggyBang Před rokem

    Yoooo that S.A.M. Trick is killer. Thanks for the tech tips.

  • @sudl5346
    @sudl5346 Před rokem +11

    For those having Problems with "No bootable Device found" after activating ReBAR:
    Make sure, that Secure Boot is disabled (restart after disabling it) and then enable CSM and it should be fine again.

    • @mattgrabowski702
      @mattgrabowski702 Před rokem

      I just straight activated it and got black screened then no input device detected on my monitor. Had to clear the bios to get the picture back. Gonna give this method a try next. Thanks!

    • @sudl5346
      @sudl5346 Před rokem +1

      @@mattgrabowski702 Tbh, i'm not sure, if it will fix the problem, you encountered.
      The problem, i mentioned, occoures when the system drive is in wrong "format". ReBAR and/or 4G decoding (not sure, if both or just one of them) needs the drive to be in GPT instead of MBR to be detected as bootable.

  • @syndicate7929
    @syndicate7929 Před rokem +174

    I tried this out the other day after seeing it in the bios for the first time (recently updated it). Went from averaging 210-230 fps with a 1660ti, to over 270. A great improvement :)

    • @Ali.Yousif
      @Ali.Yousif Před rokem +5

      So SAM works even if you have an AMD processor and an Nvidia GPU?

    • @doggo531
      @doggo531 Před rokem +2

      @@Ali.Yousif I want to know to

    • @bl1nd_ness664
      @bl1nd_ness664 Před rokem

      @@Ali.Yousif I don't think so

    • @david._.jessup2604
      @david._.jessup2604 Před rokem

      How?

    • @donovan9783
      @donovan9783 Před rokem +9

      @@Ali.Yousif no, SAM only works with an AMD GPU and CPU. FSR which is like Nvidia dlss works with nvidia and amd gpus

  • @RIOTStyx
    @RIOTStyx Před 4 měsíci

    Years later and that intro animation still leaves me amazed

  • @hiimkxiv
    @hiimkxiv Před 4 měsíci

    Thanks for this information Linus! Waiting for 2nd hand 6700xt to have stocks so I can buy. Sucks having to wait for so long lol I'm using my phone since December 23rd I think

  • @LegendaryGodKing
    @LegendaryGodKing Před rokem +13

    I've been with amd since I was 12. I'm 26 and I have to say, AMD, your price to performance is what I've always needed, Fm2+ Athlon x2 Am3+ FX8350 black, Am4+ Ryzen2 3700x, and now I'm eyeing your new chips. AMD is the only thing pushing Intel to compete. Amd started from a clone, but they have worked hard to compete in the market. I'm a huge fan, they will ALWAYS be my first option for cpu choice, and forever have a spot in my heart from all the memories of gaming, and overclocking, the GPUs are great, just not for mineing!

  • @GasolineNl
    @GasolineNl Před rokem +13

    WIth my new 6700XT on 1440p ultrawide I gain 2 to 5 fps with Assetto Corsa. Playing around 110/120 FPS.

  • @joannecunliffe4874
    @joannecunliffe4874 Před rokem

    You might find the ARM based Firefly ITX-3588J board quite interesting. It's NEARLY a game changer. Any chance of a review? If only they had put a full ATX power connector on the board (like the Axzez Interceptor board (which takes the Pi CM4).

  • @jaredrobinson7488
    @jaredrobinson7488 Před rokem

    Can you do a video like this but for Nvidia? I feel like there are some settings in the control panel that would optimize use of an RTX 4080 but I'm not sure what I should change.

  • @-_-_-_-_
    @-_-_-_-_ Před rokem +10

    It's really amazing that we've developed technology so far that the universe itself is posing hard limitations on us. Imagine telling someone in the 1930s that by 2022 we will have developed technology so far that the laws of physics start to break down (well not really but it sure feels like it) and we can't really progress any further.

    • @philmccracken2012
      @philmccracken2012 Před rokem +3

      Mina......What are you babbling on about? What are you even referring to?

    • @coolghoul9
      @coolghoul9 Před rokem +4

      @@philmccracken2012 I think they are referring to the 120v outlet the universe gave to us, if only there was a bigger outlet to be discovered

  • @imopapa6680
    @imopapa6680 Před rokem +58

    Undervolted my gpu because it was getting too hot. It also increased the fps in some of my games. I feel like undervolting is too underrated.

    • @OmniUni
      @OmniUni Před rokem +15

      Yup. I actually wish they'd delve more in to the various options that Adrenaline gives you. AMD's control center is REALLY nice, and makes it very simple to both undervolt AND overclock if you so choose.

    • @RetroPlus
      @RetroPlus Před rokem +2

      Yeah it's fantastic

    • @thetruthisoutthere5173
      @thetruthisoutthere5173 Před rokem +4

      Extremely easy and underrated unfortunately

    • @ffwast
      @ffwast Před rokem +6

      The manufacturers really configure these things for e-peen instead of a reasonable power to performance.
      The top 150 watts of the 3090ti power spec is only like the top 10% of the performance. 2/3 power for 9/10 fps in benchmarks with the power settings from the equivalent professional quadro type card.

    • @jjcdrumplay
      @jjcdrumplay Před rokem +1

      It works better for me too to only use a little bit of msi overclock 1, because maybe msi 2 system ram overclock wasnt stable, or maybe it was amd drivers that got better in last two years, with this pc always learning something.

  • @kloakovalimonada
    @kloakovalimonada Před rokem

    I remember reading about impending issues of quantum tunneling in 1997 with micrometer nodes

  • @maxdamage4919
    @maxdamage4919 Před 9 měsíci

    I remember AMD have similar feature in AM3 and AM3+ dragon plataform

  • @waynemiller3187
    @waynemiller3187 Před rokem +8

    This video really needs a warning about mbr partitioning and disabling csm compatibility mode, cause if you're not careful you can easily disable your boot disk, and the only way to recover is to reset your cmos

  • @Fredjikrang
    @Fredjikrang Před rokem +60

    A quick FYI, most breakers are only actually rated to run 80% of their rated capacity continuously, so a 15A circuit is actually only rated to power 1,440W for longer periods of time.

    • @Phynellius
      @Phynellius Před rokem

      That and if it’s your own house a 15 or 20 amp 240 volt circuit is east to setup, just should get it done by someone qualified or have it inspected. Unfortunately many consumer UPS systems are 120 only

    • @m0r73n
      @m0r73n Před rokem +12

      Laughs in 230V

    • @Jaker788
      @Jaker788 Před rokem +5

      That's why every device you buy that's intended for more continuous use is max 13 amps. Corded lawn mowers and space heaters are 13A max, but blenders are 15A or 1875W max.

    • @igobyoz
      @igobyoz Před rokem +3

      But a 20 amp circuit is fine @ 16 amps or approx 1920W -> making 1800 W range just fine all day long.

    • @serdarcam99
      @serdarcam99 Před rokem +4

      Imagine you can't open your computer cuz of 110 freedom volts

  • @Mandaeus
    @Mandaeus Před rokem

    If you look at the top right of the top line bios menu bar @4:13 you will see the option "Resize BAR". This is the option you need. Was there on mine ;)

  • @Aleksaa944
    @Aleksaa944 Před rokem

    Tnx linus, couldnt find it in bios until i saw your video. Best youtuber 💪💪🤠

  • @PixelShade
    @PixelShade Před rokem +13

    I actually thought you guys would do more of a deep dive into all the tech AMD has available in their software. Sure, you did talk about SAM, FSR/RSR, But there are other great technologies which are almost never mentioned by tech media and people often don't have a clue about. I am thinking about:
    - Radeon Chill (something for Alex hot room)
    - Radeon Boost
    - Radeon Image Sharpening.
    Radeon Chill & Radeon Boost are both movement based framerate compensations. Chill basically lowers framerate when standing still (to your lowest set value), It instantly increases framerate when moving forward (a in-between value of your min. & max.) and when you are turning, when you need maximum framerate, you get just that. This greatly improves power consumption and with a freesync monitor you barely notice a difference from running full tilt (unless it's a highly competitive title like CS:GO where split millisecond reaction is king). Radeon Boost on the other hand reads mouse movement and depending on how fast you are turning it downscales the resolution accordingly. Which is awesome in theory as you don't have the same perception of detail in motion. This however, is not available for all games. It does work in Cyberpunk and Metro Exodus to name a few. Unfortunately Radeon Boost is a bit harsh with nearest neighbor scaling. (I wish it was tightly integrated with FSR), and you can't use it together with Radeon Chill, which seems like a missed opportunity. When it comes to Image Sharpening (although being a part of FSR/RSR) can be activated individually for a next to non-existent performance impact. It almost act as a resolution bump especially in soft TAA games (textures in 1080p with Radeon Sharpening can almost appear more detailed than 1440p without sharpening). Yet it doesn't produce ringing artifacts. This is great when you want to increase perceived sharpness but don't affect performance much.

    • @lake5044
      @lake5044 Před rokem +1

      Quite interesting! Especially for people who don't own an AMD gpu and don't know such things even exist. Thanks!

    • @Dark88Dragon
      @Dark88Dragon Před rokem +2

      Yeah you are right...from my observations people with AMD-ware are a lil bit more into the tech than the average Intel-, Nvidia- or Apple-User

    • @PixelShade
      @PixelShade Před rokem +2

      @@lake5044 It's actually kind of crazy that AMD isn't promoting these unique features more. And also invest more time developing them.. like I mentioned Radeon Chill alongside Radeon Boost are fantastic features, but unfortnuately Radeon Boost still feels a bit half baked. Right now, it can't be activated together with Radeon Chill, and it uses "nearest neighbor" scaling which is a bit jarring. If this was connected to the in-game FSR pipeline (away from HUD and post-processing) It would be such a "killer feature" you would basically get the FPS where you absolutely need them (when things get chaotic and you are turning a lot) and you would save on power/heat + gain fidelity when watching cutscenes, standing still watching a scenery or just moving forward (where you don't have a lot of motion). :)
      I fully baked smart solution of this, where you can combine Radeon Chill, Radeon Boost and FSR 2.0 would honestly be a REAL killer feature.

    • @CameraObscure
      @CameraObscure Před rokem +2

      I use Chill in most games I play and have done since it came out. Saves power, keeps my GPU cooler dosent affect game play for the games i play, even better its not tied to any specific game / engine. Also the fact you can set feautures on or off, for each game in your Library seperate from global settings of GPU, meaning you have no need to reset each time you play that game as its stored in the driver already.

    • @PixelShade
      @PixelShade Před rokem +1

      ​@@CameraObscure Me too, awesome to hear that more people are using it! It saves me around 60W on a Ryzen 2700 and a 6600XT (which isn't a very power hungry combo in the first place). Taking a look at Witcher 3 I have opted for a spectrum of 48fps to 96fps. When standing still (If I watch a cutscene, get a phone call, fetch a drink or snacks, make some lunch) The computer only draws 63W (48fps). When walking forward (which you do most of the game) the computer draws 91W (65fps). meanwhile when I turn the camera it draws 144W (96fps). I wouldn't actually "feel" any difference on a freesync display when using this framerate spectrum compared to a locked 96fps. And the power saving is around 60W. I mean, I could go full tilt with 120fps+, If that was the case the computer would draw 177W.... So really the power savings is about 90-100W and the gaming experience itself is still really awesome. Saving around 100W is actually substantial for room heat, and in case of the European electricity market it's actually a substantial cost saving

  • @landoishisname
    @landoishisname Před rokem +4

    important to note that SAM can actually reduce performance in some cases

  • @o______-
    @o______- Před 7 měsíci +1

    I'm definitely getting an AMD CPU for my next rig.

  • @frando2479
    @frando2479 Před rokem +1

    Great... I cannot even boot anything anymore. Thanks for not mentioning any problems one could run into like: Having to have mbr changed to gpt, and still not seeing that resizeable bar option at all and what that means etc. etc.
    From "oh just a few clicks" to hours wasted for nothing real quick.

  • @filleswe91
    @filleswe91 Před rokem +3

    "while you were sneakily watching Newgrounds on the school computers."
    You got me Linus. You got me good. This was way back in 2001 for me, 10 years old at the time. 👍

    • @Marco-pf3te
      @Marco-pf3te Před rokem

      Same, resonated for me! Was also 10 at that point in time :)

  • @zkatt3238
    @zkatt3238 Před rokem +4

    FSR works on Nvidia GPUs as well, since it isn't tied to specialized hardware like Nvidia's DLSS

    • @Ishan.khanna
      @Ishan.khanna Před rokem

      Yep
      Sadly both can't be used together lol
      Atleast i haven't seen it in action

    • @ABaumstumpf
      @ABaumstumpf Před rokem

      yeah - cause it is just upscaling and sharpening, not image reconstruction.

    • @janisir4529
      @janisir4529 Před rokem

      @@Ishan.khanna Just use a bigger upscaling for DLSS, duh.

  • @fabrb26
    @fabrb26 Před rokem +1

    Should note that if you enable FSR you use higher ingame resolution and so is textures etc...
    On the other hand RSR want you to use the lower resolution ingame the algorithm will stretch from and so the textures and other stuff.
    My point is, if you are limited in GPU Vram, 1440P or even 4K may be achievable while being a complet no go starting at 4K because of the maxed out Vram.
    For me RSR make the use of my 60hz 4K screen a reality on my R5600/6500XT (4gb) setup

  • @gluteusmaximus7608
    @gluteusmaximus7608 Před rokem

    Good vid! Gonna try SAM now

  • @HumbertoHernandez
    @HumbertoHernandez Před rokem +11

    It's a shame only Deathloop supports FSR2.0 as of now. I'd like to try it on my GTX1070 as I cannot buy a new GPU. I mean, I'm not even in USA or other countries with scalpers and the stores still sell them way above MSRP, wth?

    • @cromefire_
      @cromefire_ Před rokem +1

      There are probably some games that will join Deathloop once it's properly released.

    • @Slayer666th
      @Slayer666th Před rokem

      Another proof that the „scalpers are at fault“ agenda so many influencers push gets to people.
      Scalpers are not the problem since early 2021, it is all corporate greed…

    • @Lyajka
      @Lyajka Před rokem +1

      Farming Sim 2022 also suports FSR 2.0 now

    • @randybobandy9828
      @randybobandy9828 Před rokem +1

      1070bid still a decent gpu.

    • @K.R.X
      @K.R.X Před rokem +1

      Hitman 3 supports FSR 2.0?

  • @shadowsandfire
    @shadowsandfire Před rokem

    we get 2400w per socket here in Aus, 230-240ishV @10A per socket outlet :)

  • @DivergentDroid
    @DivergentDroid Před 7 měsíci

    I'm running Starfield on a Ryzen 5 1600 CPU with 16 gigs and an RX 580 with 8 gigs dedicated video ram on a Gen 3 NAND NVMe M.2 SSD. Those are Way below minimum system specs. I get between 35 and 63 FPS depending how busy the scene is. I am very Happy- But I want my low end to be higher so I'm watching this video!

  • @Ferrari255GTO
    @Ferrari255GTO Před rokem +4

    I had been wondering what the hell that "decoding above 4G" was but i was too lazy to look it up. Good to know i can get an advantage out of it.