AMD just Fixed Frame Generation for Everyone- FSR 3.1 Update
Vložit
- čas přidán 17. 05. 2024
- Ugreen NASync Private Cloud Storage Solution:bit.ly/499eELY
40% OFF on (Mar 26th 8:00 am PDT-28th 8:00 am PDT)-48Hours
40% OFF on (Mar 26th 11:00 am EDT-28th 11:00 am EDT)-48Hours
Up to 35% OFF during 3/28-5/9
People who pay a $5 deposit on the pre-sale will receive 40% off from 3/26-5/9
AMD just announced FSR 3.1! Obviously an improvement on FSR 3.0 which was already extremely exciting, because it introduced frame generation to FSR and in turn, almost EVERY GPU. Nvidia also made DLSS 3.0 Frame Generation prior to this, but its main disadvantage was that it is ONLY available on Nvidia 40-series GPUs. FSR 3 solved that problem for a lot of people, although it has had its own drawbacks. FSR 3.1 sets to eliminate many issues making FSR as good as it can be!
==JOIN THE DISCORD!==
/ discord
Nvidia: • What is NVIDIA DLSS 3?...
community.amd.com/t5/gaming/a...
0:00- Frame Generation
2:00- Sponsor Spot
3:15- Mixing & Matching Upscalers and Frame Gen!
6:40- FSR Upscaling Looks Mid Right Now
10:30- Huge Image Quality Gains for FSR 3.1
13:20- IT'S NOT MAGIC, buuuut...
15:23- When is this happening?
16:25- Other ways FSR 3.1 could be great
#Ugreen #UgreenNASync #UgreenNAS - Věda a technologie
Credits to everyone who stayed on AMDs asses about fixing FSR, you are the real heroes in this
They would have to improve it anyway
@@diamonshade7484 as much as I prefer AMD, they probably wouldn't of fixed it.
These companies seem to only move when there's an uproar and they're losing money.
Wrong they were working on it people just can't wait for shit an AI version is coming too@@patrickstar3820
@@patrickstar3820 FFS y'all stop saying "of" instead of "have", what is wrong with your grammar?
I only stuck with AMD because they are the cheaper option and their driver software doesn’t look like it’s from the early 2000s
Nvidia "Screw our previous customers!".
AMD "Here you go boys, we won't kick you to the curb.".
Realest shit
AMD does that to get more customers and make more money. They would screw customers if they were dominant.
Nailed it!
If they want to surpass nividia sacrifice should made lol
@@itachi7976 Said the scorpion to the frog.
It’s a race between AMD’s software and NVIDIAS hardware. AMD is doing very well considering it requires no specific hardware
Lol no. Its a race between Nvidias software and AMD's hardware. Generally even though amd hardware is better value they fall behind in terms of software. Frame pacing issues are still rampant with fsr frame gen. DLSS upscaling remains superior and so does dlss frame gen on the newer GPUs.
And ofc theres the high end where amd hardware cant come close to nvidias.
He meant Amd upscaling does not require specific hardware it's trained by amd developers but dlss is working because of ai cores and cuda cores @RadialSeeker113
They're both software technically, FSR 3 is just software accelerated rather than CUDA accelerated. Technically, DLSS 3.5 doesn't require specific hardware either, as both the 20 series and 30 series have the same cores as the 40 series, just of different generations and amounts.
That's why I don't buy Jensen's bs that "Ampere cannot handle frame gen". Nvidia could both create software accelerated frame gen for older cards or just allow cards with the needed hardware to run them. If FSR ever overtakes DLSS, Nvidia will have no choice but to capitulate and stop locking DLSS features.
@@aouyiu Of course you're not buying Jensen's "BS", after all, he known how frame gen works while you assume it works the exact same way upscaling works.
Nice way to put it
Fsr 3.0:”He does exactly what I do” Fsr 3.1:”But better”
DLSS AND Frame gen. As an 3000 user, this pleases me immensely.
This is NOT the first time that AMD save old nvidia GPUs. The GTX users can attest to this.
They could also support older gens like AMD does.
@@HotFreshBoxThe support for older cards is limited even with FSR 3 upscaling. FSR is designed for AMD cards only. No other GPU manufacture uses FSR or optimizes their drivers for FSR. FSR is 100% optimized for AMD hardware only. The issue with frame generation is that bad upscaling makes it look worse.
@@GodKitty677FSR will run on Nvidia and Intel as well and intel xess runs on amd and Nvidia as well. DLSS is the only one hardware locked.
Because DLSS is hardware optimized it makes sense to use it over FSR on Nvidia so cards supporting DLSS should prefer it
@@Rohndogg1AMD has a tiny market share FSR is dead on arrival. Soon FSR will use AI as well. Once AMD cards support the feature in hardware.
@@GodKitty677fsr is not dying any time soon 😊
I feel like AMD giving FSR 3 access to all GPUs is their way of flipping off NVIDIA for restricting DLSS 3 to 40 series as well as prompting people with older GPUs not to upgrade to 40 series and I'm here for it.
The fact that Nvidia wouldn't even make it available to RTX 30 series lol
i own a 3060 and this is amazing for me, i can play any AAA's games +70 fps already in 1080p but this gonna be the cherry on top for VR lmao
Yup and Nvdia fanbot still shit on Amd while using Amd open source tech on their old gpu's AmD has
nO TeCh
They don't have to. It's not in their company's best interest as they're more focused on AI and making 40, 50, and 60 series cards relevant.
@@Blue-op6qvexcept it won't, flatscreen frame gen simply wouldn't work in vr
and vr in fact already has an established frame-gen tech called asynchronous spacewarp (the name is different for all implementations but the concept remains the same, halve your fps and interpolate it to target framerate)
Future of gaming is lower resolutions and fake frames. Nice.
Who cares if it looks nice and smooth
@@tuna1867 Just that it doesnt but yea ok 😅
@@Yoshi92 In time it will. Right now it doesn't
@@steadyclouds4614 I use it sometimes, but dude, he has a point, I don't want low res and fake frames, I want optimized games I can play at 2k native, FSR3 can look nice but by definition it can never look as good as a native 2K image, some people think FSR3 looks better but they are normally the type of people that like bloom and blur and all the fake GFX effects that are dog water, its fake GFX effects used with FSR3, I paid 500$ for a 2k Asus monitor with HDR and perfect colour correction, If I didn't mind how my image looked I would have went with a 70$ 1080p China number one monitor.....
Higher input lag, lower resolution and blurry TAA all with 0 large graphical leaps in a decade (raytracing is not a gamechanger lol) and Moores law no longer being valid , hell yeah the hecking future dude, time to spend 800 euro on a new card 😎
I want AMD to really make FSR be 1 to 1 with DLSS. That way Nvidia can maybe realize that paying hundreds of dollars extra for better software isn't gonna cut it anymore.
And I dont think as gamers that you needing to pay like 25% more for a gpu to have better ray tracing performance is gonna cut it anymore, once the upscalers are even.
better software is controversial, for example i like amd software more. and fsr can't be as good as dlss just because nvidia uses ai and to be more effictive they use cuda cores.
Do you think that Nvidia which just became the third biggest company on the planet cares about that handful of moaning gamers?
@@shagohoddsThey aren't even a gaming company anymore.
@shagohodds maybe not, but they should, because the "moaning gamers" got them to that point
I don't understand why people think developing DLSS was easy or free. Somebody had to say "hey, AI upscaling might be the next big thing", and probably assign a team of people and paid them for a year IN HOPES that they produced something worth selling. I'm sure Nvidia has many projects like this that didn't pan out, and the premium from products like DLSS has to cover those costs associated with creating an entirely new product. AMD on the other hand gets to play the role of white knight, criticizing Nvidia and offering similar tech at a lower cost. Why yes, it's much cheaper to copy something that works because you just have to pay one team to make this one thing, instead of maybe 10 teams to figure out what actually works as a viable product.
Another improvement you did not mention: FSR 3.1 will be available for Vulkan and DX12. So that eases the usage on Linux based operating systems a lot if you don't want to rely on Proton or Wine.
FINALLY! SOMEONE THINKS OF THE PENGUIN!
@@Takintomori I'm someone who was working on compiling FSR2 for Linux easily when AMD provided Vulkan support but didn't check whether GCC or Clang on Linux have any issues with their build process.
For I'm already waiting on the Vulkan implementation to arrive, so I can test and adjust the code to make it build on Linux.
I think Godot developers are also eagerly waiting for FSR 3.1. The engine already supports FSR 2.2 across all platforms but the plan is to support FSR frame generation as well.
@@thejackimonster9689god bless you and your work
Finally!
I don't mind the fake frames AMD is doing. They are enabling fake frames for everyone, providing even some extra life for old cards. What I do mind is what Nvidia did, making fake frames the main exclusive feature for a new generation of GPUs.
Not just a new generation of GPU but also games as DLSS doesnt work on all games like FSR
Even in this video he says why though.
"Put shit in, get shit out."
It makes sense that they wouldn't want their new expensive technology to get a bad reputation from gamers who think it is shit, because they are trying to framegen up from 15fps.
Gatekeeping frame gen was the saving grace of the 40 series because, without it, it's not a substantial enough upgrade from the 30 series. The only reason the 4070Ti "beat" the 3090 in non-4k gaming.
If you don't know how the tech Nvidia developed works you should probably stop talking or in this care writing. Seriously, every time AMD does something good you mentally challenged RTX 2000 and 3000 owners come out of the wood work complaining about how Nvidia kept innovating both software and hardware wise.
@@Lem_On_Lime the entire point of trying to use frame gen at 15 fps is to hit near 60 fps to make a better gaming experience. No matter how good the damn game is, playing at 15 fps is unbearable. The people trying to get more FPS from the starting point of 15 fps don't give a shit about how graphics look. People with high end PCs aren't going to use frame gen for more fps by sacrificing look of the game to get if they already running the game at high frame rates natively (which is best quality graphics you can get besides modding it).
This was one of the most sane, most clear, and most honest presentations of FSR and DLSS that i have encountered, and i thank you for it.
I just went through, and honestly am still going through, an internal battle about upgrading my gaming rig. I consider myself fortunate to have a 6800xt gpu with 16gb of vram. It has served me well,. But game studios are really pushing the limit to what graphic levels are in new games and it can be very easy to feel like you are being left behind. I do understand that a lot of this is market strategy to get consumers to constantly be needing to purchase new and "better" parts, but, DAMN, is it ever convincing!
I am an older gamer (74!) and my main focus is not so much on a lot of the new games as it is on being able to play the games I already love to the best of my ability. One of my BIG favorites is to heavily mod Skyrim with about 2000 or more mods and see how smoothly I can get it to work well! It's a real challenge, and a lot of fun, and is always teaching me a lot about optimizing my graphics!
I also love a lot of older rpg games like past Balder's Gate, Neverwinter Nights, etc., and these games are usually not that heavy on gpus.
When i do play more recent games my pc can sometimes struggle, but I have always put a good story before graphics, and usually find a way to make them playable with decent graphics if I really want to.
What AMD is exploring right now is very exciting to me, and especially if people can find ways to take the new ideas and learn how to adapt them to lower spec pcs. This has actually appended in the Skyrim modding world. Modders have found ways to make use of DLSS, upscaling, and some FSR techniques for use in heavily modded and very graphic intense veersions of the game.
I really think that if this new FSR from AMD actually works well, Skyrim modders will find a way to use it in such an old game, and will help to make it possible for it to be used in other past games!
So, I will try to wisely refrain from upgrading right now, and see what happens in the next couple of years.
Apologies for such a lengthy post. Old people do tend to go on . . . . and on . . . and . . . . !!!
I know the feeling haha. I’m on a 5700 XT and since I mostly play cs and some Baldurs gate, civilization and other games that aren’t all that demanding, I got along pretty well so far. Hell Divers 2 got me thinking though 🙃 definitely want to update now but also kind of want to wait a little bit longer to see what’s next!
致敬老游戏者,祝你身体健康😊
I literally just put together a rig the the 6800XT. I mainly play at 1080/1440 and it works flawlessly. Not sure why yours is struggling, possibly a CPU bottleneck? Driver issue? Regardless of my curiosity, I just stopped by to say it is definitely a terrible time to upgrade.
1. GPU prices are outrageous (hence why I bought an older GPU, the 6800 XT).
2. In every benchmark I can find the 6800XT actually beats out (in rasterized performance, non RT) the 7600/7700 and the 4060 and holds it's own against the 7800 (and XT)/4070(and ti). Losing out on some (varies game to game)but by what I consider negligible amounts (10-20FPS) for the cost. That leaves you with a reasonable upgrade path to a 7900 or a 4080. Both of which are, IMO, overpriced for your performance gain.
I would look into possible performance issues as I don't honestly think your 6800XT is performing at optimal levels given the information at hand. Best of luck with whichever course you decide on!
@@ryanwilliams5705 I think you may be reading too much into my comment, or possibly just assuming too much.
The ONLY time I mentioned my GPU struggling is this comment:
"When i do play more recent games my pc can sometimes struggle"
If you read the whole sentence, I do mention being able to find a way to make them playable also, and with decent graphics and fps. This usually comes down to a few very easy tweaks either within game settings or GPU settings.
You make it sound like I am experiencing serious problems that are making my games uplayable, when I NEVER SAID THAT!!!
What happens when we assume?????
What you said about 6800xt benchmarks compared to newer GPUs and the prices of new GPUs for the slight increase is spot on, and is exactly why I am NOT upgrading at this time.
What you didn't point out is that these new GPUs also need more power and run much hotter. So the hidden cost of moving up to these new cards can also mean upgrading your power supply and cooling system, and possibly a new cpu and motherboard as well Add to that higher electric bills and the few fps rates suddenly don't look that good any more!
If my response to your assumptions sounds too harsh, I apologize. But people are going to read your comment and get an impression that simply is NOT true! I DO keep up with new drivers and I DO optimize my settings for my pc in general, my GPU, and my games. What 'struggles" i may experience are very brief and are usually resolved with a few easy tweaks.
The 6800xt is an amazing GPU, and there are only 2 or 3 new GPUs that match or surpass its 16gb of vram or its performance!
@@denniswade4998 "If you read the whole sentence, I do mention being able to find a way to make them playable also, and with decent graphics and fps. This usually comes down to a few very easy tweaks either within game settings or GPU settings.
You make it sound like I am experiencing serious problems that are making my games uplayable, when I NEVER SAID THAT!!!
What happens when we assume?????"
I read the sentence, rather paragraph, in its entirety. While I do agree my curiosity sparked some assumptions, context is also important. Per your comment:
“I just went through, and honestly am still going through, an internal battle about upgrading my gaming rig. I consider myself fortunate to have a 6800xt gpu with 16gb of vram. It has served me well,. But game studios are really pushing the limit to what graphic levels are in new games and it can be very easy to feel like you are being left behind.”
You stated you were and are currently considering upgrading your rig and specifically mention your GPU and the demand on GPU’s in newer games today. Which leads to my next points.
“I do understand that a lot of this is market strategy to get consumers to constantly be needing to purchase new and "better" parts, but, DAMN, is it ever convincing!”
“When i do play more recent games my pc can sometimes struggle, but I have always put a good story before graphics, and usually find a way to make them playable with decent graphics if I really want to.
“Modders have found ways to make use of DLSS, upscaling, and some FSR techniques for use in heavily modded and very graphic intense veersions of the game.”
After your statements about the confliction you were having about upgrading your rig you state that purchasing new and better parts is convincing. You go on to state that yes, your PC sometimes struggles, but again context is important. You then state you put a good story before graphics (implying your card is underperforming). You also state you can find a way to make them playable with decent graphics if you really want to. Implying your GPU is underperforming. Also implying you have to heavily change your settings with the making games playable with decent graphics with the if you really want to bit. Yes, I may have made some assumptions but the context you gave led me to those assumptions.
"What you said about 6800xt benchmarks compared to newer GPUs and the prices of new GPUs for the slight increase is spot on, and is exactly why I am NOT upgrading at this time."
That is great to hear. I honestly just can’t stand the business practices at hand here with all GPU manufacturers and hate to see people give their hard-earned money to greedy corporations at such an outrageous markup. It seems to determine how they price at launch as demonstrated with the GPU shortage when the mining boom occurred. I believe the only way to rectify the current pricing gouge is to stop supporting it all together (not upgrading), until the price becomes more reasonable.
"What you didn't point out is that these new GPUs also need more power and run much hotter. So the hidden cost of moving up to these new cards can also mean upgrading your power supply and cooling system, and possibly a new cpu and motherboard as well Add to that higher electric bills and the few fps rates suddenly don't look that good any more!"
You are correct! Not only that but, in your case specifically, unless you went to say a 6900 or 6950 XT, you would indeed need not only a new MOBO but also that new fancy DDR5 ram. Another hidden cost, and another reason I went with the previous generation of cards.
"If my response to your assumptions sounds too harsh, I apologize. But people are going to read your comment and get an impression that simply is NOT true! I DO keep up with new drivers and I DO optimize my settings for my pc in general, my GPU, and my games. What 'struggles" i may experience are very brief and are usually resolved with a few easy tweaks."
Not harsh but a bit condescending. It is ironic though, the condescending bit about making assumptions and now the shoe is on the other foot. I never said your drivers were not up to date. I said driver issues. Which could mean a corrupted driver from download or something less than ideal AMD put into the software causing drops in performance for certain games.
So what happens when we assume????? (sorry couldn’t help myself).
"The 6800xt is an amazing GPU, and there are only 2 or 3 new GPUs that match or surpass its 16gb of vram or its performance!"
I agree wholeheartedly! The price to performance was in a league of it's own and the main reason I went with it. I still think it was too expensive though 😂😅 ($500).
Got a 7800XT this year and I'm, pretty excited about this news. Will have my eyes on this as it develops, can't wait to see for example how cyberpunk will look with FSR 3.1!
you can always use xess till fsr 3/3,1 releases :) should work just fine in 1080p.
I think even ray tracing would work well with fsr3, making nvidias selling gimmick obsolete
@@DL-iy7obprobably not still tbh, but it's a win for AMD users for sure
@@DL-iy7obOnly people who can't play ray tracing or path tracing says this...
I ran around screaming the same shit I went out and got a 4090 and it's night and day difference at 4k.
I wanted to thank you because I ended up buying an AMD gpu thanks to you and owens videos. I ended up like my amd card so much that I ended up building an entirely new PC with an AMD cpu to go along with it. Previously had intel/nvidia. I don't intend on going back to nvidia or intel assuming AMD continues to do right by their consumers. They have done right by me.
I don't understand why amd keeps helping the nvidia users. They are clearly not grateful.
Nvidia and its fans are the weights tied onto the legs of gaming right now. Bunch of pricks who think they are shareholders 😂
The input delay is so bad you can keep it
I'm grateful af and this is a big reason why my next gpu will probably be from amd. It's excellent marketing.
Because AMD understands gamers the more free stuff they give everyone the more people will like them and be open to buying a AMD GPU at the same time Nvidia is being anti consumer and competing with their own old cards so less people overall will need a GPU upgrade in the long run AMD will win because we're now seeing that Moore's law is dead and each generation is not that big of a improvement when everyone has 4k 120 fps nobody will care about new GPUs go to any Best buy and look at OLED 4k vs OLED 8k nobody is gonna upgrade to 8k its like a 5% difference for triple the price
@@Iisakkiik Check the comment above you. You can understand where I'm coming from.
How cyberpunk still doesn’t have 3.0 is wild. They were announced to get it when FSR 3 was announced
They announced that they will stop updating this game, the recent patches were just bug fixes without adding anything. Maybe they will finally add it soon but it also took them a few months to implement FSR 2 in update even though modders did that shortly after this technology launched.
Cyberpunk is basically NVIDIA tech demo. Of course they are not supporting AMD. Besides, I don't understand why people praise it so much in terms of graphics. IMO the game is fugly and looks "flat". Even with path tracing enabled.
@@TheMahtimursu they don’t have exclusivity deal, it still supports XeSS and FSR 2.1 and AFMF. So don’t see how you came to the conclusion it’s not supported. And it’s one of the most detailed games out there. From the reflections, to shadows and lighting to particles. There’s a reason why it’s used as a heavy GPU benchmarking game. It’s one of the best looking out there.
@@TheMahtimursu Stop huffing that copium bro
@@kineticbongos No it is definitely not one of the best looking games. I thought I was crazy when I started playing it and it just looks bad. I googled a bit and seems that many others agree. It just looks flat. That is the best way of describing it.
Don't forget that NVIDIA is not gaming company anymore, they are (called themselves) AI Factory.
You can look at their latest GDC for more details.
They are going with the good old Apple's route: less hardware and more software focus
This channel is so underrated fr. Your videos are so detailed, informative, to the point and understandable
this is why i prefer AMD over NVIDIA, NVIDIA always feeling entitled by them restricting stuff that should be used freely by everyone, that has been proven
Nvidia uses hardware for frame gen, AMD uses software. That's why Nvidia looks better and runs better. Nvidia is more expensive for a reason, they don't use dollar store software tricks like AMD
@@EveryGameGuru Haha that's a great way of putting it over 😂😂
Thank god there is nvidia.
It's they who rnd all this stuff that amd trying to copy.
@@EveryGameGurunvidia fans are so dumb smh...
it wouldn't be so bad if nvidia decided to not be a scummy company, but here we are. they expect their bells and whistles to sell the product, not their performance. they still sell absolute garbage value cards, and get away with it because they are "the standard" when it comes to graphics cards.
Can you test it in Path of Exile (with its custom game engine)? How (bad) is the image quality in fast moving game with lots of stuff moving on the screen?
would this work on existing fsr 3 games? or need another update
Do to the fact amd is willing to even do this is awesome and still trying to fine tune it.
I have a question and I know nothing about this stuff but if this is fsr 3.1 is good will the 7900xtx be better than the 4080 super
Would this work with old laptop GPUs
Such as the AMD hd 5470
I was all excited to use Frame Gen on Forbidden West today and it was only giving me an extra 15% fps. It's odd how some games give you a massive uplift while others it falls on it's face.
Probably goes to whether you were limited by gpu or cpu.
I don't see the point in endless fake frames and lag surely a small boost and no lag makes more sense a boost if you li,e rather than a doubling or triple so 45 to 60 60 to 75 etc make far more sense to me
@@audie-cashstack-uk4881 Well, I'm using it to go from 85-90 to 120-140 depending on the area so latency isn't exactly an issue.
@@audie-cashstack-uk4881 Nvidia reflex makes the lag problem a non issue
Now they need to invent fake friends
I'm sure there's some AI friends out there
Well nvidia gonna release rtx chat bot so....
so, xess with FSR 3.1 is the superior option now? (For non RTX users)
Does DLSS with AI learning really worth more (present and the future) than AMD FSR?
Just got sapphire pure 4090 gre but the alternate was 4070 (as my old psu doesn't support 4070s).
YES.
Man, I need to get some FSR in Minecraft
I'm pretty sure you can make it work on bedrock and java with some mods
@@Hash-6624 Or just use some optimization mods. Minecraft is hilariously broken in so many ways, that some mods can double or triple your performance, while using higher settings.
@@Hash-6624not yet, the biggest issue is that Minecraft uses openGL to render, which can’t support fsr. However there is this guy making a mod so that Minecraft uses vulkan instead. Vulkan does support support fsr but the mod is still far from perfect
Use lossless scaling with LSFG enabled
@@MicahDaRhulerthere are already 3 "fps boost" mod.
Once FSR is up to par im switching immediately, best case scenario would be of course if these GPUs suppliers would stop acting like a 100% inflation compared to 6 years ago is normal.
So it's possible to use DLSS upscaler along with AMD frame gen?
~8:07 cant you fix angled textures with higher texture filtering? maybe with the control panel
Are you secretly a huge Hollow Knight fan? I seem to hear alot of music from that game in your videos :D Keep up the great work.
The one game I used FSR in on my GTX 1650 GDDR6 is Deep Rock Galactic and maybe Enlisted depending on the map and they look really good and on medium (ish) settings im getting a good 90 to 107 fps- 100hz monitor
I used FSR 2 on my 1050 Ti in RDR2 but it looked too bad and didnt even have an impact on my FPS, like maximum 3-5 FPS.
@@domseyboi youre on a 1050ti. Frame gen is great. But it cant revive a corpse
but he wasn't talking about frame gen...@@RadialSeeker113
@@RadialSeeker113 Yeah I know, but the point is that it makes the graphics muddy.
@@domseyboi it really doesn't with dlss frame gen
Hey, I play minecraft on my pentium g640. Will fsr 3.1 work on my integrated graphics?
Yoo bro can you tell me an amd alternetive to the 4060ti 16 gb. And also will i get a bottleneck with the i5 13400f
fsr is finally worth using under 1440p base resolution...... it was time, dang it !!!!!!!
seriously , an nvidia gpu was just worth it because dlss basically laminates fsr under 1440p, glad to finally see majors improvement on fsr , it was much needed
thoes this mean than older cards will run in more fps? and how old cards will suport fsr?
do i need this kind of stuff if im playing on a 1080p Monitor with rx 7700 xt. if yes,can you please explain why.
imagine buying an nvidia gpu just to get more support from the rival company
Imagine buying something, not understanding how it works and then complain that your now technically obsolete product doesn't have the latest bells and whistles that come it the newer product.
@@AlucardNoir I have honestly never used a product that wasn't "obsolete". I have only ever bought 2nd hand graphics cards, currently using an rx 570
😂
@@jacobe5398🤣 🤣🤣
It's funny that they say getting 60fps should be the minimum before using frame generation, cuz if I'm already getting a steady 60+fps, I'm gonna be LESS LIKELY to use frame generation.
Use it with something on low fps and you will see why...
the latency cant be undone
For me anything above 80-90 fps i will upgrade resolution. 4k balance or quality look so good with fsr with the frame gain. Other than that i will choose native.
60 fps feels like crap in high refresh rate monitors using a mouse and keyboard so it really makes games feels smoother at the cost of latency
> if I'm already getting a steady 60+fps, I'm gonna be LESS LIKELY to use frame generation
Why though? 100fps is so much more enjoyable than 60fps. Why would you even hesitate to use FG?
@@intelligentdesignmorty8112 I've tried this on a PC in my living room. I maxed out graphics in and was at 33FPS, turned on FG, and stayed at 60. On a controller sat far enough away it really doesn't feel terrible at all. However, I think that improvements to anti-lag+ would be ultra beneficial to ALL users who turn on FG, from low framerates to high. It would allow for a much larger userbase to get that 60FPS experience with far fewer drawbacks. After that, it just needs to be added to as many games as possible, so hopefully modders can find a feasible solution similar to the way AFMF and RSR work.
guys i dont know much about graphic cards,but i have a question will fsr3.1 come to rx5700xt and also the frame generation?
When did you get all those gpus?
Everyone with nvidia gpus "FSR looks so bad why would you use it? im just gonna upgrade."
Everyone else "How about now?!"
Do 80%+ of computer gamer on the Nvidia side vs the rest... minus the apple and intel gamers. Yeah...
They showed you 4 imagines of FSR 3.1 and now your dickriding is crazy
I can turn on my 4090 and play my games (it just works) when I had amd (7900 XTX and 6950 Xt) I had driver timeouts and random shutdowns ...
There's a lot more into these companies than just upscalers or RT...
I had AMD for 15 years until 6 months ago and I'll never touch another AMD GPU I will only buy Nvidia's flagships for now on after I've seen the difference.
@@alkine3011You are fanboy nvidia liars 😅
@77gaza I've probably spent more money on AMD products than you considering both 6950 and 7900 xtx were over $1,000 a piece and not including CPU'S... you making that comment shows who's a fan boy when something is said about AMD people get but hurt but you're probably running around with a mid tier card cause you're broke and talking about how "Good" something is 🤣😂
Ray tracing and Path Tracing are a gimmick also right? 🤣
My next card is gonna be an AMD only until my 3090 is irrelevant
That would be a looong wait, as even 2060 is still a good pick.
@@alexturnbackthearmy1907 True, and those are still selling for too much on the used market, sellers need to drop prices since even used 5700XTs are going for a hundred less than them, but yeah 3090 isn't going anywhere.
Depends for what you use 3090. RTX 4090 is already struggling in many AAA titles on 4K. Once the new 5000 series GPUs come, a 4090 will be a "high mid-range" card from Nvidia.
I game at high resolutions 7k-5k at 45fps -70fps on my overclocked rtx 3090 trinity so with with frame gen I can switch to am5 and to new Zen 9900x3d from 5900x and free up a 15% bottle neck and DDR5 upgrade so I will wait for rtx 5080 ti no problem.
@@unknownorigin8446The only game my 4090 has struggled with is Alan Wake 2, and that's at Max settings, native 4K.
Great to see AMD improve FSR.
I wonder how common frame generation would be now a days, if it would have been released with Force Unleashed 2 back in the days.
They had quite interesting and working frame doubling in it.
I love that you used Control as the control in the DLSS comparison LOL
have you tried lossless scaling? it's amazing, you can even scale youtube videos / any window to double the frames
I'm using it for Enshrouded to good effect. I'm aware of some minor artifacts around my character, but it really doesn't bother me at all. The smoothness feels good.
@@jeremyengel7332 I agree
Even movies on vlc , 48 fps@@hupe5836
i will change on AMD just out of principal !
Great video and very exciting news but one of the biggest issues still remain, no Freesync support when FG is enabled :(
Where I can I use it tho? Or is my GPU too old the sapphire Radeon 580?
Try the uniscaler preview mod
*Decoupling FSR upscaling and FG is a massive win for NVIDIA 30 & 20 series holders*
Facts, DLSS and FSR3 FG will be huge for those people
@@iitzfizz Yep free upgrade thanks to AMD lol
It even good for AMD owners. Could use XESS if you prefer it or Native + Frame Gen
My next GPU is going to be AMD just to support AMD.
FSR FG really smooths out my Cyberpunk gameplay at 1440p. Its like way better motion blur.
so it has to be opted in to the game by the dev?
cant i just download it and run it?
what do you think about lossless scaling ( steam app ) ?
That was the only way for me to run lords of the fallen on stable 1080p 75fps with the rx 5700 xt
Weird 75fps@@TheInfectedVoice
The problem is that Nvidia uses the tech exclusiveness to sell cards. FSR 3.1 is fantastic but it won't sell AMD cards, in fact it will probably sell Nvidia cards, as you will be able to mix the quality of dlss with the FSR frame generation.
This may be true, but nvidia overprices everything to an insane level, AMD is taking a step back, and making it the choice for budget/normal people, meaning by in large they will have the higher demand and supply, vs nvidia with high demand, but not many people able to or wanting to buy. If you’re content with what you have and it costs 1,000,500,300, heck even 100 dollars cheaper and you get the better future proof (vram etc), then why go big unless it’s needed for work.
@@Koki-qe7vz You missed the point. AMD really needs to stop being the "nice guy" and stop helping nvidia cards with extra features. Why would anyone buy an AMD card to use AMD features when they can buy an Nvidia card and use both Nvidia and AMD features? AMD needs to start being exclusive and make their features only work on AMD cards.
@@brutlern their features are not good enough for that, you and I both know that.
It's quite amusing how, until recently, all the AMD fans were bashing Nvidia for their upscaling and frame generation capabilities. But now, they're the ones who seem most excited about these features. LOL, how the tables have turned!
Nope they are 30s and 20s and gtx users in disguise(me) cause nividea policy is stupid
Great now is there and easy free way to install it. How? Running a intel a750 tried use it couldn't get it to work.
Hopefully they can make it actually work now. I've seen multiple games lately drop FSR support right before launch because it's was causing stability issues. And not just while it's in use, but if it's even integrated into the game but turned off.
Wake up, Babe! Vex uploaded a new video
Still after 3 months can't use steam vr with AMD drivers, gotta love it.
May I know which game ya? I have a Rx 570 + Pico 4. Just planing beat saber at the moment. Planning to upgrade the GPU to play more demanding titles.
I'd like to know which game as well
Running 6700xt + Quest 3, had no issues with No Man's Sky or VRchat so far.
On any recent driver past 23.12, launching steam vr gives an error 309 which indicates that a part of steam vr isn’t loading correctly. Every recent driver even as of two days ago the most recent one released on 3/20/24 still gives this error for many of users and we have to use outdated drivers just to run steam vr. This for obvious reasons is a bad occurrence as we have to use outdated drivers from December which heavily affects all games in terms of artifacts, driver crashes, incorrect/missing textures, and so on.
And effing up their drivers? No, say it ain't so. Who would have thought that the company with a track record of effed up drivers would make effed up drivers? They must be some sort of psychic.
@sweetroll_riley Interesting, I'm on version 24.2.1, which is the latest rn. What's your GPU?
The issue seems to be affecting 7000 series, which isn't too surprising because it's a new architecture. Still a shame it has all the issues to begin with.
I sincerely hope that Nukem with his FSR2FSR3 mod will be implementing the FSR 3.1 tech in his upcoming versions. One thing is I don;t understand how one will choose FG in games, officially, when selecting DLSS for upscaling and not FSR, will it autodetect that it is an RTX card in the system and by selecting FSR 3.1 it will show ability to enable FG but use DLSS as upscaling in the background?!?- I meant for the RTX 3xxx series, for an example.
Your vids are so good man
Fake frame nice😂
Everything is fake
fake frames, fake resolution, thank you nvidia for moving the focus away from real performance.
U can thank nvidia for that, amd just gives it to you for free. Learn.
games themselves are fake "reality". If the tech is good and for everyone I don't see why being so negative.
Frame generation, money generation😂
Fake frames and fake thumbnails. Stop with the clickbait garbage.
Can i use FSR 3 with mx130 gpu?
*_Any GPU?_* 🙄
*_GCN GPU's:_* ☠️
Nvidia is a scam nowadays
THEIR gaming revenue is less than their new AI business.
@@vmafarah9473Gamers don't care about that.
@@morgueblack Nvidia don't care about gamers.
@@morgueblacktrue, nvidia dont care about gamers but care about their new ai business
@@MrHacking7 That's why more gamers should buy AMD or Intel. They have the price-to-performance advantage.
Does FSR work better on AMD gpus or is it about the same?
Finally.. Applied on all games or only for the new releases?
The once that give you the options
i dont think the gtx 1080ti will ever die after amd supporting it better then nvidia themselves
The ghosting and shimmering and temporal stability is even worse in Minecraft.
I'm a bum. I have a amd 6700 non xt, and I can't get my games to run decently.. is it my acer predator 1080p monitor without freesync? I've tried the performance mode but if it's not default i get screen tearing x.x
CPU?
With Blue Sky you can use Frame Generation on Movies and its INSANE.
Its very interesting that the generated frames are mostly a mess of pixels. But in Motion you won't see this.
so which one is it, FSR 3.1 or RSR?
part of me that plays Steam Deck when not on my PC, the idea that I could potentially see BG3 at 60FPS on it is insane.
am i missing something, whats the point of using dlss/fsr to downgrade quality if you can just use 1080p for the fps boost
Smart video, thanks. New sub
can somebody explain to me why there are striped lines from 1:32 to 1:40 ???
Does anyone know how I can get FSR 3.0/3.1 for Cyberpunk on a 7900 XTX?
Can you share the RTSS Config?
This upscalers aren't just about performance but also just making the image quality in these games more stable. One game that really noticed actually looked bad when you turn upscaling off was MK1.
Will this work on my gtx 1060 3gb?
Dude i love your videos and explaination.❤ I like your commentary.
40 gen wasn't much of an uplift over 30 gen nvidia. So they made their software trickery part of the deal. It's understandable. Why shoot yourself in the foot by giving your previous gen extra longevity. That's no way to milk your customers!
Thanks AMD. I'll be sticking it out with my 3080 just a little while longer.
Never got that glitch backpack with FSR on my 7900 XTX, maybe just bad on Nvidia gpus?
I remember finding a ton of awful bugs with DLSS in Spiderman Remastered. They were really bad at night time. I noticed that you had what almost looked like smoke trails from your hands when falling from a web swing and when using suits with lights and moving the camera fast his head or places that light up would multiply across the entire screen.
Why there are several striped lines at from 1:33 to 1:38????
i need a proper review of this advertised NAS, looking through all the software capabilities
when will 3.1 come out and How do you use dlss DLAA with amd Frame generation if anyone knows ?
You usually just turn it on the same way/place as dlss if the game supports it. But the game needs to support both DLAA and the new FSR frame generation, or you can use the fsr 3 mods that replaces dlss frame generation. There are videos on it online you can look up. You can also force DLAA if the game supports DLSS with software.
i end up having to turn down sharpness when using fsr, or outright off. but i default to go without fsr or dlss
tried and tested this today on rebirth island warzone on a gtx 1080 its sick tbf
Just finished playing cyberpunk with an fsr 3.0 mod created by Lukefz, from 65-75 fps on high to 120-150 without any additional upscaling just using the frame gen (rx 5700xt)
Is it enjoyable the fsr 3 frame gen? Any mouse input issue?
@@rizkijunir23 no input delay, just a little bit of ghosting on car bumpers that could be fixed easily
"AMD nailed it from the jump, FSR 3 always looked very good"
Well now I smell something and it ain't good
As a 4090 user and future 5090 buyer..
This is an A-level update with a potential of it being an A+ update.
Buyers from the Big three need this competition to keep going. And you never know, one day Radeon could have a Ryzen moment if NVIDIA falls asleep at the wheel for any length of time.
Keep it up 🎉
Great optimism BUT.... LOL. Nvidia can't possibly fall asleep. They are super woke lol. They'd only lose marketshare if they chose to focus more on other AI applications besides gaming and AMD went down that route. But that IS NOT going to happen, bc Nvidia is not THAT stupid. They could say they just dont care as much about consumer gpu's but that's really silly. If AMD innovates more than Nvidia in the GPU space, it will hurt Nvidias mental marketshare as well, which would interfere with their plans to literally take over the world with accidentally nefarious or subversive AI that exists in everybody's home. I'm only half kidding, of course! 🍻
@@TheModeRed well cool! NVIDIA should focus on doing the right thing. Good for them 👏
@@ehenningsen what is your complete setup by the way? I have one 3080 5900x gaming pc with a Gigabyte 1440p monitor, and another with a 7900xt, 5900x add 3440x1440 monitor. Hard fought and bought 5900x purchases during COVID, and can'[t even begin to fathom spending so so much on a 4090, but my semi-famous and multi-talented, frequently traveling family could possibly afford these things, we just aren't THAT into it to make it worth the difference right now. But we may start streaming...
@@TheModeRed nice! My PC is an Intel 13900kf, 96GB 6000Mts ram, 16TB SSD with an MSI Gaming Trio RTX4090.
My monitor is a TV, a 65" QD-OLED Samsung S90C that can game at 144hz
@@ehenningsenlol, that's a pc or a beast?
so happy for showing rtx 3060 in video because i am just about to build my first pc with rtx 3060 and after seeing this video i am sure that this card will last me good 4 years
might come in handy w/ Dragons Dogma 2 technical aspect.
The game doesn't have fsr. And you can't mod it in either because of their DRM. 😂 Good luck.
Modern gaming sucks
CPU limited game
this was defiantly needed if AMD were to keep pace with Nvidia and Intel, good to see. i kinda went on a buying spree of UE5 titles since my RX 7900 GRE arrived and i seriously hope FSR 3.1 gets backported to a lot of those for better image quality. stuff like Remnant 2, Immortals of Aveum, Lords of the Fallen etc. i kinda figure Sony will update their PC ports given the relationship between AMD and Sony hopefully.
Vex, it went without saying that the temporal stability images are in 1080p Ultra Performance
Truth about Framegen and Nvidia: the literal "NV Tensor Optical Flow API" has existed since the first-gen RTX cards, and all they did with the 4000-series is dedicate a seperated bunch of cores to it so performance would be constant, and lock down the algo to not run on other tensor gens; using it should still be 100% possible and feasible on 3000-series at least.
Don't believe me? Check out the 3 ninjas film I interpolated to 60fps using that technique and DLSS3's "Optical Flow" component with my 2070 years ago.