Nvidia RTX 4060 Ti 16GB vs 8GB, Surprising..
Vložit
- čas přidán 28. 08. 2023
- Nvidia RTX 4060 Ti 16GB vs 8GB, Surprising..
Remember to Subscribe and smash that like button if you enjoy it!
vip-cdkdeals Back to School Sale 30% Coupon code:CC20
▬Windows 10 Pro($15):biitt.ly/SjaKX
▬Windows 11 Pro($21):biitt.ly/H2sdR
▬Windows 10 Home($12):biitt.ly/HDd1O
▬Office 2021($50):biitt.ly/Muliv
▬2019 Standard 2 Core($85):biitt.ly/QHdqY
Amazon Affiliate Links:
Nvidia RTX 3060 amzn.to/3gBfMTm
Nvidia RTX 3070 amzn.to/3GLpFbB
Nvidia RTX 3070 Ti amzn.to/3GGnNkC
AMD 6900 XT amzn.to/3ALaSKf
AMD 6700 XT amzn.to/3V7WSm6
AMD 7900x Ryzen 9 amzn.to/3AKy7US
Check out my articles over at PC World Magazine: www.pcworld.com/author/thiago...
Follow Classical Technology on Social Media:
Facebook: classicaltech
Instagram: / classicaltech
Twitter: / classicaltech3 - Věda a technologie
Fix: It seems like the Cyberpunk 4060 Ti numbers were too high, DLSS 3 was becoming active even when off in the settings, It seems to be a bug. So disregard those numbers, RT Ultra 1440p is a lot less with a 4060 Ti. I'll have to redo it, Cyberpunk is not a good game to benchmark like this lol.
The Ratchet and Clank VRAM numbers are correct though, it uses more with the 16GB 4060 Ti and gets bottlenecked with the 8GB one, that was my main point.
I like this kinda presentation, where we see the gameplay, with your voice explaining at the same time, like for example certain scenes in a game we will see the limitations of 8GB or something like that. Thanks!👍
Glad you like it!
I just bought a new RTX 3060 12GB card for $299 and ratchet and clank rift’s apart runs great on my LG 27 inch 1440p ultra gear monitor.
16GB also has way better 1% lows and high-res textures are being loaded on time.
This should be the only version of the 4060ti and should cost no more than $400.
Buy a used 3080 for $400 every day of the week. Upgrading to the 12gb 3080 would be about $100 more.
In KSA cost 620$ 🫠
I didn't immediately appreciate the RTX 40 Series until I gave each of their GPUs a chance. It's insane how great this one is!
What happened to the times when VGA would use your system's RAM where needed?
good video, without a doubt, you confirm my suspicion that not even the 12 gb of vram for the 4070-ti are enough for the long term, I think I'll buy the rx 7900xt
Well, 4070 Ti can run this game at native 1440p/Maxxed out, but it dips below 60 fps from time to time. You have to use DLSS to stay above 60 all the time. And it also reduces VRAM consumption.
There is pretty linear correlation between the amount of VRAM and raw GPU power a game needs. The more VRAM it consumes, the more powerful GPU it demands. You may need 16-20 GB of VRAM at 1440p only in case you are using a lot of mods. Otherwise 12GB is fine for 1440p. And latest DLSS version looks amazing - it provides less shimmering and more stable image overall. So now it truly looks better than native TAA!
Any kind of upscale is only decent if you are playing single player games since it adds latency so for competing games native will always be king please stop excusing big corpos bs since adding the extra vram was never a broblem for them because they get them for less than retail and even if you where to buy the vram at retail it would cost like 10$ for the extra 8gb yet they charge people 100$
@SG-tx1fz
He isn't excusing anybody.
You wouldn't notice those things without onscreen stats.
The higher the fps,the Lower the Latency , isn't that how it works?
Regardless, what's better,144fps that via upscaling feels like 180fps , but has same Latency as 144fps without upscaling?
So both having same Latency,but one of them feels more smooth .
I vote for one that feels more smooth.
Shoot my 3080ti with 12gb is struggling with ram at this point. Wish it had more as it has the gpu power but that ram gets filled and i can get some noticeable hitching, even at 1440p.
I found out that specifically Ratchet and Clank game runs pretty fine on my RTX 4070 Ti. But I am using Win10. Tried out Win11 and figured out that it usues more VRAM, so it limits the amount of VRAM available for games in comparison with Win10. At the same scenario - native 1440p/Maxxed out - this game runs better on Win10 (~10% higher fps and less stutters). So it not only depends on how much VRAM does your GPU have or how much VRAM does a game need. There are other important factors.
With my 3080Ti i play at 1080p Low settings so I can Hit 240fps
Its called Vram.
Funny thing is CPUs are leagues ahead of GPUs as far as limiting performance and yet we’re getting new Intel this Fall and no high end GPUs for the foreseeable future.
Thats why 1080P/240 is still superior. No ram issues with my 12Gb 4070 at 1080/240. I think 1440P is best with a 16GB+ card.
System ram is also exceeding 16gb, on both cards!
Its 2024 guys, 16 gb vram and 32 system ram is the new standard!
I paid £420 for the 16gb version from CEX which was £50 more expensive than the 8GB version. So I made the choice to get the 16GB version because I did not think that the 8GB version had enough VRAM. And £50 difference between 8gb and 16gb seemed a good deal. And I am glad that I did.
I would like to add one more detail to your awesome presentation. It's not just the amount of VRAM that limits the gaming experience, the small memory bus in all of the 4000 gen (minus the 4090) will have significant negative impact in the near future.
That's a good point - the relative performance of the lower memory bandwidth cards decreases as resolution increases and that is the major reason that NVidia call this a 1080p card.
It wouldn't surprise me if a 4060TI with a 256 bit bus and 16GB of VRAM frequently came close to a 12GB 4070 at 1440p and beat it in some cases.
It might even be decent at 4k gaming at medium or high settings if you used DLSS quality.
NVidia certainly wouldn't want that to happen.
Great point!
New video? Woo
CP2077 1440p with RT is very scene dependant, in few places my old 8gb GPU just coudnt keep up without massive stutters
So glad I got a great deal on my RTX 3090 24GB and won't have to worry about it at all
the power consumption on those high end 30 series cards is crazy but if you got it on a discount it's worth it i guess
@@Shockload I undervolt and OC on my RTX3090. It gets 4070ti performance and the same power consumption at full load. But without a VRAM bottleneck. I do 3D rendering and Physics simulations so I really needed that VRAM.
You are wrong...not only because of the 8GB VRAM, but also because of the memory bandwidth
3060ti 8gb = 40 fps
2k ultra + rt on
With dlss2 60-70fps
15-20 fps You will get it if you raise the resolution to 4k native
I didn't find any problem with the accuracy of the 2k
Would be better if you put the benchmark tests side by side while adding your commentary on top with your face on the corner or smtng, this would save a lot of time or you could add multiple games in a 10min video ...
my personally thought.. 4060 series only target 1080p gaming/monitor it only 1440p ready. both 8GB 4060s already more than enough for 1080p gaming, so if u mainly casual 1080p gamers with 1080p monitors theres no reason to spend more on 4060 ti 16GB just to play games on 1080p monitors.
Yes, but if you want to 1440p game now or in the future, it may not be a bad idea to go for the 16gb version of the 4060 ti.
You can get new RTX 3080 or 6800XT in price of RTX 4060TI 16GB with destroys this card
I got my 16gb 4060Ti for $290 before Christmas 2023. It is amazing... and was for awhile the preferred card for SDXL gen. I'm very happy with mine and it doesn't cost much to run it all day long.
Where did you buy it so cheap?
It’s like comparing a turd against a turd with wings. They’re both turds but the 16gb one is the turd with wings and some people like it just because of the wings.
great summary! and i have to say the gameplay added alot, i didnt expect to notice any difference but i really did^^
because of the similar memory bus i skipped the 4060ti and got a 3060ti 8gb for just 300€, it completed the build for my friend very well.
300€ for a 3060ti??!? In the states used 3080s are going for around $400 (370€). Horrible value for a 3060Ti.
@@Felale eu prices are very different, 3080 is still over 500€ new here
@@Felale i just checked again a 3080 new is 630€.. 3060ti last year costed 500...
and ofc used is cheaper, how can u compare used and new prices and call it "bad value" to someone buying new
..maybe next time turn ur brain on before trying to call someone out, in case u got one
3060 is almost 4 years old in January lol
what if it is used to run AI?
example of running stable diffusion
make video RTX 4060 Ti 16GB vs 8GB running stable diffusion please ~
Is it possible that upcoming games optimized for Unreal Engine 5 will take advantage of 16gb vram ?
Anything is possible.
Every benchmarks I've seen with 4070 Super at 1440p gives me confidence it perfect for 1440p and Will be still for several years to come.
And similar at 4K.
When problem arrisis , there is DLSS.
DLSS Performance may not look nearly as good as Native 4K , but it does look better than Native 1440p
I guess the leaks of all that VRAM on RTX 5000 aren't overkill after all.
Why what are the leaks saying?
You have to remember that Blackwell is scheduled to launch in 2025.
@@littlebear4543 There's this czcams.com/video/RDjNv3UIfTA/video.htmlsi=w-e_rOVGalRXo4gG&t=193 but while I was searching there's an article on Neowin about a rumored 512bit 32GB 5090. It would be the smartest thing Nvidia could do IMHO.
I misremembered! I thought it was double the VRAM at every tier but its GDDR7 so idk.
@@blegi1245 Some rumors are pointing to 2024 so hopefully its early 2025 at the latest.
@@BlackJesus8463 according to nvidia presentation to partners this summer next consumer gpus will launch in 2025. Only products scheduled for 2024 are for data center.
Pro arts a cool looking card at least
the pro art cost 600€. for this money you can get a strix 3090 used.
beware exploding capacitors. 3090 is so hot and powerful its borderline abusive just to run it like normal.
@@BlackJesus8463 i got an bequiet dark base pro 901 and its on 70grad when i play 4k165hz. i think its ok. but thanks for the warning.😆
4060 ti should had full x16 slot with 256bit bus then definitely it would be a complete gpu for everything
Nvidia should market the 16 GB 4060 Ti as the "Sweet 16." Pun intended. 😆
I'm keeping an eye on the release of the 40 series SUPER for a 16GB GPU. I will evaluate either the 4070 ti SUPER or the 4060 Ti 16 GB, if they will be reasonably priced. Currently I am running a 2070 SUPER and 8GB GDDR6 are starting to feel a bit tiny. I am choosing to stick with NVidia because DLSS is frankly a spectacular technology, so I can't imagine to switch to AMD right now.
Wait for flash sales after super launches, or wait until rdna 4 for massive price cuts and possibly a sub 500 ~4070 ti card, amd is really behind in upscaling :/, open sourxw and working on all video cards is epic, but we need them to utilise AI stuff by 2026 latest
@@GeneralS1mba yeah I'm waiting for a 400-430€ 40series 16GB card. In the past months the 4060ti 16GB fluctuated from 490 to 570€, which in my opinion is still too high.
If you can net a good priced used 3090 you are set for all this gen and even maybe until rtx 6000. If fsr 3 is good even more so.
since the release of the 4090, the 2080 has also run out of ram. just as it will be with the 3090 when the 5090 comes. but until then we got enough time ;)
@@darmwinder Bruh, it has 24gb. Even 16 gb is not yet common. 16 gb should be common and then maybe 20 and then 24. You'll get faster performance than rtx 3090 with the lower end gpu with less vram before that 24gb becomes not enough.
@@yuvanraj2271 yeah in steam is the 1660 super the king.. yes 16gb should first become standard. or 4k gaming. but with better games the hunger for hardware also increases. who knows what nvidia is thinking when they release the 5090... 48gb like the RTX 8000?
@@darmwinderI would be very, and I mean very surprised if the 5090 had that amount of vram. I don’t even wanna know the price nvidia would charge if that’s the case. I just hope the 5080 performs on par with the 4090 and comes in around 800$.
@@littlebear4543 the 50 series will only come in 2025 and will have gddr7. I assume that the 5090 will have 32gb. I would say we'll see. let's talk again in 2 years😁
The 7800xt my might the real winner at this price range. If the lower number of CUs is not a problem.
Not only will you need more than 8GB VRAM to play Ratchet & Clank on PC, but also more system RAM greater than 16 GB. The 8GB 4060 Ti was averaging around 20GB RAM, the 16GB 4060 Ti around 17GB. Seems that 16GB VRAM & 32GB system RAM will be a recommended PC spec requirement soon for AAA gaming.
Yeah, definitely going to 32gb very soon, if not already there in many games
32 GB has been a sweet spot for a few years already
16gb with only 128 bit memory bus is not much better this card should have 256 bit.
Yeah, that is definitely an issue too -but 16gb vs 8gb is at least an improvement
not bottlenecked by Vram but bottlenecked by Memory bus 😂
Well that's nvidia in a nutshell you get screwed.
A rig and need suggestions for a $400 card. AMD?
@@cg4658 There are no $400 AMD cards.
the 4060 is a "50" class card
As usual upscaling saves this card for consumers consideration.
until low end gpus can run 1440p ultra at 300 fps in all titles, these companies should not be pushing upscale tools as substitutes for proper hardware development. NVidia has become drunk on giving low vram, and low bus bit because of dlss. who needs 16 GB when you can just down scale the input to 1080p, and upscale the image to 4k? who needs 8 GB when you can down scale the input to 480? we got AI magic to save the day.
I'll get the 7800 XT 16GB instead.
Does the RTX 3070Ti 8GB with more bandwidth, run smoother, get higher 1% lows than the RTX 4060Ti 16GB at 1440p, 1080p.
Marginally better but still unplayable at the same settings. 8GB just ain't it.
@@BlackJesus8463 I just want to see if bandwidth, memory bus is also important as VRAM. To draw faster and draw higher quality images when gaming.
If you buy anything with 8 gigs of vram in 2023 you got ripped off, 8 gigs of vram was a 6 year ago thing. 😂
nah majority of people still use 8gb vram so developers would be stupid to make games that can't run on that. 8gb will be fine for the next 5 years at least.
The textures appear in the video seem less consistent with DLSS on. They appear sharp in some instances and then mushy in others. The problem from my perspective is that Nvidia texture compression results in lower quality textures than AMD and DLSS can make them look better than stock, but it's patchy and inconsistent. This reduces the equity of the image overall. That's what Im seeing in the scene.
This make me think my RTX 3070 is getting hold back hard by vram... I like playing 4k max settings even if i'm getting 30-40 fps...
Thank for telling me that 8GB VRAM GpUS shouldn’t even be considered in 2023 and that Nvidia needs to bring RTX 4060 and RTX 4060ti a tier down to RTx 4050 and RTX 4050ti. Going Forward I believe 16GB VRAM on a 256 bit memory bus or better will be the new 8GBVRAM. So as I consumer I will look for at least 16GB VRAM on a 256 memory bit bus.
Considering that a 8GB RAM chip cost few dollars, the 8 GB cards are a pure SCAM.
I’m surprised 8gb cards still exist when you can buy last gen at least 10gb 3080 for like $400.
I’ll never buy a card with the downgraded xx70 and below memory bus either.
128 bit bus. Good luck with thqt
And that my son is what’s called planned obsolescence, if Nvidia didn’t need the revenue from low end cards they would just give you what you need.
I have $400 for a video card for my new system. Any suggestions? AMD?
save another 100 and get the 7800xt...
6750xt and 6800xt are good options around the 400. 7800 xt and 6800xt are gonna be about the same in performance so it'll be if you want old new or new new. Just F nvidia with this gen.
A little patience to see how the 7700 XT and 7800XT perform. If they are like the 7900XT then they are easily tunable cards and you will get some easy performance increases with both. The 7700 XT will quickly lower towards $400.
Also whatever Out of the Box shows on benchmarks you will get more. My score vs. his on my 7900XT were always 10 fps better. That is why he got it so wrong on the 7900 XT. Ancient Game Plays has much more trustworthy numbers and is good at tunning the cards. He doesn't have the resources so it can take him sometime to get the cards.
Game devs are using the new hardware as a crutch to not optimize their games properly.
Something is totally wrong with this video. Cyberpunk 1440p RT Ultra NO DLSS, a 4060 ti might be able to hit 20 fps but certainly NOT 63 fps
Yes I believe DLSS 3 became active without showing it in settings, will have to fix
Why TF is there a 4060ti when the 4070 only has 12gb… make it make sense NVIDIA
my 7900 xtx just keeps getting beter with time !
should have made it a 10gb card
How do I prevent vram usage? does it regenerate?
nice
game name? 😊
I have a 8gb 4060 laptop and i have had 0 issues with 8gb, even with rt turned on.
Rx7800xt is coming
very surpricing,ye ye in 5 months cooler dead
What's your point?
That 4060Ti 16GB would at 1440p beat RTX4070Ti?
If so, don't be ridiculous.
VRAM isn't everything.
Especially when you play with onscreen stats turned OFF.
No onscreen stats and just focus on gameplay instead.
You know, enjoy the f-ing game.
That game needs some work, the 4060 should be for sure enough to run the game.
You use this channel as a cover ops to give all your money to nvidia.
lol
and some to AMD and Intel too, bad news for me lol
Nvidia stock price keeps rising as they lost all graphics cards and it looks like they can't sell gaming gpu
first
first
First
Last ✊👍🥳🥳🥳
If it had 128GB you still made a mistake buying that piece of overpriced garbage..
Far-fetched. in Ratchet & Clank That 20 frames, that 40 is nonsense.That is, if there is a need to play without DLSS and FG in 2K, then you definitely need to take 4070 or even 4070ti, and it's not in memory but in a weak chip.
4060ti is not positioned as 2K, but since you launched 2K on 4060ti, you needed to enable DLSS at least for balansed, this reduces memory consumption and bottleneck
Your test does not show in any way that 8 GB is not enough, but shows that you are not playing a completely fair game.
Similarly, it would be possible to turn on 8K at 4080 and say that 16 Gig is not enough.
The last two games on UE5 showed that memory consumption in 1080p is within 7...7.5 gigabyte.
If you wrote in the title "8 Gig is not enough for 2K on ultras with RT" - then it would be fair
We get it, you have low standards.
CT likes to stir the pot for viewers
Then don't charge 1440p prices for 1080p cards. When people realize what a con RT is then they might actually get smart.
@@robertmyers6488 The 4060 ti costs $400, but at the same time it is more powerful than the 6700 xt, and with a frame generator it is comparable to the 6800.
Rays "RT" - as of 2023, it is an industry standard that is supported by all game engines, DirectX, AMD, Intel and Nvidia, designed for rendering lighting, shadows and reflections. To date, it is used in more than 200 games.
Lumen, which in "UE5" supports its own softRT, however, even it can work with "RT" support to get better results.
I think starting in 2024, support for RT graphics cards will be a prerequisite.