The RTX 4060 TI 16 GB IS THE WORST GPU EVER MADE!!
Vložit
- čas přidán 15. 06. 2024
- THE GREAT SACRIFICE (HUB): • GeForce RTX 4060 Ti 16...
The RTX 4060 ti 16 GB just dropped and Nvidia wants to act like it never happened... BUT WE GOTTA CALL THIS OUT!
It ONLY solves the VRAM capacity problem that the 8gb version had and wants another $100 (A $500 GPU!) just to do this. When the RTX 4060 ti 8gb was already garbage to begin with. All this, makes the RTX 4060 ti 16 GB the most embarrassing and (maybe) worst GPU ever made.
JOIN THE DISCORD!
/ discord
My Spotify:
open.spotify.com/artist/3Xulq...
GN: • Do Not Buy: NVIDIA GeF...
/ 1678548233780617218
/ 1681447119935406087
videocardz.com/newz/nvidia-la...
www.nvidia.com/en-us/geforce/...
www.techpowerup.com/gpu-specs...
store.steampowered.com/hwsurv...
wccftech.com/xbox-series-x-sp...
WINDOWS AT A HUGE DISCOUNT!
Windows 10 pro ($15): biitt.ly/TFR1G
Windows 11 pro($21): biitt.ly/U9jCZ
Code: "vex" for 25% off!
0:00- Bracing Impact
1:01- 4060 ti 8 GB RECAP
3:23- THANKS NVIDIA! :))
5:04- $500 straight disrespectful
6:50- 16GB 4060 ti should've been the ONLY MODEL
7:30- RTX 4050 ti in disguise
8:25- RTX 4060 ti 8gb & 16gb HAVE NO EXCUSES
10:09- About the 128 bit memory bus...
13:25- The RTX 4060 ti has no place
15:24- These products are trash and this is trash - Věda a technologie
My sacrifice wasn't for nothing! Your video Vex made it all worth it, thank you sir.
Also just to be clear, viewers have said in no uncertain terms that the then upcoming 16GB version of the RTX 4060 Ti is a waste because of the 128-bit wide memory bus, as in there won't be enough bandwidth to service the larger capacity.
My man
@@transistorjump919 .. for most of the current titles. That's important to note, because future titles may assume and be designed to utilize larger amounts of VRAM and then the limited bus width may come to bite you hard.
@@transistorjump919 Because of resizable bar (Smart Access Memory @ AMD, Clever Access Memory @ Intel) memory bus bandwidth doesn't matter that much anymore. Unless you have a very old motherboard or you are not smart/clever enough to enable it ;-). Never the less, the card is much overpriced and might only be worth it for people that use it for AI and make back their money quickly. Maybe that is the reason why NVIDIA has set this high price and does no promotion for it. AI need lots of memory, not that much compute power. This card will kill their sales of their higher priced cards that are sold because they don't offer anything else with enough memory at a decent price (and AMD or Intel cards are harder to use in AI at the moment).
@@transistorjump919 indeed
If one also looks at the numbers and what games are popular and being played, 8Gb is plenty
No one is playing Apex Legends or Fortnite or whatever somewhat seriously and running max textures
Either a noob or happy with a disadvantage.
Pretty games are distracting games.
Most games being played are multiplayer games
It’s almost as if this loud tech vocal is just one of those loud minorities whinging for the sake of whinging
They don’t vote with their wallets
They’re full of 💩
I like how NVIDIA continues to be persistent disappointing their customers.
And they still do not give a fuck about gamers
I think they dont really care about gamers they are more concerned about theyre ai part
yeah nvidia has been disappointin me, due the 3060 12 Gb is only 128Bit and the 3060ti is only 192bit, and even the stinky 1650 is 128bit
@@collt3035 yeah also the drivers, my friend have nvidia videocard, he have ok framerates, and then it snaps and stays there... like wat da heil
?
That's what consistently mean😂.
NVIDIA has been lowering the ram on all their cards since the 30 series in order to force you to upgrade sooner.
It's not stupidity, it's deliberate obsolescence.
30 series?? 💀Damn you must not remember the 970 or many other 70 / 80 class cards back before that.
@@transistorjump919 No you idiot. They used a scam to give people less than 4GB which the card was supposed to have. The cards were sold as 4gb cards but only had 3.5gb.
3.5, the issue was with the last 0.5
@@transistorjump919Still successfully got sued, so it doesn't count. You could technically argue AMD FX had 8 cores, but the state of California says otherwise.
I dont know if you need a lot of vram when using frame generation. Is frame generation vram dependent or does frame generation lessen the importance of vram amount ?
If AMD grows a brain and launches a 196-bit, 12gb card for the same 330$ MSRP as the 12gb RTX3060, then nvidia will be in some deep shit.
AMD Probably gonna miss their shot at this
YEH, Who’s gonna buy it ? who ? Y’all talk but y’all don’t walk
“Vote with your wallets”
- Checks the Hardware Surveys
Well, that was a Lie
@@SupraBagels this is the result of AMDs lack of competition for F^ck sake
Nvidia owns the market
Why would they be incentivised to border on monopoly grounds which the community would then complain about ? Like why do they NEED to do anything
They aren’t a public service
If you don’t like it go buy else where
What’s that, hardware surveys says the community is full of 💩
AMD already missed it with the RX 7600 who was released massively overpriced.
You will buy a 7600 and you will be happy....
at 500 dollars you can buy an RX 6800XT which is nearly 30 percent faster
TITAN RTX($2499USD) is 2.5 times expensive than RTX 2080 Ti($999 USD)
@@yuan.pingchen3056that doesn't make it a good argument to buy a 4060 for 500$
6800XT is the best card At this time Got mine a week Ago!
@@3dge433 The title of the film uses the emotional word 'worst' but it is clear that the 4060Ti is not the worst TITAN RTX is the worst
@@yuan.pingchen3056titan rtx is not for gaming you gonorrhea stained pants
Nvidia's next marketing move is that the 4060 Ti is meant to be used in SLI mode.
🤣 please not.
hahahahaha dead SLI 🤣🤣🤣🤣🤣🤣😂😂😂😂
Triple sli mode so you get 48gb on 128 bits 😅
@@NGreedia lol Imagin someone spending two thousand dollars on four useless cards. 😋
NVIDIA will release RTX 4060 ti 32gb on december 2% boost fps HAHA
I love how amd and intel don't have to do anything, nvidia is doing the marketing for them.
AMD just has to actually release something for once (at more reasonable prices), been waiting for their budget offerings for way too long.
@@X_irtzNah, minus the RX6700XT, Intel is your only choice or AMD and NVIDIA are forcing you to buy the high end stuff.
nobody cares about AMD
@@saricubra2867 They don't. Well, technically yes if you want 4K Ultra 2023 gaming, which some consider pure luxury. Most developers will not fine tune for every GPU architecture. Right now I like 4060 Ti FE best because of its efficiency.
@@mytestbrandkonto3040 4060Ti FE is worse than a 3070 mobile laptop GPU 🤦♂️
i thought the mining community lived in their own reality, selling heavily used cards at ridicules prices, but its actually nvidia, getting high on their mining boom bottled up farts who lost touch with time and reality.
I got an ex mining RX 6800 for 400 euros last december and was lucky the guy was active on forums here so I could see exactly how long he mined with it and how seriously.
I got an awesome deal so miners are better than Nvidia in my case. :)
actually the same can be said about consumer.
@@how2pick4name AMD and Nvidia were not amused that their GPUs got scalped. There seems to be an empty gap between what miners and gamers are willing to spend, despite improvements have been made this generation.
Got a ex mining 6800XT Sapphire Nitro+ OC for £330 and a 3070 for £230
The 3070 was in perfect condition almost
Mining always gives the best deals after causing prices to skyrocket
Got Rx 5700 xt for $124 from the miner you can't get cheaper GPU that could run anything at 1080p atm.
Nvidia should've unlaunched this too
They should’ve unlaunched the 8 GB and dropped the 16 GB price by at least $150. Or more realistically, just unlaunch the whole generation.
Nope. 4080 12GB was fine. Also the 16GB 4060 Ti seems to run very cool which is nice. So either a 4060 Ti FE it is or some decent partner model 16 GB version. Just ignore the upsell stuff if you think this is not the last GPU generation and you are not yet sure whether or not this will be your last graphics card. Still waiting for more power consumption charts (idle, video playback, multi-monitor).
It’s fine at the end of the month Nvidia will say
“Jk”
No, it's just extreme expensive way how solve 8GB VRAM problem (=new games from consoles).
Even if last 4GB is never really used, old RTX3060 was just more cost-optimal.
@@mytestbrandkonto3040nvidia fanboy 💀
Nvidia made a 4050 TI with more VRAM than 4070 Tis that cost more than double the price when this doesn’t have enough power to even push settings that use a full 16 GB and said: “🗣️🔥This is some heat! Drop it in July!”
@@transistorjump919 Nvidia themselves market this card as a 1080p GPU. Sure games push over 8 GB, but it’s not hitting 16 at 1080p. Plus, 128-bit bus bottlenecks it so hard that frame rates gaming at native 1440p at high/ultra settings are barely maintaining 60fps.
@@transistorjump919 which Nvidia themselves cut down from the 3060 Ti. Stop coping fan boy.
@@transistorjump919 he listens to the mainstream Tech tubers and treats their word as Gospel
Most of the community is full of entitled Sheep
@@sooubic lmfao if you call transistor jump an nvidia fanboy then you clearly clueless about the usual people in the comment section, your blind hate is showing too much, while i agree 4060 Ti 8 and 16gb is bad atleast im not as fanboyism about AMD as you are
@@sooubic “This guy disagrees with my opinion! Time to break out the personal insults and disrespectful behavior!”
Grow the hell up.
GREAT RTX 4050Ti 16GB
Update 1-10-24 - Today, the 4060ti 16GB priced at $450 is the best bang for the buck for Generative AI purposes that strictly require an Nvidia GPU, second closest is an additional $500. I didn't buy them for gaming so not concerned about that usage.
THIS. Yes, this card is not great for gamming, but it has VRAM enough for SD XL and Local LLMs. 128-bit is slower, but for AI a model running 30% slower on VRAM is MUCH better than not running the model at all or having to run on RAM.
This
I rememeber hearing someone say "There's no such thing as a bad video card, only bad prices"
Nvidia and AMD's only excuse this generation is GREED. Let their GPU's rot on the shelves.
Regardless of radeon 7000's disappointment, they are a better buy than Ngreedia, not to mention radeon 6000 series.
AMD could of been a good guy but the 7600 was a disappointment
@@SupraBagels yep, had they given it 10 gigs of vram and it would have destroyed the market at $249.
@@pranavmohite8544And yet nobody buys AMD
@@GewelReal sheep mentality is strong on that one.
I personally think the 4060 ti 16GB card was solely created, not for gamers... but for those who are trying to get into AI. A lot of AI related things require more than 8GB VRAM, and perform best using Nvidia cards. I kinda wish that Nvidia marketed this card for AI acceleration, instead of for gamers. It might have gone over a little better.... mabye... mabye not
@@transistorjump919 What if you're on a budget?
@@transistorjump919 An entry level card for AI matters. Education, AI drawing, etc. A lot of applications. Having enough ram to run a large model is more important than the processing speed, if you have to pick one.
bruh u can get arc a770 with 16gigs and bigger bus for around half the price
@@kriss4882 Yes, but how well would it perform in Stable Diffusion right now in comparison to an Nvidia card.
Trust me 16gb not enough for AI at all
I remember buying a 980ti day one for £549. A flagship GPU..... It was such a good deal and lasted me years.
28nm wafer was like 1/5 the price of 5nm
@@GewelRealboth were at the time the smallest transistor sizes possible at that time so not really
To be fair, £549 in 2013 equates to £730 in 2023 money, adjusted for inflation.
@@GewelReal I never argued that point, I was arguing how ridiculoudly poor value this GPU is. I have a 4080 but thats cus I had the money and my 2070 died. The 4080 is still poor value despite being an incredible card. Its power usage is INSANE like 4k games at 230 watts.
@@WSS_the_OG exactly my point. Hell even the 3000 series was actually very good value if you look at their retail prices. The 3080 was £699 in the UK which I thought was a bit much but now lookign back is an insanely good deal if u got one.
Why not just made 4060 192bit/12GB and 4070 256bit/16GB? How much expensive the wider bus is? The gpus have increased in price anyways...
You can buy 192bit bus nvidia GPU, the name is 4070. You can also buy 256bit Nvidia GPU the name is 4080…
You are barking wrong tree. You ca say that GPUs are expensive. That I agree whole hearthy! But asking why Nvidia does not make 256bit gpus, when they make 256 bit gpus is not valid. Asking why prices are bad… is IMHO
@@haukikannel But the 3070 was 256. So why not make the 4070 256 as well so it can fit 16GB?
What difference would bus width make?
Bandwidth of 4060ti is 288GB/s (Not including the cache) which translates to roughly 36 Gigabytes per sec. That's a massive data transfer per sec. Even a 4k scene does not require that much data per second to render. Bus width is NOT A LIMITING FACTOR when it comes to games. It can cause bottleneck, but only if the game is heavily data streamed. Maybe a flight sim kinda game at 4k ultra everything on. There bus width maybe a limiting factor.
Most games are rendered and processed on the go. Hence they rely heavily on GPU processing at higher resolutions. Bus width does not bottleneck the games. VRam can, because to render a scene, sometimes over 12 GB is required in memory on 4k, but bus width does not affect squat in 99.99% of the games. This bus width adage is from the stone ages that more bus width = more performance.
Also, 4060ti is NOT A 4K CARD. 128 bit is way way more than enough for 1440p. Flight Sim also does not stream 36 GB data at that resolution. There is NO game on the mainstream market which will ever get bottlenecked by 128bit at 1440p or below resolution so bus width isn't a problem with 4060ti.
MASSIVE reduction in processing units is the main problem with 4060ti. Games utilizing multiple GPU cores better will show negligible uplift over 3060ti or may even perform worse. Bus width has almost 0 affect at 1440p or lower resolution so stop whining about higher bus widths.
@@jal.ajeera So you can fit more vram. I am mostly talking about the 4070/Ti.
@@ioannispoulakas8730 192 bit 4060 with 12 GB doesn't make sense. 128 bit 16 GB is perfect. They should've increased or atleast equalled the processing units. It's literally barebones cuda, texture mapping and render output units. The only reason there are any performance gains is because of the gen on gen architectural improvements per processing unit.
What Nvidia should do next
Rtx 5050:10GB
Rtx 5060:12GB
Rtx:5070:16GB
Rtx 5080:20GB
Rtx 5090:24GB
rtx 5090 need 32gb vram
Thwy wont. 4050 has 6gb on a 96bit bus. They nwver listen to users.
They gonna milk this and release 4080 ti 20GB a year later.
hopefully Nvidia wont mess up their naming schemes of their own product again.I guess the intern dropped the naming shields and mixed them up or something.
@@__-fi6xg nah, they knew what they were doing. It just backfired.
The 2000 series won't be any longer the worst generation, I can finally enjoy my 2070 super
You clearly are young and do not remember burning Fermi, self-desoldering and overpriced 8th generation and infamous FX 5th gen.
Got an RX 6800xt for just over $450 on prime day as thr centerpiece for my first build. Never even considered Nvidia due to such poor price to performance.
I’d much rather use a 40 series card at 100-200w than a 30 series card at 240,000,000w
Idk why but I found this channel and it’s now my Favorite PC channel Good Videos!
🫡
Yeah, Vex is fun.
Hi Vex,
I love the content and I want to thank you for your help regarding which GPU I should buy. I went ahead with the Asus Dual RX 6750 XT. I upgraded from GTX 1060 3 GB and I am amazed by the performance of this card ! All settings Ultra at 1080p and I have been getting more than 90fps on avg. I couldn’t be happier. However, Iam getting a bit of coil whine. I have researched and found out it might happen due to a weak PSU. I have a Cooler Master MWE v2 650w 80+ Bronze Rated PSU. I thought it would be enough and I dont encounter any spikes or stutters in my games or daily use. I’m not sure why it might be. Could you provide some info on this too? Thanks again
You'd think that 16gb would be more suitable for the 3070 or even 4070 but nope! Let's use that to a card that won't utilize it that much!
Maybe next year you get what you want. 😉
This is what people cried about , being sheep following behind hardware unboxed. The RTX 4070 was probably already set in stone when people made the insane amount of videos crying about vram . So Nvida put 16 GB's on a card to shut people up . Yet I'd agree 4060 & 4060ti it's pointless, 12 would be enough . Those cards aren't strong enough to utilize the vram yet if hardware unboxed say it can ALL OF A SUDDEN PEOPLE BELIEVES IT CAN .....ALL SHEEP FOLLOWING youtubers
The difference in VRAM-limited situations between the 8GB and 16GB 4060ti is enormous (no stutters and up to 4x fps in for example Callisto Protocol with RT), so the extra $100 is well worth it. But at a $500 price point something like the 6800XT makes more sense if you dont care about power draw.
There is approximately 13GB or RAM available to developers on the PS5 console, BUT they cant allocate all of that just to the GPU alone. Realistically, they allocate around 8-10GB of VRAM to the GPU on the PS5 (TLOU1 remake was using 10GB of VRAM for sure). In my opinion, GPUs with 12GB of memory will be sufficient for a long time if someone game at 1080p. The most VRAM demanding games however will require some tweaking at higher resolutions for sure. For example the R&C Rift Apart can easilly fill 12GB of VRAM at 4K. For example I saw 3080ti gameplay at 4K DLSSQ and fps went from 25fps to 100fps when game was no longer using over 12GB of VRAM.
This comment helped me more than the video
I just got a 4060ti 16GB for $330 for Black Friday. Very impressed with the performance
Yep, I paid about the same for mine. The thing rocks. Very impressive.
Hey so I’ve been thinking if the 4060 cost 300 for 8gb vram and whatnot well what if I get two. Then it’ll be 16 gb vram and the latest nvidia technology’s while not spending more than 600 or 650 any thoughts or information is appreciated
I think the move to lower VRAM is due to NGreedier wanting to kill the amateur AI creators out there, and move them to more expensive professional options. Just like how you can download a productivity GPU drivers set for NGreedier, or game ready drivers. If performance there is limited by it's frame buffer significantly, that will be the proof.
Good point about wider buses scaling better to 1440p. I think's likely correct for a significant number of titles. Trouble is, this is not a GPU worth putting a 256-bit bus on, or even 192-bit. Adding a bus costs in many ways, space on the GPU, lines on the board, it all costs. Who knows what the BOM on this was, but the 4N node and space on the GPU massive cache was a big part. Cache genuinely does mitigate for slower memory, but because of that, faster memory then makes little difference to performance. You can look at this in the CPU space. Get an X3D CPU on your board and suddenly RAM speed adds much less to performance than it did before.
Hey, if I have 200$ - 300$ to spend, which is the best GPU to buy today for gaming and normal web browsing?. I'm so confused for my pc build.
Man, the prices here in Australia are even crazier. I was looking at the 4060ti 16gb and found one for $910 AUD, the. i noticed I can get a 4070 for $950 AUD from the same store. I don’t understand how that is justified or how the pricing works here but it’s ridiculous
For machine learning, Nvidia has a coding monopoly. All AI is coded for Cuda. No one takes AMD seriously. Nvidia's cards don't change much, generation to generation, architecturally. They're easier to code for. But the prices are absolutely ridiculous. For me, the sweet spot for the last 2 years has been the older Titan RTX. Can work in a pair for 48GB VRAM, potentially saving thousands over an A6000.
This goes for 3d modelling as well not sure about content creation such as vfx and video processing but yeah no one likes modelling with an amd card but gezos the prices are outrageous at Nvidia
this should cost 300$ and named RTX 4060 and be 10/12GB not 16GB.
The other issue is unsold stock from the previous generation (3000 series and amd's 6000 series cards). Both companies overproduced for the crypto mining boom and when it crashed had a lot of unsold cards. They deliberately gimped their new generation in order to make the last generation cards still attractive so people would buy them.
Nope, the problem is that the assembly lines switcherd to the new architecture and miners and gamers are now selling their 30x0 cards for half the price on ebay, really hurting their low and medium tier 40xx cards :)= What comes after a big Boom - A Crash!
the 16gb version wasnt made just for gamers lol..
the extra gb is also for people who do 3d content or any content,rendering,video editing.etc..duh and you can still enjoy games on ultra at 1080p
Well hey, if you're a data scientist major wishing to build a 'budget' PC with a decent hardware for deep learning and AI workloads, that 16 GB RTX 4060 TI is a steal! Before, our only option was the 12 GB RTX 3060, which was an okay card, but that 16 GB will surely open other possibilities for bigger and more complex model, with enough CUDA cores .
The 4070 in comparison isn't so bad after all. The cheapest 4070 have dropped to 579. And the cheapest 4060 Ti 16GB is 549...
How can anyone justify spending this much for a low-tier GPU?
Nobody can, cause 4060TI is essentially a 4050TI. At most this should cost 250$.
4070 is just meh in general but made good by 4060ti. It's obvious nvidia wants to clear the shelves of 3000 cards. Either way, lack of sales of 4000 and consistent sales of 3000 cards are all in their favor
@@alrizo11154070 isn't meh compared to 4060 Ti and other lower tier cards. 4070 and 4070+ is only good from 40 series... I own a 4070 and love it. Lower power consumption and faster than 1080 Ti.
@@MatruchusAmpere and Ada is different architecture, You can't say that since nvidia is made them that way in this generation, and they have rights what they are naming on there GPUs. If you just gaming and don't care much on frame generation/ray tracing performance, you can get 30 series or amd GPUs on good deals. 😊
They can justify spending this much because they have humor.
I just came here to say I love the thumbnail and I love how you ended the video length at 16:16, lmaooooooooo
Help. I am upgrading from a 1070ti, and want to have a modern GPU with similar TDP (180), to avoid the need to upgrade the PSU. Everything points to the 4060ti 16GB with 165TDP being a good match... Does a better card exist?
At that price, you could get a new psu($100 ish), and a 6700xt/6750xt/7700xt. It won’t be as good, but it’s the matter of percentages, and it will be more power efficient.
Go check the tier list(Google it, and click the cultist network(can’t link it, since CZcams will delete this), and go for a >c tier). A good psu is the thermaltake tt premium.
If they only fixed the memory bus and the price...
If only Nvidia could know that 🧐
What difference would bus width make?
Bandwidth of 4060ti is 288GB/s (Not including the cache) which translates to roughly 36 Gigabytes per sec. That's a massive data transfer per sec. Even a 4k scene does not require that much data per second to render. Bus width is NOT A LIMITING FACTOR when it comes to games. It can cause bottleneck, but only if the game is heavily data streamed. Maybe a flight sim kinda game at 4k ultra everything on. There bus width maybe a limiting factor.
Most games are rendered and processed on the go. Hence they rely heavily on GPU processing at higher resolutions. Bus width does not bottleneck the games. VRam can, because to render a scene, sometimes over 12 GB is required in memory on 4k, but bus width does not affect squat in 99.99% of the games. This bus width adage is from the stone ages that more bus width = more performance.
Also, 4060ti is NOT A 4K CARD. 128 bit is way way more than enough for 1440p. Flight Sim also does not stream 36 GB data at that resolution. There is NO game on the mainstream market which will ever get bottlenecked by 128bit at 1440p or below resolution so bus width isn't a problem with 4060ti.
MASSIVE reduction in processing units is the main problem with 4060ti. Games utilizing multiple GPU cores better will show negligible uplift over 3060ti or may even perform worse. Bus width has almost 0 affect at 1440p or lower resolution so stop whining about higher bus widths.
@@TedBoltcutter It's baffling to me how everyone just keeps on harping on one of the least significant issues with the 60 series.
Less processing units, VRAM and absurd pricing is what are the issues.
Nvidia fixed VRAM, but made the existing absurd pricing, beyond comprehension.
you missed the part where you explain that "if you are coming from a 10 series card, the 4060ti is great for people who are coming from WAY older systems." but i guess your target audience is people who waste money on graphics cards every time they come out
If you have an old system, you likely have pcie gen 3. Which makes it worse. Also, you could save $50 and get the exact same performance(or more if pcie gen3) with a 3060ti or 6700xt. Which at this price range is a lot of money.
The price to performance is the same whether or not you upgrade from a 3060 or 1060 lol. It's still objectively garbage and downright insulting to even offer such a terrible product. At this price range, you have way better options. And the 16 gigs of vram is just a mockery from nvidia at this point. They created the vram problem, and now they are selling the solution but with slower memory. What a joke.
Preach. I came from a 960 and the 4060ti is great in my energy efficient 5.5 liter build. I can run whatever I want at 1080p
What Steve at HUB was saying was that the 128 bit bus doesn't make things worse to have 16gb vs 8gb. He's saying that even a 64 bit bus would work the same with 8gb or 16gb ram. He wasn't saying in the statement that a 128 bit bus was fine.
Guess we'll see how it goes for me - I'm coming from a 1660 Ti and wanted a more modern upgrade that has 16 GB and went as low as possible $ wise, so timing it up with cyber monday meant I could nab a 4060 Ti 16gb for $450. Even if it's subpar to others, the alternatives cost more than I wanted to spend or didn't have 16 GB. I think in my niche situation of just wanting *any* valid upgrade that satsifies that, I might be in the target audience for this (whatever that is)
You missed virtual reality. VRAM also matters for VR because you are using 2 individual displays on top of whatever you have going on with your desktop displays. So, if they priced it properly, it would have assisted some VR users
I guess Nvidia made it pretty clear, the 16GB 4050Ti is not for gaming (at least not its primary purpose), but for some AI calculations, a large proportion of the gaming community just could not accept the fact that they are no longer the VIPs to be served (or have their "feelings" considered), and obviously Nvidia is aware of the presence of the massive decommissioned mining cards in the used market, if they had to complete the price with those used mining cards, they don't see it profitable and decided to prioritize the AI demand that 40 series exclusive properties is a necessity, rational business decision IMHO (of course it is unpopular to the gaming community, but Nvidia knows the "reputation" of the company means nearly nothing to this market segment, people will still need to swallow the 50/60 series pill when the massive stockpile of the 30 series mining cards dry out eventually).
At the end of the day, even those "picky" folks will buy the new cards when Nvidia release some decent products again, why should they care about your feelings? This was already proven in the mining era, we see no boycott when the 4090 was release even gamers were mistreated in the past years, gamers won't punish AMD/Nvidia for their past actions, if so, why should they care?
Would buying MSI's GDDR6X version of 3060 Ti be actually better for the average 1080p gamer?
I don't have a 4k monitor so I don't need to use stuff like upscaling or frame generarion
I don't train AI models on it for a living, but I want to be able to
I do plan on dabbling in some video making/editing, so there is a slight edge in buying Nvidia for me
Featured you and this video in my new video releasing tomorrow. Good work!
Happy with my 6800 xt, just got it for $325 second hand.
That’s huge. Pop off
This is the biggest comedy that i have seen for a while.
Can’t wait for Nvidia to say
“Jk”
How many Plex transcodes will it do with 16gb?
nice content bro your channel will grow a lot keep the good work.
Nvidia with their shady business practices and people have the nerve to Call Out AMD because they “Think” DLSS is being blocked from the biggest game launching in 2023
Yup, that is what Tensor Cores are made for. DLSS is great if most developers will not fine tune for different GPU architectures.
Yet they have their own proprietary stuff like DLSS and CUDA :-DDD kinda ironic to cry for being left out.
@@Keullo-eFIN In future AMD might become stronger executing all sorts of stuff in their cores but for now DLSS looks brilliantly.
Both actually are doing shady things for a very long time now.
@@arenzricodexd4409 Yeah, so while some of the things they do I can tolerate, the tendency is to keep a GPU for longer if quality is good enough. If not you can switch to integrated GPUs, which I did for 10 years or become an expert in repairing GPUs. Options are limited. They will accept a shrinking PC market because gamers' expectations are too high or they will fix things and everyone is happy without any exception.
For me it looks like Nvidia is focusing on renaming Stuff as i remember was the 4070ti a 4080 with lower Vram.
In 2011 Nvidia CEO "Jensen" said to ignore customers if you wanna stay in business. I wonder why they did listen when people complained about number of CUDA cores, less bus width and a smaller chip. It is efficient and offers a better price/performance ratio in most use cases. For some content creation it is even faster because of higher clock speeds plus less percentage is disabled compared to the bigger AD103 chip.
Remember the 4080 12 gb was not only less VRAM, it also had less cores and a slower memory bus.
@@PURENT Sure thing but it is faster in 1080p than a 3090 Ti in average , + 13% faster in Forza Horizon. So I liked it a lot from the start but not everyone did.
@@mytestbrandkonto3040 That's just cope. They tried selling an inferior card using the same branding as the superior one, under the guise that all you were losing was 4gb VRAM.
@@PURENT The fact it has a different chip was not hard to find checking the data sheet at all, so I think Nvidia did not try to hide this. I would have found it better if they just sticked to their 4080 12GB besides a price discussion. RTX 2070 was a TU206 DIE and GTX 680 a GK104. So I had no problem with an 80 class using a "4" chip, mid-size. They are more efficient under light to medium loads while delivering more than enough performance for my needs.
People should not claim they were not intelligent enough to find out spec differences between RTX 4080 with 12 or 16GB. Sounds like trash talk to me from people that are not interested anyway to buy this product. I think Nvidia made a mistake there listening. I saw huge performance difference between 2 versions and was immediately intrigued because of how freaking fast the small version is compared to last generation graphics cards. But what else to expect if I could not buy the Ampere GPU I wanted. It is not cope, it's affection.
Great video. Love your work, dude.
Coming from a 1080p gamer here, we stil exist. I bought an RX 7600 OC and it can do 1080p no problem and in most cases it can handle 1440p pretty fine too. Not everyone is switching to 1440p YET.
I bought an Nvidia 4060 Ti with 16Gb, and it's a very good card, and I am fully satisfied by it's performance! It is miles ahead of my previous GeForce 970, with the extra benefit of having lower power consumption. My friend still owns his 3060 Ti, and my 4060 Ti outperforms his one, in every game we tested. The hate towards this card comes only because of its price, otherwise the performance is fully justified.
every card has good performance just not every card has good PRICE TO PERFOEMANCE you could have bought a 7700 xt or a 7800 xt for the same price more or less!
The ideal config for these cards, at around the $400 US price point, would have been a 12 GB version with a 192 bit wide memory bus. That's a card people (besides Nvidia fan boys) could not only defend credibly, but actually feel they're getting their money's worth. Nvidia could put 32 GB of VRAM on this silicon, and it would still suck due to that 128 bit wide memory bus.
It also makes me wonder what Nvidia thinks is the right use case for the 4050 series; 720p? Or maybe 480p? Progress is dead, long live progress.
Actually now that the AI boom is here you may see some new manufacturers jump into the picture. Governments everywhere are pushing for semiconductor manufacturing so in a few years even that may help. Really thankful for the US China Cold War. Will really push the elites to actually progress rather than simply tread water.
The "Clampshell" way just means that you have 2 32Bit Chips on ONE 32Bit Bus - And that means they work like a 2 Drives RAID-0 SATA Connection with a bigger cache - Means for small texture files it is okay but for big texture files that even exceed the cache size you will lose a lot of transfer speed compared to a bus that allows all of the texture being transfered with 2x32bit. So that makes it clear that you will suffer as soon as you go hi res with high texture settings. Will it be better than 8GB on 128bit combined thoug - For sure, because the cache is faster than using the RAM to cache those files. For 32GB and 128bit you would have 4 people that need to use a single toilet at the same time...
To be fair, NVIDIA is jacking up prices because they know they will get what they're asking, whether it's now or eventually, when the elevated prices of GPUs becomes a normality down the road. Real solution is to have a competent rival that's pushing prices down, which seems to be a hit or miss lately.
Nvidia used to do this crap before, remember the gt 640 with 4gb vram, that thing was a potato, that amount of vram was completly useless
Another banger 720p card woooooo
😂
16gb will age amazingly.
Really enjoying your cat having a great time in the background
Fr
I think they both have their place. If all you are looking for is rasterization, go AMD as that's what their primary focus is. You don't have to pay for tensor cores, or RT cores, just basic rasterization. Personally I think consoles are great for that, and I have a PS5 myself. If you want more than rasterization, and are willing to pay for tensor cores for DLSS/DLAA + frame generation (which frame gen is spectacular), or need cuda cores for apps, go with Nvidia. Nvidia has the extras to artificially double your frame rate, and it's actually quite good. I was playing a path traced Cyberpunk the other day and decided to shut it off and see what the game naturally looked like in a busy area with default lighting, and that's when I seen how big of a difference it actually was and one of the few games I keep DLSS + Frame gen on for rather than native 4k. Not even a top of the line AMD for $1000 will get you there though, and once UE5 gets more implementation with their feature set, we will find out how well AMD really does with just a heavy focus on rasterization without all the extras. Not everyone wants to pay extra for tensor/RT/Cuda cores and that's understandable, but know you are going to pay for them if you want them, and AMD not having them means you can pay only for rasterization only without the extras, and there's a market for basic playing of games. Nvidia with the extras is always going to cost more for those extras, and until they convince people (steam stats seem to show that they have) that their extras are worth it, you have the option to go with the minority leader instead. There's a GPU for everyone here, and luckily there's 2 companies to target different areas of the market. Maybe they should had eliminated tensor cores/RT cores all together on the 4060 and put it into rasterization instead, but they are heavily focused on RT/DLSS at this point and now it is a bundle package. If it isn't for you, it isn't for you.
it's sad because nvidia is now a trillion dollar company and doesn't care if we buy the 40 series or not and will probably continue to disappoint
I have to respectfully disagree.
Care to elaborate
If I remember correctly, and I do
The TITAN V/Z cards before they were discontinued in essence, received their own gaming drivers in the NVIDIA drop down on their website, keep in mind these cards cost $3000 MSRP $3718 if it were to launch in 2023!.
As such
I dare anyone to look up the performance numbers of the 4060Ti 8GB vs the TITAN V 12GB or the Titan Z, doesn’t matter. The 4060Ti slaps them both, whilst having Many disadvantages that the community having an absolute laugh about
8Gb Vram
128bit bus
8 PCI lanes
Less Cuda Cores
Less available power overheads
Oh, it does have ‘IPC’ on its side though
And that at the end of the day..
If we’re talking about releases we must talk about MSRPs and what they were respectively and the price they were at the time and the given performance
As such
The 4060ti takes the Titan series cards out the back like red headed step ch!Ldr3n and beats them into Oblivion
We can add this to plenty of other releases too
Anything 2080 Super and Down MSRP price to performance it nearly penetrates them all in such a brutally disgusting way it would Get an entire channel banned instantly if the footage was shown
It should have been 256 bit bus like the 3060ti and more cores, it has far less than the 3060ti. Or they could have made it 196 bit bus with 12gb memory, that would have been a good compromise.
196b or 192b?
Xx60 should be 192b, and xx60ti should be 256b.
Consider AMD or 4070 instead..
True that. But thinking how much easyer it was for Nvidia to switch their 4060 assembly to a clampshell VRAM configuration and selling the card for 100$ more was probably genius in Nvida's eyes! A 196 bit bus would have ment a total reconfiguration that would have made the card as expensive as the 4070!
I just want to play my heavily modded skyrim. Lots of graphics mods. My current gtx1080 ti sc 11GB card struggles down to 15 to 20fps. Pretty rough....im looking at this card. It must be an upgrade surely? Im hoping it will give me upwards of 30fps in those heavy areas of the game and ill be happy.
Sad that I will have to wait until the 5000 series to upgrade from my GTX1060 6GB.
One thing that I will give the 4060 and 4060 Ti is that it's power consumption is really low. Might be useful in some sort of low power ITX portable build or something.
If power consumption is your target in Gaming you may try Card/Board Games or Mobile Games :) Or try a PS5? A laptop with a 30 or 40 series chip would also be a great alternative? Or try a Book - I heard the biggest adventures are created within your mind! And that costs you no energy at all. Just messing with you :)
you're just cheap lol a 1060 in 2023 is just dumb a 6700xt goes for a bit over 300 bucks a 6650xt even less all faster and with more vram
@@WanderfalkeAT What about playing games in 1440p with that graphic and 550w psu?
The rtx 4060 ti chip is called AD 106, which means it was intended as the 4060, not the 4050 ti. The 4060 has AD107, which means it was intended as a 4050/4050ti.
Nvidia is not dumb enough to make their lies THAT transparent. Of course they will name the chips in a way which suggest everything is cool. But the 4060 Ti cannot be a 3060 ti replacement because it lacks the performance gain over the old card of about 20-25%. If you consider this, it becomes obvious that the 4060 Ti is, performance wise, in the 4050 ti area, no matter what nvidia is trying to tell you.
I thought I would regret buying a 3060 ti in July of last year because of the 40 series coming out but now that they’re out and pretty bad I guess my choice wasn’t bad lol
I think it's not bad at all of your choice! Especially in terms of dollar to performance ratio.
Hear me out. I got a 4060 Ti (8gb) for $80 on market place should I replace my base 3060? It’s not a 3060 ti.
I love how you use parts of Hollow Knight soundtrack in your videos it always goes hard I would love to see a Hollow Knight video and maybe compare it to how modern AAA games are today
Somewhere, somebody is buying a 4060 ti 16gb and I wonder what thoughts are going through their head.
Probably: Yes, a new GPU! Because they are completely oblivious to where the product actually stands in the market, they will simply be happy to get something new.
"An AI accelerator for my amateur project for a bargaining price"
Don't worry, I won't buy it.
I bet is better than my rtx2060 for sure 😂
With right prices (there are much lower now in some countries), and upgrading from e.g. GTX 1070, wanting to use DLSS 3 and FG - it's a good deal ;)
In the years to come I could see Nvidia 4060 Ti 16Gb staying on top in terms of value and technical edge vs AMD if upcoming games rely extensively on DLSS 3 to guarantee high FPS at 1440p.
Unless/until FSR catches up 😅
I wouldn't like the Frametime impact compared to faster native FPS! DLSS is allready great even without tricks. And 3080/3090 get cheaper on the used market day by day. Not to mention how far Intel has come with their ARC Drivers and how interesting their Battlemage will be.
Is a 70 class ti verson normally 12GB of VRAM?????
Nah. Also at $800 it really stings. 1080 ti launched at $700 in 2016 with 11 gb
@@transistorjump919 easy mistake. basically all the other 1000 cards were 2016.
My b
So what's the alternative GPU at this price?
They add VRAM to the one card that needed it least, overprice it and then they will be: Why would we increase RAM on the others? You didn't buy the 4060 16gb.
It's just ridiculous.
4060 Ti 8GB was the one that needed more VRAM the most tho?
While some say it is good to have more options, a 3060 12GB was unpopular first so we might see something surprising with this 4060 TI 16GB.
Like Linus talked about on WAN Show, even if they asked DOUBLE the price of the DRAM/VRAM it would've been about 50 dollars, not 100. This is just greedy and shameless.
They are trying to keep the profit stream going from the crypto boom. But it’s too different customer bases and gamers aren’t backed by billion dollar investment funds just working on a quick buck. But shareholders don’t care and demand the same profits quarter to quarter.
funny thing is, after driver updates and some time, just as I told everyone at the time, the 2080 Ti is actually faster than the 3070 by a decent margin now. Watching recent (within the last 6 months) comparisons of these cards illustrate this perfectly.
its like slapping 4gb on a gt740 back in the day.... but at least back then price wasnt an issue lol
Amd squad represent 💯💯
It's great for stable diffusion or any AI, I copped one since it has more vram than the 4070. Decently good for gaming since I'm coming from a 1070 too, not like you have good alternatives in AMD.
Guys im getting a gaming pc, i went to a shop, i have a budget of 1500€, they gave me two options either a pc with an i9-13900k with a 4060 ti 16Gb or an i7-12700kf with a 4070, which one should i choose ? (P.s : i’ve never had a gaming pc only an alienware laptop)
I have r7 3800X Can you advise to me what GPU with this 3800X?
Unless you need NVIDIA, get a 6700xt/6750xt. If you need NVIDIA, get a 3060ti. If you want more performance, get a 6800xt, 6950xt, 7800xt, 4070, 3080. If gaming, I would then upgrade the CPU to a 5800x3d.
The 4060ti should've always just had 12GB. That's what they should've done. No 8GB model, no 16GB model, just a happy combo of the two at 12GB. EDIT: Maybe even just a 10GB model would've been better, since the 4070 is already at 12GB.
What difference would bus width make?
Bandwidth of 4060ti is 288GB/s (Not including the cache) which translates to roughly 36 Gigabytes per sec. That's a massive data transfer per sec. Even a 4k scene does not require that much data per second to render. Bus width is NOT A LIMITING FACTOR when it comes to games. It can cause bottleneck, but only if the game is heavily data streamed. Maybe a flight sim kinda game at 4k ultra everything on. There bus width maybe a limiting factor.
Most games are rendered and processed on the go. Hence they rely heavily on GPU processing at higher resolutions. Bus width does not bottleneck the games. VRam can, because to render a scene, sometimes over 12 GB is required in memory on 4k, but bus width does not affect squat in 99.99% of the games. This bus width adage is from the stone ages that more bus width = more performance.
Also, 4060ti is NOT A 4K CARD. 128 bit is way way more than enough for 1440p. Flight Sim also does not stream 36 GB data at that resolution. There is NO game on the mainstream market which will ever get bottlenecked by 128bit at 1440p or below resolution so bus width isn't a problem with 4060ti.
MASSIVE reduction in processing units is the main problem with 4060ti. Games utilizing multiple GPU cores better will show negligible uplift over 3060ti or may even perform worse. Bus width has almost 0 affect at 1440p or lower resolution so stop whining about higher bus widths.
Finally someone with a brain
@@Eleganttf2 It's baffling to me how everyone just keeps on harping on one of the least significant issues with the 60 series. 99% of the community yaps about bus width.
Less processing units, VRAM and absurd pricing is what are the issues.
Nvidia fixed VRAM, but made the existing absurd pricing, beyond comprehension.
16GB 4060ti with this level of silicon should not be more than 350. If it had equal to greater than 3060ti silicon, then 400 would've been acceptable for 16 GB. 500 for lesser processing power than 3060ti is just pure greed.
I had a choice between this and the 6800xt and you can guess which one I went with.
3:21 Wrong sand though! It hits so much harder if you consider all the beaches that have to die for that silicon.
I hated the price but that's the GPU I wanted LOL
A 3070 like with 16GB VRAM with a lower power consumption and small enough to fit nicely on an itx case.
And no, AMD is not an option when you need the GPU for VR
Preach, sometimes there are situations where the card makes alot of sense, yes its pricey but thats how it is, Im still not paying insane amounts so im good
Just buy the 4090 I don't know what is wrong with you people 😅
The price tag of that gpu
I don’t have spare kidneys
Not all the people is a spoiled rich kid that can't even stand up from his crouch
The thing is that you see all just from gamer point of view... That memory may be great, e.g., for Substance Painter with tons of layers.
Great vid bro !
It’s crazy how the top tier card from Nvidia, is the best value card as well 😂
Some say it isn't. 😉
@@mytestbrandkonto3040price to performance it is the bet value card. It's like buying food from a bulk store. Technically it's a better deal but that doesn't mean you can afford the upfront cost
@@Shadowangel09 Yeah, I see your point. Just looking at graphs from people testing those GPUs suggests a 4090 is worst in price/performance. Sometimes GPUs like these get a buying recommendation in 2023 such as a 4090 but sometimes they don't, for example a 3070 Ti. People can still buy it, though, whatever fit their budget and needs best.
And it doesn't even matter if u buy it since Nvidia will just release another exclusive tech to the next Generation.
It is funny for people think the 4060ti is a 4050ti, nvidia wouldnt sell you a 4050ti that was this close to a 3070. Techpowerup says it is 5% slower than the 3070. It is too bad and too expensive to be a 4060ti, but too good to be a 4050ti. Is is also funny to care about the name of gpus or gpu die, they dont really matter tbh, the price is the problem. If the 4070 ti was called a 4080 12gb but cost $600, everyone would cheer for it, price to performance is the problem, not name to performance. Even if they used the the AD101 die that is usually reserved for the best gpus but still performed like this, it would still be trash. If the 4090 had a AD107 low tier die and kept its performance & power usage, nobody would care. The problem is the performance. We dont care how they do it, we just want the performance we paid for. Dont teach nvidia that they can change the die naming scheme & unsuccessfully trick us.
NVIDIA owns the market
There is only so much market share they can have before it becomes a problem
Look at the competitions offerings
What happens if Nvidia released a 3070/Ti as a 4060/Ti at $299 (as you said , naming schemes aren’t important now are they, just performance RIGHT ?)
What HAPPENS if Nvidia released a $299 3070Ti with 12-16GB Vram 256bit bus, what happens??
Ask yourself ? There is a reason why Nvidias product stack under the 4090ti is Gimped to some degree
If y’all were really what you preached about, we’d be seeing the market share swing the other way massively. But it isn’t. Y’all want More from AMD and Intel without supporting them but also want more for less from the company who owns majority of the market ?
Does ANYONE SEE THE PROBLEM HERE ?
It is a 4050. Proven over and over again and again.
Its funny how ppl cant see Facts even.
@@flimermithrandir if you think nvidia would make you a 4050 price class gpu that is 5% away from the 3070 in performance when there are so many 3070s still in the market, i cant help you. 😂 50% generational perf uplift from a XX50 class card ha ha ha. It is like you people have never met nvidia before, remember how the gtx1050 was only 6% faster than the gtx950🤣. I knew then that nvidia sees the XX50 class as the dumpster ground. They created shit 4060 and 4060 ti class cards the name means nothing, i want the price down. So what they can call them 4050s or 4030s for all i care, if the price doesnt go down, it means nothing, the price is what i want down.
@@Ober1kenobi im not sure i understand your point. If the nvidia launched the 3070ti as a 4060 ti for 299, it would be a way better produt than what we got, 3060 msrp was 299 and 3060ti was $399. I dont even care if nvidia called the 4060 ti a 4090, if they charge me $299 for the 4060 ti is better than the trash we got. It would be a 33% increase in price to performance. Dont let nvidia trick you with names, they'll change the die naming skews in 2024 to trick us. I havs used both AMD and nvidia GPUs for my gaming, but i need nvidia for my job, i use cuda for machine learning at my job. Edit: im waiting for battlemage with intel for my main gaming rig. Im not biased against intel and amd, i use a 13700k cpu and 7900xtx on my main gaming rig. My nvidia rig is an emplyment necessesity, even though convenient for me.
Great stuff, thx! Your mention for the niche case was spot on and it's rare to find this aspect in recent videos, as everybody is always focused on gaming. But that's not everything you can do with those cards. The creative people and those who use generative AI are longing for this, and right now the 16Gb version is the best Nvidia option unless you're ready to go above the 500 dollar threshold. Nvidia knows that too and now wants to milk customers before AMD and others catch up. But yes, aside the narrow bandwith the price is the main problem, from all perspectives.
i honestly expect a 3060 ti when launched to at least pack 12 gb of vram dissapointed about the 8 gigs
IMO it's a great GPU that will last people YEARS
It's simply 150-200$ too expensive. But if I were to get one rn I definitely wouldn't regret it
Lmao it can barely play 1080p games...today.
@@DanielCruz-qr6qu The 8GB variant yeah. But the 16GB is fine
if amd wasnt such a joke when it comes to blendr and ue i would consider it, but its just that, a joke
Thanks for the amazing explanation!
Is it bad i bought a 4060 ti a gb? like I was upgrading from a AMD Radeon 570 and that card was really appealing in my eyes. Should i have bought the 3060 ti?
The only advantage the 4060ti has over the 3060ti is power efficiency and DLSS 3.0. That is not worth it. Also, the rx 6700xt/6750xt is great.