I'm going to show you RTX 4060 benchmarks BEFORE embargo!
Vložit
- čas přidán 25. 06. 2023
- I was asked if I wanted to show some performance figures with you guys early... WELL OF COURSE! More time to make a purchasing decision is ALWAYS a good thing!
Sponsored Links
Learn more about the Montech Titan Gold PSU lineup here - www.montechpc.com/en/products...
amzn.to/3qTbdZn
See all specs and features on the Viewsonic XG340C-2k here - www.viewsonic.com/global/prod...
Get your JayzTwoCents Merch Here! - www.jayztwocents.com
○○○○○○ Items featured in this video available at Amazon ○○○○○○
► Amazon US - bit.ly/1meybOF
► Amazon UK - amzn.to/Zx813L
► Amazon Canada - amzn.to/1tl6vc6
••• Follow me on your favorite Social Media! •••
Facebook: / jayztwocents
Twitter: / jayztwocents
Instagram: / jayztwocents
SUBSCRIBE! bit.ly/sub2JayzTwoCents - Věda a technologie
Apparently showing some early performance in live demo makes you a shill these days. Salty channels aside I of course jumped at the opportunity to allow people to get an idea of how it's scaling compared to Ti and older cards. This video as stated is an early preview, it's not my job to tell you to buy it or not, it's yours to decide based on gathering info across all of this outlets providing info in their own testing style. As stated I didn't chose ASUS, in fact I expressed my displeasure with being sent an ASUS card.
If you came here because of other salty channels, cool. I don't mind being their platform to boast their own validity while bashing. I don't do drama, that's them, not me.
As stated this card should be cheaper, the Ram should be higher and the gap should be larger. But go ahead and parrot whatever you like, it honestly doesn't bother me.
Have a nice day
Jay! I can't believe you would make something I enjoyed! Appreciate ya ;)
It's not a revolt. It's a march for justice!!
It’s not that you showed early performance, it’s that you showed exactly what Nvidia told you to. That aside, I’m still subscribed because I trust your content overall and you don’t act all high and mighty like Gamers Nexus.
alright jay i see you. the easiest thing would’ve been to take the video down and apologize but you did the man thing and stand by your word. respect. i will take somebody who follows their own word over someone who goes back on it to save face.
@@jediii86I could have picked any title and the response would be the same. I know just how much cards fall on their face in this title when the features off which is why I use this title already. People forget we already show CDPR CP77 in our testing but because some salty channels love to use me as they wield their swords of righteousness. I do apparently love to piss off the Aussies.
The fact that they forbid you from bumping this up to 1440p is hilarious to me lol. With the 4050 you're only going to be able to talk about 720p 🤣
@@96kylar Haha, yes. And it's just a coincidence that they picked the one single game that benefits Ada cores/DLSS3/FG the absolute most. Pinky promise. Scout's honor. Nvidia (and AMD as well) just can't stop being shady, they just can't help themselves.
RTX 4050 reviews will all have to be run at 1080p but only with DLSS on. 😂
Either 720p or even 480p
@@ShinyHelmet:(
This card is already a 4050 in terms of performance and die size.
128 bit bus and 8x pciexpress lanes were used for the RTX 3050. This is a 4050, with the 4060 Ti being a 4050 Ti. They just uptiered an extremely low end die to XX60 tier.
They did that on every model below the 4090. The "4080" is like a 4070 Ti.
Same trend for the whole stack.
tech reviewers became a joke. they are basically shills now making ads for overpriced x3-4 times ngreedia/ayyymd crap.
corporations set prices for freaking stupid GAMING devices like its money printing machines... and reviewers stay silent about that. either reviewers are shills now or stupid AF if they don't notice whats going on.
tech reviewers DEFINITELY SHOULD CRY MORE, to the point of ignoring products reviews and making videos about HOW MUCH OVERPRICED IT IS instead of doing tests.... khhm, I mean ADS for overpriced AF GPUs.
talk about the fact that all these cards are irrelevant as they are overpriced x3-4 times, and e.g. actually the 4070 is a xx50ti class card. and shoud be priced accordingly below $200, as xx50ti cards were priced pre-mining.
tech reviewers must talk about the fact that a true xx70 card is the 4080 16GB, 30% less performance than a top card (the 4090) is what xx70 cards were all about. talk about the fact that xx70 GPUs cost around 300 bucks in 2016 pre-mining.
remind people that no one in 2016 considered ngreedia GPUs cheap, so even if they get to pre-mining prices they still actually would not be cheap. it'd just be way more justified pricing than what we have now.
and don't forget to mention that 1080p is extreme budget in 2023, we aren't in 2016 anymore, with current monitor prices 1440p is the new norm.
talk about the fact that any 1080p GPUs must be considered extreme budget and be priced around 100-150 bucks. just like 1080p monitors they are only suitable for. e.g. the 4070 is DEFINITELY an entry-level below $200 xx50Ti class card.
talk about that a 4090 die 608mm2 costs 300 bucks MAX to make according to wafer calcs and 2 y.o. TSMC prices per wafer, its more like 200-250 bucks now.
a 4080 die 379mm2 costs 150 bucks MAX. and ngreedia is trying to sell 60-class cut down silicon 295mm2 that worth around 50 bucks as 4070ti for 800 bucks with 12gb VRAM, they even tried to sell it like 80-class silicon for 900 bucks at first.
and its on OVERPRICED AF TOP TSMC NODES. you easily can release 4070/ti/4060/ti tech on WAY CHEAPER NODES. yep, it would eat a bit more power. and those dies would cost not 50 bucks from TSMC (which is by itself should allow to sell 4070ti for 200 bucks) but like 10 bucks to make. so 4070ti literally is possible and would be VERY profitable for 200 bucks. but we are dealing with ngreedia here milking the market. same as ayyyymd and intool.
ALL TECH TUBERS DEFINITELY SHOULD CRY MORE. as is "tech reviewers" ACTUALLY ARE NGREEDIA/AMD SHILLS who normalize overpriced x3-4 time GPUs by ignoring HOW MUCH it all overpriced and not crying about it all the time. YES. YOU - TECH TUBERS. ARE. SHILLS. whether you realize that or not. period.
@@WaterZer0 Yeah basically. They went over the top with the 4090 and everything else has seemingly taken a step back. All you have to do is look at the memory bus and the power consumption and line them up with previous gen cards in the ballpark.
4050 in disguise 😂😂😂😂
That is probably to justifie the price.
This made me curious and I went back and checked day-one reviews for the 1060, and it's funny how those reviews back then focused on 1440p, some even on 2160p, and now, six years and three generations layer, we get the same tier of card focusing on 1080p. Weird time to be alive.
Games are much harder to run now, but I do see your point
@@ItouchIphoneInfoyeah but the hardware should also be much stronger to match it, right?
@@ItouchIphoneInfo i wonder if you will you need to run games in 720p or 420p in the future with a 4060. Since it's marketed as a barely 1080p card in the present.
@@majinbuu1337bruh, If this trend continues the 5060 will be a 720 gpu 😂
@@ZxanonbladeIt should be but nvidia is too busy being greedy assholes releasing weak gpus for twice their worth. Remember when gtx 1080 smoked the 980 and 780? Back when gpus were actually overpowered
Just in case you're wondering why every other big tech channel hasn't uploaded a sneak preview of the 4060.
This is all part of a deal given out by Nvidia under certain conditions.
1. You must highlight the advantages and fps gains with DLSS 3 and Frame Generation
2. You're only allowed to show specific games
3. You have to use the settings Nvidia chose beforehand
In exchange for that you get a headstart and Nvidia gets an ad from "trusted" CZcams channels.
That's why you see mostly smaller tech channels taking this opportunity in order to gain an advantage over their peers.
Seems like the 4060 TI review wasn't an accident after all and only got removed for the backlash.
Thanks for the great first impression of the 4060!
Yeah I just unsubbed Jay after so many years. Can't trust the man anymore.
Embarrassing isn’t it. “I’m not allowed to talk about 1440” etc. my take away was to try and generate some orders prior to more balanced reviews going up.
Let’s see what GN make of it.
I just unsubbed as well. GN & HU are more trustworthy brands in my opinion these days.
@@JLC1987 I treat all of them with some suspicion as they’ve all done the mock outrage and then pushed certain brands over time but GN at least gives the fullest dataset so you can get an informed view.
Cause they are whiney babies. Go ask if they are okay with $10K markups on pickup trucks....
I have a feeling Nvidia doesn't want to sell GPUs to consumers, like at all
To be fair, the consumer market is extremely unpredictable and volatile. If you can sell to other businesses, it's a lot better.
Obviously that doesn't excuse their behavior with this generation or their behavior in general.
No the really do think people are stupid, sadly they are right most of the time or they wouldn't be where they are.
The whole industry is now targeting data centers and AI us gamers just don't matter right now
I think that if they had made it a 12 GB card for $299, or even $329, it'd be a very desirable card. The $20 difference between $279 and $299 just isn't enough to make a difference to me between buying or not.
Yeah still haven't found a reason to upgrade my RTX 3060
It's a better deal than the rx7600 imo
i think it's gonna sell well for people who can't find any other card around that price with higher vram
I'm waiting until next tax time to nab myself an RX 6750 XT.
I think it would help them sell. But just because it's 12g doesnt mean it would be worth it. I think new AMD cards are gonna smash these. Oh that card is clearly like a Dell OEM card. Small and black like that i can see it in their systems. You would expect other companies to get more out of it, but they're gonna charge more for sure.
With the billions that Nvidia is making, they could certainly have blown a few million dollars to knock the price down on the 4060 to make it their loss leader card that brings back the buyers.
A 300$ card running a 2.5 year old game at 1080p, what a time to be alive.
tech reviewers became a joke. they are basically shills now making ads for overpriced x3-4 times ngreedia/ayyymd crap.
corporations set prices for freaking stupid GAMING devices like its money printing machines... and reviewers stay silent about that. either reviewers are shills now or stupid AF if they don't notice whats going on.
tech reviewers DEFINITELY SHOULD CRY MORE, to the point of ignoring products reviews and making videos about HOW MUCH OVERPRICED IT IS instead of doing tests.... khhm, I mean ADS for overpriced AF GPUs.
talk about the fact that all these cards are irrelevant as they are overpriced x3-4 times, and e.g. actually the 4070 is a xx50ti class card. and shoud be priced accordingly below $200, as xx50ti cards were priced pre-mining.
tech reviewers must talk about the fact that a true xx70 card is the 4080 16GB, 30% less performance than a top card (the 4090) is what xx70 cards were all about. talk about the fact that xx70 GPUs cost around 300 bucks in 2016 pre-mining.
remind people that no one in 2016 considered ngreedia GPUs cheap, so even if they get to pre-mining prices they still actually would not be cheap. it'd just be way more justified pricing than what we have now.
and don't forget to mention that 1080p is extreme budget in 2023, we aren't in 2016 anymore, with current monitor prices 1440p is the new norm.
talk about the fact that any 1080p GPUs must be considered extreme budget and be priced around 100-150 bucks. just like 1080p monitors they are only suitable for. e.g. the 4070 is DEFINITELY an entry-level below $200 xx50Ti class card.
talk about that a 4090 die 608mm2 costs 300 bucks MAX to make according to wafer calcs and 2 y.o. TSMC prices per wafer, its more like 200-250 bucks now.
a 4080 die 379mm2 costs 150 bucks MAX. and ngreedia is trying to sell 60-class cut down silicon 295mm2 that worth around 50 bucks as 4070ti for 800 bucks with 12gb VRAM, they even tried to sell it like 80-class silicon for 900 bucks at first.
and its on OVERPRICED AF TOP TSMC NODES. you easily can release 4070/ti/4060/ti tech on WAY CHEAPER NODES. yep, it would eat a bit more power. and those dies would cost not 50 bucks from TSMC (which is by itself should allow to sell 4070ti for 200 bucks) but like 10 bucks to make. so 4070ti literally is possible and would be VERY profitable for 200 bucks. but we are dealing with ngreedia here milking the market. same as ayyyymd and intool.
ALL TECH TUBERS DEFINITELY SHOULD CRY MORE. as is "tech reviewers" ACTUALLY ARE NGREEDIA/AMD SHILLS who normalize overpriced x3-4 time GPUs by ignoring HOW MUCH it all overpriced and not crying about it all the time. YES. YOU - TECH TUBERS. ARE. SHILLS. whether you realize that or not. period.
I'm gonna steal that for my comments about overpriced AF GPUs, thank you
lol good comment
You'd think he discovered fire, so excited that it plays one game. Now do Hogwarts using ultra textures.
$300*
You know they optimized the hell out of it for the one title when that's all you're allowed to show pre release
@@96kylarhardware unboxed refused bcos of this reason
@@96kylar
He will clealry do anything to make a buck.
It's called paid advertisement.
@@Terepin64
He tries to sit on both sides of the fence.
if jay refuses to follow the instructions he won't get sent future cards no more or his incomes from them will be obliterated sort of speaking
Everyone just needs to stick with what they have for a year so Nvidia, AMD and Intel get the hint we are not buying video cards without the performance increase we expect!
Totally agree
I'm trying to get something for myself for Christmas, my RX480 is on life support nowadays, downclocked, undervolted, custom fan curve... if it resets to the default settings it will crash after a few minutes of gameplay.
@@Cinkodacs Used market
I don’t know how much longer my 1070ti can hang on. And to be honest, I don’t think Nvidia will miss the $ from consumers one bit. They just made over $1,000,000,000 (yes billion) from the Chinese government for selling enterprise GPUs for AI. They don’t care about people not buying overpriced 4000 series cards. I don’t know what the answer is 🤷♂️
It's not always about performance increase. It's about power efficiency as well. That will save money on your power bill
27% uplift is great and all, but I can’t help but remember the 3060 being about a 2070/super in performance, so seeing the 4060 being worse than a 3060 Ti when you can get one for like $250 on eBay is rough
It's because it's a hyper inflated 4050
You realize the 3060 was only 20% better than 2060, right? The only good thing about it is that it had 12GB, otherwise it was a complete turd, much slower than the 3060 Ti.
I felt horrible when my friend bragged about his new pc and its 3060, he got so ripped off and he doesnt even know it.... I didnt have the stomach to tell him.
Not that 3060 is a bad card, just the price he paid.
Here in EU you can snipe some 3070 Models für 250-275€. Maybe this price will rise when the people realize how bad the 4060 is
@@totallyrealcat4800the 3060 ti has 8GB of VRAM, not 12GB, the 3060 has 12GB
The issue with this is that you're doing exactly what Nvidia wants you to do, and you notice how no other major channels have releases on this, there's a reason why.
Begun, the youtube tech reviewer war has.
@@alargecorgi2199😂 good one
The issue I have with FrameGen is, it boosts your fps but not your input delay. So the discrepancy between input and reaction of the game is higher and therefor the perceived input lag is worse.
Fully agree, I feel the same in my 4060 mobile, frame gen is a smoothing filter not an fps booster. When he shows 95fps with RT and FrameGen ON it is nonsense number, as he is really playing at 4xfps lag. You must be well above 100fakefps to be acceptable. I don't understand why they bought this NVIDIA marketing BS
100% agree with you both! I've tried frame gen a little bit now & it feels horrible unless you've got enough base fps to not need frame gen anyway! I would rather lower settings than use frame gen for sure. Input lag is more important than visual smoothness for most games imho.
It should be considered a visual effect nothing more.
And actually, after seeing today's HWUNBOXED q&a where they mentioned the list of NVIDIA's PR test limitations to get early reviewers to pre-shill the 4060 with their predefined test cases (they basically created a video with their face for NVIDIA), I would actually blame Jay for falling twice in NVIDIA's stone for some more views.The fact that it was an ASUS is now even funnier
yeah it does, but in my experience with my 4080, you don't notice that, where as the FPS you really do. So its like 5 leaps forward and one leap back
AMD recently dropped prices in response to this being announced so maybe the rush to lift embargo is retaliation to this
The competition , the RX7600 , hasn't been dropped in prices here and no sign of a promised 50 dollar rebate neither. AMD marketing is always such BS
@@pc_modder8156 i don't know if it is the difference between Manufactures Suggested Retail Price and Recommended Retail Price confusing them or if they are choosing to just ignore the fact they are only "suggestions" and "recommendations"
@@pc_modder8156 So they should have set the price lower at the start and analyzed the market better. That has all to do with marketing. And I think AMD didn't lower their prices that much, but just told the retailers to lower their profit margin like they did before already. Again marketing. That rosey view on AMD gets really old. They are just lumping behind Nvidia. If you like or not.
@@erikkarsies4851 why shouldn't they? just so you can get a better deal on the nvidia card you want?
@@erikkarsies4851 i think it was about 10% drop, i can't remember exactly where i saw it i think it may have been jays first video announcing this 4060 but could have been another tech channel.
Remember, this is Nvidias cherry picked game for performance. It's probably going to be their best offer, but how does it do on other games.
I was just gonna say that 😂. The cards efficiency on other games is where we will be able to see if it’s a decent card or another complete flop like the 4060Ti
I want to see this crash and burn on AMD geared titles. I love seeing tests in games like Borderlands 3 because it's straight up non RT raw performance! No DLSS, no FSR, no gimmicks to increase performance, just raw power
@@pedrolantyer welp thats great 😂
I played Borderlands 3 at 1440p at ultra settings on a 3060 12gb and got over 80fps all day on it
Can't wait to see a 4060 on it.
Also this video wasn’t even a benchmark video. It was rambling nonsense for 15minutes. 1 benchmark game and r that’s it
@@constantin-adrianprisecari5379 But people still fall for all the marketing bullshit, That's why marketing is an important tool throughout business.. It's about targeting your audience with shit that sounds great enough to pull the trigger and then too late you realise it never hit those high notes lol
This really needs prefaced with 'This is an ad'.
Jay says that generationally it's a great jump from the 3060 to the 4060, and then proceeds to show the previous gen 3060ti beating it.
When the 3060 launched, it beat the 2070 and before that the 2060 beat even the 1070 ti.
Meaning that two generations ago a new 60 class card was close to the performance of a previous gen 80 class card.
Nvidia is giving less and less performance every generation and this "reviewer" doesn't seem to notice it.
This reviewer noticed $ on his bank account from Nvidia
@@Mr11ESSE111lol
@@ProcessedDigitally only him snd Daniel Oven show this stupid reviews,other refuse it as Linus and GN and similar !!maybe they want more $ or they don't wants to play with losing of subscribers
But DLSS 2 and DLSS 3! 😂
This comment, and the many others like it, have aged as well as a dead corpse cooking on a summer street.
You would do best to keep your biases at the door instead of bringing them in and making judgements which are aligned to prove those biases true.
Jay - Asus your fired!
Nvidia - Lets troll Jay with an Asus card!
😆
yeah like what the fuck jay
I'd tape over the Asus logos and not talk who made the card or say something like "this card is from a brand i don't associate with"
@@RealGengarTV Yet he didn't do that and is making money thanks to Asus. Is incredible how powerful these brands are.
@@RealGengarTV He may have been told, he had to tell the name.
Jay : Lets revenge NVIDIA with their embargo
If nVidia is talking about replacing a 2060 with this could you do the testing on a PCI-E Gen 3 motherboard as well ? See if there's any difference between the pci-e generations.
There will be for sure because these cards are only using 8 lanes.
Yup. Not everyone is one gen 4, which includes me
If it were to use 16 lanes, there would be no difference. Even on a 4090, the difference is marginal at best. When it only uses 8 lanes though, the story might be different.
Hey.. I am using me 2060 FE 6G on a z790 i7-13700k.... :( I was really hoping the 4070 ti was not priced stupidly.
yes PCIe3 would cut bandwidth in 1/2. only 8 lanes instead of full 16 lanes.
NVidia: the way Jayz2Cents was meant to be played.
Picked up a 6700xt 12gb for 319 recently... After watching this... I'm still pleased with my purchase. I feel like I'm precisely the type of customer Nvidia is after, I've only ever bought Nvidia until this 6700xt, I was after a new card and I had a budget of 250-350, I have a1440p panel (increasingly the norm now) so 12gb was a no brainer. This made the obvious choice the 6700xt, I tried to find a competing Nvidia card for several weeks, there was nothing out there. Their pricing strategy is FUBAR.
I did the same at the end of last year. Nothing in the offerings from either red or green has made me regret it. Literally the only thing that's made me think twice was seeing how far Arc has come. Seeing the hardware AV1 encoding and then seeing the driver updates bring performance pretty close to this stuff for around the same price is the only temptation and potential regret. But then I remember that by the time places implement AV1 streaming, the Arc A380 will probably be something like $150 and I can just pick up one of those and plop it in the second PCIe x16 slot if I run into any issues with software encoding.
best deal on todays market, 1440p card, good peformance, 12 GB VRAM, its crazy to me nvidia starts with 12 GB VRAM for double the price of 6700 XT
I agree 100%. MSI 6700xt 12gb is the way to go
yep.. I got TUF6700XT and to be honest this card is running better then 4070 in many games... 4060 is not even close for comparison
@@akamdapapash yeah agreed, at this point Nvidia seem to just be relying on their brand strength / name... I'm hoping they'll learn, but with their pivot towards AI, do they actually need consumer GPU sales nowadays I wonder?!
It should have been 20% better than a 3070 at this price to be worth it. For that it should have been a 12gb card. At this level of performance with 8gb, I agree it should have been 250-260$
That used to be industry standard. Each generational leap the card would performance leap plus a bit more the card in front of it plus new software features etc and do if for the price of the lower end card the previous gen that it replaced.
Since mining they have not only raised prices on cards and decreased performance gen to gen, but seemingly moved the whole product stack so that the lower end cards become the mainstream.
This isn't the first time they have done any of this though. the xx70 series cards used to be the $300-$350 cards. I think I paid $329 for an Asus Strix GTX 970 4GB. The next gen they upped the price of the 1070 to more inline with what the 980 had sold for and then tried to push the 1060 as the more mainstream cards. I'm thinking by the time we hit the 5000 series in the next year or two the 5050 is going to cost like $350 and they will call it the great 1080P card with a 64-bit memory bus and some CZcamsrs will still be shilling them as great cards.
8GB is enough memory to play current games at 1080p or 1440, and I doubt that texture size is going to massively increase in the next couple of years. When you use features like DLAA, it has the effect of making textures look sharper than they actually are, anyways.
This card is aimed at the mainstream, not the high-end user. The drop in power consumption alone is a big practical improvement over previous cards.
I would even say it’s not a good deal at $250 either. Both frame gen and RT are vram hogs and while this patch of CP2077 can run fine, cdpr already announced that the game will get a lot more demanding for 1.7/2.0 or whatever they decide to call it. 8 gigs simply won’t be enough if you actually plan on using the nvidia features and this is the trend going forward. 8 gigs is the new entry level that should only be on xx50 tier cards
@@SteveDonev Yes, it would be decent at 250$, not a shut up and take my money video card.
@@Magnulus76 So you pay 300$ in 2023 for a nvidia card to play at medium with no rt?
Pretty sure the 4060 is x8, not x16. On PCIe 4, it doesn't really matter, but if you're upgrading a 2060 PCIe 3 system, you may be leaving a bit of performance on the table, especially if you use up the 8GB vram. I see Nvidia is doing their best to sell Frame Gen as actual FPS boost. Thanks for the early look, although marketing it as a 1080p card is just silly
Multiple good points in a single short post. Well said.
maybe I could see some use of frame gen AI for going beyond 60 on a cheap card, but then when are decent 120Hz adaptive sync displays gonna be the norm?
If they marketed it as it actually is - a retro/legacy gaming GPU for older games - and slap 199 pricetag nobody would be upset
I can't really think of a good reason that they limit the amount of lanes, is it a cost saving measure or are they really petty enough to cripple the bandwidth on Gen 3 motherboards to push consumers to upgrade. Both AMD and Nvidia seem to be doing this at this price segment now, At least Intel hasn't yet and let's hope not.
Exactly. This level card is aimed squarely at budget builders and upgraders. Those are both platforms likely to be on Gen 3 pcie devices, and this card should be tested more thoroughly on those systems. Anything else is misleading.
A large list of creators received this offer but most had more integrity than you and declined.... oof
This is not a 4060 review. No, this is just a tribute.
Hey Jay! About the 2060 51Risc model, it is a chinese brand that can sometimes include mining or reconditioned cards. It mostly sold through Aliexpress and it's fairly popular here in Brazil, although it is a bit controversial too
$300 for a RTX4050 is insane
People were calling the 4060 ti a 4050, now you lot are calling the 4060 a 4050? Make your minds up, ffs.
@@JynxedKoma 3050 was an overpriced failure of a card, so we use the term 4050 (next gen 3050) as a blanket term that describes all these cards as failures
@@JynxedKoma i guess this dude is the other people? oh wait there is a bunch of different people online with different opinions who coulda thought? :O
@@JynxedKoma 4060ti is 4060 and 4060 is 4050, if we account the die and cut of cores.
@@JynxedKoma You aren't helping anyone, including yourself with your jokes Jensen minion.
I'm going to stick with my 3060 12GB. Got it for a decent price right after the pandemic and when the mining craze died. I won't need to upgrade until probably 50 series.
And you get to use ultra textures that make the game look great, while people with the 4060/4060ti will be playing on medium/low in recent titles.
upgrading every gen never made sense anyway, unless you're a 90 tier buyer with more money than brains
@@tyre1337 No kidding! I ain't got that kinda splurge money lol I'm an upgrade every 2 generations. I got myself a Ryzen 7 5700X and I saw no need for the new Ryzen 7000 series CPU's. Maybe the second generation of the new AM5 I may pull the trigger for a new MB and CPU combo.
You would have stick to it either way its not like you are some first adopter who likes to be on the edge , if u were u wouldn’t have bought a 60series card in the first place , this comment makes no sense, people like u and me just wait for 2-3 generations to go by sometimes even more before upgrading and nvidia knows that so there’s nothing init for them to sell u on a new sub 60series tier card every generation
@@drinkwoter I hear you. My recent upgrade path went from 1050 Ti to 1660 SUPER to 3060 12 GB. But after that upgrade path I have had enough and no way am I going to even bother until at least 5000 series.
JayzTwoCents further tarnishing what little credibility he had. He's nowhere close to the likes of GamersNexus or Linus and friends when it comes to tech knowledge, and now he sells out (DonMigs85) Guru3d
Ahh. Jay fell for the classic trick of 8gb vs 12gb 3060, the white ausus card was 8gb for 279. If jay (some one whos exposed to gpu skus and trickery) fell for it how many unsuspecting buyers fell for it since that card came out.
What's the classic trick?
That the 12gb 3060 is the way to go?
@@ericj6636 Yes, the 8GB version has not only less VRAM but is further crippled and a lot slower than the 12GB even within the VRAM limits.
Timestamp is 11:20
@@ericj6636the 8gb 3060 ( the one the percentage improvements are compared to) has much less core and memory performance, it's not just vram amount
I recently bought the white Asus 8gb 3060, I also have an EVGA 12gb 3060, they are not the same, in games where the VRAM aren't a big difference the 8gb 3060 get spanked by the 12gb 3060, my RX 6600 and my Arc 770SE. The 8gb 3060 does seem to be a step up from my 3050, but not a huge step.
My two cents. With RX 6600 XTs around for nearly $100 less, the 4060 seems like it is just a way to scrape money out of Nvidia fanboys who will tout Ray tracing, and ignore rasterization.
Minor correction. All 4000-series GPUs are PCIe 4.0 devices. The 4070 and up have the 16-pin power cable, but that doesn't make them PCIe 5.0 devices.
@@RocketRentonthis doesn't make sense though, the 40 series isn't going to even fully saturate a 3.0 bus?
@@AarPlays The 4060ti use 8 pcie lane, so on a gen 3 it only use half the bandwidth available. Its very likely the same with this one.
@@RocketRenton I have a 4090 on PCIe 3.0. Works flawlessly if you explicitly set the 4.0 Slot to 3.0. (I have a 3.0 riser cable)
@@thorcall you realise a gen 3 pcie lane if your only using one has 16 lanes........ your statement literally makes no sense, even 3090tis dont saturate gen 3 with 16 pcie lanes, and guess what 3090ti and 4090 have identical memory/bandwidth/speed configuration so in raw memory intensive workloads they wont be saturated.
However the risk more comes from the future of Nvidia's RTX IO accelerated storage (very similar to DirectX Direct Storage and what is currently on consoles)
@@RocketRenton4060 ti only has 8x physical pcie lanes and remaining 8 are DUDS so gen3 mobos will run the card at gen3 8x lanes. Actual yikes move by nvidia to cut cost
I just bought a used 3080 ti for 400$ this makes me feel great about my purchase 😊
Great deal!
Jensen's two cents.
6:15 - I believe it was the Palit 4080 PCB which also had blank spots for the good old 8 Pin connectors on it. Wonder if Nvidia said "No, you need to use the new connector"
$299 is too much for sure... especially knowing it will be sold higher
DLSS wasn't on Quality when you enabled FG, it was automatically set to Auto which I think it's Performance DLSS.
I was sure of what this would be the moment they allowed you to put it up before embargo. 😂😂😂
It's gona be no more than 15% on average over 3060 people and that on 1080p.
On 1440p it will look silly compared to 3060.
And that’s comparing to 8gig crippled 3060s, not the real ones :D
4060 DOA, like all recent Nvidia products…
It's exactly 15% after all and 3% faster than 7600. And as always I'm 100% spot on and my coment underrated...
looks Nvidia is pulling all the stops at trying to make this look as good as they can prior to launch and is pushing CZcamsrs hard to aid on this I just saw REDGaming tech try and make effort to show that a 20% uplift vs the previous gen is good enough and that all uplifts in the past prior from the 960 to the 1060 were about 20% or less anyway.
REDGaming tech was a real tryhard
this is funny - I am already an infrequent viewer, but this seals the deal - Im muting this channel
Off topic. But where do you buy all your water cooling parts, Jay? You mentioned it in a video in recent weeks but I can’t seem to find or remember. Or if anyone else know, let me know. Thanks.
The frame generation and efficiency of the card is nice but the difference in raw performance is still not enough for one whole generation. A 3060Ti is faster in most situations, and whilst that card doesn't have the same low tdp and features I still think its a better buy especially at 1440p because of the higher memory bandwidth.
It does feel a bit like "look at what my 80hp engine can do with 4 gallons of nitrous oxide that I'm only allowed to use on the track"
I don’t understand why no content creator is mentioning that frame generation is unusable in multiplayer games it’s only for singleplayer only games.
Especially with lower tier cards people tend to play more less demanding esports like games. And the raw power of every 4060/4060Ti is just comical.
Frame gen and dlss just feel like cheating performance compared to pascal that did it all with pure performance, still rocking my 1080 mini here and have yet to see a gpu come out that makes me want to spend money upgrading for the form factor
If it's ~25% in most games then it's exactly a generational performance improvement. It's always in that 25-30% area.
And a 3060 Ti will have more memory bandwidth on systems this level card is intended for. 8x Gen4 pcie is fine, but aren't budget builders still using a lot of hardware on Gen3?
Are you going to add memory intensive games like last of us to your testing suite for the full set of tests?
Dude i wanna see a ram test with star citizen. none of this shit that uses 6gigs of ram like they always test. SC uses like 25 gigs on my pc
@@squidwardo7074 I agree
Jayz is the 3060 you used in the comparison benchmark a 12Gb variant or an 8Gb variant? 'Cause I didn't expect the 2060 super to outperform the 3060 12Gb🤔
Dude went from jayztwocents to shillzforcents
Have you tried the 3060 recently ? CP2077 has changed a lot, got a lot of performance patches, and I bet it's also for the benchmark.
BADASS
I got 51 fps in cyberpunk benchmark with ultra, RT on and DLSS Qual on my 3060Ti in 1440p.
Anyone know how the 4060ti 8gb compares?
Ya, sounds like they are not re-testing old cards for these comparisons. Or at least not often enough to really be useful. If it's not testing the cards on the same patch and drivers it's fairly meaningless.
@@bockchow maybe, maybe not. Point is, I do not know, that's why I'm asking :)
Oh man, a slot powered 4050/ti would be awesome. The 1050ti was amazing back in the day for just dropping into offlease/second hand business PCs without worrying about upgrading PSUs or anything else.
The down side is it's unlikely we'd see low-profile models like we did with the 1050 Ti which is what made those Pascal SKUs so good for upgrading older PCs, HTPCs and biz desktops. Plus, NVIDIA has effectively given up on the HTPC market in terms of relevant feature enhancement (indeed after Pascal the entire stack was deprecated for NVEnc performance).
For all intents and purposes this should have been the 4050
That's exactly what I did lol but the case wouldn't fit it so I had to attach it with a riser cable and leave the gpu on the floor
Aren't you supposed to put "AD" in the title for these paid tech demos? I'm pretty sure there's a law about it.
I would like to see this in a pcie 3.0 system, as I'm pretty confident that it is x8 and not x16.
They cut corners everywhere..gpus are basically evolving backwards at this point
@@enricod.7198Yeah they are fucking pathetic, the only pleasure they have given us this generation is comedy. Their shitty ass GPUs deserve to rot on shelves. I just feel bad for good AIBs who are losing money now.
I expect atleast 40% increase in performance for the same money after two years. Anything less is not good enough. This card should be priced 259.99 at most! Its 2023 this card should be targeting 1440p and having 12gb is absolute minimum. Honestly people should just buy used and get a much better deal.
LOL >40% increase, every two years, for the same money... Keep dreaming, buddy.
@@-Devy- it used to be even better than that. It's only the last couple gens Nvidia has increased the price and given us less improvement....
Totally agree, at 199 this would be good. At 299 it's a joke. What happens to the gains like the 1060 over the 960, hell even the 2060 had a larger performance increase over the 1060 than this. Albeit with a price hike.
Intel could have a great opportunity right now because both AMD and NVIDIA are somehow thinking their **60 and *600 cards being made for "1080p gaming" is enough.
But Intels Arc A750 for the same price is actually quite a nice 1440p card.
If they adjust their production and go full in on their driver improvements they could have a chance to really mix up the market.
I have a A770 16G in my daughters 1080p system. It is a really nice card. I tested it in my 1440p system against my 2060 and a 3060ti I am barrowing. The A770 did well but will not compete with the 4060. I am hoping for high tier Card pricing to crash in the next 3 months.
@@shuntao3475 it will do well someday, when they really sharpen the drivers.
@@shuntao3475 I hope you're right with the crashing of the high tier cards. I bought a 4k monitor for work, but now I can't game on it cause I only have a 3060. If the 4080 crashes $100-200 then I could kinda' afford it.
@@kaimojepaslt I honestly hope you are right. Each driver update though has brought very little performance to dx12 titles. I am trying ot hold out for Battlemage, the question is can my 2060 last that long. I really want to play CyberPunk with Ray Tracing on, as I find it to be the best-looking game in history. I have loved the Cyberpunk genre hist first seeing Arkia in the mid 80's.
@@blindguardian8599 Can you no downscale the game? If you play older titles, most are not built for 4k anyways. At a 1K price I would still hold for 4k. I cannot believe the 50xx series will be priced as stupid. I would pay $599 for the 4070ti... no more. That Price would have to hit by Oct. though. After that, we are to close the Battlemage Announcement and worth the wait.
Best thing about this opening is the EVGA boxes in the background!
Well done :D
I liked how in the early "pandemic" stages he'd have nothing but rows of RTX boxes behind him when he'd do a half assed Amd review...Seriously people need to wake up.
Quick question, from a rusty noob, what sort of PSU wat would be minimum to run this card? (I'm sure people are screaming at me but I'm here to learn.)
I went ahead and got myself an Intel ARC770 (Acer Bifrost). Sure, there are better GPUs in that $$$ range. But I wanted to play the long game by supporting a third option. Coming from a GTX 1060, I can use this on my system (an AM4 MB) without regret until I get my new system built that will be my dedicated gaming rig. So far the ARC 770 has been awesome, as I only have a 1080p monitor.
What are the perfs ? Like a 3060 ?
@@fridaycaliforniaa236 Except for outliers due to being new to the market (drivers, etc), the A770 is basically a 1080 Ti in performance and power consumption.
Huh
Hey chatgpt4 generate youtube bot comments supporting intel pls boss we need more ppl buying our stuff we cant falsify our stock for that long
@@EffYouMan oh hey. Turns out Intel makes other stuff as well and stock prices aren’t entirely dependent on their gpu line. Still… hard to tell where these little start up companies will end up. Fingers crossed they figure it out and I don’t lose money on them since my abacus stocks are already tanking.
Impressive for a 50 series card, when you ignore the price tag at least lol. Would like to see how it handles a game that isnt 3 years old though. Like Hogwarts or something
Yes, do a modern game with ultra textures like Hogwarts or Jedi. 8GB is entry level and should be $199 at most.
@@The__Comedian__ you can want things to be cheaper but you need to be realistic dude
@@The__Comedian__ RAM is expensive right now, or haven't you noticed? I'd rather companies spend the money on features that are actually future-oriented. Frame gen helps make raytracing usable on mainstream cards, without having to have massive amounts of memory and clock speed.
@@Magnulus76 whatever you say jensen huang
@@Magnulus76ke 20-35 more dollars expensive for like doubling of this card's memory to 16gb expensive ? Sure bub very expensive.
you should not have made this video with these restrictions. period.
Jayz were you allowed to use the restroom while testing?
Finally I was waiting for the 4060 benchmarks, I got a used gigabyte RTX 3060 12gb Aourus elite which has extra 6pin and it's a 240w model and it OCs super well and now I am very glad I didn't wait for the 40 series, I could have gotten rtx 3070 for what I paid but I wanted that 12gb of VRAM and I was set on getting the 3060 12gb for a while and loving it so far
Your 12gb of vram will outlive this 4060
let yourself down here Jay, played right into their hands
just curious, is that possible for you to lower the wattage limit and unplug the PCIE power cable and make this card powered by the PCIE slot completely?
Definitely appreciate the update on hardware!
this isnt an update. its a paid ad
@@azaz20244 it's an update letting me know I won't be getting one fam lol.
In this day-and-age, power consumption is something I'm looking for more that just raw power. Using an RX6600XT, so interested as to how the nVidia card compares against it. I don't need (nor want) the best top-end-card, but do want a card that plays games well and sips juice as opposed to chugging it down like someone at a keg party!
In that case a gaming laptop is typically more efficient than any desktop GPU.
And quiet. I like my system to be silent. If I hear a fan spin up I'm giving my tower the evil eye...
Installed a 6700 non-XT for a friend yesterday and ran some stress tests with a slight undervolt; it never broke 120W, which puts it about on par with this 4060, while trading blows with the 3060 Ti. My friend bought it new on sale for €280, so cheaper than the 4060 as well.
@@c99kfm Red cards age like fine wine. Green just leaves a bitter taste in the mouth in the long run. Been computer gaming for over 20 years and nothing has changed except the new team blue is now a thing. Nvidia has game ready drivers that they never update and if they do it's only to make their new cards look a bit better because value for money has never been a thing for team mean green lol. The only thing great from Nvidia is their marketing propaganda and gimmicks. This is how it always has been. Ditto to Intel CPUs.
@@leanlifter1 I had a 1060 6gb and it aged well honestly
Cross-referencing the amazing AMD 7600 review Phil put together, the 7600 is faster in Cyberpunk non-RT 1080p (86FPS compared to 80), and you can regularly find those $10-$25 cheaper than the 4060 MSRP. It'll be interesting to see how that comparison shakes out with other games, and with RT on, but i have enough hours in Cyberpunk to say that difference alone made my decision for me.
I'd like to be able to pay at 1440 . How do these games look on a good TV when the computer puts out just 1080 or 1440?
I wonder how much they will push this in to the hydro series. It would be nice to see a newer pump/res combo in the hydro line come out so those of us can update to that on the refresh build or new build we want to do. Would absolutely by these next build
I am glad I got my 3060 a year ago, upgraded from a 1060 6gb. Also glad I got one from EVGA as well. I could have gotten the 3060 Ti but I wanted more VRAM. It makes me sad to see they released the 4060 with only 8GB of VRAM, makes me feel like it's a step back as opposed step forward.
In July, there will be a 16G version, but at 128 bit bus though, Vs the 3060 that has 192bit bus at 12GB of VRAM.
@@johnhpalmer6098this will be for the 4060 ti and will have the abysmal price of $500
I'd suggest trying ZOTAC, their 1050 lasted me over 5 years and a 3070 has had zero issues so far with decent cooling. Might be a more cheapish brand but they do make some pretty nice products.
I'm geniunely looking forward the complete benchmark comparison between different brands. Also I'm curious to see what the performance is against the Intel Arcs... Especially the a750, since it's sold realistically between 260 and 280 (here in Europe, I think a Limited Edition on NewEgg may cost even less), and would fall right between this MSRP
a 300$ card running a 2.5 year old game at 1080p, what a time to be alive.
tech reviewers became a joke. they are basically shills now making ads for overpriced x3-4 times ngreedia/ayyymd crap.
corporations set prices for freaking stupid GAMING devices like its money printing machines... and reviewers stay silent about that. either reviewers are shills now or stupid AF if they don't notice whats going on.
tech reviewers DEFINITELY SHOULD CRY MORE, to the point of ignoring products reviews and making videos about HOW MUCH OVERPRICED IT IS instead of doing tests.... khhm, I mean ADS for overpriced AF GPUs.
talk about the fact that all these cards are irrelevant as they are overpriced x3-4 times, and e.g. actually the 4070 is a xx50ti class card. and shoud be priced accordingly below $200, as xx50ti cards were priced pre-mining.
tech reviewers must talk about the fact that a true xx70 card is the 4080 16GB, 30% less performance than a top card (the 4090) is what xx70 cards were all about. talk about the fact that xx70 GPUs cost around 300 bucks in 2016 pre-mining.
remind people that no one in 2016 considered ngreedia GPUs cheap, so even if they get to pre-mining prices they still actually would not be cheap. it'd just be way more justified pricing than what we have now.
and don't forget to mention that 1080p is extreme budget in 2023, we aren't in 2016 anymore, with current monitor prices 1440p is the new norm.
talk about the fact that any 1080p GPUs must be considered extreme budget and be priced around 100-150 bucks. just like 1080p monitors they are only suitable for. e.g. the 4070 is DEFINITELY an entry-level below $200 xx50Ti class card.
talk about that a 4090 die 608mm2 costs 300 bucks MAX to make according to wafer calcs and 2 y.o. TSMC prices per wafer, its more like 200-250 bucks now.
a 4080 die 379mm2 costs 150 bucks MAX. and ngreedia is trying to sell 60-class cut down silicon 295mm2 that worth around 50 bucks as 4070ti for 800 bucks with 12gb VRAM, they even tried to sell it like 80-class silicon for 900 bucks at first.
and its on OVERPRICED AF TOP TSMC NODES. you easily can release 4070/ti/4060/ti tech on WAY CHEAPER NODES. yep, it would eat a bit more power. and those dies would cost not 50 bucks from TSMC (which is by itself should allow to sell 4070ti for 200 bucks) but like 10 bucks to make. so 4070ti literally is possible and would be VERY profitable for 200 bucks. but we are dealing with ngreedia here milking the market. same as ayyyymd and intool.
ALL TECH TUBERS DEFINITELY SHOULD CRY MORE. as is "tech reviewers" ACTUALLY ARE NGREEDIA/AMD SHILLS who normalize overpriced x3-4 time GPUs by ignoring HOW MUCH it all overpriced and not crying about it all the time. YES. YOU - TECH TUBERS. ARE. SHILLS. whether you realize that or not. period.
intel gpus only work in AAA games, you try to play a indie or AA game and you'll get stutters, bad performance or it just won't work at all
@@squidwardo7074 Is it? I remember seeing benchmarks of games like Counter Strike on DX9 and they performed pretty good with the latest drivers
then you realize 6700XT goes for 319$ has 12 GIGs for VRAM and faster than 3060TI(which is like 14% faster than 4060)
So EVERYTHING is CUT DOWN on the card except the PRICE? sounds like a wonderful purchase, I would love to spend 300 dollars to get something that is already dead and wont play the games like I want, this is going to be insane.
Still happy with my A770. Yeah the extra VRAM may not have been worth it but it runs everything I play at 1080p full tilt. 100ish FPS for D4 for example. Considering the 'placeholer' 1650 super was starting to get it's ass kicked I was gonna have to buy something I I wanted to play new titles at a decent quality and FPS. Plus the AV1 encoder is amazing both for file size and render time. Just hoping Battlemage lives up to the hype.
I just hate how NVIDIA is leaning more to the "software" side for performance based when it does not support all games or in general.
They make more $$$ that way.
I want to add, before upgrading to my pc. I gamed on my laptop with a rtx 2060 max-q which I gamed at 1080p with raytracing high settings to max with a little over 60fps. That ended though when hogwarts legacy and sons of the forest released . Wanted to add this to with what Jay said, wondering if i have some overpowered 2060 or something.
@Jay - can you explain how you believe this video, created under extraordinarily specific conditions dictated by nvidia, is intended to help consumers more so than Nvidia?
For PC building and general ‘we’re having fun’ videos you guys are great. But you’ve completely lost your way when it comes to reviews and trying to help consumers. It’s difficult to take your channels opinion seriously when it’s so evident you’re mostly a PR mouthpiece for various companies.
But I get it, you do this to make money and you’ve got many views here. I’m not hating on you for this, I’d likely do the same. I enjoy most of your content and like both you and your team. So as a fan of yours, and speaking only for myself, the cost of making these videos is loss of credibility. As a result, your opinion of if something should be purchased is meaningless because your opinion itself is too easily purchased.
I look forward to seeing you finish various builds though, the mega man one looks awesome!
Jay2cent just sold his ass to Nvidia for (around) 2million cents
Jay is just falling off the cliff, this is hard to watch, for a veteran viewer so goodbye.
Jay don't clickbait like this man. Further, Jay don't shill for Nvidia like this man. I know the prospect of "views" is tempting, but you're one of the big boys in this space. You should be as above this type of shit as GN and HU are.
I mean, hell, even _Linus_ hasn't done one of these videos!
Jay, please test this on a PCIe gen 3 motherboard. I’m wondering if the gen 4x8 bus is going to hamper performance when the bandwidth is halved on a gen 3 motherboard. The “cheaper” cards are going to be picked up by people with existing systems looking to upgrade.
Out of curiosity, what would the MSRP of a 2060 be today adjusted for inflation, about $340ish?
Im happy with my Rx6600. Really impressed the fps i get on most games
Just built my desktop last Dec. I went 6700XT, ordered in Nov for $369 with a $20 rebate, so $349. I'm thrilled with its 1080p perofr,ance and entirely happy with its 1440p framerates, all of which exceed 75 FPS. I dont see any changes coming in the way of the GPU, until mid 2025 at the earliest.
@@grumpyolddude439 Good choice. It has quite abit of power consumption but the 6700 XT is best so far
RX6600 with 16Gb for 250$. They make that and i'm set for 10 years
@@fabrb26 Vram is so important right? hahaha
It's actually one of the only cards that is currently priced right for the performance it gives. That makes it the best card available today.
I think next year , Gate all around transistors will be in production with angstrom sub 2nm due in 25,26. I'm going to upgrade once more before 2025 but this gen is disappointing ❤
Is there a small amount of input delay on the controllers with FG? I feel i can see that when you press A on the keyboard the vehicle is turning just slightly slower then it would at that FPS normally?!
Hey Jay, I am so confused about buying the right parts for a new build this year. The video cards are melting, the motherboards are burning up processors, and the memory compatibility is iffy, what is the RIGHT STUFF this year? Please do a video about what you would build for yourself for $2500.00.
2013 called, they want their resolution (1080p) back.
More like 2007 and the Nvidia 8800 GT.
We have dlss3 now, we can finally game at 1080p.
To answer a question in the video, it is absolutely legitimate and important to include new technologies like frame generation and better ray tracing when comparing video cards. I would put them on a separate line or separate chart, so you can look at rasterization and the impact of technologies both, but they should certainly be included. If I am playing a game that benefits from frame gen and is not sensitive to lag, I'm going to use it, and I will want to know those capabilities before I buy a card. So, for a game that supports the tech, please include the numbers, somehow, somewhere. The push to NOT include that information is just bias -- it will favor AMD as long as Nvidia is the one innovating new technologies. I am highly skeptical any time I hear a techtuber stumble through their rationalizations for leaving out key capabilities of a graphics card. You can hear their self-doubt in their voices; they know something isn't right.
good point
I'm wondering what spec power connector nGreedier will install on the 5000 rtx series? Hope their not ignorant and keep using the 12 pin which melts at any given random moment.
What is the GPU FPS display overlay that you use? Linus and Dawid use it too. I need it for testing. TIA
I almost wonder if perhaps the 4060 non ti is going to fill the roll of the 50 model in past generations, it almost has that feel to the card
It is a 4050 and it will fill that role at a premium price for the role.
It won't because there IS a desktop 4050 coming.
96 bit bus and 6 gigs of VRAM is going to be funny to behold when it does come.
not almost, the specs ARE of a 50 class card:
128 bit bus
8x pcie lanes
8gb vram
meanwhile the 3060 had these specs:
192 bit bus
16x pcie lanes
12gb vram
jays the kind of guy who is happy to spend more on worse tyres because they are slightly better on 3 roads that they work on.
Pretending you didn't know why they paid you to publish their figures early was hilarious.
I have a 2070 I've been wanting to upgrade, but I feel like in my price range it's not really worth it to get an nvidia. For 1440p, if I wanted to spend only about $350, what's my best option? 6700 XT maybe 6750 XT?
4060 ti. I upgraded from a 2060 and i am happy
yes get the 6750xt if it fits under you budget, step down to 6700xt if it doesn't, these two cards also has the benefit of having the same CU count as the PS5, some channel has shown that Playstation exclusives ported to PC runs unusually well on the 6700xt. Like that card is beating the 3080Ti when in some cases
The moral of the story from Jay's full review but the AMD 6700xt and if you can the 6750xt. AMD really knocked it out of the park with the 6x00 series cards. I have to say the Intel A750 limited has matured alot since launch quite impressive for sure.
Hey i might actually buy intel, oh boy I hope my purchase makes a huge difference in your endeavors
Maybe not the 6500xt or 6400, but other than that totally agree!
@@spicyboy_17 yeah those low end cards are a awful value
Those extra features are the only thing making them worth it nowadays. But...the game has to support it. And if it doesnt...well...gl. I still think AMD's solutions will win out in the long run. Seeing as consoles use amd, so games will support it more easily. Just like how gsync essentially is never talked about anymore. Proprietary solutions bound to only their hardware...are not the way to go.
Hey Jay, excellent video as per usual. Can you let me know where you get the GPU wattage meter...???
I love the black with blue/red mat! I'm saving up for one.
AMD is definitely getting more desirable especially when it comes to having enough vram. I’m not sure why we can’t get a little more these days considering most games are using more. I’ve used both nvidia and AMD, loved both tbh. Best thing to do when searching for a card is just get what works for what you do
they couldn't give you "a little more" as in 10GB or 12GB, it is either 8GB or 16GB. I believe they currently use 2GB ram modules across all 40 series, the 4090 have 12x2GB for example, while this card as 4x2GB.
Adding another 8GB to make a 16GB card may seem like a small cost to the consumer, but to Nvidia it creates a big logistical problem, the GPU board does not have the slots for 8x2GB of RAM, they now need to stock 4GB RAM modules to give you 4x4GB version of the 4060Ti,, and they cant use it on anything else.
@@sjneow Can't see how that's true considering i own a card with only half of the ram slots being used, and i've seen some budget office-grade cards with even less slots being used.
Its currently a real struggle to find what to "upgrade" to from the 2060S 🤨
I just upgraded from 2060S to a 4070.
It was the only available option that kinda made sense imo (if you want a Nvidia card), and is still sort of semi affordable.
It's only 25Watt more power usage than 2060S and 1x 8pin connector, so it's an easy swap without the need for a better PSU, and no messing around with needing other cables/adapters etc.
You're probably to late now to still get a good price for your 2060S though, should have sold it a couple of weeks ago at the latest.
(Sold mine for nearly € 200,- a couple of weeks ago).
I thought about buying a second hand 3080, but a new 4070 was only like € 100/150 more, and has 2 more GB VRAM which you definitely want. (second hand 3070 no option because you wouldn't get a VRAM size upgrade)
Performance of the 4070 is a little bit worse than 3080 at 1440p/4K, which kinda s*cks and shouldn't be the case really, but what you gonna do? (but i guess you could see DLSS 3 frame generation as compensation for that, allthough i'm not a fan of frame generation myself)
12GB VRAM on the 4070 should have been at least 16GB, but again, what you're gonna do...?
4070Ti wasn't an option in my book.
Waaaaay to expensive for having only 12GB, and then you also need to work with these 12pin power adapter cables mess.
And 4080 is waaay to expensive, for me at least.
I also got € 40,- steam credit with it, which is a 1:1 discount for me really, because i would have spent that anywayz in the coming months/steam summer sale.
After selling my 2060S, and a left over 16GB DDR4 RAM kit i had still laying round + the € 40 Steam credit discount i paid € 419,- in the end when taking into account the discount + selling old GPU etc.
That made it an acceptable price to me.
You'll be perfectly fine waiting for the 5000 series.
Honestly if you could snag a 3060Ti for $250 or less, or a 3080 for under $349, those would be good buys on the used/new market. I say don't buy anything 40 series until NV cuts $100+ off the MSRP of all 40 cards.
@@UniversalPwner21 unfortunately I'm in Australia so I have to bend over with prices. I would keep my 2060 as I'm building a new system
6700xt is best price-to-performance rn if you can get one of thos $320/$330USD ones
I have Ryzen 5 5600g is this okey the 4060 in my processor?
interesting performance/price on the 4060 for 1080p. Please do a comparison with the 2 last gens of AMD cards in the same tier.
The great Jay 👑
- We part ways with asus
month or so later
- Here is a new asus product
Some greedy asshats never change
I am very very curious about this. Let’s say that you are using a 650w psu and you just did a kinda high end build gaming desktop computer using msi Z590 gaming plus and you get a 11gen intel cpu 11600k and a single fan AIO and your running 6 RGB fans and you are using a rtx 3060 and 2 8 pin cpu power will the 650 watt power supply ATX will it be enough and if not can you take the older 600 watt psu and will the power cables unplugged can you use the other psu to power the gpu and maybe all of the sata power supply will this work because you don’t have the money to buy a larger watt power supply? Can this work?