Jay FORGOT to tell you the RTX 4060 vs 3060's Cheesecake Analogy.
Vložit
- čas přidán 25. 06. 2023
- The RTX 4060 vs 3060 debate, well Jayztwocents let the cat out of the bag! This now means everyone can talk about the RTX 4060 8G in Cyberpunk 2077, which is now publicly available information, however let's compare it in a graph vs the ARC a750, RX 7600, RTX 3060 12GB (very important), RTX 2060 Super and GTX 1060 6GB. There are of course some good and bad points to go over. Though the main thing is, just wait if you are thinking about buying a new GPU, as explained with the Tech YES 'cheesecake' analogy
Sponsor Get a Windows 10/11 Pro Key for under $15 (Use 30% Coupon BFTYC) - bit.ly/BFTYC
Windows11(21$):bit.ly/BFTYC11
Office2021(49$):bit.ly/BFTYC2021
EDIT: I made a mistake on the calculation, it's more like a 18% performance uplift on the numbers. dialled in the wrong number on the calculator.
✅Shop Aliexpress WorldWide: s.click.aliexpress.com/e/_etUbxJ
✅Shop on Ebay Worldwide: rover.ebay.com/rover/1/711-53...
❤️Become a Tech YES City member and get access to perks
/ @techyescity
⭐Consider Subscribing here bit.ly/3G20vC1
🔐Get a Windows 10/11 Pro Key for CHEAP (Use Coupon BFTYC) - www.bit.ly/BFTYC
💯Merch - www.redbubble.com/shop/techyes...
❤️Support Directly - / techyescity
💻Discord Access - / discord
-------------------------------------------------------------------------------------------------
DISCLOSURES: Generally all links tied to products are either Amazon, AliExpress or Ebay Affilaite links, this means that if you purchase a product we earn a small sales commission, which costs you nothing extra (if you end up purchasing a product). All sponsored content will contain the word "SPONSOR" if directly sponsored or "AD." Any additional revenue stream will be disclosed with similar disclosure.
Music Provided by either: epidemicsound, audio library or royaltyfreeplanet.
#PCGaming #GamingPC #Winning - Jak na to + styl
I like cheesecake
Costco cheesecake
Ikea cheesecake 🧀
Woolworths cheesecake is the best. Waiting for it to defrost is a pain though 😅
Real late at night, cheesecake. Sensational.
@@cybersamiches4028 To be fair costco cheesecake is actually really good, especially the amount you get, the only mainstream one I have tried that tops it in Japan... is the ikea cheesecake lol. However you don't get much for the money at Ikea.
Reviews weren't just limited to Cyberpunk at 1080p, but Cyberpunk at 1080p with RT Ultra and DLSS 3 Balanced with Frame Gen. Nvidia totally controlled the narrative to paint the 4060 in the best light possible
I’m still trying to figure out if this video is satire or, tech yes just didn’t know about the 4060 “preview”.
Literally every company tries to paint their product in the best light possible...kinda silly to expect otherwise
They probably got better results than they expected...
Whts the price difference in cheesecakes?
@@samgoff5289that's like saying a new car review can only be driven down certain roads, at a certain speed and with no other people in the car.
Nvidia has claimed similar increases of 4060 Ti over 3060 Ti, and we all remember what has happened in actual game benchmarks...
I just bought a 3060ti for 200 so not going to upgrade but I hoping there’s not going to be that much of a increase maybe under 10%
If a 4060ti was a waste of sand, buckle up for the 4060 and 4050
SEARCH
3070 VS 4060 TI
AT 399 EACH COST
WITH I HTINK ORUND 15 GAMES
THE 3070 BEAT STHE TAR OUT OF IT IN EVERY GAME WIHT NO LESS THEN 5% AND AS MUCH AS 15 %
And the textures on some games DID NOT RENDER RIGHT
the next day also saw the 3070 drop 100 usd in price
ask why they dont want you benching there cards no more
its cause this entire 40 series is one big fucking scam
@@mainfish88 you did well with that 3060ti. It’s substantially faster in every way vs the 3060. There might be a couple of choice scenarios like cyberpunk RT dlss3 standing on your head squinting into the sun where the 4060 has a slight win but I think you will find in the majority of games the 3060ti beats the 4060
@@captain_morgan Yep. I've been saying this for a while now and no one really seems to think about this. Not that is an excuse for Nvidia doing what they did. With that, I think the 4060ti should be stomping the 3070 into the ground at launch and then just have the gap widen over the next 12-18 months. Sad that it will likely catch up or match the 3070 by the end of the 4060ti run.
how do you get 32% of uplift dividing 82 by 69? it is 1.188, meaning the uplift is 18.8% which is far from 32...
A $300 card running a 2.5 year old game at 1080p. What a time to be alive.
sure, if we ignore a lot about said game and the modes its running under or its updates. Look, there are easy, valid reasons to crap on these cards, we don't need stupid bs.
It is true, that 300 USD are a hard sell, if you happen to have a 3060, BUT JTC actually talked about the Cheesecake, when he showed us that the 4060 offers valid Ray Tracing performance, which the 3060 does not offer.
And J talked about the efficiency, which is part of the cheesecake's taste argument we get here.
If JTC's video somehow wasn't informative then I don't know what this video is, sorry.
Less cheesecake and its been reduced to sucking it through a straw instead of being delivered on a wide fork isn't the way I like my cheesecake. Also, A single slice of cheesecake costs as much as a whole cheesecake use to. Ill wait until cheesecake prices come down.
You make tasty points.
If Ray Tracing or power efficiency is important to you then the 4060 is superior to the 3060.
If RT or efficiency are not important to you then you better buy an AMD card.
the only place the 4060 really wins is efficiency. In sheer performance the RX6700XT wins. In content creation, the 3060 12GB wins. In price, both options currently slap the 4060 senseless.
Pretty sure Jay couldn't do much with the preview and it was controlled heavily by Nvidia, similar to the Digital Foundry early 4090 showing. The early preview stuff is not the full picture and pretty much a waste of time.
The point of that is to temper expectations by preemptively setting one. Used properly, this business tactic is able to deflate hype when needed, or start one when the product needs it to sell. As with all things, it's a balancing tactic.
well, daniel owen compared some nvidia cards to the 4060 despite those limitations nvidia set
@@ae-qw5xi Yeah I think he mentioned that he used the exact settings Nvidia said to use for the 4060 on the comparison cards. His video seems like the most up front presentation about the limits placed on the preview from what I have seen so far.
@@kamikaze00007It is basically a paid for ad.
Any tech "journalist" who allowed Nvidia to control them in this way is not a journalist. They're a shill. Allowing Nvidia to control every single aspect of the "preview" is no different than just doing a paid commercial and reading a script written by Nvidia. Unsubscribe from any channel who accepted Nvidia's deal.
That's 19% uplift, not 30%. How is everyone missing this?
exactly, that a major f88kup
It’s more like 3-9% if you check dracarys gaming videos, which makes the 4060 a competitor to a 6650xt 😂.Why would the uplift be higher than a 3060ti to a 4060ti. It gets worse the lower tier you go. 4050 will be a 4030 level. This should Abe been a 4050 for $200
I don't know what calculator he was using, but indeed 82/69 = 1.188 i.e. 18.8% uplift, not 32% !
Noticed this immediately lol
Was thinking the same thing...
Maybe my math is off, but 69 is 16% less than 82.
82 is 19% more than 69.
Anyways just cant wait to see benchmarks with less favorable settings and games for 8GB vram.
And the state of 4060 8GB owners in a couple years :D
Also who games at 1080p in this day and age. Like decent 2k 144hz monitors are going for $200. WTH are people using?
This 4060 with 8GB will age like a cheesecake.
😅
Jay's earned two cents.
Two Cents from Jensen
Nvidia just sent him the script to read
@@kasmidjan Jay, Cents, Jensen, Jencents.
I smell a conspiracy. lmfao
Or thirty pieces of silver
@@how2pick4name 🤣🤣🤣👍👍👍
It wasn’t just jay. This was a “preview”. Daniel Owen also did a video on it. It was approved by nvidia.
Any tech "journalist" who allowed Nvidia to control them in this way is not a journalist. They're a shill. Allowing Nvidia to control every single aspect of the "preview" is no different than just doing a paid commercial and reading a script written by Nvidia. Unsubscribe from any channel who accepted Nvidia's deal.
Yeah but that doesn't change anything
GDDR6 VRAM Prices Plummet: 8GB of Memory now Costs $27 Tom's Hardware. These should be well south of 300 in all regions.
1:44 Not "just" less VRAM, but less memory bus width (128-bit instead of 192-bit), less CUDA cores, less PCI-Express lanes (8x instead of 16x). It's overall just less, everything was downgraded. This should have been a 4030 and sold for under $99. At $300 Nvidia can go duck themselves.
It uses a 107 series GPU which is what normally has been used in 50-class cards. GeForce 60-class cards have used 106 series GPUs since GTX 660 (except GTX 760 which used a bigger 104 GPU)
If the card was 200$ it would be fantastic.
100$ would be great 😊😮
If it's free it would be great
It everthing is free then its great
if they pay me to get the card it would be fantastic
You’re living in your own reality then 🤦♂️
is worse than the 7600 in a very nvidia optimized title, thats DOA af.
You did forget the fake frame dlls 3.0 that makes it ”great”
😂
@@haukikannelframe gen is mediocre at best but it works best if you're already extrapolating from high framerate (70+). Frame gen from 30fps feels like an absolute ass. It's same with DLSS, works great at higher resolution, while at 1080 it's a dog poop (as every other upscaler). Measuring performance with these features is like measuring your cock and incorporating length of your spine in the measurement...
What would have been the most relevant NVIDIA card he could have compared it against was the 3070. Because 20% to 30% is nothing to talk about in terms of comparing it to the card it's meant to replace. Previous gens would see a 20% uplift, give or take, with a new card versus the previous gen card from the tier up. Not to mention, it was just one game. Not enough information to call this a good or a bad. Granted, this was obviously an NVIDIA cherry pick to make the 4060 not look like a turd, like what happened with the 4060 Ti.
Would love to see 3070 comparison myself as well.
@@nomenicuss2091 3070 will beat this card!
if navidia only allow 1080p and cyberpunk, then there is a reason for that!
100%...
i guess most other games will not perform like this
I'm sure Jay wasnt allowed to show it against a rx7600 like Brian did and show a $260 (and falling) amd card beat it.
@@Knebebelmeyer Exactly.
@@jurgengalke8127 Yes, he was 100% restricted to this CP2077. NVIDIA was cherry picking something that would show the 4060 in the best light. I just feel that Jay could have pointed that out in the video, but he seemed rather... prematurely positive.
HUB did reply on their Q and A video that this 'early preview' was essentially a curated Ad using tech tubers to run it. It's fairly obviously a PR move or perhaps some early damage control because everyone expects the card to be underwhelming vs the 3060 and any other card you can buy for $ 300 USD right now. Like the rest of the 4000 series stack. Good power consumption, nice to have features such as frame gen and overpriced for the vram it has.
Nvidia should have made the Rtx 4060 12 GB Vram, 192 bit bus and Pcie 4.0 x 16 and it would have been great at $300. I have a strong feeling with its current stats the 4060 is going to bandwidth starved and that will cut down on the longevity and life of the card. Personally I would buy a 3060 or even 3060ti (this one beats the 4060) before I would buy the gimped 4060. To each their own and great video Bryan.
4060 you have mentioned is renamed by Nvidia to 4070-12GB
4050 is renamed by Nvidia to 4060 to jack up the price 😄
They did, they called it the 4070 and 4070ti.
I like your opinion...just bought an MSI 3060 OC 3-fan.
@@levlevinski596 please stop spreading stupid misinformation like this
@@FreddyRHernandez
So iyo, would it be fine for newer cards to just drop x16 altogether and just use x8 like the 4060? Not a rhetorical question, I'm genuinely curious. If there are no performance differences then pcie4x8 which is basically pcie3x16 would be enough even for high end cards no? So if that's the case, paying for a gpu with pcie 4x16 would be a waste of money because you'd be paying for something you're only ever gonna use half of.
82/69=1,19 so 4060 is 19% faster than 3060. Or am i missing something? Where that 32% comes from?
Another awesome video, always appreciate the financials in the mix. Keep banging out the content
Reviewers like Jay forget most of us are not interested how it fares against a card from 6 months ago. Only a tiny amount swap cards every gen now. We want to know how it fares against cards from 3 or more years ago. Epscially important when dealing with low/mid range cards that most people buy. I rarely bother looking at reviews for the $1000 cards. No point.
Alongside graph with benchmark results with 7800x3d there should be a graph of results when using some appropriate cpu.
I really like your finance with tech analogy. Would also be nice if you provided the links to the charts in the description. :D
But he seemed to talk about deflation - I think he's just confusing a reducing rate of inflation with deflation. So prices are still going up, just not as quickly as they were before.
Rx6700xt used is around 300$ pretty sure it would destroy the 4060 and cyberpunk is probably the best result they showed
well i got 6800 for 215$ and pretty it gonna beat 4060 overall aspect .
@@ghysambath2815 Great deal. And it works, even better.
@@ghysambath2815dang where’d u get that?!
@@rcvillapando it my country , but now the seller raise it back to 300$
In the USA try to get a 6800 if your budget is $350 or less. I got mine at $325 and it should last me 3-4 yrs with the 16GB.
Nvidia is getting desperate, trying to manipulate public opinion with these restrictive early reviews that show the card in the best light.
The limitations they placed on Jay with NiVidia only letting him talk to their advantage and not the entire story was BS
At this stage i'm still running an RTX2080super OC and an i9 10900K and looking for an appropriate GPU upgrade ( the 2080s will go to my old 4790k build to replace my sapphire r9 290x). I like to play on higher settings in general whilst still requiring at least 130fps on rust for example. Can i ask for some GPU recommendations, I would ideally like to avoid 4090's due to price
Would Be great if you could configure GPU memory like you do on a mobo. Add/swap etc. Not sure if it would add a speed bottleneck but man that would be great.
pretty sure there were some restrictions applied, like no comparative testing etc, similar to he was only allowed to talk about 1080p performance, it was also not a review it was a "preview/teaser" as stated by J2C
Apparently it also had to be using RTX and DLSS 3 with frame generation which the 3060 can't do. Just the frame gen and forced high RT settings probably accounts for nearly all of it's improvement. The 3060 isn't an RT Ultra capable card so they kneecapped it as hard as they possibly could to make the 4060 look as good as possible vs it.
thx for sharing this^^ i thought u could only drop numbers with that raytracing settings and was not allowed to compare... that is what i thought XD but nice to see this graph. special the a750 looks awesome in that chart and the rx 7600 too. i think this will shake a little bit with that dlss thing and raytracing thing and framegeneration thing. with that, i guess it will place directly unter the 4060 ti. followed up by the a750 man. thx man and cheers from germany
Geeze. 6700 XT beats this up and costs less.
you scared the shit out of me with the filter on jayz part it was like artifacting 😱
Not sure if you didn't watch his entire video but, he said a couple of times that this was just an introduction and would have a full video with comparisons and more numbers.
Hmm, in his video Jay said he was "given permission" by Nvidia to give an "exclusive first look", I don't think that negates any embargo any other YT'er has for the 4060...
Good video. Test idea: Max out VRAM and see if faster mainboard RAM helps against stutters while data is transferring between VRAM and RAM (installed on mainboard). Thanks for the video.
How big is the performance hit if the game requires more vram than the video card has? Will the game even run at all?
Waiting for the comparison for the 4060 TI 16gb model compared to the 3060ti 8gb ? Is there a big difference between DDR6x and DDR6 ?
I feel like more memory won't help with this narrow memory bus (128-bit for both models). It will be excellent card for 3D artists though, for OptiX/CUDA rendering...
How does one get an *RTX2060 Super* to pull 206W in a game? It's TDP is 180W. My RTX2070 Super pulls 205W *in Kombuster* @ 0.95v / 1950GHz (probably c. 225w stock settings), peaks at c.170W (averaging less than that) in games.
Great video ❤
Frame generation in its current state is something i see as useless. If they can somehow make it use your inputs in generated frames instead of delaying frames, i can see it becoming useful.
I've been rocking a evga ftw3 3060 ti at 1080p 280hz for a couple years almost now and it still everyday blows my mind and not switching it up anytime soon
Had a 3070 and had the best past year for it...really a great card ...finished a lot of titles but at hogwarts lagacy it made ma mad that in 1440p ultra the card gave me stutters and blurry texture.
So exchanged this gpu and added 50 for a rx6800 a beast for the price. Still miss my 3070 though....
I'm making a white budget-oriented AM5 PC. I have ASRock A620M PRO RS. I'm wondering if I will lose more and more FPS in future running on the cheapest Ryzen 5 7600 against R7 7700. I already have a tower+fan for the CPU - white.
RTX 4070 is the real RTX 4060. This is RTX 4050 renamed as RTX 4060.
I'll keep happily waiting with my undervolted GTX1080Ti. Runs like a champ. Granted, I am not breaking any benchmark records with this card coupled to an X99 rig (w/turbo-unlocked E5-2696v3). However, it does everything I need it to do and it does it with aplomb.
Haha, I still use a GTX980. Lol
I wasn't gonna but criminal scalping prices so I'm patient... or really stingy!😂👍
1070 ti for me. Definitely not buying these overpriced crappy cards. The 10 series was by far the best generation since it's release. So many 20 and 30 series cards dying and the 10 series are still running like champs. People like us are the only ones helping keep GPU pricing under control.
That's funny because I use both a 1080 Ti and a 980. :D Likewise undervolted (well actually power limited, achieves the same result).
jb, it's interesting you mention X99 because I have a different 1080 Ti on X99 with an oc'd 6850K (4.4GHz, using a Rampage V Extreme). It's mainly an editing rig; testing it for games I found the CPU was indeed holding it back, but it was perhaps very specific to the games I play (a lot of Subnautica, which is single-thread-physics heavy, also GTA V and a few others). The gaming PC with the other 1080 Ti had been a 5GHz i7 2700K, which showed even more of a CPU bottleneck; it's now a 5600X/B450 (drives a 48" 1080p TV, ie. sofa gaming), a change which doubled the performance I was getting before with Subnautica, with large speedups elsewhere aswell. As always though, it depends so much on the game(s), settings, and definitely the resolution. The only game I play which does actually strain the 1080 Ti is RDR2, it's the first time since buying the 1080 Ti used back in 2018 that any title has made me contemplate something newer, but given current absurd pricing and product gimping, I can wait, and if I did get something newer that was much faster then I'd want the power draw to be lower by default, for which there's no viable option atm. I have CP2077 and various other newer games on the 5600X setup, not tried them yet.
The other gaming PC drives a 24" 1920x1200 IPS, it's used more for ED and various older games (Oblivion, Stalker COP and FC2). It has a GTX 980 on a Rampage IV Extreme and oc'd 4930K, still runs fine.
I'm still on X58 with a X5670 @ 4.4GHz and GTX 1080
@@DonzLockz6700XT second hand is seeing some excellent prices Rin. I've got mine for 240 for living room PC :D
How is 69 to 82 a 32% increase?? Am I completely wrong here?
To calculate the percentage increase from 69 to 82, you can use the following formula:
Percentage Increase = [(New Value - Original Value) / Original Value] * 100
Plugging in the values:
Percentage Increase = [(82 - 69) / 69] * 100
Calculating:
Percentage Increase = (13 / 69) * 100 ≈ 18.84%
Therefore, the percentage increase from 69 to 82 is approximately 18.84%.
@4:50
@@edgarzakarian1649 he tries to teach you about economics and inflation but can't do 5th grade math
@@tyre1337 Anyone can have a brainfart. It's ok. Just mentioning it so it's corrected.
nvidia is only allowing the early cyberpunk data. Wonder how the other games stack up. nvidia is going to need a lot more to show for to make me part ways with my 1060 6GB. I'm seriously leaning towards more on the Radeon 7600 when the price is right.
A much better use of your money Bro, The 7600 will outperform the nvidia card in 8 out of 10 games and will do it at a much lower pricepoint.
Not 32%, 19% difference. Very different. 69fps
The 4060 doesn't only have less vram but a narrower bus. It's a higly controlled scenario for nvidiapunk 2077.
I will say this: the 4060 is not a bad card if you are looking to upgrade from a very old card. That said, I am happy with my 3060 and I do not support anything less than 12gb Vram.
Damn right.
In a hunger crisis, take the larger cake; in a VRAM crisis, take the 12GB card
8:16
You are not entirely correct that rtx 4060 has "faster" memory. Technically the 4060 uses faster gddr6x vs gddr6 but...
Rtx 3060 has total memory bandwidth of 360GB/s (192 bit bus) vs 4060 with 272GB/s (128 bit bus)
The gddr6x in this case doesn't make up for the difference in bus size. Not only you get more VRAM, but also more bandwidth with 3060.
You don't have to take my word for it - you can compare memory specs of both cards on techpowerup
DLSS 2/3 is nice, but few games support it so my prediction is 4060 is DOA like rest of 4000 series.
BRUH.... they DECREASED the bus size???? WTF Nvidia?? 🤦🤦🤦
Are you blind? The 4060 still handily beats the 3060 even with everything being smaller. Your argument is absolutely pointless
@@iequalsnoob I haven't mentioned and care little about the pre-release benchmarks. I was talking specifically about memory specs and included a timestamp in my comment.
The real benchmarks remain to be seen once NDA is lifted.
Out of curiosity, why 10th gen instead of 11th gen i9? More core is better than pcie 4.0?
The question remains... Is the 32% uplift going to translate when cyberpunk releases it's updated liberty city upgrade/overhaul?
It's 32% now... But will it be more or less come the update.
I unironically think about grabbing 3060 12GBs now that they go 200 second hand and using it in OptiX/CUDA rendering in Blender Cycles.
waiting more reviews and can this gpu strong enough handle unreal engine 5.0 at least above 60 fps setting ultra or high
Since you had the newest gpu having 8gb of vram I would of liked to see the RX 6600 in there as a good budget option for those who are looking to game and spend a bit less money. But as always I love your vids. 🤔
math is not your strength
82/69 is 1,18 = 18%
Emulation would be interesting to see. PS2 and up. Apparently Yuzu team chewed out Nvidia for bandwidth cuts, and older cards with higher bandwidth would be be better- at least for switch emu. I think a 25% uplift should be the bare minimum for a 60 class card considering the 3060 was not an upgrade over a 2060, let alone being on par with a 2060 Super.... I imagine this will age like butter with the buswidth though.
3060 while still not worth the price performs little better than a 2070
i've been waiting for 4060 since first 40 series launch, can you please test warthunder in 4k native without a dlss. Currently I'm at 2060 6gb and texture missing and present because of 6gb vram
Out of curiosity why are you trying to run it in 4k on a 60 series? Would you not prefer to run it at 1440p with more frames? Or simply get a card more capable of 4k?
@@TheSwayzeTrain Why not? Warthunder is an easy to run game, why getting out of your way to buy a more expensive gpu when you're planning to mostly play older but still popular titles?
One more reason to why I don´t watch Jay2cents anymore, he´s just making a commercial for Nvidia to sell their underspec card, people with 3060 have no reason to upgrade to a 4060
the frame generation actually will make you lose in games like Diablo 4 in PvP, at least for those I know who have tried it (not with an 4060, but a 4090), it´s good for singleplayer and horror games, but that´s it, for competitive use, it´s useless, imo. Thank you for making this review so people can see how much of a scam Jay is, he´s not making decent reviews anymore, but rather taking the money and making a commercial, bordeline illegal and misleading, imo.
Steak is my cheescake and Ngreedia is not getting a penny from me.
The Cyberpunk preview is the best case scenario of performance. It is a highly regarded game that is tuned perfectly to run on nvidia cards and has all the bells and whistles the card can use, of course it will run fast with dlss 3, but I did not hear how it feels to play. From what I've heard, frame generation increases input lag and while the game looks smooth it doesn't feel like you are playing at those high frame rates.
Other thing to point out is that the 4060 will probably not do as well on 1440p, being on par with the 3060 12GB.
I feel nvidia started cutting corners since they though they could get away with it. But it feels they left s giant gap between their top tier cards (4090, 4080, 4070ti) and their more mainstream cards (4070, 4060ti, 4060).
We need competition, not only on price but also on performance, the new gen of cards feels like s refresh, at least on raster performance. DLSS is great and all, but to me is a non factir unless it is implement on a hardware/driver level rather that being implemented game by game. AMD needs to step up their game, they completely dropped the ball on this gen so far. Intel is the only hope we hsve, but they will release their newest cards next year, we still have at least half a year until we get decent mainstream cards.
To be honest, the better option is either to wait or, if you need an upgrade right now, get last gen AMD or Intel. 6700XT and A759 feel the best cards for Mid and entry level gaming respectively and nothing released recently feel even close to their price/performance. Wile I understand that reviewers need to generally plsy ball with the companies, I hope this gets bashed once again so Nvidia lowers the price of this, the 4060, the 4060ti and the 4070. Man, it sure feels bad to be a pc gamer right now, at least if you need a new graphics card...
good review
They want Jay and others to make us forget that there is 1440p and that these cards should be easily able to run it by now. And that they are specifically hobbled to be shit at 1440p. And Jay did it. Which does leave a bit of a foul taste...
Why 32%? It's 19%
Interresting would be to see comparisons of the cards using PCIe 3.0.
But generally besides DLSS 3.0 and Powerefficeny I‘m underwhelmed. With the last generations you could say, that the 60 class of the new gen has approx. a comparable performance of the 70 class of the previous gen. Doesn‘t really sum up this time…
Think I'll hang on to my 3060 12 GB for now - At least until I can get an idea of how both cards handle the upcoming Unreal Engine 5 titles. I play at 1080p 60fps, so maybe a CPU or RAM upgrade might be a better option.
mine just arrived today bro. 12gb rev 2.0 gaming oc so excited but all i can do for now is look at it since my mobo yet to arrive. gonna stable diffuse that heck out of it once it arrives
@@Feelix420 I like the idea of the 4060 drawing less power, especially as the UK power prices are ridiculously high atm, but some games I play now use more than 8GB VRAM. At 1080p, with 16GB RAM and Ryzen 5500, most games play flawlessly at extreme settings. With Unreal Engine 5 games in developement, I'll wait to see if it's worth an upgrade, when the embargo lifts.
@@bigj1454 bro uk power bills are absolutely whacked atm its not even funny, all thanks to green energy. maybe gaming aside its time to overthrow some government who knows... btw typing this from my brand new keyboard and pc setup feels nice to finally have a decent setup.
we need more frequent discussions of cheesecakes, I like how civil and organised everyone becomes
i prefer a bigger cheesecake
@@tyre1337I prefer milk cake 😂
At this point, cyberpunk already become like nvidia playground already.. Oh, and i like cheesecake too
My PC ceeps crashing with 8g vram on 6600xt in D4. High settings on a 21:9 3440:1440p 160hz Display. 90fps but the Vram is a big Problem. It fills up from 7.2g vram usage to 8g+ vram and than boom hard crash. Is it the game poor game designe or is 8g vram the problem ? Make your own dession...
find someone to swap with 3060
@@d9zirable This problem is just in D4 for me .I would Rather sell my Blizzard account than going to team greed again ...
Where are the reviews on this card at 1440 and 4k? As I understand it nVidia has put an embargo on reviews at anything except 1080.
I say skip this gen if you are able to. If you can't or won't, then I think RX 7600 is a better deal at 250 (ideally at 200) than the 300-329 RTX 405... Sorry, I mean RTX 4060 (custom AIB models).
Unless you like RT, then AMD is out of the question. Then again its not like 60 series run RT well at all but still way better with dlss2 and 3
@@Jinny-Wa Yes, we haven't reach the stage where 60 class can have 60+fps in RT games sadly... And I personally don't like fake frame because of the input lag. It's gotta be more noticeable with the 60 class GPUs...
@@Jinny-Wa No
You should skill all generations you can skip!
If you cab skip 4000 series -> skip,if you can skip 5000 series -> skip and so on.
Some people just can not skip and they need these 4060 and 7600 Gpus we have now!
How fast does the 4060 run out of VRAM and cause stutters?
2:00 You forgot to mention the memory bus and bandwidth.
Will it play Command and Conquer Generals?
If you have a 2070 Super can you throw in results plz?
Can it run Crysis though? 🤔
There was a 4GB gtx 960 and 3GB gtx 1060, technically. but it would be more fair to say it went 2gb to 3gb as well as 4gb to 6gb.
IIRC the 1060 released as a 6 GB card, then Jensen did his voodoo and made a nerfed version that had 3 GB (it should have been the 1050ti)
the power efficiency gains from the 40 series is really impressive, I wonder if the 4050 laptop GPU's that are currently available are full sized vs what the desktop 4050 will look like.
Efficiency is not impresive, because for example the 4060 chip is in reality a 4050 GPU rebranded as a 4060, is the reason why the performance uplift is so bad vs previous gen, nvidia still need to sell the remaining stock of 3060 and 3060 Ti, (yes we still are sufering from the high demand created in 2021) and nvidia intentionally maked the 4060 and 4060 Ti "bad" value cards
I mean yeah we all know the cards are way less powerful than they should be at the prices they are, but regardless the 4090 is the proof of the power efficiency gains, for the same wattage of a 3090 TI you get 60% more performance. that is really impressive regardless of how lacking in performance the cards are.
I have a 1660 ti and a laptop 3060 6gb, damn what leap that was in both gaming and video/photo editing
*Q:* Bryan what going on with the YEN?
What about a RTX 4070 TUF OC 12GB Graphics card ?
This results are going to change with the next cyberpunk big update
NVidia: the way Jayz2Cents was meant to be played.
Did you forget that Nvidia only ALLOW the comparation with 2060 and 3060?!
I never understood fully why Motherboard manufacturers won't or can't put a VRAM slot on the Mobo.
If you play at 1080p go ahead and get an 8gb Vram card. If you play 1440 or 4k and got an 8gb card, save money for the next 2 years because you will have to buy another card to replace that 8gb. As the guy said its good for "a couple of years". 2 years max and its dead.
I am interested in 1440p, will 4060 still have that big of a lead if any over 3060
Please compare the 4060 to the 6700 xt. The 6700 xt can often be had for $310 or $320. Only $10 over the 4060
What is there to compare? the 4060 will stutter in games like Last of us/Star wars jedi and Hogwarts... because it doesnt have enough VRAM like the 6700XT does
@@Ladiozahh yes *badly optimized games*
@@naipigidi when console have 12gb vram, yet pc master race blame the game using 8gb vram gpu
@@saialexander1873 Exactly. People need to start building their PC using the same or similar specs as the PS5 to enjoy games better
@@Ladioz they already do
Daniel Owen already did that what JayZ didn't
I got a 3060 12g from a buddy two months ago for $280. I am not disappointed. Especially since im coming from a GTX 970.
Brian is a RTX3060TI not a beter buy than the 4060 ??
I’m super happy with my RTX 3080 it’s pretty beastly and I’m shocked at how much they still sell for I’m kinda tempted to go out and buy a 4080 or 7900xtx and make about half or more my money back from selling the 3080 I don’t play allot of RTX titles so the 7900xtx for 200$ less is looking really really good it would only cost about 400$ in the end what do you guys think I should do
u can sell and swap for a new 4070
In this cheesecake analogy, I imagine DLSS would be represented by substituting non-dairy for creme and frame generation would be air in the mix.
@5:00 82 fps versus 69 fps is a "32%" uplift? In what universe is that true? Try an 18% uplift.. Yikes. That thumbnail with the imaginary "30%" written on it is an abomination. Even Nvidia themselves couldn't have hoped for such favorable propaganda, with them only (misleadingly) claiming "20%" uplift in their cherry picked results (When in reality the uplift is below 15%, which we have known for months based on the effectively identical mobile 4060 and how underwhelming it is). Anyone with half a brain knows that Nvidia "allowed" the "preview" of the 4060 in CyberPunk a couple days early since it shows it in the most favorable light (as VideoCardz wrote in an article yesterday, Cyberpunk is a "Nvidia tech demo"). Even then it lost by 9% again the middling Rx 7600 according to Tech Yes City's benchmarks.
he tries to teach you about economics and inflation but can't do 5th grade math
Why didn't notice warm usage between 30-40 series
If you see 30 ser using only vram GPU memory
40 ser split vram and ram memory
Just wait for RTX 4060 12GB for a few months.
If Nvidia had named the 40 series like they did with the previous series and priced them accordingly, everyone would be singing their praises. But spec wise this is a 4050 not a 4060 and should have been priced at $200 to $250 range.
82/69 = 18.8% uplift, not 32%? @ 4:50
PLEASE DO A 4060 VS 7600 VIDEO I SEE NONE. People keep comparing the 7600 to the 4060Ti and keep saying its a bad value. When the 3060TI came ot it was $400 the same with the 4060TI. 7600 launched at $280 its $250 right now and you get the performance of a slightly better 6650XT which costed $400 and the 6600XT at $380. Please make it make sense. 7600XT/7650XT aren't even out yet and the 7600 is so close in performance to the 4060TI. Dlss3 is the only thing saving 4000 series right now.a lot of games dont have it and maybe 1 or two of them are multiplayer games. In the end AMD always wins. FSR 3 is going to carry all the cards. Like how fsr1/2 did for the GTX users.