How NVIDIA missed the AMD KRAKEN
Vložit
- čas přidán 9. 08. 2022
- 20% coupon code: C30
win11 professional 1pc : www.ucdkeys.com/product/windo...
Office2021 Professional Plus CD Key:www.ucdkeys.com/product/offic...
Windows11Pro+Office2021Pro Global Bundle: www.ucdkeys.com/product/windo...
Windows 11 Home CD KEY GLOBAL :www.ucdkeys.com/product/windo...
office 19 pp:www.ucdkeys.com/office-2019-P...
win10 pro:www.ucdkeys.com/Windows-10-Pr...
365 account: www.ucdkeys.com/Office-365-Pr...
Support me on Patreon: / coreteks
Buy a mug: teespring.com/stores/coreteks
Bookmark: coreteks.tech
My channel on Odysee: odysee.com/@coreteks
I now stream at:
/ coreteks_youtube
Follow me on Twitter: / coreteks
And Instagram: / hellocoreteks
Footage from various sources including official youtube channels from AMD, Intel, NVidia, Samsung, etc, as well as other creators are used for educational purposes, in a transformative manner. If you'd like to be credited please contact me
#nvidia #rtx4090 #amd - Věda a technologie
Finally!
A youtuber not bought by GPU makers/vendors.
Prices are crashing down.
All we have to do is sit and wait.
🕐🕑🕒🕚🕓🕔🕕🕖🕗🕘🕙
nVidia 2021: "Gamers? What gamers?"
nVidia 2022: "Gamers, please buy our GPUs."
Nvidia doesn't care who buys their gpus....
@@samgoff5289 damn the joke flew right over your head
@@samgoff5289 yeah dude they dont care thats the reason the prizes go down every week
in a nutshell
To be honest...even if an 3090 drops -80% in price, I don't want my home to be heated when I use my computer AND a noisy use. FPS per watt matters as much as quietness.
That is why new APU will be interesting to follow
That's why I'm looking to AMD for my next GPU and CPU upgrades. Nvidia has power hogs for their 3000 series. AMD's stuff uses far less at default.
@@Winnetou17 the RTX 3090 is one of the least efficient in fps per watt. If you plan Undervolting and underwatting by 30%, do it with the more efficient cards already, like a 3060 or 6600XT
Undervolting is key
@@Winnetou17 Thanks for your comment, and I agree. Though, some cards are just the overclocked version of the others, in order to simply charge more. Do you actually have data or website about the best gpus to undervolt?
Nvidia put the defaults on their 30 series cards too high, supposedly they were somewhat paranoid about RDNA2. It takes like 5 mins to watch a CZcams video about undervolting. My 3080 which runs around 320Watts at stock only loses about 10% performance running at 220Watts and at 140Watts (seems to crash while gaming if it goes any lower) it's still more than enough for esports and 140Watts was also about the minimum that my 1080Ti could run at without crashing.
4:00 "cough" jayztwocents "cough"
Okay it wasn't just me who thought that.
Simple, he is deep into Nvidia pockets, like LTT and Tim from hardware unboxed.
Can't trust those sellouts.
@@grtitann7425 lol Hardware unboxed? Really? 😄
@@50shadesofbeige88 yes, Tim, from HU. Not Steve.
Tim will always twist his videos to be as pro.nvidia as he can without blowing his cover.
There's no way Jay was that naive to believe prices were never going lower.
Nvidia is paying for their greed now, it's karma for choosing to crank out high-end GPUs for miners in a time when the actual demand was in the low-midrange. Good luck selling anything when you'll be able to pick up a 3090 for $600 in december.
Yep, empires come with the seeds of their destruction.
@@samlebon9884 Honestly, right now,I don't see a difference between nVidia and AMD. They both went greedy.
@@adnan4688
When it comes to greed, nobody beats Nvidia and Intel. AMD is somewhere in the middle.
Remember, if there was no AMD, the market would a monopoly shared by green and blue.
@@samlebon9884 With the last gen,AMD proved they are the same. The moment they get a chance,they will set the same premium prices on their products. I have been an AMD supporter for 3 decades,but those are facts my friend.
Last time AMD partnered with a game manufacturer and the game was delayed it was Half-Life 2, as I recall, this worked massively in AMDs favour, building up hype and selling more cards
i think kraken is a hardware compression / decompression of assets on PS5 SoC. so there is already an AMD kraken ;-)
Yep, Nvidia is getting compressed without reversal.
Kraken was already known in the industry before the PS5 came out. The PS5 is simply the first standard hardware that has an ASIC for that.
Ideally Direct Storage will do something similar, though I don't really know what compression technique they chose. If it's similar to the XBox', then it's another technique more optimised for textures rather than general data, which makes sense, because the GPU's main workload during loading is compiling shaders (which faster storage won't help that much) and loading textures. Geometry is usually less data or with things like tessellation derived from yet another texture.
Amd fanboi channel.
4:02 you mean Jayztwocents? lol
Obviously
I am so thankful to amd. If not amd we would still have 4 cores i9 cpus
4:01 i believe it's JayzTwoCents
Soon to upgrade my main machine, I'll go all-AMD this time with Zen 4 and RDNA 3.
Go AMD! 👍👍👍
Me 2😗
do you own a motherboard that supports AMD?
Yeah baby ! ME 2 ! Just F Nvidia & Intel ! AMD all they way !
@@DamnDarryn
He said he's going all AMD, which means he's switching to a completely new platform.
Many people are switching to AMD because the choice is obvious.
You all will regret your decision for sure. Just come back to this comment when that happened
4:01 By “influencers”, do you mean Jayztwocents? I don’t think the use of “idiot” shortly afterwards was a coincidence.
2020's 500$ rtx 2080 ti was a lesson, so i doubt anyone would want to risk for another possible chip shortage
But most miners i know own 3-5 cards. With electricity bills going up and mining going down, i don‘t see them keeping their cards for much longer.
There will probably be a flood of cards.
@Joe Campbell yes and no . the bigger cards are on separate larger dies and the smaller chips can fit more on the same diameter. the market stack just comes from overall silicon performance . the best chips are nearer to the middle of the wafer and less so toward the edges . the outside can have defects or underperform and they become the other chips. we basically get 2-4 types of die every generation and each of those wafers makes different gpus in the end.
Nvidia also cranked the power to the max to get gains that the architecture and samsungs node didn't provide. They can't offer the same performance increase this time because the overclock budget is already eaten up by the last gen. They need architecture improvements that are above average to reduce the extra power consumption that they used last gen.
I wonder if the advantages of these integrated systems (cpu + gpu + direct storage/smart access) also come into play with AMD's APUs
I think so. I think this slew of techs are coming from the consoles.
Honestly. If there is one thing I beg AMD to look into doing in the future is to create a console for the PC market. RIP pc building but I think honestly moving away from having motherboards, cpu's, ram and gpu's comming from different vendors would be a good thing. Or at least have the cpu gpu and motherboard be from the same vendor. So tired of motherboards having flaws and the only one that gives a **** are the cpu vendor. Since the motherboard vendors give 0 shits about stuff being broken unstable etc etc. I rather get the hardware as if it was a laptop IF there is a hope in hell that drivers BIOS etc etc are going to be as close as flawless as possible.
Let PC building be about fans, pc cases and cooling! We pick the computer to put in there still but without this mess! Even if the CPU or APU can not be removed from the motherboard at the end of the day making the CPU vendor be responsible to design and deliver the crustal motherboard should make there be less of a need to switch some shit motherboard. There is not a single reliable vendor of motherboard out there. NONE. There are claims there are marketing but the weak link is always the motherboard whatever your budget is!
Get a AND CPU? Well to f** bad your internet nic is from Intel and we are not giving a fuck that it is broken. WTF? O you got the first generation of M.2 drive motherboards? Oopps BIOS updates makes the port useless and you can kiss your ass that we are going to make sure to fix EVER AGEN. IT IS FIXED ALREADY!
You got a Chinese produced CPU? Well the motherboard did not like that. You need it to come from this country of origin to be compatible it seems! Since somehow that is how bad there tolerances are on the motherboard level of just making shit barely work. Even now they are the ones making motherboard for the CPU vendors and there specs. AMD having to send a new CPU and proof to motherboard manufacture that the CPU they gave away for troubleshooting was genuine and still motherboard maker gives 0 fucks that there boards refuse to run products the are sold to work with. And where should go you? There is NO vendor making good motherboards. Lenovo maybe? And they basically make only laptops and laptop like desk machines.
Make personal computers like consoles and make them reliable. Almost like Apple Mac's. But hopefully repairable and costing less while still having the good reliable stuff in there. Have 1 vendor to face to when shit dose not work. Not the blame game. CPU needs the motherboard to be well made for that CPU. There is no way around it. You can spend premium money on motherboards and still be burnt every SINGLE time. Only computers that have worked almost flawless for me over the years are laptops!
@@TheDiner50 I agree with you, AMD should just build a line of "Console-like" gaming PC's using APU's
this cycle was ridiculous anyway . there were no cards the entire cycle at price and now that we are on the cusp , no one wants to overspend . Getting a card at or near msrp is possible for a new unused card . there are still those waiting until the price drops down to msrp and then they can consider the upgrade they originally planned.
The used market will be available even with the launch of the 40 series . it will be scalped as ticketmaster took control of their own scalping site. its a a lot harder to scalp concert/venue tickets now and gpu scalping is easy as can be.
Breaks my heart to see Nvidia get their comeuppance.
Oh wait, no it doesn't.
Go AMD!
They supported the open source world even though it's not profitable.
This is the way
You mean they supported open source cuz they had no choice as nobody would bother with their products otherwise
@@beachslap7359 The reasoning is irrelevant. Consumers need to reward that behavior.
@@imadecoy. I totally agree
Oh YES ! RDNA-3 is going to beat the RTX 4000 series ! And RDNA-4 will completly CRUSH Nvidia !
finally. payback time!
I can never unsee Nagreedia..
Ngreedia
Excelente video como siempre Celso. Saludos.
Time for Nvidia to be toned down imo... Go AMD!
Radeon DNA-4 (2024) will CRUSH Nvidia into the ground ! RDNA-4 will be a game changer !
The game needs to written to take advantage of direct storage though doesn't it? So sadly almost all benchmarks on release won't show off any of this 'synergy'.
Forspoken it is not out yet if you are talking about it, and it was working on direct storage for a lot of time even when it was in alpha. AMD does not need all benchmarks to be like Forspoken, just one example of what is possible will be enough. If they can improve RT, integrate FSR3.0 + SAM + SAS and be on par with nVidia with DLSS3 it will be enough to claim a lot of the market.
get enough ram and direct storage is just dead.....ddr5 has already 4,8GBps for each pin throuput its doing up to 6,4GBps per pin.....even if you get a full bandwith limited ssd thats only getting you up to 14GBps....so you looking at up to tripple digitsGBps bandwith vrs 14GBps.....direct stograe is a complet waste of time if you arent restricted by volatile memmory aka you dotn have to share the ram between gpu and cpu and only 16GB of it...if amd doesnt pay them an arm and a leg to ONLY optimze for this feature set (aka consoles) that is just DOA for PC beside some eventual level loadtime reductions...
@@proxis9980 You would think 64GB would be enough for most games and it's not that expensive these days.
Synergy can only really be achieved through harmonizing the software and the hardware, not at all CPU and GPU. Just look at Apple, although they’ve been designing their CPU since the A4 chip, they have still used PowerVR GPUs until recently, yet they still achieved great “synergy” with their proprietary Metal API. AMD’s issue now isn’t not having their CPU and GPU work together, it’s their mediocre software optimization, and their garbage ray tracing performance on RDNA 2 that can’t even beat Turing. If anyone said that ray tracing is a scam, they are tech illiterate idiots, simple, just look at how realistic Pixar animation or even movie CGI is, and apply it to video games in real time over 60 FPS. Ray tracing literally bring photo realistic and cinematic graphics to games in real time, they turn games into interactable movies, instead of just “a game”, and half assing it is a big mistake for AMD, as well as completely ignoring hardware accelerated AI, considering how much AI is revolutionized the industry right now.
@@dpptd30 to be fair to AMD I think we need to judge them in 5 years or so to see where they are as they simply haven't had the money to do anything other than the basics for a long time. But equally, 'Apple like synergy' on a Windows PC would be almost impossible simply because there are too many different players.
I think its going to be really hard to overcome Nvidia mind share in the short term, even if they have a better product, it hasn't helped that much in the past.
i think that fuels amd, the roman comfort creates weak men etc, nvidia getting way too comfortable, the mind share thing is terrifying, its more like people hate amd rather than even like nvidia, negative influence stronger than positive
I'm curious if AMD will catch up with Nvidia in real time rendering in DCC applications
They are honest to god trying but that's a huge cliff to climb.
Not going to happen any time soon, at least for the next decade, RDNA 2 can’t even beat Turing in real time rendering, seriously an RTX 2060 is beating AMD’s 6900 XT on blender open data.
@@dpptd30 🥴
Such a great video as always!
As an RTX 2060 owner, looking to get into higher tier resolution and refresh rates. I'll probably be skipping the 3000 series, simply because their price and power draw is far too high (also lack of VRAM). I may consider an RTX 4000 series. But, I'm more interested in AMD's offerings from both current gen and the upcoming RX 7000 series.
Price and power draw will be worse next gen.
As a Plex Server owner, I look forward to cheap NVENC gen 7 devices like your RTX 2060 flooding the market in the next 6 months.
Although the RX 7000 series will use more power than the RX 6000 series, it will nothing compared to the Nvidia RTX 4090TI’s 800W.
Try to hold out until Nvidia is begging you to take them!
@@tringuyen7519 Newly, NVIDIA announced the actual power draw of 4080 to be like 320. 800W may be the maximum spike height. Though, AMD is still better at performance/price.
What was the Samsung OLED monitor he mentions half way through the video? That one is new to me. Would not mind getting that. I typed in G-Series and didn't find anything.
He most likely mean Samsung odyssey g7/G8 - they are QLED not OLED. Doesn't seem correct what he says in that regards.
@@Jmundala Ya I thought as much. Damnit lol. Thanks for the reply.
there was AMD demo of physics effects working on 16 cores cpu - is it part of api stack used in forspoken?
Game developer in general did not want to push excessive resource on physics calculation. So they will make it to utilize one or two cpu core at best. And even that usually not pushing those cpu core real hard.
@@arenzricodexd4409 yes but still, there were reports of a tech demo presented like an year ago and no game used it since. Forspoken with many particle physics effects would be a good software to show that off and to get additional performance on consoles.
@@michahojwa8132 excessive physics processing has been push since mid 2000s. Game developer not interested with it. This kind of thing most often hardware maker ploy to sell stronger hardware. But game developer most often want their game to run even if you have potatoes so they can reach wider audience and broaden their sales target.
Well ... the price cuts are unprecedented only because the price hikes were unprecedented. The markets allowed one and required the other. I do think it is interesting that Nvidia's attempt to flood the market with multiple types and quantity of cards at high or no MSRP has caused their announcement of the 4000 series to backfire and force a rescheduling. I also am curious why the 1000 series seemed to have unknown performance until the announcement by Nvidia with their "performance" charts that wowed everyone. Now we all know pretty much what to expect prior to any announcement of the 4000 series. It's a new world. The company reveals seem to always be "yeah, yeah, we know that ..."
"the price cuts are unprecedented only because the price hikes were unprecedented"
Perfectly said. Prices became insane. As a result, I got completely out of the mindset of buying into a PC upgrade at all. My Skylake/Pascal rig is still doing fine at 1080p, and I'm just not going to bother with a new PC for a while, even though now they're getting more reasonable again. It's not even that I want to wait for the next gen of GPUs. I simply have no appetite for buying anything right now. I would have, when the 30 series launched, had it been reasonable and available enough.
They were not "unprecedented" they were inevitable! These "price cuts" are a joke anyways. Priced at original MSRP 3 years later is overpriced. I hope the market crushes them in the proverbial wallet.
I think managed "leaks" have become a double-edged sword. While they maintain hype over a longer period, when it comes time to purchase, people have had a longer time to consider the relative value, which cuts down on impulse buying. Evidence of this is in the video, when Coreteks says something to the effect of, "buyers have become more savvy", and this savviness is due partially to potential buyers absorbing and managing this extra information. I think people are also much more aware of product cycle behavior these days and how to identify where in the product cycle we are relative to the desire to buy.
Ah, neural graphics. That will be a holy grail, that will be a big buzzword used in next-gen consoles. Personally, I can't wait for it, as it will push the virtual world generation to whole new levels.
4:00 I'm disappointed in jayztwocents
Why? He is well known NVIDIA paid shill.
Unless China invades Taiwan. Then Jay would become a visionary.
Nvidia made so much money during the boom they are not worried about this down turn. Only people freaking out are the investors that didnt understand this growth level and revenue was temporary. Nvidia would do it all again if they could, 3 birds in 1 year, and none in the second year is still better then 1 bird in each of the two years.
You're a bird
Exactly. Calling this "losses" by the entire tech press and YT is so pathetic, these are not losses, it's a return to normal market conditions of a 2 year old GPU generation that is about to be replaced in 2 months. This is normal, last 2 years were abnormal.
It's the reason Jensen said they are raising wages at nvidia now, because they made a shit ton on $ until now.
Tech press is so ignorant and can't or won't call it what it is, they just like click bait titles and statements.
Coreteks is guilty of this too, he said losses instead of what it is. Even in calling out Jay for his BS "BUY NOW" shit he did 3 weeks ago, Coreteks was afraid to say his name. I don't understand this cowardice.
@@ItsJustVV It is a loss if they need to writedown the assets sitting in the warehouse: CR Inventory WriteOff, CR Inventory
@@jimiscott Technically yes, but I don't care about correct terminology when they made so much profits in the last 2 years that they can fail miserably in 2022 (which won't happen) and still be on the plus side, so no losses from my POV.
@@ItsJustVV "Calling this "losses" by the entire tech press and YT is so pathetic"
It is literally a loss. They have spent money that they cannot make back. The reason for it doesn't change that.
"I don't care about correct terminology"
Then shut up. If you don't use words properly, you can't say anything useful or meaningful, so why bother saying anything at all?
Samsung G (if we are thinking about curved neo G7 and G8) series are LCD panels not OLED panels? This confuses me.
I think he was mistaken, but there is a planned OLED monitor from Samsung in production, but not yet on the market
@@ImSumGuy Yes... That's right. There is an oled monitor(s?) coming from samsung sometime in the near future probably near identical to the dells AW QD OLED monitor but as of right now samsung doesn't have OLED monitors and no "G series OLED" monitors....
I think he confused it with VA mini LED displays which are G series (G7 and G8).
but does this mean Jensen is going to cut back on leather jackets?
he'll be wearing a black turtleneck sweater from now on.
Note: Fidelity FX is open source. Direct storage might become so if not already.
its a microsoft API.
@@oxfordsparky so it can be replaced and/or is already used by Nvidia.
Fidelity FX & DLSS is used by Nvidia.
@@partnersincraft1889 AMD would make a driver that is compatible with the Direct storage API, I don't think they would replace it?
@@SerBallister I'd say it depends on if the Linux driver gets a better version of it. The Linux driver is very good.
Any thoughts on the ssd market and prices in the future? Next ram and connectivity standards such as computer eXpress Link CXL? And even wireless peripherals market?
Honestly the AMD cards are looking like a relative bargain at present in my region. You can get a 6900XT for pretty much 1/3 what I paid for my 3080Ti (I did buy at the worst time). I would not even consider ray tracing if I were shopping at present as even with a 3080Ti I often just turn off RT most of the time as the more intensive implementations hit my frames so hard it just doesn't feel worth it. I wont be considering ray tracing as a factor next time I upgrade, so I may end up trying AMD for my next card as the value is much better
At my place, Sapphire Nitro RX 6900 XT still cost $1500. What an expensive dust collector I'm seeing right now...
Prices still are 200% of what they should be.. still waiting! A high end card (like 3090ti) is 600$ max, and nothing beyond!
Exactly, PC gamers shouldn't pay more for a mid range PC than a current gen console. And high end should be within a few hundred of a current gen console.
@@JustAnotherAccount8 Plus look at the insane amounts of wattage a GPU needs these days. I got a 2070 MaxQ in a shitty HP Omen laptop. Works fine on 1440p, keeping that until they release something that is truly 60fps with RT. Otherwise, screw it. Nothing much that I play supports RT anyway.
@@dazealex Yeah don't bother with RT, not worth paying a premium, and most games that support it look amazing without it, in some cases even better.
Basic supply and demand, there's only so many silicon fabs out there and they can sell GPUs for way more to server and data businesses anyways so it makes perfect sense for them to raise the prices like they do, I don't like it, but capitalism does deem that things are priced the way that they should be
3090ti is a prosumer card and its not for peasant gamers, so paying 1600 for it is worth it💥🤣
Kraken is your post-processed voice in every single video you upload nowadays.
@xidex why not turn the sound off and turn captions on if it bothers you so much?
@@Coreteks I usually watch your videos in a miniaturized window mainly listening what you are saying and doing other stuff alongside (e.g. coding), so what you suggest would not be possible. I just cannot comprehend what led you to heavily postprocess your voice when it was completely fine in the first videos on this channel.
@@xidex8 if people could accept who they are, there would not be cosmetic surgery either.😉
@@diaman_d true, he probably thought his voice was not manly enough or something.
@@Coreteks "Never argue with stupid people. They drag you down to their level, then beat you with experience". -Mark Twain 🇺🇸 😎👍☕
Great as always.
someone please make bass boosted version of this, i miss that bass!😅😂
His vocal cords are coated in butter
Imo greatest reduction is a farce when they cost 2k. Just my opinion tho
All the "Fidelity FX" features you mention are just DX12 Ultimate features, they're readily available and just as optimised for Nvidia hardware too. Square Enix are not going to alienate the company that has, by far, the largest market share when it comes PC graphics hardware. Intel and AMD are also not going to purposely hobble the performance and compatibility of their CPUs with Nvidia hardware, all those features mentioned with an "AMD synergistic" system will still be available with an AMD/Intel/Nvdia one.
is a good option to have these improvements open for gpus of diferent brand but i think the new gen of amd will be a beast when u have a 7000 cpu and amd gpu together
@Chade Fallstar to some extent yes, but just like we've seen with ReBar the perf gains will be smaller outside of AMD's ecosystem. The feature might be available to NV users, but that doesn't mean it will be fully optimized
@@Coreteks No, that's just not the case. Resizable Bar is a part of the PCIe spec and AMD just had a head start with it by utilising it (they called is SAM) first back in 2020, but they both gained performance in some games, whilst performance was actually reduced in others, but now they're pretty much neck and neck. This is a different case entirely, AMD marketing are, quite successfully by the sound of it, trying to make out like these features are in some way special to their ecosystem, they are not, they are standard DX12 Ultimate features. Features already being used in other games. AMD may initially have a performance advantage immediately after the games release, but that will be eaten up by the time Nvidia release their first Game Ready driver for it.
These aren't special or optimised for AMD in any way. This has been going on since Ati V Nvidia days, it's nothing new to have games that initially optimised for one architecture over the other.
Lol, you're the only one I know that says to clink on a link. Excellent video! I can't wait until the next AMD GPU's come out! About the 7000X series CPUs, did you see that AMD hurt their own performance by putting on such a thick heat spreader? If not, check out DerBauer's video on the subject! Maybe they should come out with a 7000Z series with a much thinner pure silver heat spreader!
Kraken as in PS5's SSD tech. Get it?
As graphic cards performance get better than games actually need and monitor refresh rates, there will be less and less reasons to upgrade. Unless there’s an actual good reason to transition to 8k.
Muh 4k Ultra @ 240Hz
I think 8k on an ordinary size monitor may be higher resolution than the human eye can see
@@UserName-eb9oy 8k 100" monitor. Yeah, get ready for lots of neck pain.
@@spicynoodle7419 They'll Probaly Just Raise 8K to like 40 inches.
The smallest LG OLEDs used as a monitor still can fill up your whole field of view. I can't imagine anything sharper would help.
what's that demo at 10:38?
@LeChuck czcams.com/video/_R-tN9l6onQ/video.html
A $250 7600 with 3070ti performance could devastate NVIDIA.
No it wouldn't, but it would force their hand to launch a 4060 that they wanted to launch as a 4060ti for $300 instead of $500, so this is what competition is good for, better products at lower prices.
If what you said was realistic NVIDIA'S reputation would already be devastated by selling xx60 class GPUs for $500
I don't care how much you sell your flagship for, but making 1080p gaming unaffordable isn't good for anyone except their greed.
I think its a bet on software that's gonna be extra for very few games like physx was
loved the video thumbnail hahaha
3:57 Jayz two cents
Yes. It seemed painfully obvious at the time.
Thank you for mentioning Jayztwocents, he really is a sell out, cant believe how shameless he is.
Never forget this is the guy who told everyone we should be thankful to them at the price increases of the 2000 series as the 2080 TI is the new Titan, then they release the Super and RTX Titan shortly before the 3000 series.
Jay is an EVGA shill !
What did he do?
@@anonymoususer3561 upset the fanboys.
@@anonymoususer3561 claimed prices wouldn't go any lower during a sale before prices went lower and have kept going lower, among other things
The biggest "competitor" Nvidia faces now is the used market of their own products. Ray tracing turned out to be a slow fizzle and FSR 2.0 is breathing in the neck of DLSS. Machine Learning is the only remaining island of security for now.
I literally bought a RTX card for the ray tracing and nothing else.
FSR 2.0 is very bad in most games. Raytracing is starting slowly because the Console GPUs are not really able to render a lot of rays. thanks AMD for nothing
@@SnivTen By the time we get fully ray-traced graphics engines (and games), the current crop of RTX hardware will be outmoded anyways. The only 100% ray-traced games that run at sufficient FPS are the old RTX flavored Quake 2 shooter and Minecraft. Not very compelling reason to dump a bag of money, for the RTX feature alone.
No worries .......... RDNA-3 is going to beat the RTX 4000 series ! And RDNA-4 will completly CRUSH Nvidia !
Just F Nvidia !
@@gertjanvandermeij4265 why are you spamming
maybe this will be the year I replace my 1070.
Your so lucky . . . I'm still on a 1030-2gb but hoping to get a deal on a 3080 for BF so I can finally play at max settings 🤪
@@carrickdubya4765 NGL I still game on a 1080P monitor so.... graphics card is not on the top of my priority list, even in just my rig.
Its not about if they make MCD GPUs but when and how many.
"gaming revenue" = selling to miners revenue
Just bought a HP omen 25l during their back to school sell. It was only 20 dollars to upgrade to a 3060ti. Granted that isnt the fastest gpu but for 1100 usd I was about to get an i5 12400, 32gb of ram, 512 nvme, 2tb hdd and a 3060ti. I feel that likes a hell of a deal.
not really...i bought a hp as well with the 3060ti and amd 6800u (8c/16t), 16gb ddr5 gaming laptop from microcenter for 799+tax. you overpaid.
@@ko7305 well damn. I needed a desktop though not a laptop.
@@samcs06 especially for a desktop as those are cheaper. Desktops, best build yourself.
If the 4090 really is 2X of a 3090, I might be tempted. 4K with ultra settings is tough.
800 w for that gpu is madness the new cpu generation is almost 300w partically u will need a 2000 power suply sometimes the gpu spike above 1000w and the system can shut down like 3000 series release
@@Azurefanger good point. I'm curious about that too.
No ...... NEVER 2x ! Maybe 60% better TOPS ! RDNA-3 is going to beat RTX 4000 series ! And RDNA-4 will completly CRUSH Nvidia !
@@gertjanvandermeij4265 time will tell.
it's 1.8 of 3090, at 2x the power consumption
so perf/watt has gone backwards
LOL. only spoiled rich kids will buy 4000 series.
Jensen jensen, my boi... where is my fucking new gen card?
All the black and leather in the world can't hide Jensen's big old belly.
7950xtx with 69 traversal coprocessors.
11:00 looks sooo dope
So the coprocessor is finally seeing the light of day?!
Interesting!
xD
Apparently 5800x3d works even better with all this
If Nvidia indeed leased(not sold) gpus in bulk to miners and now took them back, they might as well can expunge paper trail and repackage used gpus as brand news.
Memory wear from mining is worse than effects of processor wear from overclocking/undervolting. The latter will reduce performance a little but the former will just brick the gpu abruptly one day.
@@Winnetou17 yes miners commonly undervolts processors and overclocks memory frequencies
@@Winnetou17 several reports of 30x0 gpus with Hynix memory module in gpus failing has appeared since the crypto crash this year. The other makers have slightly more reliability but even they are expected to fail. Such failures were not reported before this year on the same series of gpus.
Lisa, TSMC and Jensen are from Taiwan.
Karma to Nvidia by focusing on miners instead of gamers
Nvidia tried to lock gpu's to mining right? It's just that for some crypto it is easy to bypass? But then again, maybe they did not really try that hard.
Nvidia took chances. Everyone will always be chasing for oppurtunities. Nvidia saw profits on miners and made a deal with them and they probably didn't expect that crypto will go crash so sooner than expected. Sad nvidia, im pretty sure their shareholders are having serious emotional damage and probably plans to take out their stocks from nvidia LMAO
@@Pipit123ify nvidia introfuced their LHR gaming gpus, but at the same time worked with their AIB’s to divert huge sections of their output directly to miners…
It was a real kinda fuck you to gamers. Their LHR gpus were basically all for show
They’re a business, their goal is to maximize profits, not be your friend. Grow up you naive toddler.
They're crying right now, and wiping their tears with money all the way to the bank!😁
This. Even at 4K, my RTX 3090 is having no issues playing games at ultra settings with DLSS and FSR 1/2.0 and anyone who says 'you can't hit 4k 120hz in every game' is wrong. I'm having zero issues.
As long as FPS per watt efficiency does not improve I have little interest in the RTX 4000 series.
I prefer cards with good performance and reasonable power consumption. Guideline: 1x 8pin PCI-e power connector needed.
GTX 1080 > RTX 3070 > ????
Could it be a RTX accelerator 🤔
RX 7000 is going to have dedicated RT cores rather than just accelerators
I'm still extremely happy with 6800xt 1440p performance. No need to upgrade when already at 80-250 fps ultra all.
Well, I just bought a Sapphire Toxic 6900XT LE w/360 rad for $999.99 + tax $60. As a hybrid cooled high end gpu it performs just like the 3090ti in my 1440p gaming rig. I have a 5600X since day one, I haven't seen the need to upgrade as I was always gpu bound with my 5700XT LC. I hope the new Fedelity FX features will still work with this gen gpus. I bought because after waiting 2 years, the price for this gpu dropped 50%+ It was $2049.99! 6 months ago. I hope DX Storage works with pcie gen 4 NVME drives and NOT require pcie5 OR windows 11! Thanks Cortex for all your hard work!
good find. When i checked in on the Toxic with rad it was still around 2k. I think it was at launch in May but might have been june.
i had a 5600x with my 3080 ti in 1440p. my new 5800x3d was a huge performance increase, give it a try
Nice, yea I got a 6900XT Toxic Air-cooled for 847$. It's a beast of a card. And it undervolts like a champ have it running better than stock performance at 200w or less depending on the game played. Fans never go above 30% speed lol can't even hear it
@@theplayerofus319 yeah I might upgrade/downgrade from my 5950x to the 3D once it gets a clearance discount in 2 months when zen 4 hits shelves. 250$ is a solid price for it I'd say. Hopefully stock lasts til then
4:03-4:18 the shots against JayzTwoCents here lol
Well deserved Karma Ngreedia!
nvidia is by far the biggest bubble in the history of the stock market
nvidia missed nothing. lisa su and jenseng wong are related by blood. they break holiday bread. a beautiful dance they do.
In Sweden the lowest price is on 3090 : $1578
About 160-80 less here in Norway for the most affordable cards, the suprim x 3090 is just about 1500
good for prices
I hope for AMD that this kraken is not of the Giuliani's kind...
No worries ...... RDNA-3 is going to beat the RTX 4000 series ! And RDNA-4 will completly CRUSH Nvidia !
Nvidia deserves to suffer for how shitty they’ve treated gamers with their pricing and lack of supply when everyone needed it.
And let me guess!! AMD never had a lack of supply during that entire period, the shelves where always full of RDNA2 cards priced at a reasonable MSRP and they and thier AIB partners didn't sell a single card to miners?...
Get real, dude!!! Both where equally as bad as each other. Some people just like bashing down hard on Nvidia, while giving AMD a free pass, just because they're the underdog.
Always love seeing this channel putting out videos! Probably my favorite leaker. You aren't afraid to stick your neck out there even if you're wrong! I appreciate it
Zen 4 8-core CPU + RDNA3 GPU!
Interesting. But I see no reason to upgrade for quite some time. Anticipating future tech/prices can leave you holding the bag, no matter what your entry point is. Zero money spent is the best discount. Plus, unless you are a professional/non-gamer user, there are very few games on the horizon that warrant upgrading. The latest Bethesda entry has been pushed into next year and most other games currently on the market don't push the boundaries in any meaningful way (especially on PC). Its not like the good ol' days of the 2000s, where each year pushed what your system could handle.
I hope Nvidia won't force even higher price on the 4080 because I want one tbh.
I actually disagree with you saying a 3090 is that good at 4K if you plan to use raytracing too. even the 3090 need DLSS for 4K RT
4080ti usually come 6 months after 4080 ?
Always depends on the game even a 3070 or a rx6700xt can comfortably play most games at high in 4k60 no upscaling needed it only changes if you throw in ultra modes and Raytracing and honestly ultra vs high is mostly ranging "I barely spot a difference" to "I need a looking glass to find a difference" for a heavy impact on fps and frame times
@@Warsheep2k6 I'm surprised you can even spot a difference in high vs ultra in any game released after 2015. Let alone when you're gaming and not comparing screenshots.
Nvidia is actually preparing the karen architecture, be prepared
Next gen will be full of gimmicks from both teams in order to justify higher MSRPS than ever before.
Yes indeed, just like GAY TRACING ! Not interesting for gamers, only for movie studio's !
Well, nVidia can also just try to push for ARM desktops with their DC-originating architectures, to disturb the Windows gamers market, or they could push for extra features for integration with their cloud gaming solutions... after all, nVidia still holds the crown with their workstation/compute GPGPUs' solutions. I mean nVidia could very well just offer a mobile/laptop GraceHopper-like platform whilst offering cloud-based option to run software that's not natively supported on ARM, somewhat bound offering, nearly like what Microsoft did with XboxS and XboxUltmate... Sell hardware at a loss even, but recoup costs on monthly co-sold streaming service etc.
Sounds like my stock is about to go up!
And your capita consists of what: cows, goats or sheep? ))
@@phillgizmo8934 😂 silly man, you literally have no idea 🤣
Good let the greed catchup to Nvidia
I could see the RTX 4060 or RTX 4050 or the AMD equivalent possibly catching on akin to how the GTX 1060 did. If only because of the 5 to 7 year cycle of PC upgrades.
Edit: I can also see it meshing with possibly a switch towards pure AMD and maybe even pure Intel systems. Assuming Intel can ever pull their heads from their arses and clean the shite from their ears to actually get some proper communication and work done.
great content
1000 series , buy .
2000 series , skip .
3000 series , buy .
4000 series , you know what to do .
Considering miners won't be buying a bunch of cards and both AMD and Nvidia can't cater to them, I wouldn't buy anything until prices for top end is less than $800 and price for lowest end $200 or less. Prices need to normalize. Miners buying at any cost is not a variable now. It's a buyer's market. Proving that the "chip shortage" was not the reason for costs but Nvidia and AMD preventing actual production to keep prices up should provide enough reason to hold out and hit THEM in the pocket for a change.
If ppl buy used GPUs of 3000 series they must be insanely dumb. These cards have been running 24/7 by miners since they bought em.
you must be insanely dumb to think that all rtx30 gpu's are owned by miners.
gives me a mantle api vib
Always appreciate the deep dives, Thanks Coreteks !
Poor NGreedia.. I feel sooo bad for them.
Lets all buy their latest card, the GTX1630(even slower than GTX1050Ti) to support NVidia.
Fanboys acts that way🤣
@@miyagiryota9238 c'mon it's only US$200..
4:00 calling out jayztwocents i see
I wonder if Apple has some synergy oriented plans in the future, given that they also cover all components of the trifecta that you mention.
Nvidia, thank you for the high prices and fire risks. That's all I needed to make my next GPU decision much easier 😂