5090 could be 70% FASTER than the 4090
Vložit
- čas přidán 20. 03. 2024
- Urcdkeys.Com 25% code: C25 【March Madness Sale】
Win10 pro key($15):biitt.ly/pP7RN
Win10 home key($14):biitt.ly/nOmyP
Win11 pro key($21):biitt.ly/f3ojw
office2019 pro key($50):biitt.ly/7lzGn
office2021 pro key($84):biitt.ly/DToFr
MS SQL Server 2019 Standard 2 Core CD Key Global($93):biitt.ly/oUjiR
Support me on Patreon: / coreteks
Buy a mug: teespring.com/stores/coreteks
My channel on Odysee: odysee.com/@coreteks
I now stream at:
/ coreteks_youtube
Follow me on Twitter: / coreteks
And Instagram: / hellocoreteks
Footage from various sources including official youtube channels from AMD, Intel, NVidia, Samsung, etc, as well as other creators are used for educational purposes, in a transformative manner. If you'd like to be credited please contact me
#ai #nvidia #amd - Věda a technologie
Maffs hard, and I'm a dummy: Should be 2.7X not 1.7X if it's 2 dies.
Of course there's no perfect scaling though, just like with the B100/200
Was going to say your math doesn't make sense. I know training is different in gaming but didn't he say it could do the same training at the same speed but with 1/4 of the GPU. 2000 vs 8000 GPU. Now you could say well 2000 GPU are now really 4000 cause it's two cores connected but it was also 1/4 the power consumption so there must be some efficiency improvements no?
@Coreteks Not only that, but since when does +X% transistors translate into the same X% performance increase ? I'd say never, but I'm lazy to do a thorough check.
Edit: unless we're talking about 70% perf increase from the +170% more transistors, with that I don't have any problem.
You're just trying to cover your behind with your previous 35% uplift prediction by also predicting 70%. So you can always claim "you're right" as a "leaker". GB102 will not be multi-chiplet. Most of the ~70% performance uplift will come from the programmable L1 cache, better clockspeed and ~50% more SMs.
@@louisfriend9323 you caught me man
It depends on keeping the added resources busy - en.wikipedia.org/wiki/Amdahl%27s_law .
I have heard this before. I won't believe it until i have benchmarks
only accurate prediction will be the price.
When did it ever not come true???
4090 is around 75% faster than 3090 so I dont see this being 70% faster unreasonable.
The sad part is that the 5080 will match the 4090, meaning the 5090 will be 70% faster than 5080, aka it'll be worse than 4k series with the gap being 2x wider per tier
@@NadeemAhmed-nv2br How is that sad?
70% faster and it will only cost your remaining kidney and pinky finger.
You mean you have sold your car and children to get this gpu.
Doesn't it already cost a kidney to purchase Nvidia GPU
A kidney cost $2000 ? Just get a part time job and you’ll be able to save enough money before release
Buy more save more!
@@weyo14 lil bro part time job doesn’t even give you $1000
I don't need it to be 70% faster, i need it to be 70% bigger
What we need is 70 percent cheaper lol
Crazy that TSMC/Samsung/Intel doesnt make bigger waffers. Its seems they are stuck at 300mm.
@@lolmao500a silicon ingot bigger causes a lot of issues and slows down production trying to make it so big
@@wawaweewa9159 yep...makes more sense with the yields of a smaller chip and it's cheaper to run it hard while slapping on some extra aluminum and copper instead.
That's what she said!
So they'll be $2800 a piece then. Might as well be 500% faster at that price.
It could be 10,000x better i still wont care at a 3k price point lol 😂😂😂
That's not how nvidia premium product pricing works. 70% faster than 4090 would be like 4k minimum. Because theyre still going to sell 4090 for 1.8-2k and somehow gimp the 5070 to be slower than 4090 so they can still charge 2k for 4090 while 5070 is out and "only" costs probably 1k. Hey you can get almost 4090 for "just" 1k, what a deal!
@@Andytlp What a market we are at now...
That or we can have two 5070 editions one with a lower vram that is just a slightly better 4070 and that "normal" 5070, all with the Nvidia ridiculous pricing
@@ever611 Thats likely too. But unlike nvidia a year or two ago they have infinite money. So for them to be greedy on cents on the dollar that their gaming gpu division is making compared to a.i accelerators, i'd be surprised if they still pull this crap. It happened because they had lots of ampere gpus to sell so they jacked the 4000 series prices for every tier to sell ampere. It worked and people still paid the premium. So theres a high chance theyll jack blackwell prices even higher than ada. We're never getting more performance for same price ever again if so.
@@Andytlp I know I'm in the vast minority, and this is wishful thinking more than anything, but I do believe that the prices of the 50-series will not be as ridiculous this time. If 5080 costs $750-$800, then it's not as bad, depending on performance, that is.
"5090 could be 70% FASTER than the 4090 "
Isn't that what we hear every time?
no. we always hear 2 times, 2.5 times or even 3 times as fast haha. 60%-70% increase is quite realistic expectation and it did end up true more often than not.
@@arenzricodexd4409you're right. 4090 is 70% faster than the 3090. A 40-50% increase is more realistic for the 5090 though
When did it ever not come true?
What he meant is 5090 make money 70% faster.
70% faster at heating my apartment maybe.
If Dragon’s Dogma 2 says anything about the future, we need these sorts of leaps because devs can’t optimize their games 😅
To say the game is poorly optimized is incorrect imo. The game has incredibly good graphics. There comes a point where "better optimization" isn't good enough and people need to upgrade their hardware. On the other hand, there comes a point where a better looking game has incredibly diminished returns...it doesn't look that much better to the average person for how much harder it is to run.
What the game does need though, is a low settings performance mode that runs the CPU and GPU a lot less hard maybe with extra downscaled textures, NPCs or whatever.
@@ofon2000 clearly you don’t play many games because there are significantly better looking games that run way better than DD2. Alan wake 2, forbidden west that just released today, cyberpunk, all these look leagues better and more importantly, run better too. So yes, it is badly optimized and that’s not up for debate.
@@ofon2000you have no idea what you're talking about
I most appreciate the aesthetics of games from the turn of the century - 2D renders and sharp, early 3d. The new aesthetics and the pursuit of realism do not suit me at all. That's why I'm glad that I haven't had to replace my hardware for 10 years hahaha.
Dragons Dogma 2 on the best CPUs today is struggling HARD. Game needs much more work under the hood on the CPU side of the render queues. The game performs like trash programming.
If a 5090 was 2 5080s fused together, I would expect NVIDA to price the 5090 twice that of the 5080 so I expect them to cost $1500 and $3000.
What new DLSS feature will they lock to the 50 series? Neural texture compression or whatever that's called?
Dlss 4.0 on 50 s3ries only
there must be something for sure and that would either be implemented in next aaa game or cyberpunk
@@Ghost-pb4ts Cyberpunk won't be receiving any more major updates, so that's off the table. Maybe Alan Wake 2, though
@@Vorexia nahhh alan wake 2 player base is already dead
And alan wake 2 looks like generic unreal engine horror game with no style
Cyberpunk open world which presents all kind of environment for graphics to shine
@@Vorexia so if not cyberpunk then probably gta6 will be nvidia new plaything
they are not in the gpu business, they are an ai company
The price? Second mortgage and an internal organ?
I can't wait to have 2 towers on my desk. One for my pc and one for my future 6090!
Nope, the 6090 will be the desk.
70% more performance for the 700% more money and 7000 times more power consumption. This gaming thing looks like some exotic hobby nowadays.
I don't know. It's never really been any different and it's always the "enthusiasts" who get shafted. You average PC gamer doesn't need path tracing @4k resolution and 120+ fps using fake frames.
Just in case you forgot: OG Doom required a ~2k$ PC in order to run at 320x200 pixels at a capped 30(!!!) fps :) People simply forgot that "Ultra" settings were meant to be for future hardware, "Medium" settings should be the norm and "High" is for, well current high-end hardware. These days, the completely bonkers expectation is that sub-$300 GPUs can run every game at high settings, 1440p (the new 1080p) with 60+ fps...
@@totalermist I tend to make future proof purchases. Today is the machine is capable of Ultra settings, next year High, Medium ... and so on.
@@DarkGT Fair enough. But looking at the dev cycles of major games, I still think it's save to assume that skipping one or two generations is still fine even for more mainstream cards (currently that'd be your 4060/7600 up to maybe 4070/7800XT). It's just that expectation have to be managed somewhat, that's certainly true.
@@DarkGT I just use 720p monitor and refuse to upgrade. My gpus last a very long time...
I dont understand how you did the math here but, I will at least provide an actual logical argument.
You're making the premise that you need an 5090 to game on your computer?
That would be true only if you're gaming at 16M pixels and above which 16M is dual 4K Monitors.
And even then you would be talking about 16M pixels at 240Hz refresh rate.
Unless you're doing that sort of gaming, you can buy an 7900 XTX and an 7th Gen 3D processor and call it a day for a 3-3,5K PC that can wreck 4K 120FPS gaming.
You dont need to exaggerate or make shit up, because a 5090/Ti is a use case scenario of gaming for the G95NC at max output in Ultra graphics and even perhaps ray tracing too and not for you average gamer's setup.
One would probably need to install 3 phase power to power that new card :D
and an industrial size HVAC unit ;)
This video was 2 DIE FOR amirite?
Ba dum tsss
hahahaha Two videos in a few days, great work Celso.
Dab dab
I wouldn't go that far.. death? For a GPU? Nah.. not really mate!
there is something VERT DIFFERENT about your voice in this video...... it just doesn't sound right.... did you use some AI that you feed it a few samples of you talking and you can type what you want and it'll speak it in your voice??? I can't tell if that's the case, but there's something SERIOUSLY DIFFEENT with your voice, and it's NOT a good thing
4 grand? 😂 They ain't getting shit from me. Bwahahaha!
Don't think they care.
they would say it is actually 170% faster
They said the 4070ti is 3 times faster than the 3090ti, when it was actually slower 🤣
@@coladict Source?
I feel this is more an AMD thing, they hyped up that 7900 XTX and then it was suddenly just a 4080 competitior :p
also cost 70% more than a 4090.
There was no mention of power requirements... What's up with that? Will the GPU require 1000W for itself alone or what???
I don't give a fuck how much faster it is, I care about PRICE and not being ripped off!
How can you be ripped off?? Do they force you to buy?? If you think to expensive just don't buy it. Why people cry so much. It's not basic food or water you can live without it especially that consoles have years to catch up. Just buy slower card
You don't have to buy Nvidia products or believe their hyped up lies.
Remember that!
@@ErraticPT I somewhat agree, but what's your opinion of Monopolies?
@@cccalhoun It is not monopoly, monopoly is when there is no other good option. People just don't choose to buy AMD graphics. That is people's free choice.
@@juriscervenaks8953 So it's a duopoly? What if they collude, is that different than what a monopoly does?
I don't think Jensen is crazy enough to launch a $3000-$4000 consumer card YET. Nvidia will never live down the ngreedia reputation coming from their OWN FANS at that point.
Well since amd out of high end market they charge more with % 70 power and %70 more money y’all will still buy it regardless in the market
3-4K consumer gpus will still sell like hotcakes, it just won't b gamers buying them
You'd need a 1000w PSU just for the dual-die GPU, 1300w if overclocking it.
Why would you need to overclock it? Lol. At what point is there enough frames? Monitors can't keep up.
@@Squidgy55 Monitors cant keep up? We have 240Hz 4k 0.1ms oled panels now. 4090 cant produce 240 frames per second in 4k. I doubt 5090 could either.
@@metallboy25 Like any regular gamer can afford all that shit lol. Don't come back at me with prices coming down in the future because I've seen it all before. Something will replace OLED before it's affordable for the average consumer. It's just a repeat of Plasma displays.
I would not spend over a thousand dollars on a GPU, period. Even a thousand is absurd in my mind.
There are people that make a lot of money so to them it's not really a big deal to spend a few grand every few years. Then there are people like me who will soon sell 4090 and make most of the money back add 100-300 to get the 5090 been doing this every generation.
Brokey
@@EskoxoNow it won't work anymore (4k$)
There are those who can buy them every day and not feel it, but most of us have a limited budget and, above all, common sense.
@@tristankordek based on what evidence that it will be 4k lol
What would you consider to be the ideal build for a CZcams content creator?
Thank you!
It will take about 9-10months before 5090 comes out.
You can get a kid by that time, séII him/her and get 5090
I wonder how far we are from being able to use photonics for the chip to chip interconnects.
probably get almost 60fps in 4k with max settings with the lastest unreal engine
Unreal Engine has been ruining a lot for me lately 😂
80fps max in harry potter game bcause CPU bottleneck
unreal engine has cpu problem not gpu
Monitor & other peripherals aside, the max I can justify on a complete new pc build is around £1500 (~$1900).
Ignoring that, I'd still hate to see what kind of PSU power and cooling would be required for a dual-chip 5090.
The more you pay the faster the tesselation 😅
im here in my garage with 1060 and i give 0fs about new expensive gpus
You looking forward to battlemage at all.
Thoughts on Intel GPUs?
Salt of the earth
In that case why don’t you save yourself the time and don’t click on video’s about new Nvidia GPU’s.
More like 0fps
5700xt still going so stronk
It could be, maybe. But it won't be.
10k$ GPUs lets gooooooooo💰
As the advanced packaging capacity is currently a production bottleneck I do not expect it will be used for consumer GPUs.
I would expect a ~50% performance increase, for the 5090, 30% from additional transistors/SMs and another 20% from clock uplift.
No need for NV yo go full out, as RDNA4 is rumored to not have a high end variant. Maybe NV could launch a dual die -50 (Titan?) card as part of a mid-gen refresh in 2026, if RDNA5 tuns out to be good.
I don't think 5090 is targeted at gamers, maybe small % of rich gamers. But majority buyers will be content creators, AI content producers. I believe majority of 4090 buyers also was content creators, i heard rumors from salesman. PRO Nvidia cards cost 10K$, 4090 was 2K$, so Nvidia is filling niche gap 4K$ for consumers that don't have 10K$. But still want best that they can afford, and 4090 is too slow for their needs. If it is true, then it is smart. Filling market gap.
I don't mind if they sell a 5k or 10k USD GPU, as long as there is improvement of 30%-35% throughout the whole stack.
Improvement in pricing upwards perhaps
in blury dlss ? or in RTX where all people turn it off ?
I'm not really trying to guess at pricing or how many dies and what not. I will wait until I see it then decide to go from there but normally the new 80 series is around 25-35% faster than their outgoing 90 series so 70% better is feasible at least.
Well, I guess my math is wrong, I know you can't get perfect scaling, but double transistors for just 35% more performance seems wrong to me.
I suppose it depends on what they use them for, some stuff on the die will do stuff that won't really translate to benchmarking numbers.
My bad, I pinned a comment about that. Me dumb.
2x die size for less than 2x performance?
Interesting as always
Lisa Su and Jensen Huang are both of Asian heritage. They both are if the same age. They both in the same field of work.
There are *_zero_* doubts that they are indeed intercoursing.
They are cousins …
@@marcs2960 🤫 and once removed.
@@marcs2960anddddd
More like interconnecting!!!!
Joining chips together is cheaper, not more expensive.
no competition, high price.
Looks like Jensen found footage in the thumbnail lol
No surprise here with the chip die interconnects as this with other features of advanced packaging form the waves of the future. Still the most important transition in gaming and all computing will eventually become SoC incorporating everything with the large pool of the very fast unified memory, similar to what Apple has done with the Mx chips or what AMD is doing with the MI300X. DDR will be still modular and possible to upgrade however, it will be considered as slower level from the unified memory pool.
Only if AMD would be so advanced as Nvidia it would be really an important step forward.
Ada GA100? GA100 was *Ampere* but the process and transistor count is correct.
Well said!
For 70% more money
Fact 😂
The more you buy, the more you save.
It's not a lie, if you believe it.
Give Nvidia some credit, it might just be 50%
80% doesn't sound too bad
12:00 I would like the next generation graphics card to be a significant improvement over the 4000 series in terms of power efficiency. Ideally, it would consume around 30-50% less power while maintaining moderate performance, within a 10-20% increase of the 4000's capabilities of the corresponding sku.
4000 is already crazy power efficient tho, you have 4090's running 95% or so @ 360w/80% PL and my 4090 using less power than my 3080 ti most of the time...would be impressive if they manage 30 - 50% less power but i doubt it... i just hope they keep it up going in the right direction at least
Maybe the 5090 might be some duel chip special, but all the real consumer stuff will be nothing more than 30% more expensive, with the same amount of memory, and the same memory bus. But if we see 50% performance improvements on each sku compared to the 40x0 series, then maybe it will be slightly more swallowable. But I can see the 5090 coming nearly $3000, especially if it’s some kind of dual chip thing.
While consuming "only" 50W more
Seems like I have to sell body parts to get what I want.
70% more expensive. FIFY
No way in hell they will make 5090 a 2-die.
Hmmm...guess its time to update my CV and start applying for a second job...
And 100% more expensive knowing Nvidia. So no need to get hyped...
A serious question, why has nVidia seemed to have stopped actual GPU development. Take away all the ai and cache based changes, and what have they done?
Simple answer: lack of competition on their back.
Yes, there is AMD, but only on lower-mid range GPUs. Intel is on the horizon for low-end GPUs. So we can expect somehow reasonable prices in those segments, but not at the highest segment. Every company will squeeze the market as much as the market will allow them.
what improvement should be done anyway on GPU? the only thing most gamer really care is faster FPS. when nvidia push RT most gamer shun it as useless because it tanks the performance. take out RT out of the equation GPU like 4090 can give you bajillion of FPS even at 4k. majority of gamer out there? they still with their 1080p monitors so most of them already doing very well if they can get something like RX6600 level of GPU.
@@arenzricodexd4409 AI is and will be a thing.
I went from 980TI to 5700XT and then to 3080 so i have ben planning to go for 5090 when it comes but only if it cost 2500$ or less. over that and i wait.
Exiting. I’m 100% sure I’ll get this 10 years from now in order to replace my 15 year old one, for a very good price.
By the way, why aren’t they just stacking in top of each other? What happened to that architecture?
Sad that it will probably be 70% more expensive as well!
70% scaling for dual GPU is about in line with the best case scenario when SLI and Crossfire where a thing. I ran both, and had to do tricks like force triple buffering to eliminate micro shuttering but was hit with a ~20ms latency penalty. I call B.S. when he says it won't need programing to work, all GPU's have tuned profiles in the drivers to make them work best. Might not need software changes out of the normal. I'm certain there will still be a penalty and it will be frame times and very slight latency
I might be going nuts in my old age (38), but GPUs used to run dead solid if it was pegged at 60Hz or 100Hz etc, and now they somehow feel inconsistent. Like frame times are off from the readings. And now Intel engineers are confirming my paranoia lol
I do not have 4kl or 2k
NVIDIA needs to work on converting their CUDA library to quantum computing compatible and they also need to make quantum computers as well...
I used to be excited for GPU releases, not anymore.
7:40 GA100 is Ampere not Ada, the H100 is the ADA based accelerator.
I keep waiting for their MCM GPUm for years now but I got a 3060 laptop in 2021 or 2022 to wait for that, and I had a Intel 6700HQ laptop to play games on since like 2016.
Is the Coreteks voice over AI generated in this video (?) sounds like it the first half more so than the second half >> anyways... I reckon the pricing will not justify the upgrade for gamers when 5090 / 5080 is released only productivity / VR developers can justify the cost
Why would they need 2 die for the 5090? Wouldn't a 30% increase in density, paired with 30% faster GDDR 7, and a higher clock speed get you to the 70% figure on its own? Or is performance only tied to density?
As far as i have seen that seems accurate. I would expect 50 percent at least, but alot of that will be gddr7 and density, with clocks making up the rest. I cant imagine the gpus clocking THAT much higher personally. Kopite komi he mentioned in the video has been a leaker for a long time and usually has pretty reliable information. I dont see a double chip design making it to us gamers. In fact im almost certain the 5090 wont be that. No way they give that to us with the INSANE profit margins they get from enterprise cards and hardware
I would consider myself a value purchaser. In the Neterlands I payed 1200 euro for the 4080 last year. The 4090 is about 1900-2000 euro. The only reason I needed a GPU that powerfull is because I upgraded to 4k monitors. A 5090 sure would be powerful, but also not that neccecairy and so far from beiing considered optimal value that I would fail to understand a purchase. Guess if you have the money more power to you but don't really see a real reason to upgrade. Maybe 8k?
Why is every single product cycle of Nvidia precluded by this exact prognostication?
Is it actually 70 timees better or it's just cuz of dlss
70%**** which i highly doubt. 70% higher spec doesn't necessarily equate to 70% percent gaming performance uplift.
good news for 4k gamers
I doubt they'll release such a powerful GPU for simple gaming, since they're so hellbent on AI. After all, they are pitching this to AI companies, not gamers, they'll probably scrape the bottom of the barrel of the silicon lottery to "give" to gamers, and sell the good stuff to the AI giants. In simple terms, I'd still expect the 5090 to be the 35% or even 40% faster than the 4090, not the professional level 70% faster GPU promised to companies
Same old - same old...
I would honestly prefer to buy a 4K 5090 that has 70% performance from the 4090 than a 4090 ti disguised as a 5090 that has 30% more performance at 2,5K.
Its not like the product goes from cheap to expensive. It goes from expensive to expensive, you're paying the price to get the most performance and in that sense, only the dual die chip version or a 5090 ti version makes sense to deserve that kind of hype or price to me at least as a consumer.
The real question is: if they release a dual die gaming GPU... wouldn't that finish to destroy the quadro line? Also, what about restrictions exports to China? As much as I like tech and gaming, decision are still made based on business needs.
AMD expected more performance out of the Chiplet design right? Hopefully they where able to solve the problem(s) they did not expect to get some performance back without to much Die increase and then new node + maybe they can increase from 6 to 8? just dreaming here. So no comparison to 5090 monster but hopefully a lot cheaper still fast card
Q4 is going to be soo interesting!!!
No one has mentioned power draw? 400W+280W(%70)=680W
That's mental.
But is the NVIDIA 90 (RTX 3090, 4090, 5090) series a card designed for gaming? It is very monstrous to use her only for that or that that was the destiny they had for her. The gaming cards are cheaper and are the series 60, 70, Ti and 80 as the maximum. It is more at some point when an RTX 5050 (tI) comes out. It could be the price that is budgeted to build a Gaming PC for ordinary people like me.
Brute forcecand ignorance will always win lol
They are saying that just because they will make something like dlss 4.0 just for the 50 series and call it a day and that's a fact. It can be like 15-20% faster at best.
There will be no dual die gaming gpu, as there was no ray tracing coprocessor.
probably it can be done but i don't think nvidia want to go down that route.
Clearly you didn't watch their keynotes lately. The bridge between dies is so high bandwidth that the software can address them as one singular die.
4090 cost 1 kidney, 5090 will cost 2 kidneys and a cornea 😢
@Coreteks - Are you using AI generated narrative? It sounds like you, but kind of generative.
@Druze_Tito I've always been an AI
@@Coreteks I've been following you since the beginnings and I can negate what you say, dear "AI" :)
When one can merge two dyes, what's stopping them to merge more dyes in a 3D-stack? Or am I talking weird nonsense now?
Heat removal is probably the largest factor, followed by signal integrity.
@yurriaanvanduyn check my last video
It makes no economic and business sense for NVIDIA to allocate their precious dies especially for consumers GPU with 2 die GPU configuration tier instead of overcharging profits corporate enterprise GPUs in the current AI boom
The entire GTC focus was all about AI instead of gaming
No sane consumer will ever buy GPU for $4k even $2k with this budget you can easily build entire Personal desktop computer and at $4k GPU price even enthusiasts will have harder decision but for professional market that money will not be issue if the GPU can save time and generate more money or profit
You haven't heard of simracing, right? We are buying 3k wheels, 2k pedals, triple 4k monitors,... Or flight sim enthusiasts. So yeah, I can see people buying 4k GPU just for gaming. edit: but you are probably right, dual die config. is for enterprise customers.
You're most likely right but let's not forget Nvidia is still charging the same amount for Blackwell Data center despite doubling the die size.
Supply may not be an issue too since they are sticking with the 5nm family instead of moving to 3nm.
@@illumi3604 But isn't the bottleneck packaging currently at TMSC?
i m gonna spend 10k dollars for a 5000 TITAN ! LESS GOOO!!
If cost per frame stays about the same I don't doubt that they could sell $4000 or even $8000 GPUs. Especially if 360-480 Hz 4k OLED displays get released soon.
Even among my acquaintances I knew a guy who always tried to go for SLI setups when it was almost never worth it.
$4k cost and 2k Watt power supply...
Imagine the side of this GPU 😳
They can take their 5090 and stick it 100% not 70% where the sun don’t shine because that’s how it’s going to be priced. F GreedVidia
5080 for sure
If the 5090 is really 70% faster while being on the N4P node, it's gonna be huge, expensive and gobble up tons of wattage.
VIDIA GeForce RTX 5090 or the 5000 series graphics cards are currently beyond what I’m willing to invest in. My existing PC setup is quite robust, equipped with PCIe 4.0 support, and I don’t see the necessity to upgrade to PCIe 5.0 solely for a new graphics card. My system’s core components, which include an AMD Ryzen 9 5950X processor, 64GB of DDR4 RAM at 3600MHz, and an NVIDIA GeForce RTX 4080 Super GPU, are more than capable of handling my computing needs until at least 2027. When the time comes to purchase a new PC, I’ll ensure it includes a high-performance graphics card that will provide a significant performance leap that’s truly noticeable.
the more you buy the more you save
Haha thanks for the good laugh dude haha
wouldn't adding another chip make it 135% faster and the total gains should be 170%, right?
Maybe a titan version (and other professional cards) of the GPU would cost $2000+,
but I don't think the 5090 would cost much more than the current 4090 (maybe 5-10% more, eg: $1699 or so)
Don't forget that's for gaming (and RTX 4090 are probably happy with what they've got; besides, I don't see AAA games in the horizon that will make the promotion of that GPU ... and The Witcher 4 is for much later...)
If the 4090 uses 400 watts of electricity would the 5090 then use 800 watts. This blows my mind and with electricity prices always on the up, running this card for 1 hour at today's prices will cost $4. If I use my rig 4 hours a day my annual electricity goes up by $1,460. Americans might have cheaper electricity but where I live it is prudent to constantly monitor for lights left on and airconditioning that could be turned off. I'll pass on this portable heater.
I cannot even see why this card would be needed, if you only did gaming, I'm running an RTX3080 24GB, not having any issues with any game on max settings, maybe another 4 years I need to upgrade, wonder if the 5090 could be usinf for local AI?
I sont think there are any 24gb 3080s? Also 2028 is a LONG way away. 4 years ago there werent even an xbox series or ps5. Just because you settle for a card like that doesnt mean others will. Its proven that people will buy nvidia highest end cards every generation, basically regardless of the price