The 4090 is HERE! But Is It Worth It?
Vložit
- čas přidán 14. 11. 2022
- **********************************
Check us out online at the following places!
bio.link/level1techs
IMPORTANT Any email lacking “level1techs.com” should be ignored and immediately reported to Queries@level1techs.com.
-------------------------------------------------------------------------------------------------------------
Intro and Outro Music By: Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 3.0 License
creativecommons.org/licenses/b... - Věda a technologie
I love my Zotac Extreme Airo 4090, zero coil whine, quiet fans and I honestly love the look. Makes a nice change from the blocky designs.
Thanks for mentioning the driver fix, I hadn't noticed, going to update for my 3060.
I've tested my 4090FE a little bit. My Sweetspot is around 70% Powerlimit with some UV. That UV makes older/easier to run Games use less comsumption as with just a Powerlimit.
I have it running in a System with a 5800X3D and my total power consumption is below 450W. My 4k OLED and the second Monitor add ~120-150W to it.
With my Settings it runs faster, cooler and uses less power compared to my last 2 GPUs 3080 and 3090.
BTW i also use a +1000 Setting on the Memory, the Mem Temps are just below the GPU Hotspot and never reach 80°C.
But there are 3 disadvantages: The Price, it is louder even with the lowest RPM setting, and coilwhine. That said, i can't hear the coilwhine unless i stand directly in front of the PC.
I spotted a couple of Zotac RTX 4090 cards in my local Micro Center outlet during one of my recent visits. As cool as Ada Lovelace may be, I'm still looking for RTX 3090 Ti cards for the NVLink functionality to go in my first Threadripper Pro build. Until PCIe Gen 5 properly takes the mantle from NVLink, I will have to settle for the 30-Series Titan-like cards or consider Nvidia's professional cards.
Wendell, great intro! You had me in stitches 😂. I'm still getting bot slapped trying to pick up a 4090.
A major thing that people forget when they talk about nVidia as a company is that they are not exclusively a video game hardware company. nVidia is also a data center/Artificial Intelligence company and this year was the first year that their data center business actually made more money than their gaming business. Video game hardware are a little less that half their revenue now so its still extremely important but there is so much more to these conversations than just the frame rates for Call of Duty at 4K.
They were a video game card company. Data center and AI came later by accident.
Looking like the melting is a design flaw in the adapter. It does not give the installer/human a positive snap indication at full insertion. This along with OCD cable management is causing heat issues due to bad pin contact. Listen to Wendell people. Always fully insert your dongle.
Looks like 60% power limit is the sweet spot for performance per watt on the 40's. No need for these massive coolers and huge power draw to get an extra 5-10%....imo anyway.
And yeah FE 4080 is £1270 ($1500 US) 4090 is £1700 ($2020 US) in the UK......The AIB's will be way more......crazy money.
that 35% power target is amazing.
Where can I buy the FSP lifestyle Hydro PTM X? A 130mm 1000W power supply with native 12VHPWR sounds pretty great to me.
is there a rule to coil whine on the 4090, or distributed at random among board partners? looking at the cheapest gainward phantom, but worried about quality control
I wonder if using silicon spray may help with the insertion of the connecter, but the idea of using lube to help a connecter go all the way in would sound insane.
Sounds exactly like my conversation at my local MC: "we have 4090s..." "YEA?" ... "Ya... meet us at the register" .. ::gets to the cashier:: Me: "Oh... ok" LOL
I'm building a new pc for 3d rendering 🖼do you recommend to get Threadripper 3960x or ryzen 9 7950x?
the test shows that the 7950x is faster by 20 % .....so cores count or clock speed?
The UHF reference! I am officially a subscriber😂
Appreciate you Wendell! We shall see how AMD does, but like your pragmatic review :)
Very interested in the 4080 ti.
A UHF and Spaceballs reference in the same video... classy!
What is the excuse now for not having available cards ...???? As a retired PCB designer the reason all the different sizes of caps to to address the entire Hz spectrum range to maintain a specific impedance for signal stabilization/timing. Each size cap covers a different area of that range.
lol, "let's ramble about capacitors" classic Windell
Rocking review.
I love Papa Bears review of the RTX4090 Zotac Airo model 🥰😇👍. Wendell always comes across level headed and relaxed in his explanations which is a really great promo for the product/s being shown ✊💪🥰. Zotac do make some amazing products and tend to be played down as a C or D tier product but it’s good to see the build quality etc of this GPU being expressed in a good light. Cheers 🍻 buddy ✊👍
I had a Zotac 1080 for years and no issues at all.
Best power/performance is at 350W, including AI workloads.
Does the thumbnail need a brightness/exposure edit or is every other thumbnail on youtube just hella bright??
4090 scores under 9000 in Timespy extreem , what do you think went wrong ? ( windows 11, Threadripper 2990). Thanks
Power is a big issue in europe at the moment. Here in germany current pricing is hitting 50+ cents/kWh. So haveing a GPU with that kind of power envelope....welp. It's an alternative to heating your home in case we run out of gas.
Uses less power than a 3090 ti, and it doesn't lose much performance underclocking it. I'm currently not gaming and its using 23 watts, with fan RPM at 0 and temp at 36c. Overclocking it can get you to 600 watts, and that is 100 watts over an electric heater on low. Just make sure you have a good CPU for heating, or it will never hit it's max in most games... or just play in 8K with DSR is you are out of wood.
I have like 700 watts available to run my PC, or else I will trip the breaker that powers my room lol. I remember the days where 800 watt PSU was overkill even for HEDT dual GPU systems.
Melted cable has also been observed on native 12vhpwr cable (atx 3.0 psu), so yea.. the whole thing is just a mess
Where do you wanna go when you want a name brand spatula at a fraction of retail price?
Pure comedy gold
"Zotac - I guess"
could be there new motto
Zotac is like the store brand of graphics card.
@@JeffGeerling Fr though, plus those cards are chunky af.
@@JeffGeerling mom I want a 4090!
Mom: "we have 4090 at home"
Zotac: 4090 at home
It's something wrong with Zotac cards? other than being like the store brand ? 🤣
@@anemac9 Not really, in fact I have two of their older cards. They're just not 'cool'. But that's a bonus since they're still reliable and easy to find :)
Wendell let’s see 3090 vs 4090 ML benchmarks!
Any completely unfounded guesses at an RTX 4060 ETA? I'm guessing Q2/23
900 watts for a PC to game 😱 the futures not looking so bright 😎
I think companies like ASUS and Gigabyte have their hands in many parts of the market which helps their bottom line with Monitors, Motherboards, Mice, Keyboards etc.
Can you do a round up of currently available GPUs as AI accelerators? Maybe once AMD launches the 7900 series? I'm particularly interested in seeing both what AMD's claimed improvements in hardware and software deliver and whether Intel's claims of amazing tensor performance actually play out. Most machine learning benchmarks I've seen either continue to assume NVIDIA is the only player or are comparisons of hardware that isn't in direct competition (eg comparing Apple M2 against the 3080).
4090 has an 8x CUDA performance compared to 1070ti in AI training. Take a look on the numbers at Geekbench - it is a beast. Highly recommend if you are going to use it. (You can also enable ECC from settings at the cost of 1.5GB less memory)
P.S. 4090 is a cut-down L40 - a bit less cores and higher power target
@@brutester That doesn't tell me anything about how performant AMD or Intel cards are though, particularly since both are claiming at least some sort of improvement with upcoming or recent cards (Intel claims XMX is way faster than Nvidia's tensor cores on a like for like basis for instance, and AMD is claiming that RDNA3 is going to massively improve their AI game). The 1070ti was never built for AI acceleration so it being way slower than a card with dedicated tensor hardware is unsurprising.
i'd love to see how nvidia and amd compare in say, stable diffusion
@@bosstowndynamics5488 AI acceleration was built for Nvidia card at first. I guess Nvidia saw the opportunity and started to work with AI industry to make "tensor" cores. Nvidia has CUDA toolkits to help developers to interface with the GPU. I don't think AMD has an equivalent. Maybe AMD can be used for AI tasks too, but until AMD made it it easy to interface with, it is useless.
Offtopic; looking good Wendell, keep going m8!
It's like the "Cheese Shop" skit from Monty Python. Ask and ye shall... Not today Sir.
900W+ lol....
I'm not sure an extended gaming session should be comparable to cooking thanksgiving dinner.
who is filming at the beginning and how drunk are they
I got lucky and got a founder's edition from my GeForce Experience App notification. If I hadn't gotten that price, no way I'd pay what a 4080 will be going for. The lucky get them early.
I got the PNY 4090 and I was nervous for the same reasons, but this card has blown me away. They did a fantastic job on the cooling. The hottest I could get the GPU has been 65c so far. That’s under heavy load and default fan curve. Not a giant fan of the look of the card, but it’s growing on me.
What an upgrade from a 980ti! Great video!
PNY makes Nvidias quadros. They're decent and often cheaper.
Nobody wanted a Zotac so I got one and now I have a 4090 while all the other guys in the drop watch channel are still waiting weeks later.
At the moment its Tesla type hype that getting the market price, which is going to be a problem for them in the long run when the competition finds its feet. After all they have a large gap in price disparity to fill if other manufacturers can get even with say 20% of the 4090s performance at a much lower cost to the consumer.
I won't be upgrading my 3090TI until 2 or three new cards are released.
The 4090ti that's coming is going to be insane with power....if those rumors are true that it melted boards, psu's ND the connector it should be interesting
is that a VEGA sign on the IO bracket, back plate and outer fans. VEGA lives on in ZOTAC
Great video.
4080 would have been a great card at.. $799. You would have to be out of your mind to pay $1200+ for it.
RTX 3080 launch buyers didn't know how good they had it at $799. Who is the demographic for this $1200+ rtx 4080 card? 🤔
Crazy thing is I bought my 3080 during the height of the shortage for 1200$ from Newegg’s shuffle but it was negated because I turned around and sold my 2080ti on eBay for 1300$. I agree with you though I would never actually buy a 80 class card for 1200$ during normal times. It’s insane they priced it that much.
i had a RTX 3090 Ti it exploded killing my 1550w PSU with it
To need so many caps on a GPU with so many spikes, it's because the card is not well thought. Zotac looks good but Nvidia dont know what they doing. Imagine not having enough power to reach all the transistors, you put caps in case it crashes and when they all empty, they need to refill quick, spike, lack of power from the VRMs or bad power design. I rather smooth electric design mostly when you keep precious data safe from fire.
HDMI 2.1 KVMs coming ?
You're not a Z tier CZcamsr! Just so ya know. Glad to have graduated to be a regular viewer of your channel.
Zotac's perfectly fine for build quality and durability and performance.
Where they fall down is for people who want to retrofit water cooling.
They have a habit of playing around with their board layouts so, simply because you have a Zotac XYZ card doesn't mean a cooling solution for an XYZ card is going to fit properly.
It's so bad that many third party GPU solutions refuse to make a sku available for a Zotac card. Because they can't guarantee their device will FIT.
1 minute in and I'm wondering whether Wendell will set his power connector on fire before the video ends. This has to be a meme, right?
The power draw for you yanks may not be important.
But for us Europeans, it is REALLY important, even if i have a bunch of solar panels....
Besides i dont want a room heater in the middle of the summer on my desk.
If you can’t afford the electric bill then buy a game boy
@@thepopeofkeke
Oh, , here we have the Nvidia fangirl who's gonna defend a overpriced hotbox of a card.
Little boy, I was already gaming while you were a dirty thought in your father's mind.
I've bought expensive videocards before that was even a mainstream thing..
However , i've grown up from that puberal fixation of who has the biggest epeen and actually have a life and a family,.
Not everyone lives in their mothers basement , sleeping with their waifu pillow...
Price/performance is certainly something neckbeards havent heard of. 🤣
And if you were gonna call me boomer , nope , i'm generation X , been gaming for 31 years...
If your electric bill is still a problem you should probably stay away from ultra luxury items such as this. Grab a used 3070 and undervolt.
I simply want the pricing of GPUs to go back to that of the era of 2010-2013 where ~$500 would get you best consumer-grade GPU money could buy, and ~$1000 would get you prosumer/enterprise-level GPUs.
However, I know that is unlikely to ever happen.
I got the gainward phantom 4090, it was 2400€(cheapest 4090 after the initial launch day in Scandinavia) nvidia ref design like many other cards such as pny, zotac, palit and such. and it is pretty fast anyway. 2.7ghz boost 24/7 and 3ghz oc and ram is also pretty good ocer. so the ref pcb design and with the 450w power limit seems do nothing. And yeah, I run the 13900kf/4090 on an corsair sfx 600w planitum psu, no problem, but I dont do any blender work the same time I max the gpu so :)
why wouldnt you grab a pcie 5.0 psu? like wendell said, if you can afford a 4090 you can afford a proper psu.
"Zotac? I mean, I guess." sums up Zotac perfectly lmao
For you maybe, it could also be said about any brand 4090 given the green leather jackets have dictated exactly what the AIBs can do as long as they are all peas in a pod that all struggle to go faster than a Unicorn Founders edition. So your point missed the mark considerably imo.
@@zorbakaput8537 I mean, I guess
The 4090 is... there! But the 1060 is... HERE! BTW Zotac is the only brand in stock for 4090s in Finland. 2500€ holy hell never stop the madness nvidia.
3000$ in Israel...
@@blackjew6827 That's more-or-less the price in Brazil too. Should get worse in the next weeks with shortage of GPUs and changes in the exchange rates.
All this in a country with a 225USD minimum wage.
@@GabrielRodrigues-fg6ex WTF?!
That's dumb. The only reason I'm thinking of buying is Blender and game making, So it's a business expense.
Except NVidia is a monopoly and can just manipulate the market as they like. 3000 series prices are nowhere near the price/performance of 4090 (unless you buy a 3060), even when 4090 price is highly above MSRP... This is nuts.
Spatula City, we got spatula's and that's all
living with my 1070 and a intel 4790K and trying to be good for another year atleast
Tbh. I'm waiting on reviewers to do a 30 vs 40 vs 60 comparison. Then I'll probably buy from there
Sadly, we don't have a release of Skyrim new enough to push the 40xx cards to their limit yet.
At this time, it's pure overkill. Toddy Howardson might release a suitablebly demanding Skyrim at some point, but we're not there yet.
You jest, but I'd be willing to bet the Elysium modpack would make a 4090 cry at 4k.
Very entertaining. I hereby engage.
who's holding the camera?
Well the gamers that's getting shafted today is the managers and purchasing influencers of those Enterprises and Data Centers of the future. Don't think I would put a purchase agreement in when I had been shafted, just saying.
It is an impressive but expensive science experiment.
Nothing for daily use though.
why you gotta do zotac like that 😅
Dude! It’s the 4080 that’s here, not the 4090!
“Send it”
Case?
ZOTAC!
Slagle IA no longer a thing with the EVGA fiction that jayztwocents recommended
No one really wants a zotac but they're always cheap lol.
I am still unable to buy a 4090 and it makes me sad.
Damn right its worth it. The most insane 4k gaming GPU.... how can you top this? 8k...
id like to contact you about my card's extended warranty .meme
Spatula City FTW!
Seriously, though, it seems that Nvidia is STILL considering gamers as bottom-tier customers.
I love Wendell and Level1Techs - so many pseudo tech-tubers say sth about the 4080 and here clever words about the 4090 incl. warranty etc etc.
Igorslab and Wendell are experts, who publish for all to understand, well techtechpotatoe, too. Steve so GN is also getting more technical - very nice.
Sadly many others are just on YT and even websites or magazines for entertaining. Btw Anthony should open an own little channel and do AnthonysTechTips hehe.
Cheers and Wendell + Team at Level1 stay healthy and keep your excellent way!
Pepe
that single gpu is more expensive than my entire pc.. I like fancy graphics, but i dont like them that much..
If you make a bigger die, there would be room for more caps. 🤔
$600 6900xt looks like a hell of a deal.
Still not quiiiite cyberpunk 4k 60fps native no AI
Zotac has really come a long way!! Highly underrated nowadays!
Indigo Renderer or it didn't happen! xD
IDK, I've been away from GPUs and 3D gaming in general for awhile. But it just seems to me that there should be some kind of wholistic thinking here. If every *other* piece of hardware in your computer can do so many amazing things with such low power requirements, why are GPUs requiring 1000W alone? That's datacenter territory. Do we need that to game?
I think it is done for stability reasons since no one knows what parts you have. Same with CPU's. Under volting can bring lower power since you are tuning it for your system parts and set up
I have been running my 3090 at 110 mem for over 1 year :o
The electrical wiring in a house being a problem for these kind of all out watts from the wall, is that a problem of the 110V thing? (Yes i am aware you have 220/230v just like us euro people, just connected differently)
Not American myself, but basically yes. US single phase circuits are only rated to carry 1500W continuously even in ideal circumstances, and older homes there frequently have inadequate wiring to hit even that safely for long periods of time. 220V split phase outlets can fix that but are generally only wired where high current appliances are expected - the oven and drier generally, and more recently the garage for the odd house wired to charge an EV or 2. That means that if power consumption keeps rising US gamers will soon need to spend thousands on home electrical upgrades in addition to their 5090s soon.
@@bosstowndynamics5488 1800 watts for a 15 amp direct line and 2,400 watts for a 20 amp multi outlet line. However many outlets are only rated for 1,500 watts, as that is the outlet rating for a non-dedicated line. Most competent electricians use 1,800 wat plugs in kitchens, but most computer rooms will only have 1,500W outlets unless the installer did them all as commercial outlets.
You can also run into the issue of having a window air conditioner and PC on the same line.
@@abram730 My understanding is that the US also requires that load doesn't exceed 80% for continuous loads (ie 1440 watts on a 1800 watt circuit), which would include a PC. Competence of modern electricians doesn't help older homes with bad wiring either.
It's bigger than a nuc
Less silly good
Actually the 4090 FE consistently uses less power than a 3090 ti.
Wendell you seem to really prefer AMD products. I like whatever has best performance when it is time for me to upgrade. Currently debating on 7950X or 13900K or wait for Intel HEDT rumored for January 2023. Do you prefer AMD? If so why? Any thoughts on those 3 options although one is not out yet? Thank you in Advance.
I just got a 7900x and previously a 11900k. So far I’m loving it. With a quick overclock based on Wendell’s guide I’m boosting to 6ghz on multiple cores. Computer feels insanely quick and didn’t realize how much performance my 3080 was leaving on the table on the 11th Gen Intel system. I guess the big reason for
Me on why I went to AMD was the extra PCIE slots on the e series MB so I can add a few additional M2 drives
AMD makes more sensible and competitive products generally.
NVIDIA has developed a STRONG bias towards gouging gamers which started when the crypto craze happened where miners were paying 2-5x the price for GPU's!
Also people have been use to mostly using NVIDIA GPU's for gaming. (just look at Steam discrete GPU data)
Wendell reviewing a monster - this is going to be fun.
I won't buy this card because of DP1.4. What a huge oversight.
nVidia is jealous of Apple.
6900XT for $500-$600 is a STEAL IMO
I've only seen 6800xt closing to $500
6900xt is still between the $600-$700 mark. I think it might stop at the $600 mark. AMD really wants to get rid of rx6000 before their rx7000 series.
@@crazybeatrice4555 even @ 700 its a steal!
No!
4090?
Yeah I was also thinking it is a mistake ;) By now I would expect 4080.
Wendell was not part of the nvidia launch ,so he got his hands on a 4090 via retail and is reviewing the card now.🤷 it is what it is.
weird color banding on this video.??