One kidney, please! - NVIDIA RTX 4090
Vložit
- čas přidán 16. 05. 2024
- Get $25 off all pairs of Vessi Footwear with offer code shortcircuit at www.vessi.com/shortcircuit
The 4090 is here! We can't turn it on or talk about performance at all, but we stole the Zotac 4090 Extreme Airo GPU from Labs so we can unbox it for you and talk about what we should expect from 4090 performance.
Buy a Zotac Gaming GEFORCE RTX 4090 AMP Extreme AIRO: geni.us/SBp2
Purchases made through some store links may provide some compensation to Linus Media Group.
Want us to unbox something? Make a suggestion at lmg.gg/7s34e
► SUBSCRIBE ON FLOATPLANE: floatplane.com/ltt
► GET MERCH: lttstore.com
► AFFILIATES, SPONSORS & REFERRALS: lmg.gg/scsponsors
► PODCAST GEAR: lmg.gg/podcastgear
► SUPPORT US ON FLOATPLANE: www.floatplane.com/
FOLLOW US ELSEWHERE
---------------------------------------------------
Twitter: / shrtcrctyt
Instagram: / shortcircuityt
TikTok: / linustech
Facebook: / shortcircuityt
CHAPTERS
---------------------------------------------------
0:00 Our 4090 FE was stolen
0:53 Unboxing and basic specs
2:36 Initial design impressions
3:12 What we know about performance
3:51 Sponsor - Vessi
4:32 DLSS 3, RTX Remix, AV1 Encoding
8:31 More on design and current 3090 pricing
10:15 Outro - Věda a technologie
Stay tuned to LTT for the full 4090 review coming soon! Would you consider buying a 4090? What GPU do you have right now?
AMD Radeon 6600... 😷
Ryzen 5600x
3070. I would like to upgrade to a 40 series so I can 4k game with less stress on my GPU.
I have a Razer Blade 14 with a 3070. Bought it in the summer of ‘21 for a solid PCVR experience to couple with my Quest 2 and I have no regrets :) everything is between 100-120fps in VR.
Sticking with AMD.
3070 3700x enough for now
With the size of these things its starting to feel like installing a PC into your PC.
funnily enough pcie "PCs" (DPUs, SmartNICs, etc) are much, much smaller
Well, for all intents and purposes, a gpu IS a PC inside your PC. It has a processor, „chipset“, memory, in- and output IO and Power delivery circuitry.
@@Renuclous Nooooooo..... A PC is a PC, not because of only hardware... software is part of the equation.. Linux, Windows, macOS.
@@Todd_Manus firmware
yo dawg i heard you like PC's
Just in time to function as a space heater in winters.
😂
Yes but will anyone in Europe be able to afford to run it?
@@davidlazarus67 good thing not everyone lives in Europe then.
@@chitorunya Yes but more than double North America. We are a much bigger market than the USA.
Any 110/120V users running an actual space heater in their gaming room will have their breaker trip.
Im so glad he brought realism to the durability of power cables. I can tell when they mentioned the lifespan, most people have never heard of lifespan of cables plugged in and out and freaked out over something they had no idea about.
10 times is nothing
Sizewise it reminds me of the Soundblaster AWE32, which fitted in an ISA slot.
It was so long, that in our pc case we had it in the bottom slot and it was resting on a small rubber block to prevent it from hitting the bottom of the case.
Only 1,600 before tax. Wow, that’s more than a lot of people’s whole build
Could very well get up to 1800 depending on which manufacturer you get .
It will be around 2200-2300 in my country which is like three times more than the average monthly salary 😬.
I really wonder how many people will buy even the 4080 12 GB also known as the new 4070. I bet even the true 4070 will be around 800-900 here which is awful.
RDNA 3 please save us with more reasonable pricing.
My house costs less than that
Yeah tell me about it. Mine was $1,250 and I still felt like I was over paying.
Brooooooooooo.... that would be about as much as my whole PC! I have a 2080 super, but I am pretty sure that is only a fraction of my total rig's cost... I did got my PC on a sale though!
Every time he slams it on the table, I get a sharp pain up my spine, this is no way to treat your kidney!
Why not? I treat my liver like that every other weekend
Nvidia is a piece of turd, which makes their products turds. Not kidneys. He is slamming a turd on the table which should make you feel repulsed
@@nickevison391 By slamming it on the table?
@@stevenjoummaa4640 metaphorically with alcohol, yes
The sizes of these things are getting to the point where Cloud Streaming can really become a more appealing option with a bit more advancement of the tech. I saw a video with the 4090 ROG Strix and it requires a full ATX tower due to how large it is. That's insane. The heat output could probably take care of an entire 3 bedroom house in winter!
Imagine one of those in the summer without an AC / bad air-circulation in the room... It makes me remember of a guy who got brain damage from a heat stroke from his crypto mining hardware.
A typical single room space heater is upwards of 750 Watts, some even up to 1500 Watts. So no, this card could not provide the same heating as 3 of them.
400W < 2250-4500W (Incase the math confused you)
@@Adaephonable wow no need to be condescending! Obviously my comment was hyperbole...yeesh!
Could someone explain to me what the hell this is ? I’m lost what is this device used for?
@@Lucas_Simoni Chubbyemu fan?
We went from GPU Sag to GPU Gansta Lean
Just imagine, the 5090 will use 2x PCIe x16 solts and it will come with an additional 600W PSU in the box😂
In the box? Not a chance.
The next one will ship with an ATX wallpower plug at the back. Just so it can take a whole breaker for itself. \sarcasm
Nah it will come with a dolly so you can move your appliances hooked up to the 240V outlets
Only 600W?
@@nielskoolstra that actually makes sense and id be fine w it
With how big GPUs are getting, I can't wait to see someone make an SFF PC out of the shell of one.
Right, I am taking my Star Wars Titan XP enclosure into a RPi build, but it is pretty small inside .
😂
At this point you're modding the GPU to put a PC inside it.
@@john-paulhunt6943 It gets to the point where majority of people just don't need better one , games are made with majority of people in mind , the medium will just be treated as ultra setting and newest GPUs will play games on above that.
what does any of this mean
I think these new cards are a prime example of Peltier TEC should start being implemented. A solid copperplate >>TEC>> with a thin fin heatsink designed to have air forced though as fast as possible or even water cooling.
You are aware of how inefficient Peltier coolers are? Instead of a 400W card you could have a 600W card that does the exact same thing...
@@Adaephonable I'm more concerned with getting the heat off the card and into another more efficient medium as fast as possible. If we do that the wind tunnel might be able to do some real work. Hahahahaaa
70 games 2010-2023 a few per year a good mix of styles I plan on for a first time to just sit and game
2 games only doing 90 120 fps maxed out
2 can max out but like to lower for 120fps
Rest of 70 can still do
120 140 fps
or
130 150s
or
staying in 150+ 200+ fps
Im on 1440P 165Hz monitor
ASUS Z490E GAMING
i9 10900k 10@5.0Ghz
32GB PC3000 DDR4
3090RTX oc to 2100+9900
2 x970 1TB Samsung
1000 watt
Sitting easy and gaming
How does a modern GPU not have DisplayPort 2.0.
It's new, but it's not modern
smh my head
Arc a770 has display port 2.0
buy intel gpu . 😂😂👌
New AMD CPUs support DP2.0
Can't wait for the day a manufacturer will come up with the idead of using 2 pcie-slots with one card. One for data, and the other for structural integrity.
By structural integrity you mean to prevent sagging? JayzTwoCents made a video how to do it.
...what do you think the massive heatsink does?
Yeah that would be smart. Second PCI-E x16 slot is unused by the vast majority of gamers anyway since SLI is a thing of the past so this would be a good way to make use of it.
@@st0nedpenguin To transfer heat and make the card super heavy? Im not sure what you're implying here.
@@nabawi7 The heatsink provides much of the structural rigidity.
NVIDIA GeForce RTX 4090 Installation: czcams.com/video/JqxL1y8n16Q/video.html
Benchmarks? Where is the video for it? I also wanted to know about the fan noise people were complaining about. Thanks !!!
The 40 Series feels like connecting a PC to the GPU
connecting pc to the heatsink
That's always been true as the GPU has always been the top consumer of power. Now it's more than ever... but that's never changed
@@ejkk9513 its more in reference to the size
Wow so original and SOOO funny I can't stop laughing
Lol
This GPU looks like it needs it's own case to be held properly without ripping the PCI-e connector off the motherboard
Seriously, how do they make sure the card's PCB doesn't just snap? I'm less concerned about the motherboard, it's loaded along it's length and not perpendicular(ly?) and the PCIe connector is soldered it with so many pins, it should hold up fine.
That's what I said on another guy's video, you need a construction licence to build a pillar for this card
@@LuLeBe First off, it's "perpendicular to", since you seem to be wondering (just trying to be helpful, not sarcastic, I don't know how my comment comes off).
Now, if you've ever worked on a PCB, you'd know these things are tough as shit. They're just layers of metal on top of each other, really. I tried to drill through a CPU once (I killed it by mistake and chose to make a keychain out of it instead of making e-waste) and it broke my drill bit. I thought the PCB can't be that tough. Wrong. I was worried about the IHS, as I didn't have a proper drill bit on hand to drill through metal, but it chewed through the IHS easily and quickly. Then, I thought the PCB would be a piece of cake aaaaaaand it wouldn't drill through and I snapped my bit in half. I had to borrow another bit to finish the job (while keeping the same hole size).
450w, that's more than my entire PC. 65w ryzen 1700, 175w rtx2070. 65w monitor 1440p 144hz LG32GK650f.
Depends. If the stiffness is good it should stay straight and weight distribution oke
Fascinating how excited we all get about a massive fan and some RGB lights
2:26 yeah dude, I also love the Radeon 5900x, my favorite cpu by far, the arc 12900hk isn't bad too
LMFAOOOO
I has to replay that part myself just to make sure I heard linus right
ryzen, radeon, rtx, ratatouille... a lot of r's.
Lmfao - I was like did I miss a new CPU release or something. There was no * to correct it aswell.
I'm just entertained nVidia is recommending you pair it with an AMD CPU, though the lower-than-intel power consumption was probably more important than Intel not competing with their GPUs
The first ever computer was over 300kg, looks like we are moving in the wrong direction
299kg of that was not for cooling though
The chips keep getting smaller.
It's the cooling that is becoming bigger again.
@@RadioactiveBlueberry True but shhhhh
@@moortu are these chips actually smaller though? the problem is that transistor count keeps increasing while die size stays the same or shrinks
You're just considering weight, what about the sheer computational power 🤔
card has mesh at the back, to blow out hot air, but the fins are covering it, no problem 3.5 height is the solution, just throw more heatpipes at it
Man, that's HUGE! I would think that this graphics card needs a special extra large case.
You would be correct my friend! Did an upgrade and the card is so massive it’s almost torching my front fans. Mid tower
That’s insane that a card that expensive doesn’t have DP 2.0
That's Nvidia for you, they know people with more money than sense will buy their bs GPUs now matter how much they hobble them...
If AMD 7000 series gets dp 2.0 then Nvidia will become peasant status!😜
Nvidia is long overdue for an ego correction!
Wouldnt that be on Zotac in this case? Pretty sure the manufactuer chooses which ports to use.
@@verakoo6187 no.
@@verakoo6187 no, Nvidia locked it down
Not sure about other people, but I'd be really interested to see a "we made a computer more efficient than a console" revisit, where you try to take a 4090, and undervolt it to be about equal with a 3090 and see where it ends up power wise. I'm actually super curious to see what happens when you don't go completely overkill with this generation's cards
you can undervolt a 3080 to draw 100w less power and it only looses around 1 to 2% performance.
@@SolarisUK now this is awesome. Any video on that? Would love to watch something like that
@@gamepadlad czcams.com/video/FqpfYTi43TE/video.html
@@gamepadlad silicon lottery. Usually you can undervolt so it draws 80-100w less in afterburner without even touching the core or VRAM.
And don't underestimate the power of 2k resolution. I much more prefer to have 2k and higher settings to make the enviorment more epic 🙂👍
Wow Zotac does again with the coolest design ever!
Yo Linus thanks man, when you smashed the gpu on the table and bent the cables aggressively, I was reminded why I still watch you after all these years.
Yea…Jesus Christ but thank god it’s only a zotac
Lmao! I was thinking about how his handling of tech and gpu’s specifically, hardly phases me anymore 😂 it used to make me so uncomfortable lmao !
I like the part at 6:08 where he says that being able to change the game could be a game changer.
I'd hope it's a game changer.
Everyone 60 second in Africa, a minute passes.
I'm still on vega 64.. bought 2 for $500 a couple of years ago, still great in all my games and still good in blender rendering (hence 2 of them).
Hopefully Display Port 2.0 appears in the RTX 4090 TI. If Intel can do it on their ARC GPUs surely NVidia can.
I love the EVGA model. The RGB is sick, and those fans are… man they are quiet!
hah
Since EVGA won't be putting it into production it will be very quiet.
@@jondonnelly4831 insert capt america meme
I've heard that the power consumption is incredible low
It does have some insane technology, in addition to RGB channels it even has an Alpha channel!
I highly support EVGA's move to drop Nvidia.
And more people need to do that if they're going to listen and make changes.
It’s been over 10 years…and Nvidia still isn’t listening. They’re like the ‘Putin’ of the Graphics cards market.
lol lmao
Intel's Arc isn't quite good enough to drop Nvidia yet, especially since the main game I play is DX11 only...
@@edragyz8596 I mean I don’t even want arc to drop nvidia. What would be really interesting is if Intel basically matches older generation GPU performance and offers it as alternative to the latest, greatest and expensive GPU’s.
I mean in 3 years, the 3070 is still gonna be great for most games, and Intel could simply aim lower and offer entry level 1080p/1440p cards that don’t compete with the best buy offer solid performances for a lower price.
Like I am not looking forward to my GPU dying in 2-5 years and having to upgrade to brand new brick that rips my wallet contents to pieces. I just want a instant replacement that is priced in the range of £250-£429 bracket
It’s definitely going to be something that develops further as price continue to go up at the top end
@@TheArsenalgunner28 That's all fair and all, but it seems Intel is the only one who cares about beating Nvidia in the feature game. If Intel doesn't release an xx90 killer at some point, I'll still be stuck buying an xx90, because AMD just keeps MISSING.
I will be waiting for the mini itx build inside a 3D printed rtx 4000 series case (or something along these lines).
going to get a box that houses 2 of these, Need them for Cinema 4D + Redshift.
Linus carefully handling the Arc A770.
Also Linus desk slamming the RTX 4090.
Aluminium brick is pretty impact resistant
I’m waiting for the width of these new high end GPUs to need two PCIe slots to mount in a case. The first slot is the actual data slot while the second one is for support to avoid GPU sag.
Still have a Corsair 800d case with a soft tubing build. Never wanted to dive into hard tubing but now it seems I won’t have a choice! My case won’t fit a new 4090 card because of all the water cooling stuff! I’ll stick with my rtx 3080ti for now since I game at 1440p and I’ll build a new system in a couple years so I may try hard tubing! I just wonder how big these cards continue to get lol!!!😅
I remember when a GPU needing an 8 pin was insane, but 12 is ridiculous
It has 32 pins for your power supply. I'm not sure what that other dongle is for. Linus says it's for communication to the power supply. Probably not needed. My 800 watt PSU has four 8 pins. It may run it but not with a hot CPU.
@@XGalaxy4U Soon you need your own wind turbine to be eco friendly , or wind turbine sub , few people borrow one and split the bill
I remember when 500€ for a GPU was insane
I remember top-tier GPUs costing $400. That's how much a brand new 2900XT cost that I bought with my university scholarship savings.
I mean I remember when we didn't even need extra PSU connectors to GPU, or better yet when we didn't had GPUs to begin with. Times change, technology goes forward.
They're only charging a kidney for these? Amazing, I thought they'd cost an arm and a leg.
(Studio audience laughter)
Bazinga type humor
title is a joke of bad taste for me
Next gen pricing will be your first born.
@@darylallen2485 🤣🤣🤣🤣🤣🤣🤣
That thing looks like a small skateboard, just add the wheels!
Could you guys show the machine learning performance since that is what I am interested in, possibly using a CNN or GAN and timing the training period?
The PCI express connector needs to be redesigned as a cable and you just mount the module inside your case like a PSU.
I think we could use a generation where the focus is on making things smaller and more power efficient with only minimal performance increases.
No we dont
@@scotthadley92 We don't what?
Yes, but no one would buy it, do you really want to pay more for efficiency when with the cost of electricity no classical user will ever see the difference on theirs electricity bill ?
@HackerMode if you want cheaper card more efficient just go with less higher graphics card of the new generation (4070 is a bit like a more efficient and cheaper 3080-3090) I get the point of stopping the power hungry card but I think with the current market you can already do it if you go on smaller graphics card
i think what you're trying to say is we should focus on making these cards more efficient overall, instead of more efficient per watt. ultimately when this card is pulling 400+ watts, that's a shit tonne more than the majority of other cards on the market, but the performance gains don't necessarily reflect the power draw. i totally understand!
6:20 "could be an absolute game changer"
I see what you did there........
I like how Linus seems to understand how much a kidney, underground.
A heatsink _that size_ is not just ridiculous, but bordering on impractical. Not just in the sense of space, but the weight of the card as well, which is getting into requiring a support bracket to keep it from coming out of the socket.
It would make a hell of a lot more sense to turn it into a kind of AIO with the radiator/fan attached elsewhere, or redesigning it to fit on the card, if possible.
Achtually a ton of them will ship with support brackets. Go watch the Gamers Nexus video about AIB 4090 announcements. The marketing they came up with for their GPU support brackets is hilarious.
Right, when you implement a solution to fix a thermal problem that introduces a new issue. I want to design a new GPU Bra. Make it all lacey colorful. Or a Sweedish model pump style novelty gpu riser.
They're making ones with aio radiators like the Msi Suprim 4090
@@gamingmarcus Yeah I saw. It's stupid to require a bracket for this.
@@BrentJohn My next card will be getting some support no matter what.
I have a 10 series card in my system, which is around 5 years old and despite being a relatively small 2-slot card it has developed some noticable sag over the years because it has no support at the back. So this isn't just an issue with chunky cards.
I don't know if this would ever create real problems down the line but there's hardly any effort in putting a 10cm piece of plastic under your GPU so I'll just do it.
That cooling system is freakin huge. I bet you can heat your entire house with that card.
Probably not the houses that people who get these live in
This thing consume 450 watts..Only card that uses too much power..
@@bstaznkid4lyfe392 3090ti also 450watt.
Space heaters are obsolete, just buy a modern computer.
Available at your favorite online scalping outlet stores and auction sites!
We're going to need deeper pockets to run this card and also a few extra reactors for good measure. Stunning card though.
😅😅😅
I hope AMD will be laughing all the way to the bank in november...for our sake
Definitely skipping at least the 40 series card because this shits crazy
and intel, too. with prices 5 times lower that this absolute spit in the face.
🤣
@@512TheWolf512 and like 7x weaker performance tbf
@@ntdspades5971 i reckon at this rate we will be waiting for 60series cards. I find it hard to trust amd/ati as ever gpu i have had with them has either failed or had driver issues. Yes that may have been fixed but i dont trust them in my work PC after so much lost data from random shutoffs/dead gpus. i would never pay this much for a gpu either though so the waiting game begins.
Looking at those coolers, I am actually expecting orienting the fans sideways at some point, if you're already 4 slots wide, you might as well exhaust out of the back of the case.
I was gunna say the module idea is a very likely outcome. I guess we will see
You almost got 100 member on your team and one still foked up the color grading on this video especially on your hands..:D The infos btw is much respected! :)
Anybody else getting old LTT nostalgia with this onboxing? Love the current main channel stuff but it's nice to have that feeling of old comfort once in a while with Linus
From what I remember that was the whole intention of creating this channel, to have a place where their old style could continue since they want deeper dives and pieces that tell a story in the main channel.
I'm still amazed that Zotac is still around. Amazed and grateful, because they're one of the few companies that still get weird with their product designs. Need an overpowered and very short GPU? Check Zotac, they probably have it. Need a low-profile GPU? Check Zotac.
I probably just got unlucky but a Zotac GTX 260 card was the only GPU that ever randomly died on me and the warranty process was a nightmare. I don't think I even got a replacement in the end. I haven't trusted Zotac since then, but to be fair, that happened like 14 years ago.
I wasn't grateful when they cranked their msrp to scalper levels, and were selling $700 3060s, $1200 3070s, and $2500 3090s on their own website during the shortage. Showed how much they cared about their customers then.
@@tibor29 Yeah, DoA products and other faulty products happen to every manufacturer. It's a real shame that EVGA got burnt out on the graphics card market because they legitimately had the best customer support for their stuff. I more appreciate Zotac for offering unusual products than for their overall quality.
I have a Zotac Oc mini RTX 2070 and it works great haven't had any problems with it. Zotac has became a better company then they used to be. Their customer support has gotten better after a lot of complaints. Their quality on their GPUs has gotten massively better over they years because of the consumers complained over it. Zotac do have clean looking cards and funky looking ones such as the one in the video
Zotac has a history of doing low profile cards, fanless cards and all kinds of stuff but look at reviews before you buy there have also been some really bad quality stuff by them from time to time.
a real test of player skill is playing through the hardest levels dropping to 20 fps when bombarded by enemies at all sides
Still rockin’ the 2080Ti but this is looking interesting, been saving for a new graphics card and a few other parts for a few years waiting for this bad boy… will hopefully impress on the next vid you make!
I've never been less excited for a gpu launch...
Because you can’t afford the 4080 and “4070”
Has nothing to do with this particular card
It’s pricing is in line with last gen
@@eliashabash7591 mate. That's not the issue. It's just a stupid buy in all the ways you look at it. And Nvidia is literally worse than apple so it hurts to support them.
@@eliashabash7591 Yeah the 4090 is in line with the 3090 pricing, but the 3000 series promised much bigger improvements over 2000 than this does over 3000. So it made sense that 3000 was really expensive, because it was such a big step. That launch was 2 years ago, by now the prices would usually (without miners) sit at like $550-600 for a 3080 (MSRP was 699). And compared to that, the price has really gone up. And you know that actual prices will be even higher than their MSRP.
I remember when a friend got a 580, it cost like $470 or so new. My 970 was 299. My 2070 was around $420. And the current 4070 costs $900 now???
@@LuLeBe Its ludacris..
@@SpartanArmy117 Should have told Germany that when they relied on Russians for their Energy.
All hail the 7 slot card, imagine the cooling fans that you could fit and the amount of cooling that you could produce...
whole case cooling xD
They could use the "Blowzooka" fans from Delta Electronics that they tested on LTT :D 7400 CFM or so!
@1:03 Linus gives me hope. Linus is living proof that technical details do not matter.
3080ti here, I’m going to wait until the 40 series ti cards come out and need to see benchmark comparisons with the 30 series.
With the introduction of these 40 series cards, I think I'll get a 20 or 30 series
Most people don't even need a 30 series gpu let alone a 40 series monstrosity like that.
2060 super is highly recommended. runs most game at 70+ fps at 1440p ultra[I wouldnt raytrace tho] and worst case scenario you drop resolution to 1080p or reduce quality on distant shadows because its pretty taxing for something you will never notice unless you go batman detective mode on your monitor over something as trivial as shadows that are far away. Also after doing a little research I found that the 2060 super is about on par with the 3060 although the 3060 handles lighting better the 2060 super runs calculations and physics and everything else slightly better, they have around the same price as well. if you get a 20 series get the 2060 super if you want better than that you need to go for 3070 tier or better otherwise you are wasting money
I got so confused with the Radeon 5900X CPU (2:26), but it was just a mistake. I started questioning everything I knew.
Lol same! I was looking for this.
might have to start playing seasonally.. this 4090 will turn my room into the sahara
Thanks for mentioning the legacy DisplayPort disaster!
Having Zotac 3080 this made me happy, to see Linus considers that 3rd best thing ;-)
Scored one week after launch of 3080 at pretty much MSRP. Been serving me very well since then.
As much as I kinda like the new power connector for the RTX 4000 Series, I kinda wish they had placed it on the back side of the card (Similar to the RTX 2060 FE and some Quadro cards) for a cleaner look on the inside in terms of cable management
Why cant they just add it to the mobo? Not like this is a new issue, why not just add compatibility to the motherboard?
@@bthatguy1181 Because the PCI-E slot can’t handle the power needed either.
@@HVDynamo They could come up with a standard connector that fits right after it to handle the watts needed.
Its not like PCIE hasn't been changed again and again either.
I think the issue there is the extra wiring that requires to extend the connection for the power to the board from the end instead of surface-mounting the connector directly to the board.
I think GN had complaints about how the 2060 FE's connector made teardown a lot more difficult, as well.
@@bthatguy1181 Then you need to get PSU, MB, graphics cards, and likely case designers in line for it.
And you probably can't be backwards compatible. You can adapt old PCIe power connectors to the new one. It may not be ideal, but it is viable.
Then you have to consider managing it fpr vertical-mount etc.
And MB makers need to include traces capable of carrying all that power from where it enters the MB to the PCIe slot.
All in all, it would be a LOT more complicated to implement for a small aesthetic benefit that a huge proton of their customers wouldn't care about.
I mean the thing with RTX remix is that you can use it as much as you want to, right? If you want the most ‘authentic’ experience, you can just use it to run the game at high res & FPS and just be done with it. It doesn’t have to be as deep as adding RTX, upscaling, etc unless YOU want to. That’s the nice thing about modding in general
The thumbnail is so goofy , I love it 😂😂😂
Looking at the size of this I think it's perfect time for case manufacturers to start experimenting with different mounting position for gpu . I know lian li has 2.
I don't know why, but I like the design of the Zotac cooler. Still wouldn't even consider getting an RTX 4090 card though (my electric bill is bad enough as-is).
What do you get charged for kWh? Here in California it is average 21c kWh which at 750w an hour for 4 hours a day is 50c a day (who plays that much everyday though and 50c a day is still not that bad).
That is hypothetical though as you are not going to be consistently drawing that much power over our 4 hours of base screen time, even while gaming. You will be hitting much lower.
Did you ever do the full review?
Linus: *smacking an 1600USD graficscard on the table*
my heart: *stops a beat*
I laughed too hard when he whipped out the Zotac Gaming box after explaining the cool ideas he had. I needed a good laugh today. 😂
I think a new pc motherboard layout will be needed in the future and internal pc case design too. Gpu's are the dominant thing inside a case now!
I love that the Rtx 4090 consumes more power than whole setup with i5 11. gen and Rtx 3070 (monitor including) :D
I hope tear will be more efficient options in a few years
Pretty soon we are going to be installing motherboards onto the GPU instead of the other way around.
RTX 6090: Requires a separate 1000 watt power supply with a 96 pin pcie cable and a separate case that has 20 fans installed (For average performance of course. Full performance requires more extensive measures)
No like straight up this stupid shit is gonna happen if they keep going in this direction, it’s so lame for consumers
Reminds me of the Ghostbusters, Ghost storage device, that thing is a Behemoth!
10:12 It then also could plug in in more than one PCI slot and, not only, draw less power over the flawed connector, but also be better secured.
My MSI 4090 runs like the entire length of my case. Couple mms of room past the front intake but thats it. ITS MASSIVE but its also a fucking beast.
2:32. Radeon 5900x is my favorite cpu! Much better than Ryzen!
Man, the zotac amp holo 4090 is insane looking
It's the same card but but RGB bling
it looks like a giant soap bar
when he revealed the price my first thought was: 'Well thats actualy not too bad'. The last year was rough.
I like how they release new GPU's when I cant even get my hands on a 1660 from the stores....
NVidia remix in the hands of environment artists who know what they are doing and understand the level design of games can really improve the games quality and playability without worrying about those downsides that you mentioned in the video like strategically places lights or vibe of an environment. That is what Im looking foraward too not hapazardly upgrading everything for the sake of upgrading it.
I think that the artist intention argument isn't valid for players who've already experienced the game at least once as it was intended. That's why modding exists. We want to try the game again in a different way, and the modder is presenting his or her interpretation.
Will be waiting for the second gen of ARC or a drop in the 30 series before I upgrade from my 20 series GPU. I like having kidneys!
Love all the zotac shade here.
"zotac recommends a 1000w power supply" , ahh finally i was looking for a nice space heater
I've suggested before that graphics cards be redesigned as a module.
Just a block like the PSU somewhere in the PC where it has dedicated air in and out.
Perhaps a dedicated connector as well instead of PCI-E
Back in the day it was an expansion card, nowadays most Personal Desktops have a graphics card. (PC's not workstations)
There is a case and a patent for that already. It even got reviewed on LTT.
PCs*
What is that awesome house light thingy to your right in the video? I want one!
You're so honest Linus 😂 , most of us waited for 4090 just to buy 3060 after the price drop😅
3060?
3:16 why are there still using air? With this much power shouldnt we move to water as a base?
Linus made my anxiety levels skyrocket from everytime he drops the card purposely 😢😂
He just wants to demonstrate FPS drops.
When I eventually do a major overhaul on my PC, I'm probably going to get a top-of-the-line GPU if possible. Basically going to be rebuilding my PC all over again when I do that.
ok boomer
In the distant future, the RTX 9090 will be solid state and will be the size of an iphone 4. You heard it first from SupplyNinja. Don't forget it.
Linus may have mentioned the 7 slot wide card. But how about a card that plugs into 2 PCIE slots?
My biggest concern is power draw, I recently upgraded from a 500w to a 850w and I can run basically run any 30 series theoretically.
“Upgraded recently” u made a poor choice knowing 4000 series will come out
The 4070 will be ok, I don't know about the others though.
If you're considering the 40 series, you should've sprung for a 1000w PSU
I bought a 1050 watt psu in 2018 for my 2080ti build because I didn’t know if the upgrades coming down the road would need more power. Looks like the extra $40 was well spent
@@ryanespinoza7297 nope, don’t get carried away. Your old 1000w psu doesn’t have the new connector which is recommended