Velikost videa: 1280 X 720853 X 480640 X 360
Zobrazit ovladače přehrávání
Get the news on the NVIDIA RTX 40 launch here: czcams.com/video/3tZ01ymHZEs/video.htmlLearn about EVGA leaving the video card industry here: czcams.com/video/cV9QES-FUAM/video.htmlGrab the BRAND NEW GN 'Amp' Medium Anti-Static Modmat for PC building, in stock and shipping now: store.gamersnexus.net/products/medium-modmat-v2
I don't even know what graphics card I put in my computer. It was 250$ and I play Spelunkey.
You don't say jolf, you say golf. You don't say jold, you say gold. You don't say jif, you say gif.
@Bihruz A they know that. Its a running joke that they don't want to say it
Great video! These are some nice giraffics cards. I bet they process giraffics in a jiffy.
DLSS is amazing; it can even upscale prices.
Dollar leaping super stocking
And upscale sizes.
Your reviews are truly a jift to the gaming world.
Jaming world lets jooo!
This is my reaction when people call it a "jiff". Say the word "gift" then drop the "T".
You're right Craig Jriffin
LOL, marketing for these brands feels like they're targeting 13 year olds on Xbox Live, circa 2005.It's absolutely hilarious.
In fairness, they're nVidia fanboys soAnd yes, fanboys. Because I was calling it a year ago when I said Lovelace was probably going to be a shitshow, and it's so much worse than I expected. Literally the only people left willing to put up with this shit are either amateur professionals who have to use some obscure nVidia feature for rendering, and the fanboys who'll buy a brick of elephant shit as long as it has the logo. Everyone else has been drifting toward AMD for awhile now. Last time I even saw a truly great nVidia generation it was still called GTX. Every single RTX lineup has been bad and gotten progressively worse, from DLSS 1.0 and "price uplift" to the absolutely insane instability and power issues, nVidia's bad drivers etc. on Ampere with nutso price gouging only overshadowed by their MSRP looking "reasonable" from scalping. I could already tell just based by TDPs that AMD had a threatening lineup coming up. What I didn't expect was for EVGA to be driven away from nVidia's abusive relationship and the AIBs to go nuts also but well, at least it's entertaining.
13 year old's with 2 Grand too somehow.
Palit's "Absolute dark power" had me in tears for minutes
I'm def going with Palit for my next GPU. I Gotta get me some of that Dark Power...... I mean if it can make Steve say WTF it's gotta be strong right?
@Heckulous I keep coming back for that part myself :D
Rewatched that part a couple of times - pure gold :D
I said WTF the same time as the video lol
I now want to see a Frigidaire GPU, it will be 6 slots long, and plug into 3 pci express ports just for support. It will be a mix of white, grey, and beige color themes and look like an over size box with some fans on it. It's website landing page will only have the specs and a user manual on it. It'll also have a remote that you can use to heat up your house in the winter, as such the card will also have a build in psu and require a separate power cable to be plugged into the "card's" back I.O. As for branding it'll only have a part number and be prices just a bit more then the rest of the gpu's out there.
lol I thought you said frigate at first. Reminded me of mandalore's BFGA2 video describing a 40k ship as "a flying gun-brick the size of a freeway" and I thought, well yeah might as well just introduce the frigate class of nVidia GPUs. Then they can also introduce the corvette, cruiser, battleship. At least it'd be consistent!
Nvidia is actually very considerate of their buyers by providing them with video card, space heater, dumbbell and building brick all in one.
@MISTER SIR water cooling isn't magic it's just better heat distribution. It actually takes way more space and introducdes another breakabls active moving part - the pump. It's honestly not the greatest solution.
@ArtisChronicles i can't see it coming in the next few years for sure.
Well done for getting through that video Steve. Do you think any of these manufacturers will realise that a lot of their customers are actually older than 14? Some of those cards are seriously hideous and they come in packaging that make me thankful for mail order so no-one has to actually see me carrying one of those! I think the Zotac cards are the only ones that I looked at and thought "yeah that looks ok!"
Agreed on the zotac, but it says 'live to game'
But but, we need to show off our waifu GPU !!! 🤣
If you’ve been toying with the idea of moving to water cooling, I think now’s the time if you are considering 40-series. It definitely looks like the way to go both for thermals and just to keep the size reasonable. Or if you want to stick with air cooled, I guess you could try converting a dumpster into a PC case to hold these behemoths. Just make sure you budget for a house addition construction costs in your build.
Full tower cases will have to make a comeback to fit these massive cards
right where these cards belong, perfect.
you will be fine, you may need pretty chunky radiators but at least you can put them someplace else. Water has the highest heat capacity of basically any substance, if you can passively get the heat to the fans you can easily get it into the water
@person thing not really how the physics of water-cooling works, but I did include a link to an article about the iChill Frostbite card in my last comment. You can see it is thinner than 2 slots, and only extends the length of the PCB. If you look at the air-cooled cards closely, you can see they extend well past the PCB…I’d guess at least a third of the overall card is overhang past the PCB. I will say if (and its a big if, since I’m waiting to see how AMD’s top offering stacks up head-to-head, which I committed to do once I heard 40-series would not get DisplayPort 2.0) I get a 4090, I will put it on a dedicated loop. Probably not necessary, but it wont hurt, and will make for a visually interesting build. But the iChill, and EK’s custom water block www.ekwb.com/shop/ek-quantum-vector2-fe-rtx-4090-d-rgb-nickel-plexi both look similar in size to 30-series water blocks.
@MrOffTrail It is more that the heat produced by these thing (as seen by the size of the coolers) in gonna need some chunky water blocks to run enough water to catch the heat from these cards
it's kind of weird how GPU's are getting bigger, We live in a time where everything gets smaller and more powerful.
@Friedrich Quecksilber nVidia's overall stupidity and incapability isn't an accurate reflection of what's going on with that though, they're just the more inefficient GPU makers. R9 Fury and Fermi for example got there before. It's not the first time we needed 3x8pins to power a GPU last gen. AMD's cards last gen still looked largely sane for example, while nVidia clearly was struggling and started building bricks with Ampere. Honestly anyone paying any attention saw this coming all the way back with the RTX 2000 series which direction the company was going, in pushing dumb gimmicks and superficial bs with massive power draw and impaired frametimes/powerspikes/drivers (yes, the RTX 2080ti literally had 5700XT tier driver problems on a $1200 GPU) including emphasizing cooler designs (that was ripped off from Sapphire's AMD Fury/Nitro design) instead of talking about pure performance. It's like nVidia basiclaly decided they couldn't compete anymore and so decided to make everything about running HairWorks better.What I'm curious about now is whether nVidia's software still sucks.
Well the GPUs are getting smaller and thats why they get hot and need these crazy big coolers
We've come a long way. Most of those coolers are truly colossal. I remember not too long ago, my ROG Strix 1080ti with its 2,5 slot cooler was considered enormous (and also extremely power hungry with its 280W 😂😂😂)
Dude I am still running a 1080 ti strix gaming OC watercooled. I still have the aircooler around. To think that the 4090s are even bigger is like... When are we going to build seperate PCs just to hold the GPU and connect them via riser cables
EVGA pulled out just in time
its a warning for everyone!
I don't know what she said... but it's definitely what I said.
@Barry Wallace - No you're definitely objectively wrong. You didn't even refute any of my reasons you just said "nah bruh evga sucks"
@FNVBlaze2027 I am NOT objectively wrong at because EVGA build quality of their products is crap and EVGA customer is NOT as good as people make it out to be. As t least with Asus and Galax you guaranteed a good product and both companies customer support is very good.
30 series is so in demand, it still makes you want to get it even after the 40 series has been released.
half the price to 40 series. its also waste of time and money
Don't fall for it. Thats the point of these high prices. To make the other overpriced garbage look less garbage. Used shit for new prices and new shit for scalped prices. Fucks sake.
Remember this happened almost 8 years ago and AMD rocked the scene with Radeon Fury Nano.Hopefully Intel or AMD do that again because at this point Nvidia basically went off the rails requiring a full sized PC for a card that is basically a rebranded 60 or 70 series selling at Titan prices.
8:13 is final proof that the marketing departments of every major graphics card partner have been overrun by chuuni's. I can see why NVidia would want to quarantine them during their presentation.
This must have been so incredibly embarrassing to cover. I am convinced that these board manufacturers are convinced that all gamers are idiots.
In fairness, they are. Have you ever tried interacting with "people" on Steam forums? You can easily forget just how F'ing stupid average gamers are until you actually go and interact with the ones not playing your niche city builders or whatever, and realizing these are literally Qanon follower tiers. Plus it's nVidia anyway, so they tend to get a much bigger portion of the bottom end of the bell curve regardless.
I get the feeling that the reason many manufacturers have omitted the RTX 4080 12GB pictures is because they originally rendered them as "RTX 4070" and Nvidia blindsided them.
@DAT Wagonator according to the die number, it's even a 4060
@razorgxp The new 4070 line is most likely just binned so called 4080 chips. It's pretty obvious to most of us, including many in the tech industry, that Nvidia rebranded the 4070 to "4080 lessor" so they can charge more for it. There's a reason they didn't initially announce the 4070 until the media started clamouring about about it missing.
Gotta pad them profiteering
Pretty disappointed in the latest gen of PC hardware so far, feels like we're going backwards. Everything is getting bigger, hotter and more power hungry.
I tried warning all you guys a year ago what kind of a total shitshow nVidia had become but you didn't listen. You could've prevented this. I even called, hell not just this, but Ampere when I warned you all that Ampere was looking pretty sketchy right before release, and it turned out to be one of the most meme-worthy nVidia generations since Fermi. They even started using the Sapphire Nitro Fury-X passthrough cooler design, which was an actual meme on AMD's old R9 300/Fury series cards for how hot and unstable they were, and suddenly nobody bats an eye when nVidia has to adopt it and make it mandatory. Then when I saw the TDPs I just knew it was over, because Jensen will do everything in his power to make sure they have the ostensibly "best" performing halo card regardless how much the rest of the lineup is total shite. This signalled to me they already knew that AMD was gonna beat them with RDNA3 so all what this is is some truly desperate moves on nVidia's part to maintain their image, because optics is all that really matters to nVidia, not the performance. Meanwhile they even managed to drive EVGA away, after having successfully ensured that all Macs (shitty as they are) and all consoles now have AMD hardware on it, thus ensuring much better stability, support, and optimization for productivity software and games running on AMD. I mean what do they even have left, the Switch and Tesla or something? EVGA clearly bailed at the right time. Sadly, the average nVidia buyer is now looking like a beaten housewife trying to rationalize how the corpo really loves them deep down inside, and shelling out 1080ti levels of cash for a shitty 4070 that's going to draw more power and perform even worse than a 7700XT and still require you replacing your 850/750w PSU most likely, so now you get to add that into the cost of upgrading your monitor and GPU. It absolutely astounds me that even after EVGA themselves finally had enough with nVidia, that there's still honest to God thousands of people out there who insist on buying nVidia cards for no good reason at all but branding and memes.
I love BFGPU concept. Finally we are getting real performance gains. You are stupid if you think a halo flagship discrete graphics card for a desktop computer that plugs into a wall, should be small and "energy efficient". If you don't want a flagship don't buy one.
Yeah, I think we’ve reached peak absurdity when it comes to card price/operating cost, size, thermals and naming conventions. I sincerely hope nobody buys into this, but I know e-peepers,and it will happen, unfortunately.
I like how you started with ASUS as they seem to be the only one following their naming model seriously. It's like you didn't want people to confuse them with the other nonsense they are surrounded by.
That was the best laugh I have had in a solid week! Honestly, these GPU companies are just rediculous these days. 4 slots is crazy enough, but when they cover their card in sparkly dark matter and fall in love with three fan designs supported by dark obelists, I have to wonder if they even know what kind of products they make?
These cards need to start being sold in the home appliance sections of big box stores. The power and heat management are gaining ground on small cooking appliances.
Plumb the water cooling loop into your HWS for free hot showers.
This had me laughing
@Javi Lopex "yellow stickers denoting the energy usage per year."^ThisAlthough there is a lot of wiggle room, so it should be based on say yearly cost per 1 hour of gaming per day or hours of gaming per week, and just multiply by how much we game. Along with hours per week web browsing and such.They would use unrealistically low gaming time to make the number low if they can.
@Zag Zagzag the US has enough natural gas for 98 years. & it is a renewable resource. There should not be any shortage.
True Ninja Foodi and off grid, do not go together.
I hope these aftermarket cards keep being ridiculous because this video was a hilarious reaction to them. 😂
I would love to see a electricity drain test with a 4090 ocerclocked and AMD's new 95W+ CPUs. I hope it frys a psu or 2
These silly cards just make me more excited for Intel and to see what AMD does. I've always stayed a gen or two behind for cost reasons so I know I'm not the market for these, but c'mon, it can't keep going in this direction. Eventually they're going to be too big, hot, and power hungry for the average person to use.
40 series is already there, most people will need a new PSU even if their current PSU provides enough amperage.
I predict a strong trend in watercooling this generation. Its starting to make sense now with oddball slots multipliers.
either that or a strong trend in people skipping 40-series
I'm impressed at how little i care about the 4000 series. Well done nvidia.
@TrajanRomeo I judge by power consumption thanks. Shit's DOA to me.
@Animalyze71 1080Ti? more like 1080p high lole
@Animalyze71 I mean, in the specific case I was referencing in my reply, hardware would certainly affect frame rates in a game like RDR2. Not to say games have perfect optimization.
@Zaiquiri Is it the games functions or the programming errors in those games that cause the issues? Small minds always blame the hardware when most of the loss and stutter comes from the code itself, lay folks call it optimization but it's a small deal more than just that.
@Rodiculous Yeah only the VC causes frame rate loss, too bad that's false. Ask those people what cpu and ram they are using and how much ram is installed. Also what Motherboard are they using? Do they have all PCIE lanes choked? 1080ti should push 60 no issue for that game. I'll have to install that game if I can get it free, hacker/cheater player base shoot em up games suck these last 20 years.
By the time we get to the 5000 series, we’ll need AIO cards or custom water loops for the cards to keep them cooled. I just upgraded from my 1080ti that was a beast of a card from gigabyte to the 3080 strix card and the size differences is actually comical!
Same story here, had 1080 TI Strix for a while , sold it to a friend a year+ ago.Got my 3080 Strix recently, this friend came to me to clean his computer, was overwhelmed by the size and design.1080 looks waay cheaper now with the materials and build quality.
Asetek can only wet dream that this will happen. Last time a GPU was designed facotry like this it was sued and shutdown from Asetek (Radeon Fury X)
Had to laugh so much when hearing about the brilliance, dark powers, and anti-gravity technology. Hikes, Albert Einstein was nothing compared to these geniuses.
With how they decided to tune these cards, do you think the longevity of the 40 series will be drastically shortened?
Went for cases with horizontal mobo layout a while ago.. in the near future we seem to be sticking something small (MoBo, CPU and such) on the GPU instead of putting the GPU on something. So ATX standard still a thing ;-)
The more I hear of the 4000 series the more I'm excited for the AMD cards
@Cacodemon345 My GPU paying for itself twice over (just through mining, not the other professional work I use it for) is a problem in your eyes? I'm confused. Oh, are you a "noooo you can't just heckin use electricity" guy? My PC's in the basement, it's about 10 degrees Celsius down there in spring and fall and the pipes freeze in the winter if I don't have a heater running. So yes, my 3080 Ti not only 200% paid for itself (not counting a $250K job contract it recently helped me win), it cut my heating bill, I forgot to mention that. Damn you Nvidia! So evil!
@A Fistful of 4K "mine 2x its value in crypto"There's your problem.
ewwwwwww AMD is trash
@Tesseract Orion Ripped off? I have the second-fastest GPU you can currently buy and I got it 43% below MSRP thanks to EVGA. I paid less for it than I sold my 1080 Ti for, lol. It's been integral to my home business that kept me afloat during COVID and I used it to mine 2x its value in crypto while I slept. And when I do have a chance to play games with it, it produces great framerates at 4K, with or without DLSS. It's a beast, and if I could go back in time and change my decision the only thing I'd do is buy two and set up a secondary rendering/compute/encode PC.If Nvidia charges $2000 for something that costs them $500 to make, AMD will charge $1900 for something that costs them $200 to make thanks to using chiplets. They're not going to cut you a deal when you submit your list of pro-AMD social media posts. This doesn't make either one of them your friend or your enemy, they are corporations and this is business. Wake up. Buy the best product for your needs, not based on politics that exist only inside your head.
@4GB MEANS 4GB That's super helpful, yeah. Hopefully this makes it down to the 4080 and lower, I'm not sure I can justify the 4090 even as a business expense. Man, I wish someone would just put out a dedicated video encoding card.
Heh, decades ago before it had the name "gpu support stick" I was using a cassette tape case (remember those?), wedged between the bottom of the PC case and the gpu. It was the perfect fit, non-conductive (in case it got dislodged and fell against any motherboard contact points), and depending on your PC case, mostly invisible.
The massive heatsinks and fans is actually showing how energy inefficient these chips and cards are! Also who is going to be running these as the global energy crisis gets worse?
6:35 That dev literally went back ~10 years and found an ancient Stackoverflow technique to rape mouse wheel events with jQuery and control video playback via parallax scrolling jank.Incredible. Counting my blessings as a front-end developer. We are a special people.
If something is taking up 1/2 slot, isn't it effectively taking a whole slot? What cards fit into 1/2 slot?
NVidia: "We need you to rebrand your RTX 4070s as RTX 4080 12GB"_EVGA has left the chat._
@Usama Rasheed more the other way around: Nvidia tried to "do" EVGA and they said "not with us, idiots"
@insuna they'll use stickers just like they did with the super release of 20 series.
@thesinaclwon There'll be no worries on that unless crypto mining becomes profitable again.
@Ernismeister Ultra underrated. Made me chuckle.
Lol you can imagine them just putting a tacky sticker over the RGB name on the side
Honestly at this point if AMD's RDNA3 cards are at a reasonable price with competitive performance, and have "normal" looking air coolers without the Shadow the Hedgehog level edginess they could knock it out of the park this generation.
So of course, AMD is going to be doing none of that, whatsoever.
When these cards drop and you do your reviews, could you do a breakdown of which cards would be the best option if you're planning to watercool them in a custom loop?
With my favorite GPU manufacturer leaving the business, the only other company I would even consider is Asus. Considering how good their 780 DCU II OC was, I'm willing to give them the benefit of the doubt on their latest design for the 4000 series. Or I might just skip this gen since I have a 3080, and this is looking to be the worst price/performance generation ever, and that's saying a lot considering the past two years.Also, it's not pronounced like peanut butter.
I love BFGPU concept. Finally we are getting real performance gains. Anyone who thinks a halo flagship discrete graphics card for a desktop computer that plugs into a wall should be small and "energy efficient", is stupid.
EVGA looking smarter and smarter every day since the 40 series announcement.
@Cacodemon345 did you just eat a sticky grenade?😄 I meant that all iPhones look the same so you need a sleeve to make them look different. So if all AIBs leave and only Nvidia makes the cards, all cards will look the same just like all iPhones look the same
@Cacodemon345 I suppose you entirely missed the point. it's not what Founders Edition Nvidia GPUs or iphones do, can do or can't do (obviously you can't call someone with a GPU....), it's about them looking the same if there won't be AIBs left for Nvidias GPUscan't deny it, if you place 5 iphones next to each other they look the same. no widgets, no custom grid size or spacing, etc. So what's left is a different wallpaper, and of course attachments and cases/sleeves.And yeah, calling it "gaming" if you play on a phone is still ridiculous in 2022. That's like calling yourself a gardener if you have one plot plant in your kitchen, or calling yourself a professional baseball player if you played catch as a kid. #shotsfired (ok, not really, but I hope you get my point)
@XuryFromCanada GPUs aren't general-purpose computers or even if are, aren't designed to work like one.iPhones are useful for more than just gaming, with an entire ecosystem built around them. Not GPUs.
@XuryFromCanada I don't really care too much about what my next graphics card will LOOK like physically as long as it offers good specs at a good price and good drivers, ... but I kind of agree with you ... the day nvidia manages to drive away all of their AIB partners will be a sad day ... for consumers and nvidia alike.There once was a popular graphics chip manufacturer who at some point decided to exclusively produce graphics cards "in house", thereby pissing off any AIBs who previously worked with this graphics chip manufacturer ... the AIBs moved on to partner with nvidia instead and that other graphics chip manufacturer went bankrupt and got bought by nvidia a while later ... you know who I'm talking about ... 3dfx of course ...I think the AIB cards played an important role in nvidias rise from being just one relatively new and unknown graphics chip manufacturer amongst many (Ati, avance logic, cirrus logic, rendition, S3, tseng labs ... they have all been there long before nvidia) to becoming the biggest manufacturer of (PC) graphics chips.It appears that nvidia has forgotten what happens when you piss off your AIB partners ...
@kyle hubner They do look great! But I love the comical variety of the AIB cards and all that crazy marketing talk that just hypes you up to rip & tear 😄
Was wondering if GN would consider making a video about warranties / customer support from current GPU makers. There seems to be a lot of new companies from China that offer GPUs; EVGA - the king of customer service is gone and I am not sure what kind of BS level to expect from the new offerings. From my current understanding Asus and MSI are still decent options, Gigabyte is the bottom of the barrel, for others I have never had a card from them so a professional evaluation would be great! Those new cards look so heavy, I would imagine that PCB bend would be inevitable if there is no "GPU stick" in place, so many users will have to check in with the manufacturer regarding a warranty.
I have really enjoyed the news style content on your channel, your production quality is looking really good these days.
As a casual gamer who is happy with 1080p and 60FPS, these days i am more excited to see what the low to mid range cards have to offer. I dont think i would ever need any of the higher end 40 series for at least another 10 years.
It's getting to the point these GPU's might as well just integrate to a motherboard and optimize. Or at least the MOBO manufacturers should start putting the GPU slot on the bottom (for support)
I really think the GTX 1000 series really was the peak of GPU Design.
@Dingickso The funny thing is that the man who invented that term (Yahtzee) meant it as a tongue-in-cheek way, and regrets saying that now.
3000 series was actually really good value if you got them at MSRP, especially the 3060ti founders at $399 faster than a 2080 super
I guess they really meant it when they made the slogan "Gaming Perfected". If I wasn't playing at 4K I would probably keep running my Titan XP a little longer tbh.
@Susan doesn't understand fair use That's admirable but you should use all that money you saved from not upgrading your GPU to get a 120hz+ 1440P monitor and stop playing shit at 1080p
In hindsight, the 1080ti was the smartest buy ever. Unless you go 4k, that thing is still absolutely capable in 1080p/1440p and will be okay for another few years depending on what you use it for and want out of it. Stunning value!
With this serie of GPUs, watercolling seems to be the option to go for, the Inno 3d Ichill black is the only one good looking enough for me
Now we need case manufacturers to include radiators in BOTH side panels and we are good.
Thanks for the video, saves a lot of time. I think the days of air cooling are coming to an end. 4 slot cards blowing heat everywhere just seems cringe, and any high end cpu that's air cooled is probably being held back. Surely anti gravity shark fans will solve it.
DLSS upscales now sizes of GPUs, images and the prices, what an amazing work.
Loved the writing
i think cases will need 2 x PSUs moving forward... 1 for gfx cards, 1 for the other components
I was expecting in the past that as technology goes by, the graphics card design and CPU heat sink fan design will shrink because engineers will find a way to make faster components but it will only consume less electricity even at higher work loads. 4 SLOT OR EVEN 3 1/2 SLOT even 2 slots is a BIG NO TO ME! I miss the days where AGP video cards are small and perfectly fits on that AGP 3.3v or 1.5v slot not taking up the space of the next PCI slot because the graphics card's fan is just FU*&^*CKING SMALL! Graphics Cards designs should be like this again! What is the use of adding more PCI-E slots if you cannot use it because its being blocked by that OVER SIZED VIDEO CARD COOER?!
I think I know what really happened to EVGA. They heard the rumors of marketing on these cards from their competition and they were like "I'm too old for maximum dark power obelisk bullshit" and just left the industry. Seriously tho imagine how hard the market is on these manufacturers, you have to design something that will be functional in coolling down what is essentially a furnace on a chip and make it look "special", market it right and sell it with a laughable margin because of the insane competition all the while jensen goes around asking you how you dare wanting partner information on a product you're going to sell and provide support for, when you don't do anything at all and rake in a lot of money for doing nothing. This whole market is insane and it's unlike anything else really.
these marketing summaries remind me of the good ol days when board partners would put graphics on the outside plastic of the gpu shroud. feels like gaming is staking a step backward.
Can't wait for an RTX 7090 that will finally use all 7 of my PCIE slots
@DagobertX2 When the swat helicopter flies over your house. The pilot will call into HQ. Pilot: On the thermal sensor it seems we just found another major marijuana grow op! Or maybe another current gen PC Gamer....
@Misty Kathrine 😆
@Zamaric I mean, I'm actually expecting a future GPU to have a power cable coming out the back of that you plug directly into your wall.
@MrArrmageddon 3 phase
"We'll see how long they last as an NVIDEA partner with those AMD colours" Well said!!!!
This 40xx series reminds me of the 900 series a few years back. They were getting BIG and way too power hungry. Competition came up and forced actual innovation for the 10xx series.
I remember the time when a top level GPU has 1 slot design and it was awesome.
I laughed so hard I started choking at absolute dark power. I'm happy with my 3080. Ya'll come back when you can design sensible products.
Someone has to tell ASUS that in physical products, you can't round down. If a card is 3.1 or 3.2 slot tick, it blocks 4 slots, period.
@Sensible Its jooookkkkeee
It's not just Asus, they all use that non-sensical jargon...
Previously 3 slot coolers still made the 4th slot unusable in practice while also making leaving the 5th open not a bad idea.The 3.1, and maybe the 3.2, are pretty much the same deal but starting to push the boundary on wanting to leave that 5th open.
it does make sense though, if its 3.15 slots thick then you know it blocks 4 slots BUT you also know that you have enough space to the 5th slot for ventilation. if it would just say it blocks 4 slots then you dont have that information and might think that you also have to keep the 5th slot free vor ventilation.
@Ali Shaikh Useful would be telling the number of slots it takes up and then a separate entry for the minority of system builders that actually need it the exact mms of clearance you have available in the final slot. None of this fucking imprecise measurement shit they have going on.
At some point, it would make more sense to just make the super hardcore GPU's a separate module. One box with a GPU (or 2-3 if you're insane), power supply and dedicated cooling, jumper to a PCIe connector on a single slot expansion plate, small cable to the GPU module. If you're going that hard, I doubt anyone is going to care about an extra box as long as you have ALL THE POWER!!!!
I feel like the 40 series is totally unessesary like hardly anyone could get the 30 series until recently, and the improvements aren't that big, and it seems like the only way they were able to get them is by making the cards draw way more power that usual.
Maybe we should start considering having the GPU be the main board and CPU/etc as addon cards.
I literally do not know why these companies don't just make AIO Hybrid coolers like EVGA does as a standard. Hopefully with how big these things are getting it will become the standard.
At this stage they should just build the motherboard into the card itself.
This is kinda what Nvida's Grace architecture is.
Or the gpu have its own separate case and own power supply
It's called a console.
NVidia tried for a while with Nforce....didnt last that long in grand scheme.
@NoiseBomb GPUs have been taking up two slots for years. Usually an x1 which most people don't use anyway.
I'm going to stick with the 3080Ti for the time being. The 3090 and the 40xx do no seem to worth the complications due to heat and power consumption...
Honestly, with these 4090 cards being SO powerful and needing massive heatsinks/fans, they all should just do water blocks and attaching the cooling elsewhere in the case. If every card is so heavy it needs reinforcement then ya, it's too heavy and will likely break the pci-e slot it's sitting in eventually without the extra support. I'm going to try and get a water-cooled, non-custom loop(I'd rather not deal with all of that bs), and see what my case options are. I'm hoping for a min-itx build that's about the size of a shoebox.I have little hope tho as mini-itx hasn't been a thing for years now(looking at mb's being sold for latest CPU's). Almost every single mini-itx mb has been disastrous mb's that needed updates right out of the box in order to even support latest CPU's. So I needed an old CPU, just to update the bios, and use the new CPU. Like, "Fk no, you companies should do that for me."
Oh men.. i remember how sick cards looked back then. The red MSI cards from the 9xx generation. Or the EVGA cards with full copper cooler. Or the Jetstream cards, the Aorus Xtreme cards from the 1xxx series. And so on... Or my very first card the amd sapphire R9 390x nitro. Beautiful.
I swapped out 2 1080's for a 3080. I was Impressed that the 3080 was lighter and smaller than a single 1080 and thought the next series of cards would surely be more efficient and powerfull.I thank Nvidia for releasing such cringe. My freind is a scalper who buys cards and sells them on, even he said no one would buy the 4000 series so he's not going to bother. Honestly the pricing is just ridiculous. I'm fortunate enough that I could buy one of the new cards but I didn't get that money making poor financial decisions and I'm not going to waste it on this garbage.Also there's no games worth playing anymore that would warrent such a card, mining is pointless now. What are Nvidia doing?
Also, being so power hungry, you'd be stupid to try and mine with 40 series, lol
It's ironic how detached the manufacturers are from the customer base. This generation really deserves a hard flop, and we can only hope the next one is better
@Animalyze71 which won't ever happen cause that gets clicks. Hell even when mocking it you get as much if not more views.
@RTF I think you mean the 4080 part three right?
Until we can make ppl understand they don't need these cards yet and stop wasting money on them the cycle will only continue. Once content creators stop drooling over these new releases making people want them more than they need them, that's when things will calm down.
@Pirojf Mifhghek Could you imagine if we were still pushing multi-GPU + SLI/Xfire cards at this point? Imagine needing an entire power circuit to power just your GPU's. >
At this point the GPU makers just need to build a case with the card already integrated.
Yup. I'm hopping onto the AMD Train from now on. I've usually always went with their CPU's, and now that Linux supports most/all of my games now, and windows in shitting the bed. Now NVidia is getting too expensive now too. I'm just going to hop onto AMD for GPU now, supports awesome for it too, so I'm done stressing over this shit. Cheaper, better, faster. EVGA saw what was happening.
If you gaming on Linux, AMD is the way to go.
NVidia: "We need you to rebrand your RTX 4070s as RTX 4080 12GB" EVGA has left the chat.
If I install an anti-gravity plate, would I have to put a 20kg rock on top of my computer to keep it from floating into space? Would a gravity-related event be covered by the extended warranty?
Inno3D isn't just "brutal", they're "201% committed". That's impressive.
Since they are in Hong Kong, maybe BRUTAL is their tongue-in-cheek comment paying homage to Hong Kong's brutal government events in 2019 thru now.
@Mike Watts Being an Aldi customer and an Inno3D card owner, I approve
You laugh, but I'm sure it's just a conversion anomaly from metric to imperial units. With enough precision, it would've been 200% even.
@Handyman I need my GPU maker to be at least 500% committed
I'm just gonna wait and hope for AMD to make good cards
@Chris Proost I see
What made you think they don't already? Slightly behind in some metrics this last gen nm closing the gap before the gen is over is nothing like bad at all compared to... before.
I'm not feeling any regrets over my purchase of a 3080 10gb on prime day. These 40 series cards are not even a little bit compelling.
Man, we definitely need Noctua aftermarket GPU coolers to cope with this insanity....
Between the astronomically increasing power draw and equally gigantic air coolers, when are these companies going to realise that they can't push preformance in such a way year over year. These cards draw more power than an average office computer and are getting to be the same size of small mid towers. INSANITY!!!!
I've not been this underwhelmed by a GPU launch in many years.
@You Are Talking To Yourself is it really fps if the GPU makes it up?
Even though the performance improvement is decent, it's all so underwhelming and a parody of itself.
Just wait for the perfomance gains, boy.
@realbadTech I find the frame interpolation suspect. How usable is it really going to be? Are we talking about a frame of latency? 5 or 10? I want to see it demonstrated thoroughly before I get hyped up.
The new cards are so big and heavy that under normal operation, two smaller graphic cards are orbiting them in an elliptical orbit.
I can't imagine a situation where cards of this size and power requirements appeal to me.I mean, I'm playing Cyberpunk 2077 on my 6700XT and it's perfectly fine! What games are coming out in the next year that necessitate this total lack of self-awareness and shame?
I love BFGPU concept. Finally we are getting real performance gains. You are stupid if you think a halo flagship discrete graphics card for a desktop computer that plugs into a wall, should be small and "energy efficient". If you don't want a flagship don't buy one. Also, not even a 3090 can max Cyberpunk 2077 with the ray tracing settings turned up.
We are going to need a server room for this new hardware. The size,power and heat?!? New AMD at 95 degrees and 4 slot beasts
I think I'm pretty happy with my EVGA FTW3 3080 10GB. I'll probably skip the 4000 series and may move to AMD this generation or next. Waiting to see RDNA 3.
I can honestly say I'm super excited to finally buy a 30 series card. 😆
Its almost like nvidia made a subliminal 30 series commercial here.
just go with AMD.
@AQW andrew what do you play on it, do you need it, are you in some type of competition like and e sport were you need the last fps you can get or any game that you gpu can't run right now? If all the answer to this question are a no then no keep enjoying your gpu I'm sure the gpu can run anything right now I have a amd 580 with a 1440 display and I can run every game I like just fine I'm sure the 1070 is really powerful to date
@A C In the spring maybe.
A 2 year old card for over MSRP is what you'll be buying. They're both dumb buys right now. Wait until spring and watch the prices plummet.
To be fair, GALAX is partially a Japanese company - localization is tough, lol. The cards are pretty well binned in my experence, they've got gamers at the helm in general (I've talked with a couple of them thanks to them advertising with us once)! I wish I could get their stuff in canada.
It's must almost be getting to the point where a AIO cooler might be a cheaper, lighter and easier to manage option on these cards.
Doubt you'd see the savings though. You'd still have to pay these prices for it.
Remember how the 900 series had overheating issues and would droop down by warping pcb. Get the feeling the 4090 will be the same.
I feel bad for all the engineers who spent months figuring out solutions to the insane power and thermal issues only for marketing departments to come in last minute with the cringiest names possible.
We're officially at the point where the ATX standard now needs to apply to graphics cards and we should screw the graphics card into the chassis and then have the motherboard plug into the graphics cards PCIe slot.
I remember old cases with guide rails on the front to support long ISA cards. It's like fashion: everyone who kept his old trumpet trousers from the seventies was in again a few years ago...
Watching you laugh was a high light of my day . I am guessing most people will wait for AMD or just go an buy a 3080 or 3090 .
You're mistaken in saying that AORUS claims, "the sheer size of its heatsink is affecting gravity itself". No, it clearly says, "so does the *effect of* gravity grow"--which isn't wrong. An effect of gravity is force. That force simply increases in proportion to the mass of these heatsinks (more mass means more force). Their wording was perfectly fine, technical, and clearly understandable.
whoa those cards are huge. it seems like it's overkill for most of us. if it's beyond the mainstream it is probably better for the electrical grid
the SG or Serious Gaming Series was a thing with the 30 cards already, there is a 3080 SG from KFA2 (basically Galax as far as i know) in my PC rn
You can really feel Steve's will to live fading over the course of this video
It's also the first time I see Steve visibly cringing at the wannabe edgy/cool marketing terminology lol
He's being drained by the dark obelisk.
His reactions are priceless and somewhat concerning. Dont lose the will to live Steve!
it gets sucked up by the Dark Power
@Benjamin Oechsli No True Nerdsman, if he wanted it pronounced that way he should have written it that way (blame USA copyright maximalism since Jif (food brand) could sue Jif (file format) even though the risk of confusion is non-existent) though I am sad that this decades long dispute is irrelevant now that webms are better quality for lower file size and hardware accelerated on practically every mobile device.
At this point even if the new AMD cards are 10% less in performance I’d go with AMD.
Somehow I have a feeling that game requirements won't scale with the availability of this latest generation. There will be too many people who used to to go for -80 or -90 series cards that will settle for a -70 instead or even simply wait longer than normal. If and when that happens, I am sure NVidia will blame demand. But it will be more likely that supply is to blame for once. What are they thinking? That everyone and their granny will upgrade to a 1200 watt power supply, a new case, a custom cooling solution and double the energy costs all without having games that actually require all that? 4k@240hz is nice but not a quantum leap in experience, isn't it?
Yeah, the size of them is crazy. Pretty soon you'll need one case to carry your graphics cards, and another to house everything else lol. As far as how they look? It's not bad. I mean even the smaller cards years ago had wacky sharp edges that would slice you while trying to install them lol. Plus with everyone wanting clear cases to see everything work, or fall apart, what else would they do lol? If we all still used closed cases, or solid metal cases they wouldn't care. Oh well, I bet in a year they'll have them down to 3060-3080 sizes.