SERIOUSLY?? MORE bad news for NVIDIA's 40 Series...

Sdílet
Vložit
  • čas přidán 22. 09. 2022
  • **Links to articles on this issue**
    WCCF article PCI-SIG Warns of Potential Overcurrent/Overpower Risk With 12VHPWR Connectors Using Non-ATX 3.0 PSU & Gen 5 Adapter Plugs (wccftech.com)
    wccftech.com/atx-3-0-12vhpwr-...
    Zotac Adapter Spec ZOTAC GAMING GeForce RTX 4090 AMP Extreme AIRO | ZOTAC
    www.zotac.com/us/product/grap...
    Article about Zotac Adapter specifically Zotac Reveals Surprising Fragility Of 12VHPWR Cable Adapters For GeForce RTX 40 Cards (hothardware.com)
    hothardware.com/news/zotac-hi...
    Steve vid on NVIDIA's report to PCI-SIG about the 12VHPWR cable (not adapters) after 40 cycles • HW News - Melting GPU ...
    Check out the new H7 series of cases from NZXT at nzxt.co/JayzH7
    Get your JayzTwoCents Merch Here! - www.jayztwocents.com
    ○○○○○○ Items featured in this video available at Amazon ○○○○○○
    ► Amazon US - bit.ly/1meybOF
    ► Amazon UK - amzn.to/Zx813L
    ► Amazon Canada - amzn.to/1tl6vc6
    ••• Follow me on your favorite Social Media! •••
    Facebook: / jayztwocents
    Twitter: / jayztwocents
    Instagram: / jayztwocents
    SUBSCRIBE! bit.ly/sub2JayzTwoCents
  • Věda a technologie

Komentáře • 7K

  • @NathanTyph
    @NathanTyph Před rokem +10160

    AMD seriously could not have a better launch window than they do right now

    • @KacKLaPPeN23
      @KacKLaPPeN23 Před rokem +289

      This problem will also affect AMD...

    • @666Necropsy
      @666Necropsy Před rokem +360

      amd's last cards where kinda gimped. i think they can blow it out the water this launch. the bus width on the 40 series is a joke.

    • @DagobertX2
      @DagobertX2 Před rokem +140

      We don't know anything of their new GPU, they could be in the same situation even if their GPU would be more efficient as they hinted.

    • @thecomicguy2785
      @thecomicguy2785 Před rokem +200

      @@KacKLaPPeN23 Pretty sure amd isnt switching to the new standard yet. By the time they do this may very well be fixed.

    • @registeredblindgamer4350
      @registeredblindgamer4350 Před rokem +150

      @@KacKLaPPeN23 That would only apply to AMD if they use the same connector or if the AIB's use them. So, you could easily be wrong.

  • @krishnateja4011
    @krishnateja4011 Před rokem +2269

    The more news is out about Nvidia the more I'm excited about AMD GPUs

    • @hamzasamar8823
      @hamzasamar8823 Před rokem +26

      same

    • @jsteezus
      @jsteezus Před rokem +153

      I’ve never seen as many former Nvidia enthusiasts (not fanboys, but the fans of their high end products) saying they are excited for the AMD launch or are waiting for AMD to launch before their purchasing decision. I really think they are going to be competitive. And performance is already so good that even if ray tracing isn’t as good, if its just as good as ampere thats plenty for like 90% of people.

    • @wahahah
      @wahahah Před rokem +61

      Just makes me more anxious. If AMD fails to deliver then what

    • @alexanderschulte8248
      @alexanderschulte8248 Před rokem +16

      At some point amd is going to this spec too so i dont know why it makes you more excited?

    • @austint.9943
      @austint.9943 Před rokem +3

      Of only they didn't break destiny 2 every new Gen

  • @davidcsokasy5451
    @davidcsokasy5451 Před rokem +220

    He's totally spot on with his assessment of the connectors. It's not paranoia. I bought a custom PC during the height of the GPU shortages at the end of last year (it was actually cheaper not to mention less frustrating than spending all of my free time hunting for prices that weren't eye watering) with a RTX 3080. The builder, who I won't name (cough, cough, CyberPowerPC), not only used a single split PCIe power cable, but wasn't very gentle when closing the case and crammed the excess power supply cabling inside with a bit too much gusto. Everything was fine for a couple weeks, but I started noticing GPU performance degrading. Also, my PC started crashing sporadically during gaming sessions after about 30-60 minutes. Eventually after a long gaming session one day my GPU died. At first I thought it was solely a faulty GPU because when I installed an older GPU (GTX 1080ti) it worked fine, so I started an RMA and shipped it off. When I received my RMA replacement I decided to install some braided cable extensions to class things up a bit. During the process of installing the new GPU and extensions I noticed the "real" problem. When the case was closed at the factory and the cables were crammed in the case the single PCIe power cable on the power supply had been bent hard to one side. This has caused two of the conductors to pull out a bit from the connector. The additional resistance from the less than ideal connections caused a hot spot and not only melted the cable connector, but also melted the power supply connector as well. My RTX 3080 is an overclocked model and pulls like 350W under heavy load. The higher end RTX 4000 series cards will pull even more. Using high quality cables with no defects is a must.

    • @camdavis9362
      @camdavis9362 Před rokem +26

      @WolfieMel are you trying to flex or something?

    • @justcommenting4981
      @justcommenting4981 Před rokem +17

      @WolfieMel well no, we can see now you are indeed attempting to flex. Quite silly.

    • @CaptainWafflos
      @CaptainWafflos Před rokem +20

      @WolfieMel cringe king

    • @citizensnips3850
      @citizensnips3850 Před rokem +14

      @WolfieMel Haha imagine being proud of saving money on a gaming PC. Too poor to afford a big-boy rig? :(.
      Hey bud, do you realize how dumb this flexing thing is yet?

    • @citizensnips3850
      @citizensnips3850 Před rokem

      @WolfieMel Why are you poor? Stop being poor. Just make more money so you don't have to use vouchers. Isn't flexing fun?

  • @Jet-ij9zc
    @Jet-ij9zc Před rokem +201

    I really hope we'll get a gpu gem where manufacturers heavily focus on power efficiency instead of going all in on performance and giving up on everything else

    • @Alte.Kameraden
      @Alte.Kameraden Před rokem +15

      I know. I was very happy with my 970 and 1070 Mini GPUs. I got so exciting seeing a 970 and 1070 reduced down to about the size of a 950/1050 GPU. It was such a wonderful push in the right direction. All I can say now is WTF is going on? I really think Ray Tracing is murdering the GPU market, the amount of Umph needed to make Ray Tracing work requires beefier GPUs than what would of been necessary if RT didn't exist. So since the RT series started getting introduced, bigger, and bigger, more power hungry GPUs just blew up.
      Really wish they'd abandon RT entirely. Of course Nivida's solution is Cloudy with a Chance of Meat Balls "Bla bla bla bla, all I heard is bigger... and bigger is BETTER!"

    • @catcollision8371
      @catcollision8371 Před rokem +8

      I only purchase video cards that do not require any additional power connectors. :p
      1050ti was a great card, still is..

    • @korassssssss
      @korassssssss Před rokem +2

      @@catcollision8371 I'm still using 1050ti in my current rig, really good card and only 70 watts.

    • @narmale
      @narmale Před rokem +1

      LOL... cuz you never push the envelope by being cute and tiny

    • @callofduty611
      @callofduty611 Před rokem

      It'd be lovely though I think we're reaching a certain limit with some more extreme innovations
      And as always , people will complain and never be satisfied. Just to use the GTX680 from a long time ago as an example with a smaller, less power hungry die... performance was the same as the AMD 7970 or better/worse depending on certain titles... people went crazy over the codename of the die being GK104 though despite it delivering performance-wise.

  • @Scarhwk
    @Scarhwk Před rokem +482

    I think the lesson here is "always be skeptical when a new generation draws more power." Drawing more power just screams "we're hoping nobody notices how meager a performance improvement this really is."

    • @davidlee1646
      @davidlee1646 Před rokem +79

      Yeah this trend of hulking video cards is getting out of hand. It's like shoving SUV sized boxes into PC cases. I can see the climate police going after gamers if this continues.

    • @zakkart
      @zakkart Před rokem

      @@davidlee1646 There is no climate police

    • @timhorton8085
      @timhorton8085 Před rokem +17

      Welcome to the altar of maximum performance. The market demands higher performance but engineers have not had the time or incentive to improve performance per power. If you dont want a maxed out, giant card dont get one.

    • @abby8043
      @abby8043 Před rokem +8

      We really should have just gotten 30 series “super”

    • @timhorton8085
      @timhorton8085 Před rokem +4

      @@abby8043 A rose by any other name.

  • @mikeyhope7577
    @mikeyhope7577 Před rokem +1506

    AMD can seriously take back a major part of the gpu market as long as they aren’t greedy with their pricing

    • @NavySeal2k
      @NavySeal2k Před rokem +75

      It already has with 2 5700xt and a 6800xt in my home. I saw the possibility to send a message to Nvidia especially as a Linux and enterprise feature user…

    • @RicochetForce
      @RicochetForce Před rokem +97

      They need to see Nvidia's abandonment of the low and mid-end market and utterly claim that. By the time Nvidia comes back to their senses they'll have to fight for that market again.

    • @liquidsunshine697
      @liquidsunshine697 Před rokem +1

      @@NavySeal2k ima rmaing my 5700xt its been a nightmare but i might just stick with them cause fuck nvidia

    • @reinhardtwilhelm5415
      @reinhardtwilhelm5415 Před rokem +96

      @@RicochetForce It's more than that - with these cards, Nvidia has abandoned the high end too. The "4080 12GB" that's really a 4070 is a perf/$ sidegrade from the 3080. That's never happened with a GPU this expensive before.

    • @Bambeakz
      @Bambeakz Před rokem +1

      And if is it true that the 40 serie is really expensive to produce (and a lot produced) then they can’t even get the price much lower 😅

  • @ncauhfsuohd
    @ncauhfsuohd Před rokem +106

    Just sharing my recent experience to add to this: I recently decided to add two new HDDs in Raid 1 to my computer. To get to the SATA ports on the motherboard, I had to uninstall and reinstall the graphics card (that's 1 cycle). Then, the computer would not turn on; the power supply died on me. So, I replaced it with a new power supply and cables (that's 2 cycles now). When I turned on the computer with the new power supply, my AIO pump had died. In trouble shooting that issue, I double-checked all my power cables (3 cycles). Solved the AIO issue and then turned back to setting up my new HDDs. One of the HDDs was not registering in BIOS, so I had to get back to the SATA ports (4 cycles) to double check that they were properly connected. Checked again in BIOS and the same HDD still wasn't registering. So, I replaced the SATA cable to that HDD (5 cycles).
    So, even in a pretty normal use case, you can go through a lot of cycles.

    • @danr.1299
      @danr.1299 Před 15 dny

      Why not just unplug the 12vhpwr at the PSU side? Thats what I do when I have to remove my 4090 so the cable is always plugged into the GPU
      I know this is a year old but I’m just curious if you are worried about it why would you unplug it at the card and not just leave the cable in the card and remove it with the whole cable

  • @thomaswieland9284
    @thomaswieland9284 Před rokem +202

    As, for example, Steve has pointed out, old-style PCIe connectors have the same 30-cycle insertion limit, so that's not necessarily something new; what's new is the insane amount of power that can flow through the new connector.
    Regarding the 4-pin sensor input, the table from the PCIe spec (e.g. at 3:20) shows that with both Sense0 and Sense1 inputs open, the card is not supposed to pull more than 100W on startup and 150W sustained. If there is nothing plugged into the 4-pin control input, those sense lines should be "open" and cards should be limited to that power. So either you'll seriously underpower your 40x0 GPU or those cards don't adhere to the spec and ignore the sense inputs.

    • @dguisinger
      @dguisinger Před rokem +4

      I was going to reply with the same thing, it doesn't seem to do what he says its going to do

    • @jetseverschuren
      @jetseverschuren Před rokem +1

      Yep, noticed that too

    • @AliasAlias-nm9df
      @AliasAlias-nm9df Před rokem +9

      Since all of these cards are shipping with adaptors, and the 3090ti here sold with an adaptor prior to widespread availability of atx 3.0 supplies i would conclude that the cards do not adhere to the spec. That said we already operate in this environment where cards can draw more than we can supply. Just be cautious when selecting a power supply and don't play with the cables.

    • @DimitriosChannel
      @DimitriosChannel Před rokem +1

      Gonna need 1.21 Gigawatts!

    • @VikingDudee
      @VikingDudee Před rokem +1

      They do, but the cards of yesterday also do not draw as much power as a 3090ti or 4090, so the 30 cycles you would most likely get away with it with a crash or not notice a single thing, I have, these drawing so much power and amps and being so small of pins, you have a higher risk of something wanting to ark out and heat up due to a loose connection over time, If they made the plug bigger, the pins and sleeves are going to be made with some thicker metal and bigger contact area.
      If you ever installed outlets over the years, they have them push in ones where you can just shove a wire into the back of it and it grips it and you got the ones you can use the screw, the ones with the push in design have vary small contact area and in my opinion is a fire hazard on anything on a 15amp or more circuit, I've seen them melt with a belt sander years ago, Always use the screws on outlets. It just should have a bigger contact area and the melting of the plug probably really wouldn't be too much of a concern.
      I feel this was just an over sight and I honestly don't think the person designing this plug had any clue how current works when rating and testing it. But leave it to the educated dummies.

  • @Varvarin22
    @Varvarin22 Před rokem +266

    *edit* the point of the comment is to talk about the connector itself breaking, not voltage/etc.
    I guarantee EVGA knew about the power cable issue; hence the release of the PowerLink product for the 3090 TI and 3090 TI Kingpin cards.
    I am glad I purchased the power link, as it keeps the main 16 pin cable fixed in place with screws, and has additional capacitors to help with spikes.
    EVGA states in the marketing of the PowerLink that it helps to "keep temperatures lower", and it makes me wonder if this was a statement that was sarcastically aimed at melting cables.

    • @Dave-kh6tx
      @Dave-kh6tx Před rokem +5

      I can see extra capacitors helping slightly with temps and spikes, if properly designed, but I don't see it helping with constant power draw or high temps at all.

    • @allanwilmath8226
      @allanwilmath8226 Před rokem +6

      It's not, when the voltage dips the VRMs compensate by drawing even more current which makes the heat where the restriction to current is even worse and even makes VRM less efficient the longer the VRM drivers stay on the longer there is heat generated across the gates as they conduct current.
      What they should have doen if they wanted to reduce cables melting and small connectors with less pins is increase the voltage from 12 to 24 or even 48, thus reducing current by 2 to 4 times and reducing the effects of voltages drops across connectors and wires in the process.

    • @kevingaukel4950
      @kevingaukel4950 Před rokem +6

      I know EVGA broke relations with NVIDIA because of disrepect, but I really wonder if EVGA saw this coming and was told they couldn't modify the power plugs.

    • @BKDarius27
      @BKDarius27 Před rokem +2

      Can you tell me what this powerlink thing is? Because i may buy a 3090 so i would like to know

    • @kuma8030
      @kuma8030 Před rokem

      @@BKDarius27 just search it up bro

  • @tspencer227
    @tspencer227 Před rokem +786

    As an electrical engineer, there's a reason the NEC has very, very specific requirements for bend radii for conductors, cables, and conduit. Good on EVGA to consider that for this application (especially considering how much current power cables for modern GPUs are carrying, and therefore heat from I^2R losses), and I'm just waiting now for IEEE to create a more stringent standard. Because yeah, shit's gonna start catching on fire.

    • @enntense
      @enntense Před rokem +52

      Someones got to make a hard 90 degree connector...right?...

    • @MrQuequito
      @MrQuequito Před rokem +92

      These puny cables are drawing more amps than my wall outlets, thats scary to be honest, and yeah, thats a fire waiting to happen

    • @kanlu5199
      @kanlu5199 Před rokem +17

      They would require to introduce a 36V for power supply

    • @Flor-ian
      @Flor-ian Před rokem +15

      Mechanical here, god if only the IEEE was efficient

    • @Alex-bi8ob
      @Alex-bi8ob Před rokem +10

      Maybe i am missing something but don't these PSU's have overcurrent protection for each rail which would avoid overloading in general? Not necessarily avoid this cabling bending issue tho.

  • @jokerproduction
    @jokerproduction Před rokem +6

    eTeknix just covered this with JonnyGURU and there is absolutely zero cause for concern. Literally zero.

  • @emilybjoerk
    @emilybjoerk Před rokem +2

    Based on the diagram shown, the "sub channel" for the 12VHPWR cable is just a few voltage sense pins, there's no actual data communication like USB does. The device end would just sense this with some pull-ups. This means that an adapter cable can wire up the sense pins appropriately to "communicate" the maximum power allowed assuming all the PCIe cables are plugged in to the adapter and rated at ATX 2.0 power limited for the plug sizes. I'm an electrical engineer.

  • @-EchoesIntoEternity-
    @-EchoesIntoEternity- Před rokem +317

    never interrupt your enemies from making mistakes....
    AMD: *nodding silently*

    • @tablettablete186
      @tablettablete186 Před rokem +18

      -Sun tzu (for real this time)

    • @paulustrucenus
      @paulustrucenus Před rokem +22

      @@tablettablete186 I think it was Napoleon, not Sun Tzu.
      Edit : after a quick Google search, I can confirm Sun Tzu never said that. The quote is attributed to Napoleon.

    • @tablettablete186
      @tablettablete186 Před rokem +3

      @@paulustrucenus Wait, really!? I could swear that I read this quote in Art of War.
      Either way, thanks for the heads-up!

    • @paulustrucenus
      @paulustrucenus Před rokem +2

      @@tablettablete186 Napoleon is another famous war guy so the confusion is easy.

    • @deplorablerach
      @deplorablerach Před rokem

      never ever

  • @lord_bozo
    @lord_bozo Před rokem +821

    Jay you should stress test it with a range of "worst case scenarios", an ATX 2.0, a 3 hour gaming sessions, and a fire extinguisher. Take the hit so others don't have too. and it will make for an epic thumbnail and good content too. Come on man, you know Linus is going to try to beat ya too it.

    • @killerhawks
      @killerhawks Před rokem +62

      While I love to see Jay "blow up" a psu let's leave that to the "experts" @gamersnexus k..

    • @Top3Entertainment
      @Top3Entertainment Před rokem +76

      Throw in a gigabyte PSU, a nzxt case, and a Asus z690 motherboard and we will really see if it's fireproof

    • @CommanderJPS
      @CommanderJPS Před rokem +38

      @@Top3Entertainment fireproof or a new bomb recipe for the anarchists cookbook? 🤣💥

    • @Top3Entertainment
      @Top3Entertainment Před rokem +7

      @@CommanderJPS judgement day

    • @killerhawks
      @killerhawks Před rokem +3

      @@Top3Entertainment i don't think that motherboard will fit in the H1 CASE...LOL

  • @Warsign01
    @Warsign01 Před rokem +40

    Great to know, thanks Jay. Starting to see more reasons why EVGA may have backed out.

  • @coach357
    @coach357 Před rokem +29

    If you go see the GeForce RTX 40 Series & Power Specifications on the Nvidia forum, you will see that they are using an active circuit inside the adapter that is translating the 8-pin plug status to the correct sideband signals according to the PCIe Gen 5 (ATX 3.0) spec. It is not a passive adapter so this is why there is nothing to worry if you have a good ATX 2.0 PSU with the minimum power required for the 4090 card.

    • @MegaTheJohny
      @MegaTheJohny Před rokem

      Thanks for this comment. I have 1300w Seasonic platinum PS, but it is and few years old. Shoild I be safe to use it with RTX 4090?

    • @thisisashan
      @thisisashan Před rokem +1

      That has nothing to do with his concern here.
      His concern here is there are reports, from both nvidia and users, of the smaller pins melting after being unplugged and replugged several times.
      This has nothing to do with what you are talking about here.

    • @coach357
      @coach357 Před rokem +4

      @@MegaTheJohny Your PSU will work fine with the NVIDIA adapter.

    • @coach357
      @coach357 Před rokem +2

      @@thisisashan You should rewatch the video and also check the Nvidia forum about the melting issue with the adapter. There is no more risk than all the PCIe connector we used before. The 30 connections always have applied. You really think NVIDIA would release new video card with a stupid passive adapter that can catch in fire? Seriously? Do your homework and buy a new pricy ATX 3.0 PSU if you are afraid of burning your PC. And by the way, listen at 7minute and you will understand (I hope) why I made my comment as what Jay says is absolutly false as the Active adapter balance power between pcie connector and prevent overcurrent on one cable that could result in cable melting, regardless of the 30 connections.

  • @johnnyvvlog
    @johnnyvvlog Před rokem +531

    A single graphics card consuming more power than my entire server stack. That's totally insane especially given the current power prices.
    And then to see that tiny connector it is doomed to give massive problems and fires...

    • @WizeguyGaming
      @WizeguyGaming Před rokem +15

      LOL. Like complaining about gas prices driving a Bently. Go get yourself a kia. No shame in staying within the parameters of what you can afford.

    • @gggiiia147
      @gggiiia147 Před rokem +74

      @@WizeguyGaming bro still 600w is lot of power for a single component no cap

    • @C3l3bi1
      @C3l3bi1 Před rokem +38

      @@WizeguyGaming yeah if that kia consumed as much fuel as a bentley i would be pretty pissed.

    • @robertbolzicco9995
      @robertbolzicco9995 Před rokem +36

      @@gggiiia147 maybe gene doesn't pay his own hydro/electric bill lol. 600w is insane, don't let anyone tell you different. Soon you will need a dedicated electrical circuit just for the PC lol. Call an electrician!

    • @craigbrown6628
      @craigbrown6628 Před rokem +40

      Totally this, and in the context of a world where meeting power demands at a cost effective price is becoming harder and harder. The whole 40 series launch feels a bit tone deaf.
      You know at this point I would actually have been more impressed if they had gone for ZERO gain in performance but a significant reduction in power.

  • @jayhill2193
    @jayhill2193 Před rokem +603

    Looks like Gamers Nexus and LTT couldn't haven chosen a better time to go all in on PSU testing. Their results and reviews are going to be interesting.

    • @MatthewSemones
      @MatthewSemones Před rokem +5

      you think that was just a happy accident? :P

    • @mroutcast8515
      @mroutcast8515 Před rokem +18

      LTT 🤣🤣🤣🤣🤣🤣🤣🤣

    • @TribeXFire
      @TribeXFire Před rokem +23

      @@mroutcast8515 I do believe that LTT will do good work. It would be interesting to see how their lab department will develop.

    • @mroutcast8515
      @mroutcast8515 Před rokem +16

      @@TribeXFire mhm, just like their current product reviews - 5% benchmarks and 95% talking crap. It's utter cringe channel.

    • @eng3d
      @eng3d Před rokem +6

      @@mroutcast8515 its funny but also do the job

  • @glitch9211
    @glitch9211 Před rokem +22

    Reminds me of the good old days, where every new game seemed to require a video card and RAM upgrade. Within two or three cycles, it was a motherboard upgrade to support the RAM. Of course, this is when RAM was about $100 a MB.

    • @JorgetePanete
      @JorgetePanete Před rokem +1

      And the reason is similar: unoptimized games (ignoring ridiculous hardware)

  • @fayathon
    @fayathon Před rokem +7

    This, this is one of the big reasons I sub to you Jay, this is damned good info to have, and I will link this to anyone that I hear is getting a 40 series as a precaution so that we have less fires going forward. Thanks for the heads up.

  • @ffwast
    @ffwast Před rokem +938

    The good news about these power supply problems with the 40 series is that you already shouldn't buy a 40 series anyways.

    • @justjami9619
      @justjami9619 Před rokem +11

      lol fair enough

    • @DarthCuda
      @DarthCuda Před rokem +14

      Why not? I am probably going to get a 4090.

    • @CrimsonFader
      @CrimsonFader Před rokem +84

      @@DarthCuda Very expensive, power efficiency non existent, Apparently a fire issue? Not sure havent watched the full video yet, display port 2.0 missing display port 1.4a I believe, not good for future vr headsets
      Edit: I would wait for amd just incase It is better at all/ any of these at a better price, they might not be, but its only november 3rd for the amd launch

    • @pixiesworld6367
      @pixiesworld6367 Před rokem +58

      @@DarthCuda I hope you enjoy those fake fps 😄

    • @terribleatgames-rippedoff
      @terribleatgames-rippedoff Před rokem +49

      @@DarthCuda There are cheaper options around for room heating, you know...

  • @mistirion4929
    @mistirion4929 Před rokem +151

    Electrical engineer here.
    Seriously, if I knew that my product (in this case a simple connector) would be a serious safety hazard and I had to warn my customers that they "should not plug it in and out too often" I would be ashamed of selling it.
    Honestly, before I even dare to announce it, I would redesign it completely if I knew that it would be this fragile and not durable at all. The amount of current that's flowing through these tiny connectors and wires is insane. If you make an adapter do it right and don't indirectly force your customers to buy the ideal solution so they don't have to be afraid of a fire hazard.

    • @B20C0
      @B20C0 Před rokem +19

      This will be even more interesting in Europe, 4090s could actually be banned from the European market if this isn't resolved before the EU launch. They would simply refuse CE certification.

    • @brandontrish86
      @brandontrish86 Před rokem +12

      As a fellow EE, I was thinking quite similar. The moment I saw that plug I immediately knew that it was not safely capable of delivering 50A of current. There's just not enough surface area. This is going to be a massive failure point until the day it's superceded.

    • @CommanderJPS
      @CommanderJPS Před rokem +1

      This is unfortunately how this works is going with such disposable toot so we have to keep manufacturing rubbish to earn more...
      What happened to make something which will last?

    • @monad_tcp
      @monad_tcp Před rokem +2

      I don't get why they don't use standard XT60 connector if they are going to pull 40A over that bit

    • @monad_tcp
      @monad_tcp Před rokem +3

      I'm not a professional EE, I'm only amateur, I know a bit and now I'm worried about the amount of oz of copper they used and how big are the power rails/traces on the power section of the graphics card, where VRM - voltage regulator module - are. 50A is no joke. I've seem cheap motherboards get burned silkscreen from all that heat.
      One thing is for sure, I'm not buying the stock model of that GPU before some of the better manufacturers that usually do overclocking and know their shit gets their hands and fix the design.

  • @zodak9999b
    @zodak9999b Před rokem +7

    On the table at 5:50, it shows on the bottom row that if the sense pins are both open circuit then the max power allowed at startup is 100w, and 150 after software configuration. So on the cable that doesn't have those pins, there shouldn't be a problem if the card is designed for ATX 3.0.

    • @EmptyZoo393
      @EmptyZoo393 Před rokem +1

      They communicate via simple grounding, though. I doubt the graphics card or PSU manufacturers are going to mess with it too much, but those adapters will be copied by third parties that may just ground all of those.

    • @ano_nym
      @ano_nym Před rokem

      Saw that too. But they will probably just ignore that from the card makers, because otherwise a lot of people with older PSUs will complain and want refunds because their cards aren't up to spec. Or perhaps have some work around included, to like the adapter grounding them, and saying to use the adapter at your own risk.

  • @zeonos
    @zeonos Před rokem +2

    I would love if they put the connector facing down (same side as the fans), so the cable goes down into the bottom cable management.
    Having them face the side or front of the case require you to bend the cables and gives issues in smaller cases.

  • @foglebr
    @foglebr Před rokem +490

    Here’s what sucks, it’s going to take a fire or two before this issue is truly dealt with. Which is seriously sad. It’s great that you and others are highlighting this issue, I just hope consumers actually get and implement the knowledge to prevent a dangerous condition because of this plug design.

    • @jwar375
      @jwar375 Před rokem +14

      I hate the fact that you're probably right.

    • @Jon-pc6ch
      @Jon-pc6ch Před rokem +14

      The problem will be when then average consumer buys a shoddy pre built machine.

    • @tomglover98
      @tomglover98 Před rokem +11

      @@Jon-pc6ch not necessarily. Most people buying pre-builds don't touch the insides and rarely disassemble.
      More of an issue for clean freaks like me. I take my GPU out the case to not recirculate dust inside, welp..... now I can't anymore or only 30 times before potentially damaging hardware.
      However if its the case that it can be taken out of a modular power supply end instead without damage then that's would be an OK compromise.

    • @deViant14
      @deViant14 Před rokem +3

      @@tomglover98 we salute you for taking one for the team

    • @aitorbleda8267
      @aitorbleda8267 Před rokem +5

      More like tens of fires at least.
      Seems unsafe.. and they could have added a thermistor and two very fine cables to know the connector/cable temp.

  • @mromutt
    @mromutt Před rokem +477

    Man I already decided I do not want a 4080/80/90 and most like a 4000 series card at all, but this sold me on wanting nothing to do with these. I really hope intel actually keeps going and improves and AMD doesnt let this opportunity slip through their fingers to stomp on nvidia this go around.

    • @exploranator
      @exploranator Před rokem

      AMD IS CALLING YOUUUUU?

    • @RobertNES816
      @RobertNES816 Před rokem +23

      NVIDIA is going to be a thing of the past like 3DFX. They're too cocky and too full of themselves. At this point I want NVIDIA to go out of business, this is coming from a former NVIDIA Fanboy.

    • @Lucrativecris
      @Lucrativecris Před rokem +5

      Im sticking with the 3060

    • @ZackSNetwork
      @ZackSNetwork Před rokem +10

      @@RobertNES816 So you want AMD to be the new king of the market and Intel takes AMD’s current place?

    • @Cyromantik
      @Cyromantik Před rokem +5

      Intel already said they're committed and won't quit their push to make and improve upon their Arc cards, and considering they can source their own silicone they should be able to back that up!

  • @coreymasson6591
    @coreymasson6591 Před rokem

    Thank you for supplying (no pun intended) an excellent article on this issue. Also major respect for including Steve's Link at the end. Shows that you care about your viewers and will do anything to keep them informed and safe.

  • @gwaites6329
    @gwaites6329 Před rokem +9

    Dangggggg hah. Thanks a ton for always doing the vid coverage and getting us the info in your entertaining way and easily understandable. Implications for me are hard to say, haven't considered buying a new PC for 2+ years now buuuuuut kept up on the market. Now I have so much to learn to potentially grab a 3080 to upgrade from my 1080 before it's too late and decide if I just give up on 40 series or wait another year for things to stabilize. Eitherway sooooo glad I'm starting. Anew job this year. If I keep myself from buying a sports/muscle car as a first car like a dumbass, I can save a good bit on my daily commuter and finalyyyyyyy push for a PC refresh. But damnnnnnnmnn. Still can't believe power supplies gonna get the shaft. Buuuuuut glad I still have my evga g3. Fucking love them and glad they'll atleast be making psu for me yayyyy. Cheeeeeeeeeeeeeeeers gogogogogo. 3080 evga for me hah

  • @kaimeraprime
    @kaimeraprime Před rokem +376

    used nvidia for decades but the 40 series is making me want to see what team red has to offer.

    • @Schwing27
      @Schwing27 Před rokem

      Same. This is the first time I’m not buying a new GPU every 2 years. Nvidia can F off.

    • @nando03012009
      @nando03012009 Před rokem +11

      I hear ya.. I haven't used an AMD (ATI ) card since the R9 nano. I may look into and AMD card going forward. My 3090 should last 2 more years.

    • @MemTMCR
      @MemTMCR Před rokem +20

      having used amd cards as long as I can remember
      let me warn you weird game and pc crashes have been a legitimate problem for some of my systems

    • @shawnd7
      @shawnd7 Před rokem +2

      @@MemTMCR have u owned an rx 5700xt thats where i had my problems but glad rdna 2 fixed those

    • @Dlo_BestLife
      @Dlo_BestLife Před rokem +18

      Everyone should always look at all products avaliable before making a purchase. Just my opinions

  • @nonaurbizniz7440
    @nonaurbizniz7440 Před rokem +376

    I think I see evga's strat for this change. They will be making a fortune producing the new atx 3.0 psu because of the standard change and the flood of upgrades so no need to bother with gpu sales.

    • @ZappyOh
      @ZappyOh Před rokem +113

      EVGA bailed from the GPU market, just before the actual deadly fires start, and both skyrocketing warranty claims, plus potential lawsuits or even criminal investigations begins.
      Smart move!

    • @subodai85
      @subodai85 Před rokem +7

      I’d said this, it makes complete business sense.

    • @nabieladrian
      @nabieladrian Před rokem +53

      Every single day since the news, EVGA moves make more sense.

    • @earthtaurus5515
      @earthtaurus5515 Před rokem

      @@ZappyOh EVGA didn't bail, Nvidia forced them out. Massive difference.

    • @steamroller9945
      @steamroller9945 Před rokem +29

      @@earthtaurus5515 ??? You got any evidence bud?

  • @ujiltromm7358
    @ujiltromm7358 Před rokem +2

    I don't know who specced the connector, but MOLEX (turns out MOLEX is a company and makes the PCIe plug) validated the specs. That's at least one culprit.
    One thing I'm wondering is connector contact area. The pins are tiny rods that make contact within the pin wells. Now, we all know how we use thermal paste on CPUs to fill all the asperities of the IHS and the cooler coldplate to reduce thermal resistance and ensure a higher thermal transfer. Well, those pins have asperities too, and so I expect the pins to not make full contact with the wells, increasing electrical resistance there and thus waste heat, leading to thermal runaway and ultimately melting the plug. I wonder if there exists a way to make a "contact paste" (that's not solid like well, solder) to reduce the resistance. I guess making sure the customer makes no short would be a major issue though...

  • @mikkokuorttinen3113
    @mikkokuorttinen3113 Před rokem +6

    Great thanks to you Jay for the attention on the topic! In addition to the facts you are already pointing out here, is the additional stress on the gpu power supply due to the actual length of the cables. The longer the cable the higher the stress heat created! People at home doing their own home builds may not always remember consider this.

  • @buddymac1
    @buddymac1 Před rokem +260

    30 cycles? That is HORRIFYING. I have a friend who's had his 2080 for 4 whole years now. He got one at launch and hasn't needed to spend to upgrade his GPU since then, but he has swapped/upgraded his motherboard, case, NVME, PSU, CPU cooler and probably some more stuff. Each time, he's unplugged that thing at least once, maybe 2 or 3 times for a test boot outside the the PC. When you add that plus when we built it, he's probably pushing 30 uses, and the SCARIEST PART there's thousands of people out there, maybe even more, who have definitely passed that 30 cycles safe rating. Imagine in 5 years, when they're 4080s instead of 2080s, and these things start popping off because they're running SIX-HUNDRED FUCKING WATTS through a CONNECTOR that is NOT SAFETY rated for that.

    • @Hungerperson
      @Hungerperson Před rokem +23

      As someone who works in a computer repair shop, if a computer has some issues or takes a lot of troubleshooting, then just one visit to our shop to fix an issue that could be unrelated to the gpu can take a third or even more of those cycles. It’s terrible

    • @link1565V2
      @link1565V2 Před rokem +1

      I still run my gtx 680 in an XP machine lol
      It would have to be pushing 30 plug cycles.

    • @James-wd9ib
      @James-wd9ib Před rokem +25

      If you're a manufacturer and you have to make consumers count BS "plug cycles", you can take a step back and look at your designs- I'm sure you'll find that you've been doing something very, VERY, V-E-R-Y stupid

    • @an4rchy_yt
      @an4rchy_yt Před rokem +4

      hence why evga left?

    • @an4rchy_yt
      @an4rchy_yt Před rokem +4

      @@James-wd9ib cost cutting and material limitations. p l a s t i c s .

  • @Griffolion0
    @Griffolion0 Před rokem +327

    It seems this issue is definitely more universal, but Nvidia just became the perfect demonstration case given the insane power requirements of the new line of cards. We had almost 2 decades of not having to worry about this, and now those of us old enough to remember are going to relive the nightmares of cable adapters, fear of fires, fear of blowing your cards, etc.

    • @Ghastly10
      @Ghastly10 Před rokem +31

      Exactly, unfortunately reading some of the other comments here, some folks are going the AMD vs Nvidia route and not really understanding that this problem is with the new ATX 3.0 standard. With Nvidia using the new ATX 3.0 standard, they have demonstrated one of the weaknesses of ATX 3.0.

    • @theKCMOPAR
      @theKCMOPAR Před rokem +7

      I agree with this. I'm just bummed cause I don't think nvidia and amd realize how expensive psu are. At this point you might as well do a completely new build.

    • @Lordpickleboy
      @Lordpickleboy Před rokem +2

      We will see many many house fires I thinks

    • @spankbuda7466
      @spankbuda7466 Před rokem +19

      And since you're old enough you should know not to ever buy first generation of anything because these companies uses the consumers as their real world test subjects.

    • @Nemesis1ism
      @Nemesis1ism Před rokem

      No only foolish people will have those sorts of problems. if you put a 450 watt gpu in a pc your an idiot.

  • @TayschrennSedai
    @TayschrennSedai Před rokem +2

    It's worth noting that we use these adapters in the server world today. However, they're the opposite where you pull from one or two mobo ports that give power. And these are 1200w platinum or higher psu in the servers.

  • @____________________________.x

    Engineer here. We already have multiple types of power connector rated at >500 mating cycles, built especially for this type of application. But no, they have to dream up this new Fra.nkenstein connector which is easily the most badly designed implementation I've seen since the Molex.
    There are well known problems of insertion cycles with high current connectors, BUT WE HAVE ALREADY SOLVED THIS PROBLEM. Until now, when they decided to UNSOLVE everything we've done.
    It's infuriating and so anti consumer.

  • @clintono
    @clintono Před rokem +206

    Jay, you have missed the fact that if the card does not receive a signal from the PSU it MUST (by the standard) default to the lowest setting. So people buying a 4090 are going to end up with 150W cards if they don't use an ATX 3.0 PSU.

    • @Secunder
      @Secunder Před rokem +15

      Wait.. you're joking, right?

    • @nanoflower1
      @nanoflower1 Před rokem +9

      @@Secunder Hopefully that's only at bootup because you know most people who are likely to buy 4090s already have high end PSUs and so will not be buying a new 3.0 PSU to go with their 4090. So they will be quite upset if they aren't getting the expected performance (because the card limited the power draw to 150 watts.)

    • @tehf00n
      @tehf00n Před rokem +15

      @@Secunder he's right.

    • @clintono
      @clintono Před rokem +10

      @@nanoflower1 At boot the card can only draw 100W, MAX power draw 150W.

    • @louiscorrigan3865
      @louiscorrigan3865 Před rokem +12

      yeah this comment is important. the spec is designed to prevent over-draw (at least just by looking at the table) do with no data pins going to ground the card will (at least by spec) only draw 150W at peak sustained power

  • @ErikCPianoman
    @ErikCPianoman Před rokem +67

    Rated for only 30 cycles? And this is a full launch consumer product, not some sort of alpha/beta build/prototype? This is fire hazard and unacceptable. Period. Here's hoping this trend of brute forcing generational improvement stops and efficiency takes precedent. 600w power draw is just too much.

    • @marcust478
      @marcust478 Před rokem +3

      Agree. This is actually scary.
      Imagine fucking up your PSU, your card and even putting fire on your own house because of nVidia and this madness.

    • @golfgrouch
      @golfgrouch Před rokem +5

      The 30 cycle spec is the same as it’s been for the last 20+ years on existing PCIe/ATX 8pin connectors, and has not changed with the PCIe Gen5 connectors or power adapters.

    • @ertai222
      @ertai222 Před rokem +1

      For sure. No GPU should be needing 600+ watts ever.

  • @virtualxip
    @virtualxip Před rokem +3

    @Jay: I don't think these are data lines per se. The psu and the graphics card don't communicate back and forth. They're more like jumpers - at least that's what the displayed specs suggest.
    So basically there could be specific adapter cables for different psu ratings. Or you could have cables that include jumpers or dip switches.
    (Of course that's risky, if you don't know what you're doing, no doubt about that.)

  • @AndrewFrink
    @AndrewFrink Před rokem +14

    maybe i missed something here, but the chart @6:02 shows that with sense1 and sense0 both open (not connected to ground, like the plug is missing) the cards should self limit to 100w on boot, and 150w sustained after that. so a compliant 4090 ATX3.0 based card plugged into those adaptors should just limit to 350w. My guess is that some of these PSU cables will have the side connectors and that they will be connected or switchable.

    • @LavitosExodius
      @LavitosExodius Před rokem

      The part your missing is your relying on this to work correctly this is a new thing all together there will be glitches in the system and some cards may not work correctly. Sure you can RMA them but little late at that point if it blows your PSU.

    • @AndrewFrink
      @AndrewFrink Před rokem

      @@LavitosExodius well then the card isn't pcie3.0 compliant and shouldn't have the sticker on the box. We have f#$king standards for a reason. The card vendor could just ship adapter cables for use with non 3.0 PSUs with a great big warning, but if I was engineering at an OEM, I'd be pretty against that. Let a third party do that.

    • @LavitosExodius
      @LavitosExodius Před rokem

      @@AndrewFrink no simply put with all new products there will be bugs and glitches that need worked out. It would be wise to remember all new products have these issues. I'm sure the GPU makers will do their best to make sure each card is compliant but you know Murphy's Law and all that.

    • @AndrewFrink
      @AndrewFrink Před rokem

      @@LavitosExodius i think we're just going to have to disagree. A card that doesn't respect the power config is not a "bug". Especially in a $1000+ GPU.

    • @LavitosExodius
      @LavitosExodius Před rokem

      @@AndrewFrink you missed the point this whole system was designed by a human there is a chance however small that the system can fail. If it's every card that fails like that sure not a bug. But only a handfull? That would be a glitch/bug somewhere in the detection method of the sensors. Or put more simply don't roll the dice with these cards and hope your 650 Watt PSU will power them safely unless it is a Gen 3 PSU. Even then I wouldn't roll the dice. Granted there is an argument to be made if you're buying a 4080 or higher you probably shouldn't have skimped on the PSU to begin with.

  • @Quendk
    @Quendk Před rokem +68

    This is what we need, the Brutal Truth.
    Seems like this gen for Nvidia should rather be skipped until all the kinks are worked out. Don't think Intel is going to have much of a better response to power draw either.
    AMD really has the edge. I'm glad that we have the competition to drive innovation.

  • @danishprince2760
    @danishprince2760 Před rokem +252

    I were really planning on getting a 4000 series card and finally upgrade after 8 years but between the EVGA stuff, the 4070/4080 naming bullshit, the insane price increase over 3000 series and now this.. My faith in Nvidia is just gone and creating a good product can only go so far in keeping the consumers.
    AMD is truly in a good position to grab market share depending on what they show off in November!

    • @damunzy
      @damunzy Před rokem +10

      Get the EVGA 3090ti for $1400. It'll last until the 7000 series comes out

    • @nk-dw2hm
      @nk-dw2hm Před rokem +13

      @@damunzy or literally any amd card and have better support that will improve over time, unlike nvidia

    • @MRBURNTTOAST111
      @MRBURNTTOAST111 Před rokem +7

      @@damunzy if you don’t use ray tracing there is literally no reason to not get a amd 6000 card it’s so much more efficient. My rx 6800 undervolted only uses 200 watts at max and gets similar fps compared to a 3080

    • @thenonexistinghero
      @thenonexistinghero Před rokem +8

      Insane price increase? Sounds like you have no common sense. Inflation's been crazy. The 4090 is only $100 more expensive than the 3090... looking at inflation that took place between the releases... you're actually paying a bit less. Similar story for the 4080 16 GB. Only the 4080 12 GB got an unreasonable price hike compared to the regular 4080... especially when you consider the fact that it's actually a 4070.

    • @danishprince2760
      @danishprince2760 Před rokem +23

      @@thenonexistinghero the 4080 is $500 dollars more than the 3080 and the 4070 is $400 more than the 3070. That is not reasonable increases due to inflation lol

  • @andrewsuryali8540
    @andrewsuryali8540 Před rokem +68

    Also, if you play around with ITX it's pretty easy to get to 30 mating cycles because usually you'll need to plug and unplug several times during build just to get all the components to fit and tidy up the cabling, then every time you need to clean up you'll likely have to unplug the cables again to get to some components.

    • @lucianooscarmasciulli6200
      @lucianooscarmasciulli6200 Před rokem +15

      "It's pretty easy to get to 30 mating cycles"
      ..nice

    • @GoldenTiger01
      @GoldenTiger01 Před rokem +3

      Then stop being a noob and needing to plug/unplug so many times. They make digital calipers so you can measure the connectors and any other space constraints.

    • @filthyrando3632
      @filthyrando3632 Před rokem +1

      @@lucianooscarmasciulli6200 nice

    • @MoonOvIce
      @MoonOvIce Před rokem

      @@filthyrando3632 Filthy Rando 🤣

    • @JayOhm
      @JayOhm Před rokem

      @Erick Nava Yes, that. Also, they make some nice 3D CAD and software, so you can triple-check and actually see your final PC design, with cable routing and all that, before you even start assembling anything! What an idiot would use trial and error instead of the proper engineering tools?
      Edit: Oh, and don't forget thermal simulations, to ensure your planned cooling solution can stay quied without getting blisteringly hot, and to fix it if it can't, before it gets a chance to become a problem.
      Edit2: And mechanical stress simulations, don't forget about them! These new GPUs aren't exactly light-weighted.

  • @FreakyOptics
    @FreakyOptics Před rokem +12

    This video further solidified me to not get a 40 series. I knew deep down after it was announced that the adaptors have a connect/disconnect life, that It sounded like a fire hazard already.

    • @doghous3
      @doghous3 Před rokem +1

      After seeing this, I'm thinking the same. To me it sounds like the connector/wiring is underspec. Melted wiring - come on! I may change my mind if a rev 2.0 comes out with appropriate connectors. I guess time will tell.

  • @silentq15
    @silentq15 Před rokem +216

    I am starting to feel like after watching this that a discussion needs to be had as to why these GPU's are not getting more power efficient at all. It seems like they are chasing that 2x greater than the previous generation buzz by just increasing the power. Something just seems off about that.

    • @MichaelWerneburg
      @MichaelWerneburg Před rokem +33

      Especially while the planet lurches through energy and climate crises.

    • @reappermen
      @reappermen Před rokem +33

      Oh, they absolutely are getting more power efficient. It's just that Nvidia (and to a far lesser extend AMD) currently think that it is not enought to have 15-30% better performance for the same power, so they also up the power draw to gain more totla performance.
      Plus, the GDDR6X VRAM that Nvidia uses is VERY power hungry, so upping the amounts of Vram pretty much linearly increases power draw.
      (Don't have exact numbers in memory, but higher end 3000 cards pulled hih into double digits of watts just for the VRAM under heavy load).

    • @BillyAltDel
      @BillyAltDel Před rokem +21

      People have been able to undervolt 30-series, drop like 100W, and maybe lose like 3% performance. Ridiculous.

    • @JHattsy
      @JHattsy Před rokem +2

      Yah I'm starting to feel like they're trying to turn GPUs into console release timings, they just want to put one out every 2-3 years and make the previous one useless.

    • @MartinKrol
      @MartinKrol Před rokem

      @@MichaelWerneburg theres no climate crisis. change, sure, crisis, not even remotely close. not even close in the next 100 years.

  • @Rick_Makes
    @Rick_Makes Před rokem +79

    I'm no engineer but surely a bank of pcie connectors and a separate data cable would be the smart idea for something with such a high draw. I'm wondering if that's part of the reason EVGA have dumped Nvidia. If Nvidia were making them use a connector that's barely fit for purpose it would really hold back EVGA from making the high end overclocking cards and that's one of the things EVGA are known for.

    • @2Fast4Mellow
      @2Fast4Mellow Před rokem +6

      That data pins combo table should have been reversed. Only when there are two powered data pins, the GPU can draw 600W, otherwise it is restricted to 100W (no communication). But I guess for backwards computability they (PSI SIG) have turned it around...

    • @mahuk.
      @mahuk. Před rokem +2

      Jayz already made a video about the EVGA situation. The long story short is that EVGA would be having negative sale numbers due to NVIDIA undercutting their sales. What purpose serves a partner deal when the person you're partnering with is screwing you over?

    • @tomglover98
      @tomglover98 Před rokem

      @@mahuk. he also stated nvidias lack of clear communication, timeliness and access to products on release, which does reflect what OP is saying.

    • @MikrySoft
      @MikrySoft Před rokem +3

      @@2Fast4Mellow The table is fine, it's the cards that must break spec to work with old PSUs. If you look at 5:50, both pins in "open" state (as would happen if the connector wasn't there) is the lowest power state. To get full 600W the connector should be present with connections from those two sense pins to ground.
      The correct solution would be to make the card respect the ATX3.0 spec and supply people with an adapter with two DIP-switches so they can declare the max power themselves. Heck, make the data connector a separate part and supply 4 of them, hardwired to fake different power levels, it would just need ground from the main plug. Or supply/sell 4 different whole adapters (possibly 1/2/3/4 plug versions for different power levels).
      What's really missing is a way for the card to detect if it works with an ATX3.0 PSU on a 12VHPWR cable or an older one through an adapter and change it's behavior to reduce transient spikes.

    • @MyrKnof
      @MyrKnof Před rokem +7

      EVGA could already see the amount of warranty cases they'd have to deal with and noped out

  • @jimmay7736
    @jimmay7736 Před rokem +4

    It's amazing to me how much power PC's are moving around at 12V, on those wires and connectors, and then go look at the 12V wiring and connectors used in cars.
    Other than starting the car or huge aftermarket audio, incandescent high beams are the biggest draw, and they are 60W @ 12V, only 5 amps - and their connectors are almost as big as 120V household 15A plugs.

  • @jakasrinaga
    @jakasrinaga Před rokem

    I agree with you to change the position of the connector, along with changing the shape of the connector, I suggest using a connector that can withstand heavy loads, such as Anderson connectors, it may be more suitable to be a connector with a heavy load.
    And for power supply designers to look further into the connectors suitable for heavy duty loads delivering full power.

  • @samthemultimediaman
    @samthemultimediaman Před rokem +50

    The way they designed the new ATX plugs, I'm wondering if they had an actual electrical engineer working on it, for the current needed, the gauge of wire and the size of the plug is very undersized and seems very slapped together. they should have used 8 gauge wire and some heavy duty connectors. The amount power draw on new GPUs now has moved beyond conventional computer wiring.

    • @mycosys
      @mycosys Před rokem

      many

    • @kall399
      @kall399 Před rokem +5

      No they probably had some designer who doesn't know shit about electricity design something ''sleek''

    • @qwesx
      @qwesx Před rokem +14

      Nah, there were definitely actual engineers developing this thing. But then the pointy-haired boss came into the room being "That's nice and all, but can't we just re-use the old plugs format and cables, that'll make it so much cheaper for the manufacturers!". Then the engineers did the best they could to figure out a way that could work, by reducing the longevity and cycle count.

    • @ilenastarbreeze4978
      @ilenastarbreeze4978 Před rokem +4

      hoenstly im not an electrician but i do like electrical stuff and researched it some for my own things i wana do and yea, power connectors should not melt after a couple of plugins, the amount of power moving through that tiny connector is INSANE , id rather something be safe then pretty and im sure some gamers may disagree but i like not having fires in my electronics

    • @michelvanbriemen3459
      @michelvanbriemen3459 Před rokem +1

      They probably did, and then overruled the engineer's recommendations because "progress needs to look slimmer and more efficient, not thicker and bulkier" in the minds of the higher-ups.

  • @dibs3615
    @dibs3615 Před rokem +289

    I've noticed a trend ever since I bought my GTX 460 768mb....cards were getting way too power hungry to the point of causing meltdowns, fires and other issues. Then Nvidia started working more on power efficiency. Now they're neglecting power efficiency again and it's starting to show through other issues like this.

    • @DenyBlackburn
      @DenyBlackburn Před rokem +10

      They implemented a pretty nice inbuilt undervolt program tho on their GeForce experience panel when you press alt+z and then to performance. I could drop my watt usage from 220w on full load down to 125w on my RTX 2080s with not much performance loss and it saves it so you don't need to set it everytime unlike third party programs like MSI afterburner. Idk why not everyone is doing that tbh

    • @Mizra-dq3lj
      @Mizra-dq3lj Před rokem +18

      But muh fps

    • @techkilledme
      @techkilledme Před rokem +4

      RTX 4080 more like GTX 480 am I right? But funnily enough I think the 480 was like 250 watts at most

    • @dutchdykefinger
      @dutchdykefinger Před rokem +4

      ​@@techkilledme it was, that was considered pretty high at the time though,
      i think the radeon 6970 was 250 watts too about half a year later.
      even the radeon fury X, and rhe r9-390x, both radeons notorious for running hot, years later were only 275w tdp,
      both of them didn't scale for shit with overclocking so it wasn't worthwhile to even put more into them
      it was mostly the performance/watt AMD got seriously behind on around nvidia's maxwell, in terms of power use the flagships were never all that far apart, at least not the single-chip ones, usually higher IPC tells a story about overclocking potential and scaling with clock speeds though

    • @austinbriggle3961
      @austinbriggle3961 Před rokem +9

      Honestly I believe this is just the natural life cycle of specific CPU/GPU manufacturing technologies.
      When they introduce a new manufacturing process for the "Next gen" they always start off with lower power efficient designs. There's too reasons for this. First reason is so they can perfect the manufacturing process and iron out any kinks before moving on to more sophisticated designs. The second reason, and I believe the main reason, is economics. If they try to go straight for the maximum performance they can achieve when they first introduce the manufacturing process then they miss out on future sales from potential upgrades.

  • @KH-cs7sj
    @KH-cs7sj Před rokem +4

    I had never thought that there could be so many problems with just connecting a cable when it was ATX2.0 or USB Type-A. Now these fancy new ports (ATX3.0 and USB Type-C) are tinier but come with all sorts of weird issues. The thunderbolt port on my Lenovo laptop just fried itself out of no reasons.

  • @CoreyKearney
    @CoreyKearney Před rokem +2

    The spec is 600w @12v in ideal conditions that's 50+ amps. Sure that's split across 6 pairs of wires, but that's still a bump in amps over a single line of 8-pin. With the compound issue of possible uneven resistance across the pairs pushing more or less power to certain pairs, an that resistance heating up the dinky connectors. Those pins are tiny and the wire gauge is what 18? 16? This whole spec is Firestarter. It's to many watts over to small a connector and to thin a set of wires. Before I buy my next psu I'm going to be looking into custom 8 gauge pci connectors. This is some basic electronics fail to save a buck.

  • @Vladek16
    @Vladek16 Před rokem +57

    5:59 : no, you misunderstood the spec. If the sense pin are absent (if you PSU is an old one for example) the cable default back to the lowest state of 150w max, not the highest.
    The guy who made the ATX 3.0 spec said it in an interview with PCWorld. Look for the video "Intel Talks New ATX 3.0 And ATX12VO 2.0 Power Specifications | The Full Nerd Special Edition" it's at 14min
    It's also literally written under the table "If the Add-in Card does not monitor these signals, it must default to the LOWEST value in this table"

    • @Maax1200
      @Maax1200 Před rokem +6

      So you get a 1060 card for the prize of a 4090, awsome.👍👍

    • @oebeloeber
      @oebeloeber Před rokem

      @@Maax1200 stonks

    • @Melech
      @Melech Před rokem +2

      This. The problem with using the adapter is not too much power draw. A "dumb" ps limits the card to 150W. Prepare for a lot of people complaining about their 4090 running worse than the old card.

    • @Melech
      @Melech Před rokem +1

      btw, it looks like there isnt any actual communication, it should be possible to make adapters with different power limits for "dumb" power supplies.

    • @Vladek16
      @Vladek16 Před rokem

      @@Melech the table addresses 2 of the 4 pins. Those two pins are used to specify the max power and they are hardwired yes. But the other two pins are used for smart communication between the PSU and the GPU.
      And modding those two hardwired pins is useless if your power supply is not strong enough to deliver the power, so don't do risky mods

  • @RayeKinezono
    @RayeKinezono Před rokem +108

    I have no intention of getting any form of 40 series card. I'm still running an EVGA 2060 KO Super, which has performed admirably well. I had originally intended to get a 30 series, but now, with EVGA's departure, and nVidia's pricing and 'in their own world' mentality, I'm looking more and more into Team Red for my next potential GPU upgrade. (Which is starting to make more sense, as time goes on, especially since I've been pretty much exclusively on AMD CPUs for a very long time.)

    • @MADBADBRAD
      @MADBADBRAD Před rokem +1

      I’m on the same boat as you. I also got a 2060 KO from EVGA and hasn’t giving me any issues at all. I’m definitely going with AMD when I get ready to build a new PC.

    • @Teku175
      @Teku175 Před rokem

      Honestly I'm thinking the same as you. I currently have an EVGA RTX 3080 10GB FTW3 Ultra (and used to have a 2060 KO Super which wasn't enough for 1440p155 but I digress), and my next GPU is gonna be AMD in maybe 4-5 years. Seeing all the news about how nvidia treated EVGA (and other board partners) along with them constantly making tech that doesn't work with other GPUs (AMD/Intel) makes me dislike nvidia, despite the product itself being good. 40 series pricing was the final nail in the coffin too.

    • @xfrostyresonance8614
      @xfrostyresonance8614 Před rokem

      Same over here. Been with my 5600XT for 2 years now, it still works and still is getting AMD's feature back-trickle to the point of it still being entirely relevant for even next-gen gaming. I'm just waiting to see if AMD has a worthy mid-range card this gen for me to upgrade.

    • @VirtualAustin
      @VirtualAustin Před rokem

      Yeah i managed to get a EVGA 3080 12GB FTW3 Ultra with a EVGA 1000 Watt power supply and now with EVGA out I don't think I will be getting a GPU for at least 2-4 years because even though AMD is a viable option I like to have the RTX with DLSS + Ray Tracing and then Reflex for competitive gaming

    • @halfbelieving
      @halfbelieving Před rokem +2

      I'm still on my 1660ti and Ryzen 2600. I don't really feel the need to upgrade, especially when i have a Series X. Sure i would be able to take advantage of my 144hz 1440p screen with more games but lately with the ridiculous pricing i'm content with waiting for longer than i set out to.

  • @riccardo1796
    @riccardo1796 Před rokem +2

    The connectors are minifit for the old standard and likely microfit for the new adaptors, with some 2.54mm pins for the serial on the bottom

  • @pollopesca5130
    @pollopesca5130 Před rokem +7

    I feel like we've gone back in time to 2000-2005. Back when Intel’s only solution to getting more performance was throwing more electricity at it until they hit a breaking point and had to come up with a new more efficient architecture (the core-2). ATX 3.0 just allows manufactures to drag this out even longer before efficiency is a consideration again. The electricity bill tripled for me this summer, so I think I’ll wait this gen out (and possibly others). My microwave shouldn’t be more power efficient than my PCs (x.x)

    • @wag-on
      @wag-on Před rokem +1

      I agree, it's the wrong path of ever increasing power demands. I want something fast but efficient too.

  • @willwunsche6940
    @willwunsche6940 Před rokem +473

    AMD has the perfect opportunity right now to win a lot of market share, positive press and long-term mind share if their new GPU's are priced well and good

    • @honkhonkler7732
      @honkhonkler7732 Před rokem +10

      RDNA2 should've done that had crypto miners not ruined the market for the entire past generation.

    • @sevroaubarca5209
      @sevroaubarca5209 Před rokem +3

      They just need to be better.

    • @paullucci
      @paullucci Před rokem +21

      @@sevroaubarca5209 they just need to be better value

    • @pfnieks
      @pfnieks Před rokem +10

      They won't because they make more money now that they have 20% market share than when they had 40 or even 50%, you can expect them to give you a 100$ discount at most.
      Price wars are over, they ended the moment amd saw that people bought nvidia products even when they were objectively worse and pricier.

    • @AlexanderDiviFilius
      @AlexanderDiviFilius Před rokem +7

      With all of the information available to them, I see no reason why AMD wouldn’t be able to undercut Nvidia with their new GPUs. Even if they sell at a loss initially, they’ll make it up before long, and gain a lot of market share in the process.

  • @JohnSmith-ws7fq
    @JohnSmith-ws7fq Před rokem +79

    If they were gonna revolutionize the power interface, they should have gone with 24V DC. Halves the amps (these things are pulling more than cooker circuits now in terms of amperage), yet is still extra low voltage and safe to handle.

    • @wayland7150
      @wayland7150 Před rokem +4

      As long as the plugs don't fit the old socket that would have been fine. 600w is 50A through that tiny plug.

    • @glassman3333
      @glassman3333 Před rokem +10

      It would make sense. That’s why electrical infrastructure over long distances is high voltage, and only transformed where it’s needed. This crap seems crazy to me. A pc component needs like 50amps?! Why, because we’re stuck on 12v rails? How short-sighted.

    • @joshuahulce5630
      @joshuahulce5630 Před rokem

      @@wayland7150 what about CPUs? they can easily have 100-200A flowing through them.

    • @XIIchiron78
      @XIIchiron78 Před rokem

      There might be implications for VRM and PSU design that make this undesirable? Other than the obvious break in a many decades long chain of compatibility, I'm not sure if there is really much of a market for power stages and other components that could be used either, so it'd have to be planned way in advance.

    • @reezlaw
      @reezlaw Před rokem

      That would make a lot of sense, even phone manufacturers have long realised that it's better to charge at a higher voltage to reduce amps and heat

  • @ralphcarter3261
    @ralphcarter3261 Před rokem +13

    I know nothing about how these cards and PSUs are made. But I always thought that as time goes on these cards would get more power efficient. It feels like in that aspect Nvidia is going backwards

  • @puddleduckist
    @puddleduckist Před rokem +8

    Pure crazy power hungry gpu's! 😳 I'm sure the melting n fires will start popping up once they start getting into serious use! As always thank you for keeping us all informed Jay!

  • @pixelpusher220
    @pixelpusher220 Před rokem +135

    the adapter issue is one thing, but clearly they've skimped heavily in the power connector innards on the cards. As noted by Jay, serious bending/pulling is simply a part of building. And it could cause melting and fire. just scary

    • @fynkozari9271
      @fynkozari9271 Před rokem +8

      Wheres the technology advancement? Permorfance increase, but power consumption still high? Wheres the efficiency?

    • @simbad3311
      @simbad3311 Před rokem +2

      @@fynkozari9271 for efficiency u must wait for AMD cards, if im not wrong month or so....

    • @WizeguyGaming
      @WizeguyGaming Před rokem

      Considering he hasn't seen the new card yeah.

    • @earthtaurus5515
      @earthtaurus5515 Před rokem

      Effectively, all custom cables will need to be redone for ATX3.0 and ITX / SFF builds? There is no way to even fit in PSU provided cables without scrunching up cables and shoving them into a corner somewhere. So, this poses a serious problem for any ITX / SFF Build using any top end 40 series cards.

    • @janknoblich4129
      @janknoblich4129 Před rokem

      @@simbad3311 How the turns have tabled

  • @nightshademilkshake1
    @nightshademilkshake1 Před rokem +167

    Jay I really appreciate your clear and concise explanations - they're educational without being overly technical, and conversely not dumbed down and chalk-full of silly gimmicks and sarcasm. I also appreciate your more recent bravery shown when calling out manufacturers for their bad business practices. Way to go! You're striking a very nice balance between the tech reviewer options.

    • @jahmed525
      @jahmed525 Před rokem +2

      Ditto

    • @StarFury2
      @StarFury2 Před rokem +3

      Companies should start to employ Jay to design power supplies and electrical conduits. Obviously so called engineers and scientists don't know sht.

    • @adamtajhassam9188
      @adamtajhassam9188 Před rokem

      I hope the HX 1200 watt PSU is fine too he mentions ATC alot.

    • @12Burton24
      @12Burton24 Před rokem +1

      Sadly its just wrong what he says about 600 Watts. The 4090 is rated to 450Watts if you keep stricktly to 450Watts the 3 times 150Watts are fine for the card.

  • @trailingrails9953
    @trailingrails9953 Před rokem +10

    Just seeing how small the connector was in comparison to the current standard was raising red flags, now my suspicions are confirmed. Of all the components to over-engineer, the PSU connections should be at the top of the list, especially when they knew damn well that GPU draw was only going to increase.

    • @Squall4Rinoa
      @Squall4Rinoa Před rokem +1

      the connector is engineered just fine, the conditions that can cause failure would cause (and have caused) the same failures with the mini-fit 8pin (crushed or split female contact),
      melting failure is only possible with resistance resulting through poor contact and mishandling.

    • @degru4130
      @degru4130 Před rokem +8

      @@Squall4Rinoa Good engineering is supposed to account for "mishandling" (in this case normal things a majority of people do when building a PC). Especially when you're carrying this much current and *risk of melting* is even part of the conversation. Normally when you want to carry *more* power through a single connector you make it *bigger*, not smaller and more fragile. Just because there are other under-built connectors out there doesn't excuse this one.

    • @1337GameDev
      @1337GameDev Před rokem +2

      100%.
      Who cars about size, when the CURRENT thickness was BARELY sufficient.
      They should have had a 16pin+4pin data connector, and had it have a straight and right angle variants. But no.... now we get fire hazards....

  • @TybJim
    @TybJim Před rokem

    Thanks for explaining this Jayz. I've been out of the loop and missed out on the increasing power requirements in the top end modern cards. Specifically the ATX 3.0 data cable standard I wasn't aware of until now.

  • @Somtaaw7
    @Somtaaw7 Před rokem +70

    AMD and Intel's got a massive opportunity now. Hopefully the next gen is good and Intel sorts out their driver issues.

    • @achaosg
      @achaosg Před rokem

      With tensions over in China with Taiwan and being an intel Fanboi since like 2003 due to AMDs overheating issues I am a big investor in Intel personally... I Own intel stock and with their new fabs and the reinvestment they have into their business I am a strong believer intel will be the future Goat. Its also going to be made right here in America, even if they do a little outsourcing.

  • @billwiley7216
    @billwiley7216 Před rokem +11

    I actually did my latest build at the end of last year when Alder Lake was released with the expectations of a 4090/4080 gpu.
    I bought a quality 1300w psu thinking at the time this would cover up to the 4090 card with no issues.
    Fortunately a few weeks ago when the bottom dropped out of the pricing on the 3090ti and news was first leaking about two 4080 models and price predictions were floating the 4090 would be $2000 I pulled the trigger and bought a 3090ti.
    Honestly with this information coming to light I would really be upset that the top tier platinum PSU I bought some months ago which was over $300 would not be compatible with a 4090 had I decided to go that direction.
    This was information that should have been made very plain to the consumers a year ago when these PSU specs and GPU cards were being designed with all of these requirements of actually NEEDING these sensor capabilities on the PSU.
    So much for a upgrade path from a 30 series card with the older PSU'S even if it is a 1600w model!
    Thanks for the information Jay!

  • @Gettingadicted
    @Gettingadicted Před rokem +2

    Hi Jay, the issue with te connector doesn't end with the adapter. The "mating limit" if for the 2 connectors (male and female) so now we will probably see some savers, in order to keep que VGA's connector safe, other wise you would need to replace the connector after 30 mates (to be within specs). This is totally nuts for the computer market.

    • @1337GameDev
      @1337GameDev Před rokem

      OHHHHHH. The 30 cycles applies to the FEMALE end of the card? Holy fuck that's bad.

    • @Gettingadicted
      @Gettingadicted Před rokem

      @@1337GameDev Yes usualy when you specify a mateing/demating limit, it is for the male and female connectors. That is why you will see "savers" on microwave equipment with SMA connectors (sma is not the best when you consider mating/demating limit).
      In my oppinion, the connector on the card should hold a little bit better, but by spec you should replace it to make sure the contact is good (bad contact can be the start of a molten connector)

  • @patrickmallory8273
    @patrickmallory8273 Před rokem +1

    They knew about this yes; If you remember the L shaped power clip found on the 2060, it had this issue. The adapter would short inside the plastic housing causing no post on the motherboard. It was acting as a resistor, limiting power flow. In fact, it was so bad you could smell the plastic burning. When I stopped using the inline L adapter, it booted right up.

  • @Powerman293
    @Powerman293 Před rokem +149

    ATX 3.0 makes me wonder if that was what pushed EVGA over the edge and decide to stop making graphics cards now. Having all these issues means they have way more warranty crap to deal with, cutting how much profit they can make. And I'm sure Nvidia tightened the leash even more so they can't make as much.

    • @RarestAce
      @RarestAce Před rokem +12

      EVGA will have to start making ATX 3.0 power supplies they can't make ATX 2.0 supplies forever

    • @geerstyresoil3136
      @geerstyresoil3136 Před rokem +11

      Yea, talk about fixing a "problem" that doesn't exist. I have no problem with adding more tried and true 8 pins.

    • @jMewsi
      @jMewsi Před rokem +1

      sure has to be something like that

    • @MemTMCR
      @MemTMCR Před rokem +7

      @@geerstyresoil3136 letting your gpu and psu cooperate is a pretty good thing. he's talking about how many people are just gonna keep using their old supplies because they think they probably don't need to upgrade it.

    • @MCasterAnd
      @MCasterAnd Před rokem +4

      I doubt ATX 3.0 was the problem, given that everything JayZ mentioned about the new ATX spec is wrong. The graphics doesn't default to full power, it defaults to the minimum (150W). The new ATX standard is perfectly fine.

  • @theshijin
    @theshijin Před rokem +334

    Love that AMD's new socket being exclusively DDR5 had people worrying about extreme prices but some DDR5 rams are available for decent enough prices and the motherboard isn't that far off either, meanwhile NVIDIA announces their 4000 series and issues immediately show up lmao

    • @_Kaurus
      @_Kaurus Před rokem +22

      No one was worried about that other than CZcamsrs and sensationalists

    • @Chrisp707-
      @Chrisp707- Před rokem +15

      @@_Kaurus the average person actually kinda was. DDR5 boards and such for most are still quite expensive.

    • @lirycline6646
      @lirycline6646 Před rokem

      @@Chrisp707- not really tho, MSI ATX PRO Z690-A pretty good price

    • @TheMadYetti
      @TheMadYetti Před rokem +2

      enjoy your ddr with cpu with pluton chip from microsoft. amd betrayal is complete

    • @Claude-Vanlalhruaia
      @Claude-Vanlalhruaia Před rokem +8

      @@lirycline6646 What you are saying is that a private jet that cost 5m dollar is a good price compare to other private jet that cost 10m. Compare to ddr4 it is still expensive.

  • @1337GameDev
    @1337GameDev Před rokem +2

    What they should have done with the cable is attached a METAL plug at the end AND offer 2 variants. One straight out, and one at a L angle (going down towards the card pci-e pins).
    Then, what they should have done is set the cable to include a switch the can communicate intended power draw (250w, 450w, 600w+) and default it to an empty state to force people to set it.
    Then cards can sense the switch power draw, and key off of that too, and same for the psu (and the psu refuse to power the card if it's set too high).
    a BASIC switch and an IC is like 30c per cable, and would have handled a lot of issues. The switch could have easily been a set of resistors too, on data lines, that can be sensed according to the spec if there's no active comms, but passive components to indicate power config. We do this with HDMI, DP, and USB.

  • @JimFeig
    @JimFeig Před rokem +8

    I think it's time for power supplies to communicate directly with the system. And the GPU should be able to access that power info via the system.

    • @justcommenting4981
      @justcommenting4981 Před rokem +1

      They'll make it shut down if it doesn't show the power they want. Given the wattage calculations from the manufacturers seem to always want bigger and bigger wattage I'm skeptical.

  • @Firecul
    @Firecul Před rokem +24

    For any GPU (or add-in card) that is designed for atx 3.0 if those data pins are not connected or "does not monitor these signals, it must default to the lowest value in this table" 3:13
    So the GPU should limit it's self to 150w unless it communicates with the PSU (or someone decides to to it in the cable for some stupid reason) and the psu confirms that it can provide the appropriate power to go higher.

    • @bills6093
      @bills6093 Před rokem

      Yes, it almost certainly works that way.

    • @robertr.1879
      @robertr.1879 Před rokem +2

      That was also my thought; without the 4 pins plug, you get 2 x open signal = 150W max.

    • @dark4yoyo
      @dark4yoyo Před rokem

      They're just going to sell jumpers with the adapters to fix that lol

    • @dastardly740
      @dastardly740 Před rokem

      That is what I am wondering. If these adapters leave the signal lines open, then the GPU either ignores them or is limited to 150W. And, the ignoring the both sense lines open seems like a very bad idea.

  • @Durbanite2010
    @Durbanite2010 Před rokem +143

    Even more reason to not even consider a 40 Series GPU. That will be expensive enough (probably starting at £1200 or so for a 4080) but then to have to replace the PSU, which will easily be £150+ on top for an ATX 3.0 Unit (which, like Jay said, won't be low wattage and likely at least 1000W) is just going to price people out.
    Hopefully AMD will bring out their 7000 Series with rDNA 3 soon. If that doesn't require an ATX 3.0 power supply, I could see AMD dominating the GPU market for average users with this upcoming generation of GPUs.

    • @timotervola2734
      @timotervola2734 Před rokem +8

      Plus you need to turn the PC to a room heating element with these power levels :)

    • @None-lx8kj
      @None-lx8kj Před rokem +10

      I've gone from a 1080 to a 2080ti to a 3090. That's the last Nvidia card I will buy for a long time. Maybe ever.

    • @bambix1982
      @bambix1982 Před rokem +4

      I just bought a new 850w PSU when I bought my 3080. Not going to buy another PSU anytime soon so yeah, just another reason to pass on that overpriced power guzzler.

    • @djfirestormx
      @djfirestormx Před rokem +4

      @@bambix1982 i just bought a 1200w rog thor when mine went last march, how do u think I feel right now? it has power sensing capability, but nope not supported

    • @Chrisp707-
      @Chrisp707- Před rokem +4

      I need to update my PSU anyway but I won’t be going nvidia because the regular 4080 12GB is basically a next gen 3070(4070) labeled as a 4080.

  • @Digitalbath570
    @Digitalbath570 Před rokem +1

    Do we need to look at atx standard as being old and come up with a new standard instead of just iterations? Maybe allow for different designs of pc shapes and power delivery to key components

  • @cottontails
    @cottontails Před rokem +2

    I contacted EVGA to see if i would need to upgrade to an ATX 3.0 power supply, i currently have the 1300w SuperNova G2, i also shared this video with them and they responded with this "Hello, We will not be making an adapter for any 40 series card as the specific manufacturer will come with its own adapter. We are not aware of any fire issues in regards to this. We believe a lot of people are a bit confused about how the 12 + 4 sensing pins will work. Any standard PSU will work with a 4 to 16 pin as it will evenly distribute the power over the provided adapter. "

    • @ertai222
      @ertai222 Před rokem +1

      Wonder if they can actually confirm that or there's going to be a class action lawsuit in their future when people lose their houses in fires.

  • @matthewmcclure8294
    @matthewmcclure8294 Před rokem +131

    I wasn’t planning on getting a 40 series anyway, but I might be hopping the fence after everything that happened with the last launch and now with all of the things coming out recently with the 40 series AND the split off on EVGA’s end. Thanks for the info!

  • @arcticwarfare1000
    @arcticwarfare1000 Před rokem +14

    Don't worry about max wattage. You need to worry about what the maximum current the wires can handle. Because when the voltage goes down from a high load on the card the current draw goes up. Potentially being the max wattage of 600 watts. But if voltage dips to 10v. You are potentially pushing a extra 10 amps on top of the 50 amps already being pulled. Its entirely possible that the pins are rated for a total of 60 ish amps. But the unplugging and reconnecting of the delicate pins will overtime mangle and deform the female ends (more likely) meaning a hot join with high resistance will form. Causing a location where you will accumulate heat and cause smoke\fire so on.
    One of the most overlooked aspects of the PC building space is this IMO.
    Just like how your rig couldn't over clock in the scrapyard wars was from your supply voltage to your PSU was affected by the extension leads length adding resistance and reducing supply voltage by a few volts.

    • @mk72v2oq
      @mk72v2oq Před rokem +3

      If the voltage drops to 10V your power supply is garbage in the first place. Max allowed voltage deviation is +-5%. Higher deviation by itself can damage and kill hardware.

  • @pmonet31
    @pmonet31 Před rokem +7

    Might be beneficial with these connectors to utilize dielectric grease considering their thin sidewalls and questionable connection strength over time with use.

    • @timrc666
      @timrc666 Před rokem

      That's not what dielectric grease is used for.

  • @perwestermark8920
    @perwestermark8920 Před rokem

    If there is such a huge difference in current draw between tail of the adapter cable, then that indicates some big oops.
    The new 12VHPWR connector has 6 pins +12V3/V4 and 6 pins COM.
    But each of the 4x2 connectors on the tails has 3x 12V and 5x GND.
    So with 4 tails, there are 12x 12V wires supplying power to just 6 pins which means the adapter needs to merge two wires to each of the high-current pin.
    And a ATX PSU is allowed to have multiple 12V rails. Especially since earlier ATX requirements allowed max 240W per 12V rail.
    The easiest way to get a big difference in current is if the PSU has individual 12V rails for the PCIe connectors which (because of component tolerances) have a bit different voltages and there is a low enough resistance in the cable and board input that a PSU with multiple rails needs to overload and sag one of the 12V rails just to get the voltages at the graphics card end to equalize.
    That would on one hand make the PSU side have one 12V rail suffer much harder. And that PCIe power plug would get much higher current. And depending on how the adapter wires are spliced, some of the 6 high-current power pins at the graphics card would get a higher current.
    Current will get drawn where the highest voltage is. It takes some form of series resistance that is big enough to produce enough voltage drop that there is an equalization of current. That's why a LED (they are current-driven) are either powered from constant current supplies or has a series resistor to take up the variation in Vf of the LED. Just that a graphics card drawing up to 600W can't afford such resistors because they can only work if the resistance is big enough that there will be quite a number of watt of wasted energy in the series resistors. If a 12V rail has 0.2V higher voltage, and the draw is 20A then that would be 4W heat that must be burned off in the resistor.
    My guess is that Intel, when originally specifying this new connector, did not specify/recommend/consider any adapter cable. So the ATX 3.0 standard never needed to care about this potential problem. Then the board manufacturers ended up having to consider the practical aspect - people with expensive 1000W PSU that are pre-3.0 adding a significant extra upgrade cost if the PSU also needs to be replaced.
    Maybe NVidia should have specified that the new graphics cards needs multiple DC/DC to get active regulation of the current from each adapter tail. Then the splicing of multiple power sources happens inside the graphics card where the designer has a known problem to solve.

  • @Macho_Man_Randy_Savage
    @Macho_Man_Randy_Savage Před rokem +157

    Damn, if getting a 4080/4090 in this climate isn't bad enough, you now have to consider a ATX 3.0 PSU to go with that 😅
    I'd feel really anxious knowing my PCIe plug has a finite life, even if I set and forget it 😬

    • @SoundwaveSinus9
      @SoundwaveSinus9 Před rokem +15

      not only a ATX 3,0 but a high Watt one. 3000 series had already huge spikes wich fried PSUs

    • @sebastians1511
      @sebastians1511 Před rokem +3

      would buy a atx 3.0 psu - none available :( i'll wait till they come out and then decide if i buy nvidia or amd

    • @TheMeragrin
      @TheMeragrin Před rokem +1

      Well, the fact is your PCIe plug has the same 30 cycle limit.

    • @xe-wf5iv
      @xe-wf5iv Před rokem

      @@SoundwaveSinus9 Those spikes are one of the reasons for the ATX 3.0 standard. With the new standard it will be possible to run a GPU with less wattage overhead. In other words, you want need a 1000 watt supply.

    • @DefianceOrDishonor
      @DefianceOrDishonor Před rokem +4

      The only way you could get a 3080 for the first like year they were available was to buy a bundle of hardware. I bought mine at MSRP but had to also buy a PSU / CPU / Mobo. Even many of those bundles would sellout pretty fast, but yeah.
      Chances are all the 4000 series will be sold out and scalped like we've seen with the 3000 series and you'll basically need to go the bundle route-- but the real icing on the cake here? I doubt any of those bundle deals will include ATX 3.0 PSUs, they'll likely offload older PSUs lol.

  • @JaenEngineering
    @JaenEngineering Před rokem +10

    5:50 it absolutely will not default to pulling as much power as it can. Look at the chart. If the sense pins are NOT connected (ie they are OPEN) the card should default to the lowest power draw. In order to pull the maximum, both sense pins have to be pulled down to ground. The problem will come if manufacturers start sending adapters with the sense pins hard wired allow the cards to pull maximum power.

    • @exscape
      @exscape Před rokem +2

      Yep. The data shown completely undermines the ENTIRE POINT. The standard SHOWN IN THE VIDEO says the cards are not allowed to draw more than 150 W from such an adapter!!

    • @grud8495
      @grud8495 Před rokem

      @@exscape The cards need more than 150 W to function. Manufacturer's will not ship an adapter that will not match the cards rated power draw. So, by incurrence, they will ground sense1 and maybe leave sense0 open. Then the max power draw becomes 450 W, which is the rated power draw of a 4090.

    • @JaenEngineering
      @JaenEngineering Před rokem

      @@grud8495 If the manufacturers are smart they'll wire the adapters sense pins to match the number of PCI connectors on the adapter. So if it only has 2 PCI connectors, they'll ground sense0 and leave sense1 open, telling the card it can only draw 300W (2x150W) and so on.
      But the point I was try to make is the cards will not, as Jay said, default to pulling the max power if it's connected to a "dumb" supply.

  • @atharagha6795
    @atharagha6795 Před rokem +6

    if you're going to put out some videos on this it would be good to see what happens when wiring in a new 4090 GPU with legacy kit such as an x470 board & how that would look PSU-wise. A lot of people will want to supplement their rigs with a new GPU rather than upgrade the entire thing to a Pcie5 setup

    • @nagyzoli
      @nagyzoli Před rokem +1

      That would be a waste of money (you are choking down your GPU if you do not use PCie5). 40XX is honestly meant for new rigs. Am5 socket, DDR 5, new PSu, new cooling etc

    • @atharagha6795
      @atharagha6795 Před rokem +3

      @@nagyzoli we're barely saturating pcie3.0 when it comes to graphics let alone 4.0. maybe that's true for storage throughout - and when we finally implement direct storage in games then maybe again. But right now it's maybe a few fps different between the two. Iirc GN or maybe J2C did a video about it.
      Regardless it'll be an upgrade path a lot of people are considering

  • @1Dshot
    @1Dshot Před rokem

    GOod info. Even if I'm not planning on using them, I know others that potentially will. Will definitely be passing this video on

  • @yumeN0dengon
    @yumeN0dengon Před rokem +3

    11:08 - Unless I'm missing something here, the document doesn't state "melting observed @2.5hrs" but "hot spots observed @2.5hrs, melting 10-30hrs." That's at least a 4-fold difference and, while that's still concerning, I doubt any gamer will ever have their card running at even 80% TDP for 10 hours straight. Gotta see how these hot spots and load/unload cycles impact the adapter or card's lifespan, but so far it looks like only very long and intensive compute tasks such as CGI rendering or mining are posing a risk.

  • @Ryan.Lohman
    @Ryan.Lohman Před rokem +6

    Kind of glad I took your advice on getting a 30 series graphics card when I did. The same could be said about motherboard and video card extension cables going bad after a certain amount of cycles.

  • @jonathanabbott3097
    @jonathanabbott3097 Před rokem +9

    Hi JZ with the cost of electricity going up here in the UK and worldwide. A good idea for a video would be performance per watt on current graphics cards on the market. Thanks for your informative videos 👍

  • @ruikazane5123
    @ruikazane5123 Před rokem +1

    The SENSE pins aren't exactly active communication. They are simply GND or Open similar to CPU VIDs of the old FSB era. One may just short pins to get around that. A serious, overachieving, quality adapter harness can come with those pins and a switch to set according to your power supply rating.
    And what a throwback to the old days...Molex connectors? GeForce FX!?

  • @thesilverwang
    @thesilverwang Před rokem +14

    Damn Jay, you really don’t want an Nvidia review sample 🤣

  • @MrDrTheJniac
    @MrDrTheJniac Před rokem +26

    This honestly feels like a serious engineering goof. However, it looks like the card will actually default to the lowest power bracket unless told it can go higher; note that having both pairs of pins "open" (meaning no connection) sets the cards to the low power mode.

    • @VikingDudee
      @VikingDudee Před rokem

      I wouldn't worry with anything lower than a 4080 honestly, I don't think the 30 cycles would be too much of a concern on them, but something power hungry as a 3090ti and a 4090, yeah, the plug in my opinion is too small for the amount of current it could draw, if it was a big bigger plug, the pins would have to be bigger, more contact area, less chance of melting, but even the standard PCI-E power connections are also rated for 30 cycles but nothing really draws as much power as the them higher end cards yet so. Guess we will see if someones system caught fire or see melted used cards on ebay lol.

  • @randomguitardude267
    @randomguitardude267 Před rokem +1

    Thanks to this video I realized my mistake and sent my 2.0 1200W PSU back to replace it with the same model I wanted in 3.0 standard.
    Thank you kind sir.

  • @gradocchio
    @gradocchio Před rokem

    Just built a system with a Corsair RM1000X (ATX 2.4) around my current 3090 and they seem to have a cable made especially for the 4090. Of course it does not have the communication pins but as long as it works and is safe, that's the way I will go when I move up to a 4090. Corsair CP-8920284 PCIe 5.0 Power Cable.

  • @teardowndan5364
    @teardowndan5364 Před rokem +56

    If you want to send lots of power down a small connector, use a small plug rated for high current like XT60 or EC5, then you don't have to worry about 30A going down one #16 wire and nothing on the others. 4mm solid brass pins and barrels should be quite a bit more wear-tolerant than MiniFit pins made of thin folded sheet metal.

    • @colt45strickland
      @colt45strickland Před rokem +7

      A modified XT60 with the data pins would be great imo

    • @MaddJakd
      @MaddJakd Před rokem +7

      Tell that to the PSU manufacturers and/or the guys making these cables. Seems they just threwe the most random ontern at making SOME conversion cable. No experience required

    • @MrQuequito
      @MrQuequito Před rokem +13

      They are not even 16, these are 18 at best, these cables are tiny, they have more insulation than copper, and yeah, 23+ amps going through these cables will melt them, its like whoever designed the cards never accounted for the limitations of PCI connectors

  • @naufalrifki3130
    @naufalrifki3130 Před rokem +73

    I have an RX6600, and the biggest power draw I've ever seen from the GPU was only 100W when I was playing RDR2 on high settings, a very visually demanding game. And now seeing the new RTX 4000 series requiring even more beefier power supply to run makes me appreciate my RX6600 even more

    • @yearofthegarden
      @yearofthegarden Před rokem +8

      Same, I had a rx6600xt before I "downgraded" to an APU amd 5700g, and the 6600xt was a few bloom options from running my whole selection of games from max. I like the apu more though for the 40watt power draw 8 core, for rendering/design with casual gameplay.

    • @josephjocson1385
      @josephjocson1385 Před rokem +1

      I have 6600 XT, max power I saw 102.

    • @Guldfisken90
      @Guldfisken90 Před rokem +2

      @@josephjocson1385 And here i am sitting with my effing 3060 ti, that can pull upwards to 250w sometimes -.-

    • @Redrider008
      @Redrider008 Před rokem +6

      @@Guldfisken90 tbh I'm running a 6900XT(XH) and on the most demanding setting I can draw near 300W. The High end was already drawing way too much and heating as much. Can't see the need for more power draw, they'd better refine the software and optimisation than going for power every time, it's getting out of control.

    • @xalanx07
      @xalanx07 Před rokem

      Imagine people already broke buying 40 series and need to fork another top dollar for beefier PSU. Wellplayed Nvidia 🤑🤑

  • @Gottaculat
    @Gottaculat Před rokem +1

    This makes me feel a lot better that I don't care about cable management, and prefer to let wires rest naturally than force them to conform with a group of cables. Not like I'm gonna see the cables anyway, or even the interior of my PC, because I'm looking at my beautiful games, not the tower. My cables are also not visible anyway, because the case cover isn't transparent on that side, so why potentially damage the cables for aesthetics nobody's gonna see?
    I mean, I keep it a little tidy, so I can actually see which cables go where, but beyond that, I just go by the golden rule of mechanical engineering I was taught: "Never force parts to fit."
    Doesn't matter if it's threads on a screw, the bolt on my rifle not closing properly, slotting components into a motherboard, or even servicing a fishing reel, you don't want to force things to fit, because you'll likely do more damage than good.

  • @KuKoKaNuKo
    @KuKoKaNuKo Před rokem +3

    Lol, high-end power delivery coupled with low-end cable/wire design.... GENIUS!

  • @russ4533
    @russ4533 Před rokem +73

    Can't wait for AMD, will be interesting to see their power draw and power connector choice.

    • @Alpine_flo92002
      @Alpine_flo92002 Před rokem +4

      They will have to use the same connector or hold back the ATX spec by using the old connectors

    • @ledoynier3694
      @ledoynier3694 Před rokem +5

      @@Alpine_flo92002 And AMD has the habit to give the GPU and memory power draw as the official power spec of their cards, "forgetting" all the VRM and other power draw to appear more power efficient. But power draw will very likely be very close to Nvidia 40xx. (Nvidia spec is the total card power draw).

    • @Fearzzy
      @Fearzzy Před rokem +2

      the top spec AMD cards are expected to draw around 300-350w, with lots of headroom for overclocking^ You are safe without a fancy new PSU

    • @Alpine_flo92002
      @Alpine_flo92002 Před rokem

      @@Fearzzy "Lots of headroom for overclocking" Always just reminds me of badly binned chips where either you get a GPU you can overclock to hell and back or a GPU that draws twice the power at 5% OC

    • @Fearzzy
      @Fearzzy Před rokem +1

      @@Alpine_flo92002 Well this headroom might be very "reliable" or consistant and exploited by AIB's so it might be higher clocked / power straight out of the box. But the stock clocks / power will be alot more reasonable than what we see from Nvidia right now

  • @brucethen
    @brucethen Před rokem +8

    Looking at the sense line diagram, as an electronics engineer, it would appear that if those pins are not fitted, the 2 sense lines would be open and the power would be limited to 150W. However if the extra pins are fitted, and shorted to ground, in a specific combination, that could be a problem. I would again suspect that a single 8 pin header would be configured for 150W, a double for 300W and so on. My main concern for older power supplies though would be the power on surge of the graphics card. I would be more worried about the 3090 with its disconnected sense pins.
    As an example, here in the UK we can support upto 3kW of mains power on a single socket, that is 230V at 13A. I had a power supply to test, this unit could provide 2kW DC, that is 100V at 20A, or 22V at 90A, or any combination in between. This obviously is far less than the 3kW maximum that our mains can supply. The power supply in question had been fitted with a standard mains plug, so I plugged it in an switched on. The mains trip flipped and on further investigation I discovered that the power supply had a 31A power on surge and should never have been fitted with a standard plug
    Yes the number of connections and strain on the pins would be a problem, the plug? Has connectors that are basically springy cylinders, each time they connect they are forced, slightly more, open and when disconnected they have to spring back. Eventually this spring effect gets weaker and bending the cable puts extra strain on the connection, shortening its lifespan. Once the connector looses its spring the connection becomes weak and causes resistance creating heat and melting the connector.
    .

    • @BERSERKERDEMON1
      @BERSERKERDEMON1 Před rokem

      I would rather think it as a Peltier effect problem...
      An adapter remains a piece of different alliage (not the same amount of copper, or not as pure) which creates heating points on connectors... especially with such power draws...
      It's a shame, I was really interested in the the new RTX improving games IA thing (and even considering selling my rx6900) but changing my graphics card, + my psu (and maybe my motherboard and CPU) is going to make banker have a heart attack... so I m going to skip on Nvidia I guess (to save my banker and my myself)

  • @Kupidon14
    @Kupidon14 Před rokem

    Nice information! The whole life i used power supply with no data lines between gr. card and power supply, why it's nedded now? And ŵhy they don't put thermal cutout near power connector?

  • @fuxseb
    @fuxseb Před rokem +1

    Ye olde .093" Standard Molex connectors were rated for... 25 mating cycles. A SATA plug is good for 50 cycles and its receptacle (as in mobos and disk drives) just 500. Yet any proper enthusiast has surely abused some of them for more than ten times that and they're still serviceable. They are not anymore guaranteed to keep their parameters within tolerances, but that hardly ever becomes a problem. You're right about fire hazard though. We may be pushed to actually respect the number of cycles in those connectors after all, as never before those little connectors were ever supposed to withstand currents normally seen in welding and car starters.