Is this NVIDIA's Greatest GPU?

Sdílet
Vložit
  • čas přidán 28. 07. 2024
  • On June 18th, NVIDIA became world's most valuable company. However, it has been a long way for Nvidia. Since 1995, Nvidia has made a plethora of different GPUs. But off all of those GPUs, there is a strong case to be made that this is the most important of them all
    Patreon: Patreon.com/FullyBuffered
    Twitter: FullyBuffered
    0:00 - Intro
    0:37 - GPUs, but not for games
    1:17 - Finding new applications for GPUs
    3:00 - GeForce 8800 GTX
    4:24 - DX10 & Unified Shaders
    6:06 - 8800 GTX & EVGA SR-2
    7:16 - Gaming performance
    9:13 - CUDA
    13:16 - Competition
    14:00 - Conclusion
  • Věda a technologie

Komentáře • 145

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk Před měsícem +49

    What a lot of today's reviewers forget is that 1080p wasn't always the most popular resolution. When this card was released, 1024 x 768 was the most popular resolution.
    Many lcd's had that resolution. Expensive and larger lcd monitors (22" used to be the large lcd monitors, remember that?) were often only 1680x1050 or 1600x900.

    • @coccoborg
      @coccoborg Před měsícem +5

      Yes!!! I had a 1440*900 panel :D

    • @JohnSmith-iu8cj
      @JohnSmith-iu8cj Před měsícem +3

      1680x1050 was the standard for new tfts at the time. Old tfts had mostly 1280x1024

    • @JamesSmith-sw3nk
      @JamesSmith-sw3nk Před měsícem

      @@JohnSmith-iu8cj Resolution mostly varied by lcd panel size. In 2008, according to the Steam survey, over 70% of pc gamers ran at either 1024 x 768 ( I ran that resolution on a 17" lcd) or 1280 x 960 (I never ran that resolution.)

    • @manitoba-op4jx
      @manitoba-op4jx Před měsícem +2

      @@JohnSmith-iu8cj i still have my 1280x1024 dell monitor. it's been my primary monitor for 20 years now

    • @FullyBuffered
      @FullyBuffered  Před měsícem +1

      True! These tests were meant to show what performance is like at a modern resolution, but 1080p did not become mainstream until around 2010 or so. However, higher resolution were already available on larger CRTs.

  • @Sam-K
    @Sam-K Před měsícem +46

    In a lot of ways, the 8800GTX was the 3090Ti of its time.
    Like the 8800GTX, the 3090Ti considerably raised the power envelope - from the generally accepted 250W to a staggering 450W - and was about 1.8X faster than the previous gen. 2080Ti as a result.
    And now, the 220W 4070 Super and 245W 7700XT are considered "power efficient" cards. I'm pretty sure things will eventually go up to 500-600W, now that we have finally hit the silicon barrier and it's getting harder and harder to shrink transistors and/or run them more efficiently.

    • @harryshuman9637
      @harryshuman9637 Před měsícem +10

      Also,don't forget, at $600+ it was like x2-3 times more expensive than the GPUs that came before. Compared to $250 of the mainstream 8800 GT. Kinda like RTX 3090/4090 vs 3070/4070.
      Also keep in mind, people would be buying 2, 3 and sometimes 4 of these cards to use with SLi. So top tier GPU setups were around $2000, again similar to 3090s and 4090s of today.
      Just to highlight that nothing really changed in the past 18 years, and top tier PCs of today aren't really that more expensive than the top tier PCs of 2006.

    • @Sam-K
      @Sam-K Před měsícem +13

      @@harryshuman9637 Pretty much. That's exactly why I consider the 4070Ti to be Nvidia's flagship. Cards beyond the 4070Ti are Titan-class behemoths with ridiculous TDPs and terrible price-to-performance ratios. You're paying a lot for not a whole lot of performance.
      For example, the 4090 offers ~60% better performance than the 4070Ti, yet cost twice as much, as far as MSRP is concerned.

    • @StaelTek
      @StaelTek Před měsícem +2

      more like 8800 Ultra, imo

    • @Alpine_flo92002
      @Alpine_flo92002 Před měsícem

      A big "Issue" is also how well those cards scale with additional power. You dont lose a lot of performance with limiting power yet you gain a good bit with just increasing the power budget. Unlike older gens where you would put in tons of power and get only slight gains.

    • @harryshuman9637
      @harryshuman9637 Před měsícem

      @@Sam-K You are comparing gaming performance tho, but in 2024 you aren't buying top tier GPU for gaming. You are buying it for ML applications, and 4090 is around twice as fast as a 4070.

  • @TheBasedSociety
    @TheBasedSociety Před měsícem +7

    i have the Quadro version of the 8800 GTX. Same deal but has 1.5 GB VRAM instead. Absolutely massive for the era.

  • @PixelPipes
    @PixelPipes Před měsícem +14

    A VERY excellent video, and exactly the point of view I would have taken. It was clear some years ago, even before the AI craze, that CUDA was the distant leader of the supercomputing-on-a-GPU revolution, and that was all because as you said, they laid the inroads early on and basically formed the industry from the ground up around their products. It was hard for AMD or other competitors to break in, because it wasn't just the GPGPU industry they pursued, it was _NVIDIA's_ GPGPU industry. The 8800GTX and G80 GPU at the time was described by multiple publications as an "inflection point", but they couldn't have known back then how right they'd be.

    • @FullyBuffered
      @FullyBuffered  Před měsícem +1

      Very true - thanks for the kind words man :)

  • @piked86
    @piked86 Před měsícem +3

    Your videos are so well produced and researched. I wish you made them more frequently.

  • @Ivan-pr7ku
    @Ivan-pr7ku Před měsícem +16

    The foundation for CUDA was a set of programming and debugging tools derived from the Open64 project in the early 2000s -- an open source compiler for Itanium and x86-64.

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk Před měsícem +36

    If you own a 8800gtx, use MSI Afterburner and set it to the 8800 Ultra's clocks. They are the same gpu. The Ultra has a better cooler.

    • @TheBasedSociety
      @TheBasedSociety Před měsícem +7

      Ultra has a newer revision G80 and lower latency memory.

    • @Mr371312
      @Mr371312 Před měsícem +6

      The same silicone doesn't equal same "specs", learned that the hard way. Power delivery, number of phases, cooling solutions for components outside of core.

    • @JamesSmith-sw3nk
      @JamesSmith-sw3nk Před měsícem +1

      @@Mr371312 It often can. When it comes to overclocking, it's ALWAYS:..
      "Your mileage will vary."

    • @TheBasedSociety
      @TheBasedSociety Před měsícem +3

      @@JamesSmith-sw3nk I have a Quadro FX 5600, the pro 8800 GTX with 1.5 GB. It achieves exactly the same clocks as a stock 8800 Ultra and not 1 MHz more.

    • @Different_Name_
      @Different_Name_ Před měsícem

      Binning

  • @TimmyJoePCTech
    @TimmyJoePCTech Před měsícem +14

    Extremely informative and interesting :)

  • @Geekzmo
    @Geekzmo Před 17 dny +1

    Great video! is nite to see you back!

  • @laz7354
    @laz7354 Před měsícem +3

    Hard work took a decade to pay off, but it paid off big.

  • @veilside62
    @veilside62 Před měsícem +2

    i used to do distributed computing (seti@home) back in the day, and the first time i saw my 8800 working with cuda instead of the CPU, it was so fast i thought there was a problem with it because there was no way it could be that much faster than a CPU

  • @Obie327
    @Obie327 Před měsícem +1

    Been using Nvidia since the beginning and enjoyed the ride. My first G80 was the 8800GTS and then the XFX Ultra XXX edition in july of 2007 to play Crysis. But my favorite G series was the final iteration on a 55nm G92 Die or my EVGA 250 GTS 1 gig. That GPU had refined higher clocks speeds and process. Thanks Fully buffered for the look back and informative video.

  • @MadeleineTakam
    @MadeleineTakam Před měsícem +1

    Excellently presented as always.

  • @fr8shaker289
    @fr8shaker289 Před měsícem +1

    Informative. Thank you.

  • @_odaxelagnia
    @_odaxelagnia Před měsícem +2

    Your videos are like a treat

  • @wertywerrtyson5529
    @wertywerrtyson5529 Před měsícem +7

    5% from datacenters in 2015 to 87% that is a massive change. They barely even need gamers anymore. I remember when the 8800GTX came out but didn’t pay much attention to CUDA as the gaming performance was so impressive.

    • @FullyBuffered
      @FullyBuffered  Před měsícem

      Indeed! At the moment gaming has become a side hussle for them...

  • @SinaFarhat
    @SinaFarhat Před měsícem +2

    Thanks for a informative video!

  • @Olibelus
    @Olibelus Před měsícem +2

    Great video!!

  • @knightfall71
    @knightfall71 Před 15 dny +1

    i remember the gtx 8800 i wanted one bad lol. but i could only afford the 7600gs:(. the card was still 225 bucks and i was earning 90 bucks a week as an apprentice mechanic 1st year lol

  • @MrHav1k
    @MrHav1k Před měsícem +2

    It definitely was the turning point no doubt about it. Thing is it really took over a decade for it to take off, and 15 years for CUDA to become the staple it is today.

  • @legomaniac601
    @legomaniac601 Před měsícem +1

    I got a pair of 8800 GT in an XPS I got recently so it show how far we have come from the pair of SLI 8800GT to where I am now with my 3080 12gb

  • @geort45
    @geort45 Před měsícem +5

    EVGA SR-2 FTW!

  • @exaltedb
    @exaltedb Před měsícem +2

    I own a 8800 GT, which has basically a shrunk-down, slightly reconfigured G80, the G92, where the memory interface was cut down but counteracted with twice the TMUs. Was probably too good of a card for where NVIDIA placed it

  • @Micecheese
    @Micecheese Před měsícem +1

    I loved the amd opteron quad cpu setup, but that case is really expensive.
    This old 8800 GTX is some huge up in nvidia's game, cuda helped me in more ways than just fps improvements in older games.

  • @roki977
    @roki977 Před měsícem +1

    never owned 8800gtx but i did own BFG 8800gts 320 and 2X EVGA 8800GTin sli, they where way to hot.. Those where super exciting times for gaming with great games..

  • @eightyd2554
    @eightyd2554 Před měsícem +1

    Casual SR-2

  • @jmtradbr
    @jmtradbr Před měsícem +1

    CUDA is the market default. It's like Adobe. Even if a program better than Photoshop and Premiere exists, they will ask in your curriculum Adobe products.

  • @ccleorina
    @ccleorina Před měsícem +1

    i have collected 3 GTS8800 and 2 8800 Ultra sadly they die for unknow reason. but my 4 GTX8800 still working great at 2024 inside my retro gaming pc.

  • @VladislavKusmin
    @VladislavKusmin Před měsícem

    в 2007 году я смотрел на 8800, но предпочёл взять ati 19xx. К сожалению, года через 4-5 она сломалась, но до того момента я не особо страдал от отсутствия унифицированных шейдеров

  • @lwwells
    @lwwells Před měsícem +1

    This card excited me so much when it came out that I bought 3 of them for 3way SLI. THAT WAS A MISTAKE.

    • @FullyBuffered
      @FullyBuffered  Před měsícem

      Haha that would have been an awesome setup back in the day!

    • @lwwells
      @lwwells Před měsícem

      @@FullyBuffered I felt like a king for two generations. 😂

  • @cheedam8738
    @cheedam8738 Před měsícem

    I never really understood what that long plastic that extends the GPU is, what is it actually? Some older AMD Radeon HD blower type cards had those too.

    • @Mrproud696
      @Mrproud696 Před měsícem +5

      Put in or remove card from server easily

    • @TheBackyardChemist
      @TheBackyardChemist Před měsícem +3

      Stiffening bracket for mechanical integrity in transport. It is intended to be used in a case that has slots to support the weight from both sides.

    • @lemagreengreen
      @lemagreengreen Před měsícem +1

      I think it was just like in the very old days when "full length" ISA cards sometimes had a little extension to lock into support brackets at the front of cases. Maybe Nvidia anticipated the sag problems of the future even if it wasn't strictly necessary with these cards?

    • @JohnSmith-iu8cj
      @JohnSmith-iu8cj Před měsícem

      I have the first 7800gtx dual slot with that extender, from dell. It has no cuda

    • @arenzricodexd4409
      @arenzricodexd4409 Před měsícem

      ​@@JohnSmith-iu8cjthose were older architecture. Nvidia only support CUDA with unified shaders architecture.

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk Před měsícem +2

    Great video. I owned three 8800gtx's, and for about a week I remember I had 14th on 3D mark's scoreboard. Tri-sli stuttered a lot in games so I eventually just ran 2 way sli and used the 3rd card for PhyXs.

  • @ButtaDawg6969
    @ButtaDawg6969 Před 6 dny

    old gpus are so funky and fun

  • @razorsz195
    @razorsz195 Před měsícem +1

    One day ill get around to setting up my dream beast, 2 G92 8800GT Gold cards in SLi with an Athlon II X2 example running over 4Ghz, a golden theme for a golden era, back when i was using an FX5200 and ol' Sempron where the only thing it was good at was forum and review binging the hardware i could never afford :')

    • @FullyBuffered
      @FullyBuffered  Před měsícem

      Do it!

    • @razorsz195
      @razorsz195 Před měsícem

      @@FullyBuffered Got a 7950GT SLi system in mind too with an e6500K, still hunting for a second sparkle 7950GT though, need to sell on some pcs first before making any more :P

  • @ProjectPhysX
    @ProjectPhysX Před měsícem +4

    More important than CUDA, the 8800 GTX was also Nvidia's first GPU to support OpenCL, the open GPGPU language that is equally fast/efficient as CUDA, yet works on all hardware from Nvidia, AMD, Intel, Apple, ARM, ... OpenCL is increasingly important especially in science, where new supercomputers regularly have GPUs from different vendors - the same OpenCL code works everywhere and developers don't have to waste years on code porting.
    13:29 AMD's counterpart to CUDA is not ROCm, but HIP, which is their proprietary nonsense language that doesn't even work on most of their own GPUs.

    • @Slavolko
      @Slavolko Před měsícem

      Both CUDA and HIP are dialects of C++, so I don't know why you're singling out HIP specifically, especially when HIP is able to run on more than just AMD GPUs.

    • @FullyBuffered
      @FullyBuffered  Před měsícem

      Interesting - thanks for the comment!

  • @lemagreengreen
    @lemagreengreen Před měsícem +3

    I have sort of wondered if we'll move on from the "GPU" naming convention given it's just a general purpose vector processor that we've been running for a long time now.

  • @inkysteve
    @inkysteve Před měsícem +1

    Can AI explain why one man needs so many spatulas?

  • @SedatedByLife
    @SedatedByLife Před měsícem

    Loving the new haircut. I always thought you were gorgeous but now... woof. 😊

  • @Snufflegrunt
    @Snufflegrunt Před měsícem +4

    The 8800GTX was the most important thing to happen to computers since the IBM 5150.

  • @JohnSmith-iu8cj
    @JohnSmith-iu8cj Před měsícem +1

    Gamers and nerds, unite!

  • @dubment
    @dubment Před měsícem

    gtx10 from 9 series brought the performance leap that you mentioned from 7950 to 8800gtx, funny that when they released the 1070ti, the only difference between it and gtx1080 was gddr5x and 128 more cuda cores, basically cutting an 8800gtx...

  • @tonycrabtree3416
    @tonycrabtree3416 Před měsícem

    Pretty sure it’s “Ai, Ai, Ai” 😂😂😂 or at least that’s what so many CEOs keep saying at every shareholder meeting.

  • @TCBOT
    @TCBOT Před měsícem

    just showing off the evga sr2 lol

  • @DanielCardei
    @DanielCardei Před měsícem +2

    8800 GTX was a groundbreaking product that pushed the boundaries of what was possible in GPUs.

  • @Moderna_
    @Moderna_ Před měsícem +2

    Thanks cuda core, really cool

  • @wewillrockyou1986
    @wewillrockyou1986 Před měsícem

    Nvidia became a trillion dollar company not because they made good hardware, but because they made useful software... See: Microsoft

  • @Sitharii
    @Sitharii Před měsícem +1

    definately the 2 Data-Center/A.I. landmarks for nVIDIA are *Tesla* (8xxx-lineup ) and *Fermi* (GTX5xx-lineup) architectures !!
    Both of them heavilly compute-based architectures that paved the way for the A.I.-era
    coincidentally ,these 2 have always been my most beloved architectures although i'm(was) just a videogamer ...

    • @FullyBuffered
      @FullyBuffered  Před měsícem +1

      True!

    • @GrainGrown
      @GrainGrown Před měsícem

      It's written NVIDIA, not "nVIDIA".

    • @Sitharii
      @Sitharii Před měsícem

      @@GrainGrown i'm an nVIDIA customer since 1999 ( *Riva TNT2 Ultra* ) i know very well how it's written , write in google : nVIDIA logo and check the letters ,the "n" is a small one and the other letters are capitals ,but they have the exact same size ,which ,of course i can't write it the same way using "MSword" since ,in "MSword" small & caps letters have different sizes ...😉
      (*it can be written the way you say also , but this doesn't mean that the way i write it isn't correct as well )

  • @DragonBane299
    @DragonBane299 Před měsícem +2

    WOOWW your hair is so nice!
    Its always nice when you post a video, highlight of my day everytime!

    • @FullyBuffered
      @FullyBuffered  Před měsícem +1

      Many thanks for the kind words! I'm glad to hear that! :D

  • @WhoCaresGamingIsDead
    @WhoCaresGamingIsDead Před 13 dny

    Its hilarious that the 8800GTX can now get spanked and DESTROYED BY AN AMD RADEON RX 780m lmmfao! 780m (8700G ) is the performance today of an Radeon R9 280X but lol in the cpu !! and is very close to the R9 290 desktop & obliterates the Flagship (ATT) HD 6990 6GB dual GPU CFX card $1500 GPU in 2010

  • @LawrenceTimme
    @LawrenceTimme Před měsícem

    Aipu

  • @WhoCaresGamingIsDead
    @WhoCaresGamingIsDead Před 13 dny

    2032

  • @DuneRunnerEnterprises
    @DuneRunnerEnterprises Před měsícem +1

    Also, it's became the basis for 9800,that became later the 250!😊😊😊

    • @Warbob11
      @Warbob11 Před měsícem

      Which I pulled a 9800+ from someones old build years ago and it still works. I keep it in my collection of older GPUs I use for my Win Xp 32Bit play everything I can get working parts.

    • @DuneRunnerEnterprises
      @DuneRunnerEnterprises Před měsícem

      @@Warbob11
      Got 3 of 'em now.
      8)

    • @classic_jam
      @classic_jam Před měsícem +2

      Indirectly, those all use G92 and not G80. But G92 is a more efficient refresh with upsides and downsides.

    • @DuneRunnerEnterprises
      @DuneRunnerEnterprises Před měsícem

      @@classic_jam
      Maybe.
      I might make a video,going from 8800 to the 250.
      Just got to setup a platform.

  • @MrFreeman1981
    @MrFreeman1981 Před měsícem

    your GPU test is being bottlenecked by the CPU, your imput lag is also horrible whats clearly seen at 9:02 when moving the mouse, in my opinion it is unplayable like this, yes there are some casual players who dont know the feel of actual 5ms frametime 144Hz and up gameplay, who are fine with it and even playing on a controler, but i could not game like this.

  • @WhoCaresGamingIsDead
    @WhoCaresGamingIsDead Před 13 dny

    yeah nvidia the anti-consumer - anti competitive POS GPU Company

  • @х0хлы-підоры
    @х0хлы-підоры Před měsícem

    туда

  • @VITAS874
    @VITAS874 Před měsícem

    Now nvidia is greedy company.

    • @bp-it3ve
      @bp-it3ve Před měsícem

      more like always

    • @VITAS874
      @VITAS874 Před měsícem

      @@bp-it3ve earlyer they holds off price appetites . Now they make excuses and lies with "we will do low prices after"

    • @LawrenceTimme
      @LawrenceTimme Před měsícem

      Sounds like something a poor would say.

    • @VITAS874
      @VITAS874 Před měsícem

      @@LawrenceTimme sounds like something a rich man would say. And if high prices are normal for you, then I feel sorry for you.

    • @VITAS874
      @VITAS874 Před měsícem

      @@bp-it3ve They used to keep their appetites. Now they are lying by saying “we will lower prices after Covid”.

  • @TheRealEtaoinShrdlu
    @TheRealEtaoinShrdlu Před měsícem +1

    It's "H100", not "Haayitch100".

  • @stuartthurstan
    @stuartthurstan Před měsícem

    Please, let's try to forget about those cringe, disgusting "gamer" GPUs. Everyone knows that nvidia is really an AI server company.

  • @homelessEh
    @homelessEh Před měsícem +1

    i think all this ai crap should be a crime against humanity.. i do not approve of the ai crap happening these days..

    • @homelessEh
      @homelessEh Před měsícem

      it'll get to the point where it'll be a being whos bloods electric and its now lobbying for rights that superceed human rights to become an electric god....at that point our blood will become a mere color to be used in a twisted pallet when it subverts us in a matrix like future un the guise of ART.... sounds crazy i know..

    • @FullyBuffered
      @FullyBuffered  Před měsícem

      You can certainly make some strong arguments against it...

  • @supabass4003
    @supabass4003 Před měsícem

    We all helped create the worlds most valuable company, even though nvidia has basically told gamers to get rich or touch grass, I'm happy I threw my hat into the ring 24 years ago.
    7h3 m0r3 y0u buy 7h3 m0r3 y0u 54v3