The $199 Radeon X800 & GeForce 6800 Quick 'n Dirty Comparison // Pixel Fragment

Sdílet
Vložit
  • čas přidán 8. 09. 2024
  • In this Pixel Fragment video, I spent a day talking about and doing some quick tests comparing NVIDIA's GeForce 6800 (PCI Express version) versus ATI's Radeon X800 (also PCI Express). Like other Fragment videos this is a one-day production video with no script and simplistic tests (no 1% lows here). Apologies for the clipping audio, as I changed up my recording device and it still needs some tweaks.
    Platform specs:
    Phenom II X4 965BE @ 4.09GHz
    2GB OCZ Platinum Ed. PC2-8500 @ 936MHz
    ASRock ALiveDual-eSATA2 (ULI M1695 + NVIDIA nForce3 250)
    Creative Labs Audigy 2 ZS
    Samsung 830 Series 128GB SSD
    Windows XP Professional SP3
    Drivers:
    ATI Radeon X800:
    -Catalyst 10.2 Legacy
    NVIDIA GeForce 6800:
    -Forceware 163.71
    ○PixelPipes Store: shop.spreadshi...
    ○Join the Pixel Talk Discord: / discord
    ○Instagram: / pixelpipes
    ○Follow me on Twitter @pixelpipes
    ○Donate via Paypal: @pixelpipes
    Music from epidemicsound.com - Get a 30 day trial by using this link: epidemicsound....
    Intro Animation By Ken Gruca Jr - Inquire at kjgruca@gmail.com

Komentáře • 143

  • @hotgore
    @hotgore Před rokem +13

    The X800/6800 generation was the most exciting time. Two mature platforms, neck and neck performance, PCI-E, and different hardware features made it a fun time to be into PCs.

    • @PixelPipes
      @PixelPipes  Před rokem +2

      It's my favorite era for sure!

    • @stevef6392
      @stevef6392 Před rokem +3

      I remember it as the era of 128MB X800s and 6600 GTs, and 256MB Radeon 9250s and FX5200s.

    • @olnnn
      @olnnn Před rokem

      @@stevef6392 Yea was the time when the cheap cards had just normal DDR while the high end ones came with faster GDDR memory so they put a lot of cheap vram on some of the budget ones for big numbers or something. The 256 mb FX 5200 was the PCI one while the agp ones had 64 or 128 mostly so maybe it made some sense due to the limited bus bandwidth but idk.

    • @badass6300
      @badass6300 Před 7 měsíci +1

      2004-2012 was the most fun time for hardware IMO. 2012-2015 was the second most fun for GPUs, but not CPUs. Since 2016 it's been terrible for GPUs and since 2022 for CPUs. 2017-2021 was great for CPUs.

    • @badass6300
      @badass6300 Před 7 měsíci

      @@stevef6392You sacrificed capacity to increase the bandwidth. the FX 5200 had 6Gb/s memory bandwidth the x800 has 32GB/s memory bandwidth.

  • @RuruFIN
    @RuruFIN Před rokem +34

    Probably my favourite GPU generations. Radeon X800 was faster in general, but GF 6800 had newer features (SM3.0).

    • @kimnice
      @kimnice Před rokem +10

      I remember how everyone was buying Geforce 6800 LE and then trough bios-mods they unlock processing units. It had 8 pixel shaders enabled by standard, but some people managed to get 12 units and some even got 16 and trough overclocking made it practically 6800 GT with slightly slower memory.

    • @sgdude1337
      @sgdude1337 Před rokem +10

      This Actually mattered a lot. My brother was able to game with one of these until upgrading to a gtx 260 in early 2009 due to sm3 support. A x800 would be obsolete much earlier due to many games from as early as 2007 requiring sm3. That 260 lasted him until 2015 so the GeForce 6 could have had you set up for a path of minimal upgrades. A x800 purchase would have certainly had him upgrading to a 8800gt instead over a year earlier and that certainly wouldn’t have lasted him as long.

    • @PixelPipes
      @PixelPipes  Před rokem

      Mine too

    • @sgdude1337
      @sgdude1337 Před rokem +7

      If he had the foresight to hold out til the 5870 he could have realistically gone 6800 -> 5870 -> 980 and still be gaming on his third gpu since 2004. Crazy to think about

    • @yorgle11
      @yorgle11 Před rokem +2

      @@sgdude1337 Sometimes I wonder how well I could optimize my PC upgrade history, knowing what I know now about the hardware and what software/games I ended up actually playing.

  • @mtunayucer
    @mtunayucer Před rokem +6

    Finally frequent uploads. I genuinely missed your videos and style

  • @TechGamesAU
    @TechGamesAU Před rokem +13

    I have the 256mb x800 Pro (AGP 8 version), I’ve owned it since 2004 and finished both Doom 3 and HL2 on it. Love that card.

    • @retropcscotland4645
      @retropcscotland4645 Před rokem +1

      I own the brother AGP card of that the x850 pro with a stinking huge molex power connector.

    • @beirch
      @beirch Před 10 měsíci +1

      My first GPU was a Sapphire Radeon X800 and I sort of wish I'd kept it. Got a Sapphire Radeon HD 5850 Toxic after that, which I also regret not keeping. Probably the sickest looking GPU ever.

  • @yansproductions
    @yansproductions Před rokem +8

    good to see you back, i like the way you edit your videos and it's fun to watch these GPUs in retrospective.

  • @kady5991
    @kady5991 Před rokem +3

    omg, pixelpipes! It's been forever since your vids popped up in my feed - so glad to see your vids again!

  • @IronicTonic8
    @IronicTonic8 Před rokem +6

    I have the AGP versions of both these cards, but in a twist of fate the memory configurations are reversed (6800 has 128mb and X800 has 256mb). I was able to unlock all the pipes on both cards for extra performance.

    • @classic_jam
      @classic_jam Před rokem +1

      For me all my X800 cards leave the 6800AGP I have in the dust. They all have GDDR3 or 256MB+ The 6800Ultra is a bit faster than some, but the two platinum editions leave everything in the dust. Great cards.

    • @IronicTonic8
      @IronicTonic8 Před rokem +2

      @jam8076 Yeah I upgraded from the 6800 AGP to the X800 256 and after unlocking the cores it was significantly faster than the 6800.

  • @KristopherJohnsonawesomepossum

    Holy crap! You're back. I love your channel man. Glad to see new videos :)

  • @50shadesofbeige88
    @50shadesofbeige88 Před rokem +10

    Good idea! I like the format. Not everything has to be fully scripted and produced. Keep up the good work. I'm glad to see you're back.

  • @ToniHiltunen1980
    @ToniHiltunen1980 Před rokem +3

    Just bought Club3d Radeon X800 RX/GT from flea market last week, good catch and cost me only 5€ and worked fine. Thanks for the video :)

  • @Rickenbacker451
    @Rickenbacker451 Před rokem +4

    I once had a 6800 with 128MB, it served me well for quite some time and was a nice upgrade from my previous FX 5900 XT. Now I own a 6800 GT for one of my Retro PCs. It has to serve as replacement for my amazing Gainward 7800 GS 512MB AGP with unlocked pixel pipes that sadly began to show artifacts.

  • @Wil3vlbc9gvk604
    @Wil3vlbc9gvk604 Před rokem +3

    Good to see you back.

  • @VdWck
    @VdWck Před rokem +8

    Love that time. New generations of GPU, amazing games like Far Cry, HL2, Doom 3....

  • @adamk3026
    @adamk3026 Před rokem +3

    Awesome! A new video. Thanks for making my day.

  • @wasitacatisaw83
    @wasitacatisaw83 Před rokem +3

    I had an AGP 6800GS and I flashed the 6800GT BIOS to it to unlock the 4 locked pipelines (12 to 16). Free huge performance boost.

    • @GrumpyWolfTech
      @GrumpyWolfTech Před rokem

      you sure you don't mean the 6800 LE? I remember doing it with a few of them.

  • @Najmods
    @Najmods Před rokem +4

    Love this segment, and also I was able to recall what you say as you say along like it was yesterday. I love during this time, I actually bought 6800NU back then that capable of unlock to 16 pipes. Only bottleneck is 128MB of VRAM but it was fast. Nowadays I buy bucketloads of X800 cards because of one feature, ATi Truform :)

  • @AlValentyn
    @AlValentyn Před rokem +8

    I was only able to afford the 6600GT at the time, but was a fantastic generation of cards, and games.

    • @GrumpyWolfTech
      @GrumpyWolfTech Před rokem +2

      I had a 6600gt before upgrading to a 6800 ultra, card was a great bang for the buck. The 60 series in nvidia cards don't compare now.

    • @foxxy46213
      @foxxy46213 Před 5 měsíci

      Great card that... overclocked was as fast as a 6800gt in all but huge resolutions with AA an AF

  • @otaviosakay3133
    @otaviosakay3133 Před 10 měsíci +1

    thanks for that, i really enjoy your channel, it brings us back and we can get excited about little things, like running doom 3, or call of duty

  • @dannybarker632
    @dannybarker632 Před rokem +2

    I just wanna say what a blessing it always is to see new content from you, I bought one of your "Maximum Rage" shirts, love it dude. Keep doing what you love, and me along with everyone else here will keep enjoying it. :)

    • @PixelPipes
      @PixelPipes  Před rokem +1

      Hey thank you! Hope you've been enjoying the shirt!

  • @Yellowswift3
    @Yellowswift3 Před rokem +2

    Good video, Nathan. Around this time, I believe I had the X800 Pro Vivo, though looking back, I think my CPU was restricting its performance by a fair margin.

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk Před rokem +2

    Good video. The 6800 wins because you can play another 5yrs worth of games because of shader 3 support.

  • @dualpapayas
    @dualpapayas Před rokem +3

    I never knew that the plain 6800 was native PCIe and didn't use an AGP bridge chip, neat!

    • @classic_jam
      @classic_jam Před rokem +1

      This is a later variant, there are also 6800s which use NV45 (NV40+HSI) as well as the original 6800 being NV40 and native AGP.

  • @framebuffer.10
    @framebuffer.10 Před rokem +3

    I'd have added a little bit of overclocking to the comparison, to spice things a little bit; some times these middle-high-end cards have quite of room for improvement, especially if they use under-clocked memory modules ;)

  • @blackholeretro
    @blackholeretro Před rokem +1

    @PixelPipes I'm Soooooooooooo Glad to see you making gpu videos again Keep them coming and know that your contribution to this hobby/pastime is greatly appreciated !

  • @stanleysmith7551
    @stanleysmith7551 Před rokem +5

    These were the cards I always wanted as a kid, but never could afford. Instead I was blessed with the affordable/infamous FX 5200.😂😂 Although it was the 128 MB version and it ran Call of Duty significantly better than the X800...ok, the resolution was just 1024x768 but my frame rates weren't that much lower (around 60-65 fps, but AA wasn't set to 4 for sure)... luck I guess.😅

    • @blakegriplingph
      @blakegriplingph Před rokem +3

      I had it even worse some fifteen years ago. My first discrete GPU was a 7300LE which had the so-called TurboCache feature where they had a scant amount of built-in video RAM and would take the rest of it from system RAM as needed. ATI also had their own take on it called HyperMemory too.

    • @PixelPipes
      @PixelPipes  Před rokem +5

      I always thought the TurboCache and Hyper Memory cards might make a good Halloween episode lol

  • @dabombinablemi6188
    @dabombinablemi6188 Před rokem +2

    The flipside of the 6600GT, where the AGP card had slower memory.

    • @PixelPipes
      @PixelPipes  Před rokem +2

      Yes...I'll never understand why that is

  • @SKHYJINX
    @SKHYJINX Před rokem +3

    Flashback memories.. man so long ago, Im am not 100% but I think I bought the x800 GTO? It definitely had GTO in its title/marketing.

    • @TheVanillatech
      @TheVanillatech Před rokem +1

      GTO came late to the game, but was slightly faster than the GT if I recall.

  • @MultiTelan
    @MultiTelan Před rokem +3

    Related to the X800...I actually miss my X700 pro. That was the first non-crap ATI card I owned.

    • @oddballhippie7363
      @oddballhippie7363 Před rokem +1

      9800 pro says hello...

    • @PixelPipes
      @PixelPipes  Před rokem +2

      X700 Pro was really good

    • @retro4fun358
      @retro4fun358 Před rokem

      @@PixelPipes 15% slower than GF 6600 Gt on my benchmark results with Atlon 3700+ 2GB Ram but still decent middle class card that day. It beats the 6600 non GT by far. The 6600 Gt is the clear winner against X700 Pro but the RAdeon X700 Pro was a bit cheaper. Ati had planed to bring the X700 Xt a higher clocked X700 Pro which are closer to the 6600Gt but the 6600GT still wins and the Production costs were to expensive for ATI so they have decided to skip this card and not bring them to the marked instead they have focued to the Radeon X800 non Pro which beats the 6600 GT and they have reduced the price so the X800 where the competitor of the 6600 GT althouth technically this card are equel then the Gefore 6800 (this card is not much faster than the 6600GT) 6800 was much more expensive for a little bit more performance than the 6600 GT. The 6600 GT had the best price-performance for the most gamers and this card was fast enough to handle 2004 and even 2005 games in 1280x Maximum details and Anti Aliasing.

  • @MasticinaAkicta
    @MasticinaAkicta Před rokem +3

    Ah yes remember when $199 graphic cards still came with a 256bits memory bus.
    Boy that is long ago.

    • @PixelPipes
      @PixelPipes  Před rokem +2

      We're already at the point of "Back in my day" stories lol

  • @geofrancis2001
    @geofrancis2001 Před rokem +3

    I used to have a X800 AIW with digital TV tunes and video capture. only stopped using it as a second card when i moved to vista as there was no drivers for the tuner. I had a 6800 at one point and it was awful it only had 128mb of ddr1 so was painfully slow for a 6800. ditched that for a x1900XT AIW I think this was back when i was playing UT2004.

  • @TheVanillatech
    @TheVanillatech Před rokem +2

    Back in the day, I bought a 6800XT AGP on a great deal, a freaky deal, just £25 more than a 6600GT. It also turned out to be A0 revsion PCB, and let me unlock all the masked vertex / pixel pipes with Rivatuner. Was a great card! Insane for what I paid for it! Had been set on a 6600GT but managed to get super lucky on eBuyer that day. Played Riddick Butcher Bay amazingly well.
    Now I have a ton of X800 cards. X800XL, X800GT, X800GTO, X800Pro.... all great cards!

    • @PixelPipes
      @PixelPipes  Před rokem +1

      The era of unlockable NV40 chips was a great time

    • @TheVanillatech
      @TheVanillatech Před rokem +2

      @@PixelPipes Yeah I mean, I hoped - but I never expected to get that lucky. Read an article online about how it was possible, but only on certain boards. When it arrived, I was overjoyed. It was a leadtek model. Said "Rev. A0" on the corner of the board!
      First pass of 3Dmark 03 netted me around what a 6600GT got. But I unlocked just half the masked pipes, and it went up to 9000. Then I unlocked all of them, and scored over 11,000. No artifacts! But out of caution, I settled on leaving half of them locked. Performance was still great, and I didn't wanna stress my new purchase. Still plenty of power! For a little more than the cost of a 6600GT, the card I was about to buy, I ended up with essentially a 6800GT.
      Shoulda bought a lottery ticket that week! XD

    • @technooby220
      @technooby220 Před rokem +1

      I thought I got lucky with a 6800 valilla that was unlockable. i was able to unlock and overlock. I miss the days of them kinda tricks. it was is, or seems so different now.

    • @TheVanillatech
      @TheVanillatech Před rokem

      @@technooby220 Yeah the XT was the step down from the vanilla. Unlocking half the masked pipes brought me to, essentially, a vanilla 6800. But mine even let me unlock ALL the pipes. I did settle on just half though.
      Interestingly, the 6600GT beat the 6800XT in lower resolutions. It wasn't until you went up to 1280x1024 and beyond that the 6800 pulled away. But once I'd unlocked it, totally different tier of card. For just £25 more than a 6600GT!
      Saved a fortune really! At first I was reluctant to ask my Dad for the extra £25, things were tight back then, and I'd saved forever for a 6600GT (to replace my Geforce 4 Ti 4800SE). When I saw that insane offer on the 6800XT though, I had to visit the royal bank of Dad!
      I'm trying to remember, and I think the 6600GT was £145, and they had this 6800XT in stock for £169. At the same time, the top end cards like the 6800GT were listed for £230+ . Well out of my price range! But I got it in the end! Thanks to Rivatuner and my Dad! :D

    • @technooby220
      @technooby220 Před rokem +1

      @@TheVanillatech I love hearing old stories like these. I think I got a good dela on my vanilla 6800 on ebay...maybe £110 second hand. Early hours auction finish. Where did you purchase your card? The geforce 6-8 series were awesome cards. I also have fond memories of my Geforce 2 MX200....By the time I could afford one. The Geforce 3 was being released. I do miss the old days. Was something more organic about them times.

  • @GrumpyWolfTech
    @GrumpyWolfTech Před rokem +1

    Had the 6800 ultra, upgraded from a 6600gt when I found it open box dirt cheap. Was a great card for the time.

  • @cyphaborg6598
    @cyphaborg6598 Před rokem +2

    I like this format.
    :D

  • @wishusknight3009
    @wishusknight3009 Před rokem +2

    1600x1200 with 4x AA is going to be about 114mb just for frame buffer. So I am not surprised it caused the ATI card to dump on CoD... As it also uses larger textures than earlier Q3 games. The results at 1024 is interesting to me. And it would be interesting to see this without AA to free up even more VRAM but I suspect the nVidia card will still poke ahead. CoD seemed to favor nVidia cards a small bit.

    • @PixelPipes
      @PixelPipes  Před rokem +1

      Yeah I don't know why the framerate was lower at 1024x768. Without AA I'd expect both cards to max out the engine at 90fps (tho at this point I've already been surprised). I know ATI had a reputation for worse OpenGL performance, particularly in idTech3, but it was still peculiar

    • @wishusknight3009
      @wishusknight3009 Před rokem +1

      @@PixelPipes Driver overhead may also still be a slight culprit. And perhaps the significant differences in how the cards handle AA may play a roll. It was sometimes a practice for some reviewers to run a baseline without AA and AF to remove some variables with drivers. As both took vastly different approaches to each. Though it often didn't really mean a whole lot.. lol.

  • @MadsonOnTheWeb
    @MadsonOnTheWeb Před rokem +2

    Those are the most desirable from that era

  • @BenWillock
    @BenWillock Před rokem +2

    Oh wow you're alive

  • @ZeroHourProductions407
    @ZeroHourProductions407 Před rokem +2

    Interesting stuff, but if I _may_ chime in, since it's going to be a fair bit. For point of reference, my Win98 PC uses:
    A64 3700+
    512MB DDR400
    MSI K8N Neo2 Platinum motherboard
    SB Audigy 2 (platinum, according to drivers, lol)
    1280x1024 monitor
    Through earlier efforts when they were more affordable, ended up with an X800 XTpe entirely because of it being mislabeled in the listing, an X850 XT which confirms they both perform identically, and a geforce 6800GT, since the 6800 XT I had on hand refused to work with the final 98SE drivers from nvidia. Moving on...
    In 3DMark 01's default benchmark runs, the 6800GT is, similarly, about 1000 points faster on the same build.
    In Aquamark3, which _I think_ is a more OpenGL suited benchmark program, the results were much more dramatic, in favor of nvidia. WHen I still had my 6600GT, I was getting higher scores from it than even when I still had an X850XT platinum.
    In Quake 3 Arena, with max in game settings and disabling the framerate cap (com_maxfps = 0), I'd see about 400fps with my remaining Radeon cards, but more like 600 with the geforce 6800GT. It's so obscene I'm literally having to enable the cap because the tearing becomes that distracting.
    Though one intangible in favor of the Radeon is that the install size and the resources available on a Win9x environment is simply a bit leaner with the Radeon cards. The downer at present is that selecting _reboot to MS-DOS mode_ just returns a blank screen for me with the ATI cards; no such issues with the nvidia card(s). If I had the room to have a _second_ 98SE pc like this one, I'd drop in the geforce card in a heartbeat.

    • @PixelPipes
      @PixelPipes  Před rokem +1

      Interesting. I wonder why the X850XT is performing so low?

    • @ZeroHourProductions407
      @ZeroHourProductions407 Před rokem +1

      @@PixelPipes wish I knew, but since I can't get MSI afterburner to work on windows 98, I really don't have a way to know. I was wondering how you handle benchmarking in that os to be able to offer more insight?
      I can also say that on 98se, the GeForce 6 cards all had support for temperature monitoring. Of my remaining Radeon cards, only the x800 xtp gets any temperature monitoring support at all in the drivers.

    • @classic_jam
      @classic_jam Před rokem

      None of these cards are really ideally used for Windows 98. The nVIDIA stuff tends to fare better. These really do their best in Windows XP. @@ZeroHourProductions407

  • @SUCRA
    @SUCRA Před rokem +2

    Great comparison! I love card battles and the results make sense. When you're a regular gamer you're not choosing your game based on the use of VRAM or shaders or anything, you're just looking at the game screenshots and things related to the content in that game. Anyway, I'm glad you're back.

  • @ADSLPL
    @ADSLPL Před rokem +1

    2004 nostalgia

  • @harryshuman9637
    @harryshuman9637 Před rokem +1

    I used to have X850 GT I think. The card was alright, altho the lack of SM3.0 kinda ruined it for me since I couldn't play the latest games that came out later. Still, it played X3 game very well. At least that's how I remember it, it's hard to tell since "playing well" meant anywhere between 30 and 60 FPS.

  • @gamergod9182
    @gamergod9182 Před rokem +3

    I miss Ruby. AMD should bring her back.

  • @AmstradExin
    @AmstradExin Před rokem +1

    I've never seen one of those at the dump. Also all the X800 AGP cards I ever found never worked.

  • @thepcenthusiastchannel2300

    Another one!!!! Woot!!!!

  • @Warfrog101
    @Warfrog101 Před rokem +3

    Saw the thumbnail.. PERSONAL INTEREST! lol... have an X700 AGP in a drawer.

    • @AliceC993
      @AliceC993 Před rokem

      For better or worse I still have an X1650 Pro AGP laying around somewhere, it needs to have its fan repaired/replaced but it does still work. I also need to figure out some sort of a solution for the bridge chip because it gets extremely hot and I don't want to kill the card.

    • @Warfrog101
      @Warfrog101 Před rokem

      @@AliceC993 also had the x1950 pro...so I finally relinquished my hold on AGP then and got the pci express. In any case, good luck!

    • @amdintelxsniperx
      @amdintelxsniperx Před rokem

      @@AliceC993 if the bridge chip is exposed you can either 1 put a pad to have it touch the heatsink or 2 put a small thin ram sink on it

    • @AliceC993
      @AliceC993 Před rokem

      @@amdintelxsniperx It is, it's on the back of the card I believe. Not entirely sure as I haven't messed around with it in a few years.
      Edit: Just found the card; it's an HIS 512 MB model, the bridge chip is indeed on the back. There's a thermal pad already on it but I'm not sure how much good that will do, so I think I will try to find a heatsink for it.

    • @TheVanillatech
      @TheVanillatech Před rokem +1

      X700Pro is the one card I've never been able to find or aquire in ALL these years. Have around 6-7 X800 based cards, not a single X700.

  • @NTGTechnology
    @NTGTechnology Před rokem +1

    I have a few GeForce 6800s on both AGP and PCIe and I'm pretty happy with all of them. Good enough performance to play most XP games from 2001-2006. It's a really nice card, plus most of them are single slot which is a nice bonus. If I ended up with one back in the day I don't think I would've been disappointed at all.

  • @ShadowsBehindU
    @ShadowsBehindU Před rokem +2

    NV41/NV42 have the same quad cluster render pipeline design as the NV40 -> 3 clusters = 12TMUs & 12 ROPs.

    • @PixelPipes
      @PixelPipes  Před rokem +1

      They have decoupled ROP units, similar to NV43. Unfortunately TPU (and by extension GPU-Z) has it wrong, but fillrate tests accurately show 8 ROPs

    • @ShadowsBehindU
      @ShadowsBehindU Před rokem +1

      @@PixelPipes Sorry, but I think you're wrong on that. The results of synthetic fillrate tests obviously don't have to equal to theoretical maximums. NV43's ROPs weren't decoupled. It was a different setup entirely: 2x2 ALUs / 4 TMUs / 2 ROPs per quad. The ROPs moved outside the pipeline with a crossbar interconnect first with NV47.
      Edit: A 6800GS wouldn't run almost equally to a 6800GT if NV42 didn't have at least 12 raster units ;)

    • @PixelPipes
      @PixelPipes  Před rokem

      @ShadowsBehindU ROPs were situated the same for NV41/42, outside the pipeline. You can also see 8 ROPs in die shots of the NV42 posted by Martijn Boer. Evidence trumps assumptions on your part.

    • @ShadowsBehindU
      @ShadowsBehindU Před rokem +1

      ​@@PixelPipes I know of Martin Boer's die shots and I count twelve :)
      Unfortunately I don't have a block diagram for 41/42 specifically.

  • @Henk717
    @Henk717 Před rokem +2

    For my build the reason I picked the 6800GT was shader 3 support while simultaniously having excellent DOS support. Windows 98 leaves a bit to be desired but I plug that hole with a Voodoo 2.
    The ati cards I know have less dos compatibility and the ones with 98 support lack the newer shader model. So for my as many eras as possible build the 6800GT made it a dos PC that can run crysis.

    • @TheVanillatech
      @TheVanillatech Před rokem +1

      "DOS" support with a 6800GT? Are you mental? :D
      Day of the Tentacle on a Geforce 6800GT :DDDDD

    • @Henk717
      @Henk717 Před rokem +1

      @@TheVanillatech I am more mental than that. I played sopwith on it to :P. But the theme of the build is to be slow enough to play 80s games and modern enough to play the 2005 era up to crysis 1. This GPU is perfect for that :D

    • @harryshuman9637
      @harryshuman9637 Před rokem +1

      ATI drivers from the era had notoriously poor driver support.

    • @TheVanillatech
      @TheVanillatech Před rokem

      @@Henk717 Without software, you can't play a HUGE number 80's and early 90's DOS games on anything except a 486 or lower. And that software isn't reliable. I always find DOSBOX beats any pentium build, barring a handful of motherboards that have a ton of customizable dips, when it comes to pure DOS builds.
      I have a K6-III+ machine for DOS games, runs Carmageddon and Quake and TombRaider etc perfectly, also lets me play earlier games by disabling cache and running at 75Mhz, giving me roughly 386DX speeds. Because I never managed to build a decent 486 machine, and I don't have space! XD
      I only have 3 retro rigs, thanks to DOSBOX Staging.

    • @TheVanillatech
      @TheVanillatech Před rokem +1

      @@harryshuman9637 Apparently, drivers have bad driver support....
      The stupidity of Nvidia fans knows no bounds! XD

  • @postanimus8989
    @postanimus8989 Před rokem +1

    Looks like you have pretty nice audio setup. What equipment you use?

    • @PixelPipes
      @PixelPipes  Před rokem +3

      Yeah I've been a Head-Fi member since 05. Right now I primarily use a Bifrost 2 & Lyr+ with my Arya Stealth/ZMF Auteur/HEDDphone, but I also have a couple vintage pairs

    • @postanimus8989
      @postanimus8989 Před rokem +2

      @@PixelPipes Thanks for your answer!

  • @PinkFloydFreak55
    @PinkFloydFreak55 Před rokem +2

    These were the kings when I got into gaming.... tried an x800xl in my brothers computer but absolutely hated the driver and software experience. Ended up with an agp8x 6800xl on my sempron system and liked the experience much much better and Ive been buying Nvidia cards since....

  • @KomradeMikhail
    @KomradeMikhail Před rokem +3

    GF 6800: "Time for some Win98 gaming!?"
    R X800: "Nope."

    • @greatwavefan397
      @greatwavefan397 Před 5 měsíci +1

      There are official, but beta Win98 drivers for the respective Catalyst versions for that card's generation.

  • @RiasatSalminSami
    @RiasatSalminSami Před rokem +2

    To the people that say 8 GB vram is fine in 2023, take a look at this. Low vram was always a problem

    • @cyphaborg6598
      @cyphaborg6598 Před rokem +1

      It really depends on what settings and resolution you're going for.
      Also what games you primarily play and when you buy every game that comes out.
      4K is just pushing way too much VRAM on PC.

    • @RiasatSalminSami
      @RiasatSalminSami Před rokem

      ​@@cyphaborg6598 it's not gonna be an issue for just 4k. It's affecting 1440p performance already and in some cases 1080p ray tracing is becoming a problem too.
      Same happened with those 2 GB gpus back in early ps4 era days. People said 2 GB is fine for 1080p. But within few years those cards couldn't even match Ps4's texture quality.
      You say it depends on settings and resolution. Well obviously. But what should be our standard? I believe console's graphical setting and resolution should be our bare minimum for a standard. And these 8 GB GPUs won't be able to match consoles current gen console's texture and resolution.
      8 GB was simply inexcusable for these recent GPUs. There is no defending this planned obsolesce.

  • @aaldrich1982
    @aaldrich1982 Před rokem +1

    what are your favourite cards - be it for performance, looks, weirdness, overclocking/unlocking ability or whatever you like - for win9x, xp and win 7?

    • @PixelPipes
      @PixelPipes  Před rokem +2

      The GF6 cards rank highly. I do like the weirdness of NV41/42 specifically. But V5 5500, Voodoo 1, 9700 Pro, Kyro 2, and Rendition V1000(E/L-P) are also among my favorites. Picking one is impossible and my answer can change week to week lol

    • @aaldrich1982
      @aaldrich1982 Před rokem +2

      @@PixelPipes thanks for replying! And you know, that's the real joy of retro - not necessarily having the rarest tech but being able to explore the history of what top end gaming would have been like and how buying choices played out.

  • @ricardobarros1090
    @ricardobarros1090 Před rokem +2

    I have 3 beatiful ATI cards, X800 xl and X850 and geforce 6800gs call Nvidia Quadro. The best card is ATI X850?

    • @classic_jam
      @classic_jam Před rokem +2

      It depends on what variant of X850. X800XL is a 16-pipe full card. The X850 could be a variety of things. If it's a PRO it's 12 pipe. If it's an XT or XTP it's 16 pipes. The XL and PRO will trade blows, PRO has higher clocks and will almost always be ahead in older fixed-function games, but the XL has a more powerful GPU so will outperform it in many shader-heavy games. The XT is faster than both of them.

    • @ricardobarros1090
      @ricardobarros1090 Před rokem +1

      @@classic_jam I have ATI X850 XT

    • @classic_jam
      @classic_jam Před rokem

      It is definitely the fastest then. Only the X850XT Platinum Edition and X800XT Platinum Edition are faster and the difference isn't significant. Even the 6800Ultra Extreme doesn't stand a chance. @@ricardobarros1090

  • @foxxy46213
    @foxxy46213 Před 5 měsíci

    I remember my 6600gt being quicker than an overclocked 6800 an just as quick as a stock 6800gt in up to 1280x1024 resolutions. Didn't need higher on a 21in crt on a sempron at 2.7ghz oc,ed beat rigs costing 2x more...an they laughed at my 35quid 1.7ghz sempron over there presscots an high clocked athlon 3000s..wasn't laughing at a 2.7ghz stable overclock leave them in the dust tho. Best bang for buck rig I built

  • @thomas.becker
    @thomas.becker Před rokem

    Best thing about the 6800 generation was that you could unlock the NV40 chips on AGP cards via software. That way you could turn a 6800LE into a 6800GT and almost double the performance for free. Nvidia never made that mistake again ;)

  • @ALPHABYTE1994
    @ALPHABYTE1994 Před rokem +3

    late era of "Pixel Pipes" GPUs was really interesting

  • @OldCircuitGaming
    @OldCircuitGaming Před rokem

    audio is a bit peaky at times but otherwise great video =^-^=

    • @PixelPipes
      @PixelPipes  Před rokem

      Yeah the audio in this is garbage, sorry about that. It will be better next time

  • @Michael500ca
    @Michael500ca Před rokem +2

    I had to unsubscribe from Adrian's Digital Basement Channel because he stopped doing shorter videos. I just prefer shorter videos because of time.

  • @ltlk937
    @ltlk937 Před 7 měsíci

    i had a 256 MB x850 GTO (AGP ) version.

  • @finco7726
    @finco7726 Před měsícem

    Ruby on cooler, yeah!

  • @garyc7569
    @garyc7569 Před 6 měsíci

    my x800pro was 256mb?

  • @xBruceLee88x
    @xBruceLee88x Před rokem +1

    Hello Nathan! See you next live stream at Rik's Random Retro

  • @beardedgaming1337
    @beardedgaming1337 Před 8 měsíci

    ive never seen a 128 x800. i had one when i was a kid and it was a 256 vivo that i modded to an x850

  • @ytreview4390
    @ytreview4390 Před rokem +1

    sorry, but background music irritating and don't alow to concentrate on speach

  • @Sitharii
    @Sitharii Před rokem

    18:55 : ""Halo historically favours ATi architectures""" ?????
    excuse me but that's ... how can i put it ... : *technically impossible* !!!
    the original Halo was produced as a *launch-title* for showcasing/promoting the original XBOX over Playstation2 . *The original XBOX was based on a GeForce 3-architecture* , so really , what you say is simply technically impossible *by default* .

    • @ALPHABYTE1994
      @ALPHABYTE1994 Před rokem +2

      it can be accidentally the case

    • @NoToeLong
      @NoToeLong Před rokem +2

      Halo was originally designed for Mac, which came with ATI GPUs by default. Bungie didn't completely rebuild their engine from the ground up when it shifted to Xbox, so it naturally still favours ATI's architecture.

    • @PixelPipes
      @PixelPipes  Před rokem +1

      Yeah I don't think it was intentional, but the PS2.0 codepath that they wrote for the PC version just runs exceptionally well on ATI's R3XX-R5XX architectures

    • @cyphaborg6598
      @cyphaborg6598 Před rokem +1

      Not really technically impossible depending how well they optimized the game for ATI cards on PC as they were very popular cards during that time period.

  • @mariusz76a
    @mariusz76a Před měsícem

    Nvidia cards are terrible under Windows 98, that's why I choose ATi.

  • @sonyericssoner
    @sonyericssoner Před rokem

    Bump

  • @draganraxrax7497
    @draganraxrax7497 Před rokem

    6800 was a much better card back then , all version. Idk which drivers is used here

    • @PixelPipes
      @PixelPipes  Před rokem +2

      Not really. If you go back and look at old reviews, as long as the number of pipes are the same, the X800 series generally won. X800 Pro with 12 pipes did lose to the 6800GT with its 16 pipes, though