New GPUs are Bad??... "F**k it, I'll Do it Myself."

Sdílet
Vložit
  • čas přidán 7. 06. 2024
  • Check out Dylan's DIY GPU!: www.furygpu.com/
    • FuryGPU displaying Win...
    Dylan Barrie made his OWN graphics card. It's wild. Making GPU hardware and software is extremely difficult- we've seen companies like Moore Threads enter the space recently and their launch has been quite rocky. Intel's GPUs have improved SIGNIFICANTLY, but it is clear they still have hurdles to overcome. These are companies that have tons of resources and insider information... so how did Dylan manage to do it on his own? Let's take a look!
    ==JOIN THE DISCORD!==
    / discord
    GN: • China's Moore Threads ...
    • Intel Arc 2024 Revisit...
    LTT: • China doesn't want me ...
    store.steampowered.com/hwsurv...
    www.mthreads.com/
    www.xilinx.com/products/silic...
    www.amd.com/en/corporate/xili...
    0:00- New GPUs are....
    0:51- Making GPUs is hard
    4:12- The ULTIMATE DIY Graphics Card
    6:46- The Challenges
    8:10- What about the chip?
    10:12- How does this fit in?
  • Věda a technologie

Komentáře • 754

  • @sanishnaik2040
    @sanishnaik2040 Před 2 měsíci +1910

    Im from nvidia. This guy has to be stopped.

    • @AbbasDalal1000
      @AbbasDalal1000 Před 2 měsíci +524

      Im from this guy nvidia has to be stopped

    • @KianFloppa
      @KianFloppa Před 2 měsíci +71

      amd better

    • @e....3235
      @e....3235 Před 2 měsíci +40

      Nice and AMD has to be stopped

    • @danieltoth9742
      @danieltoth9742 Před 2 měsíci +146

      I'm from stop. Nvidia and AMD needs to be this guy'd.

    • @udaysingh9_11
      @udaysingh9_11 Před 2 měsíci +75

      I'm in this guy, he wants me not to stop.

  • @ClamChowder95
    @ClamChowder95 Před 2 měsíci +758

    I think he is underselling his work. This could be a stepping stone for future open source GPUs. He should be incredibly proud of what he did all by himself.

    • @k_kubes
      @k_kubes Před měsícem +65

      This feels awfully familiar to how Linux started, basically someone toying around with a concept not expecting the project to become that big of a deal even after going open source and end up exploding in popularity

    • @jasonstephens6109
      @jasonstephens6109 Před měsícem +35

      Yeah this guy is laying the groundwork for something with huge potential. Even if an open source GPU never competes with the big boys, it could lay the groundwork for niche features that become mainstream.
      There's also the potential for custom instruction sets that give new life to defunct GPUS

    • @jnharton
      @jnharton Před měsícem +11

      @@k_kubesLinux was more than just "toying around", but definitely a personal project that was never originally intended to go public let alone become a mainstream OS.
      I actually some files once that were supposedly part/all of Linux (maybe just the kernel) prior to version 1.0.

    • @harrydean9723
      @harrydean9723 Před měsícem +2

      this should be a thing

    • @xavierrodriguez2463
      @xavierrodriguez2463 Před měsícem +8

      Open source GPU to go with RISC-V

  • @AnimeGIFfy
    @AnimeGIFfy Před 2 měsíci +714

    open source software + open source hardware. this needs to happen if you want anything good to happen in your life.

    • @DengueBurger
      @DengueBurger Před 2 měsíci +24

      At-cost open-source GPUs when

    • @DengueBurger
      @DengueBurger Před 2 měsíci +29

      The Linux of GPUs

    • @TheDoomsdayzoner
      @TheDoomsdayzoner Před měsícem +16

      That's how we evolved as species. When "secret" sciences like Algebra and Geometry became available for everyone. When "secret" tech became available for everyone like household appliances.

    • @wwrye929
      @wwrye929 Před měsícem

      Making the hardware would be price and that would make harder for the game maker

    • @AnimeGIFfy
      @AnimeGIFfy Před měsícem +4

      @@wwrye929 im not talking about everyone making their own hardware from scratch

  • @SIedgeHammer83
    @SIedgeHammer83 Před 2 měsíci +557

    Voodoo and Kyro GPUs need to make a comeback.

    • @bryndaldwyre3099
      @bryndaldwyre3099 Před 2 měsíci +17

      Imagine this gpu being able to use a glidewrapper.

    • @xgamer25125
      @xgamer25125 Před 2 měsíci +70

      wasn't 3DFX (maker of voodoos) got bought and absorbed into Nvidia..?

    • @RAgames-mc3kp
      @RAgames-mc3kp Před 2 měsíci

      Yes i think ​@@xgamer25125

    • @gh975223
      @gh975223 Před 2 měsíci +31

      @@xgamer25125 it was

    • @itsnot1673
      @itsnot1673 Před 2 měsíci +9

      And matrox
      maybe even trident

  • @Karti200
    @Karti200 Před 2 měsíci +609

    I saw the news day one when it came out - and it just pissed me off how many out of touch people there were about it…
    Like… People literary roasted the creator of it because "it is too weak"... like cmon what the heck is wrong with some people
    This is literary an openware / shareware version of a GPU made by a community - this is an amazing milestone if you ask me

    • @elysian3623
      @elysian3623 Před 2 měsíci +78

      Let's be real, consumers themselves have no idea what stuff is, what is good for them or how stuff works, they just consume, they're easily fooled into consuming stuff they don't need as well, 90% of Nvidia cards have probably never been used for a cuda work load but they were sold based on their dominance in certain tasks.
      I still live in hopes that AMD makes their software stack fully open and somebody comes along with a working prototype of something absolutely game changing and they work with them to actually use their combined technology to advance GPU's, currently the stagnation in GPU performance is because of die shrink and ramming things like AI acceleration into cards that really just need to be affordable and play games well.

    • @Dingbat1967
      @Dingbat1967 Před 2 měsíci

      The average lamba person is an idiot. That's pretty much why. The intertubes just made it more obvious.

    • @TechBuild
      @TechBuild Před 2 měsíci +30

      People who have some idea how GPUs work and their differences from CPUs will easily understand the work this person has done. It is a phenomenal task of making a GPU yourself which does 3D rendering, even a basic one, so well. In the CPU space, there are lots of architectures available to build upon but not in the GPU space.

    • @tsl_finesse_6025
      @tsl_finesse_6025 Před 2 měsíci +22

      People don't understand nvidia have around 30k employees while him alone does 1 full project by himself 😬😬. Bro is fire and a genius 💪🏾

    • @rextrowbridge8386
      @rextrowbridge8386 Před 2 měsíci +7

      Because they are ignorant of how hard it is to make a gpu from the ground up.

  • @POLARTTYRTM
    @POLARTTYRTM Před 2 měsíci +239

    The guy who made the gpu is the real "he's good with computers" guy.

    • @thevoidteb1285
      @thevoidteb1285 Před měsícem

      Did you listen to him in this video? Hes a novice.

    • @akazza69
      @akazza69 Před měsícem +1

      Why the heck am i experiencing a deja-vu
      This comment
      This reply above me
      I am getting crazy

    • @POLARTTYRTM
      @POLARTTYRTM Před měsícem

      @@akazza69 normal we all have time from time to time.

    • @0xD1CE
      @0xD1CE Před měsícem +7

      @@thevoidteb1285 Being a novice with HDL programming and writing windows kernel drivers does not mean he's not good with computers. That's like saying I'm not good with music because I never played the Pikasso guitar...

    • @yoshi596
      @yoshi596 Před měsícem +2

      @@thevoidteb1285 Oh is that so? Then go ahead and do it better than him. Notify me when you upload a video about your custom made GPU, I'll wait.

  • @crumbman2065
    @crumbman2065 Před 2 měsíci +135

    Really interesting video.
    As an electrical engineer working with FPGAs I can assure you it's a heck of a lot of writing (probably verilog) code to get this thing to work as it's supposed to. The biggest issue about doing this with an FPGA is that they run on really low clock speeds (typically ~100 -max ~250 MHz). So you can't really get speed just by increasing the clock speed (like NV and AMD have been doing more aggressively recently).
    Props to this man

    • @kesslerdupont6023
      @kesslerdupont6023 Před 2 měsíci +2

      Are consumer FPGAs big enough to scale to entire architectures or are they typically cut-down and more of an evaluation tool?

    • @noth606
      @noth606 Před 2 měsíci +6

      @@kesslerdupont6023 Well, you wouldn't be able to make a fullsize flagship FPGA based GPU that competes with the big boys, if that's what you mean. What I think this is, without looking deep into it, is a rendering pipeline mostly from scratch, my guess is that it's OpenGL based on choosing Quake to test it, so it's unlikely to support anything approaching DirectX 10 type stuff in terms of GPU functions, because it's at least one if not multiple orders of magnitude more complex.
      So, it's definitely impressive work, but I don't think nVidia or AMD are shaking in their boots.
      I'd guesstimate, based on what I know off the top of my head, that you'd need probably a few FPGA's, like 3-6, maybe more, to build up a full Dx10/11 type unit with enough ROP's, shaders etc to do something useful with including all the jank you have to have around it like memory management stuff, things to handle texture/geometry/shadercode ram plus then output handling. It kinds depends on how 'strict' of a model you aim for really, because to a point you can choose to do a lot on the host system, or not. The more you do on the host in code, the less you need dedicated hardware/FPGA space to do.
      It could be that this is a Dx10+ model project just not completed far enough to currently run more than basic stuff equivalent to OpenGL. I hope so.

    • @myne00
      @myne00 Před 2 měsíci +4

      @kesslerdupont6023 they are absolutely used by Intel and amd to test design changes. They probably have really big ones that can do an entire (single) core.
      Nvidia probably does use them too, but it would most likely be a very cut down gpu.
      They would absolutely perform horrible, but it's about comparing different hardware approaches and validation.
      Eg if fpu design x does 100flops and design z does 102 flops you have your architectural comparison.
      Then you run through a bunch of tests to validate the results. Don't want a chip that gives incorrect answers.
      Fpgas are used in the real world in applications like telecommunications signal processing where a new technique could be released every year or so.
      I'm not aware of any other real world applications aside from the "MiSTer" which is mostly used to emulate old game consoles.

    • @kesslerdupont6023
      @kesslerdupont6023 Před 2 měsíci

      @@noth606 Thanks for the helpful comment. I don't know much about DirectX but maybe I should look more into it.

    • @kesslerdupont6023
      @kesslerdupont6023 Před 2 měsíci

      @@myne00 thanks for the info. During validation is it normal to have to fuzz the chip across all possible combinations of input/output or are there shortcuts to validation?

  • @grgn5412
    @grgn5412 Před 2 měsíci +141

    This is WAY better than you may think.
    By simply converting his FPGA design into an ASIC, you get a 10x performance increase.
    ASICS are expensive (From one to a few million dollars to make the mask which allows mass production), and FPGA's have been used to prototype them : the languages to program the first or design the second are the same (VHDL or Verilog), so this conversion is very common in the industry (this happenend for crypto mining, for instance).

    • @SUCHMISH
      @SUCHMISH Před 2 měsíci +3

      I looked the chips up, and the good news is that you can by them in bulk for a low price used... Only problem is that they are used... But I feel like this idea has some merit to it though!!

    • @aymenninja8120
      @aymenninja8120 Před měsícem +1

      I saw a video of a guy who made his own ASIC, and he didn't sound like super rich, i think the technology of making ASICs is getting more affordable.

    • @Endgame901
      @Endgame901 Před měsícem +5

      @@aymenninja8120 He didn't really make his own ASIC, he basically got in on a group buy for space on the silicon for an ASIC. Still pretty dope, but not quite the same.

    • @aymenninja8120
      @aymenninja8120 Před měsícem

      @@Endgame901 and that group is not a big corporation or something, my point is there it is possible now to make ASICs for clients other than big companies. if I got things right.

    • @Endgame901
      @Endgame901 Před měsícem +1

      @@aymenninja8120 you're not wrong, per se, but an ASIC like tinytapeout isn't really in the same scope as this, even if you purchased every single "block" of silicon space.
      The type of chip you'd need _would_ cost Big Company money.

  • @lineandaction
    @lineandaction Před 2 měsíci +135

    We need open source gpu

    • @namebutworse
      @namebutworse Před měsícem +6

      Im getting flash backs the moment i hear "source"
      (I am Team Fortress 2 player)

    • @thatonegoogleuser4144
      @thatonegoogleuser4144 Před 5 dny +1

      ​@@namebutworse I can hear that sound bro

  • @toblobs
    @toblobs Před 2 měsíci +160

    I can only imagine what Dylan could do with Nvidia's budget with his talent

    • @kaptenhiu5623
      @kaptenhiu5623 Před 2 měsíci +34

      Becomes a trillionaire and dominate the AI market like NVIDIA does rn?

    • @blanketmobbypants
      @blanketmobbypants Před 2 měsíci +3

      But I would definitely buy it

    • @josephdias5859
      @josephdias5859 Před 2 měsíci +4

      or just enough of a budget to produce cards to play all the 90s games and early 2000s games

    • @ThreaT650
      @ThreaT650 Před 2 měsíci +2

      And a team of engineers, YUP!

    • @ThreaT650
      @ThreaT650 Před 2 měsíci

      Seems to have his head screwed on straight too. Would be great at the consumer level.

  • @ThreaT650
    @ThreaT650 Před 2 měsíci +30

    Respect to putting me on the FuryGPU, this is dope! That thing is performing around an old Radeon 8500 or something! Impressive!

  • @baget6593
    @baget6593 Před měsícem +31

    I dont care if fury gpu succeeds, i just want nvidia to lose

  • @liutaurasleonavicius2680
    @liutaurasleonavicius2680 Před 2 měsíci +32

    there was also one person who literally combined a Nvidia and AMD gpus and was able to use DLSS and AFMF together, this must be dark magic

    • @user-xe6sm4jv8f
      @user-xe6sm4jv8f Před 2 měsíci +10

      No magic, you literally just put 2 cards into 2 slots on your motherboard and they work🤣

    • @callyral
      @callyral Před měsícem

      ​@@user-xe6sm4jv8fIt just working makes it seem more like magic

    • @shiro3146
      @shiro3146 Před měsícem +3

      @@user-xe6sm4jv8f i dont think it was as easy as that bruh

    • @sirseven3
      @sirseven3 Před měsícem +2

      ​@@shiro3146it totally can work just like that. Just as you can have 2 NIC's operating at the same time. The main thing to worry about is the drivers for the card. You won't be able to pool the memory space as NVLINK but you do get an additional processor. Utilizing profile inspector you technically could get it to work as NVLINK but it takes manual config. I've ran different ram types with xmp and sometimes I got bluescreens but it was semi stable with a solid overclocking of said RAM

    • @granatengeorg
      @granatengeorg Před měsícem +2

      I do the same in blender, rendering on an rx while using optix denoising on my gtx, all together in realtime in the viewport. Was also quite surprised that it just worked lol.

  • @POLARTTYRTM
    @POLARTTYRTM Před 2 měsíci +33

    I've had people telling me that the guy's gpu is not impressive at all because writing drivers, APIs, and all that, including the card is the "bare minimum" they would expected from any software engineer apprentice or professional, yet I don't see new cards coming out every week made by software engineers.

    • @taxa1569
      @taxa1569 Před 2 měsíci +3

      DO NOT believe them. To do something from scratch that the past 20 years has been iterating upon over and over by massive companies is like, saying the person who added the sauce to the meal they've been preparing for 3 hours is now the chef, and these 'chefs' can ALL do what this guy did. Except he prepared the whole meal and THEY just put on the sauce.
      The bare minimum was in fact coming into the GPU development space and adding on to the well prepared, already existing meal.

    • @POLARTTYRTM
      @POLARTTYRTM Před 2 měsíci +5

      @@taxa1569 they gave every excuse possible, that the guy just messed with an FPGA and sourced the other parts... I was like in my mind... tf do it then, I want to see if it's that easy you can do it, so? Anyone can if they are given the tools.

  • @tech6294
    @tech6294 Před 2 měsíci +55

    8:57 The VRAM GDDR chip is to the lower left of the fan cooler. That mini storage card might be BIOS? Not sure lol. Great video! ;)

    • @jashaswimalyaacharjee9585
      @jashaswimalyaacharjee9585 Před 2 měsíci +9

      Yup, I guess he is hot loading the vBIOS via SD Card.

    • @dbarrie
      @dbarrie Před 2 měsíci +11

      That’s the DisplayPort splitter, which takes the DP signal from the FPGA and splits it into DP/HDMI to supply the outputs. All of the (slow, DDR3!) RAM on the device is part of the Kria SoM, underneath the fan!
      SD card is there to update the firmware when I’m not running with the card hooked up to the dev machine!

    • @greyhope-loveridge6126
      @greyhope-loveridge6126 Před měsícem +3

      @@dbarrie That's a really good way of updating the firmware on the fly - I'm amazed you got this running!
      DDR3 isn't actually the worst memory you could've used, and maybe you could use some older, broken GPUs and de-solder some GDDR6 or GDDR5 to transplant while you get stuff working too? I am fascinated to see how far this comes.

    • @Vifnis
      @Vifnis Před měsícem +1

      @@greyhope-loveridge6126 I doubt this would even work since GPUs aren't out-of-the-box FPGAs... might need to check JEDEC standards first to even see if that's possible, and iirc they only started with GDDR4 and up, wasn't everything before that 666MHz DDR3 VRAM?

    • @proudyy
      @proudyy Před měsícem

      @@greyhope-loveridge6126 Definitely, he has to keep going. The amount of potential in this is crazy. And even though he said in this video that the goal never was competition, he still can get a competitor in the future :P
      Or even found a company which competes in the future, whatever...

  • @jungle2460
    @jungle2460 Před 2 měsíci +13

    If that SD card actually turns out to be the VRAM, that's genius. I'd love to be able to swap SD cards to upgrade VRAM

    • @HamguyBacon
      @HamguyBacon Před 2 měsíci +14

      SD cards are not fast enough to be vram, thats just the bios.

    • @barela3018
      @barela3018 Před měsícem +2

      Way to slow, ram and ssds are made for different purposes, that’s why there is a loading time before games, ssd is loading the system to the ram and then when needed by the gpu, the ram will select information from the program loaded.

  • @tropixi5336
    @tropixi5336 Před 2 měsíci +106

    "YoU DiDnT mEnTiOn InTeL"

    • @AryanBajpai_108
      @AryanBajpai_108 Před 2 měsíci +19

      He did 😂

    • @Prince-ox5im
      @Prince-ox5im Před 2 měsíci +12

      ​@@neon_archhe's mocking someone's comment not saying it

    • @ranjitmandal1612
      @ranjitmandal1612 Před 2 měsíci

      😂

    • @tropixi5336
      @tropixi5336 Před 2 měsíci

      @@AryanBajpai_108 im talking about the start where he said "NVidia and amd are top contenders" ....

    • @urnoob5528
      @urnoob5528 Před měsícem +1

      @@ranjitmandal1612 smh

  • @oglothenerd
    @oglothenerd Před měsícem +12

    Someday we will have open source GPU instruction sets. I know it. The community always wins.

    • @fatemanasrin7579
      @fatemanasrin7579 Před měsícem +1

      And they'll be 9 year old white spoiled kids aho aill buy these tjinking they're smart and break it or put it on fire..

    • @xeschire706
      @xeschire706 Před měsícem +1

      Or we can just take either risc-v, or a custom & extended version of the 6502 that supports a 64 bit architecture, nyuzi, or even the miaow isa's, & modify & optimize them for efficient graphics processing, for use in our custom, open source GPU's instead, which I think would be a far better route in my opinion.

    • @oglothenerd
      @oglothenerd Před měsícem +1

      @@xeschire706 I like that idea.

  • @oktc68
    @oktc68 Před 2 měsíci +6

    This is the most interesting PC oriented video I've seen for ages. Nice1 Vex, nice change of pace.

  • @vishalkumar-dr8wq
    @vishalkumar-dr8wq Před 2 měsíci +15

    Its amazing what he was able to achieve. I used FPGA's during my time in undergraduate study and what he achieved it takes an amazing amount of skill and work.

    • @jnharton
      @jnharton Před měsícem +1

      True, but don't discount his 20-30 years of writing code for software rendering.
      Not only does that mean he has a considerable foundation in understanding what the hardware needed to be capable of, it almost meant he could write his own kernel driver and an interface API to make his GPU usable under a modern Windows OS!

    • @cooldudep
      @cooldudep Před měsícem

      ​@@jnharton what part of op's comment discounts the guy's decades of work?

    • @mrnorthz9373
      @mrnorthz9373 Před měsícem

      ​@@cooldudep i dont think he means the comment discounted his skill, he means to anyone that may think this is a miracle or something unexpected from a guy of this magnitude

  • @StuffIThink
    @StuffIThink Před 2 měsíci +6

    I've been watching him try to make this thing work forever super cool to see someone else giving him some exposure.

  • @Drunken_Hamster
    @Drunken_Hamster Před 2 měsíci +3

    The future where I can piece together and upgrade my GPU like I can the rest of my system would be lit. NGL I'd love for the GPU scene to instead have motherboard chip slots similar to the CPU socket, with their own section for memory, special compute units (to improve ray tracing or AI separately from rasterized processing), and output pathways so you never have to worry about finding a card with the types and quantities of outputs that you want.
    It'd also make cooling simpler and likely more compact, kinda like how it is for CPUs with semi-universal setups that only require certain amounts of height as the sole variable. And it'd DEFINITELY make liquid cooling more accessible, not that I want to do that as much as I once used to.

  • @rainbye4291
    @rainbye4291 Před 2 měsíci +9

    My man just did the unthinkable. Great effort for making a gpu this good ALONE.

  • @maxcarter5922
    @maxcarter5922 Před 2 měsíci +19

    What a great contribution! Crowdsource this guy?

    • @arenzricodexd4409
      @arenzricodexd4409 Před 2 měsíci

      To play Quake at 60FPS?

    • @mrnorthz9373
      @mrnorthz9373 Před měsícem +6

      ​@@arenzricodexd4409quake at 60 fps today, cyberpunk at 60 fps tomorrow.

    • @scudsturm1
      @scudsturm1 Před měsícem

      @@arenzricodexd4409 dont complain if u cant build a gpu yourself and write the driver yourself

    • @arenzricodexd4409
      @arenzricodexd4409 Před měsícem +1

      @@scudsturm1 nah this guy do it for fun. crowd source? that is an attempt to take away his passion for this.

  • @cashmoney2159
    @cashmoney2159 Před 2 měsíci

    i just got a laptop and its having a ddr5 12gb ram stick, but the problem is that in my country only 8 and 16gb ram sticks are sold now what should I do? buying which ram would be a better option 8gb or 16 for getting that dual channel boost?

  • @PeterPauls
    @PeterPauls Před 2 měsíci +3

    My first GPU was a 3Dfx Voodoo 3 3000 and 3Dfx made the first GPU available for the masses (AFAIK) and they disappeared around 2000-2002.

  • @AleksanderFimreite
    @AleksanderFimreite Před měsícem

    I would assume the rates displayed around 5:30 indicates what speed the internal game updates (ticks) are running.
    One render seems to take around 25 - 45 ms to draw, and another 5 - 10 ms to clear the data for the next render. This indicates a total range of 30 - 55 ms per update.
    Formula to calculate rate per second would be (1000ms / x) which becomes roughly 33 - 22 range. Which seems accurate to how choppy the enemies move around.
    Camera motion is much smoother than their movements. Despite this, I'm also impressed by the efforts of individuals trying to tackle such a daring project.

  • @BOZ_11
    @BOZ_11 Před 2 měsíci +29

    Fury?? ATI Technologies 'bout to make a complaint

    • @core36
      @core36 Před 2 měsíci +6

      I don’t think ATI is going to make any complaints anytime soon

    • @BOZ_11
      @BOZ_11 Před 2 měsíci +2

      @@core36 so i see sarcasm isn't your strongest suit

    • @spooderderg4077
      @spooderderg4077 Před 2 měsíci +3

      AMD may still own the trademark though. Might not since they haven't even used the ATI brand in a decade. But they potentially could.

    • @urnoob5528
      @urnoob5528 Před měsícem +2

      @@BOZ_11 tell that to urself

    • @MarioSantoro-ig5qh
      @MarioSantoro-ig5qh Před měsícem

      Unless he starts selling them they probably wont do anything.

  • @Revoku
    @Revoku Před měsícem

    the CPU/GPU on a raspberry PI is an arm cpu/whatever internal gpu, has set instructions /pathways for both. an FPGA is a chip that you can program the gates/instructions/pathways
    you run code that changes the configuration of the chip

  • @sturmim
    @sturmim Před 2 měsíci +7

    He could braze some old GDDR6 chips from broken or old GPUs. Would like to see that.

    • @kesslerdupont6023
      @kesslerdupont6023 Před 2 měsíci +5

      It may be good enough to just put some DDR5 on there depending on what speed the GPU is currently using.

    • @jcoyplays
      @jcoyplays Před měsícem +1

      He could've used DDR3 and been on par/overkill. (800-2133 MT/s, or about 400-1051 MHz, which would match/exceed the FPGA clock speed)

    • @kesslerdupont6023
      @kesslerdupont6023 Před měsícem

      @@jcoyplays yeah that is true

  • @miguelcollado5438
    @miguelcollado5438 Před 2 měsíci +16

    Real3D, Mellanox, Realtek made decent GPU's in the 90's as well... but they have eventually all been absorbed by the same 3 major brands in the 2000's...
    Dylan Barrie deserves our community's full support for his work.

  • @Desaved
    @Desaved Před 2 měsíci +10

    We're at the point where GPUs are more expensive than the entire rest of the computer!

  • @tukebox-mf9bo
    @tukebox-mf9bo Před 15 dny

    we learn how to prgram fpgs boards like the max 10 in school in vienna austria, its such a complicated process, props to the guy

  • @guarand6329
    @guarand6329 Před měsícem

    I bet if he took the transistor design and converted that to dedicated silicon vs the fpga, it would run faster.
    Pretty cool that he created a gpu design, also wrote the driver, and it's working.

  • @zawadlttv
    @zawadlttv Před měsícem +1

    the sdcard probably holds the programming of the chip. probably the easiest way to update that like that

  • @Hemeltijd
    @Hemeltijd Před 2 měsíci +2

    This is so informative and cool. If you find any more topics alike, can you make more videos like this?

  • @Md.ImranHossain-uq4gu

    How can I replace m.2 slot with thunder bolt port in a laptop in order to add external GPU?

  • @Powerman293
    @Powerman293 Před 2 měsíci +1

    I could see this project eventually turn into the PC equivalent of those FPGA clone consoles but for 90s GPUs.
    A very cool demonstration of tech that fills a niche market but ultimately is not threatening to the big players.

  • @UltraVegito-1995
    @UltraVegito-1995 Před 2 měsíci +53

    *If only moore threads became successfully AI GPU in china causing them to neglect Nvidia or AMD....*

    • @arthurwintersight7868
      @arthurwintersight7868 Před 2 měsíci +9

      I just want to see more actual competition, to drive down prices.

    • @zerocal76
      @zerocal76 Před 2 měsíci +4

      😅😅 You must know very little about China to make a comment like that. The last thing anyone in the world wants is a gov like China's to become completely tech-independent, especially in the hardware accelaration & AI space!

    • @arthurwintersight7868
      @arthurwintersight7868 Před 2 měsíci +5

      @@zerocal76 - China is highly likely to implode under their own pressure at some point. Especially if their shoddy construction work at the Three Gorges ends up being as bad as people think. In the meantime they can drive down GPU and NAND prices.

    • @arenzricodexd4409
      @arenzricodexd4409 Před 2 měsíci

      Still does not make them to ignore AI.

    • @hey01e5
      @hey01e5 Před 2 měsíci +1

      unfortunately, if moore threads became competitive they'd get sanctioned for "national security reasons", leaving us westerners stuck with nvidia and AMD who will just price gouge the GPUs

  • @sannyassi73
    @sannyassi73 Před 2 měsíci

    I wonder how well those MTT GPUs perform with AI- 16gb of memory is decent. That might be their main purpose.
    It's also becoming easy to make/mill your own silicone boards with different combinations of certain CNC machines and lasers. Most Chiplets/Capacitors/etc. on a board are very cheap to buy individually and then you make your own actual board. It's pretty neat. I don't have the tools to do it but I've been looking into it and it's getting to be affordable to do it at home for prototyping and even some mass production for more simple parts.

  • @SL4PSH0CK
    @SL4PSH0CK Před 2 měsíci +1

    unironically china market has been a blessing for budget builders notably the e-sport aimed GPU w/ the "SP's" and when the brand they represent was a sister brand like Palit(taiwan) and Inno3D

  • @cybernit3
    @cybernit3 Před 2 měsíci +10

    The biggest hurdle to make a GPU is you need lots of money to make the gpu chip if its an ASIC but later on the ASIC chip would be cheaper than using an FPGA. I wish they could make high performance FPGAs that are cheap; not so expensive. I have to give this fury gpu guy some credit for making it; this could lead to something decent in the future or inspire future gpu designers. Also there is VAMPIRE AMIGA who made an extension of the AGA Amiga graphic chipset.

    • @Tyrian3k
      @Tyrian3k Před 2 měsíci +2

      It simply can't be as cheap as a chip that is tailor made for the specific desired purpose.
      It's like wanting a van that can perform like an F1 car without it ending up costing more than the F1 car.

  • @jameshadaway8621
    @jameshadaway8621 Před 2 měsíci +1

    Great video I remember the want of 3dfx cards in 90s and always wanted to work in IT and its good people can build there cards as hobby.

  • @CrowandTalbot
    @CrowandTalbot Před měsícem +1

    wasn't quake that game that used to prove or break computer builds back in the day? and his gpu handles it gorgeously? that's enough for me to know he's onto something

  • @Ele20002
    @Ele20002 Před měsícem

    I love projects like this. Making a driver fully compatible with windows, and actually creating all the required ports to connect via PCIe is insanely impressive. Using an existing graphics API would be even more impressive, but modern APIs are so complex these days it'd be a hell of a task for one person, so I can understand skipping that step.
    It'd really be great to get more GPU designs into the open though. GPU architecture isn't really shared in that much detail - everyone does their own thing, so you can only take inspiration from the higher level concepts.
    There's a good reason to hide GPU ISAs behind the driver though - a lot of optimisations can be enabled by the compiler and features integrated into each new version that'd otherwise need convincing developers to add support into their game for.
    Breaking into the GPU space in performance is also difficult because so many optimisations are made specifically targeted at a certain GPU, forcing newcomers to support hardware acceleration of that feature as well to not fall behind in benchmarks, even if there's another way to achieve the same effect that's more efficient on their hardware.

  • @bubmario
    @bubmario Před 2 měsíci

    I think what is important about something like that is it can lead to gateways which are normally locked and up to the vendor to approve. If there is some thing that is not able to be performed on Nvidia or AMD GPU, maybe this card can be a solution to that in years time. Open source stuff is really important for that.

  • @jaywhite15_AL
    @jaywhite15_AL Před měsícem

    LOVE the Meze's you're running.

  • @histerical90
    @histerical90 Před 2 měsíci +1

    You know the problem with that survey? There are also counted people with steam decks, rog allys, other apus I think that percentage is mostly from there, while for nvdia those are just proper full gpus.

  • @xeschire706
    @xeschire706 Před měsícem

    I've been thinking about doing something like this for a while, but with instead taking a pre-existing isa like risc-v, or an open source, & upgraded implementation of the 6502 isa, then modify & optimize either architecture for efficient graphics processing in order to create a custom open source gpu. Of course my target would strictly be microcontrollers, embedded devices, Arduino's, retro style games consoles & handhelds, & also retro PCs for now, as little baby steps before I can take anything like that any further.

  • @aleanscm9350
    @aleanscm9350 Před měsícem

    I am working on optimizing the performance of low end gpu’s and other components, but if there was open source gpus were you could optimize from the metal it would beat beefy stuff.

  • @poohbear4702
    @poohbear4702 Před 2 měsíci

    I've been wondering for a few years whether someone would do this. Very cool!

  • @ohnoitsaninja
    @ohnoitsaninja Před 2 měsíci +3

    It's not hard to make a graphics card, if we abandoned our current software library.
    It's very hard to make a graphics card thats compatible and performant on every version of opengl, directx, vulkan that has ever come out.

  • @zealotoffire3833
    @zealotoffire3833 Před měsícem

    im into designing simple computer architectures and then running programs on them, even for cpu's it took a lot of research to get something working and i couldnt even find that detailed things enough to bedesigned from just logic gates. then i tried to make a simple gpu and then i was done, the most closest thing to any details was a university paper writen by someone about the order of how it interprets instructions, which was barely anything so i just gotta pretty much reinvent the wheel (i have not even attempted to make a gpu from logic since i need to plan before and i don’t know know howto do that at this point lol)

  • @noth606
    @noth606 Před 2 měsíci +1

    RasbPi has zero to do with FPGA, just relatively important point, Pi's use off the shelf chips, slap them on a custom board with some RAM and shizzle, connectors etc and call it a day. The FuryGPU is a significantly more involved project than that. I could throw together a RasbPi sort of equivalent thing in a couple of weeks for the hardware, a month or two for the adaptations of the codebase to the board assuming the chosen CPU is reasonably supported. It would take me years to fart out something approaching the FuryGPU if I'm lucky and have loads of resources. I am not bullshitting, I have done sort of similar things, but different purpose and lower power, to the rasbpi. I ended up not proceeding beyond making a first series of fully fabbed and tested boards in the end, but it didn't have anything to do with the hardware or software, but with me splitting up with my then GF, having an argument with my business partner at the same time, having to move and within less than a year finding new GF, deciding to and getting married. New wife soon got pregnant and priorities shifted from fugging around with tech, to different things. 10 yrs ago.

  • @Ninetails94
    @Ninetails94 Před 2 měsíci

    it wouldnt be too hard to make the gpu itself, the self written code is the hardest part,
    so the fact that some dude made a gpu from scratch its pretty neat, hope someday we could get custom gpus that out perform the major players.

  • @andreasmeow452
    @andreasmeow452 Před 2 měsíci

    3:28 It's been a known issue that with MSAA turned on in GTA V, arc performs very poorly. Maybe Steve had that enabled, as it does say it is "VeryHigh-Ultra-Custom". but I mean, it is also 4K too so I'd be a bit hard to run super well regardless

  • @KaitsuYT
    @KaitsuYT Před měsícem

    im curious would it be any easier to write the drivers for linux? He should try it out? maybe

  • @melexxa
    @melexxa Před 2 měsíci

    Man, I really want to mod my GPU by adding more VRAM chips using clamp shell but I'm pretty scared to do so.

  • @aggressivefox454
    @aggressivefox454 Před 23 dny

    For how customizable and “freeing” the pc market is with a wide variety of interchangeable parts, operating systems, etc. I would have expected for their to be less of a monopoly on gpus. I always got the impression they were just mini computers so I kind of thought that you might be able to easily build them yourself (granted not as easily as a pc). Open source and custom built gpus would be awesome to see though. I’d love to have a multi gpu set up that I can build myself

  • @dennisestenson7820
    @dennisestenson7820 Před měsícem

    About 10-15 years ago I worked on a product that used an FPGA to generate video output. It's impressive, but definitely not unheard of.

  • @Skullkid16945
    @Skullkid16945 Před měsícem

    I hope there is an open source hardware boom soon, or at the least more big names getting involved in finding ways to make things like this more avaliable for the open source community as a whole.

  • @tristan6509
    @tristan6509 Před 16 dny

    you should take a look at the MiSTer, they not only emulate the GPU, but also CPU, RAM, firmware, etc of old consoles with an FPGA. it's some wild stuff.

  • @YuNherd
    @YuNherd Před 2 měsíci

    i am waiting for arm based gpus, hope some will make it
    edit:
    he could pitch this gpu as modular where you can change the core and number of RAM, changeable heatsinks too

  • @deathVIAavatar
    @deathVIAavatar Před 2 měsíci

    I'm already imagining a cool, very affordable project PC bundle that could have this in it. Something that includes projects for people to make. Almost like a modern Commodore 64, but with a dedicated GPU as opposed to an SBC, and with plenty of ram to allow all kinds of editing of beginner projects. Maybe something that could render up through PS1/Nintendo 64 style graphics without sweating it. I know this sort of thing kinda exists, but with the open source nature of this GPU, that would align with the tinkering factor.

  • @JohnBernas-ll2si
    @JohnBernas-ll2si Před 21 dnem

    The frametime for the 60 FPS with working vsync should take about 16,666667 ms, so considering what we saw in the counter - it runs at about 35~40 FPS at maximum, or I don't get something about his GPU.

  • @Retrojd93
    @Retrojd93 Před 2 měsíci +1

    Whic gpu should I pick 3060 12 gb or rx7600 any suggestions

  • @litlle_elctro_engeneer
    @litlle_elctro_engeneer Před měsícem

    i think that microsd card is there for:
    * Storing the program (because its so large)
    and as he says, for storage

  • @softwarelivre2389
    @softwarelivre2389 Před 2 měsíci +1

    We need to run Super Tux Kart on that ASAP

  • @paradox8425
    @paradox8425 Před měsícem

    Anyone knows why he didn't implement an existing API to work with his hardware? Like OpenGL or Vulkan
    Edit: btw I think the sd card is to store firmware which is basically something like bios for that specific hardware

  • @phillangstrom8693
    @phillangstrom8693 Před měsícem

    i would like to see one of the companies make a video card with not only 32Gb of fast ram but also will have a high speed 200Gb m.2 drive for cashing the entire game so it doesn't have to use the cpu as much to read from the main drive or I am still waiting for game makers to adopt the new particle physics engine that will make high end GPUs unnecessary because high quality graphics at a high frame rate are possible on standard integrated APUs and low end GPUs like a gt 980

  • @kaseyboles30
    @kaseyboles30 Před 2 měsíci

    If this fpga design were transfered to a full blown asic of it's own it could probably take that 720p60fps and turn it into 1440p60fps. Especially on a more recent node. could probably double the number of cores in it on top of the speed up. heck the design might scale up to 4k60 for the game.

  • @drenewoo_irl
    @drenewoo_irl Před 17 dny

    I would contribute into this project, i already made kinda custom gpu on USB for linux, with modified dxvk so it runs pretty well

  • @TheCustomFHD
    @TheCustomFHD Před měsícem

    I "know" someone that has been reversing WDDM, the video driver stack since Win Vista. That work would help this gpu probably a bit.

  • @Meilk27
    @Meilk27 Před 13 dny

    is this the first time something like this was done and documented?

  • @legiongaming99
    @legiongaming99 Před měsícem

    I dont mind if they hit the technical limit more gpus in the market means cheaper gpus for all and if it did hit a hard limit thats good for comp games

  • @roythunderplump
    @roythunderplump Před 2 měsíci

    Louis Rossmann would love this story piece, hope more jump on board with these electronic projects.

  • @adilam6128
    @adilam6128 Před 2 měsíci

    Hey VEX! I need your help! I built a pc and when I launch a AAA game it gets too loud after 5 minutes of the launch. I disabled case fans, set silent profiles to CPU/GPU fans and nothing helped!

    • @GTORazor
      @GTORazor Před měsícem

      Do you have cpu core temp monitors? Sounds like cpu getting very hot if fans go to high when launching games.

    • @adilam6128
      @adilam6128 Před měsícem

      @@GTORazor when I play god of war cpu temp doesn't go above 70-72c max and still loud!

    • @GTORazor
      @GTORazor Před měsícem

      @@adilam6128 what CPU cooler are you running? Are all fans tied into motherboard or case fan controller? My new build runs 25C or less at idle and never over 50C during extended gaming with near silent operation. i5-12600K, Peerless Assassin 120 cooler. 5 case fans, Zalman ZL1 case.

  • @andrewvader1955
    @andrewvader1955 Před měsícem

    This is super cool! I would be happy if it runs old titles.

  • @LuGaKi
    @LuGaKi Před 9 dny +1

    "imagine having a pre-build GPU"

  • @SalveMonesvol
    @SalveMonesvol Před měsícem

    This line of work could be amazing to emulate old consoles.

  • @pauloisip3458
    @pauloisip3458 Před 2 měsíci +7

    I can see this guy becoming successful unless nvidia makes a move on the guy

    • @AssassinIsAfk
      @AssassinIsAfk Před měsícem +1

      Either 2 things will happen
      1) Nvidia becomes Nintendo/ Sony and send a cease and desist
      2) they send him a opportunity to work for them to fix their budget cards or AMD/intel send him a opportunity to work for them.

    • @NicCrimson
      @NicCrimson Před 29 dny

      @@AssassinIsAfk A cease and desist for what?

    • @AssassinIsAfk
      @AssassinIsAfk Před 29 dny

      @@NicCrimson I don't think you understand the joke

  • @sanketsbrush8790
    @sanketsbrush8790 Před měsícem

    I always wanted to make my own gpu with my own bare hands , but I don't have any knowledge of it

  • @shiro3146
    @shiro3146 Před měsícem

    cant wait for open source cpu and gpus
    would be very crazy cool considering theres linux on software space but theres no in hardware
    sure hw isnt as easy as sw, it would cost unimaginable money, but hey this kind of niches already proved itself that it can became the start of a standard too

  • @c.n.crowther438
    @c.n.crowther438 Před měsícem

    I will be following Dylan Barrie's work with great interest.

  • @emiljagnic2101
    @emiljagnic2101 Před měsícem

    Awesome, thank you for reporting about this!

  • @marisbarkans9251
    @marisbarkans9251 Před 13 dny

    now I finally understand how people heard and saw me when I was younger. You CAN make a computer just on test boards with wires and ic's. You can make a GPU from the same. You can re-solder memory, cpu's, gpus's and make frankenstein cards. You can power mod GPu's. You ln2 overclock. You can do all sorts of things but its all for fun and learning. Even upgraded cooling solutions are usually useless unless your card has very shit cooling. Most i found useful was modding laptop heatsinks if you didnt care about the looks or noise. All of this has been already done. I would say that it would be more interesting combining gpu's and writing drivers to get them to work then making your own.; In the end you dont rly make your own x=cause you take a universal ship anyway. so why not use old amd or nvidia cards to swaps gpu dies and ram etc?

  • @jasont80
    @jasont80 Před měsícem

    He's basically using a single-board computer to render graphics in software. It will never be close to a modern GPU, but this level of tinkering is amazing. Love it!

  • @J-Ernie
    @J-Ernie Před 2 měsíci

    I have the same microphone as you. Can you share how you power it and if you use the goxlr could you please share the settings?

    • @vextakes
      @vextakes  Před 2 měsíci +1

      I might make a video on setup stuff in the future. Mine is pretty simple… the shure MV7 runs XLR into the Behringer UMC204HD interface and I use a Fethead to boost the signal a little big for less signal noise. The interface is quite cheap same with the fethead.
      I record the microphone raw and do all the processing in post like the eq, compression, dessing, expansion, and limiting. Although if u use a GOXLR I do something a little similar. I use Elgato’s wavelink software to put in-software live processing on my mic for using in calls and while live-streaming. The difference with the GOXLR is I can still capture the raw microphone for recording, you would need like an XLR splitter if you want to do that on the GOXLR using its live processing
      Hope this helps!

    • @J-Ernie
      @J-Ernie Před 2 měsíci

      @@vextakes It would be amazing if you could make a video. I'll subscribe so I don't miss it. Thank you for your reply.

    • @vextakes
      @vextakes  Před 2 měsíci +1

      Yeah np
      I don’t think it would go on main channel, or it might be members-only or something. Just a heads up

    • @J-Ernie
      @J-Ernie Před 2 měsíci

      @@vextakes Ok, Thanks for the heads up. If it's okay, I have one more question. I have researched the Fetheads and saw several Triton Audio FetHeads. Could you kindly let me know which one is recommended for our microphone? I appreciate your help. Thank you!

    • @vextakes
      @vextakes  Před 2 měsíci +1

      There’s only 1 fethead- it’s a brand name. I use that triton audio one. You can also use a cloud lifter or many cheap generic ones I’ve actually heard are decent. You use the 48v phantom power that u usually use for condenser mics to boost the signal
      Honestly tho, the gain lifter doesn’t make a huge difference and I wouldn’t stress about it. It takes away a little bit of noise, but I used mine for months without one and nobody could tell especially watching thru a CZcams video on a phone speaker, tv, whatever. Nobody can tell man. Just make sure your voice sounds clear in the video and is pleasant to listen to. Almost no one is listening on studio level equipment, those that do, you shouldn’t listen to. I’ve had people try to tell me shit and they’re just nitpicking.

  • @CipherDiaz
    @CipherDiaz Před měsícem +1

    An FPGA is *NOT* in a raspberry pi. They are quite expensive. Also, the programming on these fpga's is not anywhere near similar to C/C++ or anything like that, you are basically writing the logic for how electricity flows between components. An fpga also has a clock-rate, and the higher the rate - the more these things cost. So for him to get say, 150fps, he would most likely need a fast onboard clock. And most likely more gates to work with, since the only way to truly optimize anything on an FPGA is by heavy use of tables. Which might not apply to a GPU since its basically moving a ton of data around memory as quickly as possible.
    But yeah, awesome project!

  • @user-cx6rg6mr7d
    @user-cx6rg6mr7d Před 2 měsíci

    kudos to DIY project

  • @Wobble2007
    @Wobble2007 Před 2 měsíci

    Amazing what he has achieved, though MiSTer has already done this with Groovy-Mame & MiSTerCast, which lets you use your MiSTer as a GPU.

  • @gravecode
    @gravecode Před měsícem

    I'm praying a open-source gpu commnunity develops low-key the world need one now more than ever.

  • @thevikinggamer454
    @thevikinggamer454 Před 2 měsíci

    Loved the video, it would be really good for us consumers, if more try to get into the gpu game, it would force both prices down and accsessability up. and its impressive to see 1 person actually make a functional one. When it comes to Intel Gpu's and Starfield, it works like a charm btw, and on one of the kids gaming pc'es, one of them are running Sparkle Intel Arc A770 TITAN OC 16GB, and runs both starfield and Cyberpunk 2077 with well over 100fps+ on max settings with low well over 80fps+, so I dont understand how testers gets so low on thoose 2 in tests. Intel also are sending out almost a new driver update every week now, making the card better and better every week at this point, and even at some games, my kid is actually out perform my 4080 in some games. So Intel are well on theyre way to update the atleast Arc 770 , to be a really solid contender and card for the fraction of the price of my 4080

  • @HyperDev00
    @HyperDev00 Před měsícem

    Basically FPGA is an intel chip that can be programmed ,you can specify gates and decoder , multiplexers , and all that without dealing with hardware directly, because it is programmed more like a software.

  • @CheesyX2
    @CheesyX2 Před 2 měsíci

    Seriously impressive stuff!

  • @kaimanic1406
    @kaimanic1406 Před 2 měsíci +1

    I can't even imagine to build my own GPU. This guy is amazing!

  • @Mantrevouir
    @Mantrevouir Před 2 měsíci

    Your haircut looks so badass keep it mate👍

  • @reincarnatedTohere
    @reincarnatedTohere Před měsícem

    As an amateur software programmer for fun. I just wanna know how many lines that guy had to write to make a custom gpu api cause that sounds insane

  • @flakes369
    @flakes369 Před 2 měsíci +3

    TempleOS energy

  • @keiyano
    @keiyano Před měsícem

    the hardest part of making new devices is always the windows drivers. Its indeed a lot better nowadays since windows has support for usermode drivers written in C# which is much more understandable and they do give examples of writing them. But they need more tutorials and examples to help devs who arent familiar with writing drivers. also there is also driver signing which is confusing