Game devs are drowning in complicated proprietary systems (Jonathan Blow)

Sdílet
Vložit
  • čas přidán 23. 01. 2024
  • Jonathan Blow's Twitch: / j_blow
    Tip me: ko-fi.com/blowfan
    Jonathan Blow talks about how game developers are forced to deal with big, complicated systems such as DirectX, Vulcan, OpenGL and Metal graphics APIs and their shading languages. This leaves less time to do actual, real work. Nobody feels like they have the time to fix or question the situation. Vendors wanting to keep things proprietary contributes to the problem. Just watch the video instead of reading my struggled attempt to try to summarize it.
  • Věda a technologie

Komentáře • 261

  • @augustday9483
    @augustday9483 Před 5 měsíci +153

    I don't work in gamedev, but I do work in an enterprise setting as a software engineer. A depressingly small amount of my team's developer hours goes towards writing useful features or functionality. Over half, sometimes over three quarters is spent updating dependencies, fixing issues in our deployment pipeline, and migrating from one deprecated API/technology to a newer system. API migrations especially has been killing us this past year, it feels like every month some other team is contacting us to say "this API you're using is going away, you have to move to this other system with a totally different set of endpoints with different response objects, oh and by the way a feature you relied on no longer exists".
    Add on top of that, our team's budget has been cut in half. So we're being asked to do tons of work in the backend which doesn't deliver any value to our customers, AND we have half the developers we used to. If gamedev is anything like that, I totally understand why AAA companies spend years making a mediocre product. Modern software development is a dependency nightmare.

    • @Ghorda9
      @Ghorda9 Před 5 měsíci +4

      this is why you should never switch to a new engine or API in the middle of a project, just wait for the next one.

    • @HunterD510
      @HunterD510 Před 5 měsíci +3

      Why aren't they using versioning in their APIs? You're normally supposed to support the old one as well as the new one.

    • @CottidaeSEA
      @CottidaeSEA Před 5 měsíci +8

      @@HunterD510 That's not how the world works. In that case we'd still have Windows 95 for example.
      There is a limit to how much you can support and sometimes you need to restructure things in order to avoid complexity and scale to the extent that the demand requires. It could be that the cost to benefit ratio was also too low for some features, leading to their removal.

    • @guylianwashier4583
      @guylianwashier4583 Před 5 měsíci +9

      @@Ghorda9 I'm not sure that what you're saying is even possible for the commenter. It sounds like they work inside enterprise web development. If that's the case, what would your solution be for an API vendor that gives a few months notice and then shuts down the server where the API is running? In the shitshow that is webdev that's a very real problem where your only choice is to switch to a new API in the middle of a project. Also, a project within webdev is rarely finished since there's always something to change, fix, or add.

    • @Ghorda9
      @Ghorda9 Před 4 měsíci +1

      @@guylianwashier4583 wasn't really aimed at online stuff, just everything else. Not really familiar with "enterprise"

  • @post-humanentity8206
    @post-humanentity8206 Před 5 měsíci +57

    Bro casually found a red dot sight in his cake

  • @todorsamardzhiev144
    @todorsamardzhiev144 Před měsícem +4

    This is soo spot on. Every time I ask this question to a graphics dev, I get the response "uhh I don't know man, I haven't thought about it tbh".
    Also I'm glad that influential people such as JB are finally talking about all the nasty ways proprietary systems get in the way of good engineering. The future is open software running on open hardware. This is obvious, but it's also EXTREMELY hard to make it work in the modern world! Humanity will probably lose centuries figuring this out and it just breaks my heart every time I think of it...

  • @TheEvertw
    @TheEvertw Před 4 měsíci +16

    I remember the days of the C-compiler lock-in. When I found out about gcc (7 years after it first came out, shame on me), that was a revolution!
    We should learn to NEVER depend on vendor-owned programming languages. It is the ultimate lock-in.
    The problem with the GLs is that they are not compiled languages but interpreted. Meaning that having an intermediate that translates an open GL (pun intended) into a vendor-specific one, will always suffer a performance hit.

  • @perfectionbox
    @perfectionbox Před 5 měsíci +18

    John: "... shitty home computers..."
    Retro PC enthusiasts: WHAT

  • @Casgen
    @Casgen Před 5 měsíci +94

    I am programming right now in Vulkan for my thesis and the burden that Vulkan brings against OpenGL is unbarable. The fact that you have to write a vulkan program with 1500+ lines of code just to show a single triangle is kinda insane to me. At university we had to write a software renderer and I feel like I coded that faster than I got Vulkan up and running with the triangle. Don't get me wrong, i love learning about all the low-level stuff, but Vulkan or DX12 because all of this, it's so complex.
    Don't get me even started when trying to debug a single line where I could have left out just a single StructType assignment which could throw off the whole application (At least the validation layers can help with this). There is just so much information that you have to keep in your head and it really distances you from actually solving the problem. That could be of course skill issue, but that's just how I see it. When I programmed in OpenGL it seemed much clearer to me and I could focus more on the problem.

    • @doltBmB
      @doltBmB Před 5 měsíci +28

      and even the very best AAA games specifically optimized for vulkan, have really only a minimal benefit compared to gl/dx11. so the complexity is high, the room for error is high, it is easy to make something worse than the old runtimes, and the benefit to the consumer even when it's done right is minimal. low level API's have been a complete disaster, just another exercize in ladder pulling.

    • @Casgen
      @Casgen Před 5 měsíci +7

      @@doltBmB Yes, that is one of most mind bogling things too.

    • @liviusolcovenco6575
      @liviusolcovenco6575 Před 5 měsíci +16

      And Vulkan is still less complex than the GPU drivers or the GPU hardware architecture itself

    • @tempname8263
      @tempname8263 Před 5 měsíci +1

      There are quite a few ways to make Vulkan less painful of an experience though

    • @SETHthegodofchaos
      @SETHthegodofchaos Před 5 měsíci +3

      ​@doltBmB i dont know. Vulkan had run my games more smoothly and with higher FPS. In Path of Exile I can even switch the render API and I can directly feel it runs better.
      I mean Vulkan probably had lots of ways on how not to do it, so that you can have thr capability to so it extra right.

  • @JH-pe3ro
    @JH-pe3ro Před 5 měsíci +20

    I have similar sentiments to Blow here, but I concluded, "fuck it, do some modern retrogame coding. Agon Light looks nice." So now I am coding a mix of Lua and Forth. Lua is the entire build system and provides all higher-order metaprogramming, asset handling, etc. Forth is the compilation target, assembler and REPL debugging environment - I do not write complex Forth words, I write fairly dumb ones that Lua adds static checks for and copy-pastes variants of, so that hardly anything confusing or hard to read is done with Forth(and Forth itself is no slouch, but it doesn't have garbage collection, runtime type checks, string manipulation functions, etc. - it is a macroassembler operating over static memory structures, that bootstraps into a Lisp if you work at it; vs Lua being the most popular "embedded Lisp" around, one that is Forth-like in its dedication to a small core). The Agon is an eZ80 driving ESP32 graphics firmware over a serial protocol, so it has a good deal more power than vintage 8-bits, with an open design. There is a working emulator, multiple vendors making boards. You can reasonably understand 100% of the system and make targeted low-level optimizations without going waist-deep in assembly. If I want to retarget off of Agon, I just adapt it to a different Forth system and I/O layer.
    This is the stuff you do when you want to enjoy programming again, I think. Don't do modern connected things - they are poisoned by complex protocols and vendors who want to sell them. This is something the Forthers are right about. Chuck Moore will tell you that floating point is there to sell complex FPUs.

  • @rightwingsafetysquad9872
    @rightwingsafetysquad9872 Před 5 měsíci +54

    Not saying he's wrong, but he's kinda missing the mark. You shouldn't write for a GPU architecture directly because they change very often in ways that make them not backward compatible. For example, AMD Rx 5000 series had very little in common with its immediate predecessor, the Rx 500 series. Same with Nvidia between the 1000 and 2000 series. Compare that to CPUs: anything written for the 80386 back in 1985 would still work on a processor made in 2024. Half the point of games on PC is that I can still play games I bought in 2003 when I had Windows XP on a brand new computer today.

    • @Ghorda9
      @Ghorda9 Před 4 měsíci +3

      should of told that to Bioware for the original Neverwinter Nights 2 that could only run on a very specific GPU.

    • @kahaneck
      @kahaneck Před 4 měsíci +1

      @@Ghorda9 Thats not true at all.

    • @Ghorda9
      @Ghorda9 Před 4 měsíci +2

      @@kahaneck when i had the game it couldn't run on anyone's system and when i looked it up it mentioned that it used a feature of a graphics card that was deprecated as soon as the next one came out.

    • @Ghorda9
      @Ghorda9 Před 4 měsíci +1

      @@kahaneck this was also in the mid 2000s.

    • @dacomputernerd4096
      @dacomputernerd4096 Před 4 měsíci +1

      Which, to be fair, they could do in the graphics market. They could just make one ISA and continue extending it like has been done for x86. But they don't. So why is that?

  • @Domarius64
    @Domarius64 Před 4 měsíci +7

    Apple doesnt want you to use Metal on other hardware. IBM open sourced their architecture, no one attributes PCs to IBM anymore, in fact IBM nearly went under. These companies do need a return from their investment and part of that is securing their IP to some degree. Of course you can go too far like Apple and their home button thing with Samsung

    • @Domarius64
      @Domarius64 Před 2 měsíci

      It's funny because the net result for IBM is the essentially gave us the PC architecture out of the goodness of their heart, and nobody remembers them for it!

    • @MadsterV
      @MadsterV Před měsícem

      @@Domarius64 because it was mostly by mistake, not out of the goodness of their heart. The current state of affairs was not their plan.

    • @Domarius64
      @Domarius64 Před měsícem

      @MadsterV yes the goal was that if they opened sourced the architecture, it would become more popular and they would make more money by being the biggest suppliers of that hardware, but it didn't work out. I use them as an example of why Metal was invented specifically to be different and closed off.

  • @ragequitredux
    @ragequitredux Před 5 měsíci +50

    I don't fully understand what these companies get from that kind of control. Like, nobody is going to be exclusively Mac OS so it would seem to me that Metal does nothing except create barriers to entry, no barriers to exit because people are already writing their games in DX12 or whatever.

    • @pokefreak2112
      @pokefreak2112 Před 5 měsíci +28

      You'd be very wrong on that actually! Sure most gamedevs won't exclusively develop for macOS because the target audience is tiny, but there are quite a few successful mobile games that only get released for iPhone. And you've absolutely seen the inverse happen where a game only releases on windows or on a specific console.
      I've also worked for a company doing sports computer vision stuff, and a lot of that software is written to run smoothly on a macbook by exclusively using all the proprietary apple tech. It's all swift and it would be impossible to port to anything else.
      A large portion of b2b software in general doesn't need to care about lock-in

    • @majoolwip9513
      @majoolwip9513 Před 5 měsíci +10

      Apple likes having complete control over their ecosystem as that core idea has gotten them to where they are today. They don't care if it makes porting a game to their system hard since developers will do it anyways to get access to their millions of users.

    • @monad_tcp
      @monad_tcp Před 5 měsíci +10

      you answered your question. "does nothing except create barriers to entry", that's what they gain, maybe they want their products to be perceived as exclusive

    • @PRIMARYATIAS
      @PRIMARYATIAS Před 5 měsíci +1

      @@pokefreak2112Can you name me some of those iPhone specific games (so i could check more on them). Thanks.

    • @gruntaxeman3740
      @gruntaxeman3740 Před 5 měsíci +1

      Game developers write games to game engine or at least to some higher level layer that they don't need to care maintaining Metal/Vulkan/DirectX/OpenGL mess.

  • @N7sensei
    @N7sensei Před 4 měsíci +5

    I don't think WFH is a hindrance. Sure, there are slacking asses, but there was a ton of slacking in the office too. What the office did not have is peace and quiet. I envy you if while being constantly interrupted you are able to entertain complex thought, and keep in your mind thousands of lines of code at the same time. I surely can not. I surely am far more productive at home than at the office.

    • @smergthedargon8974
      @smergthedargon8974 Před 2 měsíci +1

      100% agree. Being in an office just changes how people slack off. They'll learn to LOOK busy, not actually be busy. Slacking off is a sign of the company being bad, not the employees. If the employees felt like you treated them well, they'd feel obliged to return the favor and work harder.

  • @andyhu9542
    @andyhu9542 Před 4 měsíci +3

    I don't think it is 'They want to be in control of what languages run on their thing', instead it is 'They need to control what languages run on their thing to raise money to hire literally thousands of highly talented people to make the thing at all.' That's why we don't have community-driven 4090s. No one would spend 1 billion dollar on design, synthesis, layout, verification, and tapeout of a new GPU just to improve the environment programmers work in. Especially not people with that 1 billion dollar, since if they did, they would spend it on something and then they don't. With that said, tech companies are laying off massively in 2024. There are tons of talented people who don't work at a tech giant anymore. Maybe this is the chance for the community-owned GPU.

  • @MenkoDany
    @MenkoDany Před 5 měsíci +11

    03:32 there's Shady which compiles into (vulkan's) spir-v. It's the most advanced one I know of. There are other projects which compile into GLSL(ES)+HLSL+SPIR-V but they all have tradeoffs

  • @austinzobel4613
    @austinzobel4613 Před 4 měsíci +2

    Abstraction from Comprehension is the modern problem.
    All of these kids need to not be immersed in technology without being taught its foundations FIRST.

  • @Xerophun
    @Xerophun Před 5 měsíci +14

    Standardization comes with commodification. We won't see a uniform approach until these things become nothing more than appliances.

    • @MadsterV
      @MadsterV Před měsícem

      also standardization comes with stagnation. Don't want that myself.

  • @orlovskyconsultinggbr2849
    @orlovskyconsultinggbr2849 Před 5 měsíci +3

    The problem with streamlined of interfaces. About graphic card vendors, well its theirs intellectual property, sure if stuff was not so complex , most graphic card vendors would have opensourced the work with shaders. I personally dont care, i just need on OS level an API which just simply works.

  • @felderup
    @felderup Před 5 měsíci +4

    well, i looked up open gpu's... pretty dismal, but there's a LOT of projects, nothing that'll probably even run half life. i figure the real problem is lack of fab, if it were easy to get something down in real silicon we'd have hundreds of halfway decent open gpu's on the jungle site and nvidia's newest cards would be half the price they are now.

  • @globalistgamer6418
    @globalistgamer6418 Před 5 měsíci +3

    At least these proprietary systems are at least trying to achieve something meaningful, even in a sub-par way. But then there are proprietary systems like most modern console TRCs that don't provide any actual value whatsoever to the player or developer in what they are trying to do and are just purely a sink for development and computing resources.

  • @madpuppet666
    @madpuppet666 Před 4 měsíci +2

    biggest problem we faced on the last 2 AAA titles I had to work on from home is the shear amount of data to crunch and build was probihibitively large.
    one game had 300gigs of data to cook and took unreal engine 24+ hrs to cook. Just changing a shader could take 4hrs to cook due to a terrible dependancy system. The best you could hope for was some changes that you could do in 60seconds, but thats still terrible because you have to put your mind on hold for 60 seconds while you are waiting for the result of your last change... and 60secs is a long time to do nothing and not get distracted with something else.
    As for languages... I liked a lot fo stuff about jonathon's new language, but its missing some key features like inheritance. But worse than that - he has an amazing feature that lets the compile run code to generate code, but this makes the language too easily abused. Codebases in that language will be unreadable because of how much people are abusing the generated code and how much you need to use the generated code to get around missing features in the language.

    • @SpeedfreakUK
      @SpeedfreakUK Před 2 měsíci +1

      The decision to not include inheritance is extremely deliberate, he does not like inheritance and considers OO a bad path to go down. I’m starting to agree, it’s slow and makes code horrible to read.

    • @madpuppet666
      @madpuppet666 Před 2 měsíci

      @@SpeedfreakUK I agree it was deliberate, but a language that denies people the choice of a very commonly used pattern is going to make the language unlikely to catch on. The mess you can make with his compile time #insert shananigans is far and beyond any mess people make with heirachy. And the function polymorphics is going to be a lot more messy than a cleanly designed low-depth hierachy.

    • @oscarsmith-jones4108
      @oscarsmith-jones4108 Před 2 měsíci

      I think Jonathan Blow made an alternative to inheritance: The "using" keyword. This works just like inheritance except you can choose to inherit by using a pointer instead of in-lining the members. This means that you can mitigate some of the bloat of your application while still using inheritance. This is like the flyweight design pattern except there's minimal refactoring cost when introducing it to your code.
      Unreal is also a great example of a project that is negatively affected by the lack of reflection/meta programming support in C++. To integrate your changes with the editor, Unreal needs to first generate an "intermediate" directory. This process requires running a parser through your code and traversing the output AST. While this works reliably in Unreal, this is an added layer of complexity that results in a longer build process than normal. The program needs to be run through a parser twice. Once for the reflection and again for the final build. The build process would probably be a lot more straightforward if Unreal was made in Jai instead, which supports this kind of meta-programming stuff out of the box. You simply cannot make a game engine like Unreal without reflection/meta programming.
      I'm really looking forward to this language anyway, if it ever releases. It probably won't get wider adoption though like you said.

  • @adamgustafsson7703
    @adamgustafsson7703 Před 3 měsíci +1

    Little did he know, but in the near future he would install win 11 and find out he no longer can move the startmenu list. Win 11 is for med the practical examples of "software in decline"

  • @andyhu9542
    @andyhu9542 Před 4 měsíci +2

    I don't think the 'C runs on everything' argument is true. Other than maybe the most basic stuff, you can't take a VAX program and run it on an ATARI 130XE. If the goal is to have one syntax for all GPU programming, instead of ensuring that 90% of all games run smoothly on DAY ONE of a new GPU launch, we would have great unified GPU programming framework. Game programming is far from the most severe vendor lock-in on a GPU, general-purpose computing is.

  • @justincaceres6487
    @justincaceres6487 Před 2 měsíci

    Leverage AI LLMs as a translator and reference puller, bonus if you have an in house all-accessible custom AI to bridge the gaps.

  • @snorman1911
    @snorman1911 Před 4 měsíci +1

    An ARM licensing style GPU would be awesome.

  • @TricoliciSerghei
    @TricoliciSerghei Před 5 měsíci +4

    I work from home better than in the office.

    • @Bobo-ox7fj
      @Bobo-ox7fj Před 4 měsíci

      Only true if your office is poorly designed (see: modern)

    • @smergthedargon8974
      @smergthedargon8974 Před 2 měsíci

      @@Bobo-ox7fj Offices suck in gender. Being at home is way nicer.

  • @zdspider6778
    @zdspider6778 Před 4 měsíci +1

    1:17 'Murica.

  • @spicynoodle7419
    @spicynoodle7419 Před 5 měsíci +3

    Jon is turning into an open-source fanboy? Yay!

  • @rp627
    @rp627 Před 9 dny

    jon's rants are satisfying

  • @MsDuketown
    @MsDuketown Před 5 měsíci +7

    Regarding game systems:
    Endless worlds and adaptive gameplay already ran on DOS3. Oké, sometimes they required 8 floppies with various levels but nevertheless.. Still, Goldrush 2, King Quest IV and Monkey Island 2 had very decent graphics in the times that Voodoo cards, adlib soundcards and EGA monitors ran the top game-systems on PC.
    At that time Amiga's were more hyped, and consoles like Sega and Nintendo got introduced but PC's always stood it's ground, specifically since Xbox's with AMD chips. And Linux, without support from the major gamdev studio's, also is able to run most games.
    And game development is mostly language agnostic, except for the hardware accelerated stuff. It needs drivers and can do without chipset.

  • @waltarron
    @waltarron Před 4 měsíci

    Tactical Cake!

  • @anon_y_mousse
    @anon_y_mousse Před 5 měsíci +5

    I think the problem with Intel's compiler is that it's no longer the best at optimization and they make it annoying to acquire. If I could just go to their website and click a link to download it without any other garbage then others would use it too. Granted I've only ever used it to test things with because it's exceedingly slow and even gcc has surpassed it in terms of optimizing, let alone all the LLVM based languages out there. It just doesn't have the competitive edge that it had 25 years ago.
    As for proprietary systems, unless you want absolute control over everything, just use an existing game engine and all of these problems are handled for you. Though, I do wish that someone would develop an open source GPU. There's really not a lot of open source hardware out there that's any good, and there's just about zero chance you can have something "printed" up at a moments notice because it takes so much infrastructure to actually manufacture chips. What we really need is a (3D printer)-style revolution with regards to hardware production. If you could just download a new processor design, stick in a silicon wafer or some such nonsense, hit print and be testing it by the end of the day, then it would revolutionize computing.

    • @thewhitefalcon8539
      @thewhitefalcon8539 Před 5 měsíci +3

      An FPGA GPU would be an intermediate step

    • @anon_y_mousse
      @anon_y_mousse Před 5 měsíci +2

      @@thewhitefalcon8539 Indeed, and FPGA's in general are a great step because it allows faster iteration than a printer would allow, but they don't allow for the degree of complexity that such a device would bring to the table. Although, it'd be interesting to see someone make a more generic device that could be used in that way. Of course, the computing power of a modern GPU can't realistically be tested by any current FPGA technology, but it'd still be a useful tool in developing something new.

    • @Muskar2
      @Muskar2 Před 5 měsíci +1

      Photolithography is pretty cool

    • @anon_y_mousse
      @anon_y_mousse Před 5 měsíci

      @@Muskar2 Yeah, even if you're not making a processor it can be fun.

    • @jacksonlevine9236
      @jacksonlevine9236 Před 5 měsíci +3

      The "Use a game engine" thing breaks my heart, it's like "Don't have the liberty to make apis that are pleasant for you to use, use these garbage designed-by-committee apis with shitty unintuitive graphical interfaces that, while being intended to make it easier, make it orders of magnitude more tedious and hellish". Like, NO.
      And then to add insult to injury, except for rare cases, you can almost always tell a game was made in an engine cause it has a certain uninspired generic look and feel to everything.
      It's like the consensus is "Home cooked meals are too much of a hassle, just use TV dinner trays and add your own sauce packets to it if you want to change it up a little"

  • @peterjansen4826
    @peterjansen4826 Před 2 měsíci

    Part of the solution: go fully opensource for the development stack of games, that way you know what code runs and you can much easier optimize, make an interface etcetera. You can keep the assets of the game proprietary, that is fine. Developers need to get paid, everybody understands that. But there is no good reason to have a secret code in the development stack. I wholeheartedly agree that the GPU should use opensource software, AMD its drivers are opensource (on Linux you can use it fully opensource with good performance) but the firmware is not yet. It is a good thing that Unreal and Unity got to opensource the code (though licensing which I don''t mind) and of course we have Godott. It makes me somewhat optimistic about the future of software becoming more opensource.
    In regard to Intel its C-compiler, that lead to false benchmarks. We can speculate if Intel intended for that to happen but it was not properly optimized for AMD, using the wrong instructions on AMD-CPU's back then. Disregarding that, it is better to just use opensource tools when those are available, it protects us from getting locked in.

  • @EddieCensored
    @EddieCensored Před 5 měsíci +1

    Laughing in SNES

  • @storojilo
    @storojilo Před 5 měsíci +4

  • @MadsterV
    @MadsterV Před měsícem

    Coding straight to hardware sounds like a nightmare, nothankyouverymuch.
    Would you like to have a different target per vendor, per year?
    I wish someone somewhere would have written a shading language that compiles to any Unreal and Unity. Both do it. They have their own (graph based) language and it compiles to many different targets: metal, openGL, DirectX, Vulkan, etc etc. These in turn are interpreted by drivers which you can install for your specific card so you don't need to go back to the engine every time your card's architecture changes (which is, what, yearly?).
    I thought by proprietary systems he meant libraries, because yeah, I'd agree there. PhysX turned into a lot of features that most people never saw, for example, back when it was locked to a single card series, for no reason other than marketing.

  • @carljosephyounger
    @carljosephyounger Před 5 měsíci +2

    Metal and Swift (with C, C++, Objective-C and ARM64 interop) on Apple Silicon, running macOS, is an awesome platform. Xcode sucks (but you don't have to use it), and the market's far smaller than on Windows, but if you wanted a modernized Amiga 500, Apple Silicon is pretty close.
    You can't have your cake and eat it.

    • @meanmole3212
      @meanmole3212 Před 4 měsíci

      No thx when the modernized amiga costs billions and the manufacturer forces its shitty online spying services on you in order to get basic system updates.
      Image search "apple tech support azis" for visual representation of the situation.

    • @blipblap614
      @blipblap614 Před 4 měsíci

      > Xcode sucks (but you don't have to use it)
      We don't??

  • @dacianbonta2840
    @dacianbonta2840 Před 4 měsíci

    good

  • @0x0michael
    @0x0michael Před 5 měsíci

    We need some LLGPU project, like LLVM but for GPUs

    • @trejohnson7677
      @trejohnson7677 Před 4 měsíci +1

      what the fuck do u think vulkan is?

    • @0x0michael
      @0x0michael Před 4 měsíci

      OMG i just thought it was focused on graphics@@trejohnson7677

    • @ShadowriverUB
      @ShadowriverUB Před 4 měsíci

      @@trejohnson7677Its just API specification more then software stack, its more like C standard library, you need platform willing to implement it.

  • @doofmoney3954
    @doofmoney3954 Před 2 měsíci

    Intellectual Property and its consequences

  • @Godzillaaaaa11
    @Godzillaaaaa11 Před 4 měsíci

    It sounds like the problem is just the amount of expertise you need to work with these systems and not whether or not vendors being middlemen

  • @lifeartstudios6207
    @lifeartstudios6207 Před 4 měsíci

    This is my assumption as to why apple is so uptight about propriety hardware and metal. They're trying to force their way out of it.

  • @IPlayKindred
    @IPlayKindred Před 5 měsíci +1

    ok but whats the problem with these libraries? they handle the link between the game and the hardware, they figure out 3d for you. shaders and all that stuff.
    they are big and complicated for a reason, and they arent even close to the problem thats plaguing tripe A games.

  • @fluxem3908
    @fluxem3908 Před 5 měsíci +21

    Let me write my own programming language and OS - JB. Why can't I finish an indy game in 8 years - also JB

    • @thewhitefalcon8539
      @thewhitefalcon8539 Před 5 měsíci +8

      When he does finish a game it's the best game ever. He's done it twice.

    • @SETHthegodofchaos
      @SETHthegodofchaos Před 5 měsíci

      @@thewhitefalcon8539 tru. Still funny to me when watching his reaction in the noclip The Witness documentary to tge question if his next game will take 8 years :D

  • @BlowFan
    @BlowFan  Před 5 měsíci +1

    5:53 I see an editing mistake. That one frame of misplaced image. Annoying. . .

  • @tamtrinh174
    @tamtrinh174 Před 4 měsíci

    Set your expectation lower! As low as possible

  • @llamatronian101
    @llamatronian101 Před 4 měsíci

    If you can't have a universal shading language I guess WGSL just doesn't exist. These languages are so similar that you can transpile between them pretty easily.
    And if you can't have C for GPUs someone should tell Embark Studios to stop working on rust-gpu.
    A lot of this sounds like a problem of constantly chasing cutting edge graphics rather than good gameplay. The best game I've played in the last few years is still Tunic, not some graphical behemoth.

  • @HyperMario64
    @HyperMario64 Před 5 měsíci +4

    I completely disagree with this. One could roll with Vulkan, which is the industry effort between all vendors except Apple, and call it a day. SPIR-V is an awesome standard that can be compiled from many different shading and compute languages. There are proprietary extensions but they are not necessary to make a game renderer with average needs. Really, the frustrating part of the industry is that CUDA and Metal ecosystem that is proprietary by nature. This wastes resources. Open stuff like GL (which sucked) and Vulkan is totally different.

  • @CaleMcCollough
    @CaleMcCollough Před 4 měsíci +1

    Golden rule: You never ever make a library that should be and is made by a chip manufacturer. It's not that they don't want you to have control, try to write one of those libraries, and see how much your life sucks when the chip manufacturer releases their next major version. This same logic applies to native OS APIs and languages like Swift and Windows Native API.

  • @nidavelliir
    @nidavelliir Před 5 měsíci +6

    I work in backend APIs and dependency management is a NIGHTMARE. Changing the version of any dependency broke ten more dependencies.

  • @nonetrix3066
    @nonetrix3066 Před 4 měsíci

    I honestly disagree with the idea that it would be completely impossible to make your own shader language or rendering API, at least on the AMD Linux stack it's completely possible since it's so damn open. Hell, there is already someone doing just that, there was a project I heard about that's trying to implement DirectX on Linux natively without any Vulkan translation named OpenDX. There is zero reason you couldn't just not implement DirectX and do your own thing. But why? Honestly, don't think the APIs are the problem, they ARE mostly standardized, the only weird exception is consoles and Macs, everything else just uses Vulkan and OpenGL simple as and even the proprietary APIs on these consoles are just clones of Vulkan + extra stuff. If you want the APIs to be more simple just use a wrapper around the more complex APIs like wGPU or WebGPU. I am not a game dev but that's just how I see it, I could be wrong

  • @WaitButHow
    @WaitButHow Před 2 měsíci

    I absolutely love Battlefield 2042 and COD MW2 he mentioned. Completely disagree with his "did not produce anything useful" take on them. They are technical and artistic marvels IMO!

  • @doltBmB
    @doltBmB Před 5 měsíci +7

    jon blows tactical cake

  • @kristofkiekens902
    @kristofkiekens902 Před 3 měsíci

    Everybody is drowning in complexity. You have to draw a line and quit.

  • @ujin981
    @ujin981 Před 5 měsíci +6

    so writing a universal shading language is allegedly impossible, but then the SDL3 team is totally insane to try it...

    • @youtubesuresuckscock
      @youtubesuresuckscock Před 5 měsíci +2

      There are already dozens of cross language shader compilers that let you convert any shader into any language with 0 changes. This is a complete non issue and doesn't matter at all.

    • @ujin981
      @ujin981 Před 5 měsíci

      @@youtubesuresuckscock and yet this channel exists to spread crap over blow fans

    • @thewhitefalcon8539
      @thewhitefalcon8539 Před 5 měsíci +1

      GLSL is the universal shading language

    • @BlowFan
      @BlowFan  Před 5 měsíci +2

      ​@@ujin981 This channel exists to share the ideas, thoughts and personality of Jonathan Blow. I also want to save those moments from oblivion. You're free to think that it's crap. Thanks for watching!

    • @ujin981
      @ujin981 Před 5 měsíci +4

      @@BlowFan and you're free to delete comments with legitimate criticism. btw, there's no ideas in this video. but you're a blow fan, so won't get it. congrats, you've deserved a "dont' recommend this channel".

  • @kuklama0706
    @kuklama0706 Před 5 měsíci +2

    How about community owned... software renderer?

    • @gruntaxeman3740
      @gruntaxeman3740 Před 5 měsíci

      I don't think that is necessary.
      OpenGL ES 3.0 is anyway available everywhere as every browser uses that and that is.. nice. In fact in my opinion old OpenGL ES 2.0 was good enough when we understand that it is the assets that matter.
      If someone want to work in latest gen low level API with minimal effort, I think that would be Vulkan subset and use some wrappers to transform Vulkan API to DirectX and Metal.
      We kind of need "Vulkan ES".

    • @candoriano4420
      @candoriano4420 Před 5 měsíci +1

      Would that be the start of communism again ?

    • @kuklama0706
      @kuklama0706 Před 5 měsíci

      @@gruntaxeman3740 Here in the comments soneone said how much you have to write to output a triangle in Vulkan.
      In another video Jonathan was complaining about having to mail nVidia and praying to fix something stupid that an intern did and people now blame on you. Opengl wouldn't fix that.

    • @gruntaxeman3740
      @gruntaxeman3740 Před 5 měsíci

      @@kuklama0706
      Fix is not use API that is not yet commodity. OpenGL ES 3.0 is something that is worked two years everywhere. OpenGL ES 2.0 has been everywhere over decade.
      If limitations are not options, then it means more complexity, having middleware or manually port code to everywhere and solve problems.

    • @gruntaxeman3740
      @gruntaxeman3740 Před 5 měsíci

      @@kuklama0706
      Also, writing code to subset of Vulkan is one option, not having any fancy features, just the basic. Just like we got "miniGL" in 90s. We can have "miniVulkan" and port that to everywhere.

  • @Krshk999
    @Krshk999 Před 5 měsíci +12

    game dev is more about tying up different APIs together than actually doing some thinking and applying logic to write something sensible

  • @parassaini9748
    @parassaini9748 Před 5 měsíci +2

    The companies don't want to travel the path of nokia or we can assume innovation as a blasphemy in tech space.

  • @abstractpoet8977
    @abstractpoet8977 Před 5 měsíci +8

    Is Jon a gun guy? That's rad

    • @ramireini
      @ramireini Před 5 měsíci +4

      We memed on his stream saying if someone wants to steal the compiler they would have to gunfight him :D He shows what weapons he has on stream sometimes.

  • @wyleong4326
    @wyleong4326 Před 4 měsíci +1

    I think language is very important, living in the world today. Back then, we have programmers; people who wrote the codes that “talked” to the computer.
    Today, we have developers; dealing with dependencies, libraries, configs and crazy amount of documentation to go through to get a grasp of the software. The things that were suppose to help made by those who made them becomes a burden, as they will not understand the problem they have never encounteres.

  • @space_cowboy007
    @space_cowboy007 Před 5 měsíci

    for simple games why not remake software renderers and bypass all this crap? Surely CPUs can do circa 2000 graphics now

  • @PantheonDev
    @PantheonDev Před 3 měsíci

    Basically skill issue

  • @bobweiram6321
    @bobweiram6321 Před 5 měsíci +1

    The answer is simple. A commercial vendor is financially incentivized to support their products. They can afford to pay their employee to devote their full time to work on those products. If their game engines suck, they go out of business. Open source products, on the other hand, are not beholden to anything. They have little to no accountability and their developers are usually unpaid volunteers. If you ran a gaming company where its survival depended on an external product, you'd be a fool to develop your game using open source products.

    • @ince55ant
      @ince55ant Před 5 měsíci +4

      i agree, but you're leaving out the anti-competition nature of these companies. Like how facebook has most people locked in to their services. Theres a huge cost to changing technology so a declining product (or less rapidly improving one) wont necessarily loose customers (what company is willing to take the hit of their most expensive workers needing time out to retrain). And you'd have to be a fool to not realise these companies and more specifically their shareholders know this. So when quarterly profits are the game, the end user wont win.
      No one in actual control is worried about being the bag-holder at the end since they are all protected from any concequences

  • @breadtoucher
    @breadtoucher Před 5 měsíci +2

    Shady and VCC coming to the rescue! I hope we can all ditch attrocity of DXC, support SPIR-V more and write c programs instead of stupid shading languages.

    • @PRIMARYATIAS
      @PRIMARYATIAS Před 5 měsíci +1

      Also check about SLang, a C# like shading programming language.

    • @PRIMARYATIAS
      @PRIMARYATIAS Před 5 měsíci +1

      Also there is RustGPU, I guess all of those systems still need tons of work.

  • @dave7244
    @dave7244 Před 5 měsíci +6

    2:26 I've never seen any concrete evidence that WFH decreases productivity. If you are talking about people just not working, they do that in an office. I walked round at about 10:30 in a large organisation and half of one department was reading reddit. If you are talking about communication, a huge amount of it happens over teams/slack/email anyway.
    3:52 It isn't an "excuse". That is someone being an actual software engineer and looking at what they have to work with and trying to make the best decision for the project.
    How many games has this guy actually released? Two games and a bit games?

    • @Hwioo
      @Hwioo Před 5 měsíci +2

      He's made tons of games. It's just those two that were successful

    • @dave7244
      @dave7244 Před 5 měsíci

      @@Hwioo Ok. Thanks for the correction.

    • @fourscoreand9884
      @fourscoreand9884 Před 5 měsíci +2

      After Blow closed his first studio, he worked as a contractor for game studios with large budgets. Games he worked on include Oddworld: Munch's Oddysee, Deus Ex: Invisible War and Thief: Deadly Shadows.

    • @Wobbothe3rd
      @Wobbothe3rd Před 5 měsíci

      There is TONS of evidence that work from home is basically bullshit. It's been covered in many major journalistic outlets, and its also obvious common sense.

    • @roadent217
      @roadent217 Před 4 měsíci +1

      @@fourscoreand9884 "Games he worked on include Deus Ex: Invisible War"
      Oof. You mean the game that, whenever you transition through a level, just loads a new Windows process of itself, and then kills the old one? And I'm supposed to learn about optimization from him?
      I'm still waiting on Jai to release, btw.

  • @PRIMARYATIAS
    @PRIMARYATIAS Před 5 měsíci +6

    The programming world is these days too fragmented: Multiple Shading languages (MSL, HLSL, GLSL etc.), Multiple Frontend Frameworks (React, Solid, Vue, Svelte etc.), Multiple Programming Languages for each different platform (Kotlin for Android, Swift for IOS, C# for M$ etc.), Multiple programming languages paradigms (Clojure, Rust, C++, Go, Haskell, OCaml etc.), While all of that is good to have the incpompatibilies and “platform specificness” of each of those makes the whole field a real big headache for programmers and very fragmented.

    • @drygordspellweaver8761
      @drygordspellweaver8761 Před 5 měsíci +1

      It’s not even that. In the 70s there was far more fragmentation but the spirit of free software allowed for interoperability.
      The problem today is corporate greed and special interest groups. Locking source code in a cage while holding monopolies.
      It’s a lot like the engineered, faux diversity movement, that leads to destruction and fragmentation of actual diversity. Real diversity is something organic, free and beautiful.

    • @majorhumbert676
      @majorhumbert676 Před 5 měsíci +4

      Idk, wouldn't the alternative be to just have C everywhere?
      But I kind of agree as well. I think that React is good enough for frontend development, and that any alternative needs to have a competitive advantages. The only one that I found was Elm.

    • @gruntaxeman3740
      @gruntaxeman3740 Před 5 měsíci +1

      @@majorhumbert676
      Early 80s, C compilers were expensive, and slow. Turbo Pascal was way to go very long time.
      C however, started to show aging in mid 90s. That was clearly seen in web development when every request that triggers cgi-bin launched new process. That is when scripting languages and Java started to be popular. Scripts run in same process as HTTP server and Java application servers worked same way.
      In frontend development, Svelte and SolidJS has competetive advatage: They don't use virtual dom. Elm is also good.
      I use React but reason is toolkits made for it. But I need to create product without existing toolkits, I will use something else.

    • @youtubesuresuckscock
      @youtubesuresuckscock Před 5 měsíci +2

      Nonsense. There have always been a ton of different languages, and even as chrome dome says in this video, hardware used to be even weirder and more dissimilar in the past meaning that simply porting something was a non trivial task.
      For all the hype about there being so many different languages, HLSL and GLSL for example are so close that automatic machine conversion from one to another is trivial AND ALREADY EXISTS. Good luck converting an Atari Jaguar game to the Sega Saturn as easily as that.
      This is much ado about nothing. Not even remotely a problem.

    • @gruntaxeman3740
      @gruntaxeman3740 Před 5 měsíci +1

      ​@@youtubesuresuckscock
      Agreed. This is not new and things are easier now. In past we really have at best raw pixel data and text, and design documents in paper.
      Today, we can create whole game fully portable way without building different versions. Just make it webassemly / OpenGL ES 3.0, and limit webassebmly memory usage to 256Mb. We can also have something in DOM side running as long as we keep total memory consumption below 384Mt.
      If someone wants more, cost is to add middleware or write manually low level code.

  • @smarttarded
    @smarttarded Před 5 měsíci +1

    I never use 3rd party assets (specifically for Unity). I code anything that I need to.

  • @avhd187
    @avhd187 Před 5 měsíci

    Stuck with Dialects of C++ ? At some point many do C.

    • @llothar68
      @llothar68 Před 5 měsíci

      What Dialects? We have only three C++ compiler left, Clang/GCC and MSVC. And they are pretty compatible. In production you never want to use the latest features anyway.

    • @avhd187
      @avhd187 Před 5 měsíci

      @@llothar68He mentioned shader languages. But I’m referring to the C++ standards like Apple’s Metal vs Opengl.

  • @dmembrane_
    @dmembrane_ Před 5 měsíci +8

    Drivers were a mistake. If user space processes were able to interact with the hardware by simply writing data into memory-mapped registers and issuing interrupts (that's what a driver all really is), developers simply wouldn't put up with crappy high-level APIs, undocumented/non-free GPU architectures and proprietary shading languages any longer.
    We traded freedom and low complexity over developer convenience.

    • @youtubesuresuckscock
      @youtubesuresuckscock Před 5 měsíci +2

      No they weren't. Anyone who lived through early computing knows how completely broken it used to be when game developers actually had to either buy or WRITE THEIR OWN GAMEPAD DRIVERS for SPECIFIC GAMEPADS just to get support into their game. It was a joke. Guess what happened? Games constantly shipped that didn't work with your gamepad. It was a complete disaster.
      High level APIs supported by vendor drivers are one of the biggest success stories in the history of computing. Things are better than they've ever been. Only clowns who know nothing about the history of computers think things are worse.

    • @youtubesuresuckscock
      @youtubesuresuckscock Před 5 měsíci +2

      Mike Abrash literally spent MONTHS on a "to the metal" port of Quake 1 for the Verite graphics card because it didn't support a generic API with a vendor supplied driver. Guess what? It was a COMPLETE WASTE OF TIME and no one even played that worthless port.
      It was an utter disaster.
      High level APIs implemented by vendor drivers are far more awesome than anything the dude in this video will ever be responsible for.

    • @youtubesuresuckscock
      @youtubesuresuckscock Před 5 měsíci +1

      We have an actual HISTORY of companies doing exactly what you're suggesting. S3 Virge and Verite graphics let you write bare metal software for them and you had complete access to everything transparently, and they were UTTER FAILURES.

    • @monkev1199
      @monkev1199 Před 5 měsíci +1

      Even on platforms where you have direct access to submitting commands by hand on Linux, the amount of work you'd need is still insanity.

    • @thewhitefalcon8539
      @thewhitefalcon8539 Před 5 měsíci +6

      OK but you be the one to port Cyberpunk 2077 to 30 different GPU architectures

  • @MotraZotra
    @MotraZotra Před 4 měsíci

    "Big and compllicated system" he is more projecting than "giving advice".
    OpenGL is not really that compllicated.
    DirectX also not that complicated.
    It does require a level of comptetence.
    For those who can't deal with those, they have game engines.

    • @mousepotatoliteratureclub
      @mousepotatoliteratureclub Před 3 měsíci

      Not only for those that can't, but also for those that don't want to because it would take a lot of time that can otherwise be used on other parts of game development (or just not used, and the game released sooner/at all)

    • @MotraZotra
      @MotraZotra Před 3 měsíci

      @@mousepotatoliteratureclub Sure, but him saying the graphics api are too complicated is just wrong.

  • @ArthurSchoppenweghauer
    @ArthurSchoppenweghauer Před 5 měsíci +7

    Sounds like a problem with (catabolic) capitalism and the declining rate of innovation. Why would hardware manufacturers want to take up an otherwise counterproductive, intermediary position between the users and the hardware they themselves pay for? To squeeze out additional profit, because hardware manufacturers are too cynical about the potential profitability of future innovations (assuming they have any innovations at all). There seems to be a general creative and innovative bankruptcy across the West and the increasingly parasitic nature of proprietary hardware indicates that.

    • @Bobo-ox7fj
      @Bobo-ox7fj Před 4 měsíci

      "capitalism bad because computers not getting faster no more"

  • @lorenzozapaton4031
    @lorenzozapaton4031 Před 5 měsíci +74

    Didn't he whine about open source software?

    • @bogdan2529
      @bogdan2529 Před 5 měsíci +7

      Well, he said in this video that it's probably impossible.

    • @lorenzozapaton4031
      @lorenzozapaton4031 Před 5 měsíci +33

      @@bogdan2529 He sounds like a doomer to me. I still like him though.

    • @limarchenko96
      @limarchenko96 Před 5 měsíci +15

      He is right, open source software is awful.

    • @Tezla0
      @Tezla0 Před 5 měsíci +63

      @@limarchenko96 software in general is awful

    • @lorenzozapaton4031
      @lorenzozapaton4031 Před 5 měsíci +4

      @@Tezla0 Software made by AI will be the solution.

  • @NameUserOf
    @NameUserOf Před 3 měsíci

    So in one video you're bashing open source by painting badly(even though empirically you were proving wrong) and now you're bashing proprietary.
    Pick a side already.

  • @AnastasisGrammenos
    @AnastasisGrammenos Před 5 měsíci

    Do you guys also like Musk?

  • @vanillagorilla8696
    @vanillagorilla8696 Před 4 měsíci

    So, the downfall of the free market of both ideas and profit is Cronyism.

  • @aftalavera
    @aftalavera Před 5 měsíci +4

    This is a very sad person!

    • @majorhumbert676
      @majorhumbert676 Před 5 měsíci +6

      He's just of a different personality type than you

  • @MonsSeptime
    @MonsSeptime Před 5 měsíci +11

    "F**k Nvidia" (c) Linus

  • @soma5216
    @soma5216 Před 5 měsíci +3

    "Capitalism is trash, let me eat cake" by Jonathan Blow.
    Proprietary this and that complicating any real meaningful improvements, literally peoples times being FULL with learning and wading through all the bad decisions, systems and restrictions created in the past/present.
    P.S. @someone else with your garbage defenses of 'this system' you're wrong and Idgaf if you wanna argue, stay clownin for THE major impediment to human progress. K thnx bai.

    • @majorhumbert676
      @majorhumbert676 Před 5 měsíci +16

      A little bit more than one year ago, I would have been certain that this was written by an AI. Now however, I am certain of the opposite; an AI would never have written a text this badly.

  • @wowsnav
    @wowsnav Před 4 měsíci +4

    Lol, I don't know why youtube is recommending me Jon Blow videos. This guy hasn't made a single good game EVER, so listening to him critique other devs' programming skills is somewhat amusing. Worse than that though, he was a programmer on Deus Ex Invisible War, the worst Deus Ex game, so he contributed to making the sequel to one of my favorite games of all time much, much worse than the first one. For someone who has been in game dev for about 30 years, he never really picked up what it takes to make a quality game. Meanwhile timeless classics like Doom were made by a handful of nerds screwing around.

  • @youtubesuresuckscock
    @youtubesuresuckscock Před 5 měsíci +1

    You can already cross compile shaders to any of the vendors' language. This is NOTHING. It literally doesn't matter at all.

  • @youtubesuresuckscock
    @youtubesuresuckscock Před 5 měsíci +7

    This is a total non issue. It's easier to make games today than it's ever been. He's a clown.

    • @Muskar2
      @Muskar2 Před 5 měsíci +3

      Easy to mold to your needs and optimize ≠ easy to get started and ship a product of arbitrary quality

    • @youtubesuresuckscock
      @youtubesuresuckscock Před 5 měsíci

      @@Muskar2 It IS easy to mold to your needs.

    • @llothar68
      @llothar68 Před 5 měsíci

      @@Muskar2 I agree that getting the fundation of a software skeletton up nowadays is very hard. I'm just doing this on a cross platform app, and i see that there is just not enough cross platform support anywhere. This weekend i'm working on Update and Crash Handling. Fundamental issues and every platform has it's own implementation, already had to write my own unit testing framework because there is no valid C++ one that works on Android (NDK on Android totally sucks, still have not idea how i will port I18N/L17N to Android when gettext is not working).