Why was the Atari Jaguar so Difficult to Develop on?

Sdílet
Vložit
  • čas přidán 22. 06. 2024
  • Another installment in the difficult development series!
    This one took awhile, I hope you enjoy! Next up, the Sega Saturn!
    Subscribe for more stuff.
    Follow me on Social Media!
    Instagram: @jamezsgames
    Twitch: jamezsGames / jamezsgames
    Stream Schedule:
    Monday, Tuesday, Wednesday 6:30 PM EST
  • Věda a technologie

Komentáře • 380

  • @bryede
    @bryede Před rokem +121

    I was a Jag developer. The biggest problem is that the fastest resources are extremely hamstrung so you either write something slow on the 68000 and get the job done in time, or carefully craft small optimized GPU assembly routines like a demoscene coder, swap them in and out as needed, and take years to get it running properly. I remember we wrote all kinds of test benchmarks because it was impossible to know what the savings would really be without doing it both ways.

    • @XfromDarkHorse
      @XfromDarkHorse Před rokem +7

      Thanks for the information, but i have a question. What was the name of the video game that you developed for Atari Jaguar?

    • @eriks3260
      @eriks3260 Před rokem +3

      What game did you program???

    • @enhancementtank5876
      @enhancementtank5876 Před rokem

      Sounds like a sorrowful excuse. You failed and gave the market to Sony a consumer appliance cheaper and weaker than any home console compared to its generations next to The ps2 which was criminally weaker than the dream cast GameCube and xbox

    • @Lyricaldeamin
      @Lyricaldeamin Před rokem +1

      What are some titles u worked on ?

    • @Stabby666
      @Stabby666 Před rokem

      @@enhancementtank5876
      ^ does anyone know what this idiot is talking about?

  • @FZuloaga
    @FZuloaga Před 2 lety +40

    So Atari came with the phrase "do the math" ... how can I do the math with a flawed hardware? xD Great video.

  • @dyscotopia
    @dyscotopia Před 2 lety +25

    It seems like the Jaguar was difficult to develop for for some of the same reasons as the Saturn. Tho Sega had more resources to develop its own games and create better tools

  • @sunnohh
    @sunnohh Před 3 lety +12

    “Just throw in some null ops” that is some old school problem solving....lol

    • @ZygalStudios
      @ZygalStudios  Před 3 lety

      🤣🤣

    • @SianaGearz
      @SianaGearz Před 2 lety +1

      This nop-after-jump to clear pipeline isn't actually unique. Hitachi SuperH used on... well everything SEGA 32-bit, has 16-bit instructions but 32-bit memory bus; so usually it has the next instruction already preloaded, and they decided, let's make it 'always' instead of 'usually', and let's just start executing it anyway, regardless of any jump that happens; so canonically you nop after a jump, or alternatively, you can just shove an extra instruction from before the jump into there to be executed while the jump is happening. The time would be wasted otherwise anyway! The CPU wasn't designed with assembly programming in mind, but explicitly for C compilers.
      But buffering a div with nops is not what one may call good design... and it's usually a variable length instruction, so it is a waste of time.

  • @charlesjmouse
    @charlesjmouse Před 2 lety +7

    The Jaguar: A console that was pushed out far too quickly.
    -The excellent hardware wasn't properly integrated and implemented - couple of tweaks could have made it more robust and powerful.
    -The support library's were somewhere between useless and non-existent - this made it a pig to develop for on stupidly short timelines.
    -Atari wasn't in a position to provide sufficient support or marketing - there was too much to do in too little time with insufficient money.
    It may have bombed anyway, but at least it could have gone down fighting.

    • @maroon9273
      @maroon9273 Před 2 lety +3

      Cojag arcade board is a what if scenario had atari used a m68020 or the great MIPS R3000 cpu. With the r3000 it would've rivaled the ps1 and Saturn performance with tweaks on the hardware as well. Plus, solving to remove the bugs from the coprocessors.

  • @CharlesHepburn2
    @CharlesHepburn2 Před 2 lety +41

    Sounds to me like the hardware and software development tools needed more time in the oven; as they were a bit under-baked. Probably Atari trying to rush the system out and save the company. I actually was one of the first people to own a Jag in ‘93… I had hopes Atari would reclaim their spot on top of the video game world… at that age, I totally bought into the 64-bit marketing stuff. Lol… hindsight!

    • @SianaGearz
      @SianaGearz Před 2 lety +7

      I think it's the wrong approach altogether. Sure you can design an elegant processor; but can you spin up the whole ecosystem around it? Especially if you're a small engineering firm? Can you make sure it doesn't have critical bugs? Sure it would have all been shaken out eventually, given enough time and resources; but if you have little to work with, the last thing you do is build this sort of failure mode into your design proposal.
      Imagine what if they did a SENSIBLE thing. Like a couple commodity RISC cores, SuperH or whatever, or DSPs, and a self-designed jungle chip in the middle to do the line blits, scratch RAM, bus management, DAC and IO operations, the usual. Then they could focus on the core competence of delivering a chip, it would be structurally simpler so it would likely be free of critical errata, and there would be debug boards available for the commodity chips used, which you just solder in place of the chip, instant debugging capability, plus more debugging capabilities foreseen in the CPU core proper. Besides, the 68k crutch can be ditched then, potentially making the whole thing cheaper too. And not necessarily weaker or slower than with these custom DSPs.
      Sure, less ambitious, but ambition means work, and you want to do the least possible amount of work to deliver a given or best possible result. Less work means more opportunity to do the work at high quality, on time and under budget.

    • @maroon9273
      @maroon9273 Před 2 lety +7

      @@SianaGearz even a armv3 or nec v series CPU would've been great risc CPU for the jaguar. M68k is the worst main cpu to use for a 32-bit console.

    • @werpu12
      @werpu12 Před 8 měsíci

      @@maroon9273 The 68k was a known entity, so not a bad choice per se, its assembler was widely known and it had good tools, Jags problem were the custom chips, while powerful, Atari lacked the time and money to provide proper dev tools for them. The company behind the custom chips survived another console lifecycle, the successors to the tom and jerry chips made it into the Nuon console almost no one knows about, but with the PSX basically the methodology was shifting and those chips did not cut it anymore!
      Atari had good foresight but it took too long and they ran out of money!

    • @theobserver4214
      @theobserver4214 Před 11 dny

      ⁠@@maroon9273These would’ve been expensive compared to a 68K. The problem was that Atari opted for a the regular 68000 when the 68020 would’ve removed a giant bottleneck on the system.

  • @Piss_Pistofferson
    @Piss_Pistofferson Před 3 lety +18

    I love the "Jag-WIRE".

  • @twh563
    @twh563 Před 3 lety +6

    I absolutely have no idea what was presented here but sat on my couch, watching the whole video as if I knew what was being explained. 🤣🤣🤣 You a smart dude. Great video!!

  • @amare65
    @amare65 Před 3 lety +45

    Sooooo, programming a game on an Atari Jaguar was equivalent to trying to eat a hot dog stuck in the spokes of a bicycle wheel whilst moving at 60 mph during an earthquake. 🤔

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +10

      Maybe a little easier that that 🤣, but yes difficult

    • @richhutnik2477
      @richhutnik2477 Před rokem +6

      So they said F it and coded to the 68000 instead.

  • @Kevin_40
    @Kevin_40 Před 3 lety +12

    "jagwire" eh, oh boy

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +3

      JAG-U-ARE ;)

    • @blahdelablah
      @blahdelablah Před 3 lety +1

      @@ZygalStudios It sounds like "Jag-u-wire" in the video. Thanks for the video though!

  • @finburkard4732
    @finburkard4732 Před rokem +2

    "So I decided to add it to the stack"
    ...I see what you did there.

  • @EugenioAngueira
    @EugenioAngueira Před 3 lety +6

    That was really interesting! Loved how you explained how the Jaguar works and how objective you were in your commentary! Nicely done!

  • @mbe102
    @mbe102 Před 3 lety +2

    Just stumbled on this, but holy damn is this ever in my wheelhouse of enjoyment! Good stuff man, very good stuff! One of the more easier subs I've ever had the delight of making!

  • @paulpicillo8337
    @paulpicillo8337 Před 3 lety +7

    Jaguar is one of my top 5 systems of all time. It's an interesting machine that has some great original games, some amazing ports and a vibrant homebrew scene. Thanks for taking the time to showcase this underappreciated segment of console gaming history

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +3

      Of course Paul! It's a great honor to cover a system like this. The architecture is a work of art. And with a few tweaks, and maybe some more engineers working on this design, originally, it could have been pretty competitive. It's cool to see it where it's at now though.

    • @SpandoSpando
      @SpandoSpando Před 3 lety +1

      wich are your favorite games on it?

    • @willman85
      @willman85 Před 4 měsíci

      You like the controller?

  • @gametourny4ever627
    @gametourny4ever627 Před 3 lety +19

    I just finished watching your N64, PS3, and now this episode. Great series! After you do the Saturn which is another great choice, I would love a video on the 32X and developing with that while still having the Bottleneck of the Genesis and if you jad the Sega Cd as well, what could be possible with all 3. That woukd be pretty fascinating.

    • @twh563
      @twh563 Před 3 lety +4

      The 32x would be great! Good idea.

  • @sbanner428
    @sbanner428 Před rokem

    Your videos are absolutely incredible and I love how digestible they are! They get me interested further in CS through things I already love; good stuff!

  • @user-gf3vb7xj3h
    @user-gf3vb7xj3h Před 3 lety +10

    Atari Jaguar was awesome

  • @jerrycox9618
    @jerrycox9618 Před 3 lety +1

    Just ran onto Your channel. I bought my first Jaguar in 1995, when it was on its way out. Bought the CD add on also. Still love both of them. Thanks for taking me back to my late teens early 20s. I subscribed. Thanks 😊!

  • @przemekkobel4874
    @przemekkobel4874 Před 2 lety +4

    That took some effort to make, thank you for that. I didn't closely follow every bit of information you you presented, but two things stood out: UART bug seems like a problem in comms between JAG and another external device (like modem or another Jaguar in some kind of network), not inter-chip communications. MC68K 32-bit issue from my understanding means that you just cannot trust some long word operations in two memory areas. Rest of the memory is fine, other '.l' assembly instructions seem fine, and non-long-word operations are fine (so you can use two move.w or four move.b instead of one move.l). Annoying, but not a deal-breaker.
    Again, thanks for the interesting vid.

  • @Joshua-fm1nh
    @Joshua-fm1nh Před rokem +6

    I just bought my third jaguar. Only reason was alien vs predator. It cost $400 this time for a new one, the last one I bought before that was in the mid-90s at Walmart for $50.

    • @chuco915C
      @chuco915C Před rokem

      How much would a sega cd go for right now? I still have mine and it’s the one you would connect on the side of the genesis.

    • @randyfemrite7525
      @randyfemrite7525 Před 15 dny +1

      Did you know they were thinking about using the Lynx for AvP radar/sonar? How awesome would that have been...They had the machine but didn't know how to use it properly.

  • @larrytron1992
    @larrytron1992 Před 2 lety +9

    I've always wanted to know how the Jaguar was put together and why it was so hard to develop for

  • @WhitePointerGaming
    @WhitePointerGaming Před rokem +5

    Cool video. The weird hybrid of two 64-bit processors, two 32-bit processors and one 16-bit processor still to this day makes it unclear if the system as a whole is 64-bit as Atari claimed it was.

    • @LakeHowellDigitalVideo
      @LakeHowellDigitalVideo Před rokem

      System bus size was usually how they determined it back then. Like how the 386sx was 16 bit and 386dx was 32 bit

    • @thefurthestmanfromhome1148
      @thefurthestmanfromhome1148 Před 10 měsíci +2

      It was basically just marketing talk, because it had 64-bit architecture, it coukd be marketed as a 64-bit console, but it wasn't a 'true' 64-bit system as we would know it.

  • @Rocky1138
    @Rocky1138 Před 3 lety +11

    Excellent video. Another annoying issue is the fact that the act of the GPU processing its object list is destructive to the list, so it must be recreated every time you use it :(

  • @randyfemrite7525
    @randyfemrite7525 Před 15 dny +2

    I remember Sam Tramiel saying "if the programmers aren't smart enough to figure out how to program for our machine we don't want them making games". That attitude didn't work to well....

  • @VPSantiago
    @VPSantiago Před 3 lety +7

    This should be a project for CS students, would be better than my CS Architecture classes back two decades.

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +2

      Funny, because my degree is in Electrical Engineering:)
      CS students don't typically like hardware stuff.

    • @Z3llix
      @Z3llix Před 3 lety +1

      @@ZygalStudios Good point. This is the type of stuff we looked at in an embedded systems MSc, and that was hosted by the department of engineering. I took a CS BSc at the same University and nothing like this was covered. Excellent video btw!

  • @thefurthestmanfromhome1148

    In terms of the 68000 being used as the Manager chip..
    The 68020 and 68030 chips were both considered for that role at one point, but dropped due to how much it would push up the manufacturering costs.

    • @PlasticCogLiquid
      @PlasticCogLiquid Před rokem +5

      I remember reading somewhere, a programmer said "IF ONLY THEY ADDED 1 MORE REGISTER!!" then everything would've been so much better. :P

    • @MaxAbramson3
      @MaxAbramson3 Před rokem +3

      That's tragic, because the 68EC02/30 chips were only about $5-10 at that point and could've made the development of games much easier.

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt Před rokem +1

      @@MaxAbramson3 don't they even have instruction cache? Together with the large register file on a 68k this would allow a lot of code to run without congesting the system bus or eating into JRISC cache.

    • @MaxAbramson3
      @MaxAbramson3 Před rokem +1

      @@ArneChristianRosenfeldt The 68EC030 has a 256 byte ICache and a 256 bytes DCache. While those only manage a 50% hit rate on benchmarks, they can manage about 80% on gaming code thats written to take advantage of them.

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt Před 11 měsíci

      @@MaxAbramson3 From an overall perspective, I would invest in the gaming code to run on JRISC and the rest to run on 68k. Still, 50% hit rate mean that the 68k can run 2 instructions before it has to wait for the bus and with 32bit it is twice as fast off the bus again.
      Still, I would rearrange the main board more like this: Tom in the center of the memory (like PS4). Small cartridge slot and associated area. Point2Point DDR connection to Jerry. And uh, a 6502 core inside Jerry as the friendly face. PCB would look as clean as the N64. I really don't see a path forward with a discrete 68k. 50% hit rate really isn't great, considering that Doom manages the JRISC cache so much better.
      I would rather invest into a package to remove the color conversation off Tom .. uh, but it is too small to justify a whole custom chip. Not like in the VGA days with the palette or the character ROM in MDA.

  • @Ayrshore
    @Ayrshore Před 2 lety +4

    Cool. I''ve got to get myself one of these Atari Jagwires to go with my Atari Jaguar. Good video. I knew these were awkward - just not how awkward. I had friends worked on some cool stuff on the 8 bit Ataris, there were some good tricks with those to improve things but it seems you needed something like those tricks just to make it work like it's supposed to. I had a friend import one of the very first ones (so early we had to solder the RGB cables onto the edge connector, we couldn't get a cable here) from the US to Scotland, and although it had a few great games, overall it was a huge disappointment.

  • @douglas9265
    @douglas9265 Před 2 měsíci

    That was a great video, I have a Jag and love to hear about the hardware and history of it. Right now I ordered an SCable and found an old flat screen TV. I am pumped about that and the fact that I can buy new games that run better, have save options and improve frames. Also, they have a lot of options of either buying a new controller or getting one custom made. All good things!

  • @RISCGames
    @RISCGames Před 3 lety +19

    P.S. - when are you programming your first Jaguar game? ;-)

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +13

      Haha, that's a good one ;) maybe I will take a look sometime in the future

    • @Tolbat
      @Tolbat Před 2 lety

      @@ZygalStudios reach out to songbird productions for help

    • @tutoriais5266
      @tutoriais5266 Před 2 lety

      @@Tolbat Jaguar freaking sucks bro

  • @sshrugg
    @sshrugg Před 2 lety +1

    What a neat series!

  • @RISCGames
    @RISCGames Před 3 lety +6

    Black Ice/White Noise held the highest promise IMO. Could have been grandbreaking for its time, GTA3 before GTA3.

  • @1337Shockwav3
    @1337Shockwav3 Před 2 měsíci

    Love the initial intro that the Jaguar had good hardware, but really was hell for developers from different angles. As a hardware enthusiast I've had to learn the hard way how important it is to properly document your stuff and make it as accessible as possible if you want people to use your stuff.

  • @SonicBoone56
    @SonicBoone56 Před 9 měsíci +2

    So in summary:
    Jaguar: Hardware bugs requiring wait states and screwing up regular commands
    Saturn: Too many processor subsystems with no easy way to access them all at once
    Both: Terrible first party development tools and documentation.

  • @clover831
    @clover831 Před 6 měsíci +1

    I would like to see a video on what you think about the canceled Panasonic M2 video game console. I really enjoyed this video, and the one about the Sega Saturn.👍

  • @franesustic988
    @franesustic988 Před 3 lety +10

    Amazing and in-depth! Great job. N64 had bottlenecks, ps3 was arcane, but this one is straight up bugged haha. No wonder it was left in the dust.
    ps. weird pronunciation, Jaguayer

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +7

      Thank you!!!
      Should be 'Jag-U-Are' I suppose.

    • @robertfoxworthy5503
      @robertfoxworthy5503 Před 3 lety +1

      Its the North American pronunciation.

    • @Ayrshore
      @Ayrshore Před 2 lety

      @@robertfoxworthy5503 so, the wrong one then.

    • @robertfoxworthy5503
      @robertfoxworthy5503 Před 2 lety +1

      @@Ayrshore the root word is from a south American

    • @Ayrshore
      @Ayrshore Před 2 lety +1

      @@robertfoxworthy5503 Time the yanks learned to speak English.

  • @BubblegumCrash332
    @BubblegumCrash332 Před 3 lety +17

    Great video I love this series!! All those great chips and devs just went for the familiar 6800 I guess that was Atari fault. It explains why so many Jaguar games didn't look 64 hell even 32 bit

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +10

      Thank you! And yes, exactly!
      The architecture was phenomenal, but the manpower behind the design was small and the time was short. With more developers on the system, no doubt it would have done better. And yes, it definitely had an effect on the games. :)

    • @jerrycox9618
      @jerrycox9618 Před 3 lety +3

      I read somewhere back in the day the 68k was chosen to give developers a warm feeling when taking it on. I still love my Jaguar.

    • @Corsa15DT
      @Corsa15DT Před 3 lety +1

      Like the Falcon, the Jaguar hasn't fullfilled its destiny. Hell even A1200 didn't show anything new or better than the A500. Developers probably jumped ship to PC

    • @so-vp2qh
      @so-vp2qh Před rokem

      @@Corsa15DT accelerated a1200s could run doom

    • @Corsa15DT
      @Corsa15DT Před rokem

      @@so-vp2qh if by accelerated you mean some vampire, then yes

  • @thefurthestmanfromhome1148

    Credit has to go to developers who did use the hardware wisely and discover work arounds to get the best from it.
    Eclipse for discovering the small area of memory that could be used as a texture source and it was the same speed as flat and Gouraud shading (twice as fast as normal textures).
    ATD for using the GPU to get a better framerate and draw distance for Battlemorph over Cybermorph and ignoring Leonard Tramiel'S demands to fully texture-map the game.
    Rebellion for Skyhammer.
    Ubisoft for Rayman and showcasing the machines 2D abilities.

  • @crazyivan030983
    @crazyivan030983 Před rokem +1

    One bug here, one bug there, one bug for everyone everywhere... Bugmambo no 6 :D

  • @guaposneeze
    @guaposneeze Před 3 lety +12

    This is probably the deepest dive video on the Jaguar anywhere on the Internet. I've always been curious about that beast, so thanks for that.
    That said, I feel like you are putting a lot of weight on pipeline stall and related issues. Anybody familiar with MIPS would have been pretty well versed in needing to throw in no-ops and arrange instructions to be pipeline friendly. And pretty much every gamedev in that area had to be somewhat familiar with MIPS, given it was used in the PS1 and N64. The odd mish mash of ISA's and bus sharing behavior was probably a much bigger issue. I feel like if you do a followup video, you should try to actually write some simple software for it. By the time you get all the various dev tools set up to do a build, you'll be weeping for something like an N64.

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +3

      Thanks for the in depth comment! I appreciate it! So the 3DO and Jag were some of the first consoles to use RISC architectures. The high instruction throughput and scheduling issues were fairly new in the software development world at the time. IIRC, the Jaguar was the first (or one of the first) to use a MIPS processor (In a later Rev of hardware, no MIPS processors on the board originally.) specifically up until this point for a game console. So that was definitely a challenge. But yes, I do agree with you. The largest challenge was using the hardware to the fullest extent, as is with most of these consoles. And yes, as I mentioned, no real time debugger. But the PS1 and N64 were much different, more refined, mature, and more efficient. They also had much more resources for R&D and Nintendo had the best 3D hardware company in the world on their side, so these environments were much more mature and developer friendly. The Jaguar was not and that's definitely something to consider. Excellent points!

    • @jc_dogen
      @jc_dogen Před 2 lety

      A series where James tries to write simple demos for all the systems he talks about would be Kino 👌

  • @axlcastellanoskuwae6733

    Excelent video!!

  • @davidt3563
    @davidt3563 Před 3 lety

    Awesome video. Also I bet you could do an awesome impression of Robin from Teen Titans/GO.

    • @ZygalStudios
      @ZygalStudios  Před 3 lety

      Thank you!!
      Haven't heard that one before 🤣🤣 gonna have to look this up

  • @JeffersonHumber
    @JeffersonHumber Před 3 lety +2

    Great video, I remember this console well from my youth. Question, could the hardware bugs of been corrected with a BIOS patch?

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +13

      Hey Jefferson,
      So unfortunately no. The hardware itself had faults in the processor design and the UART hardware did too.
      This is cast into the dies and cannot be changed unless you physically rework and redesign the board.

  • @thegardner80
    @thegardner80 Před 3 lety +11

    I remember hearing at the time that if not for these hardware bugs the Jaguar would've been much more powerful. Someone (Minter?) said that one of the bugs caused texture mapping to take 11x as long as a plain ploygon, for example. Even as it was it was a worthy competitor for the 3DO, and not that much worse than the Saturn.

    • @thefurthestmanfromhome1148
      @thefurthestmanfromhome1148 Před 2 lety +5

      You might be thinking of Martin Brownlaw, coder of Missile Command 3D, not Jeff Minter.
      John Carmack also commented on how slow texture mapping was on Jaguar

  • @starcatdevocean19
    @starcatdevocean19 Před 3 lety +15

    Another thing worth mentioning is that the object processor needs a new object list every frame (50/60 Hz). So unless you buffer the list, fix the old one or create a new one, you end up with a corrupted list which will likely crash the OP. All of that requires other processors (68k, Tom or Jerry) to fix up a list. That CPU time could have been used otherwise. Also depending on how your object list is organized, performance can vary quite a bit. Not only the GPU/DSP have bugs. The blitter can be tricky to use. You need a processor to tell it what to do as well. There are so many other fun things to look out for. :) That being said, many consoles at the time had their peculiarities.

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +2

      Yikes, sounds like a fun challenge though! :)

    • @korsibat
      @korsibat Před rokem +1

      The object list had to be rebuilt every frame as it modified parts of it when processing objects. Chances are it would not corrupt the list but the objects Data and Height would get updated - so next time around it would be 0 pixels high - thus nothing to display

  • @sulrich70
    @sulrich70 Před 10 měsíci

    Good work

  • @SoulforSale
    @SoulforSale Před 3 lety +5

    Where did you learn to fly?

  • @ALM1GHTY.PEANUT
    @ALM1GHTY.PEANUT Před měsícem

    Great video! Background music was a bit too loud but awesome content!

  • @Waccoon
    @Waccoon Před 3 lety +16

    To be fair, many systems of the early 90's were grappling with the concept of superscalar pipelining, and the Jaguar just had more problems than average. It was fashionable for companies to attempt an in-house RISC CPU back then, rather than license a well-tested core. In terms of system architecture, the Jaguar was actually quite nice, and the N64 and Saturn were both far more difficult to program.
    Also, *please* turn down the music, or at least use something less melodramatic.

    • @crazedlunatic43
      @crazedlunatic43 Před rokem +1

      Unlike those systems, the Jaguar's main chipsets were held back by bugs. If Atari had left it's development tool set and chipset a little longer in the oven like let's say, a delay for it's launch in 1994, that would've been preferred over an early launch with half baked tool and hardware designs.

  • @oxcellent
    @oxcellent Před 3 lety +10

    This is easily one of the best Jaguar related videos I've seen on CZcams to date. Thank you so much for all the insights and facts rather then the usual folklore stories out there. Followed

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +2

      Thanks for the feedback!
      No problem, it was a good time learning about it.
      I don't like engineering by storytelling, so being able to review this documentation, I prepared myself as if I was actually going to be writing a production game on it before I made this video.
      Comments like this are why I'm continuing to make video game hardware architecture videos, so thank you! No folklore here :)

  • @hugoace1
    @hugoace1 Před 2 lety +8

    Dumb non-programmer question: Instead of workarounds, what would it take to actually FIX these bugs? A redesign and manufacture of chips? I don't know anything about programming but it just sounds like no one will ever be able to use the hardware's full potential.

    • @ZygalStudios
      @ZygalStudios  Před 2 lety +4

      Not a dumb question at all!
      Most of these bugs would require new hardware spins, which is definitely not ideal from a cost standpoint. This also means in some cases that software developed on old chips is not compatible with the new hardware either.

    • @thefurthestmanfromhome1148
      @thefurthestmanfromhome1148 Před 2 lety +5

      @@ZygalStudios at the time, Rob Nicholoson of Handmade Software said the Jaguar chipset needed a further 2 revisions to fix the bugs alone.

    • @FantasyVisuals
      @FantasyVisuals Před 2 lety +2

      The Jaguar GPU had 4kb of internal nonwait RAM , due to a bug , you had to write everything in 4kb chunks. They could not change it to 64kb of normal ram as the design was complete.
      The system had countless design faults.

  • @tangreen7267
    @tangreen7267 Před 2 lety +1

    Holy Crap ! I had no idea this system was so complicated !

  • @zorromagico4534
    @zorromagico4534 Před 2 lety +2

    Is funny how everyone is focusing on your way to say GaeWare but i appreciate the entire video

    • @ZygalStudios
      @ZygalStudios  Před 2 lety

      Lol hey it's fair game 🤣
      Thank you for the support!

  • @KarlHamilton
    @KarlHamilton Před 2 lety

    Excellent video, that really was a deep dive! Tell me, are these kind of flaws/bugs present in other systems too or is the Jaguar just bad for these? Cheers!

    • @SianaGearz
      @SianaGearz Před 2 lety +2

      There is no system without 'errata', hardware bugs that have been documented and just remain unfixed. Every PC has several thousand of these bugs, with corresponding workarounds in the system software and drivers - the workaround doesn't mean the hardware needs to be detected explicitly, nor do the workarounds usually limit performance, but the errata impose a particular order of doing things. On modern systems, these plentiful errata are low severity and are well hidden by the thicc system software, so on the application side, you don't need to be aware of them.
      PS2 has a few notorious errata, like WHO IN THEIR RIGHT MIND designed the interlacing flicker filter as 1:1 instead of 1:2:1? This is just so useless, makes for a jumpy and blurry image, so games didn't even bother using it. There is something wrong with edge rule on PS1 and MIP selection rules on many systems, and you can trip out the graphics chips in most systems a little bit when approaching numerical limits even when at a glance it looks like you might have some latitude. Odd little bits of weirdness. And of course peripheral device weirdness is pretty much expected, but nothing particularly detrimental, like there is a way of doing what you need to do, and robustly.
      However there's exactly a reason nobody, and i mean absolutely nobody, in their right mind, designs the high complexity main programmable processors themselves if they can help it. License existing logic. Leave it to a company who has a lineup of CPUs or DSPs already going and have them make at most minor alterations. Like one of the main things you do is assess risks, and try to make utter engineering fiasco as unlikely as possible. Jaguar went with self-designed chips which are de facto the main processors of the system.
      Jaguar brings the bugginess to a completely unprecedented and absurd level. It's right on the knife's edge where if it was just a tiny bit buggier, it couldn't have booted at all, and honestly no other company, not even Atari in a better functioning state, could have deemed the Jag even as it stands now releasable. You don't generally find errata which make the system borderline useless or eat up pretty much all of its performance. Of course design fitness varies - many systems don't reach as much performance in real uses as they seem to on paper, but then that's just generally due to almost inevitable compromises or the fundamental uncertainty as to what the actual usage of the system will be like.
      Microcontrollers, CPUs and DSPs that you can buy for your product or project actually all come with errata sheets in the documentation, and even comparatively simple ones have quite extensive ones, and you really need to consider errata in the system design, as some of them can be limiting as to how you design the system, some of them can actually hurt. And their designers actually know what they're doing and they have iterations upon iterations under the belt, so to an extent this is inevitable. But it's a million times better you design against errata that you know, than against ones you don't.

  • @Corsa15DT
    @Corsa15DT Před 3 lety +7

    Nobody knows the 68000 from the Mac, it is known from the Amiga and the Atari.

    • @watchm4ker
      @watchm4ker Před 3 lety +3

      And the Mega Drive / Genesis

    • @Corsa15DT
      @Corsa15DT Před 3 lety

      @@watchm4ker oh yes, many console used it and almost all of the arcade machines, but I was referring to home computers only.

    • @watchm4ker
      @watchm4ker Před 3 lety

      @@Corsa15DT Oh. Sharp X68000

    • @Corsa15DT
      @Corsa15DT Před 3 lety

      @@watchm4ker yes, the X68000 is a great machine, but it was not a commercial success, very few sold...

    • @watchm4ker
      @watchm4ker Před 3 lety

      @@Corsa15DT Not massively successful, no, since it never left Japan, but it lasted until 93 and had numerous models made.

  •  Před 3 lety +6

    Makes you wonder why they stuck a 68k in there instead of a PowerPC or a MIPS like the Cojag arcade board used

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +3

      My guess is cost. One of the major things that the project was marketed towards Sega as was the cost. The other two processors (Tom and Jerry), were also Motorola chips. If they purchased an older 68000 along with the other two custom chips, they'd be able to save a large amount of cost due to volume. Plus, the 68k was a pretty well known system at the time, so they'd be able to market it that way to developers.

    • @maroon9273
      @maroon9273 Před 3 lety +2

      @@ZygalStudios the m68k20 would have been better and work well with Jag coprocessor than the 68k.

    • @erikkarsies4851
      @erikkarsies4851 Před 3 lety

      @@maroon9273 Motorola wasn't keen on mass producing the 68k20 or 68k30 for the cheap price needed for the Jaguar or even other Atari desktops computers. That would have ruined their lucrative workstation and server markets. I don't know if they even could produce it that cheap, but these cpu's would made more possible for the Jaguar and it would have been hold back the customn chips less. Hell... it was designed for the 68k20. I still get annoyed by people calling it the general purpose processor. It was the central processing unit which limited so many things like the total amount of memory in the cartridge, the limited complexicity of the development tools (because of the lack of memory management) and efficient control of the customn chips to name a few.
      Basically at that time Atari hadn't the creditibillity anymore to get big company's like Motorola really backing them up and not the capital to do stuff without them.

    • @maroon9273
      @maroon9273 Před 3 lety

      @@erikkarsies4851 lowest cost 68k20 was the ec which is a cut down version of the CPU. I don't how much it cost during the development of the jaguar hardware.

    • @thefurthestmanfromhome1148
      @thefurthestmanfromhome1148 Před 2 lety

      @@ZygalStudios cost and as John Maitheson explained, they wanted to put something in there that people were familiar with.
      Atari were keen to use a 68K family device, and we looked closely at various members. We did actually build a couple of 68030 versions of the early beta developers systems, and for a while were going to use a 68020. However, this turned out too expensive. We also considered the possibility of no [Motorola 680x0 chip] at all. I always felt it was important to have some normal processor, to give developers a warm feeling when they start.
      The 68K is inexpensive and does that job well.
      - John Mathieson

  • @HeathenDeluxe
    @HeathenDeluxe Před 2 lety

    Good content

  • @Tolbat
    @Tolbat Před 2 lety +2

    This is the only console whose rights were released to the public, no one has taken advantage of this.

  • @inigomontoya4198
    @inigomontoya4198 Před 2 lety

    Cool video

  • @linkovitch
    @linkovitch Před 3 lety +5

    I think you have misunderstood the JUMP recommendations.. you certainly don't have to prefix a JUMP with NOPs, and you don't HAVE to postfix them with NOPs either, but you do need to be aware of them with the pipelining. As the instruction after the jump will be loaded into the pipeline before the jump is executed (hence the suggestion of using a NOP after), but there is nothing to stop you adding an instruction immediately after a JUMP as long as it is contained within the 16bits of an instruction (no MOVEI's), and a few other gotchas (Indexed offsets go boom :D ).
    I use instructions after a JUMP quite a lot in my code on the Jag and it works without issue. Obviously these are jumps in SRAM, I don't bother with any of the GPU in Main crazyness, life's too short.

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +2

      Now remind me, is this the jump recommendations I reference around 13:50?
      Or is it somewhere else? Either way thank you for pointing that out and commenting! :)

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +1

      Just kidding I see it at 12:19!
      You're right! Thank you. Just an effect of pipeline systems! :D

    • @linkovitch
      @linkovitch Před 3 lety +2

      @@ZygalStudios Don't know if you came across it or not, it is touched on in the Tech Manual (there are at least 3 versions of that, each with it's own extras and corrections, the one dated 1995 I think is the most complete), but the "Jaguar" part of the system is just Tom & Jerry with an additional generic general CPU. In the Arcade versions of the system, the 68K is replaced with either a 68030 or an R3K or R4K, as well as increased ROM size and DSP bus connection being full 32bit.

    • @ZygalStudios
      @ZygalStudios  Před 3 lety

      @@linkovitch Did not know about the different hardware in the cabs! That's neat they used a MIPS CPU :) smart move on that one and actual 64-bit

    • @RayR
      @RayR Před 2 lety +1

      @@linkovitch the Cojag cpu hardware is a 68EC020 or R3K.. no R4K.

  • @mcopanzan
    @mcopanzan Před 2 lety

    Looks like Jag went from from a headache to a full blown migrane.

  • @zbdot73
    @zbdot73 Před rokem +2

    It be really interesting if a programmable logic array was used to model the jaguars chip(s) but with most of these bugs worked out and to see how fast this chipset could have been at the released Mhz rating. Essentially making one more revision to the silicon which is evident being needed. Also was this rush to market driven by marketing or dollars running out for the company?

    • @vidjenko8349
      @vidjenko8349 Před 8 měsíci +1

      Cool idea and yes. They pulled it out of the oven too soon due to Atari rapidly bleeding money at the time.

  • @theannoyedmrfloyd3998
    @theannoyedmrfloyd3998 Před 3 lety +7

    The Jag Gwar was hard to develop for because most developers used IBM PC based SDK. If you really wanted to tap its true potential, you needed an Atari TT030 computer.
    It's not pronounced 'Jag Gwire.'

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +6

      Did the JAG-U-ARE have a SDK for the Atari TT030?
      If not, this would appear to make the task of development more difficult.
      The SDK/Environment cannot mask the system architecture bottlenecks and hardware bugs, which, overwhelmingly, were the main issues with it.

    • @Sinn0100
      @Sinn0100 Před 2 lety +1

      You know words are pronounced differently across cultures right? What he said was absolutely correct. It is pronounced 'Jag Gwire' here in the states. Just as we wouldn't be caught dead saying something as ridiculous as "Zedbra." Thank you and have a pleasant week.

    • @polytrashed
      @polytrashed Před rokem +1

      @@Sinn0100 Uhhh, not in any of the states I’ve ever been in, lmao 😂

    • @Sinn0100
      @Sinn0100 Před rokem

      @@polytrashed
      I've been to every state in the Union and not one time have I heard anyone pronounce Jaguar like British people. If you want to hear the most common way a word is spoken look at Television or film as they tend to match the most popular or frequently used dialect...especially commercials advertising a product. One example of this is the Atari Jaguar commercials that ran from 1993-1996. I encourage you to look it up as it will show you the correct pronunciation here in the states.
      Before you tell me the UK this or that...remember, every country has its way of doing things. We might all predominantly speak "English" but it is not the same as "English" in the UK. Often times their words have different meanings than ours do and spelling can be radically different. Just look at the word color and colour. Both mean the same thing but are spelled completely different. This phenomenon is very prevalent in Spanish speaking countries as well. Spanish in Spain is very different than Spanish from Puerto Rico.

    • @polytrashed
      @polytrashed Před rokem +3

      @@Sinn0100 What? I think there’s been a misunderstanding; everywhere I’ve been it’s been pronounced “jag-gwar” (similar to “bar”), while the stereotypical English version is the three-syllable variant “jag-yew-are”… but never before today have I heard it pronounced it as “jag-wire” lol

  • @timothypeters7160
    @timothypeters7160 Před 3 lety +2

    Love your channel...please do the atomiswave...or the neo geo pocket...thx

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +1

      Thank you!!!
      Planning on doing both :)
      Thanks for stopping by!

  • @MrChuckGrape
    @MrChuckGrape Před 3 lety

    Angry dachshund? Now you're speaking my language!

  • @crazedlunatic43
    @crazedlunatic43 Před rokem +2

    A delayed launch to ensure that the chipset's hardware bugs have been fixed, and Atari providing proper development tools over pushing the system for a '93 launch would've been preferred.

    • @thefurthestmanfromhome1148
      @thefurthestmanfromhome1148 Před rokem

      Rob Nicholson of Handmade Software, said the chipsets needed another 2 revisions, just to get the bugs out.

    • @watchm4ker
      @watchm4ker Před rokem +3

      Atari didn't have the money. Any revisions to the hardware would have meant cooking up more silicon, which was millions upon millions of dollars per revision, at a time where Atari Games hadn't seen success since the Crash.

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt Před rokem

      show me any bug which does not need more transistors ( and potentially more bugs ) to solve. Feature creep is the problem. All those special JRISC instructions for JPEG and mp3! Multilevel interrupts on all 3 processors! Two audio DACs ( why not stick to external from the start ? ). Complicated backwards compatibility to Atari controllers instead of simpler Japanese design. Vertical scaling should never have been done by the OP. Why can the blitter scan for values, but at an unusable slow pace? Why do we have OP and blitter? Really, the Jaguar needs one simple 64 bit memcopy unit with access to the bus and then one pixel pusher with private memory. Bringing a 16 or 32 bit framebuffer to the screen is just a matter of memcopy. Don't confuse it with sprite or palette or scaling or shadows, all of which are too slow for the memory bandwidth. The OP already has its own (private) linebuffer and palette. Just unify it with the blitter . Who ever needs this exploding pixel mode of the blitter? Why have to relearn everything when going from 2d to 3d? Why 32bit cartridge if the clock rate is too low and the bits too expensive for live data anyway? Who needs more than a pi/2 cos table in DSP ROM?

  • @Jolly-Green-Steve
    @Jolly-Green-Steve Před 5 měsíci

    Trying to make a game for this thing seems a lot like Sully following you in the parking garage. From here it looks like a nightmare.

  • @A31Chris
    @A31Chris Před 3 lety +3

    There was always debate about whether the Jaguar should be considered a 64-bit system. It's been argued and pointed at and laughed about cetera. But someone posted a thought on it during one of the discussions that many consider a very reasonable assessment:
    'I am in agreement with this much. To me, if the ALU doesn't do 64-bit processing internally, and the registers aren't 64 bits, the chip isn't a 64-bit chip. I also happen to think your earlier interpretation of Mathieson's design intent of the 64-bit capabilities is quite reasonable.
    But the Jaguar is unusual precisely because of its hybrid nature.
    It clearly has some 64-bit read/write capabilities. Regardless of whether its a "parlor trick" to get more instruction bandwidth out of cheaper RAM, its reading and writing at 64 bits from the System bus. It has instructions to do so, and as Gorf proved, Blitter/OPL have functions that effectively operate on 64-bit memory chunks as a "true" 64-bit processor would, despite the reality of the internals. Because of the fact there are multiple components of the same system dealing with data this way, I don't believe its unreasonable to call it part of a 64-bit *architecture*.
    As for which one I agree with more, well damn...I never was that good at math 😛'

    • @thefurthestmanfromhome1148
      @thefurthestmanfromhome1148 Před 2 lety +2

      Trip Hawkins at the time kicked off about the 64-bit marketing by Atari, saying only 64-bit aspect was the memory bus and the 3DO had more bandwidth, a 32-bit CPU, custom chips better at prouding sound in-game than Jaguar..
      Atari's Bill Rehbock fired back, with a lot of technical breakdown of the Jaguar and closed with...
      Trip spent a year promising the world a Holodeck with the 3DO 😂

    • @KarlHamilton
      @KarlHamilton Před 2 lety +1

      @@thefurthestmanfromhome1148 what a burn hahahah

    • @Optimus6128
      @Optimus6128 Před 2 lety +2

      It was all marketing at the time. Back in the time, usually 16bit console would look quite better than 8bit. Then the trend continued and some 32bit consoles would have the early 3D graphics, so people thought 64bit means nextgen of graphics/technology. But it's moot. A N64 has 64bit CPU. But also my modern PC that can run Unreal Engine 5 has 64bit CPU. Not a good metric.
      And even back then, you could have a system with say a 32bit CPU without any custom chips to even render the basic graphics, let alone 3D. If the CPU was fast enough you could software render everything in time (assuming also the video bandwidth didn't restrict you, like in old PCs with slow ISA cards). And then a 16bit system with crazy custom chips could render amazing things even if the CPU could be slow. So, the CPU bits alone wouldn't tell much. Now, the Jaguar is a hybrid approach, 68000 might be slow 16bit/32bit hybrid (some people consider it hybrid 16/32, it had 32bit registers but 16bit bandwidth) but the other 32bit chips are also programmable, and the bandwidth between two of the graphics related chips was 64bit. That's for faster throughput communication of data between the object processor and blitter.
      I need to program in this console to know more specifically, but what I hear from other Jaguar programmers, to draw triangles, you would typically have to program one processor to prepare scanline information and send these to the blitter. That means extra work that wasn't there in the hardware, to sort the 3 triangle points from top to bottom, interpolate on the edges of a triangle, generate lists of type "from X0 to X1, interpolate C0 to C1 (or Z0 to Z1 too as blitter supported also z-buffering)" so someone had to work to convert a triangle on it's list of edges with information for the blitter. That's half the job of a software polygon renderer. I am looking a bit at the manuals and I also don't see mention of texture mapping coordinates or sampling from texture on the blitter page. A Jaguar programmer could reply on that, but that tells me that possibly, for texture mapping one of the processors who prepared data for blitter, would also in the case of texture mapping software interpolate the scanline texture data, send them to the blitter to write sequentially. I know there are hardware sprites and scaling, so for the case of Doom as the scanlines in the wall columns are always vertical, this additional texture interpolator wouldn't be necessary, just stretch them as Nx1 vertical sprites. But for a full texture mapped polygon maybe things weren't easy.
      And here is the difference, on systems like 3DO, it's in the hardware and with few lines of code you have a texture mapped quad. On the Jaguar, coders had to rewrite their own triangle rasterizer implementation (with some work taken away from the blitter, but maybe only the gouraud/z-buffer interpolation, not necessary the texture interpolation if I am correct). That's possibly why most Jaguar games were flat/gouraud shaded and fewer did texture-mapping in real 3d scenes (not in Doom or AVP which are simpler vertical column bitmap stretching) and a lot of them where even slow. Jaguar is more flexible though, one could write more interesting things, but sometimes when a system has the full functionality inside the hardware for such things, it will end up being easier to just use it (and most programmers will easily without much effort show some early good 3d) and usually doing it faster than you would ever do it on the programmable CPUs alone.

  • @ravingrays248
    @ravingrays248 Před 4 měsíci

    To quote scotty from star trek 3 "the more they overthink the plumbing the easier it is to stuff up the drain"

  • @A31Chris
    @A31Chris Před 3 lety +4

    Here is a link to programmer startup, including a gcc for the gpu. Along with the story of a LRU caching system high voltage software developed. And their success using it and the risc gcc.
    Tips and tricks learned through the years etc. Might find it interesting reading.
    www.3do.cdinteractive.co.uk/viewforum.php?f=35&sid=8d6d2048716aa3a38c455547b6ac65f7

  • @lelsewherelelsewhere9435
    @lelsewherelelsewhere9435 Před 2 lety +1

    14:28 I wonder if this is a flaw of divide-scoreboard itself were maybe divide-read isn't considered a read by the scoreboard, thus the first divide isn't scoreboarded first (or at all), so second divide can occur falsely.
    This makes the previous bug, of divide needing a dummy wait, make sense, as divide isn't viewed as doing a read by scoreboard and so is ignored by scoreboard.
    If divide is too long between reading and execution, the scoreboard may see some weird, null, intentionally placed wait (to account for long divide time, divide may include a built in dummy wait statement) operation occuring. But a "dummy wait ending with divide result being stored" instruction doesn't contain a read, is seen as a dummy, thus the scoreboard treats this end of divide, and so the whole divide itself, differently and ignores it.

  • @roxynano
    @roxynano Před 3 lety +14

    I do not know most of the jargon you were saying but it is still extremely interesting! Honestly I would love to watch more and see if I’m able to learn.
    I’m intrigued on Jaguar games that actually achieved the hardwares true potential.

    • @BHGMediaGroup
      @BHGMediaGroup Před 3 lety +1

      I don't know if there are any that TRULY did reach full hardware potential. One thing thought that I find interesting is that if you look at the two games that were on every system at the time, NBA Jam TE and Doom, it's the Jaguar versions that are regarded as being the best, even better than PS1 and Saturn versions. So who knows what could've been with the Jaguar.

    • @roxynano
      @roxynano Před 3 lety

      @@BHGMediaGroup Who knows?

    • @BHGMediaGroup
      @BHGMediaGroup Před 3 lety +2

      @@roxynano I think the one thing that could be telling is that there's still a fairly large group of homebrew developers that make new games that you can buy on Atariage, but those also don't showcase the full power, most of them not even coming close to what was already commercially available, so after all this time, homebrew coders can't figure out how to squeeze out the full power, it was unlikely anybody was.

    • @AP-mw9oj
      @AP-mw9oj Před 2 lety +1

      I know this would be a super late reply, but there wasn't much in the way of "2nd (or 3rd) generation" software for the system, which successful consoles get, and beyond of teams producing 2/3/4+ games for the same system. John Carmack, who coded Wolf 3D and Doom for the Jaguar, did say that if he had done Doom again, there are several things that he would have done differently and more efficiently, so it's safe to assume that had they done Doom II, it would have been better than the Doom port was.
      That all said, the best examples of what the Jaguar could do were: Iron Soldier II, BattleSphere, Wolf 3D, Super Burnout, Doom, Missile Command 3D, Zero 5, Skyhammer, BattleMorph, NBA Jam, Tempest 2000, Alien Vs. Predator, Hover Strike CD, Tube SE, and Primal Rage. There was also some unreleased games that showed off some near-PSX potential, such as BlackICE/White Noise, Phear (which actually became Tetrisphere on the N64), Phaze Zero, Native, Conan, and the AtariOwl RPG game. :)

    • @roxynano
      @roxynano Před 2 lety +1

      @@AP-mw9oj Will note! Thank you for the late reply!

  • @dmblan82
    @dmblan82 Před 3 lety +1

    Were there any games that took advantage of the Jaguar's system? Have there been any from the homebrew scene that push the system?

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +4

      AVP stands out I would say per the time period and same with Missile Command 3D.
      As far as homebrew is concerned, I'm sure there have been some too!

    • @thefurthestmanfromhome1148
      @thefurthestmanfromhome1148 Před 2 lety +1

      @@ZygalStudios AVP, Iron Solider 1 and 2, Battlesphere, Skyhammer, Rayman.. Zero 5..
      Probably the best examples of Jaguar hardware being used wisely, maybe Battlemorph as well.

  • @Tolbat
    @Tolbat Před rokem

    Zygal my friend, we are still waiting on you to make a Jaguar game sir.

  • @leonardharris9678
    @leonardharris9678 Před 11 měsíci +1

    it could still do good graphics, its other hold back was the cartridge, i saw article talking with Rebellion software who did the game Alien vs Predator, which was a nice looking game and they said specifically, that they cut back on the texture quality considerably, due to having to cram it on a cartridge

    • @thefurthestmanfromhome1148
      @thefurthestmanfromhome1148 Před 10 měsíci

      And they had to practically beg the Tramiel's to increase the cartridge size over the original allocated size.
      A full orchestral musical score was intended for the title, but no room for it on the final cartridge.

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt Před 9 měsíci

      @@thefurthestmanfromhome1148with music you mean: wavetable?

  • @Darth001
    @Darth001 Před 2 lety

    Got a sub from me. Went and watched some of your other videos and it's seems the music I'm this video overtakes your voice at time where in other videos it's fine. I'd like you to do next iver the 3do console and or the 32x addon

    • @ZygalStudios
      @ZygalStudios  Před 2 lety +1

      Funny enough, the 32x was my last console based video a few weeks back. Stay tuned for the 3DO! :)

    • @Darth001
      @Darth001 Před 2 lety

      @@ZygalStudios brill i will check it out and keep up the good work

  • @Paul29Esx
    @Paul29Esx Před 3 lety

    Jag - you - are!

  • @SamMcDonald83
    @SamMcDonald83 Před 3 lety +5

    Extremely interesting analysis. I recall hearing on a podcast awhile ago that John Carmack was a fan of the Jaguar. So much so that he coded up Doom for it in a weekend. In fact, the Jaguar port remains the only version made by ID Software themselves in that era.
    I also seem to remember that Carmack flagged up the register bug you mentioned and was able to code a fix. However supposedly, if those issues were not present, the Jaguar could have been close to a PS1 in respect to performance.

    • @Ehal256
      @Ehal256 Před 3 lety

      I'm pretty sure that Carmack also worked on the 32X port of Doom.

    • @SamMcDonald83
      @SamMcDonald83 Před 3 lety +2

      @@Ehal256 the only console port made by ID themselves was the Jaguar one, which supposedly Carmack coded personally.
      The other ports were all handled by third party companies which is why the quality varied so much.
      Actually over the years several programmers of these ports have come out and discussed their experiences. One of the most netorious ports was for the 3DO and its lead programmer, Rebecca Heineman has discussed at length the challenges she faced czcams.com/video/rBbIil2HPSU/video.html
      Similarly the programmer for the Sega Saturn version has gone on record that his port was so bad because Carmack refused to let him use the Saturn's GPU doom.fandom.com/wiki/Sega Saturn
      Finally on a more positive note the programmer of the snes version, who managed to rewtite the game in assembler, has discussed his more pleasant experience : czcams.com/video/P5PknJvplKg/video.html

    • @robertfoxworthy5503
      @robertfoxworthy5503 Před 3 lety

      They all used the Jags source code.

    • @SamMcDonald83
      @SamMcDonald83 Před 3 lety

      @@robertfoxworthy5503 ok so what's your point?

    • @robertfoxworthy5503
      @robertfoxworthy5503 Před 3 lety +1

      @@SamMcDonald83 nothing just that it is the code base for all other Ports.

  • @finaltheorygames1781
    @finaltheorygames1781 Před 2 lety +1

    Just remember that nobody cares about a console if there is no software for it. The reason why people loved the NES and SNES was because those machines had great games and people loves those games. The Atari Jaguar just didn't have any super amazing games that would have allowed it to push those console units. This doesn't mean that you can't develop great games for the console. It also doesn't mean that the console itself isn't any good. It just means that at the time 1993-1996 the Jaguar didn't have enough killer software apps to gain interest and to make it a success. Remember how Halo put the original Xbox on the map. The Jaguar needed some killer software and it just didn't have it. This is also why hardware is nothing without software.

  • @antayat
    @antayat Před 3 měsíci

    Great research dude! It really is too bad that Atari shipped this system with inherent bugs. It's my understanding that they rushed this thing out the door as a hail marry to save the company. Not realizing the bugs.
    There is a 2600 game that builds a unique map per each play. No one has access to the pre compiled data. No one knows how it was accomplished. Even the original programmer forgot the algorithm that he derived at.
    Might be a neat puzzle to look into even though it's not really hardware related.

  • @randyfemrite7525
    @randyfemrite7525 Před 15 dny

    Alien vs. Predator and SkyHammer are excellent proof of what could have been. Apparently some programmers/developers/game designers knew what to do. Battlemorph and Hoverstrike also...did I miss anything?

  • @SerdceDanko
    @SerdceDanko Před rokem +1

    I miss these times of unique custom hardware, nowadays its all the same.

    • @chloemarietaylor4036
      @chloemarietaylor4036 Před 3 měsíci

      I guess Atari much like Sega with the Saturn approached it as arcade developers where the boards were often cutting edge, expensive and modular ie separate systems joined together. It took Sony entering without this pattern of working to apply corporate minimal viable product to it. Sega tried to make the Saturn great at absolutely everything, Sony said just make it do 3d simply. Sadly this corporate mentality has homegenised all consoles to be near identical, basically scaled down pcs. I miss the old days where systems had a character of their own, the Saturn with its quad polygons, the ps1 with its quirky integer use, the n64 with its muddy anti-aliasing. While hardware is arguably much better these days as you'd expect you only have to go the gamestore app on the device to realise it is 95% crap and 99% identical across the 3 big systems.

  • @paulwestphal7336
    @paulwestphal7336 Před 10 měsíci

    Please do a video on the NUON system.

  • @mylessmith8980
    @mylessmith8980 Před 2 lety +5

    The Atari Jaguar was like a system from the future way ahead of its time.

    • @VOAN
      @VOAN Před rokem

      There's really nothing ahead of its time about the Jaguar, a lot of its games are just worst versions of either PC or console games of the time. Even the Super NES and Sega Genesis had better 3D capabilities than Jaguar. Companies need to know that games are made to be play, not to be look at. A lot of Jaguar intend was to just make games that are good to look at regardless how they play and this is why not everyone wanted to buy one.

  • @atomicskull6405
    @atomicskull6405 Před rokem +1

    With some good middlewear the Jaguar could have done some amazing 2D games like platformers and the like. It was never going to compete with the likes of the Saturn and Playstation though.
    I've always thought that they should have just released the Panther in 91 to compete with the Megadrive and SNES tried to build up a stable of third party developers and put the Jaguar on the back burner. People always say "oh but the Panther doesn't compare to the Jaguar" and no it didn't but it still exceeded the 2D arcade PCBs of the late 80's and early 90's and was easier to write software for than the Jaguar which was good enough.
    The mid to late 90s were a dark time for video games comparable to the "iron age" of comics. If you showed someone beautiful pixel art vs ugly blocky 3D of that time 9 out of 10 people would prefer the 3D.

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt Před rokem

      What would the middleware do what the SDK does not? Tile based background? Box2d collision and physics? Did people need this middleware on 16-bit consoles? Better audio driver?

    • @atomicskull6405
      @atomicskull6405 Před 11 měsíci

      @@ArneChristianRosenfeldt From what I have read the Jaguar SDK was terrible and didn't make it any easier to develop for. an middelware would give you a software framework to start with with all the heavy ASM lifting done already.

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt Před 11 měsíci

      @@atomicskull6405 the Framework did not even give a printf . I think that this is called C stdlib. People complain that the compiler did not check for overflow. I did not know that C can throw this exception. I don’t understand how to overflow 32bit. Much bigger problem is that factors need to be 16 bit. Probably, if OP would not be able to write to DRAM, development would have an easier start. Middleware to draw a background from tiles would be nice.

    • @thefurthestmanfromhome1148
      @thefurthestmanfromhome1148 Před 10 měsíci

      According to Atari uk, had Panther been released, there would only of been around a 9 month window, before Jaguar was ready.
      People who worked on the Panther, like Jeff Minter, Rob Nicholson from HMS etc, stated the actual sprite abilities weren't anything like the numbers Atari were pushing.

  • @KungKras
    @KungKras Před 3 lety

    Was it really that Tom tried to implement hazard detection and failed, or was it that it used pipelining but no hazard detection?

    • @ZygalStudios
      @ZygalStudios  Před 3 lety

      There were no bugs with this. If there were, the processor wouldn't have even gotten off the ground. It had hardware to detect hazards and then register scoreboarding logic to correct when a pipeline stall occurred.
      The purpose of explaining this was to show some of the design challenges in assembly that you would have as a programmer.

  • @A31Chris
    @A31Chris Před 3 lety

    Maybe a partial solution to the GPU consecutive divide hazard would be to have the dsp do the second one. Would that be feasible?

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +1

      Perhaps! I'd have to look into the tech manuals to see if there would be any heartburn from that method

    • @A31Chris
      @A31Chris Před 3 lety

      @@ZygalStudios also in some instances it seems that bugs that apply to the GPU don't necessarily apply to the DSP. And vice versa.
      So weird.

    • @Electrodudimanche
      @Electrodudimanche Před 3 lety +5

      There would be cost, first on the assembly you write adding unneeded complexity. Then, you have to move around the results, it takes cycles. And if you move the result around via the UART, you have the unreliable bug shifting the result that you need to take into account.
      That's a nightmare just thinking about it...
      I would need to study the documentation and the hardware to get a grasp of the thing, but yeah, chip communication isn't cheap, usually, you have a chip listening on a set of pins for a signal, the the other chip send the signal plus a clock signal between both chips to synchronise the chips. Then, the communication can initiate. Everything cost cycles because you set the communication in assembly to set registers, the data to send, the speed of the communication.
      just from that, it would be cheaper in terms of cycles to just throw some NOP and divide again instead of doing the divide on an other chip.
      The best way to avoid 2 consecutive divide, would be to multiply by the inverse ( instead of dividing by 12 you multiply by 0.083 which is 1/12 ) or use bitwise operation to get something near the result you want and that's good enough. you lose on precision, but you gain in speed and avoid pesky bugs.

    • @A31Chris
      @A31Chris Před 3 lety

      @@Electrodudimanche That's very interesting.
      Programmers have used the DSP to help the GPU before to great effect, rather than the 68k. It's considered the more advanced way of doing things. Though I don't know if they have used it in this manner.
      Thank you for your input.

    • @Electrodudimanche
      @Electrodudimanche Před 3 lety +1

      @@A31Chris There was this kind of tricks on the saturn with great success. On the jaguar, there's so much broken stuff like the MOV instruction that is essential for a lot of things, like initialize the chips in a known state to boot properly on start up ( on start up, a chip gets random values in its registers because of static electricity, electric noise, residual electricity ).
      And I'm just programming small 8 bits micro controllers... I'd like to program a console later, I still have my old genesis and keeping it for that purpose. Maybe i'll go the Saturn route but that thing is just crazy to program...
      For the sharing of diviside operations, you need to think in advance how your code work, how you process things, it REALY isn't simple and adds a good amount of preemptive work to avoid slowdown and remember, you'll need to use the not working properly MOV instruction to do so...

  • @misterbean971
    @misterbean971 Před rokem +1

    The music is loud enough to drown out your otherwise perfect explanation.

  • @johnjay6370
    @johnjay6370 Před 2 lety +3

    Good Video, I always wondered about the bugs in the hardware that John Carmack talked about. Saying that, I think the bugs were NOT a show stopper for the hardware. Yes they are an issue and will slow things down for development, but if all the bugs are flagged and addressed, i do think the Jag had some clever ideas. I had one back in the day and DOOM , AVP, and Iron Soldier were some good games.

  • @Skullkan6
    @Skullkan6 Před 2 lety +1

    Could you go into how this affected the programming of Jaguar Doom?
    Carmack has been quoted as saying "Coding for the Jaguar is fun!"
    And nobody at id wanted to work on it. So basically only he and Sandy Petersen ended up working on Jaguar doom.

    • @thefurthestmanfromhome1148
      @thefurthestmanfromhome1148 Před 2 lety +5

      John Romero and Dave Taylor also worked on Jaguar Doom, but they weren't fans of the hardware, Taylor in particular

  • @Tolbat
    @Tolbat Před 2 lety

    Ahh yes, so you have a very in depth understanding of this, what would it take to make a clone console of this system?

  • @TheSocialGamer
    @TheSocialGamer Před 3 lety +2

    When did the JAG-WIRE come out? I only ever heard of the Jaguar? 🤣🤣🤣

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +1

      Oh you didn't hear, it came out last year 🤣🤣

  • @VOAN
    @VOAN Před rokem

    These complicated hardware is why Doom in Jaguar can't have music in the actual level. Instead you either get sounds with better graphics or you have audio but only in fix screen such as the title and result screen only.

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt Před rokem +1

      did you check the ROM-hack which corrects this? Does it slow down Doom? Doom crams some game logic and engine stuff together with the wavetable code into the tiny 8 kB of the DSP.

  • @kusumayogi7956
    @kusumayogi7956 Před rokem +1

    Ive Heard that jaguar is still using 16bit cpu and only GPU and sound processor are 32bit(combine with 64bit bus memory)
    That sound like combine rtx GPU with Intel atom cpu.
    Also they said, cpu do nothing only give command to GPU and sound processor. All process are done by GPU and sound processor. Sound like science fiction to me.

    • @humansrants1694
      @humansrants1694 Před rokem

      One of the main processors is also supposed to be the main CPU and its 32bit with some 64bit processors most devs used the 16bit Motorolla 68000 as the main CPU as it was easier to work with.

  • @thefurthestmanfromhome1148
    @thefurthestmanfromhome1148 Před 11 měsíci

    Jez San, Argonaut, who were working on the cancelled Jag CD version of Creature Shock:
    "A true 64 bit chip would have a 64 bit ALU and 64 bit registers. The
    Jaguar does not. Simply having two 32 bit chips does not constitute a 64
    bit cpu. Having a 64 bit databus does technically allow you to call it
    a 64 bit system if you must, but 64 bit cpus? no way."

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt Před 9 měsíci

      On the N64 the CPU was usually switched into 32 bit mode because no game needs 64 bits ( unless there are vector instructions).
      The so so graphic design on the GBA from Argonaut tells me that they are not competent.

  • @rockyroad90
    @rockyroad90 Před 3 lety +1

    So what would you say caused the “Hardware Bugs” : The design or the manufacture?
    I ask this because I find most quick to jump on Atari on the lack of dev support for the complex and/or flawed system but NEVER, to my knowledge, place any responsibility on the manufacturer IBM.

    • @ZygalStudios
      @ZygalStudios  Před 3 lety +2

      Well Flare Technology was contracted to design it. So it would be on them for sure. But it was really just two guys and they designed this within just a couple years, so it makes sense. 2 guys designing this system with just a couple years of development explains why it was just functional and didn't have any pomp and frills.
      But if we're talking specifically, Atari contracted resources for the project, they are certainly responsible for the lack of resources.

    • @rockyroad90
      @rockyroad90 Před 3 lety +2

      @@ZygalStudios
      Thanks for the response. But I remember an interview with one of the Flare team in which, to paraphrase, someone asked him why did they chose assem. Instead of C for coding and what John Carmack say about scratchpad memory and no virtualization in the blitter? To which he replied that it was supposed to have those thing but there were defects in the hardware( he may have said silicon) and he blamed the Tramels for rushing the project.
      My point is Atari made mistakes their’s no doubt but if there was a defect in the manufacturing process then that is on IBM.

  • @soccerojdoj
    @soccerojdoj Před rokem

    The biggest bug on the Atari Jaguar? All the experts I've heard pronounce the Atari Jaguar "The Atari JagWIRE"...lol

  • @JunkerDC
    @JunkerDC Před rokem

    poor further ado he never gets to go any where

    • @ZygalStudios
      @ZygalStudios  Před rokem +1

      This one took me a good minute to understand 🤣🤣 I love this.
      Good one!

  • @demonology2629
    @demonology2629 Před 2 lety +1

    Wish you could have showed us some of the games that had terrible problems with these bugs 🕹🐆📺

    • @korsibat
      @korsibat Před rokem +1

      The games were written with the bugs in mind so the programmers worked around them.

  • @MatthewHolevinski
    @MatthewHolevinski Před 2 lety

    What would it of been like without those problems?

    • @SianaGearz
      @SianaGearz Před 2 lety +1

      Difficult to say. I think there's zero chance it would approach PS1 in head on comparison, or even the 3DO. But it seems like a really flexible system by design, so maybe some interesting software solutions would have been found, something nobody thought of before, something unusual and impressive? Demoscene tends to be somewhat good at exploring such concepts, but developing against a fantasy console is discouraged by the competition structure. So we may never find out. I think fixing errata would have to only be the first pass, if it was a real effort, second pass should be experience based improvement, at which point, the original design documents need not be limiting, so any bets are off.

    • @thefurthestmanfromhome1148
      @thefurthestmanfromhome1148 Před rokem

      @@SianaGearz probably better frame rates, more games running with in-game music, Doom running in a higher resolution mode, with in-game music.
      Better games, but it wouldn't of turned the system into something that could match say the Saturn.

    • @SianaGearz
      @SianaGearz Před rokem +1

      @@thefurthestmanfromhome1148 Of course. There have been a number of rendering methods though that went underexplored, because polygonal 3D rendering hardware has pushed things into one given direction. I think it's an easy guess that someone would have tried S-buffer (span buffer) rendering. Next one is voxel rendering, which was used in PC game Comanche for example and also in the 32x demo by Zyrinx i think, it can look quite impressive. Then there was a game series Ecstatica which used spheroid rendering instead of polygonal. You could also see things like 2D games enhanced with realtime lighting and normal mapping on top of sprites, an example of this technique is the Blade Runner adventure game on PC. A flexible DSP-based console is a perfect host to these sorts of experimental technologies that don't directly invite comparisons to anything on fixed function hardware.

  • @finaltheorygames1781
    @finaltheorygames1781 Před 2 lety +1

    If there are hardware bugs, why not do what Nintendo did. Put in some expansion chips into the cartridges themselves with other processors on them so that you didn't have to use the processors with the bugs in them on the actual console. Like you could put in a faster 68,000 processor into the cartridge and just use that.

    • @GreyMatterShades
      @GreyMatterShades Před 2 lety +3

      That would massively increase the cost of producing those carts, and considering Atari was in some pretty dire financial circumstances at the time, I doubt they could've afforded it.
      Realistically if they were going to eat extra costs, they would have been better served by delaying the Jaguar's launch until they had fixed at least some of the more serious hardware bugs. Really a shame to see some otherwise brilliant architecture cut down by being rushed to market.
      Another possible solution is if they'd beefed up the Jaguar CD with some actual power, and maybe did a hardware revision that incorporated it. As far as I understand, all the Jaguar CD does is allow for bigger games due to the increased capacity of CDs (unlike the Sega CD, which actually has extra processing power). Seems like a wasted opportunity.
      But as a hypothetical solution to the problem of the system being on the market in a buggy state, your solution could have worked if cost wasn't such an issue.

    • @thefurthestmanfromhome1148
      @thefurthestmanfromhome1148 Před 10 měsíci

      ​@@GreyMatterShadesSam Tramiel didn't want devs using bigger cartridges, he was never going to sign off on putting DSP chips on carts to help with 3D etc.
      That's the reason the Jag CD is a dumb drive, costs had to be kept to bare minimum

    • @GreyMatterShades
      @GreyMatterShades Před 10 měsíci

      @@thefurthestmanfromhome1148 I totally agree that Sam Tramiel wouldn't sign off on extra chips on carts.
      As for the Jag CD, I still think they could have done something. I'm not sure they could have (or should have) gone as far as the Sega CD, but it was clear developers were struggling to release visually impressive games on the Jaguar/JagCD. Even some relatively cheap additional ram or processing might have helped the system perform better without raising the cost too much.
      Of course the real solution to their problems was to fix their hardware issues before launching. Anything else would have just been a band-aid, but sometimes band-aids are helpful.