World's worst video card gets better?

Sdílet
Vložit
  • čas přidán 4. 12. 2020
  • Schematics and more info: eater.net/vga
    Support these videos on Patreon: / beneater or eater.net/support for other ways to support.
    ------------------
    Social media:
    Website: www.eater.net
    Twitter: / ben_eater
    Patreon: / beneater
    Reddit: / beneater
    Special thanks to these supporters for making this video possible:
    Aaron Todd, Alexander Wendland, Andrew Vauter, Andrew Young, Anson VanDoren, Anthanasius, Armin Brauns, bapu, Ben Cochran, Ben Kamens, Ben Williams, Bill Cooksey, Binh Tran, Bouke Groenescheij, Bradley Pirtle, Bradley Stach, Brent Reusing, Brian T Hoover, Bryan Brickman, Bryan Glezerson, CaffeinatedClockmaker, Carlos Ambrozak, Christopher Blackmon, Dale Andrew Darling, Daniel Jeppsson, Daniel Tang, Dave Burley, Dave Walter, David Brown, David Clark, David Cox, David House, David Sastre Medina, David Turner, David Worsham, Dean Winger, Dissy, dko, Dmitry Guyvoronsky, Dušan Dželebdžić, Dzevad Trumic, Emilio Mendoza, Eric Brummer, Eric Busalacchi, Eric Dynowski, Erik Broeders, Eugene Bulkin, Frédéric Segard, George Foot, George Miroshnykov, Harry McDow, HaykH, Hidde de Jong, Ingo Eble, Ivan Sorokin, Jason DeStefano, Jason Specland, JavaXP, Jay Binks, Jayne Gabriele, Jeffery Springs, Jeremy A., Jim Kelly, Jim Knowler, Jim Van Meggelen, Joe Beda, Joe OConnor, Joe Pregracke, Joel Jakobsson, Joel Miller, Johannes Lundberg, John Fenwick, John Meade, Jon Dugan, Joseph Portaro, Joshua King, Kefen, Kenneth Christensen, Kent Collins, Koreo, Lambda GPU Workstations, Larry, Lucas Nestor, Lukasz Pacholik, Maksym Zavershynskyi, Marcus Classon, Martin Roth, Mats Fredriksson, Matthäus Pawelczyk, melvin2001, Michael Garland, Michael Tedder, Michael Timbrook, Miguel Ríos, mikebad, Mikel Lindsaar, Miles Macchiaroli, Nicholas Counts, Nicholas Moresco, Örn Arnarson, Paul Pluzhnikov, Paul Randal, Pete Dietl, Phil Dennis, Philip Hofstetter, Phillip Glau, PixelSergey, Randal Masutani, Randy True, Raoul van Prooijen, Ric King, Richard Wells, Rob Bruno, Robert Diaz, Roland Bobek, Ryan Trunck, Sabagmn, sam raza, Sergey Kruk, SonOfSofaman, SquirrellyCat.io, Stefan Nesinger, Stefanus Du Toit, Stephen, Stephen Riley, Stephen Smithstone, Steve Jones, Tayler Porter, TheWebMachine, Tom, Tom Burns, Tyler Latham, Vincent Bernat, Vladimir Solomatin, Walter Montalvo, Warren Miller, xisente, Yee Lam Wan

Komentáře • 1,7K

  • @Oxazepam65
    @Oxazepam65 Před 3 lety +4106

    I think I missed a few episode of Primitive Technology.

    • @CAME18471
      @CAME18471 Před 3 lety +46

      Hahahahahahahaha omg!

    • @CAME18471
      @CAME18471 Před 3 lety +25

      @@user-ug9nn seems like you missed the reference to another youtube channel called "Primitive Technology"... I guess the "dump" is not necessary 😉

    • @user-ug9nn
      @user-ug9nn Před 3 lety +29

      @@CAME18471 Ok sorry, I deleted my shitty comment...

    • @GerardMenvussa
      @GerardMenvussa Před 3 lety +44

      He has invented speech at some point apparently :p

    • @strayling1
      @strayling1 Před 3 lety +6

      The secret is to bang the rocks together.

  • @akashkumar121213
    @akashkumar121213 Před 3 lety +2518

    All graphic cards are out of stock.
    Ben: Sorry what

    • @techleontius9161
      @techleontius9161 Před 3 lety +142

      I like how his video card costs similarly to some GTX cards

    • @weepgamer
      @weepgamer Před 3 lety +11

      @@techleontius9161 lol

    • @miallo
      @miallo Před 3 lety +23

      I thought the crazy hype about block-chain mining was over, but I guess Ben might be onto something here...

    • @galladeblade6001
      @galladeblade6001 Před 3 lety +17

      Ben: Fine. I'll do it myself

    • @ethanpschwartz
      @ethanpschwartz Před 3 lety +43

      @@techleontius9161 The VGA Kit or a Quadro P400. Sorry, Nvidia, my 6502 doesn't support PCI. Yet.

  • @blasterTC
    @blasterTC Před 3 lety +217

    i really love the "some assembly required" on the box.

    • @ianwyrdness1380
      @ianwyrdness1380 Před 3 lety +25

      Ha, yes! In both senses of the word 'assembly'.

    • @object-official
      @object-official Před 9 měsíci

      ​@@ianwyrdness1380x86 and building

    • @lythd
      @lythd Před 3 měsíci +3

      @@object-official its not x86, its 6502 assembly

  • @pvic6959
    @pvic6959 Před 3 lety +231

    he talks like he knows EXACTLY whats going on 100% of the time. a true master, a legend

    • @akhilaryappatt7209
      @akhilaryappatt7209 Před 3 lety +18

      and explains each step to the last detail

    • @tombola9445
      @tombola9445 Před 3 lety +44

      Editing is a wonderful way to do this. Not taking anything away from him, but editing allows him to get his ducks in a row before he adds the sound to his video. I do think this series is great though, as I design fpga's and I do what he's doing in a somewhat sw environment, but seeing it in hw in bread board format really gives a physical visual of what I have to map in my head.

    • @progamer1125
      @progamer1125 Před 2 lety +6

      Man speaking full enchanting table

  • @dralfonzo24
    @dralfonzo24 Před 3 lety +3932

    Not done with the episode yet, but I'm eager to see you running doom on this soon.

    • @sadmac356
      @sadmac356 Před 3 lety +443

      World's worst sound card first?

    • @richardlighthouse5328
      @richardlighthouse5328 Před 3 lety +78

      @@sadmac356 Just a DAC

    • @techleontius9161
      @techleontius9161 Před 3 lety +137

      @@sadmac356 just a counter, memory, 8 KHz clock, 8 resistors, a bunch of logic to control it and you're good to go.

    • @jorenheit
      @jorenheit Před 3 lety +59

      How about crysis?

    • @weepgamer
      @weepgamer Před 3 lety +137

      @@jorenheit nah, doom is cooler

  • @geetanshgautam
    @geetanshgautam Před 3 lety +1468

    You know that a breadboard is populated when eater doesn't neatly bend his wires

    • @DingleFlop
      @DingleFlop Před 3 lety +92

      I absolutely love how cleanly he does these. I've been prototyping stuff recently and I'm trying to get them even a quarter as clean as he's doing. And I'm cheating and using an ESP32 so I've got about 1% the complexity of those two boards he's doing....

    • @AftercastGames
      @AftercastGames Před 3 lety +76

      Yeah. Forget about selling the kits... I’d pay to just have my wires pre-cut to length and bent at perfect 90 degree angles. :)

    • @GRBtutorials
      @GRBtutorials Před 3 lety +9

      @@AftercastGames You can! Not bent, but you can buy precut wires for breadboards, I have a box of those. Just search for them on Aliexpress or similar.

    • @torade100
      @torade100 Před 3 lety +16

      I liked how he tried to put his hands over the cables in an intent to make them tidier but then realized that was not possible.

    • @alfredorotondo
      @alfredorotondo Před 3 lety +16

      @@DingleFlop when I was doing telecommunications at school it was strictly forbidden to bend like this the cables because the teacher was dumb and hated them even if it makes the breadboard cleaner
      I wish i had Ben as teacher

  • @AntonyTCurtis
    @AntonyTCurtis Před 3 lety +430

    The BBC Micro had a cunning strategy of running the memory at twice the speed of the CPU so that the video and CPU accessed the memory on alternate cycles. This allows its CPU to work at full 2 MHz.

    • @2thinkcritically
      @2thinkcritically Před 3 lety +13

      The Dragon 32/64 computers certainly did that. I believe the memory access was controlled by the SAM chip so the CPU wasn't directly connected to the RAM.

    • @johndododoe1411
      @johndododoe1411 Před 3 lety +24

      @Hans J Another fav technique was for the video card to only access memory once or twice every 8 pixels, hence the weird restrictions on number of colours in a small block of screen area.

    • @mojoblues66
      @mojoblues66 Před 3 lety +11

      The Apple ][ had this "cunning strategy" already, 5 years before the BBC Micro was introduced.

    • @akersmc
      @akersmc Před 3 lety +4

      8 bit commodores did this as well I think?

    • @tonysofla
      @tonysofla Před 2 lety +12

      Commodore64 works like that, the VIC-II was created from scratch to be the master bus controller it even refreshes the cheaper dram in between cycles.
      6510 is only locked out every 8th line when the VIC-II needs to fetch next block of background/foreground color data.

  • @375-Productions
    @375-Productions Před 3 lety +248

    I love how you always solve the problems that appear during the video, rather than just making a video showing the final product. You show the entire process of the thing that you're making. Brilliant video as always!

  • @zv0n
    @zv0n Před 3 lety +1816

    Next couple of videos:
    - writing a C compiler for my CPU
    - running DOOM
    - porting Linux kernel
    - intro to quantum computation

    • @ReneSchickbauer
      @ReneSchickbauer Před 3 lety +145

      Quantum computer on a breadboard :-)

    • @Mauricetz
      @Mauricetz Před 3 lety +85

      @@ReneSchickbauer A breadboard at nearly 0 Kelvin lol

    • @Bunkers-Boys
      @Bunkers-Boys Před 3 lety +26

      cc65 should work as a c compiler. A assume a library could be made with functions and resources for this computer.

    • @kas-lw7xz
      @kas-lw7xz Před 3 lety +39

      @@d.6325 the java virtual machine runs on C++ which is based on C, so no, and also, the java virtual machine is unoptimized and slow, relatively

    • @wompastompa3692
      @wompastompa3692 Před 3 lety +68

      HolyC is all you need.

  • @TimothyChapman
    @TimothyChapman Před 3 lety +737

    Alternate title: "How to get two independent computers to share the same set of system resources without fighting over them.

    • @DingleFlop
      @DingleFlop Před 3 lety +91

      "....Too much..."

    • @dubbynelson
      @dubbynelson Před 3 lety +17

      nice profile picture

    • @pqrstzxerty1296
      @pqrstzxerty1296 Před 3 lety +6

      CPU offloading..... Ngpu Vgpu AGPU APU. Offloading is the key.

    • @josugambee3701
      @josugambee3701 Před 3 lety +43

      It's certainly much easier to do with computers than with people.

    • @shinyhappyrem8728
      @shinyhappyrem8728 Před 3 lety +18

      What I don't get is why he doesn't just remove the CPU clock and use the 10Mhz clock and a counter to clock the CPU.

  • @MsMaciek
    @MsMaciek Před 3 lety +66

    I can't wait for "World's worst video card gets raytracing support"

    • @DaedalusYoung
      @DaedalusYoung Před 3 lety +5

      "World's worst video card surpasses all other video cards"

    • @akersmc
      @akersmc Před 3 lety +4

      Ben: and you see now that the reflections are kind of slow, but if I move this jumper over to the output of the RTX enabled NAND gate...

    • @the-pink-hacker
      @the-pink-hacker Před 3 lety +2

      Ascii ray tracing is a thing.

    • @Arctic_silverstreak
      @Arctic_silverstreak Před 2 lety +2

      Dude don't go that high yet, I'm gonna be blown out if he can make a "smooth" 12fps animation @640x480p with this kind of thing

    • @progamer1125
      @progamer1125 Před 2 lety

      If he adds raytracing support I'm buying this for my pc

  • @SirFloIII
    @SirFloIII Před 3 lety +381

    BROKE: getting a widescreen monitor
    WOKE: chopping off the bottom 11 pixel to make your aspectratio wide

  • @ScottiStudios
    @ScottiStudios Před 3 lety +527

    43:25 so glad you showed us the comparison in processing time without that extra blanking interval being used or it would have left a big hole in my life 😂

    • @HKlink
      @HKlink Před 3 lety +12

      Definitely thinking the same! I was really hoping for it, and it made me very happy that it happened.

    • @ikatib
      @ikatib Před 3 lety +1

      Me too ahahahah

    • @shanesgettinghandy
      @shanesgettinghandy Před 2 lety +1

      Me too! I was getting anxious until he did it. Thank you!

    • @GanUnita
      @GanUnita Před měsícem

      Another way to do the video signal is to put the VGA Hsync and Vsync on the least significant bit's and the rest for the color on the address bus.

  • @marvintpandroid2213
    @marvintpandroid2213 Před 3 lety +457

    Nand gates, when they go low we go high.

    • @wolfpriest6954
      @wolfpriest6954 Před 3 lety +16

      underrated

    • @bugglest0n
      @bugglest0n Před 3 lety +13

      I will buy this t-shirt

    • @_c_e_
      @_c_e_ Před 3 lety +5

      *YOU*
      [QR_CODE_LINKING_TO_SUPPLIER]
      *HIGH_IQ*
      Address(bool):
      push rbp
      mov rbp, rsp
      mov eax, edi
      mov BYTE PTR [rbp-20], al
      cmp BYTE PTR [rbp-20], 0
      je .L2
      mov BYTE PTR [rbp-1], 1
      movzx eax, BYTE PTR [rbp-1]
      jmp .L1
      .L2:
      .L1:
      pop rbp
      ret
      *MID_IQ*
      bool Address(bool high) {
      if (high){
      bool flying = true;
      return flying;
      }
      }
      *LOW_IQ*
      Dope!

    • @user-me7hx8zf9y
      @user-me7hx8zf9y Před 3 lety

      @@_c_e_ 😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂😂

    • @tootaashraf1
      @tootaashraf1 Před 3 lety +4

      you can make a computer with *only* nand gates

  • @bob2859
    @bob2859 Před 3 lety +235

    Ben: Insightful process demonstration on creating a computer-controlled graphics device from components
    Army of commenters: D O O M

  • @tihmstar
    @tihmstar Před 3 lety +258

    once video card is done, he should build a soundcard next

    • @ownpj
      @ownpj Před 3 lety +23

      The via chip he uses for the lcd has a square wave output

    • @ReneKnuvers74rk
      @ReneKnuvers74rk Před 3 lety +23

      There have been a lot of sound card designs based on R2R networks connected to a parallel port. That would be an easy hack to control. Although timing would become difficult.
      An autonomous sound card that works similar to the videocard would be relatively easy as it would be the same as this videocard but with a clock at 10kHz and without all the pixel counting. The output stage would be very similar and it should read only a single byte of memory. Maybe set a flag in memory that the sample has been read. A 'sample and hold'-circuit will set and keep the poormans DAC at a value for the length of a sample.
      Feasibility: doable.

    • @stargazer7644
      @stargazer7644 Před 3 lety +7

      @@ReneKnuvers74rk that’s a lot of hardware overhead for nothing because the cpu still has to keep that memory location populated and you somehow have to time it. You simply need a timer periodically interrupt the cpu 8000 times per second (or whatever sample rate you want) and the cpu writes the next sound byte to a parallel port and you use a resistor ladder to drive an audio amp. Easy.

    • @wizdude
      @wizdude Před 3 lety +1

      Build an RTR network mapped to an output port. With a D2A converter you can synthesise multi part audio in software. I did something very similar on my TRS-80 many years ago based on a design from my father. Granted the Z80 was running at 1.77MHz but the original implementation was done on an 8080 running around 1MHz so it should be totally doable on a 6502 with this design. As long as the processor isn’t paused so often :-)

    • @user-ir2fu4cx6p
      @user-ir2fu4cx6p Před 3 lety

      I did sound card on ATtiny 85 and stm32 czcams.com/video/vfcCpQyBtAc/video.html

  • @farhanyousaf5616
    @farhanyousaf5616 Před 3 lety +256

    Every time he jiggles the bus jumpers, I panic a little bit...

    • @skellious
      @skellious Před 3 lety +25

      He uses good quality breadboard, they hold the pins well.

  • @jonathanfaber3291
    @jonathanfaber3291 Před 3 lety +341

    Ben's getting some real Bob Ross vibes in this one

    • @genjii931
      @genjii931 Před 3 lety +51

      “and here we'll just use a happy little nand gate...”

    • @kuhljager2429
      @kuhljager2429 Před 3 lety +20

      @@genjii931 there are no such things as bugs, Just happy little circuts

    • @ChristopherBergSmiet
      @ChristopherBergSmiet Před 3 lety +10

      It's not a bug, it's a happy little feature

    • @nxtktube
      @nxtktube Před 2 lety +1

      @@kuhljager2429 Just happy little interferences

  • @bassett_green
    @bassett_green Před 3 lety +53

    Through this series, I was initially surprised how relatively uncomplicated it was to get the video card running with the ROM image.
    Now I'm even more surprised that having it use the RAM was so much extra complication.

  • @greglilly3866
    @greglilly3866 Před 8 měsíci +6

    Ben... I started out my career in Electronics and Engineering building a lot of circuits like you demonstrate here... a lot of the time just for my own learning, so I'm very familiar with how most of this works that you're doing. But, the way you present this information makes it so interesting to watch even when I know what you're doing and how you're doing it, it's captivating the way you present it. Thank you for making these videos... brings back so many memories. I may even get one of your kits just to play around with and remind myself how much fun it is to build something so rudimentary and see it work.

  • @pspdsRobin2
    @pspdsRobin2 Před 3 lety +239

    This is such a great series! When you feel done, you should design your own PCB, with this setup, and talk us through some design decisions :) Order it and solder it together! It'd be really cool if you later published those PCB schematics as well!

    • @poptartmcjelly7054
      @poptartmcjelly7054 Před 3 lety +16

      yeah i think this would make a great computer kit, sort of like the Gigatron.

    • @xentropian6341
      @xentropian6341 Před 3 lety +13

      I would love that. Especially if he walks through the design process (Fusion 360 or similar). Would be cool to see how he'd approach it!

    • @Rx7man
      @Rx7man Před 3 lety +4

      @@xentropian6341 would probably be done in Eagle or Kicad

    • @xentropian6341
      @xentropian6341 Před 3 lety +1

      @@Rx7man I thought Fusion360 has Eagle now built-in? Could be wrong!

    • @Rx7man
      @Rx7man Před 3 lety +2

      @@xentropian6341 I hadn't heard of it, but possibly..
      I just say Eagle because his current schematics look like they're done in Eagle

  • @adityadhar1358
    @adityadhar1358 Před 3 lety +72

    This man troubleshoots his invigilator during exams

  • @Hobbitstomper
    @Hobbitstomper Před 3 lety +42

    You have secured yourself a seat in the doomsday bunker. If there's an apocalyptical event and humanity needs to restart with a handful of people that have in-depth knowledge about everyday things, you sir are needed!

  • @sheik124
    @sheik124 Před 3 lety +64

    I love how you in a way arrived at one of the workarounds they used in Sonic the Hedgehog on the Genesis. I think transparency effects in that game were done with some clever palette swapping during the VBI, and on a monitor with a small enough amount of overscan you can see some colored dots towards the bottom of the screen that are framebuffer artifacts of that palette swap. This was the coolest 2-3 hour detour I've ever taken on CZcams.

    • @Wolfe16
      @Wolfe16 Před 3 měsíci

      I thought the transparency was just dithering 😅

  • @flightvision
    @flightvision Před 3 lety +64

    When seeing a 100x64 pixel screen, I think everybody just screems "SNAKE"! :) . Absolutely wonderful to follow this series. Thank you!

    • @dan_loup
      @dan_loup Před 3 lety +12

      I bet it is already fast enough for snake/tetris

    • @jonnypista52
      @jonnypista52 Před 3 lety +3

      @@dan_loup you won't even need timing, when it done drawing, then draw the next move, maybe with some optimization so it only writes to places which change it would be faster, but like this it would be playable

    • @user-me7hx8zf9y
      @user-me7hx8zf9y Před 3 lety +2

      @@jonnypista52 some people have even built minecraft redstone snake machines :)

    • @Fogolol
      @Fogolol Před 3 lety +6

      @@user-me7hx8zf9y well people have built computers that are legitimately stronger than this one using redstone in minecraft
      Which is actually really amazing

    • @victordonchenko4837
      @victordonchenko4837 Před rokem +2

      I can already imagine how it could work. There could be a circular buffer holding the coordinates of each pixel of the snake, and each move the back pixel would be erased and removed and a new front pixel added and drawn. Then it could check for collisions by reading the video buffer back, lol.

  • @remyterjanian8602
    @remyterjanian8602 Před 3 lety +98

    YES! I'm so excited for this, I just got myself the VGA kit!

    • @ReidCaptain
      @ReidCaptain Před 3 lety +4

      Haha was thinking of getting it too!

    • @first-last557
      @first-last557 Před 3 lety +2

      @@ReidCaptain quick word of advice for the coilgun: a high current relay might be good for a power switch, but I'm not sure.

  • @misaalanshori
    @misaalanshori Před 3 lety +28

    I love the "Some assembly required" on the box, makes it sound like there is just a bit of assembly needed, you know some cable here and some cable there...

  • @WalterMiller
    @WalterMiller Před 7 měsíci +2

    Old video cards always struggled with full memory re-writes. That's why they used things like sprites, tiles, and palette effects. You should add in a colour palette then do some fun animations with colour cycling.

  • @randomd2146
    @randomd2146 Před 3 lety +98

    In 20 years in the future
    How to build a gaming pc from eprom chips

    • @n00blamer
      @n00blamer Před 3 lety +5

      FPGAs are now affordable to any hobbyist.. next: cheap ASICs, you just "order" 100x faster clocked ASIC of your design for $50 (still expensive per chip, but incredibly cheap per chip also.. paradox..) Then 10 years later you can print the ASIC at home with Samsung Print-A-Chip that costs $220 .. that'll be cool, and I'll be dead by then but at least someone will have fun...

    • @jamesparker8529
      @jamesparker8529 Před 3 lety

      @@n00blamer wait, you can get ASICs that inexpensively? From which company?

    • @n00blamer
      @n00blamer Před 3 lety

      @@jamesparker8529 We were just having thought-play what it'll be like 20-30 years from now... ASIC is more like 100 grands and up.. and if you need more than 1 round, well, better have deep pockets.. cya in Rev.B, mate..

    • @jamesparker8529
      @jamesparker8529 Před 3 lety

      @@n00blamer ah, I thought you were saying that you could get custom asics at that price now.

  • @thefrub
    @thefrub Před 3 lety +10

    25:05 I recognize this pattern. This is what my screen looked like when my video card died. But it was this pattern overlaid on the regular image

    • @will5948
      @will5948 Před 3 lety +3

      My ATI 4850 died that way. Actual video playing but green spots all over the screen.

  • @Icelink256
    @Icelink256 Před 3 lety +11

    That comb effect is common, when you write Atari 2600 games, that do just a little too much, per scanline.
    The most well-known game with that artifact, is Atari's own Pac Man.

    • @eekee6034
      @eekee6034 Před 2 lety +1

      Good point! It's odd no-one else brought that up.

  • @charlesjmouse
    @charlesjmouse Před 3 lety +11

    An awesome series, thank you very much!
    For anyone interested:
    This scheme for only processing in the blanking intervals is what you might call the standard "retro" method for sharing RAM access between CPU and "GPU". Commonly used in early home computers because the implementation is reasonably straightforward and not too costly in parts. As Ben alluded to there are others ways. A few of the more common alternate options to consider:
    -Costly: Use dual-ported RAM then the video and CPU can have their own RAM bus and won't clash - great for speed - old-school workstations
    -Custom: Give the "video processor" it's own RAM - no clashes and more system RAM but communication is more complicated - eg Ti9918
    -Fast RAM: Clock the RAM twice as fast as the CPU and "GPU" and give each access on alternate cycles - eg BBC Micro
    Most "classic" micros used either a variant of Ben's approach or one of the schemes I've mentioned. These days as SRAM is cheap and fast if I were smart enough to make a VGA board such as Ben's I'd use the last option and run everything off one crystal for easy synchronisation...
    ...or if i were interested in something more "retro", especially if hacking colour GFX in to a Z80 system, I'd absolutely use a Ti9918 or one of it's variants.
    P.S.
    I particularly like using Ti's chips to augment B/W CP/M systems as all you need to do is make a Ti-based GFX board, choose some appropriate ports to accesses it, and run the machine's native output through the Ti chip's built-in mixing feature. The normal output becomes a layer in the Ti chip's output and you can use the Ti chip to add further colour GFX and sprites as you wish - I never understood why a Ti GFX board wasn't an option for many such machines, I have kind-of done the reverse with my COLECO ADAM to give it 80 col CP/M functionality.

    • @crazythunderchief
      @crazythunderchief Před 4 měsíci

      The TI chip won't allow you to draw pixels directly though...

    • @grrey01
      @grrey01 Před 4 měsíci

      It was very common in Arcade machines.

  • @akshanshkmr
    @akshanshkmr Před 3 lety +201

    After watching his complete video without skipping, I feel like I have grown new brain cells 😂

    • @dan2800
      @dan2800 Před 3 lety +9

      Wait thats illegal

    • @Meskalin_
      @Meskalin_ Před 3 lety +9

      @@dan2800 wait thats an outdated meme
      and it doesnt even fit, good job fam

    • @Gamer-uf1kl
      @Gamer-uf1kl Před 3 lety +1

      I think you are always "growing" them. They die and reproduce all the time.

    • @RealAndroidTurorials
      @RealAndroidTurorials Před 3 lety +2

      @Sin Midani bs check it out, things have evolved

    • @user-me7hx8zf9y
      @user-me7hx8zf9y Před 3 lety +1

      @Sin Midani neurogenesis definitely can occur.

  • @nickroach6270
    @nickroach6270 Před 6 měsíci +3

    Pressing play on this video invoked the same feeling as opening a Christmas present when I was 8 years old

  • @Sparkette
    @Sparkette Před 3 lety +74

    42:57 If the CPU is halted 70% of the time, you'd be getting 30% efficiency, not 70%, right?

    • @Fogolol
      @Fogolol Před 3 lety +28

      man built an entire computer, he's tired so he stumbled on his words

    • @tonysofla
      @tonysofla Před 2 lety +23

      70% inefficiency.

  •  Před 3 lety +10

    After a workday doing stuff in much higher languages, cloud and basically nothing which feels hand crafted anymore, this is so relaxing to watch.
    Thank you! I've seen a cable stripper machine built with an Arduino the other day. Now I have an appreciation for why it's a cool thing to have. So many tiny wires with exact lengths.

  • @DanielCharry1025
    @DanielCharry1025 Před 3 lety +11

    Please consider using the VIA timer for a buzzer or something. This absolute unit is ready for a videogame.

  • @theshosher
    @theshosher Před 3 lety +12

    The ending with you using the extra processor time was a really cool visual representation of how much slower your processor runs with that less time. Very cool!

  • @UnidayStudio
    @UnidayStudio Před 3 lety +13

    The best part of my IT graduation is that I can watch and enjoy this master piece of a video. Amazing work, keep it up!

  • @Aithan83
    @Aithan83 Před 3 lety +12

    This might be one of the most interesting video series I've ever seen on CZcams. Hope you'll keep them coming for a long time!

    • @opendstudio7141
      @opendstudio7141 Před 3 lety

      Check out Sabastian Lague as he explores computers. Interesting since he's a programmer. czcams.com/video/QZwneRb-zqA/video.html

  • @williamsquires3070
    @williamsquires3070 Před 3 lety +3

    Hi Ben, this is an interesting series, as I learned 6502 assembler on an Apple ][+ way back in ‘78/‘79, so the 6502 has a special place in my heart. Some alternate strategies for this:
    1) use a dedicated “video/CRT-controller” chip to do the heavy lifting (ex: the Vic II (Commodore Vic 20/C64), TMS9918 (TI 99/4A), Propellor (PE6502 kit), PPU (Nintendo NES), or
    2) interleaved frames so only every other line gets drawn, giving the 6502 more processing time.) (used by the Gigatron)
    3) a separate RAM chip that shares the same address bus, but it’s data bus is used solely by the video circuitry. This would require some sort of video synchronization buffer that would take write requests from the 6502 in the range $2000-$3FFF and stuff the data into a FIFO buffer, then - in the time interval currently used by the current design - the bytes would be taken out of the FIFO buffer and stuffed into the video RAM while the video circuitry isn’t drawing anything (end of each horizontal line.) This way, the 6502 can) run at full speed; a problem with the current design as it’s questionable what would happen if an interrupt (NMI or IRQ) were to happen while DMA-bar was asserted, halting the 6502.
    4) bite the bullet and use dynamic RAM instead of static RAM; then the video counters could be used to generate the RAS/CAS lines needed by dynamic RAM, and you could interleave memory access between the 6502 (clock phase 0), and the dynamic RAM refresh/video circuitry (clock phase 1).
    On a related note, you could use a 74LS90 decade counter to divide the video clock by 10 to get the CPU clock! This would eliminate the 1 MHz crystal oscillator, and synchronize the two clock signals.

  • @simonscott1121
    @simonscott1121 Před 3 lety +4

    I didnt think you could instantly silence a 6502 like that Ben, the c64 has a bunch of logic around it (for sprite DMA that occurs on the first half of the cycle, and also reading the character pointers) because the 6502 will only pause on the next *read* cycle.... if it is writing, it will continue. In fact, the writes may be 1-3 cycles, depending on the instruction that is completing, which would kinda match what you're seeing on the left hand side of the screen?
    Also, might be easier to divide down the video clock to run the CPU???
    Awesome series man, loving it.

  • @AsraelDragon
    @AsraelDragon Před 2 lety +3

    Seeing this system go from "ultra basic" to "dynamically drawing onto a screen" has been quite a journey! It really makes me feel like I'm in the pioneering age of computers again.

  • @nickstuffinc
    @nickstuffinc Před 3 lety +24

    I would love to see the "game of life" on here.

    • @Roxor128
      @Roxor128 Před 3 lety +2

      The required double-buffering might be a bit tricky to implement, though.

    • @user-me7hx8zf9y
      @user-me7hx8zf9y Před 3 lety +1

      Conway would weep.

    • @ianwyrdness1380
      @ianwyrdness1380 Před 3 lety +1

      @@Roxor128 You can do it without double buffering. Back in the 80's, I wrote a quite efficient version in Motorola 68000 assembly that avoided the need for double buffering by only redrawing the cells that had changed.

  • @smbrown
    @smbrown Před 3 lety +33

    How about RC on reset? charging capacitor holds reset low long enough after power is applied. Pressing reset button isn't difficult, I find the RC reset more elegant.

  • @joejia1410
    @joejia1410 Před 3 lety +9

    Ben must have seen the stock on other graphics cards, pooped himself, then proceeded to make his own. Neatly.

  • @devonmcnealy8900
    @devonmcnealy8900 Před 3 lety +1

    Love the project! Just watched from start to here, I feel like this is a better format to learn basics than any school lol, I appreciate the recapping of everything I needed it. Keep up the good work!

  • @celsowebber7252
    @celsowebber7252 Před 2 lety +5

    Your videos are always inspiring, Ben! I am really impressed about how you make it simple to comprehend the very inner concepts of what happens inside a modern computer. I really love your linking from a logic diagram to the physical logical chips in the breadboards. Congratulations!

  • @bertholtappels1081
    @bertholtappels1081 Před 3 lety +7

    This is extraordinarily good content, both entertaining and informative. There’s no equivalent out there. Thank you for taking the time to do this.

  • @Squib
    @Squib Před 3 lety +8

    The output at 25:03 is such an aesthetic

    • @Sotch_Nam
      @Sotch_Nam Před 3 lety +1

      would love to have a high resolution screenshot of it, but then I remembered we're working with this breadplate pc

  • @Taurickk
    @Taurickk Před 3 lety +3

    It's amazing seeing the issues you ran into building this and realizing how much work went into the computer I'm typing this comment on.
    Also really cool seeing the CPU bottleneck here and an explanation of how/why that happens.

  • @scottlarson1548
    @scottlarson1548 Před 3 lety +6

    Now I know why old computers never used separate clocks for the video and the CPU. It's a quick way to make small problems that are hard to solve.

    • @Havron
      @Havron Před 3 měsíci +1

      Indeed. I was wondering why he didn't just use the 2's place on the horizontal counter as the clock for the computer part. Since the computer is running at 10 MHz and the video card at 40 MHz, and the horizontal counter is running continuously and its max count (264) is divisible by 2, that pin is running at a perfect 10 MHz already and is also synced with the video card's clock. So, in theory, he could have avoided all that clock sync logic by simply using that pin from the counter as the computer's clock.
      However, I suppose that the video would have been less edifying and fun without all the clock sync troubleshooting, though.
      EDIT: Whoops, I made a boo-boo: The video card is actually running at 10 MHz and the computer at 1 MHz. So, no, those won't divide evenly as a power of two, although they do divide evenly in general. So, you could solve this by setting up a little decade counter circuit, or just run the CPU at 1.25 MHz via the 4's place in the horizontal counter (since 264 is also divisible by 4). I would have given the latter a try, assuming there isn't anything particular about the 6502 that requires it to run at exactly 1 MHz. Indeed, according to a quick Google search, the 6502 can run at any clock speed between 1 MHz and 3 MHz, so you could even run it at 2.5 MHz using the same 2's place pin as I had originally suggested, and get a 2.5x boost in processing speed to boot!

  • @leobottaro
    @leobottaro Před 3 lety +16

    This is awesome Ben! as a embedded system developer, I am amazed with what you achieved! Please program pong in it :)

  • @pseudocup2848
    @pseudocup2848 Před 3 lety +7

    This new Dr. Stone episode is looking epic

  • @MrOnosa
    @MrOnosa Před rokem

    43:00 I am glad you switched the configuration to show the cleaner but slower processing method. I was really wanting to see that.
    Fantastic video series, thank you.

  • @Myndale
    @Myndale Před 3 lety +31

    Ben: what kind of keyboard are you using? Sounds like a good programmer keyboard.

    • @Artillect0
      @Artillect0 Před 3 lety +3

      It sounds a lot like an IBM Model M, but it could be a keyboard with blue switches. I'm sure someone more knowledgeable than me about mechanical keyboards could tell you exactly what it is

    • @roninkoi
      @roninkoi Před 3 lety

      I have a Model M replica, but this doesn't sound like a Model M. The sound is too high and clicky

    • @simeondermaats
      @simeondermaats Před 3 lety

      It definitely sounds like a keyboard with blue switches. I find browns a bit nicer for typing, but that's mostly personal preference.

    • @Setsuna_Kyoura
      @Setsuna_Kyoura Před 3 lety +2

      I think it's definitely an IBM Model M. I have an original Model M from 1987 (gray badge) and it sounds EXACTLY like his in the video.

  • @karmanyaahm
    @karmanyaahm Před 3 lety +19

    AMD showing their smart memory access in 2020 while this 6502 supports direct memory access
    Ik one is gpu to cpu vs cpu to gpu

  • @der.Schtefan
    @der.Schtefan Před 3 lety +12

    I could listen to him for hours, and every time an episode ends I panic and want more ;)

  • @crazyivan7884
    @crazyivan7884 Před 3 lety

    Every time you post a video there's a little bit of joy that bubbles up in me. As ever - great video. The waiting time between each is really worth it. I wonder if you will ever couple this to your 8-bit breadboard computer.

  • @michaelwoods7144
    @michaelwoods7144 Před 3 lety

    I just recently found your channel and I love your content. It also makes me really appreciate that we used Logisim instead of breadboards when I took computer organization and architecture!

  • @Passiday
    @Passiday Před 3 lety +6

    Well, that was one hell of a hack. Even the wires got messy.

  • @bustaballs
    @bustaballs Před 3 lety +13

    Next video: Ben is thrown on a desert island and has to make a computer from scratch purely out of the resources available on the island.

  • @MrMaxeemum
    @MrMaxeemum Před 3 lety

    I absolutely love this video series. I find It is so easy to follow your electronic explanations but I fall behind a bit with the software side, I understand eventually but I find "8-Bit Show And Tell" easier to understand which I think is due to the pace rather that the different architecture. But I still love your work. Thumbs up.

  • @alvamiga
    @alvamiga Před 8 měsíci +2

    I was only generally interested in this, but it proved to be very informative, explaining a lot of things I knew about from my old computers, but not the express reasons.
    There's obviously a lot that can be done to improve it, but better people than me have already done the hard work! ;)

  • @Fuartianer
    @Fuartianer Před 3 lety +3

    I think Ben is just short for "beding all these cables".
    Nice video!

  • @dexterman6361
    @dexterman6361 Před 3 lety +11

    36:24 That's the cutest wire I've ever seen!

  • @GeertKok
    @GeertKok Před 3 lety +2

    I am glad you took the challenge to connect the projects. Very wonderful to see you solve the issues

  • @killerguppy2988
    @killerguppy2988 Před 3 lety

    Another absolutely amazing video. Informative and entertaining. Can't wait to see what you do with this next!

  • @AL_O0
    @AL_O0 Před 3 lety +54

    Could using the “back porch” signal for getting control of the bus rather than using the vertical blanking signal be a better way of dealing with the artifacting at the start of each scan line, or would that be too impractical?

    • @rickdearman9992
      @rickdearman9992 Před 3 lety +2

      Yeah that is what I was thinking.

    • @josugambee3701
      @josugambee3701 Před 3 lety +17

      Seems like you could divide the graphics clock down to drive the CPU. That way you can decode the pixel counter and halt the CPU when the graphics card is about to grab the bus.

    • @cezarydudek6156
      @cezarydudek6156 Před 3 lety +4

      @@josugambee3701 Would it be appropriate then to call this a CPU with integrated video card, or rather a video card with integrated CPU?

    • @shinyhappyrem8728
      @shinyhappyrem8728 Před 3 lety +4

      @@cezarydudek6156 It becomes one system with integrated CPU and graphics card.

    • @ronaldddoooo
      @ronaldddoooo Před 3 lety +8

      @@cezarydudek6156 2020: integrated graphics card
      2030: integrated CPU

  • @phip1611
    @phip1611 Před 3 lety +16

    Hi Ben! Greetings from Germany! I just want to say thank you for your high quality content! I'm a cs student from TU Dresden. The courses are generally pretty good, but nothing compared to the kind of videos you make. So thanks! :)

    • @Anvilshock
      @Anvilshock Před 3 lety

      A CS student who voluntarily exposes his identity on the internet. Maybe change your field into Applied Arts and develop the next Fontus, why don't you?

    • @phip1611
      @phip1611 Před 3 lety

      @@Anvilshock thank you for your good proposal. I definitely going to think about it ;)

    • @user-ld7vl9sk4s
      @user-ld7vl9sk4s Před 3 lety +1

      finally a normal commenter

  • @bitrot42
    @bitrot42 Před 3 lety

    You are a master of minimalist design. There are of course a thousand ways to do this “better”, but you choose the simplest, easiest to understand solution, with just a few hints at other approaches.
    The extremely slow screen updates show why early 80s games didn’t usually have the CPU writing to a large, raw frame buffer. The video hardware would draw background tiles and foreground sprites, and the CPU would write to a much smaller block of RAM to control what gets drawn where. It could pretty easily update this info on every vsync, so the motion was at a rock-solid 60fps.

  • @ScientificSolutions
    @ScientificSolutions Před 3 lety

    Love this video stuff, thanks for the nice descriptive tour! Hmmm, I don’t see enough decoupling capacitors though; there should be one at every chip.

  • @charlesjmouse
    @charlesjmouse Před 3 lety +3

    Thank you for this interesting and enjoyable series, much appreciated.
    If you happen to develop this theme further might I suggest a version where the RAM is driven twice as fast as the VGA logic and CPU, the two synchronised to the same clock? CPU and VGA can then alternate access to RAM so simplifying the logic and allowing the CPU to run at full speed...
    -Current design a method for sharing access where RAM/ROM is slow (typical of more basic retro systems)
    -Suggested scheme being one method to simplify / gain efficiency where RAM is fast enough (BBC Micro an example)
    -It might then be a "fun" jumping-off point for what to do if RAM is fast enough but ROM isn't
    Thanks again.

  • @jakesnell7707
    @jakesnell7707 Před 3 lety +5

    covid 19: *starts*
    ben: “time to build a computer and graphics card from scratch”

  • @xxlabratxx01
    @xxlabratxx01 Před 3 lety

    Well worth the watch. Diagnostic section in particular. Thank you

  • @AlexanderMLarsen
    @AlexanderMLarsen Před rokem

    Ive been looking for this channel for so long!

  • @Skyliner_369
    @Skyliner_369 Před 3 lety +2

    the next upgrade would be to make the video memory separate. Still part of the address bus, but separate, so that execution can keep going inside the frame, even if it's not display instruction

  • @retroand
    @retroand Před 3 lety +41

    What about having isolated VRAM and using 74157 to select what address/control signals are going in? That would solve the problem.

    • @aydna3317
      @aydna3317 Před 3 lety +32

      That is of course a better approach and closer to what modern video cards are doing. But I think Ben here is trying to make something that is similar to early computers and is as simple as possible.

    • @retroand
      @retroand Před 3 lety +12

      @@aydna3317 As modern as a Commodore PET... 😁

    • @toto123456ish
      @toto123456ish Před 3 lety +4

      @@retroand Doesn't he mention this possibility in the previous video?

    • @retroand
      @retroand Před 3 lety +2

      @@toto123456ish Maybe, but has he used the approach Commodore and many, many other manufacturers 40+ years ago did?
      I think it would be great to see the 6502 syncing with that circuit without flaws. He has already done the most challenging part which is the video system itself. For this reason I still don't understand why wouldn't he improve the interface just a little more. It could also be useful as an example of shared memory.

    • @retroand
      @retroand Před 3 lety +7

      @lass kinn The contest was lost before he started building this. Later PETs can destroy the monitor (killer poke) and I heard the Sinclair QL video system can be destroyed by just unplugging the video connector while the computer is running (I have a couple of them but don't prettend I'm going to try 😁). So, nothing worse than those.
      His series about the video circuit are inspiring, it's just I feel this could be improved greatly without being that expensive on the component side.
      As a remark, this can't be called a GPU (too primitive). Video system (or subsystem) is a better term.

  • @meowcula
    @meowcula Před 7 měsíci

    "... just to clean this up a bit" - you already are the world record holder for neatest breadboard prototyping, I'm very sure!

  • @bugmuff
    @bugmuff Před 5 měsíci +1

    This blows my mind. So awesome. I’ve learned so much from your videos, and every time I re watch I learn more.
    Could you ever run basic on this, and output a
    Commodore 64 style user interface?

  • @byronwatkins2565
    @byronwatkins2565 Před 3 lety +9

    Actually, the "repainting" anomaly is due to the lack of synchronization between the video card painting and the processor writing pixels. The processor will be ahead of the video card but, due to the small interval at the end of a line devoted to the processor, the video card overtakes the processor and begins using "old" pixels that have not been re-written yet. Incorporating a wait loop until the blanking at the bottom is reached (sensed by an input) before beginning to write a new frame will eliminate this. After writing the new frame, you need to wait for the video card to begin painting the new frame before entering the first wait loop again.

    • @thargy
      @thargy Před 3 lety +1

      You’d be right on a modern architecture, or even a 1980s ‘hi-performance’ architecture where the frame time is comparable with the cpu frame generation time. That’s not the case here, as the CPU is running much more slowly, and takes numerous frames to cycle through the video buffer. So Ben’s description is entirely correct.

    • @Eagle0600
      @Eagle0600 Před 3 lety

      @@thargy Apart from the part where he says we're only getting 70% efficiency out of the computer. We're actually only getting 30% efficiency, but I'm sure that was just a verbal slip.

    • @thargy
      @thargy Před 3 lety

      @@Eagle0600 yeah I spotted that, it’s really hard talking to camera!

  • @lemonglataitor2123
    @lemonglataitor2123 Před 3 lety +9

    I have been watching for years now and i think i finally need to get one of these amazing kits this christmas!

  • @sebastianweigand
    @sebastianweigand Před rokem +1

    Thanks! Love the content! I'd love to see a follow-up video that updates the output protocol from VGA to DVI, just to show the evolution of electronic standards. Might be annoyingly difficult, but then again, so is building a computer and video card using breadboards!

    • @KirbyZhang
      @KirbyZhang Před rokem

      my guess is it will have much faster timings than 6502

  • @dedr4m
    @dedr4m Před 3 lety

    Absolutely loving these series. Getting the video bit I'd not of thought how to do so, whereas I'd probably of used a different approach using 2 banks of RAM memory mapped to a byte for controlling what bank is to be written to, or made a DMA circuit as you did and add to the circuit a quick way to mirror the current memory into the current bank as a kinda local cache (Kinda how oldskool VGA used to with the bank flipping and bank scrolling modes with their onboard RAM mirroring system RAM at location 0x0000000Ah of the main memory map).
    .
    Also in theory, your VGA+CPU could be combined as one GPU processing unit to allow an independent CPU to do more independent processing, i.e. fill GPU RAM banks, then recall individual banks when needed thus leaving the main CPU to not need to do all the rendering.
    Can't wait to see where you take this next :)

  • @no1leader135
    @no1leader135 Před 3 lety +3

    What an amazing video. I love how you combined this video card with a 6502 and little bit of electronics.

  • @RamiSlicer
    @RamiSlicer Před 3 lety +15

    I guess it's confusion time for me

  • @SytheZN
    @SytheZN Před 3 lety

    Loving this series!
    Please add dedicated video memory and buffer it in so the cpu and vpu can work (mostly) simultaneously

  • @danny_racho
    @danny_racho Před 3 lety +2

    Hey Ben! I really love watching your videos, since you go through every detail and most importantly you read data sheets, that can be very tricky to decipher sometimes. I am not supporting you on Patreon or anything, since I am still a student, but I would like to kindly ask you, if you can make one video or more about Op Amps, for instance the non-inverting or integrator and go through some data sheets and tell us how to pick the right resistors wisely and cascade or even use them as Analog Controllers!
    Nobody is stressing this subject anymore, since we have nowadays all the digital controllers, which take all the fun in building one on our own! :) Maybe just taking that into consideration.

  • @thargy
    @thargy Před 3 lety +7

    I love that you’re sharing this skill set that will too easily become a lost art. I particularly like how you work through the problem solving, showing the problem, proposing a solution and testing. These skills are so vital even in today’s modern development teams. Thanks for your efforts!

  • @weepgamer
    @weepgamer Před 3 lety +4

    I've been checking your channel so much recently to see when this video comes out, and it's finally here!!!

  • @dudeyouhavenoidea
    @dudeyouhavenoidea Před 3 lety

    I always get motivation from this channel, man I love you ❤

  • @stylesoftware
    @stylesoftware Před 3 lety

    I've been watching a while, I enjoy your videos. It actually helps me under programming VGA for FPGA's, which im new at, very cool.

  • @johnsherby9130
    @johnsherby9130 Před 3 lety +4

    95% of this is going over my head but it’s still so damn cool.

  • @blahdelablah
    @blahdelablah Před 3 lety +13

    To fix the visual issues and improve performance, could you have two display buffers, have one you're displaying and another you're updating, and switch between these buffers at the end of the display cycle?

    • @researchandbuild1751
      @researchandbuild1751 Před rokem

      Exactly what I was thinking, would actually be pretty easy to implement too, and less coordination required between CPU and Vid, just a simple memory "swap" clock cycle.

  • @XCATX25
    @XCATX25 Před 3 lety

    Your videos have kinda everrything: production value (preparing everything needed beforehand, including the cables already ready to be plugged always), educative purposes, crazy shenanigans, it's cool, it's crazy, it's great.

  • @Astr0-1408
    @Astr0-1408 Před rokem +1

    When you tell your IT manager you need something that "can just output an image"...

  • @TekgraFX101
    @TekgraFX101 Před 3 lety +7

    Listening to you makes me believe I actually understand what's going on here.

  • @Akumetsu02
    @Akumetsu02 Před 3 lety +5

    make it render the Mandelbrot set.... needs to happen :D

  • @anthonyortiz7924
    @anthonyortiz7924 Před rokem

    Ben, this is amazing stuff... I love these videos, keep up the great work!

  • @Marci124
    @Marci124 Před 3 lety +1

    With the LCD, this is a dual monitor, multi-resolution system. Your platform can do something that many desktop environments still struggle with.