I Designed My Own 16-bit CPU

Sdílet
Vložit

Komentáře • 1K

  • @AstroSamDev
    @AstroSamDev  Před rokem +566

    It sure has been a while since my last video, and there are so many new faces here!
    If you want to interact more with the community, I have a discord server where we talk about cool and random programming things, and you can see sneak peeks of future videos! discord.gg/9p82dTEdkN
    I will also start streaming programming soon on my second channel here: czcams.com/channels/QSOig4wEV_pAFPRg0qtSmQ.html

  • @davidm7333
    @davidm7333 Před rokem +1290

    As a programmer there's nothing that makes you feel quite like a fraud than seeing someone create their own CPU, build an emulator for it, create a programming language and compiler then write a game in it 😅
    Well done and keep up the good work!

    • @AstroSamDev
      @AstroSamDev  Před rokem +227

      Thank you!
      Also, think the biggest problem with this project is time. I believe anyone can do it, easily, given there are so many resources online. But finding time to do the research, and make the design can be difficult. I didn't know anything about computer hardware, but I did research (and watched a lot of Ben Eater), and then a month later I made this cpu. It is relatively basic, but it is a start.

    • @seannibecker5500
      @seannibecker5500 Před 8 měsíci +8

      Wow, are you just super smart or something @@AstroSamDev

    • @Thiquid
      @Thiquid Před 7 měsíci +13

      ​@AstroSamDev Even still, it takes incredible dedication to complete. Even if there's enjoyment to it, what you've done is beyond impressive!

    • @monad_tcp
      @monad_tcp Před 6 měsíci +1

      Do you use programming language that were not made for Browsers ? (aka, not JS)
      If you answer yes to that, you're not a fraud.

    • @okie9025
      @okie9025 Před 5 měsíci +7

      ​@@monad_tcpyou wouldn't be able to write this comment without JS

  • @ByWire-yk8eh
    @ByWire-yk8eh Před rokem +1404

    I designed a 16-bit CPU (microprocessor) in 1977 that was used in a mainframe as the I/O processor. It had a unique instruction set (RISC). Lots of fun, and a lot of work.

    • @helloworld145
      @helloworld145 Před rokem +79

      16 bit in 1977? That's insane

    • @OpenGL4ever
      @OpenGL4ever Před rokem +144

      @@helloworld145 Not for a mainframe. But the price tag sure was insane.

    • @CaptianTitan
      @CaptianTitan Před 10 měsíci +12

      Cool

    • @herrbonk3635
      @herrbonk3635 Před 10 měsíci +57

      @@helloworld145 There were 64-bit computers in the early 1960s.

    • @KuroParasite
      @KuroParasite Před 9 měsíci +22

      @@herrbonk3635 64 bit computers must be crazy expensive in that year lol.

  • @timedmondson1307
    @timedmondson1307 Před rokem +156

    I have been involved in computer engineering since the 90's and to see how easy the knowledge is accessible now is mind blowing. I had to spend years on what he has done in a few months. This is truly the best time to be alive

    • @handlehaggler
      @handlehaggler Před 5 měsíci +9

      I can't imagine what it must be like for the people who were there from around the beginning of all this craze to where we are now. At some point I imagine you knew everything there was to know about but as time moved on new products, programs and such became available and after a while it just becomes too much to all learn and you just watch in awe at how dense and complex its become

    • @MarkieMark-vy7hg
      @MarkieMark-vy7hg Před 2 měsíci +1

      @@handlehagglerthe lady who help invent voip is still alive…she was one of the hallmark individuals in the creation of the World Wide Web. It’s so wild to me.

  • @Roter_Wolf
    @Roter_Wolf Před rokem +124

    When I was 16 i just started learning my first programming language. Meanwhile this guy is casually creating his own languages and also designing a computer from scratch while he's at it.
    You're going places, man

    • @angryvaultguy
      @angryvaultguy Před 2 dny

      You think you feel under skilled I'm 22 and just learning visual studio code and Django

  • @maindepth8830
    @maindepth8830 Před rokem +4034

    The fact he did all this in under 4 months is impressive

    • @dimi5862
      @dimi5862 Před rokem +51

      I can do it in one

    • @maindepth8830
      @maindepth8830 Před rokem +238

      @@dimi5862 go on show me, everything from scratch, from nothing. As of this comment the date is Saturday 23rd july 2022.
      You can work during live streams to make sure you didnt cheat

    • @AkariInsko
      @AkariInsko Před rokem +160

      i can do it in one
      (millenium).

    • @brysonw7270
      @brysonw7270 Před rokem +6

      @@maindepth8830 lol but true

    • @brysonw7270
      @brysonw7270 Před rokem +8

      @@dimi5862 prove it

  • @le9038
    @le9038 Před rokem +5514

    >>creates own programming language
    >> learn and creates a CPU with it's own assembly and microcodes
    this man is becoming god.

  • @nichonaugle5659
    @nichonaugle5659 Před rokem +5

    incredible!! this is the most complex thing I've ever picked up on so quickly, You literally made this sound so straightforward I love it. Thank you so much!!!

  • @asyncawait5335
    @asyncawait5335 Před rokem +7

    Congrats on everything man I'm glad to see the progress you are making !

  • @Azeal
    @Azeal Před rokem +1713

    You have a level of knowledge about low-level computing that I wish I had in, well... anything. Incredible work, continue to hone your talents and I'm looking forward to that build when the day arrives!

    • @alperkaya8919
      @alperkaya8919 Před rokem +22

      Just read textbooks

    • @elakstein
      @elakstein Před rokem +12

      These things are taught in engineering.

    • @greatbullet7372
      @greatbullet7372 Před rokem +28

      @@elakstein no they dont!

    • @elakstein
      @elakstein Před rokem +5

      @@greatbullet7372 ok. I will correct myself. I learnt these things in college. Obviously not as deep as he showed in this video.

    • @vishnuskandata2355
      @vishnuskandata2355 Před rokem +3

      @@elakstein true... but we somehow dont understand or learn that but a person like this can teach it in just hours...

  • @theburntcrumpet8371
    @theburntcrumpet8371 Před rokem +912

    I've been programming for 8 years. I have recently fallen out of love with it. You have reminded me why I loved it in the first place - the ingenuity of simple yet expansible design, the thrill of problem solving and the marriage of hardware and software. Thank you.

    • @memeology4138
      @memeology4138 Před rokem +1

      Hello can u help me? Im new to coding can u link me to any good sources for learning python

    • @elakstein
      @elakstein Před rokem +36

      @@memeology4138 learn c first. Python abstract just too much. First language should be c if you want to understand things. I would recommend "Let us c" for leaning c. We used this to learn c in college, then I learnt Java fir understanding object oriented language, then python.

    • @abcdefghjijklfgu
      @abcdefghjijklfgu Před rokem +2

      @@memeology4138 im new too but im doing c and js

    • @memeology4138
      @memeology4138 Před rokem +2

      @@elakstein thx for ur help

    • @tacorito1809
      @tacorito1809 Před rokem +6

      @@abcdefghjijklfgu why c and js? learn one language first, master it. plus, js is usually nothing without html/css, unless you're learning react and stuff.

  • @UNIT398
    @UNIT398 Před rokem +24

    Simply amazing. The fact that you did all this for fun (mostly) speaks to your intelligence and understanding of how technology works. Yes, my brain didn't understand most of it, but seeing you construct each layer, part, and program was fascinating! I hope to see that creation of the CPU and other parts!

    • @JesusisJesus
      @JesusisJesus Před 9 měsíci +2

      I understood about 0000000000000000% of it.
      Wondering WHY he did it, When there’s stuff already available and cheap these days, I guess it boils down to “Because I can”

  • @amardevsharma1223
    @amardevsharma1223 Před rokem +2

    Yo this helped so much and I always appreciate the content and when i found the channel and got the energy from you from the previous video, you've been nothing but real and can vouch for the amazing content and how down to earth you are with everything! All the most love, respect, and appreciation

  • @Auxilor
    @Auxilor Před rokem +583

    Amazing. Deserves far more views! Would be great to see you make this CPU for real like the jdh-8

    • @AstroSamDev
      @AstroSamDev  Před rokem +126

      I was planning on doing just that, it might just take some time though. I also want to do a design that doesn't use breadboards, and instead have a few custom PCBs printed.

    • @AstroSamDev
      @AstroSamDev  Před rokem +59

      @@avinadadmendez4019 Cool, I have never heard of that program, I'll look into it more. Thanks!

    • @pixobit5882
      @pixobit5882 Před rokem +14

      @@AstroSamDev I would buy it if it's not too expensive! I really like this project and i think it would be really fun to develop own programms that run on real hardware!

    • @cemalettincembelentepe8943
      @cemalettincembelentepe8943 Před rokem +20

      @@AstroSamDev @AstroSam You can also put your design on an FPGA, I don't think I have ever seen one done on youtube before for a homebrew cpu. For those who don't know; FPGAs are circuits that you can load any digital circuit on it. For example you can put your circuit on an FPGA and map the input output pins of the top module and the circuit is now the circuit you created.

    • @xanderkyron
      @xanderkyron Před rokem +7

      @@cemalettincembelentepe8943 FPGAs have been used for soft CPUs for a very long time now (well over a decade), it would be more interesting to see a hard CPU manufacturing approach and it would be unique for these homebrew CPUs unlike using an FPGA. Granted he'd be stuck getting them as samples on a really old process node (>=28nm), but that's more than good enough for this purpose

  • @D.a.n_D
    @D.a.n_D Před rokem +227

    So i just barely passed my computer architecture course. We did mostly 32 bit MIPS processor and we also had to design one ourselves, but on 16 bits, and we didn't have to add multiplication or division. We only implemented about 15 instructions and i can tell you what this man did right hear, is no easy work. You deserve way more subscribers for the amount of work you put in man. Great work!

  • @snowflakezzonPC
    @snowflakezzonPC Před rokem +4

    I really feel like it's understated how incredible this video really is. Amazing work, man!

  • @daviddafitt
    @daviddafitt Před rokem +2

    This is amazing. I've always wanted to try things like this but I'm just a busy CS student for now. I'll come back to this if I do get time to start.

  • @nathanaeltrimm2720
    @nathanaeltrimm2720 Před rokem +396

    Man, the sequel to templeOS is looking wild, I’m just waiting for the twist the writers have planned.

    • @BigOrse
      @BigOrse Před rokem +100

      It's just TempleOS minus the schizophrenia

    • @HBP27
      @HBP27 Před rokem +35

      @@BigOrse so its just boring then

    • @BigOrse
      @BigOrse Před rokem +11

      @@test-zg4hv What so I should praise the man for being completely insane?

    • @tsclly2377
      @tsclly2377 Před rokem +3

      @@BigOrse I'll wait on that diagnosis..

    • @BigOrse
      @BigOrse Před rokem +12

      Not coming here to defend myself, I don't care if what I said was stupid. To anyone saying I'm disrespectful though I suggest you maybe actually look into Terry Davis as a person, because he was a lot more disrespectful than I am.
      Anyway RIP Terry. I may not agree with his ideology but can't deny the man was a pretty legendary programmer.

  • @polic72andDrD3ath
    @polic72andDrD3ath Před rokem +92

    It's been a while since I've taken a comp org class; this was the perfect bridge of software and hardware that reminded me why I loved it so much!

  • @3v068
    @3v068 Před rokem +2

    The first video of yours im watching and I love it! Ive always been interested in computers and specifically how they work, and ive also wanted to get into circuit building so this is RIGHT up my alley. Your editing is fucking amazing too! Keep it up dude!

  • @rezqy_
    @rezqy_ Před rokem +9

    "I built my own earth"

    • @SusDoctor
      @SusDoctor Před 23 dny +2

      I built the entire observable universe by scratch with real breathing life forms in it.

    • @Anirudhfou4
      @Anirudhfou4 Před 14 dny

      👍

  • @lewisd56
    @lewisd56 Před rokem +293

    I did something similar to this a good few years ago, I designed an 8-bit CPU in logicSim, wrote an emulator in C#, designed an assembly language and assembler. Unfortunately, I made a lot of mistakes in designing the architecture, things like fixed instruction lengths and no way to store perform operations on values larger than 8 bits (technically it was possible, but with 256 bytes of memory and 8 of them reserved for registers it wasnt very useful anyway). Having recently written a Z80 simulator, and learning a lot about CPU architecture in the process helped me identify all of the issues in my initial design. with that said, it wasnt bad for an A-level project that confused the hell out of everyone who tried to mark it.

    • @absalomdraconis
      @absalomdraconis Před rokem +3

      256 addresses and some external logic should be enough to implement a microcode architecture.

    • @TechTino
      @TechTino Před rokem +13

      A-level??? Dude that's unreal. I built an sql table viewer in VB...

    • @shinyhappyrem8728
      @shinyhappyrem8728 Před rokem +4

      Fixed instruction length isn't that bad, it's what RISC uses.

    • @lewisd56
      @lewisd56 Před rokem +2

      @@shinyhappyrem8728 considering the memory limitations, every instruction being 3 bytes long is an issue (command operand1 operand2), some instructions didn't need operands, such as halt, but would still need 3 bytes in RAM.

    • @lewisd56
      @lewisd56 Před rokem +4

      @Ramtin Javanmardi If you understand how the architecture works, writing an emulator isnt really too hard, you could even write a simulator if you wanted, but you need to get instruction timings and all sorts of hardware stuff simulated too, which can be a hassle.
      Basically, you need really 2 core things for an emulator, an ISA, which you should have if you designed the microArch, and some way to interpret the ISA, really this will probably be high level logic in your chosen programming language.
      I simply used a case switch on binary opcodes that called methods with the operands (remember, because this is an emulator, memory read and write timings dont matter), but a better approach (and the one I used on my Z80 simulator) would be to use instruction classes and loop through an array of these classes (well, cast to interfaces) testing if the opcode can execute a given function in the class.
      How you handle IO is up to you, I didn't, but memory mapped IO is pretty easy to implement and can be done by simply accessing the RAM object of your emulator.
      My Z80 simulator is open source, and was written for a university project, so it might help, it can be found here: github.com/Lewinator56/z80Sim (you will need .net core 3.1)
      Writing an assembler is kind of different, in reality at the easiest level you are simply going to have a defined assembly language and just convert it into opcodes, which is pretty easy to do. If you want to include functions and variables (by name) in the ASM then you need to process symbols too. I simply stuck with a conversion from ASM to binary.

  • @comandercrypto1318
    @comandercrypto1318 Před rokem +177

    You could learn verilog or vhdl to create the cpu on an fpga, then use off the shelf ic's for ram and rom. Fpga's, if you dont know, are programmable hardware that rewire themselves based on coding. Intel , AMD, and other companies use multi million dollar chips for prototyping chips before sending them off to print. Consumer models range from $50 to $300, and have been used for software implementations for retro systems and modern 32 and 64 bit risc v based cpus.

    • @proxy1035
      @proxy1035 Před rokem +13

      alternatively you can also be lazy like me and simply build your circuits in a logic simulator like "Digital" (it's like Logisim but faster and with extra features) and then just export them as Verilog/VDHL.
      it works surprisingly well actually.

    • @hedgehuug1603
      @hedgehuug1603 Před rokem

      yes this

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt Před rokem +3

      And at this point I would have thought that chips from Intel and AMD are pure evolution .. no revolutionary steps. Steps are not agile. Maybe they include experimental modules in every chip, test them, and if it works, you get the next iteration ( more expensive ). An FPGA and a chip from Intel have nothing in common. Intel chips are optimized down to the photolithography layout and full analog simulation (spice) of the transistors. FPGA does not give you this. The days of the Pentium DIV bug, where a professor from a university uses an ad hoc script to get his algorithm from university software to some proprietary CAD at intel, are long gone.

    • @OpenGL4ever
      @OpenGL4ever Před rokem

      @@ArneChristianRosenfeldt The advantage of using an FGPA instead of individual chips for this project is, that others could make use of it too by simply reprogramming their FPGA. There is already a real CPU design called Magic-1 from Bill Buzbee. The problem is, others have to recreate it, if they want have a copy of it. Search for Magic-1 on YT, there are some videos about this impressive project.

    • @darkcoeficient
      @darkcoeficient Před 7 měsíci

      ​@@ArneChristianRosenfeldtwouldn't FPGA be a bit more like a drawing board at that level?

  • @plk173
    @plk173 Před rokem +3

    when the video was suggested I was like ooo intersteing, then you introduced it and i was like oh no another meme project but instead you made an extremely difficult topic accessible and interesting, cant wait to see more from you

  • @raychrash5139
    @raychrash5139 Před rokem +2

    dude legit its my first year in computer science university and the fact that i understood everything you said in this video makes me happy, you inspired me to try messing around with cpu designing

  • @martinooliveri7310
    @martinooliveri7310 Před rokem +66

    amazing, and actually insipiring. im looking forward to make my own operative system in the future and these videos bring me so much joy. cheers!

    • @WTBuilder
      @WTBuilder Před rokem

      is it linux based?

    • @4.0.4
      @4.0.4 Před rokem +1

      @@WTBuilder _"Interjection!!"_
      - Stallman Ace Attorney

  • @markify8019
    @markify8019 Před rokem +7

    Your videos are extremely well edited and your explanations are in depth and easy to understand. Can’t wait to see you hit 100k soon!

  • @dubtokermusic950
    @dubtokermusic950 Před rokem

    Awesomeness!! Always wondered how this works, very well laid out explanation dood

  • @gegaomega5038
    @gegaomega5038 Před rokem +363

    i am too stupid to understand it

    • @Schnickenpick
      @Schnickenpick Před měsícem +26

      Finally something I can relate to

    • @0x_nietoh
      @0x_nietoh Před měsícem +6

      At least you’re not in denial. Less competition for us folks!

    • @filipec0rreia
      @filipec0rreia Před měsícem +3

      ​@@0x_nietoh "folks"

    • @kiwi7556
      @kiwi7556 Před měsícem

      Womp womp

    • @aleclandman8194
      @aleclandman8194 Před měsícem +10

      Noooo. You just haven't gone deep enough to understand. Not understanding YET is totally different from being stupid.
      :)

  • @lucarosania1358
    @lucarosania1358 Před rokem +12

    This is probably the most impressive thing I've ever seen on CZcams, very very good job! Thank you so much for sharing this 😊

  • @FloatSamplesGT710
    @FloatSamplesGT710 Před 28 dny +1

    You deserve way more subs dude, very underrated and hardworking/smart working and dedicated too

  • @evandossett3897
    @evandossett3897 Před 2 měsíci

    Love the video and the content! Only issue I had was getting caught off guard by the change in music or the selection, startled me a few times cause it would change out of no where while I wasn't fully focused on it and thought something was making noise in a different tab or process lol.

  • @M3LP
    @M3LP Před rokem +7

    This is both beautiful and confusing, it's truly art. I wish go truly understand everything here, but this flies way over my head.

  • @leeaffdraws8502
    @leeaffdraws8502 Před rokem +7

    Woah, I remember last year, one of the subjects I had during the semester was about computer architecture. We saw how the alu works, the ram, the control unit, instruction and microinstruction sets and even periferal handling. Our tests were about designing some use cases for each component, which was a demanding process. Seeing you put that knowledge to work makes me remember how amazed I was while taking those classes. Nice video! and excellently explained too

    • @mathismartin3092
      @mathismartin3092 Před rokem +1

      I did that last year too! I don't know what software you used, but here we did it with diglog and it was a lot of fun ^^

    • @leeaffdraws8502
      @leeaffdraws8502 Před rokem

      @@mathismartin3092 Our exams were mainly done on paper, but we used logisim for practicing! Of course we stuck to no more than 4 or 5 bits for the instruction set, and mostly doing just read, write and math instructions, but it was really interesting to figure out how to do certain things :D

  • @zencibalina2926
    @zencibalina2926 Před rokem

    Thanks for the motivation. I wasn't sure if I could do it, but I might try it eventually.

  • @kakes_
    @kakes_ Před rokem +3

    Literally finishing up my build of Ben Eater's 8-bit breadboard computer as I write this. I'm already itching for 16 bits, among many other improvements. (Thinking I might slap a Pi Pico on there so I can control it from a web interface - could be interesting.)
    Looks like a fun project! I'll definitely be following your progress with this. I admire that you're designing it yourself from the ground up, something I kind of wish I'd done myself tbh, haha.
    (Also, love the Wintergatan!)

  • @ggre55
    @ggre55 Před rokem +7

    Damn bro
    I can't believe how smart u actually are i never thought someone like u would exist really really awesome
    Damn

  • @monemperor1559
    @monemperor1559 Před rokem +66

    this is insane. Making a CPU, then your own compiler specifically to run programs on it and then an emulator for it.
    Cool stuff

  • @jayanthpandit3437
    @jayanthpandit3437 Před 5 měsíci +1

    watched this video once before college and didn't understand much. watched it again after my first semester as an ECE major and understood a lot more. thanks!

  • @pan2285
    @pan2285 Před 11 měsíci

    Thanks brother, your content helps me a lot! Keep up the great work! This is exactly what yt is made for!!

  • @GodofWar1515
    @GodofWar1515 Před rokem +3

    Really awesome work! I've been really interested in this for a very long time. Really glad to see more people getting into this as well.
    Keep up the great work!

  • @Bobbias
    @Bobbias Před rokem +54

    For smoother movement of the paddles, you could have created 2 more characters (either added to the charset, or overwriting some unused characters) with half height paddle characters, so your paddle could move in half-tile increments.

    • @Maykeye
      @Maykeye Před rokem +4

      if it supports color for text, 1 character is enough: you simply swap background and foreground color which turn ▀ turns into ▄ .

    • @morgan0
      @morgan0 Před rokem +1

      or a way to write arbitrary data into unused characters and then display that with the faster character rendering

    • @OpenGL4ever
      @OpenGL4ever Před rokem

      This would make the drawing routine more complex and thus slower. Better design a bit blitter and let the bit blitter hardware do the rest.

  • @mixmaster5150
    @mixmaster5150 Před rokem +2

    I love that you're using the gameboy sims musics for this video. I mean those are incredibly underrated games but made my childhood

  • @thomasvogel4340
    @thomasvogel4340 Před rokem

    I've always wanted to know how computers work at lower levels. Thanks a lot, this was very insightful, and was very entertaining to watch. You got a new subscriber now.

  • @kingofthegoats7032
    @kingofthegoats7032 Před rokem +29

    What's next? Own OS on this CPU programmed with z#? Lol
    Great vid btw worth the wait

    • @AstroSamDev
      @AstroSamDev  Před rokem +15

      Probably, that text editor is just begging to be transformed into an interactive command line.

    • @kingofthegoats7032
      @kingofthegoats7032 Před rokem +8

      @@AstroSamDev AstroOS coming :000

    • @haralabospap7091
      @haralabospap7091 Před rokem +3

      @@AstroSamDev rewrite the Linux kernel in Z# and make z#/Linux
      /s of course

    • @Maykeye
      @Maykeye Před rokem +1

      Lol aside, that's actually not impossible: see Collapse OS for how to bootstrap from almost nothing to working OS.

  • @totallynotabot151
    @totallynotabot151 Před rokem +34

    Now grab an FPGA devkit and turn it into real hardware! (In case you haven't done that before, designing circuitry with Verilog / VHDL is actually quite an interesting puzzle).

    • @AxelMontini
      @AxelMontini Před rokem +7

      Agree. My university had us build a single cycle 8-bit MIPS processor on a Xilinx FPGA. A very simple processor, yet probably the best project we've done so far.

  • @jimenezharlenjoyf.8158

    You are literally the best, I've been looking for a tutorial for three days and yours works

  • @robmartin7873
    @robmartin7873 Před rokem

    Great work! Thank you for documenting this for us

  • @fillupdidier5514
    @fillupdidier5514 Před rokem +28

    The quality of your videos is like a youtuber with a million views per video. I guess I found your channel the right time as I am getting more into computers but not even close to where you're at.

  • @adama7752
    @adama7752 Před rokem +61

    Man relives computer history. Punch cards, assembly, etc..

  • @alexjohnson5677
    @alexjohnson5677 Před 11 měsíci

    I had a similar project during my computer science degree. This was a good refresher on the different individual components of a CPU. Makes me want to fire up Logisim again! Great video.

  • @MultiSciGeek
    @MultiSciGeek Před 9 měsíci

    Amazing! I'd love to see you make actual hardware for this as well as some OS/UI.

  • @ad.i
    @ad.i Před rokem +4

    THIS WAS EXPLAINED SO WELL WHAT THE HELL? SERIOUSLY KUDOS TO YOU MAN

  • @Koljon
    @Koljon Před rokem +4

    very cool cant wait to see you building it in irl ;)

  • @Random-os5rd
    @Random-os5rd Před rokem +1

    Great tutorial, links and program worked fine for me. Thanks for sharing.

  • @Gghhhngbvg
    @Gghhhngbvg Před rokem

    I just found your channel today and your crazy dude! im Just getting into computers. But Making a language AND creating a whole new CPU. Is amazing man! If your were to ask a professor how to do this, he would look at like you were insane! Keep up the great work man!!

  • @Hylianmonkeys
    @Hylianmonkeys Před rokem +4

    HELL YEAH.
    I unknowingly taught myself how to build computers using redstone in minecraft.
    At this point I've now built dozens of redstone machines including an entire gamesystem in minecraft. It has a 7x7 pixel screen, A 7bit CPU, more than 80 4bit regesters, and a rom size currently only 5.456KB or 496 bytes. However the rom can be expanded to indefinite sizes.

  • @estebanod
    @estebanod Před rokem +9

    This guy is building his whole DIY computer and I struggle to write a three-line-long Python code to download pictures from a website ...

    • @eekee6034
      @eekee6034 Před rokem

      Yeah. I've wanted to build my own computer for decades, but my brain just goes, "what are these facts NOPE!" lol but it's getting better slowly.

  • @Azurdos
    @Azurdos Před rokem

    My godness, your content is awesome !
    I think that a video explaining your path to achieve such a detailed knowledge about low level computing would also be very interesting :)

  • @ArtemArist
    @ArtemArist Před 11 měsíci

    What the hell, man, this is nuts..
    Great work, i cannot imagine how tough it was to this point, damn..

  • @jeffparent2159
    @jeffparent2159 Před rokem +7

    Very cool project. This was the stuff I loved doing at uni. Sadly FPGA are still crazy expensive, the CPUs we designed there were all in HDL languages and allowed you to run on hardware. Makes me wanna dig out that code again.

  • @atomspalter2090
    @atomspalter2090 Před rokem +4

    congrats for that achievement. You actually designed a own computer. Wow!

  • @morchedlafferty8614
    @morchedlafferty8614 Před 6 měsíci

    Impressed!!! Great content! Well presented, I loved it!

  • @Gailon1000
    @Gailon1000 Před rokem

    CZcams needs more content like this!!! Thank you for dedication and educating us

  • @CYON4D
    @CYON4D Před rokem +8

    Does your emulator skip the logic layer and just execute instructions on the list?
    You did an amazing job with your custom CPU.
    By the way I noticed the original Sims OST playing in the background. I love that OST.

  • @AshtonSnapp
    @AshtonSnapp Před rokem +40

    Using a 16-bit address bus would get you 64 kibibytes. Although, since you’re storing a 16-bit word at each address rather than a byte, you have 64 kibiwords - or 128 kibibytes - of memory to work with.
    Le edit: 1 kibibyte = 1024 bytes. 1 kilobyte = 1000 bytes. One is a binary prefix, the other is a metric prefix.

    • @axmoylotl
      @axmoylotl Před rokem +16

      how many habibibytes is that

    • @DeeezNuts
      @DeeezNuts Před rokem +8

      @@axmoylotl 2 hamoodibytes

    • @anon_y_mousse
      @anon_y_mousse Před rokem

      I'm never going to use that childish sounding garbage. 1024 bytes is a kilobyte and SI has no business even being included in the discussion, I don't even care that they're using the same letters for prefixing, it's still going to be powers of two. Hard drive manufacturers have caused and are continuing to cause harm to the industry by being such cheats and now we've got douche-nozzles who want to redefine all of our terms instead of taking the manufacturers to task. It's almost as bad as changing BC/AD to BCE/CE because they don't like the implications of how the names originated. Stop redefining terms.

    • @tissuepaper9962
      @tissuepaper9962 Před rokem +8

      Nobody cares about kibi, kilobyte is 1024 in my heart.

    • @Anon.G
      @Anon.G Před rokem +2

      @@tissuepaper9962 same here

  • @ana303_
    @ana303_ Před rokem

    I had microprocessor last semester, you did well explaining everything as simply as possible.

  • @suvetar
    @suvetar Před 2 měsíci

    Thanks for this! Very inspirational and please keep up the good work!

  • @gawkersdeathrattle1759
    @gawkersdeathrattle1759 Před rokem +9

    You may want to expand the architecture to add interrupts, although it does complicate things somewhat given that you'll probably want a stack to handle those, and you currently don't have anything stack-ish like jump-to-subroute or returns.

    • @Ehal256
      @Ehal256 Před rokem

      Was going to mention this. It would make timing quite a bit simpler, no more arbitrary wait N cycle loops, just execute game logic once on every vertical sync interrupt. :)

  • @joshuadelawarr1027
    @joshuadelawarr1027 Před rokem +3

    AstroSam in 10 years: imade my own os that simulates our universe

  • @nsr-ints
    @nsr-ints Před 8 měsíci

    I love how intuitive your operand table is.

  • @MotownBatman
    @MotownBatman Před rokem

    This is Crazy Hangover type Witchraft!
    Epic Project; Great Job!

  • @Frankabyte
    @Frankabyte Před rokem +18

    This is seriously impressive, and I'm blown away by the amount of time and effort that went into this...
    ...but can it run DOOM?

  • @nullmemaddress
    @nullmemaddress Před rokem +45

    Let’s get this strait:
    - Man makes own cpu
    - Man then makes own language
    - Man then makes own compiler
    - Man then makes his own emulator
    WTF

    • @msmith2961
      @msmith2961 Před rokem +4

      Don't forget he also wrote a port of Pong to run on said computer...

    • @notsa_s
      @notsa_s Před rokem +2

      Bro this is like jdh shit right here

    • @DumDoger
      @DumDoger Před rokem

      *straight

    • @PiyushBhakat
      @PiyushBhakat Před rokem +1

      And he's ONLY 16. When I was 16, I was barely writing hello world in C++.

    • @alexyo2440
      @alexyo2440 Před 10 měsíci +1

      ​@@PiyushBhakatBits per age. This man will only grow stronger in time and consume more bits

  • @neomage2021
    @neomage2021 Před rokem

    Fun project! I remember having to do this (but 8 bit) in digital electronics class in undergrad for EE.

  • @mafin_official
    @mafin_official Před rokem +1

    you sir are very entertaining and smart. you made this video better than most youtubers i know. you actually put effort into these videos. you deserve my sub.

  • @janikarkkainen3904
    @janikarkkainen3904 Před rokem +4

    Awesome project. I love stuff like this, even started my own project. Tho it's already a few years working on and off on it, and using a 6502 instead of just logic stuff, and on breadboards. Fun stuff and great work getting so deep with this stuff!
    Now make it in real hardware! 😏 ...I mean, you used logisim so IC counterparts should be relarively easy to come by, VGA might pose some problems but the VGA signal is quite simple in the end...

  • @gachastorys5129
    @gachastorys5129 Před rokem +12

    Hey so I have an idea! I think that Z# would fit perfectly as an embed for the computer. It should be its own name language. Kind of like batch is for windows. I’m very impressed by this! Hope to see more videos from you in the future

  • @jigneshsolanki2486
    @jigneshsolanki2486 Před rokem

    Been watcNice tutorialng your vids for a good few weeks now, learning new sNice tutorialt each day. my worksoftow has improved so much since watcNice tutorialng

  • @codeplayer8575
    @codeplayer8575 Před rokem +1

    My friend, you are truly a legend. Congrats!

  • @hamburgerhamburgerv2
    @hamburgerhamburgerv2 Před rokem +4

    2025: I designed the world’s most powerful computer

  • @Bobbias
    @Bobbias Před rokem +7

    One small thing about how you explained your emulator. You conveniently forgot to mention that part of why logisim is so much slower is that it actually simulates things far more in depth than your emulator. A very minor thing, but the way you glossed over it gives the impression that logisim is slow because... It's slow. Rather than because it's doing way more work than your emulator.

    • @DFX2KX
      @DFX2KX Před rokem +1

      and Logisim's speed does depend on whether or not one is using the default 'chips' inside it or not, it has math operators and register modules, and whatnot, but if you eschew that and go down to the individual gates in sub-circuits....

    • @Ehal256
      @Ehal256 Před rokem +1

      He does say that logisim is simulating every logic gate, and that's why it's slow. Seemed sufficient to me.

    • @Costelad
      @Costelad Před rokem

      Yeah I was a bit confused at that. Why would they differ in speed so much when they’re both emulators?

    • @Ehal256
      @Ehal256 Před rokem +1

      @@Costelad Imagine instead of executing an add instruction directly, you simulate the propagation of electricity through each part of the circuit implementing an add operation.

  • @tekkiech3485
    @tekkiech3485 Před rokem

    This video really helped me. Thank you very much!

  • @shift3yep679
    @shift3yep679 Před rokem

    Holy shit my man your a fucking genius .
    Everything from using c++ to make the compiler to the emulator to increase processing speed. This is fucking gorgeous!! I would love to pick your brain on ideas and concepts .

  • @omegablade89
    @omegablade89 Před rokem +3

    Great video, I don’t know how old you are, but it’s clear you know your stuff. I hope Intel or Amd contacts you for a job.

    • @JtagSheep
      @JtagSheep Před rokem +1

      Intel x86 is a little more complex than making a 16 bit cpu.

  • @7.12_am
    @7.12_am Před rokem +9

    Bet in 2030 he'll be competition for Intel, AMD and NVIDIA after he figures out how to chemically make transistors

    • @LuaanTi
      @LuaanTi Před rokem

      You want Sam Zeloof for that :P czcams.com/video/s1MCi7FliVY/video.html

  • @stephencarter2041
    @stephencarter2041 Před rokem

    very interesting. love your explanation on everything.

  • @ramdxm84
    @ramdxm84 Před rokem

    Your tutorial videos are amazing. I decided to go back to creating soft after 16 years. soft soft is so easy to get into, but also offers

  • @The_hot_blue_fire_guy
    @The_hot_blue_fire_guy Před rokem +4

    Now you need to build this with actual hardware and add a long term memory to it. Like a SD card slot or something that could be used to store data even when the system is turned off.

  • @therealyojames
    @therealyojames Před rokem +3

    Hmmmm jdh.

  • @trupalpatel2329
    @trupalpatel2329 Před rokem

    thank you soo much very direct link n works for me love the way you expressed the installation .

  • @xXDiggityDawgXx
    @xXDiggityDawgXx Před 10 měsíci

    I love love love when people mention how they had xxx subs, then I see the video a year later and they have 10x more. This was so cool, time to binge!

  • @anasmhiri
    @anasmhiri Před rokem

    Bro this is amazing!!
    Keep going

  • @ingenierocantor
    @ingenierocantor Před rokem

    What the Awessome Project! And very good music.

  • @batt3ryac1d
    @batt3ryac1d Před rokem

    Great video you did a fantastic job explaining everything I swear I could almost follow what you were talking about 😅

  • @Chrls5
    @Chrls5 Před rokem

    oh My God this project is superb, beautiful, great job ❤️

  • @kevincozens6837
    @kevincozens6837 Před 8 měsíci

    A nice fun little(?) project. I would make some changes to the instruction set. Most instructions had the register after the mnemonic but you have AIN, BIN, CIN. They should be INA, INB, and INC. However, you have 5 bits for the op code and 11 spare bits. You can use the spare bits to indicate source and/or destination register as that can be coded in only two bits each. For jump instructions the spare bits can indicate which flag(s) to check. That would free up 7 instructions for other possible op-codes and make IN, LDI, SWP, and the math operations able to work with any (pair of) registers.

  • @petercortens6019
    @petercortens6019 Před rokem

    Very nice work! There is still a use case for this line of work, albeit niche. Like powering up new games for retro console cartridges, powerful for retro purposes, but with a reasonable small footprint & power consumption < 1W and so with a specific reduced optimized instruction set to complement the original hardware and or game type as much as possible.

  • @hikaroto2791
    @hikaroto2791 Před rokem

    Dopest video ever, period. My respect sir! Hope one day i can do something as you did!

  • @LegendOfKarogane
    @LegendOfKarogane Před 5 měsíci

    Bruh THIS IS AMAZINGLY HELPFUL THANKS 🙏🏿