I designed my own 8-bit computer just to play PONG
Vložit
- čas přidán 12. 06. 2021
- welcome to this week's episode of "Reinventing the Wheel" with jdh
CODE: github.com/jdah/jdh-8
TWITTER: / jdah__
PATREON: / jdah
EDITOR: NeoVim
VIMRC: gist.github.com/jdah/4b4d98c2...
THEME: palenight
CIRCUIT DESIGNER: Logisim-Evolution (github.com/logisim-evolution/...)
RESOURCES:
The Elements of Computing Systems by Noam Nisan and Shimon Schocken
@BenEater 's Channel: / eaterbc
@Esperantanaso 's DUO Adept: • An 8-Bit TTL CPU + GPU - Věda a technologie
Next: Moving electrons by hand like a real programmer
*for playing space invaders
XD
yes
@@licklack159 no they use very very basic python lol I suck (at least rn I will be the next jbh)
@@red-52 Python is okay but definitely not good enough as other languages like C or C++
@@JuanR4140 ya ik
Next: So I decided that I'm just too good for electricity, so I built a steam-powered turing machine
Inb4 he makes the Charles Babage Computational Engine
Thats actually not a bad idea, I know I have seen a water pump based calculator device on youtube.
after that comes a better wheel
Bruh
He's too cool for machines now, so he just imagines it.
Next: Too good for keyboards, or software, or languages: Just write code in binary with hole-punched notecards like a real programmer
blink for 1, dont blink for 0
If you aren't writing programs for a Jacquard loom, I don't wanna hear it!
well, after watching this episode... your idea sounds easy to do
@@przemcio6867 satire?
Punch cards? The first computer I worked with had 16 switches for the address bus, 8 switches for the data bus, a deposit button, a reset button and a run/stop switch. Based on an 8080. Hand assembly and switch input. It taught me to become a wiz in binary.
Look at him, using C. The only acceptable thing now would be to see him make his own programming language from scratch.
Exactly what I though xD
He already wrote a parser / lexer for the assembler part which is not far from what a C compiler does.
His To Do list looks like this:
* Create a "simple" computer.
* create an assembly language for it.
* Use this assembly language to create a low level language.
* Use this low level language to create an operating system and all the essential tools for the OS.
and make pong with it
make his own programming language with only ASM :flushed:
Next video: I went to a mine and gathered raw resources to build my own CPU
After that:. Custom mine tools for doing it..
Next video after that: I reverse engineered imperialism and have established the means to exploit local ethnic conflicts for the sake of extracting conflict minerals for my CPU
Oh no....
*HE LIKED YOUR COMMENT. THAT MEANS YOU GUESSED RIGHT.😅*
@@astroblurf2513 after that: using a large stockpile of up quarks, down quarks, and electrons to create ethnic groups to fight and exploiting them for materials to make a particle accelerator where wich I create silicone to make transistors that I put together to make a CPU
@@matthewe3813 next video after that: I’m sick of exploiting premade humans. I made my own human clones army
He's evolving, just... backwards
I think they’ve got a word for that
@@maxwellclark1615 yes, but you see, it's a reference
@@maxwellclark1615 devolving
@@maxwellclark1615 its just called evolving. evolution isnt linear.
@You thought it was a joke? Terry Davis who developed TempleOS?
After all of this he still doesn’t have his own programming language
yeah, he's done so much crazy shit that at this point a compiler for a custom programming language would be almost disappointing.
Can't wait to run hello world on jdh
@@TheEnderLeader1 If he did this he'd probably just make his own unique version of binary or smth
@@fr4781 he did. It's the machine code of the computer he invented and built. And also he's got a assembly language and assembler for it.
technically speaking, he did
As electrical engineer I actually think you used logisim for its intended purpose, education. The visual representation of gates as well as real-time status updates will make the circuit much more intuitive to understand than 2 pages of VHDL or verilog.
Also watching Ben Eaters series is an additional treat.
then you do it like that
not a student but I play a lot of Logisim and made my own 16x16 video player, which I another circuit dedicated on animating every frame of it.
years pass and I forgot how I even able to do that.
I'm honestly surprised he didn't write a tool to convert from Verilog to a logisim diagram
Next video: **LOGIC GATES ARE TOO HIGH LEVEL, NEED TO MAKE IT TRANSISTOR LEVEL**
No he’s going to have to go the vacuum tube route and program the computer with punch cards
I’m fed up with depending on transistor companies, I’m manufacturing my own
@@theoarcher896 I'm fed up with depending on transistor manufacturing machine companys, I'm building my own machines xD
I need to make my own universe.
Video after that: *TRANSISTORS ARE TOO HIGH LEVEL, AM NOW PUTTING P AND N SILICONE TOGETHER*
next video: I build my own universe to recreate life
bro pls no spoilers
Video after that: inventing time travel and breaking previous video’s universe
@@jdh nice se
"If you wish to make an apple pie from scratch, you must first invent the universe"
*this comment was hearted ny jdh*
**vsauce bg music intencefies**
"If you want to make a game from scratch, create an universe first."
or it won't run!
Him : "I am too cool for universe, so I decided to make multiverse"
Respect+
@@InnerEagle you shall optimise after all!
I completely relate to the StackOverflow bit about lexical analyzer generators. I eventually managed to build a working compiler from scratch but initially, all my questions were met with "use this existing tool that does all the work for you".
my ben eater binging is paying off and i can now understand 15% of what this dude's saying
omg, I thought I was the only one thinking that. lmao
@@atraps7882 me too haha
Soo true
Lmao same
Same.
Me: hehe im making a platformer game yeah im a programmer
jdh: proceeds to create a computer from scratch
I spent a few months trying to grasp the basics of programming, decided I'm too stupid, then I got this in my recommended feed. CZcams is bullying me.
@@plebisMaximus lmao noice
LMAO i feel u dude
@@plebisMaximus You're absolutely not too stupid. I can be hard, and it could be that it's not something you really want to do (the result is VERY different from the method), but find the right guide and you're able to do simple programs in 2 hours from now.
Another 10 hours and you could be making a website.
Another 7 days and you could be making a game in unity.
If you have a clear (and small) goal in mind, it can be surprisingly fast to learn.
@@tokeivo Thanks a lot for the pep talk, mate! I'll give it another shot.
See ya next time when: Mining copper with hands to make everything myself.
In my senior year integrated circuits course in undergrad, I was given an integrated circuit architecture and was instructed to program a simulation in c-spice and do some benchmarks. Going from the block diagram to c-spice code directly was not something that I had done before, so I recreated the circuitry in Logisim. I included the Logisim macros in the report and expressed how much easier the project was for me to complete with it. It is a great program for simplification and understanding.
This is absolutely mind blowing. As someone who is studying EE and CS the amount of work here is crazy! The range of skills is nutty! Big props to this dude
Similarily studying computer engineering. In theory I understand all of this but in practice holy shit.
CS student here did my compiler project last semester and a VM project a year before (which the compiler's asm output runs on), and dear god this guy is pushing himself way too hard.
@@eebilu idk man kinda sounds like you aren't pushing yourself hard enough so maybe work on that ??
@@arc-sd8sk idk man, it kinda shows you don't recognize different levels of intelligence. This guy is not normal, statistically speaking. He's for sure way above 135 IQ (SB) , i'd bet 140+ . He doesn't even have enough views/subs to make a living from youtube, so he does this as a part-time.
I'm a CS graduate, i've build a compiler from scratch, lexer and parser in C (Among other "low level" stuff, like a 3d engine), i've spent countless hours studying (You go through a lot of maths, that don't add up to practical skills , if one's not invested in research later in life), i've met much smarter people than me, and they went a step further by creating their own OS. Great for them, loved learning from smarter peers.
@@rubiskelter it was a joke-I was just busting his balls :)
but I appreciate such a detailed and thoughtful reply so thanks for that
this man is about to invent the transistor
I invented frequency and way to store sound in computer but then I come to know that that's how things are working today.
I was sad
*reinvent and make it better xD
who needs three connectors in a transistor? it should have one and output the unchanged input. the program runs in your imagination
Transistors are waaaaayyyyy too fancy and fast.
Look, I'm playing with electromechanical relays ;-)
@@shailmurtaza9082 damn you invented frequency? sick
Imagine my face, trying to construct my own CPU in breadboard, thinking me as a God, seeing your video. Man you are astonishing. Best video that I watched in YT in my entire life. Thank you man!
Damn, never say ¨I use arch¨ to this guy, he'll completely humiliate you
Good thing I'm putting Fedora back on my laptop as I'm typing this then.
Frankly, I'm surprised that you didn't go "Wait, logisim doesn't have a dark mode? Hold on, I gotta fix this."
I mean, this would take one minute max to do that. Just need to search for a setting to invert the colors of the display :)
@@Agnostic080 If you're as clever as him, at the very least, hex edit the exe to invert the color on the program itself, or maybe even create a custom theme
@@YOEL_44 Although these options sound like fun, it's so unnecessary. I just have a Keyboard shortcut to switch to inverted screen colors whenever I want. But I guess, most of the things on this channel could be considered 'unnecessary' :D
Falstad's Circuitjs has a night mode ;-)
@@Agnostic080 I thought it was very obviously a joke reply that you replied to
Looking forward to seeing how you do with your processor. Have nearly finished my own and it's taken a year of spare time to do it.
Amazing. Reminds me of my EE degree ! We had to do such things in the early 90s. Going down memory lane A. Hope your next step is building your processor using a fpga or something like that ! The most incredible thing is you probably managed to design and build everything faster than it would have taken in the 80s to do it !
Next: Designing my own universe to play Sonic the Hedgehog
Green hill zone music intensifies
What a timing! I just finshed my own cpu 2 days ago (it was an extra project for my second semester of IoT Engineering). I made a proper design in verilog so I could get it manufactured but it turned out that no manufactures take any orders under 2.000.000$ so i just got it to run on a 5$ fpga (tang nano). Having physical hardware do something is much cooler than emulation and It's actually not much harder than logisim. To be honest it was a lot easier than I expected. I think you will be pleasantly surprised!
sweet! I'm thinking about ordering an FPGA first to simulate this thing before I build it for real. I've heard of people getting a small number of PCBs manufactured and putting it together with ICs to get something in between a manufacturer-made CPU and a breadboard machine, if you can invest the time. and it's much cheaper than $2 million to get the circuits :)
@@jdh That's so cool!!! Yeah fpga's are great and while doing the project I learned that they are used not only for prototyping but you can actually find them inside cars, excavators and other industrial machines. Also I cant wait to see how you approach making the "PCB computer". It seems very hard to find a sweet spot in terms of how high level circuits you want to use. It's a weird balance between full on cpu ICs and raw transistors.
@@jdh You should BenEater's series about building a computer on breadboard! He even sells kits if you want to make your own.
You should make a video on it
15:43 While I do much more simple coding compared to this, this little montage really hit home. Sometimes it feels like everything starts blurring together, I really felt that.
This is actually something I hope to do one day as well, I want to make a machine entirely made by me, that runs games/programs made by me. Just for the learning experience, and to say I did it I guess, but this gives me some hope.
This is super fun. When I was in university, we had to make an ALU (among other things) on a breadboard directly from basic gates (and, or, not). We also had to design those gates at the CMOS logic level as well, and of course, we studied the physics of semiconductors, but we didn't do any manufacturing at that level, haha. Most people reading this will already know, but it's cool that each individual component of the circuit in this video could be implemented using nothing but a bunch of NAND gates, and of course each NAND gate of course can be made using two PMOS and two NMOS transistors. For students interested in this type of thing, there is actually a game online called, unsurprisingly, NAND game, where you build up towards a "computer" from simple gates.
Very cool video, I'm excited to see what comes next! :)
"If you wish to make an apple pie from scratch, you must first invent the universe. - Carl Sagan" - jdh
"If you wish to make an apple pie from scratch, you must first invent the universe. - Carl Sagan
- jdh"
-therealsome1
@@randomguy-gb9ge """If you wish to make an apple pie from scratch, you must first invent the universe." - Carl Sagan" -jdh" -therealsome1" -random guy
@@gigabit6226 """"if you wish to make an apple pie from scratch, you must first invent the universe." - Carl Sagan" -jdh" -therealsome1" - random guy" -Gigabit
@@dontsmi1e """"""if you wish to make an apple pie from scratch, you must first invent the universe." - Carl Sagan" -jdh" -therealsome1" - random guy" -Gigabit" - Kazat0" -DonTSmi1e
@@GalaxyStudios0 """""""if you wish to make an apple pie from scratch, you must first invent the universe." - Carl Sagan" -jdh" -therealsome1" - random guy" -Gigabit" - Kazat0" -DonTSmi1e" - GalaxyStudios0
Next: So I decided that I am just too good to use the standard molecules and laws of nature, so I decided to create my own universe!
dont encourage him
Well, it’s an emulatior of another universe, yet...
One of my classes in college was creating a MIPS CPU with SystemVerilog and getting it working on an FPGA with keyboard input and VGA output. I ended up writing 2048 in ~500 lines of MIPS assembly. Probably my favorite class in all my education! Great video :)
"So today we're going to write our ohne matrix."
The amount of knowledge and dedication required to make this is mind-blowing
not really, he uses his knowledge to make a *good* turing machine
@@PFnove huh? Designing a circuit for any architecture and writing Pong in assembly for it is already quite a challenge.
this man is just re inventing the modern computing systems again
"modern"
"just"
"man"
literally just for gits and shiggles too
@@matthewe3813 Oh, forgot using the two genders was sexist and transphobic.
Just wanted to say, you're extremely underrated, and I can't wait for your future success. You're an inspiration, and a perfect mix of learning and insanity, and its addicting!
“If you wish to make an apple pie from scratch, you must first invent the universe.”
― Carl Sagan
next video:
so here i have:
- 4 transistors
- some cables
- a waterfall
- a forest & an axe
and now, we're going to make pacman! :)
"in a cave...with a box of scraps"
@@gladepowderfresh
I have:
>matchbox (for phosphorus dopant)
>borax (for boron dopant)
>sand
>fuselage of the plane I crashed in
_and today we'll be running crysis!_
This guy: Makes a god dang computer
Me: Googles "how to declare variable in HTML"
You cannot declare variables in HTML, dude. Because HTML is not a programming language and there is no variables. :D
@@1kvolt1978 And that's the joke.
@@1kvolt1978 someone is probably going to be really mad at what you just said.
(I meant the HTML is not a programming language)
@@hetsmiecht1029 But it is really not. What's the point to be mad about the fact?
@@1kvolt1978 Learns when to take words as sarcasm.😉
Ahhh man logisim brings me back. The logic projects I got to do in uni were my absolute favorite. Writing quick sort in MIPS is a close second. Love everything about this project
That's gonna be one expensive computer.
Also massive props, not only did you do all this, you also had to sift through hours of recording and commentate for a 20m video while editing (rather masterfully might I add) and overlapping all of it.
I would not have had the patience. Not to deal with strings for hours in C just to make an assembler, and much less to do all that in logisim. Hell, even display stuff has gotten too cumbersome for me to not get bored with. And maybe I'm just not a video guy, or maybe I'm just disorganized with footage, but much less would I have felt like reliving the whole thing while having to also edit it and comment over it later.
Massive respect and gratitude from me! Looking forward to the next video and seeing you bleed all over your breadboards from stripping wire and moving around ICs for hours haha.
Alternative title: "how to get rid of Intel Management Engine without actually get rid of Intel Management Engine"
Now Intel buys RISC V, which was a promising alternative
😭
@@nicolefischer1504 Well, thanks for the clarification but that's not much better 😂
Amd
@Marco Bonera Intel bought SciFive, a RISCV manufacturer, the ISA itself is still open for anyone to use
You're literally doing everything I wanted to do, but didn't had enough motivation to finish.
I LOVE IT
sounds kinda depressing tbh😂
@@Agnostic080 swap "motivation" for "desire" and its less depressing imo
@@ciankiwi7753 you're right. But I think it creates another problem of the action being not desirable = the result is not worth the effort. Which might be interpreted as saying that the author of the video is wasting time on pointless projects.
I like how you’ve come from web to os to CPU ARCHITECTURE!! love your videos btw
"We're rolling with only 16 instructions"
Dude just took "Reduced Instruction Set" to the extremes
I can't wait for the Ben Eater collaboration
Next: creating Super Mario Bros. on a lamp
How 'bout Metal Gear on a calculator?
@@philosophicalearthworm6819 it's not universal enough. Surprisingly Calculators actually change a lot overseas. Most of Europe uses Casio because Texas Instruments are way to expensive
@@philosophicalearthworm6819 that would be easier
@@InnerEagle too easy for jdh? lol
I am genuinely curious about calculators(despite not being a math person), like ways that they could be re-purposed.
Nice! Your work brings back fond memories of early 8 bit electronics for me.
YEESSS YEEESSS YESSS Man You are a beast! So much time spend into this it's almost incredible! Make that circuit and the screen too for fucks sake! Amazing work ! I reeeeallly would love to know where did you learn or search all the info you need to make such a thing like this tho, because if you are not an electric/electronic engineer you must have worked your brain AF to make this happen. Congrats once more, and will be expecting anxious the next episode!!
one of youtube's most underrated ten channels
list 'em pls
@@szigo3211 suckerpinch needs to go on the list, if you want channels that put too much effort into useless computing tasks.
@@thomaspeck4537 Thanks
I know those Ben Eater tutorials carried you so hard
These kind of videos inspired me to make my own. Great stuff
He set the bar really high with his first video but he never disappoints
Gotta say, I was very shocked and very impressed when you whipped out the "I don't have any formal education in digital electronics" during the hardware design phase. I would NEVER have attempted something like this without the university classes I took on Verilog and digital systems design. Mad props! Can't wait to see the final hardware!
If you really want to implement your CPU irl I would recommend you use a fpga, most fpga boards include an editor similar to logisim so it shouldn't take you long to replicate your design.
To add on to this, I think logisim-evolution supports exporting your design in Verilog and/or VHDL, which are perfect for use with FPGA design tools.
But that’s boring!
Up next uploading it on an FPGA, making your own internet and then deep diving into VLSI design to get down to CMOS transistors and clock generation. This is fun!!
The next video he makes, he is going to create a universe for the earth to form, for humans to evolve, create everything we have in the early Information Age, just to play candy crush.
I never thought I would be seeing logisim ever again but god damn was I happy when I saw it. Might be old and discontinued (at least the original) piece of shit but it just works™
I love Logisim, holds a very special place in my heart. don't know any other programs like it and don't want to.
@@jdh Same, I can't really use anything else because I don't know about another simple software that is as complete as logisim.
Though everytime I have to rotate components or choose the number of inputs of a component from a list I die a little bit inside
There acually is a fork of logisim, call logisim-evulution that is still maintained
There is an open source program similar to logisim that is being actively developed, you can find it at hneemann/Digital on github, don't know how it compares do logisim though
@@jdh mulimedia logic (yes mulimedia)
next next video: I mined and refined silicon into wafers to build my own SoC
Very nice project, I'm impressed can't wait to see more!
This is really cool, this popped up in my recommended and I already love it. But you should make your own OS to run on the CPU, which can run a basic word processor that you made.
What is an OS? Why do consoles tend to only have copy protection code in ROM, but no OS? Home computers at least had BASIC . Unix started as a file system.. So on a console with only ROM, or only a CD-ROM you don’t need an OS?
I understood very little of this more than superficially but the coding montages are very cool to watch, so I had to leave a like.
*”mumbo jumbo says the redstone is simple when it sounds really complicated!”*
People who understand this video: *amatuers*
Mumbos redstone skills are overrated as fuck :D Just saying, I like his vids. I always cringe when somebody says he's the best at redstone. But he's not even saying it, people are.
I grew up watching redstone computers and stuff, in terms of this it kinda sounds trivial.
@@aviko9560 redstone computers are different in my opinion, what mumbo jumbo does is actually practical and useful, but whos actually going to use a redstone computer? its impressive but practically, mumbo jumbos redstone is more useful.
@@aviko9560 Mumbo actually once said that he isn’t the best at redstone
Legends who watched all of crash course computer science and binged all of ben eater's videos: noobs
@@aviko9560 yeah, he literally says that what he's doing is simple.
we all know you were the kid writing TI-BASIC programs in your calculator in highschool
The life of a developer is this loop:
1 BUGGING
2 DEBUGGING
3 GOTO 1
"I designed my own custom computer just to play PONG"
They did this in the 70s. Quite a lot, actually.
*So make pong with literal atoms.*
*make a computer with air molecules and stuff*
Or just p a r t i c l e s
Real Question: Where have you learned these stuff
on the internets !
@@casimirwallwitz8646 the osdev wiki wont be too much help if you're making your own cpu.
His tetris os honestly isn't that hard to do, he basically just got a basic booting os and then started on graphics right away after input is working.
Were you creating an os meant to be an os you would also need to worry about paging, user mode, system calls, executable formats etc
he is rich
@@leyasep5919 People using internet for the purposes it was likely originally intended... It is strangely hypnotizing.
@@pythonista_4382 And that has to do what, exactly?
I cant wait to see you construct that machine, its truly inspiring.
I have always wanted to make my own os running on my own cpu, but you have actually done it. Thank you, that was very inspiring.
spoiler if you are like me: don't worry, he renames the second E instruction to F in the doc at 5:27
I couldn't take my eyes of that second E over there
phew* than goodness
Possible Next Video: I created my own TV that can only let you play tic-tac-toe
The point of an LL(1) language is so you can easily build a recursive descent parser by hand for it. Perhaps you mean LALR(1) languages such as Yacc and Bison can automatically generate parsers for?
As for recursive descent parsers, you just need one function per nonterminal in the grammar of your language and in those one clause for each production that can replace that non terminal.
And at the start of each function you can pick which clause to use based on the next input token. That's what LL(1) stands for. Left to right scanning of the parser across the input tokens, doing Leftmost expansions. With one token of lookahead.
Each clause is straightforward as well. For each symbol in the corresponding production, we treat them, in the order listed in the production, in the following way:
A nonterminal means call the function for that nonterminal.
A terminal means consume a token from the input stream. If the token read isn't the same type that the production, there's an error and the program doesn't parse.
If the token is of the required type, you consume it somehow. Minimally remove it from the token stream. If the goal is to just check programs are correct, you can just ignore the token after you remove it from the token stream. Your parser will simply do nothing except empty the input of tokens for a correct program but produce an error if you input an incorrect program.
If you want to do something more interesting like generate code you can have your tokenizer add more information to your token, and that information will be in its proper context, and not only that you can leave breadcrumbs for subsequent steps.
For example you could tag variable name token types with the actual variable name. Or some computer readable equivalent. As you go along you can build up a symbol table containing such things as the type for type checking and the memory allocated to the variable.
You could implement block scoping by having a stack of or chain of symbol tables. And if a variable name isn't found in the deepest nested block scoping you look for it in the next highest scope and symbol table.
You can also do something with it like generate code for an assignment. Which would simply be a store into the memory location of whatever the right hand side evaluates to. Which would be calculated recursively and available when the function for the last nonterminal returns.
That said an assembler doesn't likely need a parser, just a lexer or regular expression matching/FSM because there's likely no nested language constructs.
And two passes, so you can find/computer the address of labels when you see the definition of a label and then come back and replace any forward references.
You're a genius artist. Thanks for making these great videos, keep up the good work.
I have a huge respect for all the low level C/C++ and assembly programmers as I am a python programmer and find these languages really difficult
I am a Python programmer, but I am switching to C now. I don't really code in Python now much, because of C. Believe me, C/C++ is easy, you just need to learn the syntax. Assembly is easy as well, you just need to understand how your CPU works, what are memory addresses, etc.
@@dominikmazurek753 how do you learn this lower level stuff? I have java exp but want to learn computers on the lower level as that'll help me be a better dev tbh
@@avidreader6534 Watch ben eater
@@neekap5987 will do thank you. Any more advice would be greatly appreciate too lok
@@avidreader6534 Sure, i would advise you to try the Harvard CS50 course for learning the C language, you can skim over concepts that you already know about from python but you should focus on pointers and memory management
This low-level stuff is super impressive
Crazy how some logic gates arranged in the right way can make literally anything happen
4:27 16 instructions? That's a lot... you can do the same with just one!
subleq
Off the shelf ALU has Input Pins. Why do you want to waste them?
This guy just gonna casually remake the wheel
So I've been thinking and came to the conclusion that the world isn't my tier, so I remade the *whole word atom by atom*
Next video: I decided to make my own universe to house my custom computer…
Ben eater: starts from the bottom and works upwards. jdh: starts from the top and works downwards
I took a class in college where we described a computer's operations and then used "Register transfer language" to express all the operations; these were then collected and used to generate the logic for each left-hand-side of those expressions.
The computer happened to be a PDP-8.
Note that there was no anachronistic 64K ROMs for decoding!
I would be more impressed if you actually did express the design using only basic logic gates, preferably as "sum of products" or "product of sum" subexpressions wherever possible.
(This was in the late 1980's)
Absolute insanity, absolutely amazing, massive kudos to you fro doing this.
Next video:
“I’ve decided that I’m too good for computers”
Lovely, mate! What I really envy you is your dedication! Keep going! Good job! :D
Using particles that some other being created seems too cheaty, you've got to create your own universe next.
Goes to show perfection gets in the way of progress. Great job with this "computing independence" project!
This is the closest thing I can find on the entire internet to a tutorial on making a computer from scratch
I found this to be incredibly fascinating. Subscribed!
jdh in 20 years : designing my custom robot with my custom language from scratch
next video: I realised Assembly is too high level, so I coded Breakout in binary
Next: i simulated every form of life while only using 1 byte of RAM
LOVE THIS. Makes me wanna dive in and do the same myself! 😅also can't wait to see the next part in the series!
Next: I designed my own universe just to desing my own pc just to run minesweeper
The fact that this video does not have over a million views already is criminal. Great video!
A day with a jdh upload is a good day
I love your videos!
They inspired me to try making simple 2d games on C++ last weekend
i feel proud of already having done that, tho jeez congrats, the amount of work must've been insane, i mean it took me days to work on my own 16 bit computer design
also good luck for the next vidéo: "i build my own computer in real life on a bread board"
Studying computer architecture for fun at home, this video is awesome atm 👍🏽 also, you are a mad man to do this on your free time and I love it lol I hope you make a lot of money at your day job because if not I don’t know what to expect to do with my cs degree when I’m done lol
Man I'm addicted to this channel...😳😳
The legend has returned
That was a very surprising ending, excited for part 2.
Bro wrote a custom architecture, an EMULATOR, and a fricking ASSEMBLER? holy cow this guy's unstoppable. I don't care if you say you dont have "formal education" in this stuff, you're a whole hecka lot better than me.