I Made a Neural Network with just Redstone!
Vložit
- čas přidán 16. 06. 2024
- To try everything Brilliant has to offer-free-for a full 30 days, visit brilliant.org/mattbatwings You’ll also get 20% off an annual premium subscription.
Patreon: / mattbatwings
Discord: / discord
My socials: linktr.ee/mattbatwings
My texture pack: modrinth.com/resourcepack/mat...
World Download: (JAVA 1.18.2) www.planetminecraft.com/proje...
IMPORTANT NOTE ABOUT USING THE WORLD DOWNLOAD, PLEASE READ!!! When drawing, make sure to make the numbers SMALL and centered. If you fill the screen with a number, it will not work. This is because the images in the MNIST dataset are small, and that’s what it was trained on.
Code used in this video: github.com/mattbatwings/neura...
Yannic's video: • I BUILT A NEURAL NETWO...
Redstone Team: • 【Minecraft】World first...
3b1b's much better MLP explanation: • But what is a neural n...
My explanation of MLPs was inspired by • I made a Neural Networ...
0:00 Intro
0:23 Backstory
2:02 MLP or CNN?
2:43 MLP Explanation
5:19 The Plan
5:39 Python Simulation
7:45 Input Layer
8:43 Hidden Layer
11:37 ReLU
11:58 Output Layer
13:09 Softmax (Kinda)
15:25 Showcase
16:10 Sponsor
Music (in order):
Astrophysics - Sweden (C418 synthwave/80s) • Sweden (C418 synthwave...
HOME - Still Life • HOME - Still Life
Helynt- Continue • Continue
LitKidBeats - GOOD VIBES • [FREE] Happy Type Beat...
Harris Heller - Guilty Spark • Guilty Spark
Harris Heller - Streamline • Streamline
Infraction - Photograph • Vlog lo-fi Anime Fashi...
Harris Heller - Iridescent • Iridescent
Harris Heller - Path Less Traveled • Path Less Traveled
unfeel - Deep Blue • Deep Blue
LuKremBo - biscuit • (no copyright music) l...
Helynt - Moog City • Moog City
LAKEY INSPIRED - Chill Day • LAKEY INSPIRED - Chill...
Infraction - Serotonin • Vlog Lo-Fi Chill by In...
HOME - We're Finally Landing • We're Finally Landing
Blue Wednesday - New Shoes • Blue Wednesday - New S...
Harris Heller - Plethora • Plethora
LAKEY INSPIRED - The Process • LAKEY INSPIRED - The P...
unfeel - Kinda Love • Kinda Love
HOME - Head First • HOME - Head First
Infraction - Jeju • Vlog Chill Hip-Hop by ...
This video was sponsored by Brilliant - Hry
To try everything Brilliant has to offer-free-for a full 30 days, visit brilliant.org/mattbatwings
You’ll also get 20% off an annual premium subscription.
ok man we got to address your genius 💀😭🙏
"Why should we try brilliant if we have you?"-me
i mean, neural networks, brilliant, it's all connected
Did you use a cnn to make the mnist image reduced and input the weights in a feed forward neural network?
Thanks I love brilliant
if you guys think this is insane, it took this guy like 2 weeks to make this all start to finish this man is a MACHINE
lol
cwaftymwastewman:3
Crazy 🎉
im sorry WHAT
@@imsaturncat Ew
ChatGPT playing minecraft: ❌️
Minecraft running ChatGPT: ✅️
Yeah bro they'll make a server, represent the internet, someone will then recreate chatgpt with redstone make it learn alot and people would be able to use it, but the problem is redstone is very slow, so they have to speed up the time so much, that it even responses in a "ok" time.
Someone NEEDS to make a chat GPT in minecraft, I Don't care if it uses command blocks, it would be so cool!
@@Centorym The GPT language models are so huge that, if we convert the whole model into redstone, the scale of the redstone machine will be so large that it will not even fit within render distance!
For comparison chatGPT model size is somewhere about 10 million~10 billion times larger than the number-recognitoin model.
Yeah I think command blocks is the only way to go, but even that the amount of command blocks would be monumental!
And the labor of copying the entire model by hand...
I think the conversion process has to be automated to be feasible
@@tung-hsinliu861 then we must settle for a very barebones version that has predetermined responses - although that'll be more of a magic 8ball ngl
@@Ari_Fudubut then it's not a neural network
the ONLY person on youtube that managed to explain neural networks in seconds, it took me days of research to understand them, be able to make and explain them
I will likely never fully understand these videos, but man are they impressive 👏
Incredible work! My brain is fried
ive never seen people not reply to a famous youtuber lol
Hi knarfy
Are you gonna be doing "Breaking a neural network with your dumb ideas"?
Fried brain 🤤
@@ThatGuyNyan run knarfy RUN before this guy makes a 3 course meal from you
This was 100% a brilliant partnership
It was 💀
😂
you were right !!
bad pun (·n·)-p
LMAO
We’re getting to the point where pretty soon someone is gonna recreate the nes in minecraft, or make doom in minecraft, im betting that within 10 years someone will get either doom or super Mario bros or the legend of Zelda running just off redstone
Idk about other games but doom already exist, someone ran it on his redstone computer (I believe it was called IRIS) I'll get back and edit this comment with the code of the video
(edit) _SvLXy74Jr4 Also I have no idea if this has been done before
such an original comment
Modpunchtree already ran doom on his cpu iris you can look up the video
@@proceduralism376 yeah and its only 28~32s for each frame
A NES Emulator in minecraft would be CRAZY
We got real AI in Minecraft before GTA 6
Diabolical
😭😭WE ONLY HAVE A COUPLE YEARS TO MAKE THESE JOKES; EVERYTHING WILL STOP BEING IMPRESSIVE SINCE ITS AFTER GTA 6
i came here looking for this comment LMFAO
@@goldfishglorywe got gta 6 before gta 7 - some guy in 2093
@@NolanHOfficial true
Bro, people out there creating neural networks in Minecraft, and I'm struggeling opening a chocalate bar while watchin them
Bruh
I’m struggling on a 2x2 this dudes making a Neural Network.
dont worry dude! it just takes time! You should watch his logical redstone reloaded series. (both new and old). they’re really helpful in understanding how computational redstone works. After that, just try to make an ALU. Its an amazing starting goal and once you’ve made your own, you can confidently say you’re proficient. I wish you luck on your journey
its 4
@@takyc7883he is talking about door
@@takyc7883 god damnit i laughed way too hard at that
@@MrFiveHimself That's assuming the commenter is not on bedrock.
my brother in christ, IT TOOK ME TWO MONTHS TO MAKE A NETWORK FROM SCRATCH THAT SOVLED THE MNIST DATASET IN PYTHON AND YOU DID IT IN REDSTONE IN 2 WEEKS, i applaude you, you redstone genius
lol there is a video i love of some bloke just writing it in like half an hour :> watching it is great way to lose confidence in your abilities
@@WoolyCow Link?
@@GustvandeWal yt doesn't play nice with links, but its called "Building a neural network FROM SCRATCH (no Tensorflow/Pytorch, just numpy & math)"
@@WoolyCow Thx!
(Most copy the part after /watch?v= 🙂)
@@GustvandeWal oh lol i shouldve thought of that! thanks for the tip :D
I just did a machine learning course last semester, and your 2 minute explanation for an MLP network was way easier to understand than our textbooks chapter that covered it. This entire build is insane, amazing work!
We got AI in Minecraft before GTA 6
You solved a number of difficult problems elegantly, but your amazing ability to communicate those ideas both visually and with narrative ease really stands out. Fantastic piece of content my dude.
The internet is such a cool place, imagine having a degree and choosing it to build real video games and software into minecraft and share it for a job, instead of actually building the video games and software, and making a living from that. The internet is so cool.
Gaming companies are so scummy and exploitative that honestly that's ain't really a bad deal after all
A forum for all ppl from stupid kids to Elon Musk
@@VortexFlickens Not much of a flattering comparison for stupid kids don't ya think?
@@VortexFlickens you said stupid kids twice
Wow look at the stupid kids hating on Elon cause he’s successful. Someone made a joke, cope
mattbatwings in 1 year: I Made a Technological Singularity with just Redstone!
Your transcript for college, internships, and future jobs in computer science is gonna be so stacked
In 16-bit logic, you can replace division by 15 by a multiplication by -30583 (32 bit result), three shifts, and two addition operations. You can easily figure this out by compiling a function that returns its 16-bit argument divided by 15 on clang with -O2, and what's efficient to do on silicon fabric (integers over floats, and multiplication over division) is almost always efficient in minecraft too.
As for softmax, in 2021, researchers at nvidia created a hardware-efficient softmax replacement called "softermax" that is realistically implementable in minecraft.
I'm not a minecraft expert, but I love seeing hardware implementations of functions, and minecraft is no exception.
Just because a function is hardware-efficient doesn't necessarily mean it can be easily or efficiently implemented in Minecraft, but it's an interesting point.
@@law1337 "what's efficient to do on silicon ... is almost always efficient in Minecraft too."
Java: *raises eyebrow*
14:19 Exponentiation is pretty simple, just convert the exponent to a binary number, then for each bit that is turned on you add the corresponding exponent, and to get the list of corresponding exponents you just start with the number you're raising to the power of the exponent and multiply by two each step. Here's an example, if you have 5^7 then it will convert 7 to binary which is 111 then it will multiply 5, 25, and 625 to get 78,125 which is the correct answer.
Also known as Square-And-Multiply algorithm
@@skaleee1207 Nice! I didn't know its official name. Originally, I thought I was the first person to come up with it, I remember being quite proud of it, later on I learned that it already existed, but I didn't know the name until now! That name is a lot simpler than my explanation and will allow people to find more information on it too, thanks!
This is an exponential with eulers number. Any output would be irrational and very messy. I understand why he would avoid this.
You could do it with base 2 (or 4), its just changing the "temperature". In that case exponentiation is trivial (bitshift). But you still have to do division.
That's like shift-and-add but for exp instead of mul
These projects of your are insane and the fact you choose a project AND actually push yourself to do it is super admirable
This is such a good demonstration that every hard problem is just a ton of smaller easier problems.
offtopic but recently started second semester on my computer science in college and was like "omg it's mattbatwings thing" the whole lecture because i already learned most of the stuff they were talking about from you 💀
It’s always a good day when a mattbatwings Video is on my recommended
Brp you could not get recommended this before premiere
Same
@@CubeXC you can. before a premire starts, it can be recommended
i'm amazed how well explained everything in this video is! keep on rocking
I was trying to come up with a project to add to my resume and you just simplified me to focus on ML. Thank you! :)
At this rate in 5 years I'm going to see a video on my homepage from mattbatswings where he ports the entire Linux kernel into Minecraft
Well, they do say that Linux runs on just about anything
Actually you only need to be continuous for training, for deployment you can drastically decrease the precision
Without losing accuracy, if you do it right
There's a paper where they reduce it all the way to one bit per neuron, which is a perfect fit for minecraft
(And I'm pretty sure also to 4 bits, which would fit signal strength applications)
quantization baby
@@user-yi8uz2ph1y Imagine getting a binary quantization good at mnist lol
Even for training, you can use quantization-aware or non-differentiable methods and meet parity on inference during training.
@@whatisrokosbasilisk80 That's what I'm talking about
I'm guessing this is the BNN paper by Courbariaux et al. from 2016? I'm skimming through the claims and it's insane what quantization can theoretically do.
This guy in 2030: Building robots to colonize all solar system planets with just redstone!
This is absolutely insane, keep up the good work!
The first thing that comes to mind is a recent cutting edge implementation of QAT (quantization aware training) called Bitnet 1.58; it operates on different principles than a standard MLP. It replaces the Matrix multiplication with binary operators (addition, subtraction, or no-ops), so it's fast in inference deployment and cheap in that you can sort of fit a single "unit" of weights into 1.58 bits (though it's easier to just do it as a 2bit implementation with one state unused). It'd probably be way faster in a Minecraft context as one of the biggest disadvantages in IRL deployment, that you need custom hardware to take full advantage of the speed improvements, isn't really a disadvantage in a bespoke system.
Anyway, the biggest difference is in the training process; it's trained at Int8 or FP8 (if memory serves, it's been a little while), and is then downscaled to the 1.58bit representation, but the information lost in that conversion to ternary values is preserved in a weight reconstruction matrix, basically. The end goal is that the network is made aware that it will be converted to a ternary representation. Hence, "quantization *aware* training", so you might be able to preserve more of the accuracy of the floating point model than you thought.
Strictly speaking, the full bitnet implementation is a Transformer network, but it should still apply to raw MLPs given that they started with the FFN (essentially an MLP placed inside a more complex network with self attention and a language head).
I never thought a Minecraft video will teach me neural network better than my teacher, thanks for the upload
The way you explained all of those deep learning terms in simple words is just marvelous!
You just reinvented the integer quantization! Nice job🎉
Brother.
I spent a while learning how to make neural networks as a school project, and just doing this from scratch, in redstone is absloutely astonishing. Legend, Mattbat.
That's incredible! Combining neural networks with Minecraft is pure genius. Keep up the amazing work!
Amazing project, congrats. Note: instead of multiplying the weights by 100, you can perform post-training int8 quantization to maintain most of the original accuracy.
I remember dabbling a bit in neural networks years ago and also went with integers instead of floating point, it was just 5x easier to code in the ancient software I was using on a school PC...
Very nice redstone.
Nice i also thought at first that your going to train the model in Minecraft but it seems that if its going to happen its going to be a whole other story
wait 2 more weeks lol
Okay, this just looks bonkers insane. Awesome work.
This taught me about implementing neural networks better than a lot of learning resources I've watched. Good work
Now please make a calculator where you can draw the numbers yourself (using a neural network and calculator) that would be awesome
Ok, now make an AI assisted shape drawing tool for your paint program.
e.g. draw a bad square, it draws a good square with the same width and height.
draw an ugly number, it fixes it by converting it to the closest possible number with correct dimensions.
that sounds like pure hell
I love it
nah
@@alluseri i like how google translate assertively translates this to "Now"
dude! seriously, from the bottom of my heart, you deserve sooooo much more recognition! thank you for this fantastic content
This was so cool man. Great job!
1 step closer to google in minecraft
There is a mod that uses block's as a screen and it connects to Google's url so thechnicly you can wach CZcams in Minecraft
Amazing
I hope you mention the first guy who did a neural network thing in minecraft, recognising numbers
Edit: he did
Feels good to comment before watching the video...
Why would you comment this before watching the video. He mentioned the other guy very early on in the video
Twitter rot
Great work! You helped me gain a better understanding of the weights. I don’t know why i was having a hard time grasping how the weights worked. Thank you
That’s a marvelous creation to be able to feature on your portfolios.
Bro is bout to build a quantum computer in Minecraft… 💀
respect for the sponsor's dish at the end of the episode
Congratulations, that's so cool! I used to do a lot of redstone back then, so I love seeing people pushing the limits further and further with it :)
massive kudos dude, seriously impressive!!!
Wow, that's actually crazy, good on you!
Congrats, you passed your PhD thesis.
This is amazing bro. Great job!
Dude your project just made me fully understand MLPs and neural networks thank you.
I just started the first few seconds of the video... the Minecraft soundtrack remix is awesome! Gives you a liiitle bit of the nostalgia of the original soundtrack but it feels so cool
WHAT THIS IS THE VIDEO I’VE BEEN WAITING FOR OMGGGG
I'm always happy to see what wizardry you come up with. Keep being awesome man!
This was such an entertaining video yet i still understood what you were talking about. This was amazing!
This is great work! I never thought we would have machine learning with just Redstone.
There's a guys who made this 1 year ago lol in minecraft
@@bintangramadan3217 Yeah but Mattbatwings is aware of that so maybe there will be something new?
You could npt have seen it yet, stop saying stuff just to get like. It was before premiere
its not machine learning, he just pasted the weights and biases into the neural network, not making it learn itself like a machine learning algorithm would
@@mineq4967 yeah that's a bit deceptive tbh
This is honestly incredible. I wish this was around when I was studying these concepts, would have helped me understand back propagation and softmax so much quicker
yooooo this is such a big inspiration since for a year or two now ive been working on and off on an AI that speedruns minecraft and i kinda stopped working on it for awhile but i think i might go back to it bc of this! this video brought it all back
Dude its sick and u are on next level , this one got u a subscriber. Keep going!
Can't say enough about how great it is that you showed prior work from others in the community before digging in to your version. That's what we want to see in the community ❤
at this point bro is gonna make hooman brain in redstone dang good job
Cringe
@@ziphy_6471 omg its linus no way 🔥🔥🔥
@@ALPRNX422 I have several children in my basement
@@ziphy_6471 cool
@@ALPRNX422 Will you be my next OwO UwU * turns up bulge *
Man, that looks so fun.
Congratulations.
bro what you are a genuine genius. I do not mean this non-literally, you are a genius
We are getting AI in minecraft before GTA VI comes out 💀💀
NO DONT TAKE OUR REDSTONE ENGINEERS JOBS
This is really good bro for visualising how computers work deep down in their tiny chips. Like ur essentially blowing up a cpu to its full size and literally WALKING thru the details and wiring. U can be a goated CS major bro, u have so much f**ING talent bro. How old are you dude? Did you do UNI, or are you currently doing uni? Like bro, go do a CS major or smth, you could make a shit ton of money from just research and development. U got like bottomless talent levels bro
This video was awesome!! It taught me the basics of machine learning but related it all to Minecraft. The perfect combo!
How did u get around the network being bad at actual digit recognition, due to the MNIST data set all being perfectly centered?
A simple MLP can learn a pretty good representation already for this dataset, but one easy approach would be to transform the input images (e.g. skew, rotate) and add these as additional training samples, this makes the learned representations even more robust :)
@@ferguspick6845 tysm
Wait what!???? Please tell me that this is just uploading the model into redstone and not all complex things like backpropagation to train the NN inside Minecraft...
For simple nural networks you dont really need backpropagation, you can just randomize the values until it gets better, itll take longer to train and wont be as efficient but its way easier to do
Bro there's a guy from Chinese who made neural Network in mine5 1 year ago lol
@@bintangramadan3217Send the vid pls
yes it is uploading the model into redstone dw
At least...
Really great video. Also you demonstrate how easy making a neural network can be. You just explained everything very well. Will inspire people. Inspired me.
So a really cool detail is how you handle the floating point limitation, this is actually really close to some quantitation solutions, look at the paper : "The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits" if you have the time, you might find further optimizations there
I’m a time traveler and mattbatt has recently made a human brain in Minecraft
He also made a Time Machine in Minecraft which is how you’re here I assume
@@Meyer-gp7nq Naturally
So... For all of you without experience with neural networks, this isn't a whole thing (not even close). He is placing weights which he already got from training in Python. Even though this is impressive, it's way less impressive than some of his other builds. Training a neural network would be impossible in Minecraft because of all the math it requires which isn't possible (or extremely hard and slow) in Minecraft... I hope I cleared things out :)
You're right, but had to do that and the result is still pretty cool
It's not impossible, maybe with a lighter neural network the training process will be possible.
it's a neural network
I mean it pretty obviously isn't the whole thing, and regardless it is still impressive. Logical redstone is pretty much "just" how you chain different circuits together so saying that it is any less impressive doesn't really make sense. Also even if it is just hard coding the weights, it is STILL a valid neural network model. Weird comment.
I'm not extremely familiar with Minecraft, but I suspect that second half of what you said, "extremely hard or slow" is more accurate, though there would of course be memory limitations of the computer itself being unable to store all of the redstone. However, assuming the world file isn't too large Minecraft should be Turing complete.
Dude this is so amazing. To have the skill to make a machine like this, understand the math and computations behind it, minecraft knowledge, and the video production after it all? That's amazing
You’ve outdone yourself again, great Job! 🙌🏻
bro casually invented quantization by himself 💀
so basically you're creating AIs from binary 💀
This is such an amazing work I even applauded when the video ended. While watching the showcase I noticed that the 8 was always on the same confidence range, could this be a bug? Anyway this is incredible, Congratulations!!
Okay the bit shift caught me off guard. Absolutely amazing work
this is absolutely crazy, as someone who took machine learning in uni i never thought that this was possible in minecraft, mind blown!
You are crazy. Amazing work. And also getting a sponsor.
Nice work man
The first time I understood such a viedo! thanks! and please keep going!
Finally found the perfect build for my base's front door combination lock
you never fail to amaze us, insane work!
this is amazing showcase of what a NN is and how it works. i worked with NN's for years and i still struggle sometimes lol
Yep definitely cool. You just said about integers being needed for minecraft and I'm thinking so you just multiply it up... sounds obvious but its only because you were already talking about it. So good.
It felt like I am watching Alan Turing solving making enigma. Keep creating this kind of content.
I'm IT student too and it's soo amazing to watch!
I finally understand how these neural networks work. Thank you!
Before we know it someone's gonna create actual LIFE from just redstone
The machine looks like the deciphering machine from the "the imitation game"
You are good man.
insane love your work
Although I didn't actually understand what you were doing, it's always fascinating how those people like you has pushed the minecraft redstone community so far. Keep up with your work!
Simple words, he made Ai
this is so cool, if there's even the slightest possibility to collaborate on a similar idea or project I'd be honored at such an opportunity
Awesome job and great explanation