How a CPU Instruction Decoder and Instruction Execution Works
Vložit
- čas přidán 6. 08. 2024
- In this video, we investigate how Instruction Decoding and Instruction Execution gets carried out inside a CPU or Microprocessor.
@stem.odyssey
00:00 Introduction
01:20 Fetch Instruction from Memory
01:50 Decode the Instruction
02:48 The Boolean Logic
04:55 The CPU Internal Data Bus
05:25 To the Control Unit...
06:36 Memory Types Used in Computers
08:46 Implementing the Control Unit via a ROM Array
12:22 CPU Microprogramming
12:45 The Microcode or Microinstructions for the Add Instruction
13:11 Summary & Outro
The Videos of this Series :
1: How a Computer Works: Introduction to Playlist -Understanding Core Fundamentals of Computer Hardware
• How a Computer Works: ...
2 : How a Computer CPU or microProcessor Works - I - ( Core Fundamentals of Electrical circuit of a CPU)
• How a Computer CPU or ...
3: How a CPU Instruction Decoder and Instruction Execution Works
[you are here]
4. How Memory Address Decoding Works
• How Memory Address Dec...
5. How Memory Address Decoding Works (Part 2)
• How Memory Address Dec...
6: How Computer Memory Works (The Core Fundamentals of Electrical Circuit of DRAM)
• How Computer Memory Wo...
7: How Flat Panel Display Electrical Circuit Works. (Understanding the Core Fundamentals)
• How Flat Panel Display...
8: Basic Fundamentals of Number Systems
• Number Systems: The Ba...
NOTES 1:
Those whom are familiar with Assembly Language might have noticed that the Add instruction we have implemented in our CPU to be a bit different from say the standard Add instruction, which would usually Add either two values already present in Internal Registers or Add a Value stored at a Memory Location, to a Value stored in a Register etc.
The reasons for implementing a bit non-Standard Add construct were:
1.1. To highlight that, there are ample cases, where the Program Counter has to be incremented more than once, during the execution of a Single Instruction. Aka the Next Instruction to be Executed, does not necessarily has to be at the Next Memory Location.
1.2. To highlight that if/when one goes for a CPU design, there is no set and fast rules, that one is required to follow, UNLESS one is implementing an Industry Standard Instruction set.
The issues associated with Implementing a non Standard Instruction set, is a totally different topic altogether. For example, most all software may/might have to be re-implemented/re-written/... to run with this new Instruction Set Architecture (ISA) etc.
Thus in many cases, when a new ISA is implemented, one of the first softwares to be ported or implemented on that environment, would be a C/C++ compiler, such of gcc. And then using that C/C++ compiler, one can compile/port the other softwares, onto the new ISA.
NOTES 2:
Many Instruction Decoders, one finds in Production Level or Industrial Microprocessors, do have a few Optimization Techniques to Uniquely Identify as to whether the Instruction is an Arithmetic Operation or as to whether it's a Logic Operation or... etc.
These optimizations do play a role in speeding up the execution process, as well as optimizing the Logic Designs.
Our Bare Bone Basic CPU, will not go into these optimization techniques at this stage, until the Core Fundamentals are covered to sufficient details...
This is so helpful. I've been looking for a more hardware-focused explanation of instructions but none of them have gone as in-depth as this video.
Glad it was helpful!
Branch Education has hardware focused videos. 😊. czcams.com/video/wtdnatmVdIg/video.htmlsi=VXDoe9ixKTkAF5q3
Many great engineers develop this technology in 1960s-1970s, ¡很多谢谢您们!
من اروع القنوات من حيث الشرح
والتوضيح
نتمنى لكم الرقي والازدهار ❤❤
In 1984 during my electronic enginneering study we were given the task to develop and build a 4 bits, 16 instruction-set microprocessor using macro-components (TTL gates, FFs, diodes, etc), the micro-instructions were the most dificult part since it is like designing and building a city trafic lights system... we realized then the beauty and magic of micro-instructions coded using actual diodes (1N4000... around 2000) in a matrix of coper wires, just like your simulation... we ended up with a board 1.5m x 3m size and meters and meters of copper wire... a total mess.... the clock a big pushbutton and a T-FF to advance step by step each micro-instruction... good memories come to mi mind!!
So true... Now just imagine how "fun" it will be, to implement a Hardwired microcontroller say for a pipelined superscalar architecture or an SMP Architecture... 😎😁. So I believe the mico-instruction approach is the one chosen by most all projects nowadays. Glad this video brought good old memories...😎
This video is very easy to understand CPU, Memory and etc. I love this video very much. Thank you for creating !
Glad you enjoyed it!
Wow! Keep it up! Waiting for the complete playlist. Subscribed to you as well.
Awesome, thank you!
Excellent video.
Niiiiiiiice!
I should have discovered this channel a long time ago.
you are fantastic keep it up with these information details. Success will come to you very quickly. I saw a video with this information in 3D detail. You are fantastic with more videos with details.
Awesome, thank you!
@@StemOdyssey please make a tetris game or atari game in detail.
can't promise that...😊
this channel is amazing 💖
The "and" logic gate looks like a tasty gum drop
🤗
Great video, but I just can't with the AI-generated voice. :(
Intends to change the AI voice, will see how it goes...
Yes.. same issue
youtube GOLD
No doubt great content...but in this video you didnt explain that instruction circuit..
yes the issue was: to prevent the video from getting too long, had to skip internal details of quite a few circuits...
i realy love to learn how computer works and i learn how to make 24 hours digital clock using logism. from 4 bit counter using jk flipflop in logic gates form and 7 segment decoder in logic gate form.
But i want to understand how computer realy execute i struction and programs,
I search in youtube many time but icant found,
Can i have a favor?
Can you make a 8 bit computer for snake and tetris game in composition in logic gates?
It will be big help and pleasure to me and to us computer enthusiast.
Snake/tetris 8 bit computer in logic gate form. Thanks
cant promise that... will see if time permits...
If you're using an AI voice, you might as well have ChatGPT take a look at your script and make it more fluent and idiomatic.
That actually might be a good idea... Will give it a try...
so it is all factorio? Always has been.
Turing Complete or Logic World on Steam.
I wish you would have listened to the text before publishing this video.
Are there any technical faults in the text?
@@StemOdyssey I did hear some things I'd like you to watch out for in your future videos.
0:18 It sounds to me like "to numbers", when I believe it should have been "2 numbers".
0:23 Here, I hear "fetch it" instead of fetched".
1:00 I don't even know how to describe what happened here.
In short, I'm not hearing big mistakes in the information being conveyed, but I do hear some things I suggest you watch out for.
Intends to change the Ai voce used, will see how it goes...