GPUs: Explained
Vložit
- čas přidán 5. 06. 2024
- Read the blog to learn how IBM Cloud enhances GPUs → ibm.biz/ibm-cloud-enhances-gpu...
Check out IBM Cloud for GPUs → ibm.biz/BdPSfV
In the latest in our series of lightboarding explainer videos, Alex Hudak is going tackle the subject of GPUs.
What is a GPU? What's the difference between a GPU and CPU? What are the most relevant use cases for GPUs, and how do GPUs figure into your cloud strategy?
Get started for free on IBM Cloud → ibm.biz/sign-up-today
#GPU #HPC #AI - Věda a technologie
Out of the many videos for GPU/CPU, this video saved my time as well as loved the simplified story. Thanks Alex & IBM
"Gaming is no longer the focus of GPUs anymore"!
Thanks guys
Lmao I paused the video just to check on this statement.
No mention of ⛏
Finally a more layman's terms video on the subject, thanks!
I was a layman then I took and arrow to the knee to become a Standingman who can use complex terminology
Can we all just take a second to appreciate how easy this woman makes writing reversed look?
She doesn't even bat an eye!
Don't you think that it's much more likely that they simply mirrored the video along the vertical axis?
I think so, because it's much more likely that she's right-handed instead of left-handed.
BRAINZ
All these IBM explanation videos are made that way. It's a simple horizontal flip.
......
My thoughts exactly
@@47Mortuus wooooosh
IBM dev rels are the best when it comes to explaining stuff...
Facts
Fax mon
Agreed 🤝
Having an exam in 2 hours time and I'm here to know a killer explanation of a GPU. Thanks for this.
This educational video is one of the reasons I love the Internet. Thank you Alex for doing this and you answered the question: 1) what is a GPU, and 2) why parallel processing is important?
Would love to see this presentation further explain: 1) what does my solution delivery team need to do in order to leverage parallel GPU processing, 2) how do I integrate a GPU processing within my current architecture and 3) what challenges do traditional solutions have with leveraging GPU capabilities?
Hey Alex! Looks like you are doing very well since your days in Troy. Of course, I'm not surprised. Keep up the amazing work!
The best and easy to understand explanation so far. Thanks!
We're glad you found it useful, Kiran! 🙂
I've been looking for a good video
And finally I found the best one
Thanks a lot 💓
Thanks a lot mam for this video. It has been explained nicely.
Wow - I can't begin to tell you how enlightening this short video was. It has, in 8mins, shed great light onto two MSc modules I'm working through right now: Blockchain (PoW mining) and Cloud Systems. I came for the blockchain mining but stayed for the relevance to cloud and virtualizaiton. 5 *s.
Good speech presentation, good visual presentation with the colored markers & screen layout. I am impressed!
And I as well!
Explained well, thank you.
Amazing explanation. Thank you so much for bringing in the clarity on CPU vs GPU.
Great presentation and clear explanation to get basics of the GPUs etc... make sense in a simple manner.. i m loving the IBM Technology videos ...
so comprehensive,
thanks
I love IBM, I'm holding the IBM Certified Cyber Security Analyst Professional Certificate and the experience was awesome throughout the 8 coursed
Well..this explains what i needed, thanks
Wow, I had a very vague (and wrong, lol) idea of what a GPU was. It actually makes sense for me now.
Simple, but effective. The drawings help a lot. Thank you!
You are most welcome.
Great info, and also such a cool presentation setup.
Tremendously Helpful Thx!
The most impressive thing about this video is how you had to draw everything backwards. How did you do that without it looking like crap
Well actually we don't write backward. Here is a blog post we wrote that explains how we do it, with a photo. ibm.co/2LTPMjo
@@IBMTechnology that link doesn't work :(
(at least from mobile)
She writes it the regular way and the video itself is mirrored.
Lol leads me to channel called "post" but nothing is posted.
I think they basically invert the video after shooting.
You Rock! Alex
Writing backwards in real time whilst multi tasking and multi processing is seriously impressive
She is a human GPU :3
it's just an horizontal reflected video.
@@porrasbrand good call.
Thanks very much for simplifying. This helps a lot.
Excellent presentation...many thanks! HAL would approve :-)
I study it and nowhere I found this good videos, IBM is awesome
Great explanation
Good Explanation. Nice Job
Exellent lecture! Thank’s a lot ma’am
Thank you. That was great explanation. Do you have a video with data that shows the speed differences of CPUs and GPUs or a CPU w/out a GPU? Thanks
Hi Julius! At this time that kind of content does not exist, but it is something we are working on for future videos.
Thank you for your good explain. May be, I can understtand the defference between CPU & GPU.
But I have more question. from your direction ,Writing the keywords was very difficult?
That's cool how you write backwards so neat
This is a great introduction
subed! great content! love!
Great explanation 👏🏿
Brilliant video post
Good video , but I wanted to know that laptop with small cpu and beast GPU good?
Excellent.
well explained. Like it a lot.
Glad it was clear, Immanuel. Thanks for visiting.
very informative info,, hats off
Amazing video.
Sooo cool video! Thx
i love the way she says virtual desktop infrastructure
good video thanks
i recently looked at a brand new cpu, they are a fasinating piece of kit!
Speechless
Thanks
Informative video
Thank you ma'am for clearing up this run away computer technology components n its effects on user.
GPU are more than gaming, inally people now days realised .thanks for making it easy to understand IBM and Alex.
You're welcome, thanks for watching! 🙏
what is vram and what is does to gpu and which one matter the most gpu or vram?
Great thank you
good job
I admire the way you can write things behind the glass like that. I tried and it's so HARD!!
she never actually wrote backwards they just flipped the video around in post production
Well actually we don't write backward. Here is a blog post we wrote that explains how we do it, with a photo. ibm.co/2LTPMjo
Ok but my question is how in the God's good world she's writing in reverse so smoothly and explaining alongside
See ibm.biz/write-backwards for details
thank u
This is the coolest white board i've ever seen!
Are both necessary for a gaming pc? If so what brand and capacity do you recommend?
yeah both are quite necessary id say just type your budget and a pc build related to that should show up
In summary, GPUs are focused on matrix computations in order to create images.
CPUs execute more general and more complex instructions.
A CPU is like a musician that can read and write music.
A GPU is like a musician that knows specific chords.
I need a book where i can aboard tecnical details? anyone any book?
Good Presentation
Thank you!
But if GPUs are so massive more powerful why not switch everything to them & go completely parallel processing for all applications
draw , write backwards. . .
wow
i subscribed, and liked already
LOL I noticed that too.
Good!
Good intro video, but why did IBM not invest in the GPU industry?
Can you explain gpu names like rtx gtx etc idk
..so how to choose the optimum gpu for an existing cpu?
Great video, more impressive how well she writes backwards.
lol
The video is horizontally flipped. She is writing normally
Even a basic computer need GPU.
Why gpus don't replace cpus ? Why do we need both ? What does the cpu that the gpu can't ?
Does IBM offer training for reverse writing?
good content
Hi Alex!
when a cpu does a computation it does it in a parallel form right???each cpu takes a part of a computation
CPU also does in parallel, but not to the extent of GPU. Also, like mentioned in the video GPU has too many cores that work in lockstep unlike CPU cores which do different work on each CPU. The video is over simplification for layman.
Great video on clarification!
...Also are we NOT going to address the fact that the presenter has been writing in REVERSE for the purpose of this lesson?
It's flipped in post-production. See ibm.biz/write-backwards for details.
Amh, described this way, GPUs include also iGPUs (rightly). So, Intel are the biggest producer of GPUs.
Good video... although...
It needs just a 'minor' correction---
CPU's can have only 1 core... the cores run in Parallel (not serie)
the core in the GPU usually run in slower speed than the CPU cores...
(there are a number of technical reasons for this limitation...)
the big difference is that the specialization
the GPU was initially developed for a specific task (as the name states)
and as the goal was mainly graphics calculation (although it can be used for others... now AI, ML, ...)
it has a small and specific set very math-oriented...
It shines in any task where there are the possibility of using a distributed load
(a bunch of simple tasks, remember these are simplistic cores)
that can tun in parallel and spread the load for the GPU...
;-)
your first paragraph is exactly what i was thinking, thanks for additional info too.
I agree that mistakes were made, but it's not right to say _"CPUs can have only 1 core... the cores run in parallel"._ As she said, a CPU can have one or many cores, but usually much less than the GPU. Also, the parallel vs series argument is quite strange. The OS manages multithreading which can mean that a program can run in parallel on a CPU. The same is true for a GPU (This is where I disagree with the lady in the video). But I agree with her in saying that GPUs are specifically designed and suited to tasks that run in parallel, while CPUs can do both well.
Good comment! it makes me clear the point that if GPU is fast why we need CPU, the main difference is that GPU's core is simplistic cores, doing specific tasks and running at a slower speed than the CPU.
What is offering manager?
Hi Tom! An Offering Manager is responsible for the full lifecycle of an offering (for a product or service) and owns the strategy and execution for bringing that offer to market.
You can read this article about a day in the life of an offering manager at IBM 👉 ibm.co/3CfytBe 🙂
Got to load off this question off my mind: Why don't we just have a cpu that has the capability of gpu and cpu at the same time? Like a god processor or something
For a short moment the world had a "god processor" that had parallelism built-in; it didn't need tens or hundreds of physical cores because a single instruction could do things in parallel (that is, if the code was written to correctly support it and not just a lazy port from traditional cpu architecture). It was called Itanium and it was as glorious as you would imagine. But due to reasons it got discontinued, so it never had a change to mature and become the GPU killer.
edit: Just to be pedantic, my "single instruction does parallel stuff" is not totally correct but in layman's terms it should be good enough. If you wan't to actually know how it works, read up on "IA-64" and "very long instruction word."
3:44 “[AI] is something that a CPU cannot do on its own.” That’s not off the wall but not really true. I ran mistral 7b with cpu using hugging face. If I get it from source, I do need NVIDIA gpus based on the way the code is set up. You can do a lot of great AI on CPU though.
I liked the video.
👍👍👍
And all of a sudden I want to work at IBM 💥
If a GPU is so great at everything then why is it not used in place of a CPU.
Is it because the OS is stuck with some architecture that only works with a CPU or cpu brands (Intel / amd) or is there some other reason.
Hi @indrajeet500 thanks for reaching out. GPUs have the capability of processing particular applications far faster than CPUs, but GPUs lack the some of the core functionality of CPUs which are needed for modern operating systems. GPUs are most suited for intensive compute applications such as low-latency graphics and deep learning, whereas CPUs are built for everyday computing. For now, GPUs and CPUs are a robust team!--Alex
In my learning from what I understand and how I explain it to people in a cliff notes version. Is that the CPU is quicker at writing out the equation, the GPU is quicker at coming up with the answer. I know it is a little thin but it seems to fit the description.
@@IBMTechnology can you tell us more about the core functions that cpus can do it but gpus can't ?
Technically, when you make an operating system, everything must executed in order. You can't do that in GPU because it's parallel processor. Therefore, you need serial processor (in this case CPU)
@@neamam9228 complex arithmetic calculations, GPU are best for simple calculations that are computationally intensive such as graph traversal, encryption and decryption
When will AI be able to think
in 3D
Is writing in reverse part of IBM training because...
👍
Wether it's true or not; considering GPUs as overpowered CPUs is a little bit disappointing.
Maria Sharapova explaining computers
you can write entire Unicode on base
I love learning with her.
This is more B2B ad for cloud compute gpu than education on gpu's.
I only came for the thumbnail , I already know what GPU's are
Simp
Basic
If a CPU has multiple physical cores, meaning multiple ALUs, cant it also do multiple calculations at the same time? Or does she mean that it will a single "task" for the same "program" so to speak into multiple parallel calculations? But the CPU also does that i suppose. So is the difference just that the GPU does a lot of it because it has more cores? Why would a GPU then not always be better? I think im not fully understanding the difference of a multicore CPU calculating things in parallel, and a GPU calculating things in parallel. It cant be only the number of cores.
It's over simplified. The CPU cores are like master chef. The GPU cores are like line chefs.
The day is approaching where GPUs will have more cores than the 8086 had transistors.
If I understood correctly GPU have much more computing power so why using CPUs at all? beacuse they are cheaper?
te amaria mucho si fuera en español pues no hay muchos que hacen esto, pero gigi no mas