threading vs multiprocessing in python
Vložit
- čas přidán 22. 06. 2024
- A comparative look between threading and multiprocessing in python.
I will show activity plots of 4,8,16 threads vs 4,8,16 processes and discuss the differences between the two modules.
In summary: threads in python are concurrent and not parallel, so no two threads can execute at the same time. The way to get around this isto use the core module multiprocessing and spawn child python processes to each run work in parallel.
I can't imagine the effort and time you have invested for making this video.. Very informative
Thank you very much!
I fully agree.
It is one of the best videos showing the issues with multithreading and how it compares to multiprocessing. It really deserves a higher piority in youtube search.
@@ErikS- yup, 18:24 clear my mind about it.
I was always wondering the real impact of it.
damn right I was for using threading, but yet, I understand now the utility of multiprocessing.
+1
🦾
Where can we have full code? Can you give a GitHub link plz? Also explain any risk of doing this multiprocessing
Too bad this video is incorrect on almost ever level.
Very nice full presentation. The short of it is that "Python" doesn't support parallel execution. For most programmers, when you talk about having multiple threads, the assumption is that those threads can and will execute in parallel. Unfortunately, Python was designed with single core CPU in mind so even though the idea of threads have existed for a while in computing, code wasn't likely to be run on a multithreaded/multicore/multicpu machine to do anything in parallel. It was just the operating system giving out small slices of time to execute one thread or another and it was perceived like both were happening at the same time -- very much like your graphs show.
Python, like most interpreted languages, cannot get over this problem because of the synchronization and locking needed to share access to data across threads so they inherently can only allow one "Python interpreted" thread to run at a time. Only library implementations in C can get around this under the hood by spawning real threads on Python's behalf to do work. Or this "multiprocess" approach, which creates a new process and an independent Python interpreter with entirely separate program state and memory. This approach isn't really a Python solution because any programming language can spawn a new OS process (provided a library is available to access fork() and exec*() system calls) and then the OS will execute that process in parallel on a multicore machine. But the thing about multiple processes is that it's harder and slower to share and synchronize data between processes than it is threads. It may not be an issue in some cases if not much synchronization is needed (the case if only an end result matters at the end of parallel work), but it can be a severe a limitation.
The last thing I'll say is that often times IO driven or IO heavy applications don't really need a performance boost of true parallel execution. The wait for IO (disk and network for example) are so slow compared to CPU execution that most threads would be waiting for IO anyways. With proper async-io setup (kqueue, select, epoll, IO completion ports) you can use a single thread to handle and dispatch thousands of IO requests and still be bottlenecked by IO. This is how/why people can still write "performance intensive" applications with interpreted languages and compete with a language like C or C++. Maximizing IO efficiency is simply something that sometimes C/C++ won't offer any benefit for so much "slower" languages appear to be just as fast.
I was able to combine multithreading under multiprocessing using threadpoolexecutor and processpoolexecutor thinking I could achieve true parallelism. But you’re right there is no real benefit doing multiprocessing 99% if your program is IO bound. The extra overhead and slowdown spawning processes is simply not worth it.
What i can’t understand from the video is why multi-threading (even if non parallel) should help, in theory, in IO heavy applications. Can you help me?
@@alessandropolidori9895 When it comes to IO the operating system is doing this in the background. So this means, you can schedule an other thread and the operating system will still do the work in the background and if your thread is scheduled again, the OS maybe finished his work and you are able to continue your work
@@alessandropolidori9895 In theory, one thing where it can help is if the hot data that is processed, can e.g. mostly be cached in some of the direct for a cpu available cache line in L1 or L2 cache. An example could be a redis like implementation with a bloom filter (very small memory that can definitly deny if data is not in the slow data store behind and 99% or so sure if it is there). And for such a scenario, it's of course helpful if for different data stores, each one works on a different cpu core, so that the bloom filter is already in the ultrafast L1 or L2 cache in case. To be honest, for python scenarios, this is a bit far off - as you would usually implement such things in a system language like C, C++, Rust, Ada, or even Golang (there indeed exists a redis clone). The latter is an example of a language that still has its own run time and garbage collection, but optimized for such tasks.
The more practical example is that in IO heavy tasks, some individual tasks will block. (Classic example fetch data from a SQL database or an url). Now you certainly don't want all other tasks to wait for it. The modern approach for it is async - but this is relatively new in Python (something like 4 years "young") and multithreading were the answer before the async implementations were available and production ready.
It's also nowadays a simple way (but slightly less performant than the async implementations in most cases) to alter code and have this (mostly) non-blocking behaviour if you don't want or can't refactor the implementation.
In general, nowadays I'd recommend either optimize the program to just run on one cpu core or to run on all, nothing in between. You can't really mix the use cases and still be performant anyway. In python, you'd end up fighting a lot with GIL (global interpreter lock) and if you have to put both use cases into one application, I'd suggest to have two different programs that communicate e.g. with a message queue asynchronously. I remember a lot of headaches with machine learning optimized implementations (e.g. Spacy) in combinations with a web server (running with wsgi). Short story: don't do it - separate them 🙂
I notice a 20-50% slower file copy with Python compared to system, for example with shutil.move() on Windows, I'm running the file copy on a separate pyQt thread... juste renaming the file which should not take any noticeable processing time.
Do you think another method might be faster? I'm asking because I expect C++ would be as fast as the system in this case, not 20-50% slower...
What an incredible video. I’ve just been blindly picking one or the other, not sure the differences between either one, but this makes everything so clear. I’m so glad I found it!!
this video is amazing, honestly one of the best I've ever seen, thank you from the bottom of my heart for dedicating so much time to creating it❤️
Thankyou Dave, i'm so glad youtube algo's bought your video to my daily feed. A really fascinating insight into thread and processes and the presentation style was perfect. Best wishes.
Excellent explaination about the most complicated questions that I have ever come across in an interview setting. Even though, this is an after math I am super glad to learn in with such a thought clarity. This is how you become fear-less!!! Thank you Dave ❤!
Absolutely the best video on youtube describing how threading works in Python, with concise demonstrations and a well thought of script and presentation. 10/10, subscribed
Brilliant work!! Best video on multithreading/processing I've seen in a while
Brilliant representation of the concept. Thanks for all you effort.
This is hands down the most thorrow video on a topic. And youtube shows me this exactly 1 year after I desperately needed it.
Better late than never i guess.
The amount of work done here is unblievable. Thank you so much
2 minutes into this I already understand it better than all other readings I did online. Nice!
Massively underrated video. Saved it in my library. Thank you sir.
This is the most comprehensive video I've ever seen on multithreading and multiprocessing. Great job!
Incredible video and crystal clear explanations. Hope to see more !
Literally the best video ive seen yet on this topic. Keep posting man!
The best video in CZcams explaining the concept! Thanks
This is by far the most comprehensive and easily consumable video on any CS learning I've ever seen. Great job! Giving you a sub for sure.
Keep it up Dave!
This video was so informative, even for someone who is unfamiliar with the concept
You deserve a lot more recognition
This is the very best explanation of threading vs multiprocessing that I have ever seen. Well done!
Very well explained :) I can see your number of subscribers growing at a steady pace mate. Keep it up! Good stuff!
Thank you very much! Yes steady growth is encouraging 😊
The best I have ever watched on multiprocessing v/s threading!! The visualizations were a complete treat ❤
Yay, it's so interesting to see a visual representation of something that I have been figuring out during my work for a few years with threading\multiproc.
When you understand it on instincts but not so visualized and vivid.
Woah, just found your channel. This is truly a goldmine.
This channel is a hidden gem!
It is my first time to figure out the multiprocess and threading in Python. Thanks a lot.
Glad it helped!
This sooooooo great.... probably the best explanation on CZcams
Awesome, no pressure and yet informative! Good work! Thanks a lot! allthough i knew the topic well from uni, i could deepen my understanding with this!
One of the best lectures on multiprocessing and threading that I ever saw. Thanks for the guide and info, this will help me improve my own lectures on the subject
Great video. The best multiprocessing v/s threading graphical explanation on the hole internet. Thanks for the dedication. New subscriber.
Thanks for this -- looking forward to more of your work!
I really, really don’t understand why you don’t have more followers. Keep up the good work. This is really well done! Informative and straight out fun!
Hands down best video on python multithreading and multiprocessing.
Outstanding video. “Like” is an understatement. So clear and informative.
I am impressed with your use of visual aids in explaining how all this works. It definitely makes a lot more sense.
As someone who does data analysis and plotting with Python, thank you. So much.
Thank you so much for your content! very useful and I really enjoyed the way you structure and visualize your video! Thank you!
fantastic data visualization with the activity charts, i will be checking out more of your videos
This was a super quality educational video, thanks so much!
This is the best video explanation on this topic, WOW
Excellent work, very informative! Thanks a ton for your time!
Excellent video, superbly made. Thanks for posting.
This video is a marvellous craftsmanship
One of the best video that I have seen on the internet...This video forced me to subscribe this channel.
🎯 Key Takeaways for quick navigation:
00:15 🐍 *Python is multi-threaded but not simultaneously multi-threaded, meaning two threads cannot execute simultaneously within the same process.*
03:36 🚧 *Python's Global Interpreter Lock (GIL) ensures thread safety by allowing only one thread to execute at a time within a process.*
06:00 🔄 *Multi-threading is suitable for I/O-bound tasks, where threads can perform other tasks while waiting for I/O operations.*
11:12 🚀 *Multi-processing is effective for CPU-bound tasks, allowing processes to run simultaneously and utilize multiple CPU cores.*
18:11 📊 *Choose multi-threading for I/O-bound tasks and multi-processing for CPU-bound tasks, considering the nature of your application.*
Made with HARPA AI
I wish growth to your channel. A very informative video with amazing visualization. There would be more of this in my recommendations.
Awesome video and so very well explained. Thank you so very much. It was excellent.
This was incredibly helpful!
Very informative video. Thanks a lot !
Very nice explanation . keep up the good work.
I love this video, the explanation is perfect
Great explanation! Thanks for clarifying.
Awesome video. Those visuals are helpful. Thank you
Glad you like them!
Excellent video! Thank you!
Thanks for the really great information.❤
amazing explanation!!
thank you!!
This is a brilliant video, thank you
Thanks for the detailing. Excellent
Unbelievable I found this video! Now open my mind about Python! Please make video like this agaiinn!
It was a very good and impressive presentation. listening to it made me feel as if David Attenborough was describing the lyrebird like in the bbc documentary. :) Thank you for your effort...
This video is amazing! I don't usually go to youtube for programming content because its all just copy paste. This is one of the most informative and useful vids I've come across in a long time. I love the graphics/ visuals. I don't know how you managed to make multithreading and multiprocessing so engaging but bravo! 👏 Keep up the great work and thank you for the content!
Wow! Really well explained
Você tá de parabéns, um dos vídeos mais bem didáticos que vi sobre python. Bom trabalho e certamente irei ver mais vídeo seu!
Awesome attention to details 😀
A few things to note:
The GIL (and therefore sequential thread execution within a process) are only an issue in CPython, not in (most) other python interpreters.
Jython for example has true parallel threads. Also most other languages have them. This is mostly a python problem
Thanks for sharing this nice presentation!
One major improvement I've found is taking your CPU intensive Python code and writing it in this language called "C". Joking aside, great video!
Very high quality!
Thank you so much for this!
Is there a preferred way to have the OS do the multiprocessing for you? Meaning, not using one control process of python to kick-off all processes and waiting for them, but rather starting them loosely (e.g. using os.fork(), os.setsid, function() and then sys.exit)? I want to avoid the controlling process to get stuck waiting for the threads/processes.
Great job and thank you so much !
This is a masterpiece, honestly. Content-wise is very informative, but the way you represent everything is like watching a sci-fi movie.
Great description!
Amazing video! Thanks a lot!
Excelent content!
1ºclass work, Congratz
thank you so much, it was a great video
Good Explanation, next time i know exactly which one is better for my purpose.
That explains the one intern I had, who wouldn't want to believe that threads are simultaneous. He said he had some python experience, but we use java.
You can run parallel threads using PdP (Parallel distributed Processing) if you have a process that can run non serial...obviously there is networking overhead. Great video--lots of ground to cover.
Awesome video. Thanks
This has been the best explanation of the differences between the two I have seen.
My only gripe is I really wanted to see this same data but also include a column for true single threaded work with no threading or multiprocessing enabled. How much lower than the 2million is it?
That would have been helpful to see. Otherwise this was excellent and helped clarify what I need yo use when. Thanks so much.
You should expect it is more than 2 million, not less. Threading has some overhead, and since none of the operations in this example are i/o bound you never get that overhead back.
HOLY CRAP THIS IS A GREAT USE CASE!
Fantastic video.
Really nice explanation.
PEP 703 go brrr! I'm excited to try it on python 3.13
excellent info .. Thank You .. Cheers :)
Knew all of that already (wish it was more tl;dw - like 2mins) but think it's super extensive + informative for a beginner.
Really informative video¡¡ I struggled a bit with the accent and speed but it's really good¡
Great video!
I love your channel :) you are a 3 blue 1 brown in the making, if not better
i appreciate this work
Thanks, so good video
This video is completely underrated.
Just a few weeks ago I went through this discovery myself when writing a wordle solver in python. This video would have been very helpful at that time. Everything explained here is spot on.
This is helpful ❤
Amazing video
My rule of thumb from trial and error is that you should always leave about 1 core free for each set of 8 (using python and Linux). So 2 cores free for 16 cores would be 14 max. Otherwise the system just bogs down and you get less performance and greater chance of hanging up.
Perfect visualisation and well presented content. Thank you for your efforts!
Glad you enjoyed it!
Very impressive!