Python Celery Distributed Task Queue | End to End Application with Celery
Vložit
- čas přidán 27. 07. 2024
- Software systems of the modern world makes use of distributed systems for multiple task which needs to be executed outside the core system functionalities and are not real time in nature. These systems communicate with each other using things like message queues, event streaming and /or publish subscribe architecture.
There are many softwares which can be used for thing including but not limited to Apache Kafka, rabbitMQ, Kinesis and google pub sub etc and all of them can be used effectively and effeciently over here.
However, in this video I'm introducing you to celery. a simple, flexible, fast and pure python system to create scalable distributed task where tasks and execute in different distributed system in asynchronous and /or synchronous way.
In this video, I've talked about about basics of task queues and under what all scenarios we can create task queues. This is followed by using RabbitMQ as a messaging broker and then using simple python client and server.
After watching this video, you'll understand how to
- Create a celery program
- concerpts of the workers of a celery program
- getting the results back in the python code
- triggering async execution of the tasks
I do hope that this video will help you in your learning journey, Thanks for watching
Timecodes
00:00 Apache Kafka, RabbitMQ, Kinesis and Pub Sub
00:57 Why we use Distributed Systems
02:43 The Concept of Task Queues
03:24 Celery Distributed Task Queue
04:22 Celery Version and Python Compatibility
04:36 using Celery with RabbitMQ
05:36 RabbitMQ Management using Docker container
05:58 Creating basic celery task
07:21 Celery Client
07:41 Running Celery with workers
08:21 Running celery task from a python client
08:48 Calling Celery with Delay function
10:23 Celery workers and multiple processes
14:01 Celery Concurrency
15:43 Reliability of Celery - Celery restart and finish items in the RabbitMQ
17:11 Returning results in the celery to the client
19:03 Waiting for results in Celery
#celery #distributedsystems #python #pythonprogramming
#CognitiveProgrammer
the part where you explained about how celery used all the cores of the machine by default to support multiprocessing is very well done. thank you sir.
Thank you so much for your kind comment 🙏
this was an incredibly well articulated video!
Thank you so much 🙏
Very easy to understand, thank you
Thanks for your comment 🙏
I have one silly doubt at 19:45 . When, you ran the script, why did the second print statement for write_log also take 5 seconds if sleep(5) was implemented for TaskQueue ?
Other question: How can I call an async function from inside of a celery task function? Is it possible?
how r u running it on the vscode terminal? its very confusing
what is that long command you are running on the vsc terminal at 11.00?????????
Great explanation, sir. But, What if I want to one task call another task passing a JSON object as a message between them? Is It possible and How is it? Thank you.
Sir, Can this task queue be Kafka as well?
Not officially supported, but I believe there are some ways to implement it. I'm sorry, I'm not so sure on this.
Hi, how can I access the material you use in Pycon24? Thank you.
Thanks for your comment - Here is the GitHub link for the same github.com/CognitiveProgrammer/PyCon2024-Celery
@@Cognitive-Programmer thank you so much
CFBR
Thank you so much 🙏