Airflow tutorial - DAGs, Operators, Tasks, Providers & airflow.cfg
Vložit
- čas přidán 1. 10. 2023
- Airflow Fundamentals
Operator Types
Task Vs Operator
Provider Packages
airflow.cfg
ERROR - No module named 'email_backend = airflow'
airflow interview questions
airflow interview questions and answers
apache airflow dag
airflow Operator types
Difference between Task and Operator
Airflow Task vs Operator
Airflow provider packages
Action Operators
Transfer Operators
Sensors
apache airflow basics
apache airflow fundamentals
Sensor Operators
airflow Operator categories
How do I install Airflow providers?
airflow-providers list
pip install airflow-providers
airflow providers-github
What is airflow dag
airflow.cfg example
airflow.cfg file location
airflow.cfg location
airflow.cfg docker
airflow.cfg default
airflow.cfg path
airflow provider packages list
airflow operators example
airflow operator task instance
airflow dag
airflow dag example
airflow dag parameters
airflow dag dependencies
airflow dag catchup
airflow dag schedule
airflow dag start_date
airflow dagbag
airflow dag timeout
airflow dag_run.conf
What is airflow dag used for
What is airflow dag in python
What is airflow dag example
airflow dag example
taskgroup airflow
airflow depends_on_past
airflow dag parameters
What is the use of DAG in Airflow?
What is DAG and how it works?
What does DAG stand for in Airflow?
What are DAGs used for?
apache airflow
apache airflow tutorial
apache airflow github
apache airflow docker
apache-airflow-providers
apache airflow documentation
apache airflow example
DAG Parameters:
airflow.apache.org/docs/apach...
Providers packages referance:
airflow.apache.org/docs/apach...
Full Code and Resources:
sleek-data.blogspot.com/2023/... - Věda a technologie
I've made it to this third video of the series and quite enjoying it. It's the perfect balance between a quick start and just enough to get you going. Thanks for creating this. You got a new sub.
Dear Friend - Thank you so much for your kind words!
To stay updated with our future content, I kindly request you to consider subscribing and liking our videos.
If you could share our video/channel with your friends, we would be so grateful!
Once again thank you for your encouragement.
Very helpfull videos! Thank you!!
Dear Friend - Thank you so much for your kind words!
To stay updated with our future content, I kindly request you to consider subscribing and liking our videos.
If you could share our video/channel with your friends, we would be so grateful!
Once again thank you for your encouragement.
Is there a way to install packages in runtime?
Or adding them in requirements and re-building the image each time is required?
Bro - You can install in docker terminal (may be you need to switch to root) but that will go away once you restart docker.
I have an issue, when the exchange_rate_pipeline included as a 2nd DAG after refresh the airflow , it's shown in the UI but when I triggered the DAG it's not starting the process, and when I check the first DAG "welcome_dag" it's also not start the process. What am I missing?
Btw I only change the airflow.cfg as you mention that "smtp_host = sleek-smtp" and "smtp_starttls = False"
Dude - pls try the video side by side by opening on another device may be your mobile...good luck.
I notice that there are wave lines under "from airflow.utils.dates import days_ago" and other lines with "from airflow....import..", how to remove such warnings? Thanks
Dude - pip install apache-airflow
still not working then pls close and reopen.
Good luck, To stay updated with our future content, I kindly request you to consider subscribing and liking our videos.
If you could share our video/channel with your friends, we would be so grateful!
The best Airflow course
Dear Friend -
Thank you for your kind words! We're delighted to hear that you're enjoying our content. To stay updated with our future videos, consider subscribing and sharing with your friends. Your support means a lot to us. Thanks again for your encouragement!
I am using gcp composer so I don’t have a docker file , where do i install my mongo db provider and how
Dude -
In Google Cloud console, go to the Environments page.
Go to Environments
In the list of environments, click the name of your environment. The Environment details page opens.
Go to the PyPI packages tab.
Click Edit
Click Add package.
In the PyPI packages section, specify package names, with optional version specifiers and extras.
Click Save.
For more details - pls refer:
cloud.google.com/composer/docs/how-to/using/installing-python-dependencies#console_1
To stay updated with our future content, I kindly request you to consider subscribing and liking our videos.
If you could share our video/channel with your friends, we would be so grateful!
Thank you for your encouragement.
I am getting an error at the docker site for i am told that the clean_data module in the exchange_rate_pipeline is not defined. Even after creating the clean_data file in the plugin folder
i have solved it. i placed the clean_data file in the same as the exchange_rate_pipeline. Thanks
Dear Friend - Thank you your kind feedback! To stay updated with our future content, I kindly request you to consider subscribing and liking our videos. Your support is valuable to us. If you ever need technical support, please feel free to leave a comment on any of our videos, and we'll be sure to respond promptly or create a video addressing the issue.
how did you create the plugin folder? i don't have it by default
Dear Friend - pls refer below line in docker compose, it just mounts entire airflow folder in your local machine as /opt/airflow in airflow docker instance.
./airflow:/opt/airflow
with this you can manually create plugin folder in your local airflow folder, then place whatever py files needed (Don't worry about _*_ file created automatically) and restart docker, all should work as expected.
Good luck, if helpful please don't forget to like and subscribe.
I have the same issue, can u please help me
on the git_repo_dag the list_repo_tags graph is failed give some tips to resolve it
Dude - pls let me know the error message as a separate comment (i don't get notification for responses on existing comment)
If someone like me stuck up with tasks of exchange rate being "succeed" but nothing actually happened: remove "end_date" param in dag=DAG, because it's just preventing dag from running at all.
dag = DAG(
'exchange_rate_etl',
start_date=datetime(2023, 10, 1),
# end_date=datetime(2023, 12, 31),
schedule_interval='0 22 * * *',
default_args={"retries": 2, "retry_delay": timedelta(minutes=5)},
catchup=False
)
I also have a problem with sending an email, but that's not super important part of the video, so that's fine
Dear Friend - Thank you so much for your feedback!
To stay updated with our future content, I kindly request you to consider subscribing and liking our videos.
If you could share our video/channel with your friends, we would be so grateful!
Once again thank you for your encouragement.
Thanks for helping out mate
while clicking on local host link 8080:8080 i am encountering the error that This page isn’t working localhost didn’t send any data.
ERR_EMPTY_RESPONSE.... Please help!!
Dear Friend
Please double-check the Docker Compose file. You should expose port 8080 from a container to the host. You can refer to the previous video in the playlist.
Try all possibilities:
127.0.0.1:8080
127.0.0.1:8080
127.0.0.1:8080
localhost:8080
localhost:8080
Also, please open the command prompt and type ipconfig and hit enter. You will get your IPv4 Address. For example, if it is 192.168.56.1, then please try all possibilities with that IP address:
192.168.56.1:8080
192.168.56.1:8080
192.168.56.1:8080
Good luck, to stay updated with our future content, I kindly request you to consider subscribing and liking our videos.
Hi Team. Your full code link is forwarding me to the same youtube video link every time.
Dude - you have to scroll down, good luck.
sleek-data.blogspot.com/2023/10/airflow-dags-operators-tasks-providers.html
I am not fully understand your tutorial because of your code document (clean_data def). Also, when implement git_repo_dag.py, a dummy error come to me "airflow.exceptions.AirflowException: GitHub operator error: The conn_id `github_default` isn't defined". Maybe, a short video is good, but not enough knowledge and explaination will let it be bad
Dear friend - Thanks for the feedback.
I will add clean_data code to the code link shortly.
And for your another issue, pls try creating a default git connection- pla refer below video to create and understand connections:
czcams.com/video/-fjAchpM4mc/video.html
@@SleekData Thank for your response! Hope you will give us more awesome videos
Local host error
Dear Friend
Please double-check the Docker Compose file. You should expose port 8080 from a container to the host. You can refer to the previous video in the playlist.
Try all possibilities:
127.0.0.1:8080
127.0.0.1:8080
127.0.0.1:8080
localhost:8080
localhost:8080
Also, please open the command prompt and type ipconfig and hit enter. You will get your IPv4 Address. For example, if it is 192.168.56.1, then please try all possibilities with that IP address:
192.168.56.1:8080
192.168.56.1:8080
192.168.56.1:8080
Good Luck, please like and subscribe.