Hello @Suraj_Kamath
When creating the PythonOperator
in line 100 of the book_of_the_day.py
, you should pass the get_random_book
method as the python_callable
input for the PythonOperator
. Instead, you have called the get_random_book
method and passed its returning value for this input, and upon calling this function outside the PythonOperator
, there is no ds
key in the context
.
In summary, python_callable=get_random_book
is the right answer to line 104, not python_callable=get_random_book()
.
I get it. Thank you!
Why though does this situation throw a DAG import error rather than importing the DAG and showing the run as failed (as the lab seemed to indicate it would do?)
You are welcome @Suraj_Kamath .
Upon uploading the python file to the DAG bucket, Airflow reads this file and tries to understand what it needs to do when the DAG is triggered. We have used different Operators such as EmptyOperator
and PythonOperator
, and these operators are created upon importing. However, these operators are only triggered when Airflow runs the DAG.
The error we had at the beginning, caused an exception on creating the PythonOperator
, so Airflow was not able to create the operators. On the other hand, the error the lab has mentioned happens inside the operator itself, and Airflow is not aware of it until the operator is triggered.
Hope this helps.
Thank you Amir. Very clear and helpful explanations.