When I attempt to import the DAGS into Airflow per Section 6 of the assignment I am getting a broken DAG error in Airflow.
At first, I was getting the error for all 3 DAGS, but I found and corrected the vendor_name variable definition by enclosing the jinja template in quotes.
when I re-run the generate_dags.py and re-sync the folders with AWS, two of the DAGS import, but the “to_my_place_ai” DAG does not. All the DAGs in the DAGS bucket show the vendor_name variable correctly defined in the two place the variable is called out.
The JSON file also correctly shows the correct variable name:
So what gives? Why is this one DAG not importing? Have I missed something?
Hello @DRybski,
Your config json and the depedencies looks good. As you noticed from the other DAG’s the issue was with line 36 missing “quotes”. There shouldn’t be an issue after you fixed the “quotes” in the template.py
and repeated the steps 5.3.1 to 5.3.3. Aftewards you got the updated output with the aws s3 sync
command for ALL 3 DAG’s:

I could reproduce your issue with what you described but I get a different import error, very similar but different vendor_name=to_my_place_ai
instead of vendor_name="to_my_place_ai"
:
In order to get “your” error I had to add quotation marks around the vendor_name
to_my_place_ai
:
Sometimes the airflow might fail to see the differences but I suggest to make all the changes in the template.py
file so it will upload ALL 3 DAG’s files at the same time. Hope it helps
Thanks Georgios!
I re-saved the template.py file (didn’t make any changes) just before my lab session timed out. However, when I rebooted and re-synchronized all the DAGS were imported right away,… so maybe was just timing issue. Thanks again for confirming that my code was good!
1 Like