C3_W4_Lab2 - tfx installation taking very long time

INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run, you can press Ctrl + C to do so. To improve how pip performs, tell us what happened here: https://pip.pypa.io/surveys/backtracking

Getting this info repeatedly while downloading prompt-toolkit and pexpect as part of installing tfx.

Due to this started getting colab notification of running colab for too long without using GPU.

Please resolve the issue.


I also have the same issue, C3_W1_Lab_2.

I already restarted the runtime several times but it just stops at this point:

Do I need to wait for the installation to be completed?

Hi! Thank you for reporting. This might be due to the TF version used by Colab. It was just updated to use TF2.6 very recently. Will check this and let you know of the fix asap. Thanks again!


I’m having the same Issue. Yesterday i waited for more than 3 hours and couldn’t get it installed.

1 Like

Hi everyone! The lab has now been updated to use the latest version of TFX and disabling the GPU accelerator. TFX currently uses tensorflow 2.5 and the Colab documentation discourages rolling back tensorflow versions when using GPU. Just tested now and it should work. Please reopen this lab from the classroom to see the revision. Thank you and hope it also works on your end!


Thank you Chris.

It’s working fine now.

Hi Chris, when I import tfdv, I got the “AttributeError: module ‘apache_beam’ has no attribute ‘metrics’” error. Is it also a version issue? How can I solve it? Thank you!

Hi Ran! Welcome to Discourse! Your query seems to be related to a different topic than what is originally in this thread. To keep the forum organized, kindly repost this as a new topic under MLEP Course 4 with steps on how you arrived at the issue. This lab (C3 W4 Lab 2) doesn’t use TFDV by default so please show where you placed the command so I or other mentors will know how to replicate it. Thank you and looking forward to the new topic!