C4_W1_Lab_1_tfserving_hello_world

Hello everyone,
I’m having a problem in making inference request either with localhost or 127.0.0.1 as seen in the attached image. Any help how to overcome this issue.

1 Like

any help regarding the above problem please

Please help with this issue. @balaji.ambresh can you please help?

Hi Nitish. I don’t have access to the latest course material. Please reach out to a course mentor for help.

@balaji.ambresh Can you please tell me who is the course mentor for this? How to reach out to them?

Here’s a mentor for data and deployment:

You can find other mentors based on their profile: Mentor - Tensorflow D&D

@gopiramena please help with this issue

1 Like

Hello Nitish,
From the image logs, I see an old version of the assignment being used. You can take the latest version from the following link: tensorflow-2-public/C4_W1_Assignment.ipynb at main · https-deeplearning-ai/tensorflow-2-public · GitHub

For the issue, if its occurring in latest version, then you can check the following things at your end:

  1. Make sure the tensorflow-model-server is launched and running as mentioned under the heading “Run the TensorFlow Model Server” in the above workbook.
  2. If you’re still getting the Connection refused error, try replacing the localhost with 127.0.0.1 in your requests.post call. This should fix the connection refused error.
1 Like

Hello @gopiramena, I am using the latest version of the notebook and I still face the issue. I also tried replacing localhost with 127.0.0.1 and also with 192.168.0.1 but the issue persists.

Just so you know, I also tried running this notebook from tensorflow website on colab - Train and serve a TensorFlow model with TensorFlow Serving  |  TFX

This is also a similar tensorflow serving tutorial and I am facing the same issue. Here also, I tried replacing the localhost with 127.0.0.1 but the error did not resolve.

I posted this issue in tensorflow forum also. Currently, my post is under review (it is not posted yet, it is under moderation check I guess)

Please help me with this.

1 Like

If you’re sure your request is correct then the only reason I can think of is a proxy or some network setting on your PC.

You can try once on google colab directly, by starting a TF Model server from colab itself. (Reference: Google Colab) (Ensure port number is correct)

1 Like

@gopiramena I have always been trying in google colab only. I ran the notebook you shared in colab. Facing the same issue when I run your notebook also.

This is coming up when !tail server.log is run

After this when I run this-

headers = {"content-type": "application/json"}
json_response = requests.post('http://localhost:8501/v1/models/fashion_model:predict', data=data, headers=headers)
predictions = json.loads(json_response.text)['predictions']

The same error is coming:-

1 Like

The notebook is one I found on github as an example of TF Serving. One thing I see if you started the server on port 8601 but using on port 8501 in requests call. But that’s not the main issue I presume.
Your server log is showing errors. Which means server hasn’t started.
The colab glibc version is not 2.28 (you can check by !ldd --version) as need by the tf model server (latest version I suppose). I would suggest to downgrade the installation of tensorflow model server. You may need to play around by downgrading to different versions to check which starts up fine.

You can start server in foreground without nohup to check that:

!tensorflow_model_server --rest_api_port=8501 --model_name=your_model --model_base_path="your_model_dir"

Version compatibility in python can be a pain sometimes, so just go by the errors thrown on starting the server, and either upgrade the dependent packages or downgrade the tf model server version. I’m not an expert in running in colab, but a way to downgrade in colab would be like follows:

!wget 'http://storage.googleapis.com/tensorflow-serving-apt/pool/tensorflow-model-server-<down_version>/t/tensorflow-model-server/tensorflow-model-server_<down_version>_all.deb'
!dpkg -i tensorflow-model-server_<down_version>_all.deb
!pip3 install tensorflow-serving-api==<down_version>
!sudo apt-get update && sudo apt-get install tensorflow-model-server
1 Like

@Nitish_Pandey I tried and found version 1.15.0 works fine… to use that don’t install tensorflow model server via apt-get install, but use the following and then start your server.

!wget 'http://storage.googleapis.com/tensorflow-serving-apt/pool/tensorflow-model-server-1.15.0/t/tensorflow-model-server/tensorflow-model-server_1.15.0_all.deb'

!dpkg -i tensorflow-model-server_1.15.0_all.deb
1 Like

Thank you very much for your efforts @gopiramena

I posted this issue on tensorflow official forum also and a guy from tensorflow team suggested me a workaround. They also suggested the same thing you did

Here’s a link to the topic: Official tensorflow serving tutorial is not working in colab - General Discussion - TensorFlow Forum

If anyone else faces this issue, please refer to the above mentioned link.

Thanks

1 Like