Error when running Jupyter Code

I get the following error when running lesson 3 and thus can’t continue:
{‘error’: {‘message’: ‘Unable to access non-serverless model togethercomputer/llama-2-7b-chat. Please visit hxxps://api.together.ai/models/togethercomputer/llama-2-7b-chat to create and start a new dedicated endpoint for the model.’, ‘type’: ‘invalid_request_error’, ‘param’: None, ‘code’: ‘model_not_available’}}

1 Like

Facing the same error here.

Hi, after spending considerable hours trying to make it work, I found a small workaround.
As we continue the class, we are shown how we can change the llama model when we request for a response.

Please try to change the code from
response = llama(prompt)

to
response = llama(prompt, model=“META-LLAMA/LLAMA-3-8B-CHAT-HF”)

This changes the llama-2-7b-chat model, that is causing the error, to llama-3-8b-chat.
It seems the llama 2 7b chat model, which is mentioned in the utilis file, has been removed from the together.ai website and is hence not accessible.

I hope this is helpful.

edit: It does not give the best responses, nowhere comparable to the instructors screen and definitely more buggy responses for some reason

Having a conversation with itself in the response :joy:

1 Like

Hey, it looks like the issue is because the model you’re trying to use, togethercomputer/llama-2-7b-chat, isn’t set up for serverless access.

You’ll need to create a dedicated endpoint for it.

  1. Go to the link in the error (change hxxps to https): https://api.together.ai/models/togethercomputer/llama-2-7b-chat
  2. Follow the steps there to create a new dedicated endpoint for the model.
  3. Once you’ve got that set up, update your code to use the new endpoint.

Let me know how it goes or if you run into any other issues!

Here is the simplest hack/work around i have found just Click on File>Open>Got to utils.py file>Change the model in the llama function or any other similar functions from “togethercomputer/llama-2-7b-chat”-> to → “meta-llama/Llama-2-7b-chat-hf” or any other model as probably names how the models are called from the code might have been changed now.

make sure you save the code in utils.py and then restart the kernel in the main jupyter notebook file.
This work-around is working atleast tilll L3 rest i will update till when it works.

The new model name i gave is nothing but copied from the list of models on together.ai model is same LlaMA 2 7B Chat.

subsequently as i am on L4 it is working fine where there is need to use LLaMA 2 Chat 70B Model explicitly in the parameter of llama function, just visit the website Together.AI and select this model copy it’s model key/name whatever it is called paste it to replace the old name.

@Amit_Sangani I get the following error when running lesson 3 and thus can’t continue:
{‘error’: {‘message’: ‘Unable to access non-serverless model togethercomputer/llama-2-7b-chat. Please visit hxxps://api.together.ai/models/togethercomputer/llama-2-7b-chat to create and start a new dedicated endpoint for the model.’, ‘type’: ‘invalid_request_error’, ‘param’: None, ‘code’: ‘model_not_available’}}, would you help me resolving the issue continue the short course?