I get the following error when running lesson 3 and thus can’t continue:
{‘error’: {‘message’: ‘Unable to access non-serverless model togethercomputer/llama-2-7b-chat. Please visit hxxps://api.together.ai/models/togethercomputer/llama-2-7b-chat to create and start a new dedicated endpoint for the model.’, ‘type’: ‘invalid_request_error’, ‘param’: None, ‘code’: ‘model_not_available’}}
Facing the same error here.
Hi, after spending considerable hours trying to make it work, I found a small workaround.
As we continue the class, we are shown how we can change the llama model when we request for a response.
Please try to change the code from
response = llama(prompt)
to
response = llama(prompt, model=“META-LLAMA/LLAMA-3-8B-CHAT-HF”)
This changes the llama-2-7b-chat model, that is causing the error, to llama-3-8b-chat.
It seems the llama 2 7b chat model, which is mentioned in the utilis file, has been removed from the together.ai website and is hence not accessible.
I hope this is helpful.
edit: It does not give the best responses, nowhere comparable to the instructors screen and definitely more buggy responses for some reason
Having a conversation with itself in the response
Hey, it looks like the issue is because the model you’re trying to use, togethercomputer/llama-2-7b-chat
, isn’t set up for serverless access.
You’ll need to create a dedicated endpoint for it.
- Go to the link in the error (change
hxxps
tohttps
): https://api.together.ai/models/togethercomputer/llama-2-7b-chat - Follow the steps there to create a new dedicated endpoint for the model.
- Once you’ve got that set up, update your code to use the new endpoint.
Let me know how it goes or if you run into any other issues!