Inference with trained model failing

I’d appreciate any help on this. Any fixes you can suggest is much appreciated. Thank you in advance.

After completing training in tutorial #5 ( Training Process) I am getting the following status 500 error:
APIError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/lamini/api/rest_requests.py in make_web_request(key, url, http_method, json)
204 if description == {“detail”: “”}:
205 raise APIError(“500 Internal Server Error”)
→ 206 raise APIError(f"API error {description}")
207
208 return resp.json()

APIError: API error {‘detail’: ‘error_id: 291160229988797889814082428996867401138: Internal server error’}

The code I have been using on Google Colab works perfectly with meta-llama/Meta-Llama-3-8B-Instruct but fails when I replace the model name with the 64 character model_ID from Lamini.

Here is the python code I used for inference on Colab:
import lamini
lamini.api_key = “
llm = lamini.Lamini(model_name=“meta-llama/Meta-Llama-3-8B-Instruct”)
query = “Why is the sky blue?”
response = (llm.generate(query, output_type={“Response”:“str”}))
print(response[“Response”])