L_5 notebooks not working

Groq calls to llama-3.1-70b-versatile as the model is deprecated. The entire cell “Kicking off the Crew” crashes.

I am not able to replace it with version 3.3 or other models, as it also throws errors. The notebook is with lengthy tracebacks.

2025-01-25 12:25:15,792 - 139754766441344 - llm.py-llm:161 - ERROR: LiteLLM call failed: litellm.BadRequestError: GroqException - Error code: 400 - {‘error’: {‘message’: ‘The model llama-3.1-70b-versatile has been decommissioned and is no longer supported. Please refer to GroqCloud for a recommendation on which model to use instead.’, ‘type’: ‘invalid_request_error’, ‘code’: ‘model_decommissioned’}}

Unfortunately, this course is turned out to be a disappointment as L_4 and L_5 notebooks just can’t run.

1 Like

is this error encountered in course environment or when you run codes locally?

In the course environment

@Mubsi

please check into this issue.

Thank you for reporting! We’re updating the llama versions!

the groq llama-3.1-70b-versatile model is deprecated in favor of llama-3.3-70b-versatile

Hi! llama-3.1-70b-versatile has been upgraded to llama-3.3-70b-versatile this changes are ready in the notebook!

1 Like