Groq calls to llama-3.1-70b-versatile
as the model is deprecated. The entire cell “Kicking off the Crew” crashes.
I am not able to replace it with version 3.3 or other models, as it also throws errors. The notebook is with lengthy tracebacks.
2025-01-25 12:25:15,792 - 139754766441344 - llm.py-llm:161 - ERROR: LiteLLM call failed: litellm.BadRequestError: GroqException - Error code: 400 - {‘error’: {‘message’: ‘The model llama-3.1-70b-versatile
has been decommissioned and is no longer supported. Please refer to GroqCloud for a recommendation on which model to use instead.’, ‘type’: ‘invalid_request_error’, ‘code’: ‘model_decommissioned’}}
Unfortunately, this course is turned out to be a disappointment as L_4 and L_5 notebooks just can’t run.