I am trying to implement the course exercises locally using my own Jupyter lab and LLM running on a local server. The course has the following endpoints configured.
Thank you. I tinkered with this further and am now re-writing the LLM functions (generate_with_single_input etc) that will use ollama instead of Coursera or Together. Will be happy to share it with others if needed.