Running Locally Error

Hello, I’m trying to run the codes locally using ollama, but I get an attribute error after running:

llm_router = llm.with_structured_output(Router)


AttributeError Traceback (most recent call last)
Cell In[37], line 1
----> 1 llm_router = llm.with_structured_output(Router)

AttributeError: ‘str’ object has no attribute ‘with_structured_output’


This is how I load my LLM:

from ollama import chat

Model configurations

llm = ‘gemma3’

def generate_response(prompt):
“”“Generate response using specified LLM model”“”
response = chat(model=llm, messages=[{‘role’: ‘user’, ‘content’: prompt}])
return response[‘message’][‘content’]

Example usage

if name == “main”:
response = generate_response(“Hello, how are you?”)
print(response)

you haven’t downloaded all the required files to run the codes locally.

Also make sure you go through the requirement.txt file which provides information about all the module version one need to have for codes run successfully in your local environment

Hi again, I’m sorry for getting back to you so late as I did not receive a notification of your response.

I have installed the requirements; later on it doesn’t recognize any memory_store in langmem. I’ve installed langmam and langgraph though.

does the installed version match with course provided version for both the modules