Hello, I’m trying to run the codes locally using ollama, but I get an attribute error after running:
llm_router = llm.with_structured_output(Router)
AttributeError Traceback (most recent call last)
Cell In[37], line 1
----> 1 llm_router = llm.with_structured_output(Router)
AttributeError: ‘str’ object has no attribute ‘with_structured_output’
This is how I load my LLM:
from ollama import chat
Model configurations
llm = ‘gemma3’
def generate_response(prompt):
“”“Generate response using specified LLM model”“”
response = chat(model=llm, messages=[{‘role’: ‘user’, ‘content’: prompt}])
return response[‘message’][‘content’]
Example usage
if name == “main”:
response = generate_response(“Hello, how are you?”)
print(response)