How do I run llama3.2 locally with crewai? How do I import LLM from crewai?

I’m currently taking the multiAgent systems with crewai course and I really could use some help. Has anyone successfully run crewai locally with llama3.2?

I keep getting stuck at “from crewai import LLM”, where I get a ModuleNotFoundError.

I have since reinstalled setuptools, double-checked my virtual environment, but I can’t seem to find why I am unable to import the LLM module.

I have moved your post to the forum area for that course.

Thanks Tom,

That sounds great.

Cheers,

Tayo

Please see link

Connect to any LLM - CrewAI

Hoping this can help you!

Thanks you so much Michaela.

The issue I’m faced with is that for some reason, my installation of crewai does not have the LLM class included. As a result, whenever I try to point to my locally installed llama3.2, I get the name error captured in the screenshot.

I’m have reinstalled crewai several times, even installed more recent versions, but nothing seems to work.

My suspicion is that my crewai installation was somehow unsuccessful/ incomplete, but I don’t have clarity on how to fix this. Everything else seems to work.

Update:

I found out what I was doing wrong. I needed to have llama3.2 running before attempting to run the command: “from crewai import LLM”. Things are looking good now.

Thank you so much guys!