Congratulations @kronsteen
You could share your thoughts with community from
- Objective to create this chatbot
- Planning
- Challenges faced or overcome
- Overall outcome.
Happy learning!!!
Regards
DP
Hi, I have just bought a M2 Mac and I was not sure if it as was good enough to work with AI applications.
Hi,
I own a MacBook Pro M2 and here’s a little tip:
If you add device=“mps”, the GPU is used and inference is much faster:
In your example:
chatbot = pipeline(task = “text-generation”,
model=“facebook/blenderbot-400M-distill”,
device=“mps”
)
Same if you use StableDiffusion, but the speedup effect will be much higher than for the small chat model above:
import torch
from diffusers import DiffusionPipeline
pipeline = DiffusionPipeline.from_pretrained(“stabilityai/stable-diffusion-3-medium-diffusers”,
torch_dtype=torch.float16,
).to(“mps”)
→ The MacBook Pro M2 (and up) is just perfect for working with AI