Installing PyTorch, running BlenderBot on M2 Mac - what a great feeling!

3 Likes

Congratulations @kronsteen

You could share your thoughts with community from

  1. Objective to create this chatbot
  2. Planning
  3. Challenges faced or overcome
  4. Overall outcome.

Happy learning!!!

Regards
DP

Hi, I have just bought a M2 Mac and I was not sure if it as was good enough to work with AI applications.

Hi,
I own a MacBook Pro M2 and here’s a little tip:

If you add device=“mps”, the GPU is used and inference is much faster:
In your example:
chatbot = pipeline(task = “text-generation”,
model=“facebook/blenderbot-400M-distill”,
device=“mps”
)

Same if you use StableDiffusion, but the speedup effect will be much higher than for the small chat model above:

import torch
from diffusers import DiffusionPipeline

pipeline = DiffusionPipeline.from_pretrained(“stabilityai/stable-diffusion-3-medium-diffusers”,
torch_dtype=torch.float16,
).to(“mps”)

→ The MacBook Pro M2 (and up) is just perfect for working with AI

1 Like