Using MCP Servers with Claude Code

Hey deeplearning.ai community!

I’m wondering if anyone else has run into the problem of using MCP Servers with Claude Code, and then seeing their usage skyrocket to the point where the session has used up all tokens.

I’m guessing some MCP’s are set up in a way that keeps context more lean, while others maybe have a massive payload of context?

Just wanted to pose a general question here and see if anyone has any good tips or workarounds for this problem. Thanks!

Yeah, I started using Skills way more, can definitely recommend. They function quite well, less token-heavy. I still run MCPs occasionally like for Supabase related tasks, but Skills can definitely help on the token usage.