Lesson Description

The "MCP Overview" Lesson is part of the full, Cursor & Claude Code: Professional AI Setup course featured in this preview video. Here's what you'd learn in this lesson:

Steve discusses the features of MCP servers and shares some use cases for MCP related to GitHub repos or Figma designs. MCP behaves like an API but for LLMs. It allows applications like Cursor or Claude Code to interact with other external tools through a consistent protocol.

Preview
Close

Transcript from the "MCP Overview" Lesson

[00:00:00]
>> Steve Kinney: The other popular thing that comes up a lot is this idea of what's called MCP Server or Model Context Protocol. To be clear, one Model Context Protocol servers are not unique to just the AI coding agents. You can use them in the cloud desktop app, definitely the OpenAI SDK API as well as the Gemini one.

[00:00:30]
It is a standard for Manthropic that has taken some amount of foothold across all of the other providers as well. And the best way to think about this, and we'll see in a slide in a second, but is think about as ways to extend the functionality of a given LLM that you're working with.

[00:00:50]
Here's a collection of me taking various runs at metaphors all on one slide. Choose the metaphor that makes you the happiest, you can consider it like APIs for LLMs. The same way you would hit the Twilio API or the stripe API, you can also hit MCP server APIs as well.

[00:01:08]
In fact, a lot of MCP servers are wrappers over APIs. They're basically saying, hey, hey, Claude or ChatGPT or Gemini, tell me that you want to hit the GitHub API, I'll do it and give you the result effectively. That's a better metaphor than I wrote on any of these slides.

[00:01:25]
Basically, it is the core way that any of these coding agent framework thingies can connect to other services and do additional things outside of what they can do, right. And they are relatively easy to do. Again, we are not doing a full talk on everything MCP today, we're doing just enough so we can go use one.

[00:01:47]
And so, there's some kind of host in this case that is Cursor or Claude code for us. What are some examples? Okay, I don't have JIRA anymore. I would have totally used one that can pull in stories from Jira and just like pull from Jira and do stuff with Jira and manage all my Jira tickets.

[00:02:04]
Because I'm writing all these documents now about features. I could have written them in JIRA and pulled them in. I understand Die a hero will live long enough to become the villain. But you can have it like, reach out to Jira and pull in the next story with all of its acceptance criteria and stuff along those lines.

[00:02:21]
Or GitHub to pull in issues which Cursor has built in, as we saw before. I could just connect it then Claude code if you have that GitHub command line tool. The GH tool can also do most of the GitHub stuff. I don't tend to install that GitHub one too often because each one has its own built-in way to do that.

[00:02:42]
The other ones that a lot of people use are one that's just like wraps, like a Postgres driver. Let this one sink in for a second, you can trust your LLM agent to execute Postgres commands directly. Remember that screenshot I said before? Whatever dropped our entire database? You can do that too.

[00:03:08]
The Supabase one is actually super interesting because you can like. And what you should be doing the PostgreSQL one too is you need to pull in the schemas, right? Like yeah, you could have gotten the schema yourself, put in a markdown or JSON file. Getting the ability to get the schemas of your data structures itself can be useful.

[00:03:28]
Another one that is useful is the ability to like drive playwright or puppeteer or firecrawl which is like a web scraper. So it can then spin up a puppeteer browser and do stuff like scrape the web. The more limitations that we need to deal with is one, what is the thing that I care the most about which is keeping the context window tidy.

[00:03:55]
The way that these tools work is you register MCP server and it goes I can do all these things. I can open up GitHub issues and I can close pull requests, and I can drop your database from the Postgres1, and I can get your Supa base schema, and I can cool.

[00:04:14]
You need to tell the LLM all the tools it has available. Guess what that eats into again in that context window is tokens. So, if you are using max mode or one of these fancier models, or in the case of Claude code. If you are one of the fancy plans, you can switch between the Opus and Sonnet, but your rate limits are different.

[00:04:36]
They are cutting into your rate limits. So going like crazy and installing all of them just because is going to hurt cost you be a waste any one of those choose. So yeah, some of the ones that exist, as I said before, the GitHub one is a thing again.

[00:04:57]
It can do everything. You can go in there, grab a personal access token for whatever you wanted. Like hey, I want you like. And I think there's some really powerful stuff with this one. Like as somebody used to run an open-source project and dreaded clicking on the issues to see and having to triage them, like being able to pull them into like JIRA or something like that.

[00:05:20]
Or like do some amount of triage, like automatically tag them at least and put them in categories so I can like process a given one than just going through in chronological order. Totally, I mean you can have it change like GitHub workflows. Anything you can do with the GitHub API, it's a wrapper over the API.

[00:05:35]
So if you can do with the GitHub API%, you can do it with the GitHub MCP server with the token as you set it up. The one that I have not used simply because right now I don't have a designer. I still worked at my last job. I can't decide where to hook this up, and I definitely would have used it hooks into Figma's dev mode.

[00:05:53]
Which kind of gives you the shape and structure of the components in a given design and lets it query that. So you can literally build probably not to my standards, but things that look like the Figma design, and they're like presently for the first time in four years I don't use Figma.

[00:06:11]
But I will and I will use it again and you can pull that stuff in which I think is super cool notion. So, I keep a lot of my documents in markdown files right now. But historically Ryl and I have used Notion as where we kept all of our documentation.

[00:06:26]
We probably will at some point. So this allows it to kind of like hook into a Notion database query Notion, get documents. So you can kind of start to keep either your product requirements or your implementation plans outside of Markdown files and somewhere else and let the model go in and pull all the stuff out of Notion, right.

[00:06:41]
Again, it's ways to connect outside services from your LLM and like not all of these will be useful to you. And that Cursor dot directory, guess what, has like hundreds of them, right? There's also like an awesome MCP list and so you can go find the ones I would say like again in silver.

[00:07:01]
They have a token cost just installing anyone you want, just cause dumb right? Hey, there's a tool that would actually change my workflow install it. Jira has one, I will use it if I still use Jira. So no judgment, so this is Cursor Directory MCP. Cursors got that at Web Search but if you wanted to use Brave, great.

[00:07:28]
Cursor's got a slack integration but there is stuff like the Stripe API or anything along those lines that you wanted to hook into that you find it use definitely. It's worth a look and experimentation. We'll show how to install one in a second one, I use a lot even though I probably don't need to be, is context 7 which all this is a website.

[00:07:53]
Where somebody has ingested the docs for tons of libraries and frameworks and cut them up into smaller pieces that are LLM friendly. So if you wanted just the Svelte docs and using Svelte runes, you could say, hey Claude, whatever cursor, go get me those docs and write them to this markdown file so I can reference them.

[00:08:17]
Or like ephemerally go get them and put them into the context, right? And so I've been using like, could I also that I have a website where I could totally go copy and paste that stuff. Yes, so you got to decide. But it's also nice to get up to date prompts and just like again, it's a place where we talked about this background agents before.

[00:08:36]
Could you have a scheduled task that said, like, hey, go look at my code, figure out what the relevant documentation that we would want and go add markdown files to. Like I keep a folder called reference, you can keep them with another one called docs, I don't care what you call it and go pull those into the repo.

Learn Straight from the Experts Who Shape the Modern Web

  • In-depth Courses
  • Industry Leading Experts
  • Learning Paths
  • Live Interactive Workshops
Get Unlimited Access Now