Modern white-collar work is shifting away from knowing things to conjuring them. We’re part scribe, part sorcerer. We don’t just write code anymore; we summon it into existence, shaping it to fit whatever spell (or requirements spec) we’re under.
Models like Claude and ChatGPT have become essential to my daily workflow: brainstorming ideas, tightening drafts, even finding open taco spots at 12:30 a.m.
But for all their power, I still spend too much time ferrying information back and forth. Every Substack post I write gets bounced between different models—for grammar, for structure, for code. I scaffold ideas with outlines, upload reference files, and re-explain the same context again and again.
⌘C. ⌘V. Repeat.
The unlock? Surprisingly simple: I gave Claude access to my local files. No uploads. No summaries. Just a shared workspace. And with that, I found a new way to work—one that might be useful even if you’ve never written a line of code.
Why Use Claude?
Anthropic introduced the Model Context Protocol (MCP) in November 2024: a standardized way for language models to call external tools safely and predictably. If you read my previous post (yes, the one with the confused fish order in Croatia), you know MCP is all about giving models structured menus instead of making them guess at every step.
Claude supports this natively, which makes it the easiest way to start experimenting—especially compared to ChatGPT, which still relies on a more closed, plugin-based system. That’s why for this demo, I’m using Claude: it’s not just capable, it’s aligned with the standard that’s actually making tool use practical.
Getting Started
The details of how MCP servers work under the hood are fascinating (and surprisingly simple), but for now we’re going to stay high-level and practical.
Here’s what we’ll do:
Use UV, a Python package manager, to install a filesystem MCP server
Wire Claude up to talk to it
Run simple, powerful file operations via natural language
First lets get UV set up. Run this in a terminal window.
pip install uv
Next, lets update Claude by opening the config file.
open -e ~/Library/"Application Support"/Claude/claude_desktop_config.json
and adding this as the entire file contents:
{
"mcpServers": {
"mcp-python-filesystem": {
"command": "uvx",
"args": [
"--from",
"git+https://github.com/dpkirschner/mcp-python-filesystem@main",
"filesystem-server-start",
"/opt/claude" // <--- change this line
]
}
}
}
This tells Claude to use UV to pull down and run the filesystem MCP server automatically at startup.
🔒 A note on the last line: This defines which directories Claude has access to. I gave it a specific external drive at /opt/claude, just to keep things tidy, but this can be any directory.
What Can Claude Do In 30 seconds?
Big Picture Why This Matters
It’s easy to dream up fun ways to use AI. You could write a screenplay, build a startup, or simulate a drunken pirate debate on nuclear fission. These are novel, entertaining, and undeniably clever.
But they’re also isolated. Little demo moments. One-offs.
What I’ve been searching for, and what MCP starts to deliver, is continuity. A way to do interesting things productively. To build tools that fit into my life and workflows, instead of sitting beside them like a party trick I can never find the right moment to pull out.
That’s what’s so powerful about the Model Context Protocol. It gives language models a structured interface to the tools I already rely on. And because it’s modular, I can wire up new capabilities in minutes—often with nothing more than a config change.
I shared a broader list in my last post, but here’s a quick sampler of the types of products you can connect to Claude:
For Developers
AWS / Azure / GCP – Query and manage your cloud infra, storage, and databases via natural language.
Postgres / Redis / ClickHouse – Perform queries and updates on popular databases, fast.
For AI Builders
Needle – Ready-to-use RAG pipeline for structured document QA.
Unstructured – Parse messy PDFs, HTML, and more into clean model-ready input.
For Knowledge Work
Notion / Linear – Search, update, and automate your team’s favorite tools.
Zapier – Instantly plug into 8,000+ apps using natural-language flows.
For Browser & Web Automation
Browserbase – Automate browsing, scraping, and data entry tasks.
Apify – Tap into 3,000+ prebuilt web data extraction tools.
Want to wire any of these up?
You already saw the pattern—just point to a different MCP server in your config and restart Claude.
You could have this running in under 30 seconds. No joke.
What’s Next?
The filesystem MCP server I used in this demo? I built it myself using Windsurf and their new frontier SWE-1 model. Claude, Gemini, and ChatGPT all chimed in, of course. It felt more like orchestrating a team of interns than writing software from scratch.
And the wild part? It worked. A lot more smoothly than I initially expected.
In the next post, I’ll walk you through exactly how I built it—step by step. If you’ve ever wanted to create your own LLM-powered toolchain, this is your invitation.
Until then, I’m curious:
Have you given your chatbot assistant any real tools yet? Or is it still just answering questions?