In this talk, Max Pilżys, ML Engineer at deepsense.ai, walks through how he built a production-ready ChatGPT connector using FastMCP, tested it with MCPInspector, and wired everything end-to-end with OAuth (including PKCE, scopes, audiences, and redirect handling).
You’ll learn:
- how a connector differs from an MCP server (MCP server vs connector),
- how to design schema-correct search/fetch for deep research connectors,
- when to use RemoteAuthProvider, OAuthProxy, or OIDCProxy,
- how to fix the most common issues (PKCE, missing scopes, wrong audience, discovery failures),
- deployment options: Cloud Run, AWS Lambda + API Gateway, Cloudflare Workers — and which one works best for MCP,
- how to build secure ChatGPT integrations without fighting the protocol.
If you’re exploring ChatGPT API engineering, OpenAI MCP tutorials, or shipping production-ready MCP connectors, this walkthrough gives you a full end-to-end view — including the mistakes worth avoiding.
Timeline
01:06 Why Connectors Matter
02:45 Connectors vs MCP Servers vs Tools
06:22 Full Connector Flow Explained
08:30 FastMCP & MCP Inspector Overview
11:38 Building the Server: Search & Fetch + OAuth
23:09 Pitfalls & Lessons Learned
Speaker
Max Pilżys
Machine Learning Engineer at deepsense.ai






