How I Improved My Productivity with Cursor and the Heroku MCP Server
- Last Updated: May 05, 2025
Generative AI has been one incredible tool to improve my productivity not only for work but for personal projects too. I use it every day, from generating stories and images for my online role playing games to solving code and engineering problems and building awesome demos. Lately I’ve leaned into Cursor as my go‑to AI coding companion. Its inline suggestions and quick edits keep me moving without context‑switching. Connecting Cursor to my apps through the Heroku MCP Server lets me perform actions like deploying or scaling, without leaving my code editor, making AI a first class citizen in the Heroku AI PaaS developer toolset. Using it along with the Heroku Extension for VS Code is a total win. In this article, I’ll show you how tying Cursor and MCP together saved me time and helped me focus on the parts of development I actually enjoy.
What is Model Context Protocol?
Model Context Protocol (MCP) is an open standard from Anthropic that defines a uniform way for my AI assistant (like Cursor) to talk to external tools and data sources. Instead of juggling custom APIs or integrations, MCP wraps up both the “context” my code assistant needs (code snippets, environment state, database schema) and the “instructions” it should follow (fetch logs, run queries, deploy apps) into a single, predictable format—much like a USB‑C port lets any device plug into any charger without extra adapters, Model Context Protocol is the universal connector for your AI tools and services.
Under the hood, MCP follows a simple client–server model:
- Host: my editor or chat interface (e.g., Cursor) that decides what my assistant can access
- Client: the small bridge component that keeps a live connection open
- Server: a lightweight service exposing specific capabilities (APIs, database calls, shell commands) in MCP’s schema
When I ask Cursor to “scale my Heroku dynos” or “pull the latest customer records,” it sends an MCP request to the right server, gets back a structured response, and I can keep coding without switching contexts or writing new integration code.
AI Dev Tools I Use Everyday
When I’m not on stage presenting or behind a mic recording a podcast, I’m usually in VS Code building JavaScript demos that highlight Heroku’s capabilities and best practices. Backend work is my comfort zone, front-end and design aren’t, so I lean on AI to bridge those gaps. Given a design spec (from Figma for example), I can get a frontend prototype in minutes, instead of writing HTML/CSS at hand, making the interaction with the design team straightforward. I’ve tried Gemini for ideation, and ChatGPT and Claude for debugging and refactoring code.
Lately, though, Cursor has become my go-to IDE. Its inline LLM suggestions and agentic features let me write, test, design, and even deploy code without leaving the editor. Pairing Cursor with different MCPs means that I can remain on the IDE, it keeps me focused, cuts out needless context-switching, and helps me ship demos faster.
Here, I share a list of the MCPs I use and how they improve my productivity:
Heroku MCP Server
All my demos go straight to Heroku. With the Heroku extension for VS Code, I rarely leave my editor to manage apps. And thanks to the Heroku MCP Server, my AI assistant now deploys, scales dynos, fetches logs, and updates config, all without opening the dashboard or terminal.
To install it in your IDE, start by generating a Heroku Authorization token:
heroku authorizations:create --description "Heroku MCP IDE"
Alternatively, you can generate a token in the Heroku Dashboard:
- Go to Account Settings → Applications → Authorizations and click Create new authorization.
- Copy the token you receive.
Then open your Cursor mcp.json
and add the following JSON configuration with the previously generated Heroku Authorization token:
Note: Make sure you have npx
installed a global command in your operative system, npx
is part of Node.js.
{
"mcpServers": {
"heroku-mcp-server": {
"command": "npx",
"args": [
"-y",
"@heroku/mcp-server"
],
"env": {
"HEROKU_API_KEY": ""
}
},
}
}
Check the project README for setup instructions on Claude Desktop, Zed, Cline, Windsurf, and VS Code.
LangChain MCPDoc
Many projects have started to adopt the /llms.txt file, which serves as a website index for LLMs, providing background information, guidance, and links to detailed markdown files. Cursor and other AI IDEs can use the llms.txt file to retrieve context for their tasks. The LangChain MCPDoc offers a convenient way to load llms.txt files, whether they are located remotely or locally, making them available to your agents.
Depending on the project I’m working on, I rely on this MCP to fetch documentation, especially when I’m building other MCPs, I use the recommended https://modelcontextprotocol.io/llms.txt file, or if I’m using LangChain JS to build agentic applications with Node.js, I use https://js.langchain.com/llms.txt.
I have also created my own Heroku llms.txt file, which you can download locally and use for your Heroku-related projects.
Here is how you can setup the LangChain MCPDoc in Cursor:
Note: Make sure you have uvx
installed as a global command in your operative system, uvx
is part of uv, a Python package manager.
{
"mcpServers": {
"heroku-docs-mcp": {
"command": "uvx",
"args": [
"--from",
"mcpdoc",
"mcpdoc",
"--urls",
"HerokuDevCenter:file:///Users/jduque/AI/llmstxt/heroku/llms.txt",
"--allowed-domains",
"*",
"--transport",
"stdio"
]
},
"modelcontextprotocol-docs-mcp": {
"command": "uvx",
"args": [
"--from",
"mcpdoc",
"mcpdoc",
"--urls",
"ModelContextProtocol:https://modelcontextprotocol.io/llms.txt",
"--allowed-domains",
"*",
"--transport",
"stdio"
]
}
}
}
Figma MCP Server
Another one of my favorites is the Figma MCP Server. It allows Cursor to download design data from Figma. I just copy and paste the link of the frame in Figma that I want to implement into my Cursor chat, and with the right prompt, it does the magic. For example, recently I had to implement our brand guidelines on a demo I’m working on, and I just pasted the frame that contains the Heroku color palette. It created a Tailwind CSS theme with the right styles. Without this tool, I’ll have to copy all the colors from the Figma file and organize them in the JSON structure as expected by Tailwind.
Here is how you can setup the Figma MCP Server in Cursor:
{
"mcpServers": {
"figma-mcp-server": {
"command": "npx",
"args": [
"-y",
"figma-developer-mcp",
"--figma-api-key=",
"--stdio"
]
}
}
}
Conclusion
Adding the Heroku MCP Server to Cursor transformed my editor into a powerful development tool. I stopped jumping between terminals, dashboards, and code. Instead, I write a prompt, and Cursor handles the rest: running queries, deploying apps, scaling dynos, or pulling logs.
This shift improved my productivity and shaved minutes off every task, cutting down on errors from running commands by memory or context-switching. More importantly, it lets me stay in flow longer, so I can focus on the parts of coding I enjoy the most.
If you’re already using Cursor or another AI coding tool, give MCP a try. Also, take a look at this quick demo where I use the Heroku MCP Server and Cursor to build and deploy a simple web app.