AI First Application Development

We've seen the rise of MCP servers over the last year, first as purely local tool calling for software engineers, but moving towards business/enterprise use cases. What I believe we'll see in the near future is consumer-facing MCP servers as the next frontier of how everyone interacts with many of their applications. We may even begin to see a wave of applications where MCP is the primary way users interact with an application, with mobile, desktop and web all being second-class citizens.

The use cases for consumer MCP servers are reasonably apparent, connecting multiple services you would normally interact with separately into one conversation orchestrated by an LLM. To get a high level idea, here are some examples of theoretical prompts that would work across multiple MCPs:

  • Book a dinner reservation near my 5pm meeting and make it a table for 4 (calendar + dinner reservation MCPs)
  • Use this recipe <url> to add ingredients to my grocery cart (web fetching + supermarket ordering MCP)
  • Book me a flight to Chicago for the product launch next week, and make sure it's under $500 (calendar + flight booking MCPs)
  • If it's going to rain tomorrow, text the party group chat to let them know it'll be moved to <my house>. (weather forecast MCP + messaging MCP).

What's the Hold Up?

One of the biggest gaps currently, is that consumers can't easily access remote MCP servers!

  • Anthropic/Claude - pioneered MCP, but connecting to a remote MCP server is only available to paid subscription users
  • OpenAI/ChatGPT - has MCP integrations available in developer mode, and offers MCP connectors via ChatGPT Apps to their paid subscribers
  • Google/Gemini - As of writing has no ability to connect to MCP for it's chat users.

With all this gating behind developer options or enterprise subscriptions, it really isn't accessible to the majority of users. But supporting a remote MCP server isn't really that complicated, it's just HTTP connection to a web server!

Introducing: Joey MCP Client

With the current lack of available options for consumers (without forking over for a subscription), I decided to create my own chat interface that provides MCP support out of the box!

The app is called Joey MCP Client, and the way it works is very simple:

  • Connecting to LLMs is handled by authenticating with your OpenRouter account
  • You manually specify the remote MCP servers in settings
  • You can pick exactly which OpenRouter LLM + MCP Servers to use for every conversation!

OpenRouter was chosen as the model provider for the initial implementation because:

  • It provides access to virtually every major LLM out there
  • It has rich privacy settings about how your prompts are used
  • It is pay-per-use rather than subscription modelled
  • It supports OAuth for securely providing access to Joey with spending limits and token expiry
  • It abstracts the implementation differences between different proprietary models

The app even supports some neat chat features like:

  • Picture support, both from MCP responses and user uploads
  • Multiple MCP servers at the same time
  • OAuth to MCP servers - so you can connect to your accounts securely

And for my developer and privacy concious readers:

  • The app itself is source available (FSL + time-deferred MIT), so you can run it for free by building from source.
  • The app itself includes no telemetry (obviously what OpenRouter / the connected MCP servers do with any data is outside the apps control)
  • No ads!

There are also loads of features that could be added to future versions of the app too:

  • Server discovery / MCP server index
  • Rich server-provided UI with MCP Apps / MCP-UI
  • Support for locally hosted LLM servers (I hear you r/LocalLLaMA) and third-party OpenAI compatible APIs

Or build from source on GitHub

Example MCP Server with Joey: Mob CRM

To test out Joey, I built an "AI first" application to pair it with. I've long been a fan of Monica Personal CRM for helping me keep track of my social connections, but I always dreaded the data-entry aspect of it. This happens to be an absolutely perfect fit for an LLM-integrated application so I built Mob CRM, which is essentially a personal CRM you almost exclusively access via MCP.

It is far more natural to recount an interaction as:

"I met with John today, and he introduced me to his friend from work Jane. We all talked about our shared obsession with slurpees"

This is very easy to write (or speak with dictation), and takes the LLM a few tool calls to Mob CRM to create the contacts, the connections and the activity notes, but it's way easier than doing 30 different clicks via the UI of Monica.

Wins for Developers: Skipping the UI Layer

For developers that have had experience working with AI assisted coding over the last couple years, we know that some of the easiest tools to build are CLI tools, primarily because they are:

  • Simple input/output
  • Basically text-only
  • Easily testable
  • Easy for an LLM to interpret the results / self-evaluate when things are working

This is opposed to web development with LLMs that often require a human in the loop to communicate nuances in behavior, styling issues, etc. These aren't beyond multi-modal LLMs, but there is definitely an extra step there.

Looking at the list, we can quite easily see how MCP server development shares many parallels with CLI tools. It's text-heavy for the primary use cases, easily testable just like any other REST API and LLMs can easily evaluate their behavior.

This means that to develop new applications in an "AI first" way with an MCP server, the barrier is even lower than building a new mobile or web app.

Hopping Into the Future

Once MCP access is democratized by the big players adding support for everyone, or users adopting MCP via their own local chat apps, we may begin to see a shift in usage. The big drain in online activity (see: Google search result clicks) that has been sucked into the LLM vortex, will only continue to impact other applications as the friction lowers for users to perform their actions from within the comfort of chat.

Just as every serious business™ today needs a mobile app, tomorrow they will need an MCP server to stay relevant. In an AI-first world, if an LLM can't talk to your business, your customers won't either.