MCP Server is a program that exposes tools, resources, or prompts to AI assistants using the Model Context Protocol standard.
How MCP servers work
An MCP server is the provider side of the Model Context Protocol. It sits between an AI assistant and some external capability, whether that is a database, an API, a local file system, or a specialised tool like bookmark search.
When an MCP client connects, the server goes through a handshake process. It tells the client what capabilities it supports: which tools are available, what resources can be read, and what prompt templates exist. The client stores this information and uses it to decide when to invoke the server during conversations.
For example, when you ask Claude “find my saved articles about TypeScript”, Claude checks its connected MCP servers, sees that ContextBolt offers a bookmark search tool, calls it with your query, and weaves the results into its response.
What an MCP server can expose
MCP servers can provide three types of capabilities:
Tools are functions the AI can call. A bookmark search tool, a database query tool, or a code execution tool are all examples. Tools accept parameters and return results. They are the most common capability.
Resources are data the AI can read. A list of available bookmark collections, a project’s file tree, or a set of configuration values could all be exposed as resources. Resources are read-only and help give the AI context.
Prompts are pre-built templates for common tasks. A server might offer a “summarise bookmarks” prompt that structures the AI’s response in a specific way. Prompts are optional and less commonly used than tools or resources.
Building an MCP server
The barrier to building an MCP server is low. The official SDKs handle protocol details, transport, and error handling. You focus on defining what your server does.
A minimal TypeScript server looks like this: you create a Server instance, register your tools with their input schemas, implement the handler functions, and connect it to a transport (stdio for local, HTTP for remote). The SDK handles JSON-RPC messaging, capability negotiation, and connection lifecycle.
Most servers start with one or two tools and grow from there. ContextBolt’s MCP server, for instance, exposes bookmark search and collection listing tools that any compatible AI client can use.
Real-world examples
MCP servers power a growing ecosystem of AI integrations:
- ContextBolt exposes your social media bookmarks from Twitter/X, Reddit, and LinkedIn as searchable tools for Claude Desktop, Cursor, and other clients
- File system servers let AI assistants read and write local files safely
- Database servers provide natural language query access to PostgreSQL, SQLite, or other databases
- Git servers expose repository history, diffs, and branch information
- API wrapper servers turn any REST API into an AI-accessible tool
The pattern is consistent: take something useful, wrap it in an MCP server, and every compatible AI assistant can use it immediately.