Skip to main content

What is a Connector?

A connector is a standalone microservice that provides a standardized interface to interact with a specific type of data source (databases, APIs, etc.). Each connector implements the Model Context Protocol (MCP) to expose tools that can be invoked by users through the platform.

Key Characteristics

Independent Service

Each connector runs as its own service on a dedicated port

MCP Protocol

Implements standard MCP for tool registration and execution

Connection Pooling

Manages connection pools for efficient resource usage

Async Operations

Built on async/await for high concurrency

Available Connectors

PostgreSQL Connector

Production-ready connector for PostgreSQL databases. Features:
  • AsyncPG-based connection pooling
  • Full schema introspection
  • Parameterized query execution
  • Transaction support
  • SSL/TLS encryption
Port: 8027 Learn more →

MSSQL Connector

Microsoft SQL Server and Azure SQL Database connector. Features:
  • ODBC Driver 18 support
  • Azure SQL authentication
  • Async operations with aioodbc
  • Connection encryption
  • Schema introspection
Port: 8028 Learn more →

Connector Architecture

Connector Service (FastMCP)
├── main.py              # Entry point, tool definitions
├── schema.py            # Pydantic models for config
├── db_manager.py        # Connection pooling logic
├── Dockerfile.dev       # Container configuration
├── pyproject.toml       # Dependencies (UV)
└── media/               # Logos and assets

Tools vs Templates

Tools

Tools are callable operations that perform specific actions:
@mcp.tool()
async def list_tables() -> list[str]:
    """List all tables in the database"""
    return await pool_manager.get_tables(server_id, server_config)
Examples:
  • list_tables() - List database tables
  • execute_query(query, params) - Run SQL queries
  • test_connection() - Verify connectivity

Templates

Templates are pre-configured tool patterns:
@mcp.template(name="select_query", params_model=SelectQueryTemplate)
async def select_query(params: SelectQueryTemplate) -> str:
    """Execute SELECT with automatic LIMIT"""
    # ...implementation
Use Cases:
  • Common query patterns
  • Reusable workflows
  • Parameter validation

Connection Lifecycle

1

Connector Startup

Service starts and loads configuration
2

Server Creation

User creates a server instance with specific config (host, database, credentials)
3

Pool Initialization

Connection pool created for the server
4

Tool Execution

Tools use pooled connections for operations
5

Pool Cleanup

Idle connections automatically closed after TTL
6

Server Destruction

Pool closed when server is deleted

Configuration Model

Each connector defines a Pydantic configuration model:
class PostgresConfig(BaseModel):
    host: Optional[str] = Field(default=None)
    port: Optional[int] = Field(default=5432)
    database: str = Field(description="Database name")
    username: Optional[str] = Field(default=None)
    password: Optional[str] = Field(default=None)
    pool_size: int = Field(default=5)
    max_overflow: int = Field(default=10)
    additional_params: Optional[Dict[str, Any]] = None
Benefits:
  • Type validation
  • Auto-generated forms in dashboard
  • Default values
  • Field descriptions

Multi-Server Support

A single connector can manage multiple server instances:
PostgreSQL Connector (:8027)
├── prod-db-server
│   └── Pool (host: 192.168.1.10, db: production)
├── staging-db-server
│   └── Pool (host: 192.168.1.11, db: staging)
└── dev-db-server
    └── Pool (host: localhost, db: development)
Each server maintains its own connection pool with independent configuration.

Performance Features

Connection Pooling

PoolManager:
├── Global Limit: 500 connections
├── Per-Server Limit: 20 connections
├── Idle TTL: 300 seconds
└── LRU Eviction: Automatic cleanup

Async Operations

All database operations use async/await:
async def execute_query(query: str, params: dict):
    pool = await self.get_pool(server_id, server_config)
    async with pool.acquire() as conn:
        async with conn.cursor() as cursor:
            await cursor.execute(query, params)
            return await cursor.fetchall()

Security Considerations

  • Credentials stored encrypted in backend database
  • Never logged or exposed in responses
  • Passed securely to connector services
  • SSL/TLS support for database connections
  • Certificate validation options
  • Encrypted data in transit
  • Parameterized queries prevent SQL injection
  • Input validation via Pydantic
  • Query timeout limits

Connector Registry

When a connector starts, it registers with the backend:
{
  "name": "postgresql",
  "type": "database",
  "endpoint": "http://localhost:8027",
  "config_schema": {...},
  "tools": [
    {
      "name": "list_tables",
      "description": "List all tables",
      "parameters": {}
    }
  ]
}

Development Workflow

1

Choose Template

Start with existing connector (PostgreSQL/MSSQL) as template
2

Create Structure

Copy connector directory, update naming
3

Define Schema

Create configuration model in schema.py
4

Implement Manager

Write connection pooling logic in db_manager.py
5

Add Tools

Define tools in main.py
6

Test Locally

Run connector and test with dashboard
7

Containerize

Build Docker image for deployment

Best Practices

✅ DO

  • Use async/await throughout
  • Implement connection pooling
  • Add comprehensive error handling
  • Validate inputs with Pydantic
  • Include health check endpoint
  • Document tool parameters
  • Use parameterized queries

❌ DON'T

  • Block async operations with sync calls
  • Create connections without pooling
  • Hard-code credentials
  • Ignore connection limits
  • Skip input validation
  • Expose sensitive data in logs

Next Steps