Unified.to
All articles

How to Connect OpenAI to Real-Time SaaS Data with Unified.to MCP Server


September 16, 2025

OpenAI's LLMs can power rich workflows — but most products still need access to live customer data from CRMs, ATSs, HRIS, or accounting systems. That usually means writing brittle glue code, normalizing APIs, and maintaining webhook jobs.

With Unified.to's MCP server, you can skip that complexity. Unified.to exposes 317+ SaaS integrations as real-time, callable tools that OpenAI can use directly — no custom integration logic required.

In this guide, we'll walk through how to connect OpenAI to Unified.to MCP so your application can give OpenAI real-time access to customer SaaS data and actions.

Authentication

Every request to the Unified.to MCP server must include a token. You can pass this either as a URL parameter (?token=...) or in the Authorization: bearer {token} header.

There are two authentication flows:

  • Private (workspace key + connection)
    Use your Unified.to workspace API key and add a connection parameter. This should never be exposed publicly.
  • Public (end-user safe)
    Generate a token in the format {connectionID}-{nonce}-{signature} using your workspace secret. Safe to share with customers.

Integrating with OpenAI (LLM API)

Once your MCP client is set up, you can connect it to OpenAI's API, which natively supports remote MCP servers.

Configure OpenAI to Use Unified MCP

When you send a chat completion request, specify the MCP server as a tool source.

Here's a candidate assessment example:

import openai

openai.api_key = os.getenv("OPENAI_API_KEY")

response = openai.chat.completions.create(
    model="gpt-4o",
    messages=[
        {"role": "user", "content": "Score this candidate for the Software Engineer job."}
    ],
    tools=[{
        "type": "mcp",
        "url": MCP_URL,  # The Unified MCP URL with your token
    }],
    tool_choice="auto"
)

print(response.choices[0].message.content)

No backend glue code required—OpenAI orchestrates the tool calls via Unified MCP.

Advanced MCP Options

Unified.to MCP gives you granular control over how tools are exposed to OpenAI:

  • permissions → restrict available scopes.
  • tools → allowlist specific tool IDs.
  • aliases → add synonyms so OpenAI better matches tool names.
  • hide_sensitive=true → automatically strip PII (emails, phone numbers, etc).
  • include_external_tools=true → expose all vendor API endpoints, not just Unified.to's normalized models.

These options help you keep OpenAI outputs predictable and secure in production-grade workflows.

Sample Snippet

import OpenAI from 'openai';

import Anthropic from '@anthropic-ai/sdk';
// OpenAI model version
const modelVersion = 'latest';
// Unified connection Id that you want to have the mcp to use tools for
const connection = 'UNIFIED_CONNECTION_ID';
 // location from where your unified account was created
const dc = 'us';
// Optional: list of specific tools that you want to use
const toolIds = ['get_messaging_message','list_messaging_message']

const openai = new OpenAI({
    apiKey: process.env.OPENAI_API_KEY || '',
});
const params = new URLSearchParams({
    token: process.env.UNIFIED_API_KEY || '',
    type: 'openai',
    dc,
    connection,
    include_external_tools: includeExternal ? 'true' : 'false',
});

if (toolIds.length > 0) {
    params.append('tools', toolIds.join(','));
}

const serverUrl = `${process.env.UNIFIED_MCP_URL}/sse?${params.toString()}`;


let latestModel;
if (modelVersion === 'latest') {
    // get the latest model from open ai
    const models = await openai.models.list();
    latestModel = models.data[0].id;
} else {
    latestModel = modelVersion;
}

const completion = await openai.responses.create({
    model: latestModel,
    tools: [
        {
            type: 'mcp',
            server_label: 'unifiedMCP',
            server_url: serverUrl, // change url as needed
            require_approval: 'never',
        },
    ],
    instructions: `You are a helpful assistant`,
    input: message,
});

for await (const chunk of completion.output) {
    console.log('chunk', chunk);
    console.log(JSON.stringify(chunk, null, 2));
}

Coverage and Infrastructure

Unified.to MCP is the most complete hosted MCP server available today:

  • 20,421+ real-time tools (growing weekly)
  • 335+ integrations across 21 categories (ATS, CRM, HRIS, Accounting, Messaging, File Storage, and more)
  • Zero-storage architecture — no caching, no liability
  • Scoped security controls — permissions, aliases, and PII redaction
  • Multi-region deployment — US, EU, and AU data centers for compliance

This ensures your OpenAI workflows aren't just demos — they're secure, scalable, and designed for production-grade use cases.

Unified.to MCP works across all major LLM providers: OpenAI, Anthropic, Google Gemini, and Cohere. That means you can build once and connect to any agent client.

Note: Unified.to MCP is currently in beta and should not be used in production systems yet. Contact us if you'd like to explore production use.

All articles