Unified.to
All articles

How to Connect LLMs to Real-Time SaaS Data with Unified.to MCP Server


September 3, 2025

Unified's MCP server allows your application to give an LLM API real-time access to your customers' SaaS data and perform actions on that data — without you having to write custom business-logic code for every integration, which can often make things messy.

For example, imagine you're building a candidate assessment workflow, we'd usually call the API to fetch the data, and feed it into the LLM for it to analyze that data. With the Unified MCP server, we can enable OpenAI (or another LLM API) to directly access the data and perform actions making the workflow much simpler and cleaner.

The Flow

Here's how an LLM API connects to Unified's MCP server:

  • Your app connects to the MCP server using the HTTP endpoint.
  • You authenticate using you Unified.to API token generated and provider one of your customer's connection ID.
  • The LLM can then discover and call tools (like "fetch candidate", "score candidate", "update job status") in real time.

Authentication

You must provide a token to the MCP server as a URL parameter (?token={token}) or in the Authorization header (Authorization: bearer {token}).

Private (Direct LLM API):

Use your Unified.to workspace API key as the token, and include a connection parameter for the customer's connection ID.

Example:

<https://mcp-api.unified.to/mcp?token=><YOUR_API_KEY>&connection=<CONNECTION_ID>
```


_Note: Do not expose this token publicly._


## Building the application with an LLM API and MCP


If your application is using an LLM API that supports MCP (like OpenAI), you can use the [`mcp-use`](https://github.com/mcp-use/mcp-use) Python package to connect to Unified MCP and access your customer's data and actions.


### Dependencies


The full list of dependencies and setup you need to do to get things running.


```bash
mkdir unified-mcp-client
cd unified-mcp-client
python -m venv .venv
source .venv/bin/activate
pip install mcp-use python-dotenv requests openai
touch client.py

Setting up your secrets

Add your Unified API key, your customer's connection ID, and your workspace secret to a .env file:

echo "UNIFIED_API_KEY=<your Unified API key here>" >> .env
echo "CONNECTION_ID=<your Unified connection ID here>" >> .env
echo ".env" >> .gitignore

The MCP Client Class

Here's an example using the mcp-use package:

# client.py
import os
import hashlib
import secrets
from dotenv import load_dotenv
from mcp_use import MCPClient

load_dotenv()

CONNECTION_ID = os.getenv("CONNECTION_ID")
TOKEN = os.getenv("API_KEY")

MCP_URL = f"<https://mcp-api.unified.to/mcp?token={TOKEN}&connection={CONNECTION_ID}>"

client = MCPClient(MCP_URL)

# List available tools
tools = client.list_tools()
print("Available tools:", [tool['id'] for tool in tools])

# Call a tool (example: call the first tool with no arguments)
if tools:
    tool_id = tools[0]['id']
    result = client.call_tool(tool_id, {})
    print("Tool result:", result)

Integrating with OpenAI (LLM API)

Once your MCP client is set up, you can connect it to OpenAI's API, which natively supports remote MCP servers.

Configure OpenAI to Use Unified MCP

When you send a chat completion request, specify the MCP server as a tool source.

Here's a candidate assessment example:

import openai

openai.api_key = os.getenv("OPENAI_API_KEY")

response = openai.chat.completions.create(
    model="gpt-4o",
    messages=[
        {"role": "user", "content": "Score this candidate for the Software Engineer job."}
    ],
    tools=[{
        "type": "mcp",
        "url": MCP_URL,  # The Unified MCP URL with your token
    }],
    tool_choice="auto"
)

print(response.choices[0].message.content)

No backend glue code required—OpenAI orchestrates the tool calls via Unified MCP.

Integrating with Anthropic API

Manual Tool Orchestration:

  1. Call Unified's /tools endpoint and pass the tool list to Claude:
    resp = client.messages.create(
        model="claude-3-5-sonnet-20241022",
        tools=tools,
        input="list the candidates and then analyse the resumes from their applications",
    )
    
  2. Claude will return a tool_use block:
    [
      {
        "type": "tool_use",
        "id": "toolu_...",
        "name": "list_candidates",
        "input": { "limit": "100" }
      }
    ]
    
  3. Call Unified's /tools/{id}/call endpoint with the arguments.
  4. Return the result to Claude as a tool_result block in your next message:
    [
      {
        "type": "tool_result",
        "tool_use_id": "toolu_...",
        "content": "..."
      }
    ]
    

Integrating with Google Gemini API

Manual Tool Orchestration:

  1. Call Unified's /tools endpoint and pass the tool list as function_declarations to Gemini.
  2. Gemini will return a function call request:
    function_call {
      name: "list_candidates"
      args { fields { key: "limit" value { string_value: "100" } } }
    }
    
  3. Call Unified's /tools/{id}/call endpoint.
  4. Respond to Gemini with the tool result as a functionResponse in your next message:
    {
      "role": "user",
      "parts": [
        {
          "functionResponse": {
            "name": "list_candidates",
            "response": { ... }
          }
        }
      ]
    }
    

Integrating with Cohere

Manual Tool Orchestration:

  1. Call Unified's /tools?type=cohere endpoint and pass the tool list to Cohere:
    response = co.chat(
        model="command-a-03-2025", messages=messages, tools=tools
    )
    
  2. When Cohere requests a tool call, call Unified's /tools/{id}/call endpoint.
  3. Pass the tool result back to Cohere in your next message

Example Prompt and Response

Prompt:

"Score this candidate for the Software Engineer job."

What happens:

  • Your chosen LLM discovers the available tools from Unified MCP (e.g., fetch-candidate, fetch-job, score-candidate).
  • The LLM calls fetch-candidate and fetch-job tools to get the data.
  • The LLM calls score-candidate with the data.
  • The LLM returns a response like:

Response:

Candidate Jane Doe scored 92/100 for the Software Engineer job. Strengths: Python, distributed systems. Recommended for interview.
All articles