About
A lightweight wrapper around mcp-swift-sdk that integrates with LLM Swift libraries such as SwiftOpenAI and SwiftAnthropic to easily create your own MCP Clients in macOS applications.
What is MCP?
The Model Context Protocol (MCP) is an open protocol that enables AI models to securely interact with local and remote resources through standardized server implementations. MCP allows AI models to:
- Discover available tools
- Call these tools with parameters
- Receive responses in a standardized format
By using MCP, developers can create applications that give AI models controlled access to functionality like accessing files, making API calls, or running code.
Requirements
- macOS 14.0+
- Swift 6.0+
- Xcode 16.0+
- npm/npx installed on your system
Installing npm/npx
npx
is included with npm since npm version 5.2.0. To install npm, you can:
-
Install via Homebrew:
brew install node
-
Or download and install directly from the Node.js website
After installation, verify it's working:
npx --version
Installation
Swift Package Manager
Add the following dependency to your Package.swift
file:
dependencies: [
.package(url: "https://github.com/jamesrochabrun/MCPSwiftWrapper", from: "1.0.0")
]
Or add it directly in Xcode:
- File > Add Package Dependencies
- Enter the GitHub URL:
https://github.com/jamesrochabrun/MCPSwiftWrapper
- Click Add Package
⚠️ Important: When building a client macOS app, you need to disable app sandboxing. This is because the app will need to run the processes for each MCP server. You can disable sandboxing in your app's entitlements file.
Getting Started
Initializing a Client
To use MCP tools, you need to initialize an MCP client that connects to a tool provider. The example below shows one way to create a GitHub MCP client:
import MCPSwiftWrapper
import Foundation
final class GithubMCPClient {
init() {
Task {
do {
self.client = try await MCPClient(
info: .init(name: "GithubMCPClient", version: "1.0.0"),
transport: .stdioProcess(
"npx",
args: ["-y", "@modelcontextprotocol/server-github"],
verbose: true),
capabilities: .init())
clientInitialized.continuation.yield(self.client)
clientInitialized.continuation.finish()
} catch {
print("Failed to initialize MCPClient: \(error)")
clientInitialized.continuation.yield(nil)
clientInitialized.continuation.finish()
}
}
}
/// Get the initialized client using Swift async/await
func getClientAsync() async throws -> MCPClient? {
for await client in clientInitialized.stream {
return client
}
return nil // Stream completed without a client
}
private var client: MCPClient?
private let clientInitialized = AsyncStream.makeStream(of: MCPClient?.self)
}
Usage with Anthropic
Getting Tools for Anthropic
// Initialize an MCP client
let githubClient = GithubMCPClient()
// Get the MCP client and fetch available tools
if let client = try await githubClient.getClientAsync() {
// Get available tools from MCP and convert them to Anthropic format
let tools = try await client.anthropicTools()
// Now you can use these tools with Anthropic's API
// Pass them in your AnthropicParameters when making requests
}
Handling Tool Calls for Anthropic
When Anthropic's models request to use a tool, you need to handle the tool use and call the tool via MCP:
// When processing a message from Anthropic that contains tool usage
switch contentItem {
case .text(let text, _):
// Handle regular text response...
case .toolUse(let tool):
print("Tool use detected - Name: \(tool.name), ID: \(tool.id)")
// Update UI or state to show tool use
// ...
// Call the tool via MCP
let toolResponse = await mcpClient.anthropicCallTool(
name: tool.name, // Name of the tool from Anthropic's response
input: tool.input, // Input parameters from Anthropic's request
debug: true // Enable debug logging
)
if let toolResult = toolResponse {
// Add the tool result back to the conversation
anthropicMessages.append(AnthropicMessage(
role: .user,
content: .list([.toolResult(tool.id, toolResult)])
))
// Continue the conversation with the tool result
// ...
}
}
Usage with OpenAI
Getting Tools for OpenAI
// Initialize an MCP client
let githubClient = GithubMCPClient()
// Get the MCP client and fetch available tools
if let client = try await githubClient.getClientAsync() {
// Get available tools from MCP and convert them to OpenAI format
let tools = try await client.openAITools()
// Now you can use these tools with OpenAI's API
// Pass them in your OpenAIParameters when making requests
}
Handling Tool Calls for OpenAI
When OpenAI models request to use a tool, you need to extract the tool information and call it via MCP:
// If the message contains tool calls
if let toolCalls = message.toolCalls, !toolCalls.isEmpty {
for toolCall in toolCalls {
let function = toolCall.function
guard
let id = toolCall.id,
let name = function.name,
let argumentsData = function.arguments.data(using: .utf8)
else {
continue
}
// Parse arguments from string to dictionary
let arguments: [String: Any]
do {
guard let parsedArgs = try JSONSerialization.jsonObject(with: argumentsData) as? [String: Any] else {
continue
}
arguments = parsedArgs
} catch {
print("Error parsing tool arguments: \(error)")
continue
}
// Call tool via MCP
let toolResponse = await mcpClient.openAICallTool(
name: name, // Name of the tool from OpenAI's response
input: arguments, // Parsed arguments from OpenAI's request
debug: true // Enable debug logging
)
if let toolResult = toolResponse {
// Add the tool result as a tool message
openAIMessages.append(OpenAIMessage(
role: .tool,
content: .text(toolResult),
toolCallID: id
))
// Continue the conversation with the tool result
// ...
}
}
}
Complete Example
Check out the included example application in the Example/MCPClientChat
directory for a full implementation of a chat application using MCP with both Anthropic and OpenAI models.
License
This project is available under the MIT license. See the LICENSE file for more info.
Recommend MCP