Connecting to most available LLM Providers
This post summarizes the video "How we connect different model providers".
It provides the necessary NuGet packages, setup steps, and C# code for integrating various LLMs into the Microsoft Agent Framework.
Core Architecture
The framework is designed around the IChatClient interface (from Microsoft.Extensions.AI). Any provider that can be wrapped in this interface can power a Microsoft Agent.
1. OpenAI
Source: platform.openai.com
NuGet:
Microsoft.Agents.AI.OpenAICode:
var client = new OpenAIClient("YOUR_API_KEY");
IChatClient chatClient = client.AsChatClient("gpt-4o");
var agent = new ChatClientAgent(chatClient);
await agent.RunAsync("Hello!");
2. Google Gemini
Source: aistudio.google.com
NuGet: Unofficial community packages (Official coming soon).
Code:
var chatClient = new GoogleAiChatClient(apiKey: "YOUR_KEY", modelId: "gemini-1.5-pro");
var agent = new ChatClientAgent(chatClient);
3. Anthropic
Source: console.anthropic.com
NuGet: Community Anthropic SDK +
Microsoft.Extensions.AI.Note:
MaxTokensis mandatory inChatOptions.
var client = new AnthropicClient("YOUR_KEY");
var chatClient = client.AsChatClient();
var agent = new ChatClientAgent(chatClient, new ChatOptions {
ModelId = "claude-3-5-sonnet",
MaxTokens = 1000
});
4. Mistral
Source: admin.mistral.ai
NuGet: Mistral SDK.
var client = new MistralClient("YOUR_KEY");
var chatClient = client.AsChatClient();
var agent = new ChatClientAgent(chatClient, new ChatOptions { ModelId = "mistral-large-latest" });
5. XAI (Grok)
Source: x.ai
NuGet:
Microsoft.Agents.AI.OpenAI(OpenAI compatible).
var options = new OpenAIClientOptions { Endpoint = new Uri("[https://api.x.ai/v1](https://api.x.ai/v1)") };
var client = new OpenAIClient("YOUR_KEY", options);
IChatClient chatClient = client.AsChatClient("grok-beta");
6. Azure OpenAI
Source: ai.azure.com
NuGet:
Azure.AI.OpenAINote: Requires a "Deployment Name" instead of just a model ID.
var client = new AzureOpenAIClient(new Uri("YOUR_ENDPOINT"), new AzureKeyCredential("YOUR_KEY"));
IChatClient chatClient = client.AsChatClient("YOUR_DEPLOYMENT_NAME");
7. Microsoft Foundry (Cloud)
Source: Azure AI Foundry.
NuGet:
Microsoft.Agents.AI.Azure.AINote: Uses Azure Credentials (CLI/Managed Identity) instead of API keys.
var projectClient = new AIProjectClient("YOUR_CONNECTION_STRING", new DefaultAzureCredential());
var agent = await projectClient.GetAgentAsync("AGENT_ID");
// Turn Foundry Agent into Framework Agent
var frameworkAgent = agent.AsAgent();
8. Amazon Bedrock
Source: aws.amazon.com/bedrock
NuGet: Official AWS Bedrock packages for .NET.
// Set AWS_BEARER_TOKEN_BEDROCK environment variable
var runtime = new AmazonBedrockRuntimeClient();
IChatClient chatClient = runtime.AsChatClient("anthropic.claude-3-sonnet");
9. Ollama (Local)
Source: Local installation of Ollama.
NuGet:
OllamaSharp
var uri = new Uri("http://localhost:11434");
var client = new OllamaApiClient(uri);
IChatClient chatClient = client.AsChatClient("llama3");
10. Microsoft Foundry Local
Source:
WinGet installMicrosoft Foundry Local.NuGet:
Microsoft.AI.FoundryLocalNote: Runs models locally using system NPU/GPU/CPU.
var model = await FoundryLocal.StartModelAsync("phi3");
var client = new OpenAIClient(new OpenAIClientOptions { Endpoint = model.Endpoint });
IChatClient chatClient = client.AsChatClient(model.ModelId);
11-14. Multi-Model Portals (OpenRouter, Together.ai, Cohere, HuggingFace)
These providers use the OpenAI-compatible standard. Use Microsoft.Agents.AI.OpenAI and change the endpoint.
Provider | Endpoint |
|---|---|
OpenRouter |
|
Together.ai |
|
Cohere |
|
HuggingFace |
|
Generic Code Pattern:
var options = new OpenAIClientOptions { Endpoint = new Uri("PROVIDER_ENDPOINT") };
var client = new OpenAIClient("YOUR_API_KEY", options);
IChatClient chatClient = client.AsChatClient("MODEL_NAME");
15. GitHub Models
Source: GitHub Personal Access Token (PAT) with model access.
NuGet:
Azure.AI.Inference+Microsoft.Extensions.AI.
var client = new ChatCompletionsClient(new Uri("[https://models.inference.ai.azure.com]
(https://models.inference.ai.azure.com)"), new AzureKeyCredential("GITHUB_PAT"));
IChatClient chatClient = client.AsChatClient();
var agent = new ChatClientAgent(chatClient, new ChatOptions { ModelId = "gpt-4o" });
Key Takeaway: The "magic" is the IChatClient abstraction. Once you have an IChatClient, you can pass it to any Agent in the framework, regardless of the underlying LLM provider.
No comments:
Post a Comment