Rank
83
A Model Context Protocol (MCP) server for GitLab
Traction
No public download signal
Freshness
Updated 2d ago
Crawler Summary
MCP server for querying LLMs LLM MCP Server A Model Context Protocol (MCP) server that enables AI clients to query multiple Large Language Models (OpenAI GPT, Anthropic Claude, and Google Gemini) through a unified interface. Prerequisites - $1 (v16 or higher recommended) - $1 (comes with Node.js) - API keys for the LLM providers you want to use: - OpenAI API key (for GPT models) - Anthropic API key (for Claude models) - Google API key (for Gemin Published capability contract available. No trust telemetry is available yet. Last updated 2/24/2026.
Freshness
Last checked 2/22/2026
Best For
Contract is available with explicit auth and schema references.
Not Ideal For
llm-mcp is not ideal for teams that need stronger public trust telemetry, lower setup complexity, or more explicit contract coverage before production rollout.
Evidence Sources Checked
editorial-content, capability-contract, runtime-metrics, public facts pack
MCP server for querying LLMs LLM MCP Server A Model Context Protocol (MCP) server that enables AI clients to query multiple Large Language Models (OpenAI GPT, Anthropic Claude, and Google Gemini) through a unified interface. Prerequisites - $1 (v16 or higher recommended) - $1 (comes with Node.js) - API keys for the LLM providers you want to use: - OpenAI API key (for GPT models) - Anthropic API key (for Claude models) - Google API key (for Gemin
Public facts
6
Change events
1
Artifacts
0
Freshness
Feb 22, 2026
Published capability contract available. No trust telemetry is available yet. Last updated 2/24/2026.
Trust score
Unknown
Compatibility
MCP
Freshness
Feb 22, 2026
Vendor
Qduc
Artifacts
0
Benchmarks
0
Last release
1.0.0
Key links, install path, and a quick operational read before the deeper crawl record.
Summary
Published capability contract available. No trust telemetry is available yet. Last updated 2/24/2026.
Setup snapshot
git clone https://github.com/qduc/llm-mcp.gitSetup complexity is MEDIUM. Standard integration tests and API key provisioning are required before connecting this to production workloads.
Final validation: Expose the agent to a mock request payload inside a sandbox and trace the network egress before allowing access to real customer data.
Everything public we have scraped or crawled about this agent, grouped by evidence type with provenance.
Vendor
Qduc
Protocol compatibility
MCP
Auth modes
mcp, api_key
Machine-readable schemas
OpenAPI or schema references published
Handshake status
UNKNOWN
Crawlable docs
6 indexed pages on the official domain
Merged public release, docs, artifact, benchmark, pricing, and trust refresh events.
Extracted files, examples, snippets, parameters, dependencies, permissions, and artifact metadata.
Extracted files
0
Examples
3
Snippets
0
Languages
typescript
bash
npm install
json
{
"ask-llm": {
"command": "node",
"args": [
"path/to/llm-mcp/index.js"
],
"env": {
"OPENAI_API_KEY": "your-openai-key",
"ANTHROPIC_API_KEY": "your-anthropic-key",
"GOOGLE_API_KEY": "your-google-key",
"OPENROUTER_API_KEY": "your-openrouter-key"
}
}
}json
{
"question": "Summarize the README",
"model": "llama3.2",
"base_url": "http://localhost:1234/v1"
}Full documentation captured from public sources, including the complete README when available.
Docs source
GITHUB MCP
Editorial quality
ready
MCP server for querying LLMs LLM MCP Server A Model Context Protocol (MCP) server that enables AI clients to query multiple Large Language Models (OpenAI GPT, Anthropic Claude, and Google Gemini) through a unified interface. Prerequisites - $1 (v16 or higher recommended) - $1 (comes with Node.js) - API keys for the LLM providers you want to use: - OpenAI API key (for GPT models) - Anthropic API key (for Claude models) - Google API key (for Gemin
A Model Context Protocol (MCP) server that enables AI clients to query multiple Large Language Models (OpenAI GPT, Anthropic Claude, and Google Gemini) through a unified interface.
Clone the repository and install dependencies:
npm install
This MCP server provides five tools for querying different AI models:
ask_gpt - Query OpenAI GPT models (default: gpt-4o-2024-11-20)ask_claude - Query Anthropic Claude models (default: claude-4-sonnet)ask_gemini - Query Google Gemini models (default: gemini-2.5-flash)ask_qwen - Query Qwen models via OpenRouter (default: qwen/qwq-32b-preview)ask_deepseek - Query DeepSeek models via OpenRouter (default: deepseek/deepseek-chat-v3-0324)Each tool accepts the following parameters:
OpenAI GPT (ask_gpt):
gpt-4o-2024-11-20 (default) - Latest GPT-4 modelo3 - Reasoning model for complex problemso4-mini - Faster reasoning modelAnthropic Claude (ask_claude):
claude-sonnet-4-20250514 (default) - Balanced performanceclaude-3-5-haiku-20241022 - Speed optimizedGoogle Gemini (ask_gemini):
gemini-2.5-flash (default) - Price/performance balancegemini-2.5-pro - Complex problemsgemini-2.5-flash-lite - Speed/cost optimizedQwen via OpenRouter (ask_qwen):
qwen/qwq-32b-preview (default) - Reasoning tasksqwen/qwen-2.5-72b-instruct - General tasksDeepSeek via OpenRouter (ask_deepseek):
deepseek/deepseek-chat-v3-0324 (default) - Advanced reasoning and general tasksThis server is designed to work with MCP-compatible clients like Claude Desktop. Add it to your MCP client configuration to access the LLM querying tools.
{
"ask-llm": {
"command": "node",
"args": [
"path/to/llm-mcp/index.js"
],
"env": {
"OPENAI_API_KEY": "your-openai-key",
"ANTHROPIC_API_KEY": "your-anthropic-key",
"GOOGLE_API_KEY": "your-google-key",
"OPENROUTER_API_KEY": "your-openrouter-key"
}
}
}
You only need API keys for the models you plan to use.
## Local LLM (per-request base_url)
The `ask_local_llm` and `list_local_models` tools support an optional per-request `base_url` parameter. This lets you point a single MCP server at different OpenAI-compatible local LLM servers without changing environment variables. Example tool input for `ask_local_llm`:
```json
{
"question": "Summarize the README",
"model": "llama3.2",
"base_url": "http://localhost:1234/v1"
}
```
For `list_local_models`, pass `{ "base_url": "http://other-host:1234/v1" }` to list models from an alternate local server.
Pull requests are welcome! For major changes, please open an issue first to discuss what you would like to change.
ISC
Machine endpoints, protocol fit, contract coverage, invocation examples, and guardrails for agent-to-agent use.
Contract coverage
Status
ready
Auth
mcp, api_key
Streaming
No
Data region
global
Protocol support
Requires: mcp, lang:typescript
Forbidden: none
Guardrails
Operational confidence: medium
curl -s "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/snapshot"
curl -s "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/contract"
curl -s "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/trust"
Trust and runtime signals, benchmark suites, failure patterns, and practical risk constraints.
Trust signals
Handshake
UNKNOWN
Confidence
unknown
Attempts 30d
unknown
Fallback rate
unknown
Runtime metrics
Observed P50
unknown
Observed P95
unknown
Rate limit
unknown
Estimated cost
unknown
Every public screenshot, visual asset, demo link, and owner-provided destination tied to this agent.
Neighboring agents from the same protocol and source ecosystem for comparison and shortlist building.
Rank
83
A Model Context Protocol (MCP) server for GitLab
Traction
No public download signal
Freshness
Updated 2d ago
Rank
80
A Model Context Protocol (MCP) server for GitLab
Traction
No public download signal
Freshness
Updated 2d ago
Rank
74
Expose OpenAPI definition endpoints as MCP tools using the official Rust SDK for the Model Context Protocol (https://github.com/modelcontextprotocol/rust-sdk)
Traction
No public download signal
Freshness
Updated 2d ago
Rank
72
An actix_web backend for the official Rust SDK for the Model Context Protocol (https://github.com/modelcontextprotocol/rust-sdk)
Traction
No public download signal
Freshness
Updated 2d ago
Contract JSON
{
"contractStatus": "ready",
"authModes": [
"mcp",
"api_key"
],
"requires": [
"mcp",
"lang:typescript"
],
"forbidden": [],
"supportsMcp": true,
"supportsA2a": false,
"supportsStreaming": false,
"inputSchemaRef": "https://github.com/qduc/llm-mcp#input",
"outputSchemaRef": "https://github.com/qduc/llm-mcp#output",
"dataRegion": "global",
"contractUpdatedAt": "2026-02-24T19:44:40.169Z",
"sourceUpdatedAt": "2026-02-24T19:44:40.169Z",
"freshnessSeconds": 4434566
}Invocation Guide
{
"preferredApi": {
"snapshotUrl": "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/snapshot",
"contractUrl": "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/contract",
"trustUrl": "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/trust"
},
"curlExamples": [
"curl -s \"https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/snapshot\"",
"curl -s \"https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/contract\"",
"curl -s \"https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/trust\""
],
"jsonRequestTemplate": {
"query": "summarize this repo",
"constraints": {
"maxLatencyMs": 2000,
"protocolPreference": [
"MCP"
]
}
},
"jsonResponseTemplate": {
"ok": true,
"result": {
"summary": "...",
"confidence": 0.9
},
"meta": {
"source": "GITHUB_MCP",
"generatedAt": "2026-04-17T03:34:06.951Z"
}
},
"retryPolicy": {
"maxAttempts": 3,
"backoffMs": [
500,
1500,
3500
],
"retryableConditions": [
"HTTP_429",
"HTTP_503",
"NETWORK_TIMEOUT"
]
}
}Trust JSON
{
"status": "unavailable",
"handshakeStatus": "UNKNOWN",
"verificationFreshnessHours": null,
"reputationScore": null,
"p95LatencyMs": null,
"successRate30d": null,
"fallbackRate": null,
"attempts30d": null,
"trustUpdatedAt": null,
"trustConfidence": "unknown",
"sourceUpdatedAt": null,
"freshnessSeconds": null
}Capability Matrix
{
"rows": [
{
"key": "MCP",
"type": "protocol",
"support": "supported",
"confidenceSource": "contract",
"notes": "Confirmed by capability contract"
},
{
"key": "mcp",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
},
{
"key": "llm",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
},
{
"key": "openai",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
}
],
"flattenedTokens": "protocol:MCP|supported|contract capability:mcp|supported|profile capability:llm|supported|profile capability:openai|supported|profile"
}Facts JSON
[
{
"factKey": "docs_crawl",
"category": "integration",
"label": "Crawlable docs",
"value": "6 indexed pages on the official domain",
"href": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
"sourceUrl": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
"sourceType": "search_document",
"confidence": "medium",
"observedAt": "2026-04-15T05:03:46.393Z",
"isPublic": true
},
{
"factKey": "protocols",
"category": "compatibility",
"label": "Protocol compatibility",
"value": "MCP",
"href": "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/contract",
"sourceUrl": "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/contract",
"sourceType": "contract",
"confidence": "high",
"observedAt": "2026-02-24T19:44:40.169Z",
"isPublic": true
},
{
"factKey": "auth_modes",
"category": "compatibility",
"label": "Auth modes",
"value": "mcp, api_key",
"href": "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/contract",
"sourceUrl": "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/contract",
"sourceType": "contract",
"confidence": "high",
"observedAt": "2026-02-24T19:44:40.169Z",
"isPublic": true
},
{
"factKey": "schema_refs",
"category": "artifact",
"label": "Machine-readable schemas",
"value": "OpenAPI or schema references published",
"href": "https://github.com/qduc/llm-mcp#input",
"sourceUrl": "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/contract",
"sourceType": "contract",
"confidence": "high",
"observedAt": "2026-02-24T19:44:40.169Z",
"isPublic": true
},
{
"factKey": "vendor",
"category": "vendor",
"label": "Vendor",
"value": "Qduc",
"href": "https://github.com/qduc/llm-mcp",
"sourceUrl": "https://github.com/qduc/llm-mcp",
"sourceType": "profile",
"confidence": "medium",
"observedAt": "2026-02-24T19:43:14.176Z",
"isPublic": true
},
{
"factKey": "handshake_status",
"category": "security",
"label": "Handshake status",
"value": "UNKNOWN",
"href": "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/trust",
"sourceUrl": "https://xpersona.co/api/v1/agents/mcp-qduc-llm-mcp/trust",
"sourceType": "trust",
"confidence": "medium",
"observedAt": null,
"isPublic": true
}
]Change Events JSON
[
{
"eventType": "docs_update",
"title": "Docs refreshed: Sign in to GitHub · GitHub",
"description": "Fresh crawlable documentation was indexed for the official domain.",
"href": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
"sourceUrl": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
"sourceType": "search_document",
"confidence": "medium",
"observedAt": "2026-04-15T05:03:46.393Z",
"isPublic": true
}
]Sponsored
Ads related to llm-mcp and adjacent AI workflows.