Rank
83
A Model Context Protocol (MCP) server for GitLab
Traction
No public download signal
Freshness
Updated 2d ago
Crawler Summary
Universal OpenAI Vector Store MCP Server - Deploy via Cloudflare Workers, NPM package, or local development. Complete 21-tool coverage with file management capabilities and enhanced Roo compatibility. Universal OpenAI Vector Store MCP Server A production-ready Model Context Protocol (MCP) server that provides comprehensive OpenAI Vector Store API access through multiple deployment options. This server enables AI assistants like Claude, Roo, and other MCP clients to manage vector stores, files, and batch operations seamlessly. ๐ Universal MCP Server - Three Ways to Connect Choose the deployment option that best fi Published capability contract available. No trust telemetry is available yet. 6 GitHub stars reported by the source. Last updated 2/24/2026.
Freshness
Last checked 2/22/2026
Best For
Contract is available with explicit auth and schema references.
Not Ideal For
universal-openai-vector-store-mcp is not ideal for teams that need stronger public trust telemetry, lower setup complexity, or more explicit contract coverage before production rollout.
Evidence Sources Checked
editorial-content, capability-contract, runtime-metrics, public facts pack
Universal OpenAI Vector Store MCP Server - Deploy via Cloudflare Workers, NPM package, or local development. Complete 21-tool coverage with file management capabilities and enhanced Roo compatibility. Universal OpenAI Vector Store MCP Server A production-ready Model Context Protocol (MCP) server that provides comprehensive OpenAI Vector Store API access through multiple deployment options. This server enables AI assistants like Claude, Roo, and other MCP clients to manage vector stores, files, and batch operations seamlessly. ๐ Universal MCP Server - Three Ways to Connect Choose the deployment option that best fi
Public facts
7
Change events
1
Artifacts
0
Freshness
Feb 22, 2026
Published capability contract available. No trust telemetry is available yet. 6 GitHub stars reported by the source. Last updated 2/24/2026.
Trust score
Unknown
Compatibility
MCP
Freshness
Feb 22, 2026
Vendor
Jezweb
Artifacts
0
Benchmarks
0
Last release
1.2.0
Key links, install path, and a quick operational read before the deeper crawl record.
Summary
Published capability contract available. No trust telemetry is available yet. 6 GitHub stars reported by the source. Last updated 2/24/2026.
Setup snapshot
git clone https://github.com/jezweb/openai-vector-assistant-mcp.gitSetup complexity is MEDIUM. Standard integration tests and API key provisioning are required before connecting this to production workloads.
Final validation: Expose the agent to a mock request payload inside a sandbox and trace the network egress before allowing access to real customer data.
Everything public we have scraped or crawled about this agent, grouped by evidence type with provenance.
Vendor
Jezweb
Protocol compatibility
MCP
Auth modes
mcp, api_key
Machine-readable schemas
OpenAPI or schema references published
Adoption signal
6 GitHub stars
Handshake status
UNKNOWN
Crawlable docs
6 indexed pages on the official domain
Merged public release, docs, artifact, benchmark, pricing, and trust refresh events.
Extracted files, examples, snippets, parameters, dependencies, permissions, and artifact metadata.
Extracted files
0
Examples
6
Snippets
0
Languages
typescript
bash
# Option A: Use directly with npx (recommended for latest fixes) npx openai-vector-store-mcp@latest # Option B: Install globally npm install -g openai-vector-store-mcp@latest # Option C: Install locally in your project npm install openai-vector-store-mcp@latest
json
{
"mcpServers": {
"openai-vector-store": {
"command": "npx",
"args": ["openai-vector-store-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key-here"
}
}
}
}json
{
"mcpServers": {
"openai-vector-store": {
"command": "npx",
"args": ["openai-vector-store-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key-here"
},
"alwaysAllow": [
"vector-store-create",
"vector-store-list",
"vector-store-get",
"vector-store-delete",
"vector-store-modify",
"file-upload",
"file-list",
"file-get",
"file-delete",
"file-content",
"upload-create",
"vector-store-file-add",
"vector-store-file-list",
"vector-store-file-get",
"vector-store-file-content",
"vector-store-file-update",
"vector-store-file-delete",
"vector-store-file-batch-create",
"vector-store-file-batch-get",
"vector-store-file-batch-cancel",
"vector-store-file-batch-files"
]
}
}
}bash
# Add the MCP server with local scope (default - available only in current project) claude mcp add openai-vector-store -- npx openai-vector-store-mcp@latest --env OPENAI_API_KEY="your-openai-api-key-here" # Add with project scope (shared with team via .mcp.json file) claude mcp add --scope project openai-vector-store -- npx openai-vector-store-mcp@latest --env OPENAI_API_KEY="your-openai-api-key-here" # Add with user scope (available across all your projects) claude mcp add --scope user openai-vector-store -- npx openai-vector-store-mcp@latest --env OPENAI_API_KEY="your-openai-api-key-here"
bash
# List all configured servers claude mcp list # Get details for the server claude mcp get openai-vector-store # Remove the server claude mcp remove openai-vector-store # Check server status within Claude Code /mcp
json
{
"mcpServers": {
"openai-vector-store": {
"command": "npx",
"args": ["openai-vector-store-mcp@latest"],
"env": {
"OPENAI_API_KEY": "${OPENAI_API_KEY}"
}
}
}
}Full documentation captured from public sources, including the complete README when available.
Docs source
GITHUB MCP
Editorial quality
ready
Universal OpenAI Vector Store MCP Server - Deploy via Cloudflare Workers, NPM package, or local development. Complete 21-tool coverage with file management capabilities and enhanced Roo compatibility. Universal OpenAI Vector Store MCP Server A production-ready Model Context Protocol (MCP) server that provides comprehensive OpenAI Vector Store API access through multiple deployment options. This server enables AI assistants like Claude, Roo, and other MCP clients to manage vector stores, files, and batch operations seamlessly. ๐ Universal MCP Server - Three Ways to Connect Choose the deployment option that best fi
A production-ready Model Context Protocol (MCP) server that provides comprehensive OpenAI Vector Store API access through multiple deployment options. This server enables AI assistants like Claude, Roo, and other MCP clients to manage vector stores, files, and batch operations seamlessly.
Choose the deployment option that best fits your needs:
Production URL: https://vectorstore.jezweb.com
Package: openai-vector-store-mcp
Local Build: Clone and run locally
โ Phase 2 v1.2.0 Complete - Complete file-to-vector-store workflow with 21 tools โ Production Deployment - Live on Cloudflare Workers with global edge distribution โ Client Integration - Working with Claude Desktop, Roo, and all MCP clients โ End-to-End Workflow - Upload files directly from local filesystem to vector stores โ Real-World Ready - Solves the 470 PDF files scenario and similar use cases
Before using this MCP server, you'll need to set up your OpenAI account and API access:
sk-proj- or sk-)# Option A: Use directly with npx (recommended for latest fixes)
npx openai-vector-store-mcp@latest
# Option B: Install globally
npm install -g openai-vector-store-mcp@latest
# Option C: Install locally in your project
npm install openai-vector-store-mcp@latest
๐ก Why use @latest?
Add to your claude_desktop_config.json:
{
"mcpServers": {
"openai-vector-store": {
"command": "npx",
"args": ["openai-vector-store-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key-here"
}
}
}
}
Add to your Roo configuration file:
{
"mcpServers": {
"openai-vector-store": {
"command": "npx",
"args": ["openai-vector-store-mcp@latest"],
"env": {
"OPENAI_API_KEY": "your-openai-api-key-here"
},
"alwaysAllow": [
"vector-store-create",
"vector-store-list",
"vector-store-get",
"vector-store-delete",
"vector-store-modify",
"file-upload",
"file-list",
"file-get",
"file-delete",
"file-content",
"upload-create",
"vector-store-file-add",
"vector-store-file-list",
"vector-store-file-get",
"vector-store-file-content",
"vector-store-file-update",
"vector-store-file-delete",
"vector-store-file-batch-create",
"vector-store-file-batch-get",
"vector-store-file-batch-cancel",
"vector-store-file-batch-files"
]
}
}
}
For users of Claude Code (CLI), you can add the MCP server using the command line interface:
# Add the MCP server with local scope (default - available only in current project)
claude mcp add openai-vector-store -- npx openai-vector-store-mcp@latest --env OPENAI_API_KEY="your-openai-api-key-here"
# Add with project scope (shared with team via .mcp.json file)
claude mcp add --scope project openai-vector-store -- npx openai-vector-store-mcp@latest --env OPENAI_API_KEY="your-openai-api-key-here"
# Add with user scope (available across all your projects)
claude mcp add --scope user openai-vector-store -- npx openai-vector-store-mcp@latest --env OPENAI_API_KEY="your-openai-api-key-here"
--scope local (default): Available only to you in the current project--scope project: Shared with everyone in the project via .mcp.json file--scope user: Available to you across all projects# List all configured servers
claude mcp list
# Get details for the server
claude mcp get openai-vector-store
# Remove the server
claude mcp remove openai-vector-store
# Check server status within Claude Code
/mcp
When using --scope project, Claude Code creates a .mcp.json file in your project root:
{
"mcpServers": {
"openai-vector-store": {
"command": "npx",
"args": ["openai-vector-store-mcp@latest"],
"env": {
"OPENAI_API_KEY": "${OPENAI_API_KEY}"
}
}
}
}
Environment Variable Expansion: Claude Code supports ${VAR} syntax for environment variables in .mcp.json files.
npm install -g mcp-proxy
claude_desktop_config.json:{
"mcpServers": {
"openai-vector-store": {
"command": "npx",
"args": [
"mcp-proxy",
"https://vectorstore.jezweb.com/mcp/YOUR_OPENAI_API_KEY_HERE"
]
}
}
}
npm install -g mcp-proxy
{
"mcpServers": {
"openai-vector-store": {
"command": "npx",
"args": [
"mcp-proxy",
"https://vectorstore.jezweb.com/mcp/YOUR_OPENAI_API_KEY_HERE"
],
"alwaysAllow": [
"vector-store-create",
"vector-store-list",
"vector-store-get",
"vector-store-delete",
"vector-store-modify",
"file-upload",
"file-list",
"file-get",
"file-delete",
"file-content",
"upload-create",
"vector-store-file-add",
"vector-store-file-list",
"vector-store-file-get",
"vector-store-file-content",
"vector-store-file-update",
"vector-store-file-delete",
"vector-store-file-batch-create",
"vector-store-file-batch-get",
"vector-store-file-batch-cancel",
"vector-store-file-batch-files"
]
}
}
}
git clone https://github.com/jezweb/openai-vector-assistant-mcp.git
cd openai-vector-assistant-mcp
npm install
# Add your OpenAI API key to wrangler.toml or use wrangler secrets
wrangler secret put OPENAI_API_KEY
npm run dev
Use the local server URL in your MCP client configuration, replacing the Cloudflare Workers URL with your local development server URL (typically http://localhost:8787).
~/Library/Application Support/Claude/claude_desktop_config.json%APPDATA%\Claude\claude_desktop_config.json~/.config/Claude/claude_desktop_config.json~/.config/roo/config.json or similarYOUR_OPENAI_API_KEY_HERE with your actual OpenAI API keyalwaysAllow array is crucial for Roo to automatically approve tool usageThe Problem Solved: Previously, users had to manually upload files to OpenAI before using vector store tools. Now you can upload files directly from your local filesystem!
# 1. Upload a local file to OpenAI
"Upload the file ./documents/report.pdf to OpenAI"
# 2. Create a vector store for the uploaded files
"Create a vector store named 'Project Documents' that expires in 7 days"
# 3. Add the uploaded file to the vector store
"Add file file-abc123 to vector store vs_def456"
# 4. Now you can query the documents using OpenAI's Assistants API!
# Upload multiple files from a directory
"Upload all PDF files from ./research-papers/ to OpenAI"
# Create a dedicated vector store
"Create a vector store named 'Research Papers Collection' that expires in 30 days"
# Batch add all uploaded files to the vector store
"Create a batch to add all uploaded PDF files to the Research Papers Collection vector store"
# Monitor batch processing
"Get the status of the batch operation"
# Query your knowledge base
"Search for papers about machine learning in the Research Papers Collection"
# For files larger than 25MB, use multipart upload
"Create a multipart upload for the file ./large-dataset.zip"
# Upload the large file in chunks
"Upload the large file ./large-dataset.zip using multipart upload"
# Add to vector store once upload completes
"Add the uploaded large file to vector store vs_def456"
# Upload local files
"Upload the file ./data/research.txt to OpenAI"
"Upload all PDF files from ./documents/ to OpenAI"
# List uploaded files
"List all my uploaded files"
"List files uploaded in the last 7 days"
# Get file information
"Get details of file file-abc123"
# Download file content
"Get the content of file file-abc123"
# Delete files
"Delete file file-abc123 from OpenAI"
# List existing vector stores
"List my vector stores"
# Create a new vector store
"Create a vector store named 'Project Documents' that expires in 7 days"
# Get details of a specific vector store
"Get details of vector store vs_abc123"
# Delete a vector store
"Delete vector store vs_abc123"
# Add a file to a vector store
"Add file file-abc123 to vector store vs_def456"
# List files in a vector store
"List all files in vector store vs_def456"
# Get file content from vector store
"Get the content of file file-abc123 in vector store vs_def456"
# Create a batch operation
"Create a batch to add files file-1, file-2, file-3 to vector store vs_def456"
# Check batch status
"Get status of batch batch_abc123 in vector store vs_def456"
# Legal Document Analysis
1. "Upload all PDF files from ./legal-documents/ to OpenAI"
2. "Create a vector store named 'Legal Document Analysis' that expires in 90 days"
3. "Create a batch to add all uploaded legal documents to the vector store"
4. "Monitor batch processing status until complete"
5. "Search for clauses about liability in the legal documents"
6. "Find all references to termination conditions"
# Academic Research Processing
1. "Upload ./research-papers/paper1.pdf to OpenAI"
2. "Upload ./research-papers/paper2.pdf to OpenAI"
3. "Create a vector store named 'Literature Review' with metadata for project tracking"
4. "Add both uploaded papers to the Literature Review vector store"
5. "Query the papers for methodology comparisons"
# Codebase Documentation and Analysis
1. "Upload all Python files from ./src/ to OpenAI"
2. "Upload all JavaScript files from ./frontend/ to OpenAI"
3. "Create a vector store named 'Codebase Analysis' that expires in 14 days"
4. "Batch add all uploaded source files to the codebase vector store"
5. "Find all functions that handle user authentication"
6. "Search for security-related code patterns"
7. "Identify API endpoints and their documentation"
# Code Review Preparation
1. "Upload changed files from ./src/modified/ to OpenAI"
2. "Add uploaded files to existing 'Code Review' vector store"
3. "Search for similar code patterns in the existing codebase"
# Company Documentation System
1. "Upload all documentation files from ./company-docs/ to OpenAI"
2. "Upload policy files from ./policies/ to OpenAI"
3. "Create a vector store named 'Company Knowledge Base' that expires in 365 days"
4. "Create a batch operation to add all documentation to the knowledge base"
5. "Monitor batch processing and handle any failed uploads"
6. "Query the knowledge base for HR policies"
7. "Search for specific procedures and guidelines"
# Customer Support Knowledge Base
1. "Upload FAQ documents from ./support-docs/ to OpenAI"
2. "Upload troubleshooting guides from ./troubleshooting/ to OpenAI"
3. "Create a vector store named 'Support Knowledge Base'"
4. "Add all support documents to the vector store"
5. "Search for solutions to specific customer issues"
# Market Research Analysis
1. "Upload market reports from ./market-research/ to OpenAI"
2. "Upload competitor analysis from ./competitor-data/ to OpenAI"
3. "Create a vector store named 'Market Intelligence' with expiration in 180 days"
4. "Batch add all research documents to the vector store"
5. "Search for market trends and opportunities"
6. "Compare competitor strategies across documents"
# Scientific Literature Review
1. "Upload research papers from ./literature/ to OpenAI"
2. "Create a vector store named 'Literature Review 2024'"
3. "Add papers to vector store with metadata tracking"
4. "Search for specific methodologies across papers"
5. "Find contradictory findings in the literature"
# Large Dataset Processing (470 PDF Scenario)
1. "List all PDF files in ./research-collection/ directory"
2. "Upload all 470 PDF files from ./research-collection/ to OpenAI"
3. "Create a vector store named 'Comprehensive Research Database'"
4. "Create batch operations to add files in groups of 50"
5. "Monitor all batch operations until completion"
6. "Verify all 470 files are successfully added to vector store"
7. "Search across entire collection for specific research topics"
8. "Generate summaries of key findings across all documents"
# Content Management System
1. "Upload articles from ./content/articles/ to OpenAI"
2. "Upload blog posts from ./content/blog/ to OpenAI"
3. "Create vector stores for different content categories"
4. "Organize content by topic using separate vector stores"
5. "Search for content gaps and opportunities"
/mcp/{api-key} patternsrc/
โโโ worker.ts # Main Cloudflare Worker entry point
โโโ mcp-handler.ts # MCP protocol implementation
โโโ types.ts # TypeScript type definitions
โโโ services/
โโโ openai-service.ts # OpenAI API client wrapper
Test the server directly with curl:
# List available tools
curl -X POST "https://vectorstore.jezweb.com/mcp/YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}'
# Create a vector store
curl -X POST "https://vectorstore.jezweb.com/mcp/YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/call",
"params": {
"name": "vector-store-create",
"arguments": {
"name": "Test Store",
"expires_after_days": 1
}
}
}'
Use the provided test scripts:
test-mcp-client.js - Node.js MCP client testtest-mcp-http-client.js - Direct HTTP API testdemo-vector-store-mcp.js - Comprehensive demogit clone https://github.com/jezweb/openai-vector-assistant-mcp.git
cd openai-vector-assistant-mcp
npm install
# Add your OpenAI API key to wrangler.toml or use wrangler secrets
wrangler secret put OPENAI_API_KEY
npm run dev
Deploy to Cloudflare Workers:
npm run deploy
The server will be available at your Cloudflare Workers domain.
MIT License - see LICENSE for details.
If you're experiencing issues with outdated versions, use @latest to bypass cache:
# Clear NPM cache and use @latest
npm cache clean --force
npx openai-vector-store-mcp@latest
# For global installations
npm uninstall -g openai-vector-store-mcp
npm install -g openai-vector-store-mcp@latest
# Check current version
npx openai-vector-store-mcp@latest --version
# Force latest version in configuration
# Update your config to use @latest instead of version numbers
Configuration file issues:
# Verify configuration file syntax
cat ~/Library/Application\ Support/Claude/claude_desktop_config.json | python -m json.tool
# Ensure @latest is used in args
"args": ["openai-vector-store-mcp@latest"]
Server not connecting:
# Check server status
claude mcp list
claude mcp get openai-vector-store
# Re-add with @latest
claude mcp remove openai-vector-store
claude mcp add openai-vector-store -- npx openai-vector-store-mcp@latest --env OPENAI_API_KEY="your-key"
Scope-related issues:
# Check which scope the server is in
claude mcp list
# Add to correct scope
claude mcp add --scope user openai-vector-store -- npx openai-vector-store-mcp@latest --env OPENAI_API_KEY="your-key"
Project .mcp.json issues:
# Reset project choices if needed
claude mcp reset-project-choices
# Verify .mcp.json syntax
cat .mcp.json | python -m json.tool
For NPM Package Users:
# Always use @latest for most reliable experience
npx openai-vector-store-mcp@latest
# Test the server directly
OPENAI_API_KEY="your-key" npx openai-vector-store-mcp@latest
For Cloudflare Workers Users:
# Test the server directly
curl -X POST "https://vectorstore.jezweb.com/mcp/YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}'
For Local Development:
# Ensure the server is running
npm run dev
# Check if the server is accessible
curl -X POST "http://localhost:8787/mcp/YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}'
"Permission denied" errors:
alwaysAllow is configured for all 21 toolsalwaysAllow match exactlyTool approval prompts:
alwaysAllow arrayfile-upload, file-list, file-get, file-delete, file-content, upload-create# Install Node.js if not present
brew install node
# Install the MCP package with @latest
npm install -g openai-vector-store-mcp@latest
# Configuration file location
~/Library/Application Support/Claude/claude_desktop_config.json
# Install Node.js from nodejs.org
# Then install the MCP package with @latest
npm install -g openai-vector-store-mcp@latest
# Configuration file location
%APPDATA%\Claude\claude_desktop_config.json
# Install Node.js (Ubuntu/Debian)
sudo apt update && sudo apt install nodejs npm
# Install the MCP package with @latest
npm install -g openai-vector-store-mcp@latest
# Configuration file location
~/.config/Claude/claude_desktop_config.json
DEBUG=* OPENAI_API_KEY="your-key" npx openai-vector-store-mcp@latest
# Test with verbose curl output
curl -v -X POST "https://vectorstore.jezweb.com/mcp/YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}'
If upgrading from version 1.1.x to 1.2.0:
alwaysAllow array for seamless operationIf upgrading from version 1.0.x:
# Check current version
npx openai-vector-store-mcp@latest --version
# Clear cache and update
npm cache clean --force
npm install -g openai-vector-store-mcp@latest
# Pin to specific version (if needed)
npm install -g openai-vector-store-mcp@1.2.3
@latest in configuration files--scope local: Personal development--scope project: Team collaboration--scope user: Cross-project utilities.mcp.json/mcp command to check server statusalwaysAllow array| Issue | Claude Desktop | Claude Code CLI | Roo |
|-------|---------------|-----------------|-----|
| Outdated Version | Use @latest in config | claude mcp remove then re-add with @latest | Use @latest in config |
| Cache Issues | npm cache clean --force | npm cache clean --force | npm cache clean --force |
| Permission Denied | Check API key in env | Check --env flag | Add tools to alwaysAllow |
| Server Not Found | Restart client | claude mcp list to verify | Restart Roo |
| Config Syntax | Validate JSON | cat .mcp.json \| python -m json.tool | Validate JSON |
API Key Management
Configuration Security
Network Security
alwaysAllow configuration and troubleshootingReady to get started? Choose your preferred installation method from the Quick Start guide above or check out the Universal MCP Server Guide for complete documentation.
Machine endpoints, protocol fit, contract coverage, invocation examples, and guardrails for agent-to-agent use.
Contract coverage
Status
ready
Auth
mcp, api_key
Streaming
Yes
Data region
global
Protocol support
Requires: mcp, lang:typescript, streaming
Forbidden: none
Guardrails
Operational confidence: medium
curl -s "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/snapshot"
curl -s "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/contract"
curl -s "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/trust"
Trust and runtime signals, benchmark suites, failure patterns, and practical risk constraints.
Trust signals
Handshake
UNKNOWN
Confidence
unknown
Attempts 30d
unknown
Fallback rate
unknown
Runtime metrics
Observed P50
unknown
Observed P95
unknown
Rate limit
unknown
Estimated cost
unknown
Every public screenshot, visual asset, demo link, and owner-provided destination tied to this agent.
Neighboring agents from the same protocol and source ecosystem for comparison and shortlist building.
Rank
83
A Model Context Protocol (MCP) server for GitLab
Traction
No public download signal
Freshness
Updated 2d ago
Rank
80
A Model Context Protocol (MCP) server for GitLab
Traction
No public download signal
Freshness
Updated 2d ago
Rank
74
Expose OpenAPI definition endpoints as MCP tools using the official Rust SDK for the Model Context Protocol (https://github.com/modelcontextprotocol/rust-sdk)
Traction
No public download signal
Freshness
Updated 2d ago
Rank
72
An actix_web backend for the official Rust SDK for the Model Context Protocol (https://github.com/modelcontextprotocol/rust-sdk)
Traction
No public download signal
Freshness
Updated 2d ago
Contract JSON
{
"contractStatus": "ready",
"authModes": [
"mcp",
"api_key"
],
"requires": [
"mcp",
"lang:typescript",
"streaming"
],
"forbidden": [],
"supportsMcp": true,
"supportsA2a": false,
"supportsStreaming": true,
"inputSchemaRef": "https://github.com/jezweb/openai-vector-assistant-mcp#input",
"outputSchemaRef": "https://github.com/jezweb/openai-vector-assistant-mcp#output",
"dataRegion": "global",
"contractUpdatedAt": "2026-02-24T19:46:31.748Z",
"sourceUpdatedAt": "2026-02-24T19:46:31.748Z",
"freshnessSeconds": 4434824
}Invocation Guide
{
"preferredApi": {
"snapshotUrl": "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/snapshot",
"contractUrl": "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/contract",
"trustUrl": "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/trust"
},
"curlExamples": [
"curl -s \"https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/snapshot\"",
"curl -s \"https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/contract\"",
"curl -s \"https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/trust\""
],
"jsonRequestTemplate": {
"query": "summarize this repo",
"constraints": {
"maxLatencyMs": 2000,
"protocolPreference": [
"MCP"
]
}
},
"jsonResponseTemplate": {
"ok": true,
"result": {
"summary": "...",
"confidence": 0.9
},
"meta": {
"source": "GITHUB_MCP",
"generatedAt": "2026-04-17T03:40:16.090Z"
}
},
"retryPolicy": {
"maxAttempts": 3,
"backoffMs": [
500,
1500,
3500
],
"retryableConditions": [
"HTTP_429",
"HTTP_503",
"NETWORK_TIMEOUT"
]
}
}Trust JSON
{
"status": "unavailable",
"handshakeStatus": "UNKNOWN",
"verificationFreshnessHours": null,
"reputationScore": null,
"p95LatencyMs": null,
"successRate30d": null,
"fallbackRate": null,
"attempts30d": null,
"trustUpdatedAt": null,
"trustConfidence": "unknown",
"sourceUpdatedAt": null,
"freshnessSeconds": null
}Capability Matrix
{
"rows": [
{
"key": "MCP",
"type": "protocol",
"support": "supported",
"confidenceSource": "contract",
"notes": "Confirmed by capability contract"
},
{
"key": "mcp",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
},
{
"key": "model-context-protocol",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
},
{
"key": "openai",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
},
{
"key": "vector-store",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
},
{
"key": "cloudflare-workers",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
},
{
"key": "serverless",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
},
{
"key": "universal",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
},
{
"key": "roo",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
},
{
"key": "claude-desktop",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
},
{
"key": "stdio",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
},
{
"key": "typescript",
"type": "capability",
"support": "supported",
"confidenceSource": "profile",
"notes": "Declared in agent profile metadata"
}
],
"flattenedTokens": "protocol:MCP|supported|contract capability:mcp|supported|profile capability:model-context-protocol|supported|profile capability:openai|supported|profile capability:vector-store|supported|profile capability:cloudflare-workers|supported|profile capability:serverless|supported|profile capability:universal|supported|profile capability:roo|supported|profile capability:claude-desktop|supported|profile capability:stdio|supported|profile capability:typescript|supported|profile"
}Facts JSON
[
{
"factKey": "docs_crawl",
"category": "integration",
"label": "Crawlable docs",
"value": "6 indexed pages on the official domain",
"href": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
"sourceUrl": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
"sourceType": "search_document",
"confidence": "medium",
"observedAt": "2026-04-15T05:03:46.393Z",
"isPublic": true
},
{
"factKey": "protocols",
"category": "compatibility",
"label": "Protocol compatibility",
"value": "MCP",
"href": "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/contract",
"sourceUrl": "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/contract",
"sourceType": "contract",
"confidence": "high",
"observedAt": "2026-02-24T19:46:31.748Z",
"isPublic": true
},
{
"factKey": "auth_modes",
"category": "compatibility",
"label": "Auth modes",
"value": "mcp, api_key",
"href": "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/contract",
"sourceUrl": "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/contract",
"sourceType": "contract",
"confidence": "high",
"observedAt": "2026-02-24T19:46:31.748Z",
"isPublic": true
},
{
"factKey": "schema_refs",
"category": "artifact",
"label": "Machine-readable schemas",
"value": "OpenAPI or schema references published",
"href": "https://github.com/jezweb/openai-vector-assistant-mcp#input",
"sourceUrl": "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/contract",
"sourceType": "contract",
"confidence": "high",
"observedAt": "2026-02-24T19:46:31.748Z",
"isPublic": true
},
{
"factKey": "vendor",
"category": "vendor",
"label": "Vendor",
"value": "Jezweb",
"href": "https://github.com/jezweb/openai-vector-assistant-mcp",
"sourceUrl": "https://github.com/jezweb/openai-vector-assistant-mcp",
"sourceType": "profile",
"confidence": "medium",
"observedAt": "2026-02-24T19:43:14.176Z",
"isPublic": true
},
{
"factKey": "traction",
"category": "adoption",
"label": "Adoption signal",
"value": "6 GitHub stars",
"href": "https://github.com/jezweb/openai-vector-assistant-mcp",
"sourceUrl": "https://github.com/jezweb/openai-vector-assistant-mcp",
"sourceType": "profile",
"confidence": "medium",
"observedAt": "2026-02-24T19:43:14.176Z",
"isPublic": true
},
{
"factKey": "handshake_status",
"category": "security",
"label": "Handshake status",
"value": "UNKNOWN",
"href": "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/trust",
"sourceUrl": "https://xpersona.co/api/v1/agents/mcp-jezweb-openai-vector-assistant-mcp/trust",
"sourceType": "trust",
"confidence": "medium",
"observedAt": null,
"isPublic": true
}
]Change Events JSON
[
{
"eventType": "docs_update",
"title": "Docs refreshed: Sign in to GitHub ยท GitHub",
"description": "Fresh crawlable documentation was indexed for the official domain.",
"href": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
"sourceUrl": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
"sourceType": "search_document",
"confidence": "medium",
"observedAt": "2026-04-15T05:03:46.393Z",
"isPublic": true
}
]Sponsored
Ads related to universal-openai-vector-store-mcp and adjacent AI workflows.