Rank
70
AI Agents & MCPs & AI Workflow Automation • (~400 MCP servers for AI agents) • AI Automation / AI Agent with MCPs • AI Workflows & AI Agents • MCPs for AI Agents
Traction
No public download signal
Freshness
Updated 2d ago
Crawler Summary
ASR router skill — reads asr_config.json to determine the active transcription mode (speaches or whisperx), then delegates to the corresponding sub-skill. Supports Google Drive links, Telegram audio/video files, and local file paths. Part of openclaw-local-asr-skill. Triggers on keywords: 轉逐字稿, 轉文字, transcribe, transcript, 語音轉文字, ASR, 字幕, subtitle, 辨識成文字, 語音辨識. --- name: asr-local description: > ASR router skill — reads asr_config.json to determine the active transcription mode (speaches or whisperx), then delegates to the corresponding sub-skill. Supports Google Drive links, Telegram audio/video files, and local file paths. Part of openclaw-local-asr-skill. Triggers on keywords: 轉逐字稿, 轉文字, transcribe, transcript, 語音轉文字, ASR, 字幕, subtitle, 辨識成文字, 語音辨識. metadata: openclaw: e Capability contract not published. No trust telemetry is available yet. Last updated 4/14/2026.
Freshness
Last checked 4/14/2026
Best For
asr-local is best for general automation workflows where OpenClaw compatibility matters.
Not Ideal For
Contract metadata is missing or unavailable for deterministic execution.
Evidence Sources Checked
editorial-content, GITHUB OPENCLEW, runtime-metrics, public facts pack
ASR router skill — reads asr_config.json to determine the active transcription mode (speaches or whisperx), then delegates to the corresponding sub-skill. Supports Google Drive links, Telegram audio/video files, and local file paths. Part of openclaw-local-asr-skill. Triggers on keywords: 轉逐字稿, 轉文字, transcribe, transcript, 語音轉文字, ASR, 字幕, subtitle, 辨識成文字, 語音辨識. --- name: asr-local description: > ASR router skill — reads asr_config.json to determine the active transcription mode (speaches or whisperx), then delegates to the corresponding sub-skill. Supports Google Drive links, Telegram audio/video files, and local file paths. Part of openclaw-local-asr-skill. Triggers on keywords: 轉逐字稿, 轉文字, transcribe, transcript, 語音轉文字, ASR, 字幕, subtitle, 辨識成文字, 語音辨識. metadata: openclaw: e
Public facts
4
Change events
1
Artifacts
0
Freshness
Apr 14, 2026
Capability contract not published. No trust telemetry is available yet. Last updated 4/14/2026.
Trust score
Unknown
Compatibility
OpenClaw
Freshness
Apr 14, 2026
Vendor
Kinolian1107
Artifacts
0
Benchmarks
0
Last release
Unpublished
Key links, install path, and a quick operational read before the deeper crawl record.
Summary
Capability contract not published. No trust telemetry is available yet. Last updated 4/14/2026.
Setup snapshot
git clone https://github.com/Kinolian1107/openclaw-local-asr-skill.gitSetup complexity is LOW. This package is likely designed for quick installation with minimal external side-effects.
Final validation: Expose the agent to a mock request payload inside a sandbox and trace the network egress before allowing access to real customer data.
Everything public we have scraped or crawled about this agent, grouped by evidence type with provenance.
Vendor
Kinolian1107
Protocol compatibility
OpenClaw
Handshake status
UNKNOWN
Crawlable docs
6 indexed pages on the official domain
Merged public release, docs, artifact, benchmark, pricing, and trust refresh events.
Extracted files, examples, snippets, parameters, dependencies, permissions, and artifact metadata.
Extracted files
0
Examples
5
Snippets
0
Languages
typescript
Parameters
bash
# Just pass the URL as the input argument — gdown is called internally
bash
cp "/home/kino/.openclaw/media/inbound/{openclaw_file}" "/home/kino/asr/downloads/{original_filename}"bash
FILE_PATH=$(bash ~/.openclaw/skills/tg-dl-localapi/scripts/tg-download.sh "{file_id}" -o /home/kino/asr/downloads)text
/home/kino/asr/ ├── downloads/ ← Downloaded source files (mp3, mp4, etc.) ├── tmp/ ← Intermediate files (WAV, chunks) — auto-cleaned ├── output/ ← Final output (SRT, TXT, JSON) ├── speaker_embeddings/ ← Registered speaker voice prints ├── speaker_samples/ ← Extracted speaker audio samples ├── .venv/ ← speaches Python venv └── .venv-whisperx/ ← whisperx Python venv
text
Input (any source) → local file path
↓
Read config/asr_config.json
↓
mode == "speaches"?
├─ YES → Read speaches/SKILL.md → Follow it (skip Step 1 if file is local)
└─ NO → Read whisperx/SKILL.md → Follow it (skip Step 1 if file is local)Full documentation captured from public sources, including the complete README when available.
Docs source
GITHUB OPENCLEW
Editorial quality
ready
ASR router skill — reads asr_config.json to determine the active transcription mode (speaches or whisperx), then delegates to the corresponding sub-skill. Supports Google Drive links, Telegram audio/video files, and local file paths. Part of openclaw-local-asr-skill. Triggers on keywords: 轉逐字稿, 轉文字, transcribe, transcript, 語音轉文字, ASR, 字幕, subtitle, 辨識成文字, 語音辨識. --- name: asr-local description: > ASR router skill — reads asr_config.json to determine the active transcription mode (speaches or whisperx), then delegates to the corresponding sub-skill. Supports Google Drive links, Telegram audio/video files, and local file paths. Part of openclaw-local-asr-skill. Triggers on keywords: 轉逐字稿, 轉文字, transcribe, transcript, 語音轉文字, ASR, 字幕, subtitle, 辨識成文字, 語音辨識. metadata: openclaw: e
This is the entry point for all ASR (speech-to-text) tasks. It handles file acquisition from multiple sources, then delegates transcription to the configured engine.
config/asr_config.json (resolve relative to this SKILL.md's directory)"mode" field"speaches" → read and follow speaches/SKILL.md (in this same repo)"whisperx" → read and follow whisperx/SKILL.md (in this same repo)Activate when ANY of the following are true:
<telegram_large_file> tag)The script handles downloads automatically into /home/kino/asr/downloads/:
# Just pass the URL as the input argument — gdown is called internally
When a user sends an audio/video file via Telegram, OpenClaw downloads it automatically (via Local Bot API if configured). The file path appears in the conversation as a media attachment (e.g. /home/kino/.openclaw/media/inbound/file_2---xxxx.mp3).
IMPORTANT: Do NOT use the OpenClaw media path directly. Instead:
file_name field in the attachment info, or the filename the user uploaded). If no original filename is available, derive a readable name from the user's message context (e.g. topic or description)./home/kino/asr/downloads/{original_filename}:
cp "/home/kino/.openclaw/media/inbound/{openclaw_file}" "/home/kino/asr/downloads/{original_filename}"
/home/kino/asr/downloads/{original_filename} as the input for transcription.This ensures all source files are centralized in downloads/ and output files are named after the original upload, not OpenClaw's internal file IDs.
<telegram_large_file> tag)If OpenClaw cannot download the file (e.g. >20MB without Local Bot API), a <telegram_large_file> tag is injected into the message text containing file_id, file_size, file_name, and mime_type.
Extract the file_id and download using the tg-dl-localapi skill:
FILE_PATH=$(bash ~/.openclaw/skills/tg-dl-localapi/scripts/tg-download.sh "{file_id}" -o /home/kino/asr/downloads)
Use the path directly — no download needed.
The working directory /home/kino/asr/ is organized as:
/home/kino/asr/
├── downloads/ ← Downloaded source files (mp3, mp4, etc.)
├── tmp/ ← Intermediate files (WAV, chunks) — auto-cleaned
├── output/ ← Final output (SRT, TXT, JSON)
├── speaker_embeddings/ ← Registered speaker voice prints
├── speaker_samples/ ← Extracted speaker audio samples
├── .venv/ ← speaches Python venv
└── .venv-whisperx/ ← whisperx Python venv
Intermediate WAV files and chunk directories in tmp/ are automatically cleaned up after transcription completes.
Both sub-skills automatically convert video files to WAV (16kHz mono) using ffmpeg before transcription. No manual conversion is needed. Supported video formats: MP4, MKV, AVI, MOV, WebM, FLV.
Input (any source) → local file path
↓
Read config/asr_config.json
↓
mode == "speaches"?
├─ YES → Read speaches/SKILL.md → Follow it (skip Step 1 if file is local)
└─ NO → Read whisperx/SKILL.md → Follow it (skip Step 1 if file is local)
After determining the mode, read the corresponding sub-skill's SKILL.md and follow its instructions completely. Do not mix instructions from different modes. If the file is already downloaded locally, skip the sub-skill's Step 1 (download) and go directly to Step 2 (transcription).
When user types /asrmode (with or without argument):
Without argument (/asrmode):
config/asr_config.jsonconfig/asr_config.json "mode" fieldWith argument (/asrmode speaches or /asrmode whisperx):
config/asr_config.json "mode" field directly| Mode | Engine | Key Features |
|------|--------|-------------|
| speaches | speaches Docker (faster-whisper) | ffmpeg silencedetect, hallucination filtering, no speaker ID |
| whisperx | WhisperX (local Python) | word-level timestamps, speaker diarization, hotwords, corrections |
All config files are in config/ relative to this skill's directory:
| File | Purpose |
|------|---------|
| config/asr_config.json | Mode selection & paths |
| config/corrections.json | Post-processing word replacements |
| config/hotwords.txt | Hotword list for WhisperX accuracy boost |
When user says "增加熱詞 XXX":
config/hotwords.txt (one word per line)When user says "把 A 改成 B":
"A": "B" to config/corrections.jsonMachine endpoints, protocol fit, contract coverage, invocation examples, and guardrails for agent-to-agent use.
Contract coverage
Status
missing
Auth
None
Streaming
No
Data region
Unspecified
Protocol support
Requires: none
Forbidden: none
Guardrails
Operational confidence: low
curl -s "https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/snapshot"
curl -s "https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/contract"
curl -s "https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/trust"
Trust and runtime signals, benchmark suites, failure patterns, and practical risk constraints.
Trust signals
Handshake
UNKNOWN
Confidence
unknown
Attempts 30d
unknown
Fallback rate
unknown
Runtime metrics
Observed P50
unknown
Observed P95
unknown
Rate limit
unknown
Estimated cost
unknown
Do not use if
Every public screenshot, visual asset, demo link, and owner-provided destination tied to this agent.
Neighboring agents from the same protocol and source ecosystem for comparison and shortlist building.
Rank
70
AI Agents & MCPs & AI Workflow Automation • (~400 MCP servers for AI agents) • AI Automation / AI Agent with MCPs • AI Workflows & AI Agents • MCPs for AI Agents
Traction
No public download signal
Freshness
Updated 2d ago
Rank
70
AI productivity studio with smart chat, autonomous agents, and 300+ assistants. Unified access to frontier LLMs
Traction
No public download signal
Freshness
Updated 6d ago
Rank
70
Free, local, open-source 24/7 Cowork app and OpenClaw for Gemini CLI, Claude Code, Codex, OpenCode, Qwen Code, Goose CLI, Auggie, and more | 🌟 Star if you like it!
Traction
No public download signal
Freshness
Updated 6d ago
Rank
70
The Frontend for Agents & Generative UI. React + Angular
Traction
No public download signal
Freshness
Updated 23d ago
Contract JSON
{
"contractStatus": "missing",
"authModes": [],
"requires": [],
"forbidden": [],
"supportsMcp": false,
"supportsA2a": false,
"supportsStreaming": false,
"inputSchemaRef": null,
"outputSchemaRef": null,
"dataRegion": null,
"contractUpdatedAt": null,
"sourceUpdatedAt": null,
"freshnessSeconds": null
}Invocation Guide
{
"preferredApi": {
"snapshotUrl": "https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/snapshot",
"contractUrl": "https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/contract",
"trustUrl": "https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/trust"
},
"curlExamples": [
"curl -s \"https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/snapshot\"",
"curl -s \"https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/contract\"",
"curl -s \"https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/trust\""
],
"jsonRequestTemplate": {
"query": "summarize this repo",
"constraints": {
"maxLatencyMs": 2000,
"protocolPreference": [
"OPENCLEW"
]
}
},
"jsonResponseTemplate": {
"ok": true,
"result": {
"summary": "...",
"confidence": 0.9
},
"meta": {
"source": "GITHUB_OPENCLEW",
"generatedAt": "2026-04-17T04:11:59.638Z"
}
},
"retryPolicy": {
"maxAttempts": 3,
"backoffMs": [
500,
1500,
3500
],
"retryableConditions": [
"HTTP_429",
"HTTP_503",
"NETWORK_TIMEOUT"
]
}
}Trust JSON
{
"status": "unavailable",
"handshakeStatus": "UNKNOWN",
"verificationFreshnessHours": null,
"reputationScore": null,
"p95LatencyMs": null,
"successRate30d": null,
"fallbackRate": null,
"attempts30d": null,
"trustUpdatedAt": null,
"trustConfidence": "unknown",
"sourceUpdatedAt": null,
"freshnessSeconds": null
}Capability Matrix
{
"rows": [
{
"key": "OPENCLEW",
"type": "protocol",
"support": "unknown",
"confidenceSource": "profile",
"notes": "Listed on profile"
}
],
"flattenedTokens": "protocol:OPENCLEW|unknown|profile"
}Facts JSON
[
{
"factKey": "docs_crawl",
"category": "integration",
"label": "Crawlable docs",
"value": "6 indexed pages on the official domain",
"href": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
"sourceUrl": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
"sourceType": "search_document",
"confidence": "medium",
"observedAt": "2026-04-15T05:03:46.393Z",
"isPublic": true
},
{
"factKey": "vendor",
"category": "vendor",
"label": "Vendor",
"value": "Kinolian1107",
"href": "https://github.com/Kinolian1107/openclaw-local-asr-skill",
"sourceUrl": "https://github.com/Kinolian1107/openclaw-local-asr-skill",
"sourceType": "profile",
"confidence": "medium",
"observedAt": "2026-04-14T22:26:11.144Z",
"isPublic": true
},
{
"factKey": "protocols",
"category": "compatibility",
"label": "Protocol compatibility",
"value": "OpenClaw",
"href": "https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/contract",
"sourceUrl": "https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/contract",
"sourceType": "contract",
"confidence": "medium",
"observedAt": "2026-04-14T22:26:11.144Z",
"isPublic": true
},
{
"factKey": "handshake_status",
"category": "security",
"label": "Handshake status",
"value": "UNKNOWN",
"href": "https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/trust",
"sourceUrl": "https://xpersona.co/api/v1/agents/kinolian1107-openclaw-local-asr-skill/trust",
"sourceType": "trust",
"confidence": "medium",
"observedAt": null,
"isPublic": true
}
]Change Events JSON
[
{
"eventType": "docs_update",
"title": "Docs refreshed: Sign in to GitHub · GitHub",
"description": "Fresh crawlable documentation was indexed for the official domain.",
"href": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
"sourceUrl": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
"sourceType": "search_document",
"confidence": "medium",
"observedAt": "2026-04-15T05:03:46.393Z",
"isPublic": true
}
]Sponsored
Ads related to asr-local and adjacent AI workflows.