Crawler Summary

reachy-mini answer-first brief

Complete SDK for controlling Reachy Mini robot - head movement, antennas, camera, audio, motion recording/playback. Covers architecture (daemon/client), deployment modes (USB, wireless, simulation, on-Pi), and app distribution. Also includes advanced application patterns: MovementManager, layered motion, audio-reactive movement, face tracking, LLM tool systems, and OpenAI realtime integration. Use when: (1) Writing code to control Reachy Mini, (2) Moving the robot head or antennas, (3) Accessing camera/video, (4) Playing/recording audio, (5) Recording or playing back motions, (6) Looking at points in image or world space, (7) Understanding robot capabilities, (8) Connecting to real or simulated robot, (9) Building conversational AI apps, (10) Integrating with LLMs/OpenAI, (11) Deploying apps to robot, (12) Any robotics task with Reachy Mini. --- name: reachy-mini description: | Complete SDK for controlling Reachy Mini robot - head movement, antennas, camera, audio, motion recording/playback. Covers architecture (daemon/client), deployment modes (USB, wireless, simulation, on-Pi), and app distribution. Also includes advanced application patterns: MovementManager, layered motion, audio-reactive movement, face tracking, LLM tool systems, and OpenAI realtime Capability contract not published. No trust telemetry is available yet. Last updated 4/15/2026.

Freshness

Last checked 4/15/2026

Best For

reachy-mini is best for general automation workflows where OpenClaw compatibility matters.

Not Ideal For

Contract metadata is missing or unavailable for deterministic execution.

Evidence Sources Checked

editorial-content, GITHUB OPENCLEW, runtime-metrics, public facts pack

Claim this agent
Agent DossierGitHubSafety: 80/100

reachy-mini

Complete SDK for controlling Reachy Mini robot - head movement, antennas, camera, audio, motion recording/playback. Covers architecture (daemon/client), deployment modes (USB, wireless, simulation, on-Pi), and app distribution. Also includes advanced application patterns: MovementManager, layered motion, audio-reactive movement, face tracking, LLM tool systems, and OpenAI realtime integration. Use when: (1) Writing code to control Reachy Mini, (2) Moving the robot head or antennas, (3) Accessing camera/video, (4) Playing/recording audio, (5) Recording or playing back motions, (6) Looking at points in image or world space, (7) Understanding robot capabilities, (8) Connecting to real or simulated robot, (9) Building conversational AI apps, (10) Integrating with LLMs/OpenAI, (11) Deploying apps to robot, (12) Any robotics task with Reachy Mini. --- name: reachy-mini description: | Complete SDK for controlling Reachy Mini robot - head movement, antennas, camera, audio, motion recording/playback. Covers architecture (daemon/client), deployment modes (USB, wireless, simulation, on-Pi), and app distribution. Also includes advanced application patterns: MovementManager, layered motion, audio-reactive movement, face tracking, LLM tool systems, and OpenAI realtime

OpenClawself-declared

Public facts

4

Change events

1

Artifacts

0

Freshness

Apr 15, 2026

Verifiededitorial-contentNo verified compatibility signals

Capability contract not published. No trust telemetry is available yet. Last updated 4/15/2026.

Trust evidence available

Trust score

Unknown

Compatibility

OpenClaw

Freshness

Apr 15, 2026

Vendor

Gary149

Artifacts

0

Benchmarks

0

Last release

Unpublished

Executive Summary

Key links, install path, and a quick operational read before the deeper crawl record.

Verifiededitorial-content

Summary

Capability contract not published. No trust telemetry is available yet. Last updated 4/15/2026.

Setup snapshot

git clone https://github.com/gary149/reachy-mini-skill.git
  1. 1

    Setup complexity is LOW. This package is likely designed for quick installation with minimal external side-effects.

  2. 2

    Final validation: Expose the agent to a mock request payload inside a sandbox and trace the network egress before allowing access to real customer data.

Evidence Ledger

Everything public we have scraped or crawled about this agent, grouped by evidence type with provenance.

Verifiededitorial-content
Vendor (1)

Vendor

Gary149

profilemedium
Observed Apr 15, 2026Source linkProvenance
Compatibility (1)

Protocol compatibility

OpenClaw

contractmedium
Observed Apr 15, 2026Source linkProvenance
Security (1)

Handshake status

UNKNOWN

trustmedium
Observed unknownSource linkProvenance
Integration (1)

Crawlable docs

6 indexed pages on the official domain

search_documentmedium
Observed Apr 15, 2026Source linkProvenance

Release & Crawl Timeline

Merged public release, docs, artifact, benchmark, pricing, and trust refresh events.

Self-declaredagent-index

Artifacts Archive

Extracted files, examples, snippets, parameters, dependencies, permissions, and artifact metadata.

Self-declaredGITHUB OPENCLEW

Extracted files

0

Examples

6

Snippets

0

Languages

typescript

Parameters

Executable Examples

python

from reachy_mini import ReachyMini
from reachy_mini.utils import create_head_pose
import numpy as np

with ReachyMini() as robot:
    robot.wake_up()

    # Move head
    pose = create_head_pose(x=0, y=0, z=0, roll=0, pitch=10, yaw=20, degrees=True)
    robot.goto_target(head=pose, antennas=[0.3, -0.3], duration=1.0)

    # Get camera frame
    frame = robot.media.get_frame()  # Returns BGR numpy array

    robot.goto_sleep()

python

# Local USB connection (default)
ReachyMini()

# Network discovery
ReachyMini(localhost_only=False)

# Simulation mode
ReachyMini(use_sim=True)

# Auto-spawn daemon
ReachyMini(spawn_daemon=True)

# Full options
ReachyMini(
    robot_name="reachy_mini",       # Robot identifier
    localhost_only=True,            # True=local daemon, False=network discovery
    spawn_daemon=False,             # Auto-spawn daemon process
    use_sim=False,                  # Use MuJoCo simulation
    timeout=5.0,                    # Connection timeout (seconds)
    automatic_body_yaw=True,        # Auto body yaw in IK
    log_level="INFO",               # "DEBUG", "INFO", "WARNING", "ERROR"
    media_backend="default"         # "default", "gstreamer", "webrtc", "no_media"
)

python

from reachy_mini.utils import create_head_pose

# By position and rotation (degrees by default)
pose = create_head_pose(x=0, y=0, z=0, roll=0, pitch=15, yaw=-10, degrees=True)

# In radians
pose = create_head_pose(pitch=0.26, yaw=-0.17, degrees=False)

# Position in millimeters
pose = create_head_pose(x=50, y=0, z=30, mm=True)

python

from reachy_mini.motion.goto_move import InterpolationTechnique

# Immediate position (no interpolation)
robot.set_target(head=pose, antennas=[0.5, -0.5], body_yaw=0.1)

# Smooth motion with duration
robot.goto_target(
    head=pose,
    antennas=[0.5, -0.5],  # [right, left] in radians
    duration=1.0,
    method=InterpolationTechnique.MIN_JERK,
    body_yaw=0.0
)

python

# Look at pixel coordinates in camera image
robot.look_at_image(u=320, v=240, duration=0.5)

# Look at 3D world point (meters from robot origin)
robot.look_at_world(x=0.5, y=0.1, z=0.3, duration=0.5)

# Get pose without moving
pose = robot.look_at_image(u=320, v=240, perform_movement=False)

python

# Current head pose (4x4 matrix)
pose = robot.get_current_head_pose()

# Joint positions
head_joints, antenna_joints = robot.get_current_joint_positions()
# head_joints: 7 values (body_rotation + 6 stewart platform)
# antenna_joints: 2 values [right, left]

# Antenna positions only
antennas = robot.get_present_antenna_joint_positions()  # [right, left]

Docs & README

Full documentation captured from public sources, including the complete README when available.

Self-declaredGITHUB OPENCLEW

Docs source

GITHUB OPENCLEW

Editorial quality

ready

Complete SDK for controlling Reachy Mini robot - head movement, antennas, camera, audio, motion recording/playback. Covers architecture (daemon/client), deployment modes (USB, wireless, simulation, on-Pi), and app distribution. Also includes advanced application patterns: MovementManager, layered motion, audio-reactive movement, face tracking, LLM tool systems, and OpenAI realtime integration. Use when: (1) Writing code to control Reachy Mini, (2) Moving the robot head or antennas, (3) Accessing camera/video, (4) Playing/recording audio, (5) Recording or playing back motions, (6) Looking at points in image or world space, (7) Understanding robot capabilities, (8) Connecting to real or simulated robot, (9) Building conversational AI apps, (10) Integrating with LLMs/OpenAI, (11) Deploying apps to robot, (12) Any robotics task with Reachy Mini. --- name: reachy-mini description: | Complete SDK for controlling Reachy Mini robot - head movement, antennas, camera, audio, motion recording/playback. Covers architecture (daemon/client), deployment modes (USB, wireless, simulation, on-Pi), and app distribution. Also includes advanced application patterns: MovementManager, layered motion, audio-reactive movement, face tracking, LLM tool systems, and OpenAI realtime

Full README

name: reachy-mini description: | Complete SDK for controlling Reachy Mini robot - head movement, antennas, camera, audio, motion recording/playback. Covers architecture (daemon/client), deployment modes (USB, wireless, simulation, on-Pi), and app distribution. Also includes advanced application patterns: MovementManager, layered motion, audio-reactive movement, face tracking, LLM tool systems, and OpenAI realtime integration. Use when: (1) Writing code to control Reachy Mini, (2) Moving the robot head or antennas, (3) Accessing camera/video, (4) Playing/recording audio, (5) Recording or playing back motions, (6) Looking at points in image or world space, (7) Understanding robot capabilities, (8) Connecting to real or simulated robot, (9) Building conversational AI apps, (10) Integrating with LLMs/OpenAI, (11) Deploying apps to robot, (12) Any robotics task with Reachy Mini.

Reachy Mini SDK

Quick Start

from reachy_mini import ReachyMini
from reachy_mini.utils import create_head_pose
import numpy as np

with ReachyMini() as robot:
    robot.wake_up()

    # Move head
    pose = create_head_pose(x=0, y=0, z=0, roll=0, pitch=10, yaw=20, degrees=True)
    robot.goto_target(head=pose, antennas=[0.3, -0.3], duration=1.0)

    # Get camera frame
    frame = robot.media.get_frame()  # Returns BGR numpy array

    robot.goto_sleep()

Connection Options

# Local USB connection (default)
ReachyMini()

# Network discovery
ReachyMini(localhost_only=False)

# Simulation mode
ReachyMini(use_sim=True)

# Auto-spawn daemon
ReachyMini(spawn_daemon=True)

# Full options
ReachyMini(
    robot_name="reachy_mini",       # Robot identifier
    localhost_only=True,            # True=local daemon, False=network discovery
    spawn_daemon=False,             # Auto-spawn daemon process
    use_sim=False,                  # Use MuJoCo simulation
    timeout=5.0,                    # Connection timeout (seconds)
    automatic_body_yaw=True,        # Auto body yaw in IK
    log_level="INFO",               # "DEBUG", "INFO", "WARNING", "ERROR"
    media_backend="default"         # "default", "gstreamer", "webrtc", "no_media"
)

Head & Antenna Control

Creating Poses

from reachy_mini.utils import create_head_pose

# By position and rotation (degrees by default)
pose = create_head_pose(x=0, y=0, z=0, roll=0, pitch=15, yaw=-10, degrees=True)

# In radians
pose = create_head_pose(pitch=0.26, yaw=-0.17, degrees=False)

# Position in millimeters
pose = create_head_pose(x=50, y=0, z=30, mm=True)

Moving the Robot

from reachy_mini.motion.goto_move import InterpolationTechnique

# Immediate position (no interpolation)
robot.set_target(head=pose, antennas=[0.5, -0.5], body_yaw=0.1)

# Smooth motion with duration
robot.goto_target(
    head=pose,
    antennas=[0.5, -0.5],  # [right, left] in radians
    duration=1.0,
    method=InterpolationTechnique.MIN_JERK,
    body_yaw=0.0
)

Interpolation methods:

  • LINEAR - Linear interpolation
  • MIN_JERK - Default, smoothest motion
  • EASE_IN_OUT - Smooth start and end
  • CARTOON - Exaggerated animation style

Look-At Functions

# Look at pixel coordinates in camera image
robot.look_at_image(u=320, v=240, duration=0.5)

# Look at 3D world point (meters from robot origin)
robot.look_at_world(x=0.5, y=0.1, z=0.3, duration=0.5)

# Get pose without moving
pose = robot.look_at_image(u=320, v=240, perform_movement=False)

Antenna Values

Antennas are [right_angle, left_angle] in radians:

  • 0.0 = antennas down/closed
  • Positive = antennas up/open
  • Typical range: -0.5 to 1.0 radians

State Queries

# Current head pose (4x4 matrix)
pose = robot.get_current_head_pose()

# Joint positions
head_joints, antenna_joints = robot.get_current_joint_positions()
# head_joints: 7 values (body_rotation + 6 stewart platform)
# antenna_joints: 2 values [right, left]

# Antenna positions only
antennas = robot.get_present_antenna_joint_positions()  # [right, left]

Motor Control

# Enable/disable all motors
robot.enable_motors()
robot.disable_motors()

# Specific motors
robot.enable_motors(ids=["right_antenna", "left_antenna"])
robot.disable_motors(ids=["body_rotation"])

Motor IDs: "body_rotation", "stewart_1" through "stewart_6", "right_antenna", "left_antenna"

# Gravity compensation (requires Placo kinematics)
robot.enable_gravity_compensation()
robot.disable_gravity_compensation()

Behaviors

robot.wake_up()     # Wake animation + sound
robot.goto_sleep()  # Sleep position + sound

Camera

# Get frame (BGR numpy array, or None if unavailable)
frame = robot.media.get_frame()

# Camera properties
width, height = robot.media.camera.resolution
fps = robot.media.camera.framerate
K = robot.media.camera.K  # 3x3 intrinsic matrix
D = robot.media.camera.D  # Distortion coefficients

# Change resolution
from reachy_mini.media.camera.camera_constants import CameraResolution
robot.media.camera.set_resolution(CameraResolution.R1920x1080at30fps)

Common resolutions: R1280x720at30fps, R1280x720at60fps, R1920x1080at30fps, R1920x1080at60fps, R3840x2160at30fps

Audio

# Play sound file
robot.media.play_sound("wake_up.wav")

# Record audio
robot.media.start_recording()
sample = robot.media.get_audio_sample()  # numpy array
robot.media.stop_recording()

# Stream audio output
robot.media.start_playing()
robot.media.push_audio_sample(audio_data)
robot.media.stop_playing()

# Audio specs
sample_rate = robot.media.get_input_audio_samplerate()  # 16000 Hz
channels = robot.media.get_input_channels()  # 2

# Direction of Arrival (ReSpeaker only)
angle, valid = robot.media.get_DoA()  # angle in radians (0=left, pi/2=front, pi=right)

Motion Recording & Playback

Recording

robot.start_recording()
# ... perform motions manually or via code ...
recorded_data = robot.stop_recording()
# Returns list of dicts with timestamps, poses, joint positions

Playing Recorded Moves

from reachy_mini.motion.recorded_move import RecordedMoves

# Load move library from HuggingFace
moves = RecordedMoves("pollen-robotics/reachy-mini-dances-library")

# List available moves
print(moves.list_moves())

# Play a move
move = moves.get("dance_name")
robot.play_move(move, initial_goto_duration=1.0, sound=True)

# Async playback
await robot.async_play_move(move)

Kinematics

Three engines available:

| Engine | Install | Speed | Features | |--------|---------|-------|----------| | AnalyticalKinematics | Default | Fast | Always available | | PlacoKinematics | pip install reachy_mini[placo_kinematics] | Medium | Collision checking, gravity compensation | | NNKinematics | pip install reachy_mini[nn_kinematics] | Very fast | Neural network based |

# Direct kinematics access
from reachy_mini.kinematics.analytical import AnalyticalKinematics

kin = AnalyticalKinematics()
joint_angles = kin.ik(pose, body_yaw=0.0)  # Inverse kinematics: pose -> joints
pose = kin.fk(joint_angles)                 # Forward kinematics: joints -> pose

Simulation

# Start with simulation
robot = ReachyMini(use_sim=True)

# Or via daemon CLI
# reachy-mini-daemon --sim

Common Patterns

Face Tracking

# Detect face, get center coordinates (u, v)
robot.look_at_image(u=face_center_x, v=face_center_y, duration=0.3)

Expressive Movements

# Happy - antennas up
robot.goto_target(antennas=[0.8, 0.8], duration=0.3)

# Sad - antennas down
robot.goto_target(antennas=[-0.3, -0.3], duration=0.5)

# Curious tilt
pose = create_head_pose(roll=15, pitch=10, degrees=True)
robot.goto_target(head=pose, duration=0.4)

Idle Animation Loop

import time
while True:
    robot.goto_target(head=create_head_pose(yaw=10, degrees=True), duration=2.0)
    time.sleep(2.0)
    robot.goto_target(head=create_head_pose(yaw=-10, degrees=True), duration=2.0)
    time.sleep(2.0)

Reference Documentation

  • Architecture & Deployment - Daemon/client split, deployment modes (USB, wireless, simulation), running code, app distribution
  • API Reference - Complete method signatures, all parameters and return types
  • Motion Reference - Interpolation details, Move classes, GotoMove, RecordedMove
  • Media Reference - All camera resolutions, audio specs, backend options
  • Application Patterns - Advanced patterns: MovementManager, layered motion, audio-reactive movement, face tracking, LLM tool systems, OpenAI realtime integration

Contract & API

Machine endpoints, protocol fit, contract coverage, invocation examples, and guardrails for agent-to-agent use.

MissingGITHUB OPENCLEW

Contract coverage

Status

missing

Auth

None

Streaming

No

Data region

Unspecified

Protocol support

OpenClaw: self-declared

Requires: none

Forbidden: none

Guardrails

Operational confidence: low

No positive guardrails captured.
Invocation examples
curl -s "https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/snapshot"
curl -s "https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/contract"
curl -s "https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/trust"

Reliability & Benchmarks

Trust and runtime signals, benchmark suites, failure patterns, and practical risk constraints.

Missingruntime-metrics

Trust signals

Handshake

UNKNOWN

Confidence

unknown

Attempts 30d

unknown

Fallback rate

unknown

Runtime metrics

Observed P50

unknown

Observed P95

unknown

Rate limit

unknown

Estimated cost

unknown

Do not use if

Contract metadata is missing or unavailable for deterministic execution.
No benchmark suites or observed failure patterns are available.

Media & Demo

Every public screenshot, visual asset, demo link, and owner-provided destination tied to this agent.

Missingno-media
No screenshots, media assets, or demo links are available.

Related Agents

Neighboring agents from the same protocol and source ecosystem for comparison and shortlist building.

Self-declaredprotocol-neighbors
GITHUB_REPOSactivepieces

Rank

70

AI Agents & MCPs & AI Workflow Automation • (~400 MCP servers for AI agents) • AI Automation / AI Agent with MCPs • AI Workflows & AI Agents • MCPs for AI Agents

Traction

No public download signal

Freshness

Updated 2d ago

OPENCLAW
GITHUB_REPOScherry-studio

Rank

70

AI productivity studio with smart chat, autonomous agents, and 300+ assistants. Unified access to frontier LLMs

Traction

No public download signal

Freshness

Updated 5d ago

MCPOPENCLAW
GITHUB_REPOSAionUi

Rank

70

Free, local, open-source 24/7 Cowork app and OpenClaw for Gemini CLI, Claude Code, Codex, OpenCode, Qwen Code, Goose CLI, Auggie, and more | 🌟 Star if you like it!

Traction

No public download signal

Freshness

Updated 6d ago

MCPOPENCLAW
GITHUB_REPOSCopilotKit

Rank

70

The Frontend for Agents & Generative UI. React + Angular

Traction

No public download signal

Freshness

Updated 23d ago

OPENCLAW
Machine Appendix

Contract JSON

{
  "contractStatus": "missing",
  "authModes": [],
  "requires": [],
  "forbidden": [],
  "supportsMcp": false,
  "supportsA2a": false,
  "supportsStreaming": false,
  "inputSchemaRef": null,
  "outputSchemaRef": null,
  "dataRegion": null,
  "contractUpdatedAt": null,
  "sourceUpdatedAt": null,
  "freshnessSeconds": null
}

Invocation Guide

{
  "preferredApi": {
    "snapshotUrl": "https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/snapshot",
    "contractUrl": "https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/contract",
    "trustUrl": "https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/trust"
  },
  "curlExamples": [
    "curl -s \"https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/snapshot\"",
    "curl -s \"https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/contract\"",
    "curl -s \"https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/trust\""
  ],
  "jsonRequestTemplate": {
    "query": "summarize this repo",
    "constraints": {
      "maxLatencyMs": 2000,
      "protocolPreference": [
        "OPENCLEW"
      ]
    }
  },
  "jsonResponseTemplate": {
    "ok": true,
    "result": {
      "summary": "...",
      "confidence": 0.9
    },
    "meta": {
      "source": "GITHUB_OPENCLEW",
      "generatedAt": "2026-04-17T00:56:07.166Z"
    }
  },
  "retryPolicy": {
    "maxAttempts": 3,
    "backoffMs": [
      500,
      1500,
      3500
    ],
    "retryableConditions": [
      "HTTP_429",
      "HTTP_503",
      "NETWORK_TIMEOUT"
    ]
  }
}

Trust JSON

{
  "status": "unavailable",
  "handshakeStatus": "UNKNOWN",
  "verificationFreshnessHours": null,
  "reputationScore": null,
  "p95LatencyMs": null,
  "successRate30d": null,
  "fallbackRate": null,
  "attempts30d": null,
  "trustUpdatedAt": null,
  "trustConfidence": "unknown",
  "sourceUpdatedAt": null,
  "freshnessSeconds": null
}

Capability Matrix

{
  "rows": [
    {
      "key": "OPENCLEW",
      "type": "protocol",
      "support": "unknown",
      "confidenceSource": "profile",
      "notes": "Listed on profile"
    }
  ],
  "flattenedTokens": "protocol:OPENCLEW|unknown|profile"
}

Facts JSON

[
  {
    "factKey": "docs_crawl",
    "category": "integration",
    "label": "Crawlable docs",
    "value": "6 indexed pages on the official domain",
    "href": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
    "sourceUrl": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
    "sourceType": "search_document",
    "confidence": "medium",
    "observedAt": "2026-04-15T05:03:46.393Z",
    "isPublic": true
  },
  {
    "factKey": "vendor",
    "category": "vendor",
    "label": "Vendor",
    "value": "Gary149",
    "href": "https://github.com/gary149/reachy-mini-skill",
    "sourceUrl": "https://github.com/gary149/reachy-mini-skill",
    "sourceType": "profile",
    "confidence": "medium",
    "observedAt": "2026-04-15T04:12:19.789Z",
    "isPublic": true
  },
  {
    "factKey": "protocols",
    "category": "compatibility",
    "label": "Protocol compatibility",
    "value": "OpenClaw",
    "href": "https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/contract",
    "sourceUrl": "https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/contract",
    "sourceType": "contract",
    "confidence": "medium",
    "observedAt": "2026-04-15T04:12:19.789Z",
    "isPublic": true
  },
  {
    "factKey": "handshake_status",
    "category": "security",
    "label": "Handshake status",
    "value": "UNKNOWN",
    "href": "https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/trust",
    "sourceUrl": "https://xpersona.co/api/v1/agents/gary149-reachy-mini-skill/trust",
    "sourceType": "trust",
    "confidence": "medium",
    "observedAt": null,
    "isPublic": true
  }
]

Change Events JSON

[
  {
    "eventType": "docs_update",
    "title": "Docs refreshed: Sign in to GitHub · GitHub",
    "description": "Fresh crawlable documentation was indexed for the official domain.",
    "href": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
    "sourceUrl": "https://github.com/login?return_to=https%3A%2F%2Fgithub.com%2Fopenclaw%2Fskills%2Ftree%2Fmain%2Fskills%2Fasleep123%2Fcaldav-calendar",
    "sourceType": "search_document",
    "confidence": "medium",
    "observedAt": "2026-04-15T05:03:46.393Z",
    "isPublic": true
  }
]

Sponsored

Ads related to reachy-mini and adjacent AI workflows.