Skip to main content
OpenClaw (formerly Moltbot/Clawdbot) is a self-hosted personal AI assistant that connects to your messaging apps (WhatsApp, Telegram, Discord, iMessage, etc.) and can take actions on your behalf. with screenpipe, OpenClaw can recall what you’ve seen on screen, reference past conversations, and answer questions about your digital history.

quick start (one command)

# full setup with morning summaries at 8am
bunx @screenpipe/agent --setup user@your-server --morning 08:00

# that's it! you'll get daily briefings via telegram/whatsapp/etc
this single command:
  1. syncs your screen data (permanent daemon, survives reboot)
  2. installs screenpipe skills (recall, search, digest, context)
  3. schedules morning summary cron job

manual setup (if you prefer)

# step 1: sync data
bunx @screenpipe/sync --daemon --remote user@your-server:~/.screenpipe/

# step 2: install skills
bunx @screenpipe/skills install --remote user@your-server

setup options

option 1: custom skill (advanced)

if you prefer manual setup, create a skill that queries screenpipe’s SQLite database directly.
  1. create ~/openclaw/skills/screenpipe/skill.md:
---
name: screenpipe
description: Search screen recordings and audio transcriptions from the user's computer
tools:
  - Bash
---

# screenpipe skill

query the user's screen history via SQLite at ~/.screenpipe/db.sqlite.

## search content

```bash
# full-text search
sqlite3 ~/.screenpipe/db.sqlite "
  SELECT f.timestamp, o.app_name, substr(o.text, 1, 200)
  FROM ocr_text o
  JOIN frames f ON o.frame_id = f.id
  WHERE o.text LIKE '%QUERY%'
  ORDER BY f.timestamp DESC
  LIMIT 20;
"

# app usage today
sqlite3 ~/.screenpipe/db.sqlite "
  SELECT o.app_name, COUNT(*) as frames
  FROM ocr_text o
  JOIN frames f ON o.frame_id = f.id
  WHERE date(f.timestamp) = date('now')
  GROUP BY o.app_name
  ORDER BY frames DESC;
"

2. restart OpenClaw to load the skill

### option 3: REST API (requires tunnel)

if OpenClaw runs on the same machine as screenpipe, query the REST API directly:

```bash
# search content
curl -s "http://localhost:3030/search?q=QUERY&limit=20"

# filter by type: ocr, audio, ui
curl -s "http://localhost:3030/search?q=QUERY&content_type=ocr"

# filter by app
curl -s "http://localhost:3030/search?q=QUERY&app_name=Chrome"
for remote setups, use a secure tunnel (tailscale, cloudflare tunnel).

option 4: MCP

OpenClaw supports MCP via the mcporter skill:
# test the MCP server
npx @modelcontextprotocol/inspector npx screenpipe-mcp
add to your OpenClaw MCP config:
{
  "screenpipe": {
    "command": "npx",
    "args": ["-y", "screenpipe-mcp"]
  }
}
MCP uses more context tokens than skills. the OpenClaw team recommends skills for most use cases.

example prompts

once configured, message OpenClaw from any chat app:
  • “what was I reading about yesterday afternoon?”
  • “find the slack message from john about the deployment”
  • “what code was I looking at in cursor this morning?”
  • “summarize my meetings from last week”
  • “what tabs did I have open when researching that bug?”
  • “when did I last see the budget spreadsheet?“

requirements

  • screenpipe running locally on your computer
  • SSH access to your OpenClaw server (for sync)
  • or: screenpipe + OpenClaw on same machine (for REST API)

troubleshooting

sync not working?
  • check SSH access: ssh user@openclaw-server "echo ok"
  • check daemon: launchctl list | grep screenpipe
  • check logs: tail /tmp/screenpipe-sync.log
skills not loading?
  • check files exist: ssh server "ls ~/openclaw/skills/screenpipe/"
  • restart OpenClaw to reload skills
no results from queries?
  • verify screenpipe is running locally: curl http://localhost:3030/health
  • check data exists: sqlite3 ~/.screenpipe/db.sqlite "SELECT COUNT(*) FROM ocr_text;"
  • ensure screenpipe has screen recording permissions

packages

PackageDescription
@screenpipe/agentOne-liner full setup (recommended)
@screenpipe/syncSync screen data to remote servers
@screenpipe/skillsPre-built skills for AI agents