how it works
- continuous capture — screenpipe records your screen at configurable intervals (default: 1 frame/second)
- OCR extraction — text is extracted from every frame using native OCR engines
- local storage — everything stored in a local SQLite database
- search API — query via
localhost:3030/searchwith filters
search examples
find by text
find by app
find by time range
find by browser URL
combine filters
search parameters
| param | type | description |
|---|---|---|
q | string | search query |
limit | int | max results (default 20) |
offset | int | pagination offset |
content_type | string | ocr, audio, ui, all |
start_time | ISO 8601 | filter by start time |
end_time | ISO 8601 | filter by end time |
app_name | string | filter by app name |
window_name | string | filter by window title |
browser_url | string | filter by browser URL |
min_length | int | minimum text length |
max_length | int | maximum text length |
using the desktop app
the easiest way to search is the built-in search in the screenpipe desktop app:- open screenpipe
- use the search bar or timeline view
- scroll through your day visually
- select content to chat with AI about it
search tips
- be specific: “slack message from john about deployment” > “deployment”
- use time context: combine
start_timeandend_timefor precision - combine filters: app name + time range + keywords
privacy
- all search happens locally on your device
- no data leaves your machine
- control what’s recorded with
--ignored-windowsand--included-windows