SDK
screenpipe provides two sdk packages:
@screenpipe/js
- for node.js environments (nextjs api routes, etc)@screenpipe/browser
- for browser environments
both sdks provide type-safe interfaces to interact with screenpipe’s core functionality.
feel free to use our docs as context in cursor agent through MCP
installation
node.js sdk
browser sdk
basic usage
search api
common usage patterns
fetching recent screen activity
searching for specific content
building a timeline of user activity
fetching website-specific content
vercel-like crons
you need to add a pipe.json
file to your pipe folder with this config for example:
this will run the /api/log
route every 5 minutes.
check how the obsidian pipe implements this, route and pipe.json for a complete example.
we recommend using the CLI to add the update-pipe-config
server action to your pipe. this will allow you to update the pipe’s cron schedule using a server action.
please adjust its code to your needs as some things are hardcoded in it.
realtime streams
react hooks sdk support ⚛️
screenpipe provides first-class support for React applications through custom hooks, enabling seamless integration with your React components. while you can manually create hooks using libraries like React Query, we recommend leveraging our built-in CLI to quickly add pre-built, optimized hooks to your pipes.
using the cli to add hooks
the fastest way to integrate react hooks into your pipe is through our CLI:
select from the interactive menu to add hooks such as:
use-pipe-settings
: manage pipe-specific and global app settings.use-health
: monitor pipe health and status.use-ai-provider
: integrate seamlessly with ai providers.use-sql-autocomplete
: provide sql query assistance.
these hooks follow best practices, ensuring type safety, efficient state management, and easy integration with your existing React components.
real-time meeting summarizer
⚠️ make sure to enable real time transcription in the screenpipe app settings or CLI args.
real-time screen activity monitor
settings management
pipes can access and modify screenpipe app settings through the SDK. this is useful for storing pipe-specific configuration and accessing global app settings.
quick start with CLI
the fastest way to add settings management to your pipe is using our CLI:
this will add the following components to your pipe:
use-pipe-settings
hook for react componentsget-screenpipe-app-settings
server action- required typescript types
manual setup
- create types for your settings:
- create server action to access settings:
- create react hook for settings management:
- use in your components:
best practices
- store pipe-specific settings under
customSettings.yourPipeName
in screenpipe app settings - use typescript for type safety
- provide default values for all settings
- handle loading and error states
- validate settings before saving
- use server actions for settings operations
- consider using shadcn/ui components for consistent UI
see the obsidian pipe for a complete example of settings management.
integrating with llms
screenpipe’s sdk can be easily integrated with various ai providers to analyze and generate insights from screen activity. here are common patterns for connecting context data with llms.
if the user is using screenpipe-cloud, you can use claude-3-7-sonnet, gemini-2.0-flash-lite or gpt-4o (Anthropic, Google, OpenAI or local models).
vercel ai sdk integration
the vercel ai sdk offers a streamlined way to work with llms. this example from the obsidian pipe demonstrates how to generate structured work logs:
see the obsidian pipe for the complete implementation.
or irs-agent for a more complex example.
deduplicating context with embeddings
when working with large amounts of screen data, it’s useful to remove duplicates before sending to an llm:
streaming transcriptions to llms
for real-time analysis of audio:
check out our production pipe examples on github to see more ai integration patterns.
notifications (desktop)
node.js specific features
the node sdk includes additional features not available in the browser:
LLM links
paste these links into your Cursor chat for context:
Was this page helpful?