Building automations with the SereneReader API
Why automate your reading?
RSS already solves the hard part: getting content from hundreds of sources into one place. The SereneReader API lets you pull that content into scripts, cron jobs, and other tools programmatically.
No webhooks to configure. No OAuth dance. Generate an API key, make HTTP requests, get JSON back.
Getting started
API access requires a Pro or Team plan.
Create an API key
Head to your settings page in SereneReader and scroll to the API section. Create a new key, give it a name you'll recognize later, and copy it somewhere safe. The key is only shown once.
Authentication
Every request needs a Bearer token in the Authorization header. That's it. No client IDs, no refresh tokens, no scopes. Your API settings page has the base URL, endpoint documentation, and example requests.
Rate limits
The API allows 30 requests per minute per account. More than enough for scripts that run on a schedule. If you hit the limit, back off and retry after a few seconds.
What you can do
The API gives you read access to your articles and subscriptions. Filter articles by feed, folder, or publish date. List your subscriptions with unread counts. All responses are JSON with titles, URLs, published dates, snippets, and feed metadata.
The full list of endpoints, parameters, and response shapes is in the API docs section of your settings page.
Use cases
Daily digest email
Write a script that runs every morning at 7am, grabs yesterday's articles, and sends you a summary email. Filter by date, pipe the JSON through jq to extract titles and URLs, and send the result to whatever notification system you prefer.
The data is there in a clean format, ready to be shaped however you want.
Feed health monitor
Feeds break. Domains expire, paths change, servers go down. A weekly script can list your subscriptions and check which ones haven't published in a while.
When a feed goes quiet, you'll know about it before you start wondering why that blog stopped showing up.
Export to other tools
Pull articles into a local database, a Notion page, an Obsidian vault, or a spreadsheet. The API gives you structured data, so the transformation step is usually just a few lines of jq or a short Python script.
Combine with AI agents
The API pairs well with AI tools that can make HTTP requests. If you're using an agent framework like OpenClaw, you can point it at SereneReader and let it process your feeds on its own. We wrote a separate post on that: How to connect your RSS feeds to AI agents.
Keep it simple
The best automations are short scripts that do one thing. A cron job that runs once a day. A shell alias that shows your unread count. A webhook relay that forwards articles to a Slack channel.
You don't need a framework for this. curl and jq cover most cases. If you need something more structured, any language with an HTTP client will work.
API keys and endpoint docs are in your SereneReader settings.