Two tools for every workflow.

A CLI for automation and scripting. A desktop app for visual crawling and analysis. Choose the interface that fits your workflow.

Crawl up to 10,000 pages per session with concurrent requests, automatic retry with backoff, and built-in respect for robots.txt directives. Extract clean Markdown content, run 23 SEO checks across every page, and export results as JSON, Sitemap XML, or CSV. Built in Rust for speed and reliability - typical crawls finish in seconds, not minutes.

crawler.sh CLI and desktop app

CLI Tool

4 subcommands for every workflow

crawl websites, inspect .crawl files with info, export to JSON or Sitemap XML, and run seo analysis with CSV/TXT export. All from the terminal.

Desktop App

8 dashboard cards for visual crawling

Live Feed, SEO Issues, Page Status, Settings, Downloads, Content viewer, Newsletter, and Premium — all in a responsive card grid with interactive overlays.

Content Extraction

Readable content as clean Markdown

Automatically extracts the main article content from any page and converts it to clean Markdown. Includes word count, author byline, and excerpt for every page.

SEO Analysis

23 checks across all crawled pages

Detects missing titles, duplicate descriptions, noindex directives, thin content, broken links, long URLs, non-self canonicals, and more. Export as CSV or TXT.

Workflow Examples

From quick crawl to full pipeline

Loading...

Start crawling for free

Download the desktop app or install the CLI with a single command. No account required for the free tier - crawl up to 600 pages per session, run full SEO audits, and export to any format. Upgrade to Pro for 10,000 pages and Content Archive exports.

Download
Crawler.sh - Free Local AEO & SEO Spider and a Markdown content extractor | Product Hunt