Crawler CLI
Command-line interface with four subcommands: crawl, info, export, and
seo. Features progress bars, color-coded output, and auto-generated
filenames. Get started →
crawler.sh is a fast, concurrent web crawler. It provides a CLI tool, a native desktop app, and an upcoming cloud API for crawling websites, extracting content, analyzing SEO issues, and exporting results in multiple formats.
Key capabilities:
Crawler CLI
Command-line interface with four subcommands: crawl, info, export, and
seo. Features progress bars, color-coded output, and auto-generated
filenames. Get started →
Crawler Desktop
Native desktop application with an interactive dashboard. Eight cards for live feed, SEO analysis, status charts, content browsing, and exports. Get started →
Crawler Cloud
Hosted crawling API with scheduled crawls, webhooks, and a web dashboard. Coming soon. Learn more →
curl -fsSL https://install.crawler.sh | shDownloads the correct binary for your platform and adds it to your PATH.
crawler crawl https://example.comThis crawls the site (up to 100 pages by default) and saves results to example-com.crawl.
crawler info example-com.crawlView summary statistics: page count, status code distribution, and response times.
crawler seo example-com.crawlRun 16 automated SEO checks and see issues grouped by category.
Download the macOS DMG (universal binary - works on both Apple Silicon and Intel).
Open the .dmg file and drag Crawler to your Applications folder.
Launch Crawler, enter a URL in the top input bar, configure settings, and click Start Crawl.
Browse the dashboard cards: Live Feed for real-time progress, SEO Issues for automated analysis, Content for extracted Markdown, and Downloads for exports.