# crawler.sh > A fast, local-first web crawler and SEO analysis tool. Crawl any website in seconds, run 23 automated SEO checks, extract content as clean Markdown, and export to JSON, CSV, or Sitemap XML - from the terminal or the desktop app. Runs entirely on your machine. For full documentation including complete CLI usage, desktop app usage, all flags, data models, and examples, see [llms-full.txt](https://crawler.sh/llms-full.txt). ## Project details - **Name**: crawler.sh - **URL**: https://crawler.sh - **Category**: Developer Tools / SEO - **License**: Proprietary (freemium) - **Platforms**: macOS (Universal - Apple Silicon and Intel), Linux (x64 and ARM64, .deb). Windows coming soon. - **Install**: `curl -fsSL https://install.crawler.sh | sh` - **Pricing**: Free (600 pages/crawl, all features) - Pro $99/year (10,000 pages/crawl, Content Archive export) - **Author**: Mehmet Kose - **Contact**: https://crawler.sh/contact/ ## What crawler.sh does crawler.sh crawls websites using a breadth-first search algorithm with configurable concurrency, depth limits, and polite delays. It stays within the same domain and records every page's status code, response time, title, meta description, headings, canonical URL, word count, and more. After crawling, a built-in SEO analysis engine runs 23 automated checks across every page: missing/short/long/duplicate titles, missing/short/long/duplicate meta descriptions, missing/multiple/empty/long/short/duplicate H1 tags, thin content, missing content, long URLs, noindex pages, nofollow pages, non-self canonicals, paginated pages, and broken outgoing links. Content extraction converts the main body of any page to clean Markdown, stripping boilerplate automatically. Results export to NDJSON, JSON, CSV, TXT, and W3C-compliant Sitemap XML. ## Interfaces - **CLI** - Four subcommands: `crawl` (site crawling), `seo` (SEO analysis with CSV/TXT export), `info` (crawl file stats), `export` (format conversion). Install with `curl -fsSL https://install.crawler.sh | sh`. - **Desktop app** - macOS app with a live crawl feed, SEO issue dashboard, page-status charts, and a content viewer. ## Pricing Free tier: up to 600 pages per crawl, all features including SEO analysis, all export formats. Pro ($99/year): up to 10,000 pages per crawl, Content Archive export. ## Platform availability macOS (Universal - Apple Silicon and Intel), Linux (x64 and ARM64, .deb). Windows coming soon. ## Documentation - [Getting Started](https://crawler.sh/docs/docs.md): Overview and quick start guide - [Installation](https://crawler.sh/docs/installation.md): How to install and uninstall the CLI and desktop app - [CLI Overview](https://crawler.sh/docs/cli.md): CLI architecture and usage - [CLI Features](https://crawler.sh/docs/cli/features.md): Crawl engine, subcommands, SEO checks, content extraction, .crawl format - [CLI Reference](https://crawler.sh/docs/cli/reference.md): All flags, page data model, crawl events, output formats - [Desktop Overview](https://crawler.sh/docs/desktop.md): Desktop app architecture and usage - [Desktop Features](https://crawler.sh/docs/desktop/features.md): Dashboard cards, SEO analysis, content viewer - [Desktop Reference](https://crawler.sh/docs/desktop/reference.md): Settings, data model, export formats ## Product pages - [Homepage](https://crawler.sh/): Main landing page - [Product](https://crawler.sh/product.md): Feature overview for CLI and desktop app - [Technical SEO Audit](https://crawler.sh/technical-seo-audit.md): SEO audit feature page with all 23 checks - [Download](https://crawler.sh/download.md): Desktop app download and CLI install command - [Roadmap](https://crawler.sh/roadmap.md): Planned features and development timeline - [FAQ](https://crawler.sh/faq.md): Common questions about installation, configuration, and usage ## Blog - [The Complete Guide to Technical SEO Audits](https://crawler.sh/blog/technical-seo-audits.md): Covers crawlability, indexation, site speed, structured data, and how to automate audits with crawler.sh ## Changelog - [Changelog](https://crawler.sh/changelog/): Release notes, new features, and improvements ## Company - [About](https://crawler.sh/about.md): Mission, team, and background - [Contact](https://crawler.sh/contact.md): Bug reports, feature requests, and questions - [Privacy Policy](https://crawler.sh/privacy-policy/) - [Terms of Service](https://crawler.sh/terms-of-service/)