Every website deserves a thorough audit

crawler.sh is a web crawler that runs locally, works offline, and outputs standard formats — no setup required.

Mission

Most web crawling tools are either too complex, too expensive, or locked behind cloud services. crawler.sh started as a simple idea: what if you could have a fast, local-first web crawler that outputs standard formats and works offline?

We built crawler.sh to give developers a single tool for crawling, content extraction, and SEO analysis — with the flexibility of powerful, local-first software.

The Project

crawler.sh ships as two tools: a CLI for automation and scripting, and the desktop app for visual analysis.

The CLI provides 4 subcommands (crawl, info, export, seo) and the desktop app features 8 interactive dashboard cards for visual crawling and analysis — all running locally on your machine.

The Story

crawler.sh started as a side project — a fast, local-first web crawler that does one thing well. We were tired of crawling tools that were slow, complex to set up, or couldn't run offline.

Today, crawler.sh is used by SEO professionals, content teams, and developers who want a reliable tool that runs on their machine, respects their privacy, and exports data in standard formats.

Use Cases

Run automated SEO analysis across entire websites. Detect missing titles, duplicate meta descriptions, noindex directives, thin content, and 12 more issue categories. Export findings as CSV or TXT for your team.

Crawler.sh - Free Local AEO & SEO Spider and a Markdown content extractor | Product Hunt