Guides
March 6, 2026

How to Find Redirect Chains for a Website with CLI

Learn how to detect and analyze HTTP redirect chains using crawler.sh CLI. Find 301/302 chains, identify loops, and export results to fix SEO issues.

Mehmet Kose
4 mins read

Redirect chains occur when a URL redirects to another URL, which then redirects to yet another URL before reaching the final destination. Each hop in the chain adds latency, wastes crawl budget, and dilutes link equity. Search engines may stop following chains after a certain number of hops, meaning some pages never get indexed.

This guide walks you through finding and analyzing redirect chains on any website using the crawler.sh CLI.

Step 1: Install crawler.sh CLI

Install the CLI with a single command:

curl -fsSL https://install.crawler.sh | sh

This downloads the correct binary for your operating system and architecture, places it in ~/.crawler/bin/, and adds it to your PATH. Restart your terminal or run source ~/.bashrc (or ~/.zshrc) to pick up the new PATH entry.

Verify the installation:

crawler --version

Step 2: Crawl the target website

Run a crawl against the website you want to audit. The crawler follows links within the same domain and records every redirect it encounters:

crawler crawl https://example.com

By default, the crawl saves results in NDJSON format (.crawl file) in the current directory. The filename is generated from the domain name. For larger sites, you can increase the page limit:

crawler crawl https://example.com --max-pages 5000

The crawler records the full redirect chain for every URL, including the HTTP status code at each hop (301, 302, 307, 308), whether a redirect loop was detected, and the final destination URL.

Step 3: View redirect information

Once the crawl completes, use the info command to get an overview of the crawl results, including redirect statistics:

crawler info example-com.crawl

This shows a summary of all crawled pages broken down by status code, response time statistics, and a redirect audit section. The redirect audit highlights how many pages involved redirects and flags any redirect chains or loops.

Step 4: Analyze redirect chains

For a detailed SEO analysis that includes redirect-specific checks, run the seo command:

crawler seo example-com.crawl

The SEO report includes dedicated checks for:

  • Redirect chains: URLs that pass through two or more redirects before reaching the final destination
  • Redirect loops: URLs that redirect back to themselves or create circular redirect paths
  • Mixed redirect types: Chains that mix 301 (permanent) and 302 (temporary) redirects, which can confuse search engines about which URL to index

Each issue is listed with the full redirect path so you can see exactly where the chain starts, which intermediate URLs it passes through, and where it ends.

Step 5: Export results

Export the SEO report to a file for sharing with your team or importing into a spreadsheet:

crawler seo example-com.crawl --format csv --output redirects.csv

You can also export as plain text:

crawler seo example-com.crawl --format txt --output redirects.txt

The CSV format works well for filtering and sorting in Excel or Google Sheets, making it easy to prioritize which redirect chains to fix first based on the number of hops or the importance of the source URL.

What to do about redirect chains

Once you have identified redirect chains, here is how to fix them:

  • Update the source URL to point directly to the final destination. If page A redirects to B, which redirects to C, update all links and references to A so they point directly to C.
  • Consolidate redirects on the server. Update your server configuration (nginx, Apache, or CDN rules) so that each old URL redirects directly to the final URL in a single hop.
  • Fix redirect loops immediately. Loops prevent pages from loading entirely. These are the highest priority to resolve.
  • Prefer 301 over 302 for permanent moves. If a URL has permanently moved, use a 301 redirect. Reserve 302 for genuinely temporary redirects.
  • Re-crawl after fixing. Run crawler crawl again after making changes to verify that the chains have been resolved.
Crawler.sh - Free Local AEO & SEO Spider and a Markdown content extractor | Product Hunt