A Ahrefs

How to run a site audit project on Ahrefs

intermediate 8 min read Updated 2026-03-13
Quick Answer

Ahrefs Site Audit detects 170+ technical SEO issues like redirect chains, duplicates, and Core Web Vitals failures. Configure settings, start the crawl, review the health score and issues on the overview dashboard, then prioritize fixes starting with errors. Use pro tips like Nitropack for speed wins and CSV exports for task management.

Prerequisites

  • Active Ahrefs subscription with sufficient crawl credits
  • Site ownership or admin access for verification and tweaks
  • Whitelist AhrefsBot and AhrefsSiteAudit user-agents and IPs
  • Updated sitemap.xml with live 200-status URLs
  • Low-traffic window for large sites

Step-by-Step Instructions

1

Log in and Navigate to Site Audit

Access the Ahrefs dashboard at ahrefs.com and select Site Audit from the left-hand sidebar. This opens the overview showing existing projects or an option to create new ones.
2

Create a New Project

Click the orange + New project button (or + Add site). Enter the root domain or full URL (e.g., example.com). Verify ownership via HTML tag, Google Analytics, Google Search Console, DNS record, or authorize Google for free Ahrefs Webmaster Tools audits.
For free audits, use Ahrefs Webmaster Tools first.
3

Configure Project Settings

Click the gear icon next to the project name and select Open project settings. Set Max number of internal pages (default 5,000 for Lite; controls crawl credits for HTML, redirects, broken pages, resources). Choose crawl speed (Normal default; Slow for shared hosting). Select user-agent like Googlebot desktop or mobile. Use regex in include/exclude patterns (e.g., exclude /blog/*). Enable Core Web Vitals with Google PageSpeed Insights API key. Toggle sitemap-only crawl if needed. Save for all future crawls.
Execute JavaScript if your site relies on it for content.
4

Optional: Import from Google Search Console

In project settings or setup wizard, click Import from Google Search Console, authorize your Google account, select the property, and confirm to pre-populate crawl scope from GSC data.
5

Start the Initial Crawl

Return to the project dashboard and click Start crawl (or it auto-starts). Monitor progress in the audit log table showing status, health score, and issue counts (Errors red, Warnings yellow, Notices blue). Crawls prioritize important pages; expect completion emails.
Run overnight for large sites (1-24+ hours).
6

Review the Overview Dashboard

Once complete, click a crawl entry in the audit log table to open Overview. Check health score (0-100), issue distribution by category (Internal Pages, Indexability, Links, etc.), and breadcrumbs. Click question marks (?) for fix instructions.
Segment crawls for sections like <code>/blog</code> or subdomains.
7

Customize Global Issues

On the main page, click Issue settings to adjust error/warning/notice levels, toggle issues on/off globally, or add to new projects. Customize for your site (e.g., demote orphan pages for ecommerce).
8

Analyze and Drill Down into Issues

Scroll to issue lists and click View All Issues or numbers to open reports. Use Page Explorer for affected URLs, URL details for structured data tabs, and filters for schema issues. Check changes across crawls (Added, New, Removed).
Prioritize Errors first, then Warnings, Notices.
9

Schedule Regular Crawls

In settings, configure Schedule for day, time, timezone. Toggle Run scheduled Crawls on/off. Set URL sources like Website, sitemaps, custom lists, or backlinks.
Use default settings for full crawls if new to Site Audit.
10

Export and Prioritize Fixes

Export CSVs from reports for tools like Asana. Fix crawlability first, then duplicates and speed. Use Nitropack for quick Core Web Vitals improvements.
Whitelist bots to avoid access blocks.

Common Issues & Troubleshooting

"409 Conflict"

Firewall/plugin (Wordfence, Sucuri, Cloudflare) blocks Ahrefs bots. Whitelist <code>AhrefsSiteAudit</code> and <code>AhrefsBot</code> user-agents and IPs from Ahrefs help docs.

Crawl timeouts on large sites

Run during low-traffic overnight window, set slower crawl speed, limit max internal pages, or use sitemap-only crawl.

Insufficient crawl credits

Check dashboard for credits based on site size/plan. Upgrade subscription or reduce max pages.

Sitemap errors or disallowed pages

Update <code>sitemap.xml</code> with only 200-status live URLs. Check robots.txt and whitelist Ahrefs bots.

Password-protected site access

Enable Authentication in settings (Advanced+ plans) and enter HTTP credentials.