How to run a site audit on Semrush
Semrush Site Audit crawls your website to detect errors, warnings, and notices like broken links and duplicate content. Configure settings for domain, crawl source, and schedule, then start the audit to get a Site Health score. Review issues, fix them, and re-run for improvements.
Prerequisites
- Active Semrush account with Site Audit access
- Website ownership verification in Semrush
- Website accessible to crawlers (no robots.txt blocks)
- Valid sitemap.xml (optional but recommended)
- Basic technical SEO knowledge
Step-by-Step Instructions
Access Site Audit
Create New Folder
Enter Domain and Settings
example.com). Choose to audit domains/subdomains and set the limit of pages to crawl (default: 100 to 10,000+ based on site size).Configure Pages to Crawl
Set Crawl Masks and Parameters
/blog/*) or exclude masks (e.g., /admin/*). Define URL parameters to include/exclude (e.g., ?utm_source=). Defaults crawl the full site without masks.Choose Crawl Mode
Configure Schedule
Review Advanced Options
Monitor and View Results
Re-run and Export
Common Issues & Troubleshooting
Broken Links (4xx/5xx Errors)
Remove or replace links to error pages. Use Semrush 'View broken links' feature; check for DDoS protection if browser works but Semrush reports broken.
Broken Internal JS/CSS Files
Verify file paths and directories. Use absolute URLs in references to ensure proper rendering and Core Web Vitals.
Duplicate Content
Identify via warnings; implement canonical tags or 301 redirects to consolidate duplicates.
Missing Meta Tags
Add required meta descriptions, titles, and other tags to improve indexing and SEO.
Slow Page Speed or Crawlability Issues
Optimize images/assets, check robots.txt, and ensure site accessibility to crawlers.