Performing an SEO audit manually becomes difficult when you have hundreds of pages on your site.
Having the right tool will help you analyze the number of broken links, broken images, missing meta description, H1 tag, title, and other SEO related issues.
Have you ever tried a tool that can crawl your full website and then generate the report of all the issues?
If not, then you should give a try to Screaming Frog SEO Spider tool.
A tool that helps you identify flaws in your website can affect your site rankings so you can take proper actions.
In this Screaming Frog SEO Spider review, we will look at this tool and how I used the tool to audit redirects when I migrated my website.
Screaming Frog SEO Spider is a desktop-based crawling tool that allows you to crawl your website to analyze and audit technical & onsite SEO. If you want a cloud-based crawling tool, then read our in-depth review of DeepCrawl.
From now onwards, I will use SEO Spider instead of Screaming Frog SEO Spider.
SEO Spider supports macOS, Windows, and Ubuntu. You can download and install it on your desktop, and it will automatically detect your operating system as soon as you click on Download.
You might be thinking about why anyone will use a desktop-based tool to find the errors and issues within the site.
To be honest, I also thought that, until I used its free version.
It presents the complex SEO data in an easy-to-understand format. However, not being a cloud-based tool cannot be neglected, but it’s worth using it.
Once you open the SEO Spider tool, you will see various tabs at the top, lower and right side window. To understand the functioning of each tab, you can read the SEO Spider’s user guide.
Now, we will see some of its best features and how they can be used for your site.
SEO Spider offers various features that are quite often missed by experienced users as well. I cannot cover all the features; still, I will discuss my favourite and most essential highlights in the tool.
In some of the features mentioned below, I have shared the links that will be redirected to the page where you can see how to use that particular feature within the tool.
I hope some of these features will help you to save you countless hours.
When you migrate your site, you have to ensure permanent 301 redirects from the old URLs to the new URLs. At that time, it’s crucial to audit redirects on a site to prevent wrong destinations and errors.
You can upload the list of old URLs and then crawl them using the SEO Spider tool. You have to check redirects until you reach the final URL with a no response, 2XX, 4XX, or 5XX.
Once the crawl is finished, you’ll have to check the reports that mention the start and final URLs, status codes, and other information.
You can visit their site to get started with auditing redirects in a site migration.
The canonical tag helps the search engines to understand your preferred URLs (the web pages that you want to index). Fortunately, it helps you to avoid indexing of duplicate or similar content.
A canonical tag can cause serious SEO problems if you don’t use it properly – for example, if you canonicalize an entire site to your home page, or redirect to 404s, and so on.
Navigate to Reports > Canonical Errors. You can select between “Canonical Chains” and “Non-Indexable Canonicals.” After exporting the report, you can view canonical errors detected by SEO Spider.
You can visit their site to get started with auditing canonicals.
Currently, you cannot crawl more than one site at a time in the SEO Spider tool. However, you can do the same by opening multiple windows of the software.
On Windows, you can directly open the second window of software by the shortcut. But, on macOS, it’s not that straightforward.
To open multiple instances of the tool on macOS, you have to launch Terminal on your Mac and type:
open -n /Applications/Screaming\ Frog\ SEO\ Spider.app/
The Crawl Path Report gives a quick insight into how the crawler identified that particular page or the path crawler took to reach the destination URL.
This report can be beneficial when you generate it for the pages with a high crawl depth level. The Crawl path report can also be used to identify how the crawler picked up the problematic URL.
Right-click on any URL, click on “Export” and then select the “Crawl Path Report.” A report will be downloaded in the CSV format.
SEO Spider provides interactive website visualizations to understand your site’s structure in different ways.
Visualizations provide data that are already available in a crawl. They are useful in understanding the issues within the site. Most people use visualizations, so they don’t have to look into spreadsheets.
Right-click on a URL, click on “Visualizations,” and then select the type of graph or diagram from the following:
- Crawl Tree Graph
- Directory Tree Graph
- Force-Directed Crawl Diagram
- Force-Directed Directory Tree Diagram
Crawl Tree Graph and Force-Directed Crawl Diagram generate the visualizations that tell you how the SEO Spider crawled your site (by shortest path).
Force-Directed Crawl Diagram
The diagram looks like a heat-map representing the homepage as the darkest green node. It represents that the crawl has started from the website homepage, and the lines between the two nodes represent the link between two URLs.
As the crawl depth increases, the nodes get smaller and lighter. You can easily understand it by viewing the below image.
From the top-right corner, you can click on the ‘i’ icon to get more information about the nodes’ colors.
Crawl Tree Graph
The purpose of the Crawl Tree Graph is the same as that of the Force-Directed Crawl Diagram, but it differs in the representation.
The URLs are represented with the small circles, and the lines represent the link between two URLs having the shortest path. The crawl depth can be calculated by viewing the graph from left to right.
Sometimes, SEO Spider cannot pick missing titles, meta descriptions, or canonical tags that can be seen to exist in a browser. It can be possible due to the site responding differently to the browser.
To diagnose exactly what the SEO Spider is considering, you have to save the HTML return by the server.
Navigate to Spider Configuration > Advanced. Checkbox both “Store HTML” and “Store Rendered HTML.”
After configuring, when a URL is crawled, it will return the exact HTML that was first returned to the SEO Spider. You can view it in the “View Source” tab.
Now, you can analyze the issues by reviewing the returned HTML. This feature will help you to diagnose the issues quickly and easily.
Note: Perhaps you have read many standard features of this tool in other articles. Those features may be missing in my review because you can uncover them quickly on your own once you start using the tool.
SEO Spider offers two plans, in which one is free, and the other is paid. For the paid version, you have to purchase a license that costs £149 per year.
If you want to identify broken links and analyze site metadata, you don’t have to purchase a license. But, when it comes to advanced features, sticking with the free version is not a great idea.
You can only crawl 500 URLs with the free version. To compare all the features of the free and paid version of Screaming Frog SEO Spider, you can visit their official pricing page.
A single user needs one license, so if you want to allow two users, you’ll have to buy two licenses. You can even purchase multiple licenses at a discounted price. To get a discount, you have to purchase five or more licenses.
Screaming Frog SEO Spider is not the only crawling tool; there are other crawling tools available in the marketplace. Today, we will discuss two best alternatives to SEO Spider in which one is cloud-based, and the other is a desktop-based crawling tool.
DeepCrawl is a cloud-based crawling tool that can run a large size of crawls to monitor and analyze the issues within your website. You can also view the issues prior that might come after you migrate or launch the site.
If you want to crawl more than a single time, you can schedule the crawls as hourly, daily, weekly, fortnightly, monthly, or quarterly.
When DeepCrawl identifies an issue, you can add a specific issue to the task manager and assign it to your team members. Within the task manager, you can describe the problem in detail and assign a deadline to completion.
The paid plan of DeepCrawl starts from $14 per month, with which you can crawl 10,000 active URLs and full access to API. It also offers a 14-day free trial without providing the payment details.
Check our complete review and walkthrough of DeepCrawl here to learn more about this web-based crawling tool.
Netpeak Spider is a website crawler that helps you to discover various SEO issues within the site. By identifying the potential SEO issues, you can improve your site’s visibility in the search engine.
Netpeak Spider also checks for pages, titles, meta descriptions, headers, broken images, duplicate content, robots.txt file, and more. If you fix all of these issues, you can increase the organic traffic on your site.
Netpeak Spider PageRank Checker helps you optimize your backlink profile by displaying dead ends, redirect pages, orphan pages, and other issues with each URL.
You can also generate white-label reports covering various aspects of on-page analysis. The white-label reports can be exported in PDFs. However, this feature is available in the Pro and Custom subscription plans.
The cheapest plan of Netpeak Spider starts from $19 per month that offers various features, including website crawling, site optimization, SEO audit, etc. You’ll have to upgrade your plan if you want to crawl more than one site simultaneously.
At a first look, you might not like to use Screaming Frog SEO Spider due to its interface, but once you get in-depth data about the SEO issues and performance of your site, you’ll ignore its user-interface.
Being a desktop-based crawling tool, still, it generates valuable insights that other web-based crawling tools are unable to do.
I wouldn’t say that you should invest your money on Screaming Frog SEO Spider; instead, I suggest you use its free version first.