What makes a site become the best site it can be? Healthy, functional sites that have reached their full SEO potential have been optimized based on market and keyword research, E-A-T, content relevance to search intent, backlink profiles, and more. But they all have one thing in common: their technical SEO needs are met.
Your site’s technical SEO needs form a hierarchy. If needs lower in the hierarchy aren’t met, needs on the next level are difficult to fulfill. Each level responds to a different requirement in the world of search engines: crawlability, indexability, accessibility, rankability, and clickability.
Understanding what each level of the pyramid involves helps make technical SEO look less intimidating without oversimplifying its role in making a website great.
The foundations of technical SEO: crawlability
At the foundation of the pyramid of technical SEO needs is a URL’s crawlability.
Crawlability concerns a URL’s ability to be discovered by search engine bots. URLs that are not crawlable might still be accessible to users navigating your website, but because they are invisible to bots, they can’t appear in search results.
Crawlable URLs, therefore, are:
- Known to search engines. Search engines discover URLs by crawling links and reading sitemaps.
- Not forbidden to bots. Most search engine bots will respect meta robots instructions and directives in a robots.txt file that ask them not to crawl certain pages or directories.
- Covered by the website’s crawl budget. Less commonly, the “budget” accorded by Google’s algorithms is spent on other parts of a site, causing delays or problems in getting a specific URL crawled.
The first step in a technical SEO audit, for example, is to uncover pages that can’t be indexed, and why. Sometimes this is intentional, and sometimes it’s an error and a quick win for SEO.
Similarly, while crawl budget may seem esoteric and difficult to quantify, the basic principle is that when the cost of crawling is optimized and when priority pages are presented first, more traffic can be gained through search engines. Technical SEO uses how pages are discovered and prioritized to promote better crawling; it leverages historical data for crawl frequency and past situations that provoke increased crawling activity to improve current crawl rates.
Newly crawled pages distribution by page groups. The ‘Other’ grey category being the garbage category. A lot of crawl budget has been wasted crawling those pages. Source: OnCrawl.
Just above crawlability in the hierarchy of technical SEO needs is indexability.
Indexable URLs are URLs that a search engine can include in a catalog of pages that are available to be presented in search results pages. Even when a URL has been crawled, various properties can prevent it from being added to the index.
In the most straightforward situations, pages can be prevented from being indexed by meta robots and robots.txt directives.