Average Reading Time: 2-5 minutes
The words “SEO audit” may send chills down the spine of the uninitiated, but an audit can be tremendously helpful in improving your site’s performance in organic search. Google collects all the data on the pages it’s crawled into a massive database known as the index. Performing an audit lets you see your site the way Google does, allowing you to make sure the proper pages are being indexed. Identifying red flags early will save you time, money, and headaches in the long run.
Once a page has been indexed, Google applies its algorithms to determine that page’s rank for a particular keyword. This is why it’s so important for you to audit your site and confirm it’s being crawled and indexed correctly. Pages that shouldn’t be indexed often make it there anyway, and you’ll want to be aware if this is happening to low-value pages on your own site.
Beginning Your Audit
Fortunately, starting the audit process is very easy. Doing a simple site:search on Google instantly yields useful information. Just type “site:yourdomain.com” and see what comes up.
You should be able to tell whether too many pages are being indexed, or too few. Do you notice duplicate pages in the search results? Anything else look out of place? This will give you a good idea of where the potential issues are for your site.
What Pages Should Be Indexed?
All the main types of pages we covered previously should be indexed: the home page, category pages, product pages, and information pages. If your home page isn’t being indexed, that’s a pretty clear indication that you have a major problem on your hands. You may also want subdomains, landing pages, archive pages, and other media like PDFs to be indexed as well.
What Pages Should NOT Be Indexed?
However, there are many kinds of pages that you don’t want to see indexed. Here are several to watch out for:
- Dynamic URLs or parameters (think ?id=1234)
- Filtered URLs (think ?size=large&color=blue)
- Tracking URLs (think source=“facebook”)
- Backend or customer login pages
- Site search results
- Low-value pages
- Duplicate pages
- Checkout/confirmation pages
- Standalone images/media attachments (optional)
Helpful Search Operators
When auditing your site, you can combine the following search operators to get more specific results. Just type any of them into the search bar on Google along with your query.
- site: Show only indexed pages from this domain
- inurl: Find only links with specific words in the URL
- -keyword (minus + word): URLs without a specific word
- filetype: Filter for specific file types, like PDFs
- *: Wildcard (Search for words by only the first few letters, i.e. “Le*”)
- $: Search for a specific price
Crawling Your Own Site
Once you audit the index, it’s time to focus on your site itself. There are a variety of paid and free tools available you can use to crawl your site. Some of the most popular include SEMrush, Screaming Frog, Moz, DeepCrawl, and Botify.
There are plenty of reasons to crawl your site. You’ll locate areas where you need to improve, find any technical gaps that may have been missed during development, and match your findings to the index audit performed earlier. Most importantly, you can optimize your crawl budget.
Crawl Budget
Each time Google crawls your site, it doesn’t fully crawl every individual page. Google’s Index is only as accurate as its most recent crawl, so you want to ensure the most important pages of your site get crawled often.
Conversely, you don’t want Google to waste time crawling pages that don’t add value to your site. These include broken links, redirect chains, duplicate content, and other low-value assets.
What to Look for in a Crawl
When crawling your site, pay close attention to these items.
HTML Tags & Meta Data
Make sure there are no duplicate or missing tags. Check individual tags for the following:
- Page titles: Character length; content matches page
- Meta descriptions: Length; persuasive copy
- Headers: Right usage and hierarchy
- Image tags: Alt text; titles; descriptive copy
Avoid keyword stuffing and “over-optimizing.” You want your content to be readable and include natural language.
Response Codes
A response code signals what a browser sees when it tries to load your page.
- 200s: Everything’s good!
- 300s: Redirects. Aim for 301s and avoid 302s. Update internal links to point directly to new URLs.
- 400s: Broken links. These should be redirected or removed.
- 500s: Server errors. Alert your development team and flag for redirects if needed.
On-Page Directives
You can use on-page directives to give specific instructions to crawlers about what to do with certain pages.
- Noindex: These pages are suppressed from the index.
- Canonical: If you have duplicates of the same page, this identifies the “master” copy.
- Nofollow: Tells crawler not to follow any links on the page. Helpful with paid links, comment sections, affiliate links, and login or private page links.
- Robots.txt: Used to block entire sections or create crawl rules for your subdomain. For example, you can block dynamic URLs, filters, site search results, and more.
How Should I Block My Page?
Determining the best technique for blocking a particular page on your site may seem confusing. Keep the following rules of thumb in mind.
- Robots.txt: “I don’t want the search engine to crawl my page at all.”
- Noindex: “I want Google to crawl the page but keep it out of the index.”
- Password protected: “This is private and I don’t want anyone accidentally (or purposefully) seeing this page.”
Secure vs. Insecure Content
Google has placed extreme importance on the HTTPS protocol, especially for sites that collect private information. (That would be you, ecommerce sites.) If you still haven’t done so, make sure to migrate your site to HTTPS.
Your crawl will let you know whether your insecure content is being redirected to secure content. It will also identify insecure assets on pages, like images or third-party content.
TL;DR
An SEO audit will give you valuable information about how your site appears to search engines, and the process is easier than you might think. It’s important to know whether the right pages are being indexed and how to correct issues with redirects, broken links, site security, and more.
That may all seem a bit overwhelming, but a technical audit really can do wonders for your SEO. To make it even easier, we’re offering free technical SEO audits just for you! Visit netelixir.com/contact and tell us a little about yourself to get started.
If you missed any of our #SEOWeek webinars, you can view them on-demand at https://www.netelixir.com//university/all-webinars/