Your Ultimate Guide to Technical SEO 2018 Part 1

When technical search engine optimization (SEO) strategists begin working on a website, it can be a little overwhelming. With so many tactics to choose from, it’s difficult to know which ones will have the biggest impact. On top of that, resource constraints like front-end and back-end developer availability could reduce the number of actions you can take.

Rather than get overwhelmed with the endless tactical options you could execute, focus on the core SEO principles—then create a hyper-focused roadmap to fulfill those principles.

Here are three technical SEO principles that, if incorporated into your overall SEO strategy, will improve the technical health of your sites and offer them the best chance to rank and perform in Google:

  • If Google can’t see your page, no one else will.
  • Technical SEOs serve two primary users: searchers and bots. Optimize for both.
  • Words are just 1s and 0s to machines and bots. Explicitly create meaning and context for them.

This three-part series will arm you with specific strategies and tactics to help you adopt each of these technical SEO principles.

Principle 1. If Google can’t see your page, no one else will.

You may be able to create the most helpful, authoritative content in your industry—but if Google can’t see it, no one else will.

Your primary goals should be (1) to ensure that every page on your website is accessible to Google and (2) to block pages you want inaccessible to Google’s crawlers (Googlebot).

Tactic #1: Fix Links to Broken Pages

On any given day, Googlebot will crawl a limited number of pages. The amount crawled daily is relatively consistent, but broken links will lower the “crawl budget” on your site and make it harder for Google to find new pages or identify updates to existing pages.

Broken pages are a normal part of the web. Google understands this and does not penalize sites for having broken pages. But a site free of broken pages is a more efficient site for Googlebot to crawl.

How to Fix Broken Links

Using your favorite web crawler, perform a crawl on your site and export all links that point to broken pages. This includes pages generating 4xx (such as 404) and 5xx (such as 502) errors. Repeat this process for all external links pointing to broken pages.

Provide your developer or content manager with a list of all broken links and the recommended action to fix them (remove the link or redirect the link to a different URL).

Resources

Tactic #2: Clean Up Redirected Links

Linking to redirected URLs will not trigger a devaluation of your site, but it could affect your crawl budget. In fact, Google has mentioned that long redirect chains (redirects where multiple redirects occur en route to the destination URL) have a negative impact your crawl budget.

How to Fix Redirected Links

Perform a crawl on your site and export all links that point to redirected pages. This includes pages generating 3xx (such as 301) status codes. Repeat this process for all external links pointing to redirected pages.

Provide your developer or content manager with a list of all redirected links and the recommended action to fix them (remove the link or redirect the link to the proper destination URL).

You’ll also want to provide your developer a list of redirect chains found in your crawl, with instructions to remove the extra hops in the redirect.

Resources

Tactic #3: Optimize Internal Linking

Google’s Googlebot crawls your site and finds new pages via internal links. Google will have difficulty finding and ranking pages on a site with an inconsistent internal linking structure.

How to Optimize Your Internal Linking

The first step is to determine your base internal linking structure. Your structure should be determined by the site type (e-commerce, blog, company “brochure” site, etc.) and your content strategy.

A good place to start is the Classic SEO Structure created by Zac Heinrichs, SEO team lead at Portent:

  1. Every page links to the home page.
  2. Every page links to every category level page. This is your main navigation.
  3. Sub-category pages link to other sub-categories within the same category only.
  4. Sub-category pages link to their own topic pages only.
  5. Topic pages link to sub-category pages within the same category only.
  6. Topic pages link to their own article pages only.
  7. Article pages link to each sub-category page within the same category only.
  8. Article pages link to each topic page within its own sub-category.
  9. Article pages link to other article pages within the same topic.
Internal linking for SEO

Portent’s “Classic Internal Linking for SEO” Image source: https://www.portent.com/blog/seo/smart-internal-linking-for-seo.htm

 

After determining your base internal linking structure, ensure every page template (category page, sub-category page, topic page, article, etc.) meets those requirements. For example, using the Classic SEO Structure, you would link every page to the home page (typically via the header logo).

The last step is to identify and fix orphan pages (pages that do not have at least one internal link). Here are three ways to find orphan pages on your site:

  1. Compare the list of pages within your sitemap with the pages crawled by Screaming Frog.
  2. Compare the list of pages that received visits in Google Analytics with the pages crawled by your web crawler.
  3. Compare the list of pages that received hits in your server logs with the pages crawled by your web crawler.

Resources

Tactic #4: Ensure JavaScript Is Crawlable

As long as you are not blocking JavaScript resources, Google should be able to crawl, index and render most JavaScript content. However, this is not true for 100% of sites—there will be some cases where JavaScript is not crawlable.

How to Ensure Google Can Crawl Your Site’s JavaScript

When optimizing your site’s JavaScript content, I recommend following these five rules from Justin Briggs of Briggsby Consulting:

  • Content in by the load event (or 5 second timeout) is indexable.
  • Content dependent on user events is not indexable.
  • Pages require an indexable URL, with server-side support.
  • Audit rendered HTML (Inspect Element) using the same SEO best practices you use on traditional pages.
  • Avoid contradictions between versions.

Resources

Tactic #5: Block URLs and Pages You Don’t Want Google to See

You will have some pages that you don’t want Google to crawl and index. This could include paid search landing pages, low-quality content, empty category pages, auto-pagination URLs, and similar content. The ideal scenario is to have only the pages you want indexed to appear in the Google index.

Failing to block Google from crawling these pages will create “index bloat” (when Google has indexed low-value pages).

How to Block URLs and Pages from Being Crawled and Indexed

First, you’ll want to see what’s currently indexed in Google. To see what pages are currently indexed, go to Google and enter site:yourdomain.com. You can also see the pages indexed for your site in Google Search Console, under Status >> Index Coverage.

The method you use to block pages from Google’s index will depend on the page/URL type. Here are some common scenarios and suggestions:

  • Paid search and other channel landing pages: use robots.txt or meta robots tags.
  • Auto-pagination URLs: use meta robots tags and canonical URLs.
  • Low quality content: use meta robots tags.
  • Empty category, tag, and author pages: use robots.txt or meta robots tags.
  • URLs with additional parameters: use robots.txt, meta robots tags, canonical URLs, or Google Search Console’s URL Parameters tool.

Resources

Tactic #6: Ensure Your Site Has Proper Sitemaps

Sitemaps are the most efficient way for Google to find all pages on your site. However, sitemaps often run into two common problems.

First, when sitemaps are not continually updated with new content, Google must crawl through your site’s internal links to find those pages. Google will have a harder time finding pages buried deep within your site architecture.

Second, when sitemaps are not continually updated and pruned, Google will crawl redirected or deleted pages that provide no value.

How to Ensure Your Site Has Proper Sitemaps

Most content management systems (CMSs) such as WordPress and Drupal will automatically update sitemaps after any changes are made. If you are on a custom CMS or your site was built with static HTML, you, your content manager, or your developer must manually update sitemaps each time a change occurs on the site. This includes excluding blocked, removed, or redirected pages from your sitemaps.

Also, ensure all sitemaps are added to Google Search Console. Re-upload whenever any URL changes occur.

Resources

What’s Next

In Part 2 of our series on technical SEO tactics, we’ll cover how to optimize for both searchers and bots.

Dwayne Hogan

Dwayne has been developing SEO strategies for local and national brands since 2008. As an SEO Manager at Clearlink, he is integral to the development of the SEO channel and provides oversight of ongoing SEO strategies across multiple web properties.

Close

Subscribe to our insight
bulletin for updates from
the world of Intelligent CX

Enter your email to stay updated with the latest marketing and leadership insights from Clearlink.

Close