You could have an awesome looking website with well thought out content, but still not rank highly in search results. Often, this issue is caused by technical issues that prevent a search engine from fully-understanding your content—or, possibly, from evening seeing it in the first place.

By the end of this blog post you will have a better understanding of why technical SEO issues are so important, common issues, and questions you should ask yourself when auditing your own website.

We like to refer to these issues as “technical debt.” This debt prevents a search engine’s ability to crawl, index, render, and rank the content on your site. All of these issues combined prevent your site from ranking higher in search results. If you’re having trouble understanding why your content isn’t ranking higher, the questions outlined below might help you get an idea of where you should put your focus.

1. Do you have a clean internal link structure?

Having a good internal linking structure is crucial when it comes to crawling and indexing content on your website. Crawlers like Googlebot are constantly visiting your website and they use your internal links to discover new content. If you aren’t linking up to a valuable resource on your website it’s unlikely Google Will ever find it.

Why you should be thinking about this.

You want to create the path of least resistance for any crawler that visits your website. While crawlers are getting more and more intuitive, they won’t be able to get to every single page on your website without a little bit of help. On the flip side, there are some sections of your website that you might not want crawlers to find and spend time understanding.


What to find out what’s slowing down the growth of your site? Try our Inbound Marketing Gameplan!

Get Your Inbound GamePlan →


For example,  one of our clients had an event calendar on their site that had a “View Next Months Events” button. This button would automatically create a link to the next month even if there weren’t any events occurring. This created links to the year 2147, just in case you wanted to know what events were going on in the next 120 years.  We found that when the crawler spent more time on these pages, their more valuable pages weren’t crawled thus dropping their rankings.

By creating a good internal linking structure you are opening the doors for crawlers to content you want to be indexed and closing the doors to others. Restricting the crawling of your site is an important element but also a delicate process.

Examples of what can go wrong.

Having a good internal linking structure doesn’t mean you want to have 50 links coming from one service page or blog post. Adding internal links should be a well thought out process. If you are writing a blog post, it’s acceptable to have ~10 internal links throughout your post. Just because a word or phrase matches up with one of your other pages or services, doesn’t mean you should link to it.

Make sure your anchor text makes sense to link from. If you want to link over to a service page about your inbound marketing, don’t use the phrase “technical seo” because that’s misleading to the user.

On the flip side, you want to be careful about restricting content. If you see a crawler spending a lot of time crawling paginated, you should think strategically about that particular content. Is this content valuable? Can I optimize these pages? If the answer is no, you’ll want to be careful about how you restrict it, especially if you plan to use a robots.txt rule.

How to Audit Without a Tool – 17 Questions to Ask Yourself

  1. Do you have any URLs that contain no Google Analytics code?
  2. Do you have any URLs that contain uppercase characters?
  3. Do any URLs contain whitespace?
  4. Do any query strings contain paginated parameters?
  5. Are any URLs missing a Google Tag Manager code?
  6. Do any query strings contain more than three parameters?
  7. Do you have any broken internal URLs?
  8. Do any query strings contain repetitive parameters?
  9. Do any query strings contain search or filter parameters?
  10. Do any query strings contain a question mark?
  11. Do any query strings contain sort parameters?
  12. Do any URLs contain a double slash?
  13. Do any URLs contain non-ASCII characters?
  14. Do any URLs contain repetitive elements?
  15. Do any URLs resolve under both HTTP and HTTPS?
  16. Do any URLs contain more than one Google Analytics code?
  17. Do any URLs contain more than one Google Tag Manager code?

2.  Are Your Internal & External Links Clean?

I know ‘clean’ is pretty vague but it was the best way I knew to describe this next section. Having clean internal and external URLs can mean a lot of things and come back into the section above. You want to make sure that every page you’re linking to whether it be internal or external, is easy to get to, looks clean, has optimized anchor text, and much more.

Why you should be thinking about this.

Having clean internal and external links adds another layer of clarity for your users and crawlers. For example, if you have really long URLs or URLs with whitespace, these can appear odd to your users and create confusion among crawlers.

Clarity and connectivity are key. Creating a descriptive path to your optimized internal linking structure will create less friction for anyone trying to understand your content.

Examples of what can go wrong.

Let’s say you were going through and adding links to one of your blog posts. You may have accidentally linked to one of your pages but put a space in between a character that will either break the URL or create whitespace. Whitespace is just a fancy word for having space where one isn’t needed. This is usually replaced with “%20” in a URL. So your internal link went from looking like this “/example/internal- link” to this “/example/internal-%20link”.

Another possibility is that you wanted to link to some references at the bottom of a case study you’ve pulled together but instead of linking to them using anchor text, you just placed the full URLs at the bottom of the post. Using anchor text adds that layer of clarity. It’s easy for users and crawlers to understand an article title rather than a full URL.

How to Audit Without a Tool – 17 Questions to Ask Yourself

  1. Do you have any links with whitespace in href attribute?
  2. Are any URLs orphaned?
  3. Do any URLs only have one followed internal link?
  4. Do you have any outgoing links with malformed href data
  5. Do you have any internal links with no anchor text?
  6. Do you have any paginated URLs with no incoming internal links?
  7. Any pages with one or more outgoing followed links with non descriptive anchor text?
  8. Any pages with incoming followed links that do not use descriptive anchor text?
  9. Any pages that only receive nofollow links or links from canonicalized URLs?
  10. Do URLs receive both follow & nofollow internal links?
  11. Do you have any links with a URL referencing LocalHost or 127.0.0.1?
  12. Do you have links with a URL referencing a local or UNC file path?
  13. Do you have any links with a URL in onclick attribute?
  14. Do you have any links with an empty href attribute?
  15. Do you have any anchored images with missing alt text?
  16. Any pages with no outgoing links?
  17. Any pages with a link to a non-HTTP protocol?

3. Are all sections of your website indexable?

Just because your website is live, doesn’t mean that all sections are indexable. You may have put some rules in place to keep sections of your website hidden during the development process, maybe you put a robots.txt rule in place that is actually hurting your presents in search results. Sometimes it takes a while for crawlers to find new sections of your website but there are some things you should do to help this process.

Why you should be thinking about this.

If your goal is to ultimately grow your traffic, indexability is going to play a big factor. Let’s say you want to start blogging as a way to produce some additional traffic to your website. That’s great! If your site has already been around for quite some time, you have trained Googlebot and other crawlers that the content that’s currently on your site is the extent of what you have to offer. Now you have to train them to see this new content and to have them continuously come back to it to see what’s been updated.

Examples of what can go wrong.

One of the issues we see most often is the improper use of canonical tags. A canonical tag is used to tell search engines that a page should pass all rank to another. SO if you canonicalize a blog post to your service page, all ranking and authority would go to your service page. This can be troublesome for a number of reasons. When you canonicalize a URL you’re taking away its ability to rank and compete with other content in search results.

Proper use of canonical tags is crucial when trying to pass authority in the right direction. This is best used when you have variation of one thing and you want the focus and rank to go to the main product.

How to Audit Without a Tool – 41  Questions To Ask Yourself

  1. Do you have URLs that contain a form with a GET method?
  2. Do you have canonicals that points to a different internal URL?
  3. Do you have canonical URLs with no incoming internal links?
  4. Do you have canonicals that point to another canonicalized URL?
  5. Any canonicalized URLs that are noindex, nofollow?
  6. Does thecontains invalid HTML elements?
  7. Any pages with noindex and nofollow directives?
  8. Any pages with a canonical that points to a noindex URL?
  9. Any pages with a canonical that is a relative URL?
  10. Any canonicals that are malformed or empty?
  11. Any canonical loops?
  12. Any pages that contain canonicals outside of?
  13. Any canonicals that points to a disallowed URL?
  14. Do you have any canonicals pointing to a redirecting URL?
  15. Do you have any canonicals pointing to a URL that is Error (5XX)?
  16. Do you have any canonical pointing to a URL that is Not Found 404?
  17. Do you have any canonicals pointing to an external URL?
  18. Do you have any canonicals pointing to HTTP version?
  19. Do you have any canonicals pointing to HTTPS version?
  20. Do you have a canonical tag in HTML and HTTP header?
  21. Do you have any mismatched canonical tags in HTML or HTTP header?
  22. Do you have any mismatched nofollow directives in HTML or header?
  23. Do you have any mismatched noindex directives in HTML or header?
  24. Do you have any pages with multiple canonical tags?
  25. Do you have any pages with multiple nofollow directives?
  26. Do you have any pages with multiple noindex directives?
  27. Do you have any pages with multiple, mismatched canonical tags?
  28. Do you have a Nofollow in HTML or HTTP header?
  29. Do you have a Noindex in HTML or  HTTP header?
  30. Do you have any disallowed images?
  31. Do you have any disallowed JavaScript files?
  32. Do you have any disallowed Style Sheets?
  33. Do you have any Internal Disallowed URLs?
  34. Do you have any URLs with only nofollow incoming internal links?
  35. Do you have any pages with meta robots found outside of?
  36. Do you have canonicals only found in rendered DOM?
  37. Do you have any base URLs that are malformed or empty?
  38. Do you have any pages with multiple base URLs?
  39. Do you have any page with multiple, mismatched base URLs?
  40. Do any pages have contain a “noscript” tag, which includes an image?
  41. Do any pages have athat contains a noscript tag?

4. Do you have redirects throughout the site?

Redirects can really slow down your site and create a frustrating experience for your users. Redirects can create endless loops of loading and rendering of pages that lead your website visitors to a blank page or a render time-out page.

Why you should be thinking about this.

It’s easy to create redirects without even knowing it. Your CMS might have the functionality of automatically creating a redirect if you change the slug of a URL. If that’s true for you, that’s great! But the work doesn’t stop there. Now that you have a redirect set up, that means wherever the original URL exists on your site, now has a redirect and you should go through and update all of these.

Examples of what can go wrong.

Let’s say you want to change a URL in your main navigation because you feel it doesn’t directly speak to what that service has to offer. If you update that URL and have a CMS tha automatically creates a redirect for you, you should be in good shape. Now you have a redirect located in the main navigation. If you don’t update this, you’ll now have a redirect found on every single page that the main navigation is found.

Another scenario that can cause issues is when you link to an external source. Linking to an external source is good practice, but more upkeep. You don’t have direct control over what they choose to do with these URLs so when they switch the slug or move the page completely, you could be leading your users to a broken URL.

How to Audit Without a Tool – 10 Questions to Ask

  1. Do you have any internal redirected URLs?
  2. Do you have any external redirected URLs?
  3. Are any of your external URL redirects broken (4XX or 5XX)?
  4. Do you have any internal URL redirects that are broken (4XX or 5XX)?
  5. Do you have any internal URL redirects that redirect back to itself?
  6. Do you have any redirects using a Meta refresh?
  7. Are any of your resource URL redirects broken (4XX or 5XX)?
  8. Are any of your page resource URLs redirecting back to itself?
  9. Do you have any redirected page resource URLs?
  10. Are any of your internal URLs part of a chained redirect loop?

5. Are you On-page elements well optimized?

On-page elements make up a large portion of ranking factors. Most people are concerned with their title tags and meta descriptions but there are many other elements you should pay attention to.

Why you should be thinking about this.

Your title tags and meta descriptions are your very first interaction with your audience and you’re competing for their attention with tons of other listings. You want to make sure your information is relevant enough for them to click through to your website.

Beyond title tags and meta descriptions, you want to think about the structure of your page. Is it easy for Google and users to identify the most important elements of the page? Are you using different header sizes to break up the content? If you are, are these headers well optimized? All of these things are ranking factors.

Examples of what can go wrong.

We always use a holistic look at on-page elements when considering optimization, but we always start with title tags and meta descriptions. We have seen some people with length title tags that are trying to communicate too much value rather than the information their audience wants the most. Having a length title tag or meta description can cause you listing to be truncated in search results.

You might even be missing an H1 tag from your page completely. This can make it difficult for search engines to understand the main idea of your content which can further hurt your rankings.

For Example, we had a client missing an H1 tag from all of their blog content. After adding this element to the blog page templates, we saw their rankings improve because search engines had more context about their information.

If you start with your title tags and meta descriptions and work your way through the page all the way to the images, you should have a fully optimized piece of content.

How to Audit Without a Tool – 17 Questions to Ask

  1. Do you have any images with missing alt text?
  2. Are you missing tags?
  3. Are your tags too short?
  4. Do any pages have multiple tags?
  5. Are your title tags too long?
  6. Are any pages missing a meta description?
  7. Are your title tags too short?
  8. Are any of your meta descriptions too long?
  9. Are any of your meta descriptions too short?
  10. Are any of your pages missing a title tag?
  11. Any HTML elements missing or empty?
  12. Any pages with an empty tag?
  13. Any pages with an empty meta description?
  14. Any pages with multiple meta descriptions?
  15. Any pages with multiple title tags?
  16. Any pages with a lengthy title tag?
  17. Any pages with an empty title tag?

6. Do you have any duplicate content?

Duplicate content is mostly a search engine related issue. They are the best at spotting this kind of content, some users will spot it too but it’s less likely unless it’s blatant. Duplicate content can trigger Google to determine a “User-Declared Canonical” which means they think one or more pages are two similar to live separately so they treat them as canonicalized content. When Goolgle declares their own canonical, it might not always be to the page you want.

Why you should be thinking about this

Having duplication on your site can be perceived as fluff. When trying to communicate with your target audience you want to eliminate all of the fluff you can and get straight to the point. Having duplicate page content should be your only concern either. All of your on-page elements can be perceived as duplicate content too.

Examples of what can go wrong.

We saw an issue one time where a client of ours offered a variety of programs in a variety of states and cities. Google would crawl this content and see that only a few elements differed between a large number of pages. They created their own “User-Declared Canonicals” and treated this large pool of programs as only a few worthy enough for crawling.

When faced with a problem like this, you really have two choices. Option one would be to make enough changes where these pages were worthy of indexing as stand alone content. Option two would be to untangle the canonicals and simplify your pages with your own set of canonical tags that group them in an effective way.

How to Audit Without a Tool – 7 Questions to Ask

  1. Do you have any URLs with duplicate h1s?
  2. Do you have any URLs with duplicate meta descriptions?
  3. Do you have any URLs with duplicate page titles?
  4. Do you have any URLs with duplicate title and meta descriptions?
  5. Do you have any Duplicate URLs (technical duplicates)?
  6. Do you have any URLs with similar content?
  7. Do you have any URLs with duplicate content?

7. Have you updated and cleaned your XML Sitemaps?

An XML Sitemap is an additional layer of clarity you can give to search engines. Your sitemap is a collection of URLs of your website. If your sitemap is accessible to these crawls, they are more likely to discover all of your content more efficiently.

Why you should be thinking about this.

Clarity and efficiency are the name of the game. If crawlers have a hard time finding your content, won’t searchers have the same problem? If it’s hard to find your content, why should you outrank someone who has easy-to-find content?

Having a clean XML Sitemap opens the door for search engines to find some of the isolated content it might not find on its own or from regularly crawling.

Examples of what can go wrong.

If you have any redirects, canonicals, noindex, or forbidden URLs in your sitemap, you can confuse search engines and they may ditch your sitemap as a crawling resource. Crawlers can resort back to traditional crawling if they think they are being misled by your sitemap file.

How to Audit Without a Tool – 9 Questions to Ask

  1. Do you have any Redirect (3XX) URLs in your XML Sitemap?
  2. Do you have any Canonicalized URLs in your XML Sitemap?
  3. Do you have any Disallowed URLs in your XML Sitemaps?
  4. Do you have any Error (5XX) URLs in your XML Sitemaps?
  5. Do you have any Forbidden (403) URLs in your XML Sitemap?
  6. Do you have any Noindex URLs in your XML Sitemaps?
  7. Do you have any Not Found (4XX) URLs in your XML Sitemaps?
  8. Do you have any Timed out URLs in your XML Sitemaps?
  9. Do you have any URLs in multiple XML Sitemaps?

8. Do you have any website security risks?

Given the fact that Google uses HTTPS as a ranking factor, security is again, another important ranking factor. Security has become a major topic for not only marketers in the last few years. With massive security breaches and leaked information, security is more important than ever.

Why you should be thinking about this.

Google has started to roll out some pretty significant warnings for website visitors if things aren’t as secure as they should be. For example, if you have a site that isn’t operating on an SSL certificate, they will soon display a warning to visitors that your site isn’t secure.

If you have an image on your webpage that is being hosted from an HTTP source, Google will display a mixed content warning within the browser.

Examples of what can go wrong.

We had a client that was missing a rule in their HTaccess file that would have forced all URLs to use HTTPS. When missing that rule, it may be possible for users to access your webpages via HTTP and HTTPS which can create duplicate URLs. This also poses a security risk as hackers have an easier time getting through URLs without a security certificate.

How to audit Without a Tool – 16 Questions to Ask

  1. Are any pages missing Strict-Transport-Security HTTP (HSTS) in the header?
  2. Are any pages missing the Referrer-Policy HTTP in the header?
  3. Are you missing a X-Content-Type-Option HTTP header?
  4. Do you have an invalid or missing Content-Security-Policy HTTP header?
  5. Do you have an invalid or missing X-Frame-Options HTTP header?
  6. Do you have an invalid or missing X-XSS-Protection HTTP header?
  7. Do pages leak server information useful for compromising servers?
  8. Do you have external opener links vulnerable to tabnapping?
  9. Do you have style sheets served via a CDN without subresource integrity?
  10. Do you have JavaScript served via a CDN without subresource integrity?
  11. Do you have any pages that loads page resources using protocol relative URLs?
  12. Do you have any instances of mixed content (loads HTTP resources on HTTPS URL)?
  13. Any instances of HTTP URLs?
  14. Do you have any HTTPS URLs linking to an HTTP URL?
  15. Do you have any HTTP URLs containing a password input field?
  16. Do you have any HTTPS URLs containing a form posting to HTTP?

9. Do you slow page speeds?

Page speed reports was another consideration Google brought to the forefront. The addition of speed reports in Google Search Console has webmasters thinking of ways to speed up their sites. Page speed hasn’t always been a ranking factor, per say, but Google recently made a statement that it will come 2021.

Why you should be thinking about this.

Giving a good experience to your website visitors should be above all else. Even though this will soon be a ranking factor, you want to make sure your audience is able to explore all of the different things on your website with ease.

Tackling speed related issues is a delicate process and should be done in stages. One thing we turn to a lot  for speed analysis is Google’s Lighthouse Report. This can give you a good indication of what things are standing in your way of having a faster website.

Examples of what can go wrong.

You may see an issue bubble up that you have oversized CSS and you should inline your critical CSS to theand differ the rest. Analyzing and finding your critical CSS can be difficult. If you don’t get all of the critical elements into the head, your page might not load correctly and then you’ve inadvertently created a worse user experience for your audience.

Maybe you added some images within a collapsible navigation. Now those images are being loaded on every page and taking more time to fully render the content. Should you compress them? Lazy load? Get rid of all together? Depending on the specific challenge tactics might change, that’s why speed issues are so delicate.

How to audit Without a Tool – 39 Questions to Ask

  1. Do you have a total combined CSS content size is too big (over 500KB)?
  2. Are pages missing Critical (Above-the-fold) CSS?
  3. Do any of your pages contain one or more single point of failure?
  4. Do any of your pages load oversized images which are scaled in the browser?
  5. Do any of your pages have a total combined image content size that is too big (over 1MB)?
  6. Do you have any resources that do not specify long cache headers?
  7. Do you have uncompressed text content resources?
  8. Do any pages have a total combined JavaScript content size that is too big (over 500KB)?
  9. Do you have any resources that do not specify cache headers?
  10. Do you have any static resources with a private cache-control header?
  11. Does the DOM width exceed the recommended 60 nodes wide?
  12. Do you have an excessive number of DOM nodes (greater than 1500)?
  13. Is your JavaScript content minified?
  14. Does the DOM depth exceeds recommended 32 nodes deep?
  15. Is your CSS content minified?
  16. Is your HTML content minified?
  17. Is your character set specified inor headers?
  18. Do you load too many webfonts?
  19. Do you load offscreen images that appear below the fold?
  20. Is your server response too slow with a Time-to-First-Byte greater than 600ms?
  21. Do any pages have a transferred image size over 100KB?
  22. Do any pages load hidden images?
  23. Any pages with too many synchronous JavaScript requests?
  24. Are any pages with a character set not UTF-8?
  25. Any pages where meta charset is not the first element in?
  26. Any pages where the character set is missing from the HTTP headers?
  27. Any URLs load in one or more identical CSS Resources?
  28. Any instances where style sheets are loaded after JavaScript resources in the?
  29. Are instances where critical (Above-the-fold) CSS was found in thebut not loaded early enough?
  30. Are there any instances of a page resource URL redirect?
  31. Are there any instances where a URL loads in one or more identical JavaScript resources?
  32. Are there any pages where the downloaded HTML is greater than 500KB?
  33. Are there any pages where the stylesheet is loaded in with media=’print’?
  34. Are there any pages where unoptimized JPEG images could be compressed further?
  35. Are there any pages where the total combined size of the webfonts used is too big (over 200KB)?
  36. Are there any pages that loads overweight Webfonts?
  37. Are there any pages that load duplicate Javascript files?
  38. Are there any pages that load duplicate style sheets?

10. Is your site mobile friendly?

Mobile-first indexing has been around for over a year now and many sites were hit with penalties and a drop in rankings. More and more people are using their phones for web searches so it makes sense why there is such a big emphasis on mobile-friendly sites.

Why you should be thinking about this.

If your site isn’t well optimized for mobile searches, there’s a good chance your audience may never make it to your site at all. Having a mobile-friendly site means that it’s easy to understand, elements on the page aren’t too small or close together, it’s fast, and much more.

Most people forget to check newly launched pages or interactive elements of their site on a mobile device. Many of these issues can be overlooked in the design or building process. Having gone through a recent redesign, we chose to start with our mobile design first, then built out our desktop version.

Examples of what can go wrong.

If you have a slow site to begin with, it’s possible that people using 3G connections won’t be able to load any of your content if your page size is too large. This can create frustration among your users and cause them to leave angry or disappointed. This might mean they chose a competitor of yours that has invested in optimizing their site for mobile searches.

Another big issue is when tap targets are too close. This means that you have clickable elements that are too close together. People have a hard time clicking on these elements on mobile devices because their fingers are just too large. This again creates frustration and a potential loss in customers.

How to Audit Without a Tool – 16 Questions to Ask

  1. Do you have any pages with a total page size too big for 3G connections?
  2. Any pages where the content does not size correctly to viewport?
  3. Do you have any font sizes too small for mobile devices?
  4. Do any pages have multiple viewport meta tags found in the?
  5. Are any pages missing viewport meta tags in the?
  6. Any pages where the viewport meta tag does not have a width set?
  7. Any pages where the viewport meta tag has a specific width set?
  8. Any pages where the viewport meta tag is missing an initial-scale?
  9. Any pages where the viewport meta tag initial-scale is incorrect?
  10. Any pages where the viewport meta tag prevents the user from scaling?
  11. Any pages where the viewport meta tag has a maximum-scale set?
  12. Any pages where the viewport meta tag has a minimum-scale set?
  13. Any pages where the tap targets are too small and close?
  14. Any pages have one or more image-map
    tags?
  15. Any pages with unsupported browser plugins?

Know where you stand, get a full website audit.

Our team of analysts can give you a full, holistic view of what is holding your website back from ranking higher and getting more traffic. We can give you a full plan of what you should be tackling first in order to get quick traffic and SEO wins.

We blend the use of the industry’s top tools  with our consultants’ trained eyes to uncover a comprehensive list of your website’s issues—both large and small. We then prepare a detailed document that outlines each issue, why it matters, and how you should fix it.

Inbound Marketing GamePlan