Mobile-friendly; what next? It’s time for a website audit

If you’ve already made your website mobile-friendly in preparation for “Mobileaggedon” (as some are calling it), there’s still more to do. It’s a bold statement to say that auditing your website regularly is a business-critical activity but one that I’m prepared to stand by. Why do I believe this is true? Because an in-depth audit is a bit like having a thorough health check. We notice little things that aren’t quite right but a detailed review can uncover the root causes. I actually find website auditing for clients very rewarding because we can find the problems and then, when they’re fixed, we can really see the benefits to their businesses.

What are the SEO killers I’ve uncovered during website audits?

  1. You think your site is mobile friendly but it isn’t really.  This one is horrible – you pass the Google Mobile Friendly test but your site doesn’t actually meet all the criteria.  The problem is that unless you have a Google Webmaster Tools account (which we recommend for all our clients), you might not realise.
    Mobile friendly tag on Homebuilder directory site
  2. Seeing double. No, tempting as it is, I haven’t been on the gin! The duplicate content catastrophe relates to the various different ways your site can be loaded and how many versions of identical text and images have been used. This might happen if you’ve tried to take advantage of keyword-rich domain names in the past and copied some or all of your main website onto a second domain. There are some other ways that duplicate sites can appear. It’s particularly common if sites load with AND without www. or with http AND https.  I’ve also seen it happen when a web developer has inadvertently allowed search engines to spider development versions of the website and then left the development site online. Why does it matter? It matters because Google doesn’t know that all these versions of the website belong to you. It can’t tell whether you are deliberately trying to exploit the algorithm, you’ve been badly advised or made an innocent mistake. Whatever the reasons behind the duplicate content, it can trigger a Google Panda penalty and that will cause you a big problem. Finding and fixing the duplicates before Google discovers them is by far the best strategy.
  3. Robots.txt files can be very useful parts of a website. However if they are set up incorrectly, you can find Google is unable to “see” your site properly and this will affect your rankings.  Google say they want to crawl everything on your site.
    Description not available because of robots.txt file
  4. Broken links are the bane of every site owner and, try as we might, it’s almost impossible to avoid them.  Unfortunately visitors and search engines hate them so a thorough “spring clean” is a great way to get them tidied up and make sure that any pages you are redirecting have a 301 redirect (not a 302).
  5. No one likes slow pages.  Making sure your pages load fast will affect your mobile rankings and your bounce rate.  Within an audit, we look at all the elements of your website design that affect your page speed.  Pages which take more than 20 seconds to load will suffer a penalty and we have seen “Slow” tags appearing in the search engine results.  Although Google’s target of under 2 seconds is often unachievable, we should be Slow tagdoing everything possible to get pages to load in under 4 seconds.
  6. If your site is hacked, how will you know? When we audit sites, we look for clues that show if your site has a problem with hacking or malware.  No one wants to have their site filled with this sort of rubbish and it will send you plunging down the rankings at warp speed!
  7. Malicious SEO attacks do exist and one strategy that people use is backlink dumping. This means your site suddenly gains a huge number of backlinks from toxic sites.  Because Google has no way of knowing that these were created by someone else, it is your site that gets associated with this “bad link neighbourhood” and YOU get the penalty. We run backlink checks as part of our audit and we monitor sudden increase in backlinks for our regular clients on a monthly basis.
  8. Search engines are looking for sites that are nurtured and their rankings algorithms are checking for quality signals that show you care about your site. One of the elements the spiders check is duplicates – too much content duplicated from one page to another, duplicated title tags and meta descriptions can all give search engines the wrong impression.  This is a particularly big problem with sites built in Kentico but it’s something we all need to keep an eye on.
  9. Are you making the most of the opportunities to improve your search engine rankings? Only around 0.35% of sites Ratings SERPuse structured data but these sites make up about 30% of the first page results on Google. Looking at this example which uses ratings data to create a “rich-snippet”, it is also easy to understand why sites with structured data often enjoy better clickthrough rates. As clickthrough rates from the SERPs (search engine results pages) are another ranking factor this is a valuable strategy. An audit will identify ways you can be taking full advantage of this legitimate technique to support your rankings.
  10. Search engine guidelines change all the time so an audit is a great way to make sure your site still meets the standards required. For example, recent news from Google includes changes to landing pages (are yours set up correctly?) and they are adding more and more tags to search engine results pages, including “mobile-friendly”, “slow”, “robots file problems” and “security certificate weak or nearly expired”.

If Local Search is important to your business, the audit should be adjusted to look at factors that have an impact on your rankings in the Google Local Pack and on Yahoo and Apple too.  Performing well for Local requires you to meet all the standards set for SEO and then some!

It is important for every site to have regular audits.  With our own website audit process, we’ve uncovered sites that have lost all their rankings for image search because of an error in their robots.txt file or other sites with up to 6 versions of their homepage.  Even brand new sites can have problems with page load speed and thin or duplicate content. With an audit we’ve been able to help clients find out what is holding their site back and address these problems.  They then get more clickthroughs from the search engines, improved bounce rates, increased conversion rates and better rankings. The culmination of all this work has been a website that supports their business growth – instead of acting as a barrier.  Isn’t that what you’d like for your website? 

Leave a Reply