seo 12

The most common on-site SEO fails and how to avoid them

So you’ve got a great website and it’s full of wonderful content.  However, it is not ranking on the major search engines.   Here are a selection of major SEO fails, with hints to check out if your website falls foul and handy tips on how to fix it

1) Your website is not responsive/mobile compliant

Burying your head in the sand won’t help.  Over 75% of website traffic is on a hand-held device; some websites go over 90%.  So if your website is not responsive (works and changes depending on the screen size of the device) prepare for a fail.  The major search engines are now using a website’s responsive nature as a ranking factor.  Google is even working towards splitting its database between “mobile” and “desktop”.  So if your website is non-responsive, you are never going to achieve the rankings you wish.

How do you know if your site is responsive?  The simple answer is to pop along to https://search.google.com/test/mobile-friendly and enter your URL.   It’s a simple yes/no system which will give you the answer you wish.  If your site is not responsive, then you will need to address that with your website developers or in-house team.

itc digital mobile friendly

2) The site is slow to load

A major ranking factor is how quickly your site loads.  The search engines want to list sites which give the best experience for their users.  Fast-to-load sites are more attractive, therefore rank higher.  There are various reasons that affect site speed – hosting, site build, image usage are common factors.

Again there are handy web tools which quickly give you the answer.  Pop along to https://developers.google.com/speed/pagespeed/insights/ and enter your URL.  A quick report will be generated.  It’s quite tech-heavy but everyone understands a score out of 100.

poor site speed Google

If you get a report like this what can you do?  It depends entirely on how your website has been built and where it is hosted.  Better hosting, the use of caching, optimisation of images and scripts are a few suggestions.  Implementing this is something that you can discuss with your web team, or you can contact us for an informed review.

3) Robots are not allowed to index the website

This one kills a site dead.  But I’ve nothing against Robots, so what are they?  Robots are the automated scripts which the search engines use to index (or spider) websites.  They periodically visit your website, take copies of pages and update their index of your website.  They are essential to listing your site on search engines – so how can I check they aren’t blocked?

Visit your main website URL and add “robots.txt” on the end of the address.  So for our website that would be https://www.itcdigitalservice.co.uk/robots.txt

You should see one of three things:

  1. Nothing (which is a good thing)
  2. A bunch of text with no “Disallow” in it (which is a good thing)
  3. A bunch of text with “Disallow” in it (which requires further investigation)

So I got some Disallow text, now what?

It’s not always bad news.  Disallow tells Robots not to bother indexing pages, and usually, that’s a good thing.  In the example from our own website:

robots disallow

There is a disallow line.  However, it is telling the search engines not to index the administration pages of our CMS, which is very sensible.

What you need to be on the lookout for are global rules such as the following:

 

disallow

This is bad.  This is telling the search engines not to index any pages.  You may as well not have any website at all.

Again, fixing this will depend on how your website is built.  The two main ways of fixing this issue if you have one are simple.  1) You can edit the robots.txt and remove the offending lines.  2) If the robots.txt file does not exist (and is generated by your CMS such as WordPress) edit the settings.

NB: sites can also block pages using meta tags.   There is more reading here if you think you may have fallen foul to this.

4) Your site does not use SEO-friendly URLs

Search engines love knowing what your site is above.  The URLs a site uses tells the engines a lot about the structure of the site – how it is built and which pages are within other pages.  URLs should be clean and make sense when read.  For example, our site uses URLs such as:

  • https://www.itcdigitalservice.co.uk/services/
  • https://www.itcdigitalservice.co.uk/services/digital-marketing-seo/
  • https://www.itcdigitalservice.co.uk/contact-us/
  • https://www.itcdigitalservice.co.uk/careers/php-developer/

So even without visiting the pages, both users and search engines have an idea of the page’s nature and place within the site.

What constitutes an unfriendly URL?  Those website address with lots of “?”, “&” and numbers are considered unfriendly.  Whilst these websites will index, Google does not rank them as nicely, as it is harder for it to work out how important (or not) that page is.  Also, these symbols and numbers do not provide the search engines with a lead on the nature of the page.    A URL such as:

  • http://www.mysite.com/products.php?id=200&code=14&cat_id=13

Is not as nice as:

  • http://www.mysite.com/shop/t-shirts/polo-shirt-black

If your site uses unfriendly URLs then the solution again lies in how the site has been built.  Some CMS engines have the ability to switch to friendly URLs, other systems will require being rebuilt.  Also, caution against making this change just on a whim.  When changing URLs on any website, ensure that users aren’t then left with a bunch of 404 pages. If they come via a search engine from a now-defunct web link – make sure they are redirected to the correct page.

5) Incoherent content

The search engines love rich, resourceful, coherent, well written, unique content.  And lots of it.  Their sophisticated algorithms spot patterns in content and use them to help rank sites.  Whilst this is a blog topic in its own right, here are a few examples of winning content:

  1. Rich content – content with as many words as possible (ideally 400+), the use of image, video, links to internal and external pages
  2. Structured content – like a good book, content is structured by topic and subtopic and has a flow
  3. The consistency of subject and content matter – sites rank well if they have common themes and language.  If your site is about Pigeons, if you suddenly start to talk about Milkshakes, the search engines will just get confused
  4. Low text pages  – content is king.  If you have a bunch of pages with little text content, try combing them into bigger, better pages
  5. Language and writing style – make sure you adopt the same style throughout the site.  This can be things from grammar, the person you write in, the formal or informal writing mode, and the use of sentences and bullet points.  It’s never a bad thing to try and use short sentences which makes content easier to read.
  6. Uniqueness – search engines are massive on penalising “copy and paste” culture.  Your site’s content will rank better the more unique it is.  So if you use content from other sources i.e. other businesses in your group or product information from suppliers – try and change it to give it your spin

6) Acting like it’s 1999

There is no technical area of digital marketing fuller of utter rubbish than SEO.  The search engines state clearly what they are looking for so.  However, SEO has become quagmired in quirky and often “black hat” techniques, where the search engines have been cheated and manipulated out of their terms of use to inflate rankings.

Google, for example, is now a multi-billion dollar company whose success is based on the quality of their search engine results.  It has a sophistication in ranking that us developers can only dream of.  It cannot be cheated or conned like 20 years ago when it first launched.  However, the hangover from this time is that some of the old “black hat” tactics are still being promoted by out-of-date practices.  So make sure that your site does not adhere to some of the crackers such as:

  1. Meta Tag Stuffing – Keywords are no longer used, and the Description tag – whilst important – will be ignored if the search engines find it different to the page’s content
  2. Link Building – a link to your site was counted as a vote of approval to it.  Therefore the more links the better a site ranked.  Not anymore!  The search engines assess the quality of a link into a site – too many low-quality links in from irrelevant websites will result in the site being penalised
  3. Keyword Stuffing – Remember the joke “How many SEOs does it take to change a light bulb, light bulbs, energy saving light bulbs, low energy light bulbs, light bulbs uk, led light bulb, cheap light bulbs, led light bulbs…” The search engines aren’t dumb machines anymore, they use artificial intelligence to spot trends in text and how “readable” it is.  Stuffing content with keywords and phrases will result in negative rankings.

Conclusion

So if you’ve gone through the tips above you are slightly wiser and more informed.  Although handy, there are about a trillion other website issues which affect SEO.  So if you still are having troubles, why not speak to our friendly team who can perform a full site audit and help pave the way to improved SEO success.