SEO Technical Improvements

Real Life E-Commerce SEO Case Study: Part 2 – Technical Improvements

The second step in our series on a real-life case study of an SEO project looks at technical improvements.

In Part 1 of our SEO case study, we performed a site audit on a real e-Commerce website.  Now is the time to make improvements based on the audit and improve the technical SEO performance of the site.

Duplicate title tags

First in the list is addressing the issue of duplicate title tags.  In a CMS-based site (this site is based on WordPress) this is common, as tags for products and posts can be overused.  In the case of this project, the same tags were being used for products, categories, and for blog posts.

The solution for this is using the Taxonomy settings in the popular plugin, Yoast SEO.   This enables website managers to set SEO templates for each tag type.  This will help us to avoid duplicates, although this means that there’s a potential for duplicate content to be on the site.

As you can see, Yoast SEO uses the same Title Tag for both Categories and blog post Tags.  So the solution is to create unique SEO titles for each type of taxonomy.  We can also turn off some of the un-needed types, as e-Commerce and WooCommerce can generate far too many classifications – such as those for shipping and for product variations.

 

 

 

product tags

Speeding up the site

Key on the first report, was that the site was slow to load.  This has two main impacts – users of the site will have a poor experience and may leave the site growing impatient, and site speed is a SEO ranking factor.

So to speed up the site, we used a variety of caching and optimisation tools to help speed this site up:

  • Added caching – so a static copy of each page is saved to the server and delivered, rather than generating it new each time
  • Compressed and deferred scripts – the site now delivers single compressed files for CSS and JS, rather than many files
  • Long lifespans for static assets – images and script files are saved on the user’s browser, so they aren’t downloaded on each page view
  • Image Lazy Load – images are only downloaded and shown on a page if the user scrolls to that part of the screen

Once this is in place, we employ a crawler tool that visits every page of the website in turn, to make sure that a cached version is saved.

Image Optimisation

Image loading speed is a key factor in slowing a site down.  The usual way to handle this in the past would be to optimise the images so they were compressed without losing any fine details.  However, modern browsers can handle the next generation of image formats.  Gone are JPEGs, PNGs, and GIFs – we are now in the world of WEBP and HEIF.

Owners of the latest iPhone will notice that their images aren’t always saved as JPEGs but in one of these newer formats.  They promise compression of around 50% of the current formats, so in effect, they will load twice as quickly which is good news for our site and for SEO.

The website also has a solution for if the user has an older browser, showing the JPEG or PNG equivalent in these situations.

Image compression comparison png webp

Broken Links and Dead Files

We found a few pages were being linked to from other pages, which no longer existed.  This was an easy thing to fix, we simply edited the pages with bad links and pointed them to the right website.

Other fixes and changes

We found one page that linked from a URL starting https:// to one starting http://, this was fixed.  This would potentially cause a browser warning so is bad for user experience and SEO.  We removed some duplicated heading tags from a page, added some Meta tags to pages without them.    The site also needed an automatic redirect from non-SSL to SSL addresses.

The End Results

So, we re-ran the audit and tests to see what the metrics were and what effect each improvement has made.

site audit 2

As a start, this SEO project has been massively optimised. The major errors have been cleared, the site loads much faster now and the basics are set up correctly.  The one remaining error is a false positive from our software, so can’t be fixed.

However, it doesn’t and shouldn’t end here – there are many further modifications both ourselves and the site owner can make.  The site has a high number of pages (320) which leads to diluting of content, so as a minimum we would suggest removing some pages or blocking some pages such as tag index pages from being indexed.  Key will also be adding more content, as many of the pages have little text (304 pages to be precise), although again this can be levelled at the number of tag/index pages which aren’t all required.

Part 3 will focus on Keyword Research and Optimisation.

Posted in SEO