All posts eCommerce Best Practices SEO News

SEO for Website Redesign & Migration

Reading Time: 5 minutes

In SEO, an important topic is website migration. If managed well, a migration can increase the organic traffic to your website and can bring massive change for an online business. With that said, if managed poorly, migration can spell  disaster for your website.

We have composed a guide on how to gain better visibility in web search engine Results Page, by showing case studies of site migration projects that have reached success. As migrations and redesigns happen at the same time, we will talk about key actions that you need to take into account with a site redesign and migration.

First of all, before you plan any website redesign, it is highly important that you recognise the obstacles that could undermine your efforts. Great visual design can be very appealing to visitors with fast Internet speed and computers.

However, these sorts of designs are not compatible with older computers, slower connection speeds or simplified search engine spider technology. Moreover, powerful content management systems can block visibility for core keywords. So there are some items that you should avoid. 


JavaScript Navigation
  • Java-Script Navigation

In case you didn’t know, search engine spiders hardly execute Javascript files, therefore, website content that is accessible only through these ways of navigation won’t be indexed. Essentially this means that you are reducing your chances of being collected by search engines and therefore seen by audiences by using these files.  Moreover, such links do not commit to the website’s overall link popularity. CSS and DHTML can replace most JavaScript navigation.

  •   Dynamic URLs

It is no secret that URLs are very important in search engine optimisation. Therefore, they should be rewritten to include relevant keywords. Dynamic URLs are tricky because they do not provide enough information about the page itself. 

  •    Session IDs

If you are using session IDs in your URL structure, it will confuse search engine spiders. Every time the spider is visiting your page, it is seen as duplicate content and those pages will be removed from their index. In extension, session IDs rarely reduce link popularity. What you should do is to classify search spiders, as for example Googlebot by detecting and removing these IDs.

Session IDs on search engines
  •  Unnecessary folders

Directory structures should be kept short, one-to-two folders, preferably, with your most important pages on the top level. In case additional folders cannot be cleared away, they should be named correspondingly.

  •    Non-permanent Redirects

In case a redirect of a page is needed and is going to be permanent, you should use a permanent 301 Redirect.  Redirects like 302 can be done using Javascript and META and redirects both users and search engines to the new page for a limited amount of time, until the redirect is removed.

  •    User-Action Requirements

It’s well known that user actions cannot fulfil user actions to get information such as age verification, zip codes or the use of an internal search box. Therefore, you should give access to search engine spiders to this information through internal links that are inside text. If it is necessary, user-agent exposure grants spiders to ignore user-action.

  •   Broken links

This is a complication that should be avoided at all cost as having broken links in your content can send a bad message about your website to search engines. We recommend that you conduct frequent checks on your website for broken links. This can be done by using tools such as Screaming Frog.

  •    Redirect Chains

A redirect chain happens when there is more than one redirect between the initial URL and the destination one, by creating crawl limitations while reducing any chance of equity transfer.

Content in Images
  • Content in Images

Content cannot be crawled if it’s embedded within an image file. You can create a similar effect using plain text,  style sheets and complete positioning.

  •  Lack of Content

You may be familiar with the phrase that content is king, so it is highly important that relevant content should be on all pages of your website. At the very least, each page should have at least one paragraph of unique content that is emphasising your target keyword phrases.

  •  Robots.txt File

Search engine spiders use Robots.txt file to read which pages on your website they can connect. Avoid blocking pages with valuable content and, in order to allow all search engine spiders access to your website, place the following two lines into a text file. Then add it on the root level of your domain and name it robots.txt: User-agent: *Disallow

  •   Incorrect Robots Meta Tag

Using noindex and nofollow Meta tags blocks pages of your website from being indexed. Below is an example of the tag:

<meta name=”robots” content=”noindex,nofollow”>

Now that we’ve established what pitfalls to avoid in terms of migration, it’s time to move on to our best practices for success. It is recommendable to acknowledge more key factors when expanding the structure of your website as well as to build various optimisation elements into the structure of your website.

Best Practices:

SEO Keywords
  • Keywords

Folders (or directories) should consist of relevant keywords depending on the theme of each section. According to keyword research, it’s important to be as detailed as possible, as these names should be a mirror of your website. Every website file should include consistent keywords, based on the unique theme of that page.

  • Keyword Structure

Split keyword phrases with dashes in file and folder names as search engines will see these as a space between two keywords.

  • Global Navigational Elements

Create clean and simple global navigational elements that are as descriptive as possible. Breadcrumbs can make your page visible to spiders. Enable them on all levels of the website if possible – they will reduce factors that might negatively affect your site’s SEO and allows users to retrace their steps as they progress through your website from the homepage. 

  • Canonical Tags

Use canonical tags, when deciding if you want visitors to see the domain with or without www. This will ensure you avoid duplicate content and will show search engines which URL corresponds to the original source of content and should be shown in search results. The version you should use is a 301 redirect to the other. Place all Javascript and CSS in external files.

Having considered the tips that we have given you, moving forward with our case study, migrating to a different eCommerce platform can be very challenging. With less than 6 months until Magento 1 end of life, now is the best time to migrate. An example of a website that migrated from Magento 1 is


Re-Hash, creates high-quality, jeans made from sustainable and is recognised among major players in that market segment. Re-Hash jeans are hand cut by master craftsman to adapt to individual body shape and sought after a website that was just as adaptable to reflect that. 

On December 18th 2019, Re-Hash fully migrated to the Kooomo platform.

The migration was executed with SEO activities managed pre and post going live by Kooomo, and after only one month returned very promising results. As you’ll see below, revenue, traffic and impressions all increased significantly over this short space of time – these improvements are a direct result of the SEO Migration and  custom strategy that Kooomo did for Re-Hash. 

ReHash Stats

You can view the Re-Hash website here and you can discuss your pending migration with our team at:

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share via
Copy link
Powered by Social Snap
%d bloggers like this: