This article helps as an SEO checklist for new websites as well as for the revamped websites

Business owners consider redesigning their websites once in a while in order to,

  • Meet the marketing requirements
  • Improve the user experience
  • To have a flexible backend or content management system

Or for some reason, but what business owners have to keep in mind is that over a period of time there will be so many SEO implementations or optimizations will be done from the marketing perspective to have an SEO optimized website and any mess in this would disturb the system in terms of

  • Tracking
  • Leads/Sales
  • Performance of website on search engines and many other

Hence its crucial to make sure that the new design migration is smooth and successful.

After migrating to new websites some of the websites perform exceptionally good and some of them lose traffic for the following reasons:

The below table explains when websites lose traffic after redesigning/revamping. Not all the websites lose traffic after revamp, I say the website which falls to the column 2 category are prone to lose in traffic.


The website which falls to the category of column 3 would see improvement in traffic. Website redesign Traffic Drop and Traffic increase:

NoTraffic DropTraffic Increase
1Increased Load timesImproved load times
2Moving to bad hosting/ServerGood hosting/servers
3Heavy design sitesLightweight design
4Disturbance in internal linking on the systemGood internal linking
5Change of domain nameNo change in domain name
6Old to new URL or Domain mapping not handled properlyOld to new URL or Domain mapping handled well
7Mess up with the default content blocks or meta tags such as canonical, Index, no indexIndex, no-index, canonicals, and paginations are handled well
8Robots.txt not cross-checked against the new URL, domain/designRobots.txt verified
9Sitemaps not verified for appropriate URLsSitemaps verified for appropriate URLs
10Not handling crawlable and noncrawlable contentCrawlable and non-crawlable content handled well
11Content would have got archivedAll content is accessible for bots and users
12SEO implementations missedSEO checklist is taken care


Most of the time it’s the SEO issues are the one those have bigger impacts and lead to major losses. SEO implantations are not implemented once on the system as they get implemented whenever a problem is found on the system and over a period of time the system becomes mature in terms of SEO implementations and hence its really very important that you take utmost care on all of those implementations during migration to new designs.

This article covers a few SEO implementations which need to be handled with great attention, and anything going wrong around it would impact adversely and lead to the following loss,

  1. Crawl resource wastage
  2. Content duplication
  3. Thin web pages
  4. Poor crawling of web pages
  5. Drop in CTR
  6. Drop in Keyword rankings
  7. And many more problems associated with

All the above problems imply finally drop in leads, sales, and revenue for a business.

And hence the teams responsible for handling all these should pay attention during any major changes on the system or during release of any small implementation on the system.

Whenever there is a design change (implementing new designs) or doing any major changes make sure that the following is taken care:


Tracking pixels:

Usually, varieties of tracking pixels will be implemented on the system, which includes,

Analytics tracking (the usual metrics, events tracking, e-commerce data layers and others), AdWords tracking, Social media and any other marketing tool pixels and hence you have to make sure that you have a track of all kinds of pixels and also acquire deep knowledge on every single pixel that’s put up on the website so that you can migrate all of them as they are on the old design.

Note: If you are a developer make sure that you are involving all the marketing teams of at least get it all checked from the marketing teams.

Handling Duplicate Content:

Many dynamic websites may have few hundreds of web pages created by default on the addition of products or categories and most probably they all may have duplicate content and hence to handle the duplicate content there could be various implementations done, such as No-index, no follow, canonical tags, pagination tags and more


Index/No-index status:

Let’s take example of a service website which by default creates city pages too with the same content on all of them with minor changes in the currencies, images specific to region or dates or any other small bits of information and meanwhile these pages have would be used for other marketing purposes other than SEO and in that case this has to be dealt effectively without wastage of crawl resource and that’s when a no-index, no-follow would be an ideal tag.

An example from one of our clients,

The system by default creates few hundreds of city pages for every service added and this leads to few lakhs of web pages (City pages) and allowing all of them for search engines to crawl and index is not a good practice and hence we have developed content for around 50K city page and only those have to be allowed for crawl and index, and everything else will be holding a no-index, no-follow tag and not having this is a penalty on SEO progress.

  • The web pages which have unique content will hold an index, follow tag
  • The web pages which don’t have a unique content will hold a no-index, no-follow tag,


Canonical Tags:

This is slightly different from the above case, where the pages will be holding a duplicate content but instead of clocking all of it for search engines, we still prefer search bots to access content on this page and crawl all the hyperlinks through whichever are put up on these pages and that’s when a canonical is implemented instead of no-index, no-follow.

On a dynamic system there is a possibility of duplication of pages for filter created or any other parameter in the URL and hence self-canonical will be the default implementations on all the web pages to avoid the content duplication.

It’s good to make sure that all of your web pages to have self-canonical tags.


Block access of common content for bots:

This is very unlike to canonical implementation, in this implementation the web pages are allowed for crawl and index of content with the restriction of certain blocks on content on the webpage.

Let’s consider the below example of an education provider and see how it’s been implemented,

There are few hundreds of city pages for every course and mostly these all will hold the same content as the course page, as practically it becomes impossible to have a unique content developed for all these few hundreds of city pages, even when the section is rewritten it all may still have huge overlap, e.g.: curriculum, despite rewriting still it sounds same and this brings down the unique content ratio down, meanwhile all those blocks are very informative to users and hence retaining it all is good, but not at the cost of duplicity and hence we have done following,

Every single piece of content is allowed for search engines crawl and index on course pages, whereas all of that content on the city pages is blocked for search engines crawl and index.

This is implemented by identifying the bots and real users if it’s a bot we make it un-accessible for the bots.


Unique content blocks on web pages:

The unique blocks will be the placeholders for the SEO content or the unique content developed from Seo perspective, these bocks are very much needed especially when content sections are blocked for search engines.


Breadcrumb & Rich snippets Implementation:

In order to improve the CTR or user experience on search results page the rich snippets will be implemented and during the change of web designs there are very high chances of these rich snippet codes getting removed and hence make sure that this rich snippets is added to your seo check list while migrating to new design.

Here are few to name,

  • Product snippets
  • Event snippets
  • Article snippets
  • Author snippets
  • Video snippets
  • And many more


Breadcrumb:

Breadcrumb mainly helps in,

Improving Internal linking (Improving the crawl rate)

Helping bots and user to understand the hierarchy of the product catalog or webpages importance

And as the part of SEO this implementation will be done by SEO professionals for sure and you have to make sure that it is not missed during the migration to new design or system or to domain.


Internal linking block:

Every webpage on the system should be accessible by the bot so that they crawl, index and then make them rank up for the keywords they are eligible, and hence SEO professionals would have implemented a specific block which includes hyperlinks to a specific set of webpages.

Every single page should have this block and each block includes 5-15 links at least, which is purely based on the math with regard to the number of optimized pages those need to be allowed for search engines crawl and index.

These hyperlinks or the anchor tags will be designed by the marketing team and the excel sheets will be provided by them for the bulk upload of the same.

Ideally, these sections have to be dynamic and also have to be built based on some logics, such as equivalent products, nearby regions and more, but the dynamic implementation would impact on the load times and hence the same logic will be hardcoded and the static blocks will be implemented and these blocks will be redone once in a while, so make sure there is place for this in the new design and don’t forget to upload these hyperlinks on the webpages


Default Content:

These is a possibility of custom default content getting replaced with system default or no content at all especially when the backend too is restructured along with the design and this would impact the SEO severely, so make sure the following are taken care during your migration of website from old to new system or design


Titles and Descriptions (Meta Tags):

Whenever a product is uploaded on the system it would create few hundreds of equivalent webpages for location, product variations and more, and updating custom titles and descriptions for all of these pages would be of great SEO advantage as missing this would impact the CTR and in turn no improvements in SERP or drop of SERP for the keywords already ranking, we suggest to have a default titles and descriptions for all these pages, which usually will be implemented by tech teams based on the templates provided by the marketing teams.


Heading tags:

Heading tags take a prominent place on webpage from SEO perspective and hence not having those tags or having a random content in the heading tags would lag the SEO progress, so I suggest to have the default heading tags across the system, so that whenever a webpage is created automatically it picks up the name of the product or category on to the webpage into its heading tags, if you are migrating to new design make sure this is not missed and is migrated as it is.


Image Alt Attribute:

Image tags hold an alt attribute and during the migration this may escape and would lead to drop in SERP on image results or the image traffic may go down for your website, so it’s crucial to cross check this implementation on the new design before you proceed uploading the new design to the website.


Sitemaps:

During migration you may mess up the system or may clean some of the redundant webpages or there can be any other changes and hence you have to make sure sitemaps implementation in place, and you should be checking it against the removal of webpages, and change of status code.

  • Whenever a web page is removed from the system it also has to be removed from the sitemap
  • Whenever a web page is blocked for search engines crawl and index it has to be removed from the sitemap
  • Whenever a web page is redirected the URL has to be removed and only the destination URL has to be in the sitemap


Robots.txt:

If you are just adding the new design to the existing website there won’t be a possibility of removal or change of sitemaps and robots.txt, but then you have to still add it to the SEO Checklist during migration and have a smooth transition.

Robots.txt hold sets of commands controlling the crawl of the webpages and this need to be always checked against the list provided by the marketing team.


Load times:

This is another most important metric that we should be paying attention to, as any impact on this implementation will have effect on several other metrics and which in turn will lead to drop in leads/sales, keyword rankings, crawl stats within no time very unlike to the above implementations, which all usually take certain time to affect negatively, but load times effect spontaneously on every single metric that’s considered by search engines to rank your website.

Ideally the suggested load times are not more than 2s – 3s, but it’s always good to have better load times, faster the website loading time better it is.


Web design practices:

A website with less HTML code and fewer HTTP requests is a good practice, so I suggest you to have this as the primary checklist if you are changing the design for the website.

In order to have fewer HTTP requests,

  • Make sure the multiple external CSS files are combined to few
  • Make sure that multiple external JSS files are combined to few
  • Sprites are used for images
  • Handle the CDN requests, API’s and other requests properly on the webpage


Summary:

This article discusses the SEO implementations to be taken care when the website design is getting changed or new designs are added to the existing webpage. This is not a SEO checklist for new websites, but then only for the existing websites with change in designs.

This article doesn’t’ include SEO checklist for change in

  1. URL structure
  2. Change in domain names
  3. Or merging with other domains.

If you are planning to implement any of the above three the mentions in the article will work but the other implementations take the highest priority in SEO such as,

  • URL mapping
  • HTTP status code for the URL’s (301, 302, 410)
  • Dealing with the backlinks and more.
Author photo

Manjunath Chowdary

Digital Marketing Consultant

-Kandra Digital

An agency that’s been built with the core purpose of delivering the quality digital marketing in the era where Digital marketing services are just business rather than the value for the business, business owners and their resources/time.

Get to us