4 Easy Fixes to Restore Traffic Post Redesign
Redesigning is a big effort. The intent of redesign could be to improve content, design, product offerings, etc. to make your website more user-friendly, all in the effort to encourage more hits. However, while the purpose of redesign is to improve traffic or user experience or both, your website could actually see a decline in traffic right after the redesign. Although traffic drops after redesign is a general trend, a drop greater than 10% is worrisome.
Why does redesign cause a traffic drop?
In order to serve the new version of your webpages, Google needs to crawl and index them so a temporary drop in traffic is expected. However, if the drop in traffic is significant, it could need some fixing.
Redirects: Unless your URLs are exactly the same as before, failure to redirect old URLs to new ones can be the biggest cause of a traffic drop. This is because redirects inform search engines and browsers where the new page is and failure to do so will indicate to the search engine that the page has been removed or doesn’t exist.
The fix: Ensure your changed URL structure or your old pages are all moved to the new locations with proper redirects in place. You can log into Google’s Search Console, select Crawl > Crawl Errors and click the “Not Found” tab.
Pages without redirects will show up here.
Then, implement 301 redirects for your old web pages. 301 redirects tell search engine bots that your old URLs have been moved to new ones. With this, your traffic should be back to normal in a couple of weeks.
Site Structure/ Sitemap: Another aspect Google considers during indexing is your sitemap. If your redesign is extensive, such that your sitemap also changes, you’ll probably need a new / updated site structure to prevent traffic drops. A sitemap is a basic listing of all your web pages. Again, the Search Console can help speed up the crawling and indexing to restore traffic.
The fix: Create a new site structure with all the important pages in place. Try and keep the limit of pages to a maximum of 100. If you overshoot that limit, crawlers will consider a general overview rather than a complete listing.
Content: Redesign could mean improvements to content as well. However, the problem could arise when you forget to include content and keywords that either were on the old website or which bring traffic to your site. This again leads to a drop in traffic.
The fix: Give SEO priority. At CloudRocket, even though our website has undergone a couple of redesigns, we always ensure our SEO specialists are always included in the redesign process right from the start. An SEO specialist will help you identify keywords and advise on how to increase traffic.
Robots.txt file blocking: The robots.txt file lets search engines know which pages to index and which ones not to. It’s a good practice to disable the indexing when your site is under development / being redesigned. However, it’s important to update the Robots.txt file once your new site is live. Failure to do so will continue to restrict search engines from indexing.
The fix: Ensure your Robots.txt files are always updated before you launch your redesigned site. Keep a tab on which pages are allowed to be indexed and which ones are disabled by checking your files in a browser. In the address bar, enter your website domain name followed by /robots.txt. Once here, you will be able to view your files. Here’s what it looks like:
Keep these best practices in mind and you will see the traffic to your website restored. If you’re still seeing a drop in traffic, keep a check on Google Analytics which can give you some insight about where you’re seeing drop-offs.
Comment & let us know if this helped!