Alien Road Company

Redirecting URLs is the practice of resolving an existing URL to a different one, effectively telling your visitors and Google Search that a page has a new location. Redirects are particularly useful in the following circumstances:

  • You’ve moved your site to a new domain, and you want to make the transition as seamless as possible.
  • People access your site through several different URLs. If, for example, your home page can be reached in multiple ways (for instance,, or, it’s a good idea to pick one of those URLs as your preferred (canonical) destination, and use redirects to send traffic from the other URLs to your preferred URL.
  • You’re merging two websites and want to make sure that links to outdated URLs are redirected to the correct pages.
  • You removed a page and you want to send users to a new page.

If you’re using a platform like Blogger or Shopify, the platform may already have built-in redirect solutions. Try searching for help articles (for example, search for “blogger redirects”).

Overview of redirect types

While your users generally won’t be able to tell the difference between the different types of redirects, Google Search uses redirects as a strong or weak signal that the redirect target should be canonical. Choosing a redirect depends on how long you expect the redirect will be in place and what page you want Google Search to show in search results:

  • Permanent redirects: Show the new redirect target in search results.
  • Temporary redirects: Show the source page in search results.

The following table explains the various ways you can use to set up permanent and temporary redirects, ordered by how likely Google is able to interpret correctly (for example, a server side redirect has the highest chance of being interpreted correctly by Google). Choose the redirect type that works for your situation and site:

Redirect types
PermanentGooglebot follows the redirect, and the indexing pipeline uses the redirect as a strong signal that the redirect target should be canonical.Use permanent redirects when you’re sure that the redirect won’t be reverted.HTTP 301 (moved permanently)Set up server side redirects.HTTP 308 (moved permanently)meta refresh (0 seconds)Set up meta refresh redirects.HTTP refresh (0 seconds)JavaScript locationSet up JavaScript redirects.Only use JavaScript redirects if you can’t do server side or meta refresh redirects.Crypto redirectLearn more about crypto redirects.Don’t rely on crypto redirects for letting search engines know that your content has moved unless you have no other choice.
TemporaryGooglebot follows the redirect, and the indexing pipeline uses the redirect as a weak signal that the redirect target should be canonical.HTTP 302 (found)Set up server side redirects.HTTP 303 (see other)HTTP 307 (temporary redirect)meta refresh (>0 seconds)Set up meta refresh redirects.HTTP refresh (>0 seconds)

Server side redirects

Setting up server side redirects requires access to the server configuration files (for example, the .htaccess file on Apache) or setting the redirect headers with server side scripts (for example, PHP). You can create both permanent and temporary redirects on the server side.

Permanent server side redirects

If you need to change the URL of a page as it is shown in search engine results, we recommend that you use a permanent server side redirect whenever possible. This is the best way to ensure that Google Search and people are directed to the correct page. The 301 and 308 status codes mean that a page has permanently moved to a new location.

Temporary server side redirects

If you just want to send users to a different page temporarily, use a temporary redirect. This will also ensure that Google keeps the old URL in its results for a longer time. For example, if a service your site offers is temporarily unavailable, you can set up a temporary redirect to send users to a page that explains what’s happening, without compromising the original URL in search results.

Implement server side redirects

The implementation of server side redirects depends on your hosting and server environment, or the scripting language of your site’s backend.

To set up a permanent redirect with PHP, use the header() function. You must set the headers before sending anything to the screen:

header('HTTP/1.1 301 Moved Permanently');

Similarly, here’s an example of how to set up a temporary redirect with PHP:

header('HTTP/1.1 302 Found');

If you have access to your web server configuration files, you may be able to write the redirect rules yourself. Follow your web server’s guides:

  • Apache: Consult the Apache .htaccess Tutorial, the Apache URL Rewriting Guide, and the Apache mod_alias documentation. For example, you can use mod_alias to set up the simplest form of redirects:# Permanent redirect: Redirect permanent “/old” “” # Temporary redirect: Redirect temp “/two-old” “”For more complex redirects, use mod_rewrite. For example:RewriteEngine on # redirect the service page to a new page with a permanent redirect RewriteRule “^/service$” “/about/service” [R=301] # redirect the service page to a new page with a temporary redirect RewriteRule “^/service$” “/about/service” [R]
  • NGINX: Read about Creating NGINX Rewrite Rules on the NGINX blog. As with Apache, you have multiple choices to create redirects. For example:location = /service { # for a permanent redirect return 301 $scheme:// # for a temporary redirect return 302 $scheme:// }For more complex redirects, use the rewrite directive:location = /service { # for a permanent redirect rewrite service?name=$1 ^service/offline/([a-z]+)/?$ permanent; # for a temporary redirect rewrite service?name=$1 ^service/offline/([a-z]+)/?$ redirect; }
  • For all other web servers, check with your server manager or hoster, or search for guides on your favorite search engine (for example, search for “LiteSpeed redirects”).

meta refresh and its HTTP equivalent

If server side redirects aren’t possible to implement on your platform, meta refresh redirects may be a viable alternative. Google differentiates between two kinds of meta refresh redirects:

  • Instant meta refresh redirect: Triggers as soon as the page is loaded in a browser. Google Search interprets instant meta refresh redirects as permanent redirects.
  • Delayed meta refresh redirect: Triggers only after an arbitrary number of seconds set by the site owner. Google Search interprets delayed meta refresh redirects as temporary redirects.

Place the meta refresh redirect either in the head section of the HTML or in the HTTP header with server side code. For example, here’s an instant meta refresh redirect in the head section of the HTML:

<!doctype html>
  <meta http-equiv="refresh" content="0; url=">
  <title>Example title</title>

Here’s an example of the HTTP header equivalent, which you can inject with server side scripts:

HTTP/1.1 200 OK
Refresh: 0; url=

To create a delayed redirect, which is interpreted as a temporary redirect by Google, set the content attribute to the number of seconds that the redirect should be delayed:

<!doctype html>
  <meta http-equiv="refresh" content="5; url=">
  <title>Example title</title>

JavaScript location redirects

Google Search interprets and executes JavaScript using the Web Rendering Service once crawling of the URL has completed.Only use JavaScript redirects if you can’t do server side or meta refresh redirects. While Google attempts to render every URL Googlebot crawled, rendering may fail for various reasons. This means that if you set a JavaScript redirect, Google might never see it if rendering of the content failed.

To set up a JavaScript redirect, set the location property to the redirect target URL in a script block in the HTML head. For example:

<!doctype html>
      window.location.href = "";
    <title>Example title</title>

Crypto redirects

If you can’t implement any of the traditional redirect methods, you should still make an effort to let your users know that the page or its content has moved. The simplest way to do this is to add a link pointing to the new page accompanied by a short explanation. For example:

<a href="">We moved! Find the content on our new site!</a>

This helps users find your new site and Google may understand this as a crypto redirect.Don’t rely on crypto redirects for letting search engines know that your content has moved unless you have no other choice. Contact your hosting provider for help with traditional redirects before resorting to crypto redirects.

Alternate versions of a URL

When you redirect a URL, Google keeps track of both the redirect source (the old URL) and the redirect target (the new URL). One of the URLs will be the canonical; which one, depends on signals such as whether the redirect was temporary or permanent. The other URL becomes an alternate name of the canonical URL. Alternate names are different versions of a canonical URL that users might recognize and trust more. Alternate names may appear in search results when a user’s query hints that they might trust the old URL more.

For example, if you moved to a new domain name, it’s very likely that Google will continue to occasionally show the old URLs in the results, even though the new URLs are already indexed. This is normal and as users get used to the new domain name, the alternate names will fade away without you doing anything.

What is a site move?

As a site owner, it is possible that at some point you’ll want to move your site to a different URL or different infrastructure. This page covers the different scenarios of site moves and gives you tips about how to prepare, implement, and monitor the move.

In this documentation, a site redesign is not considered a site move, even if it involves adding extra URLs. A redesign is changing the layout of existing pages, or adding pages of new content. A move is moving existing pages in one of the following ways:

  • Site move without URL changes
    The underlying infrastructure serving the website is changed, but there are no visible changes to the URL. For example, if you move to a different hosting provider while keeping as the same root URL for your site.
  • Site move with URL changes
    The page URLs change. For example:
    • Protocol changes — to
    • Domain name changes — to
    • URL path changes — to

To change how your site serves mobile versus desktop users, see the mobile-friendly site design guide.

Recommendations for all site moves

  • Split your move into smaller steps, if that makes sense for your site.
    We recommend initially moving just a piece of the site to test any effects on traffic and search indexing. After that you can move the rest of your site all at once or in chunks. When choosing the initial test section of the site, pick a section that changes less frequently and isn’t significantly affected by frequent or unpredictable events. Also keep in mind that while moving just one section is a great way to test your move, it’s not necessarily representative of a whole site move when it comes to search. The more pages that you move, the more likely you’ll encounter additional problems to solve. Careful planning can minimize problems.
  • Time your move to coincide with lower traffic, if possible.
    If your traffic is seasonal or dips on certain weekdays, it makes sense to move your site during the recurring traffic dips. This means that fewer people will be affected by potential issues that can happen during the site move, and more of your server’s resources can be dedicated to Googlebot crawling your site.
  • Expect temporary fluctuation in site ranking during the move.
    With any significant change to a site, you may experience ranking fluctuations while Google recrawls and reindexes your site. As a general rule, a medium-sized website can take a few weeks for most pages to move in our index; larger sites can take longer. The speed at which Googlebot and our systems discover and process moved URLs largely depends on the number of URLs and your server speed. Submitting a sitemap can help make the discovery process quicker, and it’s fine to move your site in sections.
  • Ask questions on Google Search Central.
    There is plenty of good advice on our help page and specific cases answered in our user forums. If you can’t find an answer, you can ask a live question to one of our Google Search specialists during our SEO office hours.
  • If it involves a URL change, you might consider an A/B test or trial run.
    Plan for a few weeks to allow for crawling and indexing to pick up changes, plus time to monitor traffic.

Move a site without URL changes

This guide shows how to minimize the impact on your Google Search performance when migrating your site’s hosting infrastructure. For example, when switching hosting providers or moving to a content distribution network (CDN). This guide is only for migrations that don’t affect the user-visible URL.Changing the URLs? If you’re making visible URL changes, start with Site moves with URL changes.


  1. Review FAQs and basic information about site moves. Know what to expect, and how it might affect your users and rankings.
  2. Prepare the new hosting infrastructure. Upload your content to the new servers or configure your CDN and your origin servers, and test it.
  3. Start the site move. Change the DNS settings of your domain name to point to the new hosting infrastructure. This step is the actual site move step that starts the process of sending your traffic to the new infrastructure.
  4. Monitor traffic. Keep tabs on the traffic served by the old and new hosting.
  5. Shut down. Shut down the old hosting infrastructure when you’re confident that all users are receiving content correctly from the new infrastructure and no one is using the old infrastructure.

Prepare the new hosting infrastructure 

This section covers steps to take before you start the actual site move.

Copy and test your new site

First, upload a copy of your site to your new hosting provider. Once you do that, verify that it works as expected by thoroughly testing all aspects of how your users interact with your site. Here are a few suggestions:

  • Open your new site in a web browser and review all elements of your site: webpages, images, forms, and downloads (such as PDF files).
  • Create a testing environment, perhaps with IP-restricted access, through which you test all of the features before the website goes live.
  • Allow for public testing with a temporary hostname for your new infrastructure (like so you can test accessibility by browsers. A temporary hostname can help you test whether Googlebot can reach your site or not.
  • Test the new site with a small portion of your live traffic if possible.

Check that Googlebot is able to access the new hosting infrastructure

If you don’t already have a Search Console account, create a new account for your site to help you monitor Google access and traffic. If you created a temporary hostname for your new site, create an account for that hostname as well. Check that Googlebot can access your new infrastructure using the URL Inspection Tool in Search Console.Caution: Check your firewall configuration or denial of service (DoS) protection. Make sure it does not block Googlebot’s ability to reach the DNS or the hosting provider’s servers.

Lower the TTL value for your DNS records

You can help make your site move go faster if you lower your site DNS records’ TTL value, which will allow the new settings to propagate to ISPs faster. DNS settings are usually cached by ISPs based on the specified Time to Live (TTL) setting. Consider lowering the TTL to a conservative low value (for example, a few hours) at least a week in advance of the site move to refresh DNS caches faster.

Review Search Console verification

Make sure your Search Console verification will continue to work after the site move.

If you’re using the HTML file method to verify ownership of your site in Search Console, make sure you don’t forget to include your current verification file in your new copy of the site.

Likewise, if you include in your content management system’s templates a meta tag or Google Analytics to verify ownership, ensure the new CMS copy includes these as well.

Start the site move 

The move process is as follows.

  1. Remove any temporary blocks to crawling. While building the new copy of a site, some site owners use a robots.txt file to disallow all crawling by Googlebot and other crawlers, or use noindex meta tags or HTTP headers to block indexing of content. Be sure to remove any such blocks from the new copy of the site when you’re ready to start the move.
  2. Update the DNS settings. You start the site move by updating the DNS records to point to the new hosting provider. Check with your DNS provider for how to do that. Because the DNS cache setting is cached, the records takes some time to fully propagate to all users on the internet.

Monitor traffic 

Here are three things you can do to make sure your move is going smoothly:

  • Keep an eye on the server logs on both new and old servers.
    As DNS setting propagates and the site traffic moves, you’ll notice a drop in traffic logged on the old servers and a corresponding increase in traffic on the new servers.
  • Use different public DNS checking tools.
    Check that different ISPs around the world are updating to your new DNS settings correctly.
  • Monitor crawling.
    Monitor the Index coverage graphs in Search Console.Let Googlebot decide.Rather than manually setting a maximum crawl rate in Search Console, keep the setting at Let Googlebot decide. This way, settings designed only for the old system won’t interfere with the requirements of your new system.

A note about Googlebot’s crawl rate

It’s normal to see a temporary drop in Googlebot’s crawl rate immediately after the launch, followed by a steady increase over the next few weeks, potentially to rates that may be higher than before the move.

This fluctuation occurs because we determine crawl rate for a site based on many signals, and these signals change when your hosting changes. As long as Googlebot doesn’t encounter any serious problems or slowdowns when accessing your new serving infrastructure, it will try to crawl your site as fast as necessary and possible.

Shut down old hosting 

Check the server logs on the old provider and, once the traffic to the old provider reaches zero, you can shut down your old hosting infrastructure. This completes the site move.

Move a site with URL changes

This article describes how to change the URLs of existing pages on your site with minimal impact on your Google Search results. Examples of this kind of site move include:

  • URL changes from HTTP to HTTPS
  • Domain name changes such as to or merging multiple domains or hostnames
  • URL paths changes: to, or to

Not changing the URLs? If you are making site changes without visible URL changes, start here instead.


  1. Review basic information about site moves. Know what to expect, and how it might affect your users and rankings. If moving from HTTP to HTTPS, review the best practices for HTTPS.
  2. Prepare the new site and test it thoroughly.
  3. Prepare a URL mapping from the current URLs to their corresponding new format.
  4. Start the site move by configuring the server to redirect from the old URLs to the new ones.
  5. Monitor the traffic on both the old and new URLs.

FAQs for all site moves with URL changes

  • Does Google recommend that you move everything together, or is it fine to move in sections?
    Moving in sections is fine.
  • How can you test how many pages were indexed?
    Verify data for each property separately in Search Console. Use the Index Status report for a broad look. Use the Sitemaps report to view how many URLs submitted in a sitemap have been indexed.
  • How long will it take for Google to recognize URL changes?
    There are no fixed crawl frequencies; it depends on the size of your site, and the speed of crawling that’s possible. The move takes place on a per-URL basis.
  • Do you lose credit for links when you redirect to new URLs?
    No, 301 or 302 redirects do not cause a loss in PageRank.

Migrating from HTTP to HTTPS

  • Review the best practices for HTTPS.
  • Be sure to add the HTTPS property to Search Console. Search Console treats HTTP and HTTPS separately; data for these properties is not shared in Search Console. So if you have pages in both protocols, you must have a separate Search Console property for each one.

HTTP to HTTPS migration FAQs

Will this HTTPS migration affect ranking?

As with all migrations, you may experience some ranking fluctuation during a migration. However, to avoid HTTPS-specific pitfalls, review the best practices information for HTTPS pages .

HTTPS sites receive a small ranking boost, but don’t expect a visible change. Google uses HTTPS as a positive ranking signal. This signal is one amongst many others, and currently carries less weight than high-quality site content; you should not expect a major SEO advantage for moving to HTTPS in the short term. In the longer term, Google may increase the strength of the HTTPS boost.

Is it okay to move just some pages to HTTPS?

Yes, that’s okay. Start with a part, test it, then move more at your own pace.

If you are migrating from HTTP to HTTPS in pieces, and you want to avoid early indexing of the staged URLs, we recommend using rel=canonical rather than redirects. If you use redirects, you won’t be able to test the redirected pages.

Will the rel=canonical tag guarantee that the HTTP URL is indexed?

No, but it’s a very strong signal when picking the indexed URL.

Which certificate does Google recommend?

For Google Search, any modern certificate that’s accepted by modern browsers is acceptable.

Do search keywords change after a move to HTTPS?

This won’t change with HTTPS; you can still see search queries in Search Console.

How can I test how many pages were indexed?

Verify HTTP and HTTPS separately in Search Console, and use the Index Coverage report to see which pages have been indexed.

How long will a move from HTTP to HTTPS take?

There are no fixed crawl frequencies; it depends on the size of your site and the speed of crawling that’s possible. The move takes place on a per-URL basis.

We reference our HTTP sitemaps in robots.txt. Should we update the robots.txt to include our new HTTPS sitemaps?

We recommend that you update your robots.txt file to point to the HTTPS version of your sitemap files. We also recommend listing only the HTTPS URLs in your sitemap.

Keep in mind that if you’re redirecting each URL on your site from HTTP to HTTPS, you will still have only one robots.txt file accessible to crawlers. For example, if redirects to, the contents of the HTTP version won’t be visible to Google and other search engines.

Which sitemap should map the section in the HTTPS trial?

You can create a separate sitemap just for the updated section of your site. This will enable you to track indexing of the trial section more precisely. Be sure not to duplicate these URLs in any other sitemaps, though.

What URLs should our sitemaps list if we have redirects (from HTTP to HTTPS or the reverse)?

List all the new HTTPS URLs in your sitemap, and remove the old HTTP URLs. If you prefer creating a new sitemap, list only the new HTTPS URLs in it.

Are there any other specific things we need to add to the robots.txt for the HTTPS version?


Should we support HSTS?

HSTS increases security, but adds complexity to your rollback strategy. See HTTPS best practices for more information.

We use a single Google News sitemap for our entire site. What do we do if we’re migrating our site piece by piece? 

If you want to use a Google News sitemap for the new HTTPS section, you will have to contact the News team to let them know about the protocol change, and then in your HTTPS property in Search Console you can submit a new Google News sitemap as you migrate each section of your site to HTTPS.

Are there any specific recommendations for Google News Publisher Center with HTTPS migration? 

Google News Publisher Center handles the HTTP to HTTPS moves transparently. In general you don’t have to do anything from Google News perspective, unless you’re also making use of News sitemaps. In that case, contact the News team and let them know about the change. You can also let the team know about changing sections, for example in case you’re moving to HTTPS, you can specify that you’re moving to

Prepare the new site

The details of site preparation vary for each site move, but typically you’ll do one or more of the following:

  • Set up a new content management system (CMS) and add content to it.
  • Transfer images and downloads (such as PDF documents) that you currently host.
    These might already be getting traffic from Google Search or links, and it’s useful to tell users and Googlebot about their new location.
  • For a move to HTTPS, get and configure the required TLS certificates on your server.

Set up a robots.txt for your new site

The robots.txt file for a site controls which areas Googlebot can crawl. Make sure the directives in the new site’s robots.txt file correctly reflect the parts you want blocked from crawling.

Note that some site owners block all crawling while in development. If you follow this strategy, make sure you prepare what the robots.txt file should look like once the site move starts. Likewise, if you use noindex directives during development, prepare a list of URLs from which you’ll remove the noindex directives when you start the site move.

Provide errors for deleted or merged content

For content on the old site that will not be transferred to the new site, make sure those orphaned URLs correctly return an HTTP 404 or 410 error response code. You can return the error response code at the old URL in the configuration panel for your new site, or you can create a redirect for a new URL and have that return the HTTP error code.Avoid irrelevant redirects

Don’t redirect many old URLs to one irrelevant destination, such as the home page of the new site. This can confuse users and might be treated as a soft 404 error. However, if you have consolidated content previously hosted on multiple pages to a new single page, it is acceptable to redirect the older URLs to that new, consolidated page.

Ensure correct Search Console settings

A successful site move depends on correct—and up to date—Search Console settings.

If you haven’t already, verify that you own both the old and new sites in Search Console. Be sure to verify all variants of both the old and new sites. For example, verify and, and include both the HTTPS and HTTP site variants if you use HTTPS URLs. Do this for both old and new sites.

Review the Search Console verification

Make sure your Search Console verification will continue to work after the site move. If you’re using a different method of verification, keep in mind that verification tokens may be different when the URL changes.

If you’re using the HTML file method to verify ownership of your site in Search Console, make sure you don’t forget to include your current verification file in your new copy of the site.

Likewise, if you verify ownership with an include file that references meta tag or Google Analytics to verify ownership, ensure the new CMS copy includes these as well.

Review any configured settings in Search Console

If you had changed some of the configuration settings in Search Console for your old site, make sure the new site’s settings are updated to reflect those changes as well. For example:

  • URL parameters: If you’ve configured URL parameters to control the crawling or indexing of your old URLs, make sure the settings are also applied to the new site if needed.
  • Geotargeting: Your old site might have explicit geotargeting, such as a geotargetable domain or a country-coded top-level domain (such as Apply the same setting to the new site if you want to continue targeting for the same region. However, if your site move is meant to help your business expand globally and you do not wish your site to be associated with any country or region, select Unlisted in the drop-down list of the Site Settings page.
  • Crawl rate: We recommend not limiting Googlebot’s crawl rate in Search Console for both old and new URLs. We advise you don’t configure a crawl rate setting, either. Only do this if you know that your site cannot handle Googlebot’s volume of crawling. If you have already limited Googlebot’s crawl rate for your old site, consider removing it. Google has algorithms that automatically detect that a site move has been implemented and we alter Googlebot’s crawling behavior so that our indexing quickly reflects the site move.
  • Disavowed backlinks: If you’ve uploaded a file to disavow links on your old site, we recommend that you re-upload it again using the Search Console account of the new site.

Clean up your recently purchased domain

If your new site is for a recently purchased domain, you’ll want to make sure it’s clean of any outstanding issues from the previous owner. Check the following settings:

  • Manual action for previous spam. For sites that don’t comply with our spam policies, Google is willing to take manual action, such as demoting them or even removing them from our search results altogether. Check the Manual Actions page in Search Console to see if any manual actions have been applied to the new site, and address any problems listed there before filing a reconsideration request.
  • Removed URLs. Make sure that there aren’t any URL removals left over from the previous owner, especially a site-wide URL removal. Also, before submitting URL removal requests for your content, make sure that you understand when not to use the URL removals tool.

Use web analytics

During a site move, it’s important to analyze usage on both the old and new sites. Web analytics software can help with this. Typically, web analytics configuration consists of a piece of JavaScript embedded in your pages. The details for tracking different sites varies depending on your analytics software and its logging, processing, or filtering settings. Check with your analytics software provider for help. Additionally, if you have been planning to make any configuration changes to your analytics software, now is a good time. If you use Google Analytics, consider creating a new profile for your new site if you want clean separation in your content reports.

Ensure that your server has enough computing resources

After a migration, Google will crawl your new site more heavily than usual. This is because your site redirects traffic from the old to the new site, and any crawls of the old site will be redirected to the new site, in addition to any other crawling. Ensure that your new site has sufficient capacity to handle the increased traffic from Google.

Update Data Highlighter

If you used Data Highlighter to map your old pages, be sure to redo the mapping for your new site.

As soon as your HTTPS pages are ready, update any app links intended to open your web pages in an app when displayed in Google Search results. Update these links to point to the new HTTPS URLs. Redirects won’t work for these links; mobile browser clicks will open the page in the browser instead of the app unless you update your app link handling.

It’s important to map your old site’s URLs to the URLs for the new site. This section describes a number of general approaches you can take to correctly assess the URLs on your two sites and facilitate mapping. The exact details of how you generate this mapping will vary depending on your current website infrastructure and the details of the site move.

Prepare URL mapping 

It’s important to map your old site’s URLs to the URLs for the new site. This section describes a number of general approaches you can take to correctly assess the URLs on your two sites and facilitate mapping. The exact details of how you generate this mapping will vary depending on your current website infrastructure and the details of the site move.

Determine your old URLs

In the simplest of site moves, you may not need to generate a list of your old URLs. For example, you could use a wildcard server-side redirect if you’re changing your site’s domain (for example, moving from to

In more complex site moves, you will need to generate a list of old URLs and map them to their new destinations. How you get a listing of old URLs depends on your current website’s configuration, but here are some handy tips:

  • Start with your important URLs. To find them:
    • Look in your sitemaps because it’s likely your most important URLs have been submitted in Search Console that way
    • Check your server logs or analytics software for the URLs that get the most traffic
    • Check the Links to your site feature in Search Console for pages that have internal and external links
  • Use your content management system, which can typically provide an easy way to get a listing of all URLs that host content.
  • Check your server logs for URLs that were visited at least once recently. Pick a time period that makes sense for your site, keeping in mind seasonal variation of traffic.
  • Include images and videos—Make sure that you include URLs of embedded content in your site move plans: videos, images, JavaScript, and CSS files. These URLs need to be moved in the same way as all other content on your website.

Create a mapping of old to new URLs

Once you have the listing of old URLs, decide where each one should redirect to. How you store this mapping depends on your servers and the site move. You might use a database, or configure some URL rewriting rules on your system for common redirect patterns.

Update all URL details

Once you have your URL mapping defined, you’ll want to do three things to get the pages ready for the move.

  1. Update annotations to point to the new URLs in the HTML or sitemaps entry for each page:
    1. Each destination URL should have a self-referencing rel="canonical" <link> tag.
    2. If the site you moved has multilingual or multinational pages annotated using rel-alternate-hreflang annotations, be sure to update the annotations to use the new URLs.
    3. If the site you moved has a mobile counterpart, make sure you update the rel-alternate-media annotations to use the new URLs. Learn more in our smartphone websites guidelines.
  2. Update internal links.
    Change the internal links on the new site from the old URLs to the new URLs. You can use the mapping generated earlier to help find and update the links as needed.
  3. Save the following lists for your final move:
    • A sitemap file containing the new URLs in the mapping. See our documentation about building a sitemap.
    • A list of sites linking to your old URLs. You can find the links to your site in Search Console.

Prepare for 301 redirects

Once you have a mapping and your new site is ready, the next step is to set up HTTP 301 redirects on your server from the old URLs to the new URLs as you indicated in your mapping.

Keep in mind the following:

  • Use HTTP 301 redirects. Although Googlebot supports several kinds of redirects, we recommend that you use HTTP 301 redirects if possible.
  • Avoid chaining redirects. While Googlebot can follow up to 10 hops in a “chain” of multiple redirects (for example, Page 1 > Page 2 > Page 3), we advise redirecting to the final destination. If this is not possible, keep the number of redirects in the chain low, ideally no more than 3 and fewer than 5. Chaining redirects adds latency for users, and not all user agents and browsers support long redirect chains.
  • Test the redirects. You can use the URL Inspection Tool for testing individual URLs, or command line tools or scripts to test large numbers or URLs.

Start the site move

Once the URL mapping is accurate and the redirects work, you’re ready to move.

  1. Decide how you will move your site — all at once, or in sections:
    • Small or medium sites: We recommend moving all URLs on your site simultaneously instead of moving one section at a time. This helps users interact with the site better in its new form, and helps our algorithms detect the site move and update our index faster.
    • Large sites: You can choose to move larger sites one section at a time. This can make it easier to monitor, detect, and fix problems faster.
  2. Update your robots.txt files:
    • On the old site, remove all robots.txt directives. This allows Googlebot to discover all redirects to the new site and update our index. Keep in mind that crawlers will not see the contents of the old robots.txt file once you enabled redirects.
    • On the new site, make sure the robots.txt file allows all crawling. This includes crawling of images, CSS, JavaScript, and other page assets, apart from the URLs you are certain you do not want crawled.
  3. Configure the old website to redirect users and Googlebot to the new site based on the URL mapping.
  4. Submit a Change of Address in Search Console for the old site.If you’re moving your site from HTTP to HTTPS, you don’t need to use the Change of Address tool.
  5. Keep the redirects for as long as possible, generally at least 1 year. This timeframe allows Google to transfer all signals to the new URLs, including recrawling and reassigning links on other sites that point to your old URLs.From users’ perspective, consider keeping redirects indefinitely. However, redirects are slow for users, so try to update your own links and any high-volume links from other websites to point to the new URLs.
  6. Submit the new sitemap in Search Console. This will help Google learn about the new URLs. At this point you can remove your old sitemap, since Google will use the new sitemap going forward.

The time it takes Googlebot and our systems to discover and process all URLs in the site move depends on how fast your servers are and how many URLs are involved. As a general rule, a medium-sized website can take a few weeks for most pages to move, and larger sites take longer. The speed at which Googlebot and our systems discover and process moved URLs depends on the number of URLs and the server speed.Note that the visibility of your content in web search may fluctuate temporarily during the move. This is normal and a site’s rankings will settle down over time.

Immediately after the site move is started, try to update as many incoming links as possible to improve the user experience and reduce your server load. These include:

  • External links: Try to contact the sites in the saved list of sites linking to your current content, asking them to update their links to your new site. Consider prioritizing your efforts by the number of inbound visits for each link.
  • Profile links such as from Facebook, Twitter, and LinkedIn.
  • Ad campaigns to point to the new landing pages.

Monitor traffic

Once you’ve started the site move, monitor how the user and crawler traffic changes on the new site and also the old site. Ideally the traffic on the old site will go down, while on the new site the traffic goes up. You can monitor user and crawler activity on the sites with Search Console and other tools.

Use Search Console to monitor traffic

Many features of Search Console help you monitor a site move, including:

  • Sitemaps: Submit the two sitemaps you saved earlier from the mapping. Initially, the sitemap containing the new URLs would have zero pages indexed, while the sitemap of the old URLs would have many pages indexed. Over time the number of pages indexed from the old URLs sitemap would drop to zero with a corresponding increase of indexing of the new URLs.
  • Index Coverage report: The graphs would reflect the site move, showing a drop in indexed URL counts on the old site and an increase of indexing on the new site. Check regularly for any unexpected crawl errors.
  • Search queries: As more pages of the new site get indexed and start ranking, the search queries reports would start showing the URLs on the new site getting search impressions and clicks.

Use other tools to monitor traffic

Keep an eye on your server access and error logs. In particular, check for crawling by Googlebot, any URLs that unexpectedly return HTTP error status codes, and normal user traffic.

If you installed any web analytics software on your site, or if your CMS provides analytics, it’s also recommended that you review traffic this way so that you can see the progress of traffic from your old to new site. In particular, Google Analytics offers real-time reporting, and this is a handy feature to use during the initial site move phase. You should expect to see traffic drop on the old site and rise on the new site.

Troubleshooting your site move

Here are some common mistakes when migrating a site with URL changes (including HTTP to HTTPS). These mistakes can prevent your new site from being indexed completely.

Common mistakes
noindex or robots.txt blocksDon’t forget to remove any noindex or robots.txt blocks that were only needed for the migration.It’s fine if you don’t have a robots.txt file on your site, but be sure to return a proper 404 quickly if the robots.txt file is requested but not provided.To test:Examine your robots.txt file in your HTTPS site and see if anything needs to be changed.Use the URL inspection tool for any pages that seem to be missing from Google in the new site.
Incorrect redirectsCheck your redirects from the old site to the new one. We frequently see people directing to the wrong (non-existent) URLs on the new site.
Other crawl errorsExamine the Index Coverage report for a spike in other errors on your new site during migration events.
Insufficient capacityAfter a migration, Google will crawl your new site more heavily than usual. This is because your site redirects traffic from the old to the new site, and any crawls of the old site will be redirected to the new site, in addition to any other crawling. Ensure that your site has sufficient capacity to handle the increased traffic from Google.
Not updating app linksIf you open your web pages within your app, update the app links to the new URLs before you implement your old to new page redirects. Otherwise Google won’t suggest using the app to open the new URLs in search results, but will direct users to the website in the browser instead.
Not updating sitemapsBe sure that your sitemaps are all updated with the new URLs.
Not updating Data HighlighterIf you used Data Highlighter to map your old pages, you will need to redo your mappings for your new site.

This page covers how to ensure that testing variations in page content or page URLs has minimal impact on your Google Search performance. It does not give instructions on how to build or design tests, but you can find more resources about testing at the end of this page.

Overview of testing

Website testing is when you try out different versions of your website (or a part of your website) and collect data about how users react to each version.

  • A/B testing is where you test two (or more) variations of a change. For example, you may test different fonts on a button to see if you can increase button clicks.
  • Multivariate testing is where you test more than one type of change at a time, looking for the impact of each change as well as potential synergies between the changes. For example, you might try several fonts for a button, but also try changing (and not changing) the font of the rest of the page at the same time. Is a new font easier to read and so should be used everywhere? Or is the benefit that the button font looks different to the rest of the page, helping it draw attention?

You can use software to compare behavior with different variations of your pages (parts of a page, entire pages, or entire multi-page flows), and track which version is most effective with your users.

You can run tests by creating multiple versions of a page, each with its own URL. When users try to access the original URL, you redirect some of them to each of the variation URLs and then compare users’ behavior to see which page is most effective.

You can also run tests without changing the URL by inserting variations dynamically on the page. You can use JavaScript to decide which variation to display.

Depending on what types of content you’re testing, it may not even matter much if Google crawls or indexes some of your content variations while you’re testing. Small changes, such as the size, color, or placement of a button or image, or the text of your “call to action” (“Add to cart” vs. “Buy now!”), can have a surprising impact on users’ interactions with your page, but often have little or no impact on that page’s search result snippet or ranking.

In addition, if we crawl your site often enough to detect and index your experiment, we’ll probably index the eventual updates you make to your site fairly quickly after you’ve concluded the experiment.

Best practices when testing

Here is a list of best practices to avoid any bad effects on your Google Search behavior while testing site variations:

Don’t cloak your test pages

Don’t show one set of URLs to Googlebot, and a different set to humans. This is called Cloaking, and is against our spam policies, whether you’re running a test or not. Remember that infringing our spam policies can get your site demoted or removed from Google search results—probably not the desired outcome of your test.

Cloaking counts whether you do it by server logic or by robots.txt, or any other method. Instead, use links or redirects as described next.

If you’re using cookies to control the test, keep in mind that Googlebot generally doesn’t support cookies. This means it will only see the content version that’s accessible to users with browsers that don’t accept cookies.

If you’re running a test with multiple URLs, you can use the rel="canonical" link attribute on all of your alternate URLs to indicate that the original URL is the preferred version. We recommend using rel="canonical" rather than a noindex meta tag because it more closely matches your intent in this situation. For instance, if you are testing variations of your home page, you don’t want search engines not to index your homepage; you just want them to understand that all the test URLs are close duplicates or variations on the original URL and should be grouped together, with the original URL as the canonical. Using noindex rather than rel="canonical" in such a situation can sometimes have unexpected bad effects.

Use 302 redirects, not 301 redirects

If you’re running a test that redirects users from the original URL to a variation URL, use a 302 (temporary) redirect, not a 301 (permanent) redirect. This tells search engines that this redirect is temporary—it will only be in place as long as you’re running the experiment — and that they should keep the original URL in their index rather than replacing it with the target of the redirect (the test page). JavaScript-based redirects are also fine.

Run the experiment only as long as necessary

The amount of time required for a reliable test will vary depending on factors like your conversion rates, and how much traffic your website gets; a good testing tool tells you when you’ve gathered enough data to draw a reliable conclusion. Once you’ve concluded the test, update your site with the desired content variation(s) and remove all elements of the test as soon as possible, such as alternate URLs or testing scripts and markup. If we discover a site running an experiment for an unnecessarily long time, we may interpret this as an attempt to deceive search engines and take action accordingly. This is especially true if you’re serving one content variant to a large percentage of your users.

More information about testing

Temporarily pause or disable a website

If you’re unable to fulfill orders or many of your products out of stock, you may be considering temporarily closing your online business. If the situation is temporary, meaning you expect to be able to sell products in the coming weeks or months, we recommend that you take action that preserves as much of your site’s standing in Search as possible. This guide explains how you can safely pause your online business.

Limit your site’s functionality (recommended) 

If your situation is temporary and you plan to reopen your online business, we recommend that you keep your site online and limit the functionality. This is the recommended approach since it minimizes any negative effects on your site’s presence in Search. People can still find your products, read reviews, or add wishlists so they can purchase at a later time. We recommend doing the following:

  • Disable the cart functionality: Disabling the cart functionality is the simplest approach, and doesn’t change anything for your site’s visibility in Search.
  • Display a banner or popup: A banner or popup div on all pages including the landing page quickly makes the status clear to users. Mention any known and unusual delays, shipping times, pick-up or delivery options, so that users continue with the right expectations. To prevent the content in the banner or popup from being shown in a snippet in Search results, use the data-nosnippet HTML attribute. Make sure to follow our guidelines on popups and banners.
  • Update your structured data: If your site uses structured data (for example, ProductBookEvent), make sure to adjust it appropriately (reflecting the current product availability, or changing events to cancelled). If your business has a physical storefront, update Local Business structured data to reflect current opening hours.
  • Check your Merchant Center feed: If you use Merchant Center, follow the best practices for the availability attribute.
  • Tell Google about your updates: To ask Google to recrawl a limited number of pages (for example, the homepage), use Search Console. For a larger number of pages (for example, all of your product pages), use sitemaps.

Not recommended: Disable the whole website 

Warning: Google’s systems are designed to be robust and to help websites recover from temporary issues. However, removing a site completely from Google’s index is a significant change that can take quite some time to recover from. There’s no fixed time for a recovery from a complete removal, and there’s no mechanism to speed that up. This is why we strongly recommend limiting functionality instead of removing the site from Search.

You may decide to disable the whole website. This is an extreme measure that should only be taken for a very short period of time (a few days at most), as it will otherwise have significant effects on the website in Search, even when implemented properly.

Make sure that you consider the following side effects of disabling your entire site:

  • Your customers won’t know what’s happening with your business if they can’t find your business online at all.
  • Your customers can’t find or read first-hand information about your business and its example, reviews, specs, repair guides, or manuals won’t be findable. Third-party information may not be as correct or comprehensive as what you can provide. This often also affects future purchase decisions.
  • Knowledge Panels may lose information, like contact phone numbers and your site’s logo.
  • Search Console verification will fail, and you will lose all access to information about your business in Search. Aggregate reports in Search Console will lose data as pages are dropped from the index.
  • Ramping back up after a prolonged period of time will be significantly harder if your website needs to be reindexed first. Additionally, it’s uncertain how long this would take, and whether the site would appear similarly in Search afterwards.

If you decide that you need to do this (again, not recommended), here are some options:

Best practices for disabling a site 

Warning: Keep in mind that it’s not possible for Google’s systems to refresh titles, descriptions, metadata, or structured data included on a website if a page returns a 503 HTTP response status code.

While we don’t recommend disabling your site, here are some best practices if you decide to do this:

  • Continue to allow crawling through the robots.txt file. Don’t return a 503 HTTP response status code for the robots.txt file because this blocks all crawling.
  • Confirm a 503 HTTP response status code locally by using curl or a similar tool. For example:curl -I -X GET “” HTTP/1.1 503 Service Unavailable Mime-Version: 1.0 Content-Type: text/html (…)
  • To minimize the server-side and client-side load of a 503 error page, follow these best practices:
    • Use the retry-after HTTP header with a best effort date or duration.
    • Use static HTML.
    • Minimize off-page resources; use inline CSS stylesheets and base-64-encoded images.
  • Give your users clear guidance on future steps within the content of the error page. This could include:
    • Links to more information
    • The date when you expect the website to be online again, or when the information will be updated
    • How to contact customer service
  • Don’t disallow all crawling in the robots.txt file. Returning a valid robots.txt file that disallows all crawling may remove the website’s content, and potentially its URLs, from Google Search.
  • Don’t block the website by returning 403404410 HTTP status codes, or with a noindex robots meta tag or x-robots-tag HTTP header. This will remove the website’s URLs from Google Search.
  • Don’t use the temporary website removal tool in Search Console for closures. Doing so will make it impossible for users to find your website so that they can learn its status. Also, potential resellers or affiliates of your business’s products may continue to be shown in Search.
  • Don’t block your robots.txt file with a 503 HTTP response status code.


 What if I only close the site for a few weeks?  What if I want to exclude all non-essential products?  Can I ask Google to crawl less while my site is temporarily closed?  How do I get a page indexed or updated quickly?  What if I block a specific region from accessing my site? 

 Should I use the Removals Tool to remove out-of-stock products?