Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Temporarily shut down a site
-
What would be the best way to temporarily shut down a site the right way and not have a negative impact on SEO?
-
I asked the Q&A associates their opinion, and several people also responded that a 503 would be the way to go.
-
It is due to some legal matter. So we need it to shut it down
-
Can you give us some more details about the shutdown (the reasons, why it needs to be so long, etc)? We can help you a bit better if we know more information.
When we switched from SEOmoz.org to a-moz.groupbuyseo.org, we were only down for half an hour, if that. If this is about upgrading, is there a testing server that you can use to get the website rebuilt and tested on the testing/staging server before you make it live? We used multiple staging servers to test out the site and did lots of checks so that we had minimal downtime when it came time to move the site.
-
What if it is more than a week?
-
I'm also assuming that you're talking about just a day or two, and not two months. There was a post on Moz last year about this that can also help, in addition to the good info provided by CleverPhD http://a-moz.groupbuyseo.org/blog/how-to-handle-downtime-during-site-maintenance
-
Appreciate the positive comment EGOL!
-
That was a great answer. Thanks. I didn't know that.
-
Thank you - please mark my response as Good Answer if it helps.
Cheers!
-
Thank you
-
According to Matt Cutts
"According to Google's Distinguished Engineer Matt Cutts if your website is down just for a day, such as your host being down or a server transfer, there shouldn't be any negative impact to your search rankings. However, if the downtime is extended, such as for two weeks, it could have impact on your search rankings because Google doesn't necessarily want to send the user to a website that they know has been down, because it provides the user with a poor user experience.
Google does make allowances for websites that are sporadically having downtime, so Googlebot will visit again 24 hours later so and see if the site is accessible."
That said, what should you show Google?
http://yoast.com/http-503-site-maintenance-seo/
According to Yoast, you should not show a 200 (ok) or 404 (file not found), but a 503 code on all pages with a retry-after header to Google.
The 503 (http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html) tells Google "The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay. If known, the length of the delay MAY be indicated in a Retry-After header. If no Retry-After is given, the client SHOULD handle the response as it would for a 500 response.:
The retry after tells Google when to come back. You should set this to a time that is generous to allow you plenty of time to get everything back up and running.
Another point from Yoast that he links to https://plus.google.com/+PierreFar/posts/Gas8vjZ5fmB - if the robots.txt file shows a 503 then Google will stop wasting time crawling all your pages (and wasting time) until it sees a 200 back on your robots.txt file. So it is key that you get the 503 and retry after properly on the robots.txt
Cheers!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
All URLs in the site is 302 redirected to itself
Hi everyone, I have a problem with a website wherein all URLs (homepage, inner pages) are 302 redirected. This is based on Screaming Frog crawl. But the weird thing is that they are 302 redirected to themselves which doesn't make any sense. Example:
Intermediate & Advanced SEO | | alex_goldman
https://www.example.com.au/ is 302 redirected to https://www.example.com.au/ https://www.example.com.au/shop is 302 redirected to https://www.example.com.au/shop https://www.example.com.au/shop/dresses is 302 redirected to https://www.example.com.au/shop/dresses Have you encountered this issue? What did you do to fix it? Would be very glad to hear your responses. Cheers!0 -
Splitting One Site Into Two Sites Best Practices Needed
Okay, working with a large site that, for business reasons beyond organic search, wants to split an existing site in two. So, the old domain name stays and a new one is born with some of the content from the old site, along with some new content of its own. The general idea, for more than just search reasons, is that it makes both the old site and new sites more purely about their respective subject matter. The existing content on the old site that is becoming part of the new site will be 301'd to the new site's domain. So, the old site will have a lot of 301s and links to the new site. No links coming back from the new site to the old site anticipated at this time. Would like any and all insights into any potential pitfalls and best practices for this to come off as well as it can under the circumstances. For instance, should all those links from the old site to the new site be nofollowed, kind of like a non-editorial link to an affiliate or advertiser? Is there weirdness for Google in 301ing to a new domain from some, but not all, content of the old site. Would you individually submit requests to remove from index for the hundreds and hundreds of old site pages moving to the new site or just figure that the 301 will eventually take care of that? Is there substantial organic search risk of any kind to the old site, beyond the obvious of just not having those pages to produce any more? Anything else? Any ideas about how long the new site can expect to wander the wilderness of no organic search traffic? The old site has a 45 domain authority. Thanks!
Intermediate & Advanced SEO | | 945010 -
Multiple Ecommerce sites, same products
We are a large catalog company with thousands of products across 2 different domains. Google clearly knows that the sites are connected. Both domains are fairly well known brands - thousands of branded searches for each site per month. Roughly half of our products overlap - they appear on both sites. We have a known duplicate content issue - both sites having exactly the same product descriptions, and we are working on it. We've seen that when a product has different content on the 2 sites, frequently, both pages get to page 2 of the SERPs, but that's as far as it goes, despite aggressive white hat link building tactics. 1. Is it possible to get the same product pages on page 1 of the SERPs for both sites? (I think I know the answer...) 2. Should we be canonicalizing (is that a word?) products across the sites? This would get tricky - both sites have roughly the same domain authority, but in different niches. Certain products and keywords naturally rank better on 1 site or the other depending on the niche.
Intermediate & Advanced SEO | | AMHC0 -
Do I have to optimize every page on my site?
Hi guys I run my own photography webstie (www.hemeravisuals.co.uk Going through the process optimizing my page for seo. I have one question I have a few gallery pages with no text etc? Do I still have to optimize these ? Would it rank my site lower if they weren't optimized? And how can i do this sucessfully with little text on these pages ( I have indepth text on these subjects on my services & pricing pages? Kind Regards Cam
Intermediate & Advanced SEO | | hemeravisuals0 -
Should I redirect images when I migrate my site
We are about to migrate a large website with a fair few images (20,000). At the moment we include images in the sitemap.xml so they are indexed by Google and drive traffic (not sure how I can find out how much though). Current image slugs are like:
Intermediate & Advanced SEO | | ArchMedia
http://website.com/assets/images/a2/65680/thumbnails/638x425-crop.jpg?1402460458 Like on the old site, images on the new website will also have unreadable cache slugs, like:
http://website.com/site_media/media/cache/ce/7a/ce7aeffb1e5bdfc8d4288885c52de8e3.jpg All content pages on the new site will have the same slugs as on the old site. Should I go through the trouble of redirecting all these images?0 -
Franchise sites on subdomains
I've been asked by a client to optimise a a webpage for a location i.e. London. Turns out that the location is actually a franchise of the main company. When the company launch a new franchise, so far they have simply added a new page to the main site, for example: mysite.co.uk/sub-folder/london They have so far done this for 10 or so franchises and task someone with optimising that page for their main keyword + location. I think I know the answer to this, but would like to get a back up / additional info on it in terms of ranking / seo benefits. I am going to suggest the idea of using a subdomain for each location, example: london.mysite.co.uk Would this be the correct approach. If you think yes, why? Many thanks,
Intermediate & Advanced SEO | | Webrevolve0 -
Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
Hi, We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc. The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them. As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent. The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is 1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it). 2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors). Many thanks in advance
Intermediate & Advanced SEO | | James770 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90