Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Temporarily shut down a site
-
What would be the best way to temporarily shut down a site the right way and not have a negative impact on SEO?
-
I asked the Q&A associates their opinion, and several people also responded that a 503 would be the way to go.
-
It is due to some legal matter. So we need it to shut it down
-
Can you give us some more details about the shutdown (the reasons, why it needs to be so long, etc)? We can help you a bit better if we know more information.
When we switched from SEOmoz.org to a-moz.groupbuyseo.org, we were only down for half an hour, if that. If this is about upgrading, is there a testing server that you can use to get the website rebuilt and tested on the testing/staging server before you make it live? We used multiple staging servers to test out the site and did lots of checks so that we had minimal downtime when it came time to move the site.
-
What if it is more than a week?
-
I'm also assuming that you're talking about just a day or two, and not two months. There was a post on Moz last year about this that can also help, in addition to the good info provided by CleverPhD http://a-moz.groupbuyseo.org/blog/how-to-handle-downtime-during-site-maintenance
-
Appreciate the positive comment EGOL!
-
That was a great answer. Thanks. I didn't know that.
-
Thank you - please mark my response as Good Answer if it helps.
Cheers!
-
Thank you
-
According to Matt Cutts
"According to Google's Distinguished Engineer Matt Cutts if your website is down just for a day, such as your host being down or a server transfer, there shouldn't be any negative impact to your search rankings. However, if the downtime is extended, such as for two weeks, it could have impact on your search rankings because Google doesn't necessarily want to send the user to a website that they know has been down, because it provides the user with a poor user experience.
Google does make allowances for websites that are sporadically having downtime, so Googlebot will visit again 24 hours later so and see if the site is accessible."
That said, what should you show Google?
http://yoast.com/http-503-site-maintenance-seo/
According to Yoast, you should not show a 200 (ok) or 404 (file not found), but a 503 code on all pages with a retry-after header to Google.
The 503 (http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html) tells Google "The server is currently unable to handle the request due to a temporary overloading or maintenance of the server. The implication is that this is a temporary condition which will be alleviated after some delay. If known, the length of the delay MAY be indicated in a Retry-After header. If no Retry-After is given, the client SHOULD handle the response as it would for a 500 response.:
The retry after tells Google when to come back. You should set this to a time that is generous to allow you plenty of time to get everything back up and running.
Another point from Yoast that he links to https://plus.google.com/+PierreFar/posts/Gas8vjZ5fmB - if the robots.txt file shows a 503 then Google will stop wasting time crawling all your pages (and wasting time) until it sees a 200 back on your robots.txt file. So it is key that you get the 503 and retry after properly on the robots.txt
Cheers!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shopify Site with Multiple Domains?
Hey there! My client has a website on Shopify. I don't even know how to open this can of worms, but let me try. The site URL is: https://mobilityequipmentforless.com/ However, there is another (older?) URL that gets updated as the main site gets updated and shows the exact same content. It's a straight duplicate, but is it's own URL and doesn't redirect to the main site. https://www.powerchairrecyclers.com/ And this isn't the SITE.Shopify back-end site name that was used for set up initially. I just have no idea what's going on here. Not sure if it's a serious error that needs to be fixed, or if it's something weird with how Shopify work. Any insight would be immensely helpful. Thanks! Mike
Intermediate & Advanced SEO | | naturalsociety0 -
Google cache is showing my UK homepage site instead of the US homepage and ranking the UK site in US
Hi There, When I check the cache of the US website (www.us.allsaints.com) Google returns the UK website. This is also reflected in the US Google Search Results when the UK site ranks for our brand name instead of the US site. The homepage has hreflang tags only on the homepage and the domains have been pointed correctly to the right territories via Google Webmaster Console.This has happened before in 26th July 2015 and was wondering if any had any idea why this is happening or if any one has experienced the same issueFDGjldR
Intermediate & Advanced SEO | | adzhass0 -
Merging Niche Site
I posted a question about this a while ago, but still haven't pulled the trigger. I have a main site (bobsclothing.com). I also have a EM niche site (i.e shirtsmall.com). It would be more efficient for me to merge these site, because: I would have to manage content, promos, etc. on a single site. In other words, I can focus efforts on 1 site. If I am writing content, I don't have to split the work. I don't have to worry about duplicate content. Right now, if I enter a product URL into copyscape, the other sites is returned for many products. What makes me apprehensive are: The niche site actually ranks for more keywords than the main site, although it has lower revenue. Slightly lower PA, and DA. Niche site ranks top 20 for a profitable keyword that has about 1300 exact match searches. If you include the longer tail versions of the keyword it would be more. If I merge these sites, and do proper 301s (product to product, category to category) how likely is it that main site will still rank for that keyword? Am I likely to end up with a site that has stronger DA? Am I better off keeping the niche site and just focusing content efforts on the few keywords that it can rank well for? I appreciate any advice. If someone has done this, please share your experience. TIA
Intermediate & Advanced SEO | | inhouseseo0 -
Wrong titles in site links
Hello fellow marketers, I have found this weird thing with our website in the organic results. The sitelinks in the SERP shows wrong written text. As in grammatically incorrect text. My question is where does Google get the text from? It is not the page title as we can see it. kKsFv0X.png
Intermediate & Advanced SEO | | auke18101 -
Micro sites?
Hi, I have been speaking to seo firms regarding strategies and they mentioned setting up micro sites under domains that are relevant. i.e setting up armanidoamin.co.uk and we use it as a blog type site to update all info, product reviews, news relating to armani. Whats peoples thoughts on this? Does it work? Is it worth the effort? Im not so sure but obviously looking for ideas. Cheers
Intermediate & Advanced SEO | | YNWA0 -
So What On My Site Is Breaking The Google Guidelines?
I have a site that I'm trying to rank for the Keyword "Jigsaw Puzzles" I was originally ranked around #60 or something around there and then all of a sudden my site stopped ranking for that keyword. (My other keyword rankings stayed) Contacted Google via the site reconsideration and got the general response... So I went through and deleted as many links as I could find that I thought Google may not have liked... heck, I even removed links that I don't think I should have JUST so I could have this fixed. I responded with a list of all links I removed and also any links that I've tried to remove, but couldn't for whatever reasons. They are STILL saying my website is breaking the Google guidelines... mainly around links. Can anyone take a peek at my site and see if there's anything on the site that may be breaking the guidelines? (because I can't) Website in question: http://www.yourjigsawpuzzles.co.uk UPDATE: Just to let everyone know that after multiple reconsideration requests, this penalty has been removed. They stated it was a manual penalty. I tried removing numerous different types of links but they kept saying no, it's still breaking rules. It wasn't until I removed some website directory links that they removed this manual penalty. Thought it would be interesting for some of you guys.
Intermediate & Advanced SEO | | RichardTaylor0 -
URL Structure for Directory Site
We have a directory that we're building and we're not sure if we should try to make each page an extension of the root domain or utilize sub-directories as users narrow down their selection. What is the best practice here for maximizing your SERP authority? Choice #1 - Hyphenated Architecture (no sub-folders): State Page /state/ City Page /city-state/ Business Page /business-city-state/
Intermediate & Advanced SEO | | knowyourbank
4) Location Page /locationname-city-state/ or.... Choice #2 - Using sub-folders on drill down: State Page /state/ City Page /state/city Business Page /state/city/business/
4) Location Page /locationname-city-state/ Again, just to clarify, I need help in determining what the best methodology is for achieving the greatest SEO benefits. Just by looking it would seem that choice #1 would work better because the URL's are very clear and SEF. But, at the same time it may be less intuitive for search. I'm not sure. What do you think?0 -
Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
Hi, We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc. The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them. As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent. The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is 1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it). 2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors). Many thanks in advance
Intermediate & Advanced SEO | | James770