Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best way to block a sub-domain from being indexed
-
Hello,
The search engines have indexed a sub-domain I did not want indexed its on
old.domain.com and dev.domain.com - I was going to password them but is there a best practice way to block them.
My main domain default robots.txt says :-
Sitemap: http://www.domain.com/sitemap.xml
global
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Disallow: /trackback/
Disallow: /feed/
Disallow: /comments/
Disallow: /category//
Disallow: */trackback/
Disallow: */feed/
Disallow: /comments/
Disallow: /? -
Hi,
CleverPhD has some interesting ideas with robots.txt and Google Webmaster Tools, but simply password protecting all dev pages should keep pages out of Google's index. There's no best practice here, since a password wall will keep Googlebot out on its own.
To be doubly safe, you can also include a meta noindex tag on dev pages.
Keep in mind that once a page is in Google's index, it's going to take awhile for it to leave (unless you use CleverPhD's method). But, having a blank page in Google's index really isn't all that bad. It's there, but it won't rank for much.
Hope this helps,
Kristina
-
I've never tried a method like this - FreshFireOne, did you?
-
First and foremost when you finish all this - password protect your dev instances. A url will leak out eventually and then this happens. I know it is a PIA, but it is worth it.
To remove subdomains. Go into GWT and register the subdomains as separate websites in GWT. Create a robots.txt for each subdomain (not the one you mention, you need a robots that is specific to that subdomain that disallows all files. If you cant do that, have your subdomains include a noindex meta tag on all pages. You have to be careful with this as you do not want to push out your dev. robots.txt or the noindex meta tags to your production server, but it can be done. Talk to your devs. Then go into GWT and use the URL removal tool. Just leave it blank and it will remove the whole site.
Poof. Gone. You can then watch the GWT accounts. They will show errors for the dev site like "Severe health issues are found on your site - Some important page has been removed by request." This is a good error as it confirms that that subdomain is removed.
We actually used this not on a dev site but on our www1 server that was indexed. We use a load balancer with multiple copies of the site. www1 was completing with www. Using this above did the trick.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirection of 100 domain to Main domain affects SEO?
Hi guys, An email software vendor managed by a different area of my company redirected 100 domains used for unsolicited email campaigns to my main domain. These domains are very likely to get blacklisted at some point. My SEO tool now is showing me all those domains as "linking" to my main site as do-follow links. The vendor states that this will not affect my main domain/website in any way. I'm highly concerned. I would appreciate your professional opinion about this. Thanks!!
Intermediate & Advanced SEO | | anagentile0 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Unique domains vs. single domain for UGC sites?
Working on a client project - a UGC community that has a DTC model as well as a white label model. Is it categorically better to have them all under the same domain? Trying to figure which is better: XXX,XXX pages on one site vs. A smaller XXX,XXX pages on one site and XX,XXX pages on 10-20 other sites all pointing to the primary site. The thinking on the second was that those domains would likely achieve high DA as well as the primary, and would passing their value to the primary. Thoughts? Any other considerations we should be thinking about?
Intermediate & Advanced SEO | | intentionally0 -
Should I redirect my Google Update Effected Domain to brand new Domain?
Hey Moz experts, I had a domain which was really doing better but after the Humming Bird update my traffic was decreased up to 90%. There are plenty of posts on my existing blog, Now what should I do? I mean should I redirect it to a brand new domain or Copy all the posts to a brand new domain and delete my existing domain? Note that the Old domain has PR1, DA 19 and PA 30.
Intermediate & Advanced SEO | | imran20780 -
Best practice for duplicate website content: same root domain name but different extension
Hi there I have a new client who has two websites: http://www.bayofislandsteambuilding.co.nz
Intermediate & Advanced SEO | | turnbullholdingsltd
http://www.bayofislandsteambuilding.org.nz They are the same in every regard apart from the domain extension (.co.nz & .org.nz) which is likely to be causing them issues with Google ranking given the huge amount of duplicate content. What is the best practice approach to fixing this? Normally, if I was starting from scratch, I would set one of the extensions as an alias which redirects to the main domain. Thanks in advance. Laurie0 -
Best way to get pages indexed fast?
Any suggestion on best ways to get new sites pages indexed? Was thinking getting high pr inbound links on fiverr but always a little risky right? Thanks for your opinions.
Intermediate & Advanced SEO | | mweidner27820 -
One Way Links vs Two Way Links
Hi, Was speaking to a client today and got asked how damaging two way links are. i.e. domaina.com links to domainb.com and domainb.com links back to domaina.com. I need a nice simple layman's explanation of if/how damaging they are compared to one way links. And please don't answer with you lose link juice as I have a job explaining link juice.... I am explaining things to a non techie! Thank you!!
Intermediate & Advanced SEO | | JohnW-UK0 -
Domain Name Change - Best Practices?
Good day guys, We got a restaurant that is changing its name and domain. However they are keeping the same server location, same content and same pages (we are just changing the logo on the website). It just has to go a new domain. We don't want to lose the value of the current site, and we want to avoid any duplicate penalties. Could you please advise of the best practices of doing a domain name change? Thank you.
Intermediate & Advanced SEO | | Michael-Goode0