Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain replaced domain in Google SERP
-
Good morning,
This is my first post. I found many Q&As here that mostly answer my question, but just to be sure we do this right I'm hoping the community can take a peak at my thinking below:
Problem: We are relevant rank #1 for "custom poker chips" for example. We have this development website on a subdomain (http://dev.chiplab.com). On Saturday our live 'chiplab.com' main domain was replaced by 'dev.chiplab.com' in the SERP.
Expected Cause: We did not add NOFOLLOW to the header tag. We also did not DISALLOW the subdomain in the robots.txt. We could have also put the 'dev.chiplab.com' subdomain behind a password wall.
Solution: Add NOFOLLOW header, update robots.txt on subdomain and disallow crawl/index.
Question: If we remove the subdomain from Google using WMT, will this drop us completely from the SERP? In other words, we would ideally like our root chiplab.com domain to replace the subdomain to get us back to where we were before Saturday. If the removal tool in WMT just removes the link completely, then is the only solution to wait until the site is recrawled and reindexed and hope the root chiplab.com domain ranks in place of the subdomain again?
Thank you for your time,
Chase
-
Hi Chase,
Removing dev via web master tools should do the trick for now. Then since google won't get to dev anymore you should be safe.
Adding both noindex and password protection is not needed. Since it's password protected Google won't get to see the noindex on the pages. So you should only do one of the two. No need to change now. The password protection is safe.
As expected 'dev.chiplab.com' was removed from the SERP. Now, I'm a bit worried that the link equity was transferred for good to the subdomain from 'www.chiplab.com'. That's not possible, right?
*** Yes, that's not possible so you are good.
Only 301 redirections are "mandatory" for Google to pass equity - so all good.
-
No worries, that's what this community is here for!
Google views subdomains as different entities. They have different authority metrics and therefore different ranking power. Removing a URL on a subdomain won't have any affect on it's brother over on a different subdomain (for example: dev. and www.).
Good call to keep the disallow: / on the dev.chiplab.com/robots.txt file - I forgot to mention that you should leave it there, for anti-crawling purpose.
This is the query you'll want to keep an eye on. The info: operator is new and can be used to show you what Google has indexed as your 'canonical' homepage.
-
Hi Logan,
Last follow-up. I swear.
Since I'm pretty new to this I got scared and cancelled the 'dev.chiplab.com' link removal request. I did this because I didn't want to go up 14 days without any traffic (this is the estimated time I found that the Google SERP can take to be updated even though we "fetched as GoogleBot in GWT). May be wrong on the SERP update time?
So what I did was add a 301 permanent redirect from 'dev.chiplab.com' to 'www.chiplab.com'. I've kept the NOFOLLOW/NOINDEX header on all 'dev' subdomains of course. I've kept the DISALLOW in robots.txt for the dev.chiplab.com site specifically. So now I just plan on doing work in the 'dev' site (because I can't test anything with the redirects happening). And then hopefull in 14 days or so the domain name will change gracefully in the Google SERP from dev.chiplab.com to www.chiplab.com. I did all of this because of how many sales we would lose if it took 14 days to start ranking again for this term. Good?
Best,
Chase
-
You should be all set# I wouldn't worry about link equity, but it certainly wouldn't hurt to keep an eye on your domain authority over the next few days.
-
Hi Logan,
Thanks for fast reply!
We did the following:
- Added NOINDEX on the entire subdomain
- Temporarily removed 'dev.chiplab.com' using Google Webmaster Tools
- Password protected 'dev.chiplab.com'
As expected 'dev.chiplab.com' was removed from the SERP. Now, I'm a bit worried that the link equity was transferred for good to the subdomain from 'www.chiplab.com'. That's not possible, right? Do we now just wait until GoogleBot crawls 'www.chiplab.com' and hope that it is restored to #1?
Thank you for your time (+Shawn, +Matt, +eyqpaq),
Chase
-
noindex would be the easiest way.
Seen some people having the same issue fixing it by adding rel canonical to dev pointing to the new site and so the main site got back step by step with no interruptions...
Cheers.
-
Just like Chase said, noindex your dev site to let the search engines know that it should not show in search. I do this on my dev sites everytime.
-
The most ideal method would be to make the dev page password protected. What I would do is to 301 redirect the dev page to the subsequent correct site pages and then when the SERP refreshes, I'd make the dev site a password protected site.
-
Hi Chase,
Removing the subdomain within Search Console (WMT) will not remove the rest of your WWW URLs. Since you have different properties in Search Console for each, they are treated separately. That removal is only temporary though.
The most sure-fire way to ensure you don't get dev. URLs indexed is to put a NOINDEX tag on that entire subdomain. NOFOLLOW simply means that links on whatever page that tag is on won't be followed by bots.
Remember, crawling and indexing are different things. For example, if on your live www. site you had an absolute link somewhere in the mix that had dev.chiplab.com in it, since you presumably haven't nofollowed your live site, a bot will still access that page. The same situation goes for a robots.txt disallow. That only prevents crawling, not indexing. In theory, a bot can get to a disallowed URL and still index it. See this query for an example.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Redirect domain or keep separate domains in each country?
Hi all Hoping this might be something that can be answered given the number of variables 🙂
Intermediate & Advanced SEO | | IsaCleanse
My main site is www.isacleanse.com.au (Obviously targeted to Australian Market) and also www.isacleanse.co.nz targeted to NZ. The main Keywords im targeting are 'Isagenix' for both and also Isagenix Australia, Isagenix Perth, Sydney (Australian cities) and Isagenix NZ, Isagenix New Zealand, Isagenix Auckland etc.. for NZ The Australian site gets a lot more traffic and Australian market gets a lot more searches - I also have a section www.isacleanse.com.au/isagenix-new-zealand/ on the Australian site. The question is am I best off redirrecting the .co.nz domain completley to the Australian Domain to give it extra SEO Juice?0 -
Blog subdomain not redirecting
Over the last few weeks I have been focused on fixing high and medium priority issues, as reported by the Moz crawler, after a recent transition to WordPress. I've made great progress, getting the high priority issues down from several hundred (various reasons, but many duplicates for things like non-www and www versions) to just five last week. And then there's this weeks report. For reasons I can't fathom, I am suddenly getting hundreds of duplicate content pages of the form http://blog.<domain>.com</domain> (being duplicates with the http://www.<domain>.com</domain> versions). I'm really unclear on why these suddenly appeared. I host my own WordPress site ie WordPress.org stuff. In Options / General everything refers to http://www.<domain>.com</domain> and has done for a number of weeks. I have no idea why the blog versions of the pages have suddenly appeared. FWIW, the non-www version of my pages still redirect to the www version, as I would expect. I'm obviously pretty concerned by this so any pointers greatly appreciated. Thanks. Mark
Intermediate & Advanced SEO | | MarkWill0 -
Sitemap on a Subdomain
Hi, For various reasons I placed my sitemaps on a subdomain where I keep images and other large files (static.example.com). I then submitted this to Google as a separate site in Webmaster tools. Is this a problem? All of the URLs are for the actual site (www.example.com), the only issue on my end is not being able to look at it all at the same time. But I'm wondering if this would cause any problems on Google's end.
Intermediate & Advanced SEO | | enotes0 -
Other domains hosted on same server showing up in SERP for 1st site's keywords
For the website in question, the first domain alphabetically on the shared hosting space, strange search results are appearing on the SERP for keywords associated with the site. Here is an example: A search for "unique company name" shows the results: www.uniquecompanyname.com as the top result. But on pages 2 and 3, we are getting results for the same content but for domains hosted on the same server. Here are some examples with the domain name replaced: UNIQUE DOMAIN NAME PAGE TITLE
Intermediate & Advanced SEO | | Motava
ftp.DOMAIN2.com/?action=news&id=63
META DESCRIPTION TEXT UNIQUE DOMAIN NAME PAGE TITLE 2
www.DOMAIN3.com/?action=news&id=120
META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 2
www.DOMAIN4.com/?action=news&id=120
META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 3
mail.DOMAIN5.com/?action=category&id=17
META DESCRIPTION TEXT3 ns5.DOMAIN6.com/?action=article&id=27 There are more but those are just some examples. These other domain names being listed are other customer domains on the same VPS shared server. When clicking the result the browser URL still shows the other customer domain name B but the content is usually the 404 page. The page title and meta description on that page is not displayed the same as on the SERP.As far as we can tell, this is the only domain this is occurring for.So far, no crawl errors detected in Webmaster Tools and moz crawl not completed yet.0 -
Redirect ruined domain to new domain without passing link juice
A new client has a domain which has been hammered by bad links, updates etc and it's basically on its arse because of previous SEO guys. They have various domains for their business (brand.com, brand.co.uk) and want to use a fresh domain and take it from there. Their current domain is brand.com (the ruined one). They're not bothered about the rankings for brand.com but they want to redirect brand.com to brand.co.uk so that previous clients can find them easily. Would a 302 redirect work for this? I don't want to set up a 301 redirect as I don't want any of the crappy links pointing across. Thanks!
Intermediate & Advanced SEO | | jasonwdexter0 -
Subdomain Blog Sitemap link - Add it to regular domain?
Example of setup:
Intermediate & Advanced SEO | | EEE3
www.fancydomain.com
blog.fancydomain.com Because of certain limitations, I'm told we can't put our blogs at the subdirectory level, so we are hosting our blogs at the subdomain level (blog.fancydomain.com). I've been asked to incorporate the blog's sitemap link on the regular domain, or even in the regular domain's sitemap. 1. Putting the a link to blog.fancydomain.com/sitemap_index.xml in the www.fancydomain.com/sitemap.xml -- isn't this against sitemap.org protocol? 2. Is there even a reason to do this? We do have a link to the blog's home page from the www.fancydomain.com navigation, and the blog is set up with its sitemap and link to the sitemap in the footer. 3. What about just including a text link "Blog Sitemap" (linking to blog.fancydomain.com/sitemap_index.html) in the footer of the www.fancydomain.com (adjacent to the text link "Sitemap" which already exists for the www.fancydomain.com's sitemap. Just trying to make sense of this, and figure out why or if it should be done. Thanks!0 -
Use of subdomains, subdirectories or both?
Hello, i would like your advice on a dilemma i am facing. I am working a new project that is going to release soon, thats a network of users with personal profiles seperated in categories for example lets say the categories are colors. So let say i am a member and i belong in red color categorie and i got a page where i update my personal information/cv/resume as well as a personal blog thats on that page. So the main site is giving the option to user to search for members by the criteria of color. My first idea is that all users should own a subdomain (and this is how its developed so far) thats easy to use and since the domain name is really small (just 3 letters) i believe subdomain worth since personal site will be easy to remember. My dilemma is should all users own a subdomain, a subdirectory or both and if both witch one should be the canonical? Since it said that search engines treat subdomains as different stand-alone sites, whats best for the main site? to show multiple search results with profiles in subdomains or subdirectories? What if i use both? meaning in search results i use search directory url for each profile while same time each profile owns a subdomains as well? and if so which one should be the canonical? Thanks in advance, C
Intermediate & Advanced SEO | | HaCos0