Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Multiple Domains on 1 IP Address
-
We have multiple domains on the same C Block IP Address. Our main site is an eCommerce site, and we have separate domains for each of the following: our company blog (and other niche blogs), forum site, articles site and corporate site. They are all on the same server and hosted by the same web-hosting company.
They all have unique and different content. Speaking strictly from a technical standpoint, could this be hurting us? Can you please make a recommendation for the best practices when it comes to multiple domains like these and having separate or the same IP Addresses?
Thank you!
-
Sorry, I'm confused about the setup. Hosts routinely run multiple sites off of shared IPs, but each domain name resolves as itself. Users and search bots should never see that redirection at all and shouldn't be crawling the IPs. This isn't an SEO issue so much as a setup issue. Likewise, any rel=canonical tags on each site would be tied to that site's specific domain name.
-
Hello Peter,
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
-
I think that situation's a bit different - if you aren't interlinking and the sites are very different (your site vs. customer sites), there's no harm in shared hosting. If you share the IP and one site is hit with a severe penalty, there's a small chance of bleedover, but we don't even see that much these days. Now that we're running out of IPv4 addresses, shared IPs are a lot more common (by necessity).
-
I have something similar. I'm with Hostgator, I have a VPS level 5. It comes with 4 IP address's and I have about 15 sites, some mine, some customer sites spread out over the addresses. There is very little interlinking between the sites but I was concerned too. I have read that Add-on sites are bad for SEO, but as long as you arent feature building crappy sites and linking them to your main site, should be fine.
-
I think @cgman and @Nakul are both right, to a point. Technically, it's fine. Google doesn't penalize shared IPs (they're fairly common). If you're cross-linking your sites, though, it's very likely Google will devalue those links. That tactic has just been abused too much, and a shared IP is a dead giveaway.
Now, is it worth splitting all these out to gain a little more link-juice? In most cases, probably not. Google knows you own the sites, and may devalue them anyway. Chances are, they've already been devalued a bit. So, I don't think it's worth hours and hours and thousands of dollars to give them all their own homes, in most cases (it is highly situational, though).
The only other potential problem is if one site were penalized - there have been cases where that impacted sites on the same IP, especially cross-linked sites. It's not common, and you may not be at any risk, but it's not unheard of. As @Nakul said, it's a risk calculation.
-
I am presuming all those domains are linking to each other, correct ?
Are they regular or nofollow links ? It boils down how much authority you have on your main domain as well as the other domains. If I were you, I will keep the main e-commerce website on one server and everything else including niche blogs etc on a different server. It's not just SEO, but also security issues.
Essentially, to answer your question, it may not be hurting you to have the niche blogs, a forum with user generated content, the articles site and the corporate site on the same IP/server, but it would help you a lot more if they were on a different server, possibly different Class C IPs. So, you will gain from these links being on a different server. Keep in mind, these links are important for you and its good to increase their value by hosting them separately, because these sites are links that your competition can never get linked from. I would also consider doing a nofollow on them, and that's just my thoughts. I prefer lower risk. Again, it depends on what your e-commerce website's link profile is.
-
There is nothing wrong with having multiple sites / blogs on the same C block IP address. However, if you're trying to use your blogs to link to your products to boost SEO scores then you might want to consider other link building techniques in addition. Building backlinks from sites on same IP is okay, but you'll have greater benefits getting links from sites hosted on other servers.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old domain to new domain
Hi, A website on server A is no longer required. The owner has redirected some URLS of this website (via plugin) to his new website on server B -but not all URLS. So when I use COMMAND site:website A , I see a mixture of redirected URLS and not redirected URLS.Therefore two websites are still being indexed in some form and causing duplication. However, weirdly when I crawl with Screaming Frog I only see one URL which is 301 redirected to the new website. I would have thought I'd see lots of URLs which hadn't been redirected. How come it is different to using the site:command? Anyway, how do I move to the new website completely without the old one being indexed anymore. I thought I knew this but have read so many blogs I've confused myself! Should I: Redirect all URLS via the HTACESS file on old website on server A? There are lots of pages indexed so a lot of URLs. What if I miss some? or Point the old domain via DNS to server B and do the redirects in website B HTaccess file? This seems more sensible but does this method still retain the website rankings? Thanks for any help
Technical SEO | | AL123al0 -
I have a GoDaddy website and have multiple homepages
I have GoDaddy website builder and a new website http://ecuadorvisapros.com and I notices through your crawl test that there are 3 home pages http://ecuadorvisapros with a 302 temporary redirect, http://www.ecuadorvisapros.com/ with no redirect and http://www.ecuadorvisapros/home.html. GoDaddy says there is only one home page. Is this going to kill my chances of having a successful website and can this be fixed? Or can it. I actually went with the SEO version thinking it would be better, but it wants to auto change my settings that I worked so hard at with your sites help. Please keep it simple, I am a novice although I have had websites in the past I know more about the what's than the how's of websites. Thanks,
Technical SEO | | ScottR.0 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Multiple H1 tags in Squarespace
Hi. I'm using Squarespace, and I've noticed they assign the page title and site title h1 tag status. So if I add an on-page h1 tag, that's three in total. I've seen what Matt Cutts said about multiple h1 tags being acceptable (although that video was back in 2009 and a lot has changed since then). But I'm still a little concerned that this is perhaps not the best way of structuring for SEO. Could anyone offer me any advice? Thanks.
Technical SEO | | The_Word_Department0 -
Multiple urls for posting multiple classified ads
Want to optimize referral traffic while at same time keep search engines happy and the ads posted. Have a client who advertises on several classified ad sites around the globe. Which is better (post Panda), having multiple identical urls using canonicals to redirect juice to original url? For example: www.bluewidgets.com is the original www.bluewidgetsusa.com www.blue-widgets-galore.com Or, should the duplicate pages be directed to original using a 301? Currently using duplicate urls. Am currently not using "nofollow" tags on those pages.
Technical SEO | | AllIsWell0 -
Any way around buying hosting for an old domain to 301 redirect to a new domain?
Howdy. I have just read this QA thread, so I think I have my answer. But I'm going to ask anyway! Basically DomainA.com is being retired, and DomainB.com is going to be launched. We're going to have to redirect numerous URLs from DomainA.com to DomainB.com. I think the way to go about this is to continue paying for hosting for DomainA.com, serving a .htaccess from that hosting account, and then hosting DomainB.com separately. Anybody know of a way to avoid paying for hosting a .htaccess file on DomainA.com? Thanks!
Technical SEO | | SamTurri0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0