Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Multiple Domains on 1 IP Address
-
We have multiple domains on the same C Block IP Address. Our main site is an eCommerce site, and we have separate domains for each of the following: our company blog (and other niche blogs), forum site, articles site and corporate site. They are all on the same server and hosted by the same web-hosting company.
They all have unique and different content. Speaking strictly from a technical standpoint, could this be hurting us? Can you please make a recommendation for the best practices when it comes to multiple domains like these and having separate or the same IP Addresses?
Thank you!
-
Sorry, I'm confused about the setup. Hosts routinely run multiple sites off of shared IPs, but each domain name resolves as itself. Users and search bots should never see that redirection at all and shouldn't be crawling the IPs. This isn't an SEO issue so much as a setup issue. Likewise, any rel=canonical tags on each site would be tied to that site's specific domain name.
-
Hello Peter,
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
-
I think that situation's a bit different - if you aren't interlinking and the sites are very different (your site vs. customer sites), there's no harm in shared hosting. If you share the IP and one site is hit with a severe penalty, there's a small chance of bleedover, but we don't even see that much these days. Now that we're running out of IPv4 addresses, shared IPs are a lot more common (by necessity).
-
I have something similar. I'm with Hostgator, I have a VPS level 5. It comes with 4 IP address's and I have about 15 sites, some mine, some customer sites spread out over the addresses. There is very little interlinking between the sites but I was concerned too. I have read that Add-on sites are bad for SEO, but as long as you arent feature building crappy sites and linking them to your main site, should be fine.
-
I think @cgman and @Nakul are both right, to a point. Technically, it's fine. Google doesn't penalize shared IPs (they're fairly common). If you're cross-linking your sites, though, it's very likely Google will devalue those links. That tactic has just been abused too much, and a shared IP is a dead giveaway.
Now, is it worth splitting all these out to gain a little more link-juice? In most cases, probably not. Google knows you own the sites, and may devalue them anyway. Chances are, they've already been devalued a bit. So, I don't think it's worth hours and hours and thousands of dollars to give them all their own homes, in most cases (it is highly situational, though).
The only other potential problem is if one site were penalized - there have been cases where that impacted sites on the same IP, especially cross-linked sites. It's not common, and you may not be at any risk, but it's not unheard of. As @Nakul said, it's a risk calculation.
-
I am presuming all those domains are linking to each other, correct ?
Are they regular or nofollow links ? It boils down how much authority you have on your main domain as well as the other domains. If I were you, I will keep the main e-commerce website on one server and everything else including niche blogs etc on a different server. It's not just SEO, but also security issues.
Essentially, to answer your question, it may not be hurting you to have the niche blogs, a forum with user generated content, the articles site and the corporate site on the same IP/server, but it would help you a lot more if they were on a different server, possibly different Class C IPs. So, you will gain from these links being on a different server. Keep in mind, these links are important for you and its good to increase their value by hosting them separately, because these sites are links that your competition can never get linked from. I would also consider doing a nofollow on them, and that's just my thoughts. I prefer lower risk. Again, it depends on what your e-commerce website's link profile is.
-
There is nothing wrong with having multiple sites / blogs on the same C block IP address. However, if you're trying to use your blogs to link to your products to boost SEO scores then you might want to consider other link building techniques in addition. Building backlinks from sites on same IP is okay, but you'll have greater benefits getting links from sites hosted on other servers.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Redirect typo domains
Hi, What's the "correct" way of redirecting typo domains? DNS A record goes to the same ip address as the correct domain name Then 301 redirects for each typo domain in the .htaccess Subdomains on typo urls still redirect to www or should they redirect to the subdomain on the correct url in case the subdomain exists?
Technical SEO | | kuchenchef0 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Umbrella company and multiple domains
I'm really sorry for asking this question yet again. I have searched through previous answers but couldn't see something exactly like this I think. There is a website called example .com. It is a sort of umbrella company for 4 other separate domains within it - 4 separate companies. The Home page of the "umbrella" company website is example.com. It is just an image with no content except navigation on it to direct to the 4 company websites. The other pages of website example.com are the 4 separate companies domains. So on the navigation bar there is : Home page = example.com company1page = company1domain.com company2page= company2domain.com etc. etc. Clicking "home" will take you back to example.com (which is just an image). How bad or good is this structure for SEO? Would you recommend any changes to help them rank better? The "home" page has no authority or links, and neither do 3 out of the 4 other domains. The 4 companies websites are independent in content (although theme is the same). What's bringing them altogether is under this umbrella website - example.com. Thank you
Technical SEO | | AL123al0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Transfer a Main Domain to a Sub-Domain
My IT department tells me they want to transfer my main site domain, which has been in existence since 1999 as an e-commerce site (maindomain.com) to a sub-domain (www2.maindomain.com) or a completely new domain (newdomain.net). This is because we are launching a new website and B2C e-commerce engine, but we still have to maintain the legacy B2B e-commerce engine which contains hard-coded URLs, and both systems can't use the same domain. I've been researching the issue across SEOmoz, but I haven't come across this exact type of scenario (mostly I've seen a sub-domain to new domain). I see major problems with their proposal, including negative SEO impact, loss of domain authority/ranking and issues with branding. Does anyone know the exact type of impact I can expect to see in this scenario and specific steps I should go about to minimize the impact? Btw, I will be using Danny Dover's guide on properly moving domains where appropriate. Thanks!
Technical SEO | | AscendLearning0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
Issue with .uk.com domain
hi i have rockshore.uk.com which is not indexing properly. the internal pages do not show up for the text they have on them, or the title tags. the site is on aekmps shops platform. I understand that a .uk.com is not a proper TLD but i think i have a subdomain of .uk.com Can anyone help? thanks
Technical SEO | | Turkey0