Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using GeoDNS across 3 server locations
-
Hi,
I have multiple servers across UK and USA. I have a web site that serves both areas and was looking at cloning my sites and using GeoDNS to route visitors to the closest server to improve speed and experience
So UK visitors would connect to UK dedicated server, North America - New York server and so on
Is this a good way or would this effect SEO negatively.
Cheers
Keith
-
Hi Keith,
I meant the physical bandwidth - i.e. your time. I probably should've been more clear in a technical forum!
For the architecture, there are a few common setups. What I am in the middle of doing here at my company is through Google Cloud services. Duplicating the website app or script (I.e. Wordpress, Ghost, Drupal, CMS, Python App, Rails app, etc) across the several servers and using a load balancer to determine the fastest server. In the app's configuration I am using a single Database server also set up on Google Cloud, so when one server executes a command, it is reflected for all users on all servers. If you're Cron-jobbing all the servers you have set up but no common database, you're going to have some integrity issues, with some servers having some comments or edits, and some servers not.
-
Hi,
I have quite a lot of servers dotted around UK and USA so hosting and bandwidth is no big issue. if I host soley UK the ping times is a whopping 100ms+ to USA and vice versa so this leads me to hosting at least bother countries and latency will be 10-20ms and TTFB nice and low
I like the idea of creating and maintaining one major site as all will be English based, any backlinks will always be pointed to the dot com as opposed to splitting across multiple domains. Seo wise not too bothered will be focusing on speed and entertaining people with info on what they looking for - too me this is more important then the rest
Al servers are Cpanel based, so will try and find a solution to replicate sites in real-time or cron based intervals. this will be the next challenge
If I can pull this off it will be great for other sites I have too
Regards
Keith
-
Personally, I would use the one domain. And from what you've said, you would prefer it as well.
Thankfully, rankings are on a domain basis and not an IP basis, so there would be no issue in the first scenario. If you are duplicating and synchronizing the servers, you are better off using the one domain because you aren't creating two separate websites with differing content (UK English vs US English).
Do you have the bandwidth or ability to produce separate versions (for each domain) for each area you want to target? If not you are best off generalizing your website to target all English users instead of en-US, en-GB, etc. You're going to have to evaluate your geotargeting goals and budget.
-
Hi,
Many thansk for your input
I was planning to use cloudns GeoIP to send visitors to the server of their region.
So having one web site - www.xyz.com that is duplicated across three server (location) so all people see the same site. this would maintain the backlinks and no matter if google crawls from USA or UK it will see it as one domain with exception of 3 IP's in useor have www.xyz.com and www.xyz.co.uk as duplicates and set this in google webmaster tools.
plus set the language en-US and en-UKNot sure which is the best solution. www.xyz.com has the most backlinks and DA, where www.xyz.co.uk has zero and will be new to the world
I would rather people generate backlinks for the one domain as well
Your thoughts are welcome
Regards
Keith
-
The way GeoDNS works is through one of two methods: split DNS or load balancing. The end result is the same, the user will be directed to their closest or fastest available server.
Theoretically, this helps achieves a major goal of technical SEO - great site speed.
With the new Google Web Core Vitals update of this year, site speed and user experience has been further notched up as ranking factors. To get more technical– LCP, largest contentful paint, the speed of which the largest asset on a page loads, and FCP, first contentful paint, the speed of which the first legible content is produced on the screen, are site speed signals used by Google in their ranking algorithm. By connecting a user to the closest/ fastest server available, you can bring down the time on LCP and FCP and thereby increase your rank. The rank change may not be immediately noticeable depending on the competitiveness of your keywords and industry. You can measure these and other variables here: https://developers.google.com/speed/pagespeed/insights/
In short: No, your SEO won't be negatively impacted, and it will more likely be positively impacted by these optimizations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Tools Should I Use To Investigate Damage to my website
I would like to know what tools I should use and how to investigate damage to my website in2town.co.uk I hired a person to do some work to my website but they damaged it. That person was on a freelance platform and was removed because of all the complaints made about them. They also put in backdoors on websites including mine and added content. I also had a second problem where my content was being stolen. My site always did well and had lots of keywords in the top five and ten, but now they are not even in the top 200. This happened in January and feb. When I write unique articles, they are not showing in Google and need to find what the problem is and how to fix it. Can anyone please help
Technical SEO | | blogwoman10 -
Canonical Tag when using Ajax and PhantomJS
Hello, We have a site that is built using an AJAX application. We include the meta fragment tag in order to get a rendered page from PhantomJS. The URL that is rendered to google from PhantomJS then is www.oursite.com/?escaped_fragment= In the SERP google of course doesnt include the hashtag in the URL. So my question, with this setup, do i still need a canonical tag and if i do, would the canonical tag be the escaped fragment URL or the regular URL? Much Appreciated!
Technical SEO | | RevanaDigitalSEO0 -
302 redirect used, submit old sitemap?
The website of a partner of mine was recently migrated to a new platform. Even though the content on the pages mostly stayed the same, both the HTML source (divs, meta data, headers, etc.) and URLs (removed index.php, removed capitalization, etc) changed heavily. Unfortunately, the URLs of ALL forum posts (150K+) were redirected using a 302 redirect, which was only recently discovered and swiftly changed to a 301 after the discovery. Several other important content pages (150+) weren't redirected at all at first, but most now have a 301 redirect as well. The 302 redirects and 404 content pages had been live for over 2 weeks at that point, and judging by the consistent day/day drop in organic traffic, I'm guessing Google didn't like the way this migration went. My best guess would be that Google is currently treating all these content pages as 'new' (after all, the source code changed 50%+, most of the meta data changed, the URL changed, and a 302 redirect was used). On top of that, the large number of 404's they've encountered (40K+) probably also fueled their belief of a now non-worthy-of-traffic website. Given that some of these pages had been online for almost a decade, I would love Google to see that these pages are actually new versions of the old page, and therefore pass on any link juice & authority. I had the idea of submitting a sitemap containing the most important URLs of the old website (as harvested from the Top Visited Pages from Google Analytics, because no old sitemap was ever generated...), thereby re-pointing Google to all these old pages, but presenting them with a nice 301 redirect this time instead, hopefully causing them to regain their rankings. To your best knowledge, would that help the problems I've outlined above? Could it hurt? Any other tips are welcome as well.
Technical SEO | | Theo-NL0 -
Genesis WP Theme H1 Tag not properly Used?
I am in the process of redesigning my website, and I have been working on the Genesis framework a lot lately, so I used the Genesis framework to make my new site. The URL is http://protechig.com As I look at the H1 on the page (homepage only, every other page has solid h1s from an SEO perspective.) The first thing that I see is that the home page H1 is a links (to protech's home page). The second thing that I see is the the title text is replaced with an image (my logo) and there is a text-indent:-99999; and overflow:hiden; I just want to know from an SEO perspective if this is okay, and, if it isn't, what I could/should to to rectify it. Thanks Zach
Technical SEO | | Zachary_Russell0 -
Links from the same server has value or not
Hi Guys, Sometime ago one of the SEO experts said to me if I get links from the same IP address, Google doesn't count them as with much value. For an example, I am a web devleoper and I host all my clients websites on one server and link them back to me. Im wondering whether those links have any value when it comes to seo or should I consider getting different hosting providers? Regards Uds
Technical SEO | | Uds0 -
Should H1 tags include location?
I have an IT services company that is based out of Denver. In the past I always used Denver in the H1 tag like this "Denver IT Support & Managed Services" or "Denver Data Center Solutions" I know that H tags are not that important any more but I still want to put them on each page. My question is in a post panda world do those look too spammy? Should I not include Denver on each page. I have about 25 service pages that I was going to do this for. Each page will be different because of the service but I was going to include Denver on each page. On that same note how, I normally put never in the title for each page. Should I rethink this also? Obvisouly I want to rank on Denver and the service. Any help on this would be great. Thanks
Technical SEO | | ZiaTG0 -
Does using parentheses affect the crawlers?
Quick question: if you using a parantheses around a keyword, do search bots still recognize the keyword? Fox ex: Welcome to a website about the National Basketball Association (NBA). Will the bots recognize that I'm trying to optimize to NBA and not (NBA)? Is this different for tags vs. actual body copy?
Technical SEO | | BPIAnalytics2 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0