Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why is my servers ip address showing up in Webmaster Tools?
-
In links to my site in Google Webmaster Tools I am showing over 28,000 links from an ip address. The ip address is the address that my server is hosted on. For example it shows 200.100.100.100/help, almost like there are two copies of my site, one under the domain name and one under the ip address. Is this bad? Or is it just showing up there and Google knows that it is the same since the ip and domain are from the same server?
-
Hmmm, this is a weird one. My guess is, since Google originally found those links (maybe before your site launched, but the pages were linked to and live through the IP address?), it keeps returning to them and finding them. In that case, not much you can do, but keep those canonicals on.
Canonicals really can save you from duplicate content problems: I've had clients with multiple versions of every page based on the path you take to a page, and canonicals have allowed them to rank well and avoid penalties entirely. As long as you're doing everything else right, hopefully this shouldn't be too much of an issue.
Sorry this ended up falling on you!
-
According to my latest links in Webmaster Tools the first time it happened was October 2012, which is before the site launch. It seems to have accelerated this year. It is a total of 16341 links but under linked pages it only says 27.
-
Hm, this could have, though. When did you first notice these backlinks from the IP address in GWT?
-
I am unsure to be honest. We had an organic traffic drop in 2012 the week of the penguin release. We launched a new site last year which killed organic so I am trying to improve our rankings. I can say confidently we have had nothing in Webmaster Tools, but maybe it has hurt traffic.
-
Well, from an SEO perspective, this hasn't lead to any penalties or reduced rankings, right?
-
Recently we switched to https so I started using self-referential rel="canonical" on all my pages. I can't figure this out, and nobody else can either. I am on all sorts of boards, forums, groups, and nobody has ever heard of this. I just don't get it.
-
Did you add canonicals, at least, to make sure that Google wouldn't find duplicate content? That's what I'd be most worried about, from an SEO perspective.
-
I never solved the problem. I made a new post to see if anything has changed. It seems strange that nobody else has ever had this problem. I looked all over Google and nothing. I just ran Screaming Frog and nothing showed up.
-
How is this going? Did you solve the problem?
One quick note: if you can't find a link to the IP address on your site (or, a link to a broken link or an old domain), run a Screaming Frog or Xenu crawl and look at all external links. There's probably a surprise footer link or something like that that's causing the problem, and it'd be easy to miss manually. But tools find all!
Good luck.
-
Yeah it's generally a DNS setup. If you're hosting with a company the best thing to do is open a ticket and get them to walk through it with you. Most providers will have their own admin panels.
-
I have looked and can't find anything in the site that goes from ip. I have looked in Webmaster Tools and it doesn't show any duplicate content. We are on a Windows server, think it would be pretty easy to redirect the ip to the domain?
-
There might be a link or something directing the crawlers to your site's IP address instead of the original domain. There is potential for getting flagged with duplicate content but I feel it's fairly unlikely. You do want to fix this though, it would hamper your backlink efforts. These steps will correct this issue.
1. Setup canonical tags on all your pages. This lets Google know that 1 url should be linked for this page whether they're on the IP or domain.
2. Set your host up so that anything that directs to the IP is automatically redirected to the domain. This can be done with your hosting company, or through .htaccess, or through PHP. I suggest you do it with the hosting company.
3. Check through your site and make sure no links point to the IP domain. If there are no links pointing to the IP, the crawler shouldn't follow.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mobile website on a different URL address?
My client has an old eCommerce website that is ranking high in Google. The website is not responsive for mobile devices. The client wants to create a responsive design mobile version of the website and put it on a different URL address. There would be a link on the current page pointing to the external mobile website. Is this approach ok or not? The reason why the client does not want to change the design of the current website is because he does not have the budget to do so and there are a lot of pages that would need to be moved to the new design. Any advice would be appreciated.
Intermediate & Advanced SEO | | andypatalak0 -
Exact Syntax for Canonical to PDFs for Windows Server
Hi There, I have got in my web several PDFs with the same content of the HTML version. Thus I need to set up a canonical for each of them in order to avoid duplicate content. In particular, I need to know how to write the exact syntax for the windows server (web.config) in order to implement the canonical to PDF. I surfed the web but it seems I cannot find this piece of info anywhere Thanks a lot!!
Intermediate & Advanced SEO | | Midleton0 -
Pitfalls when implementing the “VARY User-Agent” server response
We serve up different desktop/mobile optimized html on the same URL, based on a visitor’s device type. While Google continue to recommend the HTTP Vary: User-Agent header for mobile specific versions of the page (http://www.youtube.com/watch?v=va6qtaiZRHg), we’re also aware of issues raised around CDN caching; http://searchengineland.com/mobile-site-configuration-the-varies-header-for-enterprise-seo-163004 / http://searchenginewatch.com/article/2249533/How-Googles-Mobile-Best-Practices-Can-Slow-Your-Site-Down / http://orcaman.blogspot.com/2013/08/cdn-caching-problems-vary-user-agent.html As this is primarily for Google's benefit, it's been proposed that we only returning the Vary: User-Agent header when a Google user agent is detected (Googlebot/MobileBot/AdBot). So here's the thing: as the server header response is not “content” per se I think this could be an okay solution, though wanted to throw it out there to the esteemed Moz community and get some additional feedback. You guys see any issues/problems with implementing this solution? Cheers! linklater
Intermediate & Advanced SEO | | linklater0 -
Best server-side sitemap generators
I've been looking into sitemap generators recently and have got a good knowledge of what creating a sitemap for a small website of below 500 URLs involves. I have successfully generated a sitemap for a very small site, but I’m trying to work out the best way of crawling a large site with millions of URLs. I’ve decided that the best way to crawl such a large number of URLs is to use a server side sitemap, but this is an area that doesn’t seem to be covered in detail on SEO blogs / forums. Could anyone recommend a good server side sitemap generator? What do you think of the automated offerings from Google and Bing? I’ve found a list of server side sitemap generators from Google, but I can’t see any way to choose between them. I realise that a lot will depend on the type of technologies we use server side, but I'm afraid that I don't know them at this time.
Intermediate & Advanced SEO | | RG_SEO0 -
How to add subdomains to webmaster tools?
Can anyone help with how I add a sub domain to webmaster tools? Also do I need to create a seperate sitemap for each sub domain? Any help appreciated!
Intermediate & Advanced SEO | | SamCUK1 -
Tool to check XML sitemap
Hello, Can anyone help me finding a tool to have closer look of the XML sitemap? Tks in advance! PP
Intermediate & Advanced SEO | | PedroM0 -
Can I Improve Organic Ranking by Restrict Website Access to Specific IP Address or Geo Location?
I am targeting my website in US so need to get high organic ranking in US web search. One of my competitor is restricting website access to specific IP address or Geo location. I have checked multiple categories to know more. What's going on with this restriction and why they make it happen? One of SEO forum is also restricting website access to specific location. I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A. But, why Lamps Plus have set this? Is there any specific reason? Can I improve my organic ranking? Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...
Intermediate & Advanced SEO | | CommercePundit1 -
Wordtracker vs Google Keyword Tool
When I find keyword opportunities in Wordtracker, I'll sometimes run them through Adwords Keyword tool only to find that Google says these keywords have 0 search volume. Would you use these keywords even though Google says users aren't searching for them?
Intermediate & Advanced SEO | | nicole.healthline0