Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Indexed a version of my site w/ MX record subdomain
-
We're doing a site audit and found "internal" links to a page in search console that appear to be from a subdomain of our site based on our MX record. We use Google Mail internally. The links ultimately redirect to our correct preferred subdomain "www", but I am concerned as to why this is happening and if it can have any negative SEO implications.
Example of one of the links:
Links aspmx3.googlemail.com.sullivansolarpower.com/about/solar-power-blog/daniel-sullivan/renewable-energy-and-electric-cars-are-not-political-footballs I did a site operator search, site:aspmx3.googlemail.com.sullivansolarpower.com on google and it returns several results.
-
You appear to have the MX sub-domain also set up as an A record.
If you have a mac / linux you can run the command: host aspmx3.googlemail.com.sullivansolarpower.com
You get the result aspmx3.googlemail.com.sullivansolarpower.com has address 72.10.48.198
Where you should get the result "not found".
I think you want to delete the A record (though check the documentation of your email provider first). You should only need them set up as MX records and shouldn't need the A record.
You've done the right thing by setting up the redirect - which should mean that the pages drop out of the index and those links disappear. (Note that there is also an https error on the aspmx3 sub-domain - but given that you don't actually want it, I don't suppose that matters that much).
Hope that helps.
-
I did not explain the problem thoroughly. The problem is, the link does not actually exist anywhere. To make a very long story short. There was an issue with server configuration for a period of a couple months. During that time, an unknown number of non-existent subdomains got indexed. Basically, if anyone had a typo in the subdomain when accessing our site, it would get cached and if Google crawled our site before we cleared the cache, the typo subdomain would get indexed. Over a period of a couple months, many bad subdomains were accidentally created and indexed by Google. We do not have any way of finding a comprehensive list of all of them. This problem has been resolved so we are not getting new bad subdomains created and indexed, but the damage has been done.
The way our site is setup currently, any attempt to reach our site with any subdomain other than "www" gets redirected to "www.sullivan..." Also, any nonsecure protocol gets resolved to https://
The actual problem, simply put is this: Google has an index which includes some number of unknown, non existent subdomains. We need to get rid of them and cannot figure out how.
Example: Copy and paste the following into google and search it:
site:aspmx3.googlemail.com.sullivansolarpower.com
Google will return two results. If you click on either, it resolves to the "https://www. version of the page.
I know it is confusing, but does that make sense? I have searched everywhere, but the reason this happened was because of a perfect storm of server configuration issues and I cannot find anyone else who has had the same problem.
If it were one or two bad subdomains, we would just put them into search console and then get "remove URL" for the entire subdomain. But it is not 1 or 2. It is at least 10 that I know of and could be hundreds for all I know.
Does anyone have any ideas? Any and all would be welcome.
Thank you.
-
You should find the locations of those links and correct them to point to the proper URL. I find that Screaming Frog's crawl is the easiest for this, you can find every link and see where they are located.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Audit Tools Not Picking Up Content Nor Does Google Cache
Hi Guys, Got a site I am working with on the Wix platform. However site audit tools such as Screaming Frog, Ryte and even Moz's onpage crawler show the pages having no content, despite them having 200 words+. Fetching the site as Google clearly shows the rendered page with content, however when I look at the Google cached pages, they also show just blank pages. I have had issues with nofollow, noindex on here, but it shows the meta tags correct, just 0 content. What would you look to diagnose? I am guessing some rogue JS but why wasn't this picked up on the "fetch as Google".
Technical SEO | | nezona0 -
How preproduction website is getting indexed in Google.
Hi team, Can anybody please help me to find how my preproduction website and urls are getting indexed in Google.
Technical SEO | | nlogix0 -
Does Google index internal anchors as separate pages?
Hi, Back in September, I added a function that sets an anchor on each subheading (h[2-6]) and creates a Table of content that links to each of those anchors. These anchors did show up in the SERPs as JumpTo Links. Fine. Back then I also changed the canonicals to a slightly different structur and meanwhile there was some massive increase in the number of indexed pages - WAY over the top - which has since been fixed by removing (410) a complete section of the site. However ... there are still ~34.000 pages indexed to what really are more like 4.000 plus (all properly canonicalised). Naturally I am wondering, what google thinks it is indexing. The number is just way of and quite inexplainable. So I was wondering: Does Google save JumpTo links as unique pages? Also, does anybody know any method of actually getting all the pages in the google index? (Not actually existing sites via Screaming Frog etc, but actual pages in the index - all methods I found sadly do not work.) Finally: Does somebody have any other explanation for the incongruency in indexed vs. actual pages? Thanks for your replies! Nico
Technical SEO | | netzkern_AG0 -
My video sitemap is not being index by Google
Dear friends, I have a videos portal. I created a video sitemap.xml and submit in to GWT but after 20 days it has not been indexed. I have verified in bing webmaster as well. All videos are dynamically being fetched from server. My all static pages have been indexed but not videos. Please help me where am I doing the mistake. There are no separate pages for single videos. All the content is dynamically coming from server. Please help me. your answers will be more appreciated................. Thanks
Technical SEO | | docbeans0 -
Removed Subdomain Sites Still in Google Index
Hey guys, I've got kind of a strange situation going on and I can't seem to find it addressed anywhere. I have a site that at one point had several development sites set up at subdomains. Those sites have since launched on their own domains, but the subdomain sites are still showing up in the Google index. However, if you look at the cached version of pages on these non-existent subdomains, it lists the NEW url, not the dev one in the little blurb that says "This is Google's cached version of www.correcturl.com." Clearly Google recognizes that the content resides at the new location, so how come the old pages are still in the index? Attempting to visit one of them gives a "Server Not Found" error, so they are definitely gone. This is happening to a couple of sites, one that was launched over a year ago so it doesn't appear to be a "wait and see" solution. Any suggestions would be a huge help. Thanks!!
Technical SEO | | SarahLK0 -
Redirecting HTTP to HTTPS - How long does it take Google to re-index the site?
hello Moz We know that this year, Moz changed its domain to a-moz.groupbuyseo.org from www.seomoz.org
Technical SEO | | joony
however, when you type "site:seomoz.org" you still can find old urls indexed on Google (on page 7 and above) We also changed our site from http://www.example.com to https://www.example.com
And Google is indexing both sites even though we did proper 301 redirection via htaccess. How long would it take Google to refresh the index? We just don't worry about it? Say we redirected our entire site. What is going to happen to those websites that copied and pasted our content? We have already DMCAed their webpages, but making our site https would mean that their website is now more original than our site? Thus, Google assumes that we have copied their site? (Google is very slow on responding to our DMCA complaint) Thank you in advance for your reply.0 -
Is Google caching date same as crawling/indexing date?
If a site is cached on say 9 oct 2012 doesn't that also mean that Google crawled it on same date ? And indexed it on same date?
Technical SEO | | Personnel_Concept0 -
/index.php in sitemap? take it out?
Hi Everyone, The following was automatically generated at xml-sitemaps.com Should I get rid of the index.php url from my sitemap? If so, how do I go about redirecting it in my htaccess ? <url><loc>http://www.mydomain.ca/</loc></url>
Technical SEO | | RogersSEO
<url><loc>http://www.mydomain.ca/index.php</loc></url> thank you in advance, Martin0