Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to remove my cdn sub domins on Google search result?
-
A few months ago I moved all my Wordpress images into a sub domain. After I purchased CDN service, I again moved that images to my root domain. I added User-agent: * Disallow: / to my CDN domain.
But now, when I perform site search on the Google, I found that my CDN sub domains are indexed by the Google. I think this will make duplicate content issue. I already hit by the Panguin. How do I remove these search results on Google?
Should I add my cdn domain to webmaster tools to request URL removal request?
Problem is, If I use cdn.mydomain.com it shows my www.mydomain.com.
My blog:- http://goo.gl/58Utt
site search result:- http://goo.gl/ElNwc
-
Hey,
Glad to hear it worked out
Mark
-
Update!
Less than 12 hours my CDN domain de-indexed. Thanks Mark for your support!
-
Blocking with robots.txt doesn't remove pages/files from the search engines, it only prevents them from crawling the subdomain. If they already crawled those resources, as they have in your case, the robots.txt will just block them from visiting it again, but will not remove them.
What you should do is authenticate the subdomain with webmaster tools, and then remove the subdomain via the url removal tool, as you asked. This is the way to go about it.
To verify the subdomain, there are multiple options, including via your host, or you can modify the dns to prove you own the subdomain - it really depends on your setup and what you need to do.
But to remove these results permanently, with your robots.txt block, you should do it via the URL removal tool.
Good luck,
Mark
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Errors In Search Console
Hi All, I am hoping someone might be able to help with this. Last week one of my sites dropped from mid first day to bottom of page 1. We had not been link building as such and it only seems to of affected a single search term and the ranking page (which happens to be the home page). When I was going through everything I went to search console and in crawl errors there are 2 errors that showed up as detected 3 days before the drop. These are: wp-admin/admin-ajax.php showing as response code 400 and also xmlrpc.php showing as response code 405 robots.txt is as follows: user-agent: * disallow: /wp-admin/ allow: /wp-admin/admin-ajax.php Any help with what is wrong here and how to fix it would be greatly appreciated. Many Thanks
Technical SEO | | DaleZon0 -
Abnormally high internal link reported in Google Search Console not matching Moz reports
If I'm looking at our internal link count and structure on Google Search Console, some pages are listed as having over a thousand internal links within our site. I've read that having too many internal links on a page devalues that page's PageRank, because the value is divided amongst the pages it links out to. Likewise, I've heard having too many internal links is just bad in general for SEO. Is that true? The problem I'm facing is determining how Google is "discovering" these internal links. If I'm just looking at one single page reported with, say, 1,350 links and I'm just looking at the code, it may only have 80 or 90 actual links. Moz will confirm this, as well. So why would Google Search Console report different? Should I be concerned about this?
Technical SEO | | Closetstogo0 -
Omitted results
Hello We are facing a loss in ranking and organic traffic from 3 months on our ecommerce website. Mostly we have lost our ranking on our product pages. These pages are gone in the "omitted results" of google. It all started 3 months ago, when we had to face a duplicate content issue due to a technical priblems with our servers: 2 other domains that we own have been pushed online on google, while they shouldn't have. They have created millions of links to our main domain in a few days, and duplicate version / redirection to our main website. We have fixed this a long time ago now. But in GWT we still see that these domains are bringing links to our main ecommerce. It has dowgraded from 36 millions links to 3 millions.... Even if today there is no link ! We have done a lot of optimizations on site like adding specific content to our most important page, rebuilding the navigation, adding microdatas, adding canonical urls on products pages that we found were very similar (we sell very technical products, and we have products that are very similar. Now we have choosen 1 product to put in canonical each time it was necessary) Bt still our products pages don't rank in google. They stay in the "omitted results". Before they were ranking very well on 1st page of google's results. And we have noticed that some adswe put on ads listing websites are now well ranked in the google's results!... Like if the ads were having more authority on the subject than our own webpages... We started to delete some of these ads. But it's not always possible. And 2-3 of them are still online. Any advice to get our most important webpages at the top on the google's results back? Regards
Technical SEO | | Poptafic0 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0 -
No Search Results Found - Should this return status code 404?
A question came up today on how to correctly serve the right status code on pages where no search results are found. I did a couple searches on some major eccomerce and news sites and they were ALL serving status code 200 for No Search Results Found http://www.zappos.com/dsfasdgasdgadsg http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=sdafasdklgjasdklgjsjdjkl http://www.ebay.com/sch/i.html?_trksid=p5197.m570.l1313&_nkw=dfjakljgdkslagklasd&_sacat=0 http://www.cnn.com/search/?query=sdgadgdsagas&x=0&y=0&primaryType=mixed&sortBy=date&intl=false http://www.seomoz.org/pages/search_results?q=sdagasdgasdgasg I thought I read somewhere were it was recommended to serve a status code 404 on these types of pages. Based on what I found above, all sites were serving a 200, so it appears this may not be the best practice. Any thoughts?
Technical SEO | | WEB-IRS0 -
How to push down outdated images in Google image search
When you do a Google image search for one of my client's products, you see a lot of first-generation hardware (the product is now in its third generation). The client wants to know what they can do to push those images down so that current product images rise to the top. FYI: the client's own image files on their site aren't very well optimized with keywords. My thinking is to have the client optimize their own images and the ones they give to the media with relevant keywords in file names, alt text, etc. Eventually, this should help push down the outdated images is my thinking. Any other suggestions? Thanks so much.
Technical SEO | | jimmartin_zoho.com0 -
Does a CDN affect search rankings?
I feel kind of stupid asking this, but if i use one it would speed things up quite a bit. It is for a ecommerce website, any guidance on this would be awesome!
Technical SEO | | Hyrule0 -
Should I set up a disallow in the robots.txt for catalog search results?
When the crawl diagnostics came back for my site its showing around 3,000 pages of duplicate content. Almost all of them are of the catalog search results page. I also did a site search on Google and they have most of the results pages in their index too. I think I should just disallow the bots in the /catalogsearch/ sub folder, but I'm not sure if this will have any negative effect?
Technical SEO | | JordanJudson0