Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Limit on Google Removal Tool?
-
I'm dealing with thousands of duplicate URL's caused by the CMS...
So I am using some automation to get through them -
What is the daily limit? weekly? monthly?
Any ideas??
thanks,
Ben
-
Hi. I was curious to know if you were able to find all the urls that needed to be removed, if they were effectively removed, and if your traffic came back. I'm assuming you were dealing with Panda issues.
-
Thanks @Baldea - yes, you are correct, as soon as it processes the requests for removal it allows for more to be added for removal.
In "one sitting", I managed to get 1000 URL's removed - within a few hours I could process more.
-
It's been a while since I haven't used GWT, but I thing you have a limit of 500 requests/day or something like that.
As soon as those requests are processed the counter resets.
-
Thanks Baldea....
Yes, I have done all the above, but still got some pages stuck in Google Index that got in there - now measurements are in place to stop that happening in the future.
But using batch process of GWT to remove the URL's that already got into the index.
So how many URL's can I remove with GWT?
-
Hi BJ
I'd use the URL removal tool only after I'd make sure that I used rel=canonical (to point out which is the actual URL Google should consider "legit", use noindex, follow meta tag for the pages you don't want google to index - like tag or category pages or subpages -).
You can also use strings when removing pages, which may be way more effective than removing them one by one: eg: http://www.example.com/category/ - Google will remove all matching URLs.
But I strongly advise you to make sure you minimize as much as you can the duplicate content generation and use the two methods I highlighted above.
Hope it helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you index a Google doc?
We have updated and added completely new content to our state pages. Our old state content is sitting in a our Google drive. Can I make these public to get them indexed and provide a link back to our state pages? In theory it sounds like a great link building strategy... TIA!
Intermediate & Advanced SEO | | LindsayE1 -
Does google ignore ? in url?
Hi Guys, Have a site which ends ?v=6cc98ba2045f for all its URLs. Example: https://domain.com/products/cashmere/robes/?v=6cc98ba2045f Just wondering does Google ignore what is after the ?. Also any ideas what that is? Cheers.
Intermediate & Advanced SEO | | CarolynSC0 -
Alternative Link Detox tools?
My company is conducting a link detox for a client, and it seems like every tool we utilize is giving us a different answer on how many links we actually have. the numbers range anywhere from 4,000 to 200,000. Does anyone have any suggestions as to what tools will give us an accurate count, and will also email the webmasters on your behalf requesting the links removal? We are trying to have this process be as automated as possible to save time on our end.
Intermediate & Advanced SEO | | lightwurx0 -
How to NOT appear in Google results in other countries?
I have ecommerce sites the only serve US and Canada. Is there a way to prevent a site from appearing in the Google results in foreign countries? The reason I ask is that we also have a lot of informational pages that folks in other countries are visiting, then leaving right after reading. This is making our overall Bounce Rate very high (64%). When we segment the GA data to look at just our US visitors, then the Bounce Rate drops a lot. (to 48%) Thanks!
Intermediate & Advanced SEO | | GregB1230 -
Buying a domain banned by google
Hi , I came across a super domain for my business but found out that it was a great domain with 100s of link backs but is now banned by Google search engine meaning Google does not index content from that domain. Since the domains linkbacks are from my domin does it make sense to but that domain and redirect those link backs to another (301) and hope that the new domain gets some juice ... I know it is sounding crazy and may not be the best thing to do ethically but still wanted to check if its possible to get some juice.. Rgds Avinash
Intermediate & Advanced SEO | | Avinashmb0 -
Disavow Tool - WWW or Not?
Hi All, Just a quick question ... A shady domain linking to my website is indexed in Google for both example.com and www.example.com. If I wan't to disavow the entire domain, do I need to submit both: domain:www.example.com domain:example.com or just: domain:example.com Cheers!
Intermediate & Advanced SEO | | Carlos-R0 -
Multiple Authors Google + Authorship
Hello, I took a look through past questions but can't seem to find a definitive answer on setting up Google + Authorship credit (for multiple authors) using a Wordpress blog. Has anyone had experience setting this up? Or could you recommend solid reading/research? I took a look at a couple of Wordpress plug in's but just found them very confusing (so did our IT contact who will ultimately be setting up code for this.) Any direction or advice is appreciated.
Intermediate & Advanced SEO | | SEOSponge0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0