Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Is Indexing my 301 Redirects to Other sites
-
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect)
My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days.
Any ideas? Thanks!
-
Let me know how it turns out. If the problem persist, I'm glad to help
good luck!
-
I just tried removing them manually from GWT, i'll wait until they pop back up again and i'll update the question with the urls, for now i'll mark it as answered, thanks for the help!
-
Hey,
Can you point out an example URL? (if you don't want to disclose the website URL in here, you can do it via a personal message). This way we can debug an exact URL and not just a theory.
Regarding blocking via robots.txt: it is never a good idea to block a search engine from URLs you want to deindex. This way the Google crawlers won't grab and process the data, and you will have your URLs in the search index.
Just check: https://support.google.com/webmasters/answer/6062608?hl=en
"While Google won't crawl or index the content blocked by
robots.txt
, we might still find and index a disallowed URL if it is linked from other places on the web. As a result, the URL address and, potentially, other publicly available information such as anchor text in links to the page can still appear in Google search results."In case of 301 redirects (make sure you are not using 302), if the crawler can access the page, you should have the old URL removed from the index.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirects for Multiple Language Sites in htaccess File
Hi everyone, I have a site on a subdomain that has multiple languages set up at the domain level: https://mysite.site.com, https://mysite.site.fr , https://mysite.site.es , https://mysite.site.de , etc. We are migrating to a new subdomain and I am trying to create 301 redirects within the htaccess file, but I am a bit lost on how to do this as it seems you have to go from a relative url to an absolute - which would be fine if I was only doing this for the english site, but I'm not. It doesn't seem like I can go from absolute url to an absolute url - but I could be wrong. I am new to editing the htaccess file - so I could definitely use some advice here. Thanks.
Intermediate & Advanced SEO | | amberprata0 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Google indexing pages from chrome history ?
We have pages that are not linked from site yet they are indexed in Google. It could be possible if Google got these pages from browser. Does Google takes data from chrome?
Intermediate & Advanced SEO | | vivekrathore0 -
.htaccess 301 Redirect Help! Specific Redirects and Blanket Rule
Hi there, I have the following domains: OLD DOMAIN: domain1.co.uk NEW DOMAIN: domain2.co.uk I need to create a .htaccess file that 301 redirects specific, individual pages on domain1.co.uk to domain2.co.uk I've searched for hours to try and find a solution, but I can't find anything that will do what I need. The pages on domain1.co.uk are all kinds of filenames and extensions, but they will be redirected to a Wordpress website that has a clean folder structure. Some example URL's to be redirected from the old website: http://www.domain1.co.uk/charitypage.php?charity=357 http://www.domain1.co.uk/adopt.php http://www.domain1.co.uk/register/?type=2 These will need to be redirected to the following URL types on the new domain: http://www.domain2.co.uk/charities/ http://www.domain2.co.uk/adopt/ http://www.domain2.co.uk/register/ I would also like a blanket/catch-all redirect from anything else on www.domain1.co.uk to the homepage of www.domain2.co.uk if there isn't a specific individual redirect in place. I'm literally tearing my hair out with this, so any help would be greatly appreciated! Thanks
Intermediate & Advanced SEO | | Townpages0 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1 -
Multiple 301 Redirects for the Same Page
Hi Mozzers, What happens if I have a trail of 301 redirects for the same page? For example,
Intermediate & Advanced SEO | | Travis-W
SiteA.com/10 --> SiteA.com/11 --> SiteA.com/13 --> SiteA.com/14 I know I lose a little bit of link juice by 301 redirecting.
The question is, would the link juice look like this for the example above? 100% --> 90% --> 81% -->72.9%
Or just 100% -----------------------------------------> 90% Does this link juice refer to juice from inbound links or links between internal pages on my site? Thanks!0 -
How do I go about changing a 302 redirect to a 301.
Hello Friends! Thanks for viewing my question. Ok,My question today is How do I go about redirecting a 302 link to a 301 link. I understand the benefits of doing this as far as link juice and how the Search Engines views the two Re-Directs. I am wanting to know where I would start to do this. Thank you in advance for any help or suggestions!
Intermediate & Advanced SEO | | FrontlineMobility0