Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Robots txt. in page with 301 redirect
-
We currently have a a series of help pages that we would like to disallow from our robots txt.
The thing is that these help pages are located in our old website, which now has a 301 redirect to current site.
Which is the proper way to go around?
1- Add the pages we want to disallow to the robots.txt of the new website?
2- Break the redirect momentarily and add the pages to the robots.txt of the old one?
Thanks
-
In that case, you'd need to add the robots meta tag at the page level before the tag.
or
-
Hey, for some time we will keep the files in the old domain. Should we break the redirect and insert the disallows to the robot.txt of the old site?
-
So, the problem is that the robots.txt file can't be accessed because of the 301 redirect to the new domain?
Do you plan to keep the help files on the old domain, or will they be removed completely?
-
Hi Laura,
Thanks for your reply. I don't want to disallow the URLs these pages are being redirected to. Actually these URLs are in the old version but still can be accessed. So to put it simply, this is my case:
1- This was our current website: www.kilgray.com (With a 301 redirect)
2- This is our new website: www.memoq.com
3- I would like to disallow the following links on the old website that are still visible (haven't been redirected):
http://kilgray.com/memoq/2015-100/help-en/index.html
http://kilgray.com/memoq/2014/help-en/
-
Do you want to disallow the URLs that these pages are being redirected to? If not, there's no need to add anything to the robots.txt file.
If you do want to disallow the URLs that these pages are being redirected to, use relative URLs in your robots.txt file. For example, let's say olddomain.com/old-help-page/ is being redirected to newdomain.com/new-help-page/. If that's the case, add the following to your robots.txt file.
Disallow: /new-help-page/
There's no need to disallow the specific URLs that are being redirected to something else. Are you trying to get them removed from Google's index or something? If so, Google will update their index eventually based on your 301 redirects.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirects delay in picking up
Hi I have been involved in the redesign/development of a website which has up until now had a lot of international traffic. On day of migration I uploaded all the 301 redirects to the website (wordpress) using Simple 301 redirect plugin. I tested a number of them and they appeared to be working. I also submitted the new sitemaps to Search Console. Since migration international traffic - particularly from countries such as india, Phillipines, Sri Lanka etc have significantly dropped off whereas the local traffic and some of the international traffic such as USA has remained fairly consistent. Looking at Analytics and entrances recently it appears as though search results are/were showing a number of pages with 404's (one in particular which received significant traffic and for which I had created a 301 redirection) - I have checked this page using the old url and it re-directs correctly for me and today asked a colleague in India to also check - he is getting the redirection fine. Does Google.in take a significantly longer time to pick these up in search results? Or am I missing something?
Technical SEO | | musthavemarketing0 -
Is sitemap required on my robots.txt?
Hi, I know that linking your sitemap from your robots.txt file is a good practice. Ok, but... may I just send my sitemap to search console and forget about adding ti to my robots.txt? That's my situation: 1 multilang platform which means... ... 2 set of pages. One for each lang, of course But my CMS (magento) only allows me to have 1 robots.txt file So, again: may I have a robots.txt file woth no sitemap AND not suffering any potential SEO loss? Thanks in advance, Juan Vicente Mañanas Abad
Technical SEO | | Webicultors0 -
How do I redirect the Author archive page in Wordpress?
If you do a search for my name on Google, the first result is the author archive page of my Wordpress blog. I would like to redirect the author page to my "about me" page but cannot add a 301 as the author page is created dynamically in Wordpress. Anyone know how I can do this?
Technical SEO | | richdan0 -
301 Redirects in subfolders
Hi, we're making our site into a static site but I would like to transfer the Google juice. Most of the links and database exist on subfolders though. Could I simply do 301 redirects on the subfolders and retain the value or does it have to be on the full domain?
Technical SEO | | Therealmattyd0 -
Is there any value in having a blank robots.txt file?
I've read an audit where the writer recommended creating and uploading a blank robots.txt file, there was no current file in place. Is there any merit in having a blank robots.txt file? What is the minimum you would include in a basic robots.txt file?
Technical SEO | | NicDale0 -
Googlebot does not obey robots.txt disallow
Hi Mozzers! We are trying to get Googlebot to steer away from our internal search results pages by adding a parameter "nocrawl=1" to facet/filter links and then robots.txt disallow all URLs containing that parameter. We implemented this late august and since that, the GWMT message "Googlebot found an extremely high number of URLs on your site", stopped coming. But today we received yet another. The weird thing is that Google gives many of our nowadays robots.txt disallowed URLs as examples of URLs that may cause us problems. What could be the reason? Best regards, Martin
Technical SEO | | TalkInThePark0 -
How to Redirect all inactive Feed to a specific Wordpress page
Hi Guys, I've been doing much cleaning on my blog lately and deleted numerous categories including their posts with low quality content. After deleting the categories, Google Webmaster Tools is reporting some 404 errors about the RSS Feeds for the deleted categories. I've created a 404.php file inside my theme and placed the following code header("HTTP/1.1 301 Moved Permanently");
Technical SEO | | Trigun
header("Location: http://www.mysite.com/My404Page/", true, 301);
exit();
?> this have catched all 404 errors and redirected them to the specific page. Unfortunately, it could not catch the inactive feed urls. Is there a way to do this so that all inactive feeds will be redirected to my 404 page? Thanks in advance....0