Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
One robots.txt file for multiple sites?
-
I have 2 sites hosted with Blue Host and was told to put the robots.txt in the root folder and just use the one robots.txt for both sites. Is this right? It seems wrong. I want to block certain things on one site.
Thanks for the help,
Rena
-
Hi Rena. Yes, if both sites are separate domains that you want to use in different ways, then you should place a different robots.txt file in each domain root so that they're accessible at xyz.com/robots.txt and abc.com/robots.txt. Cheers!
-
Hi Rena,
You technically can do that, but it's not recommended - for the exact reason you state above. More often than not, 2 sites aren't going to have the same set of disallow rules.
Additionally, you should also be using robots.txt files to direct search engines to your XML sitemap, and if you're sharing a robots file, then you can't specify 2 different sitemaps on 2 different domains.
-
Each individual website (and some subdomains if you add them) needs a unique robots.txt file. You can copy the same file and use it again and again on each site, but each one needs a robots.txt file.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two websites, one company, one physical address - how to make the best of it in terms of local visibility?
Hello! I have one company which will be operating in two markets, printing and website design / development. I’m planning on building two websites, each for every market. But I’m a bit confused about how to optimize these websites locally. My thought is to use my physical address for one website (build citations, get listed in directories, etc. ) and PO Box for another. Do you think there is a better idea?
Technical SEO | | VELV1 -
Multiple CMS on one website / domain & SEO
For a client we would like to work with a content hub, but their website is build on a custom CMS so we are limited in our options and if we aks their web developers they ask crazy prices to help us. So now we have the idea to build the content hub with wordpress and implement it next to their current CMS. for example on www.website.com/contenthub/ . As far as i know this is technically possible and there are no negative effects regarding SEO as long as we link the two sitemaps together. Am i right or am i missing something here?
Technical SEO | | Siphoplait0 -
Role of Robots.txt and Search Console parameters settings
Hi, wondering if anyone can point me to resources or explain the difference between these two. If a site has url parameters disallowed in Robots.txt is it redundant to edit settings in Search Console parameters to anything other than "Let Googlebot Decide"?
Technical SEO | | LivDetrick0 -
Robots txt. in page with 301 redirect
We currently have a a series of help pages that we would like to disallow from our robots txt. The thing is that these help pages are located in our old website, which now has a 301 redirect to current site. Which is the proper way to go around? 1- Add the pages we want to disallow to the robots.txt of the new website? 2- Break the redirect momentarily and add the pages to the robots.txt of the old one? Thanks
Technical SEO | | Kilgray0 -
Robots.txt on subdomains
Hi guys! I keep reading conflicting information on this and it's left me a little unsure. Am I right in thinking that a website with a subdomain of shop.sitetitle.com will share the same robots.txt file as the root domain?
Technical SEO | | Whittie0 -
Are robots.txt wildcards still valid? If so, what is the proper syntax for setting this up?
I've got several URL's that I need to disallow in my robots.txt file. For example, I've got several documents that I don't want indexed and filters that are getting flagged as duplicate content. Rather than typing in thousands of URL's I was hoping that wildcards were still valid.
Technical SEO | | mkhGT0 -
Can hotlinking images from multiple sites be bad for SEO?
Hi, There's a very similar question already being discussed here, but it deals with hotlinking from a single site that is owned by the same person. I'm interested whether hotlinking images from multiple sites can be bad for SEO. The issue is that one of our bloggers has been hotlinking all the images he uses, sometimes there are 3 or 4 images per blog from different domains. We know that hotlinking is frowned upon, but can it affect us in the SERPs? Thanks, James
Technical SEO | | OptiBacUK0 -
Google insists robots.txt is blocking... but it isn't.
I recently launched a new website. During development, I'd enabled the option in WordPress to prevent search engines from indexing the site. When the site went public (over 24 hours ago), I cleared that option. At that point, I added a specific robots.txt file that only disallowed a couple directories of files. You can view the robots.txt at http://photogeardeals.com/robots.txt Google (via Webmaster tools) is insisting that my robots.txt file contains a "Disallow: /" on line 2 and that it's preventing Google from indexing the site and preventing me from submitting a sitemap. These errors are showing both in the sitemap section of Webmaster tools as well as the Blocked URLs section. Bing's webmaster tools are able to read the site and sitemap just fine. Any idea why Google insists I'm disallowing everything even after telling it to re-fetch?
Technical SEO | | ahockley0