Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will a disclaimer affect Crawling?
-
Hello everyone!
My German users will have to get a disclaimer according to German laws, now my question is the following:
Will a disclaimer affect crawling? What's the best practice to have regarding this? Should I have special care in this? What's the best disclaimer technique? A Plain HTML page? Something overlapping the site?
Thank you all!
-
Hi friend, you can display the disclaimer using a JavaScript overlay and this would be absolutely fine. The bots won't have any trouble crawling the website behind the JS overlay as they won't see it. This is a very common practice among the websites that display age gate verification page like porn sites and sites that talk or sell liquor etc..
This technique is not considered cloaking as the intention is not malicious or deceptive and Google handles these normally. Hope it helps and Good Luck.
I addressed a similar question here on Moz:
http://a-moz.groupbuyseo.org/community/q/different-user-experience-with-javascript-on-off
Best regards,
Devanur Rafi
-
Maybe I will try as you said, will just wait to see if someone else responds so I can gather more ideas. Thanks though!
About cookies, yes, it's an Europe thing, but in Germany if you have an adult site, if you sell some type of products, etc, you have to display a disclaimer

-
Hmm, I honestly do not know in this situation. One thing you might try is to do a modal that blocks the page with a semi transparent layer, but check if it is googlebot accessing the site and not do a modal.
But honestly, I thought it was a cookies thing being in the EU so I am not an expert in this area.
-
Thanks for the input!
while the site will not be pornographic it will include art nudity and I want to have a disclaimer that covers at least a portion of teh page 
-
Don't block the site totally and it will not matter really. A lot of people in the e-commerce world do it like in this demo, http://warehouse.iqit-commerce.com/selector/?theme=warehouse2 Just a small bar on the bottom of the page. If you wanted to get even more clever, you could geographically target the user and show based on that and exclude bots from seeing it. But I would not suggest blocking the whole page like an adult site does if it is for cookies. If it is an adult site, that needs a full disabling disclaimer, I have no experience in that area.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will critical error in wordpress for memory limit affect seo rankings?
will critical error in wordpress to increase memory limit affect seo rankings?
Intermediate & Advanced SEO | | gamstopbet0 -
The main navigation is using JS, will this have a negative impact on SEO?
Hi mozzers, We just redesigned our homepage and discovered that our main nav is using JS and when disabling JS, no main nav links was showing up. Is this still considered bad practice for SEO? https://cl.ly/14ccf2509478 thanks
Intermediate & Advanced SEO | | Ty19861 -
Having 2 brands with the same content - will this work from an SEO perspective
Hi All, I would love if someone could help and provide some insights on this. We're a financial institution and have a set of products that we offer. We have recently joined with another brand and will now be offering all our products to their customers. What we are looking to do is have 1 site that masks the content for both sites so it appears as there are 2 seperate brands with different content - in fact we have a main site and then a sister brand that offers the same products. Is there anyway to do this so when someone searches for Credit Card from Brand A it is indexed under Brand A and same when someone searched for Credit Card from Brand B it is indexed under Brand B. The one thing is we would not want to rel:can the pages nor be penalised by googles latest PR algorithm. Hope someone can help! Thanks Dave
Intermediate & Advanced SEO | | CFCU1 -
Do Page Anchors Affect SEO?
Hi everyone, I've been researching for the past hour and I cannot find a definitive answer anywhere! Can someone tell me if page anchors affect SEO at all? I have a client that has 9 page anchors on one landing page on their website - which means if you were to scroll through their website, the page is really really long! I always thought that by using page anchors instead of sending users through to a dedicated landing page, ranking for those keywords makes it harder because a search spider will read all the content on that landing page and not know how to rank for individual keywords? Am I wrong? The client in particular sells furniture, so on their landing page they have page anchors that jump the user down to "tables" or "chairs" or "lighting" for example. You can then click on one of the product images listed in that section of the page anchor and go through to an individual product page. Can anyone shed any light on this? Thanks!
Intermediate & Advanced SEO | | Virginia-Girtz1 -
Can't crawl website with Screaming frog... what is wrong?
Hello all - I've just been trying to crawl a site with Screaming Frog and can't get beyond the homepage - have done the usual stuff (turn off JS and so on) and no problems there with nav and so on- the site's other pages have indexed in Google btw. Now I'm wondering whether there's a problem with this robots.txt file, which I think may be auto-generated by Joomla (I'm not familiar with Joomla...) - are there any issues here? [just checked... and there isn't!] If the Joomla site is installed within a folder such as at e.g. www.example.com/joomla/ the robots.txt file MUST be moved to the site root at e.g. www.example.com/robots.txt AND the joomla folder name MUST be prefixed to the disallowed path, e.g. the Disallow rule for the /administrator/ folder MUST be changed to read Disallow: /joomla/administrator/ For more information about the robots.txt standard, see: http://www.robotstxt.org/orig.html For syntax checking, see: http://tool.motoricerca.info/robots-checker.phtml User-agent: *
Intermediate & Advanced SEO | | McTaggart
Disallow: /administrator/
Disallow: /bin/
Disallow: /cache/
Disallow: /cli/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /layouts/
Disallow: /libraries/
Disallow: /logs/
Disallow: /modules/
Disallow: /plugins/
Disallow: /tmp/0 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
Google crawling different content--ever ok?
Here are a couple of scenarios I'm encountering where Google will crawl different content than my users on initial visit to the site--and which I think should be ok. Of course, it is normally NOT ok, I'm here to find out if Google is flexible enough to allow these situations: 1. My mobile friendly site has users select a city, and then it displays the location options div which includes an explanation for why they may want to have the program use their gps location. The user must choose the gps, the entire city, or he can enter a zip code, or choose a suburb of the city, which then goes to the link chosen. OTOH it is programmed so that if it is a Google bot it doesn't get just a meaningless 'choose further' page, but rather the crawler sees the page of results for the entire city (as you would expect from the url), So basically the program defaults for the entire city results for google bot, but for for the user it first gives him the initial ability to choose gps. 2. A user comes to mysite.com/gps-loc/city/results The site, seeing the literal words 'gps-loc' in the url goes out and fetches the gps for his location and returns results dependent on his location. If Googlebot comes to that url then there is no way the program will return the same results because the program wouldn't be able to get the same long latitude as that user. So, what do you think? Are these scenarios a concern for getting penalized by Google? Thanks, Ted
Intermediate & Advanced SEO | | friendoffood0 -
Will multiple domains from the same company rank for the same keyword search?
I'm trying to convince people that we need good marketing reasons for starting multiple domains, as it will be more difficult to rank multiple sites. Does anyone know if Google actively discourages multiple domains from the same company appearing in the search results for the same keyword? We are creating a separate content website which is related to an existing company website. Would you agree that is best to have these sites on one domain with the content site on a sub-domain perhaps? I'm worried about duplication of effort and cross-keyword targeting in particular. These sites would not have duplicate content.
Intermediate & Advanced SEO | | RG_SEO0