Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to block text on a page to be indexed?
-
I would like to block the spider indexing a block of text inside a page , however I do not want to block the whole page with, for example , a noindex tag.
I have tried already with a tag like this :
chocolate pudding
chocolate pudding
However this is not working for my case, a travel related website.
thanks in advance for your support.
Best regards
Gianluca
-
Gianluca,
Rand's whiteboard Friday a couple of weeks ago may help you: http://a-moz.groupbuyseo.org/blog/handling-duplicate-content-across-large-numbers-of-urlsThough the Whiteboard Friday is about duplicate content issues, 1 piece you can probably us from it is this: embed an iframe on page of the content to leave the content out of the index and the content will not be perceived to be part of the URL when using iframe. Add “noindex” in the HTML doc in the iframe to be 100% sure that search engines do not index it.
-
There aren't too many ways to achieve this without it looking a little odd to Google. The use of Images is probably the only real world way, but do remember that Google can view images well, and I have always advised anyone wanting to do this, to avoid it.
I haven't tried this myself, but can see it working by using iframes and then Disallowing them in Robots.txt
http://stackoverflow.com/questions/15685205/noindex-tag-for-googleAndy
-
@chris - thanks for your reply. yes I realised only after I used it that this solution won't apply to web search. it is a possibility to put the text in an immage, however, since it will be a lot of text in many different product pages, I was looking for something easier to automate. any other possibilities through tags?
-
That was a good line; I will try to remember to give you attribution. Like your stuff on here.
Best -
Unfortunately, I haven't had the opportunity. I'd love to get my hands on one though--it'd be like holding a baby google in your arms
-
Chris,
Do you work with the Search Appliance? Would love to speak with you about it if so.
Thanks, great answer.
Robert
-
Gianluca,
The Googleoff: snippet is not used for web-search, it's only used with the Google Search Appliance. Could you can put the text you want to keep out of the snippet into an image?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Indexing without content
Hello. I have a problem of page indexing without content. I have website in 3 different languages and 2 of the pages are indexing just fine, but one language page (the most important one) is indexing without content. When searching using site: page comes up, but when searching unique keywords for which I should rank 100% nothing comes up. This page was indexing just fine and the problem arose couple of days ago after google update finished. Looking further, the problem is language related and every page in the given language that is newly indexed has this problem, while pages that were last crawled around one week ago are just fine. Has anyone ran into this type of problem?
Technical SEO | | AtuliSulava1 -
Google Not Indexing Pages (Wordpress)
Hello, recently I started noticing that google is not indexing our new pages or our new blog posts. We are simply getting a "Discovered - Currently Not Indexed" message on all new pages. When I click "Request Indexing" is takes a few days, but eventually it does get indexed and is on Google. This is very strange, as our website has been around since the late 90's and the quality of the new content is neither duplicate nor "low quality". We started noticing this happening around February. We also do not have many pages - maybe 500 maximum? I have looked at all the obvious answers (allowing for indexing, etc.), but just can't seem to pinpoint a reason why. Has anyone had this happen recently? It is getting very annoying having to manually go in and request indexing for every page and makes me think there may be some underlying issues with the website that should be fixed.
Technical SEO | | Hasanovic1 -
Discrepancy in actual indexed pages vs search console
Hi support, I checked my search console. It said that 8344 pages from www.printcious.com/au/sitemap.xml are indexed by google. however, if i search for site:www.printcious.com/au it only returned me 79 results. See http://imgur.com/a/FUOY2 https://www.google.com/search?num=100&safe=off&biw=1366&bih=638&q=site%3Awww.printcious.com%2Fau&oq=site%3Awww.printcious.com%2Fau&gs_l=serp.3...109843.110225.0.110430.4.4.0.0.0.0.102.275.1j2.3.0....0...1c.1.64.serp..1.0.0.htlbSGrS8p8 Could you please advise why there is discrepancy? Thanks.
Technical SEO | | Printcious0 -
Home Page Ranking Instead of Service Pages
Hi everyone! I've noticed that many of our clients have pages addressing specific queries related to specific services on their websites, but that the Home Page is increasingly showing as the "ranking" page. For example, a plastic surgeon we work with has a page specifically talking about his breast augmentation procedure for Miami, FL but instead of THAT page showing in the search results, Google is using his home page. Noticing this across the board. Any insights? Should we still be optimizing these specific service pages? Should I be spending time trying to make sure Google ranks the page specifically addressing that query because it SHOULD perform better? Thanks for the help. Confused SEO :/, Ricky Shockley
Technical SEO | | RickyShockley0 -
How to stop google from indexing specific sections of a page?
I'm currently trying to find a way to stop googlebot from indexing specific areas of a page, long ago Yahoo search created this tag class=”robots-nocontent” and I'm trying to see if there is a similar manner for google or if they have adopted the same tag? Any help would be much appreciated.
Technical SEO | | Iamfaramon0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0 -
How to block "print" pages from indexing
I have a fairly large FAQ section and every article has a "print" button. Unfortunately, this is creating a page for every article which is muddying up the index - especially on my own site using Google Custom Search. Can you recommend a way to block this from happening? Example Article: http://www.knottyboy.com/lore/idx.php/11/183/Maintenance-of-Mature-Locks-6-months-/article/How-do-I-get-sand-out-of-my-dreads.html Example "Print" page: http://www.knottyboy.com/lore/article.php?id=052&action=print
Technical SEO | | dreadmichael0