Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to stop google from indexing specific sections of a page?
-
I'm currently trying to find a way to stop googlebot from indexing specific areas of a page, long ago Yahoo search created this tag class=”robots-nocontent” and I'm trying to see if there is a similar manner for google or if they have adopted the same tag?
Any help would be much appreciated.
-
Unfortunately, there is no officially sanctioned method for blocking just a portion of a page from the index. As others have mentioned, there are tricks that might do it, but their effectiveness is inconsistent, and most of them will run the risk that Google could treat it as a red flag of some sort. More often, the results just end up being unpredictable (especially with JavaScript) and end up causing additional grief for your developers and visitors.
Most of the time, if you're dealing with substantial amounts of content you don't want indexed, I'd look for other solutions, such as grouping that content or making sure more of your content on any given page is unique. Unfortunately, that depends a lot on why you want it blocked, so it's hard to give a one-size-fits-all answer.
-
We have just had a similar conundrum and plumped for the iframe option, sticking robots.txt on the iframe's source
-
I don't know this to be a fact, but I would not be surprised that if you could hide specific content on a page from Google, it would not be the best trust signal and could have it's own downside.
-
Google is getting much better at reading javascript, however.
-
I'm going to avoid iframes but the javascript does sound the best option so far, thank you!
-
You might try Inserting your text into Javascript or maybe, inserting it into an Iframe.
-
ah ok looks like I still need to look into this further, if you do find anything I would love to hear how you can achieve it as I think it would be a useful technique to implement in some projects.
-
Ahhh unfortunately the googleon / off tags is only in conjunction with Google search appliance, if that's changed though it would be incredibly useful.
-
Here is the article where this was taken from - http://perishablepress.com/tell-google-to-not-index-certain-parts-of-your-page/
-
This is a good question and something I haven't looked into. From articles I've read I think this may be what you are searching for.
<code>This is normal (X)HTML content that will be indexed by Google. This (X)HTML content will NOT be indexed by Google.</code>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Indexing without content
Hello. I have a problem of page indexing without content. I have website in 3 different languages and 2 of the pages are indexing just fine, but one language page (the most important one) is indexing without content. When searching using site: page comes up, but when searching unique keywords for which I should rank 100% nothing comes up. This page was indexing just fine and the problem arose couple of days ago after google update finished. Looking further, the problem is language related and every page in the given language that is newly indexed has this problem, while pages that were last crawled around one week ago are just fine. Has anyone ran into this type of problem?
Technical SEO | | AtuliSulava1 -
How to index e-commerce marketplace product pages
Hello! We are an online marketplace that submitted our sitemap through Google Search Console 2 weeks ago. Although the sitemap has been submitted successfully, out of ~10000 links (we have ~10000 product pages), we only have 25 that have been indexed. I've attached images of the reasons given for not indexing the platform. gsc-dashboard-1 gsc-dashboard-2 How would we go about fixing this?
Technical SEO | | fbcosta0 -
Subdomain as News Section instead of Source in Google News?
Hi, trying to dig into Google News for a large site, mostly containing news.
Technical SEO | | m.m
The structure of the site network is subdomain.domain.se, and each subdomain has it's own brand with it's own news: x.domain.se
y.domain.se
z.domain.se
etc... Each brand/subdomain is more or less to equate with its own subjectfield/section. In Google News every subdomain is configured with it's own Site Source url, but also having the set up with one section with the same url. It seems like they're getting conflicts in Google News, Google can't always figure out which news article to which brand. Example: an article owned by brand A, but it is sometimes happens that articles getting labeled as brand B in the news SERP, though the link takes you correctly to brand A. I am thinking that this config in News Publisher Center may be a problem? Anyone having any thoughts if that would be better if we delete all source urls except for domain.se-brand and then put all the other subdomains as sections? www.domain.se x.domain.se y.doamin.se z.domain.se Any smart thoughts on this one? Or anything else that could make this wrong labeling (all content included images are hosted in same domain for example). Regards,
Magnus0 -
How To Cleanup the Google Index After a Website Has Been HACKED
We have a client whose website was hacked, and some troll created thousands of viagra pages, which were all indexed by Google. See the screenshot for an example. The site has been cleaned up completely, but I wanted to know if anyone can weigh in on how we can cleanup the Google index. Are there extra steps we should take? So far we have gone into webmaster tools and submitted a new site map. ^802D799E5372F02797BE19290D8987F3E248DCA6656F8D9BF6^pimgpsh_fullsize_distr.png
Technical SEO | | yoursearchteam0 -
How to stop my webmail pages not to be indexed on Google ??
when i did a search in google for Site:mywebsite.com , for a list of pages indexed. Surprisingly the following come up " Webmail - Login " Although this is associated with the domain , this is a completely different server , this the rackspace email server browser interface I am sure that there is nothing on the website that links or points to this.
Technical SEO | | UIPL
So why is Google indexing it ? & how do I get it out of there. I tried in webmaster tool but I could not , as it seems like a sub-domain. Any ideas ? Thanks Naresh Sadasivan0 -
De-indexed from Google
Hi Search Experts! We are just launching a new site for a client with a completely new URL. The client can not provide any access details for their existing site. Any ideas how can we get the existing site de-indexed from Google? Thanks guys!
Technical SEO | | rikmon0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0 -
Dynamically-generated .PDF files, instead of normal pages, indexed by and ranking in Google
Hi, I come across a tough problem. I am working on an online-store website which contains the functionlaity of viewing products details in .PDF format (by the way, the website is built on Joomla CMS), now when I search my site's name in Google, the SERP simply displays my .PDF files in the first couple positions (shown in normal .PDF files format: [PDF]...)and I cannot find the normal pages there on SERP #1 unless I search the full site domain in Google. I really don't want this! Would you please tell me how to figure the problem out and solve it. I can actually remove the corresponding component (Virtuemart) that are in charge of generating the .PDF files. Now I am trying to redirect all the .PDF pages ranking in Google to a 404 page and remove the functionality, I plan to regenerate a sitemap of my site and submit it to Google, will it be working for me? I really appreciate that if you could help solve this problem. Thanks very much. Sincerely SEOmoz Pro Member
Technical SEO | | fugu0