Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Prevent link juice to flow on low-value pages
- 
					
					
					
					
 Hello there! Most of the websites have links to low-value pages in their main navigation (header or footer)... thus, available through every other pages. I especially think about "Conditions of Use" or "Privacy Notice" pages, which have no value for SEO. What I would like, is to prevent link juice to flow into those pages... but still keep the links for visitors. What is the best way to achieve this? - Put a rel="nofollow" attribute on those links?
- Put a "robots" meta tag containing "noindex,nofollow" on those pages?
- Put a "Disallow" for those pages in a "robots.txt" file?
- Use efficient Javascript links? (that crawlers won't be able to follow)
 
- 
					
					
					
					
 Mmh, good point. Never heard that "privacy policy page" could be a trust signal. Is there an article somewhere that talks about this? Well, I took those two pages as an example... but my question was about avoiding link juice to flow on non-SEO pages in general. Thanks a lot for your answers! 
- 
					
					
					
					
 Exactly, and what I also try to explain to people is that privacy policy type page is additional signal for Google when they try to understand the type of site you are and how trustworthy it is. Why in the world would you noindex something like that? 
- 
					
					
					
					
 As I understand it nofollow still dilutes your link juice even though it does not pass PR (theoretically). Google made this announcement to combat PR sculpting in 2009. Here is a post from Rand about it. Unlsee something has changed that I am not aware of you could link in an iFrame and Google will not see it, nor will it dilute your PR passed out. 
- 
					
					
					
					
 Great suggestions. I've recently combined some pages (login/register, about/contact/ToS/privacy, and a few others) and have been very happy with the results. I removed 8 links from every page. I am also thinking about removing some more links from my product pages, to try and keep the most juice on those pages. Those pages don't need the same navigation as the homepage. 
- 
					
					
					
					
 It depends on what your purpose is. If you want them totally block from being index then putting the page in the robots.tx fil or using a robots meta tag would work fine. If you just want to de-emphasize the page to the search engines you can use nofollows or javascript links on footer/header links. One thing that we have done is to combine some of these pages (terms and privacy) into one page to cut down on the number of total links on each page. You could also not include the privacy page link on every page (depending on your site) but just link it from certain pages that collect sensitive data (near the form). I hope this helps. The main thing to remember is that each site is different so you will have to adjust your tactics depending on precisely what you are trying to accomplish. 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		Should we rename and update a page or create a new page entirely?
 Hi Moz Peoples! We have a small site with a simple site navigation, with only a few links on the nav bar. We have been doing some work to create a new page, which will eventually replace one of the links on the nav bar. The question we are having is, is it better to rename the existing page and replace its content and then wait for the great indexer to do its thing, or perm delete the page and replace it with the new page and content? Or is this a case where it really makes no difference as long as the redirects are set up correctly? On-Page Optimization | | Parker8180
- 
		
		
		
		
		
		Why are http and https pages showing different domain/page authorities?
 My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks, On-Page Optimization | | Aquatell1
- 
		
		
		
		
		
		Should I optimize my home-page or a sub-page for my most important keyword
 Quick question: When choosing the most important keyword set that I would like to rank for, would I be better off optimizing my homepage, or a sub page for this keyword. My thinking goes as follows: The homepage (IE www.mysite.com) naturally has more backlinks and thus a better Google Page Rank. However, there are certain things I could do to a subpage (IE www.mysite.com/green-widgets-los-angeles ) that I wouldn't want to do to the homepage, which might be more "optimal" overall. Option C, I suppose, would be to optimize both the homepage, and a single sub-page, which is seeming like a pretty good solution, but I have been told that having multiple pages optimized for the same keywords might "confuse" search engines. Would love any insight on this! On-Page Optimization | | Jacob_A2
- 
		
		
		
		
		
		Mega Menus? A good or bad idea for link juice.
 Hi Just wondering what people think of using mega menus for navigation? We have used them on our new site http://nicontrols.com/uk/ When I run the site through the excellent SEOMoz campaign tools I see that we have too many on page links. I now believe the menu is good for customers but maybe not for link juice. Anyone got an ideas? Do I remove the mega menu or just reduce the number of links? Many thanks David On-Page Optimization | | DavidLenehan0
- 
		
		
		
		
		
		301 redirects from several sub-pages to one sub-page
 Hi! I have 14 sub-pages i deleted earlier today. But ofcourse Google can still find them, and gives everyone that gives them a go a 404 error. I have come to the understading that this wil hurt the rest of my site, at least as long as Google have them indexed. These sub-pages lies in 3 different folders, and i want to redirect them to a sub-page in a folder number 4. I have already an htaccess file, but i just simply cant get it to work! It is the same file as i use for redirecting trafic from mydomain.no to www.mydomain.no, and i have tried every kind of variation i can think of with the sub-pages. Has anyone perhaps had the same problem before, or for any other reason has the solution, and can help me with how to compose the htaccess file? 🙂 You have to excuse me if i'm using the wrong terms, missing something i should have seen under water while wearing a blindfold, or i am misspelling anything. I am neither very experienced with anything surrounding seo or anything else that has with internet to do, nor am i from an englishspeaking country. Hope someone here can light up my path 🙂 Thats at least something you can say in norwegian... On-Page Optimization | | MarieA1
- 
		
		
		
		
		
		Creating New Pages Versus Improving Existing Pages
 What are some things to consider or things to evaluate when deciding whether you should focus resources on creating new pages (to cover more related topics) versus improving existing pages (adding more useful information, etc.)? On-Page Optimization | | SparkplugDigital0
- 
		
		
		
		
		
		Tag clouds: good for internal linking and increase of keyword relevant pages?
 As Matt Cutts explained, tag clouds are OK if you're not engaged in keyword stuffing (http://www.youtube.com/watch?v=bYPX_ZmhLqg) - i.e. if you're not putting in 500 tags. I'm currently creating tags for an online-bookseller; just like Amazon this e-commerce-site has potentially a couple of million books. Tag clouds will be added to each book detail page in order to enrich each of these pages with relevant keywords both for search engines and users (get a quick overview over the main topics of the book; navigate the site and find other books associated with each tag). Each of these book-specific tag clouds will hold up to 50 tags max, typically rather in the range of up to 10-20. From an SEO perspective, my question is twofold: 1. Does the site benefit from these tag clouds by improving the internal linking structure? 2. Does the site benefit from creating lots of additional tag-specific-pages (up to 200k different tags) or can these pages become a problem, as they don't contain a lot of rich content as such but rather lists of books associated with each tag? Thanks in advance! On-Page Optimization | | semantopic0
- 
		
		
		
		
		
		Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
 We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc) On-Page Optimization | | smaavie
 Cooking Method (fry, bake, boil, steam, etc)
 Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
 find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
 find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5
 
			
		 
			
		 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				