Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to fix issues regarding URL parameters?
- 
					
					
					
					
 Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs: 
 http://www.vistastores.com/table-lamps?dir=asc&order=name
 http://www.vistastores.com/table-lamps?dir=asc&order=price
 http://www.vistastores.com/table-lamps?limit=100Syntax in Robots.txt 
 Disallow: /?dir=
 Disallow: /?p=
 Disallow: /*?limit=Now, I am confuse. Which is best solution to get maximum benefits in SEO? 
- 
					
					
					
					
 No i dont think so, even if the thought they were duplicate, then they will pick one as the original. so one of them will rank. If you are still concerned use tha canonicall tag, rather them remove them from index 
- 
					
					
					
					
 Your concern is that, Google will crawl following all pages. If I will not do any thing with those pages. Right? http://www.vistastores.com/table-lamps http://www.vistastores.com/table-lamps?limit=100&p=2 http://www.vistastores.com/table-lamps?limit=60&p=2 http://www.vistastores.com/table-lamps?limit=40&p=2 Now, my website is on 3rd page of Google for Discount Table Lamps keyword. I have fear that, If Google will crawl multiple pages with duplicate Title tag so it may mesh up current ranking for Discount Table Lamps keyword. What you think about it? 
- 
					
					
					
					
 If the content is different, then dont do anything, but if it is duplicate us ethe canonical tag. The meta tages are not a problem, you are not going to get flaged for that, it would be better if you could make them unique but this is a very small problem 
- 
					
					
					
					
 Will it really work? Because, both page have different content. http://www.vistastores.com/table-lamps have 100 products and http://www.vistastores.com/table-lamps?limit=100&p=2 have different + unique 100 products. One another problem is regarding Meta info. Both page have same Meta info. If Google will index both pages so it may create warning message for duplicate Meta info across too many pages. 
- 
					
					
					
					
 Thats the advice i gave you, put a canonical tag in the page rel="canonical" href="http://www.vistastores.com/table-lamps"/> if google finds http://www.vistastores.com/table-lamps?dir=asc&order=name it will know it5 is mean to be http://www.vistastores.com/table-lamps 
- 
					
					
					
					
 Honestly, I did not getting it. Because, I have read one help article about URL parameters by Google. It shows me some different thing. Google suggested to use Google webmaster tools. But, I have restricted all dynamic pages by robots.txt. So, I want to know best practice which may help me to gain my crawling and no of indexed pages. 
- 
					
					
					
					
 I would simplty put a rel canonical in the page point ing to the true URL. so SE's will see them as one page. It is better to use cononical for the reasons in the google doc you posted, goolge may not pick the nest url to be the canonical, you should make that choice for them 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
 Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html Intermediate & Advanced SEO | | kirin44355
 https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
 https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
 https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
 https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
 https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
 https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
 Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
 Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
 Chris Gorski0
- 
		
		
		
		
		
		If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
 If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL? Intermediate & Advanced SEO | | Gabriele_Layoutweb0
- 
		
		
		
		
		
		Inactive Products - Inactive URLs
 Hi, In our website www.viatrading.com we have many products that might be in stock or not depending on availability. Until now, when a product was not available anymore, we took this page down (and redirected to its product category page). And, only if the product was available again, we re-activated the URL - this might be days, months or even years later. To make this more SEO-friendly, we decided now that while a product is not available, instead or deactivating/redirecting the page, we will leave it online and just add a message saying "This product is currently not available". If we do this, we will automatically re-activate about 500 products pages at once. 1. Just to make sure, is it harmful for SEO to keep activating/deactivating URLs this way? 2. Since most of these pages have been deindexed for a long time due to being redirected - have they lost all their SEO juice? 3. How can we better activate these old 500 pages - is it ok activating them all at once? Thank you, Intermediate & Advanced SEO | | viatrading11
- 
		
		
		
		
		
		Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
 I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found. Intermediate & Advanced SEO | | andyheath0
- 
		
		
		
		
		
		Should I disallow all URL query strings/parameters in Robots.txt?
 Webmaster Tools correctly identifies the query strings/parameters used in my URLs, but still reports duplicate title tags and meta descriptions for the original URL and the versions with parameters. For example, Webmaster Tools would report duplicates for the following URLs, despite it correctly identifying the "cat_id" and "kw" parameters: /Mulligan-Practitioner-CD-ROM Intermediate & Advanced SEO | | jmorehouse
 /Mulligan-Practitioner-CD-ROM?cat_id=87
 /Mulligan-Practitioner-CD-ROM?kw=CROM Additionally, theses pages have self-referential canonical tags, so I would think I'd be covered, but I recently read that another Mozzer saw a great improvement after disallowing all query/parameter URLs, despite Webmaster Tools not reporting any errors. As I see it, I have two options: Manually tell Google that these parameters have no effect on page content via the URL Parameters section in Webmaster Tools (in case Google is unable to automatically detect this, and I am being penalized as a result). Add "Disallow: *?" to hide all query/parameter URLs from Google. My concern here is that most backlinks include the parameters, and in some cases these parameter URLs outrank the original. Any thoughts?0
- 
		
		
		
		
		
		Do UTM URL parameters hurt SEO backlink value?
 Does www.example.com and www.example.com/?utm_source=Google&utm_medium=Press+Release&utm_campaign=Google have the same SEO backlink value? I would assume that Google knows the difference. Intermediate & Advanced SEO | | mkhGT0
- 
		
		
		
		
		
		Is it safe to redirect multiple URLs to a single URL?
 Hi, I have an old Wordress website with about 300-400 original pages of content on it. All relating to my company's industry: travel in Africa. It's a legitimate site with travel stories, photos, advice etc. Nothing spammy about. No adverts on it. No affiliates. The site hasn't been updated for a couple of years and we no longer have a need for it. Many of the stories on it are quite out of date. The site has built up a modest Mozrank value over the last 5 years, and has a few hundreds organically achieved inbound links. Recently I set up a swanky new branded website on ExpressionEngine on a new domain. My intention is to: Shut down the old site Focus all attention on building up content on the new website Ask the people linking to the old site to my new site instead (I wonder how many will actually do so...) Where possible, setup a 301 redirect from pages on the old site to their closest match on the new site Setup a 301 redirect from the old site's home page to new site's homepage Sounds good, right? But there is one issue I need some advice on... The old site has about 100 pages that do not have a good match on the new site. These pages are outdated or inferior quality, so it doesn't really make sense to rewrite them and put them on the new site. I call these my "black sheep pages". So... for these "black sheep pages" should I (A) redirect the urls to the new site's homepage (B) redirect the urls the old site's home page (which in turn, redirects to the new site's homepage, or (C) not redirect the urls, and let them die a lonely 404 death? OPTION A: oldsite.com/page1.php -> newsite.com Intermediate & Advanced SEO | | AndreVanKets
 oldsite.com/page2.php -> newsite.com
 oldsite.com/page3.php -> newsite.com
 oldsite.com/page4.php -> newsite.com
 oldsite.com/page5.php -> newsite.com
 oldsite.com -> newsite.com OPTION B: oldsite.com/page1.php -> oldsite.com
 oldsite.com/page2.php -> oldsite.com
 oldsite.com/page3.php -> oldsite.com
 oldsite.com/page4.php -> oldsite.com
 oldsite.com/page5.php -> oldsite.com
 oldsite.com -> newsite.com OPTION 😄 oldsite.com/page1.php : do not redirect, let page 404 and disappear forever
 oldsite.com/page2.php : do not redirect, let page 404 and disappear forever
 oldsite.com/page3.php : do not redirect, let page 404 and disappear forever
 oldsite.com/page4.php : do not redirect, let page 404 and disappear forever
 oldsite.com/page5.php : do not redirect, let page 404 and disappear forever
 oldsite.com -> newsite.com My intuition tells me that Option A would pass the most "link juice" to my new site, but I am concerned that it could also be seen by Google as a spammy redirect technique. What would you do? Help 😐1
- 
		
		
		
		
		
		URL Length or Exact Breadcrumb Navigation URL? What's More Important
 Basically my question is as follows, what's better: www.romancingdiamonds.com/gemstone-rings/amethyst-rings/purple-amethyst-ring-14k-white-gold (this would fully match the breadcrumbs). or www.romancingdiamonds.com/amethyst-rings/purple-amethyst-ring-14k-white-gold (cutting out the first level folder to keep the url shorter and the important keywords are closer to the root domain). In this question http://www.seomoz.org/qa/discuss/37982/url-length-vs-url-keywords I was consulted to drop a folder in my url because it may be to long. That's why I'm hesitant to keep the bradcrumb structure the same. To the best of your knowldege do you think it's best to drop a folder in the URL to keep it shorter and sweeter, or to have a longer URL and have it match the breadcrumb structure? Please advise, Shawn Intermediate & Advanced SEO | | Romancing0
 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				