Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to fix issues regarding URL parameters?
- 
					
					
					
					
 Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs: 
 http://www.vistastores.com/table-lamps?dir=asc&order=name
 http://www.vistastores.com/table-lamps?dir=asc&order=price
 http://www.vistastores.com/table-lamps?limit=100Syntax in Robots.txt 
 Disallow: /?dir=
 Disallow: /?p=
 Disallow: /*?limit=Now, I am confuse. Which is best solution to get maximum benefits in SEO? 
- 
					
					
					
					
 No i dont think so, even if the thought they were duplicate, then they will pick one as the original. so one of them will rank. If you are still concerned use tha canonicall tag, rather them remove them from index 
- 
					
					
					
					
 Your concern is that, Google will crawl following all pages. If I will not do any thing with those pages. Right? http://www.vistastores.com/table-lamps http://www.vistastores.com/table-lamps?limit=100&p=2 http://www.vistastores.com/table-lamps?limit=60&p=2 http://www.vistastores.com/table-lamps?limit=40&p=2 Now, my website is on 3rd page of Google for Discount Table Lamps keyword. I have fear that, If Google will crawl multiple pages with duplicate Title tag so it may mesh up current ranking for Discount Table Lamps keyword. What you think about it? 
- 
					
					
					
					
 If the content is different, then dont do anything, but if it is duplicate us ethe canonical tag. The meta tages are not a problem, you are not going to get flaged for that, it would be better if you could make them unique but this is a very small problem 
- 
					
					
					
					
 Will it really work? Because, both page have different content. http://www.vistastores.com/table-lamps have 100 products and http://www.vistastores.com/table-lamps?limit=100&p=2 have different + unique 100 products. One another problem is regarding Meta info. Both page have same Meta info. If Google will index both pages so it may create warning message for duplicate Meta info across too many pages. 
- 
					
					
					
					
 Thats the advice i gave you, put a canonical tag in the page rel="canonical" href="http://www.vistastores.com/table-lamps"/> if google finds http://www.vistastores.com/table-lamps?dir=asc&order=name it will know it5 is mean to be http://www.vistastores.com/table-lamps 
- 
					
					
					
					
 Honestly, I did not getting it. Because, I have read one help article about URL parameters by Google. It shows me some different thing. Google suggested to use Google webmaster tools. But, I have restricted all dynamic pages by robots.txt. So, I want to know best practice which may help me to gain my crawling and no of indexed pages. 
- 
					
					
					
					
 I would simplty put a rel canonical in the page point ing to the true URL. so SE's will see them as one page. It is better to use cononical for the reasons in the google doc you posted, goolge may not pick the nest url to be the canonical, you should make that choice for them 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		SEO effect of URL with subfolder versus parameters?
 I'll make this quick and simple. Let's say you have a business located in several cities. You've built individual pages for each city (linked to from a master list of your locations). For SEO purposes is it better to have the URL be a subfolder, or a parameter off of the home page URL: https://www.mysite.com/dallas which is essentially https://www.mysite.com/dallas/index.php or http://www.mysite.com/?city=dallas which is essentially https://www.mysite.com/index.php?city=dallas Intermediate & Advanced SEO | | Searchout0
- 
		
		
		
		
		
		Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
 Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html Intermediate & Advanced SEO | | kirin44355
 https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
 https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
 https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
 https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
 https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
 https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
 Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
 Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
 Chris Gorski0
- 
		
		
		
		
		
		How do I know if I am correctly solving an uppercase url issue that may be affecting Googlebot?
 We have a large e-commerce site (10k+ SKUs). https://www.flagandbanner.com. As I have begun analyzing how to improve it I have discovered that we have thousands of urls that have uppercase characters. For instance: https://www.flagandbanner.com/Products/patriotic-paper-lanterns-string-lights.asp. This is inconsistently applied throughout the site. I directed our website vendor to fix the issue and they placed 301 redirects via a rule to the web.config file. Any url that contains an uppercase character now displays as a lowercase. However, as I use screaming frog to monitor our site, I see all these 301 redirects--thousands of them. The XML sitemap still shows the the uppercase versions. We have had indexing issues as well. So I'm wondering what is the most effective way to make sure that I'm not placing an extra burden on Googlebot when they index our site? Should I have just not cared about the uppercase issue and let it alone? Intermediate & Advanced SEO | | webrocket0
- 
		
		
		
		
		
		Attack of the dummy urls -- what to do?
 It occurs to me that a malicious program could set up thousands of links to dummy pages on a website: www.mysite.com/dynamicpage/dummy123 www.mysite.com/dynamicpage/dummy456 etc.. How is this normally handled? Does a developer have to look at all the parameters to see if they are valid and if not, automatically create a 301 redirect or 404 not found? This requires a table lookup of acceptable url parameters for all new visitors. I was thinking that bad url names would be rare so it would be ok to just stop the program with a message, until I realized someone could intentionally set up links to non existent pages on a site. Intermediate & Advanced SEO | | friendoffood1
- 
		
		
		
		
		
		Internal links and URL shortners
 Hi guys, what are your thoughts using bit.ly links as internal links on blog posts of a website? Some posts have 4/5 bit.ly links going to other pages of our website (noindexed pages). I have nofollowed them so no seo value is lost, also the links are going to noindexed pages so no need to pass seo value directly. However what are your thoughts on how Google will see internal links which have essential become re-direct links? They are bit.ly links going to result pages basically. Am I also to assume the tracking for internal links would also be better using google analytics functionality? is bit.ly accurate for tracking clicks? Any advice much appreciated, I just wanted to double check this. Intermediate & Advanced SEO | | pauledwards0
- 
		
		
		
		
		
		Do UTM URL parameters hurt SEO backlink value?
 Does www.example.com and www.example.com/?utm_source=Google&utm_medium=Press+Release&utm_campaign=Google have the same SEO backlink value? I would assume that Google knows the difference. Intermediate & Advanced SEO | | mkhGT0
- 
		
		
		
		
		
		Is it safe to redirect multiple URLs to a single URL?
 Hi, I have an old Wordress website with about 300-400 original pages of content on it. All relating to my company's industry: travel in Africa. It's a legitimate site with travel stories, photos, advice etc. Nothing spammy about. No adverts on it. No affiliates. The site hasn't been updated for a couple of years and we no longer have a need for it. Many of the stories on it are quite out of date. The site has built up a modest Mozrank value over the last 5 years, and has a few hundreds organically achieved inbound links. Recently I set up a swanky new branded website on ExpressionEngine on a new domain. My intention is to: Shut down the old site Focus all attention on building up content on the new website Ask the people linking to the old site to my new site instead (I wonder how many will actually do so...) Where possible, setup a 301 redirect from pages on the old site to their closest match on the new site Setup a 301 redirect from the old site's home page to new site's homepage Sounds good, right? But there is one issue I need some advice on... The old site has about 100 pages that do not have a good match on the new site. These pages are outdated or inferior quality, so it doesn't really make sense to rewrite them and put them on the new site. I call these my "black sheep pages". So... for these "black sheep pages" should I (A) redirect the urls to the new site's homepage (B) redirect the urls the old site's home page (which in turn, redirects to the new site's homepage, or (C) not redirect the urls, and let them die a lonely 404 death? OPTION A: oldsite.com/page1.php -> newsite.com Intermediate & Advanced SEO | | AndreVanKets
 oldsite.com/page2.php -> newsite.com
 oldsite.com/page3.php -> newsite.com
 oldsite.com/page4.php -> newsite.com
 oldsite.com/page5.php -> newsite.com
 oldsite.com -> newsite.com OPTION B: oldsite.com/page1.php -> oldsite.com
 oldsite.com/page2.php -> oldsite.com
 oldsite.com/page3.php -> oldsite.com
 oldsite.com/page4.php -> oldsite.com
 oldsite.com/page5.php -> oldsite.com
 oldsite.com -> newsite.com OPTION 😄 oldsite.com/page1.php : do not redirect, let page 404 and disappear forever
 oldsite.com/page2.php : do not redirect, let page 404 and disappear forever
 oldsite.com/page3.php : do not redirect, let page 404 and disappear forever
 oldsite.com/page4.php : do not redirect, let page 404 and disappear forever
 oldsite.com/page5.php : do not redirect, let page 404 and disappear forever
 oldsite.com -> newsite.com My intuition tells me that Option A would pass the most "link juice" to my new site, but I am concerned that it could also be seen by Google as a spammy redirect technique. What would you do? Help 😐1
- 
		
		
		
		
		
		Url with hypen or.co?
 Given a choice, for your #1 keyword, would you pick a .com with one or two hypens? (chicago-real-estate.com) or a .co with the full name as the url (chicagorealestate.co)? Is there an accepted best practice regarding hypenated urls and/or decent results regarding the effectiveness of the.co? Thank you in advance! Intermediate & Advanced SEO | | joechicago0
 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				