Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Special characters in URL
- 
					
					
					
					
 Hi There, We're in the process of changing our URL structure to be more SEO friendly. Right now I'm struggling to find a good way to handle slashes that are part of a targeted keyword. For example, if I have a product page and my product title is "1/2 ct Diamond Earrings in 14K Gold" which of the following URLs is the right way to go if I'm targeting the product title as the search keyword? - example.com/jewelry/1-2-ct-diamond-earrings-in-14k-gold
- example.com/jewelry/12-ct-diamond-earrings-in-14k-gold
- example.com/jewelry/1_2-ct-diamond-earrings-in-14k-gold
- example.com/jewelry/1%2F2-ct-diamond-earrings-in-14k-gold
 Thanks! 
- 
					
					
					
					
 Jonaz just to add to what others said.. #1 would be the most logical answer. / (forward slash) indicates a new directory so you can't use that. % is reserved for character encoding so you shouldn't use that. _ (underscore) joins as one word and 12ct would be wrong 
- 
					
					
					
					
 doesn't seem to, no 
- 
					
					
					
					
 Quick follow-up question: Does google treat the phrases "half" and "1/2" as the same? 
- 
					
					
					
					
 You could totally replace common occurrences: - 1/2 = half
- 1/4 = quarter
- 1/3 - third
- etc
 Then just remove the less common ones entirely. 
- 
					
					
					
					
 I personally would go with #1. Definitely not #4, you never want special characters in the URL. The reason I say number 1 is because it separates the 1 from the 2 in your 1/2. #2 could be confused for a 12ct diamond earring, WOW. #3 I typically avoid underscores in all URLs. To sum up my choice is #1. Looks cleanest and when you optimize your page with the 1/2 ct wording, Google is smart enough to see that. Overall, it won't probably make a huge difference in the end. 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		HTML entity characters in meta descriptions
 Is it okay to leave HTML entity characters, such as " in meta descriptions? Will search engines translate these appropriately? Technical SEO | | ellenu0
- 
		
		
		
		
		
		Google is indexing bad URLS
 Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks! Technical SEO | | Tom3_150
- 
		
		
		
		
		
		Tool to Generate All the URLs on a Domain
 Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content. Technical SEO | | timfrick0
- 
		
		
		
		
		
		Redirect URLS with 301 twice
 Hello, I had asked my client to ask her web developer to move to a more simplified URL structure. There was a folder called "home" after the root which served no purpose. I asked for the URLs to be redirected using 301 to the new URLs which did not have this structure. However, the web developer didn't agree and decided to just rename the "home" folder "p". I don't know why he did this. We argued the case and he then created the URL structure we wanted. Initially he had 301 redirected the old URLS (the one with "Home") to his new version (the one with the "p"). When we asked for the more simplified URL after arguing, he just redirected all the "p" URLS to the PAGE NOT FOUND. However, remember, all the original URLs are now being redirected to the PAGE NOT FOUND as a result. The problems I see are these unless he redirects again: The new simplified URLS have to start from scratch to rank 2)We have duplicated content - two URLs with the same content Customers clicking products in the SERPs will currently find that they are being redirect to the 404 page. I understand that redirection has to occur but my questions are these: Is it ok to redirect twice with 301 - so old URL to the "p" version then to final simplified version. Will link juice be lost doing this twice? If he redirects from the original URLS to the final version missing out the "p" version, what should happen to the "p" version - they are currently indexed. Any help would be appreciated. Thanks Technical SEO | | AL123al0
- 
		
		
		
		
		
		XML Sitemap and unwanted URL parameters
 We currently don't have an XML sitemap for our site. I generated one using Screaming Frog and it looks ok, but it also contains my tracking url parameters (ref=), which I don't want Google to use, as specified in GWT. Cleaning it will require time and effort which I currently don't have. I also think that having one could help us on Bing. So my question is: Is it better to submit a "so-so" sitemap than having none at all, or the risks are just too high? Could you explain what could go wrong? Thanks ! Technical SEO | | jfmonfette0
- 
		
		
		
		
		
		Best URL format for pagination
 We're currently changing the URL format of our website search, we have been discussing a lot and cannot decide the past way to pass the pagination parameter for SEO. We narrowed down to the options. www.website.com/apples/p2 - www.website.com/apples?page=2 - www.website.com/apples/page/2 What would give us best ranking returns? What do you think? Technical SEO | | HelpSaude0
- 
		
		
		
		
		
		Ok to Put a Decimal in a URL?
 I'm in the process of creating new product specific URLs for my company. Some of our products have decimals in them for their names as a unit of measurement. For example - .The URL for a 050" widget would be something like: http://www.example.com/product/category/.050-inch-widget My question is - Can I use a decimal in the URL without ticking off the search engines, and/or causing any other unexpected effects? Technical SEO | | CodyWheeler0
- 
		
		
		
		
		
		Urls with or without .html ending
 Hello, Can anyone show me some authority info on wheher links are better with or without a .html ending? Thanks is advance Technical SEO | | sesertin0
 
			
		 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				