Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How much impact does bad html coding really have on SEO?
- 
					
					
					
					
 My client has a site that we are trying to optimise. However the code is really pretty bad. There are 205 errors showing when W3C validating. The >title>, , <keywords> tags are appearing twice. There is truly excessive javascript. And everything has been put in tables.</keywords> How much do you think this is really impacting the opportunity to rank? There has been quite a bit of discussion recently along the lines of is on-page SEO impacting anymore. I just want to be sure before I recommend a whole heap of code changes that could cost her a lot - especially if the impact/return could be miniscule. Should it all be cleaned up? Many thanks 
- 
					
					
					
					
 Hi Chammy, I inherited a site that reported 3,184 crawl errors in MOZ and a significant number of them (nearly 600) were duplicate titles and content. I have that down to under 1,000 total errors and only 86 critical errors. I have seen my ranking grow pretty substantially and in one week had 6 pages increase over 20 positions in rank. I can share the MOZ Rank Report if you would like to see it. So yes, it does have an impact. 
- 
					
					
					
					
 I'm sorry, I don't have any evidence from the user experience point of view,. although I would also be interested to see the results of any studies. I will say that from a site management/maintenance point of view it makes sense to try and keep the code as clean as possible. I've been involved in project were a considerable chunk of the cost was incurred due to the amount of time and effort that was required to unravel the mess even before any new changes were made! 
- 
					
					
					
					
 Thanks very much everyone - very helpful. Good point re page speed - the pages are certainly slow to load so this could well be due to the huge amount of js and bad code. And yes, think the duplicate tags should be sorted - this shouldn't be difficult. Has anyone got any tangible results that they've seen as a result of cleaning up js and code? 
- 
					
					
					
					
 If you've got things like duplicate title and meta-description's going on then I'd certainly take a look at fixing those. Being able to manage these two tags is vital to managing the way your pages will appear in the search results. (And your title tag is an important ranking factor). Normally, if your page doesn't validate then it's not a major problem and search engines won't penalise you for it. If however, your page is so badly crafted that the html errors, and general page structure makes it difficult for the search engines (and humans) to read your page then you're going to suffer. The key is to make sure that your site/page content is accessible. How accessible is your page to someone with disabilities, using a screen reader etc. You've got to make sure that the search engines can understand what your page is about or your page won't be seen as a relevant page for any search terms... How bad is it? How does google render the page in it's instant previews (you can check this is Google Webmaster tools) 
- 
					
					
					
					
 I personally don't worry about bad code unless it slows down my page or can possibly make things confusing for search engines or readers. If the title and meta are appearing twice this could be confusing for search engines, so I would change this. But, if you've got things like an unclosed here and there I personally don't think that's going to be much of a factor. 
- 
					
					
					
					
 Invalid code has a small effect on ranking. However, if the invalid code causes usability issues such as load time and or causes a high bounce rate then it can lower your rankings and of course cut back on conversions. Some of it is a higher priority than others. I would say defo remove the meta keywords. Combine JS pages. The tables while out of date is not a big issue. If you have the time and resources then yes it should be cleaned up. If not then clean up major problems 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		Barba Plugin and SEO
 Hello, community! My client wants to use the barba.js plugin for their new site. What are the implications for SEO? Technical SEO | | SimpleSearch0
- 
		
		
		
		
		
		Breadcrumbs on Mobile How important are they for SEO?
 Due to Poor unsightly look of breadcrumbs and the space it takes up above the fold we only employ breadcrumbs on our desktop version. Breadcrumbs are hidden from view on mobile version. However as mobile first indexing is now in play what technical SEO impacts will this have? one thing that comes to mind is crawling deeper pages where breadcrumbs made them accessible in less than 3 link clicks? But i am unsure now of the impacts of not having breadcrumbs visible for mobile version of our site. Technical SEO | | oceanstorm0
- 
		
		
		
		
		
		Google is indexing bad URLS
 Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks! Technical SEO | | Tom3_150
- 
		
		
		
		
		
		Do long UTM codes hurt SEO?
 Since most UTM codes/URLs are longer than 70ish characters, is this hurting my SEO? If it is, how can I solve the problem while still using a UTM code? Thanks! Technical SEO | | Cassie_Ransom0
- 
		
		
		
		
		
		SEO impact of the anatomy of URL subdirectory structure?
 I've been pushing hard to get our Americas site (DA 34) integrated with our higher domain authority (DA 51) international website. Currently our international website is setup in the following format... website.com/us-en/ website.com/fr-fr/ etc... The problem that I am facing is that I need my development framework installed in it's own directory. It cannot be at the root of the website (website.com) since that is where the other websites (us-en, fr-fr, etc.) are being generated from. Though we will have control of /us-en/ after the integration I cannot use that as the website main directory since the americas website is going to be designed for scalability (eventually adopting all regions and languages) so it cannot be region specific. What we're looking at is website.com/[base]/us-en. I'm afraid that if base has any length to it in terms of characters it is going to dilute the SEO value of whatever comes after it in the URL (website.com/[base]/us-en/store/product-name.html). Any recommendations? Technical SEO | | bearpaw0
- 
		
		
		
		
		
		Do Abbreviations Hurt SEO Results?
 We have certain products that we've abbreviated since it's a bit too long. For example, the word Fair Trade Organic is one of our categories and we abbreviate it to FTO. If I put FTO on our meta tag titles and links instead of the actual word, would that provide a weaker result? Technical SEO | | ckroaster0
- 
		
		
		
		
		
		International Seo - Canada
 Our organization is currently only operating in the USA but will soon be entering the Canadian market. We did a lot of research and decided that for our needs it would be best to use a subfolder for Canada. Initially we will be targeting the english speaking community but eventually we will want to expand to the french speaking Canadians as well. The question is - is there a preferred version in setting up the subfolders: www.website.org/ca/ -- default will be english www.website.org/ca/fr/ - french www.website.org/en-ca/ - english www.website.org/fr-ca/ - french www.website.org/ca/en/ -english www.website.org/ca/fr/ - french Thanks Technical SEO | | Morris770
- 
		
		
		
		
		
		What's the SEO impact of url suffixes?
 Is there an advantage/disadvantage to adding an .html suffix to urls in a CMS like WordPress. Plugins exist to do it, but it seems better for the user to leave it off. What do search engines prefer? Technical SEO | | Cornucopia0
 
			
		 
			
		 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				