Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How much impact does bad html coding really have on SEO?
- 
					
					
					
					
 My client has a site that we are trying to optimise. However the code is really pretty bad. There are 205 errors showing when W3C validating. The >title>, , <keywords> tags are appearing twice. There is truly excessive javascript. And everything has been put in tables.</keywords> How much do you think this is really impacting the opportunity to rank? There has been quite a bit of discussion recently along the lines of is on-page SEO impacting anymore. I just want to be sure before I recommend a whole heap of code changes that could cost her a lot - especially if the impact/return could be miniscule. Should it all be cleaned up? Many thanks 
- 
					
					
					
					
 Hi Chammy, I inherited a site that reported 3,184 crawl errors in MOZ and a significant number of them (nearly 600) were duplicate titles and content. I have that down to under 1,000 total errors and only 86 critical errors. I have seen my ranking grow pretty substantially and in one week had 6 pages increase over 20 positions in rank. I can share the MOZ Rank Report if you would like to see it. So yes, it does have an impact. 
- 
					
					
					
					
 I'm sorry, I don't have any evidence from the user experience point of view,. although I would also be interested to see the results of any studies. I will say that from a site management/maintenance point of view it makes sense to try and keep the code as clean as possible. I've been involved in project were a considerable chunk of the cost was incurred due to the amount of time and effort that was required to unravel the mess even before any new changes were made! 
- 
					
					
					
					
 Thanks very much everyone - very helpful. Good point re page speed - the pages are certainly slow to load so this could well be due to the huge amount of js and bad code. And yes, think the duplicate tags should be sorted - this shouldn't be difficult. Has anyone got any tangible results that they've seen as a result of cleaning up js and code? 
- 
					
					
					
					
 If you've got things like duplicate title and meta-description's going on then I'd certainly take a look at fixing those. Being able to manage these two tags is vital to managing the way your pages will appear in the search results. (And your title tag is an important ranking factor). Normally, if your page doesn't validate then it's not a major problem and search engines won't penalise you for it. If however, your page is so badly crafted that the html errors, and general page structure makes it difficult for the search engines (and humans) to read your page then you're going to suffer. The key is to make sure that your site/page content is accessible. How accessible is your page to someone with disabilities, using a screen reader etc. You've got to make sure that the search engines can understand what your page is about or your page won't be seen as a relevant page for any search terms... How bad is it? How does google render the page in it's instant previews (you can check this is Google Webmaster tools) 
- 
					
					
					
					
 I personally don't worry about bad code unless it slows down my page or can possibly make things confusing for search engines or readers. If the title and meta are appearing twice this could be confusing for search engines, so I would change this. But, if you've got things like an unclosed here and there I personally don't think that's going to be much of a factor. 
- 
					
					
					
					
 Invalid code has a small effect on ranking. However, if the invalid code causes usability issues such as load time and or causes a high bounce rate then it can lower your rankings and of course cut back on conversions. Some of it is a higher priority than others. I would say defo remove the meta keywords. Combine JS pages. The tables while out of date is not a big issue. If you have the time and resources then yes it should be cleaned up. If not then clean up major problems 
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		SEO - New URL structure
 Hi, Currently we have the following url structure for all pages, regardless of the hierarchy: domain.co.uk/page, such as domain/blog name. Can you, please confirm the following: 1. What is the benefit of organising the pages as a hierarchy, i.e. domain/features/feature-name or domain/industries/industry-name or domain/blog/blog name etc. 2. This will create too many 301s - what is Google's tolerance of redirects? Is it worth for us changing the url structure or would you only recommend to add breadcrumbs? Many thanks Katarina Technical SEO | | Katarina-Borovska1
- 
		
		
		
		
		
		Are Expires Headers Detrimental to SEO Health?
 My dev was looking into Expires Headers to increase speed, but she don't know the ramifications behind them for SEO. What I found online is really old: https://a-moz.groupbuyseo.org/blog/expires-headers-for-seo-why-you-should-think-twice-before-using-them What do SEOs think? Thanks in advance! ~Dana Technical SEO | | dklarse0
- 
		
		
		
		
		
		Word mentioned twice in URL? Bad for SEO?
 Is a URL like the one below going to hurt SEO for this page? /healthcare-solutions/healthcare-identity-solutions/laboratory-management.html I like the match the URL and H1s as close as possible but in this case it looks a bit funky. /healthcare-solutions/healthcare-identity-solutions/laboratory-management.html Technical SEO | | jsilapas0
- 
		
		
		
		
		
		Non-Existent Parent Pages SEO Impact
 Hello, I'm working with a client that is creating a new site. They currently are using the following URL structure: http://clientname.com/products/furry-cat-muffins/ But the landing page for the directory /products/ does not actually have any content. They have a similar issue for the /about/ directory where the menu actually sends you to /about/our-story/ instead of /about/. Does it hurt SEO to have the URL structure set up in this way and also does it make sense to create 301 redirects from /about/ to /about/our-story/? Technical SEO | | Alder0
- 
		
		
		
		
		
		SEO value of InDesign pages?
 Hi there, my company is exploring creating an online magazine built with Adobe's InDesign toolset. If we proceeded with this, could we make these pages "as spiderable" as normal html/css webpages? Or are we limited to them being less spiderable, or not at all spiderable? Technical SEO | | TheaterMania1
- 
		
		
		
		
		
		Will blocking the Wayback Machine (archive.org) have any impact on Google crawl and indexing/SEO?
 Will blocking the Wayback Machine (archive.org) by adding the code they give have any impact on Google crawl and indexing/SEO? Anyone know? Thanks! ~Brett Technical SEO | | BBuck0
- 
		
		
		
		
		
		Are 404 Errors a bad thing?
 Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks Technical SEO | | Prime850
- 
		
		
		
		
		
		How do I redirect index.html to the root / ?
 The site I've inherited had operated on index.html at one point, and now uses index.php for the home page, which goes to the / page. The index.html was lost in migrating server hosts. How do I redirect the index.html to the / page? I've tried different options that keep giving ending up with the same 404 error. I tried a redirect from index.html to index.php which ended in an infinite loop. Because the index.html no longer exists in the root, should I created it and then add a redirect to it? Can I avoid this by editing the .htaccess? Any help is appreciated, thanks in advance! Technical SEO | | NetPicks0
 
			
		 
			
		 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				