Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
The W3C Markup Validation Service - Good, Bad or Impartial?
-
Hi guys,
it seems that now days it is almost impossible to achieve 0 (Zero) Errors when testing a site via (The W3C Markup Validation Service - https://validator.w3.org). With analytic codes, pixels and all kind of tracking and social media scripts gunning it seems to be an unachievable task.
My questions to you fellow SEO'rs out there are 2:
1. How important and to what degree of effort do you go when you technically review a site and make the decision as to what needs to be fixed and what you shouldn't bother with.
2. How do you argue your corner when explaining to your clients that its impossible to active 100% validation.
*As a note i will say that i mostly refer to Wordpress driven sites.
would love ot hear your take.
Daniel.
-
I am my own client, so I can be as picky as a want, and I take care of the details that I feel are important.
I pay close attention to how the site is responding and rendering when I pretend that I am a visitor. I pay even more attention when a customer or visitor writes to me with a complaint. In my opinion, if the site is working great then all is good.

W3C validation seems to be of jugular importance to W3C evangelists. They will tell you that you will burn in Hell if you don't achieve it with flying colors. People who want to sell you their services will point at any fault that can be detected.
Practical people have a different opinion. I try to be as practical as possible.
-
I agree with Andy,
I use it as a guidance tool on any website i build. It serves a purpose, to check things are understood how they should be by a predetermined standard. But like any other automated tool it compares to set requirements that cannot always be met and cannot identify and ok these exceptions.
As long as you understand the error its pointing out and why its pointing it out, and know that despite this the code is rendering correctly and all outcomes are working as expected then there is no problem.
From an SEO stand point, aslong as google see's your site how you want it too i think it is a very very minor factor. Hell all of google returns errors of some variety.
-
Hi Yiannis,
I tend to add these in as an advisory to my clients because for the most part, and unless I see something specific, the results have absolutely no effect on SEO. If they wish to act on them, it is for their developers to handle.
I don't argue my corner really - never had to. I just tell them like it is - the site is rendering fine in everything and with no issues, so fix errors if you have the time and resources.
As I said, unless I spot something that is an actual problem, then it tends to just get bypassed.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
Schema Markup Warning "Missing field "url" (optional)"
Hello Moz Team, I hope everyone is doing well & good, I need bit help regarding Schema Markup, I am facing issue in my schema markup specifically with my blog posts, In my majority of the posts I find error "Missing field "url" (optional)"
Technical SEO | | JoeySolicitor
As this schema is generated by Yoast plugin, I haven't applied any custom steps. Recently I published a post https://dailycontributors.com/kisscartoon-alternatives-and-complete-review/ and I tested it at two platforms of schema test 1, Validator.Schema.org
2. Search.google.com/test/rich-results So the validator generate results as follows and shows no error
Schema without error.PNG It shows no error But where as Schema with error.PNG in search central results it gives me a warning "Missing field "url" (optional)". So is this really be going to issue for my ranking ? Please help thanks!6 -
Using 2 cache plugin good or not?
Hi, Can anyone tell me - whether using 2 cache plugin helps or it cause any issue? Besides, when i used w3 cache plugin in WordPress its found like inline CSS issue to get cleared. So, i tried auto optimized but my website Soc prollect gone crashed in between while using the some. Is there any solution and can anyone tell me which plugin advantages to speed the site by removing java script and inline css at a time.
Technical SEO | | nazfazy0 -
Spammers created bad links to old hacked domain, now redirected to our new domain. Advice?
My client had an old site hacked (let's call it "myolddomain.com") and the hackers created many links in other hacked sites with links such as http://myolddomain.com/styless.asp?jordan-12-taxi-kids-cheap-T8927.html The old myolddomain.com site was redirected to a different new site since then, but we still see over a thousand spam links showing up in the new site's Search Console 404 crawl errors report. Also, using the links: operator in google search, we see many results of spam links. Should we be worried about these bad links pointing to our old site and redirecting to 404s on the new site? What is the best recommendation to clean them up? Ignore? 410s? Other? I'm seeing conflicting advice out there. The old site is hosted by the client's previous web developer who doesn't want to clean anything up on their end without an ongoing hosting contract. So beyond turning redirects on or off, the client doesn't want to pay for any additional hosting. So we don't have much control over anything related to "myolddomain.com". 😞 Thanks in advance for any assistance!
Technical SEO | | usDragons0 -
Schema markup for products is missing "price": Is this bad?
Hey guys, So a current client of mine has an e-commerce shop with a few hundred products. They purposely choose to keep the prices off of their website, which is causing errors in Google Webmaster Tools. Basically the error shows: Error: Structured Data > Product (markup: schema.org) Error type: missing price 208 items with error Is this a huge deal? Or are we allowed to have non-numerical prices for schema ie. "call for quote"
Technical SEO | | tbinga1 -
Website credits for designers - good or bad
Hi My core service is web design and development. I often place a credit on my clients websites pointing them back to my web design or web development pages. Is this a wise practice with penguin and panda updates? Would this also pull my ranking down?
Technical SEO | | Cocoonfxmedia0 -
Move established site from .co.uk to .org - good or bad idea?
I am currently considering moving our site from the current .co.uk domain to the .org version which we also own. The site is established and indexed for 7 years, ranks well and has circa 10k traffic per month which is mainly UK & US traffic. The reason for the change to the .org domain is to make the site more global facing and give us the opportunity to develop the site into multi language within directories (.org/es/ etc.) and then target those to the local search engines. For the kind of site it is (community based) it wouldn’t really work to split this into lots of separate country targeted domains. So the choice is to either stick with the .co.uk and add the other foreign language specific content in directories within the .co.uk or move to the .org and do the same (there is also a potential third option of purchasing the .com which is currently unused but that could be pricey!) We are also planning a big overhaul of the site with redesign, lots of added content and reorganisation of the site – but are thinking that it would be better to move the domain on a 1:1 basis first with the current design, content and URL structure in place and then do the other changes 2 or 3 months down the line. I have read up on SEOmoz, google guidelines etc on moving a site to a new domain and understand the theoretical approach of moving the site and the steps to take (1to1 301 redirects, sitemaps on old and new etc) and I will retain ownership of the .co.uk so the redirects can remain in place indefinitely. However having worked so hard to get the site to where it is in the search engines and traffic levels I am very worried about whether the domain change is a good move. I am more than happy to accept a temporary fluctuation in rankings & traffic for 1 – 4 weeks as reported may happen as long as I can be sure it will return after a temporary period and be as strong (or almost as strong) as the previous rankings / traffic. Looking for peoples experiences to give me the confidence / reassurance to go ahead with this or any info on why I shouldn’t Thanks in advance for your advice. Adrian.
Technical SEO | | Zilla0 -
How much impact does bad html coding really have on SEO?
My client has a site that we are trying to optimise. However the code is really pretty bad. There are 205 errors showing when W3C validating. The >title>, , <keywords> tags are appearing twice. There is truly excessive javascript. And everything has been put in tables.</keywords> How much do you think this is really impacting the opportunity to rank? There has been quite a bit of discussion recently along the lines of is on-page SEO impacting anymore. I just want to be sure before I recommend a whole heap of code changes that could cost her a lot - especially if the impact/return could be miniscule. Should it all be cleaned up? Many thanks
Technical SEO | | Chammy0