Valid sites work better(?)
I learned HTML at a time when some people were still building several versions of their site. I'm not talking about the web, mobile and iphone versions – more like the netscape and IE3 versions. That was a time when writing “standard” HTML was still a fairly novel idea, but a powerful one. It made sense: the alternative was “write standard code or risk having browsers crash miserably on your web page”.
That was more than a decade ago. Browsers, meanwhile, have made incredible progress at gracefully rendering even the most broken web page. And that is a good thing.
Does this make validation and quality checking of Web pages moot? Of course not. There are many more incentives to build great standard-compliant websites: ease of maintenance, show of professionalism, or, in the words of Zeldman, Client who saves $5,000 buying cut-rate non-semantic HTML will later spend $25,000 on SEO consultant to compensate
.
It makes me curious, however, to know what are the real-life arguments in favor of valid, standard code today. Do you have an untold story of validation getting you rid of an awful rendering glitch? Real-life accounts of a search engine bump achieved by fixing the syntax of you HTML <head>
? A typo in a CSS stylesheet that hours of glancing at code didn't show, but the validator did? A forgotten alt
that would have lowered your search rank for an important keyword, or cost a big fee for non-accessibility?
Use the comments below to share and discuss your experience - we'll update our outdated “Why Validate?” doc with the best examples.
Why do I ensure my pages are valid? Because I serve my pages as application/xhtml+xml and a kitten is drop kicked every time the Yellow Screen of Death displays.
not a "i validated my pages and came top on google" story, but i can't even count the times anymore when people at work have asked me in desperation why their pages just seem broken or why their css just doesn't seem to do what it's supposed to, and the root cause of the problem turned out to be invalid html. as a rule now i don't troubleshoot any css / layout issues anymore unless i know that at least the markup that's being styled, as well as the css of course, is valid. saves hour in troubleshooting or working around different browsers' error-handling and compensation approaches that kick in when presented with broken code.
actually, to expand on the markup part: it's not so much making sure that it's valid, but rather that it's well-formed (or, to use old wcag 2 parlance, that it can be "parsed unambiguously")...but certainly having pages that also validate is a sign of good QA, unless there's a good reason for not validating (use of non-standard, but otherwise harmless, attributes as scripting hooks, or a shoddy CMS/backend/aggregator that doesn't properly escape ampersands and such).
I tried to avoid to use the Markup validator. I considered it a last chance tool. But to me more precise…
Local installation of the validators is a bit painful… but you do not do it often. You can use them offline, on resources which have confidential content, not being victims of any network outages, and you can combine with a local install of the LogValidator.
… and with all of that you save kittens.
There are two great things about validation: Validating supports and accelerates learning, so it contributes to awareness of respective specifications, and releasing formally valid code is a sign of professionalism.
Put another way, developers not validating most likely learn slower, and invalid code can in truly the most cases be considered unprofessional.
However, invalid code doesn't have to mean inaccessible or unmaintainable code. That's a myth. You can inverse that statement too though: Valid code does not – by far not – mean accessible or maintainable code, or even efficient yet fast code. The guys stating that are wrong or having different motives, as the advantages and great things about validation are, see above.
Jens, I'll take the liberty to quote another of your blog posts here.
Coding with XHTML 1.0 strict, a reset css file and making sure it's all valid is a great way to ensure that the website I build will look the same way on IE 6 and pretty much any browser published after that point.
As soon as I don't bother using valid code, certain browsers gracefully handle the problem... but usually one browser will end up doing something wacky (they call it quirks mode for a reason!).
I don't like wacky unless I styled it that way :)
Even not a web designer nor developer, but validator is there to tell me if something go wrong with my web presentation.
Then I can fix the specific problem faster and it should be standard-compliant. Easier for web maintenance and development.
Validators create a teachable moment. A moment of great opportunity. A time to educate. A time to get action. A time to get people to actually fix their pages. Flagging errors and giving warnings is a very good thing indeed.
The W3C HTML/XHTML validator is currently used as a web accessibility teaching tool. For instance, I have my students use it in the accessibility classes that I teach to flag missing alt attributes. One of their first lessons is to validate HTML on the W3C site to ensure that it is error-free and that they have indeed examined each image. It makes a BIG impression that text alternatives are mandatory not just for WCAG but as well for valid HTML4 and XHTML. It is an undeniable advertisement that it is needed. It is a first step in getting that vital message across.
The W3C CSS and (X)HTML validators are also used in lessons to help teach how to debug CSS. Many cases of "it works in one browser but not another" are caused by silly author errors. Typos and wrong syntax can cause different problems in different browsers. Small mistakes may be difficult for students to spot in their own code, but the validator excels at pinpointing them immediately. If a person uses syntax in their CSS or (X)HTML, which is not correct, there is a far greater chance that style sheets won't work as expected. Any number of strange behaviors can be triggered in various browsers if there are syntax errors. Validation isn't a magic bullet that will automatically solve all problems. But validating pages helps eliminate many which leaves a more manageable subset to address.
The W3C validators do far more than the specifications themselves to educate and increase the quality of HTML documents on the web. They are indispensable for student learning. Thank you for providing theses vital tools.
Are there actually real-life arguments in favor of non-valid code?
Hi,
What is HTML's logo/icon?
I have found that having both valid XHTML and CSS, while not necessarily helping you in the search engines, definitely doesn't hurt, and bad code, broken CSS or missing alt tags will hurt you in the search engines.
I can't think of a single instance where invalid code would help anyone.
Using valid semantic markup helps make websites and web applications more accessible and easier to use. Valid markup is also a LOT easier to style (with consistent results), maintain, and test!
I disagree with "And that is a good thing" - IMO it just encourages more website developers to be lazy. Moreover, not all browsers recover from errors in the same way, and so validation is a great help to cross-browser compatibility.
Tim - you shouldn't have any "alt tags" in your code, because there's no such thing. They're alt attributes. If you want to be understood clearly, correct use of terms is as important as correct HTML.
I believe the more we use compliant code, the faster browser companies will start making the right decisions with their products. We shouldn't have to fix our code for use with multiple browsers. Browser companies should fix their applications to be compliant and valid across ALL versions. Not just the new ones.
i think its a pile of crap to be very honest
easy solution is to use a script to listen for a browser then direct to an
appropriate page, i have been messing with wc3's auto validation
and things that they teach on w3cschools don’t even pass validation
VERY POOR
If you think im morning it is like 1:30am still trying to
Finnish my site up to w3c as its part of an assignment
I have built four websites w/o any formal training, and all have passed w3c validation after learning how to correct the errors and warnings. I believe that this tool is very beneficial to those of us who may miss an important aspect which could lead to lower rankings. I like having the peace of mind, knowing that one aspect of website development is optimized instead of simply guessing. I did, however; hope that it would be weighted a bit more heavily when it comes to the SE algorithms.
Is it just on my screen, or did anyone else notice that the sentence above,("A typo in a CSS stylesheet that hours of glancing at code didn't show, but the validator did? A forgotten alt that would have lowered your search rank for an important keyword, or cost a big fee for non-accessibility?") runs off of the page??? It appears rather ironic!
Anyway, keep up the good work, I believe validation is a valuable asset.
Thank you all so much for all the great, thought provoking feedback on this post. We are getting really close to having critical mass to rewrite the outdated “why validate” document!
I'll take the opportunity to follow-up on some of the comments, and try to summarize some of the ideas.
I see quite a few of you are using validation as a “maintenance” or debug tool. This is interesting, and I think we could try and stress both the fact that validating can be a time-saving reflex whenever a site display/interactions shows bugs in some platforms, as well as advocate for valid, well formed and rich code as an insurance against bad surprises.
Accessibility too seems to be a very important role. We are touching a controversial but fascinating area here. Some may remember the hard-fought consensus in WCAG2 about the need for valid markup. What I read in @Laura's comment is an interesting reversal of that discussion: educating (future) professionals about valid markup as a good practice can be a gentle introduction to the wider scope of accessibility.
@charlie asks, tongue in cheek perhaps “Are there actually real-life arguments in favor of non-valid code?”. I think “laziness” is the quick answer I can make here – longer answers may digress into pop psy ;) – but it is very important that we know what motivates people to think “I don't care”. If we advocate for Web quality with the wrong arguments, some people will refute the arguments and get entrenched in a “my tag soup works just fine, why should I waste my time?” standpoint. This also joins @Stewart's comment. I honestly think that laziness is not inherently bad. Laziness is what causes us to not bother about what is not important, and focus on elegant solutions for the rest.
Finally, since many like “TheRealNeO Linux hax0r” get confused, do note that w3schools and w3c are not related. w3schools is an often useful, sometimes mistaken resource made by people with no ties whatsoever with w3c. If their site is in error, please complain to them, not the W3C :).
Why I think valid code is important? Because I'm not just publishing style, I'm publishing data. Markup is about data, and I want my data to be open not to just human eyes, but all sorts of agents. The semantic web is coming, eventually, and I don't want to revisit the work I do now. I already work so hard to share my data and have people interested in my data, I'm all happy to have help from external sources. It would be akin to fixing up old pages that have been so badly written that Google can't index them. Like microformats: they are an intermediate step on the way to the semantic web. That's why valid code is important.
With the recent news about IE 8 supporting standards by default, we are going to be entering an age where the predominate browser types will be standard loving.
This means that when you code to standards and validate, more and more, the appearance of your site is going to look the same from one standards loving browser to the next. However, until HTML 5 hits critical mass, these standards browsers still render invalid code in different ways. This means your website might look fine in the one browser you are testing in, but that other popular browser you didn't code for ends up breaking the appearance of your masthead and navigation bar, for example.
Quirks mode or DOCTYPE switched, different browsers render invalid code in different ways. But standards loving browsers will more often than not, render your valid code in a much more predictable way.
True, w3schools are a different organization and they have outdated documentation. While they have much more user-friendlier guides to writing code - compared to W3C -, they still have to work on representing such a sensitive area of development, where everyone can use validation tools to check their pages.
Now about standards and browsers it is my personal opinion that respecting standards should be the most important aspect of any web development project because it's much easier and less time consuming to make browsers work with valid code than blowing your mind rendering pages made by people who don't care about/know the web standards. After all, a browser should - in theory - consume more resources to display a non-valid document because it needs a little more time to interpret the code, while standard pages are simply displayed by the rules. Take Yahoo! Mail or Google Mail for example. They really ignore the web standards and, because of that, their web interfaces are a bit slower than other pages - maybe equally large as these pages.
Actually, following the standards could be in the end the best way for everyone because rules make things simpler. Writing code randomly makes it confusing for early developers as well, because they just don't know what is and what's not correct, unless they learn standards and base their work on them. In the end, aiming towards a valid WWW is something the will ultimately affect servers as well, because clients busy rendering "crap" will stress the servers for a longer period of time for all content to load. So considering the whole equation it is cheaper to go with the standards than otherwise.
One very simple point.
who needs long doctype? validator
Before any HTML5, placing what you would call "incorrect" in the beggining, would turn good standards HTML 4.01 Strict for IE6 and every other browser. This makes unneeded all other validator stuff like content-type, public/private ns links in doctype which were only useful for validator itself.
browser is validator
We live in a real world, as editor creates page, he obviously MUST check how it looks in all popular browsers. Conforming to standards give a perfectly high chance that page will still look like that in the future, but still he must check it now.
Search engines raise rank of "valid" pages? No.
Because page doesn't have to be some kind of mathematically proven valid. Purpose of page is to represent the information. If the information is shown correctly, then page is valid for me, for every other user. If page has bad markup, obviously browser must try its best to recover and show most he can. Well it would be better if webmaster could get a notification that his page renders incorrectly. But validator is about conforming some strict rules, while browsers are not.
So making page "valid for validator" is pointless, it's a stupid machine. Make pages valid for users, they are only ones who we make pages for, they are only ones who could thank you for pages.
The single best reason for valid markup in my experience will always be the fact it loads faster. Errors in your markups can seriously penalize load times on pretty much all browsers though it's more noticable on larger pages than small ones.
@Allan, do you have stats on that one? Something in me would like to believe that well-formed markup is faster to parse and render than tag soup, but I'd rather see actual numbers than believe with my eyes closed.
If there is a speed gain, is it mostly related to parsing? Or to rendering? I suspect that the latter would make a much bigger difference, since the end user would feel rendering delay (or re-rendering of the page, or rendering after a delay) more acutely.
If the difference is significant, it would be very interesting to compare speed gained in parsing well-formed and valid markup to the time gained by retrieving shorter, but invalid, pages.
Olivier,
after discussions with some browsers implementers, the processing time for a Web page is from the quickest to slowest
John Resig has an article about performances and there is also Browser Page Load Performance
Microsoft has also an article on issues assessing browser performance.
@karl, many thanks for the pointers!
I started adding the contributions from these comments into the "why validate" document on the validator. Work in progress here: http://qa-dev.w3.org/wmvs/HEAD/docs/why
Sergey Shepelev, you do have a point with the real world application. It's true that it makes sense to make the pages look good for the users, but you missed a few points regarding validation. While it seems absurd to write code for the "validators" it is actually about the browsers. While parsing the HTML content, the browsers switch the rendering engine to Quirks Mode if the find the document not to be valid.
Obviously, the Quirks Mode takes a bit longer to process because the clients must see a normal page even if the HTML is "broken". But the problem is not with rendering time on the client. For visitors, the rendering mode is transparent because they don't see anywhere if they are viewing a website in Standards Compliance Mode or Quirks Mode. They just wait a fraction longer - maybe unnoticeable.
So the problem is not on the clients but on the server. Because clients take a bit longer to process web data, they also load the content at a slower pace - along with their rendering "speed". Sum that up and the server will have many clients with longer sessions, which means the server gets stressed more than it could. This is why, in my opinion, writing valid code is very important.
The question was: "Valid sites work better?"
The answer is: "They do work better, but never forget to view them in different browsers to adjust whatever doesn't look as it should."
Nicolae Crefelean, thanks for answer. Even more, i think user must not see normal page if HTML is really broken. If you open broken .rtf in text editor, it will be just mess and it's normal. Because nobody produces wrong RTF. Producers of really wrong, unrenderable HTML must be punished and fix their errors.
But, i completely hate HTML "close tags" rule. Well it's rather to say that HTML syntax at all is antihumanic and thus, they must use proper software to generate renderable HTML. And writing schemas addresses in DOCTYPE is completely pointless - browsers don't fetch it. It only makes validator unhappy.
That said, i can imagine so nice future if HTML was generated by software (and thus contained no syntax errors as RTF/ODT/etc) and browsers loudly failed on really unrenderable HTML. Perhaps, i must try to create a proper editor.
Problem on server is just so funny and minded. Session time doesn't consume proper (asynchronous) server resources. Single commodity box can have 45K keep-alive sessions at almost zero CPU usage.
It's sad to agree on "never forget to check in different browsers". Would be nice if same HTML looked same everywhere.
I have found the sea change in clients adopting Web Standards, e.g., semantically written HTML/CSS code which passes validation and meets Accessibility, ironic.
Three years ago, clients would reject any suggestion for semantics and validation. However, after Search Marketing experts agreed that semantically written HTML/CSS code which passes validation and meets Accessibility, i.e., Web Standards, benefits search engines results positions - clients want valid sites.
Return-on-investment by Web Standards.
??????, ?????????? ??????, ??????????? ??? ??????? ????????????? ????? ??????. (?? ? ????? ??????, ??????? :))
I would love to have a browser that, when encountering the doctype, could optionally perform a validation on a page and if not valid refuse to display the page, offering the option then to ignore the required validation and display the page anyway.
As a career software developer, I shudder to think what life would have been like if the FORTRAN compiler said in effect "Eh, that syntax is pretty close...let's give you a number that might be what you want".
I think it's time for browsers to be cruelly resistant to non standards compliant pages. Of course...with the standards being open to interpretation that's a entirely new argument to be had.
I feel that without browsers that are as strict as a compiler there will be too much art in the coding of a Web page. Save the art for the style and appearance of a page, yes, but the implementation of that artistic choice should be as rigorous as C code.
My two cents...I'd be happy to discuss.
Ahoy
I am disabled with multiple chronic illnesses which impair my eyesight, hearing, speech and motor skills. (But I look good when I'm very still and quiet)
I became involved with W3C in 1998 with my first attempt at building a simple Accessible webpage. Tim Berner-Lee's words inspired me. (And still do.)
By following W3C guidelines I found I could write html with only a text editor and explore my creativity by "seeing with my minds eye". I didn't need the popular WYSIWYG's. The W3C validator assured me my inexperience and (perhaps dubious) self-taught methods were not an issue and I was achieving my goal; Accessibility plus Cross-Browser Interoperability (be that it as it may).
I continue to faithfully use only a text editor and the Validator.
I am on a fixed (and quite meager) income. Someday Santa will bring me Adobe Flash and more importantly a new computer to replace my 5 year old refurbished eMac (hee hee) and I can do some really exciting things! Regardless, I know my webpages are compliant and render properly (as browsers will permit). The W3C Validator will always be my most important tool.
I received a grant in 1998 to build my first website for a local Art Foundation to serve as it's virtual gallery. The Foundation's building was too small to exhibit their most valuable and historic works.
Naturally it was imperative the website be attractive, easily to navigate and globally interoperable. Within weeks of its upload the other major Art Institutes in my city shut down their websites and posted "Under Construction" notices (as was popular in the late '90's). Much to my surprise and sorrow, eleven sites, known nationally, today still do not pass validation. Where have they been?
The Art website is still on-line. (I often call it mine since my name is still in the source code.) Someone else is charged with the upkeep and in the last 4 years many areas are no longer compliant and the source contains proprietary code (and there are the mysterious dead links). They removed all the Valid Markup badges rather than investigate and correct simple errors, many are due to proprietary code. (The "Lazy" factor?) Yet I'm pretty proud of my first attempt and that the Foundation has not desired to change its design. Sadly, subsequent websites I was commissioned are no longer on-line due to the businesses closing. (Of course I have copies)
You may not be able to understand how difficult for someone like myself to stay in synch with new W3C guidelines, upgrades and innovations in web design. There are entire years I have not been able to work on the computer due to illness. Everything has become more exciting but definately complicated in the past decade. Software for Accessibility is inaccessible to me due to cost, OS and more often interface. Many things are Windows only. For instance, I cannot use MAGpie because of my handicaps. I must code text-captions tediously, word by word, millisecond by millisecond. But I have been able to keep abreast of most changes simply by validating. For myself, especially, the validator has also become an essential and invaluable learning tool. And I definately perform cross platform/browser/seo checks because sometimes I am so far behind.
For someone like myself, limited and cut off physically from the world, the WWW opened a door for me I thought closed. It is my greatest escape and I am beyond thankful for the opportunity. I became productive and contributing citizen again and am only limited by my imagination. To me this is a gift. Care and attention to detail must be taken. Procedures set in place through the efforts of hundreds of people and thousands of man-hours demand respect. There is responsibility and accountability in all we do. Our webpages speak volumes of how we think and how we think about others.
To sum it up (finally) the W3C Validator does more than validate markup. It validates ... me!
~CAVU~
-LK
A little to sappy and philosophical? Yeah, I know ... someone else always has a far better sob story!!
Thank you So much!!!Great!!
I have been following the story all over the Internet are already start in 1999 with her and I are here, but unfortunately I have failed to make it further and then I went on the 2003 stopped.
The idea of making a standard "I like that".
I am here to inform me regularly about it.
If you look at what has done in the field of computer and what is today the Internet is possible, it really can only marvel.
Many thanks to all who made this possible
So if it is ok, i will place a Link to my Site, so if not, please remove it :-)
One simple reason for writing valid code was not mentioned: why to ignore orthography of markup, when we don't do the same to written human language (maybe on IM chats we do, but not in official documents)?