It seems that I keep visiting technologies or parts of projects I’ve worked on in the past. Recently, I had to deal with web sites and HTML. At one point in my past, I wrote a web server and learned a bit about HTML and how all that worked. While I was still in college, I learned HTML and hand coded my website (no more of that…too many other things to do and it is far too complicated for me). My recent project caused me to look at websites and run them through the W3C Validator. The results are kind of scary; most of the sites I looked at failed validation. What does this mean? It means that web browsers (and other tool that look at websites), have to work extra hard to handle non-standards based sites. This is prone to errors and causes the software to not work well. Then the user blames the web browser or other software instead of putting the blame squarely where it belongs, on the website author. Every Tom, Dick, or Harry can put up a website. It sure doesn’t mean that he has a clue of what he is doing. Even I have to periodically check my site to make sure it is valid; unfortunately for me, I’m still used to things 10 years ago which are no longer valid which means I make a lot of mistakes. Someday I’ll try to learn the right way to do things. By using blogging software and a web authoring tool, I don’t have to deal with it much. However, there are a few things I shove in that seem to cause problems, such as aligning text/graphics, setting borders on pictures, and using the target tag in links which are no longer valid.
I don’t have an answer on how to solve this, but it gives me a new found respect for the non-leading web browser. For the leading web browser, web designers make it work with that and some say tough luck to the others even if their sites aren’t valid.