bpresent
Webmaster Tutorial Menu
Tutorial Home
Life before .htaccess
You have to start somewhere
Becoming a Webmaster
The steep learning curve
What to look for in books
How many hats?
Technical Job Description
Linux or Microsoft?
Standard web stuff
Basic HTML
Frames and/or Flash?
Site submission
Negotiating Links
Robots.txt
Validating your HTML
Web safe fonts
Web safe colours
Different screen & monitor sizes
Cascading Style Sheets (CSS)
Javascript
'Nix specific stuff
Choosing a 'Nix hosting company
Web Logs Demystified
Web Log Status codes
Limitations of robots.txt (and the power of .htaccess)
Conclusion
_
cost  effective,   fast  loading,   lightweight,   high  return  websites
_
_

Validating your HTML

Back to my journey... I read somewhere that search engine robots can get confused for all sorts of reasons, one of them being badly written HTML.
I started to look behind the scenes at the HTML that was being generated by the WYSIWYG (what you see is what you get) editors that I'd been experimenting with and was horrified!
So much duplication of empty table cells and really weird fixed width columns and a whole host of other bits and pieces.
The experience started me off looking at "real" HTML.
Every now and then I came across little W3C HTML Markup Validated symbols so I thought I should check this out.
_
_
_
I found that my code was pretty invalid but it still worked on Netscape (Mozilla) and Internet Explorer so I decided that I'd have to put that on the back burner for a while.
 
[Content of this page last reviewed: 10-Jul-2004]
_
_ _
copyright © 2001-2007 bpresent
Terms of Use, Privacy Policy, Copyright and Trade Marks
Sydney +61 438 726 669