I am looking for a no-cost tool (or tools) that will crawl my website and validate my XHTML and CSS, and check all of my links. If I had a real Content Management System(tm), I am sure that it would handle this sort of thing for me but, as I am not and this should be a common problem, there has to be a solution out there somewhere…
Update: REL Link Checker Lite looks useful, but I am still looking for a total solution.
5 Comments
Comments are closed.
see http://mindprod.com/jgloss/htmlvalidator.html
CSE HTMLValidator is the HTML validator I use to check my *.html web pages for syntax errors. Starting with version 5, it also checks *.css style sheets. See the HTMLValidator FAQ to get a hint of what it can do, or download the free light version. One thing the validator will complain about is &s in your CGI URLS; it wants you to spell them out as &. The CGI server will still see them as plain & since your browser converts them back to plain &.
Although, as you are validating XML, it should be easy to do with a standard XML tool that will check an XML file against the DTD.
Google PageRank calculators (that do not use the Google Toolbar)
http://pagerank.walidator.com/
http://www.top25web.com/pagerank.php
Note that with both of these, it is possible to check multiple pages at once.
Some useful web services for validating various types of content are:
http://feedvalidator.org/ (validates RSS feeds)
http://jigsaw.w3.org/css-validator/ (W3C CSS validator)
http://validator.w3.org/ (W3C HTML/XHTML validator)
http://www.htmlhelp.com/tools/validator/batch.html.en (The WDG HTML Validator has a useful batch mode for validating multiple pages at once)
This is a useful online tool to look at the HTTP headers that are returned from a page request.
http://web-sniffer.net/
I found this useful to diagnose problems with my RSS feeds. I could not see them properly in my browser because I was getting a 304 response.
This I think is the best site to validate your XHTML and CSS
http://validator.w3.org/