Simplicity, carried to an extreme, becomes elegance.

— Jon Franklin

In this modern word of FooBooks and BarSpaces personal web pages might look like an abandonded technology, but I like them. I encourage everyone to have one.


I am not a web designer, as should be painfully obvious from these pages, nor do I intend to become one. Indeed, I wish most web designers were doing less design and focusing more on helping people deliver accessible content. That said, I do care about typesetting and typography and I'm aiming for a slightly retro, very clean look.

Content, in my case, is mostly text, so that's what I'm trying to deliver. I try to follow standards and try to avoid using images for decoration or for navigational purposes. The images that I use on my pages, mostly photos, carry at least some meaning themselves.

There is usually no Livescr... Javascr... um... ECMAscript or whatever it's called this week on my pages. Almost all pages are static files containing a small subset of HTML 5.

I don't do cookies, unless you count the ones baked in an oven.

I use a simple CSS compliant stylesheet but all pages should be perfectly viewable without CSS support. If your browser doesn't support CSS or doesn't support HTML5 <nav> tags the navigation above will look like an ordinary dotted list. It should still be useable, though.

A lot of information available on the interwebs is out of date but even more annoying is that it is so hard to tell if what you're looking at is out of date or if it was updated yesterday. To remedy this problem at least on my own content I include a timestamp at the bottom of each ordinary HTML file unless it is otherwise dated.

Tools Used

Everything on the web site, including my web pages, is served as static content, except for server-side includes. No use of CGI is allowed. The main reason for this is that the host serving the web site is often the target of security attacks, probably because of a misunderstanding about what the word “hack” really means. For more about this, see What is a hacker?

Some of the static HTML files under my web pages were generated by a modified version of an older version of txt2tags. Some lists of files might still be around that were produced by a trivial Perl script called flist.

An increasingly number of pages were generated by a small Perl script I wrote, mdn, which I use together with the CommonMark Markdown parser to generate a small subset of validating HTML5. Yes, really, HTML 5! Not that I use much of its features…

The HTML files are generated from plain text files with minimal markup called Markdown that is easy to write, at least compared with the HTML nightmare. Complete specs of this Markdown flavour.

mdn adds an HTML header with an optional server-side include to a navigation bar, an optional link to a stylesheet, uses the first line of the source file as the HTML title and closes the HTML page. It also removes any comments I may have added to the source file.

The original source that was used to produce the HTML of most files are available as either foo.t2t (for files intended for txt2tags) or foo.mdu (for Markdown encoded in UTF-8). The source file for this page, for instance, is colophon.mdu.

The blog and the accompanying feeds are generated by the Hugo static web generator. I used to use a slightly hacked Blosxom for a long time but converted to Pelican in early 2015 and then to Hugo in 2016. Before that the blog was just static HTML pages I wrote manually.

My photo albums were generated by a small shell script I wrote, simgal, but most of the real work was done by ImageMagick and jhead.

The rather minimal CSS stylesheets were written by hand.

The HTML files have been validated with W3C's HTML Validator, the CSS files with the CSS Validator and the RSS and Atom feeds with the W3C Feed Validator. The last time I checked, everything validated correctly. This is important to me, so if you find something strange, please report it to me.

You should probably validate all your own material if you publish anything on the web. If you think about starting to use a new tool or a Content Management System be sure to validate its output before investing time and money on a new tool.

Ordinary Unix make from FreeBSD was used to automatically rebuild HTML files if anything had changed. The Makefile looks something like this (shortened for your reading pleasure):

TARGETS = index.html computers.html

all: $(TARGETS)

.SUFFIXES: .t2t .html .mdu

sitemap.html: $(TARGETS) /home/mc/public_www > $@

	cmark $< | mdn -s /mc/nav.css -n /mc/nav.html > $@

	txt2tags -t html --style "/mc/default.css" $<

	rm $(TARGETS)

sync: $(TARGETS) sitemap.html
      rsync -a --delete .

PDF and Postscript files were usually produced with either TeX, using LaTeX macros, or Groff, a free version of troff, usually using my own macros based on ms.

Some of the Postscript files were written directly in Postscript by myself, mostly just for the fun of it.

The images on my web pages may have been manipulated with programs such as ImageMagick, NetPBM or xv.

All text was written with GNU Emacs, a Lisp based operating system cunningly disguised as a text editor.

Last updated: <2018-04-20 08:54:45 MEST>