Jason Knight
3 min readApr 19, 2022

It's a stunning example of why people don't even know what fast is. The use of framesets delays on cahce-empty first load the loading of both frames a full handshake, and needs three handshakes in total JUST to load what would be two as a monolithic HTML and a stylesheet.

The menu sidebar by itself is larger than the markup for the entire home page menu AND content has any reason to be. I'd ballpark the entire page as currently designed to warrant maybe 2.5k of HTML, but he's blowing 7.12k in three files.

And honestly given the site's simplicity I very much doubt it would take more than 5k of CSS to implement.

The kicker being the subpages are so bloated they are NOT faster, particularly with the overhead of frames and tables in the mix. If one used a monolithic stylesheet and a third the markup , you could precache the appearance of subpages making them load many times faster than what he currently has.

When you're reducing:

<table border="0" width="142">
<tr>
<td width="15"> </td>
<td width="10"> </td>
<td width="148"> </td>
</tr>
<tr>
<td width="15"> </td>
<td width="10"><font color="#FFCCCC">?</font></td>
<td width="148">
<p><a href="top.htm" target="right">???</a></p>
</td>
</tr>

To:

<nav id="mainMenu">
<ul>
<li><a href="./">Top</a></li>

You can see how it is nowhere NEAR as fast as one might think. The only REAL reason it's "fast" is being utterly devoid of significant content. It is easily consuming on average twice the bandwidth needed, 50% more handshakes/server requests.

And that's before we talk about the GIANT MIDDLE FINGER to usability and accessibility.

I might write an article about that making a workalike just to show, because it is a prime example of everything wrong with the presentational markup mindset. I'd have to make faux-content though since he has a disclaimer about copying the site's content, and I try to honor those... usually.

As to browser support, there’s really no reason not to. Visual user agents — aka browsers — really don’t need to care what the markup is, it’s all just either block or inline level tags. Maintaining their functionality shouldn’t really be a big deal in terms of code or processing overhead given this is stuff we used to run on 386’s running Windows 3.1

Though I do think that legacy features should be blocked if you put a modern doctype on it. It’s also why I think it was utterly dumbass on the part of the WhatWG getting rid of version tracking with their whole “living document” idiocy.

The mere idea that code I ran past the validator two years ago in “HTML 5” that was completely valid can be invalid today, and what’s valid today can be invalid tomorrow with ZERO way to track what “version” of HTML 5 was used in the document? Utterly dumbass.

Just like how it’s dumbass to need a “validation service” instead of having the user-agent just go “NO”, stop dead in its tracks, and telling the developer to fix things. This “blindly plod on as if nothing is wrong” parser methodology sucks out loud.

But at the same time that 80%+ of all websites would break overnight if browser makers implemented such blocks, just because some numbnuts framework or off the shelf blog plugin moron put a <style> tag inside <body>… wouldn’t go over well.

Jason Knight

Accessibility and Efficiency Consultant, Web Developer, Musician, and just general pain in the arse