It's been a few years since my last whinge about website fouls, and the web landscape has changed a bit since then. While the fouls listed on that page are still unacceptable, most are found far less often than they used to be.
This page is a new list of website crimes that I've encountered more recently, whether they be poor design decisions, software bugs, or simple oversights.
Severity: high. Prevalence: moderate.
On any pages which send or receive personal data from any visitor, such as their home address, date of birth, credit card or bank details, or any other uniquely identifying data such as NI number or SSN, you must make sure that the connection is encrypted. Otherwise you're taking chances with the data security of your visitors.
Unfortunately I keep finding websites, especially small shopping sites, which do not use encryption to protect personal data in transit. Some sites use an encrypted connection for the pages which take credit card details, but forget to protect pages which involve the visitor's full name, address, and shopping cart contents. Other sites make an attempt to use SSL, but my browser warns that the certificate is invalid or expired. In any of these cases, I exit the site and make a mental note never to go back.
Two examples from organisation which are really too big to be excused for this mistake are jacobsdigital.co.uk (which currently [September 2010] does not use encryption for the stage in its order process when your name and address are being transmitted) and the jobs pages at Wandsworth Council (which have a perfectly functioning encryption certificate, but did not use it by default last time I checked in January 2010).
Severity: moderate. Prevalence: moderate.
A quality site will use standards-compliant HTML with accessibility in mind, and will only use JavaScript to add convenience functionality such as suggesting ways to complete a text field in a form, or warning visitors when they've typed invalid data into a text field. A quality site will still be perfectly readable and accessible even if JavaScript is disabled in a visitor's web browser.
Unfortunately, a large number of websites now lazily rely on JavaScript to do more or less everything, and these websites become a broken heap without it. For instance, the online store scan.co.uk currently [September 2010] has a search engine which does nothing unless you have JavaScript enabled. It doesn't even tell you that there's a problem, you just click on Search and get nothing whatsoever.
Because JavaScript is used as an attack vector in so many hacked websites, I use browser plugins which make it impossible for a website to run JavaScript on my browser unless I specifically mark the domain as trusted first, and I recommend this approach to everyone. So the prevalence of JavaScript-dependent sites is frustrating.
Severity: moderate. Prevalence: low.
As with JavaScript, the popularity of Adobe Flash has soared, and there are a few websites which show you nothing unless you have Flash installed and enabled. While Flash is suitable for streaming video and offering web-based games and software tools, it's really not the best medium for textual content, and trapping text inside a Flash interface just makes that content far less accessible to people with sight impairment.
Also, there is a fair amount of concern about the fact that almost every browser has this brand of software installed and that it automatically executes when requested by any website. This makes it far easier for producers of malware to attack people's machines through their web browser, because there's no diversity in the software ecosystem. If a weakness is found in a recent version of Adobe Flash, the vast majority of visitors will be vulnerable. (For instance, of all the visitors to my site who are using Flash, 77.56% are using one of three recent releases of Adobe Flash.) Just to be safe, I use browser plugins which ban Flash from running on any site until I mark its domain as trusted.
Severity: moderate. Prevalence: moderate.
Worth mentioning while I have on my ranting hat, is when a website is suddenly reorganised so that every page gets a new URL.
It's certainly the case that every now and again a page or a whole section of a site will have to be moved, resulting in new URLs. But handled well, with HTTP 301 status codes used to redirect requests to the new URL (at least until requests for the old URL have died off) it's not a big problem.
It becomes a problem when a site suddenly gets a taste for it, and rejigs its structure every twelve or eighteen months, with no redirects. In the past I've found NVIDIA guilty of this practice, with pages far too often vanishing, moving, or changing content completely.
From a visitor's point of view this is a mild annoyance, but from a webmaster's point of view, it's a real hassle, hunting for broken links every six months. If I notice that a URL has changed more than one in a few years, I tend to give up and remove the link altogether, and I'm guessing that I'm not alone because rumour has it that Google punishes pages which contain broken links.
Severity: low. Prevalence: high.
This one is not a major crime, but still disappointing from big companies. When you're looking for the specifications, or the user manual, or other information about a product you own, it's frustrating to find that the manufacturer's website has already deleted the pages relating to your product just because a new model has been released.
The best companies have product pages for their entire back catalogue of products, and it's a pity that most companies don't follow this model.
Severity: moderate. Prevalence: low.
This is far less common than it used to be, but now and again I find a website that seems promising, but refuses to show you much unless you register with the website.
This makes no sense. Why would I surrender my email address, then go to the trouble of reading the site's terms and conditions and choosing a strong password, in order to create an account with a site when I've no idea whether or not the site is any good?
One example of this is at guru.com (a freelance project auction site), where you currently [September 2010] can't read the full project requirements unless you create an account. It would be very frustrating to spend time creating an account, only to login and find that the details you previously could not see actually rule you out of the running for that project. So long as the full details page offers no way of contacting the project owner outside of the website, there's no good reason to hide it from unregistered visitors.
Severity: high. Prevalence: low.
This complaint is actually just an update of the old complaint, and luckily it's nowhere near as common as it used to be. But last year [December 2009] I found that the Microsoft adCenter website stated, in its "system requirements" page, that it only worked with Internet Explorer. Looking at the same site today [September 2010], the system requirements now specify either Internet Explorer or Firefox, but you must be using the Windows operating system, which is pretty poor form.
With HTML so firmly established now, no major website should require a specific browser or operating system. A serious website should work in any operating system when viewed with any web browser that supports the bulk of the current HTML standards, such as Opera, Safari, Firefox, and even Konqueror (which scores a lot higher on The Acid3 Test than Internet Explorer 8).
Note that I don't agree that a website should work with all versions of every browser. Trying to support ancient versions of browser when there are freely available updates is madness, and it's worth noting that Google has effectively announced the end of support for the ancient, non-standards-compliant, always-rubbish Internet Explorer 6.
Severity: low. Prevalence: moderate.
Now that widescreen computer displays and televisions are common, most web pages constrict the width of their body element, so that the visitor's eyes don't have to scan across the entire width of their screen to follow each line of text. This is a great idea, unless the page designer forgets to centre the body element in the page, and leaves it hugging the extreme left edge of the web browser window, giving the visitor a crick in their neck.
While the visitor can workaround this by adjusting the width and position of their web browser window, there's really no excuse for having the web page laid out like this when it's so easy to centre it. (Just add "margin: auto;
" to the CSS styles for the body element.)
Severity: moderate. Prevalence: low.
Don't use client-side redirects. And, if you do, test to make sure they don't make it impossible to use the Back button.
Severity: low. Prevalence: low.
There seems to be a growing fashion for serving websites from multiple domains. I know that Google recommend using a separate domain for cacheable, static content, but I believe they also recommend keeping the number of domains small, perhaps two or three.
A few sites are getting carried away, however, and going all out to split their content, images, scripts, Flash movies, analytics, and advertising across as many domains as possible. For instance, gamespot.com currently holds the record in my book, some of its pages (such as video review) currently [September 2010] requiring your browser to send requests to no fewer than fourteen separate domains to load fully.
Such pages seem to take forever to finish loading, because if any one domain hits a snag it delays completion. Plus, one of the security plugins I use makes it impossible for one domain to request content from another domain unless I mark the connection as trusted. This becomes such a hassle for sites that are split across a large number of domains that I simply give up and go elsewhere.
Severity: low. Prevalence: low.
If your search engine has encountered a problem, or there are no results to return, then tell the visitor what's going on. It's confusing and annoying to click on search, only to see the page remain stubbornly blank. And it makes the website look like the work of amateurs.