By Scott Abel, TheContentWrangler

It’s frustrating. Spammers are finding increasingly sneaky ways to populate web pages and blogs with those ever-so-irritating “spamvertisements”. Although it would be convenient to blame the spammers for junk you find on blogs and websites, it’s actually the site owners who are responsible. Take the Organization for the Advancement of Structured Information Standards (OASIS). While the standards body is good at many things, creating web sites that prevent SPAM trash is not one of them.

image
  • Facebook
  • Twitter
  • LinkedIn
  • Pinterest

Consider xml.dita.org, home to information about the Darwin Information Typing Architecture. The site is nicely designed and loaded with useful information. However, because OASIS has not instituted proper protections (a standard approach, in most organizations), they have to spend time removing trash posting like the one above. And, perhaps more importantly, site users have to circumnavigate the trash in order to get to the good stuff.

image
  • Facebook
  • Twitter
  • LinkedIn
  • Pinterest
To prevent such garbage posts in the future, the organization should employ a security feature known as captcha, a program that protects websites from spammers and their spambots by generating and grading tests that humans can pass but current computer programs cannot. For example, humans can read distorted text (as shown here), but current computer programs can’t. OASIS could also empower site editors to monitor and clean up any trash left behind by spammers savvy enough to sneak through security.

Spammers are an ingenious lot. They’ll keep working on more ways to get their often convoluted messages out. It’s our job to do our best to prevent them from being successful.