Email stamps is a good example of a solution doomed to failure. Why do solutions like this fail?
Avoiding resistance from users is the first key to success. Email stamps make sending email more complicated. They also take something that was free and add a price tag to it.
Taking something that is already widely implemented and adding counter-intuitive terms to its use is always going to encounter resistance. Users don't want additional complexity added to email.
Much the same, taking something and raising its cost, or worse, making something previously free cost money, is one of the reasons so many Internet start-ups failed the past ten years (if we are to listen to social psychology). People do not like feeling cheated out of something that is already theirs.
Needlessly added complexity and making something previously free, cost, especially in a competitive marketplace where users can just switch providers, are paths to failure.
Tying it together, these lead us to the concept of naivete. How practical is the system to reach, be accepted, and then implemented by professionals?
The technology for email stamps requires a large part of the world to implement it before it works. The world is a multi-valence of complex inter-connected systems, and expecting everyone, or a large part of everyone, to do as you ask (if you can even reach them) is simply not plausible.
My second example from the previous blog, blocking port 25--a very different approach--worked immediately for those who did implement it.
Anti-spam introduced the world to the FUSSP, or sarcastically, the "perfect solution": You Might Be An Anti-Spam Kook If... http://www.rhyolite.com/anti-spam/you-might-be.html
It enumerates ways by which "new" and "amazing" suggestions on solving the spam problem go wrong... If only "everyone" (or most people) used their solution or "forced users" to act counter intuitively (and similar truisms), spam would be "gone". It is well worth a read.
Trying to map how some solutions work while others can't even get off the ground and seeing how communities and social systems change is fascinating. The examples above and many other lessons of fighting cybercrime are illuminating. Especially when we consider they are mostly derived from failures of technical solutions to solve a human problem, a common design fallacy this day and age.
In my next blog in this series, we discuss security by obscurity as used by attackers.
Follow Gadi Evron on Twitter: http://twitter.com/gadievron
Gadi Evron is an independent security strategist based in Israel. Special to Dark Reading.