One advantage of technology adoption in the developing world is that it's often the installation of the most modern equipment and software, bypassing expensive legacy systems that are difficult to replace and expensive to support.
A case in point is the mobile phone network. In most of the developing world, telecom infrastructure is minimal, particularly in rural areas, and the only phones available are mobile ones. In much of Cambodia, people use mobile phones for personal communications, and there are also services where people can pay to use a phone for a small fee. Installation of a few mobile towers in remote areas can provide most of the population with good access to communications with minimal infrastructure investment. Most of the rural population merely skips hard-wired line technology altogether.
There are advantages to incremental change over such large technological leaps, however. In the developing world mobile phone example, power systems and education also advance in parallel with the phone infrastructure. If these regions don't have rural electrical systems or inexpensive energy in any form, powering the cell towers, charging phones, and running switching equipment, is expensive.
And if they havent had experience with the basic technologies, then they end up with advanced technology that nobody really understands, and thus require external help to be brought in for even the most basic problems. Certainly, the role of the consultant (I am but one of many) is significantly more important in developing nations than in the developed world.
This phenomenon applies to security worldwide. Take Web 2.0: Weve got a whole new generation of Websites coming online with extremely advanced technology, which makes it easy to write code even with little knowledge of what actually constitutes good code. Those developers who followed the trail of CGI, ASP/JSP, servlets, and finally Ajax and Rails, typically have the depth of knowledge to write code properly. But an organization moving directly to a Web 2.0 presence without the Web 1.0 experience will likely find hidden pitfalls in this transition.
The Web is certainly not the only domain in which a problem like this may arise. Think about wireless networks, data warehousing, VPNs, etc. Somebody who has worked with predecessors of these technologies (wired networks, small databases, dial-in access) has knowledge that they may not even realize is relevant here.
What's the solution to avoiding these pitfalls? Education and measurement. I have a computer science degree from a liberal arts college, where I learned not just how to write code, but how to think about code, and to some extent, learned the history and theory of computers. We had lots of math. We wrote proofs of correctness. All of these things were frustrating at the time, but this education taught me how to think about problems in the context of computation.
Then there's measurement. Ive been reading Security Metrics: Replacing Fear, Uncertainty, and Doubt by Andrew Jaquith. The premise of the book is that we should be able to use real information security metrics to answer questions about security. Certainly, we should be able to determine how much implementing a particular technology will cost but without measuring the security impact, you won't know the true costs in the long run. Sure, you can let the marketing folks implement a fancy Ajax Website, but hiring an experienced developer to do the same thing might wind up being cheaper.
Nathan Spande has implemented security in medical systems during the dotcom boom and bust, and suffered through federal government security implementations. Special to Dark Reading