Wednesday, March 17, 2010

Our Users aren’t as Dumb as We Think

For system administrators and security professionals, one of the most common complaints is that end users don’t care, they don’t take the right precautions, or they don’t listen to what we tell them.  Well, all that may be true to some extent, but maybe we’re missing the obvious as well.  While complying with the law is not optional, the methods we use to secure our data often give us more flexibility when we think.  An alert reader on the Security Metrics list noted an article entitled “So long, And No Thanks for the Externalities: The Rational Rejection of Security Advice by Users,” which discusses the results work done by Microsoft Research.  The article abstract notes:

“It is often suggested that users are hopelessly lazy and unmotivated on security questions. They chose weak passwords, ignore security warnings, and are oblivious to certificates errors. We argue that users' rejection of the security advice they receive is entirely rational
from an economic perspective. The advice offers to shield them from the direct costs of attacks, but burdens them with far greater indirect costs in the form of effort.  Looking at various examples of security advice we find that the advice is complex and growing, but the benefit is largely speculative or moot. For example, much of the
advice concerning passwords is outdated and does little to address actual threats, and fully 100% of certificate error warnings appear to be false positives. Further, if users spent even a minute a day reading URLs to avoid phishing, the cost (in terms of user time) would be two orders of magnitude greater than all phishing losses.
Thus we find that most security advice simply offers a poor cost-benefit tradeoff to users and is rejected. Security advice is a daily burden, applied to the whole population, while an upper bound on the benefit is the harm suffered by the fraction that become victims annually. When that fraction is small, designing security advice that is beneficial is very hard. For example, it makes little sense to burden all users with a daily task to spare 0.01% of them a modest annual pain.”

In our private lives, following these security and safety precautions because not all activities are measured in dollars and sense.  If someone derives some utility from washing his hands every fifteen minutes on the off chance that some invisible deadly germ might suddenly appear, who are we to argue.  There is an infinitesimally small chance that the failure to wash at that frequency could lead to death, but few for-profit businesses could survive if they required their employees to follow that regimen.  People often misperceive risks, but many perceive risks correctly and still engage in abnormally risky or cautious behavior.  To them, it’s simply a matter of utility.  For example, the very small risk of dying in a plane crash can be enough for one to forgo flying in planes if the need nor desire is minimal.  However, driving down a highway at 10 mph for fear of car crashes would be inappropriate even if it only impeded other drivers and didn’t increase the risk of an accident, which very slow driving has been shown to do. 

When our actions work in tandem with others, some happy medium is needed.  Hence, there is the risk/reward tradeoff we find in business.  When one chooses to work for or do business with a company, one is necessarily accepting their risk model.  Consequently, a super secret intelligence agency may find it worth the cost to carefully read every URL to avoid phishing.  For the rest of us, it’s just not worth it.  And yet the tips keep coming and organizations plug them into training materials without any thought to their risk profile (or bottom line).  We lock people out of their accounts after three unsuccessful complex password tries even though there’s not a scintilla of evidence to suggest that a higher number would be too risky.  We simply take what we receive from some official sounding source to be gospel and implement because making a risk-based decision would mean someone would have to be accountable.  Some day people will realize that deferring all decisions to some unknown entity who knows little about your environment or risk tolerance is not only economically silly, it may be riskier.  I relish the lawsuit where a company is sued successfully for having security policies that are too strict to the point that any reasonable person in that environment would choose to ignore them.  Now, who was it that needed security training?

No comments: