In the 1953 movie “Houdini,” Bess Houdini cryptically explains how her husband was able to escape from a locked safe. Safes are made to keep people out, she explained, not to keep people in. That would not have been enough information to coax me to walk into a safe and shut the door behind me, of course. Houdini used other tricks that Bess was not revealing. But according to J.C. Cannell, author of “The Secrets of Houdini” the magician was able to escape from the safe because he was given access to it prior to his performance and simply rigged it to permit his escape.
That is something like a person who has access to information in an encrypted file. Sure, the file is “locked.” But a person who is given access to the opened file can copy and paste its contents onto a USB drive. Or e-mail it. Or print it. Or click “Save as . . . “ Once they are given access to information, they can successfully make their escape.
A job of the auditor is to learn whether and how they do it.
It should be no surprise that people who are granted access to sensitive information are largely responsible for its exposure. Information security professionals often say that people are the weakest link in the security chain. But I’m convinced that a good part of the time, people are not careless with information for bad reasons. In fact, in my experience, people are often careless with information for good reasons. Take for instance a company that was trying to secure sensitive information in their network. Many of their staff used high-end desktop computers to conduct complex calculations on large data sets while others used more expensive, encrypted laptops for lighter work.
It came to management’s attention that during off-hours, staff who were assigned desktops were connecting their home computers to the VPN to stay productive while at home. They were paid by the hour. Management implemented a policy, and enforced it technically, that prevented home PCs from using the VPN. This was wise because it prevented their staff from relocating sensitive information onto home computers that the company did not control.
However, a few weeks later, a little interviewing on my part revealed that the staff, who still had every incentive to remain productive off-hours, started copying unencrypted information from their office workstations to pen drives and taking it home to work on. There were no reports of lost pen drives, but clearly they had a conflict of incentives and controls. Staff were careless with sensitive information, and had a good reason to be.
When I audit for information security, I check for technical controls, management controls, IT governance, etc. But I also keep in mind that quote attributed to Bess Houdini; that safes were made to keep people out, not in. My audit interviews, therefore, focus heavily on the people who are given access to information. When I interview them, I do not treat them as suspects, but as conscientious people who really do want to do the right thing, and who are put into a double bind.
Consider two studies conducted last year: Data Breaches: Trends, Costs and Best Practices published by IT Governance which stated that 68% of information handlers surveyed violated their companies’ security controls so they could get their jobs done. A survey conducted by RSA in the same year verifies those findings at a rate of “over 50%” of staff violating security rules.
As information security professionals, we need to be aware that no matter how good our security controls and recommendations are, they necessarily interfere with people getting their jobs done. Just as a lock on your car door slows you from getting into your car, so do restrictions on VPN access, encrypted files, and the like slow down our work. And the people who are subject to those interfering controls have incentives to work around them, much like Houdini had the incentive to break out of the safe he was given access to.
In the case of the company I described, they kept their policy that prevented home computers from joining the VPN, but they also dealt with the unintended consequence that resulted from the new policy; they started providing high-end, encrypted laptops to staff who needed to work out of the office and earn more money for the company. Because the careless use of USB pen drives to carry unencrypted data was an unintended consequence of the new policy, it was only brought to management’s attention because of careful interviews with the people who were given access to that data.
Here are some tips to learning the data handling secrets of the Houdinis you encounter:
Interview as many people who handle information as you can. If one knows the secret to data escape, then a few will. One may be able to keep a secret, but not all.
Earn a reputation as an auditor who makes your interview subjects’ lives better. If people want to do the right thing while also acting on their incentives, they will welcome sympathetic auditors as a sounding board for their complaints.
Do not champion a security control just because it blocks a known risk. Think about the additional risks it may create if people feel compelled to work around it. A professional or personal incentive will usually trump a security incentive.
What are some of the methods you use to audit people?