The Modest Auditor

May 28, 2009

I frighten people. I don’t mean to, but I do. It usually happens when I tell them what I do for a living. I hack corporate networks, and then show them how to keep me from succeeding next time. It’s a more glamorous and descriptive way of saying “information security auditor.”

 

But while it’s part of our lives in the public sphere to be a little unnerving to the uninitiated, among our audit subjects we must be seen as colleagues who audit for their benefit.

 

I have seen two basic causes for auditors to take less-than-modest stances. One is that they do not want to work without administrative or root privileges to the systems they audit, and the other is that they impress themselves with having technical proficiency that others don’t have.

 

Regarding the first cause, a common precedent to becoming a security auditor is being a systems administrator. Systems administrators are used to having elevated privileges and experience a feeling of power (and hopefully responsibility) as a result. When they transition to the role of auditor, however, they must relinquish this power. They give up root and administrative privileges. Dropping their privileges is necessary because an auditor must not have the ability to administrate what they audit. The conflict of interest is apparent and makes their audit findings unreliable. Some security professionals hold onto their privileges jealously, psychologically unprepared to relinquish power, as if doing so would be a demotion of sorts. But other than the appropriate segregation of duties, there are other great reasons for information security auditors to maintain nothing more than normal user or guest privileges to the systems they audit:

 

  1. Audit results are very impressive when vulnerabilities and exploits are demonstrated with normal user access. For a knock-out example of this principle, watch this presentation by Marcus Murray on how to own a Microsoft network without needing an administrator account.

  2. It’s a great opportunity to teach systems administrators the principles of security. If you rely on administrators to conduct tests that require administrator access, then you’ve got a great opportunity to increase their security and risk awareness. When you sit with a database administrator with a well-developed audit plan, for instance, your conversation can go something like this; “We want to prevent user accounts from executing arbitrary code on your database server, so let’s see if xp_cmdshell is enabled. Here’s where the setting is, and here’s the SQL injection code we’ll use to test it. You keep a copy of the test script so you can run this whenever you set up a new database service.” Voila! An educated administrator with a tool set.

 

Regarding the second cause for immodesty among information security auditors, some technical people just like to be elite. They like having some superiority over others. This is a bad tendency. The auditor is the conscience of the organization, not the bully. Auditors must see themselves as the people who educate, coach, remind and even recommend. But also, auditors need to be really good at listening. An auditor’s recommendations may make little sense or be of little value to some organizations. I was in a situation recently in which a third party auditor insisted that their audit subjects encrypt a database to meet a certain security standard. The database did not contain sensitive information. Encryption of that dataset would have come at a high and unnecessary cost to the company. The firm more clearly defined their data classification standards rather than encrypt non-sensitive information and got the auditors to withdraw their requirement. Days of hand-wringing and wasted time and effort were spent because the auditors were sticking to their “superior” and “strict” standards. Had the auditors started by understanding their audit subject’s use of data and considered the less glamorous, less elite solution of a better defined data classification method, the problem would have been solved much quicker.

 

Finally, your audit subjects will more willingly come to you if they spot trouble (in the form of an incident or a control that is not working as intended) if they think of you as a reasonable and approachable person. They tend to want to spend less time helping people who are cranky and superior.

 

Now my recommendations likely are not universal. Can you think of exceptions to this general rule?

Advertisements

How Do You Audit Security Awareness?

May 26, 2009

OK auditors! We’ve heard it said as many times as we’ve said it ourselves, so let’s all say it together now: One, Two Three, “People are the weakest link in security.”

 

Right? Right. All agreed.

 

So now that we agree, how do you audit for security awareness?

I tend to think about security awareness in three distinct categories, each presenting its own control and auditing challenges: Enterprise Security Awareness in which general principles of information security are communicated to a general audience; Administrative Security Awareness in which controls must be aligned with known risks and administrators must demonstrate proficiency with those controls; and finally Information Handling Awareness in which people know what they must or must not do with specific information that they have access to.

 

Enterprise Security Awareness seems easy enough to dispatch and audit. Microsoft, in fact, provides for free much of what an organization needs in order to deploy a general security awareness program with their Security Awareness Program Tool Kit. The tool kit provides guidance for creating a security awareness program, including template forms, skeletons of presentations, posters and loads of material to present to a general or management audience. Topics such as password strength, shoulder surfing and physical security are covered in presentations for a general audience, with legal and risk issues covered for the management audience. Following NIST’s SP 800-50 “Building an Information Technology Security Awareness and Training Program” you can almost plug in Microsoft’s provided documents and templates and have Enterprise Security Awareness covered. It will still take you some work to see this program through, but Microsoft has done much of the thinking for you. Take advantage of their offering.

 

Administrative Security Awareness could likely be audited through a standard technology security audit. Whether you are using ISO-27000 series, BS-17799, SP 800-53, or CobiT for your security controls framework, each of your technical tests can be matched with an awareness test. For example, are domain account login failures reviewed? Yes. Are account managers aware of unusual login failure patterns that would initiate an incident handling procedure? That’s a good awareness question. Having one or more awareness test for each control test could provide you with a good handle on this second security awareness category.

 

It is this third security awareness subject, Information Handling Awareness, that I most often think about. How do we get people to know what they must and must not do with the information that they are provided access to? How do we audit for their awareness and compliance with those rules?

 

I approach Information Awareness Handling by recommending that specific instructions be provided to information handlers. I’ll explain a mechanism for doing this in a moment. First, I’ll tell you why I think the approach is important.

 

When I review the records within the DataLossDB (at http://www.datalossdb.org), I notice that of the 1,900 or so reported disclosures to date, about 600 are apparently due to people mishandling personally identifying information. This includes putting Social Security numbers in e-mails, on web pages and the like, people losing devices and print-outs, or disposing of data in unsafe ways. If we also qualify as “mishandling” the saving of clear text, confidential information to unencrypted devices that are later stolen, the percentage of disclosures attributable to mishandling could approach half of all reported cases (though I’m not asserting that it is half).

 

From my experience, only a few people within any organization handle the highest risk information. And when they do so, it is for reasons that may not be foreseen. Let’s consider an Enterprise Security Awareness program at a university. The awareness training will be presented to all staff, and will inform employees to use passwords of a certain type, to not let people tailgate into locked offices, to handle laptops and storage devices with due care, etc. There may even be generalized policies to not expose Social Security numbers, student IDs, credit card numbers, etc. This session is meant for all employees, from custodians to secretaries, admissions officers and professors. Will the training also include statements such as, “Don’t put the following information on web pages; Social Security numbers, credit card numbers, student IDs, checking account IDs, student visa numbers. Don’t e-mail the following . . . ; Don’t send via postal mail the following. . . ; When disposing of the following on paper, you must use a shredder that cross-cuts the documents in diamond shapes . . . ; When storing a CD that contains the following information, you must . . .

 

You get the picture. Most of this highly detailed, specific information is of no use to most of the audience and causes all other specific instructions to be forgotten. There should be a distinction made between generalized security training, and specific training for information handling.

 

To meet this challenge a year ago, I proposed a framework for designing security controls based on information handling instructions. The framework was first proposed in my paper “The Controlled Event Framework” published to satisfy my GSNA Gold certification and is available at the SANS Reading Room. The framework is basically a mechanism for categorizing information based on risk, then developing a set of instructions for people who will handle the information. The instructions are broken down by types of tasks, such as “Copy” “Send” “Dispose” “Save” and written specifically for the media that the information is contained within. Going back to our university example, if HR deals with health claims and Social Security numbers of employees, and they handle databases, spreadsheets and paper reports, they can be given a set of instructions for copying, saving, printing or disposing of that information. Only they need to know those instructions because only they will handle it. The Admissions department will use different processes, information and media, so their handling instructions would be tailored to the information they will be exposed to.

 

According to the framework, technical controls are then set up to automate the instructions, or to prevent or detect violations of the instructions. Finally auditors can work with technology administrators and information handlers to determine whether or not the instructions are being followed. In a way, the Controlled Events Framework drives security by awareness. The thinking behind this approach is multi-fold.

 

  1. Information technology does not yet make information secure from cradle to grave – by design or implementation – so we rely on people’s conscience and awareness to protect information that is not secured for them.

  2. If people are overwhelmed by a large set of instructions, many that do not apply to their work, they will forget them. Rather, if people are given brief instructions about how to “Print” when they need to print, or “Dispose” when they need to dispose, their awareness materials can be very tailored to their needs.

  3. When security auditors conduct interviews with information handlers about specific instructions, they will find awareness to be very audit-able and measurable.

  4. And auditors will be able to address specific controls that information handlers confess to working around because they find the controls to be too onerous

 

This framework was implemented at an international consultancy and withstood audits by clients and federal agencies, always with strong enthusiasm for its direct and audit-able way of instituting awareness where it mattered most.

 

Take a look at the Controlled Events Framework materials and feel free to comment, either here or at the site. If you think it has merit or room for improvement, I’d like to know.

 

 


Auditing People

May 24, 2009

In the 1953 movie “Houdini,” Bess Houdini cryptically explains how her husband was able to escape from a locked safe. Safes are made to keep people out, she explained, not to keep people in. That would not have been enough information to coax me to walk into a safe and shut the door behind me, of course. Houdini used other tricks that Bess was not revealing. But according to J.C. Cannell, author of “The Secrets of Houdini” the magician was able to escape from the safe because he was given access to it prior to his performance and simply rigged it to permit his escape.

 

That is something like a person who has access to information in an encrypted file. Sure, the file is “locked.” But a person who is given access to the opened file can copy and paste its contents onto a USB drive. Or e-mail it. Or print it. Or click “Save as . . . “ Once they are given access to information, they can successfully make their escape.

 

A job of the auditor is to learn whether and how they do it.

 

It should be no surprise that people who are granted access to sensitive information are largely responsible for its exposure. Information security professionals often say that people are the weakest link in the security chain. But I’m convinced that a good part of the time, people are not careless with information for bad reasons. In fact, in my experience, people are often careless with information for good reasons. Take for instance a company that was trying to secure sensitive information in their network. Many of their staff used high-end desktop computers to conduct complex calculations on large data sets while others used more expensive, encrypted laptops for lighter work.

 

It came to management’s attention that during off-hours, staff who were assigned desktops were connecting their home computers to the VPN to stay productive while at home. They were paid by the hour. Management implemented a policy, and enforced it technically, that prevented home PCs from using the VPN. This was wise because it prevented their staff from relocating sensitive information onto home computers that the company did not control.

 

However, a few weeks later, a little interviewing on my part revealed that the staff, who still had every incentive to remain productive off-hours, started copying unencrypted information from their office workstations to pen drives and taking it home to work on. There were no reports of lost pen drives, but clearly they had a conflict of incentives and controls. Staff were careless with sensitive information, and had a good reason to be.

 

When I audit for information security, I check for technical controls, management controls, IT governance, etc. But I also keep in mind that quote attributed to Bess Houdini; that safes were made to keep people out, not in. My audit interviews, therefore, focus heavily on the people who are given access to information. When I interview them, I do not treat them as suspects, but as conscientious people who really do want to do the right thing, and who are put into a double bind.

 

Consider two studies conducted last year: Data Breaches: Trends, Costs and Best Practices published by IT Governance which stated that 68% of information handlers surveyed violated their companies’ security controls so they could get their jobs done. A survey conducted by RSA in the same year verifies those findings at a rate of “over 50%” of staff violating security rules.

 

As information security professionals, we need to be aware that no matter how good our security controls and recommendations are, they necessarily interfere with people getting their jobs done. Just as a lock on your car door slows you from getting into your car, so do restrictions on VPN access, encrypted files, and the like slow down our work. And the people who are subject to those interfering controls have incentives to work around them, much like Houdini had the incentive to break out of the safe he was given access to.

 

In the case of the company I described, they kept their policy that prevented home computers from joining the VPN, but they also dealt with the unintended consequence that resulted from the new policy; they started providing high-end, encrypted laptops to staff who needed to work out of the office and earn more money for the company. Because the careless use of USB pen drives to carry unencrypted data was an unintended consequence of the new policy, it was only brought to management’s attention because of careful interviews with the people who were given access to that data.

 

Here are some tips to learning the data handling secrets of the Houdinis you encounter:

 

  1. Interview as many people who handle information as you can. If one knows the secret to data escape, then a few will. One may be able to keep a secret, but not all.

  2. Earn a reputation as an auditor who makes your interview subjects’ lives better. If people want to do the right thing while also acting on their incentives, they will welcome sympathetic auditors as a sounding board for their complaints.

  3. Do not champion a security control just because it blocks a known risk. Think about the additional risks it may create if people feel compelled to work around it. A professional or personal incentive will usually trump a security incentive.

 

What are some of the methods you use to audit people?


Twitter!

May 16, 2009

AuditExperts is now on Twitter!  If you’d like to follow us rather than having to check back at the blog to see what’s happening, you can follow us!


WebScarab Search Plugin & Auditing

May 13, 2009

If you’ve been following our series on Web Application Security Auditing then you know that we’re fans of WebScarab from OWASP.  I’m also a pretty big fan of the Burp Suite, but today I want to tell you about a feature that you may not have used in WebScarab that can make your life easier.

When you’re performing testing, especially fuzzing, you may want to answer a few questions without having to personally look at thousands of pages that have been collected by the fuzzer.  For instance, perhaps I’d like to automatically identify any queries that resulted in SQL errors or perhaps I’d like to find any and all conversations that include HTML comments.  It turns out that this is pretty easy.

If you have a look on the far right hand side you’ll find a tab labeled “Search.”  The Search tab allows you to add custom searches easily, but the interface may not be what you expect.  Rather than entering keywords to look for you actually have to add JavaBean code.  Have no fear!  While these can be complicated there’s a super simple recipe that can be used to do quick searches easily!  The recipe is:

new String(response.getContent()).indexOf("SQL") > -1

That’s it!  Let me explain what this does.  What this means is, get the content returned in the response.  Take that content and turn it into a string.  Now, search through that string for the keyword “SQL” and if you find it, return the offset where it appears in the response.  If that offset is zero or higher then “SQL” appears in the response.  If the result is -1 then the string is not in the result.  You’re done!

With this in place, WebScarab will automatically filter all of the conversations/requests that have occurred and display only those that match this condition.  Voila!


PCI/DSS Self Assessment Tools Update!

May 8, 2009

Some time ago Cyber-Defense, the free software arm of Enclave Forensics, released a set of free scripts used to automate self-assessment of the technical controls in PCI/DSS.  Specifically, the toolkit allows you to verify the security configuration of the firewall, SSL configuration on web servers, perform analysis of vulnerability reports and validate that the information flow requirements specified in sections two and three of the PCI/DSS have been properly implemented.

Cyber-Defense and Enclave Forensics today announced that an updated version of the toolkit has been released.  This new version addresses questions that have been asked about being able to more specifically identify the types of information that are permitted in and out through the organization’s firewall.  The analysis tool now allows the auditor, QSA or someone performing a self-assessment to specify exactly which ports and protocols are permitted both inbound and outbound.  The resulting report remains an easy to read HTML format.

For this toolkit to function there are a few software requirements that are easy to meet.  The toolkit expects to find that NMap, Nemesis and TCPDump are installed for firewall validations.  To perform SSL Cipher analysis, OpenSSL must be installed.  Of course, since the scripts are Perl based, Perl is also required.

For more information and to obtain a copy of the toolkit for free, please see here!

Also, a complete course on how to implement and manage compliance with (including how to perform technical self-assessments) using this toolkit and a 200+ page coursebook is available through SANS.  In fact, there’s a conference planned for this June.  The hands on PCI/DSS 1.2 compliance course is being offered on June 21 and 22.  There is also a hands on Advanced IT Audit Course being offered from June 15 through June 20 covering all you need to know about Firewalls, Routers, Web Application Security, Windows Auditing and UNIX auditing!