Ten claims that scare security pros
Telltale phrases that signal security trouble could be boiling over
By Jon Espenschied, Computerworld
August 14, 2007
A child with a chocolate-smeared shirt says, "I didn't do it." The phone rings, and Mom assures you, "There's nothing to worry about." A systems administrator carrying a box of tapes says, "We'll have everything back up in a few minutes." Sometimes the first words you hear -- despite their distance from the truth -- tell you everything you need to know.
That's so with information security as well. Some words sound reassuring at first glance, but I've found they often point to problems safeguarding internal information assets and technical resources, or with the people and processes that protect them. Here are a few of the telltale phrases signaling that security trouble could be boiling over.
"We have a culture of security."
No, you don't.
I hear this most often from enterprises that started as a five-person mom-and-pop shop, went corporate as they grew, then blinked and found themselves operating with a thousand people and no governance or policies. Three dollars and their "culture of security" will get you a fancy cup of coffee in a quiet cafe, where you can contemplate how much work there is to do.
The simple fact is that without supporting directives or a mechanism for feedback, security is defined differently by each person and verified by no one. There is no metric for compliance with a "culture," and a "culture of security" is overridden by a culture of "get the job done" every time.
If there are rules, write them down. If technology is put in place to implement or monitor the rules, write that down too. If people break the rules, follow up. If the rules prevent legitimate business from getting done, change them. It's that simple.
"IT security is information security here."
Information security is not the same thing as security in information technology. If the term "information security" is used interchangeably with "IT security," it invariably means that no one has made fundamental nontechnical security decisions and that affected departments -- IT, human resources, legal, audit and perhaps others in your organization -- are guessing what the others mean.
Get together with those who have influence in the departments above, and decide whether information (not paper documents or equipment) is an asset of the company, just like computers and paper clips. Decide whether the company authorizes people for jobs, physical access and information as individuals. Make these policy decisions as a group, and have them signed by those with authority. Then perhaps there will be more time in the day for deciding how to manage security instead of guessing what.
"That doesn't apply to the boss."
Though it's becoming less of a problem in public companies thanks to the Sarbanes-Oxley Act, occasionally an executive simply refuses to follow security or privacy directives he approved. Unless you're prepared to meticulously document misdeeds in a forensically sound manner and then take them to the board of directors or the police (or quit), you'll just have to work around it.
Most of these bad apples can be managed by appealing to their Machiavellian sense of influencing others' behavior: that they at least ought to appear to be leading by example, while continuing to do whatever it is they do with the door closed. Few would admit it, but I've run across many IT organizations that simply budget a DSL line for "guest" access in the executive's office, turn a blind eye to whatever gets plugged in and chalk up support time to the test lab. It's not a desirable solution, but if the executive's still willing to sign a Sarbanes-Oxley attestation, the rest comes down to plausible deniability.
"Our information security officer is on the IT staff."
Titles don't matter. A security expert reporting to the director of IT is a security administrator (specialized systems administrator), even if that person holds the title of information security officer.
The problem is that the word officer usually connotes authority to verify and monitor whether all technical and process controls protecting information are effective. An IT security administrator is usually involved in designing technical controls, and therefore can't self-audit or certify that IT is doing the right thing, particularly if he reports within IT. Outside of a military context, a company security officer ought to report as a peer or senior to the IT director.
"We have a password policy."
Strictly speaking, a document that specifies password length and complexity is a technical standard or procedure, not a policy. Policies are a container for a business directive, such as "individuals must be identified uniquely and authenticated prior to being granted access to company assets." Note that this example of policy concerns what to do regarding people and access, not how to construct sequence of type-able characters.
Despite the efforts of certain software vendors to confuse the language, technical controls such as "group policy objects" are not policies. However, I'm not such a stickler that I'll dig into a fight over terminology without beer. What's important during a business-hours discussion is that the password standard (or "policy") is backed up by a real policy so that IT spends its time implementing controls instead of trying to defend the reasoning or demand compliance without authority. Without a supporting reason and directive, expect to spend a lot of time repeating the same instructions to no avail.
"Our managers have copies of all passwords."
Though the idea makes freshman CISSPs faint, there are indeed managers who demand that their direct reports disclose their individual passwords. The explanation for this aggressively dumb demand is always "What if someone quits or is sick? How would we get their documents?"
I last encountered one of these organizational coelacanths in a small state agency full of lawyers who ought to know better. However, when I confronted the offender with the idea that multiuser permissions would allow him access, I might as well have been shouting "Evolve!" The only effective strategy I've found is to say "If you do that, then you're a suspect in anything bad they do. You'll never be able to fire them because you'll be a suspect too." Encouraging early retirement is worth a shot as well.
"The Web app only runs if we ... "
It's possible to browse the Web securely using IE or to expose a homebrew client/server application on the public Internet, but finding an application designed for only a single client configuration usually means security principles were considered last in a hurried process -- if at all. Most well-designed Web applications are cross-platform simply by virtue of comprehensive testing.
Finding poorly designed Web application is a particular concern because Windows "security zone" settings for corporate browsers may be set lower than appropriate, just to make internal applications work. What happens when those users load up MySpace add-ons and browse their malware-laden Web mail? Treat these applications like the legacy cruft they are, and work with the quality assurance department to include basic security standards in the testing process.
"Brand X is our standard."
I have nothing against Dell Inc., but when the hardware guys at a good-sized organization say "Our standard is Dell" (or any other brand name), what they're really saying is, "We threw our technical standards out the window in exchange for a purchasing discount, and now we buy whatever the vendor offers." This is equivalent to my aunt shopping at a store with inflated prices and stuff she doesn't need, getting excited because "this one's 75 percent off!"
The caveat, of course, is that both my aunt and IT people in the real world have other decisions to make, and PCs are pretty much commodity items at this point. It's OK to choose a vendor's product and keep ordering it for a while, as it's unlikely they will substitute cleverly disguised waffle irons for corporate laptops in the space of a year.
Yet a vendor is not a technical standard, and there's trouble brewing if no one did the homework. When a vendor makes changes to software or a product line -- especially when it comes to network and security device vendors such as Cisco Systems Inc. -- it's important to have functional requirements in hand to see if it still works as desired. When customers don't know what they want, every bargain looks like what they need.
"Hey, where'd that come from?"
It's conceivable that those highly technical users were supposed to supply their own equipment and support themselves. Otherwise it means an organization's IT and help desk service level has just been beaten by consumer hardware vendors with 800 numbers that no one answers. Security policies in the organization might as well be in the bathroom stall tucked behind the toilet paper. They might get read, but nowhere near where work gets done and only for amusement.
Resolving this is a problem of fundamental respect. Start over with an eye toward basic governance and establish that "we've got some rules around here." With a lot of effort and communication, this will at least turn into the "culture of security" problem above.
"We sent the firewall rules out to ... "
Most network administrators cringe at the above-mentioned password disclosure, yet many will freely e-mail copies of their firewall rules. Worse yet, they have a device vendor or freelance consultant configure the firewall and keep the only copy of the rules. These rules, if there's any complexity at all, provide a detailed map of your organization's security, significant information about the identity of internal networks and services, and how to target them.
No serious security professional wants to walk away with a copy of someone else's firewall rules without a specific requirement to retain them. A competent Certified Information Systems Auditor or other auditor will review firewall rules on the administrator's system, not his own. If you see a copy of your enterprise firewall rules pasted into an audit report, especially a public one, prepare for a re-IP project ... and call your lawyers.
Jon Espenschied has been at play in the security industry for enough years to become enthusiastic, blasé, cynical, jaded, content and enthusiastic again. He manages information governance reform for a refugee aid organization, and continues to have his advice ignored by CEOs, auditors and sysadmins alike.
source : infoworld.com
No comments:
Post a Comment