Archive for February, 2009

8 Problems and 9 Solutions to College Information Security

This article originally appeared on the Security Catalyst Blog.

Colleges and universities store employment data, financial records, transcripts, credit histories, medical histories, contact information, social security numbers and other types of personal information. Although higher-education institutions should be forums where information and knowledge are easily exchanged, “sometimes the free flow of information is unintentional.” Here are eight policies and behaviors that put personal information at risk:

  1. Administrative Decentralization
  2. Naive Office Culture
  3. Unprotected “Old” Data
  4. Shadow Systems
  5. Unregulated Servers
  6. Unsophisticated Privacy Policies
  7. Improper Use of the SSN
  8. Unsanitized Hard Drives

Administrative Decentralization

In a university setting each college, each department, and often each professor operates nearly autonomously. In an environment where knowledge must flow freely, decentralization is a must. However, it means that new centralized policies to address information security are difficult to implement.

Naive Office Culture

A closely related risk factor is office culture. Staff turnover makes training an ongoing struggle, despite strict policies governing information control. Accidental information leaks can occur, even in the most secure IT environment. In addition, all office cultures resist changing any process, no matter how inefficient. In one example, I called my law school to discuss financial aid. After identifying myself by only my last name, the staff member automatically read my social security number over the phone.

Unprotected “Old” Data

Colleges do a pretty good job of guarding current personal information, but fail to protect older information, which is especially risky if the old data includes social security numbers.

Almost every week a faculty member backs up an old hard drive to his personal web space, unaware that the hard drive contained legacy student grades and social security numbers. Occasionally the professor is aware of the information but mistakenly believes that his university-provided Web space is not available to the public. Often the data sit on the institutional server for up to five years undetected and forgotten—until the information turns up on Google.

Shadow Systems

“Shadow Systems” are copies of personal information from the core system which professors, colleges, departments, and even student organizations maintain independently. Shadow systems can be sophisticated databases under high security or simple Excel spreadsheets on personal laptops. They multiply at an alarming rate because faculty members with administrative access can create their own databases at any time.

Thus, even though a small army of information-technology professionals may guard a college’s core systems, the security perimeter extends much further. And despite strict policies governing information control, employee turnover makes training about privacy and security issues a continual struggle.

Unregulated Servers

Often faculty members and third-party vendors also set up their own unregulated servers outside university firewalls, often for legitimate academic use. Those servers are particularly vulnerable to hackers and accidental online exposure. In one security audit, a private university uncovered 250 unauthorized servers connected to its public internet network, each containing sensitive student information.

Unsophisticated Privacy Policies

Colleges’ privacy policies often demonstrate a basic lack of understanding of the law and, more importantly, how the institution carries out the law through internal processes. Many policies basically say nothing more than “We follow the law,” without explaining what the law is or how they follow it. Even worse, some simply say, in essence, “Trust us, we’ll be good.”

Many institutions’ privacy policies also erroneously mimic commercial policies, which are narrowly tailored to cover only information collected online. Those policies are deficient in a college setting because just a small fraction of personal information that colleges maintain is collected online.

Further, a single institution may have dozens or hundreds of separate privacy policies, each dealing with a different, and incomplete, set of issues. For example, at some highly decentralized institutions, each college, department, and even some facilities like student unions have their own privacy policies. While privacy policies should reflect the practices of each group, inconsistent policies can create confusion among staff members who must explain or carry them out.

Improper Use of the SSN

Even though many colleges don’t now use social security numbers to identify students, they once did. Those old records sit like land mines on old servers. In addition, some universities print them on academic transcripts and official documents. Even though the American Association of Collegiate Registrars and Admissions Officers recommends printing the social security number on transcripts, my January 2007 study indicates that fortunately, most don’t.

Unsanitized Hard Drives

Deleted files remain almost unchanged on the hard drive until it is overwritten or physically destroyed. Once unsanitized hard drives are re-sold, sensitive personal and corporate information can be easily retrieved. Though most universities have a sanitization protocol when retiring old hard drives, enforcing the policy can be challenging.

Solutions

College administrators should consider the following:

  • Regularly scan institutional networks for sensitive information, such as social security numbers, grades, and financial information. Use a combination of public search engines, and internal text- and file-scanning software.
  • Automatically retire “old” data on institutional servers but allow faculty members to un-retire old data they still use. Forgotten information is dangerous information.
  • Establish a “radioactive date,” which is when your institution last used social security numbers as an identifier. Files last modified before this date should be presumed dangerous.
  • Create permissions-based access to core systems. Sensitive personal information should be available to faculty members and departments only on a need-to-know basis.
  • Establish a data-retention-and-access policy by balancing threat, benefits and risks of maintaining the data.
  • Coordinate interdepartmental privacy and security practices with a special committee of information security professionals.
  • Update your privacy policy to reflect all privacy issues arising in a university setting. Explain privacy rights and practices that protect offline employment information and sensitive student records. Also explain work-flow protections (for example, “only director-level employees have access to social security numbers”) and technical practices (for example, “employee data is stored on encrypted hard drives”). Privacy policies should deal with more than just cookies and Web forms.
  • Eliminate social security numbers from official records where possible, or establish a policy whereby students can opt to omit their numbers from transcripts or other records.
  • Physically destroy all old hard drives.

Institutions of higher education must promote the free exchange of ideas while protecting sensitive personal information. Although the academic environment can seem at odds with information security, appropriate practices and procedures can balance information freedom and personal privacy.

Aaron Titus is the Privacy Director for the Liberty Coalition, and runs National ID Watch. A version of this article originally appeared in the October 24, 2008 edition of the Chronicle of Higher Education, and is republished here by arrangement.

No Comments

Cost of Data Breaches Rise

Note: This post originally appeared on JeffreyNeu.com.
ZD Net reports that the cost of a data breach has gone up 2.5% from 2007, according to research published by the Ponemon Institute.

After comparing data from 43 companies (including several repeat offenders), companies loose just over $200 per compromised record. Significantly, lost business due to a lack of customer trust and brand diminishment comprises 69% of the cost.

Forget about the cost of postage… businesses stand to loose much more in sales from customers who read, “We regret to inform you…”

No Comments

What do you Call a “Data Self?”

On page 2 of his book, The Digital Person, Professor Daniel Solove posits that each individual is comprised of “an electronic collage of bits of information, a digital person composed in the collective computer networks of the world.” In other words, a person may now be defined as just a few pieces of data.

A few months ago I argued that each this electronic collage of information comprises a “Data Self”. It was my rather ungraceful attempt to articulate: “Hey! You know all that stuff’ out there? That’s not ‘stuff’ out ‘there:’ That’s you.” Me? “Yeah, YOU.”

Thanks to an enlightening discussion with my very astute friend, Greg Ceton, we were able to identify other possibilities, each of which has its own set of problems.

  • Data Self: Nobody thinks of themselves as a “Self.”
  • Digital Identity: Although most people now understand “Identity Theft,” nobody thinks of themselves as an “Identity.”
  • Digital Self: Same problem, and information doesn’t have to be digital
  • Information Self: Even more abstract than “Data Self.”
  • Digital Clone: Better, because Clone connotes both “me” and “other” simultaneously. However, it’s pretty sci-fi.
  • Digital Me/You: “You/Me” is better than “Self,” but the concept is still too abstract to immediately grasp.
  • Digital/Data Double: Although slightly easier to grasp, the “Double,” as in “stunt double” distances a person from their Data Double. After all, the whole purpose of having a stunt double is so that you don’t get hurt.
  • Digital Twin: Same strengths and weaknesses as above. A “twin” is not “you.”
  • Digital Alter-Ego: A subtle improvement, but still suffers from the problem of detachment.
  • Your Digital Copy: Ditto.
  • Shadow Self: The strength of this phrase is that you can never get rid of your shadow. But the analogy breaks down because 1. You always know exactly where your shadow is. 2. Your shadow can’t act on its own, and 3. Your shadow can’t harm or be harmed.
  • Identity Hostage: I think this hits both points- a data alter-ego whose actions affect you. But as @caparsons put it, the term is loaded and implies that the only thing a digital identity is good for is stealing.
  • I’ll add more as I think of them. I’d appreciate your thoughts here, or on Twitter.

No Comments

The Top 5 Reasons You Won’t Hear About a Breach

Note: This article originally appeared on the Security Catalyst Blog.

I have personally discovered more than a hundred data breaches by schools, companies, doctors’ offices, tax professionals, government agencies, and individuals over the past several years. Unfortunately, very few of the breaching entities proactively announce an average breach, regardless of the law. Here are the most common reasons:

  1. Failure to Detect
  2. Market Devaluation of Privacy
  3. Poor Communication
  4. Ignorance of Law
  5. Notification Difficulty

Failure to Detect

Many organizations do not have proper diagnostic processes to detect breaches when they occur, and many do not keep proper logs. Thus, when a press releases reads, “we have no evidence that the sensitive information was accessed…” it may simply mean that they did not keep any records, and thus literally have “no evidence.”

Market Devaluation of Privacy

The market does not value privacy. Ensuring privacy is expensive, but the costs of violating privacy are small. Doing a simple cost/benefit analysis, organizations often come to the logical conclusion that the PR ‘costs’ of announcing a breach (especially when no hard proof of access exists) far outweigh any benefits.

In addition, most data breach notifications laws only require an organization to say, “Oops.” If the organization is feeling nice, they’ll say, “Oops, sorry.” And if they’re feeling gregarious, they’ll say, “Oops, sorry, and here’s a free report of how much damage has been done to your credit. You’ll still be at risk for years to come, though, so stay vigilant. Good luck.” But they have no responsibility to help you recover from financial identity theft, medical identity theft, or criminal identity theft. Merely getting a credit report does not protect against any of these risks.

Poor Communication

A cruel irony of data breaches is that the only source of information about a breach is filtered, packaged, and presented by the organization with the most incentive to skew the details. The breaching entity’s concern is to minimize perceived liability; therefore it is in their best interest to restrict the flow of information about the breach as far as possible.

I have read dozens of breach announcements, and they almost write themselves: “On X date, we discovered that some personal information was compromised. We acted immediately to make the information unavailable, and we have no evidence that anyone accessed it for inappropriate reasons. You should get a credit report as a precaution.” Keeping a victim in the dark about the details protects only the breaching entity.

Ignorance of Law

Even in states where breach notification laws exist, smaller organizations often assume that the law only applies in limited circumstances, to larger companies, or to particularly large breaches.

Notification Difficulty

For the most part, organizations which choose not to report breaches get away with it. But even under good circumstances, 100% victim notification is impossible. People move, phone numbers change, or addresses are incomplete or not on file. Letters that do arrive at the proper address may be ignored. Multiple contact strategies should be applied over long periods of time to reasonably ensure that most victims are notified.

I have suggested solutions to some of these problems here and with the creation of National ID Watch

Aaron Titus is the Privacy Director for the Liberty Coalition, and runs National ID Watch.

No Comments