By Kent Wada
A slew of legislation and industry regulations are pending that will force changes to your security policies and values on your campus. Will your security bubble burst?
The internet’s ubiquity has blurred the lines between cyberspace and the physical world—the nation’s power grid, water supplies, and other critical infrastructure—raising cybersecurity risks to unprecedented heights. Likewise, universities and colleges must now deal with infrastructure security as well as the traditional defense against hackers breaking into their systems and gaining unauthorized access to protected data.
In some quarters, however, we in the university community are considered as much a part of the problem as we are part of the solution. Consider the first large-scale cyber attack in February 2000 against prominent Internet sites such as eBay.com, Amazon.com, and Dell.com: Many university computers were used in the attack. At the time, there was speculation about the possibility of lawsuits for negligence against institutions that had not properly secured their computers and thus made the attack possible. After all, a 16-year-old perpetrated the attack using well-known computer vulnerabilities. With our largest institutions having tens of thousands of unevenly managed computers and possessing very big pipes to the Internet, our systems are often used not as targets, but as launch points for attacks against others.
Providing appropriate levels of security in the higher education environment is not an easy problem. Our institutions often are decentralized environments, akin to small cities, with hundreds of departments and diffuse authority systems. Cybersecurity is complex to begin with and becoming more so every day. IT staffs are already overloaded or under-funded. And there is often a lack of understanding about the importance and complexity of IT security. How many people, for example, faced with a barrage of passwords to remember, write them down on a sticky note and place it in a desk drawer beside them?
More importantly, our core values—including academic freedom, freedom of speech, and respect for individual privacy—encourage the open exchange of information and ideas, not exactly a security-minded moral framework. Balancing these imperatives is at the heart of the debate we must now engage in and act upon.
Where to Begin
This challenge is becoming sharper and more vexing by the day as pressures mount from outside the campus. For instance, there is a slew of legislation now pending that will affect higher education’s security, which is inextricably intertwined with—and sometimes on collision with—our values. Organizations with whom we work, such as credit giant Visa U.S.A., impose IT security restrictions. And more may be coming, especially if we appear reluctant or resistant to do our part—whether or not that is actually the case.
We need to think hard about how we balance these conflicting goals as well as how we articulate what we decide. To that end, an understanding of some of the requirements being imposed externally on colleges and universities in the IT security area is a good place to begin.
Digital Millennium Copyright Act
In the raging national debate about copyright, particularly as it applies to cyberspace, one aspect is continually linked in the media to universities and their students: the «piracy» of music, videos, and software on a large scale, thanks to newer generations of peer-to-peer (P2P) file-sharing software. Industry representatives from the Recording Industry Association of America (RIAA) and the Motion Picture Association of America are aggressively pushing campuses to do more to curb this file-sharing ability. For example, in Australia, «recording companies have asked the Federal Court to allow their computer experts to scan computers at the University of Melbourne for sound files and e-mail accounts, so they may gather evidence of claimed widespread breaches of copyright,» according to an article in the Sydney Morning Herald. And it is not just the entertainment industries that are focusing on higher education; there is now congressional concern. In a recent hearing on the topic of digital piracy, Rep. John Conyers, (D-Mich.) warned that universities should take aggressive measures to police their own networks or Congress would do it for them.
Beyond the threat of intellectual property theft, universities and colleges have other reasons to be concerned about file sharing: P2P networks are considered to be a largely unexploited vector for spreading computer viruses. And unconstrained bandwidth use due to file sharing is straining technical and fiscal capacity. For these reasons, institutions are experimenting with a wide variety of techniques, tools, and policies to manage bandwidth and appropriate use.
Also, privacy continues to be a dominant concern. A recent court case between Verizon and the RIAA is testing the law over whether the former must hand over subscriber information about a suspected copyright infringer. Will institutions some day be required to hand over names of students, faculty, and staff for copyright infringement? Perhaps even thornier from a privacy perspective is a trial at one university to evaluate software that examines every file being transmitted across the network. The software purports to determine if a file is copyrighted material being transmitted without permission. There is clearly potential for a chilling effect on academic freedom with such invasive monitoring, a special concern given the increasing interest and academic research in the area of P2P networks.
USA PATRIOT Act
The USA PATRIOT Act—Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act of 2001—was intended in part to update wiretap and surveillance laws for the Internet era. Wide in scope, it amends more than 15 statutes including FERPA, the fundamental privacy protection for students. Universities and colleges might expect to see an increase in requests for records from law enforcement agencies due to a larger number of allowed circumstances under which such requests can be made.
Existing policies and procedures (especially those regarding the handling of subpoenas and search warrants) will need to be reviewed with legal counsel in light of the USA PATRIOT Act.
For example, if a law enforcement officer were to approach a systems administrator or front-desk staff member with a request for computer logs, would he or she know what to do? Would he or she give records away in good faith, which should not have been disclosed? Who in the institution would be notified? What if it is outside normal business hours and the officer is demanding immediate access?
Having these types of issues understood beforehand will ensure both compliance with the law and minimization of unnecessary privacy breaches through ignorance. A process for tracking such requests is also important both to preserve institutional memory and to prevent abuse.
The USA PATRIOT Act also highlights records retention as an area for reconsideration. It is challenging enough to give guidance on how long e-mail should be retained, if for no other reason than individuals have differing needs. The balancing act is to keep relevant data only as long as it is legitimately needed, and no longer, lest it become a liability.
This is also true for electronic records of another sort: computer transaction logs. Web servers, e-mail servers, and other network devices all automatically note when services are used. Though the data kept is not necessarily immediately linked to individuals, often it can be. This can be helpful to systems administrators in tracing the origins of a computer security incident and to law enforcement during the course of an investigation. But privacy should also be considered in determining retention. Many libraries, for example, keep borrower information only for as long as a book is checked out, retaining only aggregate statistics for the longer term. Electronic records have characteristics that are different from paper records (for one thing, it is easy to keep massive amounts of the stuff), but policies should be viewed in the larger records management context rather than as a separate effort.
In addition to privacy issues, responding to subpoenas can be time-consuming and difficult, especially if backup tapes are involved. Determining how long such logs should be saved and under what circumstances is crucial.
Of course, this law has many ramifications beyond those considered here. Several organizations have published analyses of the USA PATRIOT Act, including the American Library Association and the Electronic Frontier Foundation. An excellent article by Tracy Mitrano, Cornell University’s IT Policy Advisor, is particularly recommended (see «References»).
There is growing awareness of the Health Insurance Portability and Accountability Act of 1996 (HIPAA), a complex piece of legislation that defines privacy and security standards for personally identifiable patient information. University hospitals will already be intimately familiar with HIPAA, but it is less obvious that it can also affect other campus offices if protected patient data is stored or transmitted. For example, a personnel office may be affected if employees are able to file insurance claims through that office.
Implementation of HIPAA is occurring in phases. Two of the key phases are privacy standards, which went into effect April 14, 2003, and the newly finalized security standards, which are effective April 21, 2005.
HIPAA compliance is a complex and time-consuming effort and it is yet too early to know where, if anywhere, there may be friction with existing institutional privacy and security policies. Yet HIPAA provides for both institutional and individual criminal penalties for some types of non-compliance, and understanding how it may affect your institution is important. Fortunately, a number of good resources are available (EDUCAUSE), (Goldsmith, 2001).
In California, Senate Bill 1386 adds a new provision to the California Information Practices Act requiring disclosure of computer security breaches in order to increase protection from identity theft to California residents. SB1386 requires state agencies with personal information to disclose any breach of a system to any California resident whose personal information was, or is reasonably believed to have been acquired by, an unauthorized person. With identity theft topping the list of consumer fraud complaints at the Federal Trade Commission, similar legislation may soon appear in other states.
Though the definition of «personal information» is narrow and the concept straightforward, implementation of SB1386 is potentially nontrivial. For one thing, book-of-record data may be kept centrally, but individuals may be routinely allowed access to download slices of confidential data to their desktop workstations for analysis. What if such data is downloaded to a Palm Pilot, which is subsequently lost?
Obviously, it would be desirable to proactively protect against such incidents to the extent possible; following good security practices will help significantly and so will the development and promulgation of security policy. But not all incidents are preventable. Thus, the formation of a Computer Emergency Response Team (CERT), with expertise not only from technical groups, but also media relations, legal counsel, and campus police, is essential. This team can quickly sort through the issues and coordinate whatever institutional actions may be necessary when a breach or other incident is identified. A clear understanding of roles, responsibilities, and procedures for a campus CERT can help to minimize institutional impact—be it loss of data, negative media exposure, or extended system downtime—in a crisis.
Visa U.S.A. Cardholder Information Security Program
The Visa U.S.A. Cardholder Information Security Program (CISP, see «References») defines a standard of due care and enforcement for protecting sensitive information associated with credit cards. Currently, it applies to eCommerce merchants allowing online Visa transactions, which would include some colleges and universities. Among other things, CISP specifies the «Digital Dozen,» a list of 12 basic security requirements with which all Visa payment system constituents need comply (e.g., requiring a firewall to protect data, encryption of data sent across public networks, and use of regularly updated antivirus software).
eMerchants falling into the «high volume transaction» category (unlikely for institutions of higher education) require an annual on-site review. But even without the mandated annual review, the Digital Dozen can be used as a security checklist to be compared against an institution’s security policies. In some cases, specific technologies or techniques may reasonably be argued to provide equal or better protection than what Visa requires.
NASA IT Security Clause
A draft prior to the final version of the National Strategy to Secure Cyberspace asked the question: «Should consideration be given to tying State or Federal funding to [institutions of higher education] to compliance with certain cybersecurity benchmarks?» Effectively, this has already begun.
In July 2002, the IT Security Clause was published in the Federal Register as a Final Rule. Applicable to any National Aeronautics Space Administration (NASA) contract where IT resources (e.g., data, information, applications, and systems) are integrated into and support the missions of NASA, it does not matter whether the contractor is a commercial entity or a university; there is no minimum dollar threshold for applicability. The clause mandates that an IT security plan be submitted to NASA, along with a project bid, detailing how IT security requirements are to be met.
Though the clause does not currently apply to grants, «… guidance will be forthcoming that will require some IT Security to apply to grants and cooperative agreements.» It would not be a surprise if other agencies began stipulating IT security as part of funding requirements. Such requirements will surely have implications on our institutions’ infrastructure, staff, and services. If a researcher were to ask what IT security infrastructure is provided at your institution to fulfill the IT security requirements of a grant, what would you answer?
Ideas for Action
The ways in which universities and colleges respond to these new external drivers are likely to be extensions to what we already do for IT security (see «Some Specific Recommendations»). We engage in dialogue with the campus community about difficult issues; we develop policies and procedures to smooth a crisis; we constantly evaluate tools and techniques to enhance security; and we raise awareness of these important issues in our communities. We worry about funding and about being overwhelmed.
It may be tempting to believe these external drivers are just more of the same. But that would be imprudent because the world around us is changing, not the least because of 9/11.
First, we must ensure that our decisions carefully weigh all arguments, balancing between conflicting needs and viewpoints of our campus communities. Determining how much monitoring is «appropriate,» for example, is made even more challenging by ambiguity in, and national controversy over, untested new laws and shifting expectations. Each institution will likely come up with a different answer, as local cultural values will always frame the discussion.
Second, these decisions must be carefully articulated. In the past, a dependence on facile and over-broad arguments invoking principles such as freedom of speech may have been sufficient. That does a disservice to higher education, particularly when these arguments are picked up and disseminated outside of context by the media. Whether talking about university computers being exploited to attack others or about «rampant digital piracy» due to our «negligence,» touchy issues full of emotional overtones are tough to present analytically to begin with. Thoughtful crafting of our arguments and working with media relations folks will help minimize a perception that we are not sensitive to these national issues of security.
Third, though our institutions span a diversity of size, geography, and many other factors, we need to work with others to avoid inventing the wheel and to ensure that our collective voice is heard in the right places. Our concerns, our decisions, and our many initiatives must be understood.
We are already sophisticated at collaboration. Let us use it to our advantage, lest others take control on our behalf.
By Kent Wada