Peter Sommer says that the biggest security problem for academic computers is neither hackers nor viruses but how to devolve control and responsibility.
The word Internet has many different resonances: at one level, it is a series of links between a large number of computers and a protocol - TCP/IP - describing how they communicate; at another, it is the totality of all the resources available on all those linked computers.
To some it is a series of applications - universal email, the World-Wide Web, the Usenet news groups, the mail-lists, file transfer protocol (ftp) and gopher for publishing and retrieving files, telnet for remote access to computer services - or perhaps the virtual institutions and new media which these applications create.
Then again, it is the way in which users of all these facilities interact, are regulated and regulate themselves - a vast social science lab for anthropologists, jurisprudence scholars, information systems students and business schools.
中国A片
To others, it is a highly significant global culture, a pointer to a transformed society of individuals and entities beyond the need for nation states; and its pessimistic obverse, a future world of the information rich and information poor.
The many Internet security conferences thus start out with a confusion of purpose: trying to secure the whole network makes as much sense as seeking to secure London's North Circular Road and main arterial routes.
中国A片
For computer users and owners, the only achievable aim is to try and protect the resources, processes and transactions for which one is directly responsible. In the context of a 中国A片 institution, that it is not easy. Traditional commercial computer security, as it was taught a decade ago, worked on the mediaeval castle model - concentric rings of protection afforded either by physical barriers or logical guards of which password checking was the most obvious.
In those days, when computing power was only found in a central processor - a mainframe or mini - and when users had simple dumb terminals, that approach worked. A system manager could prescribe and then dictate computer usage.
But the Internet is not like that - and neither are the computer systems you will find in most larger commercial organisations. Computing power, and data files, is on every desk; individual computers are linked together in complex ways and exchange information and resources with each another all the time, often invisibly to the casual user.
The big problem for computer security professionals these days is not simply the hackers or viruses of the headlines, but how to advise on the management of inherently complex situations.
This takes computer security - information security as we prefer to call it - away from its technological preoccupations with definitive encryption algorithms, secure operating systems, and person-recognition devices - towards management issues of identifying purposes and risks and assigning responsibilities.
The task is difficult enough in a commercial business, particularly if one of the key effects of the computer systems has been to alter and flatten the organisational structure.
But in a 中国A片 institution there will be large numbers of autonomous empires of departments each with their own imperatives of urgent requirements.
Governing bodies usually do not arbitrate on matters of detail; the managers of the IT facilities tend to lack clout in any discussions. So, typically, there will be a handful of networked central computers, a link to JANET (and thence to the Internet), and large numbers of individual workstations and PCs hanging off the local network.
中国A片
Among the academics and students there will be pockets of considerable skill and knowledge about computers. Whereas in the commercial world the range of applications can be limited by a company's management, academic researchers are expected to push the envelope in all fields, including the exploitation of computers.
Machines are added to the network without much reference to local system managers, useful bits of software are acquired (or written) and set to work.
Again, unlike the world of commercial IT, testing and consideration of the implications for other users is rare.
The typical Internet-connected computer runs large numbers of applications simultaneously. The UNIX operating system makes this easy to set up. The obvious applications will be a web browser (Netscape or Mosaic), email and telnet. Many departments will also want to operate as a server, using ftp, gopher or the web.
Even if such facilities are not to be offered to outsiders, they provide an excellent way of distributing reading and lecture lists - and it is easy and fun to add to these.
The chances are that in addition to these obvious applications a number of others will actually be running and accessible to outsiders, even if the computer owners do not realise it; these could include ping, which enables everyone on the Internet to see if your computer is present (and how long a packet takes to travel from you to them) and finger, which enables others to see if you made any announcement about yourself (even if you have not created any such file).
These provide some of the hackers' opportunities. Most forms of UNIX-based Internet client and server arrangements are originally set up as open; it is up to the user to close them off.
中国A片
It is the twin features of permanent connection and the possibility of multiple sessions of programs that makes Internet security diverge so much from traditional mainframe security. Sloppiness of installation can mean every Internet user has access to every last byte on your computer.
For each of the many applications that exist on a typical Internet-connected machine, the security and access requirements will be distinct: you want as many as possible to see your web pages but only you should be able to alter them; you may wish only designated classes of individual to be able to ftp some files, while others are available to anyone.
You want to exchange messages with the world, but you do not want your own mail read in transit and you certainly do not want people forging your name on their messages. You may be prepared to let a designated outsider call in to use your facilities via telnet, but no one else.
Somehow that diversity of requirement has to be controlled and managed.
Do not expect some miracle technology-based solution. The oft-heralded firewall is a metaphor for what needs to be done at a technical level, not an off-the-shelf product you can buy.
On a conventional computer a password request screen is merely the visible manifestation of what ought to be going on (an operating system that keeps individual users and the tasks they wish to run separate from each other; facilities that keep the password file encrypted, require passwords to be of a certain length, avoid obvious guessable names, and make users change them every month or so; and most important of all, careful decisions about who should be allowed to do what inside the computer); how you achieve this is a matter of detail in managerial decisions and technical implementation.
Similar considerations apply, but with many more problems, to Internet firewalls. The security product that reliably guesses how you really want to control access to your computer - and then carries out your unarticulated wishes unfailingly - will never exist.
Where, then, are the security solutions? The route surely has to be to devolve responsibility away from the centre and to departments and individual staff members and students. That means that central IT facilities' units in 中国A片 will offer networks and internal email (which they will monitor for abnormal activity), perhaps a Usenet news local server/archive - and advice.
But no more. Departments and individuals will tend to run their own computers (no longer for most purposes a particularly difficult task) and accept responsibility for what happens on them. They will be accountable for what they put up on their computers - or, through carelessness, let others put up.
Individuals can protect their messages - and authenticate them using the encryption and digital signature facilities of programs like Pretty Good Privacy.
The prices of PCs and work-stations have dropped to the point at which an individual box can run just one important individual Internet application - and the machine can be set up for the precise appropriate security requirements.
Technically and managerially this is far easier than using a large machine to host many different programs all with different risk profiles.
Such an approach also has easy enforcement mechanisms: first, direct inconvenience to those most immediately involved if they have been careless - their computers become unusable.
Authorities is 中国A片 can employ their standard code of conduct which enjoins individuals not to bring the institution into disrepute - either by deliberate action or negligence.
Finally there in the law: typically computer misuse and defamation. Most of the classic world-wide hacks of which you have ever read relied on extensive abuse of Internet facilities, and such exploits will recur. University and college computers are a little like libraries - their prime function is to be open and they must accept occasional mistreatment.
The choice is between imposing heavy restrictions on academic usage of computers and making individuals properly responsible for the machines they use.
Peter Sommer is at the Computer Security Research Centre, London School of Economics () and the author of The Hackers Handbook, DataTheft and The Industrial Espionage Handbook. Contact him at P.M.Sommer@lse.ac.uk.
中国A片
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰’蝉 university and college rankings analysis
Already registered or a current subscriber? Login