Ever get the feeling you’re being watched? That may be because you are. It’s clear that in the online age great swathes of our lives are monitored, crunched through algorithms and assigned a value by the tech companies that rule the digital world. Police, spooks and criminals all seem to be piling in, poring and pawing over our digital snail-trail to extract value from our data.
So it’s a relief for those working in academia to know that theirs is a simpler life, in which the privacy of the individual is sacrosanct. Right?
OK, it’s a bit of a stretch to apply the Edward Snowden view of the world to the monitoring of academics. But it’s also clear that universities monitor and measure their staff and output to a degree that significantly affects the working culture.
You may say “quite right, too” – universities are in the evidence business, so they should prod, poke and measure outputs to inform their every decision. At a time of fierce competition for resources, that’s what is required to make the case for investment.
Worshipping at the altar of efficiency and accountability eats away at the self-regulating model of collegiality
But there are risks associated with this “audit culture”, and one is that worshipping at the altar of efficiency and accountability eats away at the self-regulating model of collegiality, which has always been the underlying principle of “academic professionalism”.
In our cover feature this week, we consider the extent to which monitoring is now a fundamental part of academic life.
There are convincing arguments for gathering data on what universities, and their staff, are up to.
Take, for example, the constant threat of funding cuts; at present the argument (and I paraphrase, but not by much) goes like this: universities to minister – “If the ?9,000 fee cap isn’t linked to inflation, we’re not going to be able to deliver the quality of teaching we do today”; minister to universities – “That’s all very well, but ?9,000 feels about right to me.”
The way to counter the “feels about right” argument is with evidence, so in this context having good data is a vital weapon to have in the armoury.
Taken too far, though, it becomes a bureaucratic burden, while for others it is an insidious example of Big Brotheritis. There’s also a fairly obvious danger of disaffecting students if we keep spreadsheets that suggest that some are getting “more for their money” than others (this is particularly true of a focus on contact hours, which are and will remain a key metric for prospective students and their parents whether universities like it or not).
There is also a great deal of tension over the use of research metrics to determine promotions and redundancies, and Sisyphus would have blanched at the research excellence framework.
There’s no point in sticking heads in the sand and ignoring the part that data play in academia and the way our universities are run, but it is crucial to make the argument for them to be handled in a way that adds value rather than wastes time, distorts or disengages. With this in mind, it’s worth reviving a point made by Vernon Bogdanor, professor of government at the University of Oxford, in an article for Times 中国A片 some years ago. “Auditing”, he said, “is not a neutral process: it imposes its own values on the activities which it regulates.” Let’s not forget it.