Forging ahead with technological advances in society without the input of the humanities leads to situations like the plight of the Uighurs in China or “obscene” military uses for technology, according to a groundbreaking scientist.
Geoffrey Hinton (pictured below), seen as one of the pioneers of modern artificial intelligence for his decades-old research on deep learning and neural networks, also told Times 中国A片’s World Academic Summit?– held online in partnership with the University of Toronto?on 1-3 September?– that he was “very happy” if universities use big science grants to help fund the humanities.
The distinguished emeritus professor at Toronto?– who was hired part time by Google in 2013 and is now a vice-president and engineering fellow at the tech giant?– said that although technology “allows us to create lots of goodies”, other disciplines were vital for helping society determine how to use such advances. ?
“How those goodies get distributed and used depends on things that aren’t technology, that depends on social decisions about how we should divide things up and those are really important,” he said in an interview with THE editor John Gill.
A “technologically advanced society but without the humanities” then leads to problems, he said, adding that “modern China is a bit like that; you get things like the Uighurs in western China”,?subjected to high-tech and intensive surveillance by the Chinese state.
Another potential issue was “if you have too much military funding”, with Professor Hinton pointing to an example of “self-healing” minefields that had been developed in the US where mines re-establish themselves after being detonated.
“This seemed just completely obscene to me, to talk about healing from the point of view of the mines?– and that’s what you get if you don’t have enough humanities, I think," he added.
He said he was “very happy with the idea that most universities use the big grants for science to help subsidise the humanities, I think that’s a good thing to do”, while criticising governments that sought to tie non-science funding to economic impact.
Professor Hinton was also highly critical of politicians “deciding winners in science” by tying all funding to particular missions and said research systems needed a culture of allowing for failure.
On this latter point, he said the UK?– where he completed his doctorate in the 1970s?– had not always had the best approach, referring to governments in the past pushing for PhDs to be completed in a set amount of time.
“A lot of the best PhDs take five or more years and involve graduate students trying a bunch of things and failing and then trying something else,” he said.
“That is how it has to be when you’re doing an apprenticeship, and Britain doesn’t seem to allow for that well enough,” he continued, adding that the “net result” of trying to “regiment” a doctorate?is that “you can’t afford to do original research for a PhD”.
Meanwhile, Professor Hinton said he was not too concerned that industry would starve universities of AI talent.
This was because there was now a steady pipeline of researchers coming through in the field and among them there would be scientists for whom ideas were more important than earning a high salary.
Universities would also continue to be the best places for developing “radical new ideas” that might rethink the foundations of AI, even if firms might be more equipped for large computational experiments, while universities would be “essential” for handling ethical questions.
“I think issues like fairness are probably best studied in universities where researchers don’t have any links to a company, so they can be critical of a company without creating problems,” he said.
后记
Print headline:?‘Tech without humanities leads to problems’