Legislation aimed at eradicating contract cheating has not stopped a proliferation of new low-cost artificial intelligence essay writing services, with universities being urged to work together to target the major platforms that continue to allow these companies to advertise.
Laws passed in recent years in?England, Australia and elsewhere that ban businesses from completing assignments on behalf of students – and advertising such services – are often so broadly worded that they can effectively criminalise the likes of ChatGPT and Google’s Bard, a new from legal experts at UCL concludes.
Campus resource: Sensitive marking and the end of the line for the academic essay
It finds that, although prosecutions of these large language models are highly unlikely, the laws could also be used to target “AI-assisted” services that claim to be able to do things such as provide real, bone fide references for AI-generated text or rewrite content to avoid AI detectors.
The Digital Services Act that came into force across the European Union in August, and is also being applied in the UK, requires that online advertising archives be made available, explained Michael Veale, associate professor in digital rights and regulation, who co-wrote the report with research assistant No?lle Gaumann. This has made it easier to understand, for the first time, the breadth of contract cheating services being operated.
“There are a huge array of AI essay mill adverts bombarding students on TikTok and Meta services in a way only now visible now that we can look through ad archives,” said Dr Veale.
“They are designed to target the current ways universities are using to spot AI content and are effectively tools whose purpose is to assist plagiarism. Advertising them is illegal in many parts of the world and illegal for TikTok or Meta to carry.”
These services are often far cheaper than a traditional essay mill, which might charge hundreds of pounds to write an assignment on behalf of a student, Dr Veale said, potentially opening up their use to a larger cohort of students unless more is done to stop them advertising.
“A lot of the time these laws were seen as symbolic rather than practical or enforceable. Good luck getting these shady businesses into court if you find them,” said Dr Veale.
“Universities could have a go at shutting some of the bigger or more egregious ones, but they have never really had much of a chance of doing that without the support of police or prosecutors.
“Advertising is a different ball game. You can choke a key point in the system. TikTok and Meta can be rendered criminally liable if they do not take this down, once they are told about it explicitly.
“Once you pull the blindfold off and say the advert exists and you report it to them, they assume criminal liability if they refuse to take it down.”
Universities, Dr Veale said, should therefore look to pool resources to enforce the laws over advertising, or designate a sector body the responsibility of monitoring and flagging adverts when they arise.
“If students aren’t seeing these adverts on a social media platform, they are far less likely to find out about these services,” he said.
“You are not protecting them very well by putting them through academic misconduct procedures. Protecting them requires that they aren’t being bombarded by adverts throughout their university experience.”