中国A片

Student retention software comes under microscope

By Carl Straumsheim, for

十一月 11, 2013

Purdue University in Indiana has for years touted the ability of its early-warning system, Signals, to improve student retention, but a series of blog entries analysing the institution’s claims has not found a causal connection between students who use the system and their tendency to stick with their studies.

Purdue’s method of structuring its early warning system has permeated the industry, and research invalidating its results could have sent shockwaves through its competitors. The university’s approach is not being called into question, however, only its claims to boost retention - which, on one hand, is likely to come as a relief to the many software providers that have attempted to recreate what Signals does. On the other hand, the rationale given by many for using such early-warning systems is in fact to improve retention.

Signals combines demographic information with online engagement and produces a red, yellow or green light to show students how well they are doing in their courses - and provides that information to their professors so they can provide help to students before they drop or fail. Ellucian, which provides administrative software, sells it as the commercial product Course Signals, while educational software providers Blackboard and Desire2Learn offer many of the same features through Retention Center and Student Success System, respectively.

Michael Caulfield, director of blended and networked learning at Washington State University at Vancouver, decided to take a closer look at Signals after Purdue in a September press release claimed taking two Signals-enabled courses increased students’ six-year graduation rate by 21.48 per cent. Mr Caulfield described Purdue research scientist Matt Pistilli’s statement that “two courses is the magic number” as “maddening”.

Comparing the retention rates of the 2007 and 2009 cohorts, Mr Caulfield suggested much of what Purdue described as data analysis just measured how many courses students took. As Signals in 2008 left its pilot and more students across campus enrolled in at least one such course, Mr Caulfield found the retention effect “disappeared completely”.

Put another way, “students are taking more…Signals courses because they persist, rather than persisting because they are taking more Signals courses”, Mr Caulfield wrote.

His findings were last month corroborated by Alfred Essa, McGraw-Hill Education’s vice-president of research and development and analytics, who wrote a simulation that substituted “received a piece of chocolate” for “took a Signals-enabled class”.

”The simulation data shows us that the retention gain for students is not a real gain (i.e., causal) but an artifact of the simple fact that students who stay longer in college are more likely to receive more chocolates,” he concluded. “So, the answer to the question we started off with is ‘No.’ You can’t improve retention rates by giving students chocolates.”

Mr Essa helped design the Student Success System as a strategy director for Desire2Learn, but said it and other products that have been inspired by Signals don’t face an existential crisis.

“The aim of these early warning alert systems at the course level is just to make sure that students are performing well,” Mr Essa said. “It’s a huge leap to go from that and say, ‘Oh, and we’re also going to improve your retention rates directly.’ ”

Dr Pistilli defended the claims about Signals’ ability to increase retention - with the caveat that more research needs to be done. “The analysis that we did was just a straightforward analysis of retention rates,” he said. “There’s nothing else to it.”

To ensure an empirically grounded analysis of Signals, Mr Essa urged Purdue to give researchers access to as much data as possible. Dr Pistilli said he is open to participate in that conversation, but pointed out that granting open access could violate students’ privacy rights.

With Signals marking its fifth anniversary this year, Dr Pistilli said “it was probably just a matter of time for people to start looking for these pieces and begin to draw conclusions”. In that sense, the discussion about early warning systems resembles that of other ed-tech innovations, like flipping the classroom and massive open online courses, where hype drowns out any serious criticism.

“I think part of the answer is we’re really bad at statistical reasoning,” Mr Essa said. “Even experts get tripped up by statistics, and it’s very easy to make claims like this, but it’s difficult to dig in and try to make sense of it.”

He added: “Maybe one of the conclusions that could be derived from this is that we really don’t have a strong community to test and validate these claims? Maybe that’s really the starting point of discussion in the academic community. As we move forward with new technologies in learning analytics, how and who will be evaluating the claims that people put forward?”

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
ADVERTISEMENT