Thursday, March 30, 2017

Investigation Reveals That Wikipedia's Bots Are in a Silent, Never-Ending War With Each Other


Via sciencealert.com by Peter Dockrill


Wikipedia, the fifth most popular website on the internet – according to Wikipedia – has amassed an amazing 40 million entries since its launch in 2001, but underneath all that free information, a cold cyber war has been waging.

A new analysis of the first 10 years of Wikipedia has found that huge numbers of automated software 'bots' – editing algorithms powered by artificial intelligence (AI) – are embroiled in epic, ongoing disputes over articles, continually reverting each others' edits in a desperate bid to have the last word.

"The fights between bots can be far more persistent than the ones we see between people," researcher Taha Yasseri from the University of Oxford in the UK told The Guardian.

"Humans usually cool down after a few days, but the bots might continue for years."

In their study, Yasseri and fellow researchers tracked edits on Wikipedia in between 2001 and 2010.




While the amount of bot activity in the website's early years was low, it skyrocketed as the platform and bot technology matured, with bots responsible for about 15 percent of all edits across all language editions of the encyclopaedia in 2014 – even though these algorithms only make up about 0.1 percent of Wikipedia editors.

Editing bots perform a number of roles on the site to do with modifying Wikipedia content, including undoing vandalism, enforcing bans, checking spelling, creating links, and importing new content automatically.

These bots are designed to be benevolent, supporting human Wikipedia users and cooperating with them – but that benevolence doesn't always respond to their own kind.

Two bots in particular, Xqbot and Darknessbot, clashed on over 3,600 articles on a range of topics, from Alexander of Greece to Aston Villa football club.

Between 2009 and 2010, Xqbot reverted more than 2,000 of Darknessbot's edits – with Darknessbot returning the favour with some 1,700 of its own changes to Xqbot's edits.

Another epic clash, between bots called Tachikoma and Russbot, saw each undo more than 1,000 edits made by the other.

These kind of stoushes came as a shock to the researchers, given the bots aren't intended to conflict with one another – but were accidentally caught in loops where their programming made editorial combat inevitable.

"We had very low expectations to see anything interesting. When you think about them they are very boring," Yasseri told Ian Sample at The Guardian.

"The very fact that we saw a lot of conflict among bots was a big surprise to us. They are good bots, they are based on good intentions, and they are based on same open source technology."

Another surprise for the researchers was how bot conflict played out differently over the range of foreign language versions of the site.

The German edition of Wikipedia had the fewest bot conflicts – with 24 reversions per bot on average over the 10-year study period. The Portuguese Wikipedia had the most clashes – an average of 185 reversions per bot – while the English language page fell in the middle, with bots altering each other's edits 105 times on average in 10 years.

"We find that bots behave differently in different cultural environments and their conflicts are also very different to the ones between human editors," explains one of the team, Milena Tsvetkova, in a press release.

"This has implications not only for how we design artificial agents but also for how we study them. We need more research into the sociology of bots."

As automated AI becomes increasingly prevalent and more powerful, considering the cultural (and potentially combative) dispositions of their programming is something we'll have to pay a lot more attention too.

Otherwise, the future is going to end up looking way too much like this:

< br/>
Source

No comments:

Post a Comment