In an effort to better understand how trolls and spammers work on social media, Facebook is building bots that can imitate such people and help the company better identify trolls and scammers on its main apps.
Facebook has built a scaled-down version of its main platform to simulate user behaviour.
According to a research paper on a “Web Enabled Simulation” (WES) for testing the shadow platform, the move will help Facebook engineers identify and fix the undesired consequences of new updates before they’re deployed.
The platform called “WW” also automatically recommends changes that can be made to the platform to improve the community experience, reports MIT Technology Review.
Like any software company, the tech giant needs to test its product any time it pushes updates.
“But the sorts of debugging methods that normal-size companies use aren’t really enough when you’ve got 2.5 billion users. Such methods usually focus on checking how a single user might experience the platform and whether the software responds to those individual users’ actions as expected,” said the MIT report.
In contrast, as many as 25 per cent of Facebook’s major issues emerge only when users begin interacting with one another.
Facebook simulates hundreds to thousands of its users at a time with a mix of hard-coded and machine-learning-based bots.
The bots are made to play out different scenarios, such as a scammer trying to exploit other users or a hacker trying to access someone’s private photos.
In a scamming scenario, for example, the scammer bots are given the objective of finding the best targets to scam.
“While the scenarios play out, the system automatically adjusts different parameters in the simulation, such as the bots’ privacy settings or the constraints on their actions,” the report mentioned.
With every adjustment, it evaluates which combination of parameters achieves the most desired community behaviour, and then recommends the best version to Facebook’s platform developers.
WW is actually built directly on the live platform rather than a separate testing version – another key difference from most testing schemes.
The bots, however, stay behind the scenes.
“While a typical user interacts with Facebook through a front-end user interface, such as a profile and other website features, fake bot users can interact directly with the back-end code,” the report said.
This allows them to coexist with real users and more accurately simulate different scenarios on the platform without having those users mistakenly interact with them as well.
This could help Facebook detect bugs faster in the future.