You are here
Facebook launches ‘War Room’ to combat manipulation
By AFP - Oct 18,2018 - Last updated at Oct 18,2018
Employees work in Facebook's ‘War Room’, during a media demonstration on Wednesday, in Menlo Park, California. Facebook's Menlo Park headquarters is the nerve centre for the fight against misinformation and manipulation of the social network by foreign actors trying to influence elections in the US and elsewhere (AFP photo)
MENLO PARK, United States — In Facebook's "War Room," a nondescript space adorned with American and Brazilian flags, a team of 20 people monitors computer screens for signs of suspicious activity.
The freshly launched unit at Facebook's Menlo Park headquarters in California is the nerve centre for the fight against misinformation and manipulation of the largest social network by foreign actors trying to influence elections in the United States and elsewhere.
Inside, the walls have clocks showing the time in various regions of the US and Brazil, maps and TV screens showing CNN, Fox News and Twitter, and other monitors showing graphs of Facebook activity in real time.
Facebook, which has been blamed for doing too little to prevent misinformation efforts by Russia and others in the 2016 US election, now wants the world to know it is taking aggressive steps with initiatives like the war room.
"Our job is to detect... anyone trying to manipulate the public debate," said Nathaniel Gleicher, a former White House cybersecurity policy director for the National Security Council who is now heading Facebook's cybersecurity policy. "We work to find and remove these actors."
Facebook has been racing to get measures in place and began operating this nerve centre — with a hastily taped “War Room” sign on the glass door — for the first round of the presidential vote in Brazil on October 7.
It did not take long to find false information and rumours being spread which could have had an impact on voters in Brazil.
"On election day, we saw a spike in voter suppression [messages] saying the election was delayed due to protests. That was not a true story," said Samidh Chakrabarti, Facebook's head of civic engagement.
Chakrabarti said Facebook was able to remove these posts in a couple of hours before they went viral.
Humans and machines
At the unveiling of the war room for a small group of journalists including AFP this week, a man in a gray pork pie hat kept his eyes glued to his screen where a Brazilian flag was attached.
He said nothing but his mission was obvious — watching for any hints of interference with the second round of voting in Brazil on October 28.
The war room, which will ramp up activity for the November 6 midterm US elections, is the most concrete sign of Facebook's efforts to weed out misinformation.
With experts in computer science, cybersecurity and legal specialists, the centre is operating during peak times for the US and Brazil at present, with plans to eventually work 24/7.
The war room adds a human dimension to the artificial intelligence tools Facebook has already deployed to detect inauthentic or manipulative activity.
"Humans can adapt quickly to new threats," Gleicher said of the latest effort.
Chakrabarti said the new centre is an important part of coordinating activity, even for a company that has been built on remote communications among people in various parts of the world.
"There's no substitute to face to face interactions," he said.
The war room was activated just weeks ahead of the US vote, amid persistent fears of manipulation by Russia and other state entities, or efforts to polarise or inflame tensions.
The war room is part of stepped up security announced by Facebook that will be adding some 20,000 employees.
"With elections we need people to detect and remove [false information] as quickly as possible," Chakrabarti said.
The human and computerised efforts to weed out bad information complement each other, according to Chakrabarti.
"If an anomaly is detected in an automated way, then a data scientist will investigate, will see if there is really a problem," he said.
The efforts are also coordinated with Facebook's fact-checking partners around the world including media organisations such as AFP and university experts.
Gleicher said the team will remain on high alert for any effort that could lead to false information going viral and potentially impacting the result of an election.
"We need to stay ahead of bad actors," he said. "We keep shrinking the doorway. They keep trying to get in."
Related Articles
PARIS — Facebook misidentified tens of thousands of advertisements flagged under its political ads policy, according to a study released Thu
WASHINGTON — The staggering figure of more than 3 billion fake accounts blocked by Facebook over a six-month period highlights the challenge
NEW YORK/WASHINGTON — The biggest reputational risk Facebook and other social media companies had expected in 2020 was fake news surrounding