Image: justin sullivan/getty imagesFacebook's security and product leads have been acting globally in the company's effort to regain user trust and protect the integrity of elections worldwide. The world's largest social network cannot take back what occurred on its platform during the 2016 presidential election, but the team has been trying to prevent bad actors, like Russian trolls, from using Facebook to nefariously affect election results in the future.
SEE ALSO: A timeline of everything we know about how Russia used Facebook, Google, and Twitter to help Trump win
"It's really important for us while we're solving the problems of the 2016 election that we're not getting tunnel vision there," Facebook's Chief Security Officer Alex Stamos told reporters on Thursday. "We don't just want to be fighting the last war."
Facebook's main goals include "combating foreign interference, removing fake accounts, increasing ad transparency, and reducing the spread of false news," Stamos said on the call. But as Stamos shared on Thursday, fake news is more than just falsified stories shared on Facebook. His team is looking to fake identities, fake audiences, false facts, and false narratives. One major product change, for example, is allowing fact-checking partners to review and identify stories, photos, and videos as false before Facebook prompts them to do so.
Of course, election integrity isn't the only problem on Facebook executives' minds. Facebook is also in the midst of a data privacy issue involving data firm Cambridge Analytica obtaining user data on 50 million unsuspecting Facebook users. That issue has led to a movement to #DeleteFacebook and forced Facebook to release more data privacy protections.
"We know we have work to do to earn people's trust back," Guy Rosen, Facebook's vice president of product, said when asked how many people have deleted Facebook since the Cambridge Analytica scandal. "That's why we're here today."
Meanwhile, the National Fair Housing Alliance sued Facebook this week for enabling discrimination via housing ads on its platforms. Beyond advertising, Facebook also has been blamed for spreading disinformation more generally, with deadly consequences in Myanmar. Facebook didn't say exactly how they're measuring progress but told reporters they will have more of these conversations.
Stamos, who is rumored to be leaving Facebook in August, and Rosen were two of several Facebook executives who spoke during a phone briefing on Thursday to share prepared remarks as well as answer reporters' questions on what they've been up to in regards to elections. The main takeaway: Facebook is invested in protecting election integrity all over the world. The staff has been prepping for the 2018 U.S. midterms but also deploying staff in countries like Germany, France, Italy, and Kenya to localize their efforts.
"Each country we operate in will have a different range of actors," Stamos said. "We're working with countries to understand the particular actors."
Money on the line
A Facebook executive told Mashable that election integrity could affect the company's bottomline, similar to Facebook's earlier willingness to lower profitability in order to make its products more enjoyable and less addictive to users.