Former Facebook Security Chief: Increased Privacy Complicates Safety

Former Facebook Security Chief Alex Stamos (Right) was interviewed by Casey Newton (Left) at SXSW.

This difficulty, he explained, comes from users who are “opting into conversations,” and already want to talk about these issues. Fortunately, Stamos says Facebook’s pivot will allow for protection of children, “from sexual predators looking to groom targets,” as that’s a one-directional action.

Zuckerberg’s change of mind, according to Stamos, comes from being hit on both sides of the debate. Stamos said that depending on who you’re talking to, Zuckerberg and Facebook are not doing enough to “stop kids from live streaming suicide attempts,” or interfering too much in those same incidents.

So to try and cut down the number of people attacking Facebook, Stamos said Zuckerberg is slamming towards privacy, and seemingly accepting he won’t win on every safety situation. Stamos described this decision and “punting” on stopping conversations between everyone from antivaxxers to terrorists, and referenced communal violence in India.

Stamos is pro conversation, and offered up the reality of the supposedly liberal Zuckerberg surrendering Facebook and it being taken over by Peter Theil, whose politics are on the other side of the spectrum.

The Verge’s Casey Newton, who was interviewing Stamos, pushed back a little when it came to this, saying Facebook could also work to prevent the sharing of malicious viral content, and stop recommending nefarious groups.

As for how Facebook should proceed towards privacy? Stamos suggested that the company have the discussion in public, to avoid it being clouded and ruined by backend lobbying.

Stamos added that “giving up on news feed and public activity,” means “giving up on the web.” What does that mean? A future where Facebook isn’t mobile first, but “mobile only.”

Per Elizabeth Warren talking about splitting up Facebook: Stamos says breaking things up gives people choice, but the downside means smaller areas make it harder to catch bad actors – facilitate companies pooling resources to stop bad guys

Regarding the conditions of content moderators:

“You can always pay people more,” Stamos explained before he said that the company “needs to focus on emotional and psychological impact.” And that means changing to in-house talent, people who are working for Facebook, and not outsourcing to companies such as Accenture.

[“source=tomsguide”]