Author Topic: Ex-Facebook employee says child safety team in disarray  (Read 113 times)

Offline javajolt

  • Administrator
  • Hero Member
  • *****
  • Posts: 35224
  • Gender: Male
  • I Do Windows
    • windows10newsinfo.com
    • Email
Ex-Facebook employee says child safety team in disarray
« on: November 23, 2021, 04:02:43 PM »
We reported recently that Facebook/Meta has postponed its plans for End to End Encryption (E2EE) on Messenger and Instagram until 2023, as the company works to resolve child safety issues.

Many critics have noted that WhatsApp already uses E2EE and felt the delays were unjustified.

Today David Theil, a previous Facebook employee in their Child Safety division and current Chief Technology Officer of the Stanford Internet Observatory, posted a thread on Twitter berating both Facebook and their critics for the push towards E2EE on the service.

He noted that Facebook’s motives to push for E2EE were far from pure, and had more to do with “preempting anti-trust action, less interaction with LE (law enforcement), significantly scaled back safety teams, and good marketing.”

In short, by hiding communication on Facebook from Facebook itself, the company would be less responsible for the content it transmits.

Theil said the plan was announced without any roadmap to implement it, and with “an absurdly accelerated timeline,” before Facebook had even decided how to integrate their 3 messaging platforms (WhatsApp, Messenger, and Instagram).

Worse was that tests showed that when Facebook did not inspect the content of messages (as would be the case with E2EE), they were only able to detect “child grooming, sextortion and CSAM distribution” 10% as well as if they did actually look at the content of the messages, meaning the majority of harm would escape detection.

With no clear plan to address this, and a “gotta break a few eggs” attitudes from management, Theil says many prominent members of the child safety teams resigned.

Theil said the only way to maintain E2EE and still safeguard children would be client-side inspection, as Apple recently implemented, but, as Apple discovered that this was anathema to most security researchers and people in general.

Theil notes that WhatsApp did of course use E2EE already, but due to the 1:1 nature of the network, it did not provide ready access to children by predators, unlike Facebook, which was designed to introduce people to new friends and expand their social network.

Theil explained:

Quote
WhatsApp doesn’t recommend people to befriend and interact with. It doesn’t host secret groups of unlimited size. It doesn’t provide a global search of every user. It doesn’t group people by location or institutions like high schools. Whereas Facebook tries to take existing social networks, merge them, and build new ones. This has led to wildly inappropriate situations (including literally recommending victims to abusers) particularly when combined with contact sync and offsite pixel tracking.

Even then a lot of abuse was still missed on the platform.

Theil noted that until social networks themselves could be made inherently safe from child predators, E2EE should not be added to the system.

Read his full thread here.

source
« Last Edit: November 23, 2021, 04:03:50 PM by javajolt »