dogloose

Francie Latour was picking out produce in a suburban Boston grocery store when a white man leaned toward her two young sons and, just loudly enough for the boys to hear, unleashed a profanity-laced racist epithet.

Reeling, Latour, who is black, turned to Facebook[1] to vent, in a post that was explicit about the hateful words hurled at her 8- and 12-year-olds on a Sunday evening in July.

"I couldn't tolerate just sitting with it and being silent," Latour said in an interview. "I felt like I was going to jump out of my skin, like my kids' innocence was stolen in the blink of an eye."

But within 20 minutes, Facebook deleted her post, sending Latour a cursory message that her content had violated company standards. Only two friends had gotten the chance to voice their disbelief and outrage.

Experiences like Latour's exemplify the challenges Facebook chief executive Mark Zuckerberg[2] confronts as he tries to rebrand his company as a safe space for community, expanding on its earlier goal of connecting friends and family.

But in making decisions about the limits of free speech, Facebook often fails the racial, religious and sexual minorities Zuckerberg says he wants to protect.

The 13-year-old social network is wrestling with the hardest questions it has ever faced as the de facto arbiter of speech for the third of the world's population that now logs on each month.

In February[3], amid mounting concerns over Facebook's role in the spread of violent live videos and fake news[4], Zuckerberg said the platform had a responsibility to "mitigate the bad" effects of the service in a more dangerous and divisive political era. In June, he officially ...

Read more from our friends at NDTV/Gadgets