advertisement

Facebook to reexamine how videos are flagged after Christchurch shooting

The first user to alert Facebook to grisly footage of the New Zealand terrorist attack clocked in 29 minutes after the video began and 12 minutes after it ended. Had it been flagged while the feed was live, Facebook said early Thursday, the social network might have moved faster to remove it.

Now the social network says it will reexamine how it reacts to live and recently aired videos.

To alert first responders of an emergency as fast as possible, Facebook says it prioritizes user reports of a livestream for "accelerated review."

"We do this because when a video is still live, if there is real-world harm we have a better chance to alert first responders and try to get help on the ground," the company said in an update about the terrorist attacks in Christchurch.

Last year, Facebook said it applied this expedited review process to videos that were recently live. That meant that users who saw a potentially violent or abusive livestream after it aired in real-time could alert Facebook moderators with haste. But Facebook said this newly expanded acceleration process only covered recently live videos that were flagged for suicide. Other types of recently live videos that users flagged, such as the mass killing in a mosque, were not covered by this expedited review.

Facebook said this may change.

"As a learning from this, we are re-examining our reporting logic and experiences for both live and recently live videos in order to expand the categories that would get to accelerated review," the company said.

Article Comments
Guidelines: Keep it civil and on topic; no profanity, vulgarity, slurs or personal attacks. People who harass others or joke about tragedies will be blocked. If a comment violates these standards or our terms of service, click the "flag" link in the lower-right corner of the comment box. To find our more, read our FAQ.