Within hours of the police shooting of Philando Castile on Wednesday, the Facebook Live Video that showed Castile's dying moments disappeared from the platform. Facebook later reinstated the video and offered an apology. “We’re very sorry that the video was temporarily inaccessible,” Facebook told BuzzFeed News in a statement. “It was down due to a technical glitch and restored as soon as we were able to investigate.”
If that explanation sounds familiar, it should.
In April, when a handful of Facebook groups supporting Bernie Sanders also temporarily disappeared, Facebook said essentially the same thing. “A number of groups were inaccessible for a brief period after one of our automated policies was applied incorrectly,” a Facebook spokesperson told Recode. “We corrected the problem within hours and are working to improve our tools.”
Chalking the issue up to a vague "technical glitch" or an incorrect application of an automated policy effectively absolves Facebook of any editorial responsibility. A glitch implies a malfunction in the system. One that's acting irregularly and is not the norm. Yet at a certain point it's worth asking whether it's not an irregularity at all, but rather a flawed system; one that's poorly designed to deal with difficult issues.
When asked by BuzzFeed News, Facebook could not or would not say why the video came down, but there are only a few possibilities.
One is that the video was reported to Facebook by users, who believed it was too violent or perhaps had political motivations. (When the pro-Sanders groups went offline, some surmised that it was due to Sanders' enemies reporting the groups to Facebook). No matter the motive, however, reporting content can start a process within Facebook that results in its removal.
If the video was removed due to user flagging, it means that at some point along Facebook's content moderation supply chain, a person or process took action based on those complaints that removed an important, nationally significant document — so significant that the President himself addressed it today.
Alternately, the takedown could have resulted from another faulty application of a different automated policy in which it was flagged by Facebook itself — not users. If that's the case, then the automated element of Facebook's content moderation would now have a record of causing important content to be removed from Facebook.
Of course, an unexplained, previously unexperienced technical problem could have randomly removed a video that's central to a national news story. That would indeed merit the "technical glitch" explanation. But it's highly unlikely from a company as technologically sophisticated as Facebook.
from BuzzFeed - Tech http://ift.tt/29xuGUv
via IFTTT
Hiç yorum yok:
Yorum Gönder