©Reuters/Valentin Flauraud
Facebook yanked a beheading video from the social network late Tuesday following outrage over its lifting of a ban on the gory imagery, AFP reports. The flip flop came as Facebook aimed to balance the diverse sensitivities of its billion-plus members with a desire to be a platform for free speech and real-world news stories. "People turn to Facebook to share their experiences and to raise awareness about issues important to them," it said in a statement emailed to AFP. "Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses, acts of terrorism, and other violence," the California-based company added. "When people share this type of graphic content, it is often to condemn it. If it is being shared for sadistic pleasure or to celebrate violence, Facebook removes it." Facebook was adamant that it did not reverse or change any policies as a result of the controversy, but that criticism of the video prompted it to be scrutinized more closely in the context of existing terms of service. But it said that, as part of an effort to "combat the glorification of violence" on the social network, it was "strengthening" enforcement of its policies. Facebook had introduced a temporary ban on videos of beheadings in May following complaints that the graphic footage could cause users long-term psychological harm. But it confirmed on Monday that it had reversed the decision on the grounds that the site is used to share information about world events, including terrorist attacks and human rights abuses. According to screen shots, it had added a warning to the beheading video that it "contains extremely graphic content and may be disturbing" before re-evaluating the post and removing it. British Prime Minister David Cameron on Tuesday condemned Facebook as "irresponsible" and said "worried parents" needed to hear an explanation from the tech giant. "It's irresponsible of Facebook to post beheading videos, especially without a warning," Cameron said on Twitter. Facebook had reasoned that it would allow such material because "people are sharing this video on Facebook to condemn it." It has been criticized for allowing this type of violence while banning other content such as nudity. On its standards page, Facebook says "we remove content and may escalate to law enforcement when we perceive a genuine risk of physical harm, or a direct threat to public safety... Organizations with a record of terrorist or violent criminal activity are not allowed to maintain a presence on our site." The world's biggest social network said it seeks to avoid censorship and its policy notes that "graphic imagery is a regular component of current events, but must balance the needs of a diverse community." "When we review content that is reported to us, we will take a more holistic look at the context surrounding a violent image or video, and will remove content that celebrates violence," it said. Facebook will also evaluate whether posted content is being shared responsibly, perhaps with warning messages or age-restrictions for audiences. "Based on these enhanced standards, we have re-examined recent reports of graphic content and have concluded that this content improperly and irresponsibly glorifies violence," it said of the beheading video. "For this reason, we have removed it."
Facebook yanked a beheading video from the social network late Tuesday following outrage over its lifting of a ban on the gory imagery, AFP reports.
The flip flop came as Facebook aimed to balance the diverse sensitivities of its billion-plus members with a desire to be a platform for free speech and real-world news stories.
"People turn to Facebook to share their experiences and to raise awareness about issues important to them," it said in a statement emailed to AFP.
"Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses, acts of terrorism, and other violence," the California-based company added.
"When people share this type of graphic content, it is often to condemn it. If it is being shared for sadistic pleasure or to celebrate violence, Facebook removes it."
Facebook was adamant that it did not reverse or change any policies as a result of the controversy, but that criticism of the video prompted it to be scrutinized more closely in the context of existing terms of service.
But it said that, as part of an effort to "combat the glorification of violence" on the social network, it was "strengthening" enforcement of its policies.
Facebook had introduced a temporary ban on videos of beheadings in May following complaints that the graphic footage could cause users long-term psychological harm.
But it confirmed on Monday that it had reversed the decision on the grounds that the site is used to share information about world events, including terrorist attacks and human rights abuses.
According to screen shots, it had added a warning to the beheading video that it "contains extremely graphic content and may be disturbing" before re-evaluating the post and removing it.
British Prime Minister David Cameron on Tuesday condemned Facebook as "irresponsible" and said "worried parents" needed to hear an explanation from the tech giant.
"It's irresponsible of Facebook to post beheading videos, especially without a warning," Cameron said on Twitter.
Facebook had reasoned that it would allow such material because "people are sharing this video on Facebook to condemn it."
It has been criticized for allowing this type of violence while banning other content such as nudity.
On its standards page, Facebook says "we remove content and may escalate to law enforcement when we perceive a genuine risk of physical harm, or a direct threat to public safety... Organizations with a record of terrorist or violent criminal activity are not allowed to maintain a presence on our site."
The world's biggest social network said it seeks to avoid censorship and its policy notes that "graphic imagery is a regular component of current events, but must balance the needs of a diverse community."
"When we review content that is reported to us, we will take a more holistic look at the context surrounding a violent image or video, and will remove content that celebrates violence," it said.
Facebook will also evaluate whether posted content is being shared responsibly, perhaps with warning messages or age-restrictions for audiences.
"Based on these enhanced standards, we have re-examined recent reports of graphic content and have concluded that this content improperly and irresponsibly glorifies violence," it said of the beheading video.
"For this reason, we have removed it."