Facebook is reviewing child abuse video policy in latest challenge
August 01 2018 09:03 PM
A Facebook Inc logo is on display at a mega-campus for startups in Paris. Facebook is reviewing its policy of not removing videos of so-called non-sexual child abuse videos that are designed to condemn such behaviour and the child-in-question is still at risk.


Facebook Inc, is reviewing how it handles child abuse videos and preventing underage children from holding accounts, as the social network grapples with concerns it isn’t doing enough to protect users.
The tech giant is reviewing its policy of not removing videos of so-called non-sexual child abuse videos that are designed to condemn such behaviour and the child-in-question is still at risk, Facebook Ireland’s head of public policy Niamh Sweeney was expected to tell an Irish parliamentary hearing later yesterday, according to a copy of her testimony seen by Bloomberg News.
The company is also working to update “guidance for reviewers to put a hold on any account they encounter if they have a strong indication” that it belongs to a child under 13 years of age.
Sweeney will appear before lawmakers in Dublin alongside the firm’s head of content policy for Europe Siobhan Cummiskey, amid alleged issues around Facebook’s content moderation policies highlighted in a documentary broadcast by the UK’s Channel Four Television Corp She will reiterate the company’s apology for “failings” identified in the documentary.
Sweeney’s comments come as the world’s biggest social network faces weak user growth and criticism of its content policies and data privacy issues. Its shares fell by about a fifth last week after second-quarter user numbers and revenue missed market expectations.
It was “mistake” not to remove a video of “a three-year-old child being physically assaulted by an adult,” Sweeney will say. Facebook only allows that type of video to be shared if it is to “to condemn the behaviour and the child is still at risk and there is a chance the child and perpetrator could be identified to local law enforcement as a result of awareness being raised.”
On how it handles hate speech, Facebook is “increasingly using technology to detect hate speech on our platform which means we are no longer relying on user reports alone,” Sweeney will say.
“Of the 2.5mn pieces of hate speech we removed from Facebook in the first three months of 2018, 38% of it was flagged by our technology,” she will add.
The company is also updating oversight of training for staff who review content on the website.

Facebook unveils tools to tell users when to stop scrolling

San Francisco

Facebook Inc, during all its years of expansion, has been focused on one thing above all else: getting people to spend more time on its social network.
Now, as tech giants face increasing criticism over the addictive nature of their products, the company is releasing features that do the opposite. Facebook and Instagram, its photo-sharing app, will add controls to help people measure how much time they’re spending on the sites, so they can dial it back if they want to. Users can also mute notifications on the apps for a certain period of time, or sign up to get an alert when they’ve been scrolling for too long.
“It’s not just about the time people spend on Facebook and Instagram but how they spend that time,” Facebook said in a blog post yesterday. “It’s our responsibility to talk openly about how time online impacts people — and we take that responsibility seriously.”
Most companies haven’t focused on that issue until recently, following concerns from mental-health experts and industry critics about internet and device addiction, and the way technology is designed to keep users coming back for more. In June, for example, Apple Inc introduced “Screen Time,” an activity report that will show how much time users are spending on individual apps and how often they pick up their iPhones. Google announced similar controls in May.
Facebook has been working on improving the way people feel about its website, which has been a destination for political bickering, misinformation, clickbait and viral videos. The social network earlier this year pledged to change the mix of its news feed to emphasise conversations that are meaningful between friends and family, as opposed to content designed specifically to go viral. The changes have affected how much time people spend on the site, which could in turn affect Facebook’s ad revenues. The company has said that it expects sales growth to slow in the coming years — and revenue fell short of estimates in the second quarter, sending Facebook stock down 19% in a day last week.
“We want the time people spend on Facebook and Instagram to be intentional, positive and inspiring,” the company said.
As part of this push, Facebook said it convened a summit with online safety experts, researchers and teens in March to talk about technology and how it’s influencing well-being. It plans to tweak its products to further address concerns like a lack of kindness online.
Meanwhile, the company is grappling with its impact on society in other ways. It disclosed on Tuesday that it identified an ongoing effort to use its platforms to influence the US midterm election, via a network of false-identity accounts and pages. The company says it doesn’t yet know who is behind the coordinated campaign, which follows a similar effort, linked to Russia, ahead of the 2016 US presidential campaign.

There are no comments.

LEAVE A COMMENT Your email address will not be published. Required fields are marked*