European data laws bar use of Facebook's suicide-prevention tool

30 Nov 2017

1

European data laws have prevented Facebook from introducing a tool designed to identify users at risk of suicide.

The social media firm has announced that it will use artificial intelligence to spot posts and video comments that could be linked to expression of suicidal thoughts.

However, data protection laws across the EU, that bar processing the sensitive personal data of individuals without their explicit permission, mean that the update will not make it to countries in the EU.

Facebook allows users to report posts when of persons possibly at suicide risk, alerting a moderator. The moderator could then choose to send the person help such as numbers for helplines or the chance to talk to a friend. However, according to the social network, many red flags are not reported.

Facebook chief, Mark Zuckerberg said the company has now started to introduce "proactive detection", which automatically looks for trigger phrases in posts or comments on videos such as "are you OK?'' and ''can I help?".

Following tests in the US, it will be introduced worldwide except in the EU, according to the post. According to a spokesman, it was a sensitive issue in Europe and therefore would not be introduced there yet.

Facebook said in its post: ''When someone is expressing thoughts of suicide, it's important to get them help as quickly as possible.

"Facebook is a place where friends and family are already connected and we are able to help connect a person in distress with people who can support them. It's part of our ongoing effort to help build a safe community on and off Facebook.

"Today, we are sharing additional work we're doing to help people who are expressing thoughts of suicide, including:

  • Using pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide, and to help respond to reports faster
  • Improving how we identify appropriate first responders
  • Dedicating more reviewers from our Community Operations team to review reports of suicide or self harm

"Over the last month, we've worked with first responders on over 100 wellness checks based on reports we received via our proactive detection efforts. This is in addition to reports we received from people in the Facebook community. We also use pattern recognition to help accelerate the most concerning reports.''

Business History Videos

History of hovercraft Part 3...

Today I shall talk a bit more about the military plans for ...

By Kiron Kasbekar | Presenter: Kiron Kasbekar

History of hovercraft Part 2...

In this episode of our history of hovercraft, we shall exam...

By Kiron Kasbekar | Presenter: Kiron Kasbekar

History of Hovercraft Part 1...

If you’ve been a James Bond movie fan, you may recall seein...

By Kiron Kasbekar | Presenter: Kiron Kasbekar

History of Trams in India | ...

The video I am presenting to you is based on a script writt...

By Aniket Gupta | Presenter: Sheetal Gaikwad

view more