Facebook yesterday said it was ramping up the use of artificial intelligence as it pushed to make the social network "a hostile place" for extremists to spread messages of hate.
Pressure had been mounting on Facebook, along with other internet giants, who stand accused of doing too little, too late to eliminate hate speech and jihadist recruiters from their platforms.
In a joint blog post, Monika Bickert, the global policy management director of Facebook, and counter terrorism policy manager Brian Fishman said Facebook was committed to tackling the issue "head-on."
"In the wake of recent terror attacks, people have questioned the role of tech companies in fighting terrorism online," Bickert and Fishman said in the post.
"We want Facebook to be a hostile place for terrorists," they said, adding: "We believe technology, and Facebook, can be part of the solution."
They described how the network was automating the process of identifying and removing jihadist content linked to the Islamic State group, Al-Qaeda and their affiliates, and intended to add other extremist organisations over time.
Facebook used artificial intelligence to recognise when a freshly posted image or video matched one known to have been previously removed from the social network, which boasted nearly two billion users and involved over 80 languages.
''Our stance is simple: There's no place on Facebook for terrorism,'' the company said. ''We remove terrorists and posts that support terrorism whenever we become aware of them.''
According to commentators, the forceful response was meant to ward off criticism from politicians such as UK prime minister Theresa May and president Emmanuel Macron of France, who met earlier this week to discuss "a joint campaign to ensure that the internet cannot be used as a safe space for terrorists and criminals.''
Facebook said it was already doing what May wanted, she has called on internet companies to ''deprive the extremists of their safe spaces online'' so it was not clear whether yesterday's response will satisfy her, commentators said.