Business

Policy

Hey. Facebook, Twitter, YouTube. Get in here... so we can shake your hands – US Senate cyber-terror panel

So much for that grilling

By Kieren McCarthy in San Francisco

8 SHARE

The US Senate's commerce committee basically gave executives from Twitter, Facebook and YouTube a back-rub at a hearing on Wednesday morning.

The session was titled "Terrorism and Social Media: #IsBigTechDoingEnough?" and the answer from the majority of lawmakers was "yes." Unfortunately.

Under intense pressure in Europe over how their systems have been used to disseminate violent and hateful comment, as well as propaganda and misinformation, the internet giants have started taking the problem seriously.

It's amazing what real laws passed in legislative bodies can do. A new law came into effect in Germany in October that allows officials to fine social media companies up to €50m ($60m, £44m) if they fail to remove hate speech and illegal content within 24 hours. The next month, Facebook opened a second content review center in Germany with 500 staff.

With similar laws proposed in the UK and the European Union overall, suddenly a range of real programs to tackle cyber-bile appeared. In the United States, however – where the companies are all headquartered – the legislative threat was been beaten back by an army of well-funded lobbyists.

In testimony today, the corporations' representatives outlined measures focused on actively reducing or isolating troublesome content – which is significant progress from last year when they repeatedly refused to accept there was a problem at all, and argued that freedom of speech was paramount.

Great job

But, as happened last year, the companies with hundreds of millions of users and billions of dollars are confident about one thing: they are doing a great job. They are always doing a great job. Even when the same bomb-making video appears over and over again on their website. Even when fake news and conspiracies override legitimate sources of news on their platforms.

They are always doing a great job because they are always working on the problem. The past is the past. Even a week ago is old news; it all moves so fast. Let's look to the future and in future these companies will be doing more. Ipso facto, they are doing a great job.

How do we know they're doing a great job? They have stats. Stats that they themselves developed, and compiled, and the parameters of which are secret but they're still great stats.

"Last June, only 40 per cent of the videos we removed for violent extremism were identified by our algorithms," explained [PDF] YouTube's director of public policy Juniper Downs. "Today, that number is 98 per cent. Our advances in machine learning let us now take down nearly 70 per cent of violent extremism content within eight hours of upload and nearly half of it in two hours."

Facebook's got stats too – and its are even better. "Ninety-nine per cent of the ISIS and Al Qaeda-related terror content that we remove from Facebook is detected and removed before anyone in our community reports it, and in some cases, before it goes live on the site," explained [PDF] its head of product policy and counterterrorism Monika Bickert. "Once we are aware of a piece of terrorist content, we remove 83 per cent of subsequently uploaded copies within one hour of upload."

Not to be outdone, Twitter has stats, too [PDF]. Its "in-house proprietary" technology was used to flag "more than 90 percent of suspensions" and "three-quarters of those suspensions were flagged before the account had a chance to tweet even once."

And behind the numbers

Of course, those figures also reveal that nearly a third of violent videos on YouTube are missed, and that Facebook failed to catch nearly a fifth of reuploaded videos that it has already banned. Facebook also makes no mention of how many videos it missed altogether. And by piecing together Twitter's information, it's possible to figure out that there are still hundreds of thousands of fake or abused accounts on its system.

Fortunately, senators were able to immediately parse those careful statements and dig into the significant remaining problems, threatening to impose legislative corrections unless the companies did more. Only kidding. Of course they didn't.

They thanked the internet goliaths for the tremendous efforts they are undertaking – even though most of the programs are still in their early stages and remain untested. One senator even actively argued that their responses – which in some cases comprise little more than words – meant that the issue of extremist content was done and dusted.

When they were occasionally put under pressure – such as when Senator Cortez Masto (D-NV) asked pointedly about the fake news that spread in the aftermath of the August mass shooting in Las Vegas – the tech execs fell back on a well-worn piece of redirection: their platforms are also used to do wonderful things.

And speaking of redirection, it is the new policy kid on the block for dealing with extremist content.

"YouTube is teaming up with Jigsaw, the in-house think tank of Google’s parent company Alphabet, to test a new method of counter-radicalization referred to as the 'Redirect Method'," read testimony to the hearing. "Seeking to 'redirect' or re-focus potential terrorists at an earlier stage in the radicalization process, YouTube offers users searching for specific terrorist information additional videos made specifically to deter them from becoming radicalized."

Makes no difference to Facebook and pals if terror material is online or not

Which company representative flagged up this untested, unmonitored, largely theoretical program? None of them – it was the chairman of the committee himself, Senator John Thune (R-SD) in his official opening statement.

Senators even let the Silicon Valley executives get away with claiming that cleaning up their services and platforms from such content was an intrinsic part of their business, that they had a clear "commercial incentive" to make sure it didn't appear.

No doubt it is pure coincidence that the new programs, such as they are, only arrived on the scene after specific laws were passed and more were threatened.

It is, of course, absolute bunkum that Facebook et al are at a commercial disadvantage if they host violent content: it won't make the slightest different to the vast majority of their users if the content is there or not, and their entire business model is built on the simple sharing of content.

The sole voice of dissent invited to the panel was that of Clint Watts from the Foreign Policy Research Institute. He warned in his testimony [PDF] that more needed to be done: "Social media companies realize the damage of these bad actors far too late. They race to implement policies to prevent the last information attack, but have yet to anticipate the next abuse of their social media platforms by emerging threats."

It costs the social media giants time and money to restrict content – which is why they are so keen on automated ways of removing it. They already have engineers writing code and tweaking existing algorithms – let’s use them rather than hire an expensive army of video and post reviewers.

Culture wars

But even harder to fix is the work culture of these moneybags businesses that were formed from an engineering and technology utopian standpoint. They are inherently and fiercely protective of their own view of themselves as wonderful enablers of communication where any problems can be solved with a software tweak rather than requiring a cultural overhaul.

That was evident when Senator Maggie Hassan (D-NH) argued that the companies would inevitably have to adopt the generally accepted concept of "see something, say something," where the public is encouraged to report anything suspicious to the cops or Feds. She gave a clear example of where YouTube's failure to report extremist content on an account run by Boston marathon bomber Tamerlan Tsarnaev meant that the FBI missed his Al-Qaeda sympathies when vetting him for US citizenship.

Each of the company reps gave the same answer: that they work with law enforcement when it goes through the correct legal process and will report content if it fits certain specific criteria – mostly a threat to life.

Hassan responded: "I will say that the 'see something, say something' campaign is premised on something a little bit different. It is premised on if you think you see something; not 'does this meet my definition of imminent danger'? If they see something suspicious, they are asked to step up."

The cultural gulf between a senator who sees YouTube and others as the gatekeepers, overseers, and ultimately controllers of content, and the company representatives who see themselves as simply providing the means by which netizens can share information could not have been more stark.

That is not to say that Facebook, Twitter and YouTube are not serious about tackling the issue – they are. And they are making changes that should significantly reduce the problem – at least to levels where they won't be hauled in front of a congressional committee to explain themselves.

Premature

But it is far too premature for senators to take the threat of legislation off the table as some have already done. Senator Mike Lee (R-UT), for example, called even the idea of fining the corporations for failing to remove illegal content "distressing."

It is possible that Facebook, Twitter, and chums, will slowly reform themselves to suit their new environment – but it is a virtual certainty that such change will only come about thanks to external threats.

As for Congress, it has its own dysfunctions to deal with. Despite the hearing dealing with the serious issue of terrorism and extremist content, the highest ranking Democrat on the committee was unable to prevent himself from making it a partisan political issue.

Senator Bill Nelson (D-FL) focused his statement on the interference of the presidential election by Russia and the recent FCC decision on net neutrality.

Likewise, Senator Ted Cruz (R-TX) could not stop himself from turning the whole thing into another one of his conspiratorial mini-campaigns, accusing the tech companies of expressing political bias against right-wing ideologies, wrongly asserting that anti-abortion groups were being silenced and that Representative Marsha Blackburn (R-TN) was censored. Moments after he finished ignoring the responses to his false statements, Cruz stood up and left the committee room. Brief closing statements then began.

Unfortunately for the billions of users of these companies' services – and for broader society – it is only likely to be in the halls of Congress that the commercial imperatives of internet giants are tempered by the societal demands of the country as a whole.

We are not there yet, despite the stats and the wonderful array of new policies outlined today by social media heavyweights. ®

Sign up to our NewsletterGet IT in your inbox daily

8 Comments

More from The Register

Ex-stream action: YouTube slays Zombie horde in AdSense battle

Judge double taps class-action complaint against Google's vid emporium

Cisco cancels all YouTube ads, then conceals cancellation

Blog post shamed video vault, has since been ‘reposted as intended'

COPPA load of this FTC complaint: YouTube accused of collecting children's data

Privacy groups allege vid-sharing site slaps trackers on under-13s

YouTube plan to use Wikipedia against crackpots hits snag

The video site neglected to inform Wikipedia that it will be leeching its labor

YouTube sin-bins account of KRACK WPA2 researcher

Only to be mysteriously restored hours later

YouTube turns off cash tap for automatic video nasties

Beer money channels that made under $100 a year are also out of the Partner Program

YouTube banned many gun vids, so some moved to smut site

The Naked Gun was far funnier than this mess

Tech giants at war: Google pulls plug on YouTube in Amazon kit

You won't sell our stuff? We won't let you watch our vids

Humanity is doomed: We watch 45 BILLION hours of YouTube a month

And that's just the stuff Google can count on mobile devices

Be your own YouTube: Cloudflare Stream flies out of beta, emits vids

Web giant sends spare resources into the mines