Big biz: Algorithms are too complicated, but also too easy to game, to open the black box

What you really want is translucency

Large tech company reps quizzed over their use of algorithms have said they'd like to be transparent with users... just not too transparent.

Four representatives from corporates whose business models rely on data slurping and processing today gave evidence to the House of Commons Science and Technology Committee as part of its inquiry into algorithmic decision-making.

The essence of the debate – which was at times frustrated as the MPs' lack of technical expertise – centred around the idea that "it depends".

Which is probably fair cop, given how complex an area the committee is delving into, but firms' failure to acknowledge their sometimes contradictory arguments and the genuine concerns people have about automated decisions won't win them any friends.

For instance, the big biz session saw plenty of pearl-clutching about the importance of being open with users about how their data is used, but the companies ultimately sided against full transparency, for two reasons: they're too complicated or too easy to game.

google_vs_ms_648

Transparent algorithms? Here's why that's a bad idea, Google tells MPs

READ MORE

"Instead of thinking about algorithmic transparency, we should think about algorithmic translucency," said Martin Wattenberg, senior staff research scientist from Google's AI team. "It's like the frosted glass in a bathroom window, it lets light in but not certain details."

Wattenberg argued that the main issue was what users "really want in terms of an explanation", saying it was hard to know the right level of detail to provide.

To demonstrate his point, he said that if someone had asked him why he'd picked up his glass of water, the full answer would include details of the chemical reactions happening in his brain, but that probably wasn't the answer you were looking for – and such information wouldn't be that useful.

Carolyn Nguyen, director of technology policy at Microsoft, echoed this point.

"The primary purpose of transparency is to enable understanding, and that's really not done by sharing the code and sharing the data," she said. "It takes a lot of data scientists to understand what's going on there, and even then it's impossible."

Instead, Microsoft focuses on the "principle of explainability", in terms of how the algorithm is trained, and what is used to test, monitor and maintain it.

Meanwhile, Nick Pickles, Twitter's head of public policy for UK and Israel, said that opening up information would leave the systems vulnerable to gaming.

"A big concern for us is around how we use algorithms in a defensive way to protect users," he said. "How do you stop bad actors gaming your algorithm?"

Charles Butterworth of Experian agreed with this, saying that if his company was to expose its underlying algorithm then it would be "to the detriment of the credit industry".

But Labour MP Martin Whitfield questioned this apparent dichotomy: the committee had a lot of evidence saying that handing over the information on the algorithm meant someone could game it – but also that the systems were so complex no one understands it.

The witnesses countered that this was because sometimes the first excuse works best problem is the main issue, and sometimes it's the second.

Dr Hannah Fry: We need to be wary of algorithms behind closed doors

READ MORE

"Different algorithms have different properties," Wattenberg said: the ones that have a very clear way of working will help you game them.

Take Google's page rank, he said. When the academic founders trotted out a paper about the algorithm, it let spammers create networks of synthetic sites. "Since then we've learnt ways to deal with it, but we keep it very close to our chest... it's a good example of just how contextual [this question] is."

A similar situation arose when the panel was asked about audits. Nguyen said there was a "spectrum" that ran from self-auditing to a third-party investigation, adding that it wasn't clear the criteria for validation and testing were well enough understood to allow this.

Wattenberg echoed this, saying that allowing third parties in to audit would have to be "thought through very carefully because there are implications for privacy and security" for people who aren't part of the organisation having access to that information.

The committee's inquiry is ongoing. ®

Sponsored: Minds Mastering Machines - Call for papers now open




Biting the hand that feeds IT © 1998–2018