Offbeat

Legal

House of Lords push internet legend on greater openness and transparency from Google. Nope, says Vint Cerf

And he tells peers: 'I'm not sure showing you a neural network would be helpful'


The reverence in the House of Lords was palpable as Vint Cerf, a Google grandee and one of the, er, elders of the internet, was described during a committee meeting as technology's answer to Sir David Attenborough.

However, that did not stop the British Parliament's second chamber asking some pressing questions regarding the internet titan's transparency.

Having danced around the thorny problem of how de-facto standard search engine Google, and dominant video sharing platform YouTube, both owned by Alphabet, might influence public opinion in an internet age, Lord Mitchell finally got to the point: "I've been listening to this for nearly an hour. Now, one of the feelings I get from the responses is [you’re saying] ‘We’re Google, you should trust us.'"

The House of Lords' Democracy and Digital Technology Committee met on Monday to probe the search and advertising giant's use of algorithms and the controversy over its misreporting of advertising spending in the UK General Election.

Cerf and his colleague Katie O'Donovan, Google's head of UK government affairs and public policy, spent their time during the hearing defending two main lines of attack.

Firstly, that Google and YouTube should make more information about the workings of their search and recommendation algorithms open to regulators' scrutiny. Secondly, where human moderators are involved, either in selecting training data or deciding on the fate of flagged content, regulators should be able to interview them and see their working.

To both questions Cerf returned a polite, but firm no. To publish information about algorithms and neural networks would be too complicated and regulators wouldn't know what to do with them, was the response to the first probe. Or to give Cerf's charming answer:

"Basically, it's a complex interconnection of weights that take input in and pop something out to tell us you know what quality a particular web page is. I'm not sure that showing you a neural network would be helpful.

'An issue of survival': Why Mozilla welcomes EU attempts to regulate the internet giants

READ MORE

"It's not like a recipe that you would normally think of when you write a computer program. It's not an if-then-else kind of structure, it's a much more complex mathematical structure. And so, that particular manifestation of the decision making, may not be particularly helpful to look at. So, the real question is what else... would be useful in order to establish the trust that we've been talking about," he said.

Takedowns - for why?

Since Google and Alphabet use moderators, 10,000 of them it has said, which help make sure only "good quality" websites or videos become part of the training data and adjudicate in questions of whether content should be taken down or removed from recommendations, the Lords thought speaking to a few of them might be useful.

Committee chairman Lord Puttnam pressed on. "What would be your resistance to having your moderators talking about …the way they see the world and the kind of decisions they have to make? That is how you build trust, by having people on the ground, fairly consistently and openly talking about how difficult that job is. You don't build trust by ... creating a wall around the people and saying they mustn't talk to the public," he said.

But Google's representatives were largely unmoved. The moderators' guidelines were public, as were the search results, by putting the two together, authorities can assess whether the platforms are trustworthy. In any case, competition in the market, Cerf insisted, would keep Alphabet's two largest products trustworthy.

As far as competition goes, the company commands a 90 per cent market share in search, according to Statcounter and around a 70 per cent market share in video sharing, according to Datanyze. With popularity like that, the organisation must be doing something right, Cerf implied.

The Chocolate Factory is currently battling the EU in court over a €2.4bn fine for allegedly promoting its shopping search engine over smaller rivals. And over in the US last month, the Federal Trade Commission ordered Amazon, Apple, Facebook, Google and Microsoft to provide detailed information about their acquisitions of small companies.

EU competition chief Margrethe Vestager said last year, in relation to EU action around regulating Google: "[W]hen platforms do act as regulators, they ought to set the rules in a way that keeps markets open for competition. But experience shows that instead, some platforms use that power to harm competition, by helping their own services." ®

Send us news
51 Comments

Lawsuit claims gift card fraud is the gift that keeps on giving, to Google

Play Store commissions are a nice little earner, wherever they come from

Google dresses up services for the EU's Digital Markets Act

Apple also unpeels its offerings before Europe makes its pips squeak

Google sued by more than 30 European media orgs over adtech

Meanwhile, the Google News Initiative is pushing AI tools for publishers

Chinese chap charged with stealing Google’s AI datacenter secrets

Moonlighted for PRC companies after side-stepping Big G's security, allegedly

Google gooses Safe Browsing with real-time protection that doesn't leak to ad giant

Rare occasion when you do want Big Tech to make a hash of it

Microsoft says AI alliances are needed to compete with Google

Only the Chocolate Factory is 'vertically integrated' to win at 'every AI layer from chips to a thriving mobile app store'

Nano a nono: Pixel 8 phones too dumb for Google's smallest Gemini AI model

Some might say a blessing in disguise

YouTube workers laid off mid-plea at city hall meeting

Caught on camera: 'Our jobs are ended today, effective immediately'

Leaked docs hint Google may use SiFive RISC-V cores in next-gen TPUs

Would put those AI accelerators out of Arm's reach, at least

Poking holes in Google tech bagged bug hunters $10M

A $2M drop from previous year. So … things are more secure?

Chrome users – get an alert when extensions are in danger of falling into wrong hands

Under New Management is an early-warning system for potential poisoning of add-ons with malware

Google advances with vector search in MySQL, leapfrogging Oracle in LLM support

Meanwhile, only 22% of orgs are looking at GenAI strategy for databases