This article is more than 1 year old

Is this paragraph from Trump or an AI bot? You decide, plus buy your own AI for $399

Also Uber to Waymo - I wish I could quit you!

Roundup Hello, welcome to this week's roundup of AI news. Read on for a fun and, frankly worrying, quiz that tests if you can tell if something was made up by an AI text generation model or said by Trump, and more.

AI computation from 1959 - 2018: OpenAI has analysed the amount of computing power needed to build AI systems in research labs stretching from 1959 to the present day. Although the field has experienced period cycles of booms and busts, the only thing that remains constant is the need to use more hardware.

The graph below describes how many petaflops per day were required to train various models. Follow the line from the 1959 Perceptron model in the bottom left hand corner and you can see the increase is linear until there’s a sudden sharp rise in 2012 from AlexNet.

openai_compute_2

Image credit: OpenAI

“Starting from the perceptron in 1959, we see a ~2-year doubling time for the compute used in these historical results—with a 3.4-month doubling time starting in ~2012,” OpenAI said.

“It’s difficult to draw a strong conclusion from this data alone, but we believe that this trend is probably due to a combination of the limits on the amount of compute that was possible to use for those results and the willingness to spend on scaling up experiments.”

The two-year doubling is consistent with Moore’s Law, a trend that describes that the number of transistors on a microchip doubles every 18 months or two years. Although that progression has since slowed, and whether the so-called law is truly dead or not remains hotly debated, its effects haven’t impacted AI.

AlexNet, a convolutional neural network, burst into the scene in 2012, marking the rise of neural networks. These architectures require computationally solving tons of matrix calculations, and as they’ve grown in size, so has their appetite for CPUs, GPUs, TPUs, FGPAs - you name it.

“The trend represents an increase by roughly a factor of 10 each year. It’s been partly driven by custom hardware that allows more operations to be performed per second for a given price (GPUs and TPUs), but it’s been primarily propelled by researchers repeatedly finding ways to use more chips in parallel and being willing to pay the economic cost of doing so,” it added.

You can read more about that here.

Uber is still using Waymo self-driving tech: Uber will probably have to enter a financial deal with Waymo to pay for a license to continue using its self-driving car software.

The ride-hailing giant filed a quarterly securities this week, and revealed that it was still using existing knowledge taken from its competitor’s efforts to build a fully autonomous vehicle. Having to start from scratch “could limit or delay [its] production of autonomous vehicle technologies,” according to Reuters.

Both companies have been embroiled in a string of legal battles over intellectual property with one engineer Anthony Levandowski right at its center. Although Uber and Waymo privately settled on the matter last year in February, a separate lawsuit accuses Levandowski of 33 charges of theft and theft of trade secrets after the former Waymo engineer left to join Uber after it bought his self-driving truck startup Otto. Levandowski has been accused of downloading 14,000 Waymo files describing the self-driving car biz's technology on its LiDAR sensing systems. https://www.theregister.com/2019/08/27/anthony_levandowski_33_criminal_charges_uber/

As part of the lawsuit that ended in the settlement, an expert conducted an internal Uber review and confirmed that it is still using Waymo’s technology.

The smallest AI Edge computer will cost you $399: Nvidia has launched the Jetson Xavier NX, a tiny computer that can run multiple neural networks at the same time, so the company claims.

The credit card sized board delivers up to 21 trillion operations per second (TOPs) at 15 watts of power. It can also operate at more modest levels at 14 TOPs at 10 watts to run more than one neural network at any one time.

Here are its specs:

  • GPU: NVIDIA Volta with 384 NVIDIA CUDA cores and 48 Tensor Cores, plus 2x NVDLA
  • CPU: 6-core Carmel Arm 64-bit CPU, 6MB L2 + 4MB L3
  • Video: 2x 4K30 Encode and 2x 4K60 Decode
  • Camera: Up to six CSI cameras (36 via virtual channels); 12 lanes (3x4 or 6x2) MIPI CSI-2
  • Memory: 8GB 128-bit LPDDR4x; 51.2GB/second
  • Connectivity: Gigabit Ethernet
  • OS Support: Ubuntu-based Linux
  • Module Size: 70x45mm
  • The Jetson Xavier NX is the newest addition to Nvidia’s Jetson series, and is useful for machine learning enthusiasts tinkering around with robots and drones. It’s expected to be available in March next year.

You can read more about it here.

Hundreds duped by an AI model modelled on Donald Trump’s speech: Did the US president or a machine write this?

“So, that’s why I’ve been saying all along that, yes, I’d love to win. But, boy, do these guys want me to. These guys, they don’t talk about it. They say, “Donald Trump, please, please run.” Because they’ll take away your tax cuts, because they’ll take away your regulation cuts. They’ll take them away. And, frankly, really, really bad things will happen with our country. Our country would go down very quickly. Very quickly, very, very rapidly. The Democrats want to turn back the clock, which is, essentially, what they’ve done. They’ve turned back. We’ve gone much further left than anybody thought possible.”

It’s difficult to judge, and if you guessed Donald Trump then you’re not alone. In an experiment with 1,000 participants, who were asked to guess which ten paragraphs of text were written by a language-generation model created by Salesforce, or taken from the US President’s speeches. Only 40 per cent of participants were able to tell the difference between the two.

Text generation models are able to spit out sentences given a prompt. As you can see the one fine tuned to produce Trump-like text, known as RoboTrump, is pretty good. You could argue that since the President isn’t exactly known for his linguistic prowess, it’s probably quite simple to mimic his style of speech and vocabulary.

The folks over at Lawsuit.org, a legal advice site, have developed the Trump vs RoboTrump test for people to take. You can try your hand at the quiz here.

Although it’s a bit of fun, it’s also a worrying sign of how these text generation models can be tweaked to churn out disinformation. After the test, 43 per cent participants said they were now more concerned about how fake content produced by AI models could affect the upcoming Presidential election in 2020. ®

More about

TIP US OFF

Send us news


Other stories you might like