Feeds

Father of Lisp and AI John McCarthy has died

Early computing pioneer undergoes final upload

Intelligent flash storage arrays

Stanford University has confirmed that John McCarthy, the inventor of the LISP programming language and one of the pioneers of artificial intelligence (AI), has died at the age of 84.

Among developers, McCarthy may be best known as the inventor of Lisp, which he devised in 1958 while at MIT and published in the seminal work "Recursive Functions of Symbolic Expressions and Their Computation by Machine, Part I.” Lisp was originally developed for AI applications, but was quickly adopted by the industry, gained enormous popularity among developers, and is still in use today as part of Common Lisp and Scheme.

McCarthy developed Lisp in between 1956 and 1958, when he was trying to build an algebraic list processing language for artificial intelligence work on the IBM 704 computer. He was seeking to compute with symbolic expressions rather than numbers, and use this to build AI systems.

John McCarthy dies

John McCarthy, father of Lisp and AI pioneer, has died at the age of 84

“It became clear that this combination of ideas made an elegant mathematical system as well as a practical programming language,” he later wrote. “Then mathematical neatness became a goal and led to pruning some features from the core of the language. This was partly motivated by esthetic reasons and partly by the belief that it would be easier to devise techniques for proving programs correct if the semantics were compact and without exceptions.”

McCarthy was also the first person to coin the term AI, describing it in 1955 as "the science and engineering of making intelligent machines.” He was one of the most active academics in the field, publishing numerous papers on the topic and founding the Stanford Artificial Intelligence Laboratory, also known as SAIL, in 1962.

He was certainly optimistic for the future of AI, and predicted that a computer would beat a human at chess by the 1970s, something that wasn’t achieved until decades later. However, he was instrumental in bringing together academic talent in the area, as well as snarkily debunking some AI claims.

“It's difficult to be rigorous about whether a machine really 'knows', 'thinks', etc., because we're hard put to define these things,” he wrote in 1979. “We understand human mental processes only slightly better than a fish understands swimming.”

In 1971 he was awarded the Turing prize for his pioneering work in the field and its continuing development, and he worked in the field continuously until semi-retirement from Stanford in 2000. He also published a small amount of science fiction and commented on future technologies, predicting that the achievement of AI systems and the ability to manipulate genetic code would be the leading scientific developments of the 21st century.

However, he was also aware of the dangers of pseudoscience, and warned that humanity was becoming vulnerable by ignoring key areas of development such as nuclear power and stem cell research, while the general populace was being seduced by poor scientific understanding.

“He who refuses to do arithmetic is doomed to talk nonsense,” he wrote in a 1995 paper on progress and sustainability.

Born of socialist immigrant parents, McCarthy taught himself mathematics at an early age, earning his PhD from Princeton at the age of 24 and working with some of the giants in the industry, including Marvin Minsky and John Nash. He is survived by his wife. ®

Internet Security Threat Report 2014

More from The Register

next story
Download alert: Nearly ALL top 100 Android, iOS paid apps hacked
Attack of the Clones? Yeah, but much, much scarier – report
You stupid BRICK! PCs running Avast AV can't handle Windows fixes
Fix issued, fingers pointed, forums in flames
Microsoft: Your Linux Docker containers are now OURS to command
New tool lets admins wrangle Linux apps from Windows
Facebook, working on Facebook at Work, works on Facebook. At Work
You don't want your cat or drunk pics at the office
Soz, web devs: Google snatches its Wallet off the table
Killing off web service in 3 months... but app-happy bonkers are fine
First in line to order a Nexus 6? AT&T has a BRICK for you
Black Screen of Death plagues early Google-mobe batch
prev story

Whitepapers

Why and how to choose the right cloud vendor
The benefits of cloud-based storage in your processes. Eliminate onsite, disk-based backup and archiving in favor of cloud-based data protection.
Forging a new future with identity relationship management
Learn about ForgeRock's next generation IRM platform and how it is designed to empower CEOS's and enterprises to engage with consumers.
Designing and building an open ITOA architecture
Learn about a new IT data taxonomy defined by the four data sources of IT visibility: wire, machine, agent, and synthetic data sets.
How to determine if cloud backup is right for your servers
Two key factors, technical feasibility and TCO economics, that backup and IT operations managers should consider when assessing cloud backup.
High Performance for All
While HPC is not new, it has traditionally been seen as a specialist area – is it now geared up to meet more mainstream requirements?