Added by Erik West on October 25, 2011
The Stanford University School of Engineering confirmed yesterday that John McCarthy, the inventor of the LISP programming language and considered the father of artificial intelligence, died on Sunday October 23.
McCarthy is credited with creating the term “artificial intelligence” (AI) in 1955, creating
a new area of research along with other researchers including Marvin Minsky and Claude Shannon. While today’s idea of artificial intelligence focuses on machine learning, McCarthy wanted artificial intelligence to pass the Turing test, yet AI continues to be a rich source of research and development.
Just three years later, in 1958, McCarthy specified the LISP programming language, which is often referred to as the “greatest single programming language ever designed“. The name LISP comes from combining the words “LISt” and “Processing” and was heavily influenced by lambda calculus. LISP is unique because it uses lists for its major data structures and its source code – the programming language itself.
LISP was the first programming language of its time that could interpret itself, meaning that it could interpret its own programming instructions like it handles data – without requiring programmers to create a lot of code that’s capable of parsing through the programming language’s structure. The capability led to the creation of a LISP compiler – a program that converts human-readable computer source code into executable applications – using the LISP programming language itself. The LISP programming language developed the basis for “domain specific languages” (DSLs), which have only recently become known to mainstream computer programmers.
McCarthy created the Stanford AI Laboratory, credited with many innovations, where McCarthy continued to serve for more than 40 years.
McCarthy suggested the idea of time-sharing based model of computing where computing capability could be sold in a way similar to how utilities like water and electricity are sold. The basic idea of time-sharing is embedded in modern technologies like the Internet and, since about the year 2000, cloud-computing.
McCarthy received a number of awards, including the Turing Award in 1971 and, more recently, awarded the Benjamin Franklin Medal in Computer Cognitive Science in 2003.
McCarthy is quoted in 2005 saying, “Understanding intelligence is a difficult scientific problem, but lots of difficult scientific problems have been solved. There’s nothing humans can do that humans can’t make computers do” – an optimism that’s shared among many computer researchers, scientists, and programmers around the world.