LISP and Artificial Intelligence

Web Technology Digital Transformation Artificial Intelligence Natural Language Processing Semantic Web Deep Learning Online Learning Reinforcement Learning Chatbot and Q&A User Interface Knowledge Information Processing Machine Learning Reasoning Programming Navigation of this blog

About LISP and Artificial Intelligence

IPL (Information Processing Language), an early functional programming created in 1956, was based on the “lambda calculus” and recursive functions (inductive functions) proposed by Church and Creaney in the 1930s. Following them, LISP was born from research by John McCarthy in 1958. LISP has since been reincarnated as a number of languages, and is still in use today.

はじめのLISP関数型プログミング ラムダ計算からリファクタリングまでより

MacCarthy, the creator of LISP, believed that artificial intelligence could be constructed by using a language that could manipulate functions themselves and at the same time think logically. The original LISP was created as a procedural language that could manipulate the former.

LISP initially had fields that depended on the hardware architecture of the time, such as car, cdr, cpr, cir, csr, and ctr, as cons cells, the source of today’s binary tree structure, and was not a clear binary tree structure. As abstraction progressed from these machine-dependent structures to high-level languages, programs came to be represented by M-expressions, which are similar to mathematical function representations, and data or the internal structure of a program by S-expressions, and as they evolved into simpler forms, M-expressions were eliminated and both programs and data were written in S-expressions.

The structure of S-expressions makes it easier to approach the semantics of a language in a straightforward manner, without having to worry about “grammatical design” that can only make superficial differences when creating an experimental language. Having only the minimum grammatical rules for expressing symbolic structures means that the language is semantically transparent, and almost any linguistic semantics can be easily put on top of S-expressions, and once you get used to S-expressions, you can get straight to the essence of the programming language.

I believe that we can replace the above with Lao Tzu’s 42nd chapter, “The Way gives rise to the One, the One gives rise to the Two, the Two gives rise to the Three, and the Three gives rise to all things…” I think we can replace this with “The way does not give rise to nil, nil gives rise to atoms, atoms give rise to S-expressions, and S-expressions give rise to all things.

In this way, the S-expression used to express a LISP program becomes the expression method for the data handled by LISP as it is. This can also be said that programs and data are expressed in the same form (program as data). This means that S-expressions written as programs can be treated as data to be processed as they are, and that S-expressions that were data can (suddenly) be interpreted as programs.

The first language with the feature of program as data is machine language. Some machine language programs were able to realize subroutine calls by rewriting themselves. Other techniques include tricky program rewriting to reduce the number of instructions and to make complex movements.

It can also be said that program as data is a concept that faithfully reflects the true meaning of von Neumann’s program accumulation scheme in high-level languages. von Neumann’s bottleneck, which was criticized by John Backus in his Turing Award lecture, is that the program and data spaces share a path to the CPU, which creates access bottlenecks. John Backus’ criticism of von Neumann’s bottleneck in his Turing Award lecture was that access bottlenecks occur when the program space and data space share paths to the CPU, but this is not a criticism of the nature of program storage methods.

The fact that von Neumann knew the proof of Goethe’s incompleteness theorem and was thinking about self-propagating automata shows that he believed that the essence of the program as data principle is that programs and data mutually change their positions.

Program as data is also useful for extending the language itself directly or indirectly.

The significance of LISP’s self-extensibility lies in its ability to abstract. The definition of a new function or macro directly increases the abstract vocabulary and structure of the language. It will be one in which conceptual abstraction progresses rapidly without changing the skeleton of the language.

The following topics are discussed.

General

  • Overview of Code as Data and Examples of Algorithms and Implementations

“Code as Data” refers to a concept or approach that treats the code of a program itself as data, and is a method that allows programs to be manipulated, analyzed, transformed, and processed as data structures. Normally, a program receives an input, executes a specific procedure or algorithm on it, and outputs the result. In “Code as Data,” on the other hand, the program itself is treated as data and manipulated by other programs. This allows programs to be handled more flexibly, dynamically, and abstractly.

Technical Topics

In order to program, it is necessary to create a development environment for each language. This section describes how to set up specific development environments for Python, Clojure, C, Java, R, LISP, Prolog, Javascript, and PHP, as described in this blog. Each language has its own platform to facilitate development, which makes it possible to easily set up the environment, but this section focuses on the simplest case.

From “Practical Common Lisp,” a reference book on LISP, a programming language for artificial intelligence technologies. Reading notes are included.

Using chatGPT, various algebraic problems can be solved as shown below. Since chatGPT not only gives simple answers, but also shows solutions by applying various formulas, it gives the illusion that it is a universal AI.

However, when we actually perform the calculations, we find that the answers given by chatGPT are sometimes incorrect. This is because chatGPT only estimates what character will appear next based on a vast amount of training data, and does not perform essential calculations.

From “First Lisp Function Programming,” a reference book on LISP, a programming language for artificial intelligence technology. Reading notes are included.

Lisp has recently been reevaluated as a powerful and practical programming language. The original of this book, “On Lisp”, is a famous book that thoroughly explains macro programming, the source of Lisp’s power. This book was translated by Kai Noda with the permission of the author, Paul Graham, and is the Japanese translation that was made available on the Internet.

Eliza is a “classic” AI system developed by Joseph Weizenbaum at MIT and published in 1966. It is the second oldest AI system after the General Problem Solver (GPS), published by Herbert Simon in 1956.

ELIZA was one of the first natural language processing programs to feature input and output in English, and was named after the female protagonist in the play Pygmalion.

  • General Problem Solver and Application Examples, Implementation Examples in LISP and Python

The general problem solver specifically takes as input the description of the problem and constraints, and operates to execute algorithms to find an optimal or valid solution. These algorithms vary depending on the nature and constraints of the problem, and there are a variety of general problem-solving methods, including numerical optimization, constraint satisfaction, machine learning, and search algorithms. This section describes examples of implementations in LISP and Python for this GPS.

This is a book about functional programming with schema (a derivative of LISP), and has a sequel, “The Seasoned Schemer”. It is followed by “The Reasoned Schemer”, a book explaining miniKanren, a DSL for logic programming, which was previously mentioned in “Clojure core.logic and miniKanren”.

The contents include recursion, anonymous functions, lambda functions, Y-combinators, and a simple interpreter.

The idea that “thinking about computers is one of the most exciting things the human mind can do” is what sets The Little Schemer (formerly known as The Little LISPer) and its new sister volume, The Seasoned Schemer, apart from set them apart from other books on LISP. The author’s enthusiasm for the subject is evident as he presents abstract concepts in a humorous and understandable way. This book will open new doors of thinking for anyone who wants to know what a computer is.

The purpose of The Reasoned Schemer is to help functional programmers think logically and logic programmers think functionally. the authors of The Reasoned Schemer believe that logic programming is a natural extension of functional programming, and By extending Scheme, a functional language, with logical constructs, they show that it combines the advantages of both styles. This extension encompasses most of the ideas of the logic programming language Prolog.

A blog post by Seiji Koide on his blog, Semantic Web Diary, about GPS.

A blog about various Lisp-based development projects, including automatic search in the browser using Selenium and LISP, implementation of tableau, and prototyping and benchmarking of Lisp processors in various languages.

コメント

タイトルとURLをコピーしました