Hardware approaches to machine learning – FPGA, optical computing, quantum computing

Machine Learning Artificial Intelligence Natural Language Processing Probabilistic Generative Models Algorithm ICT IT Infrastructure Digital Transformation Deep Learning Mathematics Navigation of this blog
Hardware approaches to machine learning – FPGA, optical computing, quantum computing

A computer can be described as “a machine that handles information represented by mathematics.

Here, mathematics is not symbols such as “1” or “2,” but the abstract concepts they represent. In other words, in the world of “1” and “2,” “1” and “2” are “numbers” and the “numbers” they represent are things that cannot be seen (abstract concepts).

Here in Denis Gage’s “History of Numbers,” he describes numbers as follows.

The human eye cannot grasp the number of things that are more than five at a time. Even those who laugh when they hear that a young child’s concept of number is “one,” “two,” and “many” cannot distinguish between five and six at a glance. Perhaps man invented number because of this weakness in his eyes.”

This would mean that number was necessary for humans to deal with what they could not capture with their senses alone. Before the invention of numbers, people could only understand and communicate the quantity and size of things to the extent that they could understand them with their own senses.

In contrast, with the acquisition of the concept of number, a concept that can be shared with others, it became possible to obtain detailed (numerical) concepts, in other words, knowledge and information beyond one’s own senses. The information represented by these numbers can now predict not only the present and the past, but also the future and even hypothetical things, through the calculation of numbers.

Currently, most computers use electricity to represent and calculate numbers for ease of control, but other media such as steam, light, magnetism, and quantum mechanics have also been used, each with its own characteristics.

In this blog, after describing the basic principles of computing, starting with how computers work, we will focus on machine learning as the target of computation and describe machine learning with the simplest machine, the Raspberry Pi, and then describe the case of using FPGAs and ASICs as a computer dedicated to machine learning. Next, we will discuss the use of FPGAs and ASICs as computers dedicated to machine learning. I will also discuss quantum computing and optical computers as next-generation computers.

The details are as follows

The English mathematician Turing came up with the idea of the “Turing machine,” a mathematical model of a universal computer, to solve Hilbert’s “decision problem. This “Turing machine” became the mathematical basis for guaranteeing the universal applicability of computers.
Using the “Turing Machine,” Turing thoroughly examined the act of computation. He showed that showing a procedure and being able to compute are the same thing. The procedure is called an algorithm, which is now called software.
This book explains the Turing machine as a principle of computers, and also explains the famous “Turing machine stopping problem,” which solved the decision problem, in an easy-to-understand manner, as well as the amount of computation and the “P=NP problem,” one of the seven most difficult problems.

  • Thinking Machine Computers
  • Machine Learning Starting with the Raspberry Pi

I would like to discuss the process of designing semiconductor chips as described in “Computational Elements of Computers and Semiconductor Chips” and semiconductor chips specialized for AI applications, one step further from “Introduction to FPGAs for Software Engineers: Machine Learning”.

FPGA stands for Field Programmable Gate Array and refers to reconfigurable hardware, which includes the logic, DSP, RAM, and dedicated hardware macros that make up the circuitry. formerly Altera, acquired by Intel in 2015) dominate the market.

The technique of developing hardware such as FPGAs and ASICs from software has been around for a long time and is commonly referred to as high-level synthesis in the hardware industry.

In recent years, there have been presentations at AI and deep learning workshops on FPGA-related community-based workshops where FPGAs are also developed from software using high-level synthesis. High-level synthesis has become more prominent because the price of high-level synthesis tools themselves has come down, and tools that enable high-level synthesis not only from C but also from languages such as Java, Python, and buby have appeared, making it possible to use FPGAs without going to the trouble of using dedicated languages (conventional The need to use specialized languages (such as Verilog, HDL, VHDL, etc.) to use FPGAs is becoming less and less necessary.

Conventional computer architectures, especially microprocessor architectures, are based on Moore’s law, which predicts that the number of transistors on a die will double every 1.8 years as semiconductor technology improves. As a result, the clock cycle time during operation can be shortened because the gate length of transistors becomes shorter, and the required execution performance (shortening the time required for execution) can be scaled accordingly. In other words, it was possible to roughly estimate that a processor implementing this many functions would have this much performance because this much die area could be used and the operating clock frequency would be this much because the wiring delay would be this much in a number of years.

Focuses on quantum computers that work with the quantum annealing method. The lecture will explain in an easy-to-understand manner how the new method of quantum computers work, what kind of calculations they perform, and how they can be applied to artificial intelligence.

    Quantum computers have been attracting renewed attention since the “demonstration of quantum transcendence” by a Google research team was reported. How could it change our world? This special issue provides an overview of the current state of quantum information science, from its history and theoretical foundations to its latest achievements, and considers the future of the quantum age from a variety of perspectives, including political economy, philosophy, and literature.

    An optical computer refers to a computer that performs calculations using light instead of conventional electrons. The advantage over conventional computers is that optical processing consumes far less power than electronic processing, which generates heat and consumes enormous amounts of power. Another advantage is that spatial parallel processing is possible, which can significantly increase processing speed.

    On the other hand, it is difficult to build a large-scale logic circuit like that of electronic information, so a practical optical computer has not yet been realized.

    One of the most famous examples of an optical computer in the world of science fiction is the optical computer technology of the Kryptonians, Superman’s home planet. In the scene where Superman builds a base in the solitude of the North Pole, the crystals multiply themselves to create a base based on this technology. These technologies are theoretically real. I will discuss the individual elemental technologies below.

    An introduction to the technology of optical neural net chips, which was announced in 2017 and for which two ventures, Lightintelligence and Lightmatter, were founded. The chips are characterized by energy consumption that is several orders of magnitude less than normal chips and speeds that are tens to hundreds of times faster, and chips that actually work are being built.

    A paper published in 1978 that proposed the idea of multilayer learning in the volume direction using photorefractive crystals (the internal electric field changes/maintains depending on the input light). Photorefractive crystals include ferroelectrics, compound semiconductors, and organic polymers (e.g., liquid crystals). Response time is slow (ms~s) due to internal charge transfer.

    Explanation of the Colloidal diamond paper published in Nature in 2020. Photonic crystals that control the propagation of light on a nano scale are generated and controlled by self-growth using DNA.

    コメント

    Exit mobile version
    タイトルとURLをコピーしました