Technology Discussion – A look at some of the latest technologies

Machine Learning Artificial Intelligence Natural Language Processing Probabilistic Generative Models Algorithm ICT Technology Computer Architecture IT Infrastructure Digital Transformation Deep Learning Mathematics Computer Hardware DX Case Study  Navigation of this blog

Technology Discussion – A look at some of the latest technologies

This section discusses various topics found on the web about the latest technologies (biotechnology, energy, physics, agriculture, chemistry, astronomy, brain science, quantum sensing, robotics, etc.) other than the artificial intelligence technologies that are the subject of this blog.

  • Issues and trends in semiconductor technology and AI technology

Chip miniaturisation, the most significant challenge in semiconductor technology, involves a number of issues, including many challenges in the manufacturing process and design. As described in ‘Overview of semiconductor manufacturing technology and application of AI technology’, the structure of semiconductors is as follows: switching is controlled by applying a voltage to the gate section formed via an insulator above the current flowing in the channel section between the drain and the source.

I will discuss the process of designing semiconductor chips as described in “Computational Elements of Computers and Semiconductor Chips” and semiconductor chips specialized for AI applications, which is one step further from “Introduction to FPGAs for Software Engineers: Machine Learning”.

Turing’s theory of computation, proposed by Alan Turing, is a theory that theorizes the fundamental concepts of computers. This theory provides the basis for understanding how computers work and what computation is, and consists of the following elements

Neural Turing machine refers to a model of computation that combines neural networks and Turing machines.

A quantum computer is a form of computer that uses the principles of quantum mechanics to process information. The difference between a quantum computer and a conventional computer is that a quantum computer uses a unit of information with quantum mechanical properties called a “qubit” or “quantum bit,” whereas conventional computers process information in binary numbers called “bits.

  • Soft machines and biocomputers

In the film Terminator, a transformable robot T-1000 made of fluid polycrystalline metal appears. This robot has many unclear points of principle, such as the fact that it has no skeleton, the location of its power source and the principle of its CPU are unknown. Soft machines are being considered.

Quantum physics is one of the fields of physics developed to elucidate phenomena and behaviors that cannot be explained within the framework of classical mechanics, and it is a theory that describes physical phenomena at microscopic scales (such as atoms and molecules). The connection between quantum physics and artificial intelligence technology has attracted much attention in recent years. In this article, I would like to discuss some of these perspectives.

Quantum information processing (Quantum Information Processing) is a field that uses the principles of quantum mechanics to process information. Unlike conventional classical information processing, quantum information processing uses basic units of information with quantum mechanical properties called qubits, and this is how quantum Unlike conventional classical information processing, quantum information processing uses qubits, which are basic units of information with quantum-mechanical properties. In this article, we will discuss quantum entanglement and quantum teleportation related to quantum communication.

Brain Machine Interface (BMI) is a generic term for devices that interface between the brain and computers by detecting brain waves, etc., or conversely by stimulating the brain. OPEN BCI is an open source BMI.

Although there seems to be no relationship between dreams and data science, dreams have been one of the sources of ideas in the development of machine learning and brain theory. Recently, data analysis using machine learning has made it possible to analyze (decode) the content of dreams from brain activity patterns during sleep. In this section, we trace the footsteps of dream research leading to dream decoding.

Various models for emotion recognition have been proposed, as described in “Emotion recognition, Buddhist philosophy and AI”. In addition, a number of AI technologies such as speech recognition, image recognition, natural language processing and bioinformation analysis have been used to extract emotions. This section describes the details of these technologies.

This is an article about a small nuclear power generation module called a “mini-reactor” being developed by a U.S. venture company called NuScale Power. The concept is to use a combination of modules (a few units are enough for a small city) that are 19.8 meters high, 2.7 meters in diameter, and have an output of 60 megawatts (1/10 of a typical small nuclear power plant).

Along with micro-atomic power generation, the fusion of nuclear fusion technology and AI technology has become a hot topic in recent years. The basic principle is to use a model generated by learning simulation data to control the fusion reactor through reinforcement learning by feeding back the impedance and current of the coils or the values of sensors installed in the reactor (optical sensors to measure the shape and temperature of the plasma?) The model is then controlled by reinforcement learning with feedback from the coil impedance, current, or sensor values installed in the furnace (optical sensors to measure the shape and temperature of the plasma?

  • Combining 3D printers with generative AI and applying GNNs

A 3D printer is a device for creating a three-dimensional object from a digital model, which is based on a computer-designed 3D model, which is then layered with materials to produce the object. This process is called additive manufacturing (additive manufacturing). The most common materials used are plastics, but metals, ceramics, resins, foodstuffs and even biomaterials are also used; the combination of GNNs, generative AI and 3D printers can enable complex structures and dynamic optimisation to create new design and manufacturing processes.

Solar cell technology, which has flourished since the Sunshine Project started in Japan in 1973, has reached a plateau after several decades of technological accumulation, and is about to take the next step forward. In this article, I would like to give an overview of the solar cell and its future.

In this issue, we will discuss the lithium-ion iron phosphate battery, which has been the focus of much attention in recent years.

  • History of science for young readers

This book enables young readers to learn the principles of science along with its history. Science includes physics, biology, chemistry and geology in high school alone, and it is difficult for beginning students to understand how science has developed as a whole. The book vividly depicts the dynamic changes in science as once established theories from ancient times to the present day are successively overthrown. The book describes episodes from famous scientists such as Aristotle, Galen, Galileo, Harvey, Bacon, Newton, Einstein and Berners-Lee, and traces the trajectory of development from ancient civilisations to modern chemistry.

In animals, muscles contain a large amount of protein. In fact, proteins play the role of major components in all parts of life, whether in animals or plants, as components of catalysts (enzymes) for chemical reactions within organisms and receptors (receptors) on biological membranes. Also, “information is transcribed from DNA to RNA, from which the sequence of amino acids in a protein (one-dimensional structure) is determined,” and “the three-dimensional shape taken by a protein (higher-order structure) is essential to its function as a component” is the start of biology.

The next question, then, is how the higher-order structure is determined from the primary structure. For proteins that are not very large molecules, the correct answer is that they fold spontaneously to form higher-order structures. This is called protein folding.

To reproduce this phenomenon in a computer, realistic simulations of proteins have been constructed based on Newton’s equations of motion. This is a technique called molecular dynamics (MD), which is similar to the Hamiltonian MCMC in data science.

When I hear that the age of the universe is 13.8 billion years old, I feel that the time scale of the universe and celestial objects is incredibly long. Therefore, no one would expect that the sun, which set in the western sky yesterday evening, would come up a million times brighter in the morning today. But in the real universe, aside from ordinary stars like the sun, we frequently witness such astronomical observations that change on a human time scale.

The first reported explosion of V455 Andromeda was on September 4, 2007. The possibility of an explosion of this star had been pointed out for some time. Hiroshima University’s 1.5-meter telescope “Kanata” was pointed at this object to observe the oscillation phenomenon with a period of about 80 minutes, which is only observed in the early stages of the explosion due to its close distance from the earth. The brightness oscillation was successfully detected. The data obtained, however, showed that the temperature of the object becomes lower when it is brighter, which is unusual for a celestial body to show such fluctuations. However, the results were as expected for this phenomenon.

Here we describe the results of tomographic reconstruction of the shape of the accretion disc from the data obtained in these studies.

In a meta-analysis, data from multiple trials can be integrated to evaluate trial effects based on a larger amount of information, even when the amount of information in individual trials is insufficient to make inferences with sufficient precision.

The problem here is the assumption that “the relative risk in all trials is the same. In general, trials collected in this way are conducted at different times, in different regions (countries), and at different sites, and it is common for various factors, such as the background of the participants and the definitions of study drug doses and outcomes, to be strictly different across trials.

Explainable Machine Learning (EML) refers to methods and approaches that explain the predictions and decision-making results of machine learning models in an understandable way. In many real-world tasks, model explainability is often important. This can be seen, for example, in solutions for finance, where it is necessary to explain on which factors the model bases its credit score decisions, or in solutions for medical diagnostics, where it is important to explain the basis and reasons for predictions for patients.

In this section, we discuss various algorithms and examples of python implementations for this explainable machine learning.

コメント

Exit mobile version
タイトルとURLをコピーしました