Information Theory Math & Science Reading Notes

Machine Learning Technology.  Artificial Intelligence Technology   Digital Transformation Technology  Natural Language Processing Technology  Deep Learning.  Information Theory & Computer Engineering  Mathematics

From “Information Theory Math & Science.” A Reading Note

It was in 1948 that Claude Shannon published his monumental paper, “A Mathematical Theory of Communication.” More than 60 years later – today, information theory is applied not only to information and communication, but also to a wide range of fields, including the life sciences, brain science, and social sciences. Although information theory uses advanced mathematics, its essence can be clearly seen by understanding the “law of large numbers. From Shannon’s ideas to the fundamentals of information geometry, this book provides a clear explanation that even beginners can understand, and is an introduction by a leading expert to intuitively understand the ideas and mechanisms of information theory.

Chapter 1: Quantitative Recognition of Information

Section 1 Quantity of Information and Entropy

1.1 What defines the quantity of information below
1.2 Entropy
1.3 Entropy of compound events
1.4 Conditional entropy
1.5 Mutual information content
1.6 Chatter

Section 2 Information Sources

2.1 Markov Model of Information Sources
2.2 Markovian Sources of Information
2.3 Redundancy of Information Sources
2.4 Law of Large Numbers for Information Sources
2.5 Ergorticity of Information Sources
2.6 Chit-chat

Chapter 2 Information Transmission over Noiseless Call Routes

Section 1 Noiseless Discrete Call Routes

1.1 Information Transmission Model
1.2 Capacity of Noiseless Discrete Call Channel
1.3 Coding theorem for noise-free discrete calling channel
1.4 Miscellaneous talk

Section 2 Removal of Redundancy by Coding

2.1 Coding and Redundancy Removal
2.2 Coding of simplicity
2.3 Optimal coding method
2.4 Chit-chat

Chapter 3 Information Transmission on Noisy Call Routes

Section 1 Capacity of Discrete Call Routes with Noise

1.1 Information Transmission in Noisy Disturbances
1.2 Call channel capacity
1.3 Error-free information transmission over a noisy call channel
1.4 Chatting

Section 2 Error Correcting Codes

2.1 Error Correction Mechanisms
2.2 Hamming distance
2.3 Hamming code
2.4 Linear codes
2.5 Cyclic codes
2.6 Miscellaneous

Chapter 4 Continuous Information and Signal Space

Section 1 Entropy of continuous signals

1.1 Definition of entropy of continuous signals
1.2 Entropy of various signals
1.3 Conditional entropy and mutual information content
1.4 Signal Transformations and Entropy
1.5 Miscellaneous

Section 2 Composition of Signal Space

2.1 Signal Space
2.2 Sampling Theorem
2.3 Relationship between time-domain and frequency-domain representations
2.4 Chit-chat

Section 3 Continuous Call Routes

3.1 Information Transfer over a Continuous Call Channel
3.2 Capacity of a Call Channel to Generate White Gaussian Noise
3.3 Error-free Communication with Continuous Call Lines
3.4 Chit-chat

Section 4 Information Geometry of Signal Space

4.1 Noise and Metrics in Signal Space
4.2 Riemannian signal space
4.3 Information Theory of Signal Space
4.4 Miscellaneous Talks

Chapter 5 Mapping of Signal Space and Theory of Communication Systems

Section 1 Structure of Communication Systems

1.1 Structure of communication system
1.2 Coding transfer form
1.3 Communication system using continuous mapping
1.4 Quantized communication system
1.5 Miscellaneous

Section 2 Theory of Continuous Communication Systems

2.1 Insertion maps in signal space
2.2 Degenerate maps in signal space
2.3 Change of noise structure in degenerate maps
2.4 Optimal degenerate maps
2.5 Miscellaneous

コメント

タイトルとURLをコピーしました