Overview of Shannon’s Information Theory and Reference Books

Machine Learning Artificial Intelligence Natural Language Processing Algorithm ICT Computer Architecture IT Infrastructure Digital Transformation Deep Learning Mathematics Probabilistic Generative Models Navigation of this blog
Overview of Shannon’s Information Theory

Shannon’s information theory was proposed by Claude Shannon in 1948 to deal quantitatively with the quantity of information and reliability of transmission.

This theory considers information as a “signal” or “data” and considers errors and noise in its transmission. The amount of information is measured in units of “bits” (abbreviation of “binary digit”), where one bit represents two possible states. For example, if a coin is tossed, there are two possible states: heads or tails, and one bit of information.

Shannon’s information theory defines the amount of information as the amount of information. The amount of information is a measure of the uncertainty of the message output by a source of information; the greater the amount of information, the more uncertain the message is.

The amount of information is related to the “number of choices” a message has. For example, if a message has two choices, “yes” and “no,” the amount of information is one bit. This is because one bit can represent two choices.

In general, if a source outputs n different messages, the information content H is represented by the following equation.

\[H = log2 n\]

This formula represents the number of bits needed to represent the n choices, if there are n choices. For example, if the result of a dice roll is to be communicated, the dice have six choices from 1 to 6, and the amount of information is about 2.58 bits (H = log2 6 ≈ 2.58).

Since the amount of information represents the uncertainty of the information the source possesses, the more predictable the message the source outputs, the smaller the amount of information. Conversely, the more difficult the message is to predict, the greater the amount of information.

Shannon’s information theory also considers the reliability of information transmission. Since information transmission is subject to errors and noise, there is no guarantee that the transmitted information will be received accurately. To cope with such cases, methods to improve the reliability of information transmission have been studied.

The error rate in information transmission represents the probability that transmitted information will be received incorrectly. In Shannon’s information theory, error-correcting codes and coding have been studied as methods to reduce the error rate.

Error-correcting codes add redundant information to transmitted information so that it can be recovered accurately even when errors occur, while coding is a method of reducing the effects of errors by converting information to be transmitted into a different format. This is the case, for example, with converting voice data into a digital signal, which is less susceptible to noise.

The error rate in information transmission can be reduced by using various methods. However, since the error rate can never be completely zero, if the error rate exceeds a certain level, the reliability of information transmission will decrease, affecting the optimization of information volume and transmission speed.

Application examples

Shannon’s information theory has been widely applied in the field of information and communication technology. The following are some specific examples of its application.

  • Voice Communications: Shannon’s information theory has been useful in developing compression techniques for voice communications. Although voice data must be transmitted at a high bit rate, Shannon’s information theory can be used to compress the data and reduce the bit rate. This enables high-quality voice communications while saving bandwidth.
  • Data Compression: Shannon’s information theory has been instrumental in the development of compression algorithms in data compression. Data compression saves storage and bandwidth by reducing file size, and Shannon’s information theory can be used to optimize data compression.
  • Internet Communications: Shannon’s information theory has also been instrumental in the development of error correction techniques in Internet communications. Since Internet communications are prone to noise and errors, error correction techniques are necessary, and Shannon’s information theory can be used to optimize error correction techniques.
  • Encryption: Shannon’s information theory has also been applied to the development of encryption techniques. Encryption is a technique for keeping the contents of communications secret, and Shannon’s theory of information content can be used to optimize encryption techniques.
  • Life Sciences: Shannon’s information theory has been applied to the analysis of genetic information in the life sciences; DNA and RNA sequence information can be analyzed and compared using Shannon’s information content theory.
Shannon’s Information Theory and AI Technology

Shannon’s information theory has been widely applied in the field of AI technology. The following are some specific examples of its application.

  • Information Retrieval: Information retrieval is a technique for extracting specific information from large amounts of data, and AI techniques such as machine learning and natural language processing are applied. Shannon’s theory of information content can be used to optimize the ranking of search results.
  • Data compression: AI techniques are also applied in data compression. Data compression techniques using deep learning have been developed to achieve higher compression ratios. In addition, Shannon’s theory of information content can be used to optimize data compression techniques.
  • Natural Language Processing: Natural language processing is a technology in which computers process human language, making AI technology indispensable. Shannon’s theory of information content can be used to optimize natural language processing, for example, to optimally extract document features in document compression.
  • Data transfer: Shannon’s theory of information content can also be applied in data transfer using AI technology. For example, combining data compression techniques with deep learning will enable high-speed data transfer.
  • Image Processing: Shannon’s information content theory is also applied to image processing. For example, Shannon’s information content theory can be used to optimize compression ratios in image compression, and Shannon’s information content theory can be applied to image recognition techniques using deep learning.
reference book (work)

Shannon’s Introduction to Information Theory is an introductory book to Shannon’s information theory.

What is information? How do we weigh it? What is information entropy? What is compression?
This is an introduction to Shannon information theory that even high school students can understand.
Why can information be digitized?
The foundation of information science, which supports the huge information society of today, was created by Shannon.
How can we represent information without form, and how can we express the value of information?
This book provides an easy-to-understand explanation of the information theory Shannon laid the groundwork for.”

Shannon's Introduction to Information Theory
	Preface
	Chapter 1: History of Information Science
		When Computers Were People
		The History of Computers as Computing Machines
		Information Scientists in the First Half of the 20th Century
			Shannon's Definition of Information
			Shannon's Achievements
				Encoded (coded: converted from its original form for efficient processing, transmission, etc.) information
				Proposed the vizier unit of information
				All information can be replaced by numerical values
				We want to send valuable information at high speed and accuracy
	Chapter 2: What is Information?
		What is information?
		Definition of information
		The smallest unit of information
		Shannon's Essence of Information Theory : The Actors in Fast and Accurate Communication
		Information Coding
		Communication channel coding
		Receiver and receiver
	Chapter 3: Value of Information?
		How to express "valuable information"?
		Expected value
		Information entropy
		The size of the information here: the amount of information
		Probability of occurrence of information
		Information entropy and communication capacity
		Probability of appearance of alphabetic symbols
	Chapter 4: Reducing Communication Charges? Information Coding Theorem
		Coding and information capacity
		Composite possible Unique code possible and instantaneous code
		Average code length
		Information source coding theorem and various coding methods
		Shannon-Fano coding method
		Huffman coding method
		Summary of the source coding theorem
	Chapter 5: We can't afford to play the message game Reduce errors
		5-1 How much processing speed does a communication channel have?
			Communication channel and mutual information content
			Mutual information capacity and data mining
			How much processing speed does a communication channel have Communication channel capacity
			How much mutual information is a criterion for evaluating a communication channel
			How to calculate mutual information capacity
			Communication channel capacity
		5-2 How to reduce errors
			Communication channel coding theorem
			To encode communication channel
			The essence of communication channel coding theorem
		5-3 Handling continuous information Sample theorem
			Wave in frequency or in time? Or in time?
			Sampling theorem
			Why do we need more than twice as much information?
			Application to continuous quantities
			Essence of Shannon's Information Theory
	Chapter 6: Information Theory in the History of Information Science
		Information Theory in Information Science
		History of Computers
		Turing and Shannon
		Turing Machine and Computability
		From Turing Machines to Von Neumann Computers
		The "First Computer"?
		The Versatility of Computers
		Applications of Shannon's Information Theory
	Afterword

コメント

Exit mobile version
タイトルとURLをコピーしました