CS代考 COMP2610 / COMP6261 Information Theory Lecture 1: Introduction

COMP2610 / COMP6261 Information Theory Lecture 1: Introduction
U Logo Use Guidelines Robert C. Williamson
logo is a contemporary n of our heritage.
presents our name, ld and our motto:

Copyright By PowCoder代写 加微信 powcoder

earn the nature of things.
authenticity of our brand identity, there are n how our logo is used.
go – horizontal logo
go should be used on a white background. ludes black text with the crest in Deep Gold in
rinting is not available, the black logo can hite background.
Research School of Computer Science The Australian National University
e used white reversed out of a black
Preferred logo
Black version
July 23, 2018
occasionally a neutral dark background.

What is the world made of?
Ancient times: Matter — atoms

What is the world made of?
Ancient times: Matter — atoms
20th Century: Energy — mass=energy

What is the world made of?
Ancient times: Matter — atoms
20th Century: Energy — mass=energy 21st Century: Information — ????

What is the world made of?
Ancient times: Matter — atoms
20th Century: Energy — mass=energy 21st Century: Information — ????
Information underpins
Physics (energy needs of computing limited by cost of erasing information)

What is the world made of?
Ancient times: Matter — atoms
20th Century: Energy — mass=energy 21st Century: Information — ????
Information underpins
Physics (energy needs of computing limited by cost of erasing information)
Chemistry (pattern recognition by molecules)

What is the world made of?
Ancient times: Matter — atoms
20th Century: Energy — mass=energy 21st Century: Information — ????
Information underpins
Physics (energy needs of computing limited by cost of erasing information)
Chemistry (pattern recognition by molecules)
Biology (genetic code)

What is the world made of?
Ancient times: Matter — atoms
20th Century: Energy — mass=energy 21st Century: Information — ????
Information underpins
Physics (energy needs of computing limited by cost of erasing information)
Chemistry (pattern recognition by molecules)
Biology (genetic code)
Immunology (pattern recognition of self from non-self)

What is the world made of?
Ancient times: Matter — atoms
20th Century: Energy — mass=energy 21st Century: Information — ????
Information underpins
Physics (energy needs of computing limited by cost of erasing information)
Chemistry (pattern recognition by molecules)
Biology (genetic code)
Immunology (pattern recognition of self from non-self) Economics (price, markets, the economics of information)

What is the world made of?
Ancient times: Matter — atoms
20th Century: Energy — mass=energy 21st Century: Information — ????
Information underpins
Physics (energy needs of computing limited by cost of erasing information)
Chemistry (pattern recognition by molecules)
Biology (genetic code)
Immunology (pattern recognition of self from non-self) Economics (price, markets, the economics of information) Sociology (media, social networks)

What is the world made of?
Ancient times: Matter — atoms
20th Century: Energy — mass=energy 21st Century: Information — ????
Information underpins
Physics (energy needs of computing limited by cost of erasing information)
Chemistry (pattern recognition by molecules)
Biology (genetic code)
Immunology (pattern recognition of self from non-self) Economics (price, markets, the economics of information) Sociology (media, social networks)
Philosophy (ontology, epistemology, morality)

What is the world made of?
Ancient times: Matter — atoms
20th Century: Energy — mass=energy 21st Century: Information — ????
Information underpins
Physics (energy needs of computing limited by cost of erasing information)
Chemistry (pattern recognition by molecules)
Biology (genetic code)
Immunology (pattern recognition of self from non-self) Economics (price, markets, the economics of information) Sociology (media, social networks)
Philosophy (ontology, epistemology, morality) Engineering (your telephone for example)

What is the world made of?
Ancient times: Matter — atoms
20th Century: Energy — mass=energy 21st Century: Information — ????
Information underpins
Physics (energy needs of computing limited by cost of erasing information)
Chemistry (pattern recognition by molecules)
Biology (genetic code)
Immunology (pattern recognition of self from non-self) Economics (price, markets, the economics of information) Sociology (media, social networks)
Philosophy (ontology, epistemology, morality) Engineering (your telephone for example)
Computing (What is that computers do? They process information) 2/28

References for the curious . . . for interest only!
1 , The Bit and the Pendulum: From Quantum Computing to M Theory-The New Physics of Information, Wiley 2000
2 , Is information the Key?, Nature Physics 1, 1-4, October 2006
3 Wheeler, Information, Physics, Quantum: The Search for Links, in Proceedings of the 3rd International
Symposium on the Foundations of Quantum Mechanics, Tokyo, (1989)
4 Wheeler with , Geons, , and Quantum Foam: A Life in Physics, W.W. Norton and
Company, 1998; Chapter 15 “It from Bit”
5 and Niels Henrik Gregersen, Information and the Nature of Reality, Cambridge University press 2010
6 , From bit to it: how complex metabolic network transforms information into living matter, BMC Systems
Biology, 1(33), 2007
7 (Ed.), A computable universe: understanding and exploring nature as computation, World Scientific (2013)
8 , Uncertainty principle and minimal energy dissipation in the computer, International Journal of Theoretical
Physics 21(3/4), 283-297, (1982)
9 , The physical nature of information, Physics Letters A, 217, 188-193 (1996)
10 et al., Experimental verification of Landauer’s principle linking information and thermodynamics, Nature 483, 187-190, (8 March 2012)
11 .R. Parrondo, . Horowitz and , Thermodynamics of Information, Nature Physics, 11, 131-139, (February 2015)
12 Jean- , Perspectives in Supramolecular Chemistry –— From Molecular Recognition towards Molecular Information Processing and Self-Organization, International Edition in English, 29(11), 1304–1319, (November 1990)
13 Jean- , Supramolecular chemistry – scope and perspectives – molecules – supermolecules – molecular devices, Lecture, (8 December 1987)
14 Smith, The concept of information in biology, Philosophy of Science 67(2), 177-194 (2000)
15 , Information and knowledge in biology: time for reappraisal, Plant Signalling and behaviour 2(2), 65-73 (2007)
16 and , Networks, crowds and markets: reasoning about a highly connected world, Cambridge
University Press (2010).
17 . Hayek, The use of knowledge in society, The American Economic Review, 35(4), 519-530 (1945)
18 . Stigler, The Economics of Information, The Journal of Political Economy 69(3), 213-225 (1961)
19 . Stiglitz, Information and the change in the paradigm in economics, Lecture 8 (December 2001)
20 and Ian R. Mackay, Fashioning the immunological self: the biological individuality of F. ,
Journal of the History of Biology, 47, 147–175, (2014)

What Is Information? (1)
According to a dictionary definition, information can mean
1 Facts provided or learned about something or someone:
a vital piece of information.
2 What is conveyed or represented by a particular arrangement or sequence of things:
genetically transmitted information.

What Is Information? (1)
According to a dictionary definition, information can mean
1 Facts provided or learned about something or someone:
a vital piece of information.
2 What is conveyed or represented by a particular arrangement or sequence of things:
genetically transmitted information.
Important!
Usually unhelpful to ask “What is?” questions! — “essentialism”.

What Is Information? (1)
According to a dictionary definition, information can mean
1 Facts provided or learned about something or someone:
a vital piece of information.
2 What is conveyed or represented by a particular arrangement or sequence of things:
genetically transmitted information.
Important!
Usually unhelpful to ask “What is?” questions! — “essentialism”.
Better to ask what happens to it? “Grothendieck’s Relative method”

What is Information? (2)
In this course: information in the context of communication (includes information storage).
Explicitly include uncertainty — indeed, rather than deriving information from probability theory, one can start with information and derive probability theory from that!
(1948): “Amount of unexpected data a message contains”
􏰜 A theory of information transmission

What is Information? (3)
INFORMATION
SOURCE TRANSMITTER
RECEIVER DESTINATION
RECEIVED SIGNAL
NOISE SOURCE
Fig. 1—Schematic diagram of a general communication system. 1
nnon, A Mathematical Theory of Communication, Bell System Technical cimal digit is about 3 3 bits. A digit wheel on a desk computing machine has ten stable positions a
Journal (1948).
fore has a storage capacity of one decimal digit. In analytical work where integration and differentiat
nvolved the base e is sometimes useful. The resulting units of information will be called natural un
From Claude Sha
nge from the base a to base b merely requires multiplication by logb a.
By a communication system we will mean a system of the type indicated schematically in Fig. 1.

What Is Information? (4)
Information is a message that is uncertain to receivers:
If we receive something that we already knew with absolute certainty
then it is non-informative
Uncertainty is crucial in measuring information content We will deal with uncertainty using probability theory

What Is Information? (4)
Information is a message that is uncertain to receivers:
If we receive something that we already knew with absolute certainty
then it is non-informative
Uncertainty is crucial in measuring information content
We will deal with uncertainty using probability theory
Information Theory
Information theory is the study of the fundamental limits and potential of the representation and transmission of information.

Example 1: What Number Am I Thinking of?
I have in mind a number that is between 1 and 20
You are allowed to ask me one question at a time
I can only answer yes/no
Your goal is to figure out the number as quickly as possible What strategy would you follow?

Example 1: What Number Am I Thinking of?
I have in mind a number that is between 1 and 20
You are allowed to ask me one question at a time
I can only answer yes/no
Your goal is to figure out the number as quickly as possible What strategy would you follow?
Your strategy + my answers = a code for each number
Some variants:
What if you knew I never chose prime numbers?
What if you knew I was twice as likely to pick numbers more than 10? What if you knew I only ever chose one of 7 or 13?

Example 2: How Much Is Information Worth? Simplified Version of “Deal or No Deal”
$1000 Hidden in one of 16 cases.
All equally likely to contain the prize

Example 2: How Much Is Information Worth? Simplified Version of “Deal or No Deal”
$1000 Hidden in one of 16 cases.
All equally likely to contain the prize
How much would you pay to know:
1 Exactly which case contains the money?
2 Whether the case holding the money is numbered less than 8?
3 … is less than 12?
4 Which range out of 0–3, 4–7, 8–11, or 12–15 the money case is in?

Example 2: How Much Is Information Worth? Simplified Version of “Deal or No Deal”
$1000 Hidden in one of 16 cases.
All equally likely to contain the prize
How much would you pay to know:
1 Exactly which case contains the money?
2 Whether the case holding the money is numbered less than 8?
3 … is less than 12?
4 Which range out of 0–3, 4–7, 8–11, or 12–15 the money case is in?
Key Question:
Can we use these ideas to quantify information?

Example 3: Redundancy and Compression
Cn y rd ths sntnc wtht ny vwls?

Example 3: Redundancy and Compression
Cn y rd ths sntnc wtht ny vwls?
Can you read this sentence without any vowels?
Written English (and other languages) has much redundancy: Approximately 1 bit of information per letter
Na ̈ıvely there should be almost 5 bits per letter
(For the moment think of “bit” as “number of yes/no questions”)

Example 3: Redundancy and Compression
Cn y rd ths sntnc wtht ny vwls?
Can you read this sentence without any vowels?
Written English (and other languages) has much redundancy: Approximately 1 bit of information per letter
Na ̈ıvely there should be almost 5 bits per letter
(For the moment think of “bit” as “number of yes/no questions”)
Key Question:
How much redundancy can we safely remove? (Note: “rd” could be “read”, “red”, “road”, etc.)

Example 4: Error Correction
Hmauns hvae the aitliby to cerroct for eorrrs in txet and iegmas.
Key Question:
How much noise is it possible to correct for and how?

1 Information and the Nature of the Universe
2 A Brief History
3 Course Overview
4 Logistics and Expectations
5 What’s Next

A Summary of the History of Information Theory
1920s : Nyquist & Hartley at Bell Labs
1940 : Turing and Good at Bletchley Park (WWII)
1942 : and
1948 : : “A Mathematical Theory of Communication”
1958 : : “Two Famous Papers” 1970 : “Coding is Dead”
1970- : Revival with advent of digital computing
CDs, DVDs, MP3s, Digital TV, Mobiles, Internet, Deep-space comms (Voyager), …

More on the History of Information Theory
Information Theory and the Digital Age
by Aftab, Cheung, Kim, Thakkar, and Yeddanapudi.
http://web.mit.edu/6.933/www/Fall2001/Shannon2.pdf

1 Information and the Nature of the Universe
2 A Brief History
3 Course Overview
4 Logistics and Expectations
5 What’s Next

Brief Overview of Course
How can we quantify information?
􏰜 Basic Definitions and Key Concepts
􏰜 Probability, Entropy & Information

Brief Overview of Course
How can we quantify information?
􏰜 Basic Definitions and Key Concepts
􏰜 Probability, Entropy & Information How can we make good guesses?
􏰜 Probabilistic Inference
􏰜 Bayes Theorem

Brief Overview of Course
How can we quantify information?
􏰜 Basic Definitions and Key Concepts
􏰜 Probability, Entropy & Information How can we make good guesses?
􏰜 Probabilistic Inference
􏰜 Bayes Theorem
How much redundancy can we safely remove?
􏰜 Compression
􏰜 Source Coding Theorem,
􏰜 Block, Huffman, and Lempev-

Brief Overview of Course
How can we quantify information?
􏰜 Basic Definitions and Key Concepts
􏰜 Probability, Entropy & Information How can we make good guesses?
􏰜 Probabilistic Inference
􏰜 Bayes Theorem
How much redundancy can we safely remove?
􏰜 Compression
􏰜 Source Coding Theorem,
􏰜 Block, Huffman, and Lempev-
How much noise can we correct and how?
􏰜 Noisy-Channel Coding
􏰜 Repetition Codes, Hamming Codes

Brief Overview of Course
How can we quantify information?
􏰜 Basic Definitions and Key Concepts
􏰜 Probability, Entropy & Information How can we make good guesses?
􏰜 Probabilistic Inference
􏰜 Bayes Theorem
How much redundancy can we safely remove?
􏰜 Compression
􏰜 Source Coding Theorem,
􏰜 Block, Huffman, and Lempev-
How much noise can we correct and how?
􏰜 Noisy-Channel Coding
􏰜 Repetition Codes, Hamming Codes What is randomness?
􏰜 Kolmogorov Complexity
􏰜 Algorithmic Information Theory

Brief Overview of Course
How can we quantify information?
􏰜 Basic Definitions and Key Concepts
􏰜 Probability, Entropy & Information How can we make good guesses?
􏰜 Probabilistic Inference
􏰜 Bayes Theorem
How much redundancy can we safely remove?
􏰜 Compression
􏰜 Source Coding Theorem,
􏰜 Block, Huffman, and Lempev-
How much noise can we correct and how?
􏰜 Noisy-Channel Coding
􏰜 Repetition Codes, Hamming Codes What is randomness? [ ]
􏰜 Kolmogorov Complexity
􏰜 Algorithmic Information Theory

COMP2610/COMP6261 (Information Theory)
We will study the fundamental limits and potential of the representation and transmission of information.
Mathematical Foundations
Probabilistic Inference
Coding and Compression Communication
Kolmogorov Complexity (Guest Lecture)

Learning Outcomes
From https://wattlecourses.anu.edu.au/course/view.php?id=25550:
1 Understand and apply fundamental concepts in information theory such as probability, entropy, information content and their inter-relationships
2 Understand the principles of data compression
3 Compute entropy and mutual information of random variables
4 Implement and analyse basic coding and compression algorithms
5 Understand the relationship of information theoretical principles and Bayesian inference in data modelling and pattern recognition
6 Understand some key theorems and inequalities that quantify essential limitations on compression, communication and inference
7 Know the basic concepts regarding communications over noisy channels

What Tools Use?
Elementary probability theory
􏰜 “What’s the probability of rolling an odd number using a fair die?”

What Tools Use?
Elementary probability theory
􏰜 “What’s the probability of rolling an odd number using a fair die?”
Elementary linear algebra
􏰜 “Ifx=(1,1,0)andy=(−2,0,1)whatisx·yand3x+2y?”

What Tools Use?
Elementary probability theory
􏰜 “What’s the probability of rolling an odd number using a fair die?”
Elementary linear algebra
􏰜 “Ifx=(1,1,0)andy=(−2,0,1)whatisx·yand3x+2y?”
Basic programming skills
􏰜 “Do you know your for loops from your while loops?”

What Tools Use?
Elementary probability theory
􏰜 “What’s the probability of rolling an odd number using a fair die?” 􏰜 http://www.khanacademy.org/math/probability
Elementary linear algebra
􏰜 “Ifx=(1,1,0)andy=(−2,0,1)whatisx·yand3x+2y?” 􏰜 http://www.khanacademy.org/math/linear-algebra
Basic programming skills
􏰜 “Do you know your for loops from your while loops?”

1 Information and the Nature of the Universe
2 A Brief History
3 Course Overview
4 Logistics and Expectations
5 What’s Next

1 Information and the Nature of the Universe
2 A Brief History
3 Course Overview
4 Logistics and Expectations
5 What’s Next

Course Overview
See Wattle site (authoritative)
Lectures: 23 × 1 hour (two lectures per week); one public holiday
By me, except one guest lecture by (Aside: about me). Tutorials: Starting week 2; schedule up shortly.
Assignments: 3 (0% (optional), 20%, 20% each) (0% explained below)
Final Exam (60%) Hurdle assessment! You have to pass the exam to pass the course. (New this year!)
Late Submission Policy: late submissions get zero marks — 100% penalty.

Expectations
See the newly published expectations document:
https://wattlecourses.anu.edu.au/pluginfile.php/1760092/course/section/423322/Learning%20expectations.pdf
Key points:
You are expected to have familiarity and ability with elementary probability theory. “Assignment 0” is designed to help you check whether your background is sufficient.

Expectations
See the newly published expectations document:
https://wattlecourses.anu.edu.au/pluginfile.php/1760092/course/section/423322/Learning%20expectations.pdf
Key points:
You are expected to have familiarity and ability with elementary probability theory. “Assignment 0” is designed to help you check whether your background is sufficient.
As per the expectations document: you are responsible for your learning. I am here to assist. I

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com