代写 algorithm Scheme graph statistic theory Guidelines:

Guidelines:
ECE566: Information Theory – Fall 2019 – Dr. Thinh Nguyen Final Project
Due Date: March 21 2019
This project is an individual effort. You might discuss the ideas and solutions with others, but you must implement the project and write the report yourself. The report must be typed. Please indicate the total time you spend on the project, your interesting thoughts, any other confusions, or obstacles that might prevent you doing any part of the project. I will use these to improve the current and future classes.
Grading method:
The project will be graded based on a written report by individual students. The report will be graded based on: 1) Correctness (80%) and 2) Clarity and presentation (20%)
What to turn in:
1. A typed project report that contains all the graphs, the answers to the questions, and discussions. 2. All the source codes and the instructions for running the codes to produce the results in the report. 3. The instructions on how to submit the project will be posted shortly.
Project Description:
This project aims to enhance your theoretical and practical knowledge of the Shannon information theory by implementing a compression system. For this task, download the file ¡¯data.bin¡¯ from the class website. Each data sample is of type of int32 (4 bytes). For part one of the project, your task is to estimate various statistics about the file. For part two of the project, your task is to implement Huffman coding schemes with different parameters. Note that the file ¡¯data.bin¡¯ is for you to test your program. For the actually grading, individual students might have a different file. So, the answers in student¡¯s report will be likely different from others.
1 Part One: Estimating Entropies
Write a program to estimate the following quantities:
1. How many discrete values are there in the file?
2. Let Xi be the random variables denoting the ith discrete samples in the data file, assuming that Xi are i.i.d, plot the histogram of Xi and estimate its probability distribution and entropy.
3. Let Yi = XiXi+1, assuming that Yi are i.i.d, compute H(Y )/2. 1

2
1.
2. 3.
4. 5.
Part Two: Compression
Write a symbol Huffman compression algorithm and a corresponding decompression algorithm to compress Xi. Make sure that the output of your decompression algorithm matches with the original uncompressed version.
Write a block Huffman compression algorithm and a corresponding decompression algorithm to com- press sequence Yi.
Write a block Huffman compression algorithm and a corresponding decompression algorithm to com- press sequence Zi.
Which of the 3 compressed sequences above has the smallest size.
Write a program to estimate the entropy rate of the sequence Xi. Explain what the program does.
4. Let Zi = XiXi+1Xi+2, assuming that Zi are i.i.d, compute H(Z)/3.
Syntax usage for part one:
estimate entropy ¡±filename¡±
Input: ¡±filename¡±
Output: |X|,H(X),H(Y)/3,H(Z)/3.
Syntax usage for part two:
1. Encoding algorithm: ¡±Encode¡± ¡±input filename¡± ¡±n¡± ¡±output filename¡± 2. Decoding algorithm: ¡±Decode¡± ¡±input filename¡± ¡±n¡± ¡±output filename¡±
You want to make sure that your compress algorithm followed by your decompress algorithm result in the original file.
3 Bonus
Devise any compression/decompression techniques to obtain improve further compression ratio. Explain the reason behind such techniques. The number of bonus points is proportional to the compression ratio (original file size/compressed file size) obtained.
2