Fundamentals of InformationTheory信息理論基礎(英文版)
定 價:29 元
叢書名:工業(yè)和信息部十二五規(guī)劃教材
- 作者:陳杰,孫兵,于澤,周蔭清 著
- 出版時間:2016/2/1
- ISBN:9787512419728
- 出 版 社:北京航空航天大學出版社
- 中圖法分類:TN911.2
- 頁碼:
- 紙張:膠版紙
- 版次:1
- 開本:16K
Thisbooksystematicallyintroducedthefundamentalsofinformationtheory,focusingonthebasicmodelofcommunicationsystems.Itconsistsof11Chapters,includingtwoparts,namely,basicconceptionsofinforGmationtheoryandrelatedapplicationsofinformationtheory.Thefirstpartcontainsstatisticalmeasureofinformation,discretesource,losslesssourcecodinganddatacompression,discretechannelanditscapacity,channelcoding,whilethesecondpartincludesratedistortion,continuoussource,continuouschannelanditscapacity,maximumentropyandspectrumestimationaswellascomputersimulationexperimentsrelevanttoinformationtheory.
Thisbooksummarizedtheexperienceoftheauthorsonteachinginformationcoursein EnglishandChinese,togetherwiththeirunderstandingsfromyearsscientificresearchpractice.Itcanbeusedasteachingmaterialoninformationtheorycourseforbothinternationalundergraduateandgraduatestudents.
Chapter1 Introduction ………………………………………………………………………… 1
1.1 Conceptofinformation ………………………………………………………………… 1
1.2 Historyofinformationtheory ………………………………………………………… 2
1.3 Information,messagesandsignals …………………………………………………… 3
1.4 Communicationsystem model ………………………………………………………… 3
1.5 Informationtheoryapplications ……………………………………………………… 4
1.5.1 Electricalengineering(communicationtheory)………………………………… 4
1.5.2 Computerscience(algorithmiccomplexity) …………………………………… 5
Exercises ……………………………………………………………………………………… 5
Chapter2 StatisticalMeasureofInformation ………………………………………………… 6
2.1 Informationofrandomevents ………………………………………………………… 6
2.1.1 SelfGinformation …………………………………………………………………… 6
2.1.2 ConditionalselfGinformation ……………………………………………………… 6
2.1.3 Mutualinformationofevents …………………………………………………… 7
2.2 Informationofdiscreterandomvariables …………………………………………… 8
2.2.1 Entropyofdiscreterandomvariables …………………………………………… 8
2.2.2 Jointentropy……………………………………………………………………… 11
2.2.3 Conditionalentropy ……………………………………………………………… 11
2.2.4 Mutualinformationofdiscreterandomvariables …………………………… 11
2.3 Relationshipbetweenentropyandmutualinformation …………………………… 11
2.4 Mutualinformationandentropyofcontinuousrandomvariables………………… 12
2.4.1 Mutualinformationofcontinuousrandomvariables ………………………… 12
2.4.2 Entropyofcontinuousrandomvariables ……………………………………… 13
Exercises ……………………………………………………………………………………… 14
Chapter3 DiscreteSourceandItsEntropyRate …………………………………………… 15
3.1 Mathematicalmodelofsource ……………………………………………………… 15
3.1.1 Discretesourceandcontinuoussource ………………………………………… 16
3.1.2 Simplediscretesourceanditsextension ……………………………………… 17
3.1.3 Memorylesssourceandsourcewithmemory ………………………………… 18
3.2 Discretememorylesssource ………………………………………………………… 18
3.2.1 Definition ………………………………………………………………………… 18
3.2.2 Extensionofdiscretesource …………………………………………………… 19
3.3 Discretestationarysource …………………………………………………………… 20
3.3.1 Definition ………………………………………………………………………… 20
3.3.2 Entropyrateofdiscretestationarysource …………………………………… 21
3.4 DiscreteMarkovsource ……………………………………………………………… 24
3.4.1 Markovchain …………………………………………………………………… 24
3.4.2 Transitionprobability …………………………………………………………… 25
3.4.3 Markovsourceanditsentropyrate …………………………………………… 28
Exercises ……………………………………………………………………………………… 31
Chapter4 LosslessSourceCodingandDataCompression ………………………………… 33
4.1 Asymptoticequipartitionpropertyandtypicalsequences ………………………… 33
4.2 Losslesssourcecoding ……………………………………………………………… 34
4.2.1 Encoder …………………………………………………………………………… 34
4.2.2 Blockcode ………………………………………………………………………… 35
4.2.3 Fixedlengthcode………………………………………………………………… 36
4.2.4 Variablelengthcode …………………………………………………………… 42
4.3 Datacompression ……………………………………………………………………… 48
4.3.1 Shannoncoding ………………………………………………………………… 49
4.3.2 Huffmancoding ………………………………………………………………… 50
4.3.3 Fanocoding ……………………………………………………………………… 52
Exercises ……………………………………………………………………………………… 52
Chapter5 DiscreteChannelandItsCapacity ……………………………………………… 54
5.1 Mathematicalmodelofchannel ……………………………………………………… 54
5.2 Discretememorylesschannel ………………………………………………………… 55
5.2.1 Mathematicalmodelofdiscretememorylesschannel ………………………… 55
5.2.2 SimpleDMC ……………………………………………………………………… 56
5.2.3 Extensionofdiscretememorylesschannel …………………………………… 60
5.3 Channelcombination ………………………………………………………………… 65
5.4 Channelcapacity ……………………………………………………………………… 70
5.4.1 Conceptofchannelcapacity …………………………………………………… 70
5.4.2 Channelcapacityofseveralspecialdiscretechannels ………………………… 71
5.4.3 Channelcapacityofsymmetricchannels ……………………………………… 73
5.4.4 ChannelcapacityofextendedDMC …………………………………………… 75
5.4.5 ChannelcapacityofindependentparallelDMC ……………………………… 76
5.4.6 Channelcapacityofthesumchannel…………………………………………… 77
5.4.7 Channelcapacityofgeneraldiscretechannels ………………………………… 78
Exercises ……………………………………………………………………………………… 79
Chapter6 NoisyGchannelCoding ……………………………………………………………… 81
6.1 Probabilityoferror …………………………………………………………………… 81
6.2 Decodingrules ………………………………………………………………………… 83
6.3 Channelcoding ………………………………………………………………………… 84
6.3.1 Simplerepetitioncode …………………………………………………………… 84
6.3.2 Linearcode ……………………………………………………………………… 87
6.4 NoisyGchannelcodingtheorem ……………………………………………………… 91
Exercises ……………………………………………………………………………………… 91
Chapter7 RateDistortion …………………………………………………………………… 93
7.1 Quantization …………………………………………………………………………… 93
7.2 Distortiondefinition…………………………………………………………………… 94
7.2.1 Distortionfunction ……………………………………………………………… 94
7.2.2 MeanGdistortion ………………………………………………………………… 95
7.3 Ratedistortionfunction ……………………………………………………………… 97
7.3.1 Fidelitycriterionforgivenchannel …………………………………………… 97
7.3.2 Definitionofratedistortionfunction…………………………………………… 98
7.3.3 Propertyofratedistortionfunction …………………………………………… 98
7.4 Ratedistortiontheoremandtheconverse ………………………………………… 101
7.5 Thecalculationofratedistortionfunction ………………………………………… 103
Exercises …………………………………………………………………………………… 105
Chapter8 ContinuousSourceandItsEntropyRate ……………………………………… 107
8.1 Continuoussource …………………………………………………………………… 107
8.2 Entropyofcontinuoussource ……………………………………………………… 107
8.3 Maximumentropyofcontinuoussource…………………………………………… 109
8.4 Jointentropy,conditionalentropyandmutualinformationforcontinuous
randomvariables …………………………………………………………………… 109
8.5 Entropyrateofcontinuoussource ………………………………………………… 111
8.6 Ratedistortionforcontinuoussource ……………………………………………… 114
Exercises …………………………………………………………………………………… 117
Chapter9 ContinuousChannelandItsCapacity …………………………………………… 119
9.1 Capacityofcontinuouschannel …………………………………………………… 119
9.1.1 CapacityofdiscreteGtimechannel……………………………………………… 119
9.1.2 CapacityofcontinuousGtimechannel ………………………………………… 120
9.2 TheGaussianchannel ……………………………………………………………… 121
9.3 BandGlimitedchannels ……………………………………………………………… 122
9.4 Codingtheoremforcontinuouschannel …………………………………………… 123
Exercises …………………………………………………………………………………… 124
Chapter10 MaximumEntropyandSpectrumEstimation ………………………………… 125
10.1 Maximumentropyprobabilitydistribution ……………………………………… 125
10.1.1 Maximumentropydistribution ……………………………………………… 125
10.1.2 Examples ……………………………………………………………………… 126
10.2 Maximumentropyspectrumestimation ………………………………………… 127
10.2.1 Burgsmaxentropytheorem ………………………………………………… 127
10.2.2 Maximumentropyspectrumestimation …………………………………… 130
Exercises …………………………………………………………………………………… 137
Chapter11 ExperimentsofInformationTheory …………………………………………… 138
11.1 Measureofinformation …………………………………………………………… 138
11.1.1 Informationcalculator ………………………………………………………… 138
11.1.2 Propertiesofentropy ………………………………………………………… 139
11.2 SimulationofMarkovsource ……………………………………………………… 140
11.3 Performancesimulationforsourcecoding ……………………………………… 141
11.3.1 Shannoncoding ……………………………………………………………… 142
11.3.2 Huffmancoding ……………………………………………………………… 143
11.3.3 Fanocoding …………………………………………………………………… 144
11.4 SimulationofBSC ………………………………………………………………… 144
11.5 Simulationofthecascadechannel ………………………………………………… 145
11.6 Calculationofchannelcapacity …………………………………………………… 148
11.7 Decodingrules ……………………………………………………………………… 149
11.8 Performancedemonstrationofchannelcoding…………………………………… 150
References ……………………………………………………………………………………… 152