why did Shannon only consider sequences satisfying the same distribution when computing capacity?

It is also the case that for a given mean A and standard deviation S the Gaussian probability distribution for random noise is itself a maximum entropy solution. Any other probability distribution has worse entropy. For a simple proof see (or DIY it with calculus of variations).

formatting link

Other noise probability distributions would just make things worse.

The long and the short of it is that Shannon is about as unassailable on the fundamentals of information theory as Einstein is on relativity.

Sadly this does not stop cranks from regularly popping up.

--
Regards, 
Martin Brown
Reply to
Martin Brown
Loading thread data ...

Like I said, if you don't like the conversation, why not just keep out of it? Others here are happy to help those asking questions.

--

Rick C
Reply to
rickman

Thanks, Rick.

Yan

? 2017?2?22???? UTC

+8??5:21:51?rickman???

? UTC-5??3:04:20?Tim Wescott??? ??

y
}

d do

nt

Reply to
yanli0008

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.