103D ENTROPY 2016.pdf

(2679 KB) Pobierz
ENTROPY
REDUCTION
TEORIA INFORMACJI
in fact different
ely.
probability
mass functions,
p*(x)
and
Entropy, a measure of uncertainty of a random variable.
The entropy H(X) of a discrete random variable X is
variable
The entropy H(X) of a discrete random
defined
by:
X is d
H(X) - c p(dlogpm -
=
lso write H(p) for the above quantity. The log is to the
opy is expressed
to
in bits.
2 and entropy is expressed in
of a fa
For example, the entropy
The log is the base
1
bit. We will
example, the entropy of a fair coin toss
= 0,
bit.
that 0 log 0
is 1
which is
bits. For
use the convention
We
since x log x + 0
that
+ 0.
= 0.
by continuity
will use the convention
as x
0 log 0
Thus adding te
bability
does not change the entropy.
Thus
logarithm
base of the
adding terms of zero
we will denote
not change
is b,
probability does
the entropy as
the entropy.
ase of the logarithm
is e, then the entropy is measured i
otherwise specified, we will take all logarithms
to base
ll the entropies will be measured in bits.
f entropy-it
is a concave function of the distribution
and
p = 0 or 1. This makes sense, because when p = 0 or
is not random and there is no uncertainty.
Similarly,
ty
is maximum
when p = g, which also corresponds
value of the entropy.
Example
with
with
with
with
probability
probability
probability
probability
2.1.2:
Let
l/2
l/4
l/8
l/8
,
,
,
.
ropy
of X is
1
1
7
HGy)=-clogs-alog~-~log~-81og8=4bits.
xample
2.1.2:
Let
The entropy of X
with
with
with
with
probability
probability
probability
probability
l/2
l/4
l/8
l/8
,
,
,
.
of X is
1
1
7
(2.6
he entropy
HGy)=-clogs-alog~-~log~-81og8=4bits.
(2.7
CHANGE => ??
p(a) = 0
p(b) = p(c) = 1/2
p(d) = 0
Zgłoś jeśli naruszono regulamin