---
title: "Entropy"
language: "en"
type: "Symbol"
summary: "Entropy[list] gives the base E information entropy of the values in list. Entropy[k, list] gives the base k information entropy."
keywords: 
- information entropy
- Shannon entropy
- entropy
canonical_url: "https://reference.wolfram.com/language/ref/Entropy.html"
source: "Wolfram Language Documentation"
related_guides: 
  - 
    title: "Descriptive Statistics"
    link: "https://reference.wolfram.com/language/guide/DescriptiveStatistics.en.md"
related_functions: 
  - 
    title: "Tally"
    link: "https://reference.wolfram.com/language/ref/Tally.en.md"
  - 
    title: "Union"
    link: "https://reference.wolfram.com/language/ref/Union.en.md"
  - 
    title: "EntropyFilter"
    link: "https://reference.wolfram.com/language/ref/EntropyFilter.en.md"
---
# Entropy

Entropy[list] gives the base $e$ information entropy of the values in list.

Entropy[k, list] gives the base k information entropy.

## Details and Options

* ``Entropy[string]`` computes the information entropy of the characters in ``string``.

* ``Entropy`` can handle symbolic data.

* With the option setting ``SameTest -> f``, ``Entropy[list, …]`` applies ``f`` to pairs of elements in list to determine whether they should be considered equivalent.

* The default setting for ``SameTest`` is ``SameQ``.

## Examples (8)

### Basic Examples (4)

Entropy of a list of data:

```wl
In[1]:= Entropy[{0, 1, 1, 4, 1, 1}]

Out[1]= (2/3) Log[(3/2)] + (Log[6]/3)
```

---

Entropy of a symbolic list:

```wl
In[1]:= Entropy[{a, b, c, d, e}]

Out[1]= Log[5]
```

---

Entropy of a string:

```wl
In[1]:= Entropy["A quick brown fox jumps over the lazy dog"]

Out[1]= (8/41) Log[(41/8)] + (4/41) Log[(41/4)] + (6/41) Log[(41/2)] + (23 Log[41]/41)

In[2]:= %//N

Out[2]= 3.07114
```

---

Entropy of a random list of zeros and ones:

```wl
In[1]:= Entropy[RandomInteger[1, 1000]]

Out[1]= (64/125) Log[(125/64)] + (61/125) Log[(125/61)]

In[2]:= %//N

Out[2]= 0.692859
```

### Scope (2)

Calculate the entropy of an ``EventSeries`` :

```wl
In[1]:=
es = TemporalData[EventSeries, {{{5, 4, 5, 6, 3, 3, 3, 6, 6, 6, 3, 3, 3, 6, 6, 6, 3, 6, 4, 4, 6, 6, 4, 
    5, 3, 5, 3, 4, 5, 6, 5, 3, 6, 3, 6, 5, 5, 4, 3, 5, 3, 5, 4, 5, 3, 3, 4, 6, 4, 5, 6, 5, 3, 4, 6, 
    4, 5, 3, 3, 6, 4, 4, 3, 4, 6, 3, 3, 4, 3, 3, ... 6, 3, 5, 3, 3, 3, 3, 4, 5, 4, 6, 6, 3, 5, 5, 5, 6, 5, 6, 4, 3, 4, 4, 4, 
    5, 4, 3, 3, 3, 3, 3, 6, 4, 4, 5, 5, 3, 3, 4, 4, 5, 3, 4, 3, 4, 6, 3}}, {{1, 174, 1}}, 1, 
  {"Discrete", 1}, {"Discrete", 1}, 1, {ResamplingMethod -> None}}, False, 10.1];

In[2]:= ListPlot[es]

Out[2]= [image]

In[3]:= Entropy[es]

Out[3]= (1/174) (-39 Log[39] - 82 Log[41] - 53 Log[53]) + Log[174]
```

---

Specify the base:

```wl
In[1]:= list = {0, 1, 1, 4, 1, 1};

In[2]:= Entropy[10, list]//Simplify

Out[2]= (Log[(27/2)]/Log[1000])

In[3]:= Entropy[2, list]//Simplify

Out[3]= (Log[(27/2)]/Log[8])
```

Default base:

```wl
In[4]:= Entropy[E, list] == Entropy[list]

Out[4]= True
```

### Applications (1)

Calculate the entropy for a path of a ``TelegraphProcess`` :

```wl
In[1]:= data = RandomFunction[TelegraphProcess[1.3], {1, 10 ^ 2}]

Out[1]=
TemporalData[Automatic, {CompressedData["«50»"], 
  CompressedData["«1430»"], 1, {"Discrete", 1}, {"Continuous", 1}, 1, 
  {ValueDimensions -> 1}}, False, 14.3]

In[2]:= ListPlot[data]

Out[2]= [image]

In[3]:= Entropy[data]

Out[3]= (1/125) (-62 Log[62] - 63 Log[63]) + Log[125]
```

### Neat Examples (1)

Entropy of a text:

```wl
In[1]:= Entropy[ExampleData[{"Text", "Hamlet"}]]//Short

Out[1]//Short= (31648 Log[(171937/31648)]/171937) + (14768 «1»/171937) + «57» + («1»/«6») + (6 Log[(«6»/6)]/171937)

In[2]:= %//N

Out[2]= 3.14712

In[3]:= Entropy[ExampleData[{"Text", "ShakespearesSonnets"}]]//N

Out[3]= 3.06625
```

## See Also

* [`Tally`](https://reference.wolfram.com/language/ref/Tally.en.md)
* [`Union`](https://reference.wolfram.com/language/ref/Union.en.md)
* [`EntropyFilter`](https://reference.wolfram.com/language/ref/EntropyFilter.en.md)

## Related Guides

* [Descriptive Statistics](https://reference.wolfram.com/language/guide/DescriptiveStatistics.en.md)

## History

* [Introduced in 2008 (7.0)](https://reference.wolfram.com/language/guide/SummaryOfNewFeaturesIn70.en.md)