Entropy Formula:
From: | To: |
Definition: Shannon Entropy measures the uncertainty or information content in a probability distribution.
Purpose: Used in information theory, data compression, cryptography, and machine learning to quantify information.
The calculator uses the formula:
Where:
Explanation: Higher entropy means more uncertainty/unpredictability. Maximum entropy occurs when all outcomes are equally likely.
Details: Entropy helps determine the minimum number of bits needed to encode information, assess system randomness, and evaluate information security.
Tips: Enter comma-separated probabilities (must sum to 1). Example: 0.5,0.5 for fair coin or 0.25,0.25,0.25,0.25 for fair die.
Q1: What does 0 entropy mean?
A: Zero entropy means no uncertainty - one outcome has probability 1 (completely predictable).
Q2: What's the maximum possible entropy?
A: For N events, maximum is log₂(N) bits, achieved when all probabilities are equal (1/N).
Q3: Why base 2 logarithm?
A: Base 2 gives entropy in bits, the fundamental unit of information. Other bases can be used for different units.
Q4: What if probabilities don't sum to 1?
A: The calculator requires valid probabilities (sum = 1). Normalize your inputs if needed.
Q5: How is this different from thermodynamic entropy?
A: While mathematically similar, Shannon entropy measures information while thermodynamic entropy measures physical disorder.