Probabilities
The Probability Density Function¶
The Probability Density Function (PDF) of a continuous random distribution shows the relative probabilty of a value resulting from the given random distribution. It is the standard representation of random processes and systems.
The probabilty $P$ that a random variable takes a value between $a$ and $b$ can be derived from the PDF:
$$ P(a \leq x \leq b) = \int_{a}^{b} f(x) dx$$
The possiblity that a value is less than a certain value:
$$ P(x \leq a) = \int_{-\infty}^{a} f(x) dx$$
The possiblity that a value occurs can be directly taken from the PDF ($a=b$).
The Cumulative Distribution Function¶
The CDF of a random variable is defined as the integral over the PDF:
$$ F(X) = \int_{-\inf}^X p(x) = 1$$
The CDF is monotonically increasing. As the integral over all $x$ in the PDF has to be 1, the CDFs maximum value is $1$.
Typical Distributions¶
Normal Distribution¶
The Normal (or Gaussian) Distribution is one of the most widely used probability distribution in many fields. It is parametric, with the mean $\mu$ and the variance $\sigma^2$:
$$ p(x) = \frac{1}{\sqrt{2 \pi \sigma2}} ~ e^{-\frac{(x-\mu)^2}{2 \sigma^2}}$$
The square root of the variance is the standard deviation:
$$ \sigma = \sqrt{\sigma^2} $$
While $\mu$ moves the center of the distribtion, $\sigma^2$ controls the peakiness. The following plot shows normal distributions with different variances:
Uniform Distribution¶
In a uniform distribution, all values between $a$ and $b$ have the same probability:
Text(0, 0.5, 'f(x)')
Constant¶
For $a==b$, there is only one value with a probability of $1$.
Text(0, 0.5, 'p(x)')
Probability Mass Function¶
The discrete counterpart of the PDF is called Probability Mass Function (PMF). It is equivalent to the histogram - the relative number of observations for a process.
$$ p_X(x) = P(X=x)$$
Similar to the PDF, all values of a PMF must sum up to $1$:
$$ \sum_x p(x) =1$$