The sum of two independent random variables can be calculated via either convolution [^1] or by using moment generating functions.

Sum of Random Variables as Convolution

Discrete

The sum of two independent discrete random variables has the probability of

If we write it as PMFs, we can see that the PMF of the sum is the discrete convolution of individual PMFs

Continuous

Similarly, for continuous random variables, the PDF of the sum is the continuous convolution of the two inputs

Note that both the cases above require the random variables to be independent, in other word

Another interesting consequence is the central limit theorem. Since since convolution tends to smooth functions, repeatedly applying convolutions eventually give us the bell-curve shape.

convolution rectangle.png

Examples

The Sum of Independent Poissons is Poisson

and more generally: independent with implies $$ \sum_{i=1}^{n} X_{i} \sim \operatorname{Poisson}\left(\sum_{i=1}^{n} \lambda_{i}\right)

You can't use 'macro parameter character #' in math mode### The Sum of [[[independent and identically distributed random variables|i.i.d.]] [[exponential random variable|Exponentials]] is [[gamma random variable|gamma]]:

X_{1}, X_{2}, \ldots, X_{n} \stackrel{i i d}{\sim} \exp (\text { rate }=\lambda) \Rightarrow \sum_{i=1}^{n} X_{i} \sim \Gamma(n, \lambda)

You can't use 'macro parameter character #' in math mode### The Sum of i.i.d. Gammas is Gamma

X_{1}, X_{2}, \ldots, X_{n} \stackrel{i i d}{\sim} \Gamma(\alpha, \beta) \Rightarrow \sum_{i=1}^{n} X_{i} \sim \Gamma(n \alpha, \beta) .