Quantcast
Channel: Active questions tagged soft-question - Mathematics Stack Exchange
Viewing all articles
Browse latest Browse all 1295

Creative uses of the Moment Generating Function of a Random Variable

$
0
0

The theorem below finds the raw moments of the $\text{Lognormal}$ distribution by using the MGF of the $\text{Normal}$ distribution. I found this to be quite creative (since it is different from the usual properties of the MGF that get used for sums, etc):

$\text{Given } X \sim \text{Lognormal}(\mu,\sigma^2)$, the raw moment $E[X^t]$ can be written in terms of the moment generating function of $Z \sim N(\mu, \sigma^2)$ (Thm 6.9, Sahoo, Prob and Math Stat):$$E[X^{t}]=E[e^{\ln X^{t}}]=E[e^{t(\ln X)}]=M_{(\ln X)}(t)=M_Z(t)$$

While $E[X^n]$ for the standard uniform distribution can be generalized using the integral definition:$$E(X^n)=\int_{0}^{1}x^n \cdot f(x) \,dx = \int_{0}^{1}x^n \, dx = \left[\dfrac{x^{n+1}}{n+1}\right]_{0}^{1}=\dfrac{1}{n+1}$$

I tried using a method similar to the one in Sahoo using the transformation $X \sim \mathcal{U}(0,1) \implies -\ln X \sim Exp(\lambda=1)$ to get:$$E[X^{-T}]=E[e^{\ln X^{-T}}]=E[e^{T(-\ln X)}]=M_{(-\ln X)}(T)=\frac{1}{1-T}$$And with a change of variable $t=-T$, we get the same results as above:$$E[X^{t}]=\frac{1}{1+t}$$

Are there any other creative, interesting and elegant ways in which the MGF can be used to obtain results?Note: In the interest of clarity, I am asking specific numeric and algebraic answers. (Just as I have given an example).


Viewing all articles
Browse latest Browse all 1295

Latest Images

Trending Articles



Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>