Gaussian Distribution
Named after Gaussian, with some assumptions on error, describe the uncertainty of samples.
Distribution of Errors
Assumptions
- sum up to zero (average of samples is the true value)
- if
- errors are independent
Solving the probability density of the error
Suppose:
- the probability (density) function of error is
- the true value is
- the observed values are X=\left{ x\_{i} \right}
- the errors are \left{ z = x-L | x \in X \right}
Then the Maximum Likelihood Estimation of the true value give the first constraint
\begin{align} \ln P =& \sum\_i^n \ln f(x\_i - L) & \text{log-lilkelihood} & \\ \frac{\partial \ln P}{\partial L} =& \sum\_{i}^n \frac{f'(x\_i - L)}{f(x\_i -L)} (-1) \\ \=&\sum\_{i=1}^n g(x\_i - L) = 0& \text{MLE L s.t. maximize P} & \\ \end{align}This process says: to maximize the likelihood of , function should sums up to .
Now with the assumption that all errors sums to , we have two constraints:
can be one solution, thus:
Solve from the differential equation:
So the distribution of error is parameterized by and , and the parameterized probability density is:
why not ?
IDK, TODO
Solving A
As probability density is the derivative of accumulation function, integration of on should be :
\begin{align} 1 = & \int\_{-\infty}^{+\infty} f(z) dx \\ \=& A \int\_{-\infty}^{+\infty} e^{-\frac{C}{2}z^2} dx \\ \overset{x = \sqrt{\frac{2}{C}}t}{=}& A\sqrt{\frac{2}{C}} \int\_{-\infty}^{+\infty} e^{-t^2} dt \\ \end{align}with Gaussian Integral:
Interpreting
Variation of Error
\begin{align} \sigma^{2} & = \int\_{-\infty}^{+\infty} x^{2}f(x),dx \\ & = -\frac{1}{C} \int\_{-\infty}^{+\infty} x \left((-Cx) A \exp\left( -\frac{Cx^{2}}{2} \right) \right) , dx \\ & = -\frac{1}{C} \int\_{-\infty}^{\infty} x \left\[ A \exp\left( -\frac{Cx^{2}}{2} \right) \right]' , dx \\ & = -\frac{1}{C} \left{ \left\[ f(x) \right]_{-\infty}^{+\infty} - \int_{-\infty}^{\infty} f(x) , dx \right} \\ & = -\frac{1}{C} \left\[ 0 - 1 \right] = \frac{1}{C} \end{align}with , then:
Expectation of Error
For any function symmetric w.r.t. , its integral on is :
as is symmetric w.r.t. :
Distribution of the Samples
PDF of Samples
substituting the observed variable back, then:
thus the density function of distribution is
Expectation of Samples
expectation of would be:
Variance of Samples
according to Transformation, variation of would be the same, which is .
Conclusion
In conclusion, with the assumption of error, the distribution of sample would be described as:
where: is the expectation and is the variance.