Why error decreases while increasing sample size?

173 Views Asked by At

I have a data in my hand, I increase the sample size by increasing the sampling frequency of the data while the variance is fixed. As the sample size increases, the mean square error decreases.

What could be the reason for this? Why is it decreasing?

1

There are 1 best solutions below

0
On

The estimation variance is usually inversely proportional (does not have to be linear) to the sample size. For example, for mean estimation, the variance is $\Var[x]=\frac{\sigma^2}{n}$ where $\sigma$ is the noise standard deviation and $n$ is the sample size. Here, You can see a short example.