Definition
Inference comes from a word ‘infer’ which means that to draw
a conclusion about a phenomenon on the basis of some information at hand. To
make inference about a phenomenon through some statistical procedure is called
statistical inference. Inference may be divided into two categories. They are-
- Inductive Inference
- Deductive Inference
Inductive Inference
By inductive inference we mean to draw a conclusion about a
small part of the phenomenon and generalized it for the whole phenomenon. The
research worker performs an experiment and obtain some data on the basis of the
data. Certain conclusions are drawn and it is generalized for whole population.
For example
- Test of hypothesis
- Estimation
- Decision theory
- Classification etc.
Deductive Inference
By deductive inference we mean to draw a conclusion about a
small part of phenomenon from the whole known phenomenon. For example, Let the
distribution of population is known. We want to know the distribution of small
part (sample) of the population (sampling distribution), then it is a problem
of deductive inference. In this process we use the following terms-
- Estimator: Estimator is a function by which we find the value of estimate. Practically we can say that estimator is a material of finding an estimate.
- Estimate: Estimate is a value of those function which are used as estimator. Practically we can say that estimate is a value of the materials or estimator.
- Estimation: The process of finding an estimate of an estimator is called estimation. Practically we can say that estimation is a procedure of finding an estimate.
Generally, there are two parts exists in a statistical
inference. They are-
- Estimation
- Test of hypothesis
And the estimation is divided into two categories. They are-
- Point estimation
- Interval estimation
Methods of point estimation
Some important methods of inferential point estimation are
given below-
- Methods of moments
- Methods of maximum likelihood estimation
- Methods of minimum chi-square
- Methods of least square
- Bayesian Method
- Methods of minimum distance
Properties of a good estimator
For a good estimator in the sense that the distribution of
the estimator is concentrated near the true value of the parameter. The
following properties have been developed-
- Unbiasedness
- Consistency
- Efficiency or minimum variance
- Minimum variance bound
- Sufficiency
- Mean square error etc.
Unbiasedness of an estimator
Let ‘tn’ be any estimator or statistic calculated
from a sample of size n drawn from any population of f(x; Ï´). If for every value of Ï´ and x, E[tn]= Ï´; then tn is said to be
unbiased estimator of Ï´. Otherwise
tn is said to be biased estimator of Ï´. Biased of an estimator is
measured by using the difference, that is B(t)= E[tn]- Ï´.
- If the amount of bias= positive=upward bias.
- If the amount of bias= negative= downward bias.
- If the amount of bias= 0 = unbiased.
Consistency of an estimator
An estimator tn is said to be a consistent
estimator of the population parameter Ï´ if tn converges
statistically or in probability to Ï´ as n→∞. Consistency is a large sample
property which implies that a consistent estimator approaches to corresponding
parameter in large sample. For example- sample mean, sample variance, sample
moments are the consistent estimator of population mean, population variance
and population moments respectively.
Sufficiency of an estimator
An
estimator is said to be sufficient estimator of population parameter, if it
gives maximum information about the unknown parameter ϴ, ϴ ϵ H [H is parameter
space].
0 Comments