A Minimum Power Divergence Class of CDFs and Estimators for the Binary Choice Model

   This paper makes use of information theoretic methods, in the form of the Cressie-Read (CR) family of divergence measures, to introduce a new class of probability distributions and estimators for competing explanations of the data in the binary choice model. No explicit parameterization of the function connecting the data to the Bernoulli probabilities is stated in the specification of the statistical model. A large class of probability density functions emerges that includes the conventional logit model. The resulting new class of statistical models and estimators requires minimal a priori model structure and non-sample information, and provides the basis for a range of model and estimator extensions.

Keywords: semiparametric binary response models and estimators, conditional moment equations, squared error loss, Cressie-Read statistic, information theoretic methods, minimum power divergence
AMS 1991 Classification Primary 62E20
JEL Classifications: C10, C2.