Keywords
Affiliated Institutions
Related Publications
Likelihood Ratio Tests for Model Selection and Non-Nested Hypotheses
In this paper, we develop a classical approach to model selection. Using the Kullback-Leibler Information Criterion to measure the closeness of a model to the truth, we propose ...
Dropout as a Bayesian Approximation: Representing Model Uncertainty in\n Deep Learning
Deep learning tools have gained tremendous attention in applied machine\nlearning. However such tools for regression and classification do not capture\nmodel uncertainty. In com...
Stability of multiagent systems with time-dependent communication links
We study a simple but compelling model of network of agents interacting via time-dependent communication links. The model finds application in a variety of fields including sync...
Redundancy reduction with information-preserving nonlinear maps
AbstractThe basic idea of linear principal component analysis (PCA) involves decorrelating coordinates by an orthogonal linear transformation. In this paper we generalize this i...
A Least Squares Correction for Selectivity Bias
WHEN ESTIMATING REGRESSION MODELS it is very nearly always assumed that the sample is random. The recent literature has begun to deal with the problems which arise when estimati...
Publication Info
- Year
- 1998
- Type
- article
- Volume
- 66
- Issue
- 2
- Pages
- 333-333
- Citations
- 327
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.2307/2998561