A search for an estimator of β in the Normal Linear Model which has better mean squared error properties than the usual least squares estimator is undertaken. The properties of some classical techniques such as restricted least squares, which includes the selection of a subset of the independent variables, are examined, along with more recent techniques such as ridge regression and Bayesian estimators. Most of these can be shown analytically to improve over least squares only when the true parameter vector β is in some subspace of the parameter space. Empirical Bayes estimators are in general difficult to handle analytically, and so several of these are studied by Monte Carlo methods. A particular modification of one of these empirical Bayes estimators is found to improve over least squares over a large region of the parameter space, and it's use is demonstrated on a small data set. Some suggestions for further improvement of this estimator are given and some techniques for further study of estimators by Monte Carlo methods are recommended.