In recent years biased estimators have received a great deal of attention because they can often produce more accurate estimates in multiparameter problems. One sense in which biased estimators are often more accurate is that the mean square error is smaller. In this work several parametric families of estimators are examined and good values of the parameters are sought by approximate analytical arguments. These parametric values are then tested by computing and plotting graphs of the mean square error. In this way the risks of various estimators may be seen and it is possible to discard some estimators which have large risk. The risk functions are computed by numerical integration - a method faster and more accurate than the usual simulation studies. The advantage of this is that it is possible to evaluate a greater number of estimators; however, the method only copes with spherically symmetric estimators. The relationship of biased estimation to the use of prior information is made clear. This leads to discussion of partially spherically symmetric estimators and the fact that, although not uniformly better than spherically symmetric ones, they are usually better in a practical sense. It is shown how the theoretical results may be applied to the linear model. The linear model is discussed in the very general case in which it is not of full rank and there are linear restrictions on the parameter. A kind of weak prior knowledge which is often assumed for such a model makes the partially symmetric estimators attractive. Distributions of spherically symmetric estimators are briefly discussed.