This paper treats an abstract parametric family of symmetric linear estimators for the mean vector of a standard linear model. The estimator in this family that has smallest estimated quadratic risk is shown to attain, asymptotically, the smallest risk achievable over all candidate estimators in the family. The asymptotic analysis is carried out under a strong Gauss-Markov form of the linear model in which the dimension of the regression space tends to infinity. Leading examples to which the results apply include: (a) penalized least squares fits constrained by multiple, weighted, quadratic penalties; and (b) running, symmetrically weighted, means. In both instances, the weights define a parameter vector whose natural domain is a continuum. (c) 2006 Elsevier B.V. All rights reserved.