Theoretical procedures are developed for comparing the performance of arbitrarily selected admissible feedback controls among themselves with the optimal solution of a nonlinear optimal stochastic control problem. Iterative design schemes are proposed for successively improving the performance of a controller until a satisfactory design is achieved. Specifically, the exact design procedure is based on the generalized Hamilton-Jacobi-Bellman equation of the cost function of nonlinear stochastic systems, and the approximate design procedure for the infinite-time nonlinear stochastic regulator problem, is developed by using the upper and lower bounds of the cost functions. Stability of this problem is also considered. For a given controller, both the upper and lower bounds to its cost function can be obtained by solving a partial differential inequality. These bounds, constructed without actually knowing the optimal controller, are used as measure to evaluate the acceptability of suboptimal controllers. These results establish an approximation theory of optimal stochastic control and provide a practical procedure for selecting effective practical controls for nonlinear stochastic systems. An Entropy reformulation of the Generalized Hamilton-Jacobi-Bellman equation is also presented.