This paper is concerned with optimization or minimization problems that are governed by operator equations, such as partial differential or integral equations, and thus are naturally formulated in an infinite dimensional function space V. We first construct a prototype algorithm of steepest descent type in V and prove its convergence. By using a Riesz basis in V we can transform the minimization problem into an equivalent one posed in a sequence space of type l(p). We convert the prototype algorithm into an adaptive method in l(p). This algorithm is shown to be convergent under mild conditions on the parameters that appear in the algorithm. Under more restrictive assumptions we are also able to establish the rate of convergence of our algorithm and prove that the work/accuracy balance is asymptotically optimal. Finally, we give two particular examples.