Instabilities in neuromorphic machine learning can occur when synaptic updates meant to encode matrix transforms are not normalized. This phenomenon is encountered in Hebbian learning [5], where, as a synapse's strength grows, post-synaptic activity increases, further enhancing synaptic strength, leading to a runaway condition, where synaptic strength becomes saturated [3, 7]. A number of mechanisms have been suggested for regulating and stabilizing this phenomenon [1, 2, 4, 6]. Here, we present a new neuromorphic algorithm to directly normalize a synaptic connectivity. The algorithm is based on a synfire-gated synfire chain-based information control network in concert with Hebbian synapses [8, 10]. The algorithm is designed to directly normalize synaptic weights via the minimization of a cost function. We demonstrate its effectiveness as a component of a Locally Competitive Algorithm (LCA) [9] with dictionary learning, which exhibits runaway in the absence of an effective normalization procedure [15]. LA-UR-21-31884