Bayesian inference has become a powerful and popular technique for understanding psychological phenomena. However, compared with frequentist statistics, current methods employing Bayesian statistics typically require time-intensive computations, often hindering our ability to evaluate alternatives in a thorough manner. In this article, we advocate for an alternative strategy for performing Bayesian inference, called variational Bayes (VB). VB methods posit a parametric family of distributions that could conceivably contain the target posterior distribution, and then attempt to identify the best parameters for matching the target. In this sense, acquiring the posterior becomes an optimization problem, rather than a complex integration problem. VB methods have enjoyed considerable success in fields such as neuroscience and machine learning, yet have received surprisingly little attention in fields such as psychology. Here, we identify and discuss both the advantages and disadvantages of using VB methods. In our consideration of possible strategies to make VB methods appropriate for psychological models, we develop the differential evolution variational inference algorithm, and compare its performance with a widely used VB algorithm. As test problems, we evaluate the algorithms on their ability to recover the posterior distribution of the linear ballistic accumulator model and a hierarchical signal detection model. Although we cannot endorse VB methods in their current form as a complete replacement for conventional methods, we argue that their accuracy and speed warrant inclusion within the cognitive scientist's toolkit. Translational Abstract Bayesian statistics is an alternative statistical framework that has become popular for understanding psychological phenomena. In contrast to the point estimates and confidence intervals of classical statistics, the Bayesian framework provides a distribution (the posterior) that describes our uncertainty about the parameters of interest. Rarely are there closed-form solutions for deriving the posterior, and therefore Bayesians typically rely on computational methods to approximate it. The time-intensive nature of these computational methods can prohibit the application of Bayesian framework. In this article, we advocate for an alternative, and often more efficient, strategy for performing Bayesian inference called variational Bayes (VB). VB methods make assumptions about the functional form of the posterior distribution, and then systematically morph the approximating function's parameters so that it best matches the target posterior. VB methods have enjoyed considerable success in fields such as neuroscience and machine learning, yet have received surprisingly little attention in fields such as psychology. Here, we identify and discuss both the advantages and disadvantages of using VB methods in reference to conventional posterior approximation methods. We investigate a series of algorithmic components to see which, if any, of these components can be packaged into a general purpose algorithm for problems often encountered when fitting psychological models to data by testing them on two popular models from psychology. Although we cannot endorse VB methods in their current form as a complete replacement for conventional methods, we argue that their accuracy and speed warrant inclusion within the cognitive scientist's toolkit.