Aiming at the problems of slow convergence, low accuracy, and easy to fall into local optimum of the slime mould algorithm (SMA), we propose an improved SMA (OJESMA). OJESMA improves the performance of the algorithm by combining strategies based on opposition-based learning, joint opposite selection, and equilibrium optimizer. First, we introduce an adversarial learning-opposition-based learning, in generating the initial population of slime molds. Second, we incorporate a joint inverse selection strategy, including selective leading opposition and dynamic opposite. Finally, we introduce the balanced candidate principle of the equilibrium optimizer algorithm into SMA, which enhances the algorithm's optimal search capability and anti-stagnation ability. We conducted optimization search experiments on 29 test functions from CEC2017 and 10 benchmark test functions from CEC2020, as well as nonparametric statistical analysis (Friedman and Wilcoxon). The experimental results and non-parametric test results show that OJESMA has better optimization accuracy, convergence performance, and stability. To further validate the effectiveness of the algorithm, we also performed optimization tests on six engineering problems and the variable index Muskingum. In summary, OJESMA demonstrates its practical value and advantages in solving various complex optimization problems with its excellent performance, providing new perspectives and methods for the development of optimization algorithms.