Large-scale sparse multi-objective optimization problems (LSSMOPs) widely exist in the real world, such as portfolio optimization, neural network training problems, and so on. In recent years, a number of multi-objective optimization evolutionary algorithms (MOEAs) have been proposed to deal with LSSMOPs. To improve the search efficiency of the operator, using unsupervised neural networks to reduce the search space is one of the dimensionality reduction methods in sparse MOEAs. However, it is not efficient enough that existing algorithms using neural networks consume much time to train networks in each evolutionary generation. In addition, most sparse MOEAs ignore the relationship between binary vectors and real vectors, which determine the decision variables. Thus, this paper proposes an evolutionary algorithm for solving LSSMOPs. The proposed algorithm adopts an adaptive dimensionality reduction method to achieve a balance between convergence and efficiency. The algorithm groups the binary vectors and adaptively uses a restricted Boltzmann machine to reduce the search space of binary vectors. Then, the generation of real vectors is guided by binary vectors, which enhance the relationship between both parts of the decision variables. According to the experimental results on eight benchmark problems and neural network training problems, the proposed algorithm achieves better performance than existing state-of-the-art evolutionary algorithms for LSSMOPs.