The dynamic evolution of the modern energy landscape has forced the integration of renewable energy sources (RESs) into power generation, catalyzing a paradigm shift towards converter-dominated power systems. This paper addresses the imperative challenge of optimizing power-generating technologies within these complex systems to ensure both stability and sustainability. The underlying issue lies in effectively managing the energymix proportion-balancing the total electricity generated with the overall system load-while accounting for the intermittent nature of RESs. Motivated by the insistent need for adaptive and efficient power system management, this paper presents a pioneering deep reinforcement learning (DRL) approach within modern power system applications. Leveraging the Deep Q-Network (DQN) paradigm, the proposed methodology estimates the energymix proportion through a data-driven lens. By integrating MATLAB/Simulink and Python libraries, offline training and online testing validate the approach's applicability in real-world scenarios. Results from comprehensive experiments on an IEEE-9 bus system underscore the efficacy of the DRL-based framework. Notably, the online short circuit level (SCL) emerges as a robust indicator of power system security, a significant innovation in stability assessment. The model demonstrates remarkable responsiveness to load fluctuations, optimizing energy generation and respecting operational constraints. Furthermore, the adaptability of grid-forming (GFM) and gridfollowing (GFL) converters is showcased, highlighting their resilience in converter-dominated power systems. The study offers a promising avenue for future research and underscores the potential of DRL in power system optimization and operation for a sustainable energy future.