This paper presents a machine learning (ML) model designed to track the maximum power point of standalone Photovoltaic (PV) systems. Due to the nonlinear nature of power generation in PV systems, influenced by fluctuating weather conditions, managing this nonlinear data effectively remains a challenge. As a result, the use of ML techniques to optimize PV systems at their MPP is highly beneficial. To achieve this, the research explores various ML algorithms, such as Linear Regression (LR), Ridge Regression (RR), Lasso Regression (Lasso R), Bayesian Regression (BR), Decision Tree Regression (DTR), Gradient Boosting Regression (GBR), and Artificial Neural Networks (ANN), to predict the MPP of PV systems. The model utilizes data from the PV unit's technical specifications, allowing the algorithms to forecast maximum power, current, and voltage based on given irradiance and temperature inputs. Predicted data is also used to determine the boost converter's duty cycle. The simulation was conducted on a 100 kW solar panel with an open-circuit voltage of 64.2 V and a short-circuit current of 5.96 A. Model performance was evaluated using metrics such as Root Mean Square Error (RMSE), Coefficient of Determination (R2), and Mean Absolute Error (MAE). Additionally, the study assessed the correlation and feature importance to evaluate model compatibility and the factors impacting the predictive accuracy of the ML models. Results showed that the DTR algorithm outperformed others like LR, RR, Lasso R, BR, GBR, and ANN in predicting the maximum current (Im), voltage (Vm), and power (Pm) of the PV system. The DTR model achieved RMSE, MAE, and R2 values of 0.006, 0.004, and 0.99999 for Im, 0.015, 0.0036, and 0.99999 for Vm, and 2.36, 0.871, and 0.99999 for Pm. Factors such as the size of the training dataset, operating conditions of the PV system, model type, and data preprocessing were found to significantly influence prediction accuracy.