In Federated Learning (FL), personalization-based solutions have emerged to improve clients' performance, considering the statistical heterogeneity of local datasets. However, these methods are designed for a static environment and the previously learned model becomes obsolete as the local data distribution changes over time. This problem, known as concept drift, is widespread in several scenarios (e.g., change in user habits, different characteristics of visited geolocations, and seasonality effects, among others) but needs to be addressed by most solutions in the literature. In this work, we present FedPredict-Dynamic, a plugin that allows FL solutions to support statistically heterogeneous stationary and non-stationary local data. The proposed method is a lightweight and reproducible modular plugin and can be added to various FL solutions. Unlike state-of-the-art concept drift techniques, it can rapidly adapt clients to the new data context in the prediction stage without requiring additional training. Results show that when context changes, FedPredict-Dynamic can achieve accuracy improvements of up to 195% compared to concept drift-aware solutions and 210.7% compared to traditional FL methods.