Deep reinforcement learning (DRL) algorithms have demonstrated impressive performance in developing optimal energy management strategies (EMSs) for individual hybrid electric vehicles (HEVs) under predefined driving cycles. However, in this area of research, the impact of thermal loads and thermal management (TM) is often overlooked. Moreover, HEVs may encounter unseen driving patterns that can hinder the overall performance of EMS. Connected HEVs (C-HEVs) show promising solutions; however, there are existing issues such as privacy, security, and communication loads. This paper proposes a novel integrated thermal and energy management (ITEM) approach based on federated reinforcement learning (FRL) for achieving a generalized policy across multiple CHEVs. This framework broadens learning from multiple environments while preserving local HEV data privacy and security. The proposed FRL algorithm is iteratively executed between multiple HEVs and a cloud-based center to develop global policies for all ITEMs. For each ITEM, two DRL agents (cabin TM and EMS) build their local policies based on recorded driving data. The only local and global models exchanged between the cloud-based center and the ITEMs reduce communication overhead and preserve driving data privacy. Our findings successfully demonstrate that this approach has the advantage of accelerating convergence speed and achieving total rewards similar to the DRL strategy, which has access to driving cycle information in advance. Furthermore, we demonstrate that the proposed approach delivers excellent performance even when additional DRL agents join the FRL network. The implementation capability is also verified by a hardware-in-the-loop (HIL) test setup.