The escalating intensity and frequency of floods, exacerbated by global climate change, emphasize the urgent need to address the growing risks of floods. Rapid and precise flood detection is paramount for efficiently responding to emergencies and executing disaster relief measures, enabling swift reactions to flood disasters and minimizing the losses incurred thus. The 2024 IEEE GRSS Data Fusion Contest Track 1 is centered on leveraging multi-source remote sensing data, particularly synthetic aperture radar (SAR) data, to classify flood and non-flood areas. In this contest, we acknowledge the significance of managing uncertain predictions and present an efficient Uncertainty-Aware Fusion Network (UAFNet). Specifically, we build on the traditional encoder-decoder architecture, initially employing the pyramid visual transformer (PVT) as a feature extractor. Subsequently, we apply a typical decoding strategy, namely the feature pyramid network, to obtain a flood extraction map with relatively high uncertainty. Furthermore, leveraging the uncertain extraction map, we introduce an Uncertainty Rank Algorithm to quantify the uncertainty level of each pixel of the foreground and background. We seamlessly integrate this algorithm with our proposed Uncertainty-Aware Fusion Module, enabling level-by-level feature refinement and ultimately yielding a refined extraction map with minimal uncertainty. Employing the proposed UAFNet, we utilize diverse versions of PVT as encoders to train multiple UAFNets. Additionally, we enhance our approach with online testing augmentation and multi-model fusion strategy, aiming to enhance the final flood extraction accuracy. Our technical solution has exhibited outstanding performance, earning the first-place ranking in the 2024 IEEE GRSS Data Fusion Contest Track 1 and achieving an impressive F1 score of 82.985% on the official test set.