Nuclei segmentation holds pivotal importance in pathology diagnosis and treatment. Accurate and automated segmentation of nuclei not only significantly reduces labor but also propels the development of medical intelligence. Over the past decade, due to the emergence of deep learning techniques, deep convolutional neural networks (CNNs) have demonstrated remarkable efficacy in nuclei segmentation tasks. Nevertheless, as this field progresses, limitations of CNNs have become increasingly apparent. Although CNNs excel in feature extraction, they mainly focus on local details, overlooking crucial global information and failing to capture long-range dependencies. Thanks to the transformer structure, introducing self-attention compensates for the shortcomings of CNNs, enabling to encode long-range dependencies within images and to extract high-dimensional information. However, the dependency of transformer structures on vast databases poses a significant challenge, particularly in medical images, where limited database sizes prevail. In this paper, we propose a novel and efficient transformer-cnn architecture, TC-Former, arranging a transformer branch and a cnn branch in parallel. A feature exchange module is designed to exchange information between two branches, thereby synergizing global dependencies with low-level spatial details. Besides, drawing inspiration from ASPP, we introduce MWPP to effectively capture multi-scale information, enhancing generalization across various nuclei sizes and shapes. We evaluate the proposed method on the multi-organ nuclei segmentation (MoNuSeg) dataset. Extensive experiments demonstrate that TC-Former not only achieves state-of-the-art segmentation performance but also exhibits good generalization capabilities across various organs.