Knowledge graph embedding has excellent performance in capturing intrinsic relations and semantics in a wealth of information for link prediction. Knowledge graph embedding methods have achieved impressive results in recent years, especially those using convolutional neural networks. However, many previous approaches focus on interactions between relations and entities, ignoring interactions of internal data elements and the crucial role of high-frequency features. In this paper, we propose a novel approach, a knowledge graph Embedding model using 2D convolution operations integrating Embedding permutation strategy and High-frequency features fusion mechanism, named EHE, for link prediction. First, we design the embedding permutation mechanism for the embedding vectors. This mechanism leverages internal element permutation, efficiently broadening the local interactions of internal elements, especially for far-flung data elements in the one-dimensional space. Subsequently, a high-frequency feature fusion module is proposed to capture the high-frequency feature representations by using Sobel and Laplacian operators. Additionally, the projection attention mechanism is utilized to emphasize the unique semantic regions of interest in entities and relations. We assess our approach on several benchmark link prediction datasets. Considering the important metrics, MRR and H@1, our method achieves the overall best performance compared with existing state-of-the-art methods on five public datasets, showcasing its superior capacity for link prediction.