Knowledge graph reasoning or completion aims at inferring missing facts by reasoning about the information already present in the knowledge graph. In this work, we explore the problem of temporal knowledge graph reasoning that performs inference on the graph over time. Most existing reasoning models ignore the time information when learning entities and relations representations. For example, the fact (Scarlett Johansson, spouse Of, Ryan Reynolds) was true only during 2008 - 2011. To facilitate temporal reasoning, we present TA-TransR(ILP), which involves temporal information by utilizing RNNs and takes advantage of Integer Linear Programming Specifically, we utilize a character-level long short-term memory network to encode relations with sequences of temporal tokens, and combine it with common reasoning model. To achieve more accurate reasoning, we further deploy temporal consistency constraints to basic model, which can help in assessing the validity of a fact better. We conduct entity prediction and relation prediction on YAGO11k and Wikidata12k datasets. Experimental results demonstrate that TA-TransR(ILP) can make more accurate predictions by taking time information and temporal consistency constraints into account, and outperforms existing methods with a significant improvement about 6-8% on Hits @ 10.