In this article, we try to give an answer to the simple question: "What is the optimal growth rate of the dimension p as a function of the sample size n for which the Central Limit Theorem (CLT) holds uniformly over the collection of p-dimensional hyper-rectangles ?". Specifically, we are interested in the normal approximation of suitably scaled versions of the sum Sigma(n)(i=1) X-i in R-p uniformly over the class of hyper-rectangles A(re) = {Pi(p)(j=1)inverted right perpendiculara(j), b(j)inverted left perpendicular boolean AND R : -infinity <= a(j) <= b(j) <= infinity, j = 1, . . ., p}, where X-1, . . ., X-n are independent p-dimensional random vectors with each having independent and identically distributed (iid) components. We investigate the optimal cut-off rate of log p below which the uniform CLT holds and above which it fails. According to some recent results of Chernozukov et al. [Ann. Probab. 45 (2017), pp. 2309-2352], it is well known that the CLT holds uniformly over A(re) if log p = o(n(1/7)). They also conjectured that for CLT to hold uniformly over A(re), the optimal rate is log p = o(n(1/3)). We show instead that under some suitable conditions on the even moments and under vanishing odd moments, the CLT holds uniformly over A(re), when logp = o(n(1/2)). More precisely, we show that if log p = epsilon root n for some sufficiently small epsilon > 0, the normal approximation is valid with an error epsilon, uniformly over A(re). Further, we show by an example that the uniform CLT over A(re) fails if lim sup(n ->infinity) n-((1/2+delta)) logp > 0 for some delta > 0. Therefore, with some moment conditions the optimal rate of the growth of p for the validity of the CLT is given by log p = o(n(1/2)).