This paper is concerned with designing minimum average length uniquely decodable codes, for a family of source distributions for which the relative entropy with respect to a given nominal distribution is bounded above by some fixed number. This formulation leads to a minimax source coding problem, in which the minimizing players are the codeword lengths, while the maximizing players are the uncertain source distributions. It is further shown, via a Large Deviations duality relation, that the minimax source coding problem is equivalent to a coding problem which minimizes the average of an exponential pay-off instead of the conventional average length pay-off. Both Shannon coding and Huffman coding methods axe generalized to this new criterion of peformance, leading to robust codes which perform well with respect to an uncountable family of source distributions. However, the code lengths are designed with respect to a known nominal distribution, which is obtained either through modeling assumptions or via empirical techniques. Simulations are also presented comparing the Shannon/Huffman codes with the Robust codes, when the source distribution is uncertain, while a sensitivity analysis shows that the new codes axe much less sensitive to the uncertainty description. In addition, a fixed length source coding theorem is derived in which the encoding error tends to zero as the length of the codes tends to infinity, uniformly over the set of uncertain distributions which satisfy the relative entropy bound. Finally, generalizations to uncertainty described by the total variation norm is discussed.