In recent years, several differentiable histogram layers have been proposed which are suitable for end-to-end training in a convolutional neural network (CNN) model. In this paper we examine the use of a closely related kernel density estimation (KDE) layer to represent a probability distribution in combination with two novel layers for principled probabilistic reasoning. One of these probabilistic reasoning layers is based on naive Bayes, and the other is based on the nonparametric Kolmogorov-Smirnov (KS) hypothesis test. We show excellent texture classification results using compact "NBKDE-CNN" and "Kaskade-CNN" networks which combine KDE and probabilistic reasoning layers.