The goal of regression analysis is to describe the relationship between an output y and a vector of inputs x. Least squares regression provides how the mean of y changes with x, i.e. it estimates the conditional mean function. Estimating a set of conditional quantile functions provides a more complete view of the relationship between y and x. Quantile regression [1] is one of the promising approaches to estimate conditional quantile functions. Several types of quantile regression estimator have been studied in the literature. In this paper, we are particularly concerned with kernel-based nonparametric quantile regression formulated as a quadratic programing problem similar to those in support vector machine literature [2]. A group of conditional quantile functions, say, at the orders q = 0.1, 0.2, ... , 0.9, can provide a nonparametric description of the conditional probability density p(y vertical bar x). This requires us to solve many quadratic programming problems and it could be computationally demanding for large-scale problems. In this paper, inspired by the recently developed path following strategy [3][4], we derive an algorithm to solve a sequence of quadratic programming problems for the entire range of quantile orders q E (0, 1). As well as the computational efficiency, the derived algorithm provides the full nonparametric description of the conditional distribution p(y I x). A few examples are given to illustrate the algorithm.