Least squares estimate that can directly obtain the analytical solution to minimize the mean square error (MSE) is one of the most effective regression tools. It has also been applied to many classification tasks due to its simplicity, clear physical and tractability. However, there is a fundamental contradiction between regression with continuous samples and classification with discrete category labels, i.e., the output metric spaces are different. Considering this contradiction in regression-based classification, this paper presents a new linear classifier, termed penalized least squares classifier (PLSC), which gradually adds the penalty on the distance between misclassified samples and decision boundary to the loss function. Fundamentally, the decision boundary is obtained by minimizing MSE with iterative cost-sensitive learning. Then, an enhanced nonlinear neural network classifier, PLSC-BP is formulated, in which the penalized least squares mechanism is applied to tune the learning strategy of the conventional neural network by adjusting the cost factor for each sample. Extensive experiments on six synthetic datasets and eleven publicly available datasets show that classification, which cannot be realized by traditional least squares, can be conducted using a regression algorithm with iterative cost-sensitive learning. The proposed enhanced algorithm outperforms the traditional neural network classifiers and other classifiers in terms of classification.