Inspired by the generalization efficiency of affinity and class probability-based fuzzy support vector machine (ACFSVM), a pair of class affinity and nonlinear transformed class probability-based fuzzy least squares support vector machine approaches is proposed. The proposed approaches handle the class imbalance problem by employing cost-sensitive learning, and by utilizing the samples' class probability determined using a novel nonlinear probability equation that adjusts itself with class size. Further, the sensitivity to outliers and noise is reduced with the help of each sample's affinity to its class obtained with the help of least squares one-class support vector machine. The first proposed approach incorporates fuzzy membership values, computed using transformed class probability and class affinity, into the objective function of LS-SVM type formulation, and introduces a new cost sensitive term based on the class cardinalities to normalize the effect of the class imbalance problem. The inherent noise and outlier sensitivity of the quadratic least squares loss function of the first approach is further reduced in the second proposed approach by truncating the quadratic growth of the loss function at a specified score. Thus, the concerns due to noise and outliers are further handled at the optimization level. However, the employed truncated loss function of the second approach takes a non-convex structure, which in turn, is resolved using ConCave-Convex Procedure (CCCP) for global convergence. Numerical experiments on artificial and real-world datasets of different imbalance ratio establish the effectiveness of the proposed approaches. (C) 2022 Elsevier B.V. All rights reserved.