We consider the convergence analysis of the sign algorithm for adaptive filtering when the input processes are uncorrelated and Gaussian and a fixed step size mu > 0 is used. Exact recursive equations for the covariance matrix of the deviation error are established for any step size mu > 0. Asymptotic time-averaged convergence for the mean-absolute deviation error, mean-square deviation error, and for the signal mean-square estimation error are established. These results are shown to hold for arbitrary step size mu > 0.